The magic that is SQLite
Back in 2010, I was sitting at a client and using SQLite to do reconciliation between systems when we were doing a migration. Back then I thought: I must do a feature on SQLite as an “App of the week” because, honestly, it’s a beautiful thing.
SQLite has been in my toolset for some years; I used it again recently at a client to do data recon and mine some statistics. It’s especially useful when combined with the excellent SQLiteStudio. I used it just the other day to do mappings between a whole lot of fields in a large interface by loading the data in a few tables and executing some queries; something that would have taken me absolute ages using a spreadsheet.
But my fondness for SQLite has reached a new level since I decided to use it as the database for my new website: softgui.de. That decision came after reading the following on the SQLite website that describes “Appropriate Uses For SQLite”:
The SQLite website (https://www.sqlite.org/) uses SQLite itself, of course, and as of this writing (2015) it handles about 400K to 500K HTTP requests per day, about 15-20% of which are dynamic pages touching the database. Dynamic content uses about 200 SQL statements per webpage. This setup runs on a single VM that shares a physical server with 23 others and yet still keeps the load average below 0.1 most of the time.
That alone sold me.
The advantages are immediately obvious: no need to hire an extra server to run your MySQL database, which is expensive. Administering a database like MySQL or any other RDBMS server is a big administration exercise and time-intensive. SQLite requires zero config; perhaps the odd vacuum once in a while if you frequently update content, but that’s it.
Instead, I have a single file, which is currently ~150kb (not much content on that site yet) and the SQLite database process uses a minimal amount of memory. The only drawback is that I don’t have remote access directly to the database and it doesn’t feature replication, but it works for the site.
So, instead of a separate server, the database for my website is actually just a file in my Git repository. (For now, that is. It might change once it gets large). So I just commit changes on the server and pull to my local dev environment before making further changes. I can even upload content in the local Dev environment, then push everything to the server again. Magic! Backups? No problem. Just copy the file somewhere safe every day. Done.
You should also definitely go read the explanation on the SQLite website of how doing 200 requests for a web page is no problem. By cutting out the overhead of communicating with another application, either over the wire to another host or even just in a separate memory space, doing queries becomes dirt cheap: https://sqlite.org/np1queryprob.html.
For more gushing on SQLite I would recommend reading the following article: Five reasons you should use SQLite in 2016. (It’s so last year, I know). In there, the author goes on to explain all the good things like, how concurrent reading and writing is no longer a problem, how well-performing SQLite is, and so forth.
I used to think of SQLite as this thing/toy database you just use to develop your site on in Rails. No longer. I have gained a whole new respect for it.