Python and sqlite3 - adding thousands of rows

Are you using transactions ? SQLite will create a transaction for every insert statement individually by default, which slows things way down.

By default, the sqlite3 module opens transactions implicitly before a Data Modification Language (DML) statement (i.e. INSERT/UPDATE/DELETE/REPLACE)

If you manually create one single transaction at the start and commit it at the end instead, it will speed things up a lot.


Did you try running the inserts inside a single transaction? If not then each insert is treated as a transaction and .. well, you can read the SQLite FAQ for this here


In addition to running the queries in bulk inside a single transaction, also try VACUUM and ANALYZEing the database file. It helped a similar problem of mine.