02-14-2011 11:10 AM - edited 02-14-2011 04:10 PM
I have a table with about 3000 records (46 columns each record) in a SQLite database that I want to insert to another SQLite DB.
Initially I've implemented it the most straightforward way by reading all the records from the table into a cursor object and then traversing the cursor and inserting every record individually to the second database. That worked, but the performance on the different actual devices was really terrible.
I've tried to optimize it by inserting everything in one transaction (using Database.beginTransaction() and Database.commitTransaction()). This significantly cut down the wait time however it's still not good enough. In addition, every now and then I start getting "Database out of memory" exception.
I've measured performance for that transaction on a different devices and here are the results (min:sec):
Simulator (Torch 6.0)
DB w/no encryption
1 minute for merging a single table is too much for our application (For the sake of simplicity I’ve altered my end goal here – I actually need to merge more than one table at the time in the real application).
We have an Android app that does the same thing and merging this table was a performance issue as well. The solution was to use SQLite ‘ATTACH DATABASE’ command that gave us a HUGE performance gain. Unfortunately ATTACH/DETACH DATABASE commands are not supported by Blackberry SQLite implementation, so we cannot use the same approach here.
Another way to improve the performance I thought about was to try to turn off all transaction journals by using SQLite ‘PRAGMA’ command. Unfortunately ‘PRAGMA’ commands are also not supported on Blackberry.
Does anyone have any idea what else I can try here to improve performance?
In addition, is there any way to prevent these "Database out of memory" exceptions I’m getting when inserting everything in one transaction?
02-15-2011 03:51 AM
In your unique transaction, do you insert everything in one "INSERT" query or do you launch n "INSERT" query ? I expect the first to be more efficient.
About the OutOfMemory:
You may create one transaction for 1000 insert, by exemple.
Good luck for resolving your issue, I hope somebody will find other solutions!
02-15-2011 06:44 AM
I also have a time problem when i perform a select statement over thousands of rows. Should it help to do it in a transaction? I didn't do it because it's just one instruction. What do you think?
02-15-2011 08:39 PM - edited 02-16-2011 09:41 AM
Yes, I use multiple insert queries do to all the inserts.
The problem with having everything in one insert query is that the syntax:
INSERT INTO 'tablename' ('column1', 'column2') VALUES ('data1', 'data2'), ('data1', 'data2'), ('data1', 'data2'), ('data1', 'data2');
I found a different way of inserting everything in one SQL query that should be supported by SQLite:
INSERT INTO 'tablename' SELECT 'data1' AS 'column1', 'data2' AS 'column2' UNION SELECT 'data3', 'data4' UNION SELECT 'data5', 'data6' UNION SELECT 'data7', 'data8'
however when I use the following code:
statement = db.createStatement(/*INSERT MULTIPLE ROWS QUERY USING UNIONS*/); statement.prepare(); //EXCEPTION "SQL logic error or missing database" statement.bind(...)
statement.prepare() call for such syntax throws an exception: "SQL logic error or missing database"
As for the OutOfMemory Exception: It is weird, but when I switched to a mode when I have not more than a hundred inserts in each transaction - I start getting this error even more often... I will try this again.
I don't think that having your select statement in a transaction will help. I think it actually could be even slower, since the sqlite database will be writing stuff to a journal (in case you do a rollback).
It is possible that it is not the Select statement that takes time in you case, but traversing the cursor (I assume you read results row by row). Try using the profiler and see what operation actually takes most of the time.
02-16-2011 08:30 AM - edited 02-16-2011 08:31 AM
Thanks for your very good Posts mnarinsky
I don't know what could help you from now, but I just thought about one thing: Do you have any Index? Try droping them and creating them again after you have inserted everything.
03-15-2011 08:42 AM
I managed to get rid of OutOfMemoryExceptions by calling System.gc(); always before inserting lots of data.
Personally I ditched SQLite because of its dismal performance
Nowadays I use file connection API and DataOutputStream. Inserting 800 rows using SQLite took 50 seconds (Curve 5.0, no indices, everything in the same transaction), while storing them with DataOutputStream took 0.7 seconds.
03-30-2011 10:00 AM
Thanks hrnt - explicitly calling Garbage collector between transactions seems to help preventing Out of Memory exceptions.
I've also noticed that storing the database in eMMC memory (on the devices that do support eMMC) can significantly improve performance. It's still not good enough and will not work for all the devices but at least it’s faster than storing it on an SD card – for instance the test I mentioned above now takes twice less time for the Torch device (~15 seconds instead of 31 seconds))