python - sqlite memory usage issue for db file size > 2GB when creating index -


i have simple sqlite database 2 tables.

table 1:     col1: int index;      col2: text;      col3: int; table 2:     col1: int;      col2: int;       col3: int; 

the first table goes millions of rows. table 2 can have hundreds of millions rows. table 1, col2 indexed after data entered. indexes created table 2 col1, col2.

the index creation works fine when database file size small - < 3.5gb. when database file system > 3.5gb, see memory error.

this linux system, on 32 bit kernel, file size >2gb seems cause memory error during index creation. on 64 bit kernel, limit > 3.5gb.

from "top" program, see vm , rss usage goes 3.5gb on 64 bit system before die.

have seen this? suggestions on how work around issue. have luck sqlite multi gb file size + index creation?

use newer sqlite version avoid eating memory. (3.7.16 works me.)

ensure there enough free space on /tmp, or move tmpdir elsewhere.


Comments

Popular posts from this blog

java - Run a .jar on Heroku -

java - Jtable duplicate Rows -

validation - How to pass paramaters like unix into windows batch file -