I wrote a python program that processes a large amount of text and puts the data into its multi-level dictionary. As a result, dictionary gets really big (2GBs or above), eats up memory, and leads to slowness/memory error.
So I'm looking to use sqlite3 instead of putting the data in python dict.
But come to think of it, the entire db from sqlite3 will have to be accessible throughout the running of the program. So in the end, wouldn't it lead to the same result where the memory is eaten up by the large db?
Sorry my understanding of memory is a little clumsy. I want to make things clear before I bother to port my program to using db.
Thank you in advance.