Issue 2: Memory error when importing massive tables?
Status:  Fixed
Owner:
Closed:  Mar 2010
Project Member Reported by Ryan.Bis...@gmail.com, Mar 9, 2010
Haven't seen any notes related to this, or it is just an isolated issue due
to me doing something wrong, but when attempting to import some of the very
large tables, ie. eveNames, or mapDenormalize, my process will eventually
grow to consume my entire memory space, and finally end. I have been able
to get around this issue by simply deleting the records from the CCP dump
that have already been imported. Is this a known issue or am I doing
something wrong?

Mar 10, 2010
Project Member #1 snagglepants
This is a known issue, but my machine has a ton of RAM, so I guess it didn't end up
being too big a deal for my case. There does have to be some way to do this more
efficiently. I've been playing with using the 'del' keyword to clean up Django
objects manually after each iteration, but that doesn't seem to be doing much.

It might by Django's query caching, or something funky with sqlite3. There is some
way to do this without bringing machines to their knees, let's see if we can figure
out how to handle this.
Cc: snagglepants
Labels: -Priority-Medium Priority-High
Mar 10, 2010
Project Member #2 snagglepants
Got this figured out. Needed to add db.reset_queries() periodically to reset the list
of queries that Django keeps while DEBUG = True. If this drastically slows down
import speed, we can look at only doing this every so often. It is not necessary for
small importers.

SVN update to the latest revision and you should be good to go. Sorry about that!
Status: Fixed
Owner: snagglepants
Cc: -snagglepants
Mar 10, 2010
Project Member #3 Ryan.Bis...@gmail.com
Awesome, I'll give the change a go with mapDenormalize.