| Issue 2: | Memory error when importing massive tables? | |
| 1 person starred this issue and may be notified of changes. | Back to list |
Haven't seen any notes related to this, or it is just an isolated issue due to me doing something wrong, but when attempting to import some of the very large tables, ie. eveNames, or mapDenormalize, my process will eventually grow to consume my entire memory space, and finally end. I have been able to get around this issue by simply deleting the records from the CCP dump that have already been imported. Is this a known issue or am I doing something wrong?
Mar 10, 2010
Got this figured out. Needed to add db.reset_queries() periodically to reset the list of queries that Django keeps while DEBUG = True. If this drastically slows down import speed, we can look at only doing this every so often. It is not necessary for small importers. SVN update to the latest revision and you should be good to go. Sorry about that!
Status:
Fixed
Owner: snagglepants Cc: -snagglepants
Mar 10, 2010
Awesome, I'll give the change a go with mapDenormalize. |
Labels: -Priority-Medium Priority-High