Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

no clear memory leaks #17

Closed
GoogleCodeExporter opened this issue Apr 13, 2015 · 2 comments
Closed

no clear memory leaks #17

GoogleCodeExporter opened this issue Apr 13, 2015 · 2 comments

Comments

@GoogleCodeExporter
Copy link

Hello guys,

There are memory leaks during processing some files.
I use the following simple script for test.

import pefile
p = pefile.PE('/home/kirill/14439bcfb9c3a6087d67ea6c18b9516a.exe', 
fast_load=True)
p.parse_data_directories([0,1])
del p

It takes about 350Mb on Ubuntu (32bit) and about 850Mb on Debian (64bit) and I 
can't erase the memory in the future.
I use '1.2.10-76'

Best regards,
Kirill Yakovenko


Original issue reported on code.google.com by yakovenko87@gmail.com on 9 Jun 2010 at 1:26

Attachments:

@GoogleCodeExporter
Copy link
Author

The file seems to have a corrupted import table. Other applications (like 
peid,etc) should also take the same time to parse this file's imports. 

About the memory usage : 
Python would have freed the memory of p, but has not committed the memory back 
to the OS. If you are using ps to see the memory, that may not be an entirely 
'accurate' view. Python seems to be clearing the memory correctly in this case. 
Here is python's memory profile before and after running your code:

Before running the code:
-------------------
Partition of a set of 27536 objects. Total size = 3485624 bytes.
 Index  Count   %     Size   % Cumulative  % Kind (class / dict of class)
     0  12340  45  1076696  31   1076696  31 str
     1   6377  23   523664  15   1600360  46 tuple
     2    359   1   272552   8   1872912  54 dict (no owner)
     3     69   0   215160   6   2088072  60 dict of module
     4   1666   6   199920   6   2287992  66 types.CodeType
     5    178   1   192688   6   2480680  71 dict of type
     6   1587   6   190440   5   2671120  77 function
     7    197   1   175264   5   2846384  82 type
     8    143   1   156008   4   3002392  86 dict of class
     9   1031   4    82480   2   3084872  89 __builtin__.wrapper_descriptor

After running the code:
------------------
Partition of a set of 26616 objects. Total size = 3247208 bytes.
 Index  Count   %     Size   % Cumulative  % Kind (class / dict of class)
     0  12177  46  1067048  33   1067048  33 str
     1   6358  24   522160  16   1589208  49 tuple
     2     69   0   215160   7   1804368  56 dict of module
     3   1666   6   199920   6   2004288  62 types.CodeType
     4    185   1   198488   6   2202776  68 dict of type
     5   1578   6   189360   6   2392136  74 function
     6    134   1   180368   6   2572504  79 dict (no owner)
     7    185   1   164416   5   2736920  84 type
     8    137   1   154328   5   2891248  89 dict of class
     9   1031   4    82480   3   2973728  92 __builtin__.wrapper_descriptor

Original comment by kbandla%...@gtempaccount.com on 9 Jun 2010 at 9:14

@GoogleCodeExporter
Copy link
Author

Fixed in revision 77. I've added some heuristics to detect the malformed 
entries in this file's import table. The heuristics should apply generically to 
other similarly malformed files.
Once the invalid entries are detected, the parsing of those stops. Files like 
this one are now handled fast and with low memory consumption. 
Thanks for reporting this problem!

Original comment by ero.carr...@gmail.com on 16 Aug 2010 at 10:57

  • Changed state: Verified

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

1 participant