Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Pprof doesn't open large "reco.xxxx._main_-end.heap" file #251

Closed
alk opened this issue Aug 22, 2015 · 9 comments
Closed

Pprof doesn't open large "reco.xxxx._main_-end.heap" file #251

alk opened this issue Aug 22, 2015 · 9 comments

Comments

@alk
Copy link
Contributor

alk commented Aug 22, 2015

Originally reported on Google Code with ID 248

Hi. I run my executable file with "HEAPCHECK=as-is
HEAP_CHECK_IGNORE_GLOBAL_LIVE=false ./app" options. And I have
reco.xxx._main_-end.heap file (2.1 GB size) as a result. My application is
using about 2 GB (operative memmory).
When I execute pprof command supposed after finish, it hangs and is killed
by system later.
What can I do to solve this problem?

I'm using google-perftools 1.5 on Scientific Linux 5.5 x64.

With best wishes, Konstantin

Reported by K.Gertsenberger on 2010-06-03 13:23:31

@alk
Copy link
Contributor Author

alk commented Aug 22, 2015

Oh my! -- I never expected anyone to work with heap files that big.  Are you on a 32-
bit system or a 64-bit system?  (More to the point, are you using a 32-bit perl or
a 
64-bit perl?)  It may be that perl just can't deal with files larger than 2GB, if it

uses a 32-bit signed int to hold file positions and the like.

Or it could just be that perl is making progress, but slowly.  When pprof is running

and looks hanged, what happens when you attach to it with ptrace?

Reported by csilvers on 2010-06-03 13:34:28

  • Labels added: Type-Defect, Priority-Medium

@alk
Copy link
Contributor Author

alk commented Aug 22, 2015

I'm taking a part in serious project based on ROOT system that is why my macro is 
using 2 GB memory. Is it really needed 2GB file as output with HEAPCHECK.

I'm working with Scientific Linux 5.5 x64 - 64-bit system.
"perl --version
This is perl, v5.8.8 built for x86_64-linux-thread-multi".

I believe that "open" progress is very slowly. When pprof is running and locks 
hanged, my computer's working very very slowly that is why it's hard to watch ptrace

command, but I'll try.

Reported by K.Gertsenberger on 2010-06-03 13:54:31

@alk
Copy link
Contributor Author

alk commented Aug 22, 2015

One thing you could try is to run your application under ptrace from the beginning,

something like
   ptrace -f <myapp>

Eventually, it will only be running the pprof code, and you can see what it's doing.

Another possibility is just the profile is so big, that pprof ios using up all your

memory, and you're just swapping.  Another thing you could try is to run 'top' while

pprof is hanging, to see how much memory it's using.

Reported by csilvers on 2010-06-03 14:02:32

@alk
Copy link
Contributor Author

alk commented Aug 22, 2015

Sorry, I can't find ptrace command on my Scientific Linux 5.5 system.
I run top:
perl's overheads was growing up (during fifteen minutes) to 5700m VIRT and 1.8g RES
(i have 2 GB physical memory and 3.9 GB swap). Then process was killed. Can i correct
it?

Reported by K.Gertsenberger on 2010-06-07 11:39:45

@alk
Copy link
Contributor Author

alk commented Aug 22, 2015

I'm sorry, I mistyped, the command is strace, not ptrace.

But it looks like the problem is just that perl is allocating a lot of memory.  Do
you 
know perl at all?  You could look through the source code and see where it's 
allocating memory for the heap profiler, and see if it really needs to have all that

data in memory at once.  There may be places you can move over to a streaming model.

But it's very possible that it needs all the data it has, and you'll just need more

memory to handle such a huge heap profile.

Reported by csilvers on 2010-06-07 17:41:00

@alk
Copy link
Contributor Author

alk commented Aug 22, 2015

Did you ever have a chance to look into this?  If you're willing to attach the heap
file (does google code accept such  big attachments?), and ideally the executable as
well, I can try to take a look as well.  Or maybe you could put the datafile somewhere
accessible via http.

Reported by csilvers on 2010-08-02 04:46:52

@alk
Copy link
Contributor Author

alk commented Aug 22, 2015

Sorry, I were on vacation. I'll try to share data via http next week.

Reported by K.Gertsenberger on 2010-09-03 10:41:35

@alk
Copy link
Contributor Author

alk commented Aug 22, 2015

Any more news on this?

Reported by csilvers on 2011-01-10 03:22:20

@alk
Copy link
Contributor Author

alk commented Aug 22, 2015

I'm afraid I'm going to have to close this CannotReproduce -- I don't even know how
to make a profile file that big.  I'm sure there's a real issue here, with perl having
a bottleneck with large heap-profile files, but I don't know what it is.

If you can attach the profile file, and perhaps the executable, feel free to reopen
the bug.

Reported by csilvers on 2011-08-31 23:41:50

  • Status changed: CannotReproduce

@alk alk closed this as completed Aug 22, 2015
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

1 participant