-
-
Notifications
You must be signed in to change notification settings - Fork 60
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Getting ??? for every source leak #108
Comments
Did you build your binary using |
Sure ! I have found a little workaround. I changed your formula to get a previous (functional) version of your valgrind. I added system |
Do you also get this problem with a trivial example (like Hello World)? Quite likely an issue with ML_(check_macho_and_get_rw_loads). Linkers like mold have been giving us a hard time on ELF-based systems and the Valgrind code that assumed that ELF means one text segment and one data segment keeps breaking. Now the code is supposed to peek to see how many data segments there are and then trigger debuginfo reading when the text and required number of data segments have been loaded. The macho code is supposed to do the same thing. I don't know if there is the MACH_HEADER and load_commands with DSC. Also it's quite possible that the fixed-size 4k buffer isn't big enough. |
Good point @paulfloyd, I had such an issue on arm64 with some read-only segments having the wrong protection (no idea how that's possible) which was messing this up. Hopefully that will be fixed after the merge. |
Context
I'm running macOS Catalina (Version 10.15.7). I recently downloaded the latest version of Valgrind, and now I see "???" for every source of leak. The previous version worked perfectly yesterday. Here are some logs:
What went wrong?
I ran Valgrind with the following flags on my program:
What did you expect to happen?
I expected Valgrind to show the source of each memory leak instead of displaying "???".
Information
uname -m
): x86_64sw_vers
): 10.15.7xcrun --sdk macosx --show-sdk-version
): 11.1Thank you for your help.
The text was updated successfully, but these errors were encountered: