IDA goes out of memory

i am loading a ~1.5 GB elf file (having dwarf info) into IDA. IDA keeps allocating memory till eats up to 10 GB of ram, then system goes out of ram and IDA shows error.

is it normal? can i somehow limit IDA’s ram usage or make it use disk?

It’s pretty difficult to say without the file, but you can start by unchecking options in the “Load DWARF” dialog. For example, disabling types loading should skip some of the most time- and memory-consuming operations.

i will send the sample via Help Center. but i want to know is this a normal behaviour or there’s something wrong i.e typically how much ram does a elf+sym need to be analyzed by IDA?

the debug info is exactly what i wanted to load in IDA so disabling it is not an option for me.

Thank you for the sample, we’ll investigate why it takes so much memory and if there’s a way to reduce it. While IDA does use mostly disk storage (IDB) for storing data, during loading it does use some heap memory so it’s possible to run out of memory for especially large files.

Unfortunately, the DWARF format is not optimized for the way we use it (full enumeration of all records) so possibly some modification of the approach will be necessary.

BTW, my suggestion was not to “disable loading of debug info” but only to skip the most memory-consuming parts such as types. You should still get functions and global variable names.

1 Like

I want to reconstruct some classes, so having correct data types is essential. I hope this issue will be resolved in IDA 9.3.

I was able to load your binary with the maximum memory usage of around 64GiB.

1 Like

so for now its impossible for me. will this issue be solved in next version or its not an issue at all and thats what it takes to analyze such executable?

btw, since you done analyzing the file if you haven’t deleted the idb, could you be kind to send it in Help Center request? i searched a lot for that sample and its an absolute despair to see that i lack the hardware to have fun fiddling with it.