I don’t understand how out-of-memory works.
I have a Scala function which allocates a huge number of data structures and stores them in a hash table. Depending on how many variables I have within the class which instantiating I can allocate somewhere between 22 million and 30 million such instances. However, the MacOS activity monitor doesn’t think the memory is being stressed.
Although the CPU is pretty busy
I’m running on a 4 core laptop.
I’ve also doubled the heap size twice in the IntelliJ Memory Settings, from the default 2048 MB to 8092 MB. Without doubling the heap size, I run out of heap while running the program. With this setting at 8092, the program has been running for more than 24 hours, has allocated > 29 million instances (I have a print statement every million allocations)
Is there any relation between the 8GB I’ve requested for the heap size, and the memory meter in the activity monitor?