Skip to content
Advertisement

Why does setting the -Xmx too high sometimes cause the JVM to fail, even if there’s available RAM?

Basically we’ve noticed that on some computers setting the JVM option -Xmx (max heap size) sometimes cause the JVM to fail to initialize, even if there’s more than adequate RAM on the system.

So for example, on a 4gb machine, we have -Xmx1024m which fails but -Xmx800m works. I could understand on a 1gb machine, even a 2gb machine, but on a 4gb machine, especially considering that Windows, Linux, etc. can swap the RAM out, why does this fail?

I’ve seen a lot of threads and questions saying to reduce your max heap size, but no one can explain why it fails which is what I’m really looking for.

As well, how do you say consume as much memory as you want up to a certain size then?

Advertisement

Answer

It’s possible that this is due to virtual address space fragmentation. It may not be possible to reserve a contiguous 1024MB address range for the maximum potential size of the heap, depending on the load addresses of DLLs, threads’ stack locations, immovable native memory allocations, kernel reserved addresses and so forth, especially in a 32-bit process.

User contributions licensed under: CC BY-SA
3 People found this is helpful
Advertisement