From the GC logs, it appears that the heap size is being reduced across young and old generations after sometime.
Below are three entries from the logs.
{Heap before gc invocations=5:
PSYoungGen total 44800K, used 44180K [0x18220000, 0x1b420000, 0x24a20000)
eden space 38400K, 100% used [0x18220000,0x1a7a0000,0x1a7a0000)
from space 6400K, 90% used [0x1ade0000,0x1b3853c8,0x1b420000)
to space 6400K, 0% used [0x1a7a0000,0x1a7a0000,0x1ade0000)
PSOldGen total 51200K, used 16K [0x0ba20000, 0x0ec20000, 0x18220000)
object space 51200K, 0% used [0x0ba20000,0x0ba24128,0x0ec20000)
PSPermGen total 32768K, used 12417K [0x03a20000, 0x05a20000, 0x0ba20000)
object space 32768K, 37% used [0x03a20000,0x046406c0,0x05a20000)
1047.599: [GC [PSYoungGen: 44180K->5990K(44032K)] 44197K->6006K(95232K), 0.0046671 secs]
Heap after gc invocations=5:
PSYoungGen total 44032K, used 5990K [0x18220000, 0x1b5a0000, 0x24a20000)
eden space 37632K, 0% used [0x18220000,0x18220000,0x1a6e0000)
from space 6400K, 93% used [0x1a7a0000,0x1ad79990,0x1ade0000)
to space 7552K, 0% used [0x1ae40000,0x1ae40000,0x1b5a0000)
PSOldGen total 51200K, used 16K [0x0ba20000, 0x0ec20000, 0x18220000)
object space 51200K, 0% used [0x0ba20000,0x0ba24128,0x0ec20000)
PSPermGen total 32768K, used 12417K [0x03a20000, 0x05a20000, 0x0ba20000)
object space 32768K, 37% used [0x03a20000,0x046406c0,0x05a20000)
}
{Heap before gc invocations=174:
PSYoungGen total 9728K, used 9666K [0x18220000, 0x18d30000, 0x24a20000)
eden space 8640K, 99% used [0x18220000,0x18a8fa58,0x18a90000)
from space 1088K, 94% used [0x18a90000,0x18b910f0,0x18ba0000)
to space 1344K, 0% used [0x18be0000,0x18be0000,0x18d30000)
PSOldGen total 51200K, used 21113K [0x0ba20000, 0x0ec20000, 0x18220000)
object space 51200K, 41% used [0x0ba20000,0x0cebe540,0x0ec20000)
PSPermGen total 32768K, used 12611K [0x03a20000, 0x05a20000, 0x0ba20000)
object space 32768K, 38% used [0x03a20000,0x04670fa0,0x05a20000)
26968.748: [GC [PSYoungGen: 9666K->1271K(9920K)] 30780K->22465K(61120K), 0.0025274 secs]
Heap after gc invocations=174:
PSYoungGen total 9920K, used 1271K [0x18220000, 0x18d50000, 0x24a20000)
eden space 8576K, 0% used [0x18220000,0x18220000,0x18a80000)
from space 1344K, 94% used [0x18be0000,0x18d1dd00,0x18d30000)
to space 1408K, 0% used [0x18a80000,0x18a80000,0x18be0000)
PSOldGen total 51200K, used 21194K [0x0ba20000, 0x0ec20000, 0x18220000)
object space 51200K, 41% used [0x0ba20000,0x0ced29e0,0x0ec20000)
PSPermGen total 32768K, used 12611K [0x03a20000, 0x05a20000, 0x0ba20000)
object space 32768K, 38% used [0x03a20000,0x04670fa0,0x05a20000)
}
{Heap before gc invocations=23195:
PSYoungGen total 1536K, used 1372K [0x18220000, 0x18440000, 0x24a20000)
eden space 896K, 100% used [0x18220000,0x18300000,0x18300000)
from space 640K, 74% used [0x183a0000,0x184173e8,0x18440000)
to space 640K, 0% used [0x18300000,0x18300000,0x183a0000)
PSOldGen total 6272K, used 4627K [0x0ba20000, 0x0c040000, 0x18220000)
object space 6272K, 73% used [0x0ba20000,0x0bea4d70,0x0c040000)
PSPermGen total 32768K, used 12930K [0x03a20000, 0x05a20000, 0x0ba20000)
object space 32768K, 39% used [0x03a20000,0x046c0aa0,0x05a20000)
71908.146: [GC [PSYoungGen: 1372K->533K(1536K)] 6000K->5423K(7808K), 0.0107964 secs]
Heap after gc invocations=23195:
PSYoungGen total 1536K, used 533K [0x18220000, 0x18460000, 0x24a20000)
eden space 896K, 0% used [0x18220000,0x18220000,0x18300000)
from space 640K, 83% used [0x18300000,0x18385438,0x183a0000)
to space 704K, 0% used [0x183b0000,0x183b0000,0x18460000)
PSOldGen total 6272K, used 4890K [0x0ba20000, 0x0c040000, 0x18220000)
object space 6272K, 77% used [0x0ba20000,0x0bee6868,0x0c040000)
PSPermGen total 32768K, used 12930K [0x03a20000, 0x05a20000, 0x0ba20000)
object space 32768K, 39% used [0x03a20000,0x046c0aa0,0x05a20000)
}
The (relevant) JVM parameters used are as follows :
-server
-Xms600m
-Xss256k
-XX:+UseParallelGC
-XX:+PrintHeapAtGC
-XX:+HeapDumpOnOutOfMemoryError
-Xloggc:verbose_gc.log
-Djavax.net.debug=none
-XX:+PrintGCDetails
-XX:+PrintTenuringDistribution
-XX:+PrintClassHistogram
-Xmx800m
-XX:NewRatio=1
-XX:CompileThreshold=1500
-XX:PermSize=32m
-XX:MaxPermSize=128m
An assumption that memory was being consumed by too many threads (based on the understanding that stack memory was different from heap memory) was shown to be false with a sample application that simply created new threads, when observed using visual vm (the young and tenured generations filled up).
Is there any explanation to this behaviour?
Update #1: Correction and Update: The above sample application also included an array object being created with each thread. The array size was a non trivial fraction of the total heap. The issue is observable in JDK 1.5.
-R
Advertisement
Answer
Actually, this is the normal behavior of the JVM. When the JVM notices that you haven’t using some of the HEAP memory that is currently allocated for a while, it assumes that your application can run with less than you are giving it, and will reduce the total size of the heap during garbage collection (but never lower than the minimum heap size you specified with -Xms) releasing the memory back to the operating system.
If in the future it turns out your application actually needs to use more heap memory, the JVM will reserve more memory from the OS (up to the maximum heap size you specified with -Xmx).
I’ve seen this happen in the past for some of my webapps, and it seems to be the normal behavior. The trouble with this, is that this reduction of the heap can cause some unwanted full garbage collections when you application suddenly requires more memory than usual.
In my case, this decrease in heap size resulted in too many stop-the-world full garbage collections. We fixed the problem by making the minimum heap size closer to the maximum heap size. In case your really want to avoid this behavior, you can set the minimum heap size equal to the maximum heap size.
Also note that the memory taken by the thread stack space is not counted as part of the heap, IIRC.