Skip to content
Advertisement

Why is SerialGC chosen over G1GC?

I am running Java on very similar VMs and I can’t find the explanation why the SerialGC is chosen over G1GC in one case. It’s the same Java version, same OS, same VM instance type on AWS, and I suspect the only difference is container settings but I do not know how to pinpoint what changes. Is there a way to get an explanation on why the VM decides to chose this settings or another?

Java version in both case:

Java(TM) SE Runtime Environment 18.3 (build 10.0.1+10)
Java HotSpot(TM) 64-Bit Server VM 18.3 (build 10.0.1+10, mixed mode)

When running Java in one case:

java -XX:+PrintFlagsFinal -XX:+PrintCommandLineFlag

Output:

Picked up JAVA_TOOL_OPTIONS: -Dfile.encoding=UTF8
-XX:InitialHeapSize=253366976 -XX:MaxHeapSize=4053871616 -XX:+PrintCommandLineFlags -XX:+PrintFlagsFinal -XX:ReservedCodeCacheSize=251658240 -XX:+SegmentedCodeCache -XX:+UseCompressedClassPointers -XX:+UseCompressedOops -XX:+UseSerialGC
[Global flags]
(...)
   bool UseG1GC                                  = false                                    {product} {default}
     bool UseParallelGC                            = false                                    {product} {default}
     bool UseSerialGC                              = true                                     {product} {ergonomic}

And the other:

Picked up JAVA_TOOL_OPTIONS: -Dfile.encoding=UTF8
-XX:G1ConcRefinementThreads=8 -XX:InitialHeapSize=253480064 -XX:MaxHeapSize=4055681024 -XX:+PrintCommandLineFlags -XX:+PrintFlagsFinal -XX:ReservedCodeCacheSize=251658240 -XX:+SegmentedCodeCache -XX:+UseCompressedClassPointers -XX:+UseCompressedOops -XX:+UseG1GC
[Global flags]
(...)
   bool UseG1GC                                  = true                                    {product} {ergonomic}
     bool UseParallelGC                            = false                                    {product} {default}
     bool UseSerialGC                              = false                                     {product} {default}

Advertisement

Answer

{ergonomic} in -XX:+PrintFlagsFinal means that the flag was set automatically based on the number of available processors and the amount of RAM.

JDK 10 treats machine as “server” if it has at least 2 available CPUs and 2 GB RAM. This can be overriden by -XX:+AlwaysActAsServerClassMachine or -XX:+NeverActAsServerClassMachine JVM flags.

“Server” configuration assumes G1 as the default GC, while “Client” machine uses SerialGC by default.

To calculate the number of available processors JDK uses not only the CPUs visible in the OS, but also processor affinity and cgroup limits, including

  • cpu.shares
  • cpu.cfs_quota_us
  • cpuset.cpus

Since you run Java in a container, it’s likely that container imposes cgroup limits that result in a smaller number of available CPUs or amount of memory. Use -Xlog:os+container=trace to find the effective limits in each particular environment.

User contributions licensed under: CC BY-SA
6 People found this is helpful
Advertisement