I’m trying to determine how much heap any given TYPE_INT_ARGB BufferedImage
will use so that, for a program which is doing some image processing, I can set a reasonable max heap based on the size of image we feed it.
I wrote the following program as a test, which I then used to determine the least maximum heap under which it would run without an OutOfMemoryError
:
import java.awt.image.BufferedImage; public class Test { public static void main(String[] args) { final int w = Integer.parseInt(args[0]); final int h = Integer.parseInt(args[1]); final BufferedImage img = new BufferedImage(w, h, BufferedImage.TYPE_INT_ARGB); System.out.println((4*w*h) >> 20); } }
(The printed value is the expected size of the int[]
in which the BufferedImage
‘s pixel data is stored.) What I expected to find was that the required max heap is something like x + c
, where x
is the size of the data array and c
is a constant consisting of the sizes of the classes which are loaded, the BufferedImage
object, etc. This is what I found instead (all values are in MB):
4*w*h min max heap ----- ------------ 5 - 10 15 20 31 40 61 80 121 160 241
1.5x
is a good fit for the observations. (Note that I found no minimum for the 5MB image.) I don’t understand what I’m seeing. What are these extra bytes?
Advertisement
Answer
On further investigation, the problem appears to be that the Old Generation in the heap is not able to expand sufficiently to accommodate the image’s data array, despite that there is enough free memory in the heap at large.
For further details about how to expand the Old Generation, see this question.