users@glassfish.java.net

OutOfMemoryError and performace issues

From: Drinkwater, GJ \(Glen\) <"Drinkwater,>
Date: Wed, 24 Oct 2007 15:58:59 +0100

Hi

I am using glassfish v2 b58g to try and use a servlet to download large
files, but I keep on getting an out of memory error.

I try and execute concurrent requests (10 to 100) to this servlet which
reads in a large file 20 to 50M and streams it to the output.

The server keeps on taking up memory, up to the 550M I give it in byte
arrays[], which is linked to the
com.sun.enterprise.web.connector.grizzly.ssl.SSLWorkerThread.run() when
I profile it. Also, the more I increase the Request Processing thread
count the quicker it runs out of memory (ie from 5 to 10, 20 etc). All
the Eden, Survivor and Tenured heap space it taken up, and never
reclaimed, even after the requests have stopped.

I noticed the memory go up from running around 100M to 247M when
requested a download on a 60M file?

When I try the same code on tomcat 6.0.14, give it 50 max in the http
thread pool, the memory footprint never exceeds 7M? Also, the
performace compared to tomcat is very poor, even if I try out the tips
in
http://weblogs.java.net/blog/jfarcand/archive/2007/03/configuring_gri_2.
html

Can anyone help?

Simple code in the servlet doGet():

            byte[] bbuf = new byte[1024];
        FileInputStream filein =new FileInputStream(file);
        DataInputStream in = new DataInputStream(filein);
              while ((in != null) && ((length = in.read(bbuf)) != -1))
        {
            op.write(bbuf,0,length);
        }
                     
        filein.close();
        in.close();
        op.flush();
        op.close();

Thanks Glen