users@glassfish.java.net

RE: Forcing chunked encoding while downloading a large file?

From: Kevin Regan <k.regan_at_f5.com>
Date: Tue, 8 May 2012 20:32:21 +0000

Hi Martin,

I'm using just the default glassfish setup. I don't have any filters installed and I don't have any logging properties set in the application.

The only thing I do have are the SLF4J JARs installed in my web app (slf4j-api.jar and slf4j-jdk.jar) used for printing out debugging statements, which find their way into the standard Glassfish server.log file. In fact those are the only additional JARs in my web app (WEB-INF/lib directory). Everything else is being picked up from the default Glassfish 3.1.2.

I am definitely seeing the problem. Would a stack trace from the OutOfMemoryError be useful in diagnosing the problem?

--Kevin

From: Martin Matula [mailto:martin.matula_at_oracle.com]
Sent: Tuesday, May 08, 2012 2:10 AM
To: users_at_glassfish.java.net
Subject: Re: Forcing chunked encoding while downloading a large file?

Hi Kevin,
That should not be happening. For InputStream Jersey returns -1 as the content length - i.e. does not try to buffer the entity. For the File it reads it using File.length() method.
Are you using any filters in your application? E.g. the logging filter would cause that the entity would get buffered.
Martin

On May 4, 2012, at 11:24 PM, Kevin Regan wrote:


I'm downloading a large file with Jersey/Glassfish and I'm seeing a java.lang.OutOfMemoryError.

I'm returning a FileInputStream (also tried File) from the Jersey handler and it looks to be attempting to write the file to a ByteArrayOutputStream before returning it.

Is there any way to force Jersey/Glassfish to stream this large binary file to the client through either chunked encoding or by not setting CONTENT-LENGTH?

Sincerely,
Kevin Regan