users@jersey.java.net

[Jersey] Re: Message body reader/writer over very large InputStream

From: Michael Delamere <michael.delamere_at_sevon-it.com>
Date: Thu, 18 Dec 2014 19:36:22 +0100

Is there anyway of deleting my message before it ends up in Googles history?

 

Thanks and best regards,

Michael

 

Von: Michael Delamere [mailto:michael.delamere_at_sevon-it.com]
Gesendet: Donnerstag, 18. Dezember 2014 19:33
An: users_at_jersey.java.net
Betreff: [Jersey] Re: Message body reader/writer over very large InputStream

 

Sorry, don’t know how this happened. Must have clicked on the wrong email by mistake on my iphone!

 

Best regards,

Michael

 

Von: Michael Delamere [mailto:michael.delamere_at_sevon-it.com]
Gesendet: Donnerstag, 18. Dezember 2014 17:24
An: users_at_jersey.java.net
Betreff: [Jersey] Re: Message body reader/writer over very large InputStream

 


Am 18.12.2014 um 16:49 schrieb Behrooz Nobakht <nobeh5_at_gmail.com>:

Hi,

I already admit that this has not been the best approach. Yet this is a legacy context so please bear with me.

I have a piece of application that is built on top of Jersey. One of the functionalities of the application is to transfer files over network through RESTful API on top of Jersey. We use a custom “message body reader/writer” for this purpose.

The problem is that some times it could be that the transferred files are very large. This eventually leads to an OutOfMemoryError in the JVM.

Basically, the API boils down to where we use the entityStream from Jersey API to write the body of the message. The Jersey API in a lower level uses an output stream that is provided by the URLConnection implementation. For this, there are two case:

· Using default URL connection factory that is on top of sun.* implementation.

· Using Apache HTTP Client.

In both cases, the implementation uses an instance of ByteArrayOutputStream. The problem starts when growing the internal array is not possible and that’s how the memory error appears.

Is there a workaround or solution for this either in configuration level or implementation level?

We start to see the problem with around the files with size ~1.5GB.

Thanks,
Behrooz