[Jersey] Re: Message body reader/writer provider over very large InputStream

From: Behrooz Nobakht <>
Date: Mon, 22 Dec 2014 09:33:50 +0100


Your assumption was indeed correct. Thanks for the hint. I will give it a
try and provide feedback.


On Fri, Dec 19, 2014 at 12:11 PM, Marek Potociar <>

> You do not write whether the problem is on the client or server side, but
> from the content I assume client.
> To transfer large files you need to use chunked encoding and disable
> buffering. You may need to look into HttpUrlConnection or Apache HTTP
> Client configuration to disable buffering at the connector level.
> Cheers,
> Marek
> On 19 Dec 2014, at 07:18, Behrooz Nobakht <> wrote:
> Hi,
> This is a repost as the original received an unintentional deviation that
> may create the impression this is already discussed/replied.
> I already admit that this has not been the best approach.
> Yet this is a legacy context so please bear with me.
> I have a piece of application that is built on top of Jersey.
> One of the functionalities of the application is to transfer files over
> network
> through RESTful API on top of Jersey.
> We use a custom “message body reader/writer” for this purpose.
> The problem is that some times it could be that the transferred files are
> very large.
> This eventually leads to an OutOfMemoryError in the JVM.
> Basically, the API boils down to where we use the entityStream from
> Jersey API
> to write the body of the message. The Jersey API in a lower level uses an
> output stream
> that is provided by the URLConnection implementation. For this, there are
> two case:
> - Using default URL connection factory that is on top of sun.*
> implementation.
> - Using Apache HTTP Client.
> In both cases, the implementation uses an instance of
> ByteArrayOutputStream.
> The problem starts when growing the internal array is not possible
> and that’s how the memory error appears.
> Is there a workaround or solution for this either in configuration level
> or implementation level?
> We start to see the problem with around the files with size ~1.5GB.
> Thanks,
> Behrooz
> ​

-- Behrooz Nobakht