users@tyrus.java.net

websocket timeout?

From: Matthew Mah <matthew.y.mah_at_gmail.com>
Date: Mon, 03 Nov 2014 18:24:37 -0500

I am trying to detect websocket disconnects for an Android client. I
expect to be able to do this by using a RemoteEndpoint.Async, setting
its timeout to a relatively small value (10000 ms = 10 sec) , and
sending ping messages regularly.

endpoint = session.getAsyncRemote();
endpoint.setSendTimeout(10000);

However, when I disconnect the server endpoint from the network, the
Android client continues to send ping messages indefinitely without any
notice. I am expecting to either see session.isOpen() return false, or a
endpoint.sendPing(empty) throw an exception.

I have verified the Android device is sending pings at the server
through wireshark.

Is there something I am missing? How should the timeout show up?