|
I have socket client written in Java that sends a message to a socketthe
server running on the iSeries written in RPG. The socket server is
spawning socket server based on Scott Klements code. I am having a
problem where the client times out but the server doesn't realize it.
The client sends a message and times out after 40 seconds if it
doesn't receive a response. The server receives a message, gets data,
and sends a response. Usually it only take a second or two. On thethat amount of time.
send of the response, from the server, I check the response code. If
the response code is negative, I assume the client didn't get the
response and back out the change. The java developer assures me that
once the timeout is reached, the socket connection is ended by the
client. I would expect to get a negative response code on the send
from the server, but that is not always the case. This week we had
some system issues causing the server to send responses two or three
minutes later. I would have expected the pipe to have been broken in
server?
Am I wrong to expect a negative response code on the send from the
Is there anything that I can tweak so the server recognizes the pipe
is broken by the client so that it will get a negative response code?
Thanks,
Mark Garton
As an Amazon Associate we earn from qualifying purchases.
This mailing list archive is Copyright 1997-2024 by midrange.com and David Gibbs as a compilation work. Use of the archive is restricted to research of a business or technical nature. Any other uses are prohibited. Full details are available on our policy page. If you have questions about this, please contact [javascript protected email address].
Operating expenses for this site are earned using the Amazon Associate program and Google Adsense.