× The internal search function is temporarily non-functional. The current search engine is no longer viable and we are researching alternatives.
As a stop gap measure, we are using Google's custom search engine service.
If you know of an easy to use, open source, search engine ... please contact support@midrange.com.



Something odd is happening, or maybe not.

Assume someone is using a socket application (client) to
communicate with a web service.

Now, the socket creates successfully and returns a
descriptor.  But, on the connect it times out.  Now we
close the socket (which if I understand correctly closes
any open connection and destroys the socket).

What happens, though, if there is no connection?  I assume
all that happens is the socket is destroyed.

But, in this case, it seems that it is closing a
_different_ socket (a client application that is running
along with the web service client).

But that makes no sense...  

So, client app (VARPG) connects to the iSEries with a
socket.  Another client app create a socket to talk to a
web service.  It times out on the connect (but socket()
completes fine).  A close is issued on the descriptor and
it shuts down the VARPG client as well?

The only way I can see this is if the SD gets set back to
zero and that's the descriptor the VARPG client is using.
 But I don't see that in my app.  :)

Just want to see if I'm crazy or missing something.

As an Amazon Associate we earn from qualifying purchases.

This thread ...

Follow-Ups:
Replies:

Follow On AppleNews
Return to Archive home page | Return to MIDRANGE.COM home page

This mailing list archive is Copyright 1997-2024 by midrange.com and David Gibbs as a compilation work. Use of the archive is restricted to research of a business or technical nature. Any other uses are prohibited. Full details are available on our policy page. If you have questions about this, please contact [javascript protected email address].

Operating expenses for this site are earned using the Amazon Associate program and Google Adsense.