× The internal search function is temporarily non-functional. The current search engine is no longer viable and we are researching alternatives.
As a stop gap measure, we are using Google's custom search engine service.
If you know of an easy to use, open source, search engine ... please contact support@midrange.com.



On Thu, Dec 8, 2011 at 9:26 PM, CRPence <CRPbottle@xxxxxxxxx> wrote:

<<SNIP>> I need that RCMD on the Windows server!

So, it looks like I'm S.O.L. as far as running a Windows command like
XCOPY on the server. I might give a shout to our Server Ops to see
if there's a setting they can change on the server that allows an
FTP GET to work on an open file. This *was* the working behavior up
until a few months ago.

That was my recollection from old versions of Windows Server 200# and
past discussions looking for the same RCMD feature that FTP on IBM i
provides. But if there is or could be made available an active REXEC
daemon on the Windows Server, then RUNRMTCMD should enable a means to
issue the XCOPY; e.g. RUNRMTCMD issued via the client request [FTP
subcommand] SYSCMD, or in a CLP wrapper for the FTP processing. That
text should provide good search criteria for finding past discussions on
this list.

I tried some rudimentary tests and I think they have this (and all of our
servers) locked down pretty tight. For this particular process, we have a
special user name to logon to a server named AAFTP01.XXXXXXX.NET, but after
logging on, the FTP client is "looking" at a different server and in a
specific "root" folder, which is a few folders off of the drive's root
folder. I can also logon to the server using Windows' Remote Desktop
Connection with my own profile. I tried the RUNRMTCMD using both profiles,
and in both cases, I got CPE3425 (A remote host refused an attempted
connect operation.)


Of course any other communications method for which the server is
both "listening" <<SNIP>>

Understood and agree.

If there is such trust in the application actually being complete,
regardless that the file remains open, then why not have the application
make the copy after "completing" the work. Or better, just have the
application close the file, since if the application is truly done
updating the file, there is no need to keep the file open.?

The software vendor is calling this "working as designed". Basically, the
application is always "on" / loaded, waiting for data to process. I will
contact the vendor to see what's involved to get the application to close
the file and open a new one at, say, midnight.


Why implement a work-around that must rely on the assumption that the
open and unavailable file is no longer being updated? Admittedly, so it
seems, that assumption was apparently already being made, just that no
error was being manifest. At least changing the application to close
the file prevents making another copy on the server, and the data
getting copied yet again, to the FTP client.

Preaching to the choir, Chuck! I sent an inquiry to our Server Ops group
to see what they recommend.

Thanks again for your help!
- Dan

As an Amazon Associate we earn from qualifying purchases.

This thread ...

Replies:

Follow On AppleNews
Return to Archive home page | Return to MIDRANGE.COM home page

This mailing list archive is Copyright 1997-2024 by midrange.com and David Gibbs as a compilation work. Use of the archive is restricted to research of a business or technical nature. Any other uses are prohibited. Full details are available on our policy page. If you have questions about this, please contact [javascript protected email address].

Operating expenses for this site are earned using the Amazon Associate program and Google Adsense.