|
All I see so far are the commands being sent by the send command. I would like
to see everything being received. I thought about wrapping the entire script
with a shell, but I am doubtful that will work. I am thinking that the expect
program might not pass on received data. I wonder if there is some way to pipe
the sessio through tee so that expect can do wht it needs to do but I would
still have the original stream.
I am not sure how to have the script run and capture what expect is receiving. What adds to the mess is if I try to run the script, the other side may run a
process to sweep the received file to another location. I have been told that
files are swept and that they aren't. My coworker has been tying up his
computer using FileZilla. And, when he looks oments later, the files are not
there. A straight answer from them would be nice. All I can get is what
minimal information that is being captured by the standard event log - which I
learned was not much in previous conversations.
Is there any issue with attempting to send multiple large files? Tomorrow, I am
going to try this thing with only one file. If that works, I might just invoke
two sftp sessions to get the files transferred, one after the other. Would
have been nice to have had both go in a single sFTP session. I was told that a
single file transfer seemed to work, at least one time. But, that was had to
prove due to the file sweeping that may or my not have been suspended. The
best contact I have had there was somebody who had a script tht could be used
to invoke FileZilla. He did not create it. He claims that all their users use
FileZilla. When I hear the word "all", I get suspicious as to what is really
going on on their end.
John McKee
John McKee
Quoting Scott Klement <midrange-l@xxxxxxxxxxxxxxxx>:
What sort of troubles are you having capturing the session?
John McKee wrote:
I am still having problems with communicating with this Windows box using sFTP.--
I put different exit values in the script for each expect instruction.
The files that need to be sent are 500+ MB each, and there are two of them. The
expect script is doing one of two things now: 1) exiting with a return code I
have assigned to the last expect, where I am just looking for the sftp> to send
a quit, or 2) exiting waiting for the password prompt.
I had thought that the size of the files might have been a factor in the wait
time. More an act of desperation than anything else approaching rational
thinking. I bumped it up to an insanely high numer. Only thing that gets
effected is how long I have to wait for the script to error out. I also
thought the jobq might be a factor. Normally, I have been sending this to
QBATCH. I don't recll the upper limit on active jobs in that jobq, but do know
it is high enough that a number of jobs start and end. Last night, I submitted
to QTXTSRCH, which, as far as I can remember, has no limit set on active jobs.
Same results.
I was wondering if I can capture the session in a log. I want to be able to see
what expect is receiving as well as what is being sent. Is it possible to
redirect the session to a log file and still have expect work?
The people at the remote site are positive this is on my end. Something is
getting lost, that is obvious. Why FileZilla works and this script does not is
becoming a hassle. A coworker has been running FileZilla and has told me that
transfer time is inexcess of two hours.
Any ideas on how to get a handle on what is going on?
John McKee
This is the Midrange Systems Technical Discussion (MIDRANGE-L) mailing list
To post a message email: MIDRANGE-L@xxxxxxxxxxxx
To subscribe, unsubscribe, or change list options,
visit: http://lists.midrange.com/mailman/listinfo/midrange-l
or email: MIDRANGE-L-request@xxxxxxxxxxxx
Before posting, please take a moment to review the archives
at http://archive.midrange.com/midrange-l.
As an Amazon Associate we earn from qualifying purchases.
This mailing list archive is Copyright 1997-2024 by midrange.com and David Gibbs as a compilation work. Use of the archive is restricted to research of a business or technical nature. Any other uses are prohibited. Full details are available on our policy page. If you have questions about this, please contact [javascript protected email address].
Operating expenses for this site are earned using the Amazon Associate program and Google Adsense.