× The internal search function is temporarily non-functional. The current search engine is no longer viable and we are researching alternatives.
As a stop gap measure, we are using Google's custom search engine service.
If you know of an easy to use, open source, search engine ... please contact support@midrange.com.



Hello Bala,

For what purpose are you archiving the files? What are the files that you are archiving? Why did you decide to do this using QShell instead of conventional means?

For example... if you are using this for system recovery, you really want to use the SAV CL command for stream files & directories, the SAVDLO command for folders and documents, and the SAVOBJ/SAVLIB commands for everything else. But if you do that, the result can only be restored to an IBM i system -- which makes sense for system recovery.

But do you want to use this for system recovery? Or do you want to archive data for the purposes of transmitting it to another system over a network? In that case, the SAVxxx commands probably aren't what you're looking for -- you're probably going to want to use JAR, ZIP or TAR.

Unless you need to copy IBM i specific object types like *PGM, *DTAARA, *USRPRF, etc.... in which case, you're back to needing to use the SAV commands.

So we really can't answer your question until we understand what you're looking for.


Bala Rajamani wrote:
Hello,

I'm looking for a Qshell script to archive files on a sub-directory based on
certain time-frame. I understand Qshell has limited UNIX commands to
operate. Wondering it would be possible to achieve that using a QShell
script?

Thanks in advance,
Bala.


As an Amazon Associate we earn from qualifying purchases.

This thread ...

Follow-Ups:
Replies:

Follow On AppleNews
Return to Archive home page | Return to MIDRANGE.COM home page

This mailing list archive is Copyright 1997-2024 by midrange.com and David Gibbs as a compilation work. Use of the archive is restricted to research of a business or technical nature. Any other uses are prohibited. Full details are available on our policy page. If you have questions about this, please contact [javascript protected email address].

Operating expenses for this site are earned using the Amazon Associate program and Google Adsense.