× The internal search function is temporarily non-functional. The current search engine is no longer viable and we are researching alternatives.
As a stop gap measure, we are using Google's custom search engine service.
If you know of an easy to use, open source, search engine ... please contact support@midrange.com.



i wonder if the OSS logrotate would work. I've not tried it, but it is used for IFS log files


<man logrotate>
LOGROTATE(8) System Administrator's Manual LOGROTATE(8)

NAME
logrotate ‐ rotates, compresses, and mails system logs

SYNOPSIS
logrotate [--force] [--debug] [--state file] [--verbose] [--log file] [--mail command] config_file [con‐
fig_file2 ...]

DESCRIPTION
logrotate is designed to ease administration of systems that generate large numbers of log files. It allows
automatic rotation, compression, removal, and mailing of log files. Each log file may be handled daily,
weekly, monthly, or when it grows too large.

Normally, logrotate is run as a daily cron job. It will not modify a log more than once in one day unless the
criterion for that log is based on the log's size and logrotate is being run more than once each day, or un‐
less the -f or --force option is used.

Any number of config files may be given on the command line. Later config files may override the options given
in earlier files, so the order in which the logrotate config files are listed is important. Normally, a sin‐
gle config file which includes any other config files which are needed should be used. See below for more in‐
formation on how to use the include directive to accomplish this. If a directory is given on the command
line, every file in that directory is used as a config file.

If no command line arguments are given, logrotate will print version and copyright information, along with a
short usage summary. If any errors occur while rotating logs, logrotate will exit with non-zero status.

</man logrotate>


bryan


Andrew Lopez (SXS US) wrote on 9/22/2020 8:58 AM:
We use a number of QSH commands to clean the IFS of old log files produced by JD Edwards.

I have recently instituted a special backup to SAVF of sensitive files that we are having to open up to new users to maintain. These are created in library SAVFPRDDTA with names like CFG200922 (CFGyymmdd). I need to put in place some regular purges of these daily backups.

What I don't have a feel for is how common/accepted would it be to use QSH commands to clean up files in the traditional file system? We would be using something like 'find /QSYS.LIB/SAVFPRDDTA.LIB -type f -name 'CFG*.FILE' -mtime +60 -exec rm {} \;'. This works, but I've never seen it done or demoed (though I could well have missed the examples). I could do a traditional dump of DSPOBJ or DSPFD and process the records, but it's not in my nature to create temporary files when the system can handle them better than I can.

So, acceptable? Reasonable? I really prefer to not branch out into 'experimental' approaches and have no idea where this would fall in the IBM i world.


_____________________________________________________________________
Spirax-Sarco Engineering Plc. This e-mail has been scanned for viruses by Cisco Cloud Email Security.


As an Amazon Associate we earn from qualifying purchases.

This thread ...

Replies:

Follow On AppleNews
Return to Archive home page | Return to MIDRANGE.COM home page

This mailing list archive is Copyright 1997-2024 by midrange.com and David Gibbs as a compilation work. Use of the archive is restricted to research of a business or technical nature. Any other uses are prohibited. Full details are available on our policy page. If you have questions about this, please contact [javascript protected email address].

Operating expenses for this site are earned using the Amazon Associate program and Google Adsense.