× The internal search function is temporarily non-functional. The current search engine is no longer viable and we are researching alternatives.
As a stop gap measure, we are using Google's custom search engine service.
If you know of an easy to use, open source, search engine ... please contact support@midrange.com.



Noticed my grammar is off today 😊

Regards,
Richard Schoen
Web: http://www.richardschoen.net
Email: richard@xxxxxxxxxxxxxxxxx

-----Original Message-----
From: Richard Schoen
Sent: Wednesday, March 30, 2022 7:09 PM
To: midrange-l@xxxxxxxxxxxxxxxxxx
Subject: RE: Something is eating our disk space

Here's little Python sample I have created that crawls a directory tree to a specified table and then you can query by IFS directory or file size.

It supports regular IFS files and QSYS.LIB (libraries)

https://github.com/richardschoen/pymonfori/blob/main/pydircrawltodb.py

If the DB2 Services don't work for you this Python example should do the trick.

Example usage:
python3 pydircrawltodb.py --dirname / --outputtable QGPL.DIR

The above should relatively quickly crawl your entire file system including libraries via QSYS.LIB and create the selected output table.

After that you can query the resulting file for largest objects or however you want. No more DISKTASKS needed when looking for large stuff.

Note: Be aware if will crawl as fast as possible so maybe run during a slow period or in batch at lower priority.

Have fun and report any issued on the github site.

Regards,
Richard Schoen
Web: http://www.richardschoen.net
Email: richard@xxxxxxxxxxxxxxxxx

-----Original Message-----
From: MIDRANGE-L <midrange-l-bounces@xxxxxxxxxxxxxxxxxx> On Behalf Of Filip Drzewiecki via MIDRANGE-L
Sent: Wednesday, March 30, 2022 12:49 PM
To: Midrange Systems Technical Discussion <midrange-l@xxxxxxxxxxxxxxxxxx>
Cc: Filip Drzewiecki <fdrzewiecki@xxxxxxxxxxxxxxxxx>
Subject: Something is eating our disk space


Hi,

I think I have stuck and I need some help.
PRTDISKINFO is showing that 30% of our 2TB disk is eaten by "User Directories".
This is almost 600 Gigs.
I've used few different methods (old IFSTOOLS, running call qsrsrv parm("METRICS" '/' "EPFS") and RTVDIRINF and all of those shows like 65 Gb of files.
I've checked '/tmp' and some other directories and can't find what is eating all that space.

I have not tried yet object_statistics as it was taking ages, I could not figure out how to optimize it.

Filip


As an Amazon Associate we earn from qualifying purchases.

This thread ...

Replies:

Follow On AppleNews
Return to Archive home page | Return to MIDRANGE.COM home page

This mailing list archive is Copyright 1997-2024 by midrange.com and David Gibbs as a compilation work. Use of the archive is restricted to research of a business or technical nature. Any other uses are prohibited. Full details are available on our policy page. If you have questions about this, please contact [javascript protected email address].

Operating expenses for this site are earned using the Amazon Associate program and Google Adsense.