Hi, I issued the following command about 30 mins ago and it's still running. I think there's at least 100k files in the hundred or so subdirectories. I'm guessing there's no faster way to get the info?
find /docs -type f | wc -l
Thanks
-----Original Message-----
From: Scott Klement <midrange-l@xxxxxxxxxxxxxxxx>
To: Midrange Systems Technical Discussion <midrange-l@xxxxxxxxxxxx>
Sent: Tue, Apr 19, 2011 5:05 pm
Subject: Re: QSH help
Hello,
n 4/19/2011 3:05 PM, fbocch2595@xxxxxxx wrote:
> ls -lR | wc -l
2067
This will give you more than just the number of files. It'll include
irectories, plus it'll have extra lines for headings. (Which will add
o your count.)
I wouldn't use ls for this.
> find . -type f | wc -l
166
This will work better... but only includes "type f" (files). If that's
ll you want, it's good.
> find -type f | wc -l
find: 001-3026 usage: find [-H | -L | -P] [-Xdx] [-f file] file ...
expression]
0
You've forgotten to specify the directory here. Syntax is:
find DIRECTORY -type f | wc -l
So this finds everything in the current directory (and it's subdirectories):
find . -type f | wc -l
And this finds everything in the /tmp directory (and it's subdirectories):
find /tmp -type f | wc -l
du -m
qsh: 001-0019 Error found searching for command du. No such path or directory.
There is no 'du' utility in QShell.
There's one in PASE, however... If you want to run the PASE one from
Shell, try:
/QOpenSys/usr/bin/du -m
(or add /QOpenSys/usr/bin to your PATH. Or use QP2TERM instead of
Shell where it'll be in the path by default.)
Personally, I wouldn't use the -m flag. This causes the outputs to be
n megabytes, which will result in all of your small files being
nmeasurable. I'd probably use 'du -k' (kilobytes). But, I guess it
ll depends on what you're looking for.
-
his is the Midrange Systems Technical Discussion (MIDRANGE-L) mailing list
o post a message email: MIDRANGE-L@xxxxxxxxxxxx
o subscribe, unsubscribe, or change list options,
isit:
http://lists.midrange.com/mailman/listinfo/midrange-l
r email: MIDRANGE-L-request@xxxxxxxxxxxx
efore posting, please take a moment to review the archives
t
http://archive.midrange.com/midrange-l.
As an Amazon Associate we earn from qualifying purchases.