× The internal search function is temporarily non-functional. The current search engine is no longer viable and we are researching alternatives.
As a stop gap measure, we are using Google's custom search engine service.
If you know of an easy to use, open source, search engine ... please contact support@midrange.com.



OK. Looks like this is a winner:

QSH CMD('find -L ''/java'' -name ''*.java'' | xargs fgrep -il "SOMPFLM000"')

Thanks everybody.

PLA



Patrick L Archibald wrote:

Hey Scott

Thanks for all the info. I get "*qsh: 001-0085 Too many arguments specified on command.*" when using the fgrep command on a large directory. It works OK on smaller directories.
With the '*find -L /java -name *.java -print | xargs fgrep -li "sompflm000*"' it runs OK interactively but I'm having some trouble submitting it. I'm not sure what is going on with that. I'll keep at it.


Thanks again, PLA

Hi Patrick,



I use the Qshell commands "find" and "fgrep" to search java source files
for a string. It spawns a bunch of QP0ZSPWT jobs. Is there any
environment setting or some other setting to stop it from spawning all
these QZSHSH jobs?


Those jobs are where your "find" and "fgrep" commands are running. If you
didn't spawn them, your commands wouldn't work.



[SNIP]


QSH CMD('find -L ''/java'' -name ''*.java'' -exec fgrep -il
''sompflm000'' {} \;')


Okay, here you've got a lot of jobs.

1) The first job is the one in which the QSH command is running.  It's
potentially receiving keyboard input and doing display output.  It has
spawned a "worker" job in the background to run "find".

2) The worker job with "find" is running as fast as it can searching your
directory tree. As it runs, it sends output to the job that spawned it.
Each time it finds a file that matches the pattern *.java it spawns
another new job in which to run fgrep.


3) fgrep runs individually for each file found in it's own job.  It's
output is sent back up to the find command, who will then in turn send
it's output back to the QShell job.

This is done this way so that it acts like Unix -- which is what QShell is
designed to do -- on a Unix system, a new process is created for each
command. Though, Unix processes are somewhat more lightweight than OS/400
jobs are -- so the performance would be better on a real unix system.





I think there is a new QZSHSH job spawned for every file in the
directories I'm searching. Is there anyway to stop this?


As I said above, you can't eliminate them completely.  You can, however,
minimize the number of jobs required.  fgrep is searching many different
files all in one run.  For example:

     cd /java/src
     fgrep -Ril "sompflm000" *

This will search every single file in the /java/src directory (-R makes
it also search files subdirectories) for "sompflm000". Since there's only one
invocation of "fgrep", it will only need 2 jobs (one for your terminal,
and one more for fgrep) That's a far cry from the potentially thousands
of jobs that you had when you were using "find."


The only problem with this statement is that it searches EVERYTHING (not
merely *.java).  That means that if there's a file that fgrep doesn't
understand, it might prematurely stop searching.

Another way to do it is to list the files individually. You can list many
different filenames on the fgrep command:


fgrep -il "sompflm000" file1.java file2.java file3.java

And that works nicely enough, but obviously it's not always practical to
list every single file individually.

That's where the "xargs" Qshell command comes in. What "xargs" does is
construct a command for you. It works by reading the command string that
you give it as a parameter, and then appending all of the filenames that
it reads from it's standard input, and the finally running the command.


For example:

find -L /java -name *.java -print | xargs fgrep -li "sompflm000"

the "find" command will, as before, find all of the filenames that match
the pattern "*.java." It will print those filenames out to it's standard
output.


The pipe will cause the stadnard output from "find" to be input to
"xargs". So args will generate a command string that starts with "fgrep
-li "sompflm000" and then it will append all of the filenames that "find"
has printed.


This allows a total of 3-4 jobs to handle your entire request... one job
for the terminal, one job for "find", one job for "fgrep" and I'm not sure
if xargs requires it's own job, or if it shares one with fgrep. Either
way, it's a far cry from the thousands of jobs that -exec would've
generated, and will therefore run much quicker.


The only problem is that sometimes the command-line that xargs generates
will be too long for QShell to run. If that happens, you may have to go
back to the -exec method.

Hopefully you'll find this to be useful ;)

_______________________________________________
This is the Midrange Systems Technical Discussion (MIDRANGE-L) mailing list
To post a message email: MIDRANGE-L@xxxxxxxxxxxx
To subscribe, unsubscribe, or change list options,
visit: http://lists.midrange.com/mailman/listinfo/midrange-l
or email: MIDRANGE-L-request@xxxxxxxxxxxx
Before posting, please take a moment to review the archives
at http://archive.midrange.com/midrange-l.






_______________________________________________
This is the Midrange Systems Technical Discussion (MIDRANGE-L) mailing list
To post a message email: MIDRANGE-L@xxxxxxxxxxxx
To subscribe, unsubscribe, or change list options,
visit: http://lists.midrange.com/mailman/listinfo/midrange-l
or email: MIDRANGE-L-request@xxxxxxxxxxxx
Before posting, please take a moment to review the archives
at http://archive.midrange.com/midrange-l.







As an Amazon Associate we earn from qualifying purchases.

This thread ...

Replies:

Follow On AppleNews
Return to Archive home page | Return to MIDRANGE.COM home page

This mailing list archive is Copyright 1997-2024 by midrange.com and David Gibbs as a compilation work. Use of the archive is restricted to research of a business or technical nature. Any other uses are prohibited. Full details are available on our policy page. If you have questions about this, please contact [javascript protected email address].

Operating expenses for this site are earned using the Amazon Associate program and Google Adsense.