× The internal search function is temporarily non-functional. The current search engine is no longer viable and we are researching alternatives.
As a stop gap measure, we are using Google's custom search engine service.
If you know of an easy to use, open source, search engine ... please contact support@midrange.com.



I believe there is an exit point for IFS with V5R3.  Byteware is using 
this for automated virus scanning.
Is this it?
http://publib.boulder.ibm.com/infocenter/iseries/v5r3/ic2924/info/rzaii/rzaiimstexfile.htm

Rob Berendt
-- 
Group Dekko Services, LLC
Dept 01.073
PO Box 2000
Dock 108
6928N 400E
Kendallville, IN 46755
http://www.dekko.com





"JK" <johnking@xxxxxxx> 
Sent by: midrange-l-bounces@xxxxxxxxxxxx
01/21/2005 04:16 PM
Please respond to
Midrange Systems Technical Discussion <midrange-l@xxxxxxxxxxxx>


To
"'Midrange Systems Technical Discussion'" <midrange-l@xxxxxxxxxxxx>
cc

Subject
Monitoring IFS directory for file creation






Hello all,

 We need to automatically execute a job whenever files are created in a
particular IFS directory. The process needs to work no matter how the 
files
arrive: via FTP, drag-n-drop from Windows Explorer or other processes yet 
to
be invented. I could write a sleeper program to monitor the directory 
every
'x' minutes, but would rather take advantage of a system function if
possible.

 This appears to be do-able by journaling the IFS directory. The archives
are full of examples of doing this for DB2 files, but I want to make sure
this is viable for IFS directories before committing to management. Or 
maybe
there is a simpler way? Would someone be kind enough to critique this 
and/or
steer me in the right direction? I think I need to:

1) Create a journal and receiver.
2) Associate the IFS directory with that journal.
3) Submit a QBATCH job that uses RCVJRNE to watch the journal. 
4) The QBATCH job wakes when an entry appears in the journal. If it
determines that a 'file close' action occurred it will submit a job to
process and remove the IFS file.
5) QBATCH program goes back to sleep again.

 I've completed steps 1) and 2) and manually added files to the monitored
directory. Sure 'nuff, the system creates multitudes of journal entries. 
It
appears the 'CS - IFS object closed' is the one we want to watch.

Questions: 
1) Is OpsNav the only way to manage IFS journaling? I don't always have a
fully-loaded PC next to me and a green-screen command would be a nice
fallback.
2) Using drag-n-drop from WinExp creates 50+ journal entries for each file
created: commitment control, attribute changes, stuff I couldn't care less
about. That seems wasteful, plus I'm confused about exactly which entry
indicates that the file is closed and is safe to process. Can someone
enlighten me?
3) Does a program using RCVJRNE behave similarly to 'QRCVDTAQ'? That is,
does it wait patiently until an entry appears or does it require something
different?
4) What techniques should be used to ensure that the QBATCH program is
running and how to restart it without re-processing existing entries? A
utility named 'DspAudLog' by Mr. Oguine published by iSeries in June 2000
records the last-used journal sequence number in a data area. Is this 
still
the best technique?
5) What else have I missed?

Many thanks, JK

-- 
This is the Midrange Systems Technical Discussion (MIDRANGE-L) mailing 
list
To post a message email: MIDRANGE-L@xxxxxxxxxxxx
To subscribe, unsubscribe, or change list options,
visit: http://lists.midrange.com/mailman/listinfo/midrange-l
or email: MIDRANGE-L-request@xxxxxxxxxxxx
Before posting, please take a moment to review the archives
at http://archive.midrange.com/midrange-l.



As an Amazon Associate we earn from qualifying purchases.

This thread ...

Replies:

Follow On AppleNews
Return to Archive home page | Return to MIDRANGE.COM home page

This mailing list archive is Copyright 1997-2024 by midrange.com and David Gibbs as a compilation work. Use of the archive is restricted to research of a business or technical nature. Any other uses are prohibited. Full details are available on our policy page. If you have questions about this, please contact [javascript protected email address].

Operating expenses for this site are earned using the Amazon Associate program and Google Adsense.