|
QDBJRN is just another journal, isn't it? Even if I RCVJRNE on it, it won't make the entries unavailable to another process, will it?
QAUDJRN won't necessarily capture file creations and deletions (unless *OBJMGT ? is set on QAUDLVL). Even if they were being captured, would it tell me whether the file was a source PF?
QADBXREF: Yours is the first I've ever heard about not reading the QADBXREF file - not touching it period. I've known that it was not recommended to build logicals on that file, but I've never run into problems querying the file. Does IBM warn against this?
The 1000 limit was just an example. Does the system limit the number of objects that can be audited? Or are there any performance implications with having a large number of objects (in this case, only source files - again, by nature, being very low volume) being audited?
Thanks for giving me a few more things to think about!
GA
--- Vern Hamberg <vhamberg@xxxxxxxxxxxxxxxxxxxxxxxxx> wrote: > I recommend RCVJRNE on QAUDJRN, not QDBJRN - I don't like messing with > those cross-reference files. if you have some kind of object replication > software, watch out for this, as they may need the entry to stay in the > audit journal. > > No problem deleting an object with auditing turned on - all it is is a > flag that is checked at tevery access to the object anyway - that's why > there is minimal impact on the system, as you believe. The check is > always done - > extra overhead comes from writing the journal record, if needed. > > Don't interrogate QADBXREF - run DSPFD *ALL/*ALL fileatr(*pf) > output(*outfile) and query the outfile - again, stay away from the > system database cross-reference - the system is always touching that > stuff. > > BTW, I'm curious about your 1000 limit - I could not find that in the > Capacities Reference. > > HTH > > Vern > > At 01:01 PM 9/12/2003 -0700, you wrote: > >Well, I was specifically hoping I wouldn't have to do a CHGOBJAUD on > each > >source file. Source files get created, get deleted, etc, and I want to > >track all source files on the system, regardless of who creates them or > >when they are created. However, if that is the only way, will the > >following scenario work? > > > >Let's say I have an initial step where I do a CHGOBJAUD on every source > >file known on the system. I can interrogate the QADBXREF file to do > this, > >since it tells whether a file is a source or data file. > > > >What if, immediately after that first step, I start a RCVJRNE on the > >QDBJRN journal, which journals the QADBXREF file, and I look for source > >files being created or deleted. When one is created, I could > immediately > >issue a CHGOBJAUD on it. (What happens when a file being audited is > >deleted? Anything bad happen? Should I issue a CHGOBJAUD > OBJAUD(*NONE) > >on it?) > > > >And, assuming all of the above is viable, what happens if I have 1000 > >source files that I've started auditing? I'm asking this question as > it > >pertains to the limit on the number of objects that can be audited. I > >don't really see where auditing source member adds, updates, and > deletes > >is going to impact system performance. But someone may educate me > >otherwise. > > > >Is it possible to easily list all objects that are being audited? > > > >BTW, do I want to use OBJAUD(*CHANGE) on the CHGOBJAUD command? > > > >I'll see the wisdom of asking this on Friday afternoon. <g> > > > >TIA, GA
As an Amazon Associate we earn from qualifying purchases.
This mailing list archive is Copyright 1997-2024 by midrange.com and David Gibbs as a compilation work. Use of the archive is restricted to research of a business or technical nature. Any other uses are prohibited. Full details are available on our policy page. If you have questions about this, please contact [javascript protected email address].
Operating expenses for this site are earned using the Amazon Associate program and Google Adsense.