× The internal search function is temporarily non-functional. The current search engine is no longer viable and we are researching alternatives.
As a stop gap measure, we are using Google's custom search engine service.
If you know of an easy to use, open source, search engine ... please contact support@midrange.com.



<vendor>


Hi Rob ..... about your journaling e-note embedded below .....

Not everyone has a terabyte of hard drive available for Journal Receivers
(JRs). We've seen places past 90% DASD consumed because no-one actively
manages JR accumulation. Our Stitch-in-Time software has the functionality to
store only information the user defines as pertinent for a given file. That
translates to a DASD footprint +/- 92% to 98% smaller than JRs for an
equivalent retention period.
www.unbeatenpath.com/software/sit/Stitch-in-Time.pdf

Beyond using less DASD, Stitch-in-Time would also make it DRAMATICALLY
simpler to find the source of rogue database updates. Consider the
current-events LWK example:

If you were attempting to determine which program was making mysterious
changes in rate fields by using JR data, you'd start by getting a list of all
records in LWK that were potentially (but not necessarily) changed. Then
you'd wade through that streamed data dump, looking for data in columns x
through y that changed in consecutive journal records. That's a chore because
there are no visual divisions between fields and because only a single
journal entry is viewable at a time ... so you'd have to jump back and forth
between two screens to spot value changes.

To avoid eye strain, a sophisticated iSeries guy could extract JR data for
LWK into two identical external data files and then purchase a utility to
exclude records with no data change. Then he'd create a query or SQL process
that compared data in columns x through y between the two files to find rogue
data changes. Records ID'd that way would include the rogue program name.

The very sophisticated, less-eye-strain JR approach would consume 130 - 160
minutes. The same objective could be achieved in 2-4 minutes using
Stitch-in-Time software.

Now, If you wanted to get an email alert about a LWK data accident just
seconds after the bad data hit LWK, use Needle in a Haystack software. The
Needle alert would include the rogue program name plus all the other
information you'd need to fix the error immediately ... before it propagated
itself into CAP calculations.
www.unbeatenpath.com/software/needle/in-a-haystack.pdf


Warm regards,
Milt Habeck
Unbeaten Path
mhabeck@xxxxxxxxxx
(888) 874-8008
</vendor>








From: <mailto:rob@xxxxxxxxx> rob@xxxxxxxxx

Sent: Wednesday, February 29, 2012 12:47 PM

To: BPCS ERP System

Subject: Re: [BPCS-L] mysterious/rogue updates to rate fields in LWK file



First do this on your file

DSPFD LWK

File is currently journaled . . . . . . . . : Yes

Current or last journal . . . . . . . . . . : BPCS_ALL

Library . . . . . . . . . . . . . . . . . : #MXJRN



We're a Mimix shop so it has to be journaled. If you start journaling on
this file you can easily see who updated it, from what job, using what
program.

See:

CRTJRNRCV

CRTJRN

STRJRNPF

DSPJRN

Heck try it on your own test db somewhere.



Drawback, there's a whole discussion on journal receiver maintenance.

We've gone nuts with our journal receivers - they're so useful management
wants to keep them for 100 days. So we tie up a TB or so with all the
journaling we do. But what the heck, I have 1/2TB on my laptop.



Rob Berendt

Group Dekko










As an Amazon Associate we earn from qualifying purchases.

This thread ...

Follow-Ups:

Follow On AppleNews
Return to Archive home page | Return to MIDRANGE.COM home page

This mailing list archive is Copyright 1997-2024 by midrange.com and David Gibbs as a compilation work. Use of the archive is restricted to research of a business or technical nature. Any other uses are prohibited. Full details are available on our policy page. If you have questions about this, please contact [javascript protected email address].

Operating expenses for this site are earned using the Amazon Associate program and Google Adsense.