× The internal search function is temporarily non-functional. The current search engine is no longer viable and we are researching alternatives.
As a stop gap measure, we are using Google's custom search engine service.
If you know of an easy to use, open source, search engine ... please contact support@midrange.com.



I suggest you check BPCS-L archives. Related topics have come up before.

You and your boss need to have a frank discussion with the leadership of
every corporate dept regarding how far back the company needs to preserve
what kind of data, in what level of detail, to satisfy the needs of IRS
audits, regulatory standards, and ordinary business needs of that
department. For which people is GJD data good enough, and for whom do they
need actual inventory history, costs contemporary to the transactions, and
what other data. You should seek some assurance that if when those needs
change, that IT be informed in a timely manner.

Be aware that there are several reasons why you cannot store some records to
infinity content. There are fields in BPCS which act as pointers across
file types, such as how many detail records you can have in a customer
order, how many lines on a shipping invoice, some of these are 3-5 digits in
size. It is worth knowing what the ceilings are, and for which of your
large files, you are running out of numbers. As part of my end year check
list, I would identify the files at risk of this happening, notify my users,
and initiate discussion of our alternatives before estimated crunch time.

When you setup your archiving policies, to eliminate most size related
performance issues, you need to know what reports against those archives may
be needed, and how they will work. Which master files are acceptable to
reference in a program accessing detail from several years ago, and which of
those master files also need to go into the archives?

Performance "issues" are not just created by the sheer size of files. They
are also created by how the files are organized relative to how inquiry and
reports need to access them, and whether your hardware is "right sized" to
the needs of your evolving enterprise. Performance can be enhanced by
expanding the reorg options from what vanilla BPCS has to offer, although
some need to be run on a restricted AS/400. Additional indexes can be
added, for the sequence needed for most popular accesses, and the tasks
which take longest to run. We had many reports, which take 2+ hours to run,
setup to run after midnite on scheduler, so that when the morning crew
arrived, they had many reports, which they would need that day, already
generated. We also had to do some policing with JOBQ vs. on-line. Some
tasks we ended their capability of being run on-line, and some tasks got
their very own JOBQ. Some of these solutions may be found on Midrange-L
although how BPCS files are accessed by various programs, need to be
carefully not violated.

Many BPCS vanilla programs are inefficiently written. For example, if a
report is only to run data for a particular facility, warehouse, type of
inventory, etc. then when you have a humongous volume of records, it is not
efficient for the program to read in 100% of the records in the file, then
go thru a series of comparisons to prompt screen criteria, to only process a
fraction of 1% of the total records. Indexes can be setup to already only
be accessing common combinations of what people are interested in, and thus
the entire report runs in 10 minutes instead of 3 hours. You can use IBM
*OUTFILES to analyze what access is most popular, and takes longest to run,
to find the software most worthy of performance analysis.

I no longer have access to our statistics, because of my retirement about 1
year ago, but dealing with ITH challenges & consequences was a major part of
my life in the last decade, so I remember a lot of the ITH "issues," on BPCS
405 CD.
* IRS audits typically needed to see data up to 4 years ago, and could
go up to 7 years ago, if they found any "problems" with the 4 years ago
data. They often needed to see reports, which we did not normally run, and
in sequences other than how we normally ran our data, showing our inventory
activity, and cost structure of that inventory, as it was for the year they
auditing, now 4-5 years later.
* IRS rules & practices can evolve over time, so perhaps you should
ask your accounting leadership to inform IT any time the rules change on how
long what data needs to be preserved.
* In addition, we had regulatory agencies over portions of our
industry, and participated in some industry standards, where if we wanted to
keep 90% of our customers, we needed to continue to adhere to those
standards. Whatever corporate dept works with those regulations and
industry standards, needs to keep IT informed on any changes to what data
needs to be preserved in what detail for how long.
* We were also involved in a legal dispute, ultimately resolved in our
favor, in which so long as that was going on, certain data needed to be
preserved in perpetuity. Any corporate dept involved in handling such
topics, needs to keep IT informed on what data needs to be preserved for how
long, for that kind of reason. I found it expedient to create a "special"
archives library just for data needed by the legal dispute, then continue
rest of archiving on the normal schedule.
* Our in-house accounting auditors typically showed up in April of
year N+1, then needed to see reports on our data for year N, in the
condition of that data for various time periods of year N. After I got
confirmation, that they were done for year N, I would schedule a weekend to
run final end year close on the General Ledger, because BPCS supports only 6
months into year N+1, before you have to close year N.
* When management switched to a different auditing firm, surprise. To
satisfy them, 18 months of BPCS data was not good enough, because they
wanted to audit 2 years. Naturally, IT is never told what the data history
needs to be preserved, until auditors ask for data which yes is in archives,
but no, we do not have software setup for every conceivable combination.
* When I first started at my last job, in-house auditors & IRS
auditors were happy with copies of our printed reports, which we typically
ran at end-fiscal, and stored in file cabinets, in which one drawer held 1-3
months stuff, with maybe 1-2 file cabinets per fiscal year, going back
several years. But as the years went on, they needed to see the reports
electronically, the associated data electronically, AS/400 reports into
Excel.
* We ran into issues with the sheer size of reports. For example,
some reports tailor made for auditor needs, could go into an Excel, which
typically made the reports larger in size, became larger than would go onto
any magnetic media which we were using, which was acceptable media for the
auditors. Many were also larger than any ISP would accept attached to
e-mail. For a time, Google accepted bigger than others, so I suggested gmail
accounts for auditors to receive attachments. They never explained to me
why that was unacceptable. I suggested splitting reports into ranges of
activity, but that was not acceptable to auditors or management. I found
share sites on the web that would accept reports that large, but warned of
cyber security risks. I think the contents of some of these reports are too
confidential to risk breaching. Management got private share sites setup,
so humongous reports could be uploaded downloaded by personnel in different
locations.
* In one IRS audit, using auditors from way back then, combination of
IRS, auditors, and management, informed me that sending the data via share
site or other solution, which we had been using, was not acceptable, without
explaining why to me. Fortunately, one of the devices on our PC network was
able to upload to something bigger than a CD, that was acceptable to these
people, so we got the data from AS/400 to PC, to the DVD or whatever it was,
then hand delivered to the auditors.
* When we first started putting end fiscal reports into Excel, the
tools, management agreed to let us have, vastly inferior to what was
available thru freeware, did not get negatives into Excel with proper
appearance. We went to several versions of reports, the one seen by people
who worked with green bar, and the version headed for Excel, which was also
split into positives and negatives. Negatives are items whose on-hand is
negative, because of posting float. Because a handful of managers got
conniptions with lots of negatives in inventory reports, but most managers
did not give a darn, part of end fiscal was catching up on corrections to
make negatives go away. So in time, we went from thousands of items at end
month with negative on-hand to dozens.
* We were adding new records to ITH at the rate of approx 1 million
per month, more than that when physical inventory, and end-year cost
replacement, involved. We kept 2 years in the "live" BPCS, and as part of
end year processing, created an archive of end-year reality for data which
would be needed to "recreate" the year just ending, from perspective of ITH,
costs, sales, GL, and other topics. That comes to around 25 million ITH
records. It was significantly larger than that, before several facilities
got consolidated.
* My check list, for end fiscal steps, included anticipated time
duration of various reports and steps, in which INV900 could be up to 5
hours, would bomb if ANYONE signed onto BPCS before it got completed. Since
not signing onto BPCS for various time frames was not politically feasible
in our corporate culture, for some end fiscal steps we had to go to
restricted AS/400 access, with several sub-systems turned back on, which are
normally not running for other restricted access, like backup.
* Physical inventory has to run after INV900 of end-month, but before
other end month and end year steps. I had to run a set of reports, for our
auditors, when we were theoretically done with input for end month before
physical inventory launched, before INV900, after INV900, after physical
tags dumped in, after costs replaced for the new year. Each set of reports
took several DAYS to get done, and there was great pressure from other
interests to get back on the system sooner. We developed multiple programs
which could "reconstruct" these realities later.
* There are archiving systems provided by various 3rd party vendors,
such as UPI. These systems get BPCS files so that there is an archive with
the records internally consistent to BPCS rules, such as the ITH sequence #
being consistent, when more added to the archive later. As records are
removed from the "live" BPCS, the records left behind "work" for BPCS, such
as the IIM pointer to ITH items matching the count of how many records per
item, so that inquiry works correctly, working backwards through them.
* These archiving systems address file integrity, solve performance,
but do not address software needed to generate reports or inquiry from the
archived data. In most cases, with recent archives, you can have a CL
program which runs the standard report, but substitutes archives library for
the past year of interest. Go far back, and your current master files may
no longer have info on some of the customers items and GL accounts
involved, so you also need to factor in master files accessing master info
before it gets deleted.
* While we normally need ITH data, and that of many other files, going
back multiple years, to satisfy ability to pass an IRS audit, and going back
18 months to satisfy our usual internal auditors, people accessing the
equivalent data via BPCS inquiry & reports, never need that far back. 95%
of our users seldom need to go back more than 1 month of history. So we
identified who needed to go back further, and developed software, so they
could get what they needed from archives, and thus relieve everyone else of
the performance burden.
* The volume of records flowing into ITH, and its GJD counterpart, is
highly dependent on summarization rules, enacted at 2 levels: ITE and GL
Journal definition. While ITH could handle 99,999 records for any given
item, before it broke, and we rarely had close to 9,999, GJD is more
restrictive. It could only handle 9,999 of the same journal prefix, in one
month, before it broke. So we had different journal prefixes for different
inventory transactions.
* Inventory transactions can be captured into GJD in detail, summary,
or intermediate. During preparation for each up-coming end-year, I sent a
memo around, regarding how much detail was being captured and for long we
were storing various kinds of records, in "live" BPCS vs. archives, and why,
asking if we needed to keep any of this for a different time period, such as
shipping delivery tickets for less time, and in less detail, such as which
inventory transactions still needed to go into GL in detail. When auditors
were consulted, they generally wanted more detail than we were already
capturing.
* We found bugs in vanilla BPCS, many of which were solved via the 3rd
party archiving.
* BPCS lets you delete items "no longer used" because the end customer
has left us, or we have converted identical sub-assemblies into commons.
This includes items which have inventory history, other than last physical
saying none left (O transaction), or cost getting zeroed (# transaction).
Then when INV900 runs, it only purges ancient ITH records which also have an
IIM record. Then later, when that same item# is assigned to some different
function, it starts out its life with ITH records from years ago, which have
no relevance to the replacement item.
* It is possible to add an item to inventory, without all its
"marbles." Several vanilla BPCS programs, such as the INV260 series, get at
inventory by accessing a combination of "marbles" so that if some are
missing, the INV260 report omits that inventory. We discovered this when
auditors asked how come certain items were on some "complete" (custom)
inventory reports, but not on other "complete" (vanilla) inventory reports.
This discovery led us to the creation of a "null" inventory report: List
just those items in inventory which are screwed up one way or another, such
as not having CMF detail for the facility they now find themselves in.

Alister Wm Macintyre (Al Mac)
Linked In https://www.linkedin.com/in/almacintyre
-----Original Message-----
From: BPCS-L [mailto:bpcs-l-bounces@xxxxxxxxxxxx] On Behalf Of Billy Waters
Sent: Monday, February 22, 2016 6:44 AM
To: bpcs-l@xxxxxxxxxxxx
Subject: [BPCS-L] ITH Size

Currently ITH 9 million records. Considering a process change that could
greatly increase the daily ITH record created. Boss is concerned about the
size of ITH and performance considerations. If you can please respond with
your ITH size and any other information you feel would be of use. We cannot
purge ITH at month end due to an internal reporting process. We are working
on another new process that will allow purging of ITH at month end.

--
This is the BPCS ERP System (BPCS-L) mailing list To post a message email:
BPCS-L@xxxxxxxxxxxx To subscribe, unsubscribe, or change list options,
visit: http://lists.midrange.com/mailman/listinfo/bpcs-l
or email: BPCS-L-request@xxxxxxxxxxxx
Before posting, please take a moment to review the archives at
http://archive.midrange.com/bpcs-l.

As an Amazon Associate we earn from qualifying purchases.

This thread ...

Replies:

Follow On AppleNews
Return to Archive home page | Return to MIDRANGE.COM home page

This mailing list archive is Copyright 1997-2024 by midrange.com and David Gibbs as a compilation work. Use of the archive is restricted to research of a business or technical nature. Any other uses are prohibited. Full details are available on our policy page. If you have questions about this, please contact [javascript protected email address].

Operating expenses for this site are earned using the Amazon Associate program and Google Adsense.