× The internal search function is temporarily non-functional. The current search engine is no longer viable and we are researching alternatives.
As a stop gap measure, we are using Google's custom search engine service.
If you know of an easy to use, open source, search engine ... please contact support@midrange.com.



You have been given some awesome responsibilities, that we all should have, but normally we so distracted with the consequences of no one having these duties, that we never have time to get to it. In other words, dirty data flowing through flawed reports, without easy ways for people to fix errors they know about, can contribute keeping us extremely busy, but perhaps we would be better off without the flawed reports or the dirty data, that many end users probably oblivious to.
* Convenient access to good solid big picture of how various fields in BPCS files are used ... in which I think the one most useful tool you should have, for the data mapping purpose, is the BPCS Reference Manual from http://www.dssolutionsinc.com/OverviewManual.asp which I have wanted for years, but my employer won't buy it for me (there are lots of things I have wanted over the years to help me be more productive, but selling the value of that concept can be difficult for me), and I have been too cheap to cough up the $ 350.00 to buy it for myself. Look at http://radio.weblogs.com/0107846/stories/2002/11/08/bpcsDocSources.html for other such resource links. We might also have a discussion (this may already be in BPCS_L archives) why we are not using XREF that came with BPCS..
* Quality assurance in reports ... vanilla BPCS, modifications in-house over history, software from third parties, queries developed by end users, how users are using BPCS data transferred to spread sheets on personal PCs ... one thing I have suggested (another Al proposal that went no place) is to have the outside auditors get a report on which reports and inquiries get the heaviest usage (a datum that you can get from *OUTFILE software directory) to run our company, then with each annual audit, let the auditors pick a handful of those heavily used information sources for a professional inspection to see what flaws might be there, if any.
* Quality Assurance in data ... we know that there are many places where errors creep in giving us dirty data ... are our work arounds adequate? ... have we identified all the causes? ... How does Sarbanes Oxley factor into this? I see little point in having auditors say that certain financial reports are Ok, based on the immediate underlying data, when they sit in a foundation of questionable data sources. We should identify those problem data sources and clean them up long before time to check the annual reports.
* It also helps to have access to a test data base, which can be a quick way of figuring out what happens if we do this or that ... sometimes looking at the source code is the fastest way, sometimes it is faster to do some trials of a nature that are not a good idea in the live data.
* Another manual you should seek (that I also want but not have for the same reasons) that costs I think in the neighborhood of $ 135.00 is the BPCS Handbook for Auditors http://www.unbeatenpathintl.com/audhand.html ... the principle here is that people have certain expectations of the data in any ERP, and those expectations may be misplaced ... in other words, the data in the reports can be accurate, but there can be a people problem interpreting that data, and this manual helps an auditor understand areas of discontinuities between the ERP and its users ... areas of common misconceptions for BPCS users world wide, some of which might be found in the company you auditing ... where to look for common problems.
You might be interested in a project I engaged in several years ago.
The boss (at the time), felt that we were having far too many errors of a certain kind, that were having catastrophic impact down stream.
For approx 8 months, my software development focus was to try to intercept BPCS error messages before they got to the humans, and take appropriate responses every time. We also identified safer practices for end users such as
* Even if nothing goes wrong 99% of the time, capture on spool file audit trails of your work, so that when something does go wrong, we have some hope of figuring out how to fix the problem.
* If you are running this kind of application, ALWAYS send updates to JOBQ, and let's periodically check to make sure everyone in that application is using the same JOBQ ... we also deactivated the ability to run certain jobs interactive on-line.
* If you are running this other kind of application, ALWAYS do your stuff interactive on-line.
* Perhaps you need to update some shop orders, change the order quantity for example ... be aware that if labor is being keyed in and updated on that same shop order while you working on that, there could be a collision. Here is how to avoid any such collision.
* When you doing a string of related tasks, whatever you do, do not launch early tasks to JOBQ then before you got confirmation of completion start next steps on-line.
* We do not want to be posting regular activities to GEN LED (via INV920 then GLD540) while those regular activities are on going .... turned out our list of those activities was incomplete as we found out in recent months.
* In the end day processes, there are a bunch of reports listing a variety of glitches (a no fault word meaning data mucked up) ... if the reports empty then nothing to fix ... if stuff there, then get to it fixing them
* When we have a hardware problem, such as a PC going down in the middle of updating customer orders, it is kind of predictable what kind of a mess that will cause to the work that was being done when the hardware problem occurred, so we have a series of steps off a user menu to follow to fix the problem, assuming it is reported by the victims in a coherent manner to the people who know how to run the fixes.
The above might not be precisely what you looking for, but for us, what these practices accomplished was to eliminate major causes of dirty data, but in recent years we been backsliding on some of this, and getting dirty data again.


Mitch Damon wrote:
Hi folks,



Again I am turning to the wonderfully knowledgeable professionals on
this list for help and guidance.  I have recently been given
responsibility for ensuring that the data used to build management
reports, links to other systems and used in critical business processes
is accurate and used appropriately.  Of course this is a huge
undertaking and if the team hopes to get it done before we are retired
we will need a head start.  Does anyone have or know where I could get a
directory of the most important fields in BPCS with definitions of what
they are, where they are used and what programs are effected by them?
As always your input is greatly appreciated.



Thanks in advance,

Mitch Damon, CPIM
Data & Process Integrity Manager
Birds Eye Foods
Rochester, NY
(585) 383-1070 x 250

-
Al Macintyre http://www.ryze.com/go/Al9Mac
Find BPCS Documentation Suppliers http://radio.weblogs.com/0107846/stories/2002/11/08/bpcsDocSources.html
BPCS/400 Computer Janitor at http://www.globalwiretechnologies.com/
Part time may get commission from some BPCS vendors for helping them ... I will now say in a post if the product I commenting on is also one potentially involved with this ... not that I have made anything from this yet.

As an Amazon Associate we earn from qualifying purchases.

This thread ...

Follow-Ups:
Replies:

Follow On AppleNews
Return to Archive home page | Return to MIDRANGE.COM home page

This mailing list archive is Copyright 1997-2024 by midrange.com and David Gibbs as a compilation work. Use of the archive is restricted to research of a business or technical nature. Any other uses are prohibited. Full details are available on our policy page. If you have questions about this, please contact [javascript protected email address].

Operating expenses for this site are earned using the Amazon Associate program and Google Adsense.