× The internal search function is temporarily non-functional. The current search engine is no longer viable and we are researching alternatives.
As a stop gap measure, we are using Google's custom search engine service.
If you know of an easy to use, open source, search engine ... please contact support@midrange.com.



 If you are on One Point Support (OGS) go to SSA OnePoint Online and
navigate as follows and then Download:

        1.  Select Technical Documentation under Documentation
        2. Filter by BPCS Products &  Relationship Diagrams, File/Field, Program
and Database References
        3. BPCS Cross Reference Zip Files of Access databases has all your
requirements.

Kindest Regards,

Gerry Duhon, CPIM
Director of Production Planning
Henry Technologies
1060 Lake Avenue
Woodstock, IL 60098 USA

gduhon@xxxxxxxxxxxxx
Woodstock Office 815.206.3317
Chatham Office 217.483.2406 Ext 360
Cell 815.814.6719

-----Original Message-----
From: bpcs-l-bounces@xxxxxxxxxxxx [mailto:bpcs-l-bounces@xxxxxxxxxxxx]On
Behalf Of bpcs-l-request@xxxxxxxxxxxx
Sent: Tuesday, September 21, 2004 4:50 PM
To: bpcs-l@xxxxxxxxxxxx
Subject: BPCS-L Digest, Vol 2, Issue 193


Send BPCS-L mailing list submissions to
        bpcs-l@xxxxxxxxxxxx

To subscribe or unsubscribe via the World Wide Web, visit
        http://lists.midrange.com/mailman/listinfo/bpcs-l
or, via email, send a message with subject or body 'help' to
        bpcs-l-request@xxxxxxxxxxxx

You can reach the person managing the list at
        bpcs-l-owner@xxxxxxxxxxxx

When replying, please edit your Subject line so it is more specific
than "Re: Contents of BPCS-L digest..."


Today's Topics:

   1. Re: Data mapping (Alister Wm Macintyre)
   2. RE: Data mapping (Damon, Mitch)
   3. Re: Data mapping (Danny Monselise)
   4. RE: Data mapping (Alister Wm Macintyre)
   5. Re: Data mapping (Alister Wm Macintyre)


----------------------------------------------------------------------

message: 1
date: Tue, 21 Sep 2004 14:53:34 -0500
from: Alister Wm Macintyre <macwheel99@xxxxxxxxxxx>
subject: Re: Data mapping

You have been given some awesome responsibilities, that we all should have,
but normally we so distracted with the consequences of no one having these
duties, that we never have time to get to it.  In other words, dirty data
flowing through flawed reports, without easy ways for people to fix errors
they know about, can contribute  keeping us extremely busy, but perhaps we
would be better off without the flawed reports or the dirty data, that many
end users probably oblivious to.
    * Convenient access to good solid big picture of how various fields in
BPCS files are used ... in which I think the one most useful tool you
should have, for the data mapping purpose, is the BPCS Reference Manual
from http://www.dssolutionsinc.com/OverviewManual.asp which I have wanted
for years, but my employer won't buy it for me (there are lots of things I
have wanted over the years to help me be more productive, but selling the
value of that concept can be difficult for me), and I have been too cheap
to cough up the $ 350.00 to buy it for myself.  Look at
http://radio.weblogs.com/0107846/stories/2002/11/08/bpcsDocSources.html for
other such resource links.  We might also have a discussion (this may
already be in BPCS_L archives) why we are not using XREF that came with
BPCS..
    * Quality assurance in reports ... vanilla BPCS, modifications in-house
over history, software from third parties, queries developed by end users,
how users are using BPCS data transferred to spread sheets on personal PCs
... one thing I have suggested (another Al proposal that went no place) is
to have the outside auditors get a report on which reports and inquiries
get the heaviest usage (a datum that you can get from *OUTFILE software
directory) to run our company, then with each annual audit, let the
auditors pick a handful of those heavily used information sources for a
professional inspection to see what flaws might be there, if any.
    * Quality Assurance in data ... we know that there are many places
where errors creep in giving us dirty data ... are our work arounds
adequate? ... have we identified all the causes? ... How does Sarbanes
Oxley factor into this? I see little point in having auditors say that
certain financial reports are Ok, based on the immediate underlying data,
when they sit in a foundation of questionable data sources.  We should
identify those problem data sources and clean them up long before time to
check the annual reports.
    * It also helps to have access to a test data base, which can be a
quick way of figuring out what happens if we do this or that ... sometimes
looking at the source code is the fastest way, sometimes it is faster to do
some trials of a nature that are not a good idea in the live data.
    * Another manual you should seek (that I also want but not have for the
same reasons) that costs I think in the neighborhood of $ 135.00 is the
BPCS Handbook for Auditors  http://www.unbeatenpathintl.com/audhand.html
... the principle here is that people have certain expectations of the data
in any ERP, and those expectations may be misplaced ... in other words, the
data in the reports can be accurate, but there can be a people problem
interpreting that data, and this manual helps an auditor understand areas
of discontinuities between the ERP and its users ... areas of common
misconceptions for BPCS users world wide, some of which might be found in
the company you auditing ... where to look for common problems.
You might be interested in a project I engaged in several years ago.
The boss (at the time), felt that we were having far too many errors of a
certain kind, that were having catastrophic impact down stream.
For approx 8 months, my software development focus was to try to intercept
BPCS error messages before they got to the humans, and take appropriate
responses every time.  We also identified safer practices for end users
such as
    * Even if nothing goes wrong 99% of the time, capture on spool file
audit trails of your work, so that when something does go wrong, we have
some hope of figuring out how to fix the problem.
    * If you are running this kind of application, ALWAYS send updates to
JOBQ, and let's periodically check to make sure everyone in that
application is using the same JOBQ ... we also deactivated the ability to
run certain jobs interactive on-line.
    * If you are running this other kind of application, ALWAYS do your
stuff interactive on-line.
    * Perhaps you need to update some shop orders, change the order
quantity for example ... be aware that if labor is being keyed in and
updated on that same shop order while you working on that, there could be a
collision.  Here is how to avoid any such collision.
    * When you doing a string of related tasks, whatever you do, do not
launch early tasks to JOBQ then before you got confirmation of completion
start next steps on-line.
    * We do not want to be posting regular activities to GEN LED (via
INV920 then GLD540) while those regular activities are on going .... turned
out our list of those activities was incomplete as we found out in recent
months.
    * In the end day processes, there are a bunch of reports listing a
variety of glitches (a no fault word meaning data mucked up) ... if the
reports empty then nothing to fix ... if stuff there, then get to it fixing
them
    * When we have a hardware problem, such as a PC going down in the
middle of updating customer orders, it is kind of predictable what kind of
a mess that will cause to the work that was being done when the hardware
problem occurred, so we have a series of steps off a user menu to follow to
fix the problem, assuming it is reported by the victims in a coherent
manner to the people who know how to run the fixes.
The above might not be precisely what you looking for, but for us, what
these practices accomplished was to eliminate major causes of dirty data,
but in recent years we been backsliding on some of this, and getting dirty
data again.

Mitch Damon wrote:
>Hi folks,
>
>
>
>Again I am turning to the wonderfully knowledgeable professionals on
>this list for help and guidance.  I have recently been given
>responsibility for ensuring that the data used to build management
>reports, links to other systems and used in critical business processes
>is accurate and used appropriately.  Of course this is a huge
>undertaking and if the team hopes to get it done before we are retired
>we will need a head start.  Does anyone have or know where I could get a
>directory of the most important fields in BPCS with definitions of what
>they are, where they are used and what programs are effected by them?
>As always your input is greatly appreciated.
>
>
>
>Thanks in advance,
>
>Mitch Damon, CPIM
>Data & Process Integrity Manager
>Birds Eye Foods
>Rochester, NY
>(585) 383-1070 x 250

-
Al Macintyre  http://www.ryze.com/go/Al9Mac
Find BPCS Documentation Suppliers
http://radio.weblogs.com/0107846/stories/2002/11/08/bpcsDocSources.html
BPCS/400 Computer Janitor at http://www.globalwiretechnologies.com/
Part time may get commission from some BPCS vendors for helping them ... I
will now say in a post if the product I commenting on is also one
potentially involved with this ... not that I have made anything from this
yet.


------------------------------

message: 2
date: Tue, 21 Sep 2004 16:20:44 -0400
from: "Damon, Mitch" <MDamon@xxxxxxxxxxxxxxxxx>
subject: RE: Data mapping

Thanks Al,
You are, as always, a wealth of information and ideas and why you
continue to labor for anyone other than yourself remains a mystery.  If
not a bullseye it is certainly in the first ring and will give us some
very interesting avenues to explore.

Much thanks,

Mitch Damon, CPIM
Data & Process Integrity Manager
Birds Eye Foods
Rochester, NY
(585) 383-1070 x 250

-----Original Message-----
From: Alister Wm Macintyre [mailto:macwheel99@xxxxxxxxxxx]
Sent: Tuesday, September 21, 2004 3:54 PM
To: SSA's BPCS ERP System
Subject: Re: Data mapping

You have been given some awesome responsibilities, that we all should
have,
but normally we so distracted with the consequences of no one having
these
duties, that we never have time to get to it.  In other words, dirty
data
flowing through flawed reports, without easy ways for people to fix
errors
they know about, can contribute  keeping us extremely busy, but perhaps
we
would be better off without the flawed reports or the dirty data, that
many
end users probably oblivious to.
    * Convenient access to good solid big picture of how various fields
in
BPCS files are used ... in which I think the one most useful tool you
should have, for the data mapping purpose, is the BPCS Reference Manual
from http://www.dssolutionsinc.com/OverviewManual.asp which I have
wanted
for years, but my employer won't buy it for me (there are lots of things
I
have wanted over the years to help me be more productive, but selling
the
value of that concept can be difficult for me), and I have been too
cheap
to cough up the $ 350.00 to buy it for myself.  Look at
http://radio.weblogs.com/0107846/stories/2002/11/08/bpcsDocSources.html
for
other such resource links.  We might also have a discussion (this may
already be in BPCS_L archives) why we are not using XREF that came with
BPCS..
    * Quality assurance in reports ... vanilla BPCS, modifications
in-house
over history, software from third parties, queries developed by end
users,
how users are using BPCS data transferred to spread sheets on personal
PCs
... one thing I have suggested (another Al proposal that went no place)
is
to have the outside auditors get a report on which reports and inquiries

get the heaviest usage (a datum that you can get from *OUTFILE software
directory) to run our company, then with each annual audit, let the
auditors pick a handful of those heavily used information sources for a
professional inspection to see what flaws might be there, if any.
    * Quality Assurance in data ... we know that there are many places
where errors creep in giving us dirty data ... are our work arounds
adequate? ... have we identified all the causes? ... How does Sarbanes
Oxley factor into this? I see little point in having auditors say that
certain financial reports are Ok, based on the immediate underlying
data,
when they sit in a foundation of questionable data sources.  We should
identify those problem data sources and clean them up long before time
to
check the annual reports.
    * It also helps to have access to a test data base, which can be a
quick way of figuring out what happens if we do this or that ...
sometimes
looking at the source code is the fastest way, sometimes it is faster to
do
some trials of a nature that are not a good idea in the live data.
    * Another manual you should seek (that I also want but not have for
the
same reasons) that costs I think in the neighborhood of $ 135.00 is the
BPCS Handbook for Auditors  http://www.unbeatenpathintl.com/audhand.html

... the principle here is that people have certain expectations of the
data
in any ERP, and those expectations may be misplaced ... in other words,
the
data in the reports can be accurate, but there can be a people problem
interpreting that data, and this manual helps an auditor understand
areas
of discontinuities between the ERP and its users ... areas of common
misconceptions for BPCS users world wide, some of which might be found
in
the company you auditing ... where to look for common problems.
You might be interested in a project I engaged in several years ago.
The boss (at the time), felt that we were having far too many errors of
a
certain kind, that were having catastrophic impact down stream.
For approx 8 months, my software development focus was to try to
intercept
BPCS error messages before they got to the humans, and take appropriate
responses every time.  We also identified safer practices for end users
such as
    * Even if nothing goes wrong 99% of the time, capture on spool file
audit trails of your work, so that when something does go wrong, we have

some hope of figuring out how to fix the problem.
    * If you are running this kind of application, ALWAYS send updates
to
JOBQ, and let's periodically check to make sure everyone in that
application is using the same JOBQ ... we also deactivated the ability
to
run certain jobs interactive on-line.
    * If you are running this other kind of application, ALWAYS do your
stuff interactive on-line.
    * Perhaps you need to update some shop orders, change the order
quantity for example ... be aware that if labor is being keyed in and
updated on that same shop order while you working on that, there could
be a
collision.  Here is how to avoid any such collision.
    * When you doing a string of related tasks, whatever you do, do not
launch early tasks to JOBQ then before you got confirmation of
completion
start next steps on-line.
    * We do not want to be posting regular activities to GEN LED (via
INV920 then GLD540) while those regular activities are on going ....
turned
out our list of those activities was incomplete as we found out in
recent
months.
    * In the end day processes, there are a bunch of reports listing a
variety of glitches (a no fault word meaning data mucked up) ... if the
reports empty then nothing to fix ... if stuff there, then get to it
fixing
them
    * When we have a hardware problem, such as a PC going down in the
middle of updating customer orders, it is kind of predictable what kind
of
a mess that will cause to the work that was being done when the hardware

problem occurred, so we have a series of steps off a user menu to follow
to
fix the problem, assuming it is reported by the victims in a coherent
manner to the people who know how to run the fixes.
The above might not be precisely what you looking for, but for us, what
these practices accomplished was to eliminate major causes of dirty
data,
but in recent years we been backsliding on some of this, and getting
dirty
data again.

Mitch Damon wrote:
>Hi folks,
>
>
>
>Again I am turning to the wonderfully knowledgeable professionals on
>this list for help and guidance.  I have recently been given
>responsibility for ensuring that the data used to build management
>reports, links to other systems and used in critical business processes
>is accurate and used appropriately.  Of course this is a huge
>undertaking and if the team hopes to get it done before we are retired
>we will need a head start.  Does anyone have or know where I could get
a
>directory of the most important fields in BPCS with definitions of what
>they are, where they are used and what programs are effected by them?
>As always your input is greatly appreciated.
>
>
>
>Thanks in advance,
>
>Mitch Damon, CPIM
>Data & Process Integrity Manager
>Birds Eye Foods
>Rochester, NY
>(585) 383-1070 x 250

-
Al Macintyre  http://www.ryze.com/go/Al9Mac
Find BPCS Documentation Suppliers
http://radio.weblogs.com/0107846/stories/2002/11/08/bpcsDocSources.html
BPCS/400 Computer Janitor at http://www.globalwiretechnologies.com/
Part time may get commission from some BPCS vendors for helping them ...
I
will now say in a post if the product I commenting on is also one
potentially involved with this ... not that I have made anything from
this yet.
_______________________________________________
This is the SSA's BPCS ERP System (BPCS-L) mailing list
To post a message email: BPCS-L@xxxxxxxxxxxx
To subscribe, unsubscribe, or change list options,
visit: http://lists.midrange.com/mailman/listinfo/bpcs-l
or email: BPCS-L-request@xxxxxxxxxxxx
Before posting, please take a moment to review the archives
at http://archive.midrange.com/bpcs-l.




------------------------------

message: 3
date: Tue, 21 Sep 2004 16:27:32 -0400
from: "Danny Monselise" <dannym@xxxxxxxxxxxxxxxx>
subject: Re: Data mapping

i hope no one is trying to build a case for replacing your system....

------------------------------

message: 4
date: Tue, 21 Sep 2004 15:46:54 -0500
from: Alister Wm Macintyre <macwheel99@xxxxxxxxxxx>
subject: RE: Data mapping

It may be a mystery to you, but not to me.

I have more than a few personal flaws, some in the area of self-discipline,
organization, inter-personal, and other key skills ... it is not that I
lack these skills, just that I am less than a master in them.  When working
for someone else, I am a work-a-holic to their cause, but in times past I
have had several efforts working for myself, and each of those enterprises
were a total bust ... while they lasted, I had loads of fun, but finally
had to quit because I could not sustain the negative cash flow.

My age is now 60, and I am noticing some disturbing trends in my ability to
sustain certain kinds of activities that were never a problem when I was
younger, like working long hours, remembering some stuff, and making
progress trying to change bad habits.

Problem-solving and research are endeavors I find satisfaction indulging in.
Unfortunately, many problems that I get come with insufficient data to get
my arms around them.

>Thanks Al,
>You are, as always, a wealth of information and ideas and why you
>continue to labor for anyone other than yourself remains a mystery.  If
>not a bullseye it is certainly in the first ring and will give us some
>very interesting avenues to explore.
>
>Much thanks,
>
>Mitch Damon, CPIM
>Data & Process Integrity Manager
>Birds Eye Foods




------------------------------

message: 5
date: Tue, 21 Sep 2004 16:49:40 -0500
from: Alister Wm Macintyre <macwheel99@xxxxxxxxxxx>
subject: Re: Data mapping

In the absence of the kinds of resources I referred to earlier, you might
be interested in some of the kinds of tools I am able to use at GWT.

1. I do have some excellent resource material supplied by the various firms
we have done business with over the years for BPCS Education, BPCS tech
support, and conversion projects, and I also subscribe to BPCS newsletters
published by outfits we not yet doing business with.  I push this stuff out
to the end users, and sometimes it goes missing.
2. We have a version of BPCS that came with source code.  This has mixed
blessings since we do not have AS/Set.  In theory, we can back trace field
definitions, but I have found this to be very cumbersome.
3. I have generated *OUTFILEs to a collection which I access via Query/400
to generate reports of value to both myself and in some cases to end users.
4. I use Query/400 quite heavily
5. Some BPCS data in vanilla form is cumbersome to access or reference ...
I have created our own reference listings & inquiries against the data ...
for example, key in an item or customer and get a picture of the special
pricing on it ... list some files, such as reason codes and item classes,
by the text description of those codes ... try to identify where coding is
misleading, and create reference lists of what the underlying reality is
... on BPCS_L there was recent reference to some of this, such as Customer
TERMS

Examples from above

GO CMDREF accesses a command whose output can be sent to an *OUTFILE
Put that command with a CL that adds to the *OUTFILE based on several
libraries
and we have an index of what BPCS programs access what other objects
Due to soft coding it is not 100% complete, but usually a big help to me
when running a query to identify ... what are all the programs that mess
with this file or any of its logicals, or how back trace does this report
get called.

Use DSPFFD to get a printed layout of the fields of a file.
Get at compiled source code of program accessing that file, that puts the
field list into nice format of field names definitions text description, or
create a dummy program that does nothing with the file just to get access
to this

Use RUNQRY *N file name then F4
for example RUNQRY *N IIM
Change the bottom line of the F4 to *Yes

I have placed all this in an CL so that from a menu option we can get at
what file, library, member, we want to run it against

This lets us dump contents of any BPCS file without having any query
definition setup

I can do some selection criteria, view contents of file, then F12 and
change selection criteria
I jot down notes on the earlier DSPFFD or external source list

Which fields are totally unpopulated.
Which fields seem to be 100% populated with the exact same value
Which fields contain data that appears to be mislabeled ... for example,
this or that looks to me like customer #, facility, but is not labeled as
such, nor is it using consistent BPCS back referencing
Which fields are used for an obvious purpose, in which it is equally
obvious that we have some bad data in here, of what nature.  For example,
this is a date field in which some of the dates lack century, so we have
another program to trace without Y2K fix.

I can dump the file layout to a PDM SEU source document, then annotate it
with my research notes.

It has been a while since I had time to mess with it, but I have a string
of data error mapping programs:
    * Use *OUTFILE to get directory of BPCS files
    * That *OUTFILE is now input to a program that will analyze each file
in turn, for the kinds of problems we have seen in the past, to see if any
of those problems have reared their ugly heads again, or if past problems
that seemed not worth the trouble to mess with, have grown in size.
    * What is the oldest date of stuff in the file ... how many days months
years is that ... bold print the sucker if it is more than 3 years old.
    * If this is a detail file, match it against its relevant master files
and get a count of how many records are in there that ought to have a
customer master, vendor master, item master, etc. but don't.
    * Lots more

The last time I messed with this, I realized that as I was adding more
stuff worth checking, the program was approaching the practical ceiling of
how large a program can be, in terms of lines of code, so my next plan was
to split it into 4 programs: check accounting file problems, check problems
with "rules" files (that control how BPCS functions such as the
"engineering" files); check work in process (orders, inventory) problems;
check all other files problems.

My theory behind this is to get an inventory of our bad or dirty data, and
make significant progress cleaning it up, and intercepting its creation,
instead of the normal process of having vast quantities of the stuff in
which users at random collide with pieces of it, then we investigate the
details of that collision, and end up with a band aid on a problem that we
do not have a good big picture on.

Although I have made significant progress on this over the years, it is
what I work on when I run out of more important stuff to work on, which is
very seldom.

-
Al Macintyre  http://www.ryze.com/go/Al9Mac
Find BPCS Documentation Suppliers
http://radio.weblogs.com/0107846/stories/2002/11/08/bpcsDocSources.html
BPCS/400 Computer Janitor at http://www.globalwiretechnologies.com/
Part time may get commission from some BPCS vendors for helping them ... I
will now say in a post if the product I commenting on is also one
potentially involved with this ... not that I have made anything from this
yet.


------------------------------

_______________________________________________
This is the SSA's BPCS ERP System (BPCS-L) digest list
To post a message email: BPCS-L@xxxxxxxxxxxx
To subscribe, unsubscribe, or change list options,
visit: http://lists.midrange.com/mailman/listinfo/bpcs-l
or email: BPCS-L-request@xxxxxxxxxxxx
Before posting, please take a moment to review the archives
at http://archive.midrange.com/bpcs-l.



End of BPCS-L Digest, Vol 2, Issue 193
**************************************


As an Amazon Associate we earn from qualifying purchases.

This thread ...


Follow On AppleNews
Return to Archive home page | Return to MIDRANGE.COM home page

This mailing list archive is Copyright 1997-2024 by midrange.com and David Gibbs as a compilation work. Use of the archive is restricted to research of a business or technical nature. Any other uses are prohibited. Full details are available on our policy page. If you have questions about this, please contact [javascript protected email address].

Operating expenses for this site are earned using the Amazon Associate program and Google Adsense.