×
The internal search function is temporarily non-functional. The current search engine is no longer viable and we are researching alternatives.
As a stop gap measure, we are using Google's custom search engine service.
If you know of an easy to use, open source, search engine ... please contact support@midrange.com.
BPCS assigns sequentially various order #s ... customer, purchase, shop,
RMA ... invoices, checks ... they get to 999,999 then roll over to 1 again
There are also places where we can reset the last # so that we can start
over at 1 again without waiting on it getting to 999,999 ... a reason why
we might want to do this is in those areas where many co-workers are keying
in the order # hundreds of times a day ... if they can be keying a 4 digit
# rather than a 6 digit #, there is an aggregate enhancement for their
productivity, and when there is a risk of el typo ... less digits means
less risk, per unit transaction.
Also, we use query/400 a lot ... some people may have created a query where
they see ... this or that # in BPCS can go up to xxx,xxx but we are only
using up to x,xxx so let's chop off the high order digits so we can cram
more stuff sideways on this report ... then many people using the report
not realizing this has been done, then the company business gets to the
point that 9,999 becomes 10,xxx and now we have a report that is losing the
high order digit, and who knows it?
Now my question is if there is any place in BPCS where there is risk of
collision between old #s and new #s ... let's suppose the last order #
issued is 1233 and we have a real old order sitting out there by # 1234 ...
then we want the next order # to be released to be 1235 assuming not have
one like that, as opposed to it trying to create a duplicate 1234 and
perhaps bombing. Is there any type of order invoice check etc. for which
this is a risk? We are 405 CD.
I already know about a couple gotchas with shop orders and customer orders.
When we look at the detail on a shop order, such as with SFC300, some of
that detail is coming from inventory history or labor history. Suppose we
had a shop order 6 months ago for 123 then we restarted the #s again and we
get to 123 again ... not a duplicate order # ... but 6 months ago we had
another shop order 123 for a different item # different facility nothing
similar, but when we look at the latest order 123 data it has the earlier
123 data cluttered in ... thus, it is smart not to be reusing shop orders
within the time frame that we keep labor and inventory history on line.
When customer orders #s are assigned, they take the last # issued, add one,
check to see if that # in use, if not assign it as next order, if that
taken, then add one and try again ... this loop only goes so far, after
trying like 1,000 times, it gives up and bombs ... thus it would be prudent
not to be restarting this # system such that we going to run into a batch
of old order #s where many consecutive #s used up.
Are there any other gotchas associated with restarting #s before they get
to the natural 999,999 roll-over point?
-
Al Macintyre http://www.ryze.com/go/Al9Mac
Find BPCS Documentation Suppliers
http://radio.weblogs.com/0107846/stories/2002/11/08/bpcsDocSources.html
As an Amazon Associate we earn from qualifying purchases.
This mailing list archive is Copyright 1997-2024 by midrange.com and David Gibbs as a compilation work. Use of the archive is restricted to research of a business or technical nature. Any other uses are prohibited. Full details are available on our policy page. If you have questions about this, please contact
[javascript protected email address].
Operating expenses for this site are earned using the Amazon Associate program and Google Adsense.