|
>From Al Macintyre at Central Industries > From: pcrites@gbc.com (James Camp using Paul's id) > BPCS/AS-400 Community, > We need to build a small test database from an existing system. > Initially, > this will be BPCS, but the intention is to be able to use this tool for > other systems which we have/will have. > Is anyone aware of such a tool, similar to something called ExtractDB? > Any and all responses would be appreciated. My definition of a small test data base from a larger system is one in which we only have the files that are neccessary for whatever scenario or items being tested or audited, not the entire BPCS baggage, and we have selected data that has inter-related continuity. Instead of tens of thousands of items, just a few score or hundred, not chosen at random, and with intact BOM chains & the various other records essential to those items & just those items. Not ALL orders that are in the system, but a small portion, and the neccessary customers & vendors & master file contents relevant to this selection. I have never fulfilled this dream. It has been simpler to copy entire master static files, leave dynamic files empty & re-launch comparable data for simulation purposes. I think that AUDITORS have a similar interest to this TESTING perspective --- they want to select representative items & check out that the data we have on those items is correct - all the data, not just on-hand totals --- what is a good way to get that data to them to check? Then if they see a pattern of error, that tells us which kinds of data need to be checked out on a larger sampling of items. 1. We give them copies of various BPCS 405 reports that is a list of items with summary information. We have also, on occasion, sent them directories of the kinds of data we have on any given item, and lists of reports available, by our title. This can be misleading, since the data might be the same on all items & the reports might be misnamed. They do whatever they do & send us a list of items they are interested in more information on, which might be a tiny fraction of 1% of all our items. 2. We send them more detailed reports of information on those items. I think, that if this was not in violation of SSA licensing agreement (I do not know if it is or is not), it might be more useful to the auditors (especially if they are on OS/400 similar to ours to have a tool that says for these item #s given to us by the auditors copy just the relevant data from the relevant files and send the auditors a data base of just those items & their corresponding files with lead times & costs & BOM & history - the physical data, external file layout, source code of file layout, no logicals - let the auditors use this to gain whatever correlations meet their fancy. When we were on S/36 testing modifications, I sometimes copied only those files needed by the software to be tested, into their own data base (like a BPCS/400 environment). On BPCS/400 I have a hard enough job just refreshing the test environment data with contemporary live production file data in its entirety, let alone extracting a smaller more manageable representative sampling for purposes of testing & education. There are some AS/400 & BPCS/400 add-on products that might help towards this task - I am just "thinking out loud here" - I have not actually done this. Check out "BPCS Lite" http://www.unbeatenpathintl.com/services.html Several other BPCS consultants offer AS/400 services aimed at eliminating stuff you do not need & helping your BPCS performance, but this is only a first step During our Y2K conversion, we used FILE TRACK from OUTLOOK http://www.outlookcomputing.com Our interest was in transferring modifications to new fields, resolving Y2K issues, and re-structuring our numbering system as we merged 4 data bases BPCS/36 style into 1 environment BPCS/400 style. However, from what I remember of using the product, it could also be used as a Data Extract Project. 1. Designate your rules for what will be extracted - some extremes, representative data, some ranges & File Track will do that just for the files you apply those rules to. 2. Copy other files records that are interconnected, but ONLY the records that are needed for your selection criteria. 3. In later tests you can change your extraction rules. Consider archiving, generally marketed as a tool for data mining, off-loading historical files until we really need them, spool file management, etc. but perhaps we could use the archiving tools to select the data we really need in a test environment, then instead of actually removing what we have copied, and instead of it going to the archives, our test environment gets what we have selected. http://www.arctools.com http://www.helpsystems.com There are many vendors with products that take AS/400 spool reports to PC & then you manipulate the data there, such as Monarch from Data Watch 800-445-3311 (I do not have their web site URL) & I can cite over a dozen other places like this, if you have the interest Thus one could use Query or other tools to identify the data that we desire in our little test data base, then get the report not to spool but *OUTFILE then use SQL to grab the records on those item#s order#s customer#s etc. & copy them into the other environment's corresponding external file structures. Another idea - when you do physical inventory - copy that file to some other environment - manipulate the contents, on basis of item class & eliminating small potatoes or whatever your criteria is, so that it has a small portion of your total inventory - dump it into the opening balances of test environment, run BPCS reorgs to eliminate zero records. Al Macintyre +--- | This is the BPCS Users Mailing List! | To submit a new message, send your mail to BPCS-L@midrange.com. | To subscribe to this list send email to BPCS-L-SUB@midrange.com. | To unsubscribe from this list send email to BPCS-L-UNSUB@midrange.com. | Questions should be directed to the list owner: dasmussen@aol.com +---
As an Amazon Associate we earn from qualifying purchases.
This mailing list archive is Copyright 1997-2024 by midrange.com and David Gibbs as a compilation work. Use of the archive is restricted to research of a business or technical nature. Any other uses are prohibited. Full details are available on our policy page. If you have questions about this, please contact [javascript protected email address].
Operating expenses for this site are earned using the Amazon Associate program and Google Adsense.