×
The internal search function is temporarily non-functional. The current search engine is no longer viable and we are researching alternatives.
As a stop gap measure, we are using Google's custom search engine service.
If you know of an easy to use, open source, search engine ... please contact support@midrange.com.
Booth Martin wrote:
How have you guys made test data from large sets of data? I am looking
at about 30 files, some with many millions of records in them. Is it
an easy process, in your experience?
Not necessarily easy, because you'll be wanting the subset to maintain
the same relational integrity the full set has. I usually use SQL to
choose the master records I need, based on my test coverage needs. Then
I'll use SQL to select dependent child records from other files, WHERE
HISTCUST IN (SELECT CUST FROM TESTMASTER)... Be aware that you may need
multiple WHERE clauses to maintain integrity (CUST, ITEM and maybe DATE).
Don't ever waver in your discipline! Write a script, either using the
IBM RUNSQLSTM command or the one in the FAQ or some other tool, but
don't ever manually include or update a record. ALways use the test
script and then you can reliably re-create the test data.
There are commercial tools to do this but I never worked at a place
willing to spend the money. Martin Rowe has a tool that's pretty cool,
but I hope I got across some of the things to be considered when
building a test set.
As an Amazon Associate we earn from qualifying purchases.