Interesting idea on the dual F-specs for the same file. Similarly I'd have to have two files per logical as well, although our files encapsulated by service programs generally suffice on the keyed physical. I'll have to think on that (and when I get time, play with coding it). I think two files wouldn't be too bad to implement into one of our service programs for testing since the newer ones already access a data structure with the Gets and Sets procedures.
Jeff, I suppose that could be a consideration, although I think Rory's idea of using two F-specs might reduce the duplication that 2 service programs would create.
I appreciate the feedback and ideas.
From: rpg400-l-bounces@xxxxxxxxxxxx [mailto:rpg400-l-bounces@xxxxxxxxxxxx] On Behalf Of Rory Hewitt
Sent: Tuesday, April 03, 2012 5:04 PM
To: RPG programming on the IBM i / System i
Subject: Re: File Encapsulation Quandary
I assume the problem with #1 is that sometimes you have a file with read-only data that you don't want to have to copy into a test library (since you know you won't be updating it) - is that correct? If so, when does the file open fail - immediately on the OPEN (or implicit open if you're not using USROPN)? I see that the UPDPROD keyword to the STRDBG command specifies this:
"...The exception to this is starting debug mode after a production library is already opened..."
Not sure whether it's possible to bypass this problem by 'opening' the production library (or whether this would simply potentially open you up to other problems.
An option might be to make your *PROD library into a *TEST library (so avoiding the UPDPROD issue altogether) and then include a trigger on your updateable files in that library which prompts the user before attempting any update/add?
As far as #2 goes, would it work to simply have two F-spec definitions of the file, both specifying USROPN and EXTFILE, where one is U/A and one is input-only, and then open and use the correct one depending on the requirements (reading or writing)? A bit of a hassle, but it should work.
Just some top-of-the-head thoughts.....
On Tue, Apr 3, 2012 at 2:35 PM, Kurt Anderson
Being a big fan of file encapsulation (essentially centralizing
business logic relating to a file), I've created a fair number of file
encapsulated service programs. They use native I/O. Since the
service program is a one-stop shop, the file is defined as Update/Add.
This causes a couple
1. File needs to be copied into a test library.
In our environment we don't have separate test and production
environments. In fact, most testing references a client's production
library. This has actually worked fine without consequence. We always
run STRDBG UPDPROD(*NO) for testing. What this means is that any file
that is encapsulated needs to be copied into a test library otherwise
the service program won't open the file. This is a minor pain
(however has on occasion caused issues in testing b/c a file had
pre-existed in a test library and it wasn't updated).
2. Files opened for update don't block read.
One of our main files I really wanted encapsulated couldn't be because
it's a transaction file with millions of records. No blocking on read
loops hurts. A lot.
3. Using SQL bloats job log in test mode. Slows single-record access.
IBM has really been pushing using SQL to access data, so I thought
this might be a good occasion to follow that path they've laid out.
Doing so addresses issues #1& #2 above. I've modified one of our
file encapsulated service programs to use SQL. I think it works
pretty slick. Although one not-so-slick aspect is the "chain."
Presumably Closing an existing cursor, Preparing a cursor, Declaring
the cursor, then Fetching the cursor is going to be a lot slower than
a simple chain. I'm willing to live with that, although I'm getting
some beef about over-complicating it. In addition, using SQL and
running the program with STRDBG UPDPROD(*NO) balloons the job log.
Maybe I shouldn't care about the size of the job log in test, although
in one of my tests of 100,000 records (~80k chains), the job wrapped
twice and then I killed it. In this situation I may be able to load
the file into an array or something, but I know that won't always be
the case. (I do realize this is only an issue in testing, but I can
also foresee concerns about this slowing down testing since it's
writing out to the job log so much.)
This had me wondering how other shops handle file encapsulation. I
know my last job had a completely separate test environment. That's
not likely going to happen here anytime soon.
We also don't have change management software. Files being
encapsulated had made some of our file changes quite a bit easier.
Here is a sample of the code. If you have the time, I'd appreciate
CustomCall Data Systems
This is the RPG programming on the IBM i / System i (RPG400-L) mailing
list To post a message email: RPG400-L@xxxxxxxxxxxx To subscribe,
unsubscribe, or change list options,
or email: RPG400-L-request@xxxxxxxxxxxx Before posting, please take a
moment to review the archives at http://archive.midrange.com/rpg400-l.
This is the RPG programming on the IBM i / System i (RPG400-L) mailing list To post a message email: RPG400-L@xxxxxxxxxxxx To subscribe, unsubscribe, or change list options,
or email: RPG400-L-request@xxxxxxxxxxxx
Before posting, please take a moment to review the archives at http://archive.midrange.com/rpg400-l.