Kurt,
Can you tell me why you think file encapsulation is worth the effort? I spent many years thinking on this, and found that I don't like the bottom-up mindset that this imposes on application development.
I find my development patterns seem to take a top-down approach, defining business "objects" or entities, and methods (business transactions) to process these entities, and invoke whatever file I/O approach that best serves the needs of the transaction.
At issue, to me, is that the file encapsulation limits you to a handful of methods to retrieve your data, and limit your flexibility as a developer. Say for example that I only need a subset of fields from a particular file. Unless I specifically define a proc to return just the fields I want, the standard mode of operation is to simply return all the fields in the file.
Instead of designing on top of the database, design the functions that support the meta-data contained within the database. In CUST file, you might have a data function to retrieve billing or shipping address, another for YTD summary details, another for A/R credit terms, and so forth...
JMO,
-Eric DeLong
-----Original Message-----
From: rpg400-l-bounces@xxxxxxxxxxxx [mailto:rpg400-l-bounces@xxxxxxxxxxxx] On Behalf Of Kurt Anderson
Sent: Tuesday, April 03, 2012 4:35 PM
To: RPG programming on the IBM i / System i (rpg400-l@xxxxxxxxxxxx)
Subject: File Encapsulation Quandary
Being a big fan of file encapsulation (essentially centralizing business logic relating to a file), I've created a fair number of file encapsulated service programs. They use native I/O. Since the service program is a one-stop shop, the file is defined as Update/Add. This causes a couple issues:
1. File needs to be copied into a test library.
In our environment we don't have separate test and production environments. In fact, most testing references a client's production library. This has actually worked fine without consequence. We always run STRDBG UPDPROD(*NO) for testing. What this means is that any file that is encapsulated needs to be copied into a test library otherwise the service program won't open the file. This is a minor pain (however has on occasion caused issues in testing b/c a file had pre-existed in a test library and it wasn't updated).
2. Files opened for update don't block read.
One of our main files I really wanted encapsulated couldn't be because it's a transaction file with millions of records. No blocking on read loops hurts. A lot.
3. Using SQL bloats job log in test mode. Slows single-record access.
IBM has really been pushing using SQL to access data, so I thought this might be a good occasion to follow that path they've laid out. Doing so addresses issues #1 & #2 above. I've modified one of our file encapsulated service programs to use SQL. I think it works pretty slick. Although one not-so-slick aspect is the "chain." Presumably Closing an existing cursor, Preparing a cursor, Declaring the cursor, then Fetching the cursor is going to be a lot slower than a simple chain. I'm willing to live with that, although I'm getting some beef about over-complicating it. In addition, using SQL and running the program with STRDBG UPDPROD(*NO) balloons the job log. Maybe I shouldn't care about the size of the job log in test, although in one of my tests of 100,000 records (~80k chains), the job wrapped twice and then I killed it. In this situation I may be able to load the file into an array or something, but I know that won't always be the case. (I do realize this is only
an issue in testing, but I can also foresee concerns about this slowing down testing since it's writing out to the job log so much.)
This had me wondering how other shops handle file encapsulation. I know my last job had a completely separate test environment. That's not likely going to happen here anytime soon.
We also don't have change management software. Files being encapsulated had made some of our file changes quite a bit easier.
Here is a sample of the code. If you have the time, I'd appreciate comments.
http://code.midrange.com/f5aa843519.html
Thanks,
Kurt Anderson
Sr. Programmer/Analyst
CustomCall Data Systems
As an Amazon Associate we earn from qualifying purchases.