× The internal search function is temporarily non-functional. The current search engine is no longer viable and we are researching alternatives.
As a stop gap measure, we are using Google's custom search engine service.
If you know of an easy to use, open source, search engine ... please contact support@midrange.com.


  • Subject: RE: Changeing records in a big file
  • From: Joel Fritz <JFritz@xxxxxxxxxxxxxxxx>
  • Date: Fri, 4 Feb 2000 15:01:41 -0800

Blocked sequential IO could be a very good choice depending on how many
records you need to change.   If it's a substantial proportion of the file,
it will definitely be your fastest choice.  Keyed I/O is very expensive
compared with a simple read.  Leave out the K in your F spec if the file has
a key.

Update is an expensive operation. If you can write the output to a new file
with the same record format (rename the old one first) it will go much
faster.  Write all the records to the new file.  

If you have to update in place, consider exception output.  It's dangerous
in a large program that has to be maintained, but not too bad in a short one
shot program.

If you can go through the file sequentially, you can break the job into
three parts using OVRDBF to start at record numbers that will divide the
file in thirds and submit three simultaneous jobs.

Another option is an update query in the form:

update filename
set fieldx = somevalue
where fieldx < minvalue    (or any logical condition involving the fields in
the record)

but I assume you are looking for an RPG solution.

I used to do a lot of mailing list processing that involved exactly this
sort of thing.  On the F60 we had when I started, updating half the records
a 4 million record file by writing the output to a new file would take about
a half hour running in batch in the middle of the day with a lot of
interactive users signed on.  Using update in place with the UPDAT opcode
would take more than twice as long.



Four hours sounds like an unreasonably long time but maybe I've been
spoiled.

> -----Original Message-----
> From: Chris Beck [mailto:CBeck@good-sam.com]
> Sent: Friday, February 04, 2000 1:43 PM
> To: RPG400-L@midrange.com
> Subject: Changeing records in a big file
> 
> 
> I have a 4 million record file that I need to change some 
> records in if they meet certain criteria.   What are some 
> things I can do to speed this up. obviously sequential 
> reading of the file is out of the question at over 4 hours to 
> run.  I would prefer not to use the SETLL if possible,  
> because I don't want to have to go in to the file and see 
> what record to start with. 
> 
> 
> thanks for any help.  
> 
> 
+---
| This is the RPG/400 Mailing List!
| To submit a new message, send your mail to RPG400-L@midrange.com.
| To subscribe to this list send email to RPG400-L-SUB@midrange.com.
| To unsubscribe from this list send email to RPG400-L-UNSUB@midrange.com.
| Questions should be directed to the list owner/operator: david@midrange.com
+---


As an Amazon Associate we earn from qualifying purchases.

This thread ...


Follow On AppleNews
Return to Archive home page | Return to MIDRANGE.COM home page

This mailing list archive is Copyright 1997-2024 by midrange.com and David Gibbs as a compilation work. Use of the archive is restricted to research of a business or technical nature. Any other uses are prohibited. Full details are available on our policy page. If you have questions about this, please contact [javascript protected email address].

Operating expenses for this site are earned using the Amazon Associate program and Google Adsense.