× The internal search function is temporarily non-functional. The current search engine is no longer viable and we are researching alternatives.
As a stop gap measure, we are using Google's custom search engine service.
If you know of an easy to use, open source, search engine ... please contact support@midrange.com.



Without spending much time with this, my first recommendation would be to
stop using dynamic prepare to process your SQL statement.  Dynamic SQL must
analyze the statement and perform a full optimization pass before it
excutes.  Does dbFILE change for each execution, or is the SQL always
targeting the same file for its update?

The format of your SQL is appropriate for changing many files, with
different field names in each file.  This is overkill if the file and field
names are static.

If dbFILE is static, you can write the statement using host variable or
parameter markers, which allows the SQL package to compile statistics within
its "access plan" to better optimize the execution of the SQL....

C/Exec SQL
C+  Update  FileXYZ 
C+     Set  empno  = :##EN
C+   Where  emplyr = :##ER and
C+            empno  = :##SS
C/End-Exec

The colon indicates to SQL that the value used in the query is supplied by
the HLL program.  

Eric DeLong
Sally Beauty Company
MIS-Project Manager (BSG)
940-297-2863 or ext. 1863



-----Original Message-----
From: rpg400-l-bounces@xxxxxxxxxxxx
[mailto:rpg400-l-bounces@xxxxxxxxxxxx]On Behalf Of Tom Huff
Sent: Wednesday, February 01, 2006 10:38 PM
To: 'RPG programming on the AS400 / iSeries'
Subject: RE: Chaining with a very large file


I didn't create special indexes. Almost(98%) of the files had a LF by
Employer and Employee, which is the key needed to process by. I have to
process all records for an employee within an Employer.
Here is the SQL statement:
SQLCommand = 'update ' + dbFILE

                 + ' set ' + dbEN1 + ' = ''' + ##EN + ''''

                 + ' where ' + dber + ' = ''' + ##er + ''' and '

                 + dben1 + ' = ''' + ##ss + '''';  
Please let me know if anyone knows a better way to write this. I am fairly
new to using embedded SQL
Thanks
Tom

-----Original Message-----
From: rpg400-l-bounces@xxxxxxxxxxxx [mailto:rpg400-l-bounces@xxxxxxxxxxxx]
On Behalf Of Aaron Bartell
Sent: Wednesday, February 01, 2006 8:03 PM
To: 'RPG programming on the AS400 / iSeries'
Subject: RE: Chaining with a very large file

>Sorry to burst SQL's bubble, but it can not hold a candle to RPG.

I'd be curious to see your SQL statements and then the steps taken to make
sure there were indexes to optimize the SQL.  The reason I say that is
because I was doing SQL in an application and thought I was optimizing it
well with the indexes I created from my own thought process, but when I took
it through the Visual Explain tooling in iSeries Nav it turns out I was off
on many of my indexes.

If I remember correctly you can turn on journaling and see what SQL
statements are being run against a file and see exactly what the optimizer
chose for access paths and what it dynamically created as it went.

Might be worth taking a look into if you ever need to do SQL again.

Aaron Bartell


-----Original Message-----
From: rpg400-l-bounces@xxxxxxxxxxxx [mailto:rpg400-l-bounces@xxxxxxxxxxxx]
On Behalf Of Tom Huff
Sent: Wednesday, February 01, 2006 9:50 PM
To: 'RPG programming on the AS400 / iSeries'
Subject: RE: Chaining with a very large file

The last time I tested this, RPG using SETLL & read loop with a test in the
loop beat SQL by a factor of 10. I am changing all the records in an HR
system one employee at a time. The employee number is being changed from
SocSec to a generated number. There are about 600 files and each employee
has anywhere from 25 to 25000 records. SQL takes an average of 8.5 minutes
and RPG takes an average of 1 minute. 
The total number of records to change was too large to run over a weekend so
we broke it down to one employee submitted when the new employee badge is
made.
The programming in SQL was shorter but the RPG was MUCH FASTER.
I wrote a log file of the number of records changed and the elapsed time
using both methods. We are changing about 250000 employees. 
Sorry to burst SQL's bubble, but it can not hold a candle to RPG.
Thanks
Tom

-----Original Message-----
From: rpg400-l-bounces@xxxxxxxxxxxx [mailto:rpg400-l-bounces@xxxxxxxxxxxx]
On Behalf Of Wilt, Charles
Sent: Wednesday, February 01, 2006 12:28 PM
To: RPG programming on the AS400 / iSeries
Subject: RE: Chaining with a very large file

> -----Original Message-----
> From: rpg400-l-bounces@xxxxxxxxxxxx
> [mailto:rpg400-l-bounces@xxxxxxxxxxxx] On Behalf Of Douglas W. Palme
> Sent: Wednesday, February 01, 2006 12:15 PM
> To: RPG programming on the AS400 / iSeries
> Subject: RE: Chaining with a very large file
> 
> 
> Thanks for the tip, I know sql is out of the question for a file this 
> size and the machine we have.
> 

Nonsense, the logical file you need for native I/O would provide benefits
SQL would use too.

If all you are doing is reading one record, then native I/O should be
faster.

On the other hand, if you what to read 100s or 1000s of records with the
same key you'd find SQL faster.



Charles


--
This is the RPG programming on the AS400 / iSeries (RPG400-L) mailing list
To post a message email: RPG400-L@xxxxxxxxxxxx To subscribe, unsubscribe, or
change list options,
visit: http://lists.midrange.com/mailman/listinfo/rpg400-l
or email: RPG400-L-request@xxxxxxxxxxxx
Before posting, please take a moment to review the archives at
http://archive.midrange.com/rpg400-l.




As an Amazon Associate we earn from qualifying purchases.

This thread ...

Follow-Ups:

Follow On AppleNews
Return to Archive home page | Return to MIDRANGE.COM home page

This mailing list archive is Copyright 1997-2024 by midrange.com and David Gibbs as a compilation work. Use of the archive is restricted to research of a business or technical nature. Any other uses are prohibited. Full details are available on our policy page. If you have questions about this, please contact [javascript protected email address].

Operating expenses for this site are earned using the Amazon Associate program and Google Adsense.