|
Damn, I think you have got it Charles concerning the SET. That would solve my having to use update but it would have to be in a program which it is going to be anyway so the program would have the following: Open the stream file. /Exec SQL Set :DummyField = OpenIFS('/home/yourdirectory/yourfile.txt') Write the first header. /Exec SQL Set :DummyField = Select Distinct WriteIFS(Build_String_Here) From Tables Write the second header. /Exec SQL Set :DummyField = Select Distinct WriteIFS(Build_String_Here) From Tables. Write the 1.5 million. /Exec SQL Set :DummyField = Select Distinct WriteIFS(Build_String_Here From 1.5Table Close the stream file. /Exec SQL Set :DummyField = CloseIFS() Shoot, I just spotted a problem. Everything is variable so Writes would have to be prepared and I don't think I can use a prepared statement with a SET or, at least, I don't know how to do it. Maybe the solution is to use EXECUTE IMMEDIATE with an INSERT into a temp table. You cannot use SELECT with EXECUTE IMMEDIATE but I can do a "EXECUTE IMMEDIATE INSERT INTO temp_table SELECT DISTINCT WRITEIFS(Build_String_Here) FROM Tables" and declare the temp_table using DECLARE GLOBAL TEMPORARY TABLE so we would end up with: Build the dummy file /Exec SQL DECLARE GLOBAL TEMPORARY TABLE Temp_Table (DummyField Char(1)) Open the stream file. /Exec SQL SET :DummyField = OpenIFS('filename ') Write the first header StringToExecute = 'INSERT INTO TEMP_TABLE SELECT DISTINCT WRITEIFS(' + BuildStringHere + ') FROM ' + Tables /Exec SQL EXECUTE IMMEDIATE :StringToExecute So on and so forth. Global table disappears at the end. As to your question about the performance, I think that anytime you can have a system level process do the work, it is going to be faster and since the records are never coming into the HLL program, that makes it fast. To my way of thinking, SQL is always going to be faster than bringing records into a HLL program. In other words, Set at a time operations. Important point here is that the records are never coming into the HLL. All the processing is going on in SQL. SQL will read a record from the data base, format the string, call the UDF and back to get the next record. I think we are almost there. Thanks so much for the help. -----Original Message----- From: Wilt, Charles [mailto:CWilt@xxxxxxxxxxxx] Sent: Thursday, May 26, 2005 9:19 AM To: RPG programming on the AS400 / iSeries Subject: RE: Help improve performance of RPG program - Long Post > > Ok, I get it. The reason that comes to mind is that I would > have to call three programs vs. having to write one service > program with four procedures. The programs in turn would have > to call the procedures in the service program so instead of > one program, I would need to maintain four programs and as > far as I know, a stored procedure cannot be a service > program. It must be a regular program. I can see your point here. However, even with your procedures in a service program, you'd still have to have one program/stored procedure to call them. I'm not convinced that it makes sense to handle it this way. Basically, I don't see any benefits to calling the procedures as UDFs instead of bound procedure calls. However, how do you envision WRITEIFS() being used? It would seem to me that there are two ways: 1) WRITEIFS(:StringToOutput) or 2) Select WriteIFS(field1 concat,',', concat cast(field2 as char), concat ',' myDateFunc(field3)) into :errorFlag from INPUTFILE option one still requires data to be brought up from the DB into the HHL procedure to process. Option two on the other hand, may have some performance benefits as the DB record doesn't get sent to the HLL, just a string. The burning question, is it faster to manipulate the fields in a HLL or in SQL using CAST? > > In SQL Server, I can just do SELECT CLOSEIFS(') without > having to specify a table name but as far as I know you > cannot do that in AS/400 SQL. I tried to think of a way to do > it without using the UPDATE or an INSERT. That blank has to > go somewhere. If I don't consume it, the system is going to > throw it to the screen and it is just a string constant. The > only thing changing would be the file name string so you just > create an internal function and pass the string and have it > return the formatted string and pass to a internal function > to PREPARE and execute. The code in the service program to > implement is probably going to a 20 or 30 lines of code > assuming I call my IFS service program. If someone has an > idea of how to consume the blank without using an UPDATE or > INSERT, I would love to hear about it. Scalar UDFs can be used just about anywhere you can use an expression. So for instance, you can do: set dummyVar = CloseIFS(); You don't need the dummy file/fields at all. Just an dummy SQL procedure variable. I'd have the UDF return a SUCEESS/FAIL flag that I could check before continuing. > > As far as creating functions, you are really not creating > anything. You are just registering with the data base the > name of the service program and procedure to call. Either way > I have to write the same code to open the IFS file in the > appropriate manner. The code in the service program would be > identical in either case, I would just have three programs > that would call the service program. > > Also, I was originally thinking of implementing the Open as > one function, OPENIFS. The first parameter would be the same, > the second would be optional *NEW or *APPEND with *NEW being > the default. I would still code two procedures. One for one > parameter and one for two parameters. I went toward having > two functions. Maybe I should do one. Your opinion? > -- > This is the RPG programming on the AS400 / iSeries (RPG400-L) > mailing list > To post a message email: RPG400-L@xxxxxxxxxxxx > To subscribe, unsubscribe, or change list options, > visit: http://lists.midrange.com/mailman/listinfo/rpg400-l > or email: RPG400-L-request@xxxxxxxxxxxx > Before posting, please take a moment to review the archives > at http://archive.midrange.com/rpg400-l. > >
As an Amazon Associate we earn from qualifying purchases.
This mailing list archive is Copyright 1997-2025 by midrange.com and David Gibbs as a compilation work. Use of the archive is restricted to research of a business or technical nature. Any other uses are prohibited. Full details are available on our policy page. If you have questions about this, please contact [javascript protected email address].
Operating expenses for this site are earned using the Amazon Associate program and Google Adsense.