|
I've almost completed a utility to help our shop create web pages for regular tabular data. It is optimized for creating tables-based reports (i.e., convert existing reports for intranet browsing). However, I have some performance questions. The application architecture is a service program that can be bound to ILE RPG modules. The back end implements storage as user spaces in the job's QTEMP library. The requesting program sends data to the service program through prototyped calls, which stores the unformatted data in the user space. When the requesting program calls for output, it specifies either standard output or an IFS location (currently, only the IFS is implemented, and is the focus of current testing). The service program reads the user space, formats the data, then shuttles the data to its final location. I'm about 95% complete, with the exception of writing to standard output and some control fields. So far, the program works well, with the exception of some performance issues. The major problem I have is two-fold. Since I've only implemented the IFS portion, that's all I can test. However, the service program appears to suck up major CPU cycles when formatting the data and/or writing to the IFS. Consequently, I only get 1-2K/sec when writing a HTML page. In the current report I test, I have a summary portion and detail portion, as two separate pages that are hyperlinked. The summary portion is 30KB, and takes between 30-40 seconds to read the user space, format the HTML, and write to the IFS. The detail portion is roughly 1.5MB, and this process takes around an hour! (AS/400 model 720, 1GB RAM, 80GB disk) I'm unsure where the bottleneck(s) lay. 1) When storing the data to the user space, I have turned on automatic extendibility. Once the initial capacity fills, it takes noticeably longer to complete data storage. Does OS/400 actually create a new user space with a larger size, then copies the data, or simply extend the current space? Would it be better to handle storage in a file, say keyed by handle and date/time written? I don't know anything about data queues, but I do anticipate generating pages up to 2 MB in size. All I need to do is store a varying amount of data, and read it in FIFO manner. 2) Reading from the user space appears to take little time, I'm not terribly concerned here. 3) Formatting HTML. I take "unformatted" data, along with some control fields, and generate the appropriate HTML code on the output stage. Currently, this involves many %trim/%trimr operations on the output data. I am currently using size 32000 workspaces for the data (since this is the current RPG/IV limit). String operations take a long time. I thought about possibly passing data through pointers, but I still need to concatenate the data to an output variable before writing it. 4) Writing the data to the IFS takes some time. Disk utilization obviously increases. Hopefully in the next few days I can implement standard output and see if writing data is an actual bottleneck. I'm hoping someone has done similar work before, and any tips would be greatly appreciated. Thanks! Loyd -- Loyd Goodbar lgoodbar@ispchannel.com ICQ#504581 +--- | This is the WEB400 Mailing List! | To submit a new message, send your mail to WEB400@midrange.com. | To subscribe to this list send email to WEB400-SUB@midrange.com. | To unsubscribe from this list send email to WEB400-UNSUB@midrange.com. | Questions should be directed to the list owner/operator: david@midrange.com +---
As an Amazon Associate we earn from qualifying purchases.
This mailing list archive is Copyright 1997-2024 by midrange.com and David Gibbs as a compilation work. Use of the archive is restricted to research of a business or technical nature. Any other uses are prohibited. Full details are available on our policy page. If you have questions about this, please contact [javascript protected email address].
Operating expenses for this site are earned using the Amazon Associate program and Google Adsense.