Hi Simon,

Hope you have time to help me...I've got the program working with the data
que and used several of your suggestions...I have only one problem..the
program stops updating the records about twice a day.  If I stop the program
and restart it, it starts updating again..I know it reads the correct dataq
records because the first thing I do is write them to a history file...just
no updates...there are no messages in the job log...any ideas. 
-----Original Message-----
From: Simon COulter [mailto:shc@flybynight.com.au]
Sent: Friday, August 18, 2000 7:39 PM
To: COBOL400-L@midrange.com
Subject: Re: using DTAQ in COBOL



Hello Mary,

You wrote:
>We want the web program to send the data to a data queue and a COBOL
program
>to process the record and then wait for the next one. What makes the COBOL
>program set on the READ statement and how do you handle opening and closing
>of files?

You don't READ from the data queue.  You call the QRCVDTAQ API.  You would
submit the 
COBOL program as a batch process that opens its files and then waits for
data on the 
queue.  The API supports a wait-time parameter that lets you decide to wait
forever, 
wait for a finite period, or not wait at all.

>We want to halt the COBOL program and let the queue just build up while we
>run a backup on the files. Do you abort the job and let it close the files?
>Appreciate any advice.  

Be a little careful with this.  On RISC systems, the queue is sorted (if
keyed) on the 
receive so if many entries arrive during the backup the COBOL program may
appear to 
stall when you restart it after the backup -- it will catch up eventually.
Also queues 
grow when the entries arrive faster than they are removed so you may have
space 
considerations.

It would be nicer if you designed a communications method where the COBOL
program can 
shut down when it gets a particular queue entry.  You could make the queue
keyed and use 
a numeric priority as the key value.  Normal entries would be sent with a
PTY of 5, 
END_NOW entries would be sent with a PTY of 0, END_AFTER_PROCESSING would be
sent with a 
PTY of 9,  etc.  You might want to send a SUSPEND entry which causes the
program to 
close the files and wait for a RESUME entry on the queue.

You might also want to consider designing a though-put monitor into the
program so it 
can determine if the entries are arriving faster than it can handle and then
it can 
submit a new copy of itself.  Once it has caught up extra copies could
simply shut down.

You are limited only by your imagination and coding ability.

Regards,
Simon Coulter.


 FlyByNight Software         AS/400 Technical Specialists       
 Eclipse the competition - run your business on an IBM AS/400.  
                                                                
 Phone: +61 3 9419 0175      Mobile: +61 0411 091 400           
 Fax:   +61 3 9419 0175      mailto: shc@flybynight.com.au      
                                                                
 Windoze should not be open at Warp speed.                      

+---
| This is the COBOL/400 Mailing List!
| To submit a new message, send your mail to COBOL400-L@midrange.com.
| To subscribe to this list send email to COBOL400-L-SUB@midrange.com.
| To unsubscribe from this list send email to COBOL400-L-UNSUB@midrange.com.
| Questions should be directed to the list owner/operator:
david@midrange.com
+---END
+---
| This is the COBOL/400 Mailing List!
| To submit a new message, send your mail to COBOL400-L@midrange.com.
| To subscribe to this list send email to COBOL400-L-SUB@midrange.com.
| To unsubscribe from this list send email to COBOL400-L-UNSUB@midrange.com.
| Questions should be directed to the list owner/operator: david@midrange.com
+---END


As an Amazon Associate we earn from qualifying purchases.

This thread ...


Follow On AppleNews
Return to Archive home page | Return to MIDRANGE.COM home page

This mailing list archive is Copyright 1997-2021 by midrange.com and David Gibbs as a compilation work. Use of the archive is restricted to research of a business or technical nature. Any other uses are prohibited. Full details are available on our policy page. If you have questions about this, please contact [javascript protected email address].

Operating expenses for this site are earned using the Amazon Associate program and Google Adsense.