× The internal search function is temporarily non-functional. The current search engine is no longer viable and we are researching alternatives.
As a stop gap measure, we are using Google's custom search engine service.
If you know of an easy to use, open source, search engine ... please contact support@midrange.com.



> I am trying to determine the best way to pass the selections to the batch
> submission.  Should I use a USER SPACE, or a DATA QUEUE, or a uniquely named
> file with the selections (different field sizes, types)?

Depends on how large the data is and what the project's requirements are.
How many bytes of data are you expecting to send?

How important is performance?

> I have pondered a monstrous, structured parameter string, but I have heard
> such things are resource intensive.

What are you comparing it to?  Sure, passing a parameter via SBMJOB
requires more resources than passing it to a program in the same job.  But
by comparison to creating/using a file?!  Parameters use only a tiny
fraction of the resources that a file would use.

The best performance will come from having jobs that do not end that are
running in batch.  They sit there all the time waiting for data to arrive
on a data queue and create the report just as soon as data arrives.  You
can have more than one job waiting on the same queue if you want to be
able to process many simultaneous jobs quickly -- then the first available
program will do the work.   This is much more efficient because the jobs
only need to start once, and starting a job requires quite a bit of work.

However, if you're running a query of some sort in the report, the time it
takes to submit a job will be (relatively) inconsequential.  Creating an
access path and filtering out records from a large files takes a TON of
resources...   in that case, you may not want to waste the extra time and
expense writing a complex situation like the "never-ending job" scenario.

If the data to be passed is small enough to use parameters, that would be
your next best bet.  The downside to using parms is that you have to
submit a new job for every report.   But, parms are much more efficient
than creating a file or data queue if you're going to submit a new job
anyway, you might as well use parms.   Make sure you take into account the
standard "gotcha" for passing parms longer than 32 bytes, though...

A user space is also a good option if your data is going to be large.  You
could pass the name of the user space as a parameter or data queue entry
(if you're using never-ending-jobs to crate the reports) so that you can
generate a new user space for each instance of the report you want to run.

If the data is going to be VERY large (over 16mb) you might get stuck with
using a PF or stream file to pass the data.  But I can't imagine query
parms being that large.


As an Amazon Associate we earn from qualifying purchases.

This thread ...

Follow-Ups:
Replies:

Follow On AppleNews
Return to Archive home page | Return to MIDRANGE.COM home page

This mailing list archive is Copyright 1997-2024 by midrange.com and David Gibbs as a compilation work. Use of the archive is restricted to research of a business or technical nature. Any other uses are prohibited. Full details are available on our policy page. If you have questions about this, please contact [javascript protected email address].

Operating expenses for this site are earned using the Amazon Associate program and Google Adsense.