× The internal search function is temporarily non-functional. The current search engine is no longer viable and we are researching alternatives.
As a stop gap measure, we are using Google's custom search engine service.
If you know of an easy to use, open source, search engine ... please contact support@midrange.com.



Let me just make it clear that this tread almost only spoke about SCS and 
non-3812 spoolfiles. 'Traditional' spoolfiles are sure
this simple type and they are also the easiest to handle (*IPDS and userascii 
are much more difficult).

As stated the default values for CPYSPLF will ignore blank lines.

This tread contained a tip on the CPYSPLF to do a *FCFC on the CPYSPLF. But as 
I read the tip the extra control byte in pos 1 of
the record was intended to be handled by a user written program. For that 
purpose *PRTCTL is easier <quote>*PRTCTL:  Specifies
that the first four characters of every record contains skip- and space-before 
values useful in HLL (high-level language)
programs.  This code can be viewed as SSSL, where SSS is the skip-before line 
value and L is the space-before value.  SSS can
range from 001 through 255 to cause a skip to the specified line.  Once there, 
L can be used to specify a spacing of 0, 1, 2, or 3
lines before printing of the record begins.  When one part of the code is used 
(SSS or L), the other part is left blank. </quote>

However there is an easier way to do this (and _this_ time using *FCFC):

CPYSPLF CTLCHAR(*FCFC)
Call my_splitting_program to output to PF

Each 'part' can then be printed (either within the my_splitting_program as 
suggested or in any other way) without a print-program
by:
OVRPRTF CTLCHAR(*FCFC)
CPYF TOFILE(QPRINT or QSYSPRT)

In this way you don't even need to know the meaning of the single character 
generated as *FCFC.

This method might even retain a single text attribute (Bold) because bold 
sometimes are represented as double-strike (same text
with a 'space-before = 0). In the old impact printer days it was even 
physically produced that way. I don't know. Experiments will
show.

Henrik
http://hkrebs.dk

> From: Wills, Mike N. (TC) [mailto:MNWills@taylorcorp.com]
> Sent: Thursday, September 13, 2001 10:52 AM
> To: 'midrange-l@midrange.com'
> Subject: Break up spool file based on information within
>
>
> I have an interesting request, and I am not sure of the best way to do this.
>
> Brief History:
> We use a purchased product for our accounting systems. One of the reports in
> this software creates a report that we need to break up for distribution to
> other people. We don't want them to see everything, just the part that they
> are supposed to see. The report is laid out that there is a definite spot
> where you can tell where to break it apart. However, we are currently doing
> this manually.
>
> My questions:
> What I want to know is if anyone knows of a easy way to break this spool
> file apart automatically, without purchased products? We are willing to
> write a program to duplicate what the report does, in order to break this
> apart. But I am wondering if a program could be written to read through the
> report and copy the parts that we want out of the created spool file. We do
> have Brad's SPLTOOL to use (because these will be created into PDF files).
>
> Could this be done? Copy the spool file to a physical file, start reading
> though it. Since I know what positions to look at use that in combination of
> the page number and break it apart that way using SPLTOOL? Any thoughts?
>
> Thanks for any advice...
>
> Mike Wills




As an Amazon Associate we earn from qualifying purchases.

This thread ...


Follow On AppleNews
Return to Archive home page | Return to MIDRANGE.COM home page

This mailing list archive is Copyright 1997-2024 by midrange.com and David Gibbs as a compilation work. Use of the archive is restricted to research of a business or technical nature. Any other uses are prohibited. Full details are available on our policy page. If you have questions about this, please contact [javascript protected email address].

Operating expenses for this site are earned using the Amazon Associate program and Google Adsense.