× The internal search function is temporarily non-functional. The current search engine is no longer viable and we are researching alternatives.
As a stop gap measure, we are using Google's custom search engine service.
If you know of an easy to use, open source, search engine ... please contact support@midrange.com.



Our SPL2SPL (Duplicate Spooled File) software handles the files over 16mb.

http://www.bvstools.com/spl2spl.html

It also allows you to change attributes of the new spooled file such as
output queue, owner, form type, user data, etc... etc...

Brad
www.bvstools.com

On Mon, Aug 3, 2015 at 7:20 AM, Kevin Bucknum <Kevin@xxxxxxxxxxxxxxxxxxx>
wrote:

I've tested it on some very large spool files and it works fine. This is
one that got processed this weekend.
Spooled file size (K) . . . . . . . . : 149528

It only reads one buffer at a time, and replaces the userspace every
time. It's slower than doing it 16mb at a time, but most of my jobs
using this are batch, so I'm concerned about over optimizing this
currently.




Kevin Bucknum
Senior Programmer Analyst
MEDDATA/MEDTRON
Tel: 985-893-2550

-----Original Message-----
From: MIDRANGE-L [mailto:midrange-l-bounces@xxxxxxxxxxxx] On Behalf Of
paul.roy@xxxxxxx
Sent: Friday, July 31, 2015 4:06 PM
To: Midrange Systems Technical Discussion
Subject: Re: Duplicating large spoolfiles (greater than 16m)

Hi Kevin,

I had a look on your code... I think that you do not handle spooled
files
16MB... (max size of a single user space)..
The logic would be to use QSPGETSP and store the data in multiple
userspaces (create a new user space if you get CPF3CAA ) then read
all
userspaces and use QSPPUTSP.. to create the new spooled file

I wrote a program to do this some years ago... but unfortunately I am
not authorized to share the source code it as it is not "open
source"...

Paul




From: "Kevin Bucknum" <Kevin@xxxxxxxxxxxxxxxxxxx>
To: <midrange-l@xxxxxxxxxxxx>
Date: 31/07/2015 19:46
Subject: Duplicating large spoolfiles (greater than 16m)
Sent by: "MIDRANGE-L" <midrange-l-bounces@xxxxxxxxxxxx>



This comes up every now and then, and several options are always
presented, but I've never been able to find a complete program using
the
API's. Based on something that Tommy Holden posted at

http://webcache.googleusercontent.com/search?q=cache:HkiFuHDAUvoJ:iprode
veloper.com/forums/aft/135681+&cd=2&hl=en&ct=clnk&gl=us I came up
this.

http://code.midrange.com/0ae8a6e14f.html



The original post had copied in a spec for SPLA0100 instead of
SPLA0200,
and I also added the ability to change hold and save parameters as we
used those with DUPCHGSPLF which doesn't handle the files greater than
16 megs.



Just adding the below for people searching for a solution:

CPA0702

QSPGETSP

QSPPUTSP

-
--
This is the Midrange Systems Technical Discussion (MIDRANGE-L) mailing
list To post a message email: MIDRANGE-L@xxxxxxxxxxxx To subscribe,
unsubscribe, or change list options,
visit: http://lists.midrange.com/mailman/listinfo/midrange-l
or email: MIDRANGE-L-request@xxxxxxxxxxxx Before posting, please take
a
moment to review the archives at
http://archive.midrange.com/midrange-l.

--
This is the Midrange Systems Technical Discussion (MIDRANGE-L) mailing list
To post a message email: MIDRANGE-L@xxxxxxxxxxxx
To subscribe, unsubscribe, or change list options,
visit: http://lists.midrange.com/mailman/listinfo/midrange-l
or email: MIDRANGE-L-request@xxxxxxxxxxxx
Before posting, please take a moment to review the archives
at http://archive.midrange.com/midrange-l.



As an Amazon Associate we earn from qualifying purchases.

This thread ...

Replies:

Follow On AppleNews
Return to Archive home page | Return to MIDRANGE.COM home page

This mailing list archive is Copyright 1997-2024 by midrange.com and David Gibbs as a compilation work. Use of the archive is restricted to research of a business or technical nature. Any other uses are prohibited. Full details are available on our policy page. If you have questions about this, please contact [javascript protected email address].

Operating expenses for this site are earned using the Amazon Associate program and Google Adsense.