|
/* global */
DCL VAR(&TXTDIR) TYPE(*CHAR) LEN(20) VALUE('/home/dir1/')
DCL VAR(&TXTNAM) TYPE(*CHAR) LEN(20)
DCL VAR(&TXTFQN) TYPE(*CHAR) LEN(40)
/* in the loop */
CHGVAR VAR(&TXTNAM) VALUE(&FILE_NAMES |< '.TXT')
CHGVAR VAR(&TXTFQN) VALUE(&TXTDIR |< &TXTNAM)
CPYTOIMPF <your parms> +
TOSTMF(&TXTFQN)
Roger HarmanCOMMON Certified Application Developer – ILE RPG on IBM i on
PowerOCEAN User Group – Vice-President, Membership (2014)
Date: Thu, 8 Jan 2015 11:21:35 -0600files
Subject: Re: How to make one file into many and download into CSV TXT
From: thomas.burrows.1957@xxxxxxxxxthe
To: midrange-l@xxxxxxxxxxxx
Want to thank everyone for the input. I ended up creating a separate file
with the variable value that the records were being grouped by. The
starting and stopping position for those values. Then using a CL I read
that file doing a CRTDUPOBJ naming each file from the variable value in
file. Then followed up with a CPYF that copied just those records intothe
newly created file.but
Then did a CPYTOIMPF to copy the duplicated files "that only contact the
records of same variable values" to the IFS.
All works pretty slick except for one little thing. I am not sure how to
change the TOSTMF to pick up the various values of my created files.
Have included the CL code.
PGM
DCL VAR(&NUM) TYPE(*DEC) LEN(4 0) VALUE(0)
DCLF FILE(FILE_NAME)
READ: RCVF
MONMSG MSGID(CPF0864) EXEC(GOTO CMDLBL(TAG_END))
CRTDUPOBJ OBJ(SAW_DATA) FROMLIB(FRNTST031) +
OBJTYPE(*FILE) TOLIB(QTEMP) +
NEWOBJ(&FILE_NAMES)
MONMSG MSGID(CPF2130)
CPYF FROMFILE(FRNTST031/SAW_DATA) +
TOFILE(QTEMP/&FILE_NAMES) +
MBROPT(*REPLACE) FROMRCD(&STR_POS) +
TORCD(&FIN_POS)
CPYTOIMPF FROMFILE(QTEMP/&FILE_NAMES) TOSTMF('/ +
home/dir1/FILE_NAMES.TXT') +
MBROPT(*REPLACE) STMFCODPAG(*PCASCII) +
RCDDLM(*CR)
GOTO CMDLBL(READ)
TAG_END: ENDPGM
The part I cannot figure out is how to pass in a variable to the
TOSTMF('/home/dir1/file_names.txt') so that "file_names" is not static,
can change with whatever value is in CL variable "&File_names". As Isaid
everything works except for this one part.just
Thomas
On Mon, Jan 5, 2015 at 7:04 PM, Alan Campin <alan0307d@xxxxxxxxx> wrote:
I have my own API wrappers that I use that I could send you. The API
withwraps the Unix API to make them simple to use. I can include simple
examples of programs that open a text file and write to it.
On Mon, Jan 5, 2015 at 5:15 PM, Thomas Burrows <
thomas.burrows.1957@xxxxxxxxx> wrote:
Have to generate a file that we take some information and mix it in
insome set literals.
Then there will be a separate record in the TXT file for each record
wrote:the
PF file.
Have never done the create TXT file process before. Do you have any
examples?
On Mon, Jan 5, 2015 at 5:52 PM, Alan Campin <alan0307d@xxxxxxxxx>
time?time
So if I am understanding you correctly, you are level breaking each
valuethat the M2RUN value changes or do you call the program with M2RUN
as
a parameter? In other words, are all the M2RUN values run each
forfile,
Seems easy enough to break at each M2RUN value and open a new text
write the data straight to it, then close it and open a new file
readthe
valuenext one. One simple solution might be to run a Group By the M2Run
and pass to a procedure. Each procedure would open a text file,
perhapsall
M2RUNthe records for a given M2RUN value and then close text file.
What type of manual massaging do you need to do?
On Mon, Jan 5, 2015 at 4:16 PM, Thomas Burrows <
thomas.burrows.1957@xxxxxxxxx> wrote:
To answer Booth's questions.
Up to 10,000 records in the original file.
At the present that data would be in groups of up to 60 different
values.
There could one record is with an individual M2RUN value or
on -250
records with the same
M2RUN value.
There is no standardized list of sorted categories.
End result would be 50 to 100 CSV files.
For Alan's question there is only one field this file is sorted
cutM2RUN.
All other values can be different, but they tend to be the same.
We are building files to input into a computer directed SAW to
notpieces
of wood specific lengths.
On Mon, Jan 5, 2015 at 4:55 PM, Booth Martin <booth@xxxxxxxxxxxx
wrote:
category?
Can you add some texture to this? I am not clear as to what ishappening.
-How many records are typically in the original file?
-Why is the file being sorted?
-Is the end result one .csv file or 50 to 100 .csv files?
-There are 50 to 100 sort fields or 50 to 100 records per sort
-Is there a standardized list of 100+ sort categories that do
CSVlocations)normally
change from day to day (like a sorting by a list of store
asinto
opposed to sorting by date or by volume.)
On 1/5/2015 2:56 PM, Thomas Burrows wrote:
Hi:
Is there a quick way to take one file that is sorted and break
many
files according to the sort value? Then quickly download to a
could beTXT
unknownfile.
Know how to do this in general, but each day there will be an
number of individual sort fields. Usually under fifty but
mailingmailingmaybemailing
thinking75
to 100.
Hoping for a quick solution than the rather hard solution I am
of.This is the Midrange Systems Technical Discussion (MIDRANGE-L)
Thomas
--
list
To post a message email: MIDRANGE-L@xxxxxxxxxxxx--
To subscribe, unsubscribe, or change list options,
visit: http://lists.midrange.com/mailman/listinfo/midrange-l
or email: MIDRANGE-L-request@xxxxxxxxxxxx
Before posting, please take a moment to review the archives
at http://archive.midrange.com/midrange-l.
This is the Midrange Systems Technical Discussion (MIDRANGE-L)
list
To post a message email: MIDRANGE-L@xxxxxxxxxxxx--
To subscribe, unsubscribe, or change list options,
visit: http://lists.midrange.com/mailman/listinfo/midrange-l
or email: MIDRANGE-L-request@xxxxxxxxxxxx
Before posting, please take a moment to review the archives
at http://archive.midrange.com/midrange-l.
This is the Midrange Systems Technical Discussion (MIDRANGE-L)
mailinglist
To post a message email: MIDRANGE-L@xxxxxxxxxxxx--
To subscribe, unsubscribe, or change list options,
visit: http://lists.midrange.com/mailman/listinfo/midrange-l
or email: MIDRANGE-L-request@xxxxxxxxxxxx
Before posting, please take a moment to review the archives
at http://archive.midrange.com/midrange-l.
This is the Midrange Systems Technical Discussion (MIDRANGE-L)
listlist
To post a message email: MIDRANGE-L@xxxxxxxxxxxx--
To subscribe, unsubscribe, or change list options,
visit: http://lists.midrange.com/mailman/listinfo/midrange-l
or email: MIDRANGE-L-request@xxxxxxxxxxxx
Before posting, please take a moment to review the archives
at http://archive.midrange.com/midrange-l.
This is the Midrange Systems Technical Discussion (MIDRANGE-L) mailing
listTo post a message email: MIDRANGE-L@xxxxxxxxxxxx--
To subscribe, unsubscribe, or change list options,
visit: http://lists.midrange.com/mailman/listinfo/midrange-l
or email: MIDRANGE-L-request@xxxxxxxxxxxx
Before posting, please take a moment to review the archives
at http://archive.midrange.com/midrange-l.
This is the Midrange Systems Technical Discussion (MIDRANGE-L) mailing
To post a message email: MIDRANGE-L@xxxxxxxxxxxx
To subscribe, unsubscribe, or change list options,
visit: http://lists.midrange.com/mailman/listinfo/midrange-l
or email: MIDRANGE-L-request@xxxxxxxxxxxx
Before posting, please take a moment to review the archives
at http://archive.midrange.com/midrange-l.
--
This is the Midrange Systems Technical Discussion (MIDRANGE-L) mailing list
To post a message email: MIDRANGE-L@xxxxxxxxxxxx
To subscribe, unsubscribe, or change list options,
visit: http://lists.midrange.com/mailman/listinfo/midrange-l
or email: MIDRANGE-L-request@xxxxxxxxxxxx
Before posting, please take a moment to review the archives
at http://archive.midrange.com/midrange-l.
As an Amazon Associate we earn from qualifying purchases.
This mailing list archive is Copyright 1997-2024 by midrange.com and David Gibbs as a compilation work. Use of the archive is restricted to research of a business or technical nature. Any other uses are prohibited. Full details are available on our policy page. If you have questions about this, please contact [javascript protected email address].
Operating expenses for this site are earned using the Amazon Associate program and Google Adsense.