|
It appears you're pulling a character value from the entry_data column. The challenge is when you try and pull a numeric such as a decimal or integer data type from that column.
-----Original Message-----
From: MIDRANGE-L <midrange-l-bounces@xxxxxxxxxxxxxxxxxx> On Behalf Of Mitchell, Dana
Sent: Thursday, May 2, 2019 4:09 PM
To: Midrange Systems Technical Discussion <midrange-l@xxxxxxxxxxxxxxxxxx>
Subject: RE: DISPLAY_JOURNAL and ENTRY_DATA parsing using pure SQL
Using ACS Run SQL Scripts. Have a look at the example titled 'Password Failures over the last 24 hours'. I'm not much of a SQL guy, but it looks like that example is doing what you want to do:
CREATE OR REPLACE VIEW DANA.Password_Failures_24hrs FOR SYSTEM NAME PW_LAST24 (
TIME, JOBNAME, USERNAME, IPADDR) AS
SELECT ENTRY_TIMESTAMP,
JOB_NUMBER CONCAT '/' CONCAT RTRIM(JOB_USER) CONCAT '/' CONCAT RTRIM(JOB_NAME)
AS JOB_NAME, RTRIM(CAST(SUBSTR(entry_data, 2, 10) AS VARCHAR(10))), REMOTE_ADDRESS
FROM TABLE (
qsys2.display_journal('QSYS', 'QAUDJRN', -- Journal library and name
STARTING_RECEIVER_NAME => '*CURAVLCHN', journal_entry_types => 'PW', -- Journal entry types
starting_timestamp => CURRENT TIMESTAMP - 24 HOURS -- Time period
)
) X;
Dana
-----Original Message-----
From: MIDRANGE-L <midrange-l-bounces@xxxxxxxxxxxxxxxxxx> On Behalf Of Darren Strong
Sent: Wednesday, May 1, 2019 12:34 PM
To: midrange-l General Questions (midrange-l@xxxxxxxxxxxxxxxxxx) <midrange-l@xxxxxxxxxxxxxxxxxx>
Subject: DISPLAY_JOURNAL and ENTRY_DATA parsing using pure SQL
I developed a hybrid RPG and SQL program to query the DISPLAY_JOURNAL SQL UDTF from IBM, which if you use it, you know that the ENTRY_DATA column comes in as an incomprehensible BLOB. I've used RPG to overlay this BLOB onto a data structure and written this out to a file with the journalled file structure pre-built to be able to view this data. What I'm wondering is, if there is an SQL way to dump this data onto a file and/or structure so that I might be able to build this process as a view, rather than an outfile to further reduce unnecessary file I/O. If it were a view, then I imagine that the program would just create the view, in say QTEMP, and then I could query the fields as desired before the data was actually written anywhere.
The column I'm trying to parse is currently coded as follows:
cast(entry_data as char(5000) for bit data)
Attention: This electronic document and associated attachments (if any) may contain confidential information of the sender (SHAZAM Network) and is intended solely for use by the addressee(s). Review by unintended individuals is prohibited. If you are not the intended recipient: (i) do not read, transmit, copy, disclose, store, or utilize this communication in any manner; (ii) please reply to the sender immediately, state that you received it in error and permanently delete this message and any attachment(s) from your computer and destroy the material in its entirety if in hard copy format. If you are the intended recipient, please use discretion in any email reply to ensure that you do not send confidential information as we cannot secure it through this medium. By responding to us through internet e-mail, you agree to hold SHAZAM, Inc. and all affiliated companies harmless for any unintentional dissemination of information contained in your message. Thank you.
--
This is the Midrange Systems Technical Discussion (MIDRANGE-L) mailing list To post a message email: MIDRANGE-L@xxxxxxxxxxxxxxxxxx To subscribe, unsubscribe, or change list options,
visit: https://lists.midrange.com/mailman/listinfo/midrange-l
or email: MIDRANGE-L-request@xxxxxxxxxxxxxxxxxx
Before posting, please take a moment to review the archives at https://archive.midrange.com/midrange-l.
Please contact support@xxxxxxxxxxxx for any subscription related questions.
Help support midrange.com by shopping at amazon.com with our affiliate link: https://amazon.midrange.com
As an Amazon Associate we earn from qualifying purchases.
This mailing list archive is Copyright 1997-2024 by midrange.com and David Gibbs as a compilation work. Use of the archive is restricted to research of a business or technical nature. Any other uses are prohibited. Full details are available on our policy page. If you have questions about this, please contact [javascript protected email address].
Operating expenses for this site are earned using the Amazon Associate program and Google Adsense.