× The internal search function is temporarily non-functional. The current search engine is no longer viable and we are researching alternatives.
As a stop gap measure, we are using Google's custom search engine service.
If you know of an easy to use, open source, search engine ... please contact support@midrange.com.




     As far as the join options (Join with default values), the SQL 
     equivalent is a LEFT OUTER JOIN and uses the following syntax:
     
     SELECT xxx,xxx,xxxx FROM library/file1 F1 LEFT OUTER JOIN 
     library/file2 F2 ON (F1.xxx = F2.xxx AND F1.xxxx = F2.xxxx)
     
     hth
     eric.delong@pmsi-services.com


______________________________ Reply Separator _________________________________
Subject: QM/400 Conversion 
Author:  <MIDRANGE-L@midrange.com> at INET_WACO
Date:    6/13/00 10:54 AM


I have gone throught the 28 (as of today) Query/400 queries that are run as 
part of the end of month process & converted them to QM/400 objects.  It 
wasn't as bad a job as I was fearing based on comments on this list but it 
wasn't a piece of cake either.
     
The major dilema now is what to do with those queries (6) that create a 
file?  For the present, I just run the Query/400 versions & use QM/400 for 
the reporting function.  
     
What I would like to know is what options I have because these 6 queries 
have variables that change (from/thru dates, accounting period, etc.) with 
each running?  
     
OPNQRYF is one option but I'm not very fluent with that tool.  I would also 
guess a person could use an RPG program to "extract" the data needed but 
then the data needs to be sorted.  There is someplace an example of creating 
DDS from the "variables" required which then is compiled to create a logical 
file.
     
The easiest solution is to leave the queries as they are and manually change 
the variables before running the month-end reports!  BUT you take the chance 
of an error because you are doing it manually so is it really the "better" 
alternative?
     
I have a question related to QM/400 & editing dates not defined as a date 
data type.  BPCS version 6.0.02 dates are a real mess (I'm being kind when I 
say that) and the only thing I have found is to not edit them so the field 
is displayed as 20000531.  Not very elegant or user friendly.  Is there a 
better solution for this problem?  I have changed some of the major files to 
redefine the date fields as date data types but that also has it's risks.
     
I have a question related to QM/400 & what people do to "emulate" Query/400 
Option 2 of matching records - i.e., always take the primary file record 
even though there is no match on the secondary file.  I only have 1 of these 
to deal with but how do I deal with QM/400 not having that capability?
     
Looking for some thoughts from the list.  Thanks  for your input.
     
Dennis Munro
Badger Mining Corporation
dmunro@badgerminingcorp.com
(920) 361-2388 
     
+---
| This is the Midrange System Mailing List!
| To submit a new message, send your mail to MIDRANGE-L@midrange.com. 
| To subscribe to this list send email to MIDRANGE-L-SUB@midrange.com.
| To unsubscribe from this list send email to MIDRANGE-L-UNSUB@midrange.com. 
| Questions should be directed to the list owner/operator: david@midrange.com 
+---
     


+---
| This is the Midrange System Mailing List!
| To submit a new message, send your mail to MIDRANGE-L@midrange.com.
| To subscribe to this list send email to MIDRANGE-L-SUB@midrange.com.
| To unsubscribe from this list send email to MIDRANGE-L-UNSUB@midrange.com.
| Questions should be directed to the list owner/operator: david@midrange.com
+---

As an Amazon Associate we earn from qualifying purchases.

This thread ...


Follow On AppleNews
Return to Archive home page | Return to MIDRANGE.COM home page

This mailing list archive is Copyright 1997-2024 by midrange.com and David Gibbs as a compilation work. Use of the archive is restricted to research of a business or technical nature. Any other uses are prohibited. Full details are available on our policy page. If you have questions about this, please contact [javascript protected email address].

Operating expenses for this site are earned using the Amazon Associate program and Google Adsense.