|
Hi! There was a discussion about the 'best' way to do this in the DB2 mailing list. I have not yet made a compilation of the contributions. >From the list's info: > Contributions sent to this list are automatically archived. You can get a > list of the available archive files by sending an "INDEX DB2-L" command > to LISTSERV@AMERICAN.EDU. You can then order these files with a "GET > DB2-L LOGxxxx" command, or using LISTSERV's database search facilities. > Send an "INFO DATABASE" command for more information on the latter. HTH, ----------------------------------------------------------------- Dipl.-Ing. Rudolf Wiesmayr Tel: +43 - 0732 / 7070 - 1720 Magistrat Linz, ADV/AE Fax: +43 - 0732 / 7070 - 1555 Gruberstrasse 40-42 mailto://Rudolf.Wiesmayr@mag.linz.at A-4041 Linz IBMMAIL: at3vs7vs@ibmmail.com ------ http://www.linz.at <<<--- Digital City Linz, Austria ----- - City of Linz: awarded by 'Speyerer Qualitaetswettbewerb 1996' - -----Original Message----- From: Walden Leverich [SMTP:walden@techsoftinc.com] Sent: Friday, November 14, 1997 6:24 PM To: Midrange List Subject: SQL Existence Check I would like to check for the existence of a record in a file using SQL. There could be 0, 1 or more than 1 records that match the selection criteria, I only care if records exist, not how many. I know that SQL has the EXISTS predicate, but I cannot use it by itself, it must be in the where clause of a statement. I realize that I could do a select count(*) where.... but this would require DB2/400 to read all the matching records in order to count them. In my case as soon as DB2 finds 1 record it can stop looking. Any suggestions? -Walden PS. Yes, I know a single chain or setll would accomplish this. I am looking for a SQL solution.
As an Amazon Associate we earn from qualifying purchases.
This mailing list archive is Copyright 1997-2025 by midrange.com and David Gibbs as a compilation work. Use of the archive is restricted to research of a business or technical nature. Any other uses are prohibited. Full details are available on our policy page. If you have questions about this, please contact [javascript protected email address].
Operating expenses for this site are earned using the Amazon Associate program and Google Adsense.