|
I like CTEs because I cannot split a (very) complex SQL statement intoWhy do I prefer CTE's?
smaller parts, but I can test each CTE on its own.
If you start nesting sub-selects, by no later than 3 nested
sub-selects, you cannot read the query anymore, ... and then try to
test such a sub-select.
Mit freundlichen Grüßen / Best regards
Birgitta Hauser
"Shoot for the moon, even if you miss, you'll land among the stars."
(Les
Brown)
"If you think education is expensive, try ignorance." (Derek Bok)
"What is worse than training your staff and losing them? Not training
them and keeping them!"
„Train people well enough so they can leave, treat them well enough so
they don't want to.“ (Richard Branson)
-----Original Message-----
From: MIDRANGE-L <midrange-l-bounces@xxxxxxxxxxxxxxxxxx> On Behalf Of
Vernon Hamberg
Sent: Sonntag, 6. September 2020 15:18
To: Midrange Systems Technical Discussion
<midrange-l@xxxxxxxxxxxxxxxxxx>
Subject: Re: SQL Max question
+1 on the OLAP solution - there is so much those functions can do for
us, the more clear examples, the better.
I also like the CTE more - it crossed my mind, not sure why I didn't
mention it, at least - maybe being in the middle of preparing for the
upcoming POWERUp Virtual Conference.
Why do I prefer CTE's? It's easier to see what is happening - nesting
adds some confusion, trying to separate THAT SELECT statement from the
one surrounding it, well, that can be a challenge.
Gotta love it! So many different ways to do things, right? One motto
of the PERL language contingent - TMTOWTDI - There's More Than One Way
To Do It!
Vern
On 9/6/2020 7:42 AM, Birgitta Hauser wrote:
You can do it with a nested sub-select, but I prefer a Common TableExpression
Specification:
With x as (Select #DID. Max(Status) MaxStatus
from YourTable
Group by #DID)
select a.*
from YourTable a join x on a.#DID = x.#DID;
You can also can get the result by using the ROW_NUMBER() OLAP
Status Desc) RowNbr
With x as (Select a.*, Row_Number() Over(Partition by #DID Order By
from OrderDetx a)(Les Brown)
Select *
from x
Where RowNbr = 1;
Mit freundlichen Grüßen / Best regards
Birgitta Hauser
"Shoot for the moon, even if you miss, you'll land among the stars."
"If you think education is expensive, try ignorance." (Derek Bok)them and keeping them!"
"What is worse than training your staff and losing them? Not
training
„Train people well enough so they can leave, treat them well enoughthey don't want to.“ (Richard Branson)
so
Vernon Hamberg
-----Original Message-----
From: MIDRANGE-L <midrange-l-bounces@xxxxxxxxxxxxxxxxxx> On Behalf
Of
Sent: Sonntag, 6. September 2020 06:54plus a status column that has the highest status for the DID# in that row.
To: Midrange Systems Technical Discussion
<midrange-l@xxxxxxxxxxxxxxxxxx
Subject: Re: SQL Max question
Hello Art
If I may say back what I think you're saying -
You have a table with 1 row per DID# - that is, DID# is unique.
And status is not in this table.
And you want a result table that has all the columns from the first
one,
nested table expression - you want the 2nd table to have the DID# and
OK, I think you are getting close and can get this with a JOIN and a
its highest status in it - then JOIN to that.
table2max
select table1.*, table2max.maxstatus from table1
join (select did#, max(status) maxstatus from table2 group by
did#)
on table1.did# = table2max.did#this using CREATE TABLE, for example, and make it based on this SELECT
Then you can do pretty much what you want - you could create a temp
from
statement.
wrote:
HTH
Vern
On 9/5/2020 10:26 PM, Art Tostaine, Jr. wrote:
There is one row in my first table keyed by did. Other tables are
joined into this view but they are 1-1 by did. I joined it to a
status file that has Did and 1-20 statuses/rows.
In my new table I want all of the fields but only one row per did
with the highest status.
My goal is to create a temp table that I can run reports, extract
to csv, etc.
I wonder if I could create one row per did and then do an update
with a select from the status file getting only the max row from it.
On Sat, Sep 5, 2020 at 10:36 PM Alan Campin <alan0307d@xxxxxxxxx>
like?
I am not completely understanding what you are doing here but
normalizing
table structure?
Could you respond with what you see the table structure would look
list To post a message email: MIDRANGE-L@xxxxxxxxxxxxxxxxxx To--Art Tostaine
On Sat, Sep 5, 2020 at 6:52 PM Art Tostaine, Jr.
<atostaine@xxxxxxxxx>
wrote:
I have a table that has a DID#, many other columns, and a numericwith
status field. I want the table to have all the columns and one
row per did#
the highest status value.these
I'd like to either do a delete or create another temp table with
only
records. Is this possible?list
I've looked at group by but that won't let me keep all columns in
the table.
I also checked out partition by but that's not what I want.
Thank you
--
Art Tostaine
--
This is the Midrange Systems Technical Discussion (MIDRANGE-L)
mailing
To post a message email: MIDRANGE-L@xxxxxxxxxxxxxxxxxx To--
subscribe, unsubscribe, or change list options,
visit: https://lists.midrange.com/mailman/listinfo/midrange-l
or email: MIDRANGE-L-request@xxxxxxxxxxxxxxxxxx
Before posting, please take a moment to review the archives at
https://archive.midrange.com/midrange-l.
Please contact support@xxxxxxxxxxxxxxxxxxxx for any subscription
related questions.
Help support midrange.com by shopping at amazon.com with our
affiliate
link: https://amazon.midrange.com
This is the Midrange Systems Technical Discussion (MIDRANGE-L)
mailing list
To post a message email: MIDRANGE-L@xxxxxxxxxxxxxxxxxx
To subscribe, unsubscribe, or change list options,
visit: https://lists.midrange.com/mailman/listinfo/midrange-l
or email: MIDRANGE-L-request@xxxxxxxxxxxxxxxxxx
Before posting, please take a moment to review the archives
at https://archive.midrange.com/midrange-l.
Please contact support@xxxxxxxxxxxxxxxxxxxx for any subscription
related questions.
Help support midrange.com by shopping at amazon.com with our
affiliate
link: https://amazon.midrange.com
--
This is the Midrange Systems Technical Discussion (MIDRANGE-L)
mailing
subscribe, unsubscribe, or change list options,
visit: https://lists.midrange.com/mailman/listinfo/midrange-lhttps://archive.midrange.com/midrange-l.
or email: MIDRANGE-L-request@xxxxxxxxxxxxxxxxxx
Before posting, please take a moment to review the archives at
related questions.
Please contact support@xxxxxxxxxxxxxxxxxxxx for any subscription
link: https://amazon.midrange.com
Help support midrange.com by shopping at amazon.com with our
affiliate
--
This is the Midrange Systems Technical Discussion (MIDRANGE-L) mailing
list To post a message email: MIDRANGE-L@xxxxxxxxxxxxxxxxxx To
subscribe, unsubscribe, or change list options,
visit: https://lists.midrange.com/mailman/listinfo/midrange-l
or email: MIDRANGE-L-request@xxxxxxxxxxxxxxxxxx
Before posting, please take a moment to review the archives at
https://archive.midrange.com/midrange-l.
Please contact support@xxxxxxxxxxxxxxxxxxxx for any subscription
related questions.
Help support midrange.com by shopping at amazon.com with our affiliate
link: https://amazon.midrange.com
--
This is the Midrange Systems Technical Discussion (MIDRANGE-L) mailing
list To post a message email: MIDRANGE-L@xxxxxxxxxxxxxxxxxx To
subscribe, unsubscribe, or change list options,
visit: https://lists.midrange.com/mailman/listinfo/midrange-l
or email: MIDRANGE-L-request@xxxxxxxxxxxxxxxxxx
Before posting, please take a moment to review the archives at
https://archive.midrange.com/midrange-l.
Please contact support@xxxxxxxxxxxxxxxxxxxx for any subscription
related questions.
Help support midrange.com by shopping at amazon.com with our affiliate
link: https://amazon.midrange.com
As an Amazon Associate we earn from qualifying purchases.
This mailing list archive is Copyright 1997-2024 by midrange.com and David Gibbs as a compilation work. Use of the archive is restricted to research of a business or technical nature. Any other uses are prohibited. Full details are available on our policy page. If you have questions about this, please contact [javascript protected email address].
Operating expenses for this site are earned using the Amazon Associate program and Google Adsense.