× The internal search function is temporarily non-functional. The current search engine is no longer viable and we are researching alternatives.
As a stop gap measure, we are using Google's custom search engine service.
If you know of an easy to use, open source, search engine ... please contact support@midrange.com.



Michael,

Thanks. This ended up being much more complicated and then, more simple. I did some reading up on the over() statement and it clarified for me what was going on. Turns out that this CTE had a "group by" aggregation clause that was somehow not allow the over() statement to correctly aggregate. I removed the group by and then everything sorted itself out (no pun intended).

Thanks for the pointers.

Pete Helgren
Value Added Software, Inc
www.asaap.com
www.opensource4i.com


On 3/8/2011 5:42 PM, Schutte, Michael D wrote:
Just order by LRDTE

Should be... Dense_rank() over(partition by LSDTE order by LRDTE)





On Mar 8, 2011, at 5:04 PM, "Pete Helgren"<pete@xxxxxxxxxx> wrote:

Thanks. That is what I thought as well but even within the first
grouping of "date2" I get weird results when I add the partition by:

dense_rank() over(partition by lsdte order by lsdte,lrdte)) as daycount

LRDTE LSDTE DayCount
04/17/11 04/02/11 1
04/17/11 04/02/11 1
04/17/11 04/02/11 1
04/17/11 04/02/11 1
04/17/11 04/02/11 1
04/17/11 04/02/11 1
04/18/11 04/02/11 2
04/18/11 04/02/11 1
04/18/11 04/02/11 2
04/18/11 04/02/11 2
04/18/11 04/02/11 1
04/21/11 04/02/11 3
04/21/11 04/02/11 2
04/21/11 04/02/11 3
04/21/11 04/02/11 3

There are other columns in this data set and the CTE itself is pretty
ugly. Could other columns in the data set cause the "partition to" to
break prematurely? I tried sorting the entire CTE with an 'Order by'
with the date2, and date1 but I can't explain the "reset" of the ranking.

If I drop the 'partition by' I get the following:

04/17/11 04/02/11 1
04/17/11 04/02/11 1
04/17/11 04/02/11 1
04/17/11 04/02/11 1
04/17/11 04/02/11 1
04/17/11 04/02/11 1
04/18/11 04/02/11 2
04/18/11 04/02/11 2
04/18/11 04/02/11 2
04/18/11 04/02/11 2
04/18/11 04/02/11 2
04/21/11 04/02/11 3
04/21/11 04/02/11 3
04/21/11 04/02/11 3
04/21/11 04/02/11 3
04/21/11 04/02/11 3
04/22/11 04/02/11 4
04/22/11 04/02/11 4
04/22/11 04/02/11 4
04/22/11 04/02/11 4
04/22/11 04/02/11 4
05/02/11 05/16/11 5<== Should be '1'
05/02/11 05/16/11 5
05/02/11 05/16/11 5
05/02/11 05/16/11 5


I could show you the whole SQL with the CTE but it IS a monster...

Pete Helgren
Value Added Software, Inc
www.asaap.com
www.opensource4i.com


On 3/8/2011 2:16 PM, Schutte, Michael D wrote:
You will need to add partition by to the over(). Partition by date2 will cause the counter to start over when it changes.

So....

Dense_rank() over(partition by date2 order by date1)



-----Original Message-----
From: midrange-l-bounces@xxxxxxxxxxxx [mailto:midrange-l-bounces@xxxxxxxxxxxx] On Behalf Of Pete Helgren
Sent: Tuesday, March 08, 2011 3:43 PM
To: Midrange Systems Technical Discussion
Subject: Dense_Rank Revisited

It looked like dense_rank did the trick with my sample data but when I
had a larger data set I ran into a snag that may not have been apparent
in the example I presented. Here is a slightly different variation that
describes the remaining issue.

BB8179 04/17/11 04/02/11
BE7214 04/17/11 04/02/11
BE7488 04/18/11 04/02/11
BE2178 04/18/11 04/02/11
BE1618 04/18/11 04/02/11
BD9608 04/21/11 04/02/11
BE2180 04/21/11 04/02/11
BE1696 04/21/11 04/02/11
BD9607 05/7/11 05/28/11
BB6382 05/7/11 05/28/11
BB7942 05/10/11 05/28/11
BE7487 05/10/11 05/28/11
BE7489 05/11/11 05/28/11
BE2179 05/11/11 05/28/11
BE8955 05/11/11 05/28/11


What I would like to do is count the unique occurrences of the first
date column within the *same* occurrence of the second date column.
Something like:

BB8179 04/17/11 1 04/02/11
BE7214 04/17/11 1 04/02/11
BE7488 04/18/11 2 04/02/11
BE2178 04/18/11 2 04/02/11
BE1618 04/18/11 2 04/02/11
BD9608 04/21/11 3 04/02/11
BE2180 04/21/11 3 04/02/11
BE1696 04/21/11 3 04/02/11
BD9607 05/7/11 1 05/28/11
BB6382 05/7/11 1 05/28/11
BB7942 05/10/11 2 05/28/11
BE7487 05/10/11 2 05/28/11
BE7489 05/11/11 3 05/28/11
BE2179 05/11/11 3 05/28/11
BE8955 05/11/11 3 05/28/11


I tried adding the second column to the order by clause but that seemed
to make no difference. I also tried 'partition' by using the second
column but that also didn't deliver the desired result. Ideas?

--
Pete Helgren
Value Added Software, Inc
www.asaap.com
www.opensource4i.com

--
This is the Midrange Systems Technical Discussion (MIDRANGE-L) mailing list
To post a message email: MIDRANGE-L@xxxxxxxxxxxx
To subscribe, unsubscribe, or change list options,
visit: http://lists.midrange.com/mailman/listinfo/midrange-l
or email: MIDRANGE-L-request@xxxxxxxxxxxx
Before posting, please take a moment to review the archives
at http://archive.midrange.com/midrange-l.

________________________________

Notice from Bob Evans Farms, Inc: This e-mail message, including any attachments, may contain confidential information that is intended only for the person or entity to which it is addressed. Any unauthorized review, use, disclosure or distribution is strictly prohibited. If you are not the intended recipient, please contact the sender by reply e-mail and destroy all copies of the original message and any attachments.
--
This is the Midrange Systems Technical Discussion (MIDRANGE-L) mailing list
To post a message email: MIDRANGE-L@xxxxxxxxxxxx
To subscribe, unsubscribe, or change list options,
visit: http://lists.midrange.com/mailman/listinfo/midrange-l
or email: MIDRANGE-L-request@xxxxxxxxxxxx
Before posting, please take a moment to review the archives
at http://archive.midrange.com/midrange-l.


As an Amazon Associate we earn from qualifying purchases.

This thread ...

Replies:

Follow On AppleNews
Return to Archive home page | Return to MIDRANGE.COM home page

This mailing list archive is Copyright 1997-2024 by midrange.com and David Gibbs as a compilation work. Use of the archive is restricted to research of a business or technical nature. Any other uses are prohibited. Full details are available on our policy page. If you have questions about this, please contact [javascript protected email address].

Operating expenses for this site are earned using the Amazon Associate program and Google Adsense.