× The internal search function is temporarily non-functional. The current search engine is no longer viable and we are researching alternatives.
As a stop gap measure, we are using Google's custom search engine service.
If you know of an easy to use, open source, search engine ... please contact support@midrange.com.



Thanks Eric, Vern and Chuck.

It makes sense now that you have explained it. I don't know why that did
not occur to me. I will give it a try and see what I get.

Thanks again!

Jim

On Thu, Mar 31, 2011 at 4:37 PM, DeLong, Eric <EDeLong@xxxxxxxxxxxxxxx>wrote:

No, the GROUP BY clause is used to render many rows down to one... Because
each "Weekxx" column is wrapped by the SUM() function, the values aggregate
to the grouped result row. Try it out!

-Eric DeLong

-----Original Message-----
From: midrange-l-bounces@xxxxxxxxxxxx [mailto:
midrange-l-bounces@xxxxxxxxxxxx] On Behalf Of Jim Essinger
Sent: Thursday, March 31, 2011 5:07 PM
To: Midrange Systems Technical Discussion
Subject: Re: SQL to create one row from multiple records - Revisited

Eric,

I wish I could be that succinct and clear! Yes, that is what I want to do.
It looks like the code you provided would give me a one for one record,
then
I would have summarize the resulting table?

Thanks,

Jim

On Thu, Mar 31, 2011 at 3:51 PM, DeLong, Eric <EDeLong@xxxxxxxxxxxxxxx
wrote:

Jim,

Sounds like you want to pivot your weekly counts into week columns?

The only strictly SQL way I know to do this will require a series of CASE
statements for each "week" column you populate. I'll show you the
pattern,
and you can fill in the rest...

SELECT DBITEM, DBYEAR,
sum(case when week=1 then shipped end) as week01,
sum(case when week=2 then shipped end) as week02,
sum(case when week=3 then shipped end) as week03,
sum(case when week=4 then shipped end) as week04,
sum(case when week=5 then shipped end) as week05,
...
FROM DBTable
GROUP BY DBITEM, DBYEAR

Hth,
-Eric DeLong

-----Original Message-----
From: midrange-l-bounces@xxxxxxxxxxxx [mailto:
midrange-l-bounces@xxxxxxxxxxxx] On Behalf Of Jim Essinger
Sent: Thursday, March 31, 2011 4:32 PM
To: Midrange Systems Technical Discussion
Subject: SQL to create one row from multiple records - Revisited

SQL gurus!

Archive reference:

http://archive.midrange.com/midrange-l/201004/msg00143.html

I am always looking to expand my SQL knowledge and experience. I saw in
the
above archive that there were some suggestions for taking data such as
below
(described in CSV format, but actually an IBM i table) and converting it
to
a one line record for each key pair.

Item,YEAR,WEEK,Shipped
1,2008,26,80
1,2008,27,78
1,2008,28,43
1,2008,29,64
1,2008,30,84
1,2008,31,41
1,2009,2,50
1,2009,5,9
1,2009,7,13
1,2009,8,35
1,2009,48,46
1,2009,49,76
1,2009,51,52
1,2009,52,60
1,2009,53,30
2,2009,1,76
2,2009,2,96
2,2009,13,77
2,2009,24,91
2,2009,25,90
3,2009,16,86
3,2009,17,144
3,2009,18,47
4,2009,19,77
4,2009,20,21
5,2009,21,105
5,2009,22,100
5,2009,23,75
5,2009,24,94
5,2009,25,73
1,2010,32,55
1,2010,33,69
1,2010,34,126

The archived thread talked about concatenating into a variable field. I
need
to keep the numbers separate.

I want to use an SQL statement to create one record for each item and
year,
then fill in the shipped number for the appropriate weeks, 1 through 54.
SQL
translates a date to a week number between 1 and 54. I have summerized
the
existing file down to item, year, week, and shipped numbers. If a week
is
missing a record, the amount should be zero for that week bucket.

Again described as CSV but actually an IBM i table.

item,year,week01,week02,wek03, ..... ,week51,week52,week53,week54
1,2008,0,0,0, ... ,80,78,43, 64,84,41, ....
1,2009,0,2,0,0,9,0,13,35, ...... ,46,76,52,60,30,0
1,2010,0,0,0, ... ,55.69.126, ....
2,2009,76,96, ... ,77,0, ... ,91,90, ...,
3,2009, ... ,86,144,47, ...
4,2009, ... ,77,21, ...
5,2009, ... ,105,100,75,94,73, ...

Is there a somewhat easy way to do this with an SQL statement or UDF, or
am
I going to have to write a program to do this?

Thanks!

Jim
--
This is the Midrange Systems Technical Discussion (MIDRANGE-L) mailing
list
To post a message email: MIDRANGE-L@xxxxxxxxxxxx
To subscribe, unsubscribe, or change list options,
visit: http://lists.midrange.com/mailman/listinfo/midrange-l
or email: MIDRANGE-L-request@xxxxxxxxxxxx
Before posting, please take a moment to review the archives
at http://archive.midrange.com/midrange-l.



--
This is the Midrange Systems Technical Discussion (MIDRANGE-L) mailing
list
To post a message email: MIDRANGE-L@xxxxxxxxxxxx
To subscribe, unsubscribe, or change list options,
visit: http://lists.midrange.com/mailman/listinfo/midrange-l
or email: MIDRANGE-L-request@xxxxxxxxxxxx
Before posting, please take a moment to review the archives
at http://archive.midrange.com/midrange-l.


--
This is the Midrange Systems Technical Discussion (MIDRANGE-L) mailing list
To post a message email: MIDRANGE-L@xxxxxxxxxxxx
To subscribe, unsubscribe, or change list options,
visit: http://lists.midrange.com/mailman/listinfo/midrange-l
or email: MIDRANGE-L-request@xxxxxxxxxxxx
Before posting, please take a moment to review the archives
at http://archive.midrange.com/midrange-l.



--
This is the Midrange Systems Technical Discussion (MIDRANGE-L) mailing list
To post a message email: MIDRANGE-L@xxxxxxxxxxxx
To subscribe, unsubscribe, or change list options,
visit: http://lists.midrange.com/mailman/listinfo/midrange-l
or email: MIDRANGE-L-request@xxxxxxxxxxxx
Before posting, please take a moment to review the archives
at http://archive.midrange.com/midrange-l.



As an Amazon Associate we earn from qualifying purchases.

This thread ...

Replies:

Follow On AppleNews
Return to Archive home page | Return to MIDRANGE.COM home page

This mailing list archive is Copyright 1997-2024 by midrange.com and David Gibbs as a compilation work. Use of the archive is restricted to research of a business or technical nature. Any other uses are prohibited. Full details are available on our policy page. If you have questions about this, please contact [javascript protected email address].

Operating expenses for this site are earned using the Amazon Associate program and Google Adsense.