Year 2000. It's close to your hearts, if not omnipresent in our lives
for the next couple of years. 

I've been reading a little about how various computers and operating
systems count dates, what their epochs are, etc.

An interesting thought. UNIXes use a 32-bit signed integer to count
the number of seconds. Their birthdate (or epoch) is 1 Jan 1970. That
gives us some 68 years plus or minus 1970 = 1902-2038 as the useful
date range in UNIX (for date calcs). 

Now, I don't even pretend to know the inner works of OS/400 or the
AS/400 systems. However, it occurs that IBM markets the RISC systems
as 64-bit architecture. Therefore:

Why not establish a standard date conversion schema: a 64-bit signed
integer representing the number of seconds from some date. The "some
date is largely irrelevant. Here's why: 2^63 (one bit for sign left
off) gives us some 9.223372037x10^18, or 9223372037000000000 seconds
(according to my Texas Instruments TI-35X calculator (I'm not really a
math person, either.) That calculates to 2.922710231x10^11 =
292,271,023,100 years. That is 292 BILLION years. In each direction.
Counting from 0 AD, 1970 AD, or 2000 AD doesn't really make a
difference. 

Now I didn't choose 64 because the AS/400 uses 64-bit tech, but
because 64 is the next binary progression for the powers of 2. 

So: Could we not create a "standard" database date format, being 64
bits (8 bytes, no larger than current Year 2000 efforts) that
represents the date AND time of some event. Now: an OS API that can
convert the current date and time to some 64-bit number, and can take
some 64-bit number and make a date and time stamp from it. 

The beauty of this solution (or why this is VERY elegant). If for
whatever reason scientists tell us the universe is a trillion years
old (which our close-to 600 billion year range can't grasp), just
increment the bit width in the API, system register (because we'll
have 128-bit processors by then), and do a CHGPF (thank you V3R7!) on
the date field from 64 to 128 bits (only 5.391448763x10^30, or 5
thousand billion billion billion years). Now program code may have to
be modified, but Y2K taught us to document our code, right, especially
in regard to dates.

Computers that could take advantage of this:
AS/400 - with RISC.
UNIXes running on 64-bit processors and compiled for 64-bit. 
Pentiums (or is that Pro?) that utilize a 64-bit register width. 

What does everyone think? I personally don't think it's impossible.
It's very much the cleanest format I could think of. It doesn't care
what your date format is. It is easily extendable. Thoughts? Or am I
crazy??? ^_^

Thanks for your time, we return you to scheduled year "00"
programming. 

 - lg -

--
A book: ...turn a page, read, read, read, read...
Television: ...click, watch, watch, watch, watch...
The Web: ...click, wait, wait, wait, wait, read...
lgoodbar@tecinfo.com  ICQ#504581  http://www.tecinfo.com/~lgoodbar/
+---
| This is the Midrange System Mailing List!
| To submit a new message, send your mail to "MIDRANGE-L@midrange.com".
| To unsubscribe from this list send email to MAJORDOMO@midrange.com
|    and specify 'unsubscribe MIDRANGE-L' in the body of your message.
| Questions should be directed to the list owner/operator: david@midrange.com
+---


This thread ...

Follow-Ups:

Follow On AppleNews
Return to Archive home page | Return to MIDRANGE.COM home page

This mailing list archive is Copyright 1997-2019 by midrange.com and David Gibbs as a compilation work. Use of the archive is restricted to research of a business or technical nature. Any other uses are prohibited. Full details are available on our policy page. If you have questions about this, please contact [javascript protected email address].