View Single Post
  #26   Report Post  
posted to rec.boats
Eisboch Eisboch is offline
external usenet poster
 
First recorded activity by BoatBanter: Jul 2006
Posts: 5,091
Default OT : Save Windows XP


"Vic Smith" wrote in message
...
On Thu, 20 Mar 2008 14:53:01 -0500, wrote:

On Thu, 20 Mar 2008 11:51:14 -0600, Vic Smith
wrote:

When these mainframe apps were designed, conservation of precious disk
space, fast I/O and frugal CPU time was more important than thinking
20 years in the future.


Actually I am old enough to remember when 80 characters was the limit
and dates had 1 digit "years". Those folks seemed to handle the decade
change OK ... many times. That was a regular thing when "media" was a
card. IBM used a 5 digit date on lots of transactions until 1970 or
so.


I'll answer both your responses here.
I worked on insurance and financial apps that handled tens of millions
of records daily. That's what most big companies of that type do, and
fixed dates are part of most transactions.
Some of my apps had +1000 programs/modules passing data and
interacting. They were initially designed with a 2 byte year which
was naturally woven into the data flow.
Almost all were in-house developed, and unique.
So I wasn't talking about PC stuff.
Even mainframe database software such as DB2 and IMS is restricted by
how the data is represented when the app is designed.
You simply can't represent a fixed year with one digit without serious
processing and computational drawbacks.
Century wasn't much an issue when most of these systems were developed
with 2 digit years. A simple check to see if birth year was greater
than current year meant plug 18 into the date calc.
Hardly any other fixed business processing date, which are many,
needed that. They were all computed as 19xx.
The usual exceptions were amortization and maturity dates, which were
either calculated on the fly or kludged with an "century indicator" or
some such if stored. Almost any date complexity was acceptable if it
saved a byte. There were no standards.
No need to get into binary manipulation/translation of bytes, since
that has it's own serious drawbacks, not the least of them the human
interface.
Without getting into how these systems got to where they were circa
late '90's, or getting too gearhead. there were big Y2K issues that
needed addressing, and if they weren't, nearly every major business in
the country would have been dead in the water when 2000 hit.
Of course they were addressed, at the cost of many millions of
dollars.
There was no reason for the general population to be scared.
Personally, I never met anybody who was, but I heard of people
hoarding food and that kind of crap.
Anyway, I'm quickly forgetting all the old data tricks, and modern
systems like SAP which I last worked with use a full 8 bytes to
represent CCYYMMDD.
They throw in another 8 bytes or so for a full timestamp to
thousandths of a second. That's occupied disk bytes, not what's
represented through a translation.
It's evident the SAP designers, not faced with the space constraints,
and seeing a century turn up close, see the value of a full date.
Don't hold me to all that though, as I'm quickly forgetting that too.
I'm even wondering if I just didn't make that all up.
Either way, I'm surprised to find it doesn't bother me much.

--Vic



Don't feel bad.
By the time I got to the end of your post, I forgot what you were talking
about.

Eisboch