Y2K cost savings will prove expensive
Y2K cost savings will prove expensive
The Y2K computer bug — which current estimates indicate will cost from $600 billion to $1.6 trillion to correct worldwide — was born out of a desire to save money.
In the early 1960s, computer memory was very expensive, and there was a strong economic incentive to minimize the amount of memory needed to store a program and its data, according to a recent U.S. Senate report.1 To save money and memory, programmers represented four-digit years with only two digits, meaning 1968 or 1974 would be stored and processed as 68 and 74, respectively. The number 19 in the year was implied, much as personal checks once had the number 19 preprinted on the dateline. Consequently, computers could not correctly calculate the difference between a year in the 20th century and a year in the 21st century. For example, the time between July 1, 1998, and July 1, 2005 is 7 years. However, a computer with a Y2K problem could calculate an answer of either 93 years or -7 years, depending on the specific program. Calculations that used either of those results would be in error and may themselves cause subsequent problems.
Early computer programmers thought their successors would correct the problem, but the human tendency to take the path of least resistance and avoid a complex and expensive change allowed the problem to brew into today’s high-priced crisis, the report noted.
Reference
1. United States Senate Special Committee on the Year 2000 Technology Problem. "Investigating the Impact of the Year 2000 Problem." March 2, 1999. Web site: http://www.senate.gov/~y2k/
Subscribe Now for Access
You have reached your article limit for the month. We hope you found our articles both enjoyable and insightful. For information on new subscriptions, product trials, alternative billing arrangements or group and site discounts please call 800-688-2421. We look forward to having you as a long-term member of the Relias Media community.