Most computer history today is told in a linear fashion. The idea is, that all the technology which is available today was invented in the mid and late 1980s. So modern technology is nothing else but a faster Apple II computer and a Commodore Amiga with more RAM.
The problem with this story is, that it ignores the turning point of 1992. This date was a milestone in computer hardware and very important the internet age has started. From a hardware perspective the year 1992 was the transition period from former 16 bit home computer to modern 32 bit PC. Here are some facts. The Atari ST which was a typical 16bit homecomputer was first released in 1985. That means 7 years later the hardware was available widespread but at the same time it was a bit outdated. In contrast a intel 386SX chip in a PC with the windows 3.0 operating was a modern computer in that area. The main difference is that the 386SX PC is a 32bit system and has a built in hard drive while the Atari ST not.
The year 1992 was at foremost the end point of the 1980s computer generation That means all the Amiga 500, Atari ST machines, the commodore 64 and the MS-DOS based 286'er has become obsolete. Instead a more modern 32bit IBM PC compatible architecture was perceived as the new standard. The interesting point is that such a drastical decline never happend again since that area. That means everything what was invented after the year 1992 is valid today. This splits up the computer history into at least two periods. The time until the year 1992 which was equal to 16bit hardware, low resolution graphics, floppy drives and no Internet. And the time after 1992 which is equal to cheap PC compatible hardware, GUI operating systems, high resolution games and of course the WWW.
From a hardware perspective there is a large difference between a 16bit Atari ST and a 32 bit IBM PC. The later one can be described as a workstation. That means it is not a home computer but a professional computer with built in harddrive, an ethernet connection and the ability to run multiple applications at the same time. In contrast a 16bit homecomputer has to be treated as a toy. It is some sort of gaming console.
There was a big difference before the year 1992 and after that date. Before this time the question for every computer nerd was how to build and and program a computer. What was available and common at this time were radio, TV screen and electronic ocmponents like transistors. What was not available was a computer. So the average computer nerd was happy if was able to build such a mschine from scratch and type in a program from a magazine line by line. After the year 1992 the situation has changed because since then computer technology was available. Every store has provided standard pc hardware and software which was ready to use. The computer has become part of daily living and was used for some practical applications like text processing and game playing.
It is interesting to see that around the year 1992 the articles in the computer journal have become trivial. It was common before the year 1992 that tutorials were published how to program a computer in assembly or how to build something by hand. In contrast the magaine after 1992 have tested existing software and made recommendation which hardware somebody should buy.
Before the year 1992 computer was a hobby and an adventure, after that date, computing has become trivial. This is not the same that computers went out of fashion but it means that the amount of possibilities has increased drastically. In the computer all the technology from the decades ago were aggregated. The unix operating system was ported to PC hardware, the video games were published only for PC hardware and the device has become the default tool for graphic designer, musicians, programmers and even for supercomputing applications.
No comments:
Post a Comment