Most description in Retro computing are focused on the 1980s because it was the first decade in which 8bit microcomputers were available as consumer product and the first Unix workstations were available. But there are some arguments which makes the 80s a poor choice to analyze the past of computing. First problem was, that the computer hardware was very slow. Multitasking including GUI operating systems were not available. Also computers in this period weren't used for multimedia application including sound and graphics. And last but not least, the Internet including the WWW were in a very early stage.
A more powerful and less explored decade was the 2000s which should be introduced next. The main advantage of the 2000s compared to the 1980s was, that all the bottlenecks were solved. In the 2000s, powerful 32bit PC were available as consumer products. Internet was widespread used in the university and even home computers were connected over early broadband internet connections. So we can say, that the 2000s but not the 1980s was the first decade in which computing in the modern sense was available for a cheap price.
In addition there was an another unique feature available in the 2000s. It was the last decade in which Artificial Intelligence was absent. During the 2000s there was a common understanding among IT experts, that neural networks and robotics are a dead end. On the one hand, computer science has demonstrated its power in the domain of the Internet, programming languages and desktop operating systems and at the same time, it was unclear how to realize AI systems including computer vision, expert systems and robots. And even more, in the 2000s there was a widespread assumption, that AI can't be realized at all, not in the 2000s, not in the 2040s and not in 2080s.
So we can say, there was a interesting conflict available between powerful and frequently used classical computers on the one hand, and the absence of neural networks and robotics on the other hand. The 2000s was perhaps the last decade in which computers were only used as machines, but not as large language models, self driving cars and multi agent systems.
Of course, these subjects were researched in the 2000s but they were not available as practical applications. All the important innovations like Question&Answering systems with IBM Watson, Deep Learning on GPU accelerated hardware and multimodal datasets were invented after the year 2010.
Let us list some important classical computing technologies available in the 2000s: Linux, Apple iphone, youtube, social networks, ADSL, USB flash drives and Windows 2000.
So we can say, that the 2000s was a great decade for computing. Most of today's technology has its root in the 2000s. Instead of hardware from the 1980s, the technology from the 2000s can't be called Retro or ancient because with some goodwill it can be used today.
No comments:
Post a Comment