It was the early 1980s and the ten-year-old was on the train from Hanover to Munich. Then a resolute senior with white hair approached the boy and said with an imperious undertone: “Excuse me, the seat by the window belongs to me!” – “No, it was specially reserved for me,” replied the ten-year-old – and after a short “Er belongs to me!” – “No, me!!” – “No, me!!!”, the unequal couple decided to simply compare the place cards. And – oh wonder – the track had made a mistake, it had awarded the place twice. The senior knew immediately what was wrong: “It all comes from the computer, it doesn’t work, it just doesn’t work,” he chanted with a raised index finger.

What was met with great skepticism in the old Federal Republic was just about to make a huge leap forward in the USA. Probably the best indication of this was Time magazine, which was still a benchmark at the time. Every year, the publication chooses a “Person of the Year” – and in January 1982 there would have been good reasons to honor US President Ronald Reagan or British Prime Minister Margaret Thatcher, for example. However, for the first time the magazine, under the headline “The Computer moves in”, did not honor anyone, but declared the technical innovation “Machine of the year”.

The cover shows a stylized white man sitting in front of a screen with a keyboard. The tide clearly sided with personal computers. At the time, it was not entirely clear where the technological journey would lead. Companies like IBM, Control Data or Cray believed in gigantic data centers with many terminals, the waste heat from which was used to heat the house that had previously been built around them. In terms of performance, these constructions were the products of Apple, Commodore

The only disadvantage of those series was that they swallowed up a lot of money for construction, maintenance and programming – hardware and software people got along rather moderately well with each other – and you couldn’t take the parts home with you either. The latter turned out to be the decisive disadvantage. According to Time, 724,000 personal computers were sold in 1980, and 1.4 million a year later. In 1982 this number was to double again.

Especially among young adults in the USA it was suddenly considered progressive to be seen with such a thing. It had been very different in previous years. The scene largely kept to itself until well into the 1970s. Which, for example, is not particularly surprising in view of an early-stage genius like Steve Jobs. “Make a dent in the universe” was the slogan of his company Apple to make young, motivated staff his own. But when the applicants came, they quickly noticed that the boss stank.

Jobs had his own theory of how a body worked. For a while, at least, he was firmly convinced that a strict fruit diet prevented any kind of sweating and therefore shied away from taking a shower (the devil’s stuff!!!). Jobs was by no means alone with such quirks, and Bill Gates was certainly not a completely straight character, which made the whole scene a kind of human zoo for decent middle-aged Americans at best.

In addition to the first successes in word and data processing at a company like Apple, some competitors like Atari also concentrated on developing electronic games. In turn, people who considered themselves smart said that they were not contributing to the optimal development of the young people for whom they were designed. An impression that was understandable when you looked at the blank faces of young people who had achieved absolute mastery in disciplines such as “Asteroids” or “Pac-Man”, but all the less so in all others. Young arcade game audiences could not hope to find themselves at Harvard, Princeton, or Yale one day.

But with the rise of the PC, the triumph of another group of young people began. They were the ones who always had to be careful in the playground not to get beat up and otherwise were the last to be chosen for the team in sports. But once they sat at a computer keyboard, they were unbeatable. This nerd culture might not have appealed to girls at first, but it created a field for boys that had rarely been seen before.

The optimism with which commercials and brochures promised an ever better world through increasing computing power in those years is always fascinating. But these were the messages from the industry that would soon become known to a large number of people with Apple’s Macintosh or Microsoft’s DOS system. All the more astonishing is the text that the colleagues from “Time” wrote at the time to justify their choice. Of course, not every detail could be right, but the direction was definitely right: the editors were aware that it would be nothing less than a revolution of existence that humanity was about to face.

That most home computers at the time were used for video games was not something the writers thought would last. Today there is a huge market just for this type of programming – but of course the machines soon had completely different tasks. Be it that students used them as better typewriters, be it that scientists and economists used them to record their data, or be it that all communication on the planet changed with the advent of e-mail in the mid-1990s at the latest. The “Time” editors even saw social networks and video conferences coming – so it was a clairvoyant decision to choose the machine.

It goes without saying that the digital world has by far not been able to keep every promise of happiness it has ever made. As a result, this story – similar to James Watt’s steam engine – became a lesson in the fact that technical innovations will always be unstoppable, whatever lead conservatives against them with good reason. And everyone who likes it a size smaller should check if their Commodore Amiga still works. Not to work on it. But his shooting games were top class.

You can also find “World History” on Facebook. We are happy about a like.