“A rocket will never be able to leave the Earth’s atmosphere.”
- The New York Times - 13th January 1920.
This was the bold claim made by The New York Times in the early 1920s. Although cars had already become commonplace at the time, and technology was moving at full tilt, the consensus was that some things were just not going to change. The Times later made a retraction on 17th July 1969, as Apollo 11 made its way to the moon (better late than never…). So, what does this have to do with our conversation? Buckle up Major Tom and let me take you on a quondam trip.
The first swing at digitisation was taken in 1679 (yes, you read that correctly) when Gottfried Wilhelm Leibniz developed what was later referred to as the first binary number system. The language of 1’s and 0’s was conceived but was so ambiguous that he later released a book describing the new phenomena and explaining its application.
In 1847, Boolean Algebra and George Boole further developed an algebraic understanding used in the mathematical analysis and logic behind the earlier discovered binary system. Their findings were that the binary of numbers extended far beyond the comprehension of humans and held untapped potential.
In 1954, hardware started to catch up to theory. The American inventor and computer pioneer Reynold Johnson assembled an R&D team in the IBM lab, where his invention would be later referred to as the first hard disk. It held nearly enough space for a single 3x4 picture and weighed about as much as a small car, but it symbolised something even greater. The machine laid the foundation for everything that followed and became the poster child of the age of digitisation.
The year 1963 didn’t just bring the Civil Rights movement and a cultural boom. Charles Bachman developed an integrated management system that was so swift it heavily influenced the manufacturing industry.
Ten years later, things picked up where they had left off. The pulsar showed the birth of the first digital watch and the first digital camera. Digital hardware was evolving at full speed and history shows that nothing would stop it.
By 1982, 8.2% of American households owned a personal computer, a staggering number considering that, a decade earlier, an overwhelming majority hadn’t still wrapped their minds around what computers were.
In 1990, GE began (and won) the race to build the first digital television, and Finland launched the first 2G network in 1992. The 90s were at open throttle and set another milestone for what followed.
Moving on to the new millennium (Y2K turned out to be a hoax by the way), Google set to build a database for information in 2004, something that would later become the Google platform.
2008 was a big year for finance and economics enthusiasts. The first cashless, decentralised, and digital currency “Bitcoin” was born. Many successful attempts at cryptocurrencies would follow and turn it into a multi-billion-dollar phenomenon.
2015 was a brilliant year for the snowball effect. Platforms like Facebook, Instagram, and Twitter took over, revolutionising how people interacted on a daily basis.
The art of landing
So, you’ve hung around for the lift-off with the birth and refinement of the binary code, the flight, and now for the end game. Digitisation, in my opinion, witnessed two fundamental turn points: at genesis, the birth was an exploration into the unknown, with no clear direction or steer of innovation. By the mid 19th century, a clear breadcrumb trail had been laid and allowed for new leaps to be made, building upon earlier inventions and driven by the needs of the time. The wave of digitisation followed and simultaneously helped define a new age. The entire history of digitisation has been “small steps for man, but a giant leap for mankind.”