Column Next month will mark forty years since I showed up for the first day of my first professional job. I knew BASIC – I had even learned to type RPG-II on a deck of punch cards – but in reality I knew nothing.
In 1982, the field was not yet very professional. Most people working as software engineers were, like me, college dropouts.
I was fortunate to be mentored by two incredibly brilliant and (fortunately) very patient people. John taught me how to get the most out of the tight microcode of the Intel 8085A, while Ethan – who came from a background in minicomputers – showed me how to work with systems larger than a single processor and EEPROM.
I learned everything they could teach me. Enough so that when, a year later, my boss explained his Big New Idea to me, I was able to prototype it (hardware and software) from his explanation. This turned out to be RSA’s very first version of SecurID – the ancestor of many 2FA systems in use today. (Full disclosure: I’m not very good at crypto and my first implementation was ridiculously easy to break. But this proved the concept.)
Humanity has gone from an information deficit to a draining oversupply
I spent the first decade of my career writing firmware for a range of communications devices: X.25 packet assembler/disassembler, modems, CSU/DSU, and finally a range of dial-up networking equipment for a startup called Shiva Corporation that allowed users to connect to an office network from anywhere to access files and printers.
Shiva followed with an IP gateway – at the time, only a few large universities and corporations had internet access – and I learned to code for TCP/IP. It gave me my own big new idea – inspired by William Gibson neuromancer – for a virtual reality interface to the Internet.
The web was done right the first time. An old 3D banana shows that Microsoft is also doing a lot of good
While I was working on this, on the other side of the world, Tim Berners-Lee developed a protocol to connect all the computers in the world into a single hyperlinked resource. The Web changed everything (and continues to do so), providing a base on which I could build a 3D interface to the Internet: that interface was VRML.
The five years in the late 1990s felt like the technological equivalent of the Cambrian Explosion. The early raw websites soon gave way to sleek user interfaces, data browsing, media explorers, e-commerce and more. As more and more information found its way into cyberspace, the information became more accessible and shareable. In an instant, humanity went from an information deficit to an endless, exhausting oversupply.
We haven’t yet learned how to keep smartphones at a safe distance
The Web 1.0 bubble burst in the early 2000s – a reset wiping out nearly all ideas that couldn’t be immediately monetized. The 3D web has gone to the island of lost toys, never to be seen again – or so I thought.
Then Friendster made the web fun again; The endless depths of information became a space for human connection, as we found our friends, families, colleagues and neighbors, and used those connections to share and learn from each other. Social media felt like another revolution – nothing would ever be the same between us again.
It turned out that was just the opening.
From the day Steve Jobs took to the stage in January 2007 with the first iPhone, only twelve years have passed before half of the adults on Earth own a smartphone. Ubiquitous devices bring all of our information and connections to the palm of our hands. We no longer wish to look away from these screens – their flashing lights and steady stream of notifications promise a lot, while delivering a stream of FOMO, disappointment, and negativity.
We haven’t yet learned to hold these devices at a safe distance – a distance that allows us to hold ourselves back. To do this, we need time and space to think and feel. Technology helps us spend our time so efficiently that we rarely realize we need to breathe – just breathe – to grow.
The 3D Web is back – renamed “the Metaverse”. I doubt any of us are prepared for that moment when we don’t have the augmented reality “shows” that are soon to come from Apple, Meta, and Microsoft, and the screen becomes the whole world – when everything we see is mediated by the enormous infrastructure of calculation, analysis and recommendation that we have developed over the last forty years.
If we’ve learned anything over the past four decades, it’s that every innovation, no matter how wonderful, contains its opposite. The splendor of “knowledge at your fingertips” has laid the foundation for a planetary “amplifier of ignorance”. Massive human hyperconnectivity through social media has awakened and accelerated our tendency to form tribes. Our coming together is beginning to look more like our separation. Advances in artificial intelligence mean that the surveillance state can evolve without human personnel – or human oversight.
We never seem to understand that our strengths inevitably turn into weaknesses. A touch of humility could go a long way in helping us avoid tragedy as we navigate the next forty years. Where we can admit that we don’t know, we can find space to think, feel and breathe. ®