Who’s the father of modern computing? Ask that question, and you’ll get different answers depending on the age bracket…
To the youngest generation, the ones with a tablet or phone early on, the only word to know in computing is “Google,” and so they’ll likely answer Larry Page and Sergey Brin. The generation just one tick up in the age bracket will likely answer Bill Gates or Steve Jobs.
None of these are even close. When we talk about “modern computing,” we have to draw the line.
So, “ancient computing” would involve giant mainframes, reel-to-reel tape, punched cards, text terminals, and blinkenlights as the old funny sign refers to them. “Modern computing” means interfacing with the computer via a mouse, pointer, stylus, icons, windows, a desktop, images and videos leaping up to display at one click, that sort of thing.
So where did the windows and icons and mouse pointer come from?
You might say “Microsoft,” but in fact Microsoft Windows basically ripped off their user interface from Apple – to the point that Apple sued in the now-famous “look and feel” lawsuit.
So, Apple, then?
Nope, Apple copied their interface from Xerox, and likewise (seeing the pattern here?) Xerox sued Apple for their UI design too.
So what, Xerox invented the desktop?
The Xerox Alto, introduced in 1973, does qualify as the first commercially sold computer with a desktop user interface involving a mouse, windows, buttons, and icons, yes. But even that’s not the beginning of the story, for there are many more layers to this onion.
The name to know is Douglas Engelbart
Douglas Engelbart was born January 30, 1925, in Portland, Oregon. The middle child of three, he began studies at Oregon State College, but got drafted into WWII to go be a radar technician for the US Navy. During that time, he read Vannevar Bush’s 1945 essay “As We May Think,” which greatly influenced him. To sum it up in a nutshell, around the time of the Hiroshima and Nagasaki bombings which punctuated WWII, Bush held that science should be pushing towards shared discoveries and the spread of knowledge, not building bigger bombs. The essay proposes that we create a “collective memory machine” – don’t look now, but you’re soaking in it!
This video covers some of his ideology:
After the war, Douglas Engelbart picked up his Master of Science degree at UC Berkeley, and settled down at the Stanford Research Institute (SRI) in Menlo Park, California. There, he submitted a report titled “Augmenting Human Intellect: A Conceptual Framework,” bringing up the idea of using computers to coalesce the sum total of human scientific understanding. This got him a grant from ARPA to found the Augmentation Research Center (ARI), where he worked with engineers to come up with a small, hand-sized box containing a ball, two movable wheels, and a cord to plug it into the computer. The cord looked like a tail coming out the back of the thing, so engineers at the department called it a “mouse.”
The mouse was born! Engelbart filed the patent in 1970, calling it an “X-Y position indicator for a display system,” an innovation for which he would go on to earn exactly zero dollars and zero cents in royalties. The patent actually belonged to SRI, after all, and several years later they sold it to Xerox and Apple for a few thousand dollars.
Along the way to this, Engelbart hosted the landmark event known to computer science students (and nobody else) as “The Mother of All Demos.” Here’s the whole thing at an hour and forty minutes (feel free to skip a bit):
None of this may seem impressive to us now, but this was the year 1968. The average technician interacted with the average computer by dumping a stack of punched cards into a hopper and hoping they didn’t jam. People watching this demo had their jaws on the floor by five minutes in. This was the first time anybody had seen windows, hypertext, graphics, efficient navigation and command input, video conferencing, the computer mouse, word processing, dynamic file linking, revision control, and a collaborative real-time editor – let alone all on one computer!
Douglas Engelbart was doomed to obscurity from then on
Most of his engineering team moved to the private sector to work at Xerox Parc, and now you know the rest of the Alto’s story. Engelbart himself ended up dodging through a maze of buyouts and mergers as his laboratory got sold to Tymshare, Tymshare got bought out by McDonnell Douglas, and various aerospace executives grew less and less interested in what further ideas Douglas Engelbart had about the future of computing.
There was a Space Race and a Cold War on now, no time for this day-dreaming!
So he retired from his day job to found the Bootstrap Institute in 1988, in partnership with his daughter, Christina. With some DARPA funding and seminars, he founded the Doug Engelbart Institute with his daughter Christina as the Executive Director, a capacity she serves to this day. But it was the ’80s and ’90s, and the first Iraq Gulf War and various other distractions largely drowned out computer interface research, since the Graphical User Interface (GUI) was showing up in the private computing market.
He went on to serve on the advisory board for institutes and technology centers far and wide. He was certainly recognized among the highest cream of the crop in Silicon Valley, but the public largely never heard of his name. He passed away in 2007, but not without a hail of awards in recognition of his lifetime achievements.
So why say he was doomed to obscurity? Because the system he envisioned to bring peace and understanding across the world was used to download cat videos and post status updates on Facebook. The inventions and innovations he freely gave away became gang turf for every computer company to sue every other computer company. His contributions were buried behind a wall of Bill Gates and Steve Jobs up front taking all the credit.
But his vision can live on, as explained here in this tribute:
“What kind of goal should I have for a career?” – Every June, a new crop of graduates ask themselves this question. We can go on facilitating the evolution of the collective IQ. Sure, we can still have our funny cat videos, too. But every now and then, it’s good to ask if we’re using the technology we have even now to its fullest potential, or whether we can improve the world some more with it.
Read more articles by Pete