NOTE: This is a draft. This is not the final piece. More thought has to be given to it and the ideas developed. Nevertheless, feel free to read this version just now.

The final year courses that a university offered was something that heavily influenced my enrollment decision. Course structures are subject to change and the probability of at least some final year courses changing by the time I get to them is substantial (Note: You needn’t wait to learn something). This isn’t a bad thing. After all, I wouldn’t complain if they replaced a current course with something new, something at the frontier of computer science, rather, I’d be excited. This allowed me to screen universities that taught what I thought really mattered, that is, the universities that offered courses on the fresh topics and relatively newer ideas in computer science. There seemed to be a correlation between universities that took a more theoretical/mathematical approach to their computer science curriculum and those which taught what really mattered. Possibly this is to do with the nature of ideas and inventions. Does theory always precede practical use?

Areas at the forefront of science usually have higher barriers to entry in the sense that there are fewer people that understand them, therefore fewer people and resources to learn from. Consequently, you have an advantage if you can understand them. Although, the people that do understand them are likely very smart and that’s the disadvantage. You’re competing with the best. That being said, a more glass half full perspective is that it gives you scope to collaborate with the best. I believed - and still do - that an understanding of the frontiers of science opens the door to innovation. Understanding new technology is the first step in exploring the utilisation of that new-found knowledge as well as discovering problems that you never knew existed. Accompanying every problem is an opportunity; opportunity for the problem to be solved. However, not all problems are worth solving (see Paul Graham’s Bus Ticket Theory of Genius).

I’d call what I just described, a technology-first approach. It is, understand the capabilities of the technology first and then make something useful based - loosely - on those boundaries. A short video of Steve Jobs talking about a user-first approach that I came across recently, induced a re-evaluation of my views. Perhaps a superior method of inventing was to be user-first, focusing on the problem for the user and figuring out the technological implementation later. This seems trivial, nevertheless, it’s much more difficult than it seems to figure out what people actually want - people like Steve Jobs are particularly great at this - and if you base it too closely on what they already have you’ll end up with faster horses and not cars.

The truth is most people don’t know what they want, in some cases, not even the inventors themselves! Take the phonograph for example, when Thomas Edison built the first phonograph, he wrote about possible use cases for his invention. Recording the last words of dying people, recording books for blind people to hear, and announcing the time sat at the top end of his intended uses. He definitely wasn’t convinced about the idea of the phonograph being used to reproduce music, even telling his assistant that it was of no commercial value. It wasn’t until 20 years later that Edison acknowledged music reproduction as the main use of the phonograph. Unlike the layperson, Edison spent his time experimenting with the telegraph and telephone allowing him to see how the technology could be pushed and in which direction, but this understanding of the technology did not, evidently, constitute an understanding of the eventual applications. Or did it?

I’ll give a counter-argument to my last point and leave it as an exercise for the reader to decide whether or not Edison understood the applications of his technology (as is the case of all exercises left to the reader in my mathematics textbook, this too, is “trivial”.)

The influence that society has on the success of an idea is unparalleled. The culture of a society in a given time period has the ability to make or break an idea and you’ll, of course, hear of startups being “too early”.

“A new device merely opens a door; it does not compel one to enter. The acceptance or rejection of an invention, or the extent to which its implications are realized if it is accepted, depends quite as much upon the condition of a society, and upon the imagination of its leaders, as upon the nature of the technological item itself.”

Continuing with the aforementioned example of the phonograph, Edison planned the device would be used for “recording books for blind people to hear”. Whilst he may not have fully realised the tremendous potential of his intended use case for the average person, initially only suggesting it be a device for the blind, Edison’s idea is nothing short of the audiobook. The phonograph was invented in 1877, yet the idea of an audiobook lay dormant, unexploited until 1932 when books were recorded on vinyl records by The American Foundation for the Blind. Was Thomas Edison the protagonist in another story of being too early? The triumphant use case of the phonograph being unintended seems to suggest otherwise but perhaps this is just a by-product of being serendipitously successful.

Why QWERTY? The invention of the typewriter exhibits a more subtle technology-first approach. The QWERTY layout we see on our keyboard today was deliberately designed to slow down typists. Letters were intentionally spread out over all of the keyboard rows, with a high proportion of the most common being on the left-hand side - due to most people being right-handed, therefore making it more difficult for them. The reason for this premeditated murder of productivity? Two adjacent keys pressed very quickly one after the other caused the mechanism to jam. The development of the typewriter was moulded around the current technology, rather than focusing on the user and figuring out the technology to facilitate the best experience later. More efficient keyboard layouts do exist, however, most believe the time for that change has long passed and that the ingrained QWERTY layout will live on. Long live QWERTY! An interesting consequence to consider is the notion that seemingly small improvement, could have accumulated into a gargantuan productivity gain. Who knows how the world would have been impacted by such gain. Maybe we’d have the flying cars everyone envisions? Of course, it could also be the case that overall productivity would be worse off. Had the production of typewriters been halted in anticipation of improved technology, it’d likely be far more damaging to the world’s productivity. Often an MVP is released to swiftly validate the idea and iterate - it’s user-first. Contradictorily, the release of the typewriter appears to be a technology-first release. Had it been an MVP, I believe the QWERTY layout would’ve been altered when the technology advanced, although, it may have advanced at such a rate as to allow QWERTY to become an axiom of the typewriter.

Having considered two inventions and the aspects of a technology-first approach taken in each, it seems appropriate to conclude that a solely technology-first approach cannot be optimal. If it wasn’t beneficial to the user it would never have caught on. A solely user-first approach also is unlikely promising and could constrain the invention to either a poor or marginally better solution. In all likelihood, the technology-first and user-first approaches are biconditional. Successful inventions require a combination of both, and not necessarily equal amounts. Technology-first invention is still of paramount importance and whilst a novel application of some new scientific discovery itself may not change the world, with a bit of luck and the right users, you can.