Revolutions
14 Oct 2017
I have both a short-term/selfish reason and a long-term/altruistic reason for my interest in machine learning (ML). This post is about my short-term/selfish reason.
As many have said, we are in the golden age of machine learning. ML is going to be the most important force in the world for at least the next 30 years – a force as transformative as the personal computer and the Internet, and arguably more so than the smartphone or cloud computing.
Revolutions – like personal computers, the Internet, mobile phones, and cloud computing – tend to create an orchard of creative opportunity, replete with big challenges, but also low-hanging fruit. This is why so many people rushed to start Internet companies in the late 1990s – including a young South African immigrant who passed on a physics PhD program in 1995 to build a company to put Yellow Pages on the web.
In fact, some of the companies of the Internet era solved problems that didn’t exist before the Internet (Google), while others emerged much later, after the dust of the dot com bust had settled (Facebook, Airbnb).
The biggest winners tend to be those groups that 1) enter early, 2) identify fundamental problems that have suddenly become solvable, and 3) execute well, and consistently so.
To make concrete what I mean by “big winners”, here are some examples:
- Personal computers: Microsoft (1975), Apple (1976)
- Internet: Amazon (1994), Google (1998), Facebook (2004)
- Smartphones: WhatsApp (2009), Uber (2009), Snap (2011)
Thesis: all great businesses are associated with revolutions. These revolutions are often sparked by the sudden fall in marginal costs of some key piece of enabling technology, such as transistors or microprocessors. For lack of a better word, let’s call this process “democratization”. Revolutions are often abetted by the “commoditization” (transition to undifferentiated price competition) of critical infrastructure, such as server hardware or mobile service.
Notably, smaller revolutions also spawn companies. GitHub, which was recently acquired by Microsoft for a cool $7.5 billion, was founded in 2007 by a group of Ruby developers in San Francisco at a time when the distributed version control system, git, was first starting to get attention. Here, git, and open-source more broadly, was the revolution, and Tom Preston-Werner and Chris Wanstrath were the hackers who first recognized the opportunity (1, 2).
This leads me to my short-term/selfish reason. There is an explosion of interest in machine learning in both academia and industry, and a wide spectrum of problems (“verticals”) to which it can be clearly applied. If you’re betting your future on anything, then betting on an area showing immense promise is as good a gamble as any.
Read More
- 2022 Job Search (21 Jun 2022)
- 2021 Books (01 Jan 2022)
- Regret Minimization (08 Sep 2021)
- Required Reading (16 Jul 2021)
- On Computer Science (15 Sep 2017)
- Samvit's Guide to the World Wide Web (28 Aug 2017)
- How to Pick Your Next Gig: Evaluating Startups - Part II (14 Aug 2017)
- How to Pick Your Next Gig: Evaluating Startups - Part I (14 Aug 2017)
- A Brief Primer: Stochastic Gradient Descent (20 Jul 2017)
- Why Parallelism? An Example from Deep Reinforcement Learning (06 Jul 2017)