On Computer Science

15 Sep 2017

I wrote my first computer programs in 6th grade for a school “science” fair (anything remotely scientific qualified) – a C program to convert temperatures from Celsius to Fahrenheit (and back), and a C program to solve a quadratic equation, given its coefficients.

I was not impressed. I thought programming was dull and artificial. When a computer could go from bits and bytes in silicon to printing English text on my monitor, a seemingly incredible feat, why did we need humans to write code to teach it how to use arithmetic operations (1st grade math) to compute algebraic functions (8th grade math)?

I gave programming a second shot in the summer before 9th grade. I made more progress this time – learning the C language to a commendable extent, up to structures and pointers – but was surprised by the dryness of the process. So much effort to pick up a constructed human artifact, a mere tool for expressing computation. In my mind, the ideas worth learning were the ones that are discovered, not designed, ideas such as the wave-particle duality of light or the fundamental conservation laws – beautiful, physical truths about our universe, expressed in mathematics, that would exist even if no humans uncovered them.

My culminating project that summer was a Sudoku solver. Given a valid initialization of the Sudoku grid, my program could use the two or three basic heuristics I knew to output a solution to easy puzzles. Intrigued but not fascinated, I put programming on the back-burner once again.

The breakthrough happened in the summer after my freshman year of college, in 2014. The previous summer, I had learnt the basics of making apps for my iPhone 4, my first smartphone. I loved music, and I loved sharing it with friends and family – so I began work on an app to make sharing links fast, simple, and personal.

2014 was the height of social-local-mobile. The idea that two students in a dorm room could build something for their phone that was almost as valuable as an old-guard car company struck me as bizarre and incredible. I was joining a revolution.

I think my biggest shock, and probably the biggest shock for anyone who invents something or builds something new for the first time, no matter how humble, is that it worked.

I had a vision for what this app would do, and after a few weeks of coding, I was using it – my family was using it, it was real.

There was instant gratification. I had an idea for a feature – say, forwarding links – and after half a day of hacking in Xcode, I had it in my hands.

For the first time, programming blew my mind. That I could build a module on my phone that could pull from anything on the Internet – from YouTube to my own web server – with just a single line of code, and that could communicate with a clone of itself installed on another phone in another country, simply by reading and writing messages to a database somewhere in the “cloud” – was almost as exciting to me as the idea that the same theory that explains nuclear power explains the Sun and stars and the mechanics of space and time.

I had felt some angst freshman year. Why was I studying computer science? I liked using mathematical models to explain real phenomena – an idea that underlay my interest in physics in high school, and my interest in economics early in college. What was computer science about? Why were computers, a human creation, worth studying?

For me, that summer with LinkMeUp was the turning point. At the risk of sounding a little cliché, I knew at that point what I wanted to do for the rest of my life, and I haven’t looked back since.

I’m now many years down my journey with computers, and I’ll still admit that computer science, even at its deepest, isn’t as profound or as beautiful as physics.

But fascination is not what draws me to the field. It’s a more mature appreciation that it is computer scientists who helped put man on the moon, placed the world’s collective knowledge in a small device in my pocket, and put two billion people and counting on a previously faceless Internet. It’s an understanding that it is computer scientists – inventors and engineers, not thinkers or writers – who will tackle the big challenges of the 21st century, from slowing climate change to eradicating lethal diseases, from unlocking clean energy to scaling access to top-tier healthcare and education to the world.

Footnotes

You can follow me on Twitter here.