Earlier today, Jeff Atwood of Coding Horror posted “Please Don’t Learn to Code” as a response to the recent initiatives to get people programming. Here’s a quick summary: he advocates against popularizing coding on the basis that most people have jobs that won’t require it directly. He cites the mayor of New York as an example.
I respect Atwood a lot. So I reread his article several time throughout the day, trying to understand where he comes from.
I still can’t believe how much he doesn’t get it.
The motivation behind the “everyone should code” movement is not to make everyone into a professional programmer—obviously, that would create a seriously dysfunctional society.
Instead, its primary goals are (1) to teach critical-thinking skills, and (2) to help people understand the computers that run their lives.
(Please note that I am simply advocating for introductory programming courses. Beyond that, every hour spent learning to code becomes less useful to learning those skills than the previous hour was.)
Programming teaches you how to think.
The act of writing software involves the dissection of a problem, a deep understanding each of the problem’s components, and the creation of precise instructions for a machine that will solve the problem. Debugging a piece of software requires the programmer to come up with a list of all the things that could be going wrong, and find logical ways to prove that they functioning correctly.
How are these logical skills not useful for a mayor? Or any other job, for that matter—unless you are microwaving burgers for McDonald’s, these skills are quickly generalized and useful in many other situations.
(Note that I use the word “programmer” loosely: it refers to anybody who knows how to program, not just those who do it as their day jobs.)
Put another way:
We encourage kids to play a musical instrument or two in order to stimulate their cognitive development: does that mean we want everyone to play for the Chicago Symphony Orchestra?
We push for increased science education in schools so that future adults can understand scientific and medical discoveries relevant to their lives: does this mean we want every student to become a molecular biologist?
We encourage people to go to the gym so that they can stay fit: does this mean we want everyone to be in the NBA?
No, no, and no. That is the superficial way of seeing things. We do these things not because they are directly useful, but because they offer indirect benefits. That the reasons aren’t immediately apparent does not imply the activities are useless or frivolous.
Microprocessors run your life.
When you press the power button on a TV, are you closing a circuit to power the TV? No, you are giving input to a piece of software on a microprocessor that causes it to run the wake up function.
When you press the gas pedal on your car, are you mechanically changing some part of the engine to make it run faster? No, you are giving a input to a program running on a microprocessor. You do not drive the car through the gas pedal; the microprocessor screens your input and allows your input to guide it in driving the car when it makes sense.
Currently, the United States has about 100 CPUs per person, a number is only going to increase as more things become digitized. More CPUs implies that we’ll need more people to write software for them.
But that job will fall to a small minority of the population. For everybody else, this means that a general understanding of the computer’s mechanisms (computer literacy) will become critical to becoming a contributing member of society. How else will a judge make an educated ruling in a trial between software companies? How else will Congress understand that blocking a data-transfer protocol won’t end piracy? These are only a few examples of the importance of a “good enough” understanding through basic programming. It’s only going to get worse.
You can yell at me about this article through Twitter. I’m @kevinchen.