Is Computer Science necessary or useful for programmers?

UK Developer Mike Hadlow brought the wrath of the internet down upon him earlier this month when he suggested that not everybody can be taught to program. His post is an oblique swipe at the coding "bootcamps" that have been popping up in the last few years to address the problem of the programmer shortage that the Western world is purportedly facing. You might get the impression that this "how fast can we make any random person into a professional programmer" is a relatively new phenomenon, but I've been observing it since at least 1998. That was when I was working, as a programmer, for an insurance company that rolled out a two-week training course in programming concepts for any of its non-programming employees who were interested in moving over to programming. My boss at the time commented sarcastically, "or they could spend four years in college like all of us did." So... is two weeks enough? Is four years too many? Does a degree in computer science (or anything else) teach us anything useful about programming?

I don't think anybody would argue that you don't need some sort of an educational jumpstart if you want to start programming computers, either as a career or even as a hobby. If you put some random person in front of a computer and say "get started", they're not likely to produce anything you're going to be very happy with. So an aspiring programmer needs at least some basic education or training in computer programming; the question that nobody seems to have agreed to an answer on is just how much. My own computer education started when my father brought home our first personal computer in 1982. It was a TI 99/4A (catchy name, eh?). It didn't do much, even by early 1980's standards. You could hook it up to a standard TV for output, and you could plug in a tape recorder to act as I/O. It didn't have any programs pre-loaded except for a BASIC interpreter whose manual it shipped with. My interest back then was in playing video games; since it didn't come with any, I decided I would try my hand at writing one myself.

I learned a lot about the dialect of BASIC that the TI 99/4A shipped with. When we upgraded to a Commodore 64 a few years later, I re-learned C64 BASIC. I coded some rough games, but along the way I realized I was having a lot more fun writing these games than playing them (the fact that the gameplay was pretty lame was probably a contributing factor there). I spent a lot of time looking at other people's example programs to try to figure out more about how certain effects were achieved. One that I was really fascinated by was a 3D spinning cube example. When I looked at its source code, it seemed to work its magic by using a couple of mysterious functions called sin and cos. I looked them up in the manual, but the official explanations weren't all the helpful: "computes the sin/cosine of x (measured in radians)". I asked my mother what that meant and she told me that it was trigonometry, but that she didn't know how it worked. I looked forward to some day learning trigonometry so that I could finally understand the 3D spinning cube.

I started to bump up against the limits of what C64 BASIC could do performance-wise, so I started looking for alternatives. At that time, for the Commodore 64, the only options were either BASIC or Assembler. I got my hands on the Merlin64 assembler and its programming guide and started to dive into 6510 Assembler programming. Assembler was radically different than BASIC — although coding in either was called "programming", really interacting with the machine was an eye-opener. I remember being very frustrated and confused as to how to declare a "variable" in assembler, since that was so fundamental to basic programming. When I finally had the eureka moment that a variable was just space in memory, a lot of programming concepts seemed to fall into place ("so that's what that 'stack' thing is for!")

When I started college in '87, I had to pick a major. I knew I wanted to get a job after college as a programmer, so I picked what seemed like the natural choice, computer science. Back then, a college discipline called "computer science" was relatively new, and figured that since I already knew (so I thought) quite a bit about programming computers, such a degree would be a breeze for me. I knew that I kept running into performance and maintainability problems with the programs I was trying to write, and I was looking forward to learning all the tips and tricks that the pros used to write their programs so that they would work perfectly the first time and be easy to change and run as fast as possible.

To my surprise, that wasn't what they taught me at all. They started out by prohibiting the GOTO statement (what?! How could you write programs without GOTOs?) and teaching me a structured programming language called Pascal. They made me write out flowcharts for my programs before I started typing them into a computer. They also taught me my coveted trigonometry, along with algebra, calculus, physics, statistics and probability. All of these new concepts kept bending my brain and challenging my initial arrogance: that I knew more or less everything there was to know besides a couple of function calls I hadn't come across yet.

I struggled with the coursework, and questioned it at times — I did know how to write programs, after all. They might have been slow, they might have had bugs in them, and they might have been impossible to make changes to (including bug fixes), but surely learning how to differentiate polynomials or invert matrices wasn't helping me write bug-free computer code in any language, much less the impractical, academic Pascal language that they insisted I use.

I stuck it out and finished my degree, though. I did go to school with a few people who dropped out and found good-paying programming jobs, but my mother convinced me (God bless her) that in the grand scheme of things, having that "piece of paper" would make a difference. I started working and, as I suspected, I didn't apply any of the stuff that they had taught me at college. Instead, as a professional programmer, I had to learn things that they never talked about in school: source-code control, SQL and networking to name a few. Nobody ever asked me to so much as factor a quadratic equation or even prove that an algorithm was or wasn't NP complete. If I ever spent time trying to analyze the algorithmic efficiency of a particular routine, my boss would certainly have chastised me for wasting time on premature optimization. I felt ill-equipped to do my job. In spite of all the education I'd spent so much time, effort and money on, I still wrote code that had bugs that I had to go in and fix. People would ask me inexplicable questions like "how long will it take to program this?" How long? I don't know... I've spent the last four years either being told how long I had (if I was working for a grade) or just working on something personal at a leisurely pace as and when I found it interesting.

More than 20 years (and a bonus master's degree) later, I'm still programming for a living. There's a "programmer shortage" now just as there was back then — or so I'm told. Whenever I'm involved in a hiring decision, a lot of experienced, degree-holding applicants apply, but still they tell us that there are way more programming jobs available than can be filled by the programmers that are applying to them. "Programming boot camps" aim to fill this gap. The idea here is to strip away all of the waste of a traditional college degree — advanced math, science, humanities, history — and replace it with more practical topics like how to use a debugger and the intricacies of CSS. By short-circuiting programming education and focusing only on the bare minimum of what a programmer needs to know, the thinking is, we can churn out all the programmers that the computer programming "industry" needs (and maybe even then some).

Maybe I'm just a grumpy old man (well, OK, I'm definitely a grumpy old man, but I may still be correct in spite of that), and maybe I'm just playing a game of wishful thinking that the six years I spent obtaining a master's degree in computer science wasn't all a complete waste of time and money, but as much as a jump-start programmers boot camp would have appealed to me when I was eighteen and ready to make lots of money as a superstar programmer, I have my own reservations about how effective such a program can really be. The mindset behind this seems to be "what's the least amount of learning I can undertake in order to be competent?" This sort of ruthless drive for efficiency is better applied to management "science" where people focus on maximizing ROI and count every penny rather than produce a beautiful craft. As I look back at the history of, well, pretty much everything, it seems pretty consistent that if something is easy to learn or do, it's not particularly valuable. However, there appears to be a pervasive belief that programming is easy to learn and easy to do, yet is somehow simultaneously valuable. This seems unlikely to me; it would be surprising if programming were the first human endeavor which was both easy yet valuable. I freely and openly admit that almost nothing I studied in college was directly vocationally applicable to my chosen career — but I still feel that the experience made me better at what I do.

It's been the position of academia for as long as there has been academia that college is not there to help you get a job, but instead to educate you — in other words, help you become smarter than you otherwise may have been, by exposing you to new concepts and ideas. It just so happens that this process has routinely been observed to correlate highly with professional ability. If I'm right, and academic study is more worthwhile to an aspiring programmer than targeted "this is the sequence of steps you go through to accomplish task X" training, then one would imagine that non-programming pursuits like learning to speak a foreign language, or to read music, or to play chess — all the things that are traditionally associated with intelligence — would make you a better programmer, in an indirect way. Similarly, learning calculus — as brain-stretching an exercise as I've ever come across — would either enhance or at least demonstrate intelligence.

The truth is, there's no generally accepted way to measure the performance or ability of a software developer — in fact, I think that's one of the things that separates programming from skilled tradecraft. If it's the case that smarter people make better programmers, as is the undisputed case in so many other professions — elitist and snobby though I may be accused of for suggesting so — then a college education will make you a better programmer, because college is designed to unlock and enhance raw intelligence and teach you how to learn most effectively.

Not to say that I don't think that self-taught programmers can produce great programs without jumping through the hoops of a four-year (at least) college degree — I've worked with programmers I respected who had only a high school education. Like I said, I taught myself almost everything I know about actually producing code that solves a problem and serves a purpose, and I had taught myself a fair bit of it even before I started college. And if I can do it, I'm sure there are lots of other people who can. I've done well enough on standard tests of intelligence like GPA and GRE scores that I'm at least qualified to leave the house without a football helmet to protect me from accidents — but there are still people smarter than I am, who I'm sure could learn everything I've learned in half the time. As far as I can see, though, a capable programmer won't need a programming boot camp, and an incapable programmer won't be turned into a capable programmer by one either. Neither will a computer science degree turn an incapable programmer into a capable one, no matter how good the instructors are — but a computer science degree will make a capable programmer a better programmer. There's an underlying philosophy — a theory — that pervades all software development, regardless of the programming languages, tools or methodology used. Computer science is the study of the underlying philosophy, with a bit of practicality sprinkled in; "quick-start" bootcamps are all practicality with a bit of theory. In the long run, you'll be better off with the theory than with whatever happens to be in fashion today.

Although it may well be that our intelligence is fixed, determined by our DNA like eye color or height, and fundamentally unchangeable, we can increase our general knowledge. I know that you can learn all of the stuff that a programming bootcamp might propose to teach you "on the job" because I did — and I also know that I never would have learned calculus or linear algebra or even trigonometry if I hadn't been exposed to it in an academic setting. One of the biggest regrets I have now that I'm older is that I didn't take advantage of more of the learning opportunities that I had, back when the only responsibility I had was learning. I find it somewhat ironic that many of the same people who argue that a computer science degree is irrelevant to the practice of programming because it spends so much time focusing on topics not related, or only tangentially related, to computer programming are often the same people who will turn around and anecdotally point out that some of the best programmers they've ever known had backgrounds in something like philosophy or journalism.

So, will a computer science college education make you a better programmer? I think it will, and I don't think it's unreasonable for programming to join the ranks of professions like law, medicine or even accounting where a four-year degree is a bare-minimum requirement for entry. Maybe we do need lots of programmers to face the challenges of the 21st century, but we don't need lots of "just enough to get by" programmers.

Add a comment:

Completely off-topic or spam comments will be removed at the discretion of the moderator.

You may preserve formatting (e.g. a code sample) by indenting with four spaces preceding the formatted line(s)

Name: Name is required
Email (will not be displayed publicly):
Comment:
Comment is required
Merv, 2020-05-01
This was an excellent read, and I'm enjoying many other posts here.

I'm an instructor at one these "bootcamps" and struggled with this question often. How much can you teach someone in a few weeks? What's the trade-off between training for practicality and teaching theory? Just how important is intelligence on determining outcomes?

In my experience, bootcamp training can never replace 4 years of higher education. Rather than intelligence, perhaps the most important (immediately useful) trait gained at college is autodidacticism. That is, regardless of the degree, an industry programmer should be capable of teaching themselves new tricks. The discipline, persistence, and curiosity needed for self-driven, antagonistic learning seems to me the best indication of future (early) success. It can take a while for computer science theory to become more important than the ability to produce clean, functional, and timely code, and until that time one can get lost in a sea of flavor-of-the-month technologies.

A bootcamp geared towards college graduates, focused on those professional skills sometimes not taught in school (VCS, DBMS, networking), might work if it's not a replacement for college. At that point it would resemble an internship more than a bootcamp, though.
Josh, 2020-05-07
I suspect that every programmer is a self-taught programmer - it's just that some of us also got a degree from a college, too.
NigelS, 2022-04-27
An interesting article, which I enjoyed.

The gentleman referred to at the start is indeed correct that "not everybody can be taught to program". The unfortunate nature of biology means some are far better at it than others. That said, as you observed, the idea behind these "Learn X in 24 hours" is to simply <i>find people who can churn something out</i>; so to some degree, everybody <i>can</i> learn to program... to a certain degree.

I guess the real question is why a person who becomes a programmer actually thinks they'll be producing anything of personal value at all? I mean, when you're paid a certain salary to turn up to a large institution to go to work with your 'team' all hastily churning out code - where will you find your value? How will you not be disillusioned? Naturally, I cannot speak for all programmers, but this sentiment has become increasingly common as I've gotten older, so perhaps there is something to it.

My own reasoning has led me to the conclusion that I'm unlikely to treat the career I once so desperately wanted and anything other than a money making mechanism. Implement function X(), get government currency units (GCU). Migrate infrastructure to AWS, get more GCU. But then again, I've always seen it this way, and saved my real pleasures for my personal time. At the moment I get more satisfaction trying to analyze a simple algorithm, thinking about <i>why</i> long division works, or implementing an RFC than I ever would from work. Why? Because I have the time to <i>think deeply</i>, and that is what I truly enjoy.

I'll give my own background here, as I think it pertinent:

I obtained a First Class Honours Degree in Physics from university, largely by just memorizing material - in most subjects, I had no deep understanding. The degree taught me that I loved to think, and think deeply. After I left university I made a leisurely study of my mat
My Book

I'm the author of the book "Implementing SSL/TLS Using Cryptography and PKI". Like the title says, this is a from-the-ground-up examination of the SSL protocol that provides security, integrity and privacy to most application-level internet protocols, most notably HTTP. I include the source code to a complete working SSL implementation, including the most popular cryptographic algorithms (DES, 3DES, RC4, AES, RSA, DSA, Diffie-Hellman, HMAC, MD5, SHA-1, SHA-256, and ECC), and show how they all fit together to provide transport-layer security.

My Picture

Joshua Davies

Past Posts