Reading about this topic on reddit set my mind to thinking about the practicality of a computer science degree. As many have complained, the issue is that a fresh CS graduate may not be able to code up even a half-assed solution to some trivial real-world problem. Sure, knowing the difference between an AVL tree and a red-black tree is an interesting bit of trivia, but that knowledge is almost useless for parsing files or writing dynamic web applications. How important is the theory, how important is the real world, and what role should education play?
I take a very practical approach when it comes to work (as opposed to the tumbling-down-the-stairs approach I take toward women). I’m the kind of guy who reads articles on Lisp and thinks, Man, these people must be really bored with their jobs if they need to make things more challenging by coding in some esoteric puzzle language. I got my undergrad in Computer Engineering (!Science) because I wanted a challenge and exposure to real-world applications. To me, the fact that a person can successfully earn a comp sci degree, but be a floundering infant in a real job, is definite problem.
I have seen a least one person point out that a distinction needs to be made between computer science and software engineering. We expect CS grads to be software engineers, but they aren’t; even something fundamental like source control is a foreign concept to many of them. They are trained to memorize and regurgitate some abstract proof by induction using an incomprehensible jumble of ∀, ∃, ∧, and ∨ symbols, not create maintainable solutions that are of use to society.
At this point we could get into a philosophical debate about the purpose of college education. The historical mindset seems to be that we are taught theory that will create a solid foundation for the on-the-job vocational learning that occurs after graduation. We don’t attend college to learn Perl or how to grant privileges on a MySQL database; we attend to learn the tuple relational calculus. I’m certainly not going to make the ignorant argument that theory is unimportant (tuple relational calculus excepted, of course), but a CS degree becomes so much more useful when it is coupled with vocational aspects.
Some changes that would help the situation:
- Group projects should either be eliminated or have more realism. What actually happens is that one or two members who care wind up doing all the work. Throwing together people of vastly different ability and priorities, with no command hierarchy, and telling them to sort out a project together, is one of the most absurd ideas in education.
- Students should be given enough programming projects in a variety of languages so that they can easily solve trivial problems in a language they have never seen (given appropriate documentation). So if a student has enough exposure to C, Java, and Ruby, then that student should be able to pick up on PHP without much difficulty.
- At least one database class should be required.
- Students should develop/save/submit projects using version control.
- A class on security should be required, including hands-on examples (e.g. hack this box running an old, unpatched version of phpBB)
- Implementing a web-based project should be mandatory (at this stage in history anyway). Students should know how HTML, CSS, jscript, your favorite programming language, and a DBMS all fit together.
From things I have seen and heard, I think that CS educators are already starting to become more progressive along these lines. This field moves fast, and knowing how to type a Java merge sort algorithm into Notepad.exe isn’t going to cut it. Students need a healthy dose of current technology and trends along with the theoretical backing behind them. Computing is a vital field – a world-changing field – and it’s time to stop handing our graduates a blank sheet of paper when they walk across the stage.