Joel Spolsky, Snake-Oil Salesman
If there is a lecturer in TCD’s CS department that doesn’t know of the problems and issues Joel just raised in his Capstone Projects post, they’re a rare bird indeed. But what Joel hasn’t mentioned — and what those lecturers can tell you because they’ve been debating it for decades, writing papers on it, holding conferences and have published peer-reviewed journals on the topic, as opposed to Joel’s one blog post — are that there are very specific and very good reasons why CS and CEng undergraduate courses don’t get to cover all the industry tools Joel uses.
To give a brief and inexhaustive list:
- Undergraduate courses in CS and CEng are not there to teach industrial tools, but basic principles, ususally ab initio to students just out of secondary school (high school for the US equivalent courses). This has implications:
- Everyone must work solo. You can learn to work in teams later (and certainly there are team projects all through the four years of the CS and CEng courses in TCD) but until you have a grasp of the fundamentals, team projects are worse than useless as they mask the student’s problems with the basics.
- No student is expected to graduate and be able the next day to walk into an industrial role without supervision or training, and no student has ever been expected to do that in Engineering since the first undergraduate course started in TCD in 1841. That’s why we have mentoring, why we have CPD processes, why we have Chartered Engineer (or Professional Engineer) titles granted by postgraduate programmes, it’s why there’s an entire structure there that’s been built up over hundreds of years of experience. Experience that we have paid for with lives in many cases.
- Everyone needs to work on the “interesting 10%” and leave the boilerplate code for later. If we had ten years for an undergrad degree, you can be very sure it’d be covered in detail, but we don’t. And four years only sounds like a long time because you’ve never taught such a course before, and are missing details like the coverage of the entire field of Computer Science being necessary in that four years.
- Undergraduate courses lose their technical currency in something like five years on average (obviously different sectors age at different rates – web programming has a very fast cycle, embedded systems a very slow one). If we started students off on the latest fad language in year one, they’d graduate with an obsolete skill in year four. So instead we’re better serving students by choosing languages which allow the lessons to be taught clearly, or which are at least well-established and unlikly to vanish into obsolesence before they can graduate. That’s why moving basic courses to Erlang or Haskell is probably a poor idea.
- There is no such thing as the agreed best practise in industrial work. Some favour Agile methods, some regard them as toxic. What works in one sector of the industry will leave your business uncompetitive in another. What is a minor issue in one application will actually kill people in other applications. And different industry sectors have different governing legislation. So if Industry, that much-vaunted crucible where only the best practices survive the trial of the invisible hand of the free market, cannot come up with a single code of practice, what exactly would Joel have the universities teach?
- There is a duty of care to the students. There are many evangelicalists out there who promote one form of Agile methodology over another, who promote unit testing, who promote pair programming, who promote Scrum, and so forth. These methodolgies are without doubt all very interesting – but they’re also unproven. If I’d started teaching students in 2005 with whatever the Agile Methodology De Jour was, by the time they graduated in 2009, that methodology had a fairly low probability of being unaltered, let alone the dominant industry methodology. That’s four years of a student’s life, committed to a methodology based solely on the hype its originators could muster together. That’s not just poor teaching and an invitation for justified lawsuits, it’s downright unethical and wrong.
Normally in an article (or blog post in this case) like Joel’s, the last few paragraphs are where traditionally the author is meant to show a solution to an elucidated problem. Joel has highlighted a perceived lack of experience with current industrial tools and practices amongst college students (a faulty perception, but nonetheless). He’s pointed out a root cause (poor time management) which is fair enough – it’s not the sole cause, it’s not even a primary one in most professional evaluations, but it’s a valid contributing factor. So now’s the time for the solution, yes?
But here is where Joel merely says “have the students use my product to track their usage of an unproven methodology”. No indications as to how we would approach the problem of explaining the relevant industrial methods, or how we’d select the specific one we’d teach, or why we should select it, or where it’s best applied and where it’s best avoided, what it’s strengths and weaknesses are, and so forth — just pure old-fashioned, ladies-and-gentlemen-this-product-cures-all-that-ails-ya, snake oil salesmanship.
Our students may indeed have poor time management skills on long timescales (a failing which comp.risks and The Daily WTF and IT Project Failures have been pointing out in industrial programmers for many years now, which to me indicates that industry does not necessarily have much to teach here). At least when Limoncelli wrote Time Management for Systems Administrators he was putting forward a set of skills that had proven to work for him in the field, and he was trying to pass on lessons learnt the hard way. I might not use the book as a college textbook (though I do use it myself personally), but I can at least respect his intent there. But Joels solution isn’t to teach better skills; it’s to sell his software tool to those students. Let’s throw ethics out the window for a moment and say we do just that. Now what? I can sell you the best chisel in all creation Joel, but I can’t make you into Michaelangelo by doing so – I can just take some of your money and give you a tool you don’t know how to use in return.
Frankly, when I teach the CS7004 students how to use VCS and ticket tracking, it won’t be using FogBugz or any other proprietary system, it’ll be using systems they can explore without having to pay fees for, like Mercurial and Trac. And if afterwards they need to go learn Git or Jira, they’ll know the fundamentals and it’ll take them less than a few hours to make the transition. That is what university courses are for, to teach the fundamentals so that the students can pick up specific skills far more rapidly and evaluate tools according to their proposed use. Not to act as a captive and uncynical market for proprietary software tools.
And even after that, in the final paragraph, he admits himself that it won’t work – that you still require a manager to come in and do time management even in industry. That Scrum won’t teach you time management. That the only difference between college students and industrial programmers is that the latter have time management enforced upon them by a manager – in which case, were that the truth (and it’s not a universal truth, and I’ve witnessed that first-hand), then there would be no problem with college students whatsoever.
Gah. Well, perhaps it’s time I should exercise some time management techniques that I should have used years ago — and simply stop reading Joel.
You’re right, college is supposed to be about fundamentals. But lots of schools aren’t even doing that correctly.
Yes, not every assignment can be in teams, but having some team projects at all helps. Nothing I’ve ever done in a professional setting (save my startup, for obvious reasons) has been alone.
Mentioning bugtracking or VCS in some fashion at all would be helpful. Yeah, don’t teach them FogBugz, but at least an explanation of the general concepts and how to use them would be nice.
Teaching them the inner workings of Scrum might be silly, but some sort of insight into agile vs waterfall vs whatever would certainly be useful.
There has to be some kind of balance. I don’t think that college should be vocational school, but some level of preparedness would be nice.
But all of what you just described was in my undergraduate course over here Steve – and that was sixteen years ago for pete’s sake. It is not a new idea. And yes, it’s not perfect, it’s always evolving – but it’s been that way since 1841 and the day it stops changing, the day we say “yeah, that’s as good as we can get it”, that’s the day the industry dies on its feet even if it doesn’t notice it at first.
And this is being academically studied in depth – there are whole journals by the IEEE and the ACM on the topic, there are multiple conferences, hundreds of academic papers every year, lots and lots of deep, peer-reviewed thought on the subject has been going on for decades.
I once read an article post written by Joel Spolsky titled ‘The Guerrilla Guide to Interviewing’ which struck me as particularly asinine but I thought that perhaps it was an outlier. Later on I read a few of his blog posts and promptly crossed him off my list. Nowadays, if someone recommends a piece by Spolsky, the first think I think is that maybe I should cross the recommender off my list too.
You have a button with your name I can put on my shirt. Really, I do admire you
there is a trend for people like spolsky to try to convert a computer science education to a trade school education. computer science exists because it’s a wonderful academic blend of math and science and it is not a 2 year technical degree (with absolutely no disrepsect toward 2 year degrees). “fogbugz” has nothing whatsoever to do with computer *science* joel (asshole).
Hahaha hilarious. I find most of Joel’s ideas mediocre and not even really “ideas” – more rehashing of what others have done better, as you say. He comes up with occasional gem, but I think he should spend more time fixing the bugz in his buggy software — would lend him more credibility.
The sole purpose of Joel’s advert is to suck people into arguing with it, citing it and advertising it for him as you just did.
However, I do have one question for you: I am having great success writing commercial parallel programs for technical computing using F# thanks to inlined higher-order functions and the wait-free work-stealing task queues of the TPL. AFAIK, no free software supports this style of programming even though it is already being using in industry. How can you expect to teach it without using proprietary industrial technology when everything open source is already obsolete in this context?
I really think you’re onto something, so I really hate to do this, but I’m commenting to correct your spelling. It’s not “Agile Methodology De Jour”, it’s “Agile Methodology Du Jour.”
In my opinion the correct answer is to stop reading Joel. Limoncelli’s book is awesome but you don’t need him to tell you that.
I stopped reading Joel a while back. When an appreciable amount of his posts started to take on the odor of an advert, it was time to “Put the blog down Sir, and take a step back”.
That being said, I have to disagree that undergraduate education in computing sciences cannot provide more industry oriented preparation. I’m curious what “web courses” that could’ve been taught 5 years ago have lost their currency in 2009. Javascript, nope, still here. CSS scultping, nope, still here too. HTML crafting (sorry, had to pause there) ennngh, no. Now if you were talking about something like FrontPage, I wouldn’t consider that in the category of something that would be center of a “web course”. Tools, perhaps, but technologies ? What programming language that was widely in use 5 years ago has become utterly useless for a recent grad ? I honestly can’t think of one, and please do tell me, as I’m sure to tell some of my fellow classmates not to bother with them ( I returned to the the university after 19 years in the industry to pursue a Masters degree ).
I’m not saying that in some software engineering course that Scrum should be THE only agile methodology espoused to the undergrads, I’m saying a survey of them should be covered as an essential requirement of completing their studies. Regardless of how much “supervision” a recent grad might be perceived of needing ( and I’ve seen some who did, and more than a few who needed no such hand-holding ), bestowing them the ( as another commenter put it ) fundamentals, shouldn’t consist of only things like predicate calculus, recursion unrolling, and algorithm analysis. Let’s be honest, the vast majority of CompSci grads aren’t going to go on to graduate programs, and it’s a disservice to NOT more adequately expose them to the realities of how “the industry” works.
Sorry, but one course in undergrad software engineering, and a senior project alone doth not preparation for working in the industry make.
I’m sure I’ll get my share of detractors, but articles elsewhere have made the same arguments, or similar. I wish I had the citation. In case someone might remember, there were two gentlemen, well-known professors who’d worked on military-grade software systems.
Joel has some nice ideas on management, but when he ventures outside of that realm and into technical topics, he’s a complete travesty.
Well, I think the author has missed the whole point.
I don’t like Joel either but I think he’s not saying what you think he’s saying.
The point is academia should be closer to industry.
I think it’s more related to what do you do CS for? Someone, maybe, someone, is doing CS to be an academic and do research and come up with an algorithm better than disjktra.
Most of us, really vocational, are there to change the world through technology, building software that makes people lives better.
And college should be more oriented to doing just that, how do you ship a product that can indeed make a change?
And I think that in fact most of the teachers(not all) don’t have a single clue how to do that.
When a wise man points at the moon, the fool looks at the finger.
Cheers
good post..
Joel is a over-hyped (over commented) blogger ..most of his ‘ideas’ are untested, unproven dictums questioning the established just for the sake of questioning..
when i started reading his blog initially, i found it interesting and insightful. Now it seems, he has run out of ideas and posts because the world knows him for his posts..
an undergrad course is meant to touch upon topics so that students have a general understanding.
if someone wishes to specialize, one may postgraduate…(not sure how many post graduate courses teach Joel’s tools)
about time management… my college always .. always starts sessions/semesters on the same day every year… declares results on same dates every year
And don’t forget how to survive the interminable meetings you will attend as a non-student. I swear that was the worst part of working after school (and that was 20+ years ago!).
Dang, I forgot my other complaint about Joel’s recent diatribe–that no CS profs ever worked in the *real* world. That is inconsistent with my upper-division classes. One guy (data structs, I think) used to regale us with his days at GM. Another (my graphics prof) I ran into doing contract work at Mentor Graphics. I think most had worked “in the real world” at more than one occasion.
Jon,
Given a solid background in the functional paradigm, concurrency and parallel algorithms would you have right the tools to learn the F# technical computing work you do?
Things like revision control software, ‘methodologies’, paradigms (OO, functional, relational) are widespread and long standing parts of our industry. The context of their history, evolution and purpose will students into the science beyond, the reach of evangelists. My undergraduate education failed in this regard, and left me without the context to vanquish the traditional accounting/consulting company bureaucratic process mantra.
There is an important trade to software development. However skilled trades people are undervalued at present in the anglo world. So social factors prevent us pursuing an approach which acknowledges this, and we need to stick to creating professionals for the good of those educated.
How appropriate that an article railing against snake-oil salesmanship would get a comment trying to sell another snake-oil, namely, F# (or perhaps just some flying frog consulting).
F# is just a .Net implementation of (Oca)ML. Every idea in F# had its genesis (and working implementation) in academia and open source. To claim that “everything open source is already obsolete” is absurd and reveals either your lack of knowledge, or your lack of honesty. “TPL” is just a library in F#.Net, a particular way of doing concurrency. It is not fundamental. It is unnecessary and immoral to chain students to (usually inferior) commercial products as you suggest.
And let’s be honest about F#. Whatever cool things you may be doing with F#, the reality is that F# will do little more than enable a small number of Microsoft-centric programmers to claim to be doing “functional” programming just as MFC enabled (much larger numbers of) them to claim to be doing “object-oriented” programming in the 90s.
Excellent article, Mark.
“….here is a duty of care to the students. There are many evangelicalists out there who promote one form of Agile methodology over another, who promote unit testing, who promote pair programming, who promote Scrum, and so forth. These methodolgies are without doubt all very interesting – but they’re also unproven.”
Did you mean TDD? I might agree that TDD has some problems, but unit testing (or testing in general) is not unproven, it’s a fundamental aspect of CS.
*clap clap *
Reading this post has made my head lighter, has raised my spirits and made the world a better place in general for me. 😀
I appreciate your view, I can understand your frustration, and clearly Joel can step over the line with pimping his software from his bully pulpit.
But. Joel has a little bit of a right to pimp his software and methods. He’s a pretty smart guy. Every point you made is valid, and sure, quit reading Joel if you want.
But. I’ve learned things from Joel, thought he was a quack, felt glad for his contribution from the pro programming community, and… still value him.
I hope he comments on this post. I’d like to hear what he has to say.
“I think that it’s extraordinarily important that we in computer science keep fun in computing. When it started out, it was an awful lot of fun. Of course, the paying customers got shafted every now and then, and after a while we began to take their complaints seriously. We began to feel as if we really were responsible for the successful, error-free perfect use of these machines. I don’t think we are. I think we’re responsible for stretching them, setting them off in new directions, and keeping fun in the house. I hope the field of computer science never loses its sense of fun. Above all, I hope we don’t become missionaries. Don’t feel as if you’re Bible salesmen. The world has too many of those already. What you know about computing other people will learn. Don’t feel as if the key to successful computing is only in your hands. What’s in your hands, I think and hope, is intelligence: the ability to see the machine as more than when you were first led up to it, that you can make it more.”
Alan J. Perlis (April 1, 1922-February 7, 1990)
Joel has started a company and earnestly wants to be a bible salesman.Developing a boring ‘project management’ software does not interest all and is in no way doing any good to the development of computer science.Colleges and universities are meant to produce people who could propel the development of this field further.We want our universities to strive for producing next Vint Cerf and not a Joel Spolskey.
Knowing to use a subversion etc is of lesser importance and teaching how to use such specific software should be left for these ‘fogcreek’ type ‘institutions’ to teach in their internship program.
Universities should focus on more important things like understanding theory of computation and other time-standing brand neutral fundamentals.
Writing a GUI is not at all important as per developing ideas are concerned.Its the underlying program logic that gets computed is what of the main importance.
Developing a GUI is of no -value for a data compression software if you can’t implement the ‘compression’ part.
All the professors know nothing and Joel knows everything.huh.Why does Joel feel as if the key to successful computing is only in his hands!
Right on. Many “schools” (we like to call them “universities”) do not have separate CS and SE curricula, industry has been conditioned to accept that CS degrees indicate training as a developer, and academia is not doing much to make its CS courses more vocationally relevant OR to correct industry’s perception.
Even worse, from an academic perspective, is the amount of SE research being performed by CS graduates who don’t have a proper background in engineering or management (yes, management skills are a fundamental part of civil, electrical and mechanical engineering and part of the BOK for those professions).
To put it bluntly, no-one hires a physicist to build a bridge; CS must either be vocationally relevant or limited to academia and places where science (as opposed to engineering) is truly required.
I hate to repeat myself, but the entire post above was based on the idea that our CS courses are close to industry needs. But they’re not precisely what industry needs today. It’s the duty of care again. The university owes it to the student investing four years of their life to not teach them something that a company in industry thinks is highly important today – and may well drop tomorrow. The university is ethically obligated to take a longer-term view, over the career of the student, and prepare them for that as best they can.
As to the capability of the teachers, I’m not sure where you went to college, but it’s not the case in most colleges that the lecturers don’t have a clue. Certainly in my department we’ve got a large number of lecturers who have not only had lots of experience in industry, but who’ve been successes there. The founders of Havoc lecture here, for example. Yes, there are one or two who haven’t been to industry – but they’re a small minority, not the majority.
I was referring to TDD, BDD and DbC and the other variations on the theme. Unit testing is definitely considered a mainstream practice, though not yet a fundamental one (It’s not been around long enough and tested long enough for that).
As someone who started working professionally before I went back to get my degree I feel I have a different perspective I’d like to share.
To address your three implications in order:
“* Everyone must work solo. You can learn to work in teams later (and certainly there are team projects all through the four years of the CS and CEng courses in TCD) but until you have a grasp of the fundamentals, team projects are worse than useless as they mask the student’s problems with the basics.”…
-Right. 90% of your work in the real world will be in teams, if not more. The classes I have learned the most in have been classes that focused on hands-on experience with problems too difficult to tackle solo in the time allotted. Have them learn team-work but also require them to apportion the project by component with clear responsibilities and require the use of version control from the beginning. I cannot tell you how frustrating it was to have to train a person that had a bachelors in C.S. who told me “I don’t like version control. I never had to use it in college and I was fine.”.
“* No student is expected to graduate and be able the next day to walk into an industrial role without supervision or training, and no student has ever been expected to do that in Engineering since the first undergraduate course started in TCD in 1841. That’s why we have mentoring, why we have CPD processes, why we have Chartered Engineer (or Professional Engineer) titles granted by postgraduate programmes, it’s why there’s an entire structure there that’s been built up over hundreds of years of experience. Experience that we have paid for with lives in many cases.”…
-Why not? A student SHOULD be able to walk into an industrial setting and start work without more supervision than the other workers. If they can’t then that’s a failing on their part and the part of the institution. If what they learn in a university is not applicable to the profession, why learn it? I remember one professor early on in my college experience who stated “The waterfall method is the only real design method, all the others are fads.”. I pointed out (as I said, early in my college experience 🙂 that wasn’t how we were actually doing it, and it wasn’t how any of our competitors or partners were doing it either, and was ignored. I have more recently had classes that focused on team work, actual problem solving, and choosing whatever methodology you feel suits the problem – IF you can defend that methodology choice.
“* Everyone needs to work on the “interesting 10%” and leave the boilerplate code for later. If we had ten years for an undergrad degree, you can be very sure it’d be covered in detail, but we don’t. And four years only sounds like a long time because you’ve never taught such a course before, and are missing details like the coverage of the entire field of Computer Science being necessary in that four years.”…
-Fours years is a long time, if not wasted trying to spoon-feed techniques by rote memorization. Teach students ‘how’ to learn, ‘how’ to research a subject, and the basics of how language constructs work rather than the specific syntax of a language. Let them use any language, but require them to defend their choice by listing the trade-offs as they relate to the problem.
-As for working only on the interesting bits I see several problems with that. First and foremost, if they only work on the interesting bits they do not see how those bits work together except in theory – which means they have not internalized the concepts. Second, much of their work in the field of commercial programming will be writing boilerplate code, or if they are good and in a good shop – code to generate boilerplate code. They need to know how to do that, how to debug systems, etc. As a final thought, why is it I have never seen a course devoted entirely to the art of finding and fixing bugs, probably one of the most fundamentally important aspects of any programmer’s job? Why is it that highly useful computer security courses are optional upper division electives instead of both an advanced upper level version and a freshmen “Always keep in mind the following patterns of potential vulnerabilities in your code, and her’s how to fix them”.?
=Bill.Barnhill
To sell, yes. To advertise, yes (within guidelines we lay down in law for good reason). To pimp, or as in this case, to hawk it as a solution to foist on college courses? No, that’s over an ethical line.
Besides all of which, along with his right to sell, everyone else has a right to comment…
Hey have you guys read the post?? Now I think Joel’s last post on multithreading, C++ and duct tape programming was really off the mark and that it sounded like a repost from the nineties. But this ones not so bad.
He was full of praise for Greg Wilson’s efforts in getting students to contribute to open source projects. Greg Wilson by the is the guy behind software carpentry which is trying to get other scientist to adopt computer science approaches to problem solving.
Hey what Joel is saying is similar in intentions to projects like Google Summer of Code are trying to achieve for the students!!
I worked in industry (main for investment banks as I had advanced degree in Econ/Statistics ) before i did my CompSci degree. I found to my surprise that that very few of the professors could actually code or really knew about databases design (RDBMS/ Other), embedded systems, etc. Although they had written books on the subjects! Indeed their source code betrayed all the signs that they were amateurs in not addressing issues like maintainability, bug reduction , performance etc. And this is code they wrote for department/campus systems!! That saying my comp. sic degrees and time in academia were still useful as they helped to broaden by horizons beyond the industrial stuff. Anyone for natural language processing, prolog, neural networks..
Eventually (believe it or not!!) you may need to use them!!
Bill,
– They shouldn’t be able to walk into an industrial job unsupervised. They also shouldn’t need more supervision than other workers who are just starting off. But the point was, your 21-year-old CS graduate is not going to start work on monday and be the lead software architect on a medical software product by tuesday. We have moved over to the university model of educating students in the basics, but we have still kept some of the trappings of the apprenticeship model as well (because, in the main, four years is simply not enough to train people properly – which is why engineering in TCD is heading down a road of a five-year course, starting by having most students flow from the undergrad to an MSc course, but eventually it’ll become a five-year course, it’s inevitable).
It’s not about their technical skill. It’s about judgement, it’s about learning the workplace they’re in (and every one has its own quirks) namely it’s specific practices and procedures, and it’s about their seasoning, for want of a better word.
– Four years is a long time to a highly-motivated, highly intelligent and curious student with a love for the subject. It is not a long time if you’re taking a wider spread of students (as is mandatory because of the level of industrial demand and the reality that not every developer is a Sergey Brin). It’s even shorter when you consider that you’re taking in ab initio students, and teaching everything from maths to numerical methods and estimation to SQL and database theory, to systems languages and embedded programming, to object-oriented programming, to procedural programming, to functional programming, to compilers, to building a computer from the ground up with a handful of chips, a wire-wrap tool and some wire, and a lot of patience and swearing. There is a lot in those courses and you can’t cherry-pick two genius-IQ-level students every four years and run with just those guys, you have to take anyone who meets the basic math and academic requirements and who expresses an interest. (Which isn’t to say you don’t try to push the genius level students as hard as you can, but they’re two out of hundreds on average every year).
– The electives tend to be on more focussed things than the mandatory courses. Everyone needs to know how to do recursion; not everyone thinks they need to know SQL or AI methods. There’s simply not enough time to make everything mandatory that we think might be useful. Believe me, we wish there was.
He’s full of praise for Greg – and then he dumps all over the students for not producing masterworks, then blames the colleges for not teaching them how to produce masterworks, then says the problem is that college students don’t do time management and college courses don’t teach it, then says Scrum may fix that (and then hawks his product)… and then follows up by saying actually, every programmer is lousy at time management (so that there’s no real difference between college students and industry programmers) and that in industry, they have to have managers come in and enforce time management on the programmers.
And you sound like you had a bad experience with your college professors. Mind you, when you noticed that their code wasn’t really maintainable and had performance issues or wierd edge case stuff — were you standing in front of a class of 200 students who were less than six months out of secondary school (or high school in the US) trying to explain recursion for the first time to them at the time?
So I guess the proven way is to click your way through it? Or if it’s an embedded system the user should start emulating sensor data? That’ll be fun.
Do you teach students that every time the make a change the should make all the possible clicks?
Building does not exist without testing. Unit testing is just a consistent and efficient way of testing your application/system/whatever.
Of course he has to end with the basic need for managers. He’s a manager.
George, just because something isn’t thought of as “fundamental” doesn’t mean that it’s thought to be useless. I wouldn’t teach someone TDD or BDD because I don’t know if they’ll be around in five years, but I would teach unit testing because it will be. It’s just not fundamental the way that C would be, or estimating complexity or other fundamental things. Give it a few years and it probably will be.
But don’t go asking folks to risk four years of other people’s lives on something you believe. Go back to university and teach what you know – then take on that ethical choice yourself.
Well, we obviously have a different set of rules. I would teach someone to “always test your code” as I would teach him to “always free resources” . I can’t imagine how saying to students “understanding what a pointer is, is important, but making sure it won’t segfault it’s something you’ll learn later. Just screw everything up now”
Even if I take “only proven courses should be taught” for granted, then I can only name a few dozen things I learned at university that now are completely useless or obsolete. I agree that universities shouldn’t be jumping on to teaching the new hotness , but this coin has an other side: students learn obsolete stuff just because 20 years ago were proven and the university hasn’t caught up.
Having said that, I completely agree that Joel went over the line.
Jon,
Please take a look at Scala and/or Erlang which both supports inlined higher-functions which are both definitely not obsolete. (Actually Erlang was developed before C++)
ilan
@Mark
“They shouldn’t be able to walk into an industrial job unsupervised. They also shouldn’t need more supervision than other workers who are just starting off.”
Technically, you’re right there. But frankly, having counseled more than a few of the fellow classmates around me who are walking out with just their BS/MSCS into the job world THEY ARE, more and more, being looked at supposed to be able to walk in and hit the ground running without major supervision, even in some of the more junior roles. Sounds ridiculous, but I’ve heard it enough times from the few who’ve found jobs that they’re immediately overwhelmed at how “it really is” out there.
“Never take advice on computer science from a seller of bug-tracking software.”
Very well written and insightful post.
Thank you,
Kent
I read Joel’s article very differently to you. First of all, and most importantly, I didn’t feel like he was advocating the use of any one tool. In my fourth year soft eng course (by coincidence, at UofT, now Greg Wilson’s school), our prof insisted we use a version control system, but he didn’t care which one. Most of the students had never heard of it, didn’t see the need for it and hated his insistence that we use it, but he was right. We also had to track all our bugs – again, not on any one product, and lots of us used notepad/emacs. But they had to be tracked. And we had to have a software development plan: agile, waterfall, whatever, he didn’t care, but we had to decide and write it down. (You guessed it – students hated that too). When Joel says “this might be a neat opportunity to use Scrum,” I read this just as a suggestion in this spirit – what’s important is that *some* method be decided upon and used.
This stuff seems to me to be almost as fundamental to being a good software engineer as knowing your complexity classes and algorithms. And CVS is almost as fundamental as C at this point. I think it’s totally legit to insist that they be taught in university.
Secondly, you ask another commenter: “were you standing in front of a class of 200 students who were less than six months out of secondary school trying to explain recursion for the first time to them at the time?” But the article specifically suggested that this be part of the last year of university, not the first. Personally, I think an introduction to the basics of the tools themselves might be in order even earlier, perhaps by insisting that source code be submitted as part of a git repository. Not because git is important, but because it teaches you to check in changes.
And as for him hawking Fogbugz – I guess my wetware spam filter just filters that out.
5 years ago, writing your website in Perl was a totally reasonable decision. Today, most people wouldn’t even think of it in the face of RoR, Django, or even PHP.
I’m not saying this disproves your post, but I think it should give you pause.
Neither can I. However, to carry your analogy back to the unit testing topic, what I’m saying we don’t teach how to use Boost’s pool library or how to use Hoard; but we do teach malloc() and free() and not only that C doesn’t catch an out-of-bounds array access, but also why it doesn’t. Likewise we don’t teach TDD or BDD, but we would teach unit tests and why they’re written and what they’re meant to do and what their strengths and weaknesses would be. Do you see what I mean?
No, it’s not.
First off, you’ll find a large body of people (myself included) who’ve worked with CVS and been badly bitten by it and who know it’s limitations and think it’s only still about because of the inertia of legacy systems and because there are worse solutions. Basing a course on a non-distributed VCS would be a mistake in my personal opinion.
Secondly, CVS has nowhere near the level of testing C has, nowhere near the size of a knowledgebase or userbase, and nowhere near the level of importance to a graduate.
All that said, yes, I think a VCS is a critical industry tool, one that should be taught in university courses, and that’s probably why it is taught there, though usually after things like recursion and control flow and a basic grasp of a language. That’s why I said so elsewhere :
Maybe I’m too old to comment (my CS degree was 1980).
As others have stated (maybe far more clearly), I think the university curriculum needs to remain tool/methodolgy agnostic while still teaching the fundamental principles and theory (e.g. students should understand unit-testing, but its not necessary to teach test-driven development per se).
That being said… During my 4 years, I learned far more while working at the computer center help desk than I learned in any of my courses. First and formost, I learned how to read and debug other people’s code. As part of that process, I also was exposed (at least informally) to collaborative development. Therefore, I can appreciate Joel’s point regarding the need for team projects.
Hi Mark, when it comes to VCS, I think we’re saying almost the same thing. Perhaps comparing CVS to C was an overstretch, but what I’m saying is really important is that students must know what VCS is and have at least a passing knowledge of how to check code in and why that’s important. My school didn’t cover this until 4th year, and I suspect would not have covered it at all except that my one teacher insisted.
As for distributed versus non-distributed – I don’t have a strong opinion there. I’d be surprised if a reasonable student familiar with non-distributed VCS couldn’t pick up DVCS at a later date, but it’s probably reasonable to teach DVCS in the first place.
Mark, I can’t figure out how to directly apply to your comment, my apologies.
None of that stuff was in my program. I went to a fairly large school, as well. You must have been lucky. I didn’t see any of those things at all.
Before I start what has turned into quite a rant, let me say that I am not a Joel fan, but I do agree with at least some elements of his post here.
Those practical might be in your course, but it’s not in a whole lot of other courses and where it’s done that I know of, it’s not done well. I too had a capstone course that purported to explain version control systems and agile versus waterfall development, but it really only paid lip service to these concepts and very few students came out understanding the benefits of those systems. In fact, I distinctly remember one team that said version control made their project harder because they couldn’t work concurrently on files. That is an utter failure and did more harm than good to impressionable minds. The basic fact is that some practical concepts need to be more than just *taught*, they need to be underlying foundations of an entire curriculum.
I perfectly understand the argument that teaching towards the practical end can quickly put knowledge out of date, but for some practical skills, it is the WRONG attitude. Science students have to learn lab procedure, many Engineering students have to learn CAD. Computer Science students should learn at least version control, automated testing, and possibly some project automation (build tools). These are things that have been used in industry in some guise or another for decades and they are in no way becoming less relevant or outmoded. I don’t think anyone (even Joel) is arguing that particular instances of these tools have to be used.
I mean, for the love of God man, no one taught me how to get a C program to dynamically link against a shared object in any of my dozens of courses that used C and C++, let alone how to properly write a unit test or manage a project with version control. I mean, I went to a fairly prestigious private university, and that is a personally embarrassing lack of common practical knowledge. I don’t know what it’s like at the institution you teach at, but elsewhere the willful decision to not devote any focus on practical knowledge is producing students SO useless in any professional capacity that they might almost have been better not attending college just so that they could have spent 4 years learning practical skills from people that they could interact with online. I do value the theoretical knowledge I learned, and I managed to come out on the practical end okay, but only because I read diatribes from people like Joel and learned where my institution was failing me.
And don’t you even dare knock Haskell. Those are some of the few people in the entirety of computing that are actually going out of their way to cleanly unite theory and practice. While they’re avoiding success at all costs, they’re moving the state of practical computing towards theoretical soundness far more than I can say of any of the pure academics I’ve ever met.
Now I’m angry, harumph!
I think we’re agreeing with each other allright Adrian, and the differences are down to us having experience of different universities.
As to VCS-v-DVCS, I’d prefer to teach both, or at least that both exist. Like I said, start with RCS and SCCS, move on to CVS, then on to Mercurial and Git and look in each case at what they’re good at, where they’re best used and when they’re going to be a pain in the touche.
It sounds like you got the unpleasant end of the stick there Steve 🙁
I’ve been struggling with this question over the past 3 years myself as I finally went back to school to finish that CS degree I never got.
I was struck by several things (in most of my classes):
1) Whether it was required or not we are told to not work together as that is a violation of ethics and we’ll get booted. So, in most of my classes they are no team projects – ever. I would love team projects if not to help me, to be able to pass on some of my industry experience to fellow students. But no, that is forbidden in 85% of my classes.
2) Agile development, version control, etc., the tools of the trade are either: not taught; are available and learn on your own; available as a higher-level course (that is optional of course); not even mentioned. This still amazes me, that there isn’t some introductory class to working in a team environment. We have a basic C/Unix tools course, which teaches us the basics of getting stuff done in Unix, why not something for pair programming, team-work, project management, etc.?
I’ve waffled on solutions to this, does it mean we have to have two tracks? One for scientific programmers who just want to continue on and do fun research-y things and another for those of us who just want to code? Or should we intersperse classes like the Unix tools class except for team programming. At what point do you make these required? At what point do you say that in order to take 300-level courses you must have this course before moving on?
I have a feeling the answer lies somewhere in the middle. I know that I missed a lot of fundamental stuff when I didn’t get my degree, and quite honestly I’m glad I went out into the world and did programming first and came back to learn the underpinnings. I’ve enjoyed my classes so much more because of it, so maybe that’s the answer. Have them take a year of classes and then throw them in the job pool and teach them all the tools they’ll need, then have them come back. They’ll be better for it.
@Larrik,
“5 years ago, writing your website in Perl was a totally reasonable decision. Today, most people wouldn’t even think of it in the face of RoR, Django, or even PHP.
I’m not saying this disproves your post, but I think it should give you pause.”
It gives me pause when I think of the CEO I recently met is so entrenched with Perl that he DOES do website development in it ( in his word, because he can. Glad I didn’t take that job, Hmm. ). So while it may sound like an unreasonable proposition, and others would certainly be right to say “there’s better tools”. Ahhh, but what was it that led you to make such a veritable statement…pracitcal experience. Which was garnered how ? hands-on experience. So, no, it doesn’t negate anything in my argument.
What might bake YOUR noodle is that if you think that in 5 years people will say “Django, that’s preposterous, we’d never think of creating a web site with that framework.” What I was not espousing drilling A specific language/framework/methodology to the exclusion of all others, but that MORE exposure to any ( or AS MANY ) should be included. If that means that you have to convince the Uni to drop a couple of humanities from the coursework to better prepare your engineers…..so be it. I’ve seldom heard a good engineer quote Milton accurately anyway.
As a software developer, the only value my university education provides is an extra bullet at the bottom of my resume. 99% of what is relevant to my career was picked up outside of university through self learning.
The CS professors were way behind the students and taught either overly simplistic concepts, or outdated/irrelevant materials.
‘Struth! We spoken, sir! Spolsky irritates like a glass shard in my shoe.