Men & Women & Careers in IT.

« Previous article:   Next article: »
Sex & Computers & Rock & Roll:The Cycle of Creativity Blog Home Washington vs. Hollywood

(Ok, this is the third time I’m writing this. The first time it was in the comment form of a blog. For some reason, it just swallowed the message without posting it, blanking the editbox, and then giving an error saying the edit box was empty. So, I tried again. The second time, not trusting a webpage textarea, I wrote it in a text editor, so I’d have a copy if the website continued to be difficult. But I wrote it in the train into work, so I couldn’t post it immediately. I just close up my laptop with the article unsaved in Crimson Editor. When I got to work and tried to upload it, my laptop refused to come out of hibernation, forcing me to power cycle it (twice). Text lost again. By the time I was half-way through the second version, I decided I should post it to my blog as well. Fortunately, Live Writer makes saving very easy (you get to skip that “Save File” Dialog), so this time it may actually see the light.)

Sara Chipps (aka “Girl Developer”) just wrote an article about women in software development.

I see it as a bit more complicated.

Over my 20+ (gag) years as a professional software developer, I’ve noticed an interesting thing about the male:female ratio amongst software developers.

  • For Chinese developers, it’s very close to 1:1.
  • For Indians and Russians, it’s about 2:1.
  • For American-born developers of Western European descent, it’s around 20:1.

And that’s counting project leaders and other managerial roles. If we limit it to just coders, it gets close to triple digits — and it’s only that low because I worked with four fine American women programmers at one job back in the 80’s – before the big H-1B explosion.

I feel this is because programming ability, unlike being a doctor or lawyer, is not respected as a skill. Development isn’t a job one aspires to; it has become just another dead-end job for those that couldn’t hack med school.

Part of the problem can be traced to the fact that most Americans have absolutely no clue what a “computer person” does. They may not be able to perform surgery, but they do have a general idea what a surgeon is doing, and they can tell the doctors from the orderlies. But, very few people know the different between a computer programmer and a computer operator (“It’s the different between writing a novel and running a printing press”). Most literally treat the ability to get a computer to do something as if it were a form of Black Magic (and yes, I do mean “literally” there). You just type in the memorized incantations and the computer sudden does your will – like wizardry, a trait you are born with, not something that can be taught and developed. Most depictions in movies and TV treat the skill as something that even surprised us — that we know how to do the spell, but not how the spell works. (unfortunately, this is becoming true…)

This, of course, shouldn’t be surprising from an American populace that generally seems proud of their inability to do math, and treat anyone who can do even the simplest arithmetic in his head as a freak.

Then there is accountability — we feel that we are able to recognize a good doctor from a bad doctor, and maybe a good lawyer from a bad lawyer, but if you have no clue what a person does, how can you rate them? They consider the teenager that can a throw together an Html page as much of a “computer genius” as a compiler author (or they would if they had any clue what a “compiler author” was).

Star doctors save lives; star athletes fill stadiums, and as such deserve huge salaries. However, star developers, in the public’s mind (and unfortunately in the minds of upper management of many companies hiring developers), can be replaced by nerdy 16-year-olds. Salaries have plateaued – According to, a developer with 20 years experience can be expected to make only about 30% more than one with just one year of experience. In the same survey, a similarly experienced lawyer can expect double the salary of a beginner.

We have a profession that is not respected, is not considered a learnable skill, where experience counts for little, has little job security, whose average salary becomes more mediocre with each passing year, which management believes can be out-sourced to third-world countries.

side note: The trend to outsourcing that’s been underway since the 90’s has an interesting dynamic. It seems hiring has been based on the theory: “Who better to work on the Black Art of programmer than people from the mystical lands of India and the Orient?”. Now, India does have one of the best Engineering schools in the world, but only a very tiny percentage of the population attends. It’s much like assuming that because I’m American, I must have attended Harvard . In fact, a far greater percentage of American are Harvard grads than Indians who have graduated from the Indian Institute of Technology. On the other hand, tall tales of the Far East Mysticism go back nearly a millennium. So, it seems that hiring has gone from being based on sexism, to being based on racism.

So, the real question is, not “Why so few Women?”, but actually, “Why so many men?”.

From what I’ve seen answering question on various programming forums, it appears that every young boy that want to start programming, does so, so that he can write, as his very first program, a First-Person Shooter game. And why not? Programming is Black Magic. Writing Grand Thief Auto is no more difficult than writing Hello World, right? Recently on StackOverflow, someone asked about writing a game. He mentioned that he wanted to write everything himself, instead of using a framework, because he “wanted it to be fast”. I had to explain the game frameworks were written by teams of experts in the field with, collectively, decades of experience on micro-optimizations to squeeze every last cycle out of each video card, so if he had any hope of it being fast, he’d better use a framework.

So, where are most women and many men turning to instead of software?

Well, if the Reagan/Bush/Bush era (and to a lesser, but still real extent, the Clinton era) has taught us anything, it’s that workers are scum. Only the very top of the ladder has any hope to true success. When evaluating career paths, unless someone has a “calling” into a particular job (actor, teacher, priest), based on career potential, the choices basically run : Doctor, lawyer, CEO. That where the money is. Developer has become a job you “fall into” – just slightly above being promoted from store clerk to assistant manger.

So, what can we do about this?

Damned if I know.

I suspect that high school biology and social studies classes help us appreciate the skill of doctors and lawyer. As far as I know, HS classes on computers are largely limited to using MSWord and Excel — teaching us to appreciate the secretaries we don’t have anymore.

So, should high schoolers by required to take a semester in programming? That would be nice, but I figure if you add a required course, that means you’ll have to drop an existing required course, and I’m not sure what I’d give the heave-ho to. What’s more, programming isn’t even my top choice for course that all should be part of the basic curriculum — recent event have shown that that clearly needs to be a course in Personal Finance.