Atlantic Magazine: Women developed computer science but men stole it

The folks at Atlantic Magazine who were smart enough not to study computer science have produced a video about how women developed computer science (but then men stole it). One interesting aspect to the video is the association of getting a bachelor’s in CS and working as a programmer with “developing computer science“. Also the heavy focus on feeble 1970s and 1980s consumer products from companies such as Microsoft and Apple. The Atlantic folks seem to think that the 1980s were a period of “revolutionary” progress in software. It was not that software ideas from the 1960s were practical to distribute more widely due to improvements in microprocessor performance (made possible by hardware engineers). It was that MS-DOS, Windows, and MacOS were breakthroughs in computer science.

Readers: What does this video tell you about how non-technical people see the tech world? If you are involved in hiring at a tech company, does it motivate you to hire women?

Related:

28 thoughts on “Atlantic Magazine: Women developed computer science but men stole it

  1. I think the atlantic workers should have studied computers. They are in a dying and failing industry and probably don’t get paid as much as an average tech worker.

  2. When programming involved working 9 to 5 for large bureaucracies such as the military and the Social Security Administration and large banks, women did well.

    Once boys were able to get their hands on computers outside a structured corporate setting (at first in school settings and later with PCs) and let their spergy tendencies come to the fore, then girls had no chance.

    Why isn’t the Atlantic demanding that more women be hired as drywall hangers (.3% female) or diesel mechanics (.5% female)?

  3. Academically, a lot was discovered in the 60s and 70s, wasn’t it? Then forgotten because it was all on typewritten reports in some university basement.

    The GUI made it possible to write more attractive products, but I would say there have been two great shots in the arm for the field as such: Open source (GNU, BSD, …) and, of course, the web (and commercial internet). Mid-80s and early 90s, respectively.

  4. Tom: Remember that those “typewritten reports” were generally available on the ARPAnet via FTP. The significant theoretical advances were also published in academic journals. The significant practical advances were available as commercial systems, usually on mainframes or mainframes. If a practical advance had been made in a university it would be available commercially a few years later.

    The only thing that I can think of that was put on the back burner and is now all the rage is a brute force approach to artificial intelligence. Faster microprocessors and GPUs have made neural networks/machine learning/statistics of practical interest today when they were just of theoretical interest in the 1950s. See https://en.wikipedia.org/wiki/Artificial_neural_network for some history.

  5. Stop trying to muddy the waters. Women developed silicon wafer refining, modern chip fab processes, and semiconductor design. All stolen by men.

  6. I know a lady who got a law degree, worked as general counsel for a very large tech consulting firm, and moved on to the CEO of North America. Now all those programmers and computer scientists are reporting to her. Getting a ba worked out pretty well for her.

  7. Basic contributions to CS are being overhyped for some and ignored for others. Emil Post’s constructions have priority over Turing’s but he is largely ignored except by some math departments. Colossus was imagined by Max (not John von) Newman and built by Tommy Flowers who made it purely electrical based on capacitors vs slower elector-mechanical elements, first of its kind. Turing machine is an abstraction and close copy of real automata such as advanced mechanical clocks that already existed for centuries. Analog calculators and computers existed for centuries too, also they did not follow what came to be known as (John von) Newman architecture who generalized it as an employee of DoD – that was his job. It is not his most notable achievement but is very important and public one.
    One could argue that all efforts involved in consumer application development were hindering real progress by focusing resources that can be used elsewhere – who needs to rewrite same functionality for new platform every decade or less? Sometimes new programming tools are clearly inferior to past. I am saying this as someone who learns latest programming frameworks with ease and uses them to make living. I am not advocating central planning here, this seems to be a result of hybridization of central planning with private profits.

  8. I worked for big corp doing hardware and systems design and then into management from 1968 until 1999. We hired the best (BSCS and BSEE) engineers available until 1985ish. That was 95% men. Then we started getting “quotas” for different groups, women, minorities and so forth. So for several years we had to hire lots of women and minorities. We mostly hired very smart recent graduates from good schools. So it worked OK but there were issues. The gals wanted instant promotions and equality to all the guys who had worked in the company for years. They had issues working small skunk work projects. They worked great in big projects with tons of engineers and lots of paper and long development cycles. Maybe better than many of the men. They were better organized and liked code reviews and quality reviews better than men. They were OK coders and good people handlers. They communicated better than men in many cases. But most of the gals had trouble handling uncertainty or guessing about things. So it was more complex setting up and staffing jobs.

  9. Little Richard was just as influential as Chuck Berry, but Chuck got paid up front and Richard is still wailing “they stole it and I never got a dime!” Richard would probably identify as a woman starting out today, so it’s true: the men always steal it.

  10. But I also saw a big change in 1990 on the type of engineer coming out of school. For about 15 years from 1975 to 1990 lots of people thought engineering was cool. It paid well and was a stable career. So the cool kids in high school were going to engineering school. It was common for a while to hire a engineering guy and his wife together out of school who where both great looking socially skilled people.

    Then the tide changed. Personal computers and Apple and Microsoft were making “nerd kids” who were going to college. It was no longer cool to be an engineer. You were nerdy to be an engineer. So the smart gals no longer went to engineering school. So I figure the female enrollment in engineering school went down and is still going down. Ladies do not want to be nerds….

  11. Mmm, perhaps it was so at MIT. I only recall seeing those reports in more recent years, before that it was a matter of scrounging up funding so the library could buy a copy, then get it some weeks or months later. When you looked at the publication history of a journal article, it was almost always two years or more between submission and publication. But happily, in our field, the conference proceedings are where the action is at.

    That also reminds me of one of the other great enabling inventions I forgot, the search engine. Nowadays it’s so much easier to keep abreast of things.

    I also got to gawk at a couple of TI Explorer Lisp Machines back in the 80s. OK, they might be lacking some things we take for granted now, but not everything has gotten that much better.

  12. I wonder if the change can be attributed to two factors: (1) being a programmer is a bad job, and (2) family law has changed so as to enable women to avoid taking bad jobs.

    Why is programming a bad job/career when we are constantly told about some highly paid programmers in Silicon Valley? The average pay is not particularly high. Where the pay may be higher is in parts of the country where a family-size house costs $2-5 million so the actual standard of living of a programmer is low. It is hard to keep a job as a programmer in the long run because employers tend to come and go. Losing a programming job after age 50 is often the end of a career, so even if the pay is higher than in some other occupations the career is shorter so the total comp may be similar. Finally there is the social aspect. Being a programmer means that you aren’t going to benefit from social interactions the way that, e.g., a salesperson would.

    The Atlantic video says that women stopped being coders in the 1980s and 1990s. This happens to coincide with changes in family law. http://microeconomicinsights.org/divorce-laws-and-the-economic-behavior-of-married-couples/ says that women responded to the availability of no-fault divorce and 50/50 property division by withdrawing from the workforce. Presumably they quit the crummiest jobs (e.g., programmer) first. No-fault spread through the U.S. during the 1970s. Starting in 1990 with the introduction of statutory child support guidelines it became possible to obtain the same profit from an out-of-wedlock child as from the child of a long-term marriage and child support following a short-term marriage became as lucrative as had been alimony following a long-term marriage. The woman who wanted to have the after-tax spending power of a programmer could have a casual sexual encounter with two or three programmers (depending on the state; see http://www.realworlddivorce.com/ChildSupportLitigationWithoutMarriage ) and then collect child support. There was no need to go into an office every day and work with virtual punch cards.

    Subsidized public housing also became a lot more attractive in the 1980s and 1990s. A “single parent” was no longer relegated to a “project” but instead might be in a standard apartment building that had to appeal to market-rate tenants. The material lifestyle of a single parent with a more interesting/engaging 30-hour/week job (plus means-tested welfare benefits such as subsidized housing) might be comparable to the material lifestyle achievable by a programmer working 60 dreary hours per week. Why would this lead to differential behavior by men (keep the coding job) and women (quit the coding job)? It is a lot easier for women to qualify for welfare programs, e.g., by having a baby and retaining custody of the child. See https://philip.greenspun.com/blog/2015/06/01/book-review-the-redistribution-recession/ for how single mothers cut their hours as welfare programs expanded over the last 10 years.

  13. For instance, ACM published the first POPL ten or twenty proceedings on a DVD. From what I recall, the papers therein were largely typewritten. TeX was released in 1978, and I guess you also needed a graphic display and perhaps a laser printer to get the most out of it.

    When I located the report where ‘macros’ were first proposed, I recall finding it as the scanned copy of a rather non-pristine original. There might have been coffee stains.

  14. Tom: It might be true for Millennials that if information can’t be found via Facebook then it doesn’t exist. But even before ARPAnet people who were interested in advances in Computer Science managed to communicate those advances via conferences, journals, books (e.g., Knuth, the Organick book on Multics, Garey and Johnson on NP-Completeness), telephone, face-to-face meetings, etc. Universities invested heavily in libraries, for example, and the ground floor of the MIT AI/CS lab was devoted to a great CS-only library with hardcopy tech reports from other universities as well as the standard ACM and IEEE journals and conference proceedings. I don’t think that there is a strong case to be made that knowledge dissemination today is substantially better. If anything it is worse because more stuff is kept secret by corporations. IBM had little incentive to keep any improvement secret. They were renting mainframes. Customers had access to source code. IBM had the mainframe market more or less monopolized and Digital had the minicomputer market mostly monopolized. IBM even started its own journals to tell everyone how they were doing everything (see this 1964 paper on the System/360). Good luck getting this kind of information out of Apple, Facebook, or Google!

  15. Oh God, the MIT privilege. I would have loved being there. I got my PhD in the mid-90s, but far away (Sweden) from the shining library on the hill. In my case, at least, access to old and new papers has greatly improved since then, even though I’m no longer an academic or researcher.

  16. The 1980s were great for software and lots of great programs (IMO) were introduced. PCs and DOS brought computing to the masses while Excel brought good data analytical tools so everyone could do spreadsheets. Corporations are run on this program and it replaced most of the big tools that formerly ran on IBM mainframes. Word and the PC killed typewriters and carbon copies and whiteout. Together they totally reinvented modern office work and how business is conducted all over the world.

    The 2000s have been great for software as well. Cell phones are the latest platform that has spread all over the globe. Smart phones with apps (programs) for messaging and real time video and business data are changing where we work. People now do business at home or on the beach more efficiently with better real time data and tools than going to work. People no longer need an office to work. Corporation office suites and cube farms are becoming obsolete. People can be all over the globe and still work together and interact with each other efficiently.

  17. The Atlantic folks seem to think that the 1980s were a period of “revolutionary” progress in software.

    Funny, for me software in the 1980s was a huge step backward from reliable time-sharing and minicomputer systems to PCs and Macs that crashed routinely. People who worked through that generation of operating systems still have the involuntary twitch of hitting the “save” key every few seconds.

    Planet money had an insightful episode on why women disappeared from Computer Science programs in the 1980’s.

  18. Google is a limited solution to problem that arose in mid 90th and as development documentation tool it is step back from comprehensive electronic documentation of let’s say early 90th (available to every developer). I would argue that for average joe or jain programming skill is still profitable : skilled programmers can live in log cabin somewhere in Rockies and make less than in Silicon Valley or full time for large corporation but around 2 times median US household income remotely. There are many employers that do not care of age of their remote employees. Of course, if someone without inherited wealth wants to live well in Silicon Valley or Manhattan then programming is a wrong occupation to choose.

  19. dean: What fraction of the American workforce wants to sit alone in a log cabin for 8+ hours per day? And then get pushed out of the field after about 25 years? Given long life expectancies isn’t it important to either (1) choose a career where old people are welcomed as workers, or (2) choose a career where there is a fat government-guaranteed inflation-adjusted pension? There are public schools everywhere in the U.S. Adjusted for the value of the pension, a public schoolteacher earns as much as a programmer and the job involves a lot more social contact. Why not be a schoolteacher instead? (If you love computers, be a computer programming teacher!) There are hospitals all over the U.S. Why not be part of a health care team? (Being a physician is an option that isn’t available to most people, but there are plenty of other jobs that pay well and don’t require getting into medical school.) It might be a lot easier for a 50-year-old fully credentialed and certified medical technician to get a job than for a 50-year-old programmer!

  20. philg #20: I am not sure about ‘pushed out of field after 25 years’ if someone is leading healthy lifestyle. As I mentioned, remote employers do not care about their employees age. ‘Pushed out’ is for employees of corporations that often stage loaded competition to show that need to outsource work for financial benefits beyond getting job done fast and right or when interviewed by recent college graduates who do not feel comfortable around older person also he/she could probably outperform them in physical tasks too. Of course no daily drink for older programmers and if health is the issue this is not the right occupation for sure. I agree that getting cushy programming government job is better, but it is hard to get without connections. State trooper can be one of jobs that can be attained without connections but applicant there can not be older than 40 I believe. School teacher could be a good choice. I of course used hyperbole when mentioned wooden cabin in Rockies although I knew programmers doing that too. It may make sense to spend a decade or so in a high-powered environment and move away to the country coding for less. Such places have all modern infrastructure and schools with average performance exceeding average city school by far.

  21. Nobody is actually going to receive those pensions. All the pension funds are insolvent. Taking a government or union job with the expectation of fat retirement bennies has been an obviously bad plan for many years.

    I’m not sure how true it really is that skilled software engineers face serious age discrimination. There is a lot of lousy low skill grunt tier work where they know only desperate people will stick around for a couple years, so they stick to broke kids and H-1Bs. But ignoring that segment of jobs I’m not so sure about the stereotype of the unemployable 50 year old programmer with a track record of success. I think what really happens with some frequency is a guy maintains a horrible piece of crap system/application for twenty years, and then when that piece of crap is finally retired it turns out he’s not much good for anything else. Just don’t be that guy.

    There was talk above about nerdy guys driving the women out of software. That’s probably true, but it is probably dwarfed by H-1B and large numbers of people with accents in the field. Any job where a lot of people have strong accents and eat weird food is LOW SOCIAL STATUS. This is actually much more important than the pay or working conditions. Archaeologists and and art historians are broke, but not low status. In those fields you will be around people with valuable American social capital. The social prestige of engineering professions has been systematically driven down as a matter of policy, which turns into a self reinforcing loop. Women are more attuned to these things than men.

  22. dean: I wasn’t talking about a 50-year-old being “pushed out” by an employer that continues to thrive but with 25-year-old replacement workers (though obviously that would be a management goal!). I was talking more about people who had been programmers at a company or on a project that folded. Employers of programmers are much less stable than employers of health care workers, for example, or teachers. Maybe your boss at Silicon Graphics loved you, but you probably don’t still have a job there! (see https://en.wikipedia.org/wiki/Silicon_Graphics ). If you were lucky enough to get picked up by Sun MIcrosystems you probably didn’t survive the layoffs there.

    In other words, I wasn’t saying it was hard to keep a job at 50. I was saying it was hard for any private industry programmer to keep any job for more than about 10 years. This means that there will be programmers in their 50s looking for work and, from what I have heard, employers are not excited about these folks.

  23. It’s not true that there are 100k/yr remote programming jobs just falling from the sky like so many raindrops of gold, either. Or 200k/yr, if you mean median household income for married families. The government’s own data about programming salaries tells that tale. The census can only find 10 percent of IT workers working at home, and it can only find that they are paid a median of 80k, not 100k.

    Phil is right about the career trajectory. If you aren’t running a consultant shop selling the labor of younger programmers and taking the majority share off the top, or if you aren’t in management, you have a problem as a 50something with an IT background.

  24. The Practical Conservative # 24: ‘IT workers’ could mean anyone. 80K seems to be absolute minimum for any software developer with 3+years of work experience in not affluent areas of the country.

  25. Any 50something just looking for a job is screwed. It doesn’t matter the profession. A broke MD or CFA sending out resumes at 55 is not in a great position. You have to have connections and reputation or capital by that age.

  26. Interestingly, the same publication, the Atlantic, also ran an article about how there are more women than ever in medicine and law (though progress has stalled).

    https://www.theatlantic.com/sexes/archive/2012/12/more-women-are-doctors-and-lawyers-than-ever-but-progress-is-stalling/266115/

    There’s a chart in here, looks like almost 50% of newly minted JD’s and MD’s are now women. In 1984, when women got 37% of CS degrees (according to the Atlantic arcticle/video under discussion), women only got about 30% of MD degrees and about 37% of JD degrees (I’m eyeballing this from the chart in the link above).

    Another thing to mention is that in San Francisco, ground zero for the economy-crippling shortage of software developers available for hire at multi billion dollar corporations, Registered Nurses actually earn about $133,600, which is considerably higher than the median salary for software developers in this region (check US News Best jobs, which provides a roundup of BLS stats). Even dental hygienists in SF earn pretty close to the median salary for an application developer

    So I suppose we should ask, why did the women who previously studied computer science leave to get law, medical, and nursing degrees, among other things? Are the women we lost to computer science majoring in art history an working as baristas in coffee shops now? Or are they high achievers who took advantage of the greater opportunities available women to find better jobs?

  27. @philg #23: Maybe your boss at Silicon Graphics loved you, but you probably don’t still have a job there! (see https://en.wikipedia.org/wiki/Silicon_Graphics ). If you were lucky enough to get picked up by Sun MIcrosystems you probably didn’t survive the layoffs there.

    Or Wang Labs, Bell Labs, DEC, Data General, etc, etc, etc.

    Or for that matter, a close family member has experienced the same thing in banking – Wachovia, SouthTrust, Barnett Bank, NationsBank and on and on…

    @JeffB #27: In 1984, when women got 37% of CS degrees

    Yes, 40% of my 1985 BS Computer Science class were female. The 4-year program was rigorous and the graduate females were more than competent. Ever graduate quickly got hired at $35K at one of the many Massachusetts high-tech companies around Rt. 128 or the government defense contractors. Within about two years every female decided she’d rather be a “project manager/meeting-attender/tic-box checker” or go on to grad school for an MBA.

Comments are closed.