Women used to be present in computer work in higher percentages than they are today. Ever wonder what happened? Turns out that the story of gender and the progress of computing are a lot more tightly linked than we once thought…
Programmed Inequality: How Britain Discarded Women Technologists and Lost Its Edge in Computing by Marie Hicks (MIT Press, January 2017)
In 1944, Britain led the world in electronic computing. By 1974, the British computer industry was all but extinct. What happened in the intervening thirty years holds lessons for all postindustrial superpowers. As Britain struggled to use technology to retain its global power, the nation’s inability to manage its technical labor force hobbled its transition into the information age.
In Programmed Inequality, Marie Hicks explores the story of labor feminization and gendered technocracy that undercut British efforts to computerize. That failure sprang from the government’s systematic neglect of its largest trained technical workforce–simply because they were women. Women were a hidden engine of growth in high technology from World War II to the 1960s. As computing experienced a gender flip, becoming male-identified in the 1960s and 1970s, labor problems grew into structural ones and gender discrimination caused the nation’s largest computer user—the civil service and sprawling public sector—to make decisions that were disastrous for the British computer industry and the nation as a whole.
Drawing on recently opened government files, personal interviews, and the archives of major British computer companies, Programmed Inequality takes aim at the fiction of technological meritocracy. With over 30 images–including period photographs and cartoons–the reader gets a feel not only for what happened, but the cultural texture of the time. Hicks explains why, even today, possessing technical skill is not enough to ensure women will rise to the top in science and technology fields. Programmed Inequality shows how the disappearance of women from the field had grave macroeconomic consequences for Britain, and why the United States risks repeating those errors in the twenty-first century.
You can view more details about the book on the MIT Press website.
******************************************
How To Kill Your Tech Industry
by Marie Hicks
In World War II, Britain invented the electronic computer. By the 1970s, its computing industry had collapsed—thanks to a labor shortage produced by sexism.
Visualization by Celine Nguyen.
If the incredible rise of computing is one of the biggest stories of the twentieth century, then the failure of the nation that invented the electronic computer to capitalize on it is undoubtedly one of history's most important cautionary tales.
In 1944, Britain led the world in electronic computing. The top-secret codebreaking computers that the British deployed at Bletchley Park worked round the clock to ensure the success of D-Day, and the Allies’ win in Europe. At a time when the best electronic computing technology in the United States was still only in the testing phase, British computers literally changed the world.
After the war, British computing breakthroughs continued, and British computers seemed poised to succeed across the board, competing with US technology on a global scale. But by the 1970s, a mere thirty years later, the country’s computing industry was all but dead.
What happened? The traditional history of computing would have you understand this change through the biographies of great men, and the machines they designed. It would gesture towards corporations’ grand global strategies, and the marketing that those companies pushed to try to define what computers were for an entire generation of workers. It would not, however, focus on the workers themselves. And by ignoring them, it would miss the reasons for this catastrophic failure—a failure that remains a cautionary tale for many other countries today, particularly the United States.
Early Retirement
In 1965, a young computer worker named Anne celebrated her retirement. Decked out in a punched paper tape train that was made to resemble a bride’s veil, Anne celebrated the end of her career with her fellow twenty-something women colleagues at the computer company. To modern eyes, a retirement party thrown for a woman in her twenties might seem incongruous—all the more so because Anne’s technical skills were much in demand. British businesses and government agencies were scrambling to hire people who had computer skills, and yet here was a computer worker retiring from the workforce at the beginning of her career.
A few years earlier, a novel called Anne in Electronics—Anne was a popular name in Britain in the mid-twentieth century—had been published. Young women might read a light, pulpy novel like this as they rode to work each day on the London Tube, or younger women might read it to see what they had to look forward to. The book reflected the experience of many working women, but made it more glamorous and exciting.
The real-life Anne shared many of the same talents and characteristics as the fictional one. She was also young, white, and had technical skills. The fictional Anne enjoys an exciting career in a growing field, as she proves her knack for technical work, laboring with quiet diligence each day on a secret, high-tech airplane project. She eventually shows up her male superiors by figuring out a critical engineering error that was holding up the project. While they flounder, trying desperately to figure out the flaw in their design, she hesitantly points it out, afraid of embarrassing them. In the process, she not only saves the project but she wins the heart of the male coworker she had her eye on—and allows him to take credit for her breakthrough.
As women, neither the real Anne nor the imaginary one was making an unusual decision to put their career behind their other life goals. In fact, each was making the socially expected, and strongly encouraged, choice. In the book, Anne stays in the workforce after marriage, but only after being admonished that she must put her husband’s career first. Even in fantasies, women weren’t allowed to think that their careers could come first—the best Anne might do was to try to juggle her work, her family, and her husband’s needs, knowing that if a ball needed to drop, it would be her career.
A young woman named Anne Davis wears a punch tape dress at her “retirement party” as she leaves her job to get married.
In the real world, this juggling act was difficult, and often impossible. Middle-class women who had the mettle and the privilege to try it encountered major obstacles. And the fact that it was still seen as completely inappropriate for women to hold authority over men in the workplace meant that women who struggled to stay in their jobs rarely got very far. So the real-life Anne celebrated her retirement with half a dozen other young women computer workers—most of whom would go down the same path in the next few years.
Collective Failures
When we talk about computing history, we rarely talk about failure. Narratives of technological progress are so deeply ingrained into our ways of seeing, understanding, and describing high tech that to focus on failure seems to miss the point. If technology is about progress, then what is the point of focusing on failure? Up until recently, we also rarely talked about women in relation to computing.
The first silence is related to the second. Women, after all, were seen as having largely failed in computing until recent historians’ attempts to correct that assumption. But as it turns out, technological failure and womens’ erasure are intimately related in more than one way. When we put these facts together—our avoidance of failure, our ignoring of women in computing, and our tendency to see women’s contributions as less important—curious patterns start to emerge.
The failure of one unnamed and ignored postwar computer worker is a good place to start. In 1959, this particular computer programmer faced a very hectic year. She needed to program, operate, and test all of the computers in a major government computing center that was doing critical work. These computers didn’t just crunch numbers or automate low-level office work—they allowed the government to fulfill its duties to British citizens. Computers were beginning to control public utilities like electricity, and the massive and growing welfare state, which included the National Health Service, required complex, calculation-dense taxation systems. Though the welfare state was created by policy, it was technology that allowed it to function.
In addition to doing all of her normal work, our programmer also had to train two new hires. These new hires didn't have any of the required technical skills. But once she trained them, which took about a year, they stepped up into management roles. Their trainer, meanwhile, was demoted into an assistantship below them. She succeeded at her job, only to fail in her career.
That the trainer was a woman, and that her trainees were both young men, was no coincidence. Nor was it coincidental that, as a woman, she had the technical skills for a job like this while they did not. That’s because before computing became electronic, women were seen as ideal for what was considered mundane calculation work. Though this work often required advanced mathematics knowledge, it was perceived as unintellectual. Before a computer was a machine, it was a job classification—these women workers were literally called “computers.”
Even when electromechanical and then electronic computers came in, women continued to do computing work. They programmed, operated, troubleshooted, tested, and even assembled these new machines. In fact, IBM UK measured the manufacturing of computers in “girl hours” (which were less expensive than “man hours”) because the people who built the machines were nearly all women. Meanwhile, the British government, the largest computer user in the nation, called their computer workers the “machine grades” and later, the “excluded grades”—excluded from equal pay measures brought into the Civil Service in the 1950s. Because their work was so feminized, the government declined to give them equal pay and raise their pay to the men’s rate on the basis that the men’s wage was almost never used. Therefore, the lower, women’s wage became the default market rate for the work. So concentrated in machine work were women that the majority of women working in government did not gain equal pay.
By the mid-to-late 1960s, however, the low value assigned to computing work was starting to change. Not because the work itself was changing, but because the perception of the work was. Instead of being seen as intimidating behemoths that were only good for highly technical tasks, computers were now becoming widely integrated into government and industry. Their great power and potential was growing more apparent. Suddenly, low-status women workers were no longer seen as appropriate for this type of work—even though they had the technical skills to do the jobs.
So the UK faced a major problem: all of the workers who could do this work were no longer the type of workers that management wanted doing the work. Instead, managers wanted people who would eventually become managers themselves to control these newly important machines and all of the decision-making that was being programmed into them. That excluded women. In this era, women were not supposed to be in positions of power over men. Both implicit and explicit prohibitions prevented women from managing men or mixed-gender workforces.
Moving Out, Moving Up
Around the same time that the woman programmer trained two men to replace her, a young woman named Stephanie Shirley embarked on a technical career at the prestigious Dollis Hill research station—the same government agency where the Colossus codebreaking computers had been created during World War II. Shirley had been a child during the war, born in Germany, and she was Jewish. She was evacuated out of Nazi-occupied Europe with 10,000 other children on the Kindertransport, a humanitarian refugee program designed to take Jewish children to England. By comparison, the United States allowed in little more than 1,000 children through a similar program.
Grateful for the chances afforded by her adoptive country, Shirley set out to make the most of them. Yet early in her career she began to chafe at the confines of British culture. With a degree in math, a good work record, and a master’s degree on the way, Shirley was the perfect candidate for promotion—or so she thought. As she was denied promotion after promotion, she started to understand that her role was being defined by things other than her technical skill and education—that the much-vaunted “meritocracy” of the government service was anything but. “What shocked me was the discovery that, the more I became recognized as a serious young woman who was aiming high—whose long-term aspirations went beyond a merely subservient role—the more violently I was resented and the more implacably I was kept in my place,” she wrote in her memoir.
After being denied another promotion, one that she’d earned several times over, she eventually learned that the men evaluating her were resigning from the promotions board rather than making a decision on her case. “They disapproved on principle of women holding managerial posts,” she found out, so they would rather resign than consider her for a promotion. “I was devastated by this: it felt like a very personal rejection,” she recalled.
After hitting the glass ceiling first in government and then in industry, Shirley did what women were supposed to do—what the two Annes and so many other women had been encouraged to do—she got married and resigned from her position. But, unlike the Annes, she wasn’t happy about it. She still had the skills, the intelligence, and the drive to work in computing, and she knew many other women who were in the same situation—being stymied in their careers not because they weren’t good enough, but because they were women.
Because she saw the need for computers growing, she knew that people who could program would be essential. Only they could figure out how to unlock the potential of the new mainframes that so few managers understood, even as those managers earmarked hundreds of thousands of pounds to buy them. So although Shirley got married and started a family, she continued to work. In 1962, she started her own software company, Freelance Programmers, out of her home. When she had stationary made for her new company she half-jokingly put the name all in lowercase, because “we had no capital at all.”
Nevertheless, she began to recruit women who had similarly been forced into “early retirement” by having children or getting married. Once her business began to grow, she published an ad seeking people for full-time programmer positions, in the classified section of the Times of London. It read: “Wonderful chance, but hopeless for anti-feminists.” In other words, the company had a woman boss. The ad also announced that there were “opportunities for retired programmers (female) to work from home.” In 1964, this was revolutionary.
Shirley’s freelance programmers worked from home in an era when computer time was so expensive that most programming was done on paper before being punched onto cards and then tested on an actual machine. Programming from home was therefore not a problem, as long as you had a telephone to collaborate with your co-workers. Indeed, one of her programmers was once chastised by a company for using too much computer time to debug a program. Programming without a computer was cheaper—and preferred.
Programming from home also allowed women to simultaneously take care of their young children and fulfill their domestic responsibilities. To make things seem more professional, Shirley played a tape recording of typewriter sounds in the background when she answered the phone at her house, in order to drown out the sounds her young son might make. And when she was unable to get contracts early on, she took her husband’s suggestion that she start signing her letters with her nickname instead: “Steve.” With Steve Shirley as the public face of the company, business began to take off.
The Real Ann(e) in Electronics
Like many startup founders, Shirley was meeting a consumer need that was still barely understood. In the early 1960s, most software came packaged with the computer itself or was written in-house after a company purchased a mainframe. Software was not considered a product in its own right—and few people expected that customers would actually pay for it separately after spending so much money on a computer.
Shirley realized that they would. She had seen the need in both government and industry for programmers who could unleash the potential of expensive hardware with good software. Without software, after all, computers didn’t do anything, and with poor software they couldn’t fulfill their potential or justify their cost. Shirley also knew that British industry and government were getting rid of most of the people who had programming skills and training because they were women, thereby starving the entire country of the critical labor that it needed to modernize effectively.
Shirley scooped up this talent pool by giving women a chance to fulfill their potential. Offering flexible, family-friendly working hours and the ability to work from home, her business tapped into a deep well of discarded expertise. Because people who could do this work were an absolute necessity, the government and major British companies hired her and her growing team of women programmers to do mission-critical computer programming for projects ranging from payroll and accounting to cutting-edge projects like programming the “black box” flight recorder for the first commercial supersonic jet in the world: the Concorde.
A woman named Ann Moffatt led the Concorde programming team. And unlike the fictional Anne from Anne in Electronics, she kept the credit for her work. Working from home, Moffatt managed a team of women who also worked from their homes. In fact, this was the first time that Freelance Programmers had undertaken a project managed and staffed exclusively by remote workers—rather than being overseen by one of the four full-time managers who operated out of the small office space Shirley had rented a few years after starting the company out of her house. The arrangement of using remote workers to manage projects worked so well that Ann would go on to become technical director at the company, in charge of more than 300 home-based programmers.
Moffatt sits at her kitchen table in 1966, writing the code for the Concorde, while her baby looks on.
Much like Stephanie Shirley, Ann had begun working in technical roles in the 1950s, but had encountered a roadblock once she had children. The feminist business practices of Freelance Programmers let Ann continue her career and take care of her home and children, all the while contributing to Britain’s high-tech economy. In addition to being a major project for the company, the Concorde was a symbol of British high-technology pride and prestige—it flew successfully for decades, the only supersonic passenger airplane to date. And Ann’s concurrent project functioned for even longer: the baby in the photograph is now fifty-three years old.
Losing The Lead
While Shirley, Moffatt, and hundreds of other women programmers created software that helped Britain advance further into the accelerating digital age, British industry and government struggled to hire, train, and retain their computer workers. Women had the technical skills, but were not supposed to be managers. Even the fact that women were wielding more power by controlling computers was viewed as dangerously out-of-bounds.
As computing became increasingly interwoven with all of the functions of the state—from the Bank of England to the Atomic Energy Authority—computer workers grew indispensable. By the late 1960s, the government began to fear losing control of the machines that allowed the state to function because they did not have a well-trained, permanent, reliable core of technical experts. The women who had the technical skills were judged unreliable because they were not aligned with management. They were seen as liminally working-class, temporary workers who should not rise above their current station. To elevate women further would upend the hierarchies of both government and industry, pushing low-status workers into high-status positions.
So determined were ministers within government that they needed a cadre of male, management-oriented technocrats that they began, counterintuitively and in desperation, to lower the standards of technical skill needed for the jobs. Lowering standards of technical proficiency to create an elite class of male computer workers didn’t work, however. In fact, it made the problem worse, by producing a devastating labor shortage.
Well-heeled young men tapped for the positions often had no interest in derailing their management-bound careers by getting stuck in the “backwater” of computer work, which had still not fully shaken its association with low-level, feminized labor. Machine work in general was viewed as unintellectual and working-class, ensuring that men of the desired background had little interest in being swept up in the “industrialization of the office.” Most men who were trained for these positions, at great employer expense, left to take better, non-computing jobs within a year. As a result, the programming, systems analysis, and computer operating needs of government and industry went largely unmet. Although there were plenty of women who had the required skills, the government all but refused to hire them, and private industry largely refused to promote them.
Steve Shirley, Ann Moffatt, and their coworker Dee Shermer.
Soon, the most powerful people within the Civil Service had become convinced that the government could no longer function by trying to get more young men into computing: the numbers simply weren’t there. The shortage of “suitable” computer labor had risen to the level of a national security issue in the eyes of the state. Even low-level women computer workers held great power: when the all-women punching staff went on strike for better pay and working conditions, the massive new VAT (value added tax) system ground to a halt, derailing months of planning, to the horror of the men at the top. So they decided to approach the problem from a different angle: if there weren’t enough men for computer jobs, the number of these jobs needed to be reduced. They needed to find a way to do the same amount of computing work with fewer computer workers.
This meant ever more massive, powerful mainframes that could be run by centralized control and command. On the advice of the Minister of Technology, the UK decided to force the largest remaining British computer companies to merge into one huge firm that could provide government and industry with the sort of massive, centralized mainframe technologies they needed. In 1968, International Computers Limited (ICL) was born, and ordered to produce the machines that would allow Britain to meet its digital needs with its newly minimized and masculinized computer labor force.
Unfortunately, this change occurred right as the mainframe was on its way out, in a period when smaller and more decentralized systems were becoming the norm. This meant that by the time ICL delivered the product line they had been tasked with creating, in the mid-1970s, the British government no longer wanted it, and neither did any other potential customers. As the government realized their mistake—though not the underlying sexism that had caused it—they quickly withdrew their promised support for ICL, leaving the company in the lurch and finishing off what was left of the British computer industry.
Hiding Tech’s Mistakes
Stephanie Shirley’s company succeeded by taking advantage of the sexism intentionally built into the field of computing to exclude talented and capable technical women. At the same time, the rest of the British labor market discarded the most important workers of the emerging computer age, damaging the progress of every industry that used computers, the modernization projects of the public sector, and, most strikingly, the computer industry itself.
By utilizing just a small portion of this wasted talent, Shirley rescued many women’s skills from being discarded entirely and helped British industry and government fulfill some of the promise of computerization. But for every woman Shirley employed there were always several more applicants she could not. The massive waste of human talent rippled upward, eventually destroying the British lead in computing and the British computer industry.
In computing, discrimination is as old as the field itself. And discrimination has shaped the field in ways we are only now coming to understand and admit. The technical labor shortage in the UK was produced by sexism—it did not represent a natural evolution of the field, nor a reflection of women’s talents, goals, or interests.
Computing history shows us that the “computer revolution” was never really meant to be a revolution in any social or political sense. People who were not seen as worthy of wielding power were deliberately excluded, even when they had the required technical skills. To a great extent, that process continues today. Now, as then, hierarchies are constructed through high tech to preserve powerful social and political structures.
That we have historically ignored the impact of women on computing—both the women who stayed in the field and the many more who were pushed out and shaped the field through their absence—shows how narratives of technological progress hide the mistakes of the past. Often, failures teach us more. But by assuming the tautology that technology always leads to progress, we become blinded to all of the situations in which the opposite has occurred.
For the contemporary US, the British example is a chilling lesson. In twentieth-century Britain, computers helped re-institutionalize ideas about women’s second-class status in society. They took away women’s ability to participate in the digital economy under the pretext that they should not be in charge of powerful machines even if they had the technical know-how.
Though these attitudes may seem antiquated today, a closer look at our own technological landscape reveals that we continue to ignore the role of women and other minority groups in technology fields—and the impact of technology on those minoritized groups. When Twitter or Facebook is accused of doing something that hurts women, it is seen as a niche concern. Women do not stand in for “people” in general in the eyes of technology behemoths—a major problem when those technologies increasingly define every aspect of how we live. The issues affecting white women and women of color, people of color of all genders, the LGBTQ population, and other groups are repeatedly constructed as not being of primary importance by the technology companies that structure our political and economic environment—perhaps because to view things in that way would reveal how badly technology is failing us.
The British aspiration to build a new technological empire through computing should sound familiar: it is eerily akin to our own current situation. The US approach to high technology in the Cold War was baldly imperial. But what many fail to realize is that it has remained so long after the fall of the Berlin Wall. The stock market bubble of the first internet boom did not herald a warmer, fuzzier era of more democratic computing. It inaugurated a new era of “greed is good,” and in the process, Silicon Valley learned that it could actively profit from social inequality. The only catch was it had to be willing to manufacture ever more of it, selling technological “advances” that were actively harmful to a progressive civil society under the guise of technosocial progress.
The dynamic continues to this day. Silicon Valley reaps enormous profits at the expense of the majority of users, and calls it progress. But technology’s alignment with actual progress has a long and uneven history, and its effects are rarely straightforward or fully foreseen. Real progress isn’t synonymous with building another app—it involves recognizing the problems in our society and confronting the uncomfortable fact that technology is a tool for wielding power over people. Too often, those who already hold power, those who are least able to recognize the flaws in our current systems, are the ones who decide our technological future.
This piece draws on the research for, and includes material from, the author’s book,Programmed Inequality: How Britain Discarded Women Technologists and Lost Its Edge in Computing (MIT Press, 2017). Learn more at programmedinequality.com.
Marie Hicks is a historian of technology and a professor working on the intersection of gender, sexuality, technology, and power.