3 Understanding Inequities in Computer Science and Computer Science Education

Sara Vogel; Christopher Hoadley; Lauren Vogelstein; Bethany Daniel; Stephanie T. Jones; and Computer Science Educational Justice Collective

Chapter Overview

This chapter considers how inequities created by social beliefs, structures, and interactions influence computer science (CS) and CS education (CS Ed). It explores some of the historical and contemporary inequities that shape CS and CS Ed. The chapter focuses specifically on inequities that happen through the unjust distribution of CS resources and CS Ed learning opportunities, exclusion from CS and CS Ed, deficit narratives about marginalized groups in CS and CS Ed, erasure and a narrowing of what “counts” as CS and CS Ed, and embedded bias and the unjust impacts of technology on marginalized groups.

Chapter Objectives

After reading this chapter, I can:

  • Describe inequities that shape CS and CS Ed.
  • Identify how inequities impact CS students and CS teaching and learning.

Key Terms:

bias(es); deficit narratives; dysgraphia; erasure; exclusion; gatekeeping; inequity; intersectionality; marginalized/minoritized identities

Ms. Morales’ Story

Working toward equity in CS and CS Ed requires an understanding of the challenges created by the inequities that exist and persist in the industry and in classrooms, influencing what happens from day to day. Ms. Morales, an elementary CS teacher, recognizes some of the inequities that she and her students face and is working hard to change them.[1]

Ms. Morales is a New York City–based K-8 public school teacher involved in an equity-focused CS professional development (PD) community. Different experiences and aspects of Ms. Morales’ identity have shaped how she engages in equity work. When asked to describe herself during an interview with the authors of this guide, Ms. Morales named some of the identities she holds, including that of CS teacher, artist, and stepmother. She shared that she has Puerto Rican and Chinese heritage and speaks English and conversational Spanish with her family. She also described how when I was younger, I was dealing with mild dysgraphia in school and helping my brother who has hearing impairments.” Ms. Morales’ background shaped her desire to teach, and an early interest in technology eventually led her to become a CS teacher.

Ms. Morales also shared some of the issues that she needs to address as a CS teacher. Those challenges include things like:

  • meeting the needs of her students from different cultural and linguistic backgrounds,
  • supporting her students with disabilities,
  • helping students feel like they belong in CS Ed,
  • developing her own CS skills to support her students,
  • making sure that all of the students she services have CS learning opportunities, and
  • having access to the technology (working computers, wireless internet access, etc.) needed to teach CS.

Ms. Morales can address some of these problems within her own classroom and in collaboration with educators at her school. Other problems are more complex and tied to broader systems of inequity. These problems require us to organize collectives and build power and resources to work toward change.

Inequities in CS Industry and CS Ed

Ms. Morales’ concerns represent inequities that impact CS students and teachers across the country. Many of them fall into four categories: (1) the unjust distribution of CS resources and learning opportunities; (2) exclusion from CS; (3) deficit narratives about marginalized groups; (4) erasure and a narrowing of what “counts” as valuable knowledge and practices; and (5) embedded biases and unjust impacts of technology. We look at the historical roots of these inequities and how they continue in CS industry and CS Ed today. While we break them out into separate groups, it’s important to recognize that inequities are intertwined and often overlap, as the examples that follow illustrate.

Unjust Distribution of Resources

One foundational inequity is the reality that access to resources that facilitate CS learning are not equitably distributed in U.S. society or across U.S. schools. Resources include things like technological tools, rigorous CS instruction, and well-qualified teachers. Not all students have the same access to the resources needed to learn CS.

Schools with large populations of racially marginalized and low-income students tend to have more limited access to technology (like computers, tablets, and modems) and the programs and infrastructure needed to do computing (like websites and apps, software, internet connections, and power; Blikstein, 2018). Historically, only school districts that served high-income students had the resources to spend on new technology. School districts serving low-income and racially marginalized students could not afford the high costs of computers (Sutton, 1991). Today, while most households and schools have access to devices, many groups lack access to broadband internet connectivity (King et al., 2022). Students might be “smartphone-only” internet users, because these tools tend to be less expensive than other computing devices (Pew Research Center, 2024). Smartphones lend themselves well to entertainment, information seeking, taking and sharing photos and videos, and communication. However, many tasks valued in school, like extended research and writing tasks, web publishing, and coding and programming are more difficult to accomplish on a smartphone. These realities demonstrate how societal hierarchies contribute to an unjust and inequitable distribution of and access to the technological resources needed to engage in computing.

Resources related to rigorous CS instruction are also inequitably distributed. When computers started appearing in classrooms in the 1980s and 1990s, their use was often tied to racially constructed norms of behavior that favored students racialized as white. These students were perceived as “better behaved” and had more opportunities to use computers (Arias, 1990; Sutton, 1991). This trend continues today. Students who are perceived as “struggling” or as “troublemakers” may end up being tracked on remediation pathways, giving them fewer chances to take electives like CS. Even in schools where all students have opportunities to take courses related to technology, middle- and upper-class white students are more likely to use computers for activities like programming. By contrast, lower-income and racially marginalized students are more likely to use computers for drill and practice (Margolis et al., 2008). This distinction is important, as it highlights the difference between simply using computers and actually engaging in practices of computing.

Lack of access to well-prepared CS educators is another example of inequitable distribution of resources. Students from historically marginalized groups are more likely to attend elementary schools with fewer CS Ed opportunities, and a recent study found that only 8% of low-income schools offered the AP Computer Science course compared to 37% of high-income schools (Code.org et al., 2022). This could be in part from a lack of teachers who are qualified to teach CS. STEM teacher J. Lauren reflected on the impact that the unjust distribution of resources has on CS education:

There are countless students in underfunded schools across the country who may very well be incredible contributors to CS fields, but if they don’t have access to a computer for most of their education, or if their tech use is limited to a narrow suite of apps on an outdated school-owned iPad or Chromebook, they won’t even know that computer science and technology is a passion that they can have!

CS teachers need to be aware of the different ways that unjust distribution of resources can impact students. Educators may perpetuate inequity if they assume that all students have access to things like a computer or internet. Karime, a CS teacher, shared an example of how this plays out at school:

Low-income students are being denied the agency of working when and where they want to and are expected to give up their free time. I have students who were [enrolled] in CS [courses] against their wishes and do not have computers at home, so they are made to use the computer lab during lunch and after school. Instead of giving them computers (which [the school is] supposed to do), the CS teacher is being asked to rewrite the curriculum so that it can be completed on a phone. The inconvenience is enough to alienate someone from the field.

These examples illustrate how the unjust distribution of resources, including access to hardware and software, rigorous CS learning experiences, and well-qualified CS teachers, contributes to inequities in CS Ed.

Exclusion

At Ms. Morales’ school, CS learning experiences were offered mostly to 6th through 8th graders because of scheduling issues and misconceptions about what Computer Science for All was. How to structure things to achieve the “for all” goal was a challenge. Ms. Morales was often assigned more middle school classes than elementary classes and did not have enough time in her schedule to reach all the grades. These factors contributed to students in younger grades not consistently receiving CS instruction and feeling excluded from CS. The situation at Ms. Morales’ school shows how inequity often persists because of constraints that limit what is possible in a given space and time in tangible ways. It takes collective and intentional efforts to find creative solutions to overcome these challenges.

Nevertheless, in many cases, exclusion as a form of inequity is and has been built into the social fabric of CS. Exclusion can often be recognized by a lack of diversity and representation in CS Ed and CS fields. For example, those with certain identities may have easier access to CS courses, labs, and jobs than others who are excluded from those opportunities. Exclusion operates at two levels: an external level of oppression, where individuals may be kept from opportunities to participate in CS; and a psychological or emotional level of internalized oppression, where individuals may feel excluded and unwelcome in a space even though they are physically present.

External exclusion is often reproduced through practices like gatekeeping, or institutional systems and policies that control who gets to participate in opportunities and who has access to resources. For example, course policies in U.S. school systems may push students with marginalized identities onto an academic track that prevents them from enrolling in CS courses (Margolis et al., 2008). This gatekeeping in CS Ed also impacts the field. Keeping high schoolers from exploring CS in K-12 schooling may prevent them from choosing CS-related college majors or pursuing CS careers.

Exclusion also happens implicitly. Society and its institutions have invisibilized norms, policies, and practices that send messages about who can or should be involved in computing. For example, Ms. Morales’ teaching schedule seemed to be aimed toward the upper elementary and middle school grades, which, whether intentional or not, perpetuated a notion of not needing to provide coding or technology opportunities to younger students. This form of gatekeeping limits young children’s CS opportunities despite research that shows the benefits of teaching coding skills and CS concepts as a form of literacy through both “unplugged” activities and age-friendly tech tools (e.g., Bers, 2019; Wohl et al., 2015). Advocating for other teachers to integrate CS into their practice and have access to CS resources that they could use within their existing curriculum were some of the ways that Ms. Morales worked to include young children in CS at her school.

Exclusion and gatekeeping have persisted since the earliest days of the field of CS.[2] Backus (1980) described how early on, the CS field developed a culture of “priesthood,” where access to the power of digital computers was restricted to a chosen few. He described this priesthood as a fraternity of members selected because of “a colorful personality [or] … an extraordinary feat of coding … [instead of] for intellectual insight” (Backus, 1980, p. 127). These original programmers, generally white males who held social power and privilege, prided themselves on being a part of this select group. They resisted efforts to “make programming so easy that anyone could do it” (Backus, 1980, p. 128).

This resistance led to exclusion and the discrediting of work done by individuals who held marginalized identities. The differences between how early computer scientists Alexandra Forsythe and her husband George were treated illustrate this. Both Alexandra and George made significant contributions to the CS field. Alexandra wrote the first CS textbook (Forsythe et al., 1969), and George coined the term “computer science” and helped found the Department of Computer Science at Stanford University. Alexandra and George both began as Ph.D. students at Brown University. While George graduated with his Ph.D., a dean canceled Alexandra’s fellowship because, as Alexandra explained, “he [the dean] didn’t think I properly stayed on the sidelines like a woman should do” (Forsythe, 1979, p. 9). Despite the gatekeeping that prevented Alexandra from earning her doctoral degree, she went on to teach at Vassar College and eventually at Stanford alongside her husband.

Unfortunately, exclusion persists in CS industries and CS Ed today. Recent statistics illustrate how exclusion in CS impacts marginalized groups (100Kin10.org, 2021):

  • Women face greater exclusion from CS careers than women in STEM broadly.
  • Only 19% of CS bachelor’s degrees are awarded to women, even though they earn 85% of health-related bachelor’s degrees as a STEM field.
  • 74% of women in CS careers report workplace gender discrimination.
  • Only 3.4% of all CS doctorates in recent years were awarded to Black people.
  • Only 7% of those in computer occupations are Black although Black people make up 11% of the overall workforce.
  • Native American/Alaskan Native students are the least likely of all racially minoritized students to attend schools with CS courses.[3]
  • Only 5.6% of the 11.2% of students labeled as English learners and 7.6% of the 12.9% of students with disabilities participated in CS in recent years.

These statistics highlight how exclusion in CS and CS Ed occurs for those with socially marginalized identities around class, disability, gender, language, and race. But the data in the report did not look at how individuals who hold overlapping marginalized identities are excluded. Intersectionality (explored further in Chapter 5) helps us understand how those with multiply marginalized identities experience inequity at the intersections of their exclusion. While the data identify Black people and women as being excluded from CS, it does not consider how Black women are uniquely impacted because their identities lie at the intersections of being Black and being female. Similarly, the report noted that data often wasn’t even available to understand many groups’ participation in CS. For example, there was no data on non-binary students and employees, limited data on those with disabilities, and no data on multilingual learners and employees beyond the PK-12 school system. The fact that data for these groups wasn’t even collected emphasizes how exclusion continues to persist in CS and CS Ed.

Deficit Narratives About Marginalized Groups

Ms. Morales observed that many of the adults around her had a “mindset that kids can’t grasp computing at a young age.” But Ms. Morales resisted this story about her students. She knew it wasn’t true from her own experience. Observing her students after the COVID-19 experience, Ms. Morales recognized that even students in kindergarten and first grade “had experiences navigating websites and gaining new [computing] skills.” In identifying the incorrect assumptions she was hearing about her students, Ms. Morales was noticing another common inequity in CS and CS Ed: deficit narratives. In this case, those narratives were myths about young learners’ potential to do CS that created inequitable access to CS learning opportunities at her school.

Deficit narratives are broadly held beliefs, including stereotypes, that position groups of people as deficient or lacking in some way (Louie et al., 2021; Steele, 2011). In Ms. Morales’ school, young children were positioned as lacking the skills or ability to engage meaningfully in computing. Deficit narratives often show up as “common sense” statements like “kids can’t grasp computing at a young age.”[4] These statements make assumptions about the abilities of a group or blame individuals instead of the systems around them that reproduce inequities (Philip, 2011).

Deficit narratives might sound like, “Women just aren’t naturally as good at math as men,” or, “Students of color just aren’t interested in CS.”[5] What makes deficit narratives problematic is that they aren’t based in truth. Women can be just as successful in math as men (Steele, 2011), and studies have found that Black and Hispanic students are more likely to be interested in CS coursework than their white peers (Gallup, 2020).[6] Deficit narratives come out of assumptions about groups of people who differ in some way from what has been constructed as normal (and implicitly, the “ideal”) by society, not out of what is true. In the United States, society positions identities like white, male, upper-middle class, English-speaking, cis-gender, and heterosexual, among others as the “norm.” Individuals and groups who differ from these norms are positioned by society as lacking in some way, and the narratives told about them reproduce the social hierarchies that perpetuate inequity.

Deficit narratives about marginalized groups are embedded into the policies and practices of U.S. education in general. Historically, schooling was used as a tool to assimilate immigrant groups, to “civilize” Indigenous peoples, and to segregate Black people.[7] Deficit thinking about Black, Latine, low-income, immigrant, and other minoritized children persists in education today. Educators often offer deficit beliefs as rationales for how schools fail racially marginalized youth. These narratives include statements like, “the parents just don’t care, these children don’t have enough exposure/experiences, these children aren’t ready for school, their families don’t value education” and so on (Ladson-Billings, 2007, p. 318). Again, these narratives are not based in reality. Recent research shows that Black and Hispanic caregivers are actually more likely to value CS Ed and to recognize the need for CS in future careers than white caregivers (Gallup, 2020). However, deficit-based explanations can lead educators and institutions to lower expectations for marginalized youth, blaming students and families rather than the systems that underserve them.

Those lowered expectations often result in limited meaningful CS learning opportunities for low-income youth and students with marginalized identities. An examination of three different schools in Los Angeles showed how the forces that reproduced inequity went beyond the availability of technology and CS courses (Margolis et al., 2008). Low expectations were set for racially marginalized students regardless of whether they attended a school where a majority of the student body held racially marginalized identities or a school where they were integrated with white, upper-middle class students. School tracking policies funneled low-income and racially minoritized students into basic computing classes, while students who came from the most prepared, technologically equipped households were encouraged to take advanced CS coursework. The researchers concluded that “technology has not been the great equalizer because schools are providing different learning opportunities, and these opportunities vary according to the racial and socioeconomic demographics of the students” (Margolis et al., 2008, p. 134). More than a decade after that research, concerns still exist about “the unequal presence of CS in public schools, the quality of instruction, and the educators’ and counselors’ unconscious bias regarding who is ‘suited’ to take CS classes” (Blikstein, 2018, p. 34).

Ms. Morales appreciated arguments about the need for young children to get off screens and into settings where they play and interact with each other. But she had also read about the benefits of having students explore computational thinking through “unplugged activities” that don’t use technology, having young learners code digital stories using age-appropriate tools like Scratch Jr. and even having them engage in critical conversations about tech tools in their own lives (Bers, 2019; Wohl et al., 2015). Ms. Morales recognized that having moments of offline activities that reinforce key skills are important in all core subjects, including CS. She hoped to resist the deficit narratives that framed her young students as less capable at computing.

Erasure and Narrowing What “Counts” as Valuable

Another concern that Ms. Morales identified in her work as a CS educator was how to best include her students’ different backgrounds and meet their different needs. Ms. Morales’ students typically come from many different countries: Albania, Georgia, Honduras, Mexico, Russia, Turkey, and Uzbekistan, among others. Her students speak a range of languages, bring with them a variety of cultural experiences and practices, and have diverse abilities. Ms. Morales described how she wants her students to “be very comfortable” sharing and participating in class so that they feel like they belong. One challenge for Ms. Morales was that only a few coding websites (e.g., code.org and Scratch[8]) provide multiple languages for students to learn from. Given her students’ diverse language abilities, this meant that many resources that Ms. Morales might use presented challenges for her students from the very beginning. She recognized that the CS curriculum doesn’t always center or make space for her students’ different languages, experiences, and interests.

The lack of relevant curriculum for Ms. Morales’ diverse students is part of a broader inequity that comes from a narrowing within CS and CS Ed of what kinds of knowledge and computing “count” as valuable. The computational knowledge that is valued has typically been determined by people who hold socially powerful and privileged identities. Yet those who have historically been excluded also have rich knowledge that can contribute to and (re)shape the CS industry and CS Ed. Gaskins (2021) emphasized that although many creative “innovations [are] produced by ethnic groups,” these contributions “are often overlooked” (p. 5). The limited recognition of the knowledge, practices, and contributions of marginalized and minoritized people often remains unacknowledged through a process known as erasure. Erasure refers to efforts that render invisible the presence and labor of minoritized and oppressed groups, contributing to a narrow view of what is valued in CS and CS Ed. Below, we share some examples of creative innovations that people marginalized based on their race and gender identities have contributed to the field, despite efforts to erase their contributions.

One historical example appears in the experiences of Katherine Johnson,[9] Dorothy Vaughan,[10] and Mary Jackson,[11] often known as the “Hidden Figures.”[12] These three Black women worked as human computers for NASA in the 1960s (see Shetterly, 2016). Their mathematical and computing expertise were central to supporting major innovations in space exploration and rocketry. Yet the women faced workplace discrimination and racial segregation throughout their employment. Their intellectual contributions were discredited and ignored for many years until they were finally recognized as mathematicians by NASA. Similar histories shaped the development of artificial intelligence (AI). Six women (Frances Bilas, Betty Jean Jennings, Ruth Lichterman, Kathleen McNulty, Elizabeth Snyder, and Marlyn Wescoff) were trained as human programmers for one of the first electronic digital computers. Despite their mathematical expertise and the computational skills needed for the job, the women’s work was framed as “clerical” work, and they received little credit (Schwartz, 2019).

Another stark example of erasure comes from the early days of CS. The Fairchild Semiconductor company was an important early influence in Silicon Valley. The company established a manufacturing plant in Shiprock, New Mexico, to utilize the skills of Navajo women. These women’s traditional knowledge of weaving and their handiwork skills proved essential to laying out computer circuitry. However, the Navajo women’s labor and expertise were never treated as an important intellectual contribution to CS. Instead, their work was labeled as a “labor of love” that reinforced Western stereotypes of Indigenous women as docile and their skills were reduced to simply having “nimble fingers” (Nakamura, 2014). The Fairchild company eventually called their Navajo collaboration a failure, although they continued to use techniques perfected by the Navajo women. Their decisions illustrate processes of erasure: they extracted the Indigenous knowledge held by the Navajo women; disconnected that knowledge from those women, their labor, and their cultural practices; and then erased the history of the Navajo influence on circuitry entirely.

Erasure and the narrowing of what counts as valuable in CS continue in CS Ed today. As Ms. Morales noticed, many CS curricula are narrow, focusing on specific sets of skills like knowing the syntax of particular programming languages or, for younger students, doing coding puzzles. This emphasis means that CS courses tend to be taught as if the code on screens, the people behind the screens, and the screens themselves were politically neutral, which is never the case (Benjamin, 2019). As a result, CS courses fail to engage students in exploring biases that are embedded in technology (Noble, 2018). CS Ed can also continue the history of erasure when curricula exclude people like the Hidden Figures or the Navajo women described above. Students often have limited opportunities to deeply engage with the worlds around their technology or the worlds they create through their tech. When students are rarely asked to confront injustices and be agents of change, disrupting inequity by broadening what might count as valuable computing is limited.

Embedded Bias and Unjust Impacts of Technology

As Ms. Morales was talking to her colleague Ms. Kors, they named another inequity they deal with as CS teachers — recognizing biases embedded in technology. Ms. Kors shared an experience from her middle school class that illustrated this problem. She had asked her students to produce a Scratch project that shared a family story. One of her students, John, had lived most of his life in eastern Africa and was a recent arrival to the United States. He decided that he wanted to add a picture to his project to represent his mother — a Black woman who wears traditional Eritrean dress. John used Google Images to search for a picture. When he typed “woman” into the search bar, John and Ms. Kors noticed that almost all of the women in the images that came up were young, white, and wearing Western outfits. Ms. Kors shared this example with Ms. Morales to illustrate how the biases (prejudice in favor of or against a group) embedded in technologies and the negative impacts of technology can disproportionately affect minoritized and marginalized groups, including their CS students (see Vogel, 2020, for more on John and Ms. Kors’ experience).

There are many reasons to do CS, and these reasons differ across cultures. In many instances in the United States, CS has been used to advance military and law enforcement goals and has been linked to consumer capitalist purposes. Computing industries tend to focus on solving problems faced by white, male, middle-class/wealthy people (Vossoughi & Vakil, 2018). As a result, the interests of those groups become embedded as biases in algorithms, interfaces, and systems that make up the computing tools in our world, causing harm to marginalized groups.

Biases have been embedded into technology since the development of the CS field began. The early “priesthood” of male computer scientists programmed in binary, a language of 0s and 1s that is compatible with the on/off switches of circuit boards. As the field grew, computer scientists continued to code in binary as a way to keep computing knowledge in the hands of a few (Backus, 1980). Over time, fluency in binary faded and other forms of in-group/out-group boundaries evolved. For example, students at many schools with computing programs developed “hacker” cultures and worked to create codes of ethics that sought to counter traditional militaristic and capitalistic objectives, including making software open to freely share and modify (Vadén & Stallman, 2002). Yet, even though this hacker culture was intended to open up access to computing, it often created new ways to exclude people, notably women (Steinmetz et al., 2019).

Many efforts to disrupt these embedded biases came from people marginalized in the CS field. Women like Kathleen Booth and Grace Hopper tried to create more accessible ways to program. Booth invented early assembly language (programs composed of mnemonic letters and digital numbers) that humans could read more easily than machine language (strings of binary digits). Hopper invented the first programming language that used elements of English grammar and vocabulary to further simplify programming. While these efforts still reproduced embedded biases that favor English as a dominant computing language, Booth, Hopper, and others shaped the trajectory of CS in ways that led to the creation of high-level, more user-friendly programming languages.

Many of the tools and tech created and used today are treated as neutral and benign, despite biases that are embedded into their designs that make them anything but neutral. Scholar Ruha Benjamin named the racism that is embedded into digital technologies the “New Jim Code,” referencing the Jim Crow segregation policies that existed in the United States from 1877 through the 1950s (Benjamin, 2019). Ms. Kors and John discovered these racial biases in the search algorithms and data sets they used trying to find a picture to represent John’s mother. Their experience echoes similar research by scholar Safiya Noble (2018) who found that biases in Google search engine algorithms perpetuated the hypervisibility and hypersexualization of Black women that have been historical realities in the United States. Other examples of embedded biases in technology that lead to harm of individuals and communities include the following:

  • Social media algorithms that signal-boost racist and sexist ideologies in advance of elections (Guess et al., 2023).
  • Voice recognition software that frequently incorrectly processes language from people who speak with accents that deviate from a standard (Paul, 2017).
  • Cryptocurrency mining operations that emit carbon dioxide at the same rate as entire countries. This results in environmental impacts that especially harm low-income and racialized communities (Mahoney, 2024; United Nations University, 2023).
  • Body scanners used by airport security that have disproportionately flagged trans individuals for additional screening (Costanza-Chock, 2020).
  • Police departments that have used and mis-used facial recognition software in ways that have surfaced major privacy violations and racial profiling errors. These violations have in turn led to police violence against innocent people and false imprisonment (Najibi, 2020).
  • Content moderation to develop AI that has come at the expense of underpaid and exploitative labor (Data Workers Inquiry, 2024; Perrigo, 2023).

These realities make it clear that to work toward equitable outcomes in CS and CS Ed, we need definitions of equity that prepare us to resist or dismantle unjust technology. We need definitions that reimagine the purposes of technology and how it is created and ensure that the impacts of technology are more just.

Revisiting Ms. Morales’ Story

In their PD community, Ms. Morales and her colleagues grappled with the different inequities that we’ve considered here: the unjust distribution of CS resources; exclusion from CS; deficit narratives about marginalized groups in CS; erasure and a narrowing of what counts as CS; and embedded biases and the unjust impacts of technology. Recognizing the weight of what they and their students faced in their CS classrooms felt heavy. Yet Ms. Morales and her colleagues took strength from their relationships with each other. They were creative and dedicated teachers, and while they would not be able to solve all of these issues at once, they were committed to doing something to bring about changes within their communities. In Chapter 4, we’ll dive deeper into the work of their equity-focused PD community and explore how they developed definitions of equity that could help resist the inequities they were seeing.

Reflection Questions:

  1. Which inequities discussed in this chapter exist in your settings? The inequities considered here are an incomplete list. Are there other patterns of inequity that are apparent in your CS spaces?
  2. What have you noticed about the relationship between equity and inequity in your work? How do the ways that people define equity impact the problems and inequities they are working to solve?

Takeaways for Practice:

  • Analyze your CS Ed context through the lenses of the inequities discussed in this chapter:
    • Unjust Distribution of Resources: Evaluate the CS resources that are available. What do students have access to (or not)? How are they invited to engage in computing instead of simply “using computers”? What access to well-prepared CS educators do they have?
    • Exclusion: Evaluate the CS opportunities that are available. Who is included? Who is not present? How welcoming are those spaces to students with different marginalized and minoritized identities?
    • Deficit Narratives: Listen to the conversations around you. What do you notice about how students are discussed? Is the focus on students’ strengths and contributions or on what students are missing and lack? You may also want to analyze a school or district policy to see if/how deficit narratives are institutionalized in your setting.
    • Erasure: Evaluate your curriculum. Are there perspectives, voices, or identities that have been erased? Consider some ways you might include them and expand what counts as valuable in your space. The resource Historical Giants in Computing from Chapter 1 may be a helpful starting point.
    • Embedded Bias: Think about the different tech tools you use in your classroom. What kinds of biases might be embedded in these tools? How could you help students recognize these biases? What can you do as a CS Ed instructor to help students navigate the biases and create tools that challenge them?

Glossary

Term Definition
bias(es) Attitudes, beliefs, or actions for or against an idea or group when compared with another; may be conscious or unconscious.
deficit narratives Broadly held beliefs, including stereotypes, that identify groups of people as lacking or deficient in some way (Louie et al., 2021; Steele, 2011).
dysgraphia A learning disability that may affect a person’s physical ability to write and/or impact their ability to express their thoughts through writing.
erasure Processes and efforts that render invisible the presence and labor of marginalized and oppressed groups.
exclusion Processes and efforts that limit the presence and participation of marginalized and oppressed groups in a space.
gatekeeping Institutional policies and structures that control who gets to participate in opportunities and who has access to resources in ways that limit the participation of marginalized groups.
inequity Injustice or unfairness that is created and reproduced by social forces. It is important to remember that “fairness” does not mean “sameness,” so working to right unfairness and inequity does not mean just giving everyone the same thing.
intersectionality A theory that recognizes how people’s different identities (e.g., disability, gender, race) overlap and intersect, creating access to privilege or resulting in oppression in ways that cannot be understood or addressed by considering each identity separately (Crenshaw, 1991; Collins, 2019).
marginalized/minoritized identities Identity categories that are devalued in society, often because they are different from what society has established as the “ideal” norm. Those with marginalized/minoritized identities often face oppression and exclusion from mainstream society.We use both “marginalized” and “minoritized” as adjectives to emphasize how social processes actively construct inequity (Black et al., 2023). The term “minoritized” emphasizes historical systematic oppression and may be used regardless of whether an identity group actually represents a numerical minority in a context (see Black et al., 2023; Flores & Rosa, 2015).

References

100Kin10. (2021). unCommission research summary on the communities most excluded from STEM learning. https://docs.google.com/document/d/1xOgL3AvDkYjyjVTxgxaDUfpUIEC7c0czPUsDSfLhYzE/edit

Anderson, J. D. (1988). The education of Blacks in the South, 1860-1935. University of North Carolina Press.

Arias, M. B. (1990). Computer access for Hispanic secondary students. In C. J. Faltis & R. A. DeVillar (Eds.), Language minority students and computers (pp. 243-256). The Haworth. https://doi.org/10.1300/J025v07n01_12

Backus, J. (1980). Programming in America in the 1950s — Some personal impressions. In N. Metropolis, J. Howlett, & G. C. Rota (Eds.), A History of Computing in the Twentieth Century (pp. 125-135). Academic Press. https://doi.org/10.1016/B978-0-12-491650-0.50017-4

Benjamin, R. (2019). Race after technology: Abolitionist tools for the new Jim code (1st ed.). Polity.

Bers, M. U. (2019). Coding as another language: A pedagogical approach for teaching computer science in early childhood. Journal of Computers in Education, 6, 499-528. https://doi.org/10.1007/s40692-019-00147-3

Black, C., Cerdeña, J. P., & Spearman-McCarthy, E. V. (2023). I am not your minority. Lancet Regional Health Americas, 19. https://doi.org/10.1016/j.lana.2023.100464

Blikstein, P. (2018). Pre-college computer science education: A survey of the field. Google LLC. https://services.google.com/fh/files/misc/pre-college-computer-science-education-report.pdf

Code.org, CSTA, & ECEP Alliance (2022). 2022 State of Computer Science Education: Understanding Our National Imperative. https://advocacy.code.org/stateofcs

Collins, P. H. (2019). Intersectionality as critical social theory. Duke University Press.

Costanza-Chock, S. (2020). Design justice: Community-led practices to build the worlds we need. MIT Press. https://doi.org/10.7551/mitpress/12255.001.0001

Crenshaw, K. (1991). Mapping the margins: Intersectionality, identity politics, and violence against women of color. Stanford Law Review, 43(6), 1241-1299. https://doi.org/10.2307/1229039

Data Workers Inquiry. (2024). https://data-workers.org/

Fairclough, N. (2014). Language and power (3rd ed.). Routledge. https://doi.org/10.4324/9781315838250

Flores, N., & Rosa, J. (2015). Undoing appropriateness: Raciolinguistic ideologies and language diversity in education. Harvard Educational Review, 85(2), 149-171. https://doi.org/10.17763/0017-8055.85.2.149

Forsythe, A. (1979, 16 May). An interview with Alexandra Forsythe [Interview]. Charles Babbage Institute, University of Minnesota. https://conservancy.umn.edu/server/api/core/bitstreams/563c9e36-1ecd-47a8-abfe-4204ec0ea11d/content

Forsythe, A. I., Keenan, T., Organic, E., Sternberg, W. (1969). Computer science: A first course (1st ed.). Wiley.

Gallup, Inc. (2020). Current perspectives and continuing challenges in computer science education in U.S. K-12 schools. https://services.google.com/fh/files/misc/computer-science-education-in-us-k12schools-2020-report.pdf

Gaskins, N. R. (2021). Techno-vernacular creativity and innovation: Culturally relevant making inside and outside of the classroom. MIT Press. https://doi.org/10.7551/mitpress/12379.001.0001

Garfinkel, H. (1967). Studies in ethnomethodology. Polity Press.

Guess, A. M., Malhotra, N., Pan, J., Barberá, P., Alcott, H., Brown, T., Crespo-Tenorio, A., Dimmery, D., Freelon, D., Gentzkow, M., González-Bailón, S., Kennedy, E., Kim, Y. M., Lazer, D., Moehler, D., Nyhan, B., Rivera, C. V., Thomas, D. R., Thorson, E., … Tucker, J. A. (2023). How do social media feed algorithms affect attitudes and behavior in an election campaign. Science, 381(6656), 398-404. http://doi.org/10.1126/science.abp9364

King, H., Martin, M., McArdle, S., Goldberg, R., & DeSalvo, B. (2022, May 13). New digital equity act population viewer shows broadband access and demographic characteristics. United States Census Bureau. https://www.census.gov/library/stories/2022/05/mapping-digital-equity-in-every-state.html

Ko, A. J., Beitlers, A., Wortzman, B., Davidson, M., Oleson, A., Kirdani-Ryan, M., Druga, S., & Everson, J. (2024). Critically conscious computing: Methods for secondary education. https://criticallyconsciouscomputing.org

Ko, A. J., Oleson, A., Ryan, N., Register, Y., Xie, B., Tari, M., Davidson, M., Druga, S., & Loksa, D. (2020). It is time for more critical CS education. Communications of the ACM, 63(11), 31-33. 4:01 https://doi.org/10.1145/3424000

Ladson-Billings, G. (2007). Pushing past the achievement gap: An essay on the language of deficit. Journal of Negro Education, 76(3), 316-323.

Louie, N., Adiredja, A. P., & Jessup, N. (2021). Teacher noticing from a sociopolitical perspective: The FAIR framework for anti-deficit noticing. ZDM-Mathematics Education, 53, 95-107. https://doi.org/10.1007/s11858-021-01229-2

Mahoney, A. (2024, July 9). Crypto-mining creates new environmental injustices for Black Texans. Capital B News. https://capitalbnews.org/crypto-mining-natural-gas-black-communities/

Margolis, J., Estrella, R., Goode, J., Holme, J. J., & Nao, K. (2008). Stuck in the shallow end: Education, race, and computing. MIT Press.

Najibi, A. (2020). Racial discrimination in face recognition technology. Harvard Online: Science Policy and Social Justice, 24. https://web.archive.org/web/20240730143205/https://projects.iq.harvard.edu/sciencepolicy/blog/racial-discrimination-face-recognition-technology

Nakamura, L. (2014). Indigenous circuits: Navajo women and the racialization of early electronic manufacture. American Quarterly, 66(4), 919-941. https://doi.org/10.1353/aq.2014.0070

Noble, S. U. (2018). Conclusion: Algorithms of oppression. In Algorithms of oppression, pp. 171-182. New York University Press. https://doi.org/10.2307/j.ctt1pwt9w5.11

Paul, S. (2017, March 20). Voice is the next big platform, unless you have an accent. Wired. https://www.wired.com/2017/03/voice-is-the-next-big-platform-unless-you-have-an-accent/

Perrigo, B. (2023, January 18). Exclusive: OpenAI used Kenyan workers on less than $2 per hour to make ChatGPT less toxic. Time. https://time.com/6247678/openai-chatgpt-kenya-workers/

Pew Research Center. (2024, November 13). Internet, broadband fact sheet. Pew Research Center. https://www.pewresearch.org/internet/fact-sheet/internet-broadband/

Philip, T. M. (2011). An “ideology in pieces” approach to studying change in teachers’ sensemaking about race, racism, and racial justice. Cognition and Instruction, 29(3), 297-329. https://doi.org/10.1080/07370008.2011.583369

Schwartz, O. (2019, March 25). Untold history of AI: Invisible women programmed America’s first electronic computer. IEEE Spectrum. https://spectrum.ieee.org/untold-history-of-ai-invisible-woman-programmed-americas-first-electronic-computer

Shetterly, M. L. (2016). Hidden figures: The American dream and the untold story of the Black women mathematicians who helped win the space race. William Morrow.

Steele, C. M. (2011). Whistling Vivaldi: How stereotypes affect us and what we can do. W. W. Norton.

Steinmetz, K. F., Holt, T. J., & Holt, K. M. (2019). Decoding the binary: Reconsidering the hacker subculture through a gendered lens. Deviant Behavior, 41(8), 936–948. https://doi.org/10.1080/01639625.2019.1596460

Sutton, R. E. (1991). Equity and computers in the schools: A decade of research. Review of Educational Research, 61(4), 475-503. https://doi.org/10.3102/00346543061004475

United Nations University. (2023, Oct 24). UN study reveals the hidden environmental impacts of Bitcoin: Carbon is not the only harmful by-product. https://unu.edu/press-release/un-study-reveals-hidden-environmental-impacts-bitcoin-carbon-not-only-harmful-product

Vadén, T., & Stallman, R. (2002). The hacker community and ethics [Interview]. GNU Operating System. https://www.gnu.org/philosophy/rms-hack.html

Vogel, S. (2020). Translanguaging about, with, and through code and computing: Emergent bi/multilingual middle schoolers forging computational literacies. [Doctoral Dissertation, City University of New York]. https://academicworks.cuny.edu/cgi/viewcontent.cgi?article=5015&context=gc_etds

Vossoughi, S., & Vakil, S. (2018). Toward what ends? A critical analysis of militarism, equity, and STEM education. In A. I. Ali & T. L. Buenavista (Eds.), Education at war: The fight for students of color in America’s public schools (1st ed., pp. 117–140). Fordham University Press. https://doi.org/10.2307/j.ctt2204pqp

Wohl, B., Porter, B., & Clinch, S. (2015). Teaching computer science to 5-7 year-olds: An initial study with Scratch, Cubelets, and unplugged computing. In J. Gal-Ezer, S. Sentance, & J. Vahrenhold (Eds.), WiPSCE ‘15: Proceedings of the Workshop in Primary and Secondary Computing Education (pp. 55-60). Association for Computer Machinery. http://doi.org/10.1145/2818314.2818340


  1. Ms. Morales’ experiences are shared with permission.
  2. To learn more about the history of CS, see the book Critically Conscious Computing (Ko et al., 2024); https://criticallyconsciouscomputing.org/
  3. We use Native American/Alaskan Native here to match the term used in the original research report. See On Terminology for a full discussion of terms used in the guide.
  4. When naming something as “common sense,” we align with scholars who have pointed out that what is often considered common sense is neither common nor sensical (Fairclough, 2014; Garfinkel, 1967).
  5. This deficit narrative is also problematic because it collapses the different experiences of racially marginalized people into an overly broad category of “students of color” and fails to consider individual experiences.
  6. We use Hispanic here to match the term used in the original source. See On Terminology for a full discussion of terms used in the guide.
  7. U.S. public schooling was first founded by African Americans following their emancipation from enslavement and reflected their deep commitment to education. Segregation policies then worked to maintain underfunding in schools for Black and other marginalized students. See Anderson (1988) for a full discussion.
  8. Visit https://code.org and https://scratch.mit.edu/ to learn more.
  9. Learn more about Katherine Johnson at https://www.nasa.gov/centers-and-facilities/langley/katherine-johnson-biography/
  10. Learn more about Dorothy Vaughan at https://www.nasa.gov/people/dorothy-vaughan/
  11. Learn more about Mary Jackson at https://www.nasa.gov/people/mary-w-jackson-biography/
  12. Their story was told in Margot Lee Shetterly’s 2016 book, Hidden Figures: The American Dream and the Untold Story of the Black Women Mathematicians Who Helped Win the Space Race, and the 2016 film based on Shetterly’s book.

License

Icon for the Creative Commons Attribution-ShareAlike 4.0 International License

Understanding Inequities in Computer Science and Computer Science Education Copyright © 2025 by Sara Vogel; Christopher Hoadley; Lauren Vogelstein; Bethany Daniel; Stephanie T. Jones; and Computer Science Educational Justice Collective is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License, except where otherwise noted.