{"id":3979,"date":"2025-01-10T14:01:32","date_gmt":"2025-01-10T19:01:32","guid":{"rendered":"https:\/\/sites.ohio.edu\/library-archives-blog\/?p=3979"},"modified":"2025-07-30T10:51:12","modified_gmt":"2025-07-30T14:51:12","slug":"jour-4130-unequal-algorithms","status":"publish","type":"post","link":"https:\/\/sites.ohio.edu\/library-archives-blog\/2025\/01\/10\/jour-4130-unequal-algorithms\/","title":{"rendered":"Unequal Algorithms: How Artificial Intelligence Reflects and Reinforces Media\u2019s Social Biases"},"content":{"rendered":"\n<p><em>By Avery Ochs, Emma Kate Kawaja, Sydney Lehmann, &amp; Isabel Mattern, Journalism \u201925, for JOUR 4130 Gender, Race, and Class in Journalism and Mass Media with Victoria LaPoe, Fall 2024&nbsp;<\/em><\/p>\n\n\n\n<p><\/p>\n\n\n\n<figure class=\"wp-block-pullquote has-small-font-size\" style=\"margin-top:0;margin-right:0;margin-bottom:0;margin-left:0;padding-top:0;padding-right:0;padding-bottom:0;padding-left:0\"><blockquote><p>During the fall 2024 semester, the staff of the Mahn Center for Archives and Special Collections worked intensively with Victoria La Poe\u2019s JOUR 4130 class, Gender, Race, and Class in Journalism and Mass Media. The students explored, selected, and researched materials from the collections, then worked in small groups to prepare presentations. The students had the option to then expand their research into a blog post like this one for their final project.<\/p><\/blockquote><\/figure>\n\n\n\n<p>As future journalists, it&#8217;s crucial to understand how the media we consume shapes societal perceptions of race, gender, and class. In Journalism 4130, our group explored these relationships by analyzing Ohio University\u2019s archived media, including rare books, photographs, and other historical collections.&nbsp;<\/p>\n\n\n\n<figure class=\"wp-block-image aligncenter size-full\"><img loading=\"lazy\" decoding=\"async\" width=\"760\" height=\"876\" src=\"https:\/\/sites.ohio.edu\/library-archives-blog\/wp-content\/uploads\/2024\/12\/PICTURE-OF-ARTIFACT-1.png\" alt=\"Cover of words and images: WOMEN ONLINE, an Ohio University master's thesis by Paula Welling exploring visual results from specific Google search terms.\u00a0\" class=\"wp-image-3983\" srcset=\"https:\/\/sites.ohio.edu\/library-archives-blog\/wp-content\/uploads\/2024\/12\/PICTURE-OF-ARTIFACT-1.png 760w, https:\/\/sites.ohio.edu\/library-archives-blog\/wp-content\/uploads\/2024\/12\/PICTURE-OF-ARTIFACT-1-260x300.png 260w\" sizes=\"auto, (max-width: 760px) 100vw, 760px\" \/><figcaption class=\"wp-element-caption\">Cover of <em>words and images: WOMEN ONLINE<\/em>, an Ohio University master&#8217;s thesis by Paula Welling exploring visual results from specific Google search terms.&nbsp;<\/figcaption><\/figure>\n\n\n\n<p>We chose to examine Paula Welling\u2019s <em><a href=\"https:\/\/ohiolink-ou.primo.exlibrisgroup.com\/permalink\/01OHIOLINK_OU\/kl42u0\/alma991031207239708516\" target=\"_blank\" rel=\"noreferrer noopener\">words and images: Women Online<\/a><\/em>, an Ohio University MFA thesis held in the rare book collection that investigates the visual results produced by specific Google search terms. Welling\u2019s search included keywords like &#8220;hot,&#8221; &#8220;pretty,&#8221; and &#8220;assistant.&#8221; The images generated for these terms reveal how women are portrayed in the media and highlight common gender and social stereotypes.<\/p>\n\n\n\n<figure class=\"wp-block-image aligncenter size-large\"><img loading=\"lazy\" decoding=\"async\" width=\"768\" height=\"1024\" src=\"https:\/\/sites.ohio.edu\/library-archives-blog\/wp-content\/uploads\/2024\/12\/IMG_7647-768x1024.jpeg\" alt=\"An example page from words and images comparing Google results images for the words &quot;woman&quot; and &quot;CEO.&quot;\" class=\"wp-image-4049\" srcset=\"https:\/\/sites.ohio.edu\/library-archives-blog\/wp-content\/uploads\/2024\/12\/IMG_7647-768x1024.jpeg 768w, https:\/\/sites.ohio.edu\/library-archives-blog\/wp-content\/uploads\/2024\/12\/IMG_7647-225x300.jpeg 225w, https:\/\/sites.ohio.edu\/library-archives-blog\/wp-content\/uploads\/2024\/12\/IMG_7647-1152x1536.jpeg 1152w, https:\/\/sites.ohio.edu\/library-archives-blog\/wp-content\/uploads\/2024\/12\/IMG_7647-1536x2048.jpeg 1536w, https:\/\/sites.ohio.edu\/library-archives-blog\/wp-content\/uploads\/2024\/12\/IMG_7647-scaled.jpeg 1920w\" sizes=\"auto, (max-width: 768px) 100vw, 768px\" \/><figcaption class=\"wp-element-caption\">An example page from <em>words and images <\/em>comparing Google results images for the words &#8220;woman&#8221; and &#8220;CEO.&#8221;<\/figcaption><\/figure>\n\n\n\n<p><em>words and images<\/em> explores the impact of gender norms and sexist ideologies and their effect on women over time. As a group of four young women, we found the book\u2019s exploration of gendered stereotypes thought-provoking. Welling\u2019s research concluded in 2017, but we wondered: How does Artificial Intelligence <a href=\"https:\/\/meng.uic.edu\/news-stories\/ai-artificial-intelligence-what-is-the-definition-of-ai-and-how-does-ai-work\/\" target=\"_blank\" rel=\"noreferrer noopener\">(AI) <\/a>affect Google searches today? If the same prompts were input into ChatGPT, would the results be the same? In this blog post, we expand on Welling\u2019s research, examining how gender and racial stereotypes from the past continue to shape AI systems like ChatGPT.&nbsp;&nbsp;<\/p>\n\n\n\n<h4 class=\"wp-block-heading\">Woman &amp; Girl<\/h4>\n\n\n\n<figure class=\"wp-block-image aligncenter size-large\"><img loading=\"lazy\" decoding=\"async\" width=\"1024\" height=\"436\" src=\"https:\/\/sites.ohio.edu\/library-archives-blog\/wp-content\/uploads\/2024\/12\/PICTURE-OF-WOMAN-VS-GIRL-1024x436.png\" alt=\"\" class=\"wp-image-3985\" srcset=\"https:\/\/sites.ohio.edu\/library-archives-blog\/wp-content\/uploads\/2024\/12\/PICTURE-OF-WOMAN-VS-GIRL-1024x436.png 1024w, https:\/\/sites.ohio.edu\/library-archives-blog\/wp-content\/uploads\/2024\/12\/PICTURE-OF-WOMAN-VS-GIRL-300x128.png 300w, https:\/\/sites.ohio.edu\/library-archives-blog\/wp-content\/uploads\/2024\/12\/PICTURE-OF-WOMAN-VS-GIRL-768x327.png 768w, https:\/\/sites.ohio.edu\/library-archives-blog\/wp-content\/uploads\/2024\/12\/PICTURE-OF-WOMAN-VS-GIRL-1536x654.png 1536w, https:\/\/sites.ohio.edu\/library-archives-blog\/wp-content\/uploads\/2024\/12\/PICTURE-OF-WOMAN-VS-GIRL.png 1664w\" sizes=\"auto, (max-width: 1024px) 100vw, 1024px\" \/><figcaption class=\"wp-element-caption\">AI generated images of confident woman in professional attire (left) beside a delicate young girl in a garden (right).&nbsp;<\/figcaption><\/figure>\n\n\n\n<p>While exploring <em>Words &amp; Images<\/em>, we encountered Gigi Durham\u2019s theory of the <a href=\"https:\/\/humanspider.wordpress.com\/wp-content\/uploads\/2009\/02\/the-lolita-effect.pdf\" target=\"_blank\" rel=\"noreferrer noopener\">Lolita Effect<\/a>, which examines the sexualization of young girls in the media. Durham\u2019s work reveals how harmful stereotypes portray young girls and women as objects, sexualizing them at an early age. The AI-generated images we analyzed vividly illustrated this divide: one depicted a composed, mature woman, while the other showed a delicate, smiling young girl. These contrasting visuals reflect societal stereotypes that divide women and girls into two categories: power and innocence. The woman, often portrayed as severe and controlled, symbolizes power through composure, while the girl\u2019s innocence is frequently sexualized.&nbsp;<\/p>\n\n\n\n<p>This visual divide directly relates to the Lolita Effect, which implies that flaunting &#8220;hotness&#8221; equals power while also perpetuating the disturbing idea that children can be &#8220;sexy.&#8221; The media glamorizes these ideas, blurring the lines between childhood and adulthood. By reducing girls and women to objects of physical appeal, the press erases their individuality. It reinforces the notion that a woman\u2019s worth is tied to her looks rather than her character or abilities.&nbsp;<\/p>\n\n\n\n<h4 class=\"wp-block-heading\">CEO &amp; Assistant<\/h4>\n\n\n\n<figure class=\"wp-block-image aligncenter size-large\"><img loading=\"lazy\" decoding=\"async\" width=\"1024\" height=\"461\" src=\"https:\/\/sites.ohio.edu\/library-archives-blog\/wp-content\/uploads\/2024\/12\/PICTURE-OF-CEO-VS-ASSISTANT-1024x461.png\" alt=\"\" class=\"wp-image-3993\" srcset=\"https:\/\/sites.ohio.edu\/library-archives-blog\/wp-content\/uploads\/2024\/12\/PICTURE-OF-CEO-VS-ASSISTANT-1024x461.png 1024w, https:\/\/sites.ohio.edu\/library-archives-blog\/wp-content\/uploads\/2024\/12\/PICTURE-OF-CEO-VS-ASSISTANT-300x135.png 300w, https:\/\/sites.ohio.edu\/library-archives-blog\/wp-content\/uploads\/2024\/12\/PICTURE-OF-CEO-VS-ASSISTANT-768x346.png 768w, https:\/\/sites.ohio.edu\/library-archives-blog\/wp-content\/uploads\/2024\/12\/PICTURE-OF-CEO-VS-ASSISTANT.png 1400w\" sizes=\"auto, (max-width: 1024px) 100vw, 1024px\" \/><figcaption class=\"wp-element-caption\">AI generated images of a powerful, severe-looking man in a suit portrayed as a CEO (left), next to a smiling, approachable woman in a suit, who is described as an assistant (right).<\/figcaption><\/figure>\n\n\n\n<p>Those in power actively shape societal narratives, often reinforcing harmful stereotypes. In chapter seven of <em><a href=\"https:\/\/ohiolink-ou.primo.exlibrisgroup.com\/permalink\/01OHIOLINK_OU\/kl42u0\/alma991038180107708516\" target=\"_blank\" rel=\"noreferrer noopener\">Underserved Communities and Digital Discourse: Getting Voices Heard<\/a><\/em> [OHIO login required], Benjamin R. LaPoe II and Dr. Jinx Broussard emphasize how mainstream media, controlled by white perspectives, has historically perpetuated negative stereotypes about people of color. These biases extend beyond race and intersect with deeply ingrained gender roles. Media portrayals of gender dynamics often reinforce the notion that women are subordinate to men, shaping societal perceptions and expectations.<\/p>\n\n\n\n<p>The two images above show how AI mirrors and amplifies these biases. When prompted to generate an image of a CEO, AI consistently produced images of men, often confident and commanding. Conversely, when asked to depict an assistant, AI-generated images of women that are typically styled in supportive and subservient roles. These patterns show how AI systems internalize and reproduce the societal stereotypes embedded in the data on which they are trained.&nbsp;<\/p>\n\n\n\n<p>By presenting these skewed depictions, AI doesn\u2019t just reflect existing biases\u2014it reinforces them. This cycle subtly influences how we perceive the roles of men and women, perpetuating the idea that men are naturally suited for leadership while women belong in subordinate roles. As AI becomes more integrated into society, the risk of these biases shaping cultural norms grows, underscoring the need to critically examine how AI technologies influence our perceptions of gender and power.&nbsp;<\/p>\n\n\n\n<h4 class=\"wp-block-heading\">Scientist &amp; Hot Person<\/h4>\n\n\n\n<figure class=\"wp-block-image aligncenter size-large\"><img loading=\"lazy\" decoding=\"async\" width=\"1024\" height=\"343\" src=\"https:\/\/sites.ohio.edu\/library-archives-blog\/wp-content\/uploads\/2024\/12\/PICTURE-OF-SCIENTIST-VS-HOT-1024x343.png\" alt=\"\" class=\"wp-image-3995\" srcset=\"https:\/\/sites.ohio.edu\/library-archives-blog\/wp-content\/uploads\/2024\/12\/PICTURE-OF-SCIENTIST-VS-HOT-1024x343.png 1024w, https:\/\/sites.ohio.edu\/library-archives-blog\/wp-content\/uploads\/2024\/12\/PICTURE-OF-SCIENTIST-VS-HOT-300x100.png 300w, https:\/\/sites.ohio.edu\/library-archives-blog\/wp-content\/uploads\/2024\/12\/PICTURE-OF-SCIENTIST-VS-HOT-768x257.png 768w, https:\/\/sites.ohio.edu\/library-archives-blog\/wp-content\/uploads\/2024\/12\/PICTURE-OF-SCIENTIST-VS-HOT-1536x514.png 1536w, https:\/\/sites.ohio.edu\/library-archives-blog\/wp-content\/uploads\/2024\/12\/PICTURE-OF-SCIENTIST-VS-HOT.png 1916w\" sizes=\"auto, (max-width: 1024px) 100vw, 1024px\" \/><figcaption class=\"wp-element-caption\">AI generated image of male scientist is depicted working in a lab (left) next to AI-generated text asking for further explanation of what is meant by the word &#8220;hot&#8221; (right).<\/figcaption><\/figure>\n\n\n\n<p>The images above contrast two deeply ingrained societal stereotypes. When prompted to generate an image of a &#8220;scientist,&#8221; AI depicted a male figure, reinforcing the stereotype that men dominate scientific fields. When we asked AI to generate an image for the term &#8220;hot,&#8221; the results were strikingly different\u2014AI hesitated to produce one. This discrepancy highlights both societal biases and the limitations of AI in handling complex or subjective terms.&nbsp;<\/p>\n\n\n\n<p>While exploring <em>words and images<\/em>, we found that Google\u2019s interpretation of &#8220;hot&#8221; typically showcases conventionally attractive individuals, reinforcing narrow beauty standards. However, AI&#8217;s reluctance to produce an image for &#8220;hot&#8221; likely reflects its ethical boundaries and concerns over the use of copyrighted material, preventing AI from confidently creating images based on subjective or controversial terms.&nbsp;<\/p>\n\n\n\n<p>The AI-generated image of the male scientist underscores how societal norms equate authority, intelligence, and expertise with masculinity. By failing to challenge these stereotypes, AI merely reflects and perpetuates them. This serves as a reminder of our responsibility as creators and consumers of technology to question and challenge biases that persist in AI systems.&nbsp;<\/p>\n\n\n\n<h4 class=\"wp-block-heading\">Full-time &amp; Stay-at-home parent<\/h4>\n\n\n\n<figure class=\"wp-block-image aligncenter size-large\"><img loading=\"lazy\" decoding=\"async\" width=\"1024\" height=\"461\" src=\"https:\/\/sites.ohio.edu\/library-archives-blog\/wp-content\/uploads\/2024\/12\/PICTURE-OF-STAY-AT-HOME-PARENT-VS-WORKING-PARENT-1024x461.png\" alt=\"AI generated image of a dad wearing a suit working on a laptop seated next to a child coloring with a woman in a suit posed reading in front of a window in the background (left), and a dad dressed casually seated on the floor of a home reading to two young children (right).\" class=\"wp-image-3996\" srcset=\"https:\/\/sites.ohio.edu\/library-archives-blog\/wp-content\/uploads\/2024\/12\/PICTURE-OF-STAY-AT-HOME-PARENT-VS-WORKING-PARENT-1024x461.png 1024w, https:\/\/sites.ohio.edu\/library-archives-blog\/wp-content\/uploads\/2024\/12\/PICTURE-OF-STAY-AT-HOME-PARENT-VS-WORKING-PARENT-300x135.png 300w, https:\/\/sites.ohio.edu\/library-archives-blog\/wp-content\/uploads\/2024\/12\/PICTURE-OF-STAY-AT-HOME-PARENT-VS-WORKING-PARENT-768x346.png 768w, https:\/\/sites.ohio.edu\/library-archives-blog\/wp-content\/uploads\/2024\/12\/PICTURE-OF-STAY-AT-HOME-PARENT-VS-WORKING-PARENT.png 1400w\" sizes=\"auto, (max-width: 1024px) 100vw, 1024px\" \/><figcaption class=\"wp-element-caption\">AI generated image of a dad wearing a suit working on a laptop seated next to a child coloring with a woman in a suit posed reading in front of a window in the background (left), and a dad dressed casually seated on the floor of a home reading to two young children (right).<\/figcaption><\/figure>\n\n\n\n<p>The<em> <\/em><a href=\"https:\/\/books.google.com\/books?hl=en&amp;lr=&amp;id=67DB9krBq2oC&amp;oi=fnd&amp;pg=PA3&amp;dq=Beyond+the+Double+Bind&amp;ots=D73Cp8hU_i&amp;sig=cZWYGth-EWXlEuHJkDfxtuYBGBk#v=onepage&amp;q=Beyond%20the%20Double%20Bind&amp;f=false\" target=\"_blank\" rel=\"noreferrer noopener\">double bind<\/a> refers to the overlapping forms of discrimination faced by marginalized groups, such as African American women, who face both racial and gender discrimination. They encounter systemic barriers as women in a patriarchal society and are further marginalized by racism. These intersecting challenges often leave their voices unheard, forcing them to fight for recognition within the women\u2019s and civil rights movements.&nbsp;<br>This exclusion extends into modern representations, including AI-generated images. When we searched for parental figures, none of the results included African American women. This absence underscores how AI systems reflect societal biases, continuing the double bind. By excluding African American women, these images reinforce their marginalization in social, cultural, and professional narratives.&nbsp;<\/p>\n\n\n\n<p>The lack of representation in AI-generated imagery has profound implications. It perpetuates stereotypes and erases African American women from media depictions, reinforcing their systemic invisibility. Addressing these biases requires a conscious effort to challenge AI systems and ensure they create inclusive, equitable representations that reflect the diversity of human experiences.&nbsp;<\/p>\n\n\n\n<h4 class=\"wp-block-heading\">Mother &amp; Diverse Mother<\/h4>\n\n\n\n<figure class=\"wp-block-image aligncenter size-large\"><img loading=\"lazy\" decoding=\"async\" width=\"1024\" height=\"381\" src=\"https:\/\/sites.ohio.edu\/library-archives-blog\/wp-content\/uploads\/2024\/12\/PICTURE-OF-A-WOMEN-VS-DIVERSE-WOMEN-1024x381.png\" alt=\"AI defaults to always creating white examples in response to all prompts.\u00a0\u00a0\" class=\"wp-image-3999\" srcset=\"https:\/\/sites.ohio.edu\/library-archives-blog\/wp-content\/uploads\/2024\/12\/PICTURE-OF-A-WOMEN-VS-DIVERSE-WOMEN-1024x381.png 1024w, https:\/\/sites.ohio.edu\/library-archives-blog\/wp-content\/uploads\/2024\/12\/PICTURE-OF-A-WOMEN-VS-DIVERSE-WOMEN-300x112.png 300w, https:\/\/sites.ohio.edu\/library-archives-blog\/wp-content\/uploads\/2024\/12\/PICTURE-OF-A-WOMEN-VS-DIVERSE-WOMEN-768x286.png 768w, https:\/\/sites.ohio.edu\/library-archives-blog\/wp-content\/uploads\/2024\/12\/PICTURE-OF-A-WOMEN-VS-DIVERSE-WOMEN-1536x571.png 1536w, https:\/\/sites.ohio.edu\/library-archives-blog\/wp-content\/uploads\/2024\/12\/PICTURE-OF-A-WOMEN-VS-DIVERSE-WOMEN.png 1630w\" sizes=\"auto, (max-width: 1024px) 100vw, 1024px\" \/><figcaption class=\"wp-element-caption\">AI defaults to always creating white examples in response to all prompts.&nbsp;&nbsp;<\/figcaption><\/figure>\n\n\n\n<p>Lastly, our group explored the racial biases within AI systems, and we uncovered troubling patterns. We requested the ChatGPT to generate images based on descriptions like &#8216;a mother,&#8217; &#8216;an athletic woman,&#8217; and several other specific phrases. No matter how we phrased the request, the AI consistently produced images of white women. It wasn\u2019t until we specifically requested a &#8216;diverse&#8217; image or asked for a woman of color that the system adjusted its output. This experiment highlighted a deeper issue: AI systems, if not carefully designed, can reinforce racial biases by default. The results were a stark reminder of the importance of diversity and inclusivity in AI training and development. This not only misrepresents marginalized groups but reinforces harmful stereotypes.&nbsp;<\/p>\n\n\n\n<p>We turned to <a href=\"https:\/\/iep.utm.edu\/fem-stan\/\" target=\"_blank\" rel=\"noreferrer noopener\">feminist standpoint theory<\/a> to better understand these biases. This theory suggests that one\u2019s social position shapes knowledge and that marginalized groups\u2014such as women and people of color\u2014offer unique insights into societal issues that dominant groups may overlook. Applying this theory to AI development, we see how the exclusion of marginalized communities leads to skewed, incomplete representations. AI systems trained without diverse perspectives fail to reflect the complexity of the world, perpetuating harmful stereotypes.&nbsp;<\/p>\n\n\n\n<p>Research on power dynamics must prioritize the experiences of marginalized communities. Their insights are crucial to addressing the inaccuracies and biases in AI systems. Without their inclusion, AI will continue to misrepresent and harm these groups, particularly women of color, who are disproportionately affected by these biases. For example, in 2023, <a href=\"https:\/\/www.businessinsider.com\/ai-generated-barbie-every-country-criticism-internet-midjourney-racism-2023-7\" target=\"_blank\" rel=\"noreferrer noopener\">Buzzfeed asked AI to produce images of Barbie representing different cultures worldwide.<\/a> Readers were shocked to see the German Barbie depicted with Nazi imagery and the South Sudan Barbie holding a rifle. These are just some examples of how embedded biases and racism can influence AI outputs. Such issues are unfortunately common across AI systems, and they can have a harmful impact on marginalized groups, reinforcing stereotypes and perpetuating harmful narratives that are found within the media.&nbsp;&nbsp;<\/p>\n\n\n\n<h4 class=\"wp-block-heading\">Conclusion<\/h4>\n\n\n\n<p>In conclusion, our research, informed by <em>words and images<\/em>, has allowed us to examine how evolving societal norms impact media and technology by exploring how gender and racial stereotypes in media influence AI; we\u2019ve highlighted the unconscious biases embedded in these systems and their far-reaching consequences. These biases shape how we view the world, perpetuating harmful stereotypes that affect marginalized groups. As AI continues to evolve, we must remain vigilant in its development. This analysis underscores the need for greater awareness and accountability in AI creation, urging us to consider the lasting effects these biases have on society. The question remains: How can we ensure that AI evolves to reflect a more inclusive and equitable world rather than reinforcing existing stereotypes?&nbsp;<\/p>\n\n\n\n<h4 class=\"wp-block-heading\">References<\/h4>\n\n\n\n<p>Bowell, Tracy. \u201cFeminist Standpoint Theory.\u201d <em>Internet Encyclopedia of Philosophy<\/em>, iep.utm.edu\/fem-stan\/. Accessed 8 Oct. 2024.&nbsp;&nbsp;<\/p>\n\n\n\n<p>Broussard, Jinx and Lapoe II, Benjamin R. <em>Underserved Communities and Digital Discourse: Getting Voices Heard<\/em>, Lexington Books\/Fortress Academic, 31 October 2018, pp. 137-156.&nbsp;&nbsp;<\/p>\n\n\n\n<p>Broussard, Jinx. <em>Giving a Voice to the Voiceless: Four Pioneering Black Women Journalists, <\/em>Routledge, 2004.&nbsp;&nbsp;<\/p>\n\n\n\n<p><em>ChatGPT<\/em>, chatgpt.com\/. Accessed 3 Dec. 2024.&nbsp;&nbsp;<\/p>\n\n\n\n<p>Durham, Meenakshi Gigi. <em>The Lolita Effect: The Media Sexualization of Young Girls and What We Can Do About It.<\/em>Woodstock, NY, Overlook Press, 2008.Jamieson, Kathleen Hall. <em>Beyond the Double Bind<\/em>. Oxford University Press, 1995.&nbsp;&nbsp;<\/p>\n\n\n\n<p>Koh, Reena. \u201cA List of Ai-Generated Barbies from \u2018every Country\u2019 Gets Blasted on Twitter for Blatant Racism and Endless Cultural Inaccuracies.\u201d Business Insider,www.businessinsider.com\/ai-generated-barbie-every-country-criticism-internet-midjourney-racism-2023-7. Accessed 3 Dec. 2024.&nbsp;&nbsp;<\/p>\n\n\n\n<p>\u201cWhat Is (AI) Artificial Intelligence?\u201d <em>What Is (AI) Artificial Intelligence? | Online Master of Engineering | University of Illinois Chicago<\/em>, 7 May 2024, meng.uic.edu\/news-stories\/ai-artificial-intelligence-what-is-the-definition-of-ai-and-how-does-ai-work\/.&nbsp;&nbsp;<\/p>\n","protected":false},"excerpt":{"rendered":"<p>By Avery Ochs, Emma Kate Kawaja, Sydney Lehmann, &amp; Isabel Mattern, Journalism \u201925, for JOUR 4130 Gender, Race, and Class in Journalism and Mass Media with Victoria LaPoe, Fall 2024&nbsp; During the fall 2024 semester, the staff of the Mahn Center for Archives and Special Collections worked intensively with Victoria La Poe\u2019s JOUR 4130 class, [&hellip;]<\/p>\n","protected":false},"author":5,"featured_media":4049,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"_lmt_disableupdate":"no","_lmt_disable":"","footnotes":""},"categories":[179,43],"tags":[219,135,218,215,214],"class_list":["post-3979","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-class-project","category-rare-books","tag-ai","tag-artists-books-2","tag-gender","tag-jour-4130","tag-journalism"],"modified_by":"Miriam Intrator","_links":{"self":[{"href":"https:\/\/sites.ohio.edu\/library-archives-blog\/wp-json\/wp\/v2\/posts\/3979","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/sites.ohio.edu\/library-archives-blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/sites.ohio.edu\/library-archives-blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/sites.ohio.edu\/library-archives-blog\/wp-json\/wp\/v2\/users\/5"}],"replies":[{"embeddable":true,"href":"https:\/\/sites.ohio.edu\/library-archives-blog\/wp-json\/wp\/v2\/comments?post=3979"}],"version-history":[{"count":9,"href":"https:\/\/sites.ohio.edu\/library-archives-blog\/wp-json\/wp\/v2\/posts\/3979\/revisions"}],"predecessor-version":[{"id":5060,"href":"https:\/\/sites.ohio.edu\/library-archives-blog\/wp-json\/wp\/v2\/posts\/3979\/revisions\/5060"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/sites.ohio.edu\/library-archives-blog\/wp-json\/wp\/v2\/media\/4049"}],"wp:attachment":[{"href":"https:\/\/sites.ohio.edu\/library-archives-blog\/wp-json\/wp\/v2\/media?parent=3979"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/sites.ohio.edu\/library-archives-blog\/wp-json\/wp\/v2\/categories?post=3979"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/sites.ohio.edu\/library-archives-blog\/wp-json\/wp\/v2\/tags?post=3979"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}