Artificial Intelligence and Sexual Health Services: Critiquing and Expanding AI’s Presence in the Sex Education and Sexual Wellness Industry

Freedman, Ethan

LGBT 320A: LGBT Health

Dr. Lindsay Toman

United States, New York, Hamilton

Submitted in partial fulfillment of the requirements for LGBT Health 320A

Colgate University, May 7, 2024

Acknowledgements:

As I close out my final semester at Colgate and one of my favorite classes in which I participated, I would like to leave room for gratitude. Specifically and primarily Dr. Lindsay Toman for their assistance, guidance, recommendations, presence, and the class atmosphere they helped to generate that contributed to the curiosity displayed in this paper. It was beyond fun being in one of your first classes at Colgate, and even more fun to round out my own experience at the university with you. Moreover, every student in the course who showed up, participated, provided feedback, and brought their entire selves to class – whatever part could be retrieved at least. This piece would be nothing without my peers from Spring 2024 LGBT health 320A. With these brief acknowledgements, I present to you a work of my own volition.

Abstract

This paper explores the integration of Artificial Intelligence (AI) in sexual health services, with a focus on its application in sex education and sexual wellness. As AI continues to evolve, its intersection with the medical industrial complex offers both innovative solutions and ethical dilemmas, particularly within the realms of sex and sexuality education. The research highlights how AI, through machine learning and data analytics, can significantly contribute to the destigmatization and dissemination of sexual health knowledge, thereby enhancing public health outcomes. However, it also addresses the concerns related to privacy, empathy, and the maintenance of human touch in digital interactions. By examining various case studies and current implementations of AI in sexual health contexts, the paper argues for a balanced approach that maximizes benefits while mitigating risks. This involves considering user-centered design, inclusivity, and ethical frameworks in AI applications to ensure they support rather than replace human expertise. The ultimate goal is to leverage AI’s potential to foster a more informed and healthy society regarding sexual health issues.

Keywords: Artificial Intelligence, Sexual Health, Sex Education, Ethical AI, Public Health

Artificial Intelligence and Sexual Health Services:

Critiquing and Expanding AI’s Presence in the Sex Education and Sexual Wellness Industry

I feel like a silly person adding a tally mark to the 60,000+ other articles of research on artificial intelligence created each year, but I persist (Tobin et al., 2019). Artificial Intelligence, also known as AI, machine learning, or alternative intelligence, has risen in interest since 2014 when academic articles initially began to come out in masses, but now machine learning is in business, homes, individuals hands, and all around us (Tobin et al., 2019). While it may seem like tech companies are at the forefront of who is guiding alternative ways of learning, Hoff (2023) reminds that the governments play a key role in the imagination of where artificial intelligence can be applied. One of the many niches of industry that technology caters to with encouragement by governments is the medical industrial complex (MIC) (Neelam, 2020; Hoff, 2023).

Presently, AI sits in tension with medical industries and production motives. Although still progressing in strength, an area where artificial intelligence can be carefully applied is assisting in sexual health education on sex and sexuality (Young et al., 2021; Nadarzynski, 2020; Nadarinski, 2021). While much of the benefits of implementing machine learning with these topics center around efforts of eliminating stigma and providing knowledge, the capacity of alternative learning models to make our relationship to sex and sexuality more positive lies in the hands of those who are most eager to use AI as a tool to aid them – not the tool on which they depend. Artificial intelligence intertwines with the medical industrial complex due to how useful it appears to be, and the ability for AI to provide facets of sex education that many agents struggle with finding assistance for or generally fall through the cracks of life is huge.

Literature Review:

What is Artificial Intelligence: Intentions of Establishment

Artificial intelligence research goes through periods that are called AI winters, firmly understood as “disillusionment with the technology,” but on the other side often exists a breakthrough moment (Tobin et al., 2019). Tobin (2019) seeks to provide a brief historical understanding of AI, situating knowledge of the field being at tension primarily. With the capacity for alternative intelligence systems being endless, there is extreme desire for economic strength and security from countries who are nationally competitive with each other for resources and power (Tobin, 2019). Each country inquiring about the ways AI can assist them leads to governments releasing artificial intelligence strategies in their race towards global control that center AI workforces, research development, and inferences on how AI production will affect economics, social spheres, and continued interest in the subject. While American approaches call for billions of dollars worth of research, China sought global advanced AI levels by 2020 with a role in leadership, technology, and applications by 2030 (Tobin, 2019).

Neelam (2020) in tandem with Tobin (2019) positions how artificial intelligence is a technology that creates systems, and the goal of alternative learning methods is to create system functions that require knowledge closest to human beings models of learning, reasoning, problem solving, and cognition. Countries are beyond interested in applying AI to their systems because of its role in management, capacity to collect information, and find solutions for complex problems (Neelam, 2022). In Hoff’s “Unavoidable Futures?” (2023), we are reminded that governments are influencing the construction of artificial intelligence, but specifically favor application in the imaginaries of healthcare systems. While Neelam (2022) reminds that machine learning is a combination of scientific and engineering fields, Hoff (2023) looks at AI tech solutions to health care challenges to explore how governments are legitimizing, promoting, and reassuring motives for machine learning. From governments having active participants that shape global technical imaginaries of AI in healthcare, a discourse analysis of Dutch government methods reveals motives for generating positive sociotechnical imaginaries that mask the risks that can result (Hoff, 2023). Via legitimizing AI as inevitable and beneficial, promoting and encouraging professionals to adopt AI usage, and reassuring concerns about impact on the professional roles, the risks are hidden behind a facade of ignorance next to the technologies being virtually brand new. (Hoff, 2023).

AI and Healthcare Systems: The Capacity to Intervene in Sexual Health

One facet of the healthcare industry overlooked and underestimated in its effectiveness is the medical industrial tech centered approach to compulsory sexuality, sex education, and sexual wellness. The reality is that AI’s potential to solve problems in sexuality related healthcare services goes down to microscopic levels with one in five people experiencing sexually transmitted infections (STI) (Young et al., 2021). Despite how prominent STIs are, attention required in the United States has been lacking with Center for Disease Control funding static alongside rising rates of infections (Young et al., 2021). Young et al. (2021) believes that AI applications in sexual health facets of contraception and STI prevention can address epidemics by predicting rates alongside the data that it is publicly capable of consuming as its learning methodology. By looking at patterns in healthcare records and trends in stigmatized care, information could be provided to key stakeholders like parents, marginalized communities, and governments who are active in providing sexual health resources (Young et al., 2021).

While Young et al., 2021 highlights that machines have biases when they have little amounts of education in them, similar to humans, the necessity of racial integration and positive exposure to diverse racial and ethnic groups in AI systems would help to stop perpetual forms of violence in STI prevention methodologies (Young et al., 2021). For a system to have discriminatory biases brings scary imaginaries next to the notion that artificial intelligence is the operating software gatekeeping healthcare resources.

While racial bias is a fear that academics and researchers of AI healthcare systems have, Sexual and Reproductive Health services (SRH) are already experiencing a digital evolution that Nadarzynski et al. (2020) examined thoroughly. Interested in analyzing the acceptability of digital services like video consultations, live web chats with healthcare providers, and artificial intelligence chatbots, Nadarzynski et al. (2020) had over 250 English respondents in a survey asking participants to communicate their preferences for healthcare mediums. In England, SRH services and funding for education have been drastically reduced next to digital interventions increasing in the form of websites, social networking, and text based platforms that provide information about STIs and risk factors (Nadarzynski et al., 2020). Because people often have low health literacy, online services that are easily comprehensive were thought to be most impactful, but a vast majority of participants preferred face to face consultations with healthcare providers. SRH chatbots were untimely for those who were experiencing STI symptoms, might be less desirable for those already in clinical treatment, and may not be suitable in high risk populations. There was moderate acceptability in innovative chatbot services and hesitancy towards AI led chatbots due to concerns around confidentiality, cybersecurity and lack of empathy and trust, but SRH services are expanding fast and increased digitalization is decreasing the cost of effective services (Nadarzynski et al., 2020). Low acceptability rates of technologies in healthcare services for sexual health means research for implementation of more technological solutions requires centering patient perspectives for equitable, engaging, cost effective artificial learning methodologies.

With low levels of acceptability, Nadarzynski et al. (2021) set out to explore the barriers and facilitators of engagement with sexual health chatbots via semi structured and online interviews with participants asked to interact with chatbots advising on STIs and relevant services. The United Kingdom based study had participants ranging from 18 to 50 years of age that primarily thought chatbots could aid in sexual education methodologies by providing useful information and bridging connections with resources, but chatbots were still perceived as inferior next to the internet and health professionals (Nadarzynski et al., 2021). AI chatbots were limited in their interactivity, constrained with their content, dodgy with privacy, untrustworthy, and inaccurate at times (Nadarzynski et al. 2021), highlighting that their effectiveness and safety must be established for useful, empathetic, and agency reiterating sex educational resources.

A Deeper Look at the Endless Relevancy of Alternative Learning Methods for Sex, Sexuality, and Healthcare Related Information

What is often a stigmatized conversation is exemplified by the context of sexually transmitted infections and AI, but prevention and mitigation is only one facet often centered in discourse for how artificial intelligence can intervene with our lives. Prevention centered missions are a prime example with data to back it up and it rides the wave of public health concern spanning generations; however, a larger level on which machine learning can operate is aspects of intimacy. In Alexander and Yescavage “Sex and AI: Queering Intimacies,” Spike Jonze’s film Her is unpacked to explore the sexual technology potential for queering intimacies and ideologies around technological relationships. While there are many cultural and moral implications for AI in intimate relationships, the film has a nuanced depiction of people and operating systems that undoes heteronormative understandings of affective relationships (Alexander & Yescavage, 2018). While Alexander & Yescavage (2018) conclude with machine learning’s capacity to queer conceptions of intimacy, there is a cry for ethical and inclusive approaches to implementation of AI and human relations.

Sexual health spans many realms and the desire for intimacy with machines portrayed by a film like Her is pervasive in society. It is a compulsory sexuality that sexualizes the artificial form of intelligence, but what if AI could help us to better connect with each other to generate intimacy? Furlo et al. (2021) asked dating app users who were women and LGBTQIA+ to interact with AI prompted conversations and imagine dating apps as a safe, consensual, normalized space for discussing consent and sexual boundaries before meeting in person. There was a focus on women and LGBTQIA+ users, despite being applicable to all, because sexual violence and disregard of consent boundaries are commonly experienced by people within these marginalized groups.

Referencing the use of “consent apps” and mobile applications designed to mediate consent, Furlo et al. (2021) wondered how dating apps could be designed to better facilitate the discovery of potential sexual partners rather than transactional consensual sex. While there were questions about what consensual practices should be computer mediated, foundational boundaries for operationalization of consent were 1.) all partners must express explicit consent 2.) consent must be given for specific acts, and 3.) sexual preferences, boundaries and consent practices must be discussed before activity starts (Furlo et al., 2021). With dating apps hosting overt conversations about sex and sexuality for its users before in-person engagement, AI was witnessed informing conversation progress towards topics of consent and sexual boundaries as it “tailored” to a system where these kinds of conversations were already taking place. The capacity for AI to assist in the moral development of agents engaging in sexual conversations means that dating apps could play a key role in normalizing consent and boundaries.

While artificial intelligence can assist us in dating and intimacy, consent and boundaries seeps into its capacity for security, which generates a lot of interest about which Neelam (2022) reminds us. Media and surveillance are huge applicable systems for AI that are capable of tracking, monitoring, and involving policy in their methods of maintaining and overseeing civilians. (Neelam, 2022). While China has already implemented many AI surveillance epistemologies, the problems we are asking alternative learning methods to survey leave out sexual health security potentials and truly positive imaginaries. How can AI protect sex and sexuality and do we need it to?

Conclusion: Expanding the Imaginaries of AI and Sexual Health Implementation

Artificial intelligence, despite stigma, has exploded as a field of research and technological advancement capable of expanding the possibilities of entire countries and the world (Tobin, 2019), but its application in medical industries is capable of being represented by efforts of prevention and mitigation of sexually transmitted infections (Neelam, 2022). While it is known that AI creates new challenges for governments in a medical context, the ability for technology to advance the medical field has created polarities of support and resistance (Hoff, 2023). Rising rates of STIs and lack of federal attention led to AI being a prime tool for topics of stigma like sexual health (Young et al., 2021), but fears of discrimination, accuracy, low information, lack of empathy, and illusions of trust continue to prevent the mass uptake of new interventions (Nadarzynski et al., 2020). Although initially struggling, AI is slowly being adopted due to the creation of effective and safe resources patients trust (Nadarzynski et al., 2020); however, the realm of sexual health in which AI can operate goes beyond STI prevention (Alexander and Yescavage, 2018). Dating apps and intimacy are one of the many contexts alternative learning models can assist in and may play a key role in establishing boundaries and consent between agents (Furlo et al., 2021). That being said, it can also undo intimacies as we know them currently.

With AI being so expansive that it can operate in industries of medicine, sexuality, education, and security highlights how powerful the tool can be. It is a scary thought for many as fears of delegitimizing the human experience are highly prevalent in discourse, but this depends on who is in control of implementation. While many governments are eager to develop massive machine learning systems, artificial intelligence is something capable of being used by anyone with access to the internet. With this in mind, the kinds of problems technology can address ranges from minuscule to all encompassing, but we must continue to be critical of those who utilize it. Data is everything to a machine with digestive powers to learn and data can be biased; however, the data that feeds a system can be as niche as we desire.

In my experience as a sex educator who has worked in facets of adult media and film, I have learned a lot about pain and pleasure, their intersection in kink, and the extent that adult entertainment platforms screen for maladaptive content (Hillinger, 2023) (Freedman, 2024). What has historically been people sitting at a computer for hours screening pornography for signs of violence so they can be uploaded on to websites can now be done by artificial intelligence, but it depends on how we guide it. I have always been curious about the extent people can differentiate between expressions and vocalizations of pain and pleasure depicted in adult film, pornography, and a range of media types connected to the notion that adult content is capable of perpetuating violence and harmful ideals around sex and sexuality as well as positive ones. Bekenova and Bekenova (2023) looked at AI and the efforts made to capture and classify emotion via audio. With emotion recognition being super important in governing human behavior, figuring out a way for AI to mimic these learning processes can be helpful in classification based audio interventions (Bekenova & Bekenova, 2023). To classify audios and emotional representations of pain and pleasure may be beneficial in separating truly harmful content from positive content. Whether feminist, ethical, mainstream, gonzo, heteronormative, queer, or anything else, affective displays of pain and pleasure mix with methodologies for communicating, and historically violent forms of sex education like pornography can utilize AI to better flag maladaptive content. Moreover, it spares the people whose jobs it is to screen for maladaptive content (Hillinger, 2023).

In closing, the capacity for artificial intelligence is scarily endless and stretches as far as users imaginations – and an essay on AI’s attachment to sex and sexuality should be representative of this – but we must continue to be critical of how these systems currently operate to ensure positive outcomes upon inevitable implementation. Whether it is the general desire for AI, its valuable assistance in medical settings, hyper specific examples of help in sex education, or the expansion it offers to cyber intimacies, artificial intelligence is only in its infancy–and still beyond impressive.

Bibliography

Alexander, J., & Yescavage, K. (2018). Sex and the AI: Queering intimacies. Science Fiction Film and Television, 11(1), 73–96.

Bekenova, S., & Bekenova, A. (2023). Emotion recognition and classification based on audio data using AI. E3S Web of Conferences, 420, 10040. https://doi.org/10.1051/e3sconf/202342010040

Freedman, E., (2024) (Unpublished) Sex Education Methodologies and Pornography: Accountability of Adult Film Companies as Sex Educators

Furlo, N., Gleason, J., Feun, K., & Zytko, D. (2021). Rethinking Dating Apps as Sexual Consent Apps: A New Use Case for AI-Mediated Communication. Companion Publication of the 2021 Conference on Computer Supported Cooperative Work and Social Computing, 53–56. https://doi.org/10.1145/3462204.3481770

Hillinger, S. 2023; Moneyshot: The Pornhub Story

Hoff, J.-L. (2023). Unavoidable futures? How governments articulate sociotechnical imaginaries of AI and healthcare services. Futures, 148, 103131. https://doi.org/10.1016/j.futures.2023.103131

Nadarzynski, T., Bayley, J., Llewellyn, C., Kidsley, S., & Graham, C. A. (2020). Acceptability of artificial intelligence (AI)-enabled chatbots, video consultations and live webchats as online platforms for sexual health advice. BMJ Sexual & Reproductive Health, 46(3), 210–217. https://doi.org/10.1136/bmjsrh-2018-200271

Nadarzynski, T., Puentes, V., Pawlak, I., Mendes, T., Montgomery, I., Bayley, J., Ridge, D., & Newman, C. (2021). Barriers and facilitators to engagement with artificial intelligence (AI)-based chatbots for sexual and reproductive health advice: A qualitative analysis. Sexual Health, 18(5), 385–393. https://doi.org/10.1071/SH21123

Neelam, M. (2022). Neelam MahaLakshmi (2021) Aspects of Artificial Intelligence In Karthikeyan.J, Su-Hie Ting and Yu-Jin Ng (eds), “Learning Outcomes of Classroom Research” p:250-256, L’ Ordine Nuovo Publication, India. 978-93-92995-15-6.

Tobin, S., Jayabalasingham, B., Huggett, S., de Kleijn, M., & Lawlor, B. (2019). A brief historical overview of artificial intelligence research: Information Services & Use. Information Services & Use, 39(4), 291–296. https://doi.org/10.3233/ISU-190060

Young, S. D., Crowley, J. S., & Vermund, S. H. (2021). Artificial intelligence and sexual health in the USA. The Lancet Digital Health, 3(8), e467–e468. https://doi.org/10.1016/S2589-7500(21)00117-5