Exploring the role of ethics in designing period tracking applications in a post-Roe v Wade world.

-Sarvani Polisetty


Introduction and Abstract

My goal for this synthesis paper is to explore the landscape of data privacy in mHealth tracking applications. Through this, I want to bring to attention the privacy concerns of using period and fertility tracking applications in a surveillance economy post Roe V. Wade and see what role design ethics can play in alleviating the problem for the users. To explore the landscape of healthcare data privacy, I start by exploring the guidelines and laws that help data privacy protection for mHealth application users that regulate the practices of the tracking applications. I then go on to see how period data tracking is regulated, why tracking certain private user data raises a lot of concerns, how applications application developers and App Stores/Play Stores misuse the gaps in laws for unethical data collection, and ultimately various views feel the users (people who track period data) should be doing to safeguard themself in today’s world and how can design ethics help, if it can. 

Keywords: Data Privacy, Period Tracking, Fem Tech, menstrual surveillance, Surveillance Economy, Abortion Rights, Roe V. Wade, post-Dobbs, HIPAA, Healthcare Data, Design Ethics


Introduction

In today’s world where data is considered power, there is a cause for concern for people in the United States who track their personal health data through technology. While there exist data privacy risks of personal data in most domains, the risk to personal health data privacy might make them vulnerable and in some cases might even put their life at risk. For a domain like that of female health data tracking, it poses threats to the users in the United States as there are strict laws against certain reproductive health processes. As of June 2022, the federal government had revoked the nationwide grant for abortion and now it had been left for each state to decide their stance on abortion rights. Close to half the states in the country have either criminalized or are very close to doing so.[9] 


Data privacy in healthcare

Over the past few years, there has been a steady rise in the adoption of personal health-tracking devices.  Menstrual data tracking applications are the fastest growing businesses in the “FemTech” industry with these applications being considered positive health literacy tools that have gained widespread adoption in the past few years. [1] These applications that help collect large numbers of various health metrics have raised questions about the impact of this data leak. The new age form of health data surveillance came up with “menstrual surveillance” in the “FemTech” market which tracks some of the most sensitive data in the lives of women.[2,3] Sensitive health data has been vulnerable in the hands of malicious third-party organizations such as Insurance companies, cybercriminals, and big data predictive analysis companies. They use sensitive patient data without their knowledge and consent to access them. [2]


Regulations for Healthcare Data Privacy


“Hardly any US laws contain upstream requirements that minimize or otherwise limit data collection.”[4] 

The currently existing laws on healthcare, the Health Insurance Portability and Accountability Act (HIPAA) Privacy and Security Rules fail to address the digital data security concerns of today's world. With patient data records becoming more digital and available on the cloud and the rise in users of wearable and mobile health data tracking, the amount of health data that is available digitally has been rapidly increasing over the years [4]. HIPAA regulations existing in the United States, although very weak in comparison to the General Data Protection Regulation (GDPR) in Europe [4,5], help protect some basic patient data privacy rights, areas such as period tracking that aren’t covered under the HIPAA guidelines make them all the more vulnerable to data threats. HIPAA guidelines that were once thought to be extensive and well enforced these 2 decade-old regulations that were written in 1996 fall short in accounting for the methods in which data is stored and shared in today’s world. As health data increasingly finds its home on the cloud, consumer health tracking companies encouraging the users themselves to collect huge amounts of data, and the security of this data is put at risk. Further with the latest election results in 2024 in the United States, there is a fear of how much the regulations would be pushed to be enforced. 

GDPR VS HIPAA

HIPAA guidelines lay out a domain-specific set of regulations, that is healthcare unlike the European counterpart the GDPR which focuses on consumer data protection across domains. Another differentiator is how HIPAA focuses more on the entities that control data such as Insurance Companies and healthcare providers, and not on what type of data is under its purview. Health data that is handled by entities outside the scope of the traditional healthcare system such as Fitness app developers isn’t controlled by the HIPAA guidelines. [5,6]

The GDPR enforces its regulations such that the user’s privacy while using technology is protected. It  “adopts a comprehensive and rights-based approach, emphasizing individual rights to privacy, data portability, and the "right to be forgotten." ” [7] The penalty of not enforcing these guidelines in technology is consistent across the jurisdiction of the set of regulations unlike in the US where the penalties vary from state to state. This comparison underscores why period tracking apps fall into regulatory gaps, leaving users' sensitive data exposed.[13]

Some studies [8,9] show that while even the GDPR regulations fail to completely protect consumer data privacy in applications downloaded from the Play Store and App Store, the authors of these papers [8,9] suggest that the penalties and repercussions aren’t strictly enforced allowing the perpetrators to carry out acts like collecting data without user consent, not displaying privacy policy, collecting location information without consent or user knowledge. 

Data privacy regulations such as these are not being enforced enough and have gaps and loopholes in their administration, the application developers and  App Store and Play Store utilize this to their advantage. A study on 30 different period tracking applications talks about how the applications do not respect the privacy of the users. The existing regulations do not stop the period and fertility-related data on these apps from being shared with third-party sources. What this study points out as the reason for this is how these apps are labeled on the PlayStore and the AppStores. While some are labeled as “Health and Wellness”, some as “Medical”  and some others as “Communication”. This miscategorization of the data makes it difficult to maintain consistency in the apps and the regulations they are bound by. Some of these applications take this to their advantage and misuse the data. 


Impact of Abortion Bans on Period Data Privacy

While data protection concerns on period and fertility tracking apps existed for a long time it didn't raise a topic of debate and an alarming raise for concern until June 2022 when the federal right for women for abortion was revoked by the central government in the United States. This led to a rise in anguish and concern about what it means for each individual woman’s body rights who has the autonomy in making such laws. This also translated into concerns about data privacy on period and fertility tracking applications. Some research shows that users don't trust these sites from past experience with data sharing with 3rd party applications. A lot of these period tracking applications were found to be unethically collecting, storing, and sending data to unauthorized parties compromising the privacy policies and the trust of users.[10,11,12] Users were scared of their data reaching the wrong hands, and the government which has criminalized abortion (which differs from state to state now). Most women across the country are upset with this ruling and are concerned about the implications of this breach of their privacy. 

The current set of laws on abortions implies that period/fertility data tracked by the users could be interpreted and make predictions for cases where abortions might occur. This data could be interpreted unintentionally by the users, by people who are in ownership of the data, incorrectly or correctly. This could then pose a threat to the users who did not wish to share their private sensitive data with any 3rd parties. 

Users who have been using period and fertility tracking applications for a long time, aware of the privacy issues, did so as they felt the pros outweighed the cons and risks as they found tracking their reproductive health beneficial and useful in their lives. However, after the revoking of the abortion law, most applications saw a sudden drop in their users due to these underlying privacy concerns. While some users were found to resort to deleting the applications some were still found to be clueless on the implications of these sensitive data breaches. They seemed to be unaware of such activity happening and the repercussions of what might happen if that data was accessed by the government. Some users were unaware of the applications’ privacy policy or found it too hard to comprehend. Some users weren't aware of the consent they were giving rights to while signing the privacy policies. Some applications promised data privacy and didn’t follow through with their promises; other applications were found to use dark patterns in getting users to sign up by making their policies too hard to understand by their users. [11,12,2]

The studies examining period tracking apps find that many of them gather and retain private user information unknown to the users including location, sexual activity, and mental health issues. Some of the applications access location data and some upload data straight to their servers. They talk about the significant rise in concern among the users since the law was revoked.  Sixteen applications suggest they may share data with authorities, and two apps collect more data than they disclose. User reviews show suspicion and a preference for European apps. The research makes several recommendations, such as restricting data collection, allowing anonymous use, offering local storage, and advising users to verify permissions and avoid revealing real information. [10]  The research showed that though the companies portray their privacy policies in favor of the users and help them safeguard their data on their applications the reality was found to be far from the truth. These companies had to balance protecting user privacy while also being ready to comply with government requests for data. Some companies emphasized their technical features, like keeping user data anonymous, while others were less clear about their practices.[12]


The Role of Design Ethics in Addressing Data Privacy

This topic could be termed a wicked problem that might not have a clear design solution that could help solve the problem entirely. While changing policies in healthcare and women’s health rights might be out of the purview of HCI, a designer can however try to propose solutions that might help the users navigate the wickedness of women’s health regulations. A few ways designers can however help here could be designing in ways that would reduce the connection of user data. Flo, a period tracking application had come up with a feature to track data without having to log in and create an account. The users would be able to track in an “anonymous” mode to help protect their private information. A few other solutions that were proposed were to save the data locally on mobile devices rather than on the cloud which could expose the possibilities if it is being accessed by an unintended audience. There were also suggestions on moving away from digital tracking methods to physical tracking, using a paper calendar or a journal to track period and fertility data. This made me realize how probably sometimes technology doesn’t have the solutions for every problem and maybe using non-digital tools could help. A dilemma that I was stuck with here was wondering if using a non-digital solution was “running away from the problem” rather than trying to solve it. 


Recommendations for Ethical Design and Policy Alignment

A few articles talk about how there is a need for better policies and designs of the applications to provide rights for the bodily autonomy of their users. They call for regulatory bodies to help protect the privacy of the users and they point out how the HIPAA compliance that is regulated by the Department of Health and Human Services (HHS) doesn’t cover the data protection of period tracking. When health laws affect the way users interact with applications online giving rise to privacy concerns, there must be laws that evaluate these concerns and propose better rules. They also call for designers and app developers to consider the ethics in data sharing of private and sensitive subjects. They question the need for saving the data on the cloud and the effectiveness of the period tracking calendars.  They also urge the designers to review the collection of data such as mood tracking and location and only allow for the input of absolutely required data[2,5]. While one paper spoke about how it was also the responsibility of the users to understand privacy policies before signing up for such applications[1], the general consensus was that we need better ethical designing and policy regulations around data protection on women’s reproductive health that takes into account the current methods of data tracking and sharing to protect their user’s data and possibly their lives.  


Conclusion

Through this synthesis paper exploration, I have discovered multiple facets of this topic. There were so many times I had to pause and wonder why some problems were way too complicated to address. Sometimes I felt hopeless and helpless discovering the number of stakeholders in this domain and if we do want change to occur sometimes it might take decades to see it happen. In a capitalistic surveillance economy that typically puts profits before ethics, there is an imminent need for policymakers, designers, and developers of “FemTech” applications to consider the perils of unfair unethical practices for the sake of humanity. I wonder how much change can be created by using or abolishing terms that label technology as female-centric. I still struggle to understand if setting such technology apart would marginalize them more from the center of discussions or if it highlights the need for more people who are part of the community to be involved in the decision-making process.  As a designer trying to explore this field, I do not have a straightforward argument that can help us find the answer thus making it the wicked problem that it is. I am however optimistic that we will see change in the future. Research in female healthcare tech is highly inadequate in understanding the problems we face and how we are impacted by laws differently than other genders. Recently coming across a few pleasantly surprising changes in applications I use every day incorporating guidelines for UX dark patterns makes me optimistic about how bringing such issues to light through research can help change the world one research paper at a time. 


References

[1] Kelly, B. G., & Habib, M. (2023). Missed period? The significance of period-tracking applications in a post-Roe America. Sexual and reproductive health matters, 31(4), 2238940. https://doi.org/10.1080/26410397.2023.2238940

[2] Drew Harwell. 2019. Is your pregnancy app sharing your intimate data with your boss?https://www.washingtonpost.com/technology/2019/04/10/tracking-your-pregnancy-an-app-may-be-more-public-than-you-think/

[3] Amy Olivero. 2022. Privacy and digital health data: The femtech challenge. https://iapp.org/news/a/privacy-and-digital-health-data-the-femtech-challenge/

[4] N. Terry, Existential challenges for healthcare data protection in the United States, Ethics, Medicine and Public Health, Volume 3, Issue 1, 2017, Pages 19-27, ISSN 2352-5525, https://doi.org/10.1016/j.jemep.2017.02.007

[5] M. Landesberg, T. Levin, C. Curtin, O. Lev, Privacy online: a report to congress, US Federal Trade Commission, Washington, D.C (1998)

[6] Simplification, H. A. (2006). Office of the Secretary 45 CFR Parts 160 and 164.

[7] Seun Solomon Bakare, Adekunle Oyeyemi Adeniyi, Chidiogo Uzoamaka Akpuokwe, & Nkechi Emmanuella Eneh. (2024). DATA PRIVACY LAWS AND COMPLIANCE: A COMPARATIVE REVIEW OF THE EU GDPR AND USA REGULATIONS. Computer Science & IT Research Journal, 5(3), 528-543. https://doi.org/10.51594/csitrj.v5i3.859

[8] Maryam Mehrnezhad and Teresa Almeida. 2021. Caring for Intimate Data in Fertility Technologies. In CHI Conference on Human Factors in Computing Systems (CHI ’21), May 8–13, 2021, Yokohama, Japan. ACM, New York, NY, USA, 11 pages. https://doi.org/10.1145/3411764.3445132

[9]Center for Reproductive Rights,  After Roe Fell: Abortion Laws by State  (Dec 2024) , https://reproductiverights.org/maps/abortion-laws-by-state/

[10] Zikan Dong, LiuWang, Hao Xie, Guoai Xu, and HaoyuWang. 2022. Privacy Analysis of Period Tracking Mobile Apps in the Post-Roe v. Wade Era. In 37th IEEE/ACM International Conference on Automated Software Engineering (ASE ’22), October 10–14, 2022, Rochester, MI, USA. ACM, New York, NY, USA, 6 pages. https://doi.org/10.1145/3551349.3561343

[11] Jiaxun Cao, Hiba Laabadli, Chase Mathis, Rebecca Stern, and Pardis Emami-Naeini. 2024. “I Deleted It After the Overturn of Roe v.Wade”: Understanding Women’s Privacy Concerns Toward Period-Tracking Apps in the Post Roe v. Wade Era. In Proceedings of the CHI Conference on Human Factors in Computing Systems (CHI ’24), May 11–16, 2024, Honolulu, HI, USA. ACM, New York, NY, USA, 22 pages. https://doi.org/10.1145/3613904.3642042

https://dl.acm.org/doi/full/10.1145/3613904.3642042

[12] Qiurong Song, Rie Helene (Lindy) Hernandez, Yubo Kou, and Xinning Gui. 2024. “Our Users’ Privacy is Paramount to Us”: A Discourse Analysis of How Period and Fertility Tracking App Companies Address the Roe v. Wade Overturn. In Proceedings of the CHI Conference on Human Factors in Computing Systems (CHI ’24), May 11–16, 2024, Honolulu, HI, USA. ACM, New York, NY, USA, 21 pages. https://doi.org/10.1145/3613904.3642384

[13] Kollnig, K. & Binns, R. & Van Kleek, M. & Lyngs, U. & Zhao, J. & Tinsman, C.& Shadbolt, N. (2021). Before and after GDPR: tracking in mobile apps. Internet Policy Review, 10(4). https://doi.org/10.14763/2021.4.1611

[14] 2022. Flo, the leading female health app, launches ‘Anonymous Mode’ to further protect reproductive health information in wake of Roe v. Wade decision. 

https://flo.health/press-center/flo-launches-anonymous-mode.

[15] Citron, Danielle Keats, Intimate Privacy in a Post-Roe World (March 13, 2023). Florida Law Review, Forthcoming, Virginia Public Law and Legal Theory Research Paper No. 2023-22, Available at SSRN: https://ssrn.com/abstract=4387341

[16]  Arielle A.J. Scoglio, Sameera S. Nayak,Alignment of state-level policies and public attitudes towards abortion legality and government restrictions on abortion in the United States,Social Science & Medicine,Volume 320,2023,115724,ISSN 0277-9536, https://doi.org/10.1016/j.socscimed.2023.115724.

[17] Han, C., Reyes, I., Elazari Bar On, A., Reardon, J., Feal, A., Egelman, S., & Vallina-Rodriguez, N. (2019, May). Do you get what you pay for? comparing the privacy behaviors of free vs. paid apps. In Workshop on Technology and Consumer Protection (ConPro 2019), in conjunction with the 39th IEEE Symposium on Security and Privacy, 23 May 2019, San Francisco, CA, USA..

[18] Sajid, A., & Abbas, H. (2016). Data privacy in cloud-assisted healthcare systems: state of the art and future challenges. Journal of medical systems, 40(6), 155.

[19] Behrendt, F., & Sheller, M. (2023). Mobility data justice. Mobilities, 19(1), 151–169. https://doi.org/10.1080/17450101.2023.2200148

[20] Paldan, K., Sauer, H. & Wagner, NF. Promoting inequality? Self-monitoring applications and the problem of social justice. AI & Soc 38, 2597–2607 (2023). https://doi.org/10.1007/s00146-018-0835-7

[21] Taylor, L. (2023). Data justice, computational social science and policy. In Handbook of computational social science for policy (pp. 41-56). Cham: Springer International Publishing.

[22] Bernard P, Charafeddine R, Frohlich KL, Daniel M, Kestens Y, Potvin L (2007) Health inequalities and place:: a theoretical conception of neighbourhood. Soc Sci Med (1982) 65(9):1839–1852. https://doi.org/10.1016/j.socscimed.2007.05.037

[23] D'Ignazio, C., Klein, L. F. (2023). Data Feminism. United Kingdom: MIT Press.

[24] Ana O Henriques, Hugo Nicolau, Anna R. L. Carter, Kyle Montague, Reem Talhouk, Angelika Strohmayer, Sarah Rüller, Cayley Macarthur, Shaowen Bardzell, Colin M. Gray, and Eleonore Fournier-Tombs. 2024. Fostering Feminist Community-Led Ethics: Building Tools and Connections. In Companion Publication of the 2024 ACM Designing Interactive Systems Conference (DIS '24 Companion). Association for Computing Machinery, New York, NY, USA, 424–428. https://doi.org/10.1145/3656156.3658385

[25] Najd Alfawzan, Markus Christen, Giovanni Spitale, Nikola Biller-Andorno, et al. 2022. Privacy, Data Sharing, and Data Security Policies of Women’s mHealth Apps: Scoping Review and Content Analysis. JMIR mHealth and uHealth 10, 5 (2022), e33735.

[26] Nastaran Bateni, Jasmin Kaur, Rozita Dara, and Fei Song. 2022. Content Analysis of Privacy Policies Before and After GDPR. 2022. IEEE, 1–9.

[27]  Shamal Ahmed Hama Aziz and Sara Kamal Othman. 2020. Speech acts uses in persuasion and deception in marketing discourse. Journal of University of Babylon for Humanities 28, 6 (2020), 12.

[28] Flora Garamvolgyi. 2022. Why US women are deleting their period tracking apps. T he Guardian. Retrieved October 27, 2022 from https://www.theguardian.com/world/2022/jun/28/why-us-woman-are-deleting-their-period-tracking-apps

[29] Olivia Goldhill. 2019. “FemTech” is not and should not be a thing. Quartz. Retrieved January 15, 2023 from https://qz.com/1586815/why-femtech-is-a-sexistcategory/

[30] Tobias Dehling and Ali Sunyaev. 2023. A design theory for transparency of information privacy practices. Information Systems Research (2023).