
Tech & Personal Freedoms
Kid’s Safety Online
The rising mental health crisis among young people, including increasing rates of depression, anxiety, and suicide, has spurred growing concerns about the impact of social media on adolescent well-being. The Biden-Harris administration’s Kids Online Health and Safety Task Force and various state-level initiatives are working to address these concerns, focusing on limiting harmful social media practices and enhancing data privacy. However, there is uncertainty about the effectiveness of these laws and how to balance protecting youth from harmful content while allowing meaningful online engagement, especially for marginalized groups like LGBTQ+ youth. The full scope of social media's impact remains unclear, with research showing mixed results on its effects, ranging from positive to negative outcomes. Given the complexity of the issue, there is an urgent need for objective, evidence-based research to inform effective policies that protect youth without unintended consequences. This presents an opportunity for philanthropic support to drive informed, data-driven solutions for youth digital safety and mental health.
-
Rates of depression, anxiety, and suicide among young people are on the rise.
Social media’s role in shaping adolescent well-being is increasingly under scrutiny.
There is a pressing need for solutions to protect youth mental health in the digital age.
-
The Biden-Harris administration’s Kids Online Health and Safety Task Force aims to understand and regulate digital platforms' impact on youth.
There is uncertainty about the incoming administration’s approach to regulation.
States like California, New York, Illinois, and Colorado have begun implementing laws to restrict harmful social media practices (e.g., limiting notifications, improving data privacy).
As such, objective, data-driven research is required to inform these federal and state policy decisions.
-
Social media can have both positive and negative effects on adolescent well-being.
Research on the topic is inconsistent, with some studies showing negligible effects, while others find significant negative impacts, particularly on girls' mental health.
The multifaceted nature of social media complicates efforts to understand its full impact, as the effects vary based on how it is used, individual differences, and contextual factors.
-
Many young people, especially those from marginalized communities (e.g., LGBTQ+ youth), rely on the internet for connection and support.
Social media offers a critical space for youth to find belonging and resources that may be unavailable offline.
Parental verification systems, intended to enhance safety, could be problematic for youth from unsupportive or absent family environments.
Need: A balanced approach is required to protect youth from harmful content and addictive behaviors, while still allowing them meaningful participation in online spaces.
-
Without solid research, well-intentioned regulations could unintentionally harm vulnerable youth populations by restricting access to important online resources or limiting digital skill development.
The lack of clarity on social media’s effects means that laws could either be too restrictive or insufficiently protective, potentially exacerbating the problem.
-
This issue presents an urgent opportunity for donors to fund research that can drive policy decisions and create evidence-based solutions for protecting youth.
Your support can ensure that young people’s rights to access, connect, and engage online are respected while safeguarding their mental health and well-being.
Funding is essential to conduct rigorous studies, assess the real-world impact of current laws, and develop effective, data-driven policies.
Protecting Digital Identity
Digital identity technologies provide individuals with the ability to verify and authenticate their identity when accessing government services and public resources. These innovations hold transformative potential to enhance inclusivity, particularly for marginalized communities, by improving access to public assistance programs and essential financial services.
In the United States, initiatives such as Login.gov—a shared federal platform for identity proofing and authentication—represent early steps toward building a robust national digital identity infrastructure. However, implementing such systems faces significant challenges, including:
Ensuring equity and accessibility for all populations, including those without access to digital devices or broadband.
Balancing privacy and data protection with functionality.
Addressing potential harms, such as identity fraud, surveillance risks, and disproportionate impacts on marginalized groups.
Developing standards and frameworks that adapt to evolving security threats.
Philanthropic support can play a catalytic role by funding research on successful global examples, convening stakeholders from diverse sectors, and fostering a multidisciplinary approach to crafting U.S.-specific recommendations for implementing digital identity technologies.
-
Social Inclusion: Digital identity can expand access to public programs and financial services for underserved populations, including unbanked individuals and those without traditional identification documents.
Efficiency Gains: Streamlined authentication systems can reduce administrative burdens and improve user experiences for citizens and governments alike.
Economic Development: A trusted digital identity ecosystem can unlock economic opportunities, such as enabling secure online transactions and fostering innovation in e-governance.
Global Learnings: Success stories like India’s Aadhaar system or Estonia’s e-Residency program offer valuable insights on scalable implementation.
-
Equity and Accessibility: Ensuring digital identity systems do not exclude those with limited digital literacy, no internet access, or disabilities.
Privacy and Security Risks: Mitigating risks of data breaches, unauthorized surveillance, and misuse of personal information.
Trust and Adoption: Building public confidence in digital identity systems, particularly in communities with historical mistrust of government technology.
Interoperability and Standards: Developing interoperable systems across federal, state, and local levels while addressing conflicting regulations.
Cost and Sustainability: Balancing the financial investment required for infrastructure with long-term maintenance needs.
-
Authoritarian governments or entities may use digital identity systems to surveil and suppress dissent or opposition.
Poorly implemented systems can deny services to those who cannot meet stringent identity verification standards, exacerbating existing vulnerabilities.
Private firms managing digital identity systems may prioritize profit, leading to exploitation of user data or exclusion of populations that offer less commercial value.
Weak regulatory frameworks often fail to safeguard sensitive personal information, exposing individuals to risks of identity theft or fraud.
Historical mistrust of governments or global organizations makes communities skeptical of participating in digital identity programs.
-
Strategic investments, collaborative policymaking, and philanthropic support can drive progress toward a secure and inclusive digital identity ecosystem.
Multilateral institutions and private donors are working to help developing countries design inclusive, privacy-respecting ID systems. Philanthropies could fund pilot programs in countries with large, unregistered populations.
It is essential to fund advocacy organizations that work on improving global digital rights, data privacy policies, and ethical tech use in developing countries.
Non Consensual Deepfake Pornography
Deepfake pornography – non-consensual AI-generated material of individuals in sexually explicit contexts – has increased in volume with the rapid development of generative AI models. The impact of this trend is tragically evident: in 2023, United States cybersecurity firm Home Security Heroes reported 95,820 deepfake videos online (a 550% increase from 2019), 98% of which were pornographic. In addition, students and law enforcement officials alike report deepfake image-based sexual abuse with increasing frequency. A number of state-level bills that address deepfake generation do exist; however, at least 23 states have no laws restricting deepfake generation. Currently, there are two pieces of legislation that aim to provide victims with some form of protection at the federal level: the Disrupt Explicit Forged Images and Non-Consensual Edits (DEFIANCE) Act of 2024, and the Preventing Deepfakes of Intimate Images Act.
-
AI-assisted sexual abuse inflicts many different types of harm on victims, including, but not limited to:
Emotional Harm – In addition to instigating a heavy sense of mistrust and isolation, AI-assisted sexual abuse can also result in PTSD, body dysmorphia, anxiety, and depression
Reputational Harm – Deepfake abuse is comparable to “revenge porn” and is often created and shared with the intention to humiliate victims and disrupt public perception of their persona
Occupational/Economic Harm – An extension of reputational harm; there have been a number of cases in which victims of AI-assisted sexual abuse are fired by their employers because of generated material
-
This is an issue that disproportionately impacts women: 99% of all victims of AI-assisted sexual abuse identify as women
Need: Discussing deepfakes with the knowledge that it is a feminist issue in addition to an online safety, privacy, and moderation issue will allow us to more effectively challenge sexist norms in online spaces
-
Due to the accessibility of relevant technologies (ie. GANs and GPTs) there has been an increased number of cases in which underaged students are the target of AI-assisted sexual abuse
Young victims are increasingly susceptible to long-term psychological harm as a result of abuse.
Need: In addition to addressing deepfakes via legislation, specific education initiatives should be implemented in schools, ie. existing discussions of sexual harassment prevention should be amended to include AI-assisted sexual abuse.
-
The DEFIANCE Act of 2024 establishes a civil remedy to individuals who are subjected to “digital forgery”. The remedy described is a paid sum of damages from producers and holders of deepfaked images to the victim. In addition, the DEFIANCE Act offers other mechanisms to restore victim privacy. This bill has passed Congress, been sponsored by Rep. Alexandria Ocasio-Cortez (D-NY), and is currently at the House.
The Preventing Deepfakes of Intimate Images Act introduces a criminal penalty for distributing nonconsensual, sexual deepfakes. This bill was introduced to the House in 2023.
Need: Federal deepfake legislation would grant victims the ability to seek aid in all states. Supporting these bills is a necessary first step.
-
Current Federal-level initiatives take a reactive approach to deepfake policy, in that they grant victims the ability to seek support and/or damages after being subjected to abuse
These bills are preventative in some sense, but do not address the actual generation software. Implementing new systems – for example, irremovable and traceable digital watermarks on AI-generated images – could be better preventative measures
Need: More research needs to be done on ways to institute proactive measures for deepfake prevention
-
The rapid evolution of generative image softwares has led to a sharp increase in the number of AI generated images, and in parallel, a drastic increase in the number of non-consensual deepfakes. It is therefore essential to develop empathetic policy that can match the issue’s growth
Understanding this as an intersectional issue (feminism, child safety) allows for the inclusion of more – and more relevant – voices of support
Research into proactive policy solutions that enable innovation, but not at the cost of user safety, is a worthy philanthropic endeavor
Organizations Working Towards Progress
-
AnitaB.org is an organization that operates with the intention of advancing and empowering women and nonbinary technologists through various avenues. The organization achieves this in numerous ways, whether via research dedicated to alleviating gender disparity in the tech industry, or through programs and events, including the Grace Hopper Celebration, which repeatedly hosts the world’s largest collective of women and nonbinary technologists.
Email giving@anitab.org to learn more about donating.
-
Rewriting the Code is a registered 501(c)(3) non-profit organization that empowers students and early career women in technology to belong in all spaces where the future is shaped. Through intersectional communities, actionable data insights, and continuous engagement, the organization advocates for equity, fosters connections, and supports the next generation of tech leaders globally.
Email contact@rewritingthecode.org or visit https://rewritingthecode.org/contact/ to learn more about donating.
-
The Society of Women Engineers (SWE) is dedicated to giving women engineers a unique place and voice within the technology and engineering sectors. SWE establishes engineering as a highly desirable career for women through training and development programs, networking opportunities, scholarships alongside outreach and advocacy activities.
Email funddevelopment@swe.org or hq@swe.org to learn more about donating.
-
The Center for Humane Technology is dedicated to raising awareness of the negative impacts of technology on individuals and society through powerful media, strategic partnerships, and resources that empower leaders to take bold, coordinated action. One of their key achievements is producing the Emmy-winning documentary The Social Dilemma (2020), which exposes the harmful effects of social media. The center also has a dedicated policy team whose work spans from supporting state-level initiatives like Vermont’s Kids Code to engaging in high-level federal policy discussions, such as the Senate’s bipartisan AI Insight Forum. The team focuses four issue areas, one of which is future generations .
Email giving@humanetech.com to discuss the impact your gift could have.
-
As digital technology becomes deeply embedded in every aspect of life, it is reshaping how we connect, work, and understand our mental and physical well-being. The Center for Digital Ethics (CDE) is at the forefront of this field, conducting primary research on digital well-being as one of its four core programs. This research focuses on understanding the impact of technology on mental health, social relationships, and overall wellness. CDE’s scholars, including Cal Newport, Will Fleisher, and Meg Leta Jones, are leaders in the field, with their work featured prominently in both academic circles and mainstream media. By addressing the pressing issues of our digital age, CDE is shaping the future of technology’s role in society. Your support can help further this essential work. Contact digitalethics@georgetown.edu to discuss the impact your gift could have
-
The rapid evolution of generative image softwares has led to a sharp increase in the number of AI generated images, and in parallel, a drastic increase in the number of non-consensual deepfakes. It is therefore essential to develop empathetic policy that can match the issue’s growth
Understanding this as an intersectional issue (feminism, child safety) allows for the inclusion of more – and more relevant – voices of support
Research into proactive policy solutions that enable innovation, but not at the cost of user safety, is a worthy philanthropic endeavor
-
It is dedicated to advancing social change through data-driven and technology-focused solutions. A key initiative is the Digital Benefits Network (DBN), which supports government efforts to deliver public benefits services that are accessible, effective, and equitable. DBN has launched a research agenda to provide in-depth learnings about identity verification and authentication practices across core social safety net programs.
Contact beeckcenter@georgetown.edu to discuss the impact your gift could have.
-
It is an international organization dedicated to defending and extending the digital rights of users at risk. Through its #WhyID campaign, Access Now critically examines digital identification systems, highlighting potential risks such as privacy infringements, increased surveillance, and the marginalization of vulnerable populations. The organization advocates for human rights-centered approaches to digital identity, emphasizing data minimization and user autonomy to prevent discrimination and uphold individual dignity.
Contact melissa@accessnow.org to discuss the impact your gift could have.
-
KGI’s mission is to channel insights and evidence from the research and public interest communities into actionable solutions that shape technology policy and industry decisions. One of KGI’s key areas of focus is platform governance, which examines how the design of digital platforms impacts user behavior, community dynamics, and societal outcomes. KGI is working to strengthen the connections between research and policy, building relationships among leading academics, legislators, and enforcement authorities to address these complex questions, including those related to youth, the First Amendment and Section 230. Through its research and partnerships, KGI aims to ensure that platform governance evolves in a way that effectively mitigates harm and promotes public good.
Email knightgeorgetown@georgetown.edu to learn more about donating.
-
The Trevor Project is committed to safeguarding the health and well-being of LGBTQ+ youth through a variety of impactful initiatives. One key aspect of this work is conducting innovative research that advances knowledge in LGBTQ+ mental health and suicide prevention. Recent studies have focused on the unique experiences of LGBTQ+ youth online, ensuring their voices are amplified and their specific needs are addressed in the digital age. Drawing from these findings, The Trevor Project has developed TrevorSpace an affirming, online community for LGBTQ+ young people between the ages of 13-24. Additionally, the organization leverages this research to advocate for stronger protections and policies for the LGBTQ+ community in Congress, working to ensure that the needs of youth are prioritized in both digital spaces and legal frameworks.
To learn more about the Trevor Project and to donate, visit their website https://www.thetrevorproject.org/