- FUTURE FIT | The Solopreneur Newsletter
- Posts
- Paragon Policy Fellowship, Carbon Hack 24 and a Master’s in Design for Responsible AI
Paragon Policy Fellowship, Carbon Hack 24 and a Master’s in Design for Responsible AI
AI2 Allen AI Institute & Center for Democracy & Technologies, Future of Life Institute + All Tech is Human, Encode Justice, Humane Tech Community

👋 Hello, Humane Tech Pioneers!
The rise of AI deepfakes poses significant threats to personal privacy, the integrity of our election systems, and the overall reliability of our information ecosystem. This challenge opens up critical career paths in technology, policy, and education. You can be part of the solution
In this week’s newsletter:
🚨 NEWS - AI deepfakes, election security, and the vast reach of Facebook's trackers.
🎓 LEARNING PATHWAYS - Paragon Policy Fellowship, Carbon Hack 24 and a Master’s in Design for Responsible AI.
🌐 ORGANIZATIONS - AI2 Allen AI Institute, Center for Democracy & Technologies, Future of Life Institute.
👥 PEOPLE & COMMUNITIES - Four Responsible AI Pioneers + All Tech is Human, Encode Justice, Humane Tech Community.
⚙️ TOOLS & RESOURCES - The Responsible Tech Guide; Safe Foundation Model Deployment; Understanding Private Surveillance Providers and Technologies; Regulating AI Deepfakes and Synthetic Media in the Political Arena.
HUMANE TECH NEWS
TRUST & SAFETY

Source: Midjourney (AI-generated Image)
Summary: Recently, explicit deepfake images of pop star Taylor Swift went viral on social media, particularly on platform X, amassing millions of views before being taken down.
Take Away: Deepfakes targeting real people expose flaws in current software & social media policies. Urgent reinforcement is needed to combat the creation and spread of these harmful AI-generated manipulations.
CYBER & DEMOCRACY

Source: Midjourney (AI-generated Image)
Summary: OpenAI outlines its approach to safeguarding the 2024 elections, emphasizing abuse prevention, AI content transparency, and authoritative voting information access. Strategies include enhancing safety protocols, collaborating with organizations like NASS, and ensuring transparent AI-generated content.
Take Away: This article shows how AI companies can act responsibly to mitigate the potential negative impacts of Generative AI on the integrity of elections worldwide . While self-regulation is important, protecting democracy demands industry-wide standards. These must be a collaborative effort between AI companies, lawmakers, and civil society.
SURVEILLANCE CAPITALISM

Source: Midjourney (AI-generated Image)
Summary: According to a new research co-published by Consumer Reports and The Markup, Facebook users are monitored by thousands of companies, with data on each person being shared by an average of 2,230 companies. The authors argue that Meta's tracking practices are invasive and give companies too much power over users' personal information. They call for more transparency and control over how Meta collects and shares user data
Take Away: Companies using our data to gather details about our lives without full awareness exemplifies what Shoshana Zuboff calls "Surveillance Capitalism”. This constant tracking can have a chilling effect, as data can be used to manipulate our choices and discriminate against us based on predictions.
LEARNING PATHWAYS
LEARNING PATHWAY
Online
What is it?
The Paragon Fellowship offers a hands-on policy experience within the realm of science and technology governance at the local level. Fellows tackle current or emerging tech-related issues relevant to their state and local governments while contributing directly to informed policymaking.
Who is the Program For?
Future Tech Policy Practitioners - Ideally suited for individuals who envision roles on government tech/innovation boards, working with elected officials to craft ethical guidelines, etc. This provides invaluable on-the-ground experience.
Policy Students Seeking Tech Specialization - It could be a great addition for those studying broader governance, and seeking real-world projects to understand how science and tech intersects with rulemaking.
"Community Impact" Motivated Individuals - If your goal is less about career-ladder scaling and more about shaping your immediate surroundings, working directly with local governing bodies offers great opportunity.
Future Fit Criteria
Exponential Technologies ⭐️⭐️⭐️ While not deeply centered on emerging tech, the program implicitly requires fellows to grapple with implications. If working on broadband infrastructure, that includes future changes to speed and access impacting governance. Similarly, AI procurement policies may have long-term repercussions for communities.
Real-World Challenges ⭐️⭐️⭐️⭐️⭐️ The Paragon Fellowship is hyper-focused on addressing tangible problems of direct importance to government partners. This ensures fellows engage with complex real-world scenarios affecting society, mirroring challenges those in tech policy will face.
Innovation Skills ⭐️⭐️⭐️ Innovation here lies in devising creative policy solutions that address tech-related problems for communities. It might sometimes involve adapting pre-existing ideas to localized, specific needs.
Human Skills ⭐️⭐️⭐️⭐️⭐️ This is a significant strength of the fellowship. Communicating technical ideas and policy recommendations to government leaders requires empathy, clarity, and diplomacy. This transferable skillset is essential in the increasingly diverse landscape of tech policy.
Work-Based Learning ⭐️⭐️⭐️⭐️⭐️ The cornerstone of the Paragon Fellowship is direct integration with actual governments on genuine policy needs. It goes beyond theory, and this immersion makes it extremely future-fit.
Industry Connections ⭐️⭐️⭐️⭐️ Building a strong network in local and state government is the likely direct benefit. While the policy sector may be less glamorous than some tech giants, there is a growing value for experts navigating this interface.
LEARNING PATHWAY
by Green Software Foundation
Online
What is it?
Carbon Hack 24 is an online hackathon aimed at developing software solutions to reduce the ecological footprint of software, focusing on carbon emissions, water usage, and land impact.
Who is the Program For?
This program is for technology professionals, developers, engineers, and students interested in creating sustainable software solutions. It prepares participants for careers in green software development, environmental impact analysis, and sustainability consulting.
Future Fit Criteria
Exponential Technologies ⭐️⭐️⭐️⭐️
Focuses on sustainable software practices and the Impact Framework for measuring software's environmental impact, including cloud, mobile, and AI technologies.
Real-World Challenges ⭐️⭐️⭐️⭐️
Participants work in teams to tackle real-world problems related to software's ecological footprint, using the Impact Framework to measure and reduce environmental impacts.
Innovation Skills ⭐️⭐️⭐️⭐️
Encourages innovation in creating or enhancing tools for measuring and reducing software's environmental impact, including developing new model plugins and improving the Impact Framework's performance and usability.
Human Skills ⭐️⭐️⭐️⭐️⭐️
While the hackathon focuses on technical skills, collaboration and teamwork are essential, fostering communication and problem-solving abilities among participants.
Work-Based Learning ⭐️⭐️⭐️⭐️
Offers practical experience in developing real solutions to environmental challenges in software, with opportunities for recognition and awards.
Industry Connections ⭐️⭐️⭐️⭐️⭐️
Participants gain exposure to the Green Software Foundation's network, including potential sponsors and partners interested in sustainable software solutions. The hackathon also highlights early-stage innovations and talent.
LEARNING PATHWAY
by Elisava - Barcelona School of Design and Engineering
Part-time, Low Residency, On-Campus
What is it?
A program focusing on creative research, critical thinking, and strategic decision-making in AI, addressing its societal, ethical, and environmental impacts from a design perspective.
Who is the Program For?
Professionals in tech, social innovation, or media seeking to responsibly integrate AI into their work, emphasizing ethics, sustainability, and social justice in digital transformations.
Future Fit Criteria
Exponential Technologies ⭐️⭐️⭐️⭐️
This master's program teaches you how to design solutions for automation and automated decision making in both private and public sector organizations, focusing on implementing Responsible AI principles with an emphasis on AI's explainability.
Real-World Challenges ⭐️⭐️⭐️⭐️
You learn how to translate principles like fairness into actionable strategies, addressing risks and using AI to tackle global challenges.
Innovation Skills ⭐️⭐️⭐️⭐️
In the Creative Research & Imagination Collaboratory (Module 1), you learn to collaborate effectively with people from diverse backgrounds and skillsets. You develop the ability to research complex themes creatively, cultivate intuitive problem-solving approaches, and practice insightful reflection.
Human Skills ⭐️⭐️⭐️⭐️⭐️
The program is built around values like humility, responsibility, diversity, empathy, and solidarity. These principles foster a collaborative learning space where participants agree on guidelines that prioritize inclusivity and address community challenges.
Work-Based Learning ⭐️⭐️⭐️⭐️
The program culminates with a final project, starting from team and project setup in Term II and evolving through Term III. It includes regular design critiques, peer feedback, guidance from tutors and remote experts, culminating in a public presentation.
Industry Connections ⭐️⭐️⭐️⭐️⭐️
Postgraduate students at Elisava engage in over 250 extracurricular internships annually, facilitated through partnerships with leading companies. These internships, accessed via Elisava's portal or student proposals, complement the curriculum, offering industry connections with firms like Folch, Run Design, and Accenture, among others
ORGANIZATIONS
ORG FOR YOUR NEXT STEP
Seattle WA, USA
What they do:
The Allen Institute for AI (AI2), founded by Paul Allen, is dedicated to advancing the state of artificial intelligence through high-impact research and engineering, currently led by CEO Ali Farhadi. Their work aims to solve significant challenges in AI for the common good, fostering scientific breakthroughs and developing open-source tools to propel the field forward.
Workplace highlights:
AI2 champions a culture of innovation, collaboration, and inclusivity, with a strong emphasis on continuous learning and ethical AI development. It fosters a dynamic, supportive environment where diversity, open dialogue, and societal impact are key pillars.
Careers:
AI2 seeks innovative thinkers passionate about AI, including researchers, engineers, and data scientists, who value collaboration, diversity, and ethical implications in their work. They prioritize individuals eager for learning and making societal impacts through AI advancements. Check open roles here
AI2's internship program offers hands-on experience in AI research and development, targeting students passionate about AI's ethical and societal impacts. Interns work closely with experts, gaining insights into cutting-edge AI projects and contributing to meaningful advancements. Learn more about the AI2 internship program here
ORG FOR YOUR NEXT STEP
Washington, D.C. (USA); Brussels (Belgium)
What they do:
The Center for Democracy & Technology is a leading nonpartisan, nonprofit organization focused on advancing civil rights and civil liberties in the digital age. While individual rights are central to their mission, CDT works to shape technology policy and architecture with a broader focus, aiming to balance individual liberties with societal impacts and the public good. Their efforts include advocating for responsible AI practices, privacy, security, online expression, and fostering a healthy online civic space.
Workplace highlights:
CDT fosters a culture of innovation, collaboration, and inclusivity, emphasizing diversity, equality, and inclusion. Their work environment encourages continuous learning and experimentation in small multidisciplinary teams, aiming to advance tech policy with equity and democratic values at its core. Learn more
Careers:
CDT seeks individuals passionate about digital rights, including policy experts, researchers, and advocates. They value collaboration, diversity, and the ethical implications of technology. CDT encourages those inspired by an open and free internet to join their team. Career opportunities can be explored here
CDT's Academic Year Externship offers graduate and PhD students the chance to work with their Research Team on projects like drafting literature reviews and data analysis. Unpaid and for credit, it's open to all majors. Learn more
ORG FOR YOUR NEXT STEP
Boston, MA (USA) and around the world
What they do:
The Future of Life Institute is dedicated to mitigating existential risks facing humanity, with a focus on artificial intelligence, biotechnology, and nuclear weapons. Its mission is to steer the development of transformative technologies towards benefiting life and avoiding large-scale risks. FLI engages in research, policy advocacy, and grantmaking to ensure that technologies such as AI are developed and managed responsibly, aiming to prevent catastrophic outcomes and enhance the future of life on Earth.
Workplace highlights:
FLI has a diverse and multidisciplinary team, including experts from government institutions, industry, academia, and various fields such as behavioral sciences, medicine, machine learning, engineering, law, and design. This diversity fosters a culture of innovation and collaboration, crucial for addressing the complex challenges at the intersection of technology and society.
Careers:
The Future of Life Institute (FLI) is looking for individuals with a diverse range of expertise, including behavioral sciences, medicine, machine learning, engineering, law, and design. They have a team distributed across Europe and the US and meet in person as a full team once a year. FLI offers internship opportunities as part of their rolling applications. Learn more about careers at FLI
PEOPLE AND COMMUNITIES
![]() | Timnit Gebru, Founder & Executive Director at the Distributed AI Research Institute (DAIR) |
![]() | Dr. Joy Buolamwini, Author of the book Unmasking AI, and Artist in Chief & President of The Algorithmic Justice League. |
![]() | Fei-Fei Li, Sequoia Professor of Computer Science and Co-director of the Human-Centered AI Institute at Stanford University. |
![]() | Mutale Nkonde, Visiting Policy Fellow at the Oxford Internet Institute and CEO of AI for People. |
![]() | Tristan Harris, Co-founder of the Center for Humane Technology |
COMMUNITY
Online & in NY, USA
All Tech is Human aims to create ethical, responsible, and inclusive tech. It unites technologists, policymakers, and academics to ensure tech aligns with societal values and needs, fostering dialogue and action for technology that benefits all.
Who is it for:
"All Tech is Human" is for technologists, policymakers, academics, ethicists, industry leaders, and anyone interested in ethical technology.
Main Initiatives:
Responsible Tech University Ecosystem: a network of academic programs, faculty, student clubs, and research institutes that are working to address the ethical and societal implications of technology.
Responsible Tech Mentorship Program: a free program that connects aspiring responsible tech practitioners with experienced mentors.
COMMUNITY
Online, 40+ US States, 30+ Countries
Encode Justice is a youth-led movement advocating for human-centered artificial intelligence. It focuses on addressing AI's societal impacts, such as surveillance and disinformation, and promotes safety, justice, and ethical governance in AI development and application.
Who is it for:
Encode Justice is for high school and college students globally who are passionate about ethical AI and interested in advocacy, policy-making, and education regarding the responsible use of technology.
Main Initiatives:
Youth Open Letter on AI Risks: demanding action to mitigate the dangers of AI and to include young people in crafting the policies that will shape our technological future.
Educational Outreach: Provides workshops and develops curriculum to educate students about AI implications.
Public Campaigns: Organizes campaigns against measures that could expand the use of biased algorithms, such as California's ballot measure.
COMMUNITY
Online & Global
The Humane Tech Community is dedicated to realigning technology with humanity's best interests. It focuses on mitigating the negative impacts of technology on well-being, freedom, society, and ensuring technology's evolution is beneficial and humane.
Who is it for:
The Humane Tech Community caters to a broad audience including parents, students, technologists, scientists, government officials, businesses, and educators. It aims to engage and educate these groups on the importance of humane technology.
Main Initiatives:
Pyramid of Wellbeing: Focuses on promoting technology and habits that enhance overall well-being, advocating for tech that enables a happier life
Pyramid of Freedom: Addresses the challenges to freedom posed by technology, aiming to ensure technology supports human flourishing and freedom of expression
Pyramid of Society: Seeks to improve societal interaction through technology, promoting inclusive access and creating a safe space for humanity to thrive.
TOOLS AND RESOURCES
🧭 The Responsible Tech Guide - created by All Tech Is Human, helps you navigate the responsible tech landscape and find ways to get involved. LINK
✅ Guidance for Safe Foundation Model Deployment - created by Partnership on AI, aimed at model providers, it is designed to ensure the safe and responsible deployment of LLMs. LINK
👁️ Understanding Private Surveillance Providers and Technologies - created by Privacy International and Geneva Center for Security Sector Governance, a policy paper examining the risks of private surveillance tech and urging stronger rules for "private security providers." LINK
👾 Regulating AI Deepfakes and Synthetic Media in the Political Arena - created by the Brennan Center for Justice, it addresses the challenge of AI Deepfakes in Politics and calls for regulation to safeguard democracy. LINK
Thank you for reading!
Do you know of any learning pathways, communities, or resources focused on humane tech? Reach out by replying to this newsletter, or via direct message on Linkedin.