93% found this document useful (30 votes)
67K views

Responsible Tech Guide by All Tech Is Human

How do you get involved in the growing Responsible Tech field? This guide is a comprehensive look at the vibrant Responsible Tech ecosystem and ways for YOU to get more involved. Stay in touch with All Tech Is Human at AllTechIsHuman.org Slack group: bit.ly/ResponsibleTechSlack Newsletter: AllTechIsHuman.Substack.com All projects & links: Linktr.ee/AllTechIsHuman
Copyright
© Attribution (BY)
93% found this document useful (30 votes)
67K views

Responsible Tech Guide by All Tech Is Human

How do you get involved in the growing Responsible Tech field? This guide is a comprehensive look at the vibrant Responsible Tech ecosystem and ways for YOU to get more involved. Stay in touch with All Tech Is Human at AllTechIsHuman.org Slack group: bit.ly/ResponsibleTechSlack Newsletter: AllTechIsHuman.Substack.com All projects & links: Linktr.ee/AllTechIsHuman
Copyright
© Attribution (BY)
You are on page 1/ 160

Responsible Tech

Guide | 2024
N
D
•L
C
•D
YC
N

ALL TECH IS HUMAN | RESPSONSIBLE TECH GUIDE 2024 | XX


Contents

Introduction 4

About All Tech Is Human 11

An Evolving Ecosystem 16
Responsible AI 22
Trust & Safety 51
Youth, Tech, and Wellbeing 65
Public Interest Technology 85
Cyber & Democracy 105
Tech Policy 118

Moving Forward 139

Acknowledgments 155

All Tech Is Human Team 157

ALL TECH IS HUMAN | RESPONSIBLE TECH GUIDE 2024 | 03


RESPONSIBLE TECH GUIDE

Introduction
Welcome
If you are interested in tackling thorny tech and society issues, you have
found the right place.

The Responsible Tech Guide is our organization’s flagship resource


designed to help you navigate the people, organizations, and ideas that
makeup the rapidly-evolving Responsible Tech movement. Using All Tech
Is Human’s three pillars for ecosystem-building (community, education,
careers), this guide provides actionable ways to “find the others,” to better
understand the variety of resources and upskilling opportunities available,
and to learn about all of the emerging careers where you can do incredibly
meaningful work helping to build a better tech future.

It is easy to feel like you are alone in deeply caring about ensuring that our
tech future is aligned with public interest, but in reality there is a vibrant
Responsible Tech ecosystem comprised of thousands of people from a
wide variety of backgrounds,
disciplines, and perspectives. This is
your invitation to join in and find
where you can add value; whether
that’s change through civil society,
government, industry, or
academia.

Complex issues require diverse


participation and collective action.

Thank you for doing your part!

David Ryan Polgar


Founder, All Tech Is Human
ALL TECH IS HUMAN | RESPONSIBLE TECH GUIDE 2024 | 05
Executive Summary
Responsible Tech is a multistakeholder, multidisciplinary
approach that integrates principles from ethics, law,
technology, and social sciences to help create novel
technology solutions that are aligned with the larger goals of
individual liberties and societal well-being. To co-create a
better tech future, we believe that we must:

Define a more cohesive Responsible Tech ecosystem.


We are bringing together communities from Responsible AI, Trust & Safety,
Public Interest Technology, and more under the larger umbrella term
“Responsible Tech.”

Illuminate pathways for others to join the Responsible Tech ecosystem.


We are creating avenues for finding community, relevant resources and
education opportunities, and a better understanding of the seismic shift in
the career landscape.

Encourage us to move at the speed of tech.


We are emboldening organizations and our community to create a better
networked and collaborative ecosystem to ensure that the speed of tech
innovation does not outpace society’s ability to understand the impact.

Expand where and how positive change happens.


We are supporting multiple ways in which positive change happens from
inside and outside by providing insight into industry, academic, research,
entrepreneurial, and governmental roles.

Activate the collective efforts of Responsible Tech.


We are driving new ways to connect to more opportunities for knowledge-
sharing and collaboration, and to take action to co-create a better tech
future, together.

ALL TECH IS HUMAN | RESPONSIBLE TECH GUIDE 2024 | 06


Our 2024 Responsible Tech Guide is designed to provide
an overview of the people, organizations, and ideas of the
growing Responsible Tech movement. Its purpose is to
enable readers to learn more about the Responsible Tech
ecosystem and to find actionable ways to get involved.

This year, we synthesized learnings from our mentorship program, large Slack
community, working groups, livestreams, and dozens of in-person gatherings
to provide a comprehensive and informed view of the Responsible Tech
ecosystem and how you can best contribute your perspective and add value
as we tackle complex tech and society issues.

The Responsible Tech Guide, along with all of the work of All Tech Is Human
(ATIH), is focused on three pillars for creating a strong Responsible Tech
ecosystem: building community, increasing education, and providing
career resources and guidance.

Please note: descriptions and information about organizations and resources


mentioned in the Responsible Tech Guide utilize language directly from the
entities’ websites.

Responsible Tech requires the participation of


diverse stakeholders across demographics,
geographies, and roles in the development and
regulation of technology. That way, you are
empowered by knowledge, supported by
community, and inspired to co-create a better
tech future. Find more resources at
alltechishuman.org.

ALL TECH IS HUMAN | RESPONSIBLE TECH GUIDE 2024 | 07


Our eighteen interviews derive
from six continents and
represent all six key areas of
Responsible Tech including
Responsible AI, Trust & Safety,
and Public Interest Technology. Afua Bruce Audrey Tang
Page 94 Page 102

Baroness Beeban Kidron Camille François Cansu Canca Eugenio V Garcia


Page 74 Page 60 Page 47 Page 131

Gemma Galdon Clavell Johanna Weaver Dr. Joy Buolamwini June Okal
Page 45 Page 137 Page 36 Page 134

Lyel Lacoff Resner Navrina Singh Oumou Ly Sean Litton


Page 97 Page 41 Page 99 Page 62

Sinead Bovell Sonia Livingstone Stéphane Duguin Trisha Prabhu


Page 39 Page 81 Page 116 Page 78

ALL TECH IS HUMAN | RESPONSIBLE TECH GUIDE 2024 | 08


Responsible Tech Needs You

Emerging | Students or career-changers getting started in


Responsible Tech
For young people and students, we created a global Responsible Tech
University Network with intentional agnosticism to disciplines.

For career-changers, we recommend reading our resources to


understand the ecosystem and find support through mentorship, mixers,
and our Slack community.

Mid-Career | Some level of experience in Responsible Tech


Many in our community seek mentorship while mentoring others getting
started in the field. By getting involved in our working groups, attending
our gatherings, and using our free resources, they can find their niche
within the responsible tech ecosystem.

Established | Experienced in Responsible Tech


Individuals with years of experience in Responsible Tech pay their
knowledge forward by mentoring others, speaking at our gatherings,
and getting interviewed for our reports.

ALL TECH IS HUMAN | RESPONSIBLE TECH GUIDE 2024 | 09


Environmental
Studies
How are
technologies and International
computing altering
Computer Science our environment? Relations
+ Engineering Anthropology What role can
How can I develop How does Economics technology play in
technologies technology and In what ways can international affairs
responsibly? culture influence one we balance and politics?
another? responsible
innovation and
economic growth?

Law Digital Design


How can we ensure What is the impact of
the legal protections thoughtful design on
of individuals in technology?
Education digital spaces?
How will technology
shape the way we
learn?
Statistics
How can we

Responsible
demystify the
statistical foundations
of "AI" and "Big Information
Tech Data"? Science
How is information
Philosophy stored and
Sociology disseminated online?
How can we harness
theories of In what ways does
philosophy and technology impact
ethics to better our social
shape technology? organizations and
relationships?

Art
Social Work How does
Psychology Health How can we apprise technology shape the
How can individuals about way we view art and
What role can
technology media?
Community technology play in their digital rights and
influence our minds developing a more protections?
and behavior? Development
equitable
How can
healthcare system?
communities Policy
leverage technology As technology
for equity and becomes more
access? ingrained in society,
how can policy
evolve to the voice
of citizens?

We need a diverse range of backgrounds in the


Responsible Tech ecosystem.

We can’t solve complex tech and society issues alone.


Instead, we must incorporate multiple disciplines and
backgrounds to ensure a better understanding of the
evolving ways technology impacts us.

ALL TECH IS HUMAN | RESPONSIBLE TECH GUIDE 2024 | 10


RESPONSIBLE TECH GUIDE

About All Tech Is Human


About All Tech Is Human (ATIH)
We are building the world’s largest multistakeholder,
multidisciplinary network in Responsible Tech. This
powerful network, strategically connected to key levers
of change, helps build a stronger and more diverse
Responsible Tech ecosystem. Our unique approach is
designed to help society tackle thorny tech & society
issues while moving at the speed of tech.

Since 2018, All Tech Is Human’s Responsible Tech network has


expanded to 42,000+ individuals, with an average growth rate
doubling year-over-year (YoY) across channels.

We also discovered that 20% of ATIH’s network hold decision-maker


executive leadership roles, balanced evenly by 25% entry-level
contributors. This reflects our “grassroots-power model” engagement
strategy to bring all stakeholders to the table.

The line waiting to get into ATIH’s Responsible Tech Mixer + Book Launch With Kashmir Hill in NYC.

ALL TECH IS HUMAN | RESPONSIBLE TECH GUIDE 2024 | 12


42,000+
Unique
individuals
across
channels

41%
Individuals in the
tech industry make

20%
up a plurality of the
network

C-suite/Exec-level

98
25% Entry-level
Countries ATIH
reaches globally*

Average YoY
growth rate
on LinkedIn 163%
NY SF DC

LONDON
100+
100+ cities globally
with at least 40+
Top geographic locations members each

Download 2024 Ecosystem Pulse Report *Countries ATIH reaches globally was
updated August 2024

ALL TECH IS HUMAN | RESPONSIBLE TECH GUIDE 2024 | 13


All Tech Is Human’s Ten Principles
01 The future of technology is intertwined with the future of democracy
and the human condition.

02 To align our tech future with the public interest, we need to involve the
public.

03 We need collective action in tech, not just individual thought


leadership.

04 No application without representation — not about us without us.

05 Combining multiple stakeholders, disciplines, and perspectives


requires an agnostic space for understanding and knowledge-
sharing.

06 People often struggle to “find the others” and discover the wide
variety of people and orgs committed to co-creating a better tech
future.

07 Technology is not just for technologists; we need all disciplines


involved.

08 Top-down models have power but often lack a diversity of ideas;


grassroots models have ideas but often lack power. We unite these
models.

09 Tech innovation moves fast, while our consideration of its impact often
moves slow. We need to reduce the gulf between these.

10 There is a growing awareness of the root causes of our current dilemma,


but limited action toward understanding values, trade-offs, and best
paths forward.

We cannot align our tech future with the public interest unless we actively
involve the public. All Tech Is Human’s approach brings together people of
all backgrounds and skill levels to learn from each other, build community,
and co-create a better tech future. Find out more at alltechishuman.org.

ALL TECH IS HUMAN | RESPONSIBLE TECH GUIDE 2024 | 14


Meaningful Impact in 2024
Since the release of last year’s Responsible Tech Guide
(2023), All Tech Is Human has continued to connect ideas,
people, and organizations globally. Our rapid growth is an
indication of a desire and a demand for change. It is a
reflection of our local-to-global community, a perspective
that drives us to listen, learn, and capture possibilities in
order to create more opportunities for meaningful impact.

ATIH’s Action Through Partnerships


We were honored to partner with Thorn, a nonprofit that builds technology
to defend children from sexual abuse, on a historic alliance including
Amazon, Anthropic, Civitai, Google, Meta, Metaphysic, Microsoft, Mistral AI,
OpenAI, and Stability AI, committing to implement principles to guard
against the production and dissemination of AI-generated child sexual
abuse material (AIG-CSAM) and other sexual harms against children.

ATIH’s Action Through Education


We launched our newest program, The University Network, with a plan
to launch 100 clubs by May 2025. A springboard for action, this program
serves our mission to grow and diversify the entire Responsible Tech
community by extending opportunities for learning and career growth to
affiliated student clubs at colleges and universities across the globe.

ALL TECH IS HUMAN | RESPONSIBLE TECH GUIDE 2024 | 15


RESPONSIBLE TECH GUIDE

An Evolving Ecosystem
Evolution of Responsible Tech

The pace of tech innovation moves so quickly, it can be a


challenge to keep track of technological progress from day
to day, let alone year to year! In addition to the myriad of
small changes since 2018, the year ATIH was founded, we
have seen a few major shifts:

Generative AI
OpenAI released ChatGPT in November of 2022, and with it came the
widespread use of consumer chatbots in classrooms and workplaces around
the world. The broad accessibility of consumer chatbots brought responsible
AI considerations into common consciousness.

Trust & Safety Shift


Elon Musk acquired Twitter in April of 2022, gutting Trust & Safety teams and
setting a precedent for other social platforms to follow suit; this drastically
shifted platform trust & safety functions toward third-party vendors and
consulting services.

Global Tech Regulations


New tech-focused legislation has come into effect, addressing everything
from privacy (GDPR) to artificial intelligence (EU AI Act), to platform
transparency (DSA) to child safety (KOSA).

“Responsible Tech calls for transforming tech from


within, external oversight and public accountability,
rigorous research, and re-imagining what technology in
the public interest can be.”
Rebekah Tweed
Executive Director, All Tech Is Human

ALL TECH IS HUMAN | RESPONSIBLE TECH GUIDE 2024 | 17


Key Moments That Matter

2024
The EU AI Act comes into
force, regulating the use
of AI to ensure safe,
2024 transparent, and non-
US Senate passes The Kids discriminatory practices.
Online Services Act, (August)
protecting minors from
harmful material on social
channels and creating a
2023
‘duty of care’ for online US issues Executive order
platforms. (July) on Artificial Intelligence,
establishing new standards
for safe, secure, and
trustworthy AI. (October)
2022
OpenAI releases ChatGPT,
kicking off the consumer
Generative AI era.
(November)
2022
Digital Services Act is
adopted by the EU,
addressing disinformation
2022 and illegal content.
Elon Musk buys Twitter and (October)
eliminates roughly 80% of its
staff, including entire Trust &
Safety teams. (April)
2018
General Data Protection
Regulation (GDPR) takes
effect, focusing on
information privacy. (May)

ALL TECH IS HUMAN | RESPONSIBLE TECH GUIDE 2024 | 18


Six Pillars of Responsible Tech

All Tech Is Human’s extensive and growing community


reflects the key activities, expertise, and interests in the
Responsible Tech ecosystem.

Top Areas of Interest and Expertise Among the ATIH Community


All Tech Is Human community members’ interests and expertise are diverse
and overlapping. Among Slack members, Responsible AI is a top interest for
83% of members. Many of tech’s challenges are multidimensional and our
membership’s focus reflects that. Download 2024 Ecosystem Pulse Report

Community Areas of Interest N=1,800


(Source: Slack Signup Survey, 2024 Ecosystem Pulse Report)

ALL TECH IS HUMAN | RESPONSIBLE TECH GUIDE 2024 | 19


Responsible AI
Responsible AI is at the forefront of ethical technology development. It
emphasizes the need for transparency, fairness, and accountability in AI
systems. Our community of ethical practitioners works to ensure that
applications of AI benefit society without causing harm or reinforcing bias.
In the responsible technology space, it’s important to craft guidelines,
implement robust testing, and advocate for policies that prioritize ethical
considerations while respecting human rights and equity.

Trust & Safety


Trust & Safety teams play a vital role in maintaining the integrity of online
platforms and digital spaces. Practitioners are not only dedicated to
combating harmful content, disinformation, and cyber threats, they also
work to foster an environment of trust among users. The Trust & Safety
field is evolving, and difficult tradeoffs and challenges are emerging in
the work of promoting a secure and trustworthy digital ecosystem.

Public Interest Technology


Public Interest Tech champions technology serving the greater good.
Practitioners in this space work to address societal challenges through
innovative, sustainable solutions, with expertise spanning issue areas
like civic tech, data, and digital accessibility. This work involves cross-
sector collaboration with governments, civil society, and communities to
create digital tools and policies that prioritize wellbeing and equitable
access for all.

ALL TECH IS HUMAN | RESPONSIBLE TECH GUIDE 2024 | 20


Youth, Tech, and Wellbeing
The intersection of youth, tech, and wellbeing features a range of
stakeholders steeped in the unique and emerging challenges between
young people and technology. There is an emphasis on digital literacy,
online safety, and the responsible use of technology, particularly in the
realm of social media. Youth advocacy in the space promotes meaningful
tech legislation, digital literacy, and wellbeing tools, ensuring mental and
emotional health while navigating the digital landscape.

Cyber & Democracy


This vertical is relatively new, having been added in 2024, and evolving
from the Tech & Democracy vertical. It reflects our community’s sentiment
that Responsible Tech must be secure, privacy-preserving, resilient, and
rights-respecting to work in the public’s interest. This means following
cybersecurity best practices, upholding human rights and democratic
values in cyberspace, and promoting equitable access while centering
people’s lived experiences.

Tech Policy
Tech Policy professionals work on regulation and governance of current
and emerging technologies. This can happen through company and
industry policies as well as through government frameworks and
legislation. As technology continues to touch so many aspects of society,
tech policy is intertwined with technology’s impact on individuals, society
and public interest. As a result, tech policy is very multidisciplinary and
can cover a wide range of issues from advocacy, freedom of expression,
education, health, and online safety just to name a few.

ALL TECH IS HUMAN | RESPONSIBLE TECH GUIDE 2024 | 21


AN EVOLVING ECOSYSTEM

Responsible AI
Responsible AI
Responsible AI has been a In 2024, our focus has shifted from
uncovering potential harms to
key focus for ATIH since our
identifying the best governance
founding in 2018 when the mechanisms for aligning AI usage
first sets of principles and with the public interest.
guidelines were being
This year, we have been providing
drawn up by pioneering opportunities to understand and
NGOs and early-adopting engage in evaluation of artificial
tech giants. intelligence systems through public
accountability mechanisms,
Immediately following the release of including:
our Responsible Tech Guide in
2020, we formed a working group An AI Governance maturity
and released our first RAI report, model we released in
“The Business Case for AI Ethics.” partnership with TechBetter,
An algorithmic bias bounty
In the few short years since, we challenge with Humane
have seen RAI’s center of gravity Intelligence that we co-hosted,
shift from novel and theoretical A partnership with Thorn to
societal harms - the purview of secure commitments from
academic researchers - to universal Amazon, Anthropic, Google,
concerns that are of interest to Meta, Microsoft, OpenAI, and
every person with a smartphone others to guard against AI-
and access to a consumer chatbot. generated child sexual abuse
material (AIG-CSAM).
RAI impacts are ubiquitous now:
Automated decision-making With these initiatives, we seek to not
systems only provide access to opportunities
Biometric surveillance tech but also to maximize the impact of
Autonomous weapons our collective efforts.
Deepfakes & info integrity
Job displacement
Security and safety risks
ALL TECH IS HUMAN | RESPONSIBLE TECH GUIDE 2024 | 23
RESPONSIBLE AI

Key Terms & Definitions AI Governance: A system of laws,


policies, frameworks, practices, and
Responsible AI: The practice of processes at international, national,
designing, building, deploying, and organizational levels. AI
operationalizing, and monitoring AI governance helps various
systems in a manner that [adheres] stakeholders implement, manage,
to principles of validity and oversee, and regulate the
reliability, safety, fairness, security development, deployment, and use
and resilience, accountability and of AI technology. It also helps
transparency, explainability and manage associated risks to ensure
interpretability, and privacy. (WEF) AI aligns with stakeholders'
objectives, is developed and used
Accountability: The role of AI actors responsibly and ethically, and
should be considered when seeking complies with applicable legal and
accountability for the outcomes of regulatory requirements. (IAPP)
AI systems. The relationship
between risk and accountability Algorithmic Impact Assessment:
associated with AI and technological Algorithmic impact
systems more broadly differs across assessments...are a tool for
cultural, legal, sectoral, and societal assessing possible societal impacts
contexts. Maintaining organizational of an algorithmic system before the
practices and governing structures system is in use–with ongoing
for harm reduction can help lead to monitoring often advised. (Ada
more accountable systems. (NIST) Lovelace Institute)

Accuracy: Accuracy is the closeness Model Card: A brief document that


of results of observations, discloses information about an AI
computations, or estimates to the model, such as explanations about
true values or the values accepted intended use, performance metrics,
as being true. (NIST) and benchmarked evaluation in
various conditions (e.g. across
different cultures or demographics).
(IAPP)

ALL TECH IS HUMAN | RESPONSIBLE TECH GUIDE 2024 | 24


RESPONSIBLE AI

Explainability: Explainability refers


to a representation of the
mechanisms underlying AI systems’
operation. (NIST)
ATIH IN ACTION
All Tech Is Human recently
Interpretability: Interpretability
partnered with the non-profit
refers to the meaning of AI systems’
Humane Intelligence to host an
output in the context of their
Algorithmic Bias Bounty
designed functional purposes.
Challenge hackathon to prompt
(NIST)
LLMs to elicit outcomes
demonstrating factuality errors,
Reliability: Reliability is defined as
bias, or misdirection. Read more.
the “ability of an item to perform as
required, without failure, for a given
time interval, under given
conditions.” (NIST) Red Teaming: The process of
testing the safety, security, and
Resilient: AI systems (as well as the performance of an AI system
ecosystems in which they are through an adversarial lens. This is
deployed) may be said to be re- typically done through the
silient if they can withstand simulation of attacks on the model
unexpected adverse events or to evaluate it against certain
unexpected changes in their envi- benchmarks and try to make it
ronment or use – or if they can behave in unintended or
maintain their functions and inappropriate ways. Red teaming
structure in the face of internal and often reveals security risks, biases,
external change and degrade safely and other harms. The results of such
and gracefully when this is testing are passed along to the
necessary. (NIST) model developers for evaluation
and remediation. Developers use
Robustness: Robustness or red teaming to improve a model
generalizability is defined as the before and after releasing it to the
“ability of a system to maintain its public. (IAPP)
level of performance under a variety
of circumstances.” (NIST)
ALL TECH IS HUMAN | RESPONSIBLE TECH GUIDE 2024 | 25
RESPONSIBLE AI

Watermarking: The process of addressing issues such as harmful


embedding subtle or visually bias and discrimination. (NIST)
imperceptible patterns in AI-
generated content or metadata that Privacy-enhanced: Privacy values
can only be detected by computers. such as anonymity, confidentiality,
Watermarking helps with the and control generally should guide
detection and labeling of AI- choices for AI system design,
generated content, promoting development, and deployment.
transparency. (IAPP) Privacy-related risks may influence
security, bias, and transparency and
Safety: The characteristic in which come with tradeoffs with these other
an AI system does not, under characteristics. (NIST)
defined conditions, lead to a state in
which human life, health, property, Validation: Validation is the
or the environment is endangered. “confirmation, through the provision
(NIST) of objective evidence, that the re-
quirements for a specific intended
Secure: AI systems that can use or application have been
maintain confidentiality, integrity, fulfilled.” (NIST)
and availability through protection
mechanisms that prevent
unauthorized access and use may
be said to be secure. (NIST)

Transparency: Transparency
reflects the extent to which
information about an AI system and
its outputs are available to
individuals interacting with such a
system – regardless of whether they
are even aware that they are doing
so. (NIST)

Fairness: Fairness in AI includes


concerns for equality and equity by
ALL TECH IS HUMAN | RESPONSIBLE TECH GUIDE 2024 | 26
RESPONSIBLE AI

Select Organizations growing number of models capable


of producing material harm to our
A large and growing number of world. Together they operationalize
impactful organizations are focused AI risk management to protect the
future of our world.”
on Responsible AI. These
organizations research, advocate,
Alan Turing Institute: “The Alan
monitor, and evaluate AI systems to Turing Institute, headquartered in
mitigate their known and potential the British Library, London, was
harms, working to ensure that these created as the national institute for
tools are aligned with the public data science in 2015. In 2017, as a
interest. result of a government
recommendation, they added
Ada Lovelace Institute: “An artificial intelligence to their remit.”
independent research institute with
a mission to ensure that data and AI Algorithmic Justice League: “AJL’s
work for people and society. They mission is to raise public awareness
about the impacts of AI, equip
believe that a world where data and
advocates with resources to bolster
AI work for people and society is a
campaigns, build the voice and
world in which the opportunities, choice of the most impacted
benefits, and privileges generated communities, and galvanize
by data and AI are justly and researchers, policymakers, and
equitably distributed and industry practitioners to prevent AI
experienced.” harms.”

AI Risk and Vulnerability Alliance Center for AI and Digital Policy:


(ARVA): “Their mission is to “The Center for AI and Digital Policy
empower communities to recognize, aims to ensure that artificial
diagnose, and manage intelligence and digital policies
promote a better society, more fair,
vulnerabilities in AI that affect them.
more just, and more accountable – a
Our vision is to work across
world where technology promotes
disciplines to make AI safer for broad social inclusion based on
everyone by exposing fundamental rights, democratic
vulnerabilities in the AI ecosystem. institutions, and the rule of law. As
They build tools for practitioners
and communities to interrogate the
ALL TECH IS HUMAN | RESPONSIBLE TECH GUIDE 2024 | 27
RESPONSIBLE AI

an independent non-profit research ForHumanity: “Their purpose is to


organization, they assess national AI examine and analyze the downside
policies and practices, train AI policy risks associated with the ubiquitous
leaders, and promote democratic advance of AI & Automation. With
values for AI.” the assistance of the ForHumanity
Fellows, they developed
Data & Society Research Institute: Independent Audit and Governance
“Data & Society is an independent of Contact Tracing. They are
nonprofit research organization. developing the larger work of
They study the social implications of Independent Audit of AI Systems to
data, automation, and AI, producing build an infrastructure of trust for the
original research to ground world's autonomous systems.”
informed public debate about
emerging technology. Data &
Society believes that technology
policy must be grounded in
empirical evidence, account for
tech’s real-world impacts, and serve
the public.”

Distributed AI Research Institute


(DAIR): “An interdisciplinary and
globally distributed AI research
institute rooted in the belief that AI
is not inevitable, its harms are
preventable, and when its
production and deployment include
diverse perspectives and deliberate
processes it can be beneficial. Our
research reflects our lived
experiences and centers our
communities.”

ALL TECH IS HUMAN | RESPONSIBLE TECH GUIDE 2024 | 28


RESPONSIBLE AI

Humane Intelligence: “Humane RAI Institute: “The Responsible AI


Intelligence is a tech nonprofit Institute (RAI Institute) is a global
building a community of practice and member-driven non-profit
around algorithmic evaluations. The dedicated to enabling successful
organization, led by Dr. Rumman responsible AI efforts in
organizations. The RAI Institute’s
Chowdhury, is building a
conformity assessments and
programming platform for model
certifications for AI systems support
evaluators and individuals seeking
practitioners as they navigate the
to learn more about model complex landscape of creating,
evaluations.” selling or buying AI products.”

International Association of
Algorithmic Auditors (IAAA): “The
IAAA is a community of practice that
aims to advance and organize the
algorithmic auditing profession,
promote AI auditing standards,
certify best practices and contribute
to the emergence of Responsible
AI.”

Partnership on AI: “Partnership on


AI develops tools, recommend-
ations, and other resources by
inviting voices from across the AI
community and beyond to share
insights that can be synthesized into
actionable guidance. They then
work to drive adoption in practice,
inform public policy, and advance
public understanding. Through
dialogue, research, and education,
PAI is addressing the most
important and difficult questions
concerning the future of AI.”
ALL TECH IS HUMAN | RESPONSIBLE TECH GUIDE 2024 | 29
RESPONSIBLE AI

The Future Society: “The Future


Society (TFS) is an independent
501(c)(3) nonprofit organization
based in the United States and
Europe. Their mission is to align
Job Levels
artificial intelligence through better
According to ATIH’s job board,
governance. We develop, advocate
most Responsible AI roles posted
for, and facilitate the implementation
require 5-6 years (35%), 7-9 years
of AI governance mechanisms,
(32%), and 10+ years (23%).
ranging from laws and regulations to
voluntary frameworks such as global
Job Titles
principles, norms, standards, and
AI Ethics & Safety Manager
corporate policies.”
Sr. Director, Responsible AI
Testing, Evaluation, and
Women in AI Ethics: “Women in AI
Alignment
Ethics™ (WAIE) is a fiscally
Technical AI Ethicist/AI Red
sponsored project of Social Good
Teamer
Fund, a California nonprofit
Director, Responsible AI
corporation and registered 501(c)(3)
Research Scientist, Model
organization with a mission to
Safety
increase recognition,
Applied Scientist,
representation, and empowerment
Responsible AI
of brilliant women in this space who
AI/ML Privacy Engineer
are working hard to save humanity
Software Developer/Engineer
from the dark side of AI.”
Data Scientist, Responsible AI
Research
See a full list of organizations:
Analyst/Associate/Researcher
alltechishuman.org/responsible-
, Trustworthy ML
tech-organizations.
Research Engineer,
Trustworthy AI

ALL TECH IS HUMAN | RESPONSIBLE TECH GUIDE 2024 | 30


RESPONSIBLE AI

Resources Generative AI Red Teaming


Challenge: Transparency Report
AI Incident Database (PAI): The AI (Humane Intelligence): Humane
Incident Database is dedicated to Intelligence, Seed AI, and AI Village
indexing the collective history of partnered to hold the first public red
harms or near harms realized in the teaming event for closed source API
real world by the deployment of models at DEF CON 2023. The
artificial intelligence systems. Like report provides some insights into
similar databases in aviation and the potential and promise of public
computer security, the AI Incident red teaming, framed around the
Database aims to learn from Generative AI Red Teaming
experience so we can prevent or Challenge conducted at AI Village
mitigate bad outcomes. within DEF CON 31.

AI Risk Management Framework Global AI Legislation Tracker


(NIST): In collaboration with the (IAPP): Countries worldwide are
private and public sectors, NIST has designing and implementing AI
developed a framework to better governance legislation and policies
manage risks to individuals, commensurate to the velocity and
organizations, and society associated variety of proliferating AI-powered
with artificial intelligence (AI). The technologies. This tracker identifies
NIST AI Risk Management legislative or policy developments
Framework (AI RMF) is intended for or both in a subset of jurisdictions.
voluntary use and to improve the
ability to incorporate trustworthiness Fireside Chat: Dr. Joy Buolamwini
considerations into the design, in conversation with Sinead Bovell
development, use, and evaluation of | Unmasking AI book talk
AI products, services, and systems.
Fireside Chat: Kashmir Hill in
EU AI Act (European Parliament): conversation with Rebekah Tweed
The use of artificial intelligence in the | Your Face Belongs To Us book
EU will be regulated by the AI Act, talk
the world’s first comprehensive AI
law.

ALL TECH IS HUMAN | RESPONSIBLE TECH GUIDE 2024 | 31


RESPONSIBLE AI

OECD AI Principles (OECD): abuse material (AIG-CSAM) and


Countries use the OECD AI Principles other sexual harms against children.
and related tools to shape policies
and create AI risk frameworks, AI Risk Repository (MIT): The AI Risk
building a foundation for global Repository has three parts: The AI
interoperability between jurisdictions. Risk Database captures 700+ risks
The European Union, the Council of extracted from 43 existing
Europe, the United States, and the frameworks, with quotes and page
United Nations and other jurisdictions numbers. The Causal Taxonomy of
use the OECD’s definition of an AI AI Risks classifies how, when, and
system and lifecycle in their why these risks occur. The Domain
legislative and regulatory frameworks Taxonomy of AI Risks classifies
and guidance. these risks into seven domains.

Responsible AI Governance Maturity AI Vulnerability Database (AVID):


Model hackathon report (TechBetter Open-source knowledge base of
& ATIH): an interdisciplinary research failure modes for AI models,
project to develop a Responsible AI datasets, and systems. Includes a
Governance Maturity Model that is taxonomy of the different avenues
rigorous, based on widely accepted through which an AI system can fail,
resources, and empirically informed. and a database of evaluation
examples.
Safety by Design for Generative AI:
Preventing Child Sexual Abuse Blueprint for an AI Bill of Rights
(Thorn & ATIH): In collaboration with (White House OSTP): White paper
Thorn and All Tech Is Human, published by the White House
Amazon, Anthropic, Civitai, Google, Office of Science and Technology
Invoke, Meta, Metaphysic, Microsoft, Policy. It is intended to support the
Mistral AI, OpenAI, and Stability AI development of policies and
have publicly committed to Safety by practices that protect civil rights and
Design principles. These principles promote democratic values in the
guard against the creation and building, deployment, and
spread of AI-generated child sexual governance of automated systems.

ALL TECH IS HUMAN | RESPONSIBLE TECH GUIDE 2024 | 33


RESPONSIBLE AI

ATIH Livestreams

Responsible AI: Discussing the


Governance Maturity Model: panel
discussion on the AI Governance
Maturity Model hackathon report
released in partnership with
TechBetter.

Finding a Career in AI Auditing w/


Cathy O'Neil, Dr. Gemma Galdon-
Clavell, and Shea Brown: panel
discussion on careers in algorithmic
auditing and the launch of the
International Association of
Algorithmic Auditors (IAAA). How can we mitigate AI risks?
livestream: panel discussion on
Responsible Tech Author Series: methods for managing AI's risks with
The Algorithm - Hilke Schellmann: a Dr. Rebecca Portnoff, Dr. Hany Farid,
conversation with the author on AI in and Maggie Munts, moderated by
Human Resources and AI’s usage in Sandra Khalil.
deciding who gets hired, monitored,
promoted, and fired. This Month in RAI with Rebekah
Tweed and Renée Cummings:
Dr. Rumman Chowdhury on AI Policy & elections
Strategies for AI Transparency: a RAI Best Practices
fireside chat on the "Generative AI RAI Literacy & innovation
Red Teaming Challenge: Executive Order on Safe,
Transparency Report" on the Secure, and Trustworthy AI
performance of eight state-of-the-art
large language models (LLMs) from All Tech Is Human has a weekly
the first public red teaming event for livestream series. Join us every
closed source API models, held by Thursday at 1pm ET. Can't make it?
Humane Intelligence, Seed AI and AI All livestreams are freely available
Village at DEF CON 2023. as videos.
ALL TECH IS HUMAN | RESPONSIBLE TECH GUIDE 2024 | 33
RESPONSIBLE AI

Looking Ahead Clarity on IP/Copyright Issues


Licensing deals will continue to be
With the ubiquitousness of made between AI developers and
generative AI usage across sectors, major publishers, as civil litigation
we expect to see Responsible AI continues through the U.S. court
remain in the spotlight–along with system. This will begin to clarify the
its risks and harms. These current murkiness regarding the
challenges require ongoing legality of using copyright-protected
management, and new challenges data to train large language models,
constantly emerging. and when and whether “fair use”
applies.
EU AI Act and the Brussels Effect
The EU AI Act is a comprehensive
regulatory framework for AI that
could become the benchmark for AI ATIH IN ACTION
regulation worldwide. Companies All Tech Is Human partnered
outside the EU will choose to with Humane Intelligence to
comply with the Act’s requirements host an event on Governing AI
globally, to avoid the complexity of with four panel discussions
adhering to multiple sets of featuring experts including Dr.
regulations (the Brussels Effect). Rumman Chowdhury,
Watch for other countries or regions Amandeep Singh Gill (UN),
to develop their own AI regulations Areeq Chowdhury (The Royal
based on the EU’s model. Society), and CNN’s Donie
O’Sullivan. Humane Intelligence
Increase in AI Governance Roles has been a key partner for All
As companies increase the adoption Tech Is Human, with additional
of generative AI tools while gatherings focused on
navigating emerging areas of risk Responsible AI.
and new legal requirements, we will
see a sharp uptick in the number of
available AI Governance roles.

ALL TECH IS HUMAN | RESPONSIBLE TECH GUIDE 2024 | 34


RESPONSIBLE AI

Information Integrity Cost & Sustainability


Consequential elections around the There could be increased pressure
globe will continue to keep the on AI developers to find solutions
spotlight on information integrity as that bring down the high cost and
the capabilities of generative AI unsustainable method of training
tools continue improving. large language models, which could
Deepfakes distributed across social include an uptick in the usage of
platforms will become increasingly small language models instead.
indistinguishable from authentic
data.
ATIH IN ACTION
Last September, All Tech Is
Non-Consensual Pornography
Human hosted its Responsible
As more and more people–both
Tech Summit: Shaping Our
private citizens and high profile
Digital Future in partnership with
public figures–become victims of AI-
the Canadian Consulate in New
generated non-consensual
York. The picture below is from a
pornography, efforts to mitigate
panel on the future of digital
these harms could be accelerated.
spaces and public goods.

ALL TECH IS HUMAN | RESPONSIBLE TECH GUIDE 2024 | 35


RESPONSIBLE AI

Dr. Joy Buolamwini


Founder of the Algorithmic Justice League

“In my work, art has been crucial to advancing the


conversation about algorithmic bias, making ideas more
accessible and adding nuance.”

In your opinion, what is the biggest tech &


society issue we are currently facing?
The increased adoption of biometric tools
integrated in government services,
deployed at airports, schools, hospitals, and
our own homes is establishing the
infrastructure for a biometric surveillance
state apparatus that will far exceed the
wildest dreams of the most brutal
authoritarian governments. When our faces,
voices, and those of our loved ones are
taken, manipulated, and deployed to
pollute our information networks, we risk no
longer being able to trust what we see with
our eyes or hear with our ears while losing
access to basic services without submitting
our intimate biometrics information.

How does your role help tackle thorny tech & society issues?
I am a poet of code and the author of Unmasking AI: My Mission to Protect
What is Human in a World of Machines. I tell stories that make daughters of
diaspora dream and sons of privilege pause. As the founder of the
Algorithmic Justice League, I lead an organization that uses poetry, creative
science communication, and research to prevent the deployment of harmful
AI systems and increase accountability in the creation of technology.

ALL TECH IS HUMAN | RESPONSIBLE TECH GUIDE 2024 | 36


RESPONSIBLE AI

(Interview with Dr. Joy Buolamwini - continued)

How did your career grow, and what advice would you give to others
wanting to be in a similar position?
Especially earlier in my career, I sensed that focusing too much on subjective
experience or leaning into my artistic side might lead to not being taken
seriously. There’s a genuine concern that if you don’t come from a technical
enough background or if you show artistic inclinations you may be dismissed
by the tech establishment. However, in my work, art has been crucial to
advancing the conversation about algorithmic bias, making ideas more
accessible, and adding nuance.

For example, when I was a member of the EU Global Tech Panel, I shared a
video of my spoken word algorithmic audit poem, “AI Ain’t I A Woman?”
which shows different facial analysis technologies failing on the faces of
iconic Black women. With that piece, I highlight how these technologies fail
and what those failures mean. I reference Sojourner Truth’s “Ain’t I A Woman”
speech and her struggle to be recognized in her full humanity within a
political system. Simultaneously, you see on screen a facial detection system
failing to find a human face in a picture of a young Oprah Winfrey. This art
piece informed conversations about the integration of computer vision
systems including facial recognition into lethal autonomous systems. The
power of storytelling in that work connected me with everyone in the room,
allowing me to speak truth to power. In a time when creative work is being
mechanized and devalued, I urge us to keep making space for artists,
storytellers, and poets.

ALL TECH IS HUMAN | RESPONSIBLE TECH GUIDE 2024 | 37


RESPONSIBLE AI

(Interview with Dr. Joy Buolamwini - continued)

What backgrounds or voices would you like to see more of in the


Responsible Tech ecosystem?
In 2023, AJL gave out the first-ever Gender Shades Justice Award to Robert
Williams. The Gender Shades Justice Award is designed to honor a recipient
who has experienced AI harm, taken the risk of voicing their experiences, and
actively worked to prevent future harm. While there are contests and awards
that recognize the efforts of researchers evaluating AI systems, this award
seeks to address the fact that there are few avenues through which those
impacted by AI harm are publicly celebrated and recognized for their
contributions in the fight for algorithmic justice. I want to see more impacted
individuals and communities centered in responsible tech conversations.

What is your vision of a better tech future and how we can move towards
that vision?
As I write in Unmasking AI, Algorithmic Justice means the people have a
voice and a choice in determining and shaping the algorithmic decisions that
shape their lives. That when harms are perpetuated by AI systems there is
accountability in the form of redress to correct the harms inflicted. That we
do not settle on notions of fairness that
do not take historical and social factor
into account. That the creators of AI
reflect their societies, That data does
not destine you to unfair discrimination.
That your hue is not a cue to dismiss
your humanity. And that AI is for the
people and by the people and not just
privileged few. We get there by
developing mechanisms and
communities of practice that surface
the limitations of AI systems so we can
reach our greater aspirations.

ALL TECH IS HUMAN | RESPONSIBLE TECH GUIDE 2024 | 38


RESPONSIBLE AI

Sinead Bovell
Futurist, WAYE Founder

“My vision of a better tech future is one that reflects the


diverse aspirations of society, not just those of a select few
with outsized resources.”

In your opinion, what is the biggest tech &


society issue we are currently facing?
The most pressing challenge we face today
is the breakdown of our information
ecosystems, which are vital for a healthy
democracy. Coupled with this is the
accelerating pace of technological
innovation, outpacing our ability to adapt
existing infrastructures and ecosystems.
This widening gap threatens our capacity to
ensure that technology develops safely,
responsibly, and in alignment with the
public interest.

How does your role help tackle thorny


tech & society issues?
Strategic foresight is essential for identifying emerging risks and challenges
before they escalate. By anticipating these issues and plausible futures, we
gain critical time to mitigate potential harms while building systems that
support both technological innovation and the public interest. Moreover, by
equipping young people with the tools to understand emerging technologies
and practice strategic foresight, we empower them to design the futures they
want to be a part of. By meaningfully including youth in conversations about
the future, we ensure they have a voice in shaping the world they will inherit.
This approach, blending foresight with education empowerment, allows us to
guide innovation responsibly, ensuring that it aligns with societal values and
creates a more inclusive future for all.
ALL TECH IS HUMAN | RESPONSIBLE TECH GUIDE 2024 | 39
RESPONSIBLE AI

(Interview with Sinead Bovell - continued)

How did your career grow, and what advice would you give to others
wanting to be in a similar position?
My background is rooted in finance and chemistry, with a master’s in
business, where I was first introduced to exponential technologies and the
practice of strategic foresight. This sparked my passion for understanding
the future of innovation. After transitioning into management consulting, I
took an unexpected turn into the fashion industry, which ultimately led me to
launch my own tech education company. One of the key lessons I’ve
learned is that you have the right to rewrite your career story as many times
as necessary. Your career grows and evolves alongside you, reflecting your
experiences and aspirations.

What is your vision of a better tech future and how can we move towards
that vision?
My vision of a better tech future is one that reflects the diverse aspirations
of society, not just those of a select few with outsized resources. It’s a future
where we adopt a long-term lens to planning, allowing us to proactively
shape the future rather than simply react to it. In this future, individuals feel
empowered by the decisions we make today, knowing that these choices
are informed by a broader, more inclusive vision. By embracing strategic
foresight and thoughtful collaboration, we can create a future where
technology serves the collective good and aligns with the values and needs
of all.

Sinead Bovell is one of Mozilla’s 2024 Rise25 honorees;


learn more about Rise25 at Rise25.mozilla.org.

ALL TECH IS HUMAN | RESPONSIBLE TECH GUIDE 2024 | 40


RESPONSIBLE AI

Navrina Singh
Founder and CEO, Credo AI
“I believe we need more representation from marginalized
communities—those who have historically been overlooked
in tech. These are the voices most impacted by AI bias,
disinformation, and lack of inclusivity, yet they are often
absent from the conversations shaping AI's future.”
In your opinion, what is the biggest tech &
society issue we are currently facing?
The biggest tech and society issue we’re
facing today is the erosion of trust in AI and
technology as a whole. While AI offers
immense potential to solve complex global
challenges, it also introduces significant
risks—such as bias, privacy invasion,
disinformation, and lack of accountability.
The rapid pace of AI development has
outstripped the creation of governance
frameworks to manage these risks and
emerging risks, leaving a dangerous gap
between innovation and responsible use.
This gap not only threatens societal values
but also national competitiveness.
Countries and organizations that fail to implement responsible AI
governance will lose the trust of their citizens and global partners, ultimately
falling behind in the race for AI leadership. The future of global
competitiveness hinges on the ability to balance innovation with ethical
responsibility. At Credo AI, we believe that closing this gap is critical to
ensuring AI serves humanity in a positive, equitable way. Our mission at
Credo AI is to create the infrastructure of trust for AI, helping organizations
govern their AI systems responsibly and ensuring that technology empowers
rather than harms. By embedding transparency, accountability, and values
into AI systems, we help organizations build trust, drive innovation
ALL TECH IS HUMAN | RESPONSIBLE TECH GUIDE 2024 | 41
RESPONSIBLE AI

(Interview with Navrina Singh - continued)

responsibly, and ensure that their AI systems empower rather than harm.
Trust is not just essential for societal well-being—it’s the foundation for
sustained leadership in the AI-driven economy.

How does your role help tackle thorny tech & society issues?
As the Founder and CEO of Credo AI, my role is centered around building
the infrastructure of trust for AI and ensuring that this transformative
technology is developed and deployed responsibly. I lead our vision and
strategy to address critical challenges at the intersection of technology and
society, focusing on creating governance frameworks that align AI with
ethical standards. I collaborate closely with governments, global
organizations, and enterprises to implement accountability measures for AI
systems.

A key part of my role is bridging the gap between innovation and regulation,
ensuring that AI's rapid development doesn't outpace our ability to manage
its societal impacts. This is vital not only for protecting human rights but also
for maintaining national competitiveness in the global AI race. At Credo AI,
we create software tools and frameworks that help organizations align their
AI systems with their values, standards, and regulations. We guide them in
navigating complex AI capabilities and understanding its risks, ensuring their
AI technologies contribute positively to society while driving business
growth. In this capacity, I’m privileged to lead a movement that is
transforming how we govern AI, making sure it remains a force for both
societal progress and national innovation.

How did your career grow, and what advice would you give to others
wanting to be in a similar position?
I founded Credo AI in March 2020, drawing on over 20 years of experience
in technology, including roles at Qualcomm and Microsoft, as well as
advisory positions with Mozilla AI, the U.S. Department of Commerce, OECD
and the United Nations. Credo AI is the manifestation of my life’s work,
values, and convictions—ensuring AI serves humanity responsibly. My
journey started in India, where my father’s military service and my mother’s
dedication as an educator instilled in me the values of hard work, integrity,
and resilience.
ALL TECH IS HUMAN | RESPONSIBLE TECH GUIDE 2024 | 42
RESPONSIBLE AI

(Interview with Navrina Singh - continued)

At 20, I left India to pursue a master’s degree in the U.S., navigating


challenges as a minority in tech. These experiences, coupled with my passion
for fairness and ethics, shaped my desire to drive change in the AI
ecosystem. After years in engineering and executive roles, I recognized the
urgent need for AI governance, particularly as AI’s impact on humanity and
our planet became clear. At Credo AI, we are building tools to ensure AI
systems have governance and oversight. My advice? Follow your passion,
stay grounded in your values, and be relentless in pursuing the change you
want to see in the world. As I often remind myself, "Success isn't just about
what you achieve, it's about the values you uphold along the way.” These
principles continue to guide my work and the mission of Credo AI—ensuring
that AI continues to serve humanity.

What backgrounds or voices would you like to see more of in the


Responsible Tech ecosystem?
In the Responsible Tech/AI ecosystem, we need a broad range of voices and
backgrounds to truly address the complex challenges AI and emerging
technologies pose to society. First and foremost, I believe we need more
representation from marginalized communities—those who have historically
been overlooked in tech. These are the voices most impacted by AI bias,
disinformation, and lack of inclusivity, yet they are often absent from the
conversations shaping AI's future. We also need greater participation from
disciplines beyond just technology and engineering. Ethicists, sociologists,
policymakers, legal experts, and human rights advocates must be at the table,
bringing a multidisciplinary approach to AI governance. The societal impacts
of AI touch everything—from privacy and fairness to accountability and
transparency—and solving these issues requires a diverse set of skills and
perspectives. Additionally, it's crucial to bring in more perspectives from the
Global South and emerging markets. Often, the conversation around AI
governance is dominated by voices from Western countries, but the global
impact of AI demands input from all corners of the world. Lastly, I'd love to
see more collaboration with young innovators—students and early-career
professionals who are both tech-savvy and values-driven. They are the future
of responsible AI, and their fresh perspectives can help us navigate this
evolving landscape. A truly inclusive Responsible Tech ecosystem is essential
to building technology that serves all of humanity.
ALL TECH IS HUMAN | RESPONSIBLE TECH GUIDE 2024 | 43
RESPONSIBLE AI

(Interview with Navrina Singh - continued)

What is your vision of a better tech future and how can we move towards
that vision?
My vision for a better tech future is one where innovation not only advances
rapidly but remains deeply aligned with human values. As we push the
boundaries of technology, including the development of non-human
intelligence and superintelligent systems, it is imperative that these
advancements are in service of humanity. In my view, the true power of AI lies
not in its ability to surpass human intelligence but in its potential to amplify
our collective well-being, creativity, and progress.

To move toward this future, we must ensure that AI and other emerging
technologies are built with governance and oversight. This requires a
profound shift—embedding governance and Responsible AI frameworks from
the outset, rather than treating them as afterthoughts. We must foster
collaboration across diverse voices, including technologists, ethicists, and
policymakers, and focus on creating systems that respect human rights and
dignity. At Credo AI, we are committed to driving this vision forward by
providing the tools to govern AI responsibly and ensure intelligence evolves
in ways that benefit all of humanity. Innovation without humanity is empty, but
when innovation is aligned with our core ethical principles, it can propel
society forward in ways we have yet to imagine.

ALL TECH IS HUMAN | RESPONSIBLE TECH GUIDE 2024 | 44


RESPONSIBLE AI

Gemma Galdon Clavell


CEO, Eticas.ai

“Our ability to correct the course of Big Tech will determine


the rights, quality of life and opportunities of generations to
come.”

In your opinion, what is the biggest tech &


society issue we are currently facing?
AI bias. AI systems are designed to be
biased. The process of identifying patterns
in datasets will always skew results towards
majority patterns, pushing outliers and
minorities towards invisibility. As bias is an
AI feature, specific steps need to be taken
to identify and protect outliers in AI systems,
processes, and decisions. Failing to address
bias, fairness, and discrimination in tech
endangers rights, liberties, and many of the
democratic wins of the last century. Bias in
AI is the biggest challenge we face.

How does your role help tackle thorny tech & society issues?
Tackling bias issues in AI requires more than principles and general
commitments -it demands specific engineering practices and fairness metrics.
After 15 years in the Tech Accountability space, we realized that for these
efforts to exist, they had to be validated. Eticas.ai is a startup focused on
developing AI auditing software that validates fairness efforts. As an
independent party, we can set the benchmarks and standards, and help
those who want to pioneer fairness tools by giving them a competitive
advantage.

ALL TECH IS HUMAN | RESPONSIBLE TECH GUIDE 2024 | 45


RESPONSIBLE AI

(Interview with Gemma Galdon Clavell - continued)

How did your career grow, and what advice would you give to others
wanting to be in a similar position?
I've been in the responsible data/tech space for over 15 years. I completed
my PhD on tech and policy and was one of the few people working on tech
from the social sciences. My multi-disciplinary work was so unique that it was
relatively easy to get attention and funding, and create a space for socio-
technical innovation around Responsible Tech. Today, we are one of the few
organizations doing technical work from a digital rights and AI accountability
perspective. So my career has not been linear nor planned! I would advice
anyone seeking to enter this space to be prepared to work very hard: we are
inventing the future. Our ability to correct the course of Big Tech will
determine the rights, quality of life and opportunities of generations to come.

What backgrounds or voices would you like to see more of in the


Responsible Tech ecosystem?
Non-US voices! The US is an incredibly diverse country, but internal diversity
cannot make up for the wealth of knowledge and ideas that come from other
cultural, linguistic and historical contexts. As a Hispanic in the US, I also miss
seeing more Latinxs!

What is your vision of a better tech future and how can we move towards
that vision?
A future where tech and AI contribute to fairness and equality instead of
eroding it. A future where tech helps us fix long-established discrimination
and power dynamics, instead of worsening them. And, of course, a future
where all AI is audited so that we can be sure that innovation outcomes are
safe and trustworthy.

Gemma Galdon Clavell is one of Mozilla’s 2024 Rise25


honorees; learn more about Rise25 at Rise25.mozilla.org.

ALL TECH IS HUMAN | RESPONSIBLE TECH GUIDE 2024 | 46


RESPONSIBLE AI

Cansu Canca
Founder & Director, AI Ethics Lab; Director of
Responsible AI Practice & Research Associate
Professor in Philosophy, Northeastern University
“Technology exists to serve humanity, and its purpose
should align with our most fundamental values.”

In your opinion, what is the biggest tech &


society issue we are currently facing?
As AI systems become more deeply
embedded in our daily lives, they are not
just transforming how we access
information but reshaping the very structure
of society. This shift goes beyond mere
technological advancement—it's about
reconfiguring power, opportunity, and
autonomy. AI now influences fundamental
decisions in criminal justice, healthcare,
finance, and education, often in ways that
are opaque or invisible to the public. These
systems determine who gets access to
resources, what opportunities are available,
and even which life paths we can imagine for ourselves. The troubling reality
is that if individuals are unaware of the possibilities that exist—or the
mechanisms of inequality that AI perpetuates—they cannot strive for a better
future or fight against injustice. This is both a matter of social justice and
personal autonomy.

As AI increasingly shapes our world and our lives, ensuring the ethical design
and transparent governance of these systems becomes critical. In other
words, the biggest issue is not a singular problem but a broader, complex
systems challenge: how to build and use AI to enhance social justice and
empower personal autonomy.
ALL TECH IS HUMAN | RESPONSIBLE TECH GUIDE 2024 | 47
RESPONSIBLE AI

(Interview with Cansu Canca - continued)

How does your role help tackle thorny tech & society issues?
My work is very hands-on. I am the lead AI ethicist to multidisciplinary teams
of computer scientists, philosophers, legal scholars, and designers. We
provide consulting to companies and organizations that are developing
and/or deploying AI systems to make their AI ethically better and to put in
place a responsible AI governance framework so that all AI that they develop
or procure is assessed for its risks and is not deployed unless potential risks
are effectively mitigated. For example, we help insurance companies, banks,
educational institutions, and healthcare providers create “fairer” AI models,
establish responsible AI workflows, and integrate ethics impact assessments
that engage key stakeholders, and create effective user interface for better
human decision-making using AI. Additionally, I advise international
organizations on AI ethics. This includes working with the UN and INTERPOL
to design and develop a responsible AI innovation toolkit for law enforcement
agencies, as well as as well as collaborating with the World Economic Forum
on a responsible AI investment playbook.

How did your career grow, and what advice would you give to others
wanting to be in a similar position?
I am trained as a philosopher, specifically as an applied ethicist. I have a PhD
in philosophy, and I worked on ethics and health for over a decade,
conducting research on end-of-life decisions and life-saving resource
allocation, advising healthcare officials
on public health policy decisions,
consulting hospitals on complex cases,
and training physicians and medical
students. This background provided me
with robust technical knowledge and
tools from philosophy, along with real-life
experience in complex, life-and-death
situations. I transitioned to AI ethics
initially because of AI-driven healthcare
technologies and the lack of ethical
assessment of these echnologies. I was
too early—this was around 2015 when I Photo from Mozilla Rise25 award ceremony, held
first began exploring AI ethics. August 13, 2024 in Dublin, Ireland.

ALL TECH IS HUMAN | RESPONSIBLE TECH GUIDE 2024 | 48


RESPONSIBLE AI

(Interview with Cansu Canca - continued)

At the time, the role of "AI ethicist" was not yet a thing, and there were no
positions available in academia or industry. So, I founded AI Ethics Lab to fill
this evident gap. I focused on building knowledge and networks, diving
deeply into AI ethics problems, collaborating with practitioners and
researchers, designing tools, developing methods, and creating training.

Early scandals, like Facebook's involvement with Cambridge Analytica,


created the perfect storm for AI Ethics Lab to gain momentum and become a
leading voice in shaping the applied AI ethics field. For those aspiring to enter
this space, my advice is always the same: AI ethics is a multidisciplinary field,
but don’t let that mislead you. To excel, you must be grounded in the
knowledge, methods, and tools of a specific discipline. Choose your area
carefully and develop expertise in it. On top of that, learn the basics of other
disciplines you’ll need to collaborate with so you can work effectively across
fields. Don’t get stuck in the limbo of interdisciplinary work by knowing a little
about everything without mastering any one area.

What backgrounds or voices would you like to see more of in the


Responsible Tech ecosystem?
Philosophers! At the heart of AI ethics lies the question of ethics itself—what
is the right thing to do or the right policy to implement in a given situation?
This is a deeply philosophical inquiry, and no other discipline can fully answer
it. Philosophy has spent over two millennia grappling with questions of
morality, justice, and human welfare, yet it is often overlooked in discussions
of responsible tech. We risk missing out on a vast body of knowledge that
could guide us through these complex dilemmas.

That said, philosophers cannot work in isolation. To truly navigate the ethical
challenges of AI, we need the insights of social scientists, computer scientists,
engineers, and policy experts. Another critical, yet frequently
underrepresented, voice in this ecosystem is that of designers. The ethical
use of AI hinges on how well people understand these systems, and that
understanding is largely shaped by design. User interfaces play a vital role in
communicating what AI can and cannot do, making designers essential to the
responsible tech conversation—not peripheral participants, but integral
partners in creating ethical AI systems.

ALL TECH IS HUMAN | RESPONSIBLE TECH GUIDE 2024 | 49


RESPONSIBLE AI

(Interview with Cansu Canca - continued)

What is your vision of a better tech future and how can we move towards
that vision?

We live in a deeply unfair and unsustainable world, where human life is driven
and shaped by a curious measure called “productivity”. But technology,
particularly AI, has the potential to shift the balance toward a more just
distribution of wealth and opportunity, a more sustainable planet, and lives
driven by meaning rather than market utility. While this vision may seem
utopian, every step toward it matters. Can we leverage AI to advance racial
justice in the criminal justice system, promote gender equity in healthcare, or
foster financial inclusion for marginalized communities? Can we harness AI’s
productivity to empower everyone—not just the privileged few—to pursue
goals that they find personally meaningful? The answer is an unequivocal yes.

Technology exists to serve humanity, and its purpose should align with our
most fundamental values. As philosophers have long argued, this boils down
to reducing suffering (of all sentient beings), enhancing self-determination,
and upholding fairness and justice. By keeping these principles at the core of
AI development, we can work toward a future where technology genuinely
elevates humanity.

Cansu Canca is one of Mozilla’s 2024 Rise25 honorees; learn


more about Rise25 at Rise25.mozilla.org.

ALL TECH IS HUMAN | RESPONSIBLE TECH GUIDE 2024 | 50


AN EVOLVING ECOSYSTEM

Trust & Safety


Trust & Safety
The term “Trust & Safety” The recognition that Trust and
Safety issues as well as
refers to the policies,
techniques used by bad actors
practices, and products that are constantly developing,
enable users of online requiring ongoing and
services and platforms to evolutionary responses” (Trust
and Safety Professional
feel safe. It’s the umbrella Association. “Trust and Safety
term that covers common Curriculum.” Accessed August
threats like fraud in a digital 15, 2024).
landscape that has
At All Tech Is Human, we catalyze
drastically expanded in the global change in the Trust & Safety
past three decades. field through robust community-
building, research and resource
The Trust and Safety Professional development, awareness-building,
Association (TSPA) names the and workforce development. We’ve
following four factors in “the connected individuals and
evolution of its function: organizations from various sectors
to collaborate on important Trust
The proliferation of online and Safety challenges.
services and user-generated
content; By creating this agnostic space for
The recognition among many dialogue, our organization has
large internet companies that bridged critical information gaps and
their products can be exploited fostered cross-sector collaboration
in ways that bring about to tackle complex issues. Our
unintended consequences; organization has also led a working
The realization that if efforts are group of Trust and Safety
not undertaken to mitigate such professionals to produce upskilling
misuse and abuse, both users’ resources.
experiences and company
reputations and profits are likely We are actively leaning into our
to erode; awareness-building capacity to
ALL TECH IS HUMAN | RESPONSIBLE TECH GUIDE 2024 | 52
TRUST & SAFETY

provide insights and platforms, social platforms, and


recommendations to policymakers search engines may take to
and regulators on governing healthy implement these principles.
digital spaces.

All Tech Is Human’s recent


collaboration with Thorn to
establish generative AI principles
around child safety, with
commitments from key tech
companies, showcases the
importance of multistakeholder
collaboration in the field of Trust &
Safety. Safety by Design for
Generative AI: Preventing Child
Sexual Abuse outlines these
collectively defined principles, along
with providing mitigations and
actionable strategies that AI
developers, providers, data-hosting

ALL TECH IS HUMAN | RESPONSIBLE TECH GUIDE 2024 | 53


TRUST & SAFETY

Key Terms & Definitions Misinformation: False or inaccurate


information that is spread without
Trust & Safety (T&S): A the intent to deceive.
multidisciplinary field focused on
creating secure, inclusive, and Phishing: A fraudulent attempt to
ethical digital environments. T&S obtain sensitive information, such as
teams work to protect users from usernames, passwords, or credit
harmful content, behaviors, and card details, by disguising oneself
interactions while ensuring that as a trustworthy entity in electronic
digital platforms remain trustworthy communications. Phishing is a
and reliable. common cybersecurity threat that
Trust and Safety teams work to
Content Moderation: The process prevent.
of monitoring and managing user-
generated content on digital Harassment: Unwanted and
platforms. This includes identifying, aggressive behavior directed at
reviewing, and removing content individuals or groups, often
that violates community guidelines involving threats, intimidation, or
or legal standards, such as hate abuse. Harassment can occur in
speech, misinformation, or graphic various forms, including
violence. cyberbullying, doxxing, and trolling,
and is a key focus for trust and
User-Generated Content (UGC): safety teams.
Any content, such as text, images,
videos, or reviews, created and Trolling: The act of deliberately
shared by users on digital platforms. provoking or upsetting others online
UGC is a key area of focus for trust by posting inflammatory or off-topic
and safety efforts due to the messages. Trolls often seek to
potential for harmful or disrupt conversations and cause
inappropriate content. emotional distress, making trolling a
significant challenge for content
Disinformation: Deliberately false moderation.
information spread with the intent to
mislead or deceive others.

ALL TECH IS HUMAN | RESPONSIBLE TECH GUIDE 2024 | 54


TRUST & SAFETY

Hate Speech: Any communication Transparency Report: A public


that belittles or discriminates against document issued by digital
individuals or groups based on platforms detailing their efforts in
attributes such as race, religion, content moderation, including the
ethnicity, gender, sexual orientation, number of takedown requests
or disability. Hate speech is often received, content removed, and
prohibited by platform guidelines actions taken against harmful
and is a primary target for content behaviors. Transparency reports are
moderation efforts. an important tool for accountability
and trust-building in the digital
Doxxing: The act of publicly sharing space.
private or personal information
about an individual without their Content Policy: A set of rules and
consent, often with malicious intent. guidelines that define what is and
Doxxing can lead to harassment, isn't allowed on a digital platform.
threats, and physical harm, making it Content policies are developed by
a serious concern for Trust and Trust and Safety teams to align with
Safety teams. legal requirements, community
standards, and company values.
Spam: Unsolicited and often
irrelevant messages sent in bulk, Safety by Design: A proactive
typically for advertising or phishing approach to designing digital
purposes. Spam can clutter digital products and services with user
platforms and pose security risks, safety as a foundational principle.
making its detection and removal a This involves integrating safety
priority for trust and safety teams. considerations into every stage of
product development, from
Community Guidelines: The rules conception to deployment, to
and standards set by digital prevent harmful experiences and
platforms to govern user behavior foster trust.
and content. These guidelines are
enforced by Trust and Safety teams
to ensure a safe and respectful
environment for all users.

ALL TECH IS HUMAN | RESPONSIBLE TECH GUIDE 2024 | 55


TRUST & SAFETY

Select Organizations internet, powered by its community


of integrity professionals. From the
Digital Trust & Safety Partnership: website: “We are engineers, product
“Trust and Safety is a critical function managers, researchers, analysts,
at many of the world’s leading digital data scientists, operations
service companies. The people specialists, policy experts and more,
working in Trust and Safety help to with decades of combined
promote a safer and more experience across numerous
trustworthy internet. The internet is platforms. We understand the
more than pixels and algorithms. It is systemic causes of problems on the
guided by Trust and Safety social internet and how to mitigate
professionals who spend their days them. We have seen (and built!)
keeping people safe from abuse. successful and unsuccessful
Each company is guided by its own solutions. We bring this experience
values, product, aims, digital tools, and expertise directly to the people
and human-led processes to make theorizing, building, and governing
tough decisions about how to enable the social internet.”
a broad range of human expression
and conduct, while working to Trust and Safety Forum: “Since the
identify and prevent harmful content Spring of 2022, the Trust & Safety
and conduct. Until now, the field of Forum (T&SF) offers a cohesive and
Trust and Safety has not yet international space open to all
developed the kinds of best practices stakeholders, from platforms to
and assessments that have been regulators, inclusive of trusted
crucial to maturing and organizing flaggers, solutions providers, and
other tech disciplines like governments committed to a trusted
cybersecurity. That’s why leading and safer digital environment today
technology companies are coming and for the future. The T&SF offers a
together to form the Digital Trust & place and time to connect with
Safety Partnership.” different stakeholders to discuss
and advance collaborative
Integrity Institute: “The Integrity initiatives, develop innovative
Institute advances the theory and processes and plan solutions,
practice of protecting the social ensuring that the digital

ALL TECH IS HUMAN | RESPONSIBLE TECH GUIDE 2024 | 56


TRUST & SAFETY

environment remains a place to share


knowledge, to build communities, to
develop opportunities and empower
people.”

Trust and Safety Foundation: “The


Trust and Safety Foundation works to Job Levels
improve society’s understanding of According to ATIH’s job board,
the trust and safety field through most Trust & Safety roles require
educational and research programs 5-6 years (37%), 7-9 years (32%),
and interdisciplinary dialogue.” and 3-4 years (32%).

Trust and Safety Professional Job Titles


Association: “The Trust & Safety Manager, Trust and Safety
Professional Association (TSPA) is a Senior Manager, Moderation
501(c)(6) non-partisan membership Operations
association that supports the global Fraud Technical Investigator,
community of professionals who Platform Abuse
develop and enforce principles, Content & Online Safety
policies, and practices that define Compliance Program Mgr.
acceptable behavior and content Machine Learning Engineer,
online and/or facilitated by digital Account Integrity
technologies. TSPA works to create Product Mgr, Content Safety
and foster a global community of trust Incident Analyst
and safety professionals, Data Scientist, Trust & Safety
collaborating with them to build a Fraud and Safety Analyst
community of practice, and providing Community Safety Specialist
support as they do the challenging Scaled Abuse Lead, Trust &
work of keeping online platforms Safety
safe.”

See a full list of organizations:


http://www.alltechishuman.org/trust
-and-safety.

ALL TECH IS HUMAN | RESPONSIBLE TECH GUIDE 2024 | 57


TRUST & SAFETY

Resources Digital Trust & Safety Partnership,


Safe Assessments Report: DTSP
The Trust & Safety Professional members have created evaluation
Association, Trust & Safety criteria for trust & safety best
Curriculum: The Trust & Safety practices at major tech companies.
Professional Association’s Curriculum
Development Working Group has Digital Trust & Safety Partnership,
written a series of comprehensive Best Practices Framework: DTSP
modules on trust & safety, including members have also created and
key roles and functions of T&S teams, facilitated the adoption of a best
a glossary of terms and definitions, practices framework, with the focus
and other foundational insights. of “articulating industry efforts to
address online Content and
Stanford Internet Observatory, Trust Conduct-Related Risks.”
& Safety Teaching Consortium:
Similarly to TSPA’s Curriculum
Development Working Group, the
Trust & Safety Teaching Consortium
is a coalition of T&S professionals
that hosts a 13-module reading list
with associated slides, lectures and
readings, with the intent of expanding
the audiences to whom T&S topics
are taught.

Integrity Institute, Best Practices


Guides for Platforms: The Integrity
Institute’s members have produced a
body of research to improve
platforms’ integrity processes, from
algorithmic risk assessment and
mitigation to transparency to defining
election integrity and supporting
elections online.

ALL TECH IS HUMAN | RESPONSIBLE TECH GUIDE 2024 | 58


TRUST & SAFETY

Looking Ahead More impactful cross-sector


collaboration
Increased globalization and With the growing role of academic
standardization of practices institutions and civil society
We will see continued efforts to organizations in trust & safety, we
internationally unify the business will see more meaningful
language around trust & safety, and opportunities to leverage cross-
continued development of sector strengths in the T&S solution
resources at the standard-setting space.
level.
Applications of AI and general
More focus on the global majority adaptation to emerging technology
and marginalized communities The trust & safety field will continue
We will see the continued to adapt to an evolving tech
development of resources landscape, including the refined use
equipping tech companies to of AI to address the unique threats
respect cultural differences on their posed by emerging technology.
platforms while maintaining user
protection and privacy. Implications of automating the
Trust & Safety function
Evolving legal and regulatory We will see an increased focus on
landscape the effects of automation on staffing,
With the push for regulating digital content moderation, and policy.
spaces, we will continue to see laws
that aim to address online harm,
data privacy, and platform ATIH IN ACTION
accountability and transparency. In June 2023, All Tech Is Human
This will increasingly complicate the and the Atlantic Council hosted
T&S compliance space. the NYC launch event for the
Taskforce for a Trustworthy
Professionalization of the field Future Web’s report, Scaling
We will see the growing Trust on the Web.
professionalization of the T&S field
with formal degree programs and
career pathways.
ALL TECH IS HUMAN | RESPONSIBLE TECH GUIDE 2024 | 59
TRUST & SAFETY

Camille François
Professor, Columbia University School of
International & Public Affairs
“I'd like our technological futures to be open, preferably open
source when that can be on the menu, with a generous side
of pluralism and safety.”

What is the biggest tech & society issue


we are currently facing?
Technology is developed and deployed
faster than we can understand and mitigate
its negative impacts on society; that's the
core challenge spanning AI, social
networks, and immersive technologies. This
translates into a number of urgent issues,
from the increased use of AI in armed
conflict to the sustainability of journalism, to
information operations targeting elections
around the world...I'm not fond of picking
one issue to rule them all. What we need is
a broad range of talent and safety
innovation across all domains!

How does your role help tackle thorny tech & society issues?
I've been fortunate to tackle these issues from a few different vantage points:
as an executive leader in Silicon Valley, as a researcher and a professor, and
as an advisor to policymakers in the U.S., E.U. and France.

Each of these positions translates in a different way to addressing these


issues. And at Columbia, I'm focused on the next generation of talent, ideas
and systems to tackle the most pressing tech and society challenges.

ALL TECH IS HUMAN | RESPONSIBLE TECH GUIDE 2024 | 60


TRUST & SAFETY

(Interview with Camille François- continued)

How did your career grow, and what advice would you give to others
wanting to be in a similar position?
II had plenty of luck, good mentors, and followed my curiosity wherever it led.

I initially came to these issues from a cybersecurity perspective, and so bring


a little bit of a hacker mindset to the safety field. I was also told plenty of
times that my degree in human rights would be wholly irrelevant to the career
I was trying to build, which thankfully turned out to not be true! To those
wanting to grow in the field: welcome! We need fresh perspectives, diverse
disciplines, and storied backgrounds coming together to tackle these issues.

What backgrounds or voices would you like to see more of in the


Responsible Tech ecosystem?
We've talked about how much we need diverse backgrounds and expertise
in this field; this in turn creates the need for us to be good at bridging silos.
For instance, right now, there remain large disconnects between the
Responsible AI and Trust & Safety fields, at a moment when it's critical that
they join forces. Beyond the responsible tech ecosystem, I continue to
advocate for more diversity throughout the technology industry itself, where
the leadership and governance structures of large companies remain far
behind the curve in representing the diversity of individuals and communities
using their products.

What is your vision of a better tech future


and how can we move towards that vision?
I'd like our technological futures to be open,
preferably open source when that can be on
the menu, with a generous side of pluralism
and safety. I think about this a lot in the
context of contemporary artificial intelligence
developments, where avoiding a
monoculture of large models seems urgent
and where we're still lacking a lot of
accessible safety tools and safeguards.

ALL TECH IS HUMAN | RESPONSIBLE TECH GUIDE 2024 | 61


TRUST & SAFETY

Sean Litton
President and CEO, Tech Coalition

“The Tech Coalition is trying to build a safer internet for kids.


One that helps them learn, play and creatively express
themselves, but that also has safeguards to protect them
against people who want to harm them. This is a whole of
society challenge that requires a whole of society response.”

What is the biggest tech & society issue


we are currently facing?
The number one issue that we are all facing
is how do we grapple with the impact of
generative AI? These extraordinary new AI
tools show such promise for positive impact
in so many fields but there is also risk of
misuse. How do we take full advantage of
that incredible potential for good without
suffering the consequences of
unanticipated misuse? For over a year, our
members have been engaged in
constructive dialogue with child safety
experts, U.S. and international law
enforcement, and global regulators to build
a shared understanding of the potential risks predatory actors pose to
children through generative AI. Our members have also come together to
collaboratively develop strategies to mitigate those risks. This type of
collaboration vastly accelerates industry’s progress on ensuring user safety
versus individual companies attempting to figure all of this out in isolation.

ALL TECH IS HUMAN | RESPONSIBLE TECH GUIDE 2024 | 62


TRUST & SAFETY

(Interview with Sean Litton - continued)

How does your role help tackle thorny tech & society issues?
I lead the Tech Coalition, the industry alliance of more than 40 global tech
companies of varying sizes and services tackling the challenge of online child
sexual abuse and exploitation (OCSEA). Our efforts are focused on building
industry capacity to protect children as they enjoy the same online tools and
services we all use to connect, share, and learn. The Tech Coalition provides
a safe space for our industry members to collaborate on the most significant
issues relating to OCSEA and identify, develop and deliver new initiatives that
drive real results. We also provide critical insights, practical resources and
step-by-step guides to support individual companies as they build their
capacity to combat OCSEA. Lastly, we foster constructive engagement and
dialogue between industry and third-party stakeholders to enhance
understanding, build trust, and drive collective action to combat OCSEA.

How did your career grow, and what advice would you give to others
wanting to be in a similar position?
I am a seminary drop out that found his calling and people at Notre Dame
Law School. After a few years as a lawyer with Kirkland & Ellis in Washington,
DC (extraordinary training for which I am forever grateful), I joined a startup
human rights organization called International Justice Mission (IJM) and
moved to the Philippines to work with local authorities to investigate and
prosecute cases of child sexual exploitation and abuse. This was supposed to
be a temporary break from practicing law in the US but I loved the work and
people so much that I stayed with IJM for 20 years, moving into leadership as
it continued to grow and eventually serving as President. In late 2020, I
began having conversations with several tech companies about leading the
relaunch of the Tech Coalition (it had originally started in 2006 but had
struggled to gain sustained momentum because it had no full-time staff).
Given the scale of the companies involved and their level of commitment, it
was clear to me that this was perhaps the single most leveraged and strategic
opportunity to drive global impact on child protection that I would ever
encounter. I gratefully accepted the opportunity to lead the Coalition and was
soon joined by a phenomenal team of leaders who have rapidly built
momentum and trust with our members. Together, we are driving real impact
for children. Here is my career advice: Take risks. Work hard. Be grateful.
Take care of yourself and your team.
ALL TECH IS HUMAN | RESPONSIBLE TECH GUIDE 2024 | 63
TRUST & SAFETY

(Interview with Sean Litton - continued)

What backgrounds or voices would you like to see more of in the


Responsible Tech ecosystem?
Trust and safety issues manifest on all types of platforms and apps. I think
there are opportunities to better engage app developers, gamers, and other
tech entrepreneurs to build our collective understanding of potential risks and
what works to mitigate them.

What is your vision of a better tech future and how we can more towards
that vision?
The Tech Coalition is trying to build a safer internet for kids. One that helps
them learn, play and creatively express themselves, but that also has
safeguards to protect them against people who want to harm them. This is a
whole of society challenge that requires a whole of society response. Child
sexual abuse is not a new problem. Certainly, the tech industry must do its
part to prevent, detect and report
attempted abuse online. This is my
focus every day. But as a society,
we need a more open and informed
dialogue on child sexual abuse. We
cannot avoid it. We cannot continue
to be surprised when it shows up in
an institution we trusted. Around the
world, there needs to be a greater
investment in ensuring individuals at
risk of offending against children
have access to the assistance they
need so that they do not offend.
There needs to be a greater
investment in the National Center
for Missing and Exploited Children’s Cybertipline that receives more than 30
million reports a year from industry relating to online child sexual exploitation
and abuse. And there needs to be a corresponding increase in investment in
global law enforcement to ensure that they have the capacity to respond to
these reports. Currently, they do not. We have to look at the whole pipeline.
Industry must do its part to ensure its products are safe AND we all have to
work together. This is what is required to keep our children safe.
ALL TECH IS HUMAN | RESPONSIBLE TECH GUIDE 2024 | 64
AN EVOLVING ECOSYSTEM

Youth, Tech and Wellbeing


Youth, Tech, and Wellbeing
The past year has seen tradeoffs for child safety this Fall.
Supported by the Oak Foundation,
tremendous developments
the report will delve into best
in the Youth, Tech, and practices for balancing privacy and
Wellbeing space, with safety in encrypted environments.
numerous youth or
This past year, we also launched our
intergenerationally-led University Network, including a
initiatives launching. global, interdisciplinary Board of
Directors representing 40+
This area remains a critical focus universities from North and South
area for us, with a growing sub- America, Europe, Asia, Africa, and
community of researchers, students, Australia. Under the leadership of
educators, activists, policymakers, Dr. Steven Kelts, the University
health professionals, and platform Network aims to expand the
representatives in this work stream. responsible tech movement to over
100 universities and equip students
In August 2024, the US Senate with the knowledge and skills
overwhelmingly passed the Kids necessary to explore a career at the
Online Safety Act (KOSA), which intersection of social impact and
would codify a “duty of care” for tech.
platforms (with more meaningful,
proactive harm prevention for minor We were equally excited to launch a
users) and give minor users more set of guidelines on AI-generated
options to make informed, child sexual abuse material (CSAM)
responsible decisions to protect this year, a project in collaboration
themselves online. The bill has with Thorn. Eleven tech companies
moved on to a vote in the House have committed to these Safety by
later this fall. Design principles, including
Amazon, Google, Meta, Microsoft,
On the heels of this development, and OpenAI. The effort signals a
All Tech Is Human will launch a commitment to protecting youth as
report outlining the evolving the future of Generative AI unfolds.

ALL TECH IS HUMAN | RESPONSIBLE TECH GUIDE 2024 | 66


YOUTH, TECH, AND WELLBEING

Key Terms & Definitions from risks such as cyberbullying,


exploitation, and exposure to
Youth: The United Nations, for inappropriate content.
statistical purposes, defines “youth”
as those persons between the ages Social Media Addiction: A
of 15 and 24 years. compulsive and excessive use of
social media platforms, leading to
Digital Wellbeing: The impact of negative consequences on a
digital experiences on physical, person's life and mental health.
mental, and emotional health,
including how time spent online Digital Footprint: The trail of data
affects sleep, social connections, that users leave behind when they
and cognitive development. use the internet, including social
media activity, browsing history, and
Digital Literacy: The ability to find, personal information.
evaluate, and create information
using digital technologies. It Child Sexual Abuse Material
includes understanding the ethical (CSAM): Any visual depiction of
use of digital tools and navigating sexually explicit conduct involving a
the digital environment safely. minor. The term is often used in the
context of online safety to address
Screen Time: The amount of time the distribution and creation of such
spent using devices with screens content through digital platforms.
such as smartphones, tablets,
computers, and televisions.
ATIH IN ACTION
Cyberbullying: The use of digital On June 27, 2024, All Tech Is
platforms (social media, messaging Human partnered with the LEGO
apps, gaming platforms, etc.) to Group and the Royal Society to
harass, threaten, or humiliate others, host Responsible Tech London,
particularly among adolescents. featuring panels and a mixer
focused on digital wellbeing for
Online Safety: Practices and tools youth with a mix of stakeholders.
designed to protect Internet users

ALL TECH IS HUMAN | RESPONSIBLE TECH GUIDE 2024 | 67


YOUTH, TECH, AND WELLBEING

Select Organizations this century. Building on the success


of the youth-led campaign, Design It
Children and Screens: Institute of For Us has grown into a youth-led
Digital Media and Child coalition to advocate for safer online
Development: “Children and platforms and social media. The new
Screens: Institute of Digital Media first-of-its-kind “Design It For Us”
and Child Development is an coalition aims to drive and achieve
international non-profit organization key policy reforms to protect kids,
founded in 2013 to understand and teens, and young adults online
address compelling questions through the mobilization of youth
regarding media’s impact on child activists, leaders, and voices. The
development through youth-led coalition is spearheaded
interdisciplinary dialogue; objective, by two Co-Chairs and a Core Team
scientific research; and information of young people between the age of
sharing.” 18 and 26. The coalition is
supported by an array of youth
Common Sense Media: “Common activists, youth-led organizations,
Sense has been the leading source and advisors.”
of entertainment and technology
recommendations for families and Encode Justice: “Encode Justice is
schools. Together with policymakers, the world’s first and largest youth
industry leaders, and global media movement for safe, equitable AI.
partners, they’re building a digital Powered by 1,000 young people
world that works better for all kids, across every inhabited continent,
their families, and their communities.” we believe AI must be steered in a
direction that benefits society.”
Design It For Us: “Design It For Us
was an innovative multimedia effort Family Online Safety Institute
for young people to share why online (FOSI): “The Family Online Safety
spaces should be designed for them. Institute brings a unique,
The campaign elevated youth voices international perspective to the
and secured the unanimous passage potential risks, harms as well as the
of the California AADC, the most rewards of our online lives. FOSI’s
significant tech accountability bill to 30+ members are among the
pass anywhere in the United States leading telecommunications, social
ALL TECH IS HUMAN | RESPONSIBLE TECH GUIDE 2024 | 68
YOUTH, TECH, AND WELLBEING

media, cybersecurity, gaming and together interdisciplinary dialogue,


internet companies in the world. public information, and rigorous,
Their work encompasses public objective research. Endorsed and
policy, industry best practices, and informed by academics, parents,
good digital parenting.” policy makers, and other experts,
these principles “were also shaped
#GoodforMEdia: “An initiative from by what children and young people
the Stanford Center for Youth told us they needed from the digital
Mental Health & Wellbeing, world to thrive.”
#GoodforMEdia advocates for
helping youth practice healthier ways Mothers Against Media Addiction
of engaging with media. We take a (MAMA): “Mothers Against Media
nuanced approach to technology and Addictions wants kids to experience
social media’s effects on youth childhood without constant
mental health, because we recognize distraction and manipulation by tech
it’s not black and white.” companies. They focus on working
to create change and awareness in
Headstream Innovation: these areas so children can learn
“Headstream envisions “a near-term better in schools, and teachers can
future where mental health care is as teach. They also advocate for policy
prioritized as physical health care. that allows “families and teachers,
That access to timely and culturally not tech companies, decide what
competent care will unburden young their children see.”
people, across all backgrounds, from
the mental well-being constraints that
prevent them from thriving and
harnessing their full power.“

The Log Off Movement: “LOG OFF is


a youth-led organization committed
to helping kids, teens, and young
people build healthy relationships
with social media and online
platforms.”

ALL TECH IS HUMAN | RESPONSIBLE TECH GUIDE 2024 | 69


YOUTH, TECH, AND WELLBEING

SWGFfL: “SWGfL is a not-for-profit Thorn: “Thorn is a nonprofit that


charity ensuring everyone can benefit builds technology to defend
from technology free from harm. children from sexual abuse.
Forming 1/3 of the UK Safer Internet Founded in 2012, the organization
Centre, our experts advise schools, creates products and programs to
public bodies, and industry on empower the platforms and people
appropriate actions to take in regard who have the ability to defend
to safeguarding and advancing children. Thorn’s tools have helped
positive online safety policies. SWGfL the tech industry detect and report
has been at the forefront of online millions of child sexual abuse files
safety for the past two decades, on the open web, connected
delivering engaging presentations investigators and NGOs with critical
and training to a wide variety of information to help them solve
audiences nationally and cases faster and remove children
internationally. Their work has from harm, and provided parents
brought online safety to the forefront and youth with digital safety
of public attention, ensuring everyone resources to prevent abuse.”
can develop their understanding of
what online safety truly means in an
ever-changing world.”

Tech Coalition: “The Tech Coalition is ATIH IN ACTION


an alliance of global tech companies On September 17, 2024, All Tech Is
who are working together to combat Human will partner with Safe
child sexual exploitation and abuse Online to bring 100 stakeholders
online. The Tech Coalition convenes together for a Safety by Design
and aligns the global industry - event. Attendees will represent
pooling knowledge, upskilling civil society, government, industry,
members, and strengthening all links and academia.
in the chain — so that the smallest
startups have access to the same
level of knowledge and technical
expertise as the largest tech
companies in the world.”

ALL TECH IS HUMAN | RESPONSIBLE TECH GUIDE 2024 | 70


YOUTH, TECH, AND WELLBEING

Young People’s Alliance: “A post-


partisan youth advocacy nonprofit
founded and led by college
Job Levels
students. “As students, we’ve seen
According to ATIH’s job board,
firsthand how young people’s
most Youth, Tech, & Wellbeing
interests are sidelined by our
roles require 5-6 years (40%), 3-4
political system. Our mission is to
years (40%), and 0-2 years (33%).
empower young people to shape
their future.”
Job Titles
5Rights Foundation: “5Rights works Child Safety Analyst
to put children’s needs and rights at Policy Manager, Domain
the very heart of digital design. It's Specialist, Child Safety
their mission to “ensure that the Child Safety Enforcement
same freedoms, protections and Specialist
privileges that young people are Technical Program Manager
entitled to offline, also apply online. of Child Safety
5Rights Foundation was started as a Child Safety Manager,
set of principles that would bring Government Affairs & Policy
Global Head of Child Safety
See a full list of organizations: Child Safety Engineering
https://alltechishuman.org/youth- Analyst
tech-wellbeing. Threat Investigator, Child
Safety
Law Enforcement Response
Team - Child Safety Specialist
Policy Design Manager, Child
Safety and Emotional and
Psychological Harm
Child Safety Emerging Risk
Analyst
Program Director, Child
Protection and Technology
Researcher, Child Safety

ALL TECH IS HUMAN | RESPONSIBLE TECH GUIDE 2024 | 71


YOUTH, TECH, AND WELLBEING

Resources #GoodforMEdia provides Guides


for youth navigating social media.
The Archewell Foundation Parent’s
Network works to build an National Center of Excellence on
empowered, informed, and Social Media and Youth Mental
connected global community of Health (American Academy of
families. Pediatrics) supports the mental
health of children and adolescents
Common Sense Media’s K-12 Digital as they navigate social media.
Citizenship Curriculum was
designed and developed in 5Rights Foundation’s Digital
partnership with Project Zero at Childhood Report addresses
the Harvard Graduate School of childhood development milestones
Education, guided by research with in the digital environment.
thousands of educators.
Fairplay provides a Screens in
FOSI. Media Literacy Flashcards: Schools Action Kit for parents.
Each card provides a simple
definition and conversation starter Tech Coalition provides a
question to help your child begin to Knowledge Hub of resources for
learn and talk about media literacy. protecting children from online
sexual exploitation and abuse.
Thorn. Youth Perspectives on
Online Safety report: An annual
report of youth attitudes and
experiences and Discussion Guides
for parents to talk to kids about
online safety topics.

The Responsible Innovation in


Technology for Children (RITEC)
project, co-founded by UNICEF and
the LEGO Group released a report.

ALL TECH IS HUMAN | RESPONSIBLE TECH GUIDE 2024 | 72


YOUTH, TECH, AND WELLBEING

Looking Ahead to youth, tech, and wellbeing,


recognizing the diverse cultural
Personalized Experiences Online contexts in which young people use
We can expect a rise in technology. International
personalized tooling for wellbeing, collaborations can lead to the
with specific designs for young development of global standards for
people. comprehensive youth digital
wellbeing.
Continued Activism and Youth-Led
Initiatives
We will continue to see youth taking
ATIH IN ACTION
an active role in shaping their future
In February 2024, All Tech Is
with technology.
Human hosted Responsible Tech
DC: A Better Tech Future for
Enhanced Digital Literacy and
Youth for 200 technologists, youth
Education
leaders, policymakers, and
Digital literacy will continue to be a
researchers in the youth safety
focal point in our education system,
space. Check out the panel on
with an emphasis on teaching the
effective design for youth here!
discernment necessary to evaluate
online content and experiences,
ATIH specializes in uniting a broad
understand the implications of a
range of individuals who come
digital footprint, and protect one’s
together to surface values, tensions,
privacy.
trade-offs, and best practices.
Collaboration is key to tackling
Wellbeing by Design
thorny tech & society issues!
We will see a greater focus on
inserting wellbeing principles into
the design of tech products. This
includes robust measures to prevent
unsafe experiences on platforms.

Global and Cultural Considerations


We will see a more global approach

ALL TECH IS HUMAN | RESPONSIBLE TECH GUIDE 2024 | 73


YOUTH, TECH, AND WELLBEING

Baroness Beeban Kidron


Member of UK House of Lords, Chair 5Rights Foundation,
Fellow and Advisor to Institute for Ethics in AI, University of
Oxford and Advisor to AI Ethics Institute
“A responsible tech sector follows the needs and norms that
we have already created in our desire to live together with a
commitment to delivering benefit, security, and wellbeing to
the broadest number of people.”

What is the biggest tech & society issue we


are currently facing?
The lack of corporate responsibility for the
creation, distribution, and impact of tech.
Regulatory innovation (avoidance) by tech
companies impacts almost every area of
public and private life. From the lack of local
housing in tourist hot spots; discord at
democratically arrived at outcomes;
pervasive death threats that chase women
out of public life; or the cost to child
development of monopolizing their (and their
parents) attention - among many others - the
impact of the tech sector is clearly evidenced.
Taking advantage of the conceptual idea of
being neutral 'infrastructure' to avoid
responsibility. This out-of-date concept, undermined by the ubiquitous use of
tech to personalize, editorialize and commercialize the user experience has
been allowed to impact each area of public and private life with no regard for
society and the communities from which they profit. A situation met with fury,
but a fundamentally inadequate response, from lawmakers and courts in
ensuring any sense of an equality of arms between the tech sector, legacy
businesses, communities or individual users.

ALL TECH IS HUMAN | RESPONSIBLE TECH GUIDE 2024 | 74


YOUTH, TECH, AND WELLBEING

(Interview with Baroness Beeban Kidron - continued)

Tell us about your career journey. How did your career grow, and what
advice would you give to others wanting to be in a similar position?
For the first 30 years I was a film director. I made feature films in the UK and
US and, between the narrative films, I made documentaries on issues that
interested me. In 2012, when the smartphone became the price point an adult
might give to a child, I made a film about children and the Internet. That film
changed my life. And led to me being a full-time legislator and campaigner.

Spending time with children in play and communication, developing


relationships and learning, I realized that the technology they were using,
which increasingly defined their childhood, had not been designed with them
in mind. Indeed, it was undermining their autonomy and ability to develop.
From interrupting eye contact with parents, increasingly passive interactions
inhibiting language development and being ever-present to the demands of
online services designed to monopolize attention, all presented a huge
obstacle to their emotional development - even before one considered the
more visible harms of grooming, multiple versions of self-harm, suicide and
sexual abuse. The issues were deep and broad (much broader than the
public discourse) and in that moment I saw a generational miscarriage of
justice. I have tried to fight this injustice. I have never made a film again.

Going from directing to policymaking may appear discordant, but many of the
competencies of bringing together a large team and walking them towards an
as yet unseeable goal have been helpful. And over the last decade or so,
many more people have joined in the fight for a different kind of
technologically-enabled childhood. One not based on commercial interest,
but one that would build the digital world children deserve.

What backgrounds or voices would you like to see more of in the


Responsible Tech ecosystem?
Children and young people. Educators. Parents. And where a product or
service is aimed at a particular community, I would like to see that community
have a voice. I do regular workshops with children and young people on
subjects that range from AI and Ed Tech to body image and social cohesion.

ALL TECH IS HUMAN | RESPONSIBLE TECH GUIDE 2024 | 75


YOUTH, TECH, AND WELLBEING

(Interview with Baroness Beeban Kidron - continued)

Almost always they finish the workshop with a sense that they have found a
language to express what is happening to them. That they have understood that
the hours they spent doomscrolling was not so much their weakness as the
success of a product team that has spent months and millions of dollars to get
them to do just that.

Similarly, educators have a very clear idea of what good looks like, and which
aspects of digital norms and digital design are messing up the kids in their
classroom. Like the children, they are rarely seeking a tech-free world, but are
desperate for someone - anyone - to detoxify the race for attention that defines
this generation of products and services. The cost to learning of poor Ed Tech
that does not mirror pedagogical needs, the tired or anxious children who pay
more attention to their device than their learning, and the extraordinary cost of
mis/disinformation - whether about public health and elections, or the kid in their
class - is heartbreaking to educators. So, too, is the automated prevalence of
misogyny, suicide ideations, self-harm, pornography, and AI or otherwise child
sexual abuse. They don’t understand why they are so confined by what and how
they teach, but this is allowed to undermine children unabated and unpunished.
Parents are confused and guilty; Unable, sometimes unwilling, to set themselves
in opposition to something that has grasped hold of themselves as well as their
children. Family conflict and a feeling that there are only bad choices - many
parents perhaps even most - would like clear rules of the game on addictive
practices - because at least then they could make some realistic conditions.

Finally, the impact on communities is so huge, I believe they should have a say.
Let towns or communities say if they want Airbnb, some might prefer to
guarantee the supply of houses for locals. Let teachers (or education boards)
have pedagogical criteria for Ed Tech that can be easily understood rather than
flood the market with expensive untested services designed by generalists. Let
services that do accounting, sell cars or holidays, have to meet the professional
standards of the sector, make sure that surge pricing is transparent and
understood (maybe even limited) so that a girl alone at night is not suddenly
faced with walking home because they can no longer afford the ride and so on
and so forth. As it stands, we are paying the costs of a model that has innovated
its way out of responsibilities that are practical and protective of those that they
impact.

ALL TECH IS HUMAN | RESPONSIBLE TECH GUIDE 2024 | 76


YOUTH, TECH, AND WELLBEING

(Interview with Baroness Beeban Kidron - continued)

What is your vision of a better tech future and how can we move towards
that vision?
A responsible tech sector follows the needs and norms that we have already
created in our desire to live together with a commitment to delivering benefits,
security and wellbeing to the broadest number of people. Depending on how
you analyse how we got here, tech is either the cause or simply took advantage
of a period in which corporates have been given licence to make money without
making any artifacts. It has unleashed a movement of wealth to an ever-fewer,
ever-richer group of people. The freedoms they trumpet in their defence
(speech, movement of capital, responsibility for impact, etc.) are also unequally
distributed. So a female politician can give up the public square, give up her
right to wellbeing or give up her job - but does not have the freedom to go to
work unabused. Ditto, it is now best practice not to post more than 40% of a
child online so that they cannot be scraped and turned into AI child abuse
material. Similarly, the freedom to make products that entice others into a state
of compulsion, is something that costs society dearly, as does the attention-
seeking mis and disinformation that feeds conspiracy theories and that social
media coffers.

In short, my view is one in which the gatekeepers and purveyors have


responsibilities that are equal to their impact. That services are assessed and
required to work in a way that meets the obligations that we have (largely)
already articulated to protect workers, citizens, communities, and individuals.
And in which human dignity, freedom to participate, and the protection of
vulnerable groups such as children are put on an equal footing with the rights of
commercial companies. And it should be done in haste because AI untrammeled
will exacerbate many of the ills we have already experienced. Indeed, if we
could turn the tanker around and put tech at the service of humanity and then
bring to bear the benefits of AI, it would be the new and exciting frontier that is
claimed. As it stands, it is a gold rush in the hands of the usual suspects seeking
power and profit, in a race in which the experiences and livelihoods of the
broader community are often conceived as collateral damage. There is a much
richer path than the one we are on.

ALL TECH IS HUMAN | RESPONSIBLE TECH GUIDE 2024 | 77


YOUTH, TECH, AND WELLBEING

Trisha Prabhu
Founder and CEO, ReThink;
Founder and President, ReThink Citizens
“So many young people wonder, “What can I do? What can I
offer?” In fact, we young people are uniquely positioned to
build the digital world that we want to live in. So go for it!
Make your impact.”

What is the biggest tech & society issue we


are currently facing?
The silent pandemic of online harassment
that today’s youth are experiencing.
According to Pew, 46% -- nearly half -- of US
youth aged 13-17 have been cyberbullied.
Historically vulnerable groups, like youth of
color and members of the LGBTQIA+
community, are disproportionately targeted,
often because of their identities. As a young
person myself -- and in conversation with
youth across the country and globally -- I’ve
also increasingly heard about and witnessed
how emerging forms of online hate are
exacerbating harm.
Non-consensual sexual deepfakes, for instance, are seriously threatening the
wellbeing of young women. Today's young people deserve better. We
deserve an internet that prioritizes our interests. Making that vision a reality is,
in my view, the biggest issue ahead of us.

How does your role help tackle thorny tech & society issues?
I’m the Founder and CEO of ReThink, a global movement with the mission of
eradicating online hate and cultivating a new generation of responsible digital
citizens. ur flagship product is our ReThink technology, an internationally
acclaimed tool that helps students pause, review, and ReThink before they
say something harmful or offensive online.
ALL TECH IS HUMAN | RESPONSIBLE TECH GUIDE 2024 | 78
YOUTH, TECH, AND WELLBEING

(Interview with Trisha Prabhu - continued)

By intervening in-the-moment, we stop this speech proactively (over 93% of


the time), before the damage is done, and teach youth crucial online and
offline decision-making skills. ReThink has been recognized and celebrated
by Google, MIT, Harvard, Mozilla, The Elevate Prize Foundation, and Prince
Harry and Meghan, The Duke and Duchess of Sussex.

How did your career grow, and what advice would you give to others
wanting to be in a similar position?
I started ReThink when I was just 13 years old. After experiencing and
witnessing cyberbullying, I knew that I couldn’t be a bystander -- I had to be
an Upstander. I was particularly passionate about ensuring that we weren’t
putting the burden on cyberbullying victims -- or on youth, generally -- to
respond to online hate; instead, I wanted a system that was safe by design.
The result was ReThink, a solution that
stopped cyberbullying before it happened.
My advice to fellow young people passionate
about building a better internet is to know
how powerful your voice is. So many young
people wonder, “What can I do? What can I
offer?” In fact, we young people are uniquely
positioned to build the digital world that we
want to live in. So go for it! Make your impact.

What backgrounds or voices would you like


to see more of in the Responsible Tech
ecosystem?
Young people and people with lived
experience. In this space, there’s often a lot
of talking about young people without young
people. It's time to change that. Bring us in,
and don’t stop there. Give us the funds,
legitimacy, agency, and backing we need to
pursue our own ideas. Create ecosystems
that invite more young people to speak up
and take action. Ensure that it’s diverse

ALL TECH IS HUMAN | RESPONSIBLE TECH GUIDE 2024 | 79


YOUTH, TECH, AND WELLBEING

(Interview with Trisha Prabhu - continued)

youth -- from all walks of life -- that are contributing to this work. It’s equally
crucial that we invite more people with lived experience with harm into this
work. These folks, more than anyone, know how the system is failing. They
know what needs to change. We need to learn from them; we need to center
their perspectives and insights. In doing so, we can ensure that they are the
last to suffer.

What is your vision of a better tech future and how can we move towards
that vision?
My vision of a better tech future is centered around an internet that is kind,
affirming and inclusive. It’s an internet where young people are accepted and
celebrated as they are. It’s an internet that young people help to
create/sustain. In that vein, in this future, young people are at the forefront of
the movement to build a just digital experience. Whether it’s in trust and
safety-related work, policy work, or technology innovation, young people
have the agency and support they need to construct the world that they want.
As a young person I recently spoke to said, “Wouldn’t that be awesome?”

ALL TECH IS HUMAN | RESPONSIBLE TECH GUIDE 2024 | 80


YOUTH, TECH, AND WELLBEING

Sonia Livingstone
LSE Professor and Director,
Digital Futures for Children
“So many young people wonder, ‘What can I do? What can I
offer?’ In fact, we young people are uniquely positioned to
build the digital world that we want to live in. So go for it!
Make your impact.”

What is the biggest tech & society issue we


are currently facing?
There’s an ever-growing list of difficult tech
& society problems to be tackled, but
underpinning many of them is the fact that
tech has become infrastructural in everyday
life. In short, often unaccountable and
powerful companies provide the digital
products and services for information,
relationships, health, education, work,
welfare, justice, politics, intimacy and more.
Throughout human history, these have
mostly been either personal (and hence
unobserved, and private to us) or public
(and hence accountable to public scrutiny and public interest). Now, they all
rely on data-driven and profit-hungry businesses, usually headquartered
outside our jurisdiction. No wonder the long list of problems often boils down
to problems with the business models of the tech sector, their associated
risks of surveillance and exploitation, and their evasion of the legislation and
regulation established by democratic governments. One paradoxical result is
that the platforms know a huge amount of their users to provide highly
personalised services in increasingly automated ways, on the one hand. And
yet, on the other, these same services do not (even, will not), in crucial ways
respect the needs, rights or circumstances of users as either individuals or

ALL TECH IS HUMAN | RESPONSIBLE TECH GUIDE 2024 | 81


YOUTH, TECH, AND WELLBEING

(Interview with Sonia Livingstone - continued)

population segments – such as children or people with disabilities or


refugees or multiple other groups. This paradox both generates a host of
difficult tech & society problems and also greatly impedes finding the
solutions.

How does your role help tackle thorny tech & society issues?
I lead the Digital Futures for Children (DFC) centre, a joint LSE and 5Rights
centre which facilitates research for a rights-respecting digital world for
children. We support an evidence base for advocacy, facilitate dialogue
between academics and policymakers, and amplify children’s voices, in
accordance with the UN Committee on the Rights of the Child’s authoritative
statement, General Comment No. 25, on how to implement the UNCRC in
relation to the digital environment.

A core ambition of our work is that providers of digital products and services
that impact children in one way or another should find ways to keep children
in mind throughout their work and embed children’s rights into their provision.
We have consulted innovators, practitioners, experts and children, learning
for example of designers’ everyday dilemmas about how to consult children,
meet the needs of different age groups, balance protection and participation,
and know when they have got it right. To find answers for them, we drew on
the collected wisdom of many rights-based, ethical and value-sensitive
organisations and we held a consultation with children.

The result is the DFC’s Child Rights by Design toolkit which guides providers
of digital products and services impacting children in ways to. Its 11 principles
are distilled from the articles of the UN Convention on the Rights of the Child:
1. Equity & Diversity, 2. Best Interests, 3. Consultation, 4. Age Appropriate, 5.
Responsibility, 6. Participation, 7. Privacy, 8. Safety, 9. Wellbeing, 10.
Development, and 11. Agency. See here

Of course, the task of embedding children’s rights in the digital environment


continues. We are now planning new research on multiple challenges posed
to children’s rights by digital innovation, seeking to engage with and input
into processes of policymaking at national and international levels.

ALL TECH IS HUMAN | RESPONSIBLE TECH GUIDE 2024 | 82


YOUTH, TECH, AND WELLBEING

(Interview with Sonia Livingstone - continued)

How did your career grow, and what advice would you give to others
wanting to be in a similar position?
I began as a social psychologist fascinated by questions of agency, voice and
power in our mediated world. My research has always taken a comparative,
critical and contextualised approach, examining how changing conditions of
mediation reshape everyday practices and possibilities for action. For the last
thirty years, I’ve focused on the digital lives of children and young people – at
home, with friends, at school and more broadly. My role as professor at the
London School of Economics and Political Science has supported me in
developing my research, teaching, networking and engagement in many
ways. We created a successful multidisciplinary Department of Media and
Communications. And more recently, with 5Rights Foundation, we’ve created
the Digital Futures for Children (DFC) centre to advance research and
advocacy at the intersection of children: rights: digital.

ALL TECH IS HUMAN | RESPONSIBLE TECH GUIDE 2024 | 83


YOUTH, TECH, AND WELLBEING

(Interview with Sonia Livingstone - continued)

There are several ways of telling the story of my career (so far!), but they
probably all centre on the combination of passion, purpose and hard work.
I’ve been very fortunate in my mentors and collaborators over the years. I do
find it hard to offer advice, because everyone is different, as are the
conditions of their lives. But I might say – figure out what you value, what you
can contribute, who can help you and who you can help. For more about me,
see www.sonialivingstone.net.

What backgrounds or voices would you like to see more of in the


Responsible Tech ecosystem?
My focus is on children's rights in the digital environment. In pursuing that
aim, I tend to find that child rights experts are still figuring out the
technicalities and regulation of the digital environment, while digital rights
experts may not have spent much time thinking about children and the
specific issues that arise regarding their rights. So while I enjoy trying to
bridge these domains, my guess is that there are many different versions of
this challenge across the Responsible Tech ecosystem.

In other words, all the needed backgrounds and voices are probably present,
but they may not yet all be comfortably in conversation with each other.
There are lots of ways of asking questions, describing problems, weighing
evidence or defining concepts. It takes a lot of mutual discussion and
challenge to find our way forward.

What is your vision of a better tech future and how can we move towards
that vision?
This is the hardest question to answer. Probably, there are lots of visions of a
better tech future, and the most important way ahead, of course, is to consult
people - I would say, especially children and young people as they have the
greatest stake in the future, and also especially those who are more
marginalised or under-represented or least heard. Then we must think, better
for whom? And how would we decide what’s better? Really, the question is
not so much, what’s a better tech future but, what role should tech play in a
better future for all.

ALL TECH IS HUMAN | RESPONSIBLE TECH GUIDE 2024 | 84


AN EVOLVING ECOSYSTEM

Public Interest Tech


Public Interest Technology
Public Interest Technology Key aspects of public interest
technology include:
(PIT) aims to harness the
Cross-Sector Collaboration:
power of technology to Engaging experts from various
advance human rights, fields—such as law, engineering,
promote fairness, and social sciences, and government
—to tackle complex societal
improve the well-being of challenges through technology.
society as a whole. Equity and Justice: Addressing
inequalities and ensuring that
It refers to the development, technology benefits everyone,
deployment, and governance of especially those who have been
technology in ways that prioritize historically marginalized.
the public good, social justice, and Accountability and
ethical considerations. PIT Transparency: Implementing
encompasses a multidisciplinary practices that make the use of
approach, bringing together technology in public life
technologists, policymakers, transparent and accountable to
activists, and citizens to ensure that the public.
technology serves the needs of all Inclusive Participation: Involving
people, particularly marginalized or diverse communities in the
underserved populations. development and decision-
making processes related to
All Tech Is Human has been heavily technology.
involved in efforts to build and
develop PIT career pathways, from ATIH’s extensive Careers
our partnership with Fordham programming leans heavily into PIT,
University to identify the especially our talent matchmaking
competencies most relevant for PIT platform focused on connecting
roles to the multiple PIT career fairs social impact talent with impactful
we have been involved in hosting roles in the public sector.
and organizing at universities like
Stanford, Pepperdine, and the
University of Washington.
ALL TECH IS HUMAN | RESPONSIBLE TECH GUIDE 2024 | 86
PUBLIC INTEREST TECHNOLOGY

Key Terms & Definitions collection and its use are connected
Some terms and definitions that are to broader social and political
important for understanding Public concerns, and how data-driven
Interest Technology include the systems can be designed more
following: equitably. (Emory University)

Public Interest Technology: Public Digital Divide: The digital divide is


Interest Technology refers to a set the gap between those who have
of practices to design, deploy, and affordable access, skills, and
govern technology in ways that support to effectively engage online
advance the public interest. and those who do not. (National
Interdisciplinary by nature, it Digital Inclusion Alliance)
involves the ability to assess and
respond to the core ethical, legal, Digital Equity: Digital equity is a
policy, social, economic and political condition in which all individuals and
implications of technology. (New communities have the information
America) technology capacity needed for full
participation in our society,
Public Good: The phrase “tech for democracy, and economy. Digital
good” often begs the question, equity is necessary for civic and
“good for whom?”; PIT explicitly cultural participation, employment,
embraces the concept of lifelong learning, and access to
technology that is good for the essential services.
public and for improving public life.
Digital Human Rights: Human rights
Civic Tech: Civic technology, or civic apply both online and offline. Digital
tech, enhances the relationship technologies provide new means to
between the people and exercise human rights, but they are
government with software for too often used to violate human
communications, decision-making, rights. Data protection and privacy,
service delivery, and political digital identity, the use of
process. surveillance technologies, online
violence and harassment, are of
Data Justice: Data justice considers particular concern. (United Nations)
how questions about data, its
ALL TECH IS HUMAN | RESPONSIBLE TECH GUIDE 2024 | 87
PUBLIC INTEREST TECHNOLOGY

Digital Safety: Digital safety is about


preventing and reducing harm in the
digital space. This includes
moderating illegal or harmful content,
driving responsible platform design
and governance, and empowering
users to tailor their online
experiences. (World Economic
Forum)

Human-centered Design: Human-


centered design is an approach to
interactive systems that aims to make
systems usable and useful by
focusing on the users, their needs
and requirements, and by applying
human factors/ergonomics, and
usability knowledge and techniques.
(International Organization for
Standardization)

Social Impact: Social impact is the


change (either positive or negative)
for people and communities that
happens as a result of a deliberate
activity or service. The term is
sometimes used as the reflection of
social and environmental outcomes
as measurements. (Stanford
University)

ALL TECH IS HUMAN | RESPONSIBLE TECH GUIDE 2024 | 88


PUBLIC INTEREST TECHNOLOGY

Select Organizations and Presidential Innovation Fellows:


Key Programs “The Presidential Innovation Fellows
program is a competitive fellowship
Code for America: “Code for America program that pairs top innovators
is a 501 civic tech non-profit from the private sector, non-profits,
organization that was founded by and academia with top innovators in
Jennifer Pahlka in 2009, "to promote government to collaborate on
‘civic hacking’, and to bring 21st solutions that aim to deliver
century technology to government." significant results in months, not
Federal, state, and local governments years.”
often lack the budget, expertise, and
resources to efficiently deploy TechCongress: “TechCongress
modern software.” places computer scientists,
engineers, and other technologists to
Coding it Forward: “Coding it serve as technology policy advisors
Forward is a nonprofit for early-career to Members of Congress through our
technologists creating new pathways Senior Congressional Innovation
into public interest technology Fellowship (mid-career pipeline), the
through a summer fellowship Congressional Innovation Fellowship
program. Since 2017, CiF has placed (early-career pipeline), and the
691 Fellows at 80 local, state, and Congressional Digital Service
federal government offices Fellowship.”
nationwide.”
United States Digital Corps: “USDC,
New America’s Public Interest a part of the U.S. General Services
Technology University Network: Administration’s Technology
“The Public Interest Technology Transformation Services, aims to
University Network (PIT-UN) fosters create a government technology
collaboration between universities workforce that reflects the diversity of
and colleges to build the field of America. They offer early career
public interest technology and technologists the opportunity to work
nurture a new generation of civic- on some of our most pressing
minded technologists.” challenges.”

ALL TECH IS HUMAN | RESPONSIBLE TECH GUIDE 2024 | 89


PUBLIC INTEREST TECHNOLOGY

United States Digital Response:


“USDR is a nonprofit, nonpartisan
organization that works alongside
governments at all levels to ensure
they have the capacity to meet the
Job Levels
public's needs. USDR places
According to ATIH’s job board,
experienced, pro-bono technologists
most Public Interest Technology
to work with government and
roles require 5-6 years (47%), 3-4
organizations responding to crises to
years (47%), and 7-9 years (27%).
quickly deliver critical services.

Job Titles
United States Digital Services: The
Data Scientist
United States Digital Service is a
Software Engineer
technology unit housed within the
Product Manager
Executive Office of the President of
Product Designer
the United States. It provides
UX Designer
consultation services to federal
Digital Designer & Accessibility
agencies on information technology.
Analyst
It seeks to improve and simplify
Research Engineer
digital service, and to improve federal
Deputy Director of Network
websites.
Services
Innovation Manager
18F: “18F is a digital services agency
Program Officer for Technology
within the Technology
Technical Director
Transformation Services department
Project Manager
of the General Services
Statewide Spatial Equity Analyst
Administration of the United States
Government. Their purpose is to
deliver digital services and
technology products.”

See a full list of organizations:


alltechishuman.org/public-interest-
technology.

ALL TECH IS HUMAN | RESPONSIBLE TECH GUIDE 2024 | 90


PUBLIC INTEREST TECHNOLOGY

Resources improved with student input, and


includes a series of custom “plays”
New America PIT-UN resources (New designed to help new students ramp
America): PIT-UN has provided up quickly to a Public Interest
programming and funding for Technology mindset.
member universities to grow public
interest technology on their What is PIT? workbook (New
campuses and in their communities America PIT-UN): With support from
since 2019. They provide a robust set a 2022 PIT-UN Network Challenge
of resources available to the public. Grant, Denise Ferebee (Rust
College) and Zina Parker (LeMoyne-
The Public Interest Technologist Owen College) led a group of faculty
(MIT): The Public Interest from five universities in creating a
Technologist is an online publication workbook for educators and
aimed at helping the MIT community students interested in public interest
think together about the social technology.
responsibilities of students, faculty,
staff and alumni who design and
implement technologies of various
kinds.

Spotlight on PIT (ATIH): A


compilation of information and
resources on PIT by All Tech Is
Human.

Datasets for Good (New America):


open datasets that can be used in
your development and public interest
technology research.

Policy Innovation Lab Playbook


(Carnegie Mellon University): The
PIL Playbook was designed and

ALL TECH IS HUMAN | RESPONSIBLE TECH GUIDE 2024 | 91


PUBLIC INTEREST TECHNOLOGY

Fast Forward: Tech Nonprofit Breaking the Mold: Investing in Racial


Playbook: Fast Forward bridges the Diversity in Tech Report (Civil Rights
tech and nonprofit sectors to build Privacy & Technology Table): A report
capacity for tech nonprofits, so they for investors from Civil Rights Privacy
can scale solutions to our world’s and Technology Table
most urgent problems.
Case Studies in PIT (Harvard): three
Consumer Reports Innovation Lab: different case studies of public interest
CR empowers and informs tech research projects (PhD
consumers, incentivizes corporations dissertation)
to act responsibly, and helps
policymakers prioritize the rights and
interests of consumers to shape a ATIH IN ACTION
truly consumer-driven marketplace. All Tech Is Human hosted a
And, because software has radically Responsible Tech Mixer and
transformed the marketplace and Speaker Series in August 2023
consumers’ lives, CR launched that focused on how
Innovation Lab — to design, technology should be viewed
prototype and scale new solutions to as critical infrastructure. Lyel
the problems facing consumers Resner, Saima Akhtar, Matt
today. Mitchell, and Claire Yang
discuss the importance of
The PIT Landscape (Boston multistakeholder collaboration
University): sheds light on the and the ways inequitable
priorities of PIT-UN members, and infrastructure produces
opportunities for future growth. The inequitable outcomes. Watch
report draws on both an in-depth the panel discussion:
member survey and a broad scan of “Technology is Infrastructure”
related activities, academic programs
and research initiatives underway at
43 academic institutions that made
up the membership of PIT-UN as of
the summer of 2021.

ALL TECH IS HUMAN | RESPONSIBLE TECH GUIDE 2024 | 92


PUBLIC INTEREST TECHNOLOGY

Looking Ahead dozens of academics at universities


across the United States have been
Increased Focus on Equity building academic programs and
PIT explicitly centers the initiatives explicitly geared toward
experiences of historically PIT field building. At All Tech Is
marginalized groups who have been Human, we are able to connect
both targeted and neglected by these academic programs and the
technology. We expect to see an students who graduate from them to
increased focus on the importance the professional opportunities for
of diverse perspectives among which they have been prepared. As
those tasked with the responsibility these programs increasingly engage
of designing, developing, and in cross-sector collaboration,
deploying the technologies that preferred professional
impact the public. competencies are being identified,
which will lead to a workforce that is
Professionalization of the Field much better equipped for the
Public Interest Technology as a available roles.
professional field is relatively
nascent, but resources and talent Influx of Hiring in the Public Sector
have been pouring into the space, Due in large part to the White House
especially within academia and civil Executive Order on AI in October of
society organizations, leading to a 2023, a PIT hiring spree is
robust infrastructure and strong underway in the U.S. federal
foundation for the growing field. We government. This could lead to a
expect this steady investment will substantial increase in the number
result in a more comprehensive of Public Interest Technologists at
understanding of career pathways every level, as visibility for and
and a holistic professionalization of interest in these roles rises.
the field.

Increased Alignment Between


Academic Programs and PIT Roles
Primarily due to the foundational
efforts of New America’s PIT-UN,

ALL TECH IS HUMAN | RESPONSIBLE TECH GUIDE 2024 | 93


PUBLIC INTEREST TECHNOLOGY

Afua Bruce
Author, The Tech That Comes Next;
Principal, ANB Advisory Group
“My vision for a better tech future is one where technology
truly centers equity and justice, enabling people to support
their communities, run their businesses, and explore their
interests.”

In your opinion, what is the biggest tech &


society issue we are currently facing?
One of the biggest tech and society issues
we currently face is the same issue we have
faced for the last several decades: systemic
exclusion from technical design,
development, and implementation
processes. Various technological and
societal structures marginalize some
communities, limiting their access to and
participation in tech development. As a
result, we see products that don’t work for
everyone, algorithms that harm certain
populations, technical infrastructure with
negative side effects on environmental and
personal health, and technical
implementations that prevent people from accessing resources to which they are
legally entitled.

How does your role help tackle thorny tech & society issues?
In my current role, I support organizations across sectors to develop and sustain
public interest tech projects. Whether working with funders to design and run
various grant and investment programs, or with nonprofits to develop strong
technical and organizational strategies, or with the private sector to build
responsible tech products and resources, I support leaders in tackling tech and

ALL TECH IS HUMAN | RESPONSIBLE TECH GUIDE 2024 | 94


PUBLIC INTEREST TECHNOLOGY

(Interview with Afua Bruce - continued)

society issues through well-executed programs. I also do a fair amount of


speaking and training on tech and society issues – sharing my own insights
from studying and working across the growing field. Finally, I get to develop
partnerships and work in collaboration with a number of phenomenal leaders
in different sectors.

How did your career grow, and what advice would you give to others
wanting to be in a similar position?
I started my career as a software engineer at IBM. After taking a leave of
absence from IBM to get my MBA, I joined the FBI as a Special Advisor and
held several leadership positions in various science and technology strategy
and program management roles. Being at the FBI showed me how
intertwined technology, policy, and society are; I knew I wanted to do more
work at that intersection. My career moves from the FBI, to the White House,
to a think tank, to a nonprofit, and now to leading a consulting firm have been
driven by my desire to influence technology
from various angles. My advice to people
wanting to work in similar positions is to
identify what matters most to you, and be
open to working on those issues in different
forms. Once I realized that interdisciplinary,
collaborative, and inclusive science and
technology in the public interest mattered
to me, I was open to new partnerships and
new places.

What backgrounds or voices would you


like to see more of in the Responsible
Tech ecosystem?
All Tech is Human has done a good job of
including many voices in conversations
about how to build responsible
technologies. The community should
continue to work towards reflecting the
racial, ethnic, socio-economic, and ability

ALL TECH IS HUMAN | RESPONSIBLE TECH GUIDE 2024 | 95


PUBLIC INTEREST TECHNOLOGY

(Interview with Afua. Bruce - continued)

diversity of the world. I would also like to hear more voices from a wider
range of science and technology fields. Especially with the growth of
generative AI, responsible tech affects everything from how spacecraft is
designed to how new materials are discovered to how drugs are developed
to how roads are constructed; subject matter experts from all these
disciplines should be included in the responsible tech ecosystem. Finally, we
need more storytellers and artists to help tell what’s possible with more
responsible technology and to help people imagine a better future.

What is your vision of a better tech future and how can we move towards
that vision?
My vision for a better tech future is one where technology truly centers equity
and justice, enabling people to support their communities, run their
businesses, and explore their interests. Realizing that vision requires
collaboration across sectors, including changemakers, technologists, social
impact organizations, and funders, to improve tech design and development
processes. Additionally, I would like to see more work done on sustainable
business models and funding structures for responsible tech work; without
advancements in this area, I worry we will see a number of great short-term
changes and successes, but fewer long-term, large-scale impacts.

ALL TECH IS HUMAN | RESPONSIBLE TECH GUIDE 2024 | 96


PUBLIC INTEREST TECHNOLOGY

Lyel Lacoff Resner


Visiting Faculty, Head of Public Interest
Technology Studio, Cornell Tech
“I think our responsible tech conversations often over-
emphasize the technology itself. We need to pay closer
attention to the incentive structures that drive its
development.”

What is the biggest tech & society issue we


are currently facing?
Ultimately, most technology companies are
still funded in a way that often places
enormous emphasis on short-term growth
and even conflates ‘valuation’ with societal
value.

How does your role help tackle thorny tech


& society issues?
My work with founders, GPs, LPs, and
students generally centers on getting
agreement that when we create technology
we create spreads of value — accruing to
some stakeholders and perhaps destroying
or extracting it from others. And while our precise values or politics may differ,
it’s incumbent that we endeavor to do an honest accounting of the spread of
value creation, destruction, or extraction that happens when we create
technology or technology companies.

How did your career grow, and what advice would you give to others
wanting to be in a similar position?
I think often about a story from Kurt Vonnegut. He was at the party of a
billionaire with the author Joseph Heller, and remarks that their host had made
more money in a single day than Heller had ever earned from
ALL TECH IS HUMAN | RESPONSIBLE TECH GUIDE 2024 | 97
PUBLIC INTEREST TECHNOLOGY

(Interview with Lyel Lacoff Resner- continued)

Catch-22. And Heller responds, “Yes, but I have something he will never have
— enough.” When you work in tech the wealth that exists can be disorienting.
“Enough” is something different to everyone but truly knowing what is
enough allows you to build a career driven by your values. And looking back
on that truly is priceless.

What backgrounds or voices would you like to see more of in the


Responsible Tech ecosystem?
These days, the Responsible Tech ecosystem seems to have generally
healthy levels of representation. I would like to see more asset owners and
allocators — namely LPs — be more intentional about engaging these voices.

How did your career grow, and what advice would you give to others
wanting to be in a similar position?
I think our responsible tech conversations often over-emphasize the
technology itself. We need to pay closer attention to the incentive structures
that drive its development.

In our recent SSIR piece, my collaborator Dr. Wilneida Negron and I framed it
this way: We need more models of mission-driven catalytic capital to support
tech companies committed to building a more just, equitable, and sustainable
private sector. We need values-driven capital to become exponentially more
active at the seed and startup stages,
to influence the ethos with which tech
companies are encouraged to grow.
Imagine, if this year’s YC crop of 400+
companies was inspired to create value
for public benefit, rather than simply
“build something that people want.”
The actions funders and impact investors
take in the next few years could help to
speed the type of transformation that’s
needed to ensure that the next
generation of Big Tech companies is
grounded in the public interest.

ALL TECH IS HUMAN | RESPONSIBLE TECH GUIDE 2024 | 98


PUBLIC INTEREST TECHNOLOGY

Oumou Ly
Senior Advisor for Technology and Ecosystem
Security, the White House
“My advice to others is to embrace complexity. Be curious,
push boundaries, and invest in relationships that challenge
and support your growth. Understanding how to
communicate across disciplines and find common ground
will make you an indispensable part of any team.”

Tell us about your role. How does your role


help tackle thorny tech & society issues?
As the Senior Advisor for Technology and
Ecosystem Security at the White House, my
work centers on creating stronger security
outcomes across the digital ecosystem. This
includes establishing guardrails for emerging
technology, including AI systems, and
strengthening cybersecurity across the
traditional software ecosystem. I am also
focused on strengthening the cyber and
technology workforce.
For example, my team authored the National
Cyber Workforce and Education Strategy and
are currently working to implement it. We also are working to implement the
National Cybersecurity Strategy and the President's AI Executive Order. The
aim of this work is to create stronger incentives toward security across all
elements of society to create and sustain opportunity for a greater cross-
section of Americans. These approaches help ensure that technology doesn’t
just optimize upon itself, but facilitates resilience, security, and public trust.

ALL TECH IS HUMAN | RESPONSIBLE TECH GUIDE 2024 | 99


PUBLIC INTEREST TECHNOLOGY

(Interview with Oumou Ly - continued)

Tell us about your career journey. How did your career grow, and what
advice would you give to others wanting to be in a similar position?
My first role in this space was in the U.S. Senate, where I was a member of
Leader Schumer's national security team. Part of my role was to advise the
Senator on nominations of senior national security leaders, including the
Chairman of the Joint Chiefs of Staff and the Secretaries of Defense, State,
and Homeland Security. I remember staffing the Senator for a meeting with
Secretary Ash Carter, who talked at length about technology, innovation,
cybersecurity and how they work together to create a holistic approach to
national security. It was one of the first times I had heard a leader so incisively
articulate a policy vision which drew on the interrelation of these traditionally
distinct policy domains. After that, I started to think about how I could build a
non-traditional, multidisciplinary career.

For me, career growth has come intentionally seeking interdisciplinary


opportunities and building friendships and relationships across sectors and
industries. Because the challenges we face draw on various domains, it's
important not only to have technical understanding, but an ability to bridge
diverse perspectives.

My advice to others is to embrace complexity. Be curious, push boundaries,


and invest in relationships that challenge and support your growth.
Understanding how to communicate across disciplines and find common
ground will make you an indispensable part of any team.

What backgrounds or voices would you like to see more of in the


Responsible Tech ecosystem?
To address the complex challenges at the intersection of tech and society,
we need a broader range of voices in the responsible tech ecosystem. In
particular, I’d like to see more representation from underrepresented
communities—especially those who have historically borne the brunt of the
negative impacts of emerging technologies, such as marginalized racial and
ethnic groups, people with disabilities, and economically disadvantaged
communities. We also need a strong presence from non-technical fields like

ALL TECH IS HUMAN | RESPONSIBLE TECH GUIDE 2024 | 100


PUBLIC INTEREST TECHNOLOGY

(Interview with Oumou Ly - continued)

sociology, anthropology, and ethics. Bringing in these perspectives helps


ensure that we’re not only building technologies that are technically sound,
but also equitable, inclusive, and aligned with our values.

What is your vision of a better tech future and how can we move towards
that vision?
Today, we are making decisions that will determine how technology shapes
the future. My vision for a better tech future is one where innovation isn’t just
about creating efficiencies but about improving lives across the board—
whether through better access to healthcare, education, or economic
opportunity. In this future, technology enables everyone to uplevel their way
of life.

To get there, we need to rethink how we develop technology. Security,


fairness, and inclusivity should be at the core of every innovation. This
requires collaboration across sectors—government, industry, and civil society
—working together to ensure technology serves the public good.
Moreover, I believe we'd be well served by measuring innovation more
holistically. By focusing on how technology can empower individuals and
uplift society, we can ensure that advancements are truly transformative.

ALL TECH IS HUMAN | RESPONSIBLE TECH GUIDE 2024 | 101


PUBLIC INTEREST TECHNOLOGY

Audrey Tang
Senior Fellow, Project Liberty Institute

“Stay curious and remain committed to the public interest.


Don't shy away from unconventional paths, as they often
lead to the most impactful innovations.”

What is the biggest tech & society issue we


are currently facing?
The erosion of trust and the rise of epistemic
chaos are the most pressing issues today.
This crisis is driven by the proliferation of
fraudulent behavior, fragmented information
ecosystems, and increasingly sophisticated
manipulative technologies. As trust erodes, it
has profound implications on democratic
processes, mental health, interpersonal
relationships, and our collective sense of
reality. When people struggle to discern
truth from falsehood or agree on basic facts,
meaningful dialogue becomes challenging,
making it difficult to address the global
challenges we face.

How does your role help tackle thorny tech & society issues?
As Taiwan’s first Digital Minister (2016-2024), I worked to bridge the gap
between technology and society. Through Presidential Hackathon, Ideathon,
Join.gov.tw, the Participation Officers network, and Alignment Assemblies,
we ensured that digital advancements enhanced democratic values and civic
engagement, leading Taiwan to top rankings in Asia on Internet Freedom
and Democracy indices. Now, as a Senior Fellow at the Project Liberty
Institute, I am extending this mission globally. My work focuses on shaping
ethical governance models for digital platforms that prioritize transparency,
user agency, and democratic participation.
ALL TECH IS HUMAN | RESPONSIBLE TECH GUIDE 2024 | 102
PUBLIC INTEREST TECHNOLOGY

(Interview with Audrey Tang - continued)

By collaborating with international experts, I aim to design decentralized


social networks that resist polarization and foster public trust, leading to a
more equitable and democratic digital future.

How did your career grow, and what advice would you give to others
wanting to be in a similar position?
My career journey has been unconventional, driven by a passion for open
collaboration. I began coding at age 8 and left formal education at 14 to fully
engage in internet development. This path gave me a deep appreciation for
decentralized, collaborative efforts. A pivotal moment in my career was my
involvement with g0v (pronounced "gov zero"), a grassroots movement in
Taiwan focused on government transparency. This experience taught me the
importance of bridging the gap between technology and society to create
inclusive and accessible systems. Serving as Taiwan's Digital Minister, I
played a key role in shaping our response to COVID-19 and safeguarding
against cyber interference, grounded in civic participation and co-creation.
My advice: Stay curious and remain committed to the public interest. Don't
shy away from unconventional paths, as they often lead to the most impactful
innovations. Cultivate empathy and strive to
understand diverse perspectives. Always
remember that technology should empower
and bridge across communities, not divide
them.

What backgrounds or voices would you like


to see more of in the Responsible Tech
ecosystem?
I would like to see greater representation from
spiritual and art communities; these groups
bring a unique perspective that is often
overlooked in development and governance.
Spiritual communities offer deep insights into
ethics, mindfulness, and
the human condition, helping to ensure that
our advancements are aligned with values

ALL TECH IS HUMAN | RESPONSIBLE TECH GUIDE 2024 | 103


PUBLIC INTEREST TECHNOLOGY

(Interview with Audrey Tang - continued)

that promote well-being and interconnectedness. Their focus on inner


development and ethical considerations can guide the creation of technology
that truly serves humanity. Artists, on the other hand, are natural visionaries
who can push the boundaries of how we think about technology and its
impact on society. Their creativity and ability to imagine alternative futures
can inspire more innovative and human-centered design. Integrating these
voices will help create a holistic approach to technology that honors the full
spectrum of human experience and fosters a deeper connection between
technology and the human spirit.

What is your vision of a better tech future and how we can move towards
that vision?
My vision of a better tech future centers on "pro-social media," a
transformative approach that prioritizes bridging divides over amplifying
conflicts. Unlike current social media, which often exploits emotional
reactions for engagement, pro-social media would use algorithms that
promote constructive dialogue and highlight common ground, even among
differing viewpoints.

Moving toward this vision requires advancements in language models to


create ranking systems that value curiosity, context, and consensus potential
over mere engagement. This shift would require a rethinking of how platforms
operate, focusing on interoperability between publishing platforms and a
choice of community-steerable recommendation engines. In addition to
technological changes, promoting "prebunking" — raising awareness about
polarization tactics — can empower communities to critically assess the
content they encounter. By equipping society with both the tools and the
mindset to resist divisive tactics, we can create a more resilient and cohesive
digital environment. Ultimately, this vision calls for collaboration between
governments, tech platforms, and civil society to build systems that genuinely
serve the public good.

Audrey Tang is one of Mozilla’s 2024 Rise25 honorees;


learn more about Rise25 at Rise25.mozilla.org.

ALL TECH IS HUMAN | RESPONSIBLE TECH GUIDE 2024 | 104


AN EVOLVING ECOSYSTEM

Cyber & Democracy


Cyber & Democracy
In parallel, governments must work
The cyber and democracy nationally to pass legislation that
vertical aims to foster promotes security and economic
technology that is secure, innovation and development that is
human-centered.
resilient, open, trustworthy,
and stable, and upholds Internationally, governments work
democratic and human with other countries through
rights principles and bilateral and multilateral channels to
promote cooperation on cyber
institutions. issues, build cyber capacity, further
digital rights, and work toward
Technology that aligns with this equitable digital access.
approach follows cybersecurity
best practices and is privacy- Civil society and academia also play
preserving and rights-respecting an important role through pushing
while promoting sustainable for both government and industry
economic innovation. accountability, conducting research
on cyber harms, being
Advancing secure technology for implementers, contributing to
democracy starts from the design standards setting and regulation to
phase and continues throughout its ensure they adhere to human rights,
lifecycle, including governance on offer cyber workforce development
both the private and public sector programs, advocating for
side. technology to take into account
vulnerable and marginalized
On the industry side, this means populations and other actions that
working to ensure that technology fill in societal gaps.
is not misused to carry out online
harms such as cybercrime and Due to the cross-sector nature of
attacks, influence operations, these issues, trust and partnership
disinformation campaigns, and development is essential. In cyber,
unlawful surveillance. as the saying goes, we’re only as
safe as the weakest link.
ALL TECH IS HUMAN | RESPONSIBLE TECH GUIDE 2024 | 106
CYBER & DEMOCRACY

Key Terms & Definitions Cyber Capacity Building (CCB): The


Global Forum on Cyber Expertise, a
Below are some definitions that a leading NGO in helping countries
reader might come across when build cyber capacity, defines cyber
learning more about cyber & capacity building as “equipping
democracy issues. The terms are not individuals and organizations with the
all-encompassing and are not meant knowledge, skills and tools they need
to be a definitive list. Instead, they to protect themselves and their
are meant as an entry point for digital assets.” At the community
anyone interested in learning more level, this can include offering
about these issues, regardless of cybersecurity training to interested
their background, and are meant to job seekers to training diplomats in
be accessible to the general public. cybersecurity issues to help shape
national strategies and discuss
Cybersecurity: At its essence, cybersecurity issues in multilateral
cybersecurity is protecting fora.
information and systems from
people with malicious intentions. Cyber Confidence Building
The goal of cybersecurity is to keep Measures (CCBMs): The
people safe, both digitally and from Organization for Security and Co-
real world harms. operation in Europe (OSCE), one of
the first intergovernmental agencies
Cyber Attack / Cyber Incident: to adopt cyber confidence building
Broadly, a cyber incident is the measures, broadly defines CCBMs as
attempt to gain unauthorized access actions that increase trust between
to systems to alter, destroy, or steal states by reducing the risk of
data. While there is debate about misunderstanding and escalation that
the exact definition and difference of could lead to conflict. One example
“incident” and “attack,” especially in of a confidence building measure is
international law, “incident” tends to an international directory between
indicate lower level impact than states that lists point of contacts for
“attack.” countries to help facilitate
communication and transparency.

ALL TECH IS HUMAN | RESPONSIBLE TECH GUIDE 2024 | 107


CYBER & DEMOCRACY

Cyber Crime: Generally, cybercrime Cyber Strategy: A plan detailing an


is when someone conducts illegal organization or nation state’s
activity with a computer. However, objectives in cyberspace and how
caution must be exercised when they plan to achieve it.
categorizing actions as “cybercrime”
and context must be taken into Cyber Resilience: The ability to
account since this term can recover after a cyber attack. As
sometimes be used by authoritarian cyber incidents become more
governments to suppress dissent and common, cyber resilience is
human rights. becoming a more widely accepted
concept in place of preventing
Cyber Diplomacy: Being a relatively cyber incidents.
new term, “cyber diplomacy” refers to
the act of discussing cyber-related Digital Rights: The concept that
matters in diplomatic contexts to human rights apply and must be
further nation state objectives. maintained on the internet.

Critical Infrastructure: Essential


services such as power, water, and
electricity which are now often
connected to the internet.

Cyber Norms: Generally accepted


behaviors in cyberspace that are not
legally binding. For example, not
using computers to illegally access
(“commit a cybercrime”) a nation’s
critical infrastructure.

Cyber Policy: Principles and


approaches to cyberspace, whether
an organization or a state, that
manage cyber risk while pursuing
strategic objectives.

ALL TECH IS HUMAN | RESPONSIBLE TECH GUIDE 2024 | 108


CYBER & DEMOCRACY

Select Organizations architecture, with a focus on the


rights of the individual. CDT supports
Below are some highlighted laws, corporate policies, and
organizations. To view a more technological tools that protect
comprehensive list, please visit the privacy and security and enable free
Cyber & Democracy vertical speech online.”
webpage at:
alltechishuman.org/cyber-and- CyberPeace Institute “We develop
democracy. programs to support communities
vulnerable to threats in cyberspace.
Access Now “Access Now defends By monitoring, assessing and
and extends the digital rights of communicating on the cyber threats
people and communities at risk. By to these communities, we can work
combining direct technical support, together with partners to better
strategic advocacy, grassroots protect them.”
grantmaking, and convenings such as
RightsCon, we fight for human rights EU Disinfo Lab “EU DisinfoLab is a
in the digital age.” young independent NGO focused on
researching and tackling
Atlantic Council “Driven by our sophisticated disinformation
mission of “shaping the global future campaigns targeting the EU, its
together,” the Atlantic Council is a member states, core institutions, and
nonpartisan organization that core values.”
galvanizes US leadership and
engagement in the world, in Institute for Security and
partnership with allies and partners, Technology “The Institute for
to shape solutions to global Security and Technology builds
challenges.” solutions to enhance the security of
the global commons. Our goal is to
Center for Democracy & Technology provide tools and insights for
“The Center for Democracy & companies and governments to
Technology is a 501(c)(3) working to outpace emerging global security
promote democratic values by threats. Our non-traditional approach
shaping technology policy and is biased towards action, as we build

ALL TECH IS HUMAN | RESPONSIBLE TECH GUIDE 2024 | 109


CYBER & DEMOCRACY

trust across domains, provide See a full list of organizations:


unprecedented access, and deliver https://alltechishuman.org/cyber-
and implement solutions.” and-democracy.

National Democratic Institute “NDI


believes all people have the right to
live in a world that respects their
dignity, security, and political rights— Job Levels
and the digital world is no exception. According to ATIH’s job board,
NDI's Democracy and Technology most Cyber & Democracy roles
(DemTech) division seeks to foster an posted require 5-6 years (39%),
inclusive and global digital 3-4 years (36%), and 7-9 years
ecosystem in which: Democratic (25%).
values are protected, promoted, and
can thrive; Governments are more Job Titles
transparent and inclusive; and All Security Analyst
citizens are empowered to hold their IT Specialist
government accountable.” Cyber Security Analyst
Senior Product Manager –
Project Liberty “Project Liberty is Cybersecurity Risk &
stitching together an ecosystem of Controls
technologists, academics, VP Product Security
policymakers and citizens committed Cyber Threat Intelligence
to building a better internet—where Analyst
the data is ours to manage, the Engineering Manager,
platforms are ours to govern, and the Security Incident Response
power is ours to reclaim. ‘Project Senior Director of
Liberty is not a tech project,’ as Cybersecurity Programs
founder Frank McCourt told Kara Director of AI & Democracy
Swisher. ‘It’s a democracy project. Cyber Security Researcher
And we need everyone involved.’” Chief Information Security
Officer

ALL TECH IS HUMAN | RESPONSIBLE TECH GUIDE 2024 | 110


CYBER & DEMOCRACY

Resources All Tech is Human. Strengthening


the Information Ecosystem (2024):
All Tech Is Human. Responsible Tech This short report synthesizes key
Mixer (August 2024): This panel findings and recommendations from
from All Tech Is Human’s Responsible All Tech Is Human’s collaboration
Tech Mixer looks at public interest with the Consulate General of
technology, including cyber Finland in New York in March 2024,
workforce development, and offers which brought together over 85
career advice to those looking to cross-sector stakeholders to discuss
break into the field. ways to foster information integrity.
The program looked at best
All Tech Is Human. Protecting the practices in building resilience and
Public through Cybersecurity relationships to advance trust in the
Cooperation (2024): This livestream information ecosystem while also
showcases how partnerships and exploring concrete ways to combat
community building help create a disinformation.
safer cyberspace through such
efforts as cyber capacity building
programs, workforce development
initiatives, and community building.
ATIH IN ACTION
All Tech Is Human. Cybersecurity for ATIH partnered with the
All: Social Impact Careers in Cyber Consulate General of Canada in
(2024): This livestream highlighted New York last March to host
speakers who positively impact Tech & Democracy: a Better
society through their cyber work Tech Future Summit. This was
without having to program or code. one of three gatherings done
They share how they use cyber to with the Consulate General of
keep the public safe, what type of Canada in New York. Read more
skills they draw on, and how they about the event.
found themselves in the cyber field.
This livestream is particularly geared
toward those who know nothing
about cybersecurity or cyber policy
and are looking to learn more.
ALL TECH IS HUMAN | RESPONSIBLE TECH GUIDE 2024 | 111
CYBER & DEMOCRACY

Access Now. Digital Security Center for Cybersecurity Policy &


Helpline: “Access Now’s Digital Law. Distilling Cyber Policy:
Security Helpline works with “Distilling Cyber Policy is the
individuals and organizations around podcast for those that want to follow
the world to keep them safe online. If and understand global public policy
you’re at risk, we can help you events and developments related to
improve your digital security cybersecurity. We separate the facts
practices to keep out of harm’s way. from the hype, and boil out the
If you’re already under attack, we wonky jargon so you can keep up
provide rapid-response emergency with the latest developments
assistance.” impacting the future of security.”

Access Now. #KeepItOn: “The Center for Strategic and


#KeepItOn coalition brings together International Studies. Cyber from
hundreds of civil society the Start: “The Cyber from the Start
organizations and our allies from podcast unveils the roots of today’s
around the world – in government, cybersecurity policies for critical
international institutions, media, the infrastructure, surveillance,
private sector, and beyond – to fight espionage, warfare and privacy.”
for an end to internet shutdowns.
Global Forum on Cyber Expertise. Center for Strategic and
International Studies. Inside Cyber
Cybil Knowledge Portal: “The Diplomacy: “Inside Cyber
globally owned one-stop knowledge Diplomacy presents a wide-ranging
hub brings together knowledge on and thought-provoking look at
international cyber capacity building.” international cybersecurity, its
challenges, and practices. Through
Article 19. Article 19 Podcast: “Our candid interviews with experts
podcasts delve into aspects of around the world, co-hosts Jim
freedom of expression; from Lewis and Chris Painter explore how
exploring the right to protest, to diplomacy and negotiation have
fighting for basic freedoms under shaped the field.”
repressive regimes.”

ALL TECH IS HUMAN | RESPONSIBLE TECH GUIDE 2024 | 112


CYBER & DEMOCRACY

The Lawfare Institute / Brookings. NYU’s Center for Social Media and
Cyberlaw Podcast: The Lawfare Politics. CSMaP Newsletter: Learn
Podcast features discussions with about their latest work to strengthen
experts, policymakers, and opinion democracy by conducting rigorous
leaders at the nexus of national research, advancing evidence-based
security, law, and policy, including public policy, and training the next
cybersecurity and governance. generation of scholars.

Council on Foreign Relations. Cyber


Week in Review: “Digital and
Cyberspace Policy program updates
on cybersecurity, digital trade,
internet governance, and online
privacy bimonthly.”

Center on National Security at


Fordham Law. Center on National
Security Cyber Brief: “The Aon CNS
Cyber Brief is a weekly roundup of
cyber news that highlights
developments across the digital
domain, including cybersecurity law
and policy, the technology industry,
and cryptocurrency.”

EU Cyber Direct. EU Cyber Direct


Newsletter: Learn about the latest
publications from EU Cyber Direct on
policy support, research, outreach
and capacity building in the field of
cyber diplomacy.

ALL TECH IS HUMAN | RESPONSIBLE TECH GUIDE 2024 | 113


CYBER & DEMOCRACY

Looking Ahead especially important in critical areas


such as hospitals and government
Proliferation of Generative AI services.
With large language models (LLMs)
now accessible to the public, such AI-driven Cyber Threat Detection
as ChatGPT or Llama, there is AI and machine learning (ML) are
concern that these models will be being increasingly used to identify
used to help enhance cyberattacks. and mitigate cyber threats in real-
While threat actors have not yet time. By analyzing large datasets
used AI in a novel way to conduct and recognizing patterns, AI can
cyberattacks, this doesn’t mean they detect anomalies, identify malware,
won’t in the near future. and stop intrusions faster than
traditional methods.
Continued Cyber Workforce
Shortage To both contribute to the larger
While the cybersecurity talent pool cyber policy ecosystem while
has grown recently, the shortage meeting our community needs, the
still remains with 4 million cyber & democracy vertical will
professionals needed to fill the gap. continue to:

Increasing Governance Efforts Highlight the multidisciplinary,


Between such endeavors as the EU global nature of cyber policy:
Cyber Resilience Act, EU Digital We will continue to show that all
Operational Resilience Act, the backgrounds are needed in the
release of more guidance on SEC’s cyber policy space through our
Cybersecurity Disclosure Rules, and livestreams, in-person panel
the UN Cybercrime Treaty, just to discussions, and profile
name a few, we’re seeing cyber interviews while also
governance development at the considering global
national, regional, and global level. circumstances and contexts.

Shifting Emphasis to Resilience


With the rise in cyber incidents and
data breaches, the focus is shifting
from prevention to recovery. This is
ALL TECH IS HUMAN | RESPONSIBLE TECH GUIDE 2024 | 114
CYBER & DEMOCRACY

Foster an open and accessible


cyber & democracy community:
A low barrier to entry will remain
a cornerstone of All Tech Is
Human’s cyber & democracy
programming. This includes
keeping programs free or at a ATIH IN ACTION
low cost and ensuring that In March 2024, All Tech Is
jargon doesn’t make its cyber Human partnered with the
and democracy resources Consulate General of Finland in
inaccessible. New York to co-host
Provide resources and Strengthening the Information
education for cyber policy: Ecosystem.
Through our job board, working
group, focus area webpage, and We brought together 75 experts
mentorship program, we will from across civil society,
provide an avenue for government, industry, and
individuals to learn about and academia to discuss methods to
work on cyber & democracy fortify information integrity, and
issues. The flexible structure best practices in building
allows both those looking to relationships to advance trust
break into the field and those and resilience in the information
looking to upskill to partake. ecosystem. We also explored
concrete ways to combat
As the saying goes, “you have to disinformation and strategic
see it to be it.” But once you see it, influence operations on both a
there still needs to be an on ramp technical and societal level.
that you can find. All Tech Is Human Read more
aims to do just that by highlighting
individuals and career pathways,
and providing free resources and
learning opportunities both within
and outside All Tech Is Human.

ALL TECH IS HUMAN | RESPONSIBLE TECH GUIDE 2024 | 115


CYBER & DEMOCRACY

Stéphane Duguin
CEO, CyberPeace Institute

“The convergence of accelerating technologies, AI,


Quantum, Brain Machine Interface, creates very fast
modification in human - tech interaction. This creates great
opportunity but also enormous threats.”

What is the biggest tech & society issue we


are currently facing?
We are facing a simultaneous convergence
and disconnection. The convergence of
accelerating technologies, AI, Quantum,
Brain Machine Interface, creates very fast
modification in human - tech interaction. This
creates great opportunity but also enormous
threats. If i would chose one, i’ll choose the
unacceptable risk of autonomous
cyberattacks.

How does your role help tackle thorny tech


& society issues?
I lead the CyberPeace Institute. We strive for
accountability in cyberspace. Independent and neutral, we investigate and
analyse the human impact and harm of cyberattacks. We deliver free
cybersecurity assistance, advance the enforcement of international laws and
norms and forecast digital threats to international peace and security.

How did your career grow, and what advice would you give to others
wanting to be in a similar position?
I worked for 25 years in law enforcement and 5 years in civil society. My
journey is investigating organised crime, especially cybercrime, and terrorist

ALL TECH IS HUMAN | RESPONSIBLE TECH GUIDE 2024 | 116


CYBER & DEMOCRACY

(Interview with Stephanie Duggan - continued)

use of tech, and engaging the digital policy and tech community to do so at
scale. I grew from investigator to senior manager, and I led the creation of
innovative operational capabilities to address new threats: The European
CyberCrime Center, The EU Internet Referral Unit, the European Innovation
Lab and now, the CyberPeace Institute.

Advice: Never underestimate the value of fieldwork, and how it can help your
journey towards systemic changes. It is one thing to theorise about
cybercrime, it is something else to investigate, arrest, interview, and bring a
cybercriminal to justice. It is one thing to discuss about victims, it is something
else to be called at 2 o’clock in the morning by someone who lost everything
because of a cyberattack.

What backgrounds or voices would you like to see more of in the


Responsible Tech ecosystem?
Representatives of victim groups or association. Experts in human behavior,
and how tech modifies it.

What is your vision of a better tech future and how can we move towards
that vision?
A tech future which will not be about tech for the sake of tech, but about how
it supports an ambitious human agenda.

ALL TECH IS HUMAN | RESPONSIBLE TECH GUIDE 2024 | 117


AN EVOLVING ECOSYSTEM

Tech Policy
Tech Policy

Tech policy, broadly Effective tech policy requires a


nuanced understanding of
speaking, is the regulation
emerging technologies, regulatory
and governance of current frameworks, and the global
and emerging technologies. implications of digital
This can happen through transformation. As technology
evolves, so must our policies,
company and industry adapting to new challenges and
policies as well as through opportunities to foster a safe,
government regulations and equitable, and innovative digital
future.
legislation.
To govern the development,
As technology touches so many deployment, and regulation of
aspects of society, we continue to technology in society, the
see tech policy intertwined with importance of well-crafted tech
technology’s impact on individuals, policy cannot be overstated.
society, and public interest. As a Internationally, tech policy must
result, the tech policy space is navigate the complexities of global
multidisciplinary. It can cover a wide governance. The cross-border
range of issues from advocacy, nature of the Internet and digital
freedom of expression, education, technologies means that policies in
health, and online safety. one country can have far-reaching
implications elsewhere.
Its dynamic intersection of International collaboration and
technology, law, and governance harmonization of regulations are
shapes our digital landscape by necessary to address global
balancing technological innovation challenges such as data flows,
with societal safeguards. It addresses cyber threats, and ethical use of
critical issues in data privacy, emerging technologies. Ultimately,
cybersecurity, AI ethics, and digital tech policy is about steering tech
inclusion, ensuring that technological development in a direction that
advancements align with public maximizes societal benefit while
interest and human rights. minimizing potential risks.
ALL TECH IS HUMAN | RESPONSIBLE TECH GUIDE 2024 | 119
TECH POLICY

Key Terms & Definitions Service briefer, a digital or online


platform is generally a computer
Content Moderation: The Trust & application or service that provides
Safety Professional Association content or service through the
defines content moderation as internet.
“process of reviewing online user-
generated content for compliance Digital/Online Products: Broadly, a
against a digital platform’s policies digital or online product is a product
regarding what is and what is not or service that utilizes software that
allowed to be shared on their provides a function for users.
platform.”
Emerging Technology: This may be
Content Policy: Generally, content developing a new technology or
policies are guidelines and roles existing technology that is used in a
that guide what can appear on different way.
digital and online platforms as set by
the company. Frameworks: The U.S. National
Institute of Technology (NIST)
Digital Governance: Within a global describes frameworks as indicators
tech policy context, according to of how technologies should be built
CSIS, digital governance is the and interrelate with each other.
“norms, institutions, and standards
that shape the regulation around the Regulation: Within tech policy
development and use” of conversations, this refers to
technology and the internet. There government imposing controls and
tends to be elements of politics, restrictions on how tech companies
power, and geopolitical interests act and the technologies that are
due to the ubiquity of technology. developed and deployed.

Digital Rights: The concept that Regulatory Body: A government


human rights apply and must be entity that is in charge of creating
maintained on the internet. rules, in this context, on tech
companies. However, not all
Digital/Online Platforms: According regulators are able to enforce the
to a U.S. Congressional Research rules, which is where a lot of tension
ALL TECH IS HUMAN | RESPONSIBLE TECH GUIDE 2024 | 120
TECH POLICY

comes into play. reports about what goes on their


platforms and sharing them publicly
Risk & Compliance: Risk refers to or granting researchers access to
the function of an organization that platform data to better understand
seeks to understand potential the inner workings.
pitfalls that might hinder operations
or goals and its corresponding
response. Compliance refers to the
organization’s adherence to rules,
laws, and standards. The term “risk
& compliance” refers to the practice
of identifying potential risks to an
organization and developing
appropriate controls and
procedures to mitigate them while
ensuring compliance with relevant
laws and regulations.

Standards Setting: This is the act of


deciding on standards,
requirements, and or guidelines for
new and future technologies. In tech
policy discussions, standards
settings usually involve power,
politics, and geopolitical motives as
actors try to create rules that benefit
them and those they represent.

Transparency: This is the idea that


online platforms should share
information and/or access about the
platform with external audiences.
This may entail the platform writing

ALL TECH IS HUMAN | RESPONSIBLE TECH GUIDE 2024 | 121


TECH POLICY

Select Organizations Digital Impact Alliance “Digital


transformation is moving at
Aspen Digital “A program of the breakneck speed. So, we focus our
Aspen Institute, we bring together energy on the opportunities – and
thinkers and doers from around the issues – that need the most
world to uncover new ideas and attention. Today, that’s how data
spark actions that empower can be unlocked to help people get
communities and strengthen access to services; policymakers
democracy. We engage and advance make informed choices; private
perspectives from industry, sector innovation to flourish; and in
government, and civil society to find all ways, fuel trust and
clarity in the chaos of public empowerment.”
discourse and chart a path forward.”
Internet Law & Policy Foundry “The
Center for Humane Technology “Our Internet Law & Policy Foundry is a
mission is to align technology with collaborative collection of early-
humanity’s best interests. We career Internet law and policy
envision a world with technology that professionals passionate about
respects our attention, improves our technology and disruptive
well-being, and strengthens innovation. The Foundry offers
communities.” members a platform for professional
development, constructive debate,
The Center for Democracy & and network-building within a cohort
Technology (CDT) “The Center for of skilled attorneys and policy
Democracy & Technology (CDT) is analysts eager to help shape the
the leading nonpartisan, nonprofit development of Internet law and
organization fighting to advance civil policy.”
rights and civil liberties in the digital
age. We shape technology policy, Knight First Amendment Institute
governance, and design with a focus at Columbia University “The Knight
on equity and democratic values. First Amendment Institute defends
Established in 1994, CDT has been a the freedoms of speech and the
trusted advocate for digital rights press in the digital age through
since the earliest days of the strategic litigation, research, policy
internet” advocacy, and public education. Our

ALL TECH IS HUMAN | RESPONSIBLE TECH GUIDE 2024 | 122


TECH POLICY

aim is to promote a system of free content governance and


expression that is open and inclusive, mis/disinformation and privacy, and
that broadens and elevates public their impact on underrepresented
discourse, and that fosters creativity, communities, predominantly living in
accountability, and effective self- low- and middle-income countries.”
government.”
Tech Policy Press “We publish
Oxford Internet Institute, University opinion and analysis. At a time of
of Oxford “The Oxford Internet great challenge to democracies
Institute—founded in 2001—is a globally, we seek to advance a pro-
multidisciplinary research and democracy movement in tech and
teaching department of the University tech policy.”
of Oxford, dedicated to the social
science of the Internet. Digital TheBridge “Non-partisan
connections are now embedded in organization breaking down silos
almost every aspect of our daily lives, and connecting professionals across
and research on individual and technology, policy, politics, and
collective behaviour online is crucial society — building stronger, more
to understanding our social, collaborative relationships.
economic, and political world.” TheBridge community consists of
leaders and knowledge-seekers at
Tech Global Institute “Tech Global the intersection of tech, policy, and
Institute (TGI) is a policy lab with a politics. Our community is wide-
mission to reduce equity and ranging across the public and
accountability gaps between private sectors including federal
technology platforms and the Global elected officials, technical experts,
South. We work at the intersection of policy experts, investors, corporate
private technology companies, civil CEOs, academics, community
society and government to reduce organizers, government employees
equity gaps in the Global South. We and political rising stars across the
provide thought leadership, advocate country.”
for inclusive policies and legislations
and develop nuanced research on a
range of Internet and technology
topics, including artificial intelligence,
ALL TECH IS HUMAN | RESPONSIBLE TECH GUIDE 2024 | 123
TECH POLICY

Tony Blair Institute for Global


Change “We help governments and
leaders turn bold ideas into reality.
We do it by advising on strategy,
policy and delivery, unlocking the
power of technology across all three.
Job Levels
We do it by sharing our insights, so
According to ATIH’s job board,
everyone can benefit. And we do it to
most Tech Policy roles posted
help build more open, inclusive and
require 5-6 years (40%), 3-4 years
prosperous countries for people
(34%), and 7-9 years (30%).
everywhere.”

Job Titles
Office of the Secretary-General's
Director, US AI Governance
Envoy on Technology “The Office of
Tech Policy Manager
the UN Secretary-General’s Envoy on
Senior Counsel, Intellectual
Technology is dedicated to
Property and Technology
advancing the United Nations’ digital
Associate General Counsel,
cooperation agenda, ensuring that
Privacy Data Governance
technological advancements benefit
Policy Researcher
all of humanity while mitigating
Senior Policy Adviser, Data
associated risks. The office serves as
Policy Analyst: AI and
a bridge between the UN system,
Emerging Technology
member states, the tech community,
Policy Manager - AI
and other stakeholders, promoting
Governance
partnerships and harnessing
Policy Enforcement Manager,
technology for the Sustainable
Trust and Safety
Development Goals (SDGs).”
Product Policy Specialist
AI Policy Director -
See a full list of organizations:
Government Relations
https://alltechishuman.org/tech-
policy.

ALL TECH IS HUMAN | RESPONSIBLE TECH GUIDE 2024 | 124


TECH POLICY

Resources Experience on the sidelines of All


Tech Is Human’s annual
All Tech Is Human. Tech & Responsible Tech Summit.
Democracy Report: “The Tech &
Democracy report will center around Bipartisan Policy Center. Tech On
four key topics - Misinformation and The Hill: “...This podcast dives deep
Disinformation, Digital Governance & into the day's top congressional
Policy, Online Radicalization and news, focusing on how the latest
Extremism, and Technology & bills, debates, and discussions in
Elections, as well as best practices Congress impact the ever-evolving
related to multistakeholder world of tech regulations. From data
collaboration, civil society privacy and antitrust concerns to
empowerment, and accountability. In cybersecurity and digital rights, we
addition, the report will feature profile break down complex legislative
interviews with 40+ individuals jargon into digestible insights,
actively working in the space, as well ensuring you stay informed and
as related resources from over 100 ahead of the curve. Whether you're
organizations at the intersection of a tech enthusiast, a policy wonk, or
technology and democracy.” just someone looking to understand
the implications of congressional
All Tech Is Human. All Tech Is decisions on the digital landscape,
Human, McGill University, and the this podcast is your go-to source.”
Consulate General of Canada in
New York’s Participatory
Democracy to Govern Big Tech: The
Canadian Experience: On September
14, 2023, All Tech Is Human, the
Consulate General of Canada in New
York, and McGill University’s Centre
for Media, Technology and
Democracy held the discussion
Participatory Democracy to Govern
Big Tech: The Canadian

ALL TECH IS HUMAN | RESPONSIBLE TECH GUIDE 2024 | 125


TECH POLICY

Center for Democracy & Technology. disinformation and microchips, we


Tech Talk: “CDT’s Tech Talk is a explore how today’s technology is
podcast where we dish on tech and shaping our world — and driving the
Internet policy, while also explaining policy decisions, global rivalries and
what these policies mean to our daily industries that will matter tomorrow.”
lives.”
Tech Policy Press. The Sunday
Data & Society. Data & Society: Show: “The Sunday Show is the
“Presenting timely conversations weekly (and sometimes more often)
about the purpose and power of podcast from Tech Policy Press.”
technology that bridge our
interdisciplinary research with Carnegie Endowment for
broader public conversations about International Peace. Tracking
the societal implications of data and Technology Regulations in Africa:
automation.” “...The Africa Technology Regulatory
Tracker: the first continent-wide
Internet Law & Policy Foundry. Tech aggregate of digital economy laws,
Policy Grind: “On the Tech Policy policies, and regulations in Africa
Grind Podcast, we discuss the most developed by Carnegie’s Africa
pressing issues at the intersection of Program. Our digital economy
law and technology. We chat with framework is divided into four digital
friends and fellows of the Internet economy pillars.”
Law and Policy Foundry about their
perspectives on emerging topics in Center for European Policy Analysis.
tech law and policy. From AI to Transatlantic Tech Policy Tracker:
cybersecurity, internet governance, “CEPA’s Transatlantic Tech Policy
privacy, and more - join us weekly to Tracker charts the key tech policy
dig into the latest in tech policy!” and business developments around
the globe. From antitrust to
Politico. Politico Tech: “The telecommunications and artificial
POLITICO Tech podcast is your daily intelligence to European digital
download on the disruption that regulation, this interactive tool
technology is bringing to politics and allows users to search and find
policy. From AI and the metaverse to

ALL TECH IS HUMAN | RESPONSIBLE TECH GUIDE 2024 | 126


TECH POLICY

news items compiled since the stakeholders across civil society,


beginning of 2020.” industry, platforms, and regulatory
bodies, this panel will illuminate the
Ranking Digital Rights. The Big Tech impacts of the Online Safety Act –
Scorecard: “Each year, Ranking and how we can all co-create in a
Digital Rights evaluates and ranks 14 more secure internet.”
of the world’s most powerful digital
platforms on their policies and All Tech Is Human & Crisp, A Kroll
practices affecting people’s rights to Business. Tech Policy Panel: “AI
freedom of expression and privacy.” Ethicist and Data Activist in
Residence Renée Cummings, Yoti
Tech Policy Press. Tech Policy Chief Policy and Regulatory Officer
Tracker: “Tech Policy Press is Julie Dawson, and Chief Intelligence
tracking laws and regulations, along Officer at Crisp, a Kroll Business
with government investigations and John-Orr Hanna discuss the
litigation, that will shape the rules and pathways to creating a more
accountability for tech companies. inclusive tech policy future.”
You can click on an item for more
information and use the 'filter by' tool
to narrow down the list by topic,
government, and type. Check back ATIH IN ACTION
often for new additions and updates All Tech Is Human hosted a
to existing items.” panel discussion focused on
how the Online Safety Act
All Tech Is Human. How Will the balances individual and
Online Safety Act Influence the collective privacy concerns at
Internet?: “In this expert panel, we our Responsible Tech London
explored how the Online Safety Act gathering last December.
balances individual and collective Watch now.
privacy and security concerns. We
also interrogated how new regulation
stands to hold existing tech
companies accountable while
encouraging competition. Based on
the insight of a wide range of
ALL TECH IS HUMAN | RESPONSIBLE TECH GUIDE 2024 | 127
TECH POLICY

All Tech Is Human & Consulate All Tech Is Human & Consulate
General of Canada in New York. General of Canada in New York.
Responsible Tech Summit: Responsible Tech Summit: Shaping
Improving Digital Space: Held on Our Digital Future: “On Sept. 14,
May 20, 2023, All Tech Is Human 2023, at SVA Theatre, All Tech Is
collaborated with the Consul General Human, in collaboration with the
of Canada in New York to bring Consulate General of Canada New
together 120 leaders across civil York, held its annual Responsible
society, government, industry, and Tech Summit which, brought together
academia who are working toward over 280+ stakeholders from across
improving the health and vibrancy of civil society, academia, industry, and
digital spaces. These are the people government to discuss how to co-
and organizations focused on create a better tech future.”
reducing harms, expanding
education, and reimagining what an
ideal digital space aligned with the
human experience looks like.”

All Tech Is Human & Consulate


General of Canada in New York.
Tech & Democracy: A Better Tech
Future Summit: “On March 1, 2023,
120 individuals from across civil
society, government, industry, and
academia came together at the
Consulate General of Canada in New
York to discuss the opportunities to
co-create a tech future aligned with
the public interest and the trade-offs
necessary to achieve it. The
gathering coincided with the release
of All Tech Is Human’s Tech &
Democracy report.”

ALL TECH IS HUMAN | RESPONSIBLE TECH GUIDE 2024 | 128


TECH POLICY

Looking Ahead Provide the latest discussion of


recent tech policy changes: Our
Increased Regulation mixers, summits, and
From the EU Digital Services Act livestreams provide our
and AI Act to the UK Online Safety community with the opportunity
Bill to the U.S. Executive Order on to hear the most up-to-date
Personal Data, policy makers and analysis of tech policy issues.
legislators are stepping up to Furthermore, our weekly
regulate the flow of data, algorithms, livestream series enables us to
and online platforms. deliver expert discussions on
breaking changes.
Influence of Geopolitics Share resources and create
From U.S. export controls to China space for aspiring tech policy
to legislation calling for ByteDance professionals: Between our
to divest from TikTok to growing mentorship program, job board,
concerns over a fragmented working groups, slack, and
internet, the past year has seen reports we host a variety of
geopolitical issues play out through resources for aspiring tech
tech policy issues. policy folks. We also plan to
launch a tech policy curriculum
Focus on Technology to Promote in 2025.
Economic Innovation
Countries all over the world are
looking to technology to drive
economic growth and their
governance decisions reflect this.
For instance, the EU-Japan data
flow agreement came into force
while the U.S. sought to prevent
data access to countries of concern,
both with the goal of fostering
economic prosperity.

To contribute to the tech policy


space while providing for the
community, we will continue to: ALL TECH IS HUMAN | RESPONSIBLE TECH GUIDE 2024 | 129
TECH POLICY

Create a space for tech policy


experts to discuss wicked and
thorny tech policy issues: Many
issues within the tech policy
umbrella are complex. For
instance, how do we balance ATIH IN ACTION
age verification with privacy? In September 2023, Author, Policy
How do we ensure global Advocate, and Professor at Columbia
majority voices are part of Law School, Tim Wu, joined Justin
governance conversations when Hendrix, CEO and Editor, Tech Policy
most big tech platforms are Press, to discuss antitrust legislation
located in the West? We pride and privacy at All Tech Is Human’s
ourselves on being able to Responsible Tech Summit at SVA
create an agnostic place where Theatre. Watch now.
individuals from different sectors
and backgrounds can have
vibrant discussions on
challenging topics.

ALL TECH IS HUMAN | RESPONSIBLE TECH GUIDE 2024 | 130


TECH POLICY

Eugenio V Garcia
Tech Diplomat, Ministry of Foreign Affairs of Brazil

“Global co-governance can only succeed with the


participation of a wide range of players in a multistakeholder
setting, with developing countries actively engaged.”

What is the biggest tech & society issue we


are currently facing?
So many challenges, especially for
developing countries and those who are
unconnected or underserved. Digital
exclusion is a major issue. Developing
countries are facing significant dilemmas as
they navigate the ongoing technological
revolution. For many of them, the promise of
technology still looks like a distant dream.
The rapid pace of technological
advancements has created a widening gap
between technologically advanced nations
and those still struggling with poor digital
infrastructure, outdated communication
networks, and a shortage of computing power. The lack of access to the
Internet, limited technical expertise, and insufficient investment in technology
education are major barriers to fully participating in the global digital
economy. This technological divide exacerbates existing inequalities, making
it harder for these nations to leverage technology for economic development
and social progress.

How does your role help tackle thorny tech & society issues?
As a Tech Diplomat, my role is to build bridges to connect people and
promote Tech for Good. Technology should be beneficial for everyone,

ALL TECH IS HUMAN | RESPONSIBLE TECH GUIDE 2024 | 131


TECH POLICY

(Interview with Eugenio V Garcia - continued)

everywhere, respecting the rich cultural diversity of the world’s population


and the local needs of different societies across the globe. We need to
promote further dialogue and bridge the gap between technologists
(developers, engineers, experts from private companies) and policymakers
(leaders, politicians, diplomats, governmental officials, parliamentarians) to
explore opportunities for interconnection between these two worlds, with
special regard to policymaking in the digital domain.

How did your career grow, and


what advice would you give to
others wanting to be in a similar
position?
My job has been rising in
importance in a globalized digital
ecosystem. We need more experts
on board. Technology is too
important to be left to technologists
alone. This is why Tech Diplomats
need to be bilingual: speak the
languages of both diplomacy and
technology to bridge the gap
between technologists and
policymakers. Global co-
governance can only succeed with the participation of a wide range of players in a
multistakeholder setting, with developing countries actively engaged. Unless we
take the concerns of the majority of the world’s population seriously, reaping the
rewards of the technological revolution will be a privilege confined to a minority, or
worse still, controlled by a few hands.

What backgrounds or voices would you like to see more of in the Responsible
Tech ecosystem?
Global tech policymaking demands responsible strategies to prevent disturbing
scenarios, build commonly accepted rules and minimum standards, and foster
international cooperation to avoid strategic uncertainty. Predictability by means of
norm-setting is in everyone’s interest. Effective global governance means that

ALL TECH IS HUMAN | RESPONSIBLE TECH GUIDE 2024 | 132


TECH POLICY

(Interview with Eugenio V Garcia - continued)

international issues are to be addressed in good faith by all interested parties,


following procedures commonly agreed upon to uphold the rule of law and
fairness. Tech Diplomacy will be increasingly called to manage the globalized
digital ecosystem as it stands today to reach political solutions that can
accommodate all views and concerns as much as possible. In this larger
debate, we need the participation of more scholars and policymakers of the
Global South.

These all-important issues concern all societies, and their consideration


should not be confined to a few influential actors, requiring instead political
will, inclusiveness, and more diverse representation in a plurality of settings.

What is your vision of a better tech future and how can we move towards
that vision?
In my view, Tech Diplomats from developing countries can play an active role
in a number of areas: pushing for international cooperation; engaging in
global policymaking; supporting efforts to ensure responsible use of new
technologies; exchanging views and coordinating positions; promoting a
common vision for the future; or joining forces with other partners in cross-
region institutions.

They can also embrace multi-stakeholderism; engage civil society, private


companies, researchers, and other stakeholders in bona fide cross-cultural
dialogues; pursue geographical and gender balance to ensure broad
representation at all levels; promote normative leadership; invest in capacity-
building to empower people and foster tech literacy among vulnerable
communities; seek inputs from marginalized groups and neglected
audiences; and learn from multidisciplinary perspectives by bringing new
voices to the table.

ALL TECH IS HUMAN | RESPONSIBLE TECH GUIDE 2024 | 133


TECH POLICY

June Okal
Senior Manager, Global Stakeholder
Engagement, Eastern and South Africa, ICANN

“I would love to see a world and a space where Africa and


the rest of the Majority World isn’t having ‘how do we catch
up conversations’ rather, we share experiences and
learnings based on our unique contexts and take up home -
bred solutions built for and by us.”

What is the biggest tech & society issue we


are currently facing?
Internet Fragmentation. While the world shifts
to AI as the next ‘frontier and buzzword of the
season’, majority of the world is still grappling
with little or no connectivity, poor energy
infrastructure to power the technology, lack
of digital skills, unaffordability of devices and
high costs of connection, unfriendly and
difficult business environments and core
socio - economic issues that are at the
forefront of their minds.

How does your role help tackle thorny tech


& society issues?
I am the primary representative of the Internet Corporation for Assigned
Names and Numbers (ICANN) in the Eastern, Southern African and Indian
Ocean countries and act as, what I like to call a translator between ICANN
and our community groups including governments, internet industry, and
media and communications professionals. We are uniquely positioned and
entrusted to advocate strongly for an even playing field, especially for Africa.
The Coalition for Digital Africa (www.coalitionfordigitalafrica.africa) is an

ALL TECH IS HUMAN | RESPONSIBLE TECH GUIDE 2024 | 134


TECH POLICY

(Interview with June Okal - continued)

initiative that aims to bring together stakeholders to expand the Internet in


Africa with 14 regional and international partner organizations working
together to expand the reach of information, tackling vital issues such as
Internet accessibility, affordability, infrastructure, local content creation, and
capacity development. So far, we’ve had projects with impact and reach in 34
African countries (and intentionally expanding), supported the expansion of
ICANN Managed Root Server (IMRS) infrastructure ensuring internet users in
Africa now have more resilient and stable access to DNS Root Zone services,
with 80% Africa-based Internet DNS root queries to the IMRS now resolved in
the region, trained 22 African Members of Parliament from 15 countries, and
had them at an ICANN meeting for the first time for a collaborative workshop
on the role of ICANN, the importance of the DNS, how the Internet works and
the unique role legislators have in the ecosystem, amongst other impactful
projects.

How did your career grow, and what advice would you give to others
wanting to be in a similar position?
I always wanted to have across-the-board experience in the tech industry in
order to appreciate the unique challenges and nuances of each. For instance,
I started work in tech policy at the equivalent of an ICT Department, but for
the Government of Kenya supporting in setting up ICT Standards to be
adopted, then worked at the copyright regulator where we ratified rules on
copyright exemptions for the visually challenged as well as introducing the
first Kenyan legal instrument on intermediary liability, then engaged at a
leading think tank KICTANet that seeks to catalyze ICT reforms through a
multistakeholder approach, to offering legal and regional client advisory at a
boutique and specialist TMT law firm, o setting up the office and helping
protect clients intellectual for one of the largest IP firms globally, shifting to
Google Interfacing with a user base of +1B, then to a passive infrastructure
telecommunications provider, then at ARTICLE 19 advocating for freedom of
expression and information with a focus on the Domain Name System, to
Meta Connectivity advocating for high speed connectivity for Africa’s people
and most recently at Harvard’s Berkman Klein Center for Internet and Society.
The transition at each point has been exciting, some more seamless than
others, but because the foundation is the same, it is simpler to build upon it.

ALL TECH IS HUMAN | RESPONSIBLE TECH GUIDE 2024 | 135


TECH POLICY

(Interview with June Okal - continued)

Key learnings: Stakeholder engagement and policy work is a marathon, not a


sprint. It takes time, energy and resources to see the return on investment
and impact. Also, loyalty is to yourself; the human resource (you) is one of the
most important puzzle pieces in the organizational organization, be mindful of
your wellness and wellbeing - whatever that means to you. Lastly, clarity of
purpose and the why of work you seek to do is so important. When all else
fails, your why acts as a true north and offers guidance. Be good at what you
do, be kind and build a solid network.

What backgrounds or voices would you like to see more of in the


Responsible Tech ecosystem?
Young, black and African-talented persons, especially women and non-binary
persons and more voices from the majority world. I recently spent a month in
Brazil and disappointingly in most of the rooms I was the only black African
female which felt like I had to represent the whole continent, our unique
context and experience in use and adoption of technology. I would love to
see a world and a space where Africa and the rest of the Majority World isn’t
having ‘how do we catch up conversations’ rather, we share experiences and
learnings based on our unique contexts and take up home-bred solutions
built for and by us. Several organizations continue to run programmes that
seek to bridge the gap and I would love to see more success stories of the
impact and return on investment for the benefit of the community.

What is your vision of a better tech future and how can we move towards
that vision?
A better leveled playing field across the global stage using tech as an enabler
with tangible socio-economic impact. Collaboration, collaboration,
collaboration. A friend Cecil Yongo recently tweeted (X’ed?) ‘ Privileged
Africans (like me) have a moral duty to do everything in their power to open
up opportunities for the brightest young Africans.’ and this goes beyond just
Africans, we need to create and welcome new voices in the space and go
against gatekeeper norms that have been rooted in the ecosystem. I deeply
believe in abundance and if the numbers on meaningful participation in tech
is anything to go by, there is still so much room for others to come in and take
up space. "If you want to go fast, go alone. If you want to go far, go together" -
African Proverb.

ALL TECH IS HUMAN | RESPONSIBLE TECH GUIDE 2024 | 136


TECH POLICY

Johanna Weaver
Founding Director, Tech Policy Design Centre
My advice to those wanting to enter the field? Don't be
scared of tech; it's everywhere and it's already part of your
job - whether you know it or not. It’s no longer possible to
be a credible policy practitioner if you don't have tech policy
literacy.

What is the biggest tech & society issue we


are currently facing?
We face a plethora of thorny tech & society
issues, but the root to solving them all is
awakening people to the fact that we each
have agency to demand that technology be
made differently. It is possible to design tech
that reenforces democracy, social media that
fosters connection and community, and tech
that helps address - not exacerbate - the
climate crisis. That is why a movement like
All Tech Is Human is so important - it's
growing the critical mass to think differently
about technology and its impact on our lives.

How does your role help tackle thorny tech & society issues?
Our mission is to shape technology for the benefit of humanity. We believe
technology shapes society, but we must remember that people have the
power to shape technology. This includes through its technical design, but
also through the laws, policies, and standards that govern the development
and use of tech. By harnessing each of these, we can shape a future where
humans, the environment, and technology thrive.

ALL TECH IS HUMAN | RESPONSIBLE TECH GUIDE 2024 | 137


TECH POLICY

(Interview with Professor Johanna Weaver - continued)

What backgrounds or voices would you like to see more of in the


Responsible Tech ecosystem?
We all use tech, so we can't solve complex tech and society issues if we
aren't providing space - and listening to - all voices. Tech policy should not be
made in silos, rather, it should be informed by the diverse stakeholders who
develop, use, and govern technology.

Co-designing solutions is at the core of everything we do at the Tech Policy


Design Centre. To get more granular, I'd particularly like to see more
amplification of Indigenous and first national perspectives on technology
governance; there is much we can learn about their long history of innovative
use of technology to care for community and environment. More generally, I
think we also need much louder voices in and from the Asia-Pacifc. We have
more than half of the world's Internet users, and yet we remain
underrepresented in tech policy conversations globally.

What is your vision of a better tech future and how can we move towards
that vision?
My vision is a future in which technology makes our lives better, not just
easier. It is possible to shape a future in which people, the environment and
technology thrive. But we can't be passive, we need to take action today to
create this postive future tomorrow.

ALL TECH IS HUMAN | RESPONSIBLE TECH GUIDE 2024 | 138


RESPONSIBLE TECH GUIDE

Moving Forward
Co-creating a Better Future

Over the next five years, All Tech Is Human will focus
intently on ensuring our tech future is designed collectively.
This realignment requires a paradigm shift in how society
designs, develops, and deploys technologies that deeply
impact people and our social fabric. Your voice is a
necessary part of this goal. This will allow for a better
approach to tackling thorny tech & society issues.

This is an ambitious goal, but one we are actively working towards by


building the world’s largest multistakeholder, multidisciplinary network in
Responsible Tech. With a network of 42,000 and growing, we have plans to
understand and activate hundreds of thousands of individuals over the next
two years, and millions after five years.

How are we doing this?


We are creating a different approach that is not top-down but
instead operates as a flywheel to understand and activate the
growing Responsible Tech community across disciplines and
geographic locations.

In other words, instead of a traditional unilateral communication


where your role is to distribute information, with All Tech Is
Human, your role is to actively participate in and contribute to the
hivemind of intelligence.

ALL TECH IS HUMAN | RESPONSIBLE TECH GUIDE 2024 | 140


Hurdles for Getting Involved

Where to start?
It can be overwhelming to understand what the Responsible Tech
ecosystem looks like and determine how to get more involved. Individuals
often get inspired by a book, a movie, an article, or a personal experience
as a catalyst to making a positive difference in our tech future; however,
they may struggle to understand how. The Responsible Tech Guide that
you are reading is designed to help guide you on the many ways that you
can grow in the Responsible Tech ecosystem.

Finding Community
It can oftentimes feel like a solitary pursuit for individuals who want to make
a difference, so it is essential to illuminate pathways into the ecosystem.
There are thousands of people and organizations committed to the
Responsible Tech movement. The Responsible Tech Guide shows you
numerous ways that you can build greater community through All Tech Is
Human’s activities, along with the hundreds of other incredible
organizations in the ecosystem.

Getting Support and Mentorship to Grow in Responsible Tech


Many struggle to find the necessary support to better understand the
ecosystem, expand their network, upskill, and find career opportunities in
Responsible Tech. Our activities at All Tech Is Human shine a light on these
issue areas and offer free educational resources, mentorship, and other
upskilling opportunities.

ALL TECH IS HUMAN | RESPONSIBLE TECH GUIDE 2024 | 141


ATIH Theory of Change

Our current approach to complex tech and society issues


is not working.

The speed of innovation is


CO
MM
greatly outpacing our ability NG
E

UN
to consider its impact, we do

A
CH

ITY
not have an appropriate mix
of backgrounds involved in
the process, and we do not

N
have enough knowledge-

IO
CA

EE

AT
R

sharing and collaborative RS


ED
UC
opportunities.

All Tech Is Human is building a better approach to how we can tackle


thorny tech & society issues. Structured as a complex adaptive system, we
leverage the collective intelligence of the community, diversify the
traditional tech pipeline, and move at the speed of tech.

We are designed to understand the ecosystem while influencing it at the


same time. Our wide range of activities provide perpetual insight and the
ability to reach thousands of individuals across civil society, government,
industry, and academia.

Change happens through our


community-building,
educational resources, and
career-related activities.
Community Education Careers

ALL TECH IS HUMAN | RESPONSIBLE TECH GUIDE 2024 | 142


Launching a Supporters Network
All Tech Is Human was launched in 2018 and grew from a
bootstrapped grassroots organization to receiving its first
funding in 2021, which has massively expanded the reach
and overall activities that support the Responsible Tech
ecosystem.

To ensure our sustainability as the backbone of the Responsible Tech


movement, we are launching our Responsible Tech Supporters Network.
This network of funders, companies, and individuals will help provide the
necessary yearly funding to operate our numerous activities.

Activities include our large Slack community, Mentorship program, University


Network, Responsible Tech Job Board, working groups, regular in-person
gatherings, weekly livestream series, and upcoming courses.

Learn more about the Responsible Tech Supporters


Network

All Tech Is Human will launch its Responsible Tech Supporters


Network membership in early 2025. To learn more, please fill
out our interest form here. The supported activities are
centered on our three pillars of community, education, and
careers.

ALL TECH IS HUMAN | RESPONSIBLE TECH GUIDE 2024 | 143


Change in Action
The All Tech Is Human network is a proxy for the growth
of Responsible Tech and public interest tech movements.
From the start, our strategy has been to “find the others”
through convenings, collaborative platforms, and
amplification of community contributors’ work.

In July 2024, All Tech Is Human


released the Ecosystem Pulse
Report, a first-of-its-kind report that
explores the question, “Who are the
Humans of All Tech Is Human?” It
features key findings like:

The Responsible Tech


network’s growth to 42,000
individuals
41% of the network comprises
tech industry professionals
20% of the network holds
decision-maker leadership roles
in their companies
Public Interest Technology and
AI are key areas of interest and
expertise within the community

The top areas of expertise, which are diverse and overlapping, include
Responsible AI, Tech & Democracy, and Tech Policy.

Read more about All Tech Is Human’s vision to maintain a space that
depicts the diverse and global voices in Responsible Tech. Download the
Ecosystem Pulse Report here.

ALL TECH IS HUMAN | RESPONSIBLE TECH GUIDE 2024 | 144


“All Tech Is Human is fostering a
collaborative environment where diverse
voices can come together to drive
meaningful change in tech.”
Sandra Khalil
Associate Director, All Tech Is Human

In our Slack community, you will find They are researchers, builders,
channels to discuss each of our 6 policymakers, industry
focus areas, location-specific professionals, organizers, and
channels for hubs around the world, educators sharing and executing
and career-focused channels to post our vision of building a better future
job openings and share resources. for all.

For more information, including Examples of affiliate projects include


how to join our Slack community, growing All Tech Is Human’s
visit: alltechishuman.org/slack- thought leadership and convening
community. footprint, moderating our Slack
workspace and broader community
Affiliates Program ethos, and managing our community
In the fall of 2023, All Tech Is Working Groups and curriculum-
Human launched its inaugural building efforts (see below).
cohort of affiliates. We sought to
expand the Responsible Tech For more information and to apply
movement’s reach by appointing 42 to the next cohort, visit
trusted and dedicated community alltechishuman.org/affiliates.
members representing 11 countries
around the world.

ALL TECH IS HUMAN | RESPONSIBLE TECH GUIDE 2024 | 145


All Tech Is Human has held dozens of in-person gatherings in our
key hubs: New York City, Washington, D.C., San Francisco, and
London. In addition, groups are using our large Slack community to
independently organize gatherings across the globe!

Our official gatherings have brought together thousands of individuals across


civil society, government, industry, and academia who are committed to
tackling thorny tech & society issues. Quite simply: good things happen when
we unite a diverse range of stakeholders to learn from each other, share
knowledge, and find avenues to co-create a better tech future.

All Tech Is Human hosts a variety of gatherings: large Responsible Tech


Mixers, curated gatherings on specific topic areas, workshops, roundtables,
and our annual Responsible Tech Summit that draws a global audience.

2024 event registrations on Eventbrite to-date


(Source: Eventbrite)

ALL TECH IS HUMAN | RESPONSIBLE TECH GUIDE 2024 | 146


All Tech Is Human specializes in uniting multiple stakeholders who
come together to surface values, tensions, trade-offs, and best
practices. Recent attendees have come from:

Civil Society
Ada Lovelace Institute, Accountable Tech, Alan Turing Institute, Algorithmic
Justice League, Berkman Klein Center, Center for Democracy & Technology,
Center for Humane Technology, Consumer Reports, Integrity Institute,
Encode Justice, Project Liberty, Thorn, World Economic Forum, World Wide
Web Foundation, and more.

Government and Multilateral Institutions


British Consulate in New York, Canadian Consulate in New York, European
Commission, Finnish Consulate in New York, Ofcom, United Nations, UNDP,
UNICEF, USAID, and more.

Industry
Amazon, Discord, EY, Google, Hinge, IBM, Meta, Microsoft, OpenAI, Oracle,
Pinterest, Reddit, Spotify, TikTok, Vimeo, Yahoo, and more.

Academia
Columbia University, Cornell Tech, Georgetown, Harvard University, New
York University, Oxford University, Princeton University, Stanford University,
and more.

Key Funders in Responsible Tech


Ford Foundation, Kapor Foundation, Oak Foundation, Omidyar Network,
Open Society Foundations, Patrick J. McGovern Foundation, Schmidt Futures,
Siegel Family Endowment, and more.

Journalists
Axios, Bloomberg, Fast Company, MIT Technology Review, New York Times,
Rest of World, Tech Policy Press, Washington Post, WIRED, and more.

ALL TECH IS HUMAN | RESPONSIBLE TECH GUIDE 2024 | 147


At All Tech is Human, we are deeply invested in educating
the next generation and providing free upskilling
opportunities. Our ongoing initiatives include:

Working Groups Mentorship Program


Six ongoing working groups Over 1,500 graduates from our
correlating with our six key free mentorship program,
focus areas, along with a where we place three
broader educational working mentees with a mentor for a
group to help inform our six-month duration. Learn
work. Learn more more

University Network Livestream Series


This network provides Held Thursdays at 1:00pm ET,
educational opportunities and our free weekly series
co-mentorship with student-to- illuminates a vibrant
student interactions and shared ecosystem and provides
learnings across the network. significant learning
Learn more opportunities for all. See all

Reports and Guides


Aside from our annual Responsible Tech Summit every
September, our organization releases 2- 4 reports per
year. Recent reports include Tech & Democracy, AI &
Human Rights, Improving Social Media, and the HX Report:
Aligning Our Tech Future With Our Human Experience.
Read our reports and guides.

ALL TECH IS HUMAN | RESPONSIBLE TECH GUIDE 2024 | 148


“Responsible Tech is a fast-growing and
ever-changing career space. Students need
insight into the latest trends. As this sector
expands, the UNet is pushing to include
voices previously left out, and regions
without a thriving tech sector.”
Steven Kelts
Director, Responsible Tech University Network, All Tech Is Human

Responsible Tech University Network

The Responsible Tech University ATIH-affiliated guest speakers


Network makes ATIH’s huge on-campus and online to talk
community of professionals a about their own career
resource for student growth. We pathways.
empower students by bringing them Opportunities for project-based
in contact with a diverse cross- learning, sourced from our
section of people in responsible extensive network across civil
tech. How do we do this? society, academia, government,
and industry.
Exclusive internship and Learning materials produced by
externship listings, building on ATIH working groups to bring
ATIH’s wildly successful the expertise of our community
responsible tech job board. into discussions on campus.
A student-focused mentorship Access to certifications and
program that links college badges from major providers.
students with early career
professionals. Get involved as a speaker, mentor,
Job-shadowing opportunities for or contributor! Email Steven at
advanced students to learn the steven@alltechishuman.org.
ropes in major organizations.
ALL TECH IS HUMAN | RESPONSIBLE TECH GUIDE 2024 | 149
“Working Groups are composed of
responsible tech practitioners from all
over the world. It’s a great way to meet
(and do meaningful work with) other
responsible tech practitioners!”
Sarah Welsh, PhD
Director, Mentorships and Working Groups, All Tech Is Human

Working Groups

Our Working Groups leverage the We currently have working groups


collective intelligence of the focused on the six key areas of
Responsible Tech community. We Responsible Tech:
combine individuals from a wide
range of disciplines across the Responsible AI
globe, merging the deeply Trust & Safety
experienced with those newer to Tech & Democracy
the ecosystem. Tech Policy
Public Interest Technology
We draw on our working groups to Youth, Tech, and Wellbeing
help inform topic-specific reports,
knowledge hubs, curricula, and Working group members can join a
more, in order to help shape the group according to their area of
future of the Responsible Tech expertise, or opt to learn
Ecosystem. something new alongside subject
matter experts.
These groups are a cross-section of
our community that is vetted and Check out the Working Groups
organized by the All Tech Is Human interest form to learn more.
team.
ALL TECH IS HUMAN | RESPONSIBLE TECH GUIDE 2024 | 150
The Responsible Tech Ecosystem is made up of
individuals who come to this space through
different pathways, bringing unique skills,
perspectives, and diverse backgrounds. The
Careers vertical at ATIH is dedicated to helping our
community find their space in this ecosystem and
organizations find the right people to grow.

Responsible Tech Job Board

Our Responsible Tech Board hard-to-fill positions to ATIH’s


regularly showcases 400+ jobs from expansive network, giving them
across the world. This means visibility and connection to the right
everything from large organizations types of talent.
to small startups, governments to
civil society, for-profits to nonprofits. Our job board, along with all of our
Our curated list of opportunities career resources, recognize that
covers roles from across our six important change happens inside of
focus areas. the tech industry, from research and
oversight, and from re-imagining our
Our job board allows job seekers to tech future.
gain a better understanding of
what’s out there and search for Visit our Responsible Tech Job
social impact roles. Board and join our talent pool.

Additionally, All Tech Is Human’s


Responsible Tech Job Board allows
organizations free access to post

ALL TECH IS HUMAN | RESPONSIBLE TECH GUIDE 2024 | 151


“The most common career path to a
Responsible Tech job is a nonlinear one. At
All Tech Is Human, we know that, and we
are working to bridge the gap between our
multifaceted talent with the social impact
organizations who need them. A career is
what happens in retrospect.”

Ali Feldhausen
Director of Career Development, All Tech Is Human

Talent Matchmaking

Through our wide range of activities, incorporated Tekalo, a tool built to


including our Responsible Tech Job bridge the gap between tech talent
Board, working groups, mentorship and impact-driven opportunities.
program, summits, livestreams,
reports, and Slack community of This program will:
10,000+ members, ATIH is uniquely Power more open roles and
positioned to understand both candidates for the talent pool
responsible tech talent and the Grow our services and support
organizations seeking to hire. for Responsible Tech orgs
Support the growth of a healthy
Leveraging that unique space, Responsible Tech ecosystem at
ATIH’s Talent Matchmaking the speed of tech
colleagues provide nonprofits, for-
profits, and government entities with Learn more about our
direct connections to the talent they Matchmaking services.
need to solve tech's thorniest
problems. This year, we are also
excited to announce that we have

ALL TECH IS HUMAN | RESPONSIBLE TECH GUIDE 2024 | 152


“Running this program makes me feel
hopeful for the future because it shows how
important it is to so many people to do
meaningful work, build connections, and
affect change in their organizations. We
have a great community here!”
Sarah Welsh, PhD
Director, Mentorships and Working Groups, All Tech Is Human

Mentorship Program

All Tech Is Human developed this represent people from Public


free program in order to help build Interest Technology, Trust & Safety,
the Responsible Tech pipeline – by Responsible AI, and more.
facilitating connections and career
development among talented We are particularly grateful to the
students, career changers, and stellar mentors who contribute their
practitioners. time and expertise towards guiding
the next generation. Mentors can
The Mentorship Program is an expect an opportunity to shape the
impactful way to get involved, give interests of aspiring Responsible
back to the community, create a Tech practitioners, as well as
stronger network, and empower networking opportunities with other
aspiring Responsible Tech mentors. Mentees can expect
practitioners. career guidance and peer support.

To date, nearly 1000 mentees have Join the waitlist for our 2025
gone through the mentorship Mentorship Program.
program since its inception. Our
mentorship program cohorts
ALL TECH IS HUMAN | RESPONSIBLE TECH GUIDE 2024 | 153
Learn More, Take Action
After you read the Responsible Tech Guide, here are ways
you can stay involved:

01 Build community by joining our Slack community of over 10,000


members across 98 countries.

02 Apply to our free mentorship program, which has now mentored


over 1,500 people in Responsible Tech.

03 Attend one of our in-person gatherings, or find an independently-


organized gathering in your city.

04 Participate in our weekly livestream series, taking place every


Thursday at 1:00pm ET.

05 Are you a student? Join our growing Responsible Tech University


Network and help start an All Tech Is Human-affiliated club.

06 Join one of our seven open working groups, focused on our six key
areas of Responsible Tech, plus a general education group.

07 Stay abreast of the Responsible Tech ecosystem by joining our


newsletter.

08 Are you growing your career in Responsible Tech? Join our talent
pool and use our Responsible Tech Job Board, along with our
career resources and services.

09 Read our previous reports, along with upcoming reports focused on


Responsible AI, Trust & Safety, Public Interest Technology, and more.

10 Be on the lookout for our Responsible Tech 101 courses arriving in


2025, along with additional upskilling resources.

ALL TECH IS HUMAN | RESPONSIBLE TECH GUIDE 2024 | 154


RESPONSIBLE TECH GUIDE

Acknowledgments
Thank You
If we thanked everyone who has contributed in some way to
the Responsible Tech Guide, we’d have to add an entirely
new report. The Responsible Tech Guide has been informed
by our current working groups (over 500 people), previous
reports that have involved over 1,000 people in their
creation, our mentorship program with over 1,500
participants, and the 10,000+ members in our Slack
community who are regularly sharing resources, insights,
and feedback that we synthesize into the guide.

We would also like to thank and acknowledge the support we receive from
Siegel Family Endowment, Patrick J. McGovern Foundation, the Future of
Online Trust & Safety Fund, Mozilla, Schmidt Futures, and Oak Foundation.
We have also previously received support from the Ford Foundation and
Project Liberty.

Our Responsible Tech Guide is an evolving process that builds off of previous
versions. The 2024 Responsible Tech Guide is the fifth edition, as the first
guide was released in September 2020. In that time, we’ve activated
hundreds of volunteers who have helped shape the version you’re reading
today.

All Tech Is Human has been an organization with big ambitions to develop a
better approach to tackling thorny tech & society issues. Thank you to
everyone who has believed in our mission and helped spread the word.
Your participation in our various activities helps create a stronger hub of
knowledge to the benefit of the entire community. Thank you.

ALL TECH IS HUMAN | RESPONSIBLE TECH GUIDE 2024 | 156


RESPONSIBLE TECH GUIDE

All Tech Is Human Team


All Tech Is Human Team

David Ryan Polgar Rebekah Tweed Sandra Khalil Sherine Kazim


Founder & President Executive Director Associate Director Executive Director,
Strategic Ops

Sarah Welsh, PhD Ali Feldhausen Steven Kelts Josh Chapdelaine


Director, Mentorship Director, Career Director, Responsible Program Manager, Comms
and Working Groups Development Tech University Network and Special Projects

Elisa Fox, Matt Skomarovsky Alexis Crews Renée Cummings


Program Manager, Senior Fullstack Senior Fellow, Senior Fellow, AI, Data
Events and Cyber & Engineer Information Integrity and Public Policy
Democracy

Learn more about how we are tackling complex tech & society issues on
our website, or email us at hello@alltechishuman.org. You can also
submit press & media inquiries and download our press kit.

ALL TECH IS HUMAN | RESPONSIBLE TECH GUIDE 2024 | 158


N
D
•L
C
•D
YC
N

ALL TECH IS HUMAN | RESPSONSIBLE TECH GUIDE 2024 | XX


Let’s co-create a
better tech future!

You might also like