What are some simple things I can do to contribute to AI safety?
Even if you are not in a position to make major life changes, here are some ways you can start contributing:
Learn More
Learning more about AI alignment
Consider these options:
-
Keep exploring our website.
-
Complete an online course. AI Safety
Fundamentals is a popular option that offers courses for both alignment and governance. There is also Intro to MLAI safetyView full definitionA research field about how to prevent risks from advanced artificial intelligence.
Safety, which follows a more empirical curriculum. Getting into these courses can be competitive, but all the material is also available online for self-study. There are also other courses you can take.Machine learningAn approach to AI in which, instead of designing an algorithm directly, we have the system search through possible algorithms based on how well they do on some training data.
-
Read books (we recommend The Alignment Problem), watch videos, or listen to podcasts.
Join the Community
Joining the community is a great way to find friends who are interested and will help you stay motivated.
-
Join the local group for AI safety, Effective Altruism1
or LessWrong. You can also organize your own!Not all EA groups focus on AI safety; contact your local group to find out if it’s a good match. -
Join online communities such as Rob Miles’s Discord (where you can help write and edit this website) or the AI Alignment Slack.
-
Write thoughtful comments on platforms where people discuss AI safety, such as LessWrong.
-
Attend an EAGx conference for networking opportunities.
Here’s a list of existing AI safety communities.
Get Into Action - Donate, Volunteer, and Reach Out
Donating to organizations or individuals working on AI safety can be a great way to provide support.
-
Donate to AI safety projects.
-
Help us write and edit the articles on this website so that other people can learn about AI alignment more easily. You can always ask on Discord for feedback on things you write.
-
Write to local politicians about policies to reduce AI existential risk.
-
Join a protest group such as Pause AI.
If you don’t know where to start, consider signing up for a navigation call with AI Safety Quest to learn what resources are out there and to find social support.
If you’re overwhelmed, you could look at our more bite-sized suggestions.
Not all EA groups focus on AI safety; contact your local group to find out if it’s a good match. ↩︎