What should I do with my idea for helping with AI alignment?
Maybe you've thought of a project that seems really valuable for someone to do, but nobody seems to be doing it; the AI safety
A research field about how to prevent risks from advanced artificial intelligence.
To make progress, you could go to Alignment Ecosystem Development to get more people involved with your idea, start an AI Safety Quest group for it, or find collaborators in some other way. It also helps to make yourself “discoverable”. For advice, you could talk to AI Safety Support. You could also visit EA hubs or attend EA Global to exchange ideas. Consider using red-teaming or murphyjitsu to find the most likely ways your project could fail and avoid them. If you’re not sure the project is really something you want to do, try goal factoring. If you are sure, you can apply for funding for your project from philanthropists interested in promoting AI safety.