I’d like to get deeper into the AI alignment literature. Where should I look?

The AI Safety Fundamentals course is a great way to get up to speed on alignment; you can apply to go through it together with other students and a mentor, but note that they reject many applicants. You can also read their materials independently, ideally after finding a group of others to take the course together with.

Other great ways to explore:

You might also consider reading Rationality: A-Z, which covers skills that are valuable to acquire for people trying to think about complex issues. The Rationalist's Guide to the Galaxy is a shorter and more accessible AI-focused option.