Apart Research
Funding ends June 2025: Urgent support for proven AI safety pipeline converting technical talent from 26+ countries into published contributors
Centre pour la Sécurité de l'IA
4M+ views on AI safety: Help us replicate and scale this success with more creators
Oliver Habryka
Funding for LessWrong.com, the AI Alignment Forum, Lighthaven and other Lightcone Projects
Remmelt Ellen
Cost-efficiently support new careers and new organisations in AI Safety.
Ryan Kidd
Help us support more research scholars!
Connor Axiotes
Filming a feature-length documentary on risks from AI for a non-technical audience on streaming services
PIBBSS
Fund unique approaches to research, field diversification, and scouting of novel ideas by experienced researchers supported by PIBBSS research team
Jørgen Ljønes
We provide research and support to help people move into careers that effectively tackle the world’s most pressing problems.
Guy
Out of This Box: The Last Musical (Written by Humans)
Joep Meindertsma
Help the largest AI activist group grow
Epoch AI
Constance Li
Field building on AI's impact on nonhumans
Incubate AI safety research and develop the next generation of global AI safety talent via research sprints and research fellowships
Peter Wildeford
Michaël Rubens Trazzi
How California became ground zero in the global debate over who gets to shape humanity's most powerful technology
PauseAI US
SFF main round did us dirty!
Holly Elmore
Help me to organize US moratorium-promoting activities to expand the Overton window and increase public pressure in favor of a moratorium on frontier AI.
Request for Retroactive Funding
Distilling AI safety research into a complete learning ecosystem: textbook, courses, guides, videos, and more.