Oliver Habryka
Funding for LessWrong.com, the AI Alignment Forum, Lighthaven and other Lightcone Projects
ampdot
Community exploring and predicting potential risks and opportunities arising from a future that involves many independently controlled AI systems
Ryan Kidd
Help us support more research scholars!
Greg Colbourn
Centre hosting EAs’ research projects/EAs’ upskilling for ~1/3 of the average cost of funding such projects remotely
PIBBSS
Fund unique approaches to research, field diversification, and scouting of novel ideas by experienced researchers supported by PIBBSS research team
Nuño Sempere
A foresight and emergency response team seeking to react fast to calamities
Jørgen Ljønes
We provide research and support to help people move into careers that effectively tackle the world’s most pressing problems.
ALERT := Active Longtermist Emergency Response Team
Apart Research
Incubate AI safety research and develop the next generation of global AI safety talent via research sprints and research fellowships
Jordan Braunstein
Combining "kickstarter" style functionality with transitional anonymity to decrease risk and raise expected value of participating in collective action.
Joel Becker
Boosting advocacy for investment in and deployment of technologies for improving indoor air quality
Piotr Zaborszczyk
Reach the university that trained close to 20% of OpenAI early employees
PauseAI US
SFF main round did us dirty!
Grace Braithwaite
A Cambridge Biosecurity Hub and Cambridge Infectious Diseases Symposium on Avoiding Worst-Case Scenarios
Siao Si Looi
12 months funding for 3 people to work full-time on projects supporting AI safety efforts
Camille Berger
A workshop and software to learn how to effectively handle disagreements.
Orpheus Lummis
Non-profit facilitating progress in AI safety R&D through events
Support the growth of an international AI safety research and talent program
Florent Berthet
French center for AI safety
Angie Normandale
Seeding a business which finds grants and High Net Worth Individuals beyond EA