OpenAI Safety Fellowship Open Application: External Researchers Join AI Safety Research, Deadline May 3

ChainNewsAbmedia

OpenAI announces the opening of applications for the “OpenAI Safety Fellowship,” a pilot program for external researchers. It is designed to support independent AI safety and alignment research and to cultivate the next generation of AI safety talent.

Program duration and location

The Fellowship program will run from September 14, 2026 to February 5, 2027, for approximately five months. The main work space is at Constellation in Berkeley, California, where Fellows work alongside others at the same location; applicants can also choose to participate remotely.

Priority research areas

OpenAI says the program will prioritize research questions directly related to the safety of existing and future systems, including:

Safety evaluation

Ethics

Robustness

Scalable mitigations

Privacy-preserving safety methods

Agentic oversight

High-severity misuse domains

OpenAI also specifically emphasizes that it hopes the selected research will be empirically grounded, technically rigorous, and valuable as a reference for the broader research community.

Eligibility and benefits

The program welcomes applicants from diverse backgrounds, including computer science, social sciences, information security, privacy, and human-computer interaction (HCI). OpenAI clearly states that the evaluation criteria prioritize research ability, technical judgment, and execution capability, rather than specific academic credentials.

Selected Fellows will receive:

A monthly stipend

Compute support

Ongoing guidance from an internal OpenAI mentor

API credits and other appropriate resources

It is worth noting that Fellows will not receive access to internal OpenAI systems; research must be carried out independently from the outside. The program requires submission of recommendation letters, and expects each Fellow to produce concrete research outputs before the program ends (such as papers, benchmark tests, or datasets).

Application deadline: May 3

Applications are now open, with a deadline of May 3, 2026. OpenAI plans to notify the results by July 25, 2026. For the application link and detailed eligibility requirements, please refer to the official application form; for related questions, you can email openaifellows@constellation.org.

This article OpenAI Safety Fellowship opens for applications: External researchers join AI safety research, deadline May 3 first appeared on Chain News ABMedia.

Disclaimer: The information on this page may come from third parties and does not represent the views or opinions of Gate. The content displayed on this page is for reference only and does not constitute any financial, investment, or legal advice. Gate does not guarantee the accuracy or completeness of the information and shall not be liable for any losses arising from the use of this information. Virtual asset investments carry high risks and are subject to significant price volatility. You may lose all of your invested principal. Please fully understand the relevant risks and make prudent decisions based on your own financial situation and risk tolerance. For details, please refer to Disclaimer.
Comment
0/400
No comments