Paul Christiano
| Paul Christiano | |
|---|---|
| Education | |
| Known for | |
| Scientific career | |
| Institutions | |
| Thesis | Manipulation-resistant online learning (2017) | 
| Doctoral advisor | Umesh Vazirani | 
| Website | paulfchristiano | 
Paul Christiano is an American researcher in the field of artificial intelligence (AI), with a specific focus on AI alignment, which is the subfield of AI safety research that aims to steer AI systems toward human interests. He serves as the Head of Safety for the U.S. AI Safety Institute inside NIST. He formerly led the language model alignment team at OpenAI and became founder and head of the non-profit Alignment Research Center (ARC), which works on theoretical AI alignment and evaluations of machine learning models. In 2023, Christiano was named as one of the TIME 100 Most Influential People in AI (TIME100 AI).
In September 2023, Christiano was appointed to the UK government's Frontier AI Taskforce advisory board. Before working at the U.S. AI Safety Institute, he was an initial trustee on Anthropic's Long-Term Benefit Trust.