Clinical Advisor
We're teaching machines to recognize when someone is in crisis. That's a hard problem, and getting it wrong has real consequences. We need someone who's sat with people in those moments and can tell us when our system's intuitions are off.
This isn't clinical practice: no patient contact, no liability. It's advisory. Reviewing our risk taxonomy, gut-checking edge cases, telling us when we're overconfident or missing something obvious. You'd be the person we call when a classification feels wrong and we can't figure out why.
We're also aware that most crisis research is Western-centric, and that distress presents differently across cultures. We'd value someone who thinks about these gaps.
The work is mostly async. Annotating outputs, reviewing test cases, occasional calls when we're stuck. We're looking for someone who finds this problem genuinely interesting, not just professionally adjacent.
You'd be a good fit if
You've done suicide risk assessment in practice (psychiatry, crisis services, clinical psychology, whatever the context). You know what C-SSRS is and have opinions about its limitations. You're curious about what AI can and can't do here. You can explain why something matters to people who don't share your training.
Apply
Tell us about yourself. No cover letter needed—just the basics and anything you'd like us to know.
We read every application. If there's a fit, we'll reach out within a 5 days. If not, we'll let you know—no ghosting.