Olivier Elemento, Director of Englander Institute for Precision Medicine at Weill Cornell Medicine, shared a post on LinkedIn:
“Five hard problems in biomedicine becoming tractable with AI
Five years ago, protein structure prediction was biology’s hardest problem. AlphaFold changed that overnight. Here are five more problems where I think we’re seeing that same shift.
Virtual cells and organs with dynamic simulations
Why it’s hard: Understanding how cellular circuits work—how genes, proteins, and pathways interact dynamically—requires capturing responses to perturbations across contexts. That needs massive perturbation data and models that learn causal mechanisms, not just correlations.
What’s changing: Large Perturbation Models like GSK’s LPM learn general perturbation-response rules that transfer across cell types. These virtual cell models enable in silico therapeutic design—from drug combinations to engineered cell therapies.
In silico drug design and antibody engineering
Why it’s hard: Designing molecules that bind targets and work in living systems requires navigating enormous chemical space. For antibodies, structural stability and biological plausibility often conflict.
What’s changing: David Baker’s lab recently published RFdiffusion antibodies in Nature—designing functional antibodies entirely computationally, no animal immunizations needed. Methods like Germinal and mBER achieve 4-38% success rates. We’re moving from discovering drugs to designing them.
True multi-modal diagnostics
Why it’s hard: Diagnostic data is siloed—pathology in one system, imaging in another, genomics in a third. Real diagnosis requires synthesizing across all.
What’s changing: Multi-modal foundation models are beginning to integrate diverse data types into unified representations. The frontier is building systems that synthesize pathology, imaging, genomics, and labs across the full diagnostic spectrum.
Structuring unstructured EHR data
Why it’s hard: Clinical notes are filled with noise, abbreviations, contradictions, and tremendous variation. Converting this to structured data has been a decades-long challenge.
What’s changing: Large language models with agentic capabilities and tool use extract structured information from clinical text. This unlocks clinical trial matching, AI diagnostic assistants, and more. The challenge shifts from “can we” to “can we do it safely enough”.
Autonomous surgery
Why it’s hard: Surgery on soft, variable tissue requires real-time adaptation to unexpected anatomy. Traditional programming can’t capture this complexity.
What’s changing: Hierarchical AI like SRT-H uses high-level AI to plan and low-level AI to execute. By separating strategic planning from motor control, these systems handle complete procedures autonomously.
These problems share a pattern: they require handling complexity—whether biological, physical, or data—that exceeded traditional computational approaches. AI is changing what’s feasible. The shift from intractable to tractable doesn’t mean solved, but we can now make progress where we were stuck.”
More posts featuring Olivier Elemento.