A team of researchers at Mass General Brigham has developed an artificial intelligence tool that can analyze a selfie to estimate a person’s biological age and predict their likelihood of surviving cancer treatment. The algorithm, called FaceAge, uses facial features to gauge a person’s biological — not just chronological — age, potentially offering insight into overall health, disease progression, and even life expectancy.
In clinical tests involving over 6,000 cancer patients, FaceAge detected that many had a biological age about five years older than their chronological age — a red flag for poorer outcomes. Researchers also tested the system against predictions from 10 clinicians. The result? FaceAge outperformed the doctors in estimating survival outcomes for patients receiving palliative care.
The findings, published in The Lancet Digital Health, suggest that a simple photograph could someday help shape treatment plans for cancer patients, and possibly predict aging trajectories and health risks more broadly.
How it works
FaceAge was trained on more than 58,000 images of presumed healthy individuals. When tested on cancer patients, it assessed aging markers visible in the face to provide a biological age score. Researchers found that patients whose faces looked younger than their actual age fared significantly better in cancer treatment than those who appeared older biologically.
Dr. Hugo Aerts, director of the Artificial Intelligence in Medicine program at Mass General Brigham and a co-senior author of the study, said the goal is to provide doctors with objective data that augments — not replaces — human judgment. “A photo like a simple selfie contains important information that could help to inform clinical decision-making and care plans,” he said.
More than skin deep
Dr. Ray Mak, another co-senior author, sees broader implications. “As we increasingly think of different chronic diseases as diseases of aging, it becomes even more important to be able to accurately predict an individual’s aging trajectory,” he said. FaceAge could eventually be used for early detection of age-related diseases or even general lifespan predictions.
AI advocates say this kind of technology could help eliminate bias in health care by offering data-driven analysis of patient health, rather than relying on subjective impressions.
Experts urge caution
Dr. Harvey Castro, an emergency medicine physician and AI futurist not involved in the project, praised the tool’s potential. He said it formalizes what doctors have long called the “eyeball test” — the gut feeling of how healthy or frail a patient appears.
But he also warned of ethical and technical risks. “If the training data lacks diversity, we risk producing biased results,” he said. Additionally, privacy concerns loom large: Who owns your facial data? How is it stored? Could patients be psychologically harmed if told they look “older” than they are?
And there’s the risk of misusing the information. Doctors might change treatment plans based on a computer’s prediction, or insurance companies could seek access to this data for their own calculations.
Not a replacement for doctors
All researchers and outside experts agree: AI should enhance — not override — clinical decisions. “It cannot replace the empathy, context, and humanity that define medicine,” Dr. Castro emphasized.
FaceAge is not yet approved for clinical use, and more research is underway to test its accuracy across diverse populations and diseases. But it opens a new frontier in what your face might reveal — beyond genetics or family history — about your long-term health.
For now, it’s a powerful reminder that how you age may be more than skin deep — and that modern medicine is getting a whole lot better at reading between the lines.