A smartphone-based system combines autofluorescence and white light imaging with machine learning for accurate identification of oral lesions that require referral to cancer specialists
Peer-Reviewed Publicationimage:
Analysis of an anatomic site using the mobile Detection of Oral Cancer (mDOC) model involves multiple inputs: images of clinically relevant regions are masked, cropped, resized for analysis, and passed through the mDOC system, along with oral cancer risk factors. The output referral decision is “Refer” or “Do Not Refer” for oral cancer evaluation.
view moreCredit: R. Mitbander et al., doi 10.1117/1.BIOS.2.4.042307
Oral cancer remains a serious health concern, often diagnosed too late for effective treatment—even though the mouth is easily accessible for routine examination. Dentists and dental hygienists are frequently the first to spot suspicious lesions, but many lack the specialized training to distinguish between benign and potentially malignant conditions. To address this gap, researchers led by Rebecca Richards-Kortum at Rice University have developed and tested a low-cost, smartphone-based imaging system called mDOC (mobile Detection of Oral Cancer). Their recent study, published in Biophotonics Discovery, evaluates how well this system can help dental professionals decide when to refer patients to oral cancer specialists.
The mDOC device combines white light and autofluorescence imaging with machine learning to assess oral lesions. Autofluorescence imaging uses blue light to detect changes in tissue fluorescence, which can signal abnormal growth. However, this method alone can be misleading, as benign conditions like inflammation also reduce fluorescence. To improve accuracy, the mDOC system uses a deep learning algorithm that analyzes both image data and patient risk factors—such as age, smoking habits, and anatomic location—to make referral recommendations.
In this study, researchers collected data from 50 patients at two community dental clinics in Houston, Texas. Each patient underwent imaging of up to five oral sites using the mDOC device. The images were reviewed by expert clinicians, and their referral decisions served as the ground truth for training and testing the algorithm. The team used a rehearsal training method, combining new data with previously collected images from high-prevalence and healthy populations to improve the model’s performance in typical dental settings, where suspicious lesions are rare.
The final model was tested on a holdout dataset representing a low-prevalence population. It achieved an area under the ROC curve (AUC-ROC) of 0.778, with a sensitivity of 60 percent and specificity of 88 percent. This means the system correctly identified 60% of the sites that experts recommended for referral, while avoiding unnecessary referrals in most cases. Notably, the mDOC algorithm outperformed dental providers, who had 0% sensitivity and 100 percent specificity—missing all cases that required referral.
While the system misclassified two of five referral sites, those lesions had resolved by the time of the specialist visit, suggesting that mDOC may have correctly predicted that no further evaluation was needed. However, the algorithm also produced 21 false positives, indicating room for improvement in specificity.
The study highlights the potential of mDOC to support early detection and referral decisions in dental clinics, especially where access to specialists is limited. With an average imaging time of just 3.5 minutes, the system fits easily into routine dental workflows. Future improvements may include collecting more detailed patient history and refining the algorithm to reduce false positives.
For details, see the original Gold Open Access article by R. Mitbander et al., “Optimization of a mobile imaging system to aid in evaluating patients with oral lesions in a dental care setting," Biophoton. Discovery 2(4), 042307 (2025), doi: 10.1117/1.BIOS.2.4.042307.
Journal
Biophotonics Discovery