International Journal of Research and Innovation in Applied Science (IJRIAS)

Submission Deadline-09th September 2025
September Issue of 2025 : Publication Fee: 30$ USD Submit Now
Submission Deadline-04th September 2025
Special Issue on Economics, Management, Sociology, Communication, Psychology: Publication Fee: 30$ USD Submit Now
Submission Deadline-19th September 2025
Special Issue on Education, Public Health: Publication Fee: 30$ USD Submit Now

Breast Tumor Ultrasound: Clinical Applications, Diagnostic Features, and Integration with AI

  • Majd Oteibi
  • Adam Tamimi
  • Kaneez Abbas
  • Gabriel Tamimi
  • Danesh Khazaei
  • Hadi Khazaei
  • 382-396
  • Sep 2, 2025
  • Healthcare Technology

Breast Tumor Ultrasound: Clinical Applications, Diagnostic Features, and Integration with AI

Majd Oteibi*1, Adam Tamimi2, Kaneez Abbas3, Gabriel Tamimi4, Danesh Khazaei5, Hadi Khazaei6

1,2,4Validus Institute Inc

3Athreya Med Tech

5Portland State University

6Portland State University/ Athreya Med Tech

*Corresponding Author

DOI: https://doi.org/10.51584/IJRIAS.2025.100800034

Received: 30 July 2025; Accepted: 05 August 2025; Published: 02 September 2025

ABSTRACT

Breast tumor ultrasound has become a cornerstone technique in both the screening and diagnosis of breast cancer due to its accessibility, non-invasiveness, and high sensitivity. This is especially true in dense breast tissue, where mammography may be less effective. Clinically, ultrasound is employed to distinguish between benign and malignant lesions, guide biopsies, and monitor treatment response. Essential diagnostic features include shape (irregular vs. round/oval), margins (non-circumscribed, spiculated), internal echotexture, and posterior acoustic phenomena. Benign lesions tend to appear as smooth, well-circumscribed, and homogeneous, whereas malignancies often present as irregular, hypoechoic masses with non-circumscribed margins and posterior shadowing.

Recent advances focus on the integration of artificial intelligence (AI) with breast ultrasound. AI-enhanced ultrasound systems, including deep learning radiomics and commercial decision support tools, can improve diagnostic accuracy, consistency, and efficiency. This approach helps with facilitating earlier detection and helps in reducing unnecessary biopsies, particularly for less experienced radiologists. Automated Breast Ultrasound (ABUS) combined with AI radiomics extracts high-dimensional image features, supporting predictive modeling for disease characterization and treatment planning.

The accompanying illustrations detail typical ultrasound appearances of benign and malignant breast masses, highlight key ultrasound features, and showcase an AI-assisted workflow. The included flowchart demonstrates the clinical algorithm: from initial patient presentation, through ultrasound image acquisition and feature characterization, to AI-integrated decision support and follow-up management steps.

This convergence of advanced imaging and AI integration marks a paradigm shift in breast cancer care, enabling more precise, timely, and cost-effective pathways from detection to management.

Keywords: Breast cancer, Ultrasound, Synthetic Tissue, AI technology, Innovation

Image summary of the steps of breast ultrasound imaging and machine learning process

Figure 1.  Image summary of the steps of breast ultrasound imaging and machine learning process.

INTRODUCTION

Breast cancer is the most common cancer in women globally. It is the second leading cause of cancer death in women globally. In 2022, there were 2.3 million new cases of breast cancer globally. Death rates were highest in low-income countries due to later diagnoses and access.

That’s where the breast ultrasonography techniques that will be discussed in this article show promise in the search for more accessible, less intrusive, and more affordable breast cancer screening methods.

Ethics Statements

Animal Studies:

This manuscript did not present any information involving animal subjects

Human Studies:

No human subjects are involved in this manuscript.

Studies involving human subjects

Financial Disclosures

No financial disclosures to make in this manuscript

Conflict of Interest:

No conflict of interest presented in this manuscript

Funding Information:

No funding received to develop this manuscript

Disclaimer:

The following manuscript explores an original experiment that was done by the Validus Institute Inc team and collaborators from the Materials and Mechanical Engineering lab at Portland State University. The experiment is original, and not all aspects discussed in this experiment will have references to prior published articles. All the images marked as courtesy of Validus Institute Inc were taken by Validus Institute Director Dr. Majd Oteibi to document that progression of this research project.

Overview of Breast Ultrasound Modalities and Clinical Applications

Ultrasound imaging is a non-invasive diagnostic tool that uses high-frequency sound waves to create real-time images of internal structures. It is widely used to assess and differentiate between benign and malignant tumors, particularly in organs like the breast (Table 1).

Table 1. Ultrasound Features of Benign vs. Malignant Breast Tumors

Feature Benign Lesions Malignant Lesions
Shape Oval or round Irregular, lobulated
Margins Smooth, circumscribed Indistinct, spiculated, angular
Orientation Wider-than-tall (parallel to skin) Taller-than-wide (non-parallel; perpendicular growth)
Echogenicity Isoechoic or hyperechoic; homogeneous Hypoechoic; heterogeneous
Posterior Features Posterior acoustic enhancement (bright behind) Posterior shadowing (dark behind), or none
Internal Echo Pattern Uniform, possible cystic areas Heterogeneous; internal echoes vary; may show necrosis
Calcifications Coarse, macrocalcifications (if present) Microcalcifications (tiny, punctate, suspicious)
Vascularity (Doppler) Minimal or peripheral Increased internal or central vascularity
Compressibility Compressible Non-compressible
Mobility Mobile on palpation/ultrasound probe pressure Fixed or less mobile
Elasticity (if elastography is used) Soft or intermediate Stiffer than the surrounding tissue

Table 2. Summary of the classification of breast lesion types and how each can appear on the ultrasound

Typical Benign Lesions (Examples):

  • Fibroadenoma: Oval, smooth, hypoechoic, wider-than-tall, with gentle lobulations.
  • Cyst: Anechoic (black), posterior enhancement, well-defined.
  • Lipoma: Isoechoic or hyperechoic, compressible, thin capsule.

 Typical Malignant Lesions (Examples):

  • Invasive Ductal Carcinoma (IDC): Irregular shape, spiculated margins, hypoechoic, posterior shadowing, taller-than-wide.
  • Invasive Lobular Carcinoma (ILC): May appear subtle or ill-defined, often hypoechoic with minimal shadowing.

Figure 2. Ultrasound images of different breast tissue based on Malignant vs Benign clinical correlation

The Breast Imaging-Reporting and Data System (BI-RADS), developed by the American College of Radiology, serves as a standardized framework for interpreting and reporting breast imaging findings, including ultrasound examinations. The BI-RADS Ultrasound Classification categorizes lesions based on their sonographic characteristics, providing an estimate of the likelihood of malignancy and guiding appropriate clinical management. This system enhances communication between radiologists and referring clinicians and facilitates consistency in decision-making and follow-up recommendations. Table 2 summarizes the BI-RADS ultrasound categories, associated risk of malignancy, and corresponding clinical actions.

Table 3. BI-RADS Ultrasound Classification (Breast Imaging-Reporting and Data System)

Category Description Risk of Malignancy Typical Action
BI-RADS 1 Negative 0% Routine screening
BI-RADS 2 Benign finding 0% Routine screening
BI-RADS 3 Probably benign <2% Short-interval follow-up (e.g., 6 months)
BI-RADS 4A Low suspicion for malignancy ~2–10% Recommend biopsy
BI-RADS 4B Moderate suspicion ~10–50% Recommend biopsy
BI-RADS 4C High suspicion ~50–95% Recommend biopsy
BI-RADS 5 Highly suggestive of malignancy >95% Immediate biopsy or surgical consultation
BI-RADS 6 Known biopsy-proven malignancy 100% (by definition) Treatment planning

Table 4. BI-RADS Ultrasound Classification (Breast Imaging-Reporting and Classification)

Examples of Ultrasound Findings by Category

BI-RADS 2 (Benign):

  • Oval, smooth, well-circumscribed fibroadenoma
  • Simple cyst (anechoic, posterior enhancement, no internal echoes)
  • Lipoma or intramammary lymph node

BI-RADS 3 (Probably Benign):

  • Hypoechoic, oval lesion with circumscribed margins
  • Wider-than-tall
  • Stable fibroadenoma in young women

 BI-RADS 4 (Suspicious):

  • Irregular, hypoechoic mass
  • Angular or indistinct margins
  • Slight posterior shadowing or mixed posterior features

 BI-RADS 5 (Highly Suspicious):

  • Irregular or spiculated mass
  • Taller-than-wide
  • Strong posterior shadowing
  • Marked internal vascularity
  • Associated suspicious lymph nodes

METHODS/INTERVENTION

Creating synthetic tissue components, process and steps

The creation of synthetic phantom breast tissue plays a vital role in developing and validating AI-based ultrasound imaging systems. For this purpose, a multilayered phantom was constructed utilizing a controlled polyvinyl chloride (PVC) matrix plasticized with dioctyl terephthalate (DEHT) softener to emulate the acoustic and mechanical properties of human breast tissue. PVC was selected as the primary matrix material for this phantom due to its superior durability and long shelf life compared to traditional gelatin-based phantoms, which, while simpler to prepare, are prone to degradation over time and offer limited usability for prolonged imaging studies.

A ratio of 16.7 grams of PVC powder was combined with 100 milliliters of DEHT softener, producing a 16.7% w/v mixture. This blend was initially homogeneous and runny, appearing as a milky white liquid upon mixing. The mixture was steadily heated with constant stirring (Figure 3 A-B). All heating steps were conducted under a fume hood to ensure safe handling of any fumes or vapors generated during the process. As the temperature approached approximately 130°C, the solution thickened and developed a honey-like viscosity (Figure 3C). Continued heating to approximately 170°C led to a decrease in viscosity and the mixture acquired an amber-clear appearance, indicating readiness for molding (Figure 3D).

Prior to casting, a cube mold was coated internally with mineral oil to facilitate demolding. The hot PVC-DEHT mixture was first poured as a thin base layer and allowed to cool briefly, just enough for the polymer to support added features without significant deformation. Artificial lesions, in our case, 3D-printed with ABS, were then placed on top of the partially solidified base (Figure 3E), after which a second aliquot of the hot mixture was poured to fully encapsulate these inclusions. Following a similar cooling interval, a silicone breast implant was positioned atop the set lesion layer. A final portion of the mixture was applied to submerge the implant, leaving only a portion of the nipple-areolar complex exposed to replicate anatomical fidelity (Figure 3 F-G).

After assembly, the phantom was allowed to cool to room temperature to ensure complete solidification. The finished construct could be easily removed from the mold, yielding a stable, layered model suitable for ultrasound investigation. This fabrication protocol provides a reproducible approach for generating physiologically relevant phantoms intended for AI/ML training and imaging research.

Figure 3. Fabrication and structure of the synthetic breast phantom. (A) Addition of powdered PVC into DEHT under heating and stirring. (B) Homogeneous, milky-white dispersion after full mixing. (C) Thickening of mixture at ~130 °C with honey-like viscosity. (D) Final amber-clear solution at ~170 °C, maple syrup consistency. (E) 3D-printed lesions prior to embedding. (F) Molded phantom showing stratified layers and embedded structures. (G) Fully cured phantom removed from mold, with visible breast implant and exposed nipple–areolar complex. Courtesy of Validus Institute Inc and PSU collaborators, 2025

Using the ultrasound probe and the process of capturing ultrasound images

The Butterfly iQ probe was selected for this study due to its portability, user-friendly interface, and ability to deliver imaging quality comparable to conventional ultrasound machines. The study utilized the Butterfly iQ in conjunction with the Butterfly App on a 10th Generation Apple iPad. Real-time imaging was displayed on the app interface as the probe was applied to a surface, and connected to the Ipad via USB-C.

The ultrasound procedure began with the application of conductive gel to the probe. As sound waves travel more effectively through fluid mediums, the gel facilitates optimal transmission by reducing air interference between the probe and the skin. With the gel applied, the probe was placed in contact with the general area of concern. It is important to establish a consistent orientation during scanning by referencing the probe’s side light indicator, which identifies the probe’s lateral orientation relative to the patient.

The Butterfly probe provides a downward-facing view, where the upper portion of the screen represents the most superficial structures. Proper grip and pressure must be maintained to ensure consistent contact and high-quality imaging. The scanning process involves first sweeping across the transverse axis, followed by the longitudinal axis, adjusting planes as needed to fully evaluate the region of interest. Images were interpreted live through the Butterfly App interface during the scanning process.

Figure 4: www.butterfly network.com

Techniques for handling a butterfly IQ probe are similar to conventional ultrasound devices, as the user must be aware of orientation, proper “pencil” grip and hand positioning. The user can adopt the “PART” technique to ensure optimal image acquisition. This includes adjusting pressure gently to maintain good contact without distorting tissue, angling the probe to better visualize deeper structures, rotating it 90 degrees to alternate between transverse and longitudinal planes, and tilting the probe sideways, upwards, or downwards to sweep through different tissue layers. Additionally, effective image optimization involves adjusting depth and gain settings on the interface as needed. Utilizing the app’s preset modes for organs such as vascular, MSK, or abdominal can further enhance image quality tailored to the anatomical region being assessed.

Figure 5. Synthetic tissue used and ultrasound probe used to capture images Butterfly IQ3 probe.

www.butterfulynetwork.com, and synthetic tissue image. Courtesy of Validus Institute Inc, 2025

In this experiment, the following presets were used; soft tissue, small organ, and musculoskeletal (MSK). These were used because of the depth of the specific lesions and tissue since the team took images of different synthetic tissues.

Figure 6.  Ultrasound image captured using synthetic tissue. The image shows the measurements of the breast tissue. Using the ellipse and the lines both vertical and horizontal to start the annotation process. Courtesy of Validus Institute Inc., 2025

Understanding the ultrasound images and annotation process

Depending on the device being utilized, images are shown in real time on the interface. In this A10th generation iPad Air was used in this testing; the Butterfly I Q 3 was used and the gadget is USB-C compliant.

To start the annotation process, the researcher added an ellipse, line, or notes that are seen from the screen after the live broadcast has been frozen to take an image.

The size of the breast image that was taken, was identified and marked. The handling of the probe direction of the transverse or longitudinal direction was identified and marked.

The position of the cyst arrangement of the breast side that was taken, whether it was taken proximally or distally in relation to the hypothetical patient’s head, were all marked with annotations for both benign cystic lesions and solid masses.

The edges of benign cystic lesions are shown by an ellipse. To display the lesions’ measurements, lines were added. For malignant lesions the annotation process involves line measurements both horizontal and vertical. It also labels the organ name, transverse vs longitudinal and the location of the tumor in relation to the side of the breast captured in the image taken during this time.

To capture the borders of the cystic lesion, we clicked on the elliptical tab. The elliptical shape to shows the margins of the legion where we draw an oval shape or circle depending on the shape of the cyst. We then employed both vertical and horizontal lines to show the measurements of the cysts. The side of the breast that was photographed and the type of organ were identified to apply annotations for both benign cyst and solid masses. This work is based on the original experiment of the team during this experiment. For solid mass annotations were used by identifying the type of organ, side of the breast that was captured in the image as listed above.

Figure 7. Left cystic breast ultrasound image. Taken as a transverse position using the synthetic tissue model. Measurements and annotations are shown in the picture. After the measurements are taken we start the annotation process by marking the side of the lesion, name of the lesion and the position of the ultrasound hand probe. Here we see an oval cystic lesion of the left breast with regular and clearly marked edges. Courtesy of Validus Institute Inc, 2025

Figure 8. Left solid breast ultrasound image. Taken as a transverse position using the synthetic tissue model. Measurements and annotations are shown in the picture. Courtesy of Validus Institute Inc, 2025

OUTCOME AND FUTURE RESEARCH

The integration of conventional breast ultrasound with artificial intelligence (AI) represents a transformative advancement in breast cancer diagnosis and management. This innovation significantly enhances diagnostic accuracy and streamlines clinical workflows, especially in settings where radiological expertise may be limited. AI-assisted platforms have proven effective in differentiating benign from malignant lesions with higher consistency, reducing inter-observer variability, and minimizing unnecessary biopsies. Features like deep learning–based radiomics and Automated Breast Ultrasound (ABUS) have allowed the extraction and analysis of complex imaging biomarkers, supported personalized treatment strategies and improved patient outcomes. The clinical algorithm described ranging from initial imaging to AI-supported decision-making demonstrates the potential for integrated systems to provide more efficient and accurate diagnostic pathways.

Limitations: 

There are constraints of using synthetic tissue because of the limitations of the breast side that can have a synthetic tissue mimicked, which can lead to bias in labeling the image captured from synthetic tissues.

Also, the limitation of the handle probe and a specific portable ultrasound probe, can be influenced by users training and affect the credibility of the image results and the research results. That’s where training staff to provide proper ultrasound techniques and capturing images is a crucial part of the ultrasound screenings and training as well as synthetic tissue development. The other limitation was limited available information that can be replicated. This experiment is original in its design and thus our team went through many trials before perfecting it and coming up with the best approach to conduct this experiment.

Another limitation is limited funding options and the small data sample that was used. That is why it is important to consider other funding resources, such as grants or other small business loans that can support this research, to be able to conduct this study with a bigger sample size and resources to test this theory and replicate it on a bigger scale. This is an important step to reach the stage where we can train the AI model and produce reliable and valid AI integrative models in the future.

Future Research:

Future research in breast ultrasound and AI integration will likely focus on the following areas:

  1. Improved Model Generalizability:
Continued work is needed to train AI models on more diverse, multi-institutional datasets to ensure robust performance across different populations, equipment vendors, and clinical environments.
  2. Real-Time AI Integration:
Development of real-time, point-of-care AI solutions that assist clinicians during live scanning, offering dynamic feedback to enhance diagnostic accuracy and reduce scanning time.
  3. Multimodal Imaging Fusion:
Research into fusing AI-enhanced ultrasound with mammography, MRI, or elastography could yield more comprehensive diagnostic systems that capitalize on the strengths of each modality.
  4. Predictive and Prognostic Modeling:
Expanding the role of AI from diagnosis to long-term outcome prediction, including recurrence risk, treatment response, and survival analysis, by combining imaging data with genomic, clinical, and pathological information.
  5. Validation through Prospective Clinical Trials:
Large-scale, prospective validations are needed to assess real-world effectiveness, cost-benefit ratios, and clinical outcomes driven by AI-assisted ultrasound systems.
  6. Ethical and Regulatory Frameworks:
Establishing clear guidelines and standards around the deployment, transparency, and interpretability of AI in breast imaging to ensure safe, equitable, and trustworthy clinical use.

In summary, the coupling of ultrasound imaging with AI holds the promise of redefining breast cancer care by enabling earlier detection, better risk stratification, and more personalized treatment strategies.

Then add to this the integration of synthetic tissue models in breast ultrasound screening has yielded several promising outcomes, significantly advancing our research in a practical way. One of the most notable benefits is the ability to standardize and enhance training for radiologists and sonographers. Synthetic tissue phantoms, engineered to mimic the acoustic and mechanical properties of human breast tissue, allow for repeated, consistent practice without risk to patients. This has improved diagnostic accuracy, particularly in detecting subtle lesions or abnormalities in dense breast tissue.

Synthetic tissues have proven useful for capturing reliable images and capturing multiple images from many different angles of the breast and using both transverse and longitudinal probe handling during the capturing of the ultrasound images. Furthermore, synthetic tissue enables the safe integration and testing of AI algorithms. By generating controlled scenarios with known pathologies as was demonstrated in our experiment. The use of synthetic models allows for training and validation of machine learning systems in a reproducible environment.

CONCLUSION

The use of synthetic tissue and ultrasound images of the breast has demonstrated that it is a reliable technique to use. As synthetic tissue-supported imaging and AI modelling improve lesion characterization, clinicians can more confidently distinguish benign from suspicious masses. A potential prospect for non-invasive, easily accessible, and reasonably priced screening in a variety of healthcare settings is the combination of portable ultrasonography and artificial intelligence.

The AI assisted ultrasound training will allow the interpretation of the lesions. This will lead to the reduction in invasive procedures as less core biopsies will be needed.  By providing a stable reference, these models support ongoing quality assurance, reducing variability between screenings and across different equipment or operators. The training of AI assisted ultrasound technologies using synthetic tissue modules as was demonstrated in our experiment shows promising results. This reduces the ethical concerns and logistical limitations of relying solely on human subjects and accelerates the development of robust, generalizable AI tools for early breast cancer detection.

The use of AI assisted ultrasound imaging has the potential to revolutionize the treatment of breast cancer by facilitating improved risk assessment, earlier detection, and more individualized treatment plans. Thorough validation, interdisciplinary cooperation, and ongoing learning integrated into clinical workflows will be essential for long-term success of this innovative process.

ACKNOWLEDGMENTS

We acknowledge BDSIL 2025 for sharing their ideas regarding this research process. The ideas in this research were further developed following participation in the Biomedical Data Science Innovation Lab (BDSIL) 2025, which encouraged new interdisciplinary collaborations and inspired innovative ideas. The following is a summary provided by Dr. John Darrell Van Horn, Ph.D., M.Eng., FOHBM

“The Biomedical Data Science Innovation Lab (BDSIL; https://www.innovation.lab.virginia.edu/), through support from the National Institute of General Medical Sciences (NIGMS; NIH Grant# R25GM139080), is an intensive, interdisciplinary research training and team science program designed to catalyze novel collaborations between early-career investigators from biomedical and quantitative backgrounds. Through a carefully structured combination of mentorship, creativity training, facilitated ideation, and collaborative grant development, BDSIL provides participants with the tools, environment, and professional support needed to generate high-impact, data-driven research proposals. The program emphasizes cross-disciplinary communication, the co-creation of new ideas at the interface of biomedicine and data science, and the development of innovative solutions to complex health-related challenges. Participation in the BDSIL fostered lasting research partnerships and offered invaluable guidance from expert mentors, ultimately accelerating the trajectory of the work described in this manuscript.” -courtesy of Dr. John Darrell Van Horn, Ph.D., M.Eng., FOHBM

Special thanks to Dr. John Darell Von Horn for sharing this detailed summary and the BDSIL team as well as all the mentors for their stimulating questions during this learning experience.. Based on their stimulating questions and ideas my team will take this project further and develop a study to involve multi-omics as part of our new research, which will be part of our new upcoming paper and research project for 2025-2026. Thank you for all your valuable insights.

Special thanks to Portland State University for their collaboration and training program, especially Dr. Faryar Estemi Emaratus, Professor of Mechanical Engineering, Portland State University, and Mr. Behrooz Khajehee, M.Sc. in Artificial Intelligence for Science and Technology, Joint Program: University of Milan, Milano-Bicocca, and Pavia | Italy.

Thank you to all our team members and collaborators!

REFERENCES

  1. Tabár, L. et al. Swedish two-county trial: impact of mammographic screening on breast cancer mortality during 3 decades. Radiology 260, 658–663 (2011).
  2. Lehman, C. D. et al. National performance benchmarks for modern screening digital mammography: update from the Breast Cancer Surveillance Consortium. Radiology 283, 49–58 (2017).
  3. Bray, F. et al. Global cancer statistics 2018: GLOBOCAN estimates of incidence and mortality worldwide for 36 cancers in 185 countries. CA Cancer J. Clin. 68, 394–424 (2018).
  4. The Canadian Task Force on Preventive Health Care. Recommendations on screening for breast cancer in average-risk women aged 40–74 years. CMAJ 183, 1991–2001 (2011).
  5. Marmot, M. G. et al. The benefits and harms of breast cancer screening: an independent review. Br. J. Cancer 108, 2205–2240 (2013).
  6. Lee, C. H. et al. Breast cancer screening with imaging: recommendations from the Society of Breast Imaging and the ACR on the use of mammography, breast MRI, breast ultrasound, and other technologies for the detection of clinically occult breast cancer. J. Am. Coll. Radiol.
  7. 18–27 (2010). 7. Oefringer, K. C. et al. Breast cancer screening for women at average risk: 2015 guideline update from the American Cancer Society. J. Am. Med. Assoc. 314, 1599–1614 (2015).
  8. Siu, A. L. Screening for breast cancer: U.S. Preventive Services Task Force recommendation statement. Ann. Intern. Med. 164, 279–296 (2016).
  9. Center for Devices & Radiological Health. MQSA National Statistics (US Food and Drug Administration, 2019; accessed 16 July 2019); http://www.fda.gov/radiation-emitting-products/mqsa- insights/mqsanational-statistics.
  10. Cancer Research UK. Breast Screening (CRUK, 2017; accessed 26 July 2019); https://www.cancerresearchuk.org/about-cancer/breastcancer/screening/breast-screening.
  11. Elmore, J. G. et al. Variability in interpretive performance at screening mammography and radiologists’ characteristics associated with accuracy. Radiology 253, 641–651 (2009).
  12. Lehman, C. D. et al. Diagnostic accuracy of digital screening mammography with and without computer-aided detection. JAMA Intern. Med. 175, 1828–1837 (2015).
  13. Tosteson, A. N. A. et al. Consequences of false-positive screening mammograms. JAMA Intern. Med. 174, 954–961 (2014).
  14. Houssami, N. & Hunter, K. The epidemiology, radiology and biological characteristics of interval breast cancers in population mammography screening. NPJ Breast Cancer 3, 12 (2017).
  15. Gulshan, V. et al. Development and validation of a deep learning algorithm for detection of diabetic retinopathy in retinal fundus photographs. J. Am. Med. Assoc. 316, 2402–2410 (2016).
  16. Esteva, A. et al. Dermatologist-level classification of skin cancer with deep neural networks. Nature 542, 115–118 (2017).
  17. De Fauw, J. et al. Clinically applicable deep learning for diagnosis and referral in retinal disease. Nat. Med. 24, 1342–1350 (2018).
  18. Ardila, D. et al. End-to-end lung cancer screening with three-dimensional deep learning on low-dose chest computed tomography. Nat. Med. 25, 954–961 (2019).
  19. Topol, E. J. High-performance medicine: the convergence of human and artificial intelligence. Nat. Med. 25, 44–56 (2019).
  20. Moran, S. & Warren-Forward, H. The Australian Breast Screen workforce: a snapshot. Radiographer 59, 26–30 (2012).
  21. Wing, P. & Langelier, M. H. Workforce shortages in breast imaging: impact on mammography utilization. AJR Am. J. Roentgenol. 192, 370–378 (2009).
  22. Rimmer, A. Radiologist shortage leaves patient care at risk, warns royal college. BMJ 359, j4683 (2017).
  23. Nakajima, Y., Yamada, K., Imamura, K. & Kobayashi, K. Radiologist supply and workload: international comparison. Radiat. Med. 26, 455–465 (2008).
  24. Rao, V. M. et al. How widely is computer-aided detection used in screening and diagnostic mammography? J. Am. Coll. Radiol. 7, 802–805 (2010).
  25. Gilbert, F. J. et al. Single reading with computer-aided detection for screening mammography. N. Engl. J. Med. 359, 1675–1684 (2008).
  26. Giger, M. L., Chan, H.-P. & Boone, J. Anniversary paper: history and status of CAD and quantitative image analysis: the role of Medical Physics and AAPM. Med. Phys. 35, 5799–5820 (2008).
  27. Fenton, J. J. et al. Influence of computer-aided detection on performance of screening mammography. N. Engl. J. Med. 356, 1399–1409 (2007).
  28. Kohli, A. & Jha, S. Why CAD failed in mammography. J. Am. Coll. Radiol. 15, 535– 537 (2018).
  29. Rodriguez-Ruiz, A. et al. Stand-alone artificial intelligence for breast cancer detection in mammography: comparison with 101 radiologists. J. Natl. Cancer Inst. 111, 916–922 (2019).
  30. Wu, N. et al. Deep neural networks improve radiologists’ performance in breast cancer screening. IEEE Trans. Med. Imaging https://doi.org/10.1109/TMI.2019.2945514 (2019).
  31. Zech, J. R. et al. Variable generalization performance of a deep learning model to detect pneumonia in chest radiographs: a cross-sectional study. PLoS Med. 15, e1002683 (2018).
  32. Becker, A. S. et al. Deep learning in mammography: diagnostic accuracy of a multipurpose image analysis software in the detection of breast cancer. Invest. Radiol. 52, 434–440 (2017).
  33. Ribli, D., Horváth, A., Unger, Z., Pollner, P. & Csabai, I. Detecting and classifying lesions in mammograms with deep learning. Sci. Rep. 8, 4165 (2018).
  34. McKinney, S.M., Sieniek, M., Godbole, V. et al. international evaluation of an AI system for breast cancer screening. Nature 577, 89–94 (2020). https://doi.org/10.1038/s41586-019-1799-6.
  35. Lutz, Stephen, Edward Chow, and Peter Hoskin. “Radiation Oncology in Palliative Cancer Care.” https://onlinelibrary.wiley.com/doi/epdf/10.1002/9781118607152.fmatter
  36. Silva, Helbert Eustáquio Cardoso da, Glaucia Nize Martins Santos, André Ferreira Leite, Carla Ruffeil Moreira Mesquita, Paulo Tadeu de Souza Figueiredo, Cristine Miron Stefani, and Nilce Santos de Melo. “The use of artificial intelligence tools in cancer detection compared to the traditional diagnostic imaging methods: An overview of the systematic reviews.” Plos one 18, no. 10 (2023): e0292063.
  37. Hosny, Ahmed, Chintan Parmar, John Quackenbush, Lawrence H. Schwartz, and Hugo JWL Aerts. “Artificial intelligence in radiology.” Nature Reviews Cancer 18, no. 8 (2018): 500-510.
  38. Unveiling the Future: Convergence of Engineering, Medicine, and Technology in Biomedical and Biotechnology, Hadi Khazaei, Danesh Khazaei, Faryar Etesami(Emeritus Professor, Department of Mechanical Engineering/Portland State University).
  39. Majd Oteibi, Adam Tamimi, Kaneez Abbas. Et al. Gabriel Tamimi, Danesh Khazaei, Hadi Khazaei, “Advancing Digital Health using AI and Machine Learning Solutions for Early Ultrasonic Detection of Breast Disorders in Women.” IJRSI (Volume XI Issue XI) ISSN No. 2321-2705 (2024).
  40. Cai, Y., Dai, F., Ye, Y. et al.The global burden of breast cancer among women of reproductive age: a comprehensive analysis. Sci Rep 15, 9347 (2025). https://doi.org/10.1038/s41598-025-93883-9
  41. Berg WA, Gutierrez L, NessAiver MS, et al. Diagnostic accuracy of mammography, clinical examination, US, and MR imaging in preoperative assessment of breast cancer. Radiology. (2004); 233(3):830-849. doi:10.1148/radiol.2333031484
  42. D’Orsi CJ, Sickles EA, Mendelson EB, et al. ACR BI-RADS® Atlas, Breast Imaging Reporting and Data System. 5th edition. Reston, VA: American College of Radiology; 2013.
  43. Khazaei, Hadi et al. “3D Ultrasound using 3D Phantom Models for Oculofacial Injuries in Emergencies.” International Journal of Research and Innovation in Applied Science (2023): . Springer, Cham..https://doi.org/10.1007/978-3-031-85768-3_16
  44. Nakajima, N., Isner, J.D., Harrell, E.R., & Daniels, C.A. Investigation of PolyVinyl Chloride Plastisol Tissue-Mimicking Materials with an Open-Source, Accessible Fabrication Protocol for Medical Ultrasound. Journal of Applied Clinical Medical Physics, 20(8), 191–199. ( 2025). https://doi.org/10.1002/acm2.12661
  45. Khazaei, H., Khazaei, D., Junejo, N., Ng, J.D., Etesami, F.. Evaluation of Optic Nerve Sheath Diameter Measurements in Eye Phantom Imaging Using POCUS and AI. In: Khazaei, H. (eds) Fundamentals of Orbital Inflammatory Disorders. Springer, Cham. (2025). https://doi.org/10.1007/978-3-031-85768-3_18

Article Statistics

Track views and downloads to measure the impact and reach of your article.

0

PDF Downloads

[views]

Metrics

PlumX

Altmetrics

Paper Submission Deadline

Track Your Paper

Enter the following details to get the information about your paper

GET OUR MONTHLY NEWSLETTER