fbpx
June 16, 2025 by g4qwj 0 Comments

Classification of Mammographic Breast Microcalcifications Using a Deep Convolutional Neural Network A BI-RADS–Based Approach

Abstract

Purpose 

The goal of this retrospective cohort study was to investigate the potential of a deep convolutional neural network (dCNN) to accurately classify microcalcifications in mammograms with the aim of obtaining a standardized observer-independent microcalcification classification system based on the Breast Imaging Reporting and Data System (BI-RADS) catalog.

Materials and Methods 

Over 56,000 images of 268 mammograms from 94 patients were labeled to 3 classes according to the BI-RADS standard: “no microcalcifications” (BI-RADS 1), “probably benign microcalcifications” (BI-RADS 2/3), and “suspicious microcalcifications” (BI-RADS 4/5). Using the preprocessed images, a dCNN was trained and validated, generating 3 types of models: BI-RADS 4 cohort, BI-RADS 5 cohort, and BI-RADS 4 + 5 cohort. For the final validation of the trained dCNN models, a test data set consisting of 141 images of 51 mammograms from 26 patients labeled according to the corresponding BI-RADS classification from the radiological reports was applied. The performances of the dCNN models were evaluated, classifying each of the mammograms and computing the accuracy in comparison to the classification from the radiological reports. For visualization, probability maps of the classification were generated.

Results 

The accuracy on the validation set after 130 epochs was 99.5% for the BI-RADS 4 cohort, 99.6% for the BI-RADS 5 cohort, and 98.1% for the BI-RADS 4 + 5 cohort. Confusion matrices of the “real-world” test data set for the 3 cohorts were generated where the radiological reports served as ground truth. The resulting accuracy was 39.0% for the BI-RADS 4 cohort, 80.9% for BI-RADS 5 cohort, and 76.6% for BI-RADS 4 + 5 cohort. The probability maps exhibited excellent image quality with correct classification of microcalcification distribution.

Conclusions 

The dCNNs can be trained to successfully classify microcalcifications on mammograms according to the BI-RADS classification system in order to act as a standardized quality control tool providing the expertise of a team of radiologists.

Read the full article online here:

Download the PDF of the article

June 16, 2025 by g4qwj 0 Comments

Automatic and standardized quality assurance of digital mammography and tomosynthesis with deep convolutional neural networks

Abstract

Objectives

The aim of this study was to develop and validate a commercially available AI platform for the automatic determination of image quality in mammography and tomosynthesis considering a standardized set of features.

Materials and methods

In this retrospective study, 11,733 mammograms and synthetic 2D reconstructions from tomosynthesis of 4200 patients from two institutions were analyzed by assessing the presence of seven features which impact image quality in regard to breast positioning. Deep learning was applied to train five dCNN models on features detecting the presence of anatomical landmarks and three dCNN models for localization features. The validity of models was assessed by the calculation of the mean squared error in a test dataset and was compared to the reading by experienced radiologists.

Results

Accuracies of the dCNN models ranged between 93.0% for the nipple visualization and 98.5% for the depiction of the pectoralis muscle in the CC view. Calculations based on regression models allow for precise measurements of distances and angles of breast positioning on mammograms and synthetic 2D reconstructions from tomosynthesis. All models showed almost perfect agreement compared to human reading with Cohen’s kappa scores above 0.9.

Conclusions

An AI-based quality assessment system using a dCNN allows for precise, consistent and observer-independent rating of digital mammography and synthetic 2D reconstructions from tomosynthesis. Automation and standardization of quality assessment enable real-time feedback to technicians and radiologists that shall reduce a number of inadequate examinations according to PGMI (Perfect, Good, Moderate, Inadequate) criteria, reduce a number of recalls and provide a dependable training platform for inexperienced technicians.

Key points

  1. Deep convolutional neural network (dCNN) models have been trained for classification of mammography imaging quality features.

  2. AI can reliably classify diagnostic image quality of mammography and tomosynthesis.

  3. Quality control of mammography and tomosynthesis can be automated.

Read the full article online here:

Download the PDF of the article

June 16, 2025 by g4qwj 0 Comments

Fully automatic classification of automated breast ultrasound (ABUS) imaging according to BI-RADS using a deep convolutional neural network

Abstract

Purpose

The aim of this study was to develop and test a post-processing technique for detection and classification of lesions according to the BI-RADS atlas in automated breast ultrasound (ABUS) based on deep convolutional neural networks (dCNNs).

Methods and materials

In this retrospective study, 645 ABUS datasets from 113 patients were included; 55 patients had lesions classified as high malignancy probability. Lesions were categorized in BI-RADS 2 (no suspicion of malignancy), BI-RADS 3 (probability of malignancy < 3%), and BI-RADS 4/5 (probability of malignancy > 3%). A deep convolutional neural network was trained after data augmentation with images of lesions and normal breast tissue, and a sliding-window approach for lesion detection was implemented. The algorithm was applied to a test dataset containing 128 images and performance was compared with readings of 2 experienced radiologists.

Results

Results of calculations performed on single images showed accuracy of 79.7% and AUC of 0.91 [95% CI: 0.85–0.96] in categorization according to BI-RADS. Moderate agreement between dCNN and ground truth has been achieved (κ: 0.57 [95% CI: 0.50–0.64]) what is comparable with human readers. Analysis of whole dataset improved categorization accuracy to 90.9% and AUC of 0.91 [95% CI: 0.77–1.00], while achieving almost perfect agreement with ground truth (κ: 0.82 [95% CI: 0.69–0.95]), performing on par with human readers. Furthermore, the object localization technique allowed the detection of lesion position slice-wise.

Conclusions

Our results show that a dCNN can be trained to detect and distinguish lesions in ABUS according to the BI-RADS classification with similar accuracy as experienced radiologists.

Key Points

 A deep convolutional neural network (dCNN) was trained for classification of ABUS lesions according to the BI-RADS atlas.

 A sliding-window approach allows accurate automatic detection and classification of lesions in ABUS examinations.

Read the full article online here:

Download the PDF of the article

June 16, 2025 by g4qwj 0 Comments

Diagnostic accuracy of automated ACR BI-RADS breast density classification using deep convolutional neural networks

Abstract

Objectives

High breast density is a well-known risk factor for breast cancer. This study aimed to develop and adapt two (MLO, CC) deep convolutional neural networks (DCNN) for automatic breast density classification on synthetic 2D tomosynthesis reconstructions.

Methods

In total, 4605 synthetic 2D images (1665 patients, age: 57 ± 37 years) were labeled according to the ACR (American College of Radiology) density (A-D). Two DCNNs with 11 convolutional layers and 3 fully connected layers each, were trained with 70% of the data, whereas 20% was used for validation. The remaining 10% were used as a separate test dataset with 460 images (380 patients). All mammograms in the test dataset were read blinded by two radiologists (reader 1 with two and reader 2 with 11 years of dedicated mammographic experience in breast imaging), and the consensus was formed as the reference standard. The inter- and intra-reader reliabilities were assessed by calculating Cohen’s kappa coefficients, and diagnostic accuracy measures of automated classification were evaluated.

Results

The two models for MLO and CC projections had a mean sensitivity of 80.4% (95%-CI 72.2–86.9), a specificity of 89.3% (95%-CI 85.4–92.3), and an accuracy of 89.6% (95%-CI 88.1–90.9) in the differentiation between ACR A/B and ACR C/D. DCNN versus human and inter-reader agreement were both “substantial” (Cohen’s kappa: 0.61 versus 0.63).

Conclusion

The DCNN allows accurate, standardized, and observer-independent classification of breast density based on the ACR BI-RADS system.

Key Points

 A DCNN performs on par with human experts in breast density assessment for synthetic 2D tomosynthesis reconstructions.

 The proposed technique may be useful for accurate, standardized, and observer-independent breast density evaluation of tomosynthesis.

 

Read the full article online here:

Download the PDF of the article

June 16, 2025 by g4qwj 0 Comments

Classification of Mammographic Breast Microcalcifications Using a Deep Convolutional Neural Network

Abstract

The aim of this study was to investigate the potential of a machine learning algorithm to classify breast cancer solely by the presence of soft tissue opacities in mammograms, independent of other morphological features, using a deep convolutional neural network (dCNN). Soft tissue opacities were classified based on their radiological appearance using the ACR BI-RADS atlas. We included 1744 mammograms from 438 patients to create 7242 icons by manual labeling. The icons were sorted into three categories: “no opacities” (BI-RADS 1), “probably benign opacities” (BI-RADS 2/3) and “suspicious opacities” (BI-RADS 4/5). A dCNN was trained (70% of data), validated (20%) and finally tested (10%). A sliding window approach was applied to create colored probability maps for visual impression. Diagnostic performance of the dCNN was compared to human readout by experienced radiologists on a “real-world” dataset. The accuracies of the models on the test dataset ranged between 73.8% and 89.8%. Compared to human readout, our dCNN achieved a higher specificity (100%, 95% CI: 85.4–100%; reader 1: 86.2%, 95% CI: 67.4–95.5%; reader 2: 79.3%, 95% CI: 59.7–91.3%), and the sensitivity (84.0%, 95% CI: 63.9–95.5%) was lower than that of human readers (reader 1:88.0%, 95% CI: 67.4–95.4%; reader 2:88.0%, 95% CI: 67.7–96.8%). In conclusion, a dCNN can be used for the automatic detection as well as the standardized and observer-independent classification of soft tissue opacities in mammograms independent of the presence of microcalcifications. Human decision making in accordance with the BI-RADS classification can be mimicked by artificial intelligence.
 

Read the full article online here:

Download the PDF of the article

August 22, 2024 by g4qwj 0 Comments

Discover DISS Analytics 3.0

We are thrilled to announce the launch of DISS Analytics 3.0, the latest version of our customized business intelligence (BI) software tailored specifically for the healthcare industry.

 

Over the past few years, we have developed a user-friendly BI tool for non-technical users, enabling them to visualize data and make informed decisions. With DISS Analytics, users can monitor key metrics and performance indicators in real time, allowing for agile decision-making.

 

We continuously enhance our software by integrating cutting-edge technology, ensuring our customers can fully leverage their data.

What’s New and Remarkable in DISS Analytics 3.0?

 

Enhanced User Experience and Performance Improvement

DISS Analytics 3.0 introduces significant improvements to the user interface, especially on the mobile version, allowing users to navigate reports and adjust filters easily. We have focused on the UI design to ensure a smoother and intuitive user experience. With a streamlined design, healthcare professionals can access vital data and insights, facilitating data-driven decision-making.

 

In addition, we have also made substantial updates to the underlying code for performance enhancements. The platform now offers faster and more efficient data processing with reduced loading times.

 

Advanced Predictive Analytics with Scenarios

The latest version includes advanced predictive analytics capabilities with scenario features. This functionality allows healthcare organizations to adjust variables such as costs, revenue, and staffing levels to see how these changes could impact departmental performance. The scenarios feature enables more accurate forecasting and strategic planning.

 

Latest Artificial Intelligence Model

Another relevant advancement in DISS Analytics 3.0 is the latest Artificial Intelligence model integrated into the platform. This model enhances the mobile experience by allowing users to dynamically interact with their data. Thanks to generative AI, users can now ask questions about their data and receive quick, accurate responses on demand.

Whether you are a small clinic or a large medical center, DISS Analytics software offers straightforward data visualization capabilities, helping you enhance operational and financial efficiency while improving patient care delivery.

For more information on how DISS Analytics can transform your data analytics experience or to schedule a demo, please visit our website or contact our sales team.

November 1, 2023 by Rubén García 0 Comments

KPIs for Radiology

A question we often get is “What Key Performance Indicators (KPIs) are essential to help monitor the performance and success of a radiology business?” The reality is that only the specific department or center knows what is best for them, but there are certainly some key KPIs that can be thought of as industry best practices for a radiology business:

 

1. Patient Volume: Track the number of patients served over a specific period. This KPI provides insights into the demand for your services and helps identify trends or seasonal variations.

 

2. Referral Patterns: Monitor the sources and volume of referrals from physicians, hospitals, and other healthcare providers. Understanding referral patterns helps identify key referral sources and allows you to strengthen relationships with them.

 

3. Turnaround Time: Measure the time it takes to deliver diagnostic reports or images to referring physicians. Efficient turnaround times are crucial for patient satisfaction and maintaining strong relationships with referring providers.

 

4. Report Accuracy: Assess the accuracy of diagnostic reports through quality assurance programs and peer review processes. This KPI ensures that the interpretations provided are reliable and meets industry standards.

 

5. Revenue and Profitability: Monitor financial performance, including revenue generated and profitability. Track revenue per modality or service to identify areas of high profitability and potential growth opportunities.

 

6. Equipment Utilization: Measure the utilization of imaging equipment to ensure optimal use and identify any underutilized resources. This KPI helps with equipment maintenance and replacement planning.

 

7. Patient Satisfaction: Gather patient feedback through surveys or online reviews to assess patient satisfaction levels. This KPI helps identify areas for improvement in patient experience and enables you to make necessary adjustments.

 

8. Referring Physician Satisfaction: Gauge the satisfaction levels of referring physicians through surveys or direct communication. Building strong relationships with referring physicians is essential for business growth and success.

 

9. Staff Productivity: Assess the productivity of radiologists and other staff members by tracking the number of cases read or procedures performed per day or hour. This KPI helps optimize staffing levels and workload distribution.

 

10. Market Share: Evaluate your business’s market share by comparing it to competitors in your region. This KPI provides insights into your market position and growth potential.

 

Without a doubt, the selection of specific KPIs will vary based on the specific goals and objectives of your radiology business. However, we know for sure that what isn’t measured can’t be improved so, regularly monitoring your specific KPIs and using the insights gained to drive improvements can help optimize your operations and achieve business growth.

August 15, 2023 by Rubén García 0 Comments

DISS and deepc bring Cutting- Edge AI to Puerto Rico & Latin America

  • deepc and DISS announce their partnership
  • This partnership will be the bridge for Puerto Rico, Dominican Republic,

Guatemala, Honduras, and Jamaica healthcare professionals to work with cutting-edge AI technology and ultimately provide better patient care.

Munich, Germany, July 28th, 2023

D

ISS, a company founded by professionals with extended experience in the market of medical diagnostic images with the mission of bringing the best medical solutions to healthcare professionals in the Latin world, announces its partnership with deepc, the leading AI Operating System for radiologists. deepc has created the first vendor-neutral cloud-to-cloud connection for integrating AI-based imaging and reporting tools.

This partnership enables easy access to cloud technologies and artificial intelligence, benefiting latin american health professionals. With a single integration to the PACS, medical institutions can select from all of deepc’s curated AI tools and easily leverage them in their clinical workflows. The deepc integration significantly reduces the technical effort and the required legal agreements for data protection.

“Unquestionably, artificial intelligence is the future; we are very proud to partner with deepc and be pioneers in implementing artificial intelligence programs in our territories. We are both technologically advanced companies with similar values of innovation, quality and a sense of urgency for excellent customer service,”- said Rubén A. García, Chief Technology Officer at DISS, in recognition of this new and exciting partnership.

One of the key benefits of this collaboration is Integrating AI classifiers into the radiology workflow. This Operating System will support physicians in reporting X-rays, MR, or CT scans, helping them prioritize cases and shortening the time from admission to the emergency room to discharge, providing for efficient allocation of scarce staff resources and supporting better medical care. The result is exceptional accuracy and efficiency, which leads to better patient outcomes.

From a technical perspective, less IT effort and maximum data protection through central integration enable radiology departments to adopt multiple AI solutions with minimal required resources. Only a one-time integration via a standard interface is required, with data stored exclusively in an encrypted form. This multi-level security concept ensures data protection of sensitive patient data at all times.

John Moulden, CCO of deepc, stated, “We are very pleased to enter this partnership with DISS to empower healthcare professionals in Puerto Rico and LATAM with the cutting-edge technology of globally leading AI solutions for more than 50 clinical indications”.

Join us on our mission to save lives and revolutionize the future of healthcare.

About DISS

Since its founding in 2001, DISS has prided itself in being a company with extensive experience in the sales, installation, distribution and service of specialized medical solutions for radiology and cardiology, as well as in offering the most advanced Digital Solutions for the healthcare field. DISS locations include Puerto Rico, the Dominican Republic, Guatemala, Honduras, Jamaica and the Virgin Islands, supported by local teams of specialists and service engineers.

About deepc

deepc has developed the radiology AI platform deepcOS, providing clinicians with easy access to an ecosystem of regulatory-approved, globally leading AI solutions for more than 50 clinical indications. deepc offers easy installation, with one contract, billing, service, and support framework in compliance with all data protection and cybersecurity requirements.

Press contact deepc:

Nerilda Meda I Marketing Manager

nerilda.meda@deepc.ai 

www.deepc.ai LinkedIn/company/deepchealth

DISS and deepc bring Cutting- Edge AI to Puerto Rico & Latin America

July 26, 2023 by Rubén García 0 Comments

Challenges For Radiology Centers

R

adiology is not exempt from the challenges that all businesses face. Inflation, lack of qualified personnel and reimbursement situations are some of the challenges that a Radiology business faces. However, growing a radiology business in the face of current challenges is not impossible and as long as you have a strategic approach and positively execute on the implementation of several key strategies:

1. Enhance Technology and Equipment: Stay up to date with the latest technological advancements in radiology. Investing in modern equipment and software can improve efficiency, accuracy, and patient satisfaction. Consider acquiring advanced imaging modalities or upgrading your existing equipment to offer a wider range of diagnostic services.

2. Expand Service Offerings: Identify potential service gaps or areas of high demand in your region. Consider adding specialized services such as interventional radiology, nuclear medicine, or musculoskeletal imaging to broaden your offerings. Collaborate with referring physicians and hospitals to understand their needs and develop tailored services accordingly.

3. Strengthen Referral Relationships: Establish strong relationships with referring physicians and hospitals in your area. Enhance communication channels, collaborate on patient care, and provide prompt reports and consultations. Regularly meet with physicians to understand their evolving needs and explore opportunities for collaboration.

4. Improve Patient Experience: Focus on delivering exceptional patient experiences. Streamline appointment scheduling, reduce wait times, and ensure clear and effective communication. Provide a comfortable and welcoming environment within your facility. Invest in patient-centric initiatives such as patient portals, online appointment booking, and transparent billing processes.

5. Collaborate with Research Institutions: Partner with research institutions or universities to participate in clinical trials or research projects. Engaging in cutting-edge research can enhance your reputation, attract talented radiologists, and open doors for collaboration with academic institutions.

6. Network with Industry Professionals: Attend radiology conferences, seminars, and workshops to network with industry professionals. Stay updated on the latest trends and developments in radiology and build connections that may lead to new opportunities, partnerships, or collaborations.

7. Monitor Key Performance Indicators (KPIs): Regularly track and analyze key performance indicators such as patient volume, referral patterns, revenue, and turnaround times. Use this data to identify areas for improvement, make informed business decisions, and measure the success of your growth strategies.

8. Develop a Strong Online Presence: In today’s digital age, having a robust online presence is crucial for business growth. Create an informative and user-friendly website that highlights your services, expertise, and patient testimonials. Optimize your website for search engines to improve visibility. Utilize social media platforms to engage with patients and share educational content.

Remember, that all the right strategies take time and persistence. Nothing happens overnight, so don’t be discouraged with lack of immediate results, because adopting these strategies, continuously adapting to market changes, and providing excellent patient care, you will be positioned for success.

May 3, 2023 by Rubén García 0 Comments

Potential Benefits of AI in CT Interpretation

C

omputed Tomography (CT) scans are an important tool for radiologists in diagnosing and treating a wide range of medical conditions. These scans generate detailed images of the body’s internal structures, which can be difficult for human radiologists to interpret accurately in some cases. To help overcome this challenge, advanced AI tools are being developed to assist radiologists in the interpretation of CT scans. One such tool is machine learning algorithms that can identify patterns and anomalies in CT scans that may be difficult for human radiologists to detect. For example, these algorithms can be trained to identify small lung nodules, which can be early signs of lung cancer. The algorithms can also analyze the shape, size, and texture of nodules to predict their malignancy, helping radiologists make more accurate diagnoses. Another advanced AI tool used in the interpretation of CT scans is deep learning algorithms. These algorithms use artificial neural networks to analyze CT images and identify patterns that may be missed by human radiologists. One example is the use of deep learning algorithms to identify and classify liver lesions. This technology can help radiologists make faster and more accurate diagnoses, improving patient outcomes. AI-powered software can also assist in the segmentation of CT images, which involves separating the image into different regions of interest. This technology can help radiologists identify and analyze specific structures or abnormalities in CT scans. For example, AI-powered software can segment the liver in a CT scan to help radiologists detect and classify liver lesions more accurately. Natural Language Processing (NLP) is another advanced AI tool that can help radiologists in the interpretation of CT scans. NLP algorithms can analyze radiology reports generated from CT scans and extract key information, such as diagnoses and findings. This technology can help radiologists identify patterns and trends in patient data, improving patient care and outcomes. Finally, AI-powered software can assist radiologists in generating reports from CT scans. This technology can automatically identify and summarize key findings in CT scans, reducing the time it takes for radiologists to generate reports. This can help improve workflow and reduce wait times for patients. In conclusion, advanced AI tools are being developed to assist radiologists in the interpretation of CT scans. These tools include machine learning algorithms, deep learning algorithms, segmentation software, natural language processing, and report generation software. These technologies can help improve the accuracy and speed of CT scan interpretation, leading to better patient outcomes.