Awesome
About MIDA
<a href="https://github.com/MIMBCD-UI/" title="Meta" target="_blank"> <img src="https://github.com/MIMBCD-UI/meta/blob/master/headers/breast_cancer_women.png" alt="header" /> </a>The goal of this work is to propose a new methodology for fully automated breast cancer detection and segmentation from multi-modal medical images introducing clinical covariates. This protocol provides the following novelties. First, it lies on the use of deep Convolutional Neural Networks (CNNs) that will incorporate several image modalities: 3D Magnetic Resonance Imaging (MRI), UltraSound (US) or MammoGraphy (MG) views. This is the first methodology that is able to classify a whole exam, containing all the above image modalities. In this work, we are interested to develop a new assistant (i.e. a system that will provide diagnosis support as a second opinion), for other human tissues, besides the breast. In the end, the work will embrace other anatomies such as the aorta, spinal column, epicardial fat, body fat, heart, lungs, and muscle.
Abstract
The goal of this research is the study, design, and development, as well as evaluation of novel AI-based visual representations supported by intelligent agents for medical imaging diagnosis. For this purpose, recent achievements are built on the accuracy of intelligent agents. To address the effects of varied visual representations, the intelligent agents are applied for the breast cancer domain across different levels of medical expertise and multiple clinical workflows. Despite the promise of assisting clinicians in the decision-making process, there are two initial challenges that this research aims to cover: (i) the lack of available and curated medical data to be consumed by the AI algorithms; and (ii) the fact that medical professionals often find it challenging to understand how an AI system transform their initial input into a final decision. The goal is to evaluate the impact of the introduction of several techniques using an intelligent agent. Specifically, by focusing on how multimodality and AI-assistance could add value to the medical workflow in a User-Centered Design process. Indeed, this research provides a clinicians' evaluation and results for several high-level goals, namely: (a) understand clinicians' response to the AI-assistance; (b) find how they interact (and accept) with these systems; (c) discover how AI-assistive affects the UX of clinicians. A clinically oriented intelligent agent system can be achieved by mimicking the medical diagnosis and look for the presence of patient-relevant clues in breast cancer images. This approach (i.e., HAII that brings Human and AI together), tend to be easier for clinicians and leading to better patient results. In the end, the research discuss the importance, as well as pros-and-cons for the introduction of intelligent agents in the medical imaging workflow. Nevertheless, work is still in progress and some goals have not been accomplished yet which will be addressed as future work.
Research
Like the fact that these developements are research works, we underline and ask for the support of our noble scientific community to help us across the path. With this work, we hope to provide valuable information regarding our research topics and theories. Also, you can follow and support our research work on ResearchGate.
Introduction
Medical Imaging Diagnosis Assistant (MIDA) research work involves the collaborative effort of two Portuguese Research Institutions: ISR/IST and ITI. This work proposes the development of a methodology for breast detection and cancer targeting using AI-Assistance for medical imaging technologies. In short, it will be the proposal development of an Assistant for the diagnosis of cancer pathologies based medical imaging technologies.
Goals
The goal of this work is to research and develop an assistant supporting clinicians (e.g. Physicians, Doctors or Radiologists) on a Medical Imaging (MI) diagnosis, in the context of applied Human-Computer Interaction (HCI) to Health Informatics (HI). In particular, we aim at collecting the expert annotations in big data. In order to reach these goals, we need to study the user's experience with our Interactive Machine Learning (iML) system as a Human-In-The-Loop (HITL) approach while applying User Research (UR) methods, like the User-Centered Design (UCD) approach, involving end-users on the system development. The working practice of clinicians is, therefore, to determine how to adapt their workflow to a new recommendation system for the medical imaging diagnosis. It is crucial to understand how our system can be introduced into the workflow of clinicians, helping them to effectively do the patient's diagnostic. Furthermore, we need to understand the impact of our system in terms of promoting accurate diagnostic and the right patient care. The knowledge and findings from these studies, as well as, from the State-Of-The-Art (SOTA), will allow us to create and evaluate new HCI and HI practices and distil findings and guidelines for clinicians experts to use. Moreover, understanding best practices and design patterns and establish guidelines is impactful to the healthcare industry, as richer, appealing and well designed recommendation systems are crucial to a solid foundation of HCI applied to HI. As a proof-of-concept, it is intended to develop an Assistant for monitoring and diagnosis of breast lesions in various MI modalities.
Imaging modalities to include in the work are:
- MammoGraphy images;
- CranioCaudal views;
- MedioLateral Oblique views;
- UltraSound images;
- MRI volumes;
Requirements
- HTML;
- CSS;
- Javascript;
- Python;
- MATLAB;
Media
Advisorship
Advisor
Professor Jacinto C. Nascimento (ISR/IST)
Co-Advisor
Professor Nuno J. Nunes (M-ITI/IST)
Acknowledgements
Assistance provided by all intervenients is greatly appreciated. This work was partially supported by national funds through FCT and IST across the UID/EEA/50009/2013 R&D units strategic plan and BL89/2017-IST-ID grant.
Supporting
Our organization is a non-profit organization. However, we have many needs across our activity. From infrastructure to service needs, we need some time and contribution, as well as help, to support our team and projects.
<span> <a href="https://opencollective.com/oppr" target="_blank"> <img src="https://opencollective.com/oppr/tiers/backer.svg" width="220"> </a> </span>Contributors
This project exists thanks to all the people who contribute. [Contribute].
Backers
Thank you to all our backers! 🙏 [Become a backer]
<span> <a href="https://opencollective.com/oppr#backers" target="_blank"> <img src="https://opencollective.com/oppr/backers.svg?width=890"> </a> </span>Sponsors
Support this project by becoming a sponsor. Your logo will show up here with a link to your website. [Become a sponsor]
<span> <a href="https://opencollective.com/oppr/sponsor/0/website" target="_blank"> <img src="https://opencollective.com/oppr/sponsor/0/avatar.svg"> </a> </span> <br /> <span> <a href="http://www.fct.pt/" title="FCT" target="_blank"> <img src="https://github.com/mida-project/meta/blob/master/brands/fct_footer.png?raw=true" alt="fct" width="20%" /> </a> </span> <span> <a href="https://www.fccn.pt/en/" title="FCCN" target="_blank"> <img src="https://github.com/mida-project/meta/blob/master/brands/fccn_footer.png?raw=true" alt="fccn" width="20%" /> </a> </span> <span> <a href="https://www.ulisboa.pt/en/" title="ULisboa" target="_blank"> <img src="https://github.com/mida-project/meta/blob/master/brands/ulisboa_footer.png?raw=true" alt="ulisboa" width="20%" /> </a> </span> <span> <a href="http://tecnico.ulisboa.pt/" title="IST" target="_blank"> <img src="https://github.com/mida-project/meta/blob/master/brands/ist_footer.png?raw=true" alt="ist" width="20%" /> </a> </span> <span> <a href="http://hff.min-saude.pt/" title="HFF" target="_blank"> <img src="https://github.com/mida-project/meta/blob/master/brands/hff_footer.png?raw=true" alt="hff" width="20%" /> </a> </span>