Helper AI output in LOINC
Helper AI gets even more helpful! Our helper AI now outputs in standardized Logical Observation Identifiers Names and Codes (LOINC) nomenclature. LOINC provides a set of universal names and ID codes for identifying clinical information which will help facilitate communication, generalizability, and testing.
MD.ai’s robust model support allows easy deployment, validation, and testing of models. As described here, you can easily upload any model of your choice, run it on your MD.ai dataset, and visualize the predictions in a simple and user-friendly manner. You can even copy model outputs over to annotations and edit them as needed for easy AI-assisted annotating. In addition to our propriatry Helper AI models, there is also a library of pre-trained public models available for 1-click deployment in MD.ai that you can learn about here.
Navigate while cine is on
Navigate while cine is on! You can now navigate to different exams and series while cine is active - for example halfway through viewing an ultrasound video you can jump to a different video if desired – no need to press pause first.
We improved our documentation – for example, learn how to run exported models locally here.
Continue to modify label scope as desired! We fixed a bug preventing the change of label type from global to local type.
Annotations tied to a dataset will now delete when the dataset is deleted, so your next .json annotation export will contain only relevant data.