Python Client Library

Warning

In active development. Currently pre-alpha -- API may change significantly in future releases.

Tutorials with Jupyter Notebooks

To get started, we recommend looking at several Jupyter notebooks we have prepared. The following notebooks show how to perform classification of chest vs. abdomen x-rays using TensorFlow/Keras and TFRecords, and using fast.ai library, which is based on pytorch.

Chest/Abdomen X-Ray images classification using different deep learning libraries

Chest/Abdomen X-Ray Annotator Project URL: https://public.md.ai/annotator/project/PVq9raBJ/workspace

Introduction to deep learning for medical imaging lessons Addtionally, we created Jupyter notebooks covering the basics of using the client library for downloading and parsing annotation data, and training and evaluating different deep learning models for classification, semantic and instance segmentation and object detection problems in the medical imaging domain.

See lessons link for these lessons below.

  • Lesson 1. Classification of chest vs. adominal X-rays using TensorFlow/Keras Github Annotator
  • Lesson 2. Lung X-Rays Semantic Segmentation using UNets. Github Annotator
  • Lesson 3. RSNA Pneumonia detection using Kaggle data format Github Annotator
  • Lesson 3. RSNA Pneumonia detection using MD.ai python client library Github Annotator

Running Jupyter notebooks on Google Colab

It’s easy to run a Jupyter notebook on Google's Colab with free GPU use (time limited). For example, you can add the Github jupyter notebook path to https://colab.research.google.com/notebook: Select the "GITHUB" tab, and add the Lesson 1 URL:https://github.com/mdai/ml-lessons/blob/master/lesson1-xray-images-classification.ipynb

To use the GPU, in the notebook menu, go to Runtime -> Change runtime type -> switch to Python 3, and turn on GPU. See more colab tips.

Advanced: How to run on Google Cloud Platform with Deep Learning Images

You can also run the notebook with powerful GPUs on the Google Cloud Platform. In this case, you need to authenticate to the Google Cloud Platform, create a private virtual machine instance running a Google's Deep Learning image, and import the lessons. See instructions below.

GCP Deep Learnings Images How To