Once the build status for your model is
Success, you can use the model to perform inference on the dataset. The steps are as follows:
- Select the dataset which you want to use for inference in case you have multiple datasets in your project (eg. train, val, test)
- Select the model version you want to use. A tick mark will be placed to the left of the model you have chosen.
- Select the scope of the inference task i.e whether you want the model to run on the entire dataset, exam, series or a single image.
- Click on
RUN INFERENCEand wait for the model to run until the inference task
You will now receive a notification email when your model task completes along with a notification on the UI.
You can check the predictions of your model alongside the tab named
Model Outputs. You can also hover over the predicted labels to know more details about the model used along with additional fine grained analysis about the predictions such as class probabilities or gradcam visualizations etc.
Model task resource navigation
Once a task runs on a given resource (dataset, exam, series or instance), you can easily nagivate to that specific resource in the workspace to compare its annotations and model predictions for better error analysis.
If the model task ran -
- On the whole dataset, the dataset information will be shown and you can click to navigate to the dataset.
- On an exam, the exam information will be shown along with its dataset. You can click the dataset item to navigate to the dataset, or click the exam item to directly navigate to the specific exam.
- On a series, the series information will be shown along with its dataset. You can click the dataset item to navigate to the dataset, or click the series item to directly navigate to the series.
- On an instance, the instance information will be shown along with its dataset. You can click the dataset item to navigate to the dataset, or click the instance item to directly navigate to the instance.
Alerts will be shown in case the resources were deleted or you do not have view access to specific datasets based on the user assignment.
Upload and Run Inference
Instead of running models on existing resources on MD.ai, you can also choose to quickly test models on external data that you can upload using our
Upload and Run Inference feature. Select one or multiple models you want to test and upload images (maximum total file size is 500MB). Infererence will then run automatically on these images and you can view the results in a fast and efficient way. The uploaded images will be added to a
Temporary Dataset that will automatically delete after 60 mins of inactivity.