Model Inferences
Inference sets include all of your model predictions for your dataset
Overview
Within Aquarium, we refer to model results and predictions as inferences.These inferences are the results from your model that was trained off corresponding ground truth values. As a result your inferences relate back to your labeled data. We can upload ground truth labels and inferences for the same image, point cloud, etc. That allows us to compare the results between our ground-truth labels and our inferences.
When working with Aquarium, for each model inference we create an object called an InferenceFrame and assign it to an Inferences set.
An InferenceFrame contains all of the information related to model predictions: predicted bounding boxes, predicted classifications, and all inference metadata.
For real examples of uploading inference data, please look at our quickstart guides!
Prerequisites to Uploading Model Inferences
In order to ensure the following steps will work smoothly, this guide assumes you have already:
Have access to your model predictions
Confidence scores
Classifications
To view your labeled data once uploaded, you will have to make sure that you have selected and set up the appropriate data sharing method for your team.
Creating and Formatting Your Inference Data
To ingest you model inferences, there are two main objects you'll work with:
Assuming a Project and a LabeledDataset have been created in Aquarium, let's also upload your model inferences. Inferences, like labels, must be matched to a frame within the dataset. For each LabeledFrame in your dataset, we will create an InferencesFrame and then assign the appropriate inferences to that InferencesFrame.
Then, we add those InferencesFrames to the Inferences object in order to upload your model inferences into Aquarium. This usually means looping through your data and creating InferenceFrames to add to the Inferences object.
If you have generated your own embeddings and want to use them during your inference data uploads, please also see this section for additional guidance!
When defining an inference frame, it is important that you use the same frame_id
as your labeled frame.
This is how Aquarium associated labels to inferences is by corresponding frame ID. So we define a new object called InferencesFrame, but we do so by using the frame_id
we used to create a LabeledFrame. Think of an InferenceFrame as a container for your model inferences.
Defining these objects looks like this:
Once you've defined your frame, we need to associate some data with it!
Adding Model Inferences to Your Inference Frame
Like labeled datasets, the Inferences object is also made of frames of type InferencesFrame, which contain inferred values.
Inference frames can contain zero or more inference objects. The type of prediction like bounding boxes or classification depends on the type of ML task for your project.
Adding data to an InferencesFrame usually corresponds 1:1 with a label format, with the addition of a confidence parameter.
For example, this is what it looks like when you add a bounding box label to a 2D image:
And this is what it looks like to add a bounding box that represents the model inference to a 2D image:
There is quite a bit of overlap between adding data to a LabeledFrame and a InferencesFrame and you can see all of the different options in the API docs!
Putting It All Together
Now that we've discussed the general steps for adding inference/prediction data, here is an example of what this would look like for a 2D object detection task (you can view the quickstart guide for more details on the dataset used below):
Uploading Your Inferences Set
Now that we have everything all set up, let's submit your new inference dataset to Aquarium!
Reminder, Aquarium does some processing of your data, like indexing metadata and possibly calculating embeddings, so after they're submitted so you may see a delay before they show up in the UI. You can view some examples of what to expect as well as troubleshooting your upload here!
Submitting Your Dataset
You can submit your LabeledDataset to be uploaded in to Aquarium by calling .create_inferences()
.
To spot check our data immediately, we can set the preview_first_frame
flag to True
and see a link in the console to a preview frame allows you to make sure data and labels look right.
This is an example of what the create_inferences()
call will look like:
After kicking off your inferences upload, it can take anywhere from minutes to multiple hours depending on your dataset size.
You can monitor your uploads under the "Streaming Uploads" tab in the project view. Here is a guide on how to find that page.
Once completed within Aquarium in the specific Project page, you'll be able to see your dataset details and on the righthand side you can see all of your available inferences once the upload has completed.
Quickstart Examples
For examples of how to upload labeled datasets, check out our quickstart examples.
Last updated