Welcome to RAIC Foundry, your platform for building robust, scalable vision AI solutions.

Create a RAIC Account

  1. Fill out the sign-up form at raiclabs.com/foundry and submit.
  2. You will receive a welcome email from RAIC Labs. Don't forget to check your spam folder!
  3. Set your password via the link in the welcome email.
  4. Access the portal at synthetaic.raic.ai

CLI and SDK Installation

In addition to the portal, RAIC Foundry has a CLI and a Python SDK, both of which can be installed via a pip package.

  • The recommended Python version 3.12.
  • Working within a virtual environment is highly recommended.

To install the raic-foundry package:

Ingest Data

From the Foundry Portal:

Supported formats are imagery (.jpg, .png, .bmp, .gif) and geotiff (.tif)

  1. Navigate to Data Sources
  2. Click on the Create A New Data Source button
  3. Enter your data source name
  4. (Coming Soon): Upload a zip file of your imagery using the Select A File button
  5. Click the Create button and you will be redirected to the data source details page
  6. On this page you can use the Azure Blob Storage information to perform azcopy commands to upload your imagery

From the Foundry CLI:

Supported imagery formats:
  • Archive files (.zip, .tar, .bz2, .gz, .xz) will be unpacked
  • Geospatial raster files (all single-file formats supported by gdal, multifile not yet supported) will be transformed to EPSG:4326 geotiff (.tif)
  • Geotiff (.tif) files larger than 9792px in width or height will be separated into smaller tiles of 9792px
  • Imagery formats (.jpg, .png, .bmp, .gif) are read and left unchanged

From the Foundry SDK:
Supported imagery formats:
  • Archive files (.zip, .tar, .bz2, .gz, .xz) will be unpacked
  • Geospatial raster files (all single-file formats supported by gdal, multifile not yet supported) will be transformed to EPSG:4326 geotiff (.tif)
  • Geotiff (.tif) files larger than 9792px in width or height will be separated into smaller tiles of 9792px
  • Imagery formats (.jpg, .png, .bmp, .gif) are read and left unchanged

Creating Inference Runs to Generate Object Detections and RAIC Embeddings

From the Foundry Portal:
  1. Navigate to Inference Runs
  2. Click Run button and the Run Inference wizard will appear
  3. Leave the Domain set to Image
  4. Select an existing Data Source from the dropdown, or type into the text box to autocomplete
    • To create a new Data Source, click on the Plus icon
    • Click the Next button
  5. Select a Universal Detector model.  This model performs generalized object detection so that object embeddings can be generated
    • Click the Next button
    • Select the model's IOUConfidence, and Max Detects thresholds or leave them as the default suggested
    • Click the Next button
  6. Select a Vectorizer model.  This model performs the vectorization into the RAIC embedding space
    • Click the Next button
  7. No need to pick a RAIC Vision Model in this case, you can click Next
  8. Enter a Name for this inference run or leave it as the default suggested
  9. Click the Create button
From the Foundry CLI:
  • Name, core-model, and vectorizer parameters can be optionally included, otherwise you will be prompted for them
  • When specifying the models either the name (if unique) or the uuid of the model can be used
  • There are no prompts for iou, confidence, and max detects so if you wish to use any values other than their model defaults they need to be specified as arguments below
From the Foundry SDK:
  • Name, universal model, and vectorizer parameters can be optionally included, otherwise the default models will be used
  • When specifying the data source either the name (if unique) or the uuid of the data source can be used
  • When specifying the models either the name (if unique) or the uuid of the model can be used
  • For the Universal Detector model, if no iou, confidence, or max detects values are specified then the model's defaults will be used

For additional documentation, please reference Foundry SDK and API docs.

For support, contact support@raiclabs.com.