Skip to main content
Version: 0.95

About external models

Many foundation model suite workflows can use third-party external services to run inference over your data. For more information about how to use these models, see Using external models.

Available external providers and models

Predictive use case support

ProviderModel Type(s)
Amazon SageMakerText2Text
Azure Machine LearningText2Text
Azure OpenAIText2Text
Hugging FaceText2Text, QA, DocVQA
OpenAIText2Text
Vertex AIText2Text
Custom inference (OpenAI API specification)Text2Text

Generative use case support

ProviderModel Type(s)Management interface
Amazon SageMakerText2TextSDK only

Supported Model Types

We support prompting for specific model types across external model providers to support a variety of use cases in Snorkel Flow. This list includes only models that the Snorkel team has tested. Endpoints with compatible specifications to these providers could be swapped in as well.

Model TypeSupported ModelsSupported Applications
Text2Text- Hugging Face Text2Text and Text Generation models
- OpenAI chat models
- Azure OpenAI chat models
- Azure Machine Learning Text2Text and Text Generation models
- Amazon SageMaker Text2Text and Text Generation models
- Vertex AI Palm and Gemini models
- Custom inference service Text2Text models
- Text Classification
- Sequence Tagging
- PDF Extraction
QA- Hugging Face Question Answering models- Sequence Tagging
DocVQA- Hugging Face Document Question Answering models- PDF Extraction

What's next?

You can now use an external model by linking external services and configuring an external model endpoint.