About external models
Many foundation model suite workflows can use third-party external services to run inference over your data. For more information about how to use these models, see Using external models.
Available external providers and models
Predictive use case support
Provider | Model Type(s) |
---|---|
Amazon SageMaker | Text2Text |
Azure Machine Learning | Text2Text |
Azure OpenAI | Text2Text |
Hugging Face | Text2Text, QA, DocVQA |
OpenAI | Text2Text |
Vertex AI | Text2Text |
Custom inference (OpenAI API specification) | Text2Text |
Generative use case support
Provider | Model Type(s) | Management interface |
---|---|---|
Amazon SageMaker | Text2Text | SDK only |
Supported Model Types
We support prompting for specific model types across external model providers to support a variety of use cases in Snorkel Flow. This list includes only models that the Snorkel team has tested. Endpoints with compatible specifications to these providers could be swapped in as well.
Model Type | Supported Models | Supported Applications |
---|---|---|
Text2Text | - Hugging Face Text2Text and Text Generation models - OpenAI chat models - Azure OpenAI chat models - Azure Machine Learning Text2Text and Text Generation models - Amazon SageMaker Text2Text and Text Generation models - Vertex AI Palm and Gemini models - Custom inference service Text2Text models | - Text Classification - Sequence Tagging - PDF Extraction |
QA | - Hugging Face Question Answering models | - Sequence Tagging |
DocVQA | - Hugging Face Document Question Answering models | - PDF Extraction |
What's next?
You can now use an external model by linking external services and configuring an external model endpoint.