Supported OpenAI Models
You can import large language models from Hugging Face and models imported from an OCI Object Storage bucket into OCI Generative AI, create endpoints for those models, and use them in the Generative AI service.
These models have an advanced open-weight transformer architecture with Mixture-of-Experts (MoE) architecture, optimized for efficient, high-quality language reasoning and large context handling. For more information, see GptOss in the Hugging Face documentation.
GptOss
| Hugging Face Model ID | Model Capability | Recommended Dedicated AI Cluster Unit Shape |
|---|---|---|
| openai/gpt-oss-20b | TEXT_TO_TEXT | H100_X1 |
| openai/gpt-oss-120b | TEXT_TO_TEXT | H100_X2 |
Note
- To import a fine-tuned version of a model, only fine-tuned models that use the same transformers version as the original model and have a parameter count within ±10% of the original are supported.
- If the instance type for the recommended unit shape isn’t available in your region, select a higher-tier instance (for example, select an H100 shape instead of an A100-80G shape).
- For prerequisites and how to import models, see Managing Imported Models (New).