Use Meta Llama 3.3 70B in all OCI Generative AI regions

OCI Generative AI now supports Meta's 70 billion-parameter Llama 3.3 instruct model. This text-only model delivers better performance than both Llama 3.1 70B and Llama 3.2 90B for text tasks. Access this model through the Console's chat interface, API, and dedicated endpoints and leverage this model without the concern of managing the infrastructure.

Key Highlights
  • Accepts text-only inputs and produces text-only outputs.
  • Uses the same prompt format as Llama 3.1 70B.
  • Supports the same code interpreter as Llama 3.1 70B and retains the 128,000 token context length.
  • Compared to its Llama 3.1 70B predecessor, responds with improved reasoning, coding, math, and instruction-following. See the Llama 3.3 model card.
  • Available for on-demand inferencing, dedicated hosting, and fine-tuning.
Available Regions
  • Brazil East (Sao Paulo)
  • Germany Central (Frankfurt)
  • Japan Central (Osaka)
  • UK South (London)
  • US Midwest (Chicago)

Important Note: Before you use this model, review Meta's Llama 3.3 Acceptable Use Policy.

For a list of offered models, see Pretrained Foundational Models in Generative AI. For information about the service, see the Generative AI documentation.