PromptTokensDetails¶
-
class
oci.generative_ai_inference.models.PromptTokensDetails(**kwargs)¶ Bases:
objectBreakdown of tokens used in the prompt.
Methods
__init__(**kwargs)Initializes a new PromptTokensDetails object with values from keyword arguments. Attributes
cached_tokensGets the cached_tokens of this PromptTokensDetails. -
__init__(**kwargs)¶ Initializes a new PromptTokensDetails object with values from keyword arguments. The following keyword arguments are supported (corresponding to the getters/setters of this class):
Parameters: cached_tokens (int) – The value to assign to the cached_tokens property of this PromptTokensDetails.
-
cached_tokens¶ Gets the cached_tokens of this PromptTokensDetails. Cached tokens present in the prompt.
Returns: The cached_tokens of this PromptTokensDetails. Return type: int
-