Policy Setup

Setting up ML Applications policies.

Policy setup is critical to:

  • Ensure Functionality: To make your ML Application fully functional, you must grant the necessary privileges. If a service in your solution lacks a required privilege, the solution fails. For example, not granting your ML Job access to your subnet can break your application.

  • Follow the Least-Privilege Principle: To maintain security, it's crucial to grant only the necessary privileges. For example, instead of granting broad permissions such as "manage object-family" across the entire tenancy, grant specific actions, such as reading a bucket in a particular compartment for a specific resource principal.

  • Implementing Tenant Isolation: ML Applications also let you implement tenant isolation. This ensures that workloads running on behalf of your customers can only access the resources owned by the respective customer, providing an extra layer of security.

An AI/ML solution built as an ML Application typically relies on several OCI services (such as Data Science, Object Storage, Networking, and Logging). Configure the correct policies for all these services, including those specific to ML Application resources.

Simplifying Policy Setup with a Sample Infrastructure Project

To ease this process, ML Applications provides a sample infrastructure project. This project helps you set up production-grade policies and other prerequisites for developing your ML Applications. It supports all scenarios, including providers and consumers in the same tenancy, different tenancies, and even internal ML Application environments.

The sample infrastructure project is available here: sample-project-policies.

Clone this project and use it to prepare the infrastructure for all environments where ML Applications are deployed. It creates least-privilege policies that implement tenant isolation. The policies also includes a compartment, a tag namespace, and a tag.

The project includes documentation in README.md files where you can find detailed information about the project and its components.

Follow these steps to configure an environment for ML Applications, including the necessary policies.

  1. Prepare the environment folder:

    • You can use the default dev environment folder (environments/dev) or use it as a template to create your custom environment.
    • To create your custom environment:
      • Make a copy of the dev environment folder (environments/dev).
      • Rename the copied folder to match the name of your environment.
  2. Configure the environment:

    • Navigate to the folder corresponding to your environment (for example, environments/dev).
    • Edit the input_variables.tfvars file to configure your environment settings.
  3. Run Terraform to create resources:

    1. Initialize Terraform in your environment folder:

      terraform init
    2. Apply your configuration to create a compartment, policy, tag namespace, and tag:
      terraform apply -var- file input_variables.tfvars
    3. If needs be, destroy the created resources:
      terraform destroy -var- file input_variables.tfvars
Note

  • You might need to configure the HTTP proxy.

  • The Terraform scripts create a top-level compartment named
    ml-app-<name-of-your-application>-<environment-suffix>
    .
  • You can preview the resources that Terraform creates by running:

    terraform plan -var- file input_variables.tfvars
  • If your application requires a specific subnet, specify its ID in the input_variables.tfvars file.