Load Data into Autonomous Database from
Azure Blob Storage or Azure Data Lake
Storage
This example shows you how to load data from Azure Blob Storage or Azure Data Lake
Storage to Autonomous Database.
You have various options to perform data loading into Autonomous Database, such as:
Using UI options: You can use the
Data Studio Load tool user interface to create credentials for the
cloud store location, select files containing data, and run data
load jobs.
All these methods use the
same PL/SQL package DBMS_CLOUD for loading data.
However, Data Studio provides additional benefits over SQL commands.
It not only helps to analyze the source and create table definitions
but also performs validation checks.
You require Azure access credentials for user account authentication and
an object URL for accessing the object in your Azure Storage account container.
To load data from Azure Blob Storage or Azure Data Lake
Storage:
Create credentials for Azure user account in the Autonomous Database.
Copy data from Azure Blob Storage or Azure Data Lake
Storage
to the database.
Prepare for Loading Data from Azure Blob Storage or Azure Data Lake
Storage 🔗
Verify the prerequisites and prepare for loading data from Azure Blob Storage or Azure Data Lake
Storage.
Prerequisites
A data file, for example, azure-data.txt exists in the
Azure Storage account container that you can import. The sample file in this example
has the following
contents:
1,Azure Direct Sales
2,Azure Tele Sales
3,Azure Catalog
4,Azure Internet
5,Azure Partners
On the Azure side, log in to your Azure Storage account and do the
following:
Grant the required role, for example Storage Blob Data Contributor, to your
Azure Storage account.
Steps for Loading Data from Azure Blob Storage or Azure Data Lake
Storage 🔗
Run these steps to load data from Azure Blob Storage or Azure Data Lake
Storage to Autonomous Database.
Store the Azure Storage account credentials in your Autonomous Database and specify a
credential name. This enables the database to authenticate with your Azure
Storage account and access the items in the Azure Storage account
container.
SET DEFINE OFF
BEGIN
DBMS_CLOUD.CREATE_CREDENTIAL(
credential_name => 'AZURE_CRED_NAME',
username => 'username',
password => 'password'
);
END;
/
Note
Here, the
username is your Azure Storage account name and the
password is your user access key.
Creating a
credential to access Azure Blob Storage or Azure Data Lake Storage
is not required if you enable Azure service principal. See Use Azure Service Principal to Access Azure Resources for more information.
Optionally, you can test the access to Azure Blob Storage or Azure Data Lake
Storage as shown in this example.
SELECT * FROM DBMS_CLOUD.LIST_OBJECTS('AZURE_CRED_NAME', 'https://dbazure.blob.core.windows.net/my-azure-container/');
Create a table in your database where you want to load the data.
CREATE TABLE myazuretable (id NUMBER, name VARCHAR2(64));
Import data from the Azure Blob Storage or Azure Data Lake
Storage container to
your Autonomous Database.
Specify the table name and the Azure credential name followed by
the Azure Blob Storage or Azure Data Lake
Storage object URL.
BEGIN
DBMS_CLOUD.COPY_DATA(
table_name => 'myazuretable',
credential_name => 'AZURE_CRED_NAME',
file_uri_list => 'https://dbazure.blob.core.windows.net/my-azure-container/azure-data.txt',
format => json_object('delimiter' value ',')
);
END;
/
You have successfully imported data from Azure Blob Storage or Azure Data Lake
Storage to your Autonomous Database.
You can run this statement and verify the data in your
table.
SELECT * FROM myazuretable;
ID NAME
-- –-------------
1 Azure Direct Sales
2 Azure Tele Sales
3 Azure Catalog
4 Azure Internet
5 Azure Partners