Connections help you to connect Data Transforms to various technologies reachable from your OCI network.
This section describes the generic steps to create a connection. The displayed connection detail options may vary depending on the selected connection type.
Apart from the connection types listed in Supported Connection Types you can create custom connectors, which you can use to connect Data Transforms to any JDBC supported data sources. See Create Custom Connectors.
To create a new connection:
From the left pane of the Home page, click the Connections tab.
Connections page appears.
Click Create Connection.
Create Connection page slides in.
Do one of the following:
In the Select Type field, enter the name or part of the name of the connection type.
Select the type of connection that you wish to create.
Databases - Allows you to configure any connection type for supported database types.
Applications - Allows you to configure any connection type for supported applications.
Services - Allows you to configure any connection type for supported services.
After selecting the required connection type, click Next.
The Connection Name field is pre-populated with a default name. You can edit this value.
For Connection Details, provide the connection details for the selected type such as:
Connection -
JDBC URL - The URL to connect to the data server. For example:
For connectors that use an autoREST driver that
provides the model files along with the driver, specify the
servername and other properties required to connect to that
datasource. For example:
User - The user name, if required, for connecting to the server.
Password - The password for connecting to server.
Advanced Options
Array Fetch Size - When reading large volumes of data from a data server, Oracle Data Transforms fetches successive batches of records. This value is the number of rows (records read) requested by Oracle Data Transforms on each communication with the data server.
Batch Update Size - When writing large volumes of data into a data server, Oracle Data Transforms pushes successive batches of records. This value is the number of rows (records written) in a single Oracle Data Transforms INSERT command.
Note
Set Batch Update Size to 1000 or less for loading tables with BLOB data type columns.
Degree of Parallelism for Target - This value indicates the number of threads allowed for a loading task. The default value is 1. The maximum number of threads allowed is 99.
Note
Connection details are specific and the above options vary based on the selected connection type. For the default connection that is created during provisioning, only the User and Password fields are editable. All the other fields are disabled.
After providing all the required connection details, click Test Connection to test the connection.
If the test connection fails, do one of the following:
Check whether the Autonomous Database from where you are accessing
Data Transforms is configured to use a private endpoint.
If a connection between Autonomous Databases on a Private Endpoint
fails with a "Mismatch with server cert DN" error, check whether you
have specified the ssl_server_dn_match=yes property
in the JDBC URL. See Troubleshoot Mismatch with Server Cert DN Error for more information.
Click Create.
The new connection is created.
The newly created connections are displayed in the Connections page.
Click the Actions icon () next to the selected connection to perform the following operations:
Select Edit to edit the provided connection details.
Select Test Connection to test the created connection.
Click Export to export the connection. See Export Objects.
Select Delete Schema to delete schemas.
Select Delete Connection to delete the created connection.
You can also search for the required connection to know its details based on the following filters:
Name of the connection.
Technology associated with the created connection.
Supported Connection Types This topic lists the connection types that are supported for connecting to Data Transforms.
Create Custom Connectors The Custom Connections page of the Administration tab of Oracle Data Transforms helps you to create custom connectors that point to any JDBC supported data sources.
Create a Delta Share Connection Databricks Delta Share is an open protocol for secure data sharing. Oracle Data Transforms integrates with Delta Share to load data to Oracle Autonomous Database. You can use the Delta Share connection to load data from Databricks or Oracle Data Share.
Create and use an Oracle Cloud Infrastructure Generative AI Connection Oracle Cloud Infrastructure (OCI) Generative AI enables organizations to automate text summarization and dynamic content generation. Data Transforms integrates with OCI Generative AI to support the use of embedding vectors in a data flow.
Create an Oracle Financials Cloud Connection You can fetch real time transactional data from Oracle Financials Cloud REST endpoints, import the data entities into Data Transforms, and use them as a source in a data flow.
Create and Use an Oracle NetSuite Connection You can use the Oracle NetSuite JDBC Driver or OAuth 2.0 authentication to connect to the Oracle NetSuite application. For Oracle NetSuite connections, Data Transforms allows you to load pre-built dataflows and workflows that you can run to transfer data from NetSuite to your target schema.
Create a REST Server Connection You can connect to any REST service endpoint, import the data entities into Data Transforms, and use them as source in a data flow.
This topic lists the connection types that are supported for connecting to Data Transforms.
Government cloud regions are required to support FIPS compliance
standards for data protection. Data Transforms in Government realms is FIPS 140-2
Level 1 compliant. Make sure the connections sources are FIPS compliant to ensure
secure communication between the servers.
Note
APPLIES TO: Data Transforms that is available as a separate listing on Marketplace
called Data Integrator: Web Edition.
For the connectors that require driver installation, you
need to copy the jar files to the /u01/oracle/transforms_home/userlibs directory before you
add the connection.
Apart from the connection types listed here you can create
custom connectors, which you can use to connect Data Transforms to any
JDBC supported data sources. See Create Custom Connectors.
Name
Type
Supported in Data Integrator: Web Edition
Supported in Data Transforms built into Autonomous
Database
Supported in Data Transforms built into OCI
GoldenGate
Make sure that the system variable property
sql_require_primary_key is set to OFF.
Otherwise, an ADW to MySQL mapping could fail with a “Table does not
exist” error.
If MySQL Heatwave database is created with high
availability, then write operation is not supported.
Make sure that the system variable property
sql_require_primary_key is set to OFF.
Otherwise, an ADW to MySQL Heatwave mapping could fail with a
“Table does not exist” error.
Netezza
Database
Yes
No
No
Depends on the driver
-
Oracle Data Transforms uses the Netezza JDBC to
connect to a NCR Netezza database. This driver must be installed in
your Data Transforms userlibs directory. You
can download the Netezza JDBC driver from the IBM website.
For Data Integrator Web Edition, write operation is
supported only on Oracle cloud database targets. For details refer
to the Oracle terms of use before deploying the image from OCI
marketplace.
Oracle Analytics Cloud
Application
Yes
Yes
Yes
No
-
Oracle Business Intelligence Cloud (BICC)
Connector
When using multiple JDBC connections for Oracle
Service Cloud in Oracle Data Transforms, ensure that each
connection is uniquely identified to avoid conflict. To do this,
add the DatabaseName property to the JDBC
URL.
Data Transforms uses the Teradata JDBC Driver to
connect to a Teradata Database. To use Teradata as a data source the
Teradata Gateway for JDBC must be running, and this driver must be
installed in your Data Transforms userlibs
directory. You can download the JDBC driver from the Teradata
website.
Teradata 17+
Database
Yes
No
No
Depends on the driver
-
Data Transforms uses the Teradata JDBC Driver to
connect to a Teradata Database. To use Teradata as a data source the
Teradata Gateway for JDBC must be running, and this driver must be
installed in your Data Transforms userlibs
directory. You can download the JDBC driver from the Teradata
website..
APPLIES TO: Data Transforms that is available as a separate listing on Marketplace
called Data Integrator: Web Edition.
The Custom Connections page of the Administration tab of Oracle Data Transforms helps you to create custom connectors that point to any JDBC supported data sources.
The custom connectors will be listed in the Create Connection page where you
can use them to connect data sources to Data Transforms. See Work with Connections for more information.
To create a new connector:
In the left pane, click Administration.
A
warning message appears.
Click Continue.
In the left pane, click Custom Connections.
Custom Connections screen appears.
Click Create Connection Type.
The Create
Connection Type page appears.
From the Category drop-down select the type of connection
that you wish to create whether database, application, or service.
Enter a name for the connection.
Enter the name of the JDBC Driver of the source connection. For
example, oracle.jdbc.OracleDriver.
Note
For connectors that require driver installation, you need to copy the jar
files to the /u01/oracle/transforms_home/userlibs directory before you add
the connection.
Click OK.
The newly created custom connection appears in the list and are
available in the Create Connection page.
Click the Actions
icon () next to the selected connection to perform the following
operations:
Select Edit, to edit the provided connection details.
Click Export to export the connection. See Export Objects.
Select Delete, to delete the created connection.
Note
You cannot delete custom
connectors that have existing connections.
Create a Data Transforms Connection for Remote Data Load 🔗
You can connect to an existing Data Transforms instance and run a data load remotely.
To create this connection, you need to specify the URL of the Data Transforms instance along with the name of the ODI rest API from where you want to run the data load.
To define a Data Transforms connection:
From the left pane of the Home page, click the Connections tab.
Connections page appears.
Click Create Connection.
Create Connection page slides in.
For Select Type,
In the Name field, enter the name of the newly created connection
Select Services as the type of connection that you wish to create.
In the Endpoint URL textbox, enter the URL of the ODI rest API from where you want to run the data load. Enter the URL in the format http://<host-ip-address>:<port>/odi-rest.
In the User text box enter SUPERVISOR as the user name.
In the Password text box enter the ODI Supervisor password.
After providing all the required connection details, click Test Connection to test the established connection.
Click Create.
The new connection is created.
The newly created connections are displayed in the Connections page.
Click the Actions icon () next to the selected connection to perform the following operations:
Select Edit to edit the provided connection details.
Select Test Connection to test the created connection.
Click Export to export the connection. See Export Objects.
Select Delete Schema to delete schemas.
Select Delete Connection to delete the created connection.
You can also search for the required connection to know its details based on the following filters:
Name of the Connection.
Technology associated with the created Connection.
Databricks Delta Share is an open protocol for secure data sharing. Oracle Data Transforms integrates with Delta Share to load data to Oracle Autonomous Database. You can use the Delta Share connection to load data from Databricks or Oracle Data Share.
To use Databricks as a source, you need to specify the URL of the Delta Sharing server along with the bearer token that lets you access the Delta Lake share server. To use Oracle Data Share as a source, you need to specify the URL for the token end point along with a client ID and the secret key.
From the left pane of the Home page, click the Connections tab.
Connections page appears.
Click Create Connection.
Create Connection page slides in.
Do one of the following:
In the Select Type field, enter the name or part of the name of the connection type.
Select the Databases tab.
Select Delta Share as the connection type.
Click Next.
The Connection Name field is pre-populated with a default name. You can edit this value.
In the Share Endpoint URL textbox, enter the URL of the Delta Sharing server. Enter the value in the <host>:<port>/<shareEndpoint>/ format.
In the Connection section, do one of the following:
Select Oracle Data Share and provide the Token Endpoint URL, Client ID, and Client Secret for accessing the share.
You can get this information from the Delta Share Profile JSON document that you will need to download from supplied to you by the Share Provider. (This is also where they get the Share Endpoint URL from)
You can get this information from the Delta Share Profile JSON document that you can download from the activation link that is provided by the Data Share provider to access their share.
Select Databricks and in the Bearer Token text box enter the token for connecting to the Delta Sharing server.
If you need to use a proxy to access the Delta Share Server or Delta Share Storage configure the following settings:
In the Proxy Host textbox, enter the host name of the proxy server to be used for the connection.
In the Proxy Port textbox, enter the port number of the proxy server.
Select the following checkboxes depending on where the proxy is required:
Use Proxy to access Delta Share Server
Use Proxy to access Delta Share Storage
Click Test Connection, to test the established connection.
After providing all the required connection details, click Create.
The new connection is created.
The newly created connections are displayed in the Connections page.
Click the Actions icon () next to the selected connection to perform the following operations:
Select Edit, to edit the provided connection details.
Select Test Connection, to test the created connection.
Click Export to export the connection. See Export Objects.
Select Delete Schema, to delete schemas.
Select Delete Connection, to delete the created connection.
You can also search for the required Connection to know its details based on the following filters:
Name of the Connection.
Technology associated with the created Connection.
Creating and Running a Delta Share Data Load
To load data from Delta Share into Oracle Autonomous Database, the Oracle connection user must be an Admin user. Admin privileges are required so that the Oracle user can create and insert data into tables in another schema.
When you run the data load, Data Transforms loads the data onto a corresponding
table in the target schema. The data load runs incrementally. The very first time
you run a data load, Data Transforms copies all the data into new tables. For every
subsequent data load run, it only uploads the changes. Any additions or deletions in
the records will reflect in the target tables. Note that if there is any metadata
change in the table, for example a column is added, Data Transforms creates a new
table to load the data on to the target server. You could create a workflow, add the
data load as a step, create a schedule to run the workflows at a predefined time
interval. See Create a New Workflow.
To create and run a Delta Share data load:
Do one of the following:
On the Home page, click Load Data. The Create Data Load wizard appears.
In the Create Data Load tab, enter a name if you want to replace the default value, add a description, and select a project from the drop-down.
On the Home page, click Projects, and then the required project tile. In the left pane, click Data Loads, and then click Create Data Load. The Create Data Load wizard appears.
In the Create Data Load tab, enter a name if you want to replace the default value and add a description.
Click Next.
In the Source Connection tab,
From the Connection Type drop-down, select Delta Share.
from the Connection drop-down, select the required connection from which you wish to add the data entities.
Select the share that you want to load tables from the Share drop-down. The drop-down lists all the shares for the selected connection.
Click Next.
In the Target Connection tab,
From the Connection Type drop-down, select Oracle as the connection type.
Note
This drop-down lists only JDBC type connections.
From the Connection drop-down, select the required connection from to you wish to load the data entities.
Enter a unique name in the Schema textbox.
Click Save.
The Data Load Detail page appears listing all the tables in the selected share with their schema names.
Note
For Delta Share data loads the Data Load Detail page only includes the option. You cannot apply different actions - incremental merge, incremental append, recreate, truncate, append - on the data entities before loading it to the target schema. This is to make sure that the data is consistent between the Delta Sharing server and the target schema.
Click to run the data load.
A confirmation prompt appears when the data load starts successfully.
To check the status of the data load, see the Status panel on the right below the Target Schema details. For details about the Status panel, see Monitor Status of Data Loads, Data Flows, and Workflows. This panel shows links to the jobs that execute to run this data load. Click the link to monitor the progress on the Job Details page. For more information about jobs, see Create and Manage Jobs.
All the loaded data entities along with their details are listed in the Data Entities page. To view the statistics of the data entities, click the Actions icon () next to the data entity, click Preview, and then select the Statistics tab. See View Statistics of Data Entities for information.
Create an Oracle Business Intelligence Cloud Connector Connection 🔗
Oracle Business Intelligence Cloud Connector (BICC) allows you to extract business data from a data source and load it into Autonomous Database.
To create an Oracle BICC connection you need to first configure external storage using the OCI Object Storage Connection tab in the BICC Console. You need to specify these connection details when you define the connection in Oracle Data Transforms.
You can use the BICC connection to choose the offerings whose data stores you want to extract. Data Transforms uses an Oracle Object Storage Data Server used by Oracle BICC to stage the extracted files, which you can then use as a source for mapping. Note that you cannot use an Oracle BICC connection as a target for mapping.
To define an Oracle BICC connection,
From the left pane of the Home page, click the Connections tab.
Connections page appears.
Click Create Connection.
Create Connection page slides in.
Do one of the following:
In the Select Type field, enter the name or part of the name of the connection type.
Select the Applications tab.
Select Oracle BI Cloud Connector as the connection type.
Click Next.
The Connection Name field is pre-populated with a default name. You can edit this value.
Enter the URL in the BI Cloud Connector Service URL textbox.
In the Connection section, enter the following details:
In the User text box enter the user name configured in the Oracle BI Cloud Connector Console.
In the Password text box enter the password configured in the Oracle BI Cloud Connector Console.
In the Storage section, enter the following details:
In the External Storage BICC Name text box enter the name of the external storage as it appears in the Oracle BI Cloud Connector Console.
In the External Storage Bucket text box specify the bucket into which extracts are uploaded. Bucket names are obtained in the OCI Console.
In the External Storage Name Space text box specify the namespace. Namespace is obtained in the OCI Console.
In the External Storage Region text box enter the OCI Object Storage region.
In the External Storage User text box enter your Oracle Cloud Infrastructure username.
In the External Storage Token text box enter the auth token.
Click Test Connection to test the established connection.
Click Create.
The new connection is created.
The newly created connections are displayed in the Connections page.
Click the Actions icon () next to the selected connection to perform the following operations:
Select Edit, to edit the provided connection details.
Select Test Connection, to test the created connection.
Click Export to export the connection. See Export Objects.
Select Delete Schema, to delete schemas.
Select Delete Connection, to delete the created connection.
You can also search for the required Connection to know its details based on the following filters:
Name of the Connection.
Technology associated with the created Connection.
Create and use an Oracle Cloud Infrastructure
Generative AI Connection 🔗
Oracle Cloud Infrastructure (OCI) Generative AI enables organizations to
automate text summarization and dynamic content generation. Data Transforms integrates with
OCI Generative AI to support the use of embedding vectors in a data flow.
Before you create an OCI Generative AI connection you create an Oracle
Database 23ai connection. Data Transforms will use this connection to test the OCI
Generative AI connection. To create the OCI Generative AI connection you need to specify
details such as the OCI URL, User OCID, Tenancy OCID, Compartment OCID, Private Key, and
Fingerprint information.
You can fetch real time transactional data from Oracle Financials Cloud REST endpoints, import the data entities into Data Transforms, and use them as a source in a data flow.
To create an Oracle Financials Cloud connection you need to choose a temporary schema where Data Transforms can create data entities after the reverse-engineering operation.
To define an Oracle Financials Cloud connection,
From the left pane of the Home page, click the Connections tab.
Connections page appears.
Click Create Connection.
Create Connection page slides in.
Do one of the following:
In the Select Type field, enter the name or part of the name of the connection type.
Select the Applications tab.
Select Oracle Financials Cloud as the connection type.
Click Next.
The Connection Name field is pre-populated with a default name. You can edit this value.
In the REST Service URL textbox, enter the URL of the endpoint that services the REST resources.
In the Proxy Host textbox, enter the host name of the proxy server to be used for the connection.
In the Proxy Port textbox, enter the port number of the proxy server.
In the User text box enter the user name for connecting to the REST endpoint.
In the Password text box enter the password for connecting to the REST endpoint.
Choose a connection from the Staging Connection drop-down list. The list displays only existing Autonomous Database connections. To use a different connection, create the connection before you reach this page.
After providing all the required connection details, click Create.
The new connection is created.
Click Test Connection, to test the established connection.
The newly created connections are displayed in the Connections page.
Click the Actions icon () next to the selected connection to perform the following operations:
Select Edit, to edit the provided connection details.
Select Test Connection, to test the created connection.
Click Export to export the connection. See Export Objects.
Select Delete Schema, to delete schemas.
Select Delete Connection, to delete the created connection.
You can also search for the required Connection to know its details based on the following filters:
Name of the Connection.
Technology associated with the created Connection.
You can use the Oracle NetSuite JDBC Driver or OAuth 2.0 authentication to connect to the Oracle NetSuite application. For Oracle NetSuite connections, Data Transforms allows you to load pre-built dataflows and workflows that you can run to transfer data from NetSuite to your target schema.
Configuring Access Permissions Required for Building Data Warehouse
Before you create a NetSuite connection or use the Build Data Warehouse
Wizard in Data Transforms, you need to login to NetSuite as an administrator, enable
SuiteAnalytics Connect, create a custom role, and set the access permissions that
are required to build the NetSuite Data Warehouse.
To configure the access permissions:
Log in to NetSuite as an administrator using the following URL:
Check whether the SUITEANALYTICS CONNECT feature is enabled.
Go to Setup > Company > Enable Features. Click
the Analytics tab and verify that the SuiteAnalytics Connect
box is checked.
Add a custom role.
Go to Setup > Users/Roles
> User Management > Manage Roles. Click New Role, add
the required details, and click Save to create a custom role.
Assign the required permissions to the custom role.
Go to Setup > Users/Roles > User Management > Manage Roles.
Click Customize next to the name of the custom role for which you
would like to add the SuiteAnalytics Connect permission.
Click the Transactions tab under the Permissions tab and
assign the following permissions:
Account Detail
Accounting Lists
Accounting Management
Accounts
Accounts Payable
Accounts Payable Graphing
Accounts Payable Register
Accounts Receivable
Accounts Receivable Graphing
Accounts Receivable Register
Adjust Inventory
Adjust Inventory Worksheet
Amortization Reports
Amortization Schedules
Audit Trail
Balance Sheet
Bank Account Registers
Bill Purchase Orders
Billing Schedules
Bills
Bin Putaway Worksheet
Bin Transfer
Blanket Purchase Order
Build Assemblies
CRM Groups
Calendar
Cash Sale
Cash Sale Refund
Charge
Charge - Run Rules
Charge Rule
Check
Classes
Commission Feature Setup
Commission Reports
Commit Orders
Commit Payroll
Competitors
Component Where Used
Contacts
Count Inventory
Create Allocation Schedules
Credit Card
Credit Card Refund
Credit Card Registers
Credit Memo
Credit Returns
Currency
Currency Revaluation
Custom Recognition Event Type
Customer Deposit
Customer Payment
Customer Refund
Customers
Deferred Expense Reports
Deleted Records
Departments
Deposit
Deposit Application
Documents and Files
Edit Forecast
Edit Manager Forecast
Email Template
Employee Commission Transaction
Employee Commission Transaction Approval
Employee Record
Employee Reminders
Employees
Enter Opening Balances
Enter Vendor Credits
Equity Registers
Establish Quotas
Estimate
Events
Expense Report
Expenses
Export Lists
Fair Value Dimension
Fair Value Formula
Fair Value Price
Financial Statements
Find Transaction
Fixed Asset Registers
Fulfill Orders
Fulfillment Request
General Ledger
Generate Price Lists
Generate Statements
Imported Employee Expenses
Inbound Shipment
Income
Income Statement
Inventory
Inventory Status Change
Invoice
Invoice Approval
Invoice Sales Orders
Item Fulfillment
Item Receipt
Item Revenue Category
Items
Lead Snapshot/Reminders
Locations
Long Term Liability Registers
Make Journal Entry
Manage Accounting Periods
Manage Payroll
Mass Updates
Memorized Transactions
Mobile Device Access
Net Worth
Non Posting Registers
Notes Tab
Opportunity
Other Asset Registers
Other Current Asset Registers
Other Current Liability Registers
Other Lists
Other Names
Ownership Transfer
Pay Bills
Pay Sales Tax
Pay Tax Liability
Paycheck Journal
Payroll Items
Perform Search
Phone Calls
Post Vendor Bill Variances
Posting Period on Transactions
Price Books
Price Plans
Process Payroll
Project Revenue Rules
Purchase Contract
Purchase Order
Purchase Order Reports
Purchases
Reconcile
Reconcile Reporting
Refund Returns
Report Customization
Report Scheduling
Request For Quote
Requisition
Resource
Return Authorization
Revalue Inventory Cost
Revenue Arrangement
Revenue Arrangement Approval
Revenue Element
Revenue Recognition Field Mapping
Revenue Recognition Plan
Revenue Recognition Reports
Revenue Recognition Rule
Revenue Recognition Schedules
SOAP Web Services
Sales
Sales By Partner
Sales By Promotion
Sales Order
Sales Order Approval
Sales Order Fulfillment Reports
Sales Order Reports
Sales Order Transaction Report
Set Up Budgets
Set Up SOAP Web Services
Statement Charge
Statistical Account Registers
Store Pickup Fulfillment
Subscription Change Orders
Subscription Plan
Subscriptions
Subsidiaries
SuiteAnalytics Connect
SuiteAnalytics Workbook
Tasks
Tax
Track Messages
Transaction Detail
Transfer Funds
Transfer Inventory
Transfer Order
Transfer Order Approval
Trial Balance
Unbilled Receivable Registers
Unbuild Assemblies
Units
Vendor Bill Approval
Vendor Payment Approval
Vendor Request For Quote
Vendor Return Auth. Approval
Vendor Return Authorization
Vendor Returns
Vendors
Work Calendar
Work Order
Work Order Close
Work Order Completion
Work Order Issue
Add the SuiteAnalytics Connect – Read All permission.
Click the Setup tab under the Permissions tab,
select SuiteAnalytics Connect – Read All from the drop-down, and
click Add.
Click Save to apply these permissions to the custom
role.
Assign the custom role to a user. When you create a connection to
NetSuite from Data Transforms, you will need to enter the credentials of this
user to connect to the data server. See Creating the Oracle NetSuite Connection for information about creating the connection.
To assign the custom role to the user, go to Setup > Users / Roles
> Manage Users. Click Edit next to the name of the user,
assign the custom role, and click Save.
To verify the access permissions, log in as the user that has the
custom role assigned.
Go to Analytics > Datasets. Click
New Dataset. This page will list all the tables and record types
that the user has access to. Search for “transaction” table, for example, to
verify whether the user has access to the transaction table.
Creating the Oracle NetSuite Connection
You can create an Oracle NetSuite connection using JDBC connectivity or OAuth 2.0
authentication.
To define an Oracle NetSuite connection:
From the left pane of the Home page, click the Connections tab.
Connections page appears.
Click Create Connection.
Create Connection page slides in.
Do one of the following:
In the Select Type field, enter the name or part of the name of the connection type.
Select the Applications tab.
Select Oracle NetSuite as the connection type.
Click Next.
The Connection Name field is pre-populated with a default name. You can edit this value.
To specify the connection details, do one of the following:
To use JDBC connectivity, specify the following details:
JDBC URL - Enter the URL of the SuiteAnalytics Connect server to be used for the connection.
User - Enter the user name for connecting to the data server.
In the Password textbox enter the password for connecting to the data
server.
In the Account ID textbox, enter the account ID for connecting to the data server.
In the Role ID textbox, enter the role ID for connecting to the data server.
To use OAuth 2.0 authentication, click the OAuth 2.0 switch and then specify
the following details:
Account ID - Enter the account ID for
connecting to the data server. You can get this information by
logging into the NetSuite account and viewing the SuiteAnalytics
connect information.
Role ID - Enter the role ID for connecting
to the data server. You can get this information by logging into
the NetSuite account and viewing the SuiteAnalytics connect
information.
Client ID - Enter the client ID for
connecting to the data server.
To obtain the
client ID, create an Integration record in NetSuite by
enabling OAuth 2.0 Client Credentials Flow. Copy and save
the Client ID that is displayed when the Integration Record
is successfully created.
Note
NetSuite no
longer supports the RSA PKCSv1.5 scheme for token signing
for NetSuite OAuth 2.0 client credentials flow. Any
integrations that rely on the RSA PKCSv1.5 scheme will need
to be updated to use the RSA-PSS scheme. Refer to the Oracle
NetSuite documentation for more information.
Public Certificate and Private Key -
Use the OpenSSL commands to generate the key pair in the
required PEM format. For example,
Paste the contents of
public.pem in the Public Certificate
field. Paste the contents of private.pem in
the Private Key field.
Certificate ID - Enter the Certificate ID
for connecting to the data server.
To get the
certificate ID, use the NetSuite OAuth 2.0 Client
Credentials (M2M) Setup to add the public certificate file
(auth-cert.pem) to the certificate key
list and copy the generated Certificate ID.
If the source that you want to use for mapping is a saved search, you need to also specify the following details in Saved Search Extraction:
Application ID: Enter the NetSuite Application ID for Data Transforms.
Version: Enter the NetSuite version number.
Select the checkbox in Build Data Model to install pre-built dataflows and workflows that you can run to extract data from NetSuite and move it to your Oracle target schema using the Build Data Warehouse wizard.
Click Test Connection, to test the established connection.
After providing all the required connection details, click Create.
The new connection is created.
The newly created connections are displayed in the Connections page.
Click the Actions icon () next to the selected connection to perform the following operations:
Select Edit, to edit the provided connection details.
Select Test Connection, to test the created connection.
Select Build Data Warehouse, to select the functional areas and create the NetSuite Data Warehouse in the target schema. See Using the Build Data Warehouse Wizard for more information.
Click Export to export the connection. See Export Objects.
Select Delete Schema, to delete schema.
Select Delete Connection, to delete the created connection.
You can also search for the required Connection to know its details based on the following filters:
Name of the Connection.
Technology associated with the created Connection.
Using the Build Data Warehouse Wizard
Data in your NetSuite account is grouped into business or subject areas in the Analytics Warehouse. The Build Data Warehouse wizard allows you to select the areas that you want to include in the newly created Data Warehouse.
To use the Build Data Warehouse Wizard:
On the Home page, click the Connections tab. The Connections page appears.
Click the Actions icon () next to the Oracle NetSuite connection that you want to use to build the data warehouse and click Build Data Warehouse.
The Build Data Warehouse wizard opens.
From the Connection drop-down list, choose the Autonomous Database connection where your target schema resides.
From the Staging Schema drop-down, all schema corresponding to the selected connection are listed in two groups:
Existing Schema (ones that you've imported into Oracle Data Transforms) and
New Database Schema (ones that you've not yet imported).
Select the schema that you want to use from the drop-down.
Similarly select the Target Schema.
Click Next.
Select the NetSuite Business Areas that you want to use to transfer data from the NetSuite Data Warehouse to the target schema.
Click Save.
Data Transforms starts the process to build the data warehouse. Click Jobs on the left pane of the Home page to monitor the progress of the process. When the job completes successfully, Data Transforms creates a Project folder that includes all the pre-built workflows and dataflows, which you can run to transfer data from the NetSuite connection to your target schema. See Running the Pre-Built Workflows to Load Data into the Target Schema for more information.
Running the Pre-Built Workflows to Load Data into the Target Schema
When the Build Data Warehouse wizard completes successfully, Data Transforms creates
a project that includes all the pre-built data flows and workflows that you can run
to extract data from a NetSuite connection and load it into your target schema.
To view and run the pre-built workflows:
Click Projects on the left pane of the Home page and select the newly created NetSuite project.
Click Workflows in the left pane. The following pre-built workflows are listed in the Project Details page:
Stage NetSuite Source to SDS
Extract Transaction Primary Keys
Load SDS to Warehouse
Apply Deletes
All Workflows
Click the Actions icon () next to the workflow you want to run and click Start.
Oracle recommends that you run All Workflows to execute all the pre-built workflows.
To see the status of the workflow, click Jobs from the left pane in the current project. When the job completes successfully, all the data from the NetSuite connection is loaded into the target schema.
You can use Data Transforms to upload data from Oracle Object Storage to Autonomous Database.
The OCI Object Storage dedicated endpoints feature allows OCI customers to securely access the storage buckets. See Object Storage Dedicated Endpoints for more information. You need to use the new URL format when you create Object Storage connections in Data Transforms. For users that already have an Object Storage connection, the existing URL is automatically updated to the new URL format.
To create an Oracle Object Storage connection you need to have an Oracle Cloud Infrastructure username and an auth token. See Getting an Auth Token for information about how to generate the auth token. You need to specify these details when you define the connection in Oracle Data Transforms.
Note the following:
To use an Oracle Object Storage connection to import data into Data Transforms, you
must use a public IP address to access the compute node. If you want to use a
private IP address to access the Object Storage service, make sure that you have
access to the Internet.
The supported file format for loading data from Oracle Object Storage to Autonomous Database and vice versa is CSV.
The supported data types are Numeric, Double, String, and Date.
Data load is not supported for Oracle Object Storage connections.
To create a mapping to Object Storage, the source technology must support the
DBMS_CLOUD package. Otherwise, the mapping will fail.
Data Transforms does not support cross-realm connectivity for Object Storage.
For example, you cannot connect from Data Transforms configured in a Government
Cloud region to Object Storage that is in a commercial realm.
To define an Oracle Object Storage connection,
From the left pane of the Home page, click the Connections tab.
Connections page appears.
Click Create Connection.
Create Connection page slides in.
Do one of the following:
In the Select Type field, enter the name or part of the name of the connection type.
Select the Databases tab.
Select Oracle Object Storage as the connection type.
Click Next.
The Connection Name field is pre-populated with a default name. You can edit this value.
Enter the URL in the Object Storage URL textbox. You can enter the URL in either of the following formats:
Credential mode is available only to Data Transforms that is available as a separate listing on Marketplace called Data Integrator: Web Edition.
The values for Region, Namespace and Bucket are auto-populated based on the URL provided.
To select the Connection Mode do one of the following:
Select Swift Connectivity, and provide the following details:
In the User Name text box enter your Oracle Cloud Infrastructure username. For tenancies that support identity domains, specify the domain name along with the username. For example, <identity-domain-name>/<username>.
In the Token text box enter the auth token.
(This is applicable only to Data Transforms that is available as a separate listing on Marketplace called Data Integrator: Web Edition.) Select Credential and provide the ODI credential in the Enter Credential text box.
You must create the credential in the repository and in the Autonomous Database that you created during instance creation. When you create a data flow to map data from Object Storage to Autonomous Database you need to create the ODI credential in the target schema as well. Before you run the mapping, make sure that you select the step and in the Properties panel, set the Create credential KM option to false. Otherwise, the credential-based connection will fail.
To create the credential, execute the following script:
You can connect to any REST service endpoint, import the data entities into Data Transforms, and use them as source in a data flow.
To create a generic REST connector, you need to provide the JDBC URL, username, and password to connect to the endpoint. You can also create and upload a config file that contains information such as the authentication methods, endpoints, and tables that you want to import data entities from.
The Application tab on the Create Connection page includes two connection options to create a generic REST connection - Generic REST and Generic REST Config. This topic has the following sections:
The newly created connections are displayed in the Connections page.
Click the Actions icon () next to the selected connection to perform the following operations:
Select Edit to edit the provided connection details.
Select Test Connection to test the created connection.
Click Export to export the connection. See Export Objects.
Select Delete Schema to delete schemas.
Select Delete Connection to delete the created connection.
You can also search for the required connection to know its details based on the following filters:
Name of the connection.
Technology associated with the created connection.
Creating a Generic REST Connection
To create this connection you need to specify the REST service URL and choose a temporary schema where Data Transforms can create data entities after the reverse-engineering operation.
To define a REST server connection:
From the left pane of the Home page, click the Connections tab.
Connections page appears.
Click Create Connection.
Create Connection page slides in.
Do one of the following:
In the Select Type field, enter the name or part of the name of the connection type.
Select the Applications tab.
Select Generic Rest as the connection type.
Click Next.
The Connection Name field is pre-populated with a default name. You can edit this value.
In the REST Service URL textbox, enter the URL of the endpoint that services the REST resources.
In the Proxy Host textbox, enter the host name of the proxy server to be used for the connection.
In the Proxy Port textbox, enter the port number of the proxy server.
In the User text box enter the user name for connecting to the REST endpoint.
In the Password text box enter the password for connecting to the REST endpoint.
Choose a connection from the Staging Connection drop-down list. The list displays only existing Autonomous Database connections. To use a different connection, create the connection before you reach this page.
After providing all the required connection details, click Test Connection to test the connection.
Click Create.
The new connection is created.
Creating a Generic Rest Connection Using a Config File
APPLIES TO: Data Transforms that is available as a separate listing on Marketplace
called Data Integrator: Web Edition.
To create a generic REST connector, you need the JDBC URL, username, password, and a config file. The config file is a model file with the file_name.rest naming convention that you need to upload when you create a REST Server connection. You need to specify the endpoints, table mappings, and the authentication methods to create the config file. You can create the config file using any text editor.
To define a REST server connection using a config file:
From the left pane of the Home page, click the Connections tab.
Connections page appears.
Click Create Connection.
Create Connection page slides in.
Do one of the following:
In the Select Type field, enter the name or part of the name of the connection type.
Select the Applications tab.
Select Generic Rest Config as the connection type.
Click Next.
The Connection Name field is pre-populated with a default name. You can edit this value.
Use the Config File text box to upload the config file that you want to use.
In the JDBC URL textbox, enter the URL to connect to the server.
In the User and Password text boxes enter the user name and password for connecting to the REST endpoint. You may leave these fields blank if these values are not applicable or are already mentioned in the JDBC URL.
After providing all the required connection details, click Test Connection to test the connection.