Work with Connections

Connections help you to connect Data Transforms to various technologies reachable from your OCI network.

This section describes the generic steps to create a connection. The displayed connection detail options may vary depending on the selected connection type.

Apart from the connection types listed in Supported Connection Types you can create custom connectors, which you can use to connect Data Transforms to any JDBC supported data sources. See Create Custom Connectors.

To create a new connection:

  1. From the left pane of the Home page, click the Connections tab.
    Connections page appears.
  2. Click Create Connection.
    Create Connection page slides in.
  3. Do one of the following:
    • In the Select Type field, enter the name or part of the name of the connection type.
    • Select the type of connection that you wish to create.
      • Databases - Allows you to configure any connection type for supported database types.
      • Applications - Allows you to configure any connection type for supported applications.
      • Services - Allows you to configure any connection type for supported services.
  4. After selecting the required connection type, click Next.
  5. The Connection Name field is pre-populated with a default name. You can edit this value.
  6. For Connection Details, provide the connection details for the selected type such as:
    • Connection -
      • JDBC URL - The URL to connect to the data server. For example:

        jdbc:weblogic:sqlserver://hostname:port[;property=value[;...]]

        For connectors that use an autoREST driver that provides the model files along with the driver, specify the servername and other properties required to connect to that datasource. For example:

        jdbc:weblogic:autorest://servername;[property=value[;...]]
      • User - The user name, if required, for connecting to the server.
      • Password - The password for connecting to server.
    • Advanced Options
      • Array Fetch Size - When reading large volumes of data from a data server, Oracle Data Transforms fetches successive batches of records. This value is the number of rows (records read) requested by Oracle Data Transforms on each communication with the data server.
      • Batch Update Size - When writing large volumes of data into a data server, Oracle Data Transforms pushes successive batches of records. This value is the number of rows (records written) in a single Oracle Data Transforms INSERT command.
      • Degree of Parallelism for Target - This value indicates the number of threads allowed for a loading task. The default value is 1. The maximum number of threads allowed is 99.
      Note

      Connection details are specific and the above options vary based on the selected connection type. For the default connection that is created during provisioning, only the User and Password fields are editable. All the other fields are disabled.
  7. After providing all the required connection details, click Test Connection to test the connection.

    If the test connection fails, check whether the Autonomous Database from where you are accessing Data Transforms is configured to use a private endpoint.

  8. Click Create.
    The new connection is created.

The newly created connections are displayed in the Connections page.

Click the Actions icon (Actions icon) next to the selected connection to perform the following operations:

  • Select Edit to edit the provided connection details.
  • Select Test Connection to test the created connection.
  • Click Export to export the connection. See Export Objects.
  • Select Delete Schema to delete schemas.
  • Select Delete Connection to delete the created connection.

You can also search for the required connection to know its details based on the following filters:

  • Name of the connection.
  • Technology associated with the created connection.

Supported Connection Types

This topic lists the connection types that are supported for connecting to Data Transforms.

Note

APPLIES TO: Applicable Data Transforms that is available as a separate listing on Marketplace called Data Integrator: Web Edition.
  • For the connectors that require driver installation, you need to copy the jar files to the /u01/oracle/transforms_home/userlibs directory before you add the connection.
  • Apart from the connection types listed here you can create custom connectors, which you can use to connect Data Transforms to any JDBC supported data sources. See Create Custom Connectors.
Name Type Supported in Data Integrator: Web Edition Supported in Data Transforms built into Autonomous Database Supported in Data Transforms built into OCI GoldenGate Supports Write Operation Notes
Aha! Application Yes Yes Yes No  
Ahrefs Application Yes Yes Yes No  
Amazon Aurora Database Yes Yes Yes Yes  
Amazon EMR Hive Database Yes Yes Yes No  
Amazon Redshift Database Yes Yes Yes Yes  
Apache Hive Database Yes Yes Yes No  
Apache Spark SQL Database Yes Yes Yes No  
AWS S3 Database Yes Yes Yes No  
Azure Billing Application Yes Yes Yes No  
Azure Compute Database Yes Yes Yes No  
Azure Data Lake Storage Database Yes Yes Yes No  
Azure Reserved VM Instances Database Yes Yes Yes No  
Azure Resource Health Application Yes Yes Yes No  
Azure SQL Database Database Yes Yes Yes Yes  
Azure Synapse Analytics Database Yes Yes Yes Yes  
BigCommerce Application Yes Yes Yes No Requires driver installation
Cassandra Database Yes Yes Yes Yes  
Cloudera CDH Hive Database Yes Yes Yes No  
Cloudera Impala Database Yes Yes Yes No  
Confluence Cloud Database Yes Yes Yes No  
Data Transforms Service Yes Yes Yes No For instructions on connecting to an existing Data Transforms instance, see Create a Data Transforms Connection for Remote Data Load.
DataStax Application Yes Yes Yes Yes  
Delta Share Application Yes Yes Yes No For instructions on creating a connection using Delta Share, see Create a Delta Share Connection
DocuSign Database Yes Yes Yes No  
eBay Application Yes Yes Yes No Requires driver installation
EnterpriseDB Database Yes Yes Yes Yes  
FinancialForce Application Yes Yes Yes Yes  
FourSquare Application Yes Yes Yes No  
Generic Rest Application Yes Yes Yes No For information about connecting to any REST service endpoint to create a connection, see Create a REST Server Connection.
Generic Rest Config Application Yes No No No For information about connecting to any REST service endpoint to create a connection, see Create a REST Server Connection.
GitHub Application Yes Yes Yes No  
Google Ads Application Yes No No Depends on the driver Requires driver installation
Google AdSense Application Yes Yes Yes No  
Google Analytics Application Yes Yes Yes No  
Google BigQuery Database Yes Yes Yes No  
Google Calendar Application Yes Yes Yes No  
Google Campaign Manager Application Yes Yes Yes No  
Google Contacts Application Yes Yes Yes No  
Google Drive Database Yes Yes Yes No  
Google Search Ads 360 Application Yes Yes Yes No  
Greenplum Database Yes Yes Yes No  
Hortonworks Hive Database Yes Yes Yes No  
HubSpot Application Yes Yes Yes No  
Hypersonic SQL Database Yes Yes Yes Yes  
IBM BigInsights Database Yes Yes Yes No  
IBM DB2 Hosted Database Yes Yes Yes Yes  
IBM DB2 UDB Database Yes Yes Yes Yes  
IBM DB2 Warehouse Database Yes Yes Yes Yes  
IBM DB2/400 Database Yes Yes Yes Yes  
Informix Database Yes Yes Yes No  
Jira Application Yes Yes Yes No  
Klaviyo Application Yes Yes Yes No  
Magento Application Yes No No Depends on the driver Requires driver installation
Mailchimp Application Yes Yes Yes No  
MapR Hive Database Yes Yes Yes No  
Marketo Application Yes Yes Yes No  
Microsoft Dynamics 365 Application Yes Yes Yes Yes  
Microsoft SharePoint Application Yes Yes Yes Yes  
Microsoft SQL Server Database Yes Yes Yes Yes  
Mongo DB Database Yes Yes Yes Yes  
MySQL Database Yes Yes Yes Yes Make sure that the system variable property sql_require_primary_key is set to OFF. Otherwise, an ADW to MySQL mapping could fail with a “Table does not exist” error.
MySQL Heatwave Database Yes Yes Yes Yes

If MySQL Heatwave database is created with high availability, then write operation is not supported.

Make sure that the system variable property sql_require_primary_key is set to OFF. Otherwise, an ADW to MySQL Heatwave mapping could fail with a “Table does not exist” error.

Netezza Database Yes No No Depends on the driver Oracle Data Transforms uses the Netezza JDBC to connect to a NCR Netezza database. This driver must be installed in your Data Transforms userlibs directory. See Download the Netezza JDBC driver for more information.
Oracle Database Yes Yes Yes Yes For Data Integrator Web Edition, write operation is supported only on Oracle cloud database targets. For details refer to the Oracle terms of use before deploying the image from OCI marketplace.
Oracle Analytics Cloud Application Yes Yes Yes No  
Oracle Business Intelligence Cloud (BICC) Connector Application Yes Yes Yes No For information about creating a connection using Oracle Business Intelligence Cloud (BICC) Connector, see Create an Oracle Business Intelligence Cloud Connector Connection.
Oracle EBS Application Yes Yes Yes Yes  
Oracle Financials Cloud Application Yes Yes Yes No For information about creating a connection using Oracle Financials Cloud, see Create an Oracle Financials Cloud Connection.
Oracle Fusion ERP Application Yes Yes Yes No  
Oracle Fusion Sales Application Yes Yes Yes No  
Oracle Fusion Service Application Yes Yes Yes No  
Oracle GoldenGate – OCI Service Yes Yes Yes Yes  
Oracle Marketing Cloud Application Yes Yes Yes Yes  
Oracle NetSuite Application Yes Yes Yes No For information about creating a connection using Oracle Netsuite, see Create and Use an Oracle NetSuite Connection.
Oracle Object Storage Database Yes Yes Yes Yes For information about creating a connection using Oracle Object Storage, see Create an Oracle Object Storage Connection.
Oracle People Soft Application Yes Yes Yes No  
Oracle Sales Cloud Application Yes Yes Yes No  
Oracle Service Cloud Application Yes Yes Yes No  
Oracle SIEBEL Application Yes Yes Yes No  
PayPal Application Yes Yes Yes No  
Pivotal HD Database Yes Yes Yes No  
Pivotal HDB Database Yes Yes Yes No  
PostgreSQL Database Yes Yes Yes Yes  
Qmetry Application Yes Yes Yes No  
QuickBooks Online Application Yes Yes Yes No  
QuickBooks Payments Application Yes Yes Yes No  
Quora Ads Application Yes Yes Yes No  
Sage Application Yes Yes Yes No  
Salesforce Chatter Application Yes Yes Yes No  
Salesforce.com Application Yes Yes Yes Yes  
SAP BW/4HANA Database Yes Yes Yes No  
SAP HANA Application Yes Yes Yes No  
SAP NetWeaver Database Yes Yes Yes No  
SAP S/4HANA Cloud Application Yes Yes Yes No  
Semrush Application Yes Yes Yes No  
ServiceNow Service Yes Yes Yes No  
Shopify Application Yes Yes Yes No Requires driver installation
Snowflake Database Yes Yes Yes Yes  
Square Application Yes Yes Yes No  
Stripe Application Yes Yes Yes No  
Sybase As Anywhere Database Yes Yes Yes Yes  
Sybase as Enterprise Database Yes Yes Yes Yes  
Sybase AS IQ Database Yes Yes Yes Yes  
TeamCity Application Yes Yes Yes No  
Teradata Database Yes No No Depends on the driver Data Transforms uses the Teradata JDBC Driver to connect to a Teradata Database. To use Teradata as a data source the Teradata Gateway for JDBC must be running, and this driver must be installed in your Data Transforms userlibs directory. You will find the driver here: https://downloads.teradata.com/download/connectivity/jdbc-driver.
Teradata 17+ Database Yes No No Depends on the driver Data Transforms uses the Teradata JDBC Driver to connect to a Teradata Database. To use Teradata as a data source the Teradata Gateway for JDBC must be running, and this driver must be installed in your Data Transforms userlibs directory. You will find the driver here: https://downloads.teradata.com/download/connectivity/jdbc-driver.
Tumblr Application Yes Yes Yes No  
Twitter Application Yes Yes Yes No  
Veeva CRM Application Yes Yes Yes Yes  
Volusion Application Yes Yes Yes No  
Wistia Application Yes Yes Yes No  
WooCommerce Application Yes No No Depends on the driver Requires driver installation
WordPress Application Yes Yes Yes No  
Workday Application Yes No No Depends on the driver Requires driver installation
Xero Application Yes Yes Yes No Requires driver installation
Yelp Application Yes Yes Yes No  
Zendesk Application Yes Yes Yes No Requires driver installation
Zoho CRM Application Yes Yes Yes No  
Zoom Application Yes Yes Yes No  

Create Custom Connectors

APPLIES TO: Applicable Data Transforms that is available as a separate listing on Marketplace called Data Integrator: Web Edition.

The Custom Connections page of the Administration tab of Oracle Data Transforms helps you to create custom connectors that point to any JDBC supported data sources.
The custom connectors will be listed in the Create Connection page where you can use them to connect data sources to Data Transforms. See Work with Connections for more information.

To create a new connector:

  1. In the left pane, click Administration.

    A warning message appears.

  2. Click Continue.
  3. In the left pane, click Custom Connections.

    Custom Connections screen appears.

  4. Click Create Connection Type.

    The Create Connection Type page appears.

  5. From the Category drop-down select the type of connection that you wish to create whether database, application, or service.
  6. Enter a name for the connection.
  7. Enter the name of the JDBC Driver.
  8. Click OK.

The newly created custom connection appears in the list and are available in the Create Connection page.

Click the Actions icon (Actions icon) next to the selected connection to perform the following operations:

  • Select Edit, to edit the provided connection details.
  • Click Export to export the connection. See Export Objects.
  • Select Delete, to delete the created connection.
    Note

    You cannot delete custom connectors that have existing connections.

Create a Data Transforms Connection for Remote Data Load

You can connect to an existing Data Transforms instance and run a data load remotely.

To create this connection, you need to specify the URL of the Data Transforms instance along with the name of the ODI rest API from where you want to run the data load.

To define a Data Transforms connection:

  1. From the left pane of the Home page, click the Connections tab.
    Connections page appears.
  2. Click Create Connection.
    Create Connection page slides in.
  3. For Select Type,
    • In the Name field, enter the name of the newly created connection
    • Select Services as the type of connection that you wish to create.
  4. In the Endpoint URL textbox, enter the URL of the ODI rest API from where you want to run the data load. Enter the URL in the format http://<host-ip-address>:<port>/odi-rest.
  5. In the User text box enter SUPERVISOR as the user name.
  6. In the Password text box enter the ODI Supervisor password.
  7. After providing all the required connection details, click Test Connection to test the established connection.
  8. Click Create.
    The new connection is created.

The newly created connections are displayed in the Connections page.

Click the Actions icon (Actions icon) next to the selected connection to perform the following operations:

  • Select Edit to edit the provided connection details.
  • Select Test Connection to test the created connection.
  • Click Export to export the connection. See Export Objects.
  • Select Delete Schema to delete schemas.
  • Select Delete Connection to delete the created connection.

You can also search for the required connection to know its details based on the following filters:

  • Name of the Connection.
  • Technology associated with the created Connection.

Create a Delta Share Connection

Databricks Delta Share is an open protocol for secure data sharing. Oracle Data Transforms integrates with Delta Share to load data to Oracle Autonomous Database. You can use the Delta Share connection to load data from Databricks or Oracle Data Share.

To use Databricks as a source, you need to specify the URL of the Delta Sharing server along with the bearer token that lets you access the Delta Lake share server. To use Oracle Data Share as a source, you need to specify the URL for the token end point along with a client ID and the secret key.

Creating the Delta Share Connection

To define a Delta Share connection:

  1. From the left pane of the Home page, click the Connections tab.

    Connections page appears.

  2. Click Create Connection.

    Create Connection page slides in.

  3. Do one of the following:
    • In the Select Type field, enter the name or part of the name of the connection type.
    • Select the Databases tab.
  4. Select Delta Share as the connection type.
  5. Click Next.
  6. The Connection Name field is pre-populated with a default name. You can edit this value.
  7. In the Share Endpoint URL textbox, enter the URL of the Delta Sharing server. Enter the value in the <host>:<port>/<shareEndpoint>/ format.
  8. In the Connection section, do one of the following:
    • Select Oracle Data Share and provide the Token Endpoint URL, Client ID, and Client Secret for accessing the share.

      You can get this information from the Delta Share Profile JSON document that you will need to download from supplied to you by the Share Provider. (This is also where they get the Share Endpoint URL from)

      You can get this information from the Delta Share Profile JSON document that you can download from the activation link that is provided by the Data Share provider to access their share.

    • Select Databricks and in the Bearer Token text box enter the token for connecting to the Delta Sharing server.
  9. If you need to use a proxy to access the Delta Share Server or Delta Share Storage configure the following settings:
    • In the Proxy Host textbox, enter the host name of the proxy server to be used for the connection.
    • In the Proxy Port textbox, enter the port number of the proxy server.
    • Select the following checkboxes depending on where the proxy is required:
      • Use Proxy to access Delta Share Server
      • Use Proxy to access Delta Share Storage
  10. Click Test Connection, to test the established connection.
  11. After providing all the required connection details, click Create.

    The new connection is created.

The newly created connections are displayed in the Connections page.

Click the Actions icon (Actions icon) next to the selected connection to perform the following operations:

  • Select Edit, to edit the provided connection details.
  • Select Test Connection, to test the created connection.
  • Click Export to export the connection. See Export Objects.
  • Select Delete Schema, to delete schemas.
  • Select Delete Connection, to delete the created connection.

You can also search for the required Connection to know its details based on the following filters:

  • Name of the Connection.
  • Technology associated with the created Connection.

Creating and Running a Delta Share Data Load

To load data from Delta Share into Oracle Autonomous Database, the Oracle connection user must be an Admin user. Admin privileges are required so that the Oracle user can create and insert data into tables in another schema.

When you run the data load, Data Transforms loads the data onto a corresponding table in the target schema. The data load runs incrementally. The very first time you run a data load, Data Transforms copies all the data into new tables. For every subsequent data load run, it only uploads the changes. Any additions or deletions in the records will reflect in the target tables. Note that if there is any metadata change in the table, for example a column is added, Data Transforms creates a new table to load the data on to the target server. You could create a workflow, add the data load as a step, create a schedule to run the workflows at a predefined time interval. See

To create and run a Delta Share data load:

  1. Do one of the following:
    • On the Home page, click Load Data. The Create Data Load wizard appears.

      In the Create Data Load tab, enter a name if you want to replace the default value, add a description, and select a project from the drop-down.

    • On the Home page, click Projects, and then the required project tile. In the left pane, click Data Loads, and then click Create Data Load. The Create Data Load wizard appears.

      In the Create Data Load tab, enter a name if you want to replace the default value and add a description.

  2. Click Next.
  3. In the Source Connection tab,
    1. From the Connection Type drop-down, select Delta Share.
    2. from the Connection drop-down, select the required connection from which you wish to add the data entities.
    3. Select the share that you want to load tables from the Share drop-down. The drop-down lists all the shares for the selected connection.
    4. Click Next.
  4. In the Target Connection tab,
    1. From the Connection Type drop-down, select Oracle as the connection type.
      Note

      This drop-down lists only JDBC type connections.
    2. From the Connection drop-down, select the required connection from to you wish to load the data entities.
    3. Enter a unique name in the Schema textbox.
    4. Click Save.
    The Data Load Detail page appears listing all the tables in the selected share with their schema names.
    Note

    For Delta Share data loads the Data Load Detail page only includes the execute iconoption. You cannot apply different actions - incremental merge, incremental append, recreate, truncate, append - on the data entities before loading it to the target schema. This is to make sure that the data is consistent between the Delta Sharing server and the target schema.
  5. Click execute iconto run the data load.

    A confirmation prompt appears when the data load starts successfully.

To check the status of the data load, see the Status panel on the right below the Target Schema details. For details about the Status panel, see Monitor Status of Data Loads, Data Flows, and Workflows. This panel shows links to the jobs that execute to run this data load. Click the link to monitor the progress on the Job Details page. For more information about jobs, see Create and Manage Jobs.

All the loaded data entities along with their details are listed in the Data Entities page. To view the statistics of the data entities, click the Actions icon (Actions icon) next to the data entity, click Preview, and then select the Statistics tab. See View Statistics of Data Entities for information.

Create an Oracle Business Intelligence Cloud Connector Connection

Oracle Business Intelligence Cloud Connector (BICC) allows you to extract business data from a data source and load it into Autonomous Database.

To create an Oracle BICC connection you need to first configure external storage using the OCI Object Storage Connection tab in the BICC Console. You need to specify these connection details when you define the connection in Oracle Data Transforms.

You can use the BICC connection to choose the offerings whose data stores you want to extract. Data Transforms uses an Oracle Object Storage Data Server used by Oracle BICC to stage the extracted files, which you can then use as a source for mapping. Note that you cannot use an Oracle BICC connection as a target for mapping.

To define an Oracle BICC connection,

  1. From the left pane of the Home page, click the Connections tab.
    Connections page appears.
  2. Click Create Connection.
    Create Connection page slides in.
  3. Do one of the following:
    • In the Select Type field, enter the name or part of the name of the connection type.
    • Select the Applications tab.
  4. Select Oracle BI Cloud Connector as the connection type.
  5. Click Next.
  6. The Connection Name field is pre-populated with a default name. You can edit this value.
  7. Enter the URL in the BI Cloud Connector Service URL textbox.
  8. In the Connection section, enter the following details:
    • In the User text box enter the user name configured in the Oracle BI Cloud Connector Console.
    • In the Password text box enter the password configured in the Oracle BI Cloud Connector Console.
  9. In the Storage section, enter the following details:
    • In the External Storage BICC Name text box enter the name of the external storage as it appears in the Oracle BI Cloud Connector Console.
    • In the External Storage Bucket text box specify the bucket into which extracts are uploaded. Bucket names are obtained in the OCI Console.
    • In the External Storage Name Space text box specify the namespace. Namespace is obtained in the OCI Console.
    • In the External Storage Region text box enter the OCI Object Storage region.
    • In the External Storage User text box enter your Oracle Cloud Infrastructure username.
    • In the External Storage Token text box enter the auth token.
  10. Click Test Connection to test the established connection.
  11. Click Create.
    The new connection is created.

The newly created connections are displayed in the Connections page.

Click the Actions icon (Actions icon) next to the selected connection to perform the following operations:

  • Select Edit, to edit the provided connection details.
  • Select Test Connection, to test the created connection.
  • Click Export to export the connection. See Export Objects.
  • Select Delete Schema, to delete schemas.
  • Select Delete Connection, to delete the created connection.

You can also search for the required Connection to know its details based on the following filters:

  • Name of the Connection.
  • Technology associated with the created Connection.

Create an Oracle Financials Cloud Connection

You can fetch real time transactional data from Oracle Financials Cloud REST endpoints, import the data entities into Data Transforms, and use them as a source in a data flow.

To create an Oracle Financials Cloud connection you need to choose a temporary schema where Data Transforms can create data entities after the reverse-engineering operation.

To define an Oracle Financials Cloud connection,

  1. From the left pane of the Home page, click the Connections tab.
    Connections page appears.
  2. Click Create Connection.
    Create Connection page slides in.
  3. Do one of the following:
    • In the Select Type field, enter the name or part of the name of the connection type.
    • Select the Applications tab.
  4. Select Oracle Financials Cloud as the connection type.
  5. Click Next.
  6. The Connection Name field is pre-populated with a default name. You can edit this value.
  7. In the REST Service URL textbox, enter the URL of the endpoint that services the REST resources.
  8. In the Proxy Host textbox, enter the host name of the proxy server to be used for the connection.
  9. In the Proxy Port textbox, enter the port number of the proxy server.
  10. In the User text box enter the user name for connecting to the REST endpoint.
  11. In the Password text box enter the password for connecting to the REST endpoint.
  12. Choose a connection from the Staging Connection drop-down list. The list displays only existing Autonomous Database connections. To use a different connection, create the connection before you reach this page.
  13. After providing all the required connection details, click Create.
    The new connection is created.
  14. Click Test Connection, to test the established connection.

The newly created connections are displayed in the Connections page.

Click the Actions icon (Actions icon) next to the selected connection to perform the following operations:

  • Select Edit, to edit the provided connection details.
  • Select Test Connection, to test the created connection.
  • Click Export to export the connection. See Export Objects.
  • Select Delete Schema, to delete schemas.
  • Select Delete Connection, to delete the created connection.

You can also search for the required Connection to know its details based on the following filters:

  • Name of the Connection.
  • Technology associated with the created Connection.

Create and Use an Oracle NetSuite Connection

You can use the Oracle NetSuite JDBC Driver or OAuth 2.0 authentication to connect to the Oracle NetSuite application. For Oracle NetSuite connections, Data Transforms allows you to load pre-built dataflows and workflows that you can run to transfer data from NetSuite to your target schema.

Creating the Oracle NetSuite Connection

You can create an Oracle Netsuite connection using JDBC connectivity or OAuth 2.0 authentication.

To define an Oracle NetSuite connection:

  1. From the left pane of the Home page, click the Connections tab.

    Connections page appears.

  2. Click Create Connection.

    Create Connection page slides in.

  3. Do one of the following:
    • In the Select Type field, enter the name or part of the name of the connection type.
    • Select the Applications tab.
  4. Select Oracle NetSuite as the connection type.
  5. Click Next.
  6. The Connection Name field is pre-populated with a default name. You can edit this value.
  7. To specify the connection details, do one of the following:
    • To use JDBC connectivity, specify the following details:
      • JDBC URL - Enter the URL of the SuiteAnalytics Connect server to be used for the connection.
      • User - Enter the user name for connecting to the data server.
      • In the Password text box enter the password for connecting to the data server.
      • In the Account ID textbox, enter the account ID for connecting to the data server.
      • In the Role ID textbox, enter the role ID for connecting to the data server.
    • To use OAuth 2.0 authentication, click the OAuth 2.0 switch and then specify the following details:
      • Username - Enter the name of the user who has role access to login to NetSuite using OAuth 2.0 connection.
      • Account ID - Enter the account ID for connecting to the data server. You can get this information by logging into the NetSuite account and viewing the SuiteAnalytics connect information.
      • Role ID - Enter the role ID for connecting to the data server. You can get this information by logging into the NetSuite account and viewing the SuiteAnalytics connect information.
      • Client ID - Enter the client ID for connecting to the data server.

        To obtain the client ID, create an Integration record in NetSuite by enabling OAuth 2.0 Client Credentials Flow. Copy and save the Client ID that is displayed when the Integration Record is successfully created.

      • Public Certificate and Private Key - Use the OpenSSL commands to generate the key pair in the required PEM format. For example,

        openssl req -x509 -newkey rsa:4096 -sha256 -keyout auth-key.pem -out auth-cert.pem -nodes -days 730

        Paste the contents of auth-cert.pem in the Public Certificate field. Paste the contents of auth-key.pem in the Private Key field.

      • Certificate ID - Enter the Certificate ID for connecting to the data server.

        To get the certificate ID, use the Netsuite OAuth 2.0 Client Credentials (M2M) Setup to add the public certificate file (auth-cert.pem) to the certificate key list and copy the generated Certificate ID.

  8. If the source that you want to use for mapping is a saved search, you need to also specify the following details in Saved Search Extraction:
    • Application ID: Enter the NetSuite Application ID for Data Transforms.
    • Version: Enter the NetSuite version number.
  9. Select the checkbox in Build Data Model to install pre-built dataflows and workflows that you can run to extract data from NetSuite and move it to your Oracle target schema using the Build Data Warehouse wizard.
  10. Click Test Connection, to test the established connection.
  11. After providing all the required connection details, click Create.

    The new connection is created.

The newly created connections are displayed in the Connections page.

Click the Actions icon (Actions icon) next to the selected connection to perform the following operations:

  • Select Edit, to edit the provided connection details.
  • Select Test Connection, to test the created connection.
  • Select Build Data Warehouse, to select the functional areas and create the NetSuite Data Warehouse in the target schema. See Using the Build Data Warehouse Wizard for more information.
  • Click Export to export the connection. See Export Objects.
  • Select Delete Schema, to delete schemas.
  • Select Delete Connection, to delete the created connection.

You can also search for the required Connection to know its details based on the following filters:

  • Name of the Connection.
  • Technology associated with the created Connection.

Using the Build Data Warehouse Wizard

Data in your NetSuite account is grouped into business or subject areas in the Analytics Warehouse. The Build Data Warehouse wizard allows you to select the areas that you want to include in the newly created Data Warehouse.

To use the Build Data Warehouse Wizard:

  1. On the Home page, click the Connections tab. The Connections page appears.
  2. Click the Actions icon (Actions icon) next to the Oracle NetSuite connection that you want to use to build the data warehouse and click Build Data Warehouse.

    The Build Data Warehouse wizard opens.

  3. From the Connection drop-down list, choose the Autonomous Database connection where your target schema resides.
  4. From the Staging Schema drop-down, all schema corresponding to the selected connection are listed in two groups:
    • Existing Schema (ones that you've imported into Oracle Data Transforms) and
    • New Database Schema (ones that you've not yet imported).
    Select the schema that you want to use from the drop-down.
  5. Similarly select the Target Schema.
  6. Click Next.
  7. Select the NetSuite Business Areas that you want to use to transfer data from the NetSuite Data Warehouse to the target schema.
  8. Click Save.

    Data Transforms starts the process to build the data warehouse. Click Jobs on the left pane of the Home page to monitor the progress of the process. When the job completes successfully, Data Transforms creates a Project folder that includes all the pre-built workflows and dataflows, which you can run to transfer data from the NetSuite connection to your target schema. See Running the Pre-Built Workflows to Load Data into the Target Schema for more information.

Running the Pre-Built Workflows to Load Data into the Target Schema

When the Build Data Warehouse wizard completes successfully, Data Transforms creates a project that includes all the pre-built data flows and workflows that you can run to extract data from a Netsuite connection and load it into your target schema.

To view and run the pre-built workflows:

  1. Click Projects on the left pane of the Home page and select the newly created NetSuite project.
  2. Click Workflows in the left pane. The following pre-built workflows are listed in the Project Details page:
    • Stage NetSuite Source to SDS
    • Extract Transaction Primary Keys
    • Load SDS to Warehouse
    • Apply Deletes
    • All Workflows
  3. Click the Actions icon (Actions icon) next to the workflow you want to run and click Start.

    Oracle recommends that you run All Workflows to execute all the pre-built workflows.

    To see the status of the workflow, click Jobs from the left pane in the current project. When the job completes successfully, all the data from the NetSuite connection is loaded into the target schema.

Create an Oracle Object Storage Connection

You can use Data Transforms to upload data from Oracle Object Storage to Autonomous Database.

The OCI Object Storage dedicated endpoints feature allows OCI customers to securely access the storage buckets. See Object Storage Dedicated Endpoints for more information. You need to use the new URL format when you create Object Storage connections in Data Transforms. For users that already have an Object Storage connection, the existing URL is automatically updated to the new URL format.

To create an Oracle Object Storage connection you need to have an Oracle Cloud Infrastructure username and an auth token. See Getting an Auth Token for information about how to generate the auth token. You need to specify these details when you define the connection in Oracle Data Transforms.

Note the following:

  • To use an Oracle Object Storage connection to import data into Data Transforms, you must use a public IP address to access the compute node. If you want to use a private IP address to access the Object Storage service, make sure that you have access to the Internet.
  • The supported file format for loading data from Oracle Object Storage to Autonomous Database and vice versa is CSV.
  • The supported data types are Numeric, Double, String, and Date.
  • Data load is not supported for Oracle Object Storage connections.

To define an Oracle Object Storage connection,

  1. From the left pane of the Home page, click the Connections tab.
    Connections page appears.
  2. Click Create Connection.
    Create Connection page slides in.
  3. Do one of the following:
    • In the Select Type field, enter the name or part of the name of the connection type.
    • Select the Databases tab.
  4. Select Oracle Object Storage as the connection type.
  5. Click Next.
  6. The Connection Name field is pre-populated with a default name. You can edit this value.
  7. Enter the URL in the Object Storage URL textbox. You can enter the URL in either of the following formats:
    • URL with fully qualified domain name.

      For example,

      https://<namespace>.swiftobjectstorage.<your-region>.oci.customer-oci.com/v1/<your-namespace>/<your-bucket>
      https://<namespace>.objectstorage.<your-region>.oci.customer-oci.com/n/<your-namespace>/b/<your-bucket>/o
    • If you want to use the URL provided by the OCI Console, specify the URL only till the name of the bucket.

      For example,

      https://<namespace>.swiftobjectstorage.<your-region>.oci.customer-oci.com/v1/<your-namespace>/<your-bucket>
      https://<namespace>.objectstorage.<your-region>.oci.customer-oci.com/n/<your-namespace>/b/<your-bucket>/o
    • If you choose Credential as the Connection Mode (see step 6), specify the URL till bucketname/o

      For example,

      https://<namespace>.objectstorage.<your-region>.oci.customer-oci.com/n/<your-namespace>/b/<your-bucket>/o/
      Note

      Credential mode is available only to Data Transforms that is available as a separate listing on Marketplace called Data Integrator: Web Edition.
    The values for Region, Namespace and Bucket are auto-populated based on the URL provided.
  8. To select the Connection Mode do one of the following:
    • Select Swift Connectivity, and provide the following details:
      • In the User Name text box enter your Oracle Cloud Infrastructure username. For tenancies that support identity domains, specify the domain name along with the username. For example, <identity-domain-name>/<username>.
      • In the Token text box enter the auth token.
    • (This is applicable only to Data Transforms that is available as a separate listing on Marketplace called Data Integrator: Web Edition.) Select Credential and provide the ODI credential in the Enter Credential text box.

      You must create the credential in the repository and in the Autonomous Database that you created during instance creation. When you create a data flow to map data from Object Storage to Autonomous Database you need to create the ODI credential in the target schema as well. Before you run the mapping, make sure that you select the step and in the Properties panel, set the Create credential KM option to false. Otherwise, the credential-based connection will fail.

      To create the credential, execute the following script:

      begin DBMS_CLOUD.create_credential( credential_name => '<Credential name>',
            username => '<oci username>', password => '<auth token>' ); end;
  9. Click Create.
    The new connection is created.
  10. Click Test Connection, to test the established connection.

The newly created connections are displayed in the Connections page.

Click the Actions icon (Actions icon) next to the selected connection to perform the following operations:

  • Select Edit, to edit the provided connection details.
  • Select Test Connection, to test the created connection.
  • Click Export to export the connection. See Export Objects.
  • Select Delete Schema, to delete schemas.
  • Select Delete Connection, to delete the created connection.

You can also search for the required Connection to know its details based on the following filters:

  • Name of the Connection.
  • Technology associated with the created Connection.

Create a REST Server Connection

You can connect to any REST service endpoint, import the data entities into Data Transforms, and use them as source in a data flow.

To create a generic REST connector, you need to provide the JDBC URL, username, and password to connect to the endpoint. You can also create and upload a config file that contains information such as the authentication methods, endpoints, and tables that you want to import data entities from.

The Application tab on the Create Connection page includes two connection options to create a generic REST connection - Generic REST and Generic REST Config. This topic has the following sections:

The newly created connections are displayed in the Connections page.

Click the Actions icon (Actions icon) next to the selected connection to perform the following operations:

  • Select Edit to edit the provided connection details.
  • Select Test Connection to test the created connection.
  • Click Export to export the connection. See Export Objects.
  • Select Delete Schema to delete schemas.
  • Select Delete Connection to delete the created connection.

You can also search for the required connection to know its details based on the following filters:

  • Name of the connection.
  • Technology associated with the created connection.

Creating a Generic REST Connection

To create this connection you need to specify the REST service URL and choose a temporary schema where Data Transforms can create data entities after the reverse-engineering operation.

To define a REST server connection:

  1. From the left pane of the Home page, click the Connections tab.

    Connections page appears.

  2. Click Create Connection.

    Create Connection page slides in.

  3. Do one of the following:
    • In the Select Type field, enter the name or part of the name of the connection type.
    • Select the Applications tab.
  4. Select Generic Rest as the connection type.
  5. Click Next.
  6. The Connection Name field is pre-populated with a default name. You can edit this value.
  7. In the REST Service URL textbox, enter the URL of the endpoint that services the REST resources.
  8. In the Proxy Host textbox, enter the host name of the proxy server to be used for the connection.
  9. In the Proxy Port textbox, enter the port number of the proxy server.
  10. In the User text box enter the user name for connecting to the REST endpoint.
  11. In the Password text box enter the password for connecting to the REST endpoint.
  12. Choose a connection from the Staging Connection drop-down list. The list displays only existing Autonomous Database connections. To use a different connection, create the connection before you reach this page.
  13. After providing all the required connection details, click Test Connection to test the connection.
  14. Click Create.

    The new connection is created.

Creating a Generic Rest Connection Using a Config File

APPLIES TO: Applicable Data Transforms that is available as a separate listing on Marketplace called Data Integrator: Web Edition.

To create a generic REST connector, you need the JDBC URL, username, password, and a config file. The config file is a model file with the file_name.rest naming convention that you need to upload when you create a REST Server connection. You need to specify the endpoints, table mappings, and the authentication methods to create the config file. You can create the config file using any text editor.

To define a REST server connection using a config file:
  1. From the left pane of the Home page, click the Connections tab.

    Connections page appears.

  2. Click Create Connection.

    Create Connection page slides in.

  3. Do one of the following:
    • In the Select Type field, enter the name or part of the name of the connection type.
    • Select the Applications tab.
  4. Select Generic Rest Config as the connection type.
  5. Click Next.
  6. The Connection Name field is pre-populated with a default name. You can edit this value.
  7. Use the Config File text box to upload the config file that you want to use.
  8. In the JDBC URL textbox, enter the URL to connect to the server.
  9. In the User and Password text boxes enter the user name and password for connecting to the REST endpoint. You may leave these fields blank if these values are not applicable or are already mentioned in the JDBC URL.
  10. After providing all the required connection details, click Test Connection to test the connection.
  11. Click Create.

    The new connection is created.