Connecting to Secure Clusters with Kerberos and SSL Enabled Using Spark3 Thrift Server
- 
Create a 
Kerberosdirectory on a local system that's an SSL enabled ODH cluster for Hive. - 
Copy 
spark.service.keytabfrom mn0 node (Spark3 Thrift Server node) of the ODH cluster to theKerberosdirectory, and then rename it tooac.keytab. - 
Copy 
/etc/krb5.conffrom mn0 node of the ODH cluster to theKerberosdirectory and rename it tokrb5conf. - 
Update 
admin_serverandkdcinformation inkrb5confwith the public IP of cluster's mn0 node instead of hostname. - 
Create a file named 
service_details.jsoninsideKerberosdirectory. For example:{ "Host" : "<Public IP of Spark3 Thrift Server node(mn0)>", "Port" : "10000", "ServicePrincipalName" : "spark/<FQDN of Spark3 Thrift Server node(mn0)>@<REALM_NAME>" } - 
Create a zip for the 
Kerberosdirectory. For example:$ ls -1 kerberos krb5conf oac.keytab service_details.json $ zip -r spark3tskerb.zip kerberos/* - To create a connection for Kerberos enabled ODH Open the navigation menu and select Analytics & AI. Under Analytics, select Analytics Cloud.
 - 
To connect to an Oracle Analytics Cloud instance, select the compartment in which you created the instance. 
If needed, create an instance. See Creating an OAC Instance.
 - Select the instance name.
 - Select Analytics Home Page.
 - Select Create, and then select Connection.
 - Select Spark.
 - 
Enter a name for the connection, and then enter the remaining details with the following specifics:
- Authentication Type: Select Kerberos
 - Client Credentials: Select 
spark3tskerb.zipfrom the local system - Authentication: Select Always use these credentials
 
 - Select Save.
 - To verify the connection, go to the OAC home page and select Connect to Your Data.
 - 
Select the connection you created.
If successful, the Apache Hive database tables are listed.