- If it is integrated with a Cloudera cluster, it must be installed on a gateway node that is managed by Cloudera Manager.
- If it is integrated with Cloudera Data Platform, it must be installed on an edge node of the cluster.
- If it is integrated with a Hortonworks cluster, it must be installed on an Ambari/Hadoop client that is managed by Hortonworks Ambari.
- Customers who originally installed an earlier version on a non-edge node will still be supported. If the software is not installed on an edge node, you may be required to copy over files from the cluster and to synchronize these files after upgrades. The cluster upgrade process is more complicated.
- This requirement does not apply to the following cluster integrations:
- AWS EMR
- Azure Databricks
Where possible, you should install the same version of Java on the
|D s node|
- Java 8 11 (recommendedruntime only)
Notes on Java versions:
OpenJDK 8 is supported.
NOTE: If you are using Azure Databricks as a datasource, please verify that openJDKv1.8.0_302 or earlier is installed on the
. Java 8 is required. There is a known issue with TLS v1.3.
D s node
- There are additional requirements related to Java JDK listed in the Hadoop Components section listed below.
- If you are integrating your
with S3, you must install the Oracle JRE 1.8 onto the
D s item item instance
. No other version of Java is supported for S3 integration. For more information, see S3 Access in the Configuration Guide.
D s node
See Supported Deployment Scenarios for Cloudera in the Install Guide.
Hortonworks supported distributions
See Supported Deployment Scenarios for Hortonworks in the Install Guide.
EMR supported distributions
- Java must be installed on each node of the cluster. For more information, see https://www.cloudera.com/documentation/enterprise/latest/topics/cdh_ig_jdk_installation.html.
- The versions of Java on the
and the Hadoop cluster do not have to match.
D s item item node
|Spark 2.3||Spark 2.4||Spark 3.0.1|