D toc |
---|
D s node |
---|
The node where the
D s item | ||
---|---|---|
|
Item | Description | Status or Value | ||||||
---|---|---|---|---|---|---|---|---|
Operating System | The operating system on the node must be one of the supported 64-bit versions. For more information, see System Requirements. | |||||||
Cores | Minimum of four cores | |||||||
RAM Memory | Minimum of 16 GB dedicated | |||||||
Disk Space | Minimum of 20 GB | |||||||
Internet Access | If your data sources are available over an Internet connection, the platform must be permitted to use that connection. | |||||||
Databases | The
|
Cloud Infrastructure
D s product |
---|
Item | Description | Status or Value | ||||
---|---|---|---|---|---|---|
AWS |
| |||||
Azure |
|
Hadoop Cluster
If your instance of
D s product |
---|
Info | |
---|---|
NOTE: The number of nodes in your cluster and the number of cores and amount of memory on each data node can affect the performance of running jobs on the cluster. If you have questions about cluster size, please contact
|
Item | Description | Status or Value |
---|---|---|
Cluster Type and Version | Supported types of Hadoop clusters:
| |
Number of data nodes | Total number of data nodes. | |
Data node - number of cores | Number of cores on each data node. | |
Data node - memory | Volume of RAM memory (GB) on each data node. | |
Upgrade plans | If there are planned upgrades to the cluster, please review the list of supported versions to verify that the new version is supported within the timeframe of your upgrade plans. |
Hadoop Cluster Details
During installation and configuration, you may need to specify the following configuration information to successfully integrate
D s product |
---|
Item | Description | Status or Value | ||
---|---|---|---|---|
Namenode host | Host name of the namenode on the cluster | |||
Namenode port | Port number of the namenode on the cluster | |||
Secondary namenode host | Host name for the secondary namenode for the cluster
| |||
Secondary namenode port | Port number for the secondary namenode on the cluster
| |||
Namenode Service name | Name for the namenode service
| |||
ResourceManager host | Host name for the ResourceManager on the cluster | |||
ResourceManager port | Port number for the ResourceManager on the cluster | |||
Secondary ResourceManager host | Host name for the secondary ResourceManager on the cluster
| |||
Secondary ResourceManager port | Port number for the secondary ResourceManager on the cluster
| |||
Hive host | Host name for the Hive server on the cluster. For more information, see Configure for Hive in the Configuration Guide. | |||
Hive port | Port number for the Hive server on the cluster. For more information, see Configure for Hive in the Configuration Guide. | |||
HttpFS host | Host name for the HttpFS server on the cluster.
| |||
HttpFS port | Port number for the HttpFS server on the cluster.
|
Hadoop Cluster Security
Item | Description | Status or Value | ||||||
---|---|---|---|---|---|---|---|---|
HDFS Service user | By default, the platform uses the When Kerberos is enabled, this user is used to impersonate other users on the cluster. For more information on required users, see Required Users and Groups. | |||||||
HDFS transfer encryption | Optionally, the cluster can be configured to use SSL/TLS on data transfer for HDFS. On the cluster, this setting is defined using the | |||||||
SSL on HTTP endpoints | Is encryption applied to the WebHDFS/HttpFS endpoints? | |||||||
Keberos | Cluster has been enabled for Kerberos. For more information on the integration, see Configure for Kerberos Integration in the Configuration Guide. | |||||||
KDC | Name for the Key Data Center for Kerberos. For more information on the integration, see Configure for Kerberos Integration in the Configuration Guide. | |||||||
Kerberos realm | Realm for the Key Data Center for Kerberos. For more information, see Configure for Kerberos Integration in the Configuration Guide. |
Firewall
Item | Description | Status or Value | |
---|---|---|---|
Firewall between users and cluster | If a firewall is present between users and the cluster, the default web application port must be opened for user access. See below. | ||
Web application port | By default, the As needed, this value can be modified. For more information, see System Ports. |
Connectivity
D s product |
---|
Item | Description | Status or Value |
---|---|---|
Primary backend storage | The following primary storage environments are supported: HDFS or S3.For more information, see Set Base Storage Layer in the Configuration Guide. | |
Hive | For more information, see Configure for Hive in the Configuration Guide. | |
Relational Connections | Supported connections include Oracle, SQL Server, Teradata, Tableau, Salesforce, and more. For more information, see Connection Types. |
Item | Description | Status or Value |
---|---|---|
Redshift | For more information, see Create Redshift Connections in the Configuration Guide. |
Desktop Environments
The following requirements apply to end-user desktop environments.
Item | Description | Status or Value | |||
---|---|---|---|---|---|
Google Chrome version | The application requires that users connect using a version of Google Chrome.
For a list of supported versions, see Desktop Requirements. | ||||
Desktop Application | If Google Chrome is not available, users can connect to the application using a custom desktop application.
For more information, see Install Desktop Application in the Install Guide. |
Extras
Deployment or use of these features requires additional configuration or development external to the application. Related content may not be available in printed format.
Item | Description | Status or Value | |
---|---|---|---|
Cluster Compression | The platform can integrate with clusters that are compressed using Bzip2, Gzip, or Snappy. For more information, see Enable Integration with Compressed Clusters in the Configuration Guide. | ||
Single Sign-On | The application can integrate with the following Single Sign-On solutions:
| ||
SSL for the platform | You can apply an SSL certificate to the
| ||
API | You can manage aspects of your flows, datasets, and connections through publicly available Application Protocol Interfaces (APIs). For more information, see API Reference in the Developer's Guide. | ||
UDF | You can create custom user-defined functions for deployment into the platform. For more information on the list of available functions, see Language Index in the Language Reference Guide. For more information on UDFs, see User-Defined Functions in the Developer's Guide. |