Page tree

Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.
Comment: Published by Scroll Versions from space DEV and version next

...

  1. Review Planning Guide: Please review and verify Install Preparation and sub-topics.
  2. Acquire Assets: Acquire the installation package for your operating system and your license key. For more information, contact 
    D s support
    .
    1. If you are completing the installation without Internet access, you must also acquire the offline versions of the system dependencies. See Install Dependencies without Internet Access.
  3. Deploy Hadoop cluster: In this scenario, the 

    D s platform
     does not create a Hadoop cluster. See below.

    Info

    NOTE: Installation and maintenance of a working Hadoop cluster is the responsibility of the

    D s product
    productee
    customer. Guidance is provided below on the requirements for integrating the platform with the cluster.

  4. Deploy 
    d-s-item
    item
    node
    :
    D s product
     must be installed on an edge node of the cluster. Details are below.

...

In your enterprise infrastructure, you must deploy a cluster using a supported version of Hadoop to support manage the expected data volumes of your 

D s item
itemjobs
. For more information on suggested sizing, see Sizing Guidelines in the Install Preparation area.

...

Info

NOTE: By default, smaller jobs are executed on the

d-s-serverphoton
running environment . Larger jobs are executed using Spark on the integrated Hadoop cluster. Spark must be installed on the cluster. For more information, see System Requirements in the Install Preparation area.

...

Additional users may be required. For more information, see Required Users and Groups in the Install Preparation area.

Deploy the 
d-s-

...

node

An edge node of the cluster is required to host the 

D s platform
 software. For more information on the requirements of this node, see System Requirements

...

Info

NOTE: You can try to verify operations using the

d-s-serverphoton
running environment at this time. While you can also try to run a job on the Hadoop cluster, additional configuration may be required to complete the integration. These steps are listed under Next Steps below.

...

TopicDescription
Required Platform Configuration

This section covers the following topics, some of which should already be completed:

  • Set Base Storage Layer - The base storage layer must be set once and never changed.
  • Create Encryption Key File - If you plan to integrate the platform with any relational sources, including Hive or Redshift, you must create an encryption key file and store it on the
    d-s-item
    item
    node
  • Running Environment Options - Depending on your scenario, you may need to perform additional configuration for your available running environment(s) for executing jobs.
  • Profiling Options - In some environments, tweaks to the settings for visual profiling may be required. You can disable visual profiling if needed.
  • Configure for Spark - If you are enabling the Spark running environment, please review and verify the configuration for integrating the platform with the Hadoop cluster instance of Spark.
Configure for Hadoop
Enable Integration with Compressed ClustersIf the Hadoop cluster uses compression, additional configuration is required.
Enable Integration with Cluster High Availability

If you are integrating with high availability on the Hadoop cluster, please complete these steps.

  • If you are integrating with high availability on the Hadoop cluster, HttpFS must be enabled in the platform. HttpFS is required in other, less-common cases. See Enable HttpFS.
Configure for Hive

Integration with the Hadoop cluster's instance of Hive.

 

Configure for KMSIntegration with the Hadoop cluster's key management system (KMS) for encrypted transport. Instructions are provided for distribution-specific versions of Hadoop.
Configure Security

A list of topics on applying additional security measures to the

D s platform
and how integrates with Hadoop.

Configure SSO for AD-LDAPPlease complete these steps if you are integrating with your enterprise's AD/LDAP Single Sign-On (SSO) system.