This section describes how to configure the
for integration with KMS system for Hortonworks Data Platform. It assumes that access to the cluster is gated by Ranger.Before you begin, please verify the pre-requisites. See Configure for KMS.
Excerpt |
---|
Info |
---|
NOTE: These changes should be applied through the management console for the Hadoop cluster before pushing the client configuration files to the nodes of the cluster. |
In the following sections: - - the userID accessing the cluster component
- -the appropriate group of user accessing the cluster component
D s item |
---|
| user properties |
---|
| user properties |
---|
| to KMS site fileIn Ambari on the Hortonworks cluster, navigate to Configs > Advanced > kms-site. Add the following properties: Code Block |
---|
<property>
<name>hadoop.kms.proxyuser.[hadoop.user].users</name>
<value>*</value>
</property>
<property>
<name>hadoop.kms.proxyuser.[hadoop.user].groups</name>
<value>*</value>
</property>
<property>
<name>hadoop.kms.proxyuser.[hadoop.user].hosts</name>
<value>*</value>
</property> |
If you are using Ranger's Key Management System, additional configuration is required. Info |
---|
NOTE: These changes apply to the Hortonworks cluster only. Make changes through Ambari; avoid editing configuration files directly. Configuration files do not need to be shared with the . |
If you are using Hive, please add the Hive users and groups information to kms-site.xml : Code Block |
---|
<property>
<name>hadoop.kms.proxyuser.[hadoop.user].users</name>
<value>*</value>
</property>
<property>
<name>hadoop.kms.proxyuser.[hadoop.user].groups</name>
<value>*</value>
</property>
<property>
<name>hadoop.kms.proxyuser.[hadoop.user].hosts</name>
<value>*</value>
</property> |
If Kerberos is deployed, edit kms-site.xml and verify the following properties in kms-site.xml : Code Block |
---|
<property>
<name>hadoop.kms.authentication.type</name>
<value>kerberos</value>
<description> Authentication type for the KMS. Can be either "simple" or "kerberos".</description>
</property>
<property>
<name>hadoop.kms.authentication.kerberos.keytab</name>
<value>/etc/security/keytabs/spnego.service.keytab</value>
<description> Path to the keytab with credentials for the configured Kerberos principal.</description>
</property>
<property>
<name>hadoop.kms.authentication.kerberos.principal</name>
<value>HTTP/FQDN for KMS host@YOUR HADOOP REALM</value>
<description> The Kerberos principal to use for the HTTP endpoint. The principal must start with 'HTTP/' as per the Kerberos HTTP SPNEGO specification.</description>
</property> |
If you are using Kerberos KMS authentication, verify the following properties in kms-site.xml : Code Block |
---|
<property>
<name>hadoop.kms.proxyuser.hdfs.users</name>
<value>*</value>
</property>
<property>
<name>hadoop.kms.proxyuser.hdfs.groups</name>
<value>*</value>
</property>
<property>
<name>hadoop.kms.proxyuser.hdfs.hosts</name>
<value>*</value>
</property> |
Info |
---|
NOTE: The following changes need to be applied to the Hortonworks cluster configuration files and then shared with the . For more information on the files required by the platform, see Configure for Hadoop. |
Edit core-site.xml and make the following change: Code Block |
---|
hadoop.security.key.provider.path=kms://http@<KMS_HOST>:9292/kms |
Edit hdfs-site.xml and make the following change: Code Block |
---|
dfs.encryption.key.provider.uri=kms://http@<KMS_HOST>:9292/kms |
Info |
---|
NOTE: The following changes is required only if Ranger's KMS system is enabled. If so, this change enables access to files that are stored in secured folders. |
Edit dbks-site.xml and make the following change: Info |
---|
NOTE: If the existing value is hdfs , you may leave it as-is. |
Code Block |
---|
update property hadoop.kms.blacklist.DECRYPT_EEK='-' |
|
Verify that updated cluster configuration files have been deployed to the
After the configuration is complete, you can try to import a dataset from a source stored in a cluster location managed by KMS, assuming that any required authentication configuration has been completed. See Import Data Page.
For more information, see Configure Hadoop Authentication.