This is the documentation for Cloudera Enterprise 5.8.x. Documentation for other versions is available at Cloudera Documentation.

Integrating Hadoop Security with Active Directory

TLS Setup Notes

  • Set -keypass to the same value as -storepass. Cloudera Manager assumes that the same password is used to access both the key and the keystore, and therefore, does not support separate values for -keypass and -storepass.
  Important: With CDH 5.1 and higher, clusters managed by Cloudera Manager 5.1 (and higher) do not require a local MIT KDC and are able to integrate directly with an Active Directory KDC. Cloudera recommends you use a direct-to-AD setup. For instructions, see Enabling Kerberos Authentication Using the Wizard.
If direct integration with AD is not currently possible, use the following instructions to configure a local MIT KDC to trust your AD server:
  1. Run an MIT Kerberos KDC and realm local to the cluster and create all service principals in this realm.
  2. Set up one-way cross-realm trust from this realm to the Active Directory realm. Using this method, there is no need to create service principals in Active Directory, but Active Directory principals (users) can be authenticated to Hadoop. See Configuring a Local MIT Kerberos Realm to Trust Active Directory.

Configuring a Local MIT Kerberos Realm to Trust Active Directory

On the Active Directory Server

  1. Add the local realm trust to Active Directory with this command:
    netdom trust YOUR-LOCAL-REALM.COMPANY.COM /Domain:AD-REALM.COMPANY.COM /add /realm /passwordt:<TrustPassword>
  2. Set the proper encryption type with this command:

    On Windows 2003 RC2:

    ktpass /MITRealmName YOUR-LOCAL-REALM.COMPANY.COM /TrustEncryp <enc_type>

    On Windows 2008:

    ksetup /SetEncTypeAttr YOUR-LOCAL-REALM.COMPANY.COM <enc_type>

    The <enc_type> parameter specifies AES, DES, or RC4 encryption. Refer to the documentation for your version of Windows Active Directory to find the <enc_type> parameter string to use.

  3. Get and verify the list of encryption types set with this command:

    On Windows 2008:

    ksetup /GetEncTypeAttr YOUR-LOCAL-REALM.COMPANY.COM
      Important: Make sure the encryption type you specify is supported on both your version of Windows Active Directory and your version of MIT Kerberos.

On the MIT KDC Server

Type the following command in the kadmin.local or kadmin shell to add the cross-realm krbtgt principal. Use the same password you used in the netdom command on the Active Directory Server.

kadmin:  addprinc -e "<enc_type_list>" krbtgt/YOUR-LOCAL-REALM.COMPANY.COM@AD-REALM.COMPANY.COM

where the <enc_type_list> parameter specifies the types of encryption this cross-realm krbtgt principal will support: either AES, DES, or RC4 encryption. You can specify multiple encryption types using the parameter in the command above, what's important is that at least one of the encryption types corresponds to the encryption type found in the tickets granted by the KDC in the remote realm. For example:

kadmin:  addprinc -e "rc4-hmac:normal des3-hmac-sha1:normal" krbtgt/YOUR-LOCAL-REALM.COMPANY.COM@AD-REALM.COMPANY.COM
  Note: The cross-realm krbtgt principal that you add in this step must have at least one entry that uses the same encryption type as the tickets that are issued by the remote KDC. If no entries have the same encryption type, then the problem you will see is that authenticating as a principal in the local realm will allow you to successfully run Hadoop commands, but authenticating as a principal in the remote realm will not allow you to run Hadoop commands.

On All of the Cluster Hosts

  1. Verify that both Kerberos realms are configured on all of the cluster hosts. Note that the default realm and the domain realm should remain set as the MIT Kerberos realm which is local to the cluster.
    [realms]
      AD-REALM.CORP.FOO.COM = {
        kdc = ad.corp.foo.com:88
        admin_server = ad.corp.foo.com:749
        default_domain = foo.com
      }
      CLUSTER-REALM.CORP.FOO.COM = {
        kdc = cluster01.corp.foo.com:88
        admin_server = cluster01.corp.foo.com:749
        default_domain = foo.com
      }
  2. To properly translate principal names from the Active Directory realm into local names within Hadoop, you must configure the hadoop.security.auth_to_local setting in the core-site.xml file on all of the cluster machines. The following example translates all principal names with the realm AD-REALM.CORP.FOO.COM into the first component of the principal name only. It also preserves the standard translation for the default realm (the cluster realm).
    <property>
      <name>hadoop.security.auth_to_local</name>
      <value>
        RULE:[1:$1@$0](^.*@AD-REALM\.CORP\.FOO\.COM$)s/^(.*)@AD-REALM\.CORP\.FOO\.COM$/$1/g
        RULE:[2:$1@$0](^.*@AD-REALM\.CORP\.FOO\.COM$)s/^(.*)@AD-REALM\.CORP\.FOO\.COM$/$1/g
        DEFAULT
      </value>
    </property>

For more information about name mapping rules, see Configuring the Mapping from Kerberos Principals to Short Names.

Page generated July 8, 2016.