Я пытаюсь настроить аутентификацию Kerberos для HBase с помощью этого http://hbase.apache.org/0.94/book/security.html документации и пока очень мало продвинулись.
HBase 1.1.1 от Apache без каких-либо влияний Cloudera. Хост-машина работает под управлением Centos 6.5.
Я уже настроил Kerberos KDC и клиент, следуя инструкциям https://gist.github.com/ashrithr/4767927948eca70845db KDC находится на той же машине, что и HBase, которую я пытаюсь защитить.
В общем, вот текущее состояние среды: файл keytab находится здесь /opt/hbase.keytab
содержимое hbase-site.xml
<?xml version="1.0"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
<configuration>
<property>
<name>hbase.rootdir</name>
<value>file:///opt/hbase-data/hbase</value>
</property>
<property>
<name>hbase.zookeeper.property.dataDir</name>
<value>/opt/hbase-data/zookeeper</value>
</property>
<property>
<name>hbase.cluster.distributed</name>
<value>true</value>
</property>
<property>
<name>hbase.security.authentication</name>
<value>kerberos</value>
</property>
<property>
<name>hbase.security.authorization</name>
<value>true</value>
</property>
<property>
<name>hbase.coprocessor.region.classes</name>
<value>org.apache.hadoop.hbase.security.token.TokenProvider</value>
</property>
<property>
<name>hbase.master.keytab.file</name>
<value>/opt/hbase.keytab</value>
</property>
<property>
<name>hbase.master.kerberos.principal</name>
<value>hbase/_HOST@XXX.MYCOMPANY.COM</value>
</property>
<property>
<name>hbase.regionserver.kerberos.principal</name>
<value>hbase/_HOST@XXX.MYCOMPANY.COM</value>
</property>
<property>
<name>hbase.regionserver.keytab.file</name>
<value>/opt/hbase.keytab</value>
</property>
</configuration>
Это псевдораспределенный режим, и я не стал беспокоиться об отключении HDFS, чтобы все было как можно проще.
Однако, когда я запускаю hbase с помощью команды ./start-hbase, я получаю следующую ошибку в региональный сервер.журнал
2015-10-20 17:33:18,068 INFO [regionserver/xxx.mycompany.com/172.24.4.60:16201] regionserver.HRegionServer: reportForDuty to master=xxx.mycompany.com,16000,1445349909162 with port=16201, startcode=1445349910087 2015-10-20 17:33:18,071 WARN [regionserver/xxx.mycompany.com/172.24.4.60:16201] ipc.AbstractRpcClient: Exception encountered while connecting to the server : javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)] 2015-10-20 17:33:18,071 FATAL [regionserver/xxx.mycompany.com/172.24.4.60:16201] ipc.AbstractRpcClient: SASL authentication failed. The most likely cause is missing or invalid credentials. Consider 'kinit'. javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]
at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:212)
at org.apache.hadoop.hbase.security.HBaseSaslRpcClient.saslConnect(HBaseSaslRpcClient.java:179)
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupSaslConnection(RpcClientImpl.java:609)
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.access$600(RpcClientImpl.java:154)
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection$2.run(RpcClientImpl.java:735)
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection$2.run(RpcClientImpl.java:732)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1614)
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupIOstreams(RpcClientImpl.java:732)
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.writeRequest(RpcClientImpl.java:885)
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.tracedWriteRequest(RpcClientImpl.java:854)
at org.apache.hadoop.hbase.ipc.RpcClientImpl.call(RpcClientImpl.java:1180)
at org.apache.hadoop.hbase.ipc.AbstractRpcClient.callBlockingMethod(AbstractRpcClient.java:213)
at org.apache.hadoop.hbase.ipc.AbstractRpcClient$BlockingRpcChannelImplementation.callBlockingMethod(AbstractRpcClient.java:287)
at org.apache.hadoop.hbase.protobuf.generated.RegionServerStatusProtos$RegionServerStatusService$BlockingStub.regionServerStartup(RegionServerStatusProtos.java:8982)
at org.apache.hadoop.hbase.regionserver.HRegionServer.reportForDuty(HRegionServer.java:2260)
at org.apache.hadoop.hbase.regionserver.HRegionServer.run(HRegionServer.java:893)
at java.lang.Thread.run(Thread.java:745) Caused by: GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)
at sun.security.jgss.krb5.Krb5InitCredential.getInstance(Krb5InitCredential.java:147)
at sun.security.jgss.krb5.Krb5MechFactory.getCredentialElement(Krb5MechFactory.java:121)
at sun.security.jgss.krb5.Krb5MechFactory.getMechanismContext(Krb5MechFactory.java:187)
at sun.security.jgss.GSSManagerImpl.getMechanismContext(GSSManagerImpl.java:223)
at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:212)
at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179)
at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:193)
... 18 more 2015-10-20 17:33:18,072 WARN [regionserver/xxx.mycompany.com/172.24.4.60:16201] regionserver.HRegionServer: error telling master we are up com.google.protobuf.ServiceException: java.io.IOException: Could not set up IO Streams to xxx.mycompany.com/172.24.4.60:16000
at org.apache.hadoop.hbase.ipc.AbstractRpcClient.callBlockingMethod(AbstractRpcClient.java:223)
at org.apache.hadoop.hbase.ipc.AbstractRpcClient$BlockingRpcChannelImplementation.callBlockingMethod(AbstractRpcClient.java:287)
at org.apache.hadoop.hbase.protobuf.generated.RegionServerStatusProtos$RegionServerStatusService$BlockingStub.regionServerStartup(RegionServerStatusProtos.java:8982)
at org.apache.hadoop.hbase.regionserver.HRegionServer.reportForDuty(HRegionServer.java:2260)
at org.apache.hadoop.hbase.regionserver.HRegionServer.run(HRegionServer.java:893)
at java.lang.Thread.run(Thread.java:745) Caused by: java.io.IOException: Could not set up IO Streams to xxx.mycompany.com/172.24.4.60:16000
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupIOstreams(RpcClientImpl.java:777)
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.writeRequest(RpcClientImpl.java:885)
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.tracedWriteRequest(RpcClientImpl.java:854)
at org.apache.hadoop.hbase.ipc.RpcClientImpl.call(RpcClientImpl.java:1180)
at org.apache.hadoop.hbase.ipc.AbstractRpcClient.callBlockingMethod(AbstractRpcClient.java:213)
... 5 more Caused by: java.lang.RuntimeException: SASL authentication failed. The most likely cause is missing or invalid credentials. Consider 'kinit'.
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection$1.run(RpcClientImpl.java:677)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1614)
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.handleSaslConnectionFailure(RpcClientImpl.java:635)
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupIOstreams(RpcClientImpl.java:743)
... 9 more Caused by: javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]
at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:212)
at org.apache.hadoop.hbase.security.HBaseSaslRpcClient.saslConnect(HBaseSaslRpcClient.java:179)
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupSaslConnection(RpcClientImpl.java:609)
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.access$600(RpcClientImpl.java:154)
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection$2.run(RpcClientImpl.java:735)
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection$2.run(RpcClientImpl.java:732)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1614)
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupIOstreams(RpcClientImpl.java:732)
... 9 more Caused by: GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)
at sun.security.jgss.krb5.Krb5InitCredential.getInstance(Krb5InitCredential.java:147)
at sun.security.jgss.krb5.Krb5MechFactory.getCredentialElement(Krb5MechFactory.java:121)
at sun.security.jgss.krb5.Krb5MechFactory.getMechanismContext(Krb5MechFactory.java:187)
at sun.security.jgss.GSSManagerImpl.getMechanismContext(GSSManagerImpl.java:223)
at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:212)
at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179)
at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:193)
... 18 more 2015-10-20 17:33:18,073 WARN [regionserver/xxx.mycompany.com/172.24.4.60:16201] regionserver.HRegionServer: reportForDuty failed; sleeping and then retrying.
Я полагаю, что Kerberos работает, потому что я могу получить
$ klist -ekt hbase.keytab
Keytab name: FILE:hbase.keytab
KVNO Timestamp Principal
---- ----------------- --------------------------------------------------------
3 10/19/15 17:11:42 hbase/xxx.mycompany.com@XXX.MYCOMPANY.COM (arcfour-hmac)
3 10/19/15 17:11:42 hbase/xxx.mycompany.com@XXX.MYCOMPANY.COM (des3-cbc-sha1)
3 10/19/15 17:11:42 hbase/xxx.mycompany.com@XXX.MYCOMPANY.COM (des-cbc-crc)
$ kinit -kt /opt/hbase.keytab hbase/xxx.mycompany.com@XXX.MYCOMPANY.COM
[userx1@gms-01 logs]$ klist
Ticket cache: FILE:/tmp/krb5cc_2369
Default principal: hbase/xxx.mycompany.com@XXX.MYCOMPANY.COM
Valid starting Expires Service principal
10/20/15 17:49:32 10/21/15 03:49:32 krbtgt/XXX.MYCOMPANY.COM@XXX.MYCOMPANY.COM
renew until 10/27/15 16:49:32
Оболочка hbase создает такое же исключение, как указано выше, при попытке запустить команду status (или любую другую)
Если у кого-то есть предложения или советы, дайте мне знать
заранее спасибо
Я столкнулся с той же проблемой и решил ее, запустив региональный сервер отдельно. При запуске демонов убедитесь, что у вас есть действующий Kerberos TGT.