Today, my colleagues want to use hive in zeppelin, it’s the first time to use hive in this new kerberized cluster, and unfortunately there was an authenticate issue of using hive. So I have to debug on it.

The hive client was installed hadoop-client and hive and put all the needed keytabs in config dirs and set the right permission of their all, but still could not connect to the cluster. The log always shows authentication failed.

Firstly I must descripting our new cluster, this is a cluster for P&G Group that managed by Cloudera Manager with an enterprise license. The nodes were all installed with cloudera hadoop 5.10.1 and other components with same versioned parcels. The client deployed with CDH 5.9.0 rpm package with manually, and an Apache community versioned Zeppelin. I think if I didn’t use cloudera manager, the cluster management would be easier much more. CM is the biggest nightmare in the history of bigdata. The first big problem is you never know where the manager put the config files are.

Now, back to our discussing, the kerberos. So, for debugging, I logged in a cluster node, and try hive and beeline command, they all seems worked well.

On pg-dmp-slavexx.hadoop

hive
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=512M; support was removed in 8.0
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=512M; support was removed in 8.0

Logging initialized using configuration in jar:file:/opt/cloudera/parcels/CDH-5.10.1-1.cdh5.10.1.p0.10/jars/hive-common-1.1.0-cdh5.10.1.jar!/hive-log4j.properties
WARNING: Hive CLI is deprecated and migration to Beeline is recommended.
hive> show databases;
OK
default
Time taken: 1.661 seconds, Fetched: 1 row(s)
hive>

Well it seems good, now beeline

beeline -u 'jdbc:hive2://pg-dmp-master2.hadoop:10000/default;principal=hive/pg-dmp-master2.hadoop@PG.COM'
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=512M; support was removed in 8.0
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=512M; support was removed in 8.0
scan complete in 1ms
Connecting to jdbc:hive2://pg-dmp-master2.hadoop:10000/default;principal=hive/pg-dmp-master2.hadoop@PG.COM
Connected to: Apache Hive (version 1.1.0-cdh5.10.1)
Driver: Hive JDBC (version 1.1.0-cdh5.10.1)
Transaction isolation: TRANSACTION_REPEATABLE_READ
Beeline version 1.1.0-cdh5.10.1 by Apache Hive
0: jdbc:hive2://pg-dmp-master2.hadoop:10000/d> show databases;
INFO  : Compiling command(queryId=hive_20170503222424_9512c898-9822-4659-b07b-f8abb2fd50b7): show databases
INFO  : Semantic Analysis Completed
INFO  : Returning Hive schema: Schema(fieldSchemas:[FieldSchema(name:database_name, type:string, comment:from deserializer)], properties:null)
INFO  : Completed compiling command(queryId=hive_20170503222424_9512c898-9822-4659-b07b-f8abb2fd50b7); Time taken: 0.004 seconds
INFO  : Executing command(queryId=hive_20170503222424_9512c898-9822-4659-b07b-f8abb2fd50b7): show databases
INFO  : Starting task [Stage-0:DDL] in serial mode
INFO  : Completed executing command(queryId=hive_20170503222424_9512c898-9822-4659-b07b-f8abb2fd50b7); Time taken: 0.013 seconds
INFO  : OK
+----------------+--+
| database_name  |
+----------------+--+
| default        |
+----------------+--+
1 row selected (0.106 seconds)
0: jdbc:hive2://pg-dmp-master2.hadoop:10000/d>

Well, it seems good either.

Second step, I logged in the 5.9.0 client and try to use same command, and hive runs failed.

hive
2017-05-03 22:09:28,228 WARN  [main] mapreduce.TableMapReduceUtil: The hbase-prefix-tree module jar containing PrefixTreeCodec is not present.  Continuing without it.

Logging initialized using configuration in file:/etc/hive/conf.dist/hive-log4j.properties
Exception in thread "main" java.lang.RuntimeException: org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
        at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:541)
        at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:689)
        at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:628)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
        at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
        at org.apache.hadoop.hive.ql.metadata.Hive.registerAllFunctionsOnce(Hive.java:206)
        at org.apache.hadoop.hive.ql.metadata.Hive.<init>(Hive.java:324)
        at org.apache.hadoop.hive.ql.metadata.Hive.get(Hive.java:285)
        at org.apache.hadoop.hive.ql.metadata.Hive.get(Hive.java:260)
        at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:514)
        ... 8 more
Caused by: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
        at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1530)
        at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:67)
        at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:82)
        at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3037)
        at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3056)
        at org.apache.hadoop.hive.ql.metadata.Hive.getAllFunctions(Hive.java:3281)
        at org.apache.hadoop.hive.ql.metadata.Hive.reloadFunctions(Hive.java:217)
        at org.apache.hadoop.hive.ql.metadata.Hive.registerAllFunctionsOnce(Hive.java:201)
        ... 12 more
Caused by: java.lang.reflect.InvocationTargetException
        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
        at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
        at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
        at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1528)
        ... 19 more
Caused by: MetaException(message:Could not connect to meta store using any of the URIs provided. Most recent failure: org.apache.thrift.transport.TTransportException: GSS initiate failed
        at org.apache.thrift.transport.TSaslTransport.sendAndThrowMessage(TSaslTransport.java:232)
        at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:316)
        at org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37)
        at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:52)
        at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:49)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:415)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1698)
        at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport.open(TUGIAssumingTransport.java:49)
        at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:430)
        at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:240)
        at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:74)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
        at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
        at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
        at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1528)
        at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:67)
        at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:82)
        at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3037)
        at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3056)
        at org.apache.hadoop.hive.ql.metadata.Hive.getAllFunctions(Hive.java:3281)
        at org.apache.hadoop.hive.ql.metadata.Hive.reloadFunctions(Hive.java:217)
        at org.apache.hadoop.hive.ql.metadata.Hive.registerAllFunctionsOnce(Hive.java:201)
        at org.apache.hadoop.hive.ql.metadata.Hive.<init>(Hive.java:324)
        at org.apache.hadoop.hive.ql.metadata.Hive.get(Hive.java:285)
        at org.apache.hadoop.hive.ql.metadata.Hive.get(Hive.java:260)
        at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:514)
        at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:689)
        at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:628)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
        at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
)
        at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:477)
        at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:240)
        at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:74)
        ... 24 more

And beeline failed too.

beeline -u 'jdbc:hive2://pg-dmp-master2.hadoop:10000/default;principal=hive/pg-dmp-master2.hadoop@PG.COM'
2017-05-03 22:27:44,881 WARN  [main] mapreduce.TableMapReduceUtil: The hbase-prefix-tree module jar containing PrefixTreeCodec is not present.  Continuing without it.
scan complete in 1ms
Connecting to jdbc:hive2://pg-dmp-master2.hadoop:10000/default;principal=hive/pg-dmp-master2.hadoop@PG.COM

17/05/03 22:27:46 [main]: ERROR transport.TSaslTransport: SASL negotiation failure
javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]
        at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:212)
        at org.apache.thrift.transport.TSaslClientTransport.handleSaslStartMessage(TSaslClientTransport.java:94)
        at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:271)
        at org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37)
        at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:52)
        at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:49)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:415)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1698)
        at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport.open(TUGIAssumingTransport.java:49)
        at org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:202)
        at org.apache.hive.jdbc.HiveConnection.<init>(HiveConnection.java:167)
        at org.apache.hive.jdbc.HiveDriver.connect(HiveDriver.java:105)
        at java.sql.DriverManager.getConnection(DriverManager.java:571)
        at java.sql.DriverManager.getConnection(DriverManager.java:187)
        at org.apache.hive.beeline.DatabaseConnection.connect(DatabaseConnection.java:142)
        at org.apache.hive.beeline.DatabaseConnection.getConnection(DatabaseConnection.java:207)
        at org.apache.hive.beeline.Commands.connect(Commands.java:1457)
        at org.apache.hive.beeline.Commands.connect(Commands.java:1352)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at org.apache.hive.beeline.ReflectiveCommandHandler.execute(ReflectiveCommandHandler.java:52)
        at org.apache.hive.beeline.BeeLine.execCommandWithPrefix(BeeLine.java:1130)
        at org.apache.hive.beeline.BeeLine.dispatch(BeeLine.java:1169)
        at org.apache.hive.beeline.BeeLine.initArgs(BeeLine.java:810)
        at org.apache.hive.beeline.BeeLine.begin(BeeLine.java:890)
        at org.apache.hive.beeline.BeeLine.mainWithInputRedirection(BeeLine.java:510)
        at org.apache.hive.beeline.BeeLine.main(BeeLine.java:493)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
        at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
Caused by: GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)
        at sun.security.jgss.krb5.Krb5InitCredential.getInstance(Krb5InitCredential.java:147)
        at sun.security.jgss.krb5.Krb5MechFactory.getCredentialElement(Krb5MechFactory.java:121)
        at sun.security.jgss.krb5.Krb5MechFactory.getMechanismContext(Krb5MechFactory.java:187)
        at sun.security.jgss.GSSManagerImpl.getMechanismContext(GSSManagerImpl.java:223)
        at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:212)
        at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179)
        at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:193)
        ... 35 more
HS2 may be unavailable, check server status
Error: Could not open client transport with JDBC Uri: jdbc:hive2://pg-dmp-master2.hadoop:10000/default;principal=hive/pg-dmp-master2.hadoop@PG.COM: GSS initiate failed (state=08S01,code=0)
Beeline version 1.1.0-cdh5.9.0 by Apache Hive
beeline>

When I saw HS2 may be unavailable, I think the hiveserver2 process must be down, so I check it in the CM and with ps -aux, but it stiill alive, and I try to use telnet hiveserver2 10000, and telnet metastore_server 9083, each connected correctly. So I didn’t know what happens. Reading the log, there is a SASL error, WTF?

It looks like a kerberos authentication issue. I google-ed whole day, but there was no useful info for me.

So I open the hive-env.sh, and add this line, as the same, I add this to another client which is working well.

export HADOOP_OPTS="-Dsun.security.krb5.debug=true ${HADOOP_OPTS}"

Well, on good client, it shows the kerberos debug info like this.

Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=512M; support was removed in 8.0
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=512M; support was removed in 8.0
scan complete in 2ms
Connecting to jdbc:hive2://pg-dmp-master2.hadoop:10000/default;principal=hive/pg-dmp-master2.hadoop@PG.COM
Java config name: null
Native config name: /etc/krb5.conf
Loaded from native config
>>>KinitOptions cache name is /tmp/krb5cc_0
>>>DEBUG <CCacheInputStream>  client principal is xianglei@PG.COM
>>>DEBUG <CCacheInputStream> server principal is krbtgt/PG.COM@PG.COM
>>>DEBUG <CCacheInputStream> key type: 23
>>>DEBUG <CCacheInputStream> auth time: Wed May 03 18:29:34 CST 2017
>>>DEBUG <CCacheInputStream> start time: Wed May 03 18:29:34 CST 2017
>>>DEBUG <CCacheInputStream> end time: Thu May 04 18:29:34 CST 2017
>>>DEBUG <CCacheInputStream> renew_till time: Wed May 10 18:29:33 CST 2017
>>> CCacheInputStream: readFlags()  FORWARDABLE; RENEWABLE; INITIAL;
>>>DEBUG <CCacheInputStream>  client principal is xianglei@PG.COM
>>>DEBUG <CCacheInputStream> server principal is X-CACHECONF:/krb5_ccache_conf_data/fast_avail/krbtgt/PG.COM@PG.COM@PG.COM
>>>DEBUG <CCacheInputStream> key type: 0
>>>DEBUG <CCacheInputStream> auth time: Thu Jan 01 08:00:00 CST 1970
>>>DEBUG <CCacheInputStream> start time: null
>>>DEBUG <CCacheInputStream> end time: Thu Jan 01 08:00:00 CST 1970
>>>DEBUG <CCacheInputStream> renew_till time: null
>>> CCacheInputStream: readFlags() 
Found ticket for xianglei@PG.COM to go to krbtgt/PG.COM@PG.COM expiring on Thu May 04 18:29:34 CST 2017
Entered Krb5Context.initSecContext with state=STATE_NEW
Found ticket for xianglei@PG.COM to go to krbtgt/PG.COM@PG.COM expiring on Thu May 04 18:29:34 CST 2017
Service ticket not found in the subject
>>> Credentials acquireServiceCreds: same realm
default etypes for default_tgs_enctypes: 23.
>>> CksumType: sun.security.krb5.internal.crypto.RsaMd5CksumType
>>> EType: sun.security.krb5.internal.crypto.ArcFourHmacEType
>>> KdcAccessibility: reset
>>> KrbKdcReq send: kdc=pg-dmp-master2.hadoop TCP:88, timeout=3000, number of retries =3, #bytes=621
>>> KDCCommunication: kdc=pg-dmp-master2.hadoop TCP:88, timeout=3000,Attempt =1, #bytes=621
>>>DEBUG: TCPClient reading 612 bytes
>>> KrbKdcReq send: #bytes read=612
>>> KdcAccessibility: remove pg-dmp-master2.hadoop
>>> EType: sun.security.krb5.internal.crypto.ArcFourHmacEType
>>> KrbApReq: APOptions are 00100000 00000000 00000000 00000000
>>> EType: sun.security.krb5.internal.crypto.ArcFourHmacEType
Krb5Context setting mySeqNumber to: 575633251
Created InitSecContextToken:
0000: 01 00 6E 82 02 1B 30 82   02 17 A0 03 02 01 05 A1  ..n...0.........
0010: 03 02 01 0E A2 07 03 05   00 20 00 00 00 A3 82 01  ......... ......
0020: 45 61 82 01 41 30 82 01   3D A0 03 02 01 05 A1 08  Ea..A0..=.......
0030: 1B 06 50 47 2E 43 4F 4D   A2 28 30 26 A0 03 02 01  ..PG.COM.(0&....
0040: 00 A1 1F 30 1D 1B 04 68   69 76 65 1B 15 70 67 2D  ...0...hive..pg-
0050: 64 6D 70 2D 6D 61 73 74   65 72 32 2E 68 61 64 6F  dmp-master2.hado
0060: 6F 70 A3 82 01 00 30 81   FD A0 03 02 01 17 A1 03  op....0.........
0070: 02 01 05 A2 81 F0 04 81   ED 7C 10 DA F1 10 84 5A  ...............Z
0080: EF 26 A4 1F 75 47 E7 AD   18 DE 05 1F B8 F8 9D 2F  .&..uG........./
0090: A1 CB 55 11 1E 19 56 0D   1C 9D B1 6D E3 84 FD A5  ..U...V....m....
00A0: 06 70 06 64 5C 6A F7 05   CE AA 38 6D 53 62 08 23  .p.d\j....8mSb.#
00B0: 2B 4A 8F 77 BB 1F A1 8D   CC A9 5B 31 A5 7A 85 21  +J.w......[1.z.!
00C0: 34 98 9F FD D4 B9 25 74   6A E5 5D FE 77 B1 73 27  4.....%tj.].w.s'
00D0: B1 54 E5 46 05 61 BF 0E   39 9E 1C 2E 3B 03 4A 39  .T.F.a..9...;.J9
00E0: 11 8D D3 F9 8F 23 FA 42   89 A0 1D E4 0C 10 05 C4  .....#.B........
00F0: 12 99 4F 69 6A 0D C6 E1   D0 F0 B3 8B DA 05 AF 35  ..Oij..........5
0100: 9D F1 33 3D A2 8C B1 1A   C9 77 1E 54 99 03 E0 8A  ..3=.....w.T....
0110: D4 20 F9 BC 34 23 7F 4C   A5 DC E4 90 0D 73 74 07  . ..4#.L.....st.
0120: 59 10 13 7C B0 44 5F 20   CE D2 C1 F2 BF 75 77 96  Y....D_ .....uw.
0130: DF 08 7A FF BB 7C 1F 7C   7C 0F 98 90 C2 0F 4D E9  ..z...........M.
0140: 81 A3 1F 64 D7 12 31 1E   A9 0C 78 33 46 66 5A DE  ...d..1...x3FfZ.
0150: F6 8E F6 02 F2 11 1C 8C   F6 BB 0C 4F FB C2 39 DB  ...........O..9.
0160: 7A F3 94 0D 95 28 A4 81   B8 30 81 B5 A0 03 02 01  z....(...0......
0170: 17 A2 81 AD 04 81 AA B7   6B 3E 91 7B 6A 78 A3 35  ........k>..jx.5
0180: E5 40 C3 24 C6 8A 90 29   D6 CC 9A 6C D1 97 DE 58  .@.$...)...l...X
0190: 18 1E B4 E5 B6 8D D3 53   F7 D4 E9 D5 ED E6 F1 E7  .......S........
01A0: AB 7F 16 B3 A6 EB F1 4B   FA FF 23 2E C7 01 60 1E  .......K..#...`.
01B0: 19 45 C0 1C 0C AA 0A 4E   3F A2 50 AD 01 7B FF 97  .E.....N?.P.....
01C0: 31 85 FD 18 34 73 4B 7A   1C 6A 98 2D BD 9E 76 86  1...4sKz.j.-..v.
01D0: 53 A0 78 AF E1 D4 0E 47   7B 78 6E CE 26 64 BB E0  S.x....G.xn.&d..
01E0: A4 72 EE D5 72 23 45 E8   F3 26 F3 CD A8 55 ED 83  .r..r#E..&...U..
01F0: 57 0D C0 F5 F3 38 2B 10   66 10 8D E7 2F F7 01 FE  W....8+.f.../...
0200: 0A 19 57 7E 62 95 CB A1   33 A2 C4 43 CA E6 49 71  ..W.b...3..C..Iq
0210: 63 E6 01 EF 6A A1 4E E2   FC 36 66 65 D6 41 B4 F9  c...j.N..6fe.A..
0220: 64                                                 d

Entered Krb5Context.initSecContext with state=STATE_IN_PROCESS
>>> EType: sun.security.krb5.internal.crypto.ArcFourHmacEType
Krb5Context setting peerSeqNumber to: 15371956
Krb5Context.unwrap: token=[60 30 06 09 2a 86 48 86 f7 12 01 02 02 02 01 11 00 ff ff ff ff 81 6d f2 03 73 5b 76 3c 92 69 4f 82 dc b2 40 63 f9 2d de 4f f8 7c af 41 01 01 00 00 01 ]
Krb5Context.unwrap: data=[01 01 00 00 ]
Krb5Context.wrap: data=[01 01 00 00 ]
Krb5Context.wrap: token=[60 30 06 09 2a 86 48 86 f7 12 01 02 02 02 01 11 00 ff ff ff ff 4d 06 d1 37 3b 4c 57 96 72 04 26 e2 af 91 90 81 b2 f3 e8 d6 07 8e d3 7a 01 01 00 00 01 ]
Connected to: Apache Hive (version 1.1.0-cdh5.10.1)
Driver: Hive JDBC (version 1.1.0-cdh5.10.1)
Transaction isolation: TRANSACTION_REPEATABLE_READ
Beeline version 1.1.0-cdh5.10.1 by Apache Hive
0: jdbc:hive2://pg-dmp-master2.hadoop:10000/d>

On failed node, it shows nothing, just failed.

beeline -u 'jdbc:hive2://pg-dmp-master2.hadoop:10000/default;principal=hive/pg-dmp-master2.hadoop@PG.COM'
2017-05-03 22:27:44,881 WARN  [main] mapreduce.TableMapReduceUtil: The hbase-prefix-tree module jar containing PrefixTreeCodec is not present.  Continuing without it.
scan complete in 1ms
Connecting to jdbc:hive2://pg-dmp-master2.hadoop:10000/default;principal=hive/pg-dmp-master2.hadoop@PG.COM
Java config name: null
Native config name: /etc/krb5.conf
Loaded from native config
17/05/03 22:27:46 [main]: ERROR transport.TSaslTransport: SASL negotiation failure
javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]
        at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:212)
        at org.apache.thrift.transport.TSaslClientTransport.handleSaslStartMessage(TSaslClientTransport.java:94)
        at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:271)
        at org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37)
        at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:52)
        at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:49)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:415)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1698)
        at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport.open(TUGIAssumingTransport.java:49)
        at org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:202)
        at org.apache.hive.jdbc.HiveConnection.<init>(HiveConnection.java:167)
        at org.apache.hive.jdbc.HiveDriver.connect(HiveDriver.java:105)
        at java.sql.DriverManager.getConnection(DriverManager.java:571)
        at java.sql.DriverManager.getConnection(DriverManager.java:187)
        at org.apache.hive.beeline.DatabaseConnection.connect(DatabaseConnection.java:142)
        at org.apache.hive.beeline.DatabaseConnection.getConnection(DatabaseConnection.java:207)
        at org.apache.hive.beeline.Commands.connect(Commands.java:1457)
        at org.apache.hive.beeline.Commands.connect(Commands.java:1352)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at org.apache.hive.beeline.ReflectiveCommandHandler.execute(ReflectiveCommandHandler.java:52)
        at org.apache.hive.beeline.BeeLine.execCommandWithPrefix(BeeLine.java:1130)
        at org.apache.hive.beeline.BeeLine.dispatch(BeeLine.java:1169)
        at org.apache.hive.beeline.BeeLine.initArgs(BeeLine.java:810)
        at org.apache.hive.beeline.BeeLine.begin(BeeLine.java:890)
        at org.apache.hive.beeline.BeeLine.mainWithInputRedirection(BeeLine.java:510)
        at org.apache.hive.beeline.BeeLine.main(BeeLine.java:493)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
        at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
Caused by: GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)
        at sun.security.jgss.krb5.Krb5InitCredential.getInstance(Krb5InitCredential.java:147)
        at sun.security.jgss.krb5.Krb5MechFactory.getCredentialElement(Krb5MechFactory.java:121)
        at sun.security.jgss.krb5.Krb5MechFactory.getMechanismContext(Krb5MechFactory.java:187)
        at sun.security.jgss.GSSManagerImpl.getMechanismContext(GSSManagerImpl.java:223)
        at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:212)
        at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179)
        at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:193)
        ... 35 more
HS2 may be unavailable, check server status
Error: Could not open client transport with JDBC Uri: jdbc:hive2://pg-dmp-master2.hadoop:10000/default;principal=hive/pg-dmp-master2.hadoop@PG.COM: GSS initiate failed (state=08S01,code=0)
Beeline version 1.1.0-cdh5.9.0 by Apache Hive
beeline>

I create a principle and a new keytab for the failed node, but still failed. But see, there is no kerberos authenticate info on failed node.

Then I start thinking, kerberos will create a local cache for the application, this will prevent access KDC server when call hive every time, it will authenticated by local. Maybe, the failed node was not read the local cache of kerberos? So I copy the core-site.xml of hadoop to the hive conf path. The core-site.xml contains a set named hadoop.security.auth_to_local, and the value is DEFAULT. And the issue solved.

And other logs here

/tmp/root/hive.log on failed node

2017-05-03 22:09:30,656 ERROR [main]: transport.TSaslTransport (TSaslTransport.java:open(315)) - SASL negotiation failure
javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]
        at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:212)
        at org.apache.thrift.transport.TSaslClientTransport.handleSaslStartMessage(TSaslClientTransport.java:94)
        at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:271)
        at org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37)
        at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:52)
        at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:49)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:415)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1698)
        at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport.open(TUGIAssumingTransport.java:49)
        at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:430)
        at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:240)
        at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:74)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
        at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
        at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
        at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1528)
        at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:67)
        at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:82)
        at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3037)
        at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3056)
        at org.apache.hadoop.hive.ql.metadata.Hive.getAllFunctions(Hive.java:3281)
        at org.apache.hadoop.hive.ql.metadata.Hive.reloadFunctions(Hive.java:217)
        at org.apache.hadoop.hive.ql.metadata.Hive.registerAllFunctionsOnce(Hive.java:201)
        at org.apache.hadoop.hive.ql.metadata.Hive.<init>(Hive.java:324)
        at org.apache.hadoop.hive.ql.metadata.Hive.get(Hive.java:285)
        at org.apache.hadoop.hive.ql.metadata.Hive.get(Hive.java:260)
        at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:514)
        at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:689)
        at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:628)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
        at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
Caused by: GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)
        at sun.security.jgss.krb5.Krb5InitCredential.getInstance(Krb5InitCredential.java:147)
        at sun.security.jgss.krb5.Krb5MechFactory.getCredentialElement(Krb5MechFactory.java:121)
        at sun.security.jgss.krb5.Krb5MechFactory.getMechanismContext(Krb5MechFactory.java:187)
        at sun.security.jgss.GSSManagerImpl.getMechanismContext(GSSManagerImpl.java:223)
        at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:212)
        at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179)
        at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:193)
        ... 36 more
2017-05-03 22:09:30,661 WARN  [main]: hive.metastore (HiveMetaStoreClient.java:open(439)) - Failed to connect to the MetaStore Server...
2017-05-03 22:09:31,663 ERROR [main]: transport.TSaslTransport (TSaslTransport.java:open(315)) - SASL negotiation failure
javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]
        at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:212)
        at org.apache.thrift.transport.TSaslClientTransport.handleSaslStartMessage(TSaslClientTransport.java:94)
        at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:271)
        at org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37)
        at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:52)
        at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:49)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:415)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1698)
        at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport.open(TUGIAssumingTransport.java:49)
        at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:430)
        at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:240)
        at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:74)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
        at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
        at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
        at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1528)
        at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:67)
        at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:82)
        at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3037)
        at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3056)
        at org.apache.hadoop.hive.ql.metadata.Hive.getAllFunctions(Hive.java:3281)
        at org.apache.hadoop.hive.ql.metadata.Hive.reloadFunctions(Hive.java:217)
        at org.apache.hadoop.hive.ql.metadata.Hive.registerAllFunctionsOnce(Hive.java:201)
        at org.apache.hadoop.hive.ql.metadata.Hive.<init>(Hive.java:324)
        at org.apache.hadoop.hive.ql.metadata.Hive.get(Hive.java:285)
        at org.apache.hadoop.hive.ql.metadata.Hive.get(Hive.java:260)
        at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:514)
        at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:689)
        at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:628)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
        at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
Caused by: GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)
        at sun.security.jgss.krb5.Krb5InitCredential.getInstance(Krb5InitCredential.java:147)
        at sun.security.jgss.krb5.Krb5MechFactory.getCredentialElement(Krb5MechFactory.java:121)
        at sun.security.jgss.krb5.Krb5MechFactory.getMechanismContext(Krb5MechFactory.java:187)
        at sun.security.jgss.GSSManagerImpl.getMechanismContext(GSSManagerImpl.java:223)
        at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:212)
        at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179)
        at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:193)
        ... 36 more
2017-05-03 22:09:31,665 WARN  [main]: hive.metastore (HiveMetaStoreClient.java:open(439)) - Failed to connect to the MetaStore Server...
2017-05-03 22:09:32,666 ERROR [main]: transport.TSaslTransport (TSaslTransport.java:open(315)) - SASL negotiation failure
javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]
        at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:212)
        at org.apache.thrift.transport.TSaslClientTransport.handleSaslStartMessage(TSaslClientTransport.java:94)
        at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:271)
        at org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37)
        at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:52)
        at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:49)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:415)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1698)
        at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport.open(TUGIAssumingTransport.java:49)
        at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:430)
        at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:240)
        at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:74)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
        at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
        at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
        at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1528)
        at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:67)
        at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:82)
        at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3037)
        at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3056)
        at org.apache.hadoop.hive.ql.metadata.Hive.getAllFunctions(Hive.java:3281)
        at org.apache.hadoop.hive.ql.metadata.Hive.reloadFunctions(Hive.java:217)
        at org.apache.hadoop.hive.ql.metadata.Hive.registerAllFunctionsOnce(Hive.java:201)
        at org.apache.hadoop.hive.ql.metadata.Hive.<init>(Hive.java:324)
        at org.apache.hadoop.hive.ql.metadata.Hive.get(Hive.java:285)
        at org.apache.hadoop.hive.ql.metadata.Hive.get(Hive.java:260)
        at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:514)
        at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:689)
        at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:628)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
        at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
Caused by: GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)
        at sun.security.jgss.krb5.Krb5InitCredential.getInstance(Krb5InitCredential.java:147)
        at sun.security.jgss.krb5.Krb5MechFactory.getCredentialElement(Krb5MechFactory.java:121)
        at sun.security.jgss.krb5.Krb5MechFactory.getMechanismContext(Krb5MechFactory.java:187)
        at sun.security.jgss.GSSManagerImpl.getMechanismContext(GSSManagerImpl.java:223)
        at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:212)
        at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179)
        at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:193)
        ... 36 more
2017-05-03 22:09:32,667 WARN  [main]: hive.metastore (HiveMetaStoreClient.java:open(439)) - Failed to connect to the MetaStore Server...
2017-05-03 22:09:33,674 WARN  [main]: metadata.Hive (Hive.java:registerAllFunctionsOnce(204)) - Failed to register all functions.
java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
        at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1530)
        at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:67)
        at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:82)
        at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3037)
        at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3056)
        at org.apache.hadoop.hive.ql.metadata.Hive.getAllFunctions(Hive.java:3281)
        at org.apache.hadoop.hive.ql.metadata.Hive.reloadFunctions(Hive.java:217)
        at org.apache.hadoop.hive.ql.metadata.Hive.registerAllFunctionsOnce(Hive.java:201)
        at org.apache.hadoop.hive.ql.metadata.Hive.<init>(Hive.java:324)
        at org.apache.hadoop.hive.ql.metadata.Hive.get(Hive.java:285)
        at org.apache.hadoop.hive.ql.metadata.Hive.get(Hive.java:260)
        at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:514)
        at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:689)
        at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:628)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
        at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
Caused by: java.lang.reflect.InvocationTargetException
        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
        at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
        at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
        at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1528)
        ... 19 more
Caused by: MetaException(message:Could not connect to meta store using any of the URIs provided. Most recent failure: org.apache.thrift.transport.TTransportException: GSS initiate failed
        at org.apache.thrift.transport.TSaslTransport.sendAndThrowMessage(TSaslTransport.java:232)
        at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:316)
        at org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37)
        at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:52)
        at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:49)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:415)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1698)
        at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport.open(TUGIAssumingTransport.java:49)
        at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:430)
        at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:240)
        at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:74)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
        at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
        at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
        at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1528)
        at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:67)
        at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:82)
        at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3037)
        at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3056)
        at org.apache.hadoop.hive.ql.metadata.Hive.getAllFunctions(Hive.java:3281)
        at org.apache.hadoop.hive.ql.metadata.Hive.reloadFunctions(Hive.java:217)
        at org.apache.hadoop.hive.ql.metadata.Hive.registerAllFunctionsOnce(Hive.java:201)
        at org.apache.hadoop.hive.ql.metadata.Hive.<init>(Hive.java:324)
        at org.apache.hadoop.hive.ql.metadata.Hive.get(Hive.java:285)
        at org.apache.hadoop.hive.ql.metadata.Hive.get(Hive.java:260)
        at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:514)
        at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:689)
        at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:628)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
        at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
)
        at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:477)
        at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:240)
        at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:74)
        ... 24 more

hiveserver2.log

2017-05-03 22:27:46,471 ERROR org.apache.thrift.server.TThreadPoolServer: [HiveServer2-Handler-Pool: Thread-63]: Error occurred during processing of message.
java.lang.RuntimeException: org.apache.thrift.transport.TTransportException: Peer indicated failure: GSS initiate failed
        at org.apache.thrift.transport.TSaslServerTransport$Factory.getTransport(TSaslServerTransport.java:219)
        at org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge$Server$TUGIAssumingTransportFactory$1.run(HadoopThriftAuthBridge.java:793)
        at org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge$Server$TUGIAssumingTransportFactory$1.run(HadoopThriftAuthBridge.java:790)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:356)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1900)
        at org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge$Server$TUGIAssumingTransportFactory.getTransport(HadoopThriftAuthBridge.java:790)
        at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:269)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
        at java.lang.Thread.run(Thread.java:745)
Caused by: org.apache.thrift.transport.TTransportException: Peer indicated failure: GSS initiate failed
        at org.apache.thrift.transport.TSaslTransport.receiveSaslMessage(TSaslTransport.java:199)
        at org.apache.thrift.transport.TSaslServerTransport.handleSaslStartMessage(TSaslServerTransport.java:125)
        at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:271)
        at org.apache.thrift.transport.TSaslServerTransport.open(TSaslServerTransport.java:41)
        at org.apache.thrift.transport.TSaslServerTransport$Factory.getTransport(TSaslServerTransport.java:216)
        ... 10 more

metastore server log

2017-05-03 22:58:16,642 ERROR org.apache.thrift.server.TThreadPoolServer: [pool-4-thread-90]: Error occurred during processing of message.
java.lang.RuntimeException: org.apache.thrift.transport.TTransportException: Peer indicated failure: GSS initiate failed
        at org.apache.thrift.transport.TSaslServerTransport$Factory.getTransport(TSaslServerTransport.java:219)
        at org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge$Server$TUGIAssumingTransportFactory$1.run(HadoopThriftAuthBridge.java:793)
        at org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge$Server$TUGIAssumingTransportFactory$1.run(HadoopThriftAuthBridge.java:790)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:356)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1900)
        at org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge$Server$TUGIAssumingTransportFactory.getTransport(HadoopThriftAuthBridge.java:790)
        at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:269)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
        at java.lang.Thread.run(Thread.java:745)
Caused by: org.apache.thrift.transport.TTransportException: Peer indicated failure: GSS initiate failed
        at org.apache.thrift.transport.TSaslTransport.receiveSaslMessage(TSaslTransport.java:199)
        at org.apache.thrift.transport.TSaslServerTransport.handleSaslStartMessage(TSaslServerTransport.java:125)
        at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:271)
        at org.apache.thrift.transport.TSaslServerTransport.open(TSaslServerTransport.java:41)
        at org.apache.thrift.transport.TSaslServerTransport$Factory.getTransport(TSaslServerTransport.java:216)
        ... 10 more
2017-05-03 22:58:17,646 ERROR org.apache.thrift.server.TThreadPoolServer: [pool-4-thread-91]: Error occurred during processing of message.
java.lang.RuntimeException: org.apache.thrift.transport.TTransportException: Peer indicated failure: GSS initiate failed
        at org.apache.thrift.transport.TSaslServerTransport$Factory.getTransport(TSaslServerTransport.java:219)
        at org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge$Server$TUGIAssumingTransportFactory$1.run(HadoopThriftAuthBridge.java:793)
        at org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge$Server$TUGIAssumingTransportFactory$1.run(HadoopThriftAuthBridge.java:790)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:356)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1900)
        at org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge$Server$TUGIAssumingTransportFactory.getTransport(HadoopThriftAuthBridge.java:790)
        at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:269)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
        at java.lang.Thread.run(Thread.java:745)
Caused by: org.apache.thrift.transport.TTransportException: Peer indicated failure: GSS initiate failed
        at org.apache.thrift.transport.TSaslTransport.receiveSaslMessage(TSaslTransport.java:199)
        at org.apache.thrift.transport.TSaslServerTransport.handleSaslStartMessage(TSaslServerTransport.java:125)
        at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:271)
        at org.apache.thrift.transport.TSaslServerTransport.open(TSaslServerTransport.java:41)
        at org.apache.thrift.transport.TSaslServerTransport$Factory.getTransport(TSaslServerTransport.java:216)
        ... 10 more
2017-05-03 22:58:18,648 ERROR org.apache.thrift.server.TThreadPoolServer: [pool-4-thread-92]: Error occurred during processing of message.
java.lang.RuntimeException: org.apache.thrift.transport.TTransportException: Peer indicated failure: GSS initiate failed
        at org.apache.thrift.transport.TSaslServerTransport$Factory.getTransport(TSaslServerTransport.java:219)
        at org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge$Server$TUGIAssumingTransportFactory$1.run(HadoopThriftAuthBridge.java:793)
        at org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge$Server$TUGIAssumingTransportFactory$1.run(HadoopThriftAuthBridge.java:790)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:356)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1900)
        at org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge$Server$TUGIAssumingTransportFactory.getTransport(HadoopThriftAuthBridge.java:790)
        at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:269)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
        at java.lang.Thread.run(Thread.java:745)
Caused by: org.apache.thrift.transport.TTransportException: Peer indicated failure: GSS initiate failed
        at org.apache.thrift.transport.TSaslTransport.receiveSaslMessage(TSaslTransport.java:199)
        at org.apache.thrift.transport.TSaslServerTransport.handleSaslStartMessage(TSaslServerTransport.java:125)
        at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:271)
        at org.apache.thrift.transport.TSaslServerTransport.open(TSaslServerTransport.java:41)
        at org.apache.thrift.transport.TSaslServerTransport$Factory.getTransport(TSaslServerTransport.java:216)
        ... 10 more

发表回复

*
*

Required fields are marked *