How to connnect sparksql in Kerberos enviroment

How to conncet sparksql in Kerberos environment?
I don’t see the document about kerberos settting, keytab or principal/password, so I just simply kinit in shell and then java -jar metabase.jar, and it seems does not work.

Below is the detail log, including the sparksql jdbc setting.

12-09 09:34:27 INFO metabase.driver :: Initializing driver :sparksql…
12-09 09:34:27 DEBUG plugins.init-steps :: Loading plugin namespace metabase.driver.sparksql…
12-09 09:34:27 DEBUG plugins.jdbc-proxy :: Registering JDBC proxy driver for class metabase.driver.FixedHiveDriver…
Load lazy loading driver :sparksql took 963.0 µs
12-09 09:34:27 WARN util.NativeCodeLoader :: Unable to load native-hadoop library for your platform… using builtin-java classes where applicable
12-09 09:34:28 ERROR transport.TSaslTransport :: SASL negotiation failure
javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]
at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:211)
at org.apache.thrift.transport.TSaslClientTransport.handleSaslStartMessage(TSaslClientTransport.java:94)
at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:271)
at org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37)
at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:52)
at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:49)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1729)
at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport.open(TUGIAssumingTransport.java:49)
at org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:204)
at org.apache.hive.jdbc.HiveConnection.(HiveConnection.java:176)
at metabase.driver.FixedHiveConnection.(Unknown Source)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at clojure.lang.Reflector.invokeConstructor(Reflector.java:294)
at metabase.driver.FixedHiveDriver$driver_connect.invokeStatic(FixedHiveDriver.clj:20)
at metabase.driver.FixedHiveDriver$driver_connect.invoke(FixedHiveDriver.clj:16)
at metabase.driver.FixedHiveDriver.connect(Unknown Source)
at metabase.plugins.jdbc_proxy$proxy_driver$reify__66297.connect(jdbc_proxy.clj:32)
at java.sql.DriverManager.getConnection(DriverManager.java:664)
at java.sql.DriverManager.getConnection(DriverManager.java:208)
at clojure.java.jdbc$get_driver_connection.invokeStatic(jdbc.clj:271)
at clojure.java.jdbc$get_driver_connection.invoke(jdbc.clj:250)
at clojure.java.jdbc$get_connection.invokeStatic(jdbc.clj:411)
at clojure.java.jdbc$get_connection.invoke(jdbc.clj:274)
at clojure.java.jdbc$db_query_with_resultset_STAR_.invokeStatic(jdbc.clj:1093)
at clojure.java.jdbc$db_query_with_resultset_STAR_.invoke(jdbc.clj:1075)
at clojure.java.jdbc$query.invokeStatic(jdbc.clj:1164)
at clojure.java.jdbc$query.invoke(jdbc.clj:1126)
at clojure.java.jdbc$query.invokeStatic(jdbc.clj:1142)
at clojure.java.jdbc$query.invoke(jdbc.clj:1126)
at metabase.driver.sql_jdbc.connection$can_connect_QMARK_.invokeStatic(connection.clj:159)
at metabase.driver.sql_jdbc.connection$can_connect_QMARK_.invoke(connection.clj:154)
at metabase.driver.sql_jdbc$fn__67312.invokeStatic(sql_jdbc.clj:35)
at metabase.driver.sql_jdbc$fn__67312.invoke(sql_jdbc.clj:34)
at clojure.lang.MultiFn.invoke(MultiFn.java:234)
at metabase.driver.util$can_connect_with_details_QMARK_$fn__19311.invoke(util.clj:31)
at metabase.util$do_with_timeout$fn__6284.invoke(util.clj:334)
at clojure.core$binding_conveyor_fn$fn__5754.invoke(core.clj:2030)
at clojure.lang.AFn.call(AFn.java:18)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)
at sun.security.jgss.krb5.Krb5InitCredential.getInstance(Krb5InitCredential.java:147)
at sun.security.jgss.krb5.Krb5MechFactory.getCredentialElement(Krb5MechFactory.java:122)
at sun.security.jgss.krb5.Krb5MechFactory.getMechanismContext(Krb5MechFactory.java:187)
at sun.security.jgss.GSSManagerImpl.getMechanismContext(GSSManagerImpl.java:224)
at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:212)
at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179)
at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:192)
… 46 more
12-09 09:34:28 ERROR transport.TSaslTransport :: SASL negotiation failure
javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]
at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:211)
at org.apache.thrift.transport.TSaslClientTransport.handleSaslStartMessage(TSaslClientTransport.java:94)
at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:271)
at org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37)
at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:52)
at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:49)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1729)
at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport.open(TUGIAssumingTransport.java:49)
at org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:204)
at org.apache.hive.jdbc.HiveConnection.(HiveConnection.java:176)
at metabase.driver.FixedHiveConnection.(Unknown Source)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at clojure.lang.Reflector.invokeConstructor(Reflector.java:294)
at metabase.driver.FixedHiveDriver$driver_connect.invokeStatic(FixedHiveDriver.clj:20)
at metabase.driver.FixedHiveDriver$driver_connect.invoke(FixedHiveDriver.clj:16)
at metabase.driver.FixedHiveDriver.connect(Unknown Source)
at metabase.plugins.jdbc_proxy$proxy_driver$reify__66297.connect(jdbc_proxy.clj:32)
at java.sql.DriverManager.getConnection(DriverManager.java:664)
at java.sql.DriverManager.getConnection(DriverManager.java:208)
at clojure.java.jdbc$get_driver_connection.invokeStatic(jdbc.clj:271)
at clojure.java.jdbc$get_driver_connection.invoke(jdbc.clj:250)
at clojure.java.jdbc$get_connection.invokeStatic(jdbc.clj:411)
at clojure.java.jdbc$get_connection.invoke(jdbc.clj:274)
at clojure.java.jdbc$db_query_with_resultset_STAR_.invokeStatic(jdbc.clj:1093)
at clojure.java.jdbc$db_query_with_resultset_STAR_.invoke(jdbc.clj:1075)
at clojure.java.jdbc$query.invokeStatic(jdbc.clj:1164)
at clojure.java.jdbc$query.invoke(jdbc.clj:1126)
at clojure.java.jdbc$query.invokeStatic(jdbc.clj:1142)
at clojure.java.jdbc$query.invoke(jdbc.clj:1126)
at metabase.driver.sql_jdbc.connection$can_connect_QMARK_.invokeStatic(connection.clj:159)
at metabase.driver.sql_jdbc.connection$can_connect_QMARK_.invoke(connection.clj:154)
at metabase.driver.sql_jdbc$fn__67312.invokeStatic(sql_jdbc.clj:35)
at metabase.driver.sql_jdbc$fn__67312.invoke(sql_jdbc.clj:34)
at clojure.lang.MultiFn.invoke(MultiFn.java:234)
at metabase.driver.util$can_connect_with_details_QMARK_$fn__19311.invoke(util.clj:31)
at metabase.util$do_with_timeout$fn__6284.invoke(util.clj:334)
at clojure.core$binding_conveyor_fn$fn__5754.invoke(core.clj:2030)
at clojure.lang.AFn.call(AFn.java:18)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)
at sun.security.jgss.krb5.Krb5InitCredential.getInstance(Krb5InitCredential.java:147)
at sun.security.jgss.krb5.Krb5MechFactory.getCredentialElement(Krb5MechFactory.java:122)
at sun.security.jgss.krb5.Krb5MechFactory.getMechanismContext(Krb5MechFactory.java:187)
at sun.security.jgss.GSSManagerImpl.getMechanismContext(GSSManagerImpl.java:224)
at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:212)
at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179)
at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:192)
… 46 more
12-09 09:34:28 ERROR driver.util :: Database connection error
java.sql.SQLException: Could not open client transport with JDBC Uri: jdbc:hive2://yjd-dn-50-24.meizu.mz:10016/log;principal=hive/yjd-dn-50-24.meizu.mz@MEIZU.COM;transportMode=http&useUnicode=true&characterEncoding=utf8&autoReconnect=true&tinyInt1isBit=false&useSSL=false: GSS initiate failed
at org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:231)
at org.apache.hive.jdbc.HiveConnection.(HiveConnection.java:176)
at metabase.driver.FixedHiveConnection.(Unknown Source)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at clojure.lang.Reflector.invokeConstructor(Reflector.java:294)
at metabase.driver.FixedHiveDriver$driver_connect.invokeStatic(FixedHiveDriver.clj:20)
at metabase.driver.FixedHiveDriver$driver_connect.invoke(FixedHiveDriver.clj:16)
at metabase.driver.FixedHiveDriver.connect(Unknown Source)
at metabase.plugins.jdbc_proxy$proxy_driver$reify__66297.connect(jdbc_proxy.clj:32)
at java.sql.DriverManager.getConnection(DriverManager.java:664)
at java.sql.DriverManager.getConnection(DriverManager.java:208)
at clojure.java.jdbc$get_driver_connection.invokeStatic(jdbc.clj:271)
at clojure.java.jdbc$get_driver_connection.invoke(jdbc.clj:250)
at clojure.java.jdbc$get_connection.invokeStatic(jdbc.clj:411)
at clojure.java.jdbc$get_connection.invoke(jdbc.clj:274)
at clojure.java.jdbc$db_query_with_resultset_STAR_.invokeStatic(jdbc.clj:1093)
at clojure.java.jdbc$db_query_with_resultset_STAR_.invoke(jdbc.clj:1075)
at clojure.java.jdbc$query.invokeStatic(jdbc.clj:1164)
at clojure.java.jdbc$query.invoke(jdbc.clj:1126)
at clojure.java.jdbc$query.invokeStatic(jdbc.clj:1142)
at clojure.java.jdbc$query.invoke(jdbc.clj:1126)
at metabase.driver.sql_jdbc.connection$can_connect_QMARK_.invokeStatic(connection.clj:159)
at metabase.driver.sql_jdbc.connection$can_connect_QMARK_.invoke(connection.clj:154)
at metabase.driver.sql_jdbc$fn__67312.invokeStatic(sql_jdbc.clj:35)
at metabase.driver.sql_jdbc$fn__67312.invoke(sql_jdbc.clj:34)
at clojure.lang.MultiFn.invoke(MultiFn.java:234)
at metabase.driver.util$can_connect_with_details_QMARK_$fn__19311.invoke(util.clj:31)
at metabase.util$do_with_timeout$fn__6284.invoke(util.clj:334)
at clojure.core$binding_conveyor_fn$fn__5754.invoke(core.clj:2030)
at clojure.lang.AFn.call(AFn.java:18)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.thrift.transport.TTransportException: GSS initiate failed
at org.apache.thrift.transport.TSaslTransport.sendAndThrowMessage(TSaslTransport.java:232)
at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:316)
at org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37)
at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:52)
at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:49)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1729)
at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport.open(TUGIAssumingTransport.java:49)
at org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:204)
… 36 more
12-09 09:34:28 DEBUG middleware.log :: POST /api/database 400 807.3 ms (0 DB calls)
{:valid false,
:dbname
“Could not open client transport with JDBC Uri: jdbc:hive2://yjd-dn-50-24.meizu.mz:10016/log;principal=hive/yjd-dn-50-24.meizu.mz@MEIZU.COM;transportMode=http&useUnicode=true&characterEncoding=utf8&autoReconnect=true&tinyInt1isBit=false&useSSL=false: GSS initiate failed”,
:message
“Could not open client transport with JDBC Uri: jdbc:hive2://yjd-dn-50-24.meizu.mz:10016/log;principal=hive/yjd-dn-50-24.meizu.mz@MEIZU.COM;transportMode=http&useUnicode=true&characterEncoding=utf8&autoReconnect=true&tinyInt1isBit=false&useSSL=false: GSS initiate failed”}

@hicenlee
Did you try connecting with another client, through kinit, preferably a java client?

Looking at the log you provide here, it contains several more errors, specifically:
GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)
https://community.cloudera.com/t5/Support-Questions/beeline-returns-quot-Failed-to-find-any-Kerberos-tgt-quot/td-p/96908

And it seems like you might need some extra parameters for java as well:
https://stackoverflow.com/questions/32205087/javax-security-sasl-saslexception-gss-initiate-failed-caused-by-gssexception

Or could it simply be a problem with sudo vs regular user?
https://stackoverflow.com/questions/48093826/hadoop-kerberos-hdfs-command-failed-to-find-any-kerberos-tgt-even-though-i-ha/48105052

Anyways, I would recommend that you try making it work through kinit with a client that you know, since you should then be able to use the same configuration on Metabase then.

@flamber

I have connected the spark thrift server through beeline client succesfully(kinit and then connected). It seems the connecting parameter and thrift server config is OK.

I used to connect kerberos-environmnet hive/hbase/hadoop cluster by java code, it need some additional code. But i know nothing about clojure.

[haiyang1@yjd-etl-0-115 ~]$ beeline -u “jdbc:hive2://yjd-dn-50-24.meizu.mz:10016/log;principal=hive/yjd-dn-50-24.meizu.mz@MEIZU.COM;transportMode=http&useUnicode=true&characterEncoding=utf8&autoReconnect=true&tinyInt1isBit=false&useSSL=false”
/opt/hadoop/etc/hadoop/hadoop-env.sh: line 24: ulimit: open files: cannot modify limit: Operation not permitted
/opt/spark/conf/spark-env.sh: line 48: ulimit: open files: cannot modify limit: Operation not permitted
/opt/hadoop/etc/hadoop/hadoop-env.sh: line 24: ulimit: open files: cannot modify limit: Operation not permitted
Connecting to jdbc:hive2://yjd-dn-50-24.meizu.mz:10016/log;principal=hive/yjd-dn-50-24.meizu.mz@MEIZU.COM;transportMode=http&useUnicode=true&characterEncoding=utf8&autoReconnect=true&tinyInt1isBit=false&useSSL=false
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/opt/hadoop-2.7.5/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/opt/tez-0.9.1-minimal/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
19/12/09 10:22:12 INFO Utils: Supplied authorities: yjd-dn-50-24.meizu.mz:10016
19/12/09 10:22:12 INFO Utils: Resolved authority: yjd-dn-50-24.meizu.mz:10016
19/12/09 10:22:13 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform… using builtin-java classes where applicable
19/12/09 10:22:13 INFO HiveConnection: Will try to open client transport with JDBC Uri: jdbc:hive2://yjd-dn-50-24.meizu.mz:10016/log;principal=hive/yjd-dn-50-24.meizu.mz@MEIZU.COM;transportMode=http&useUnicode=true&characterEncoding=utf8&autoReconnect=true&tinyInt1isBit=false&useSSL=false
Connected to: Spark SQL (version 2.2.1)
Driver: Hive JDBC (version 1.2.1.spark2)
Transaction isolation: TRANSACTION_REPEATABLE_READ
Beeline version 1.2.1.spark2 by Apache Hive
0: jdbc:hive2://yjd-dn-50-24.meizu.mz:10016/l> show tables;
±----------±------------------------------------------±-------------±-+
| database | tableName | isTemporary |
±----------±------------------------------------------±-------------±-+
| log | ads_intf_nginx_log_project_prov_stat | false |
| log | ads_rpt_nginx_log_cc_attack | false |
| log | ads_rpt_nginx_log_project_pvuv_stat | false |
| log | ads_rpt_nginx_log_project_region_isp | false |
| log | ads_rpt_nginx_log_project_request_status | false |
| log | ads_rpt_nginx_log_project_resource_file | false |
| log | ads_rpt_nginx_log_project_upload_file | false |
| log | ads_rpt_nginx_log_project_uri_bottom | false |
| log | ads_rpt_nginx_log_project_uri_new | false |
| log | bdl_fdt_adlog | false |
| log | bdl_fdt_adlog_tmp0 | false |
| log | dwd_anystream_nginx_log_detail_h | false |
| log | dwd_log_nginx_business_detail | false |
| log | dwd_log_nginx_business_detail_h | false |
| log | dwm_cc_attack_nginx_log | false |
| log | dwm_nginx_log_project_stat_mild | false |
| log | dwm_nginx_log_project_uri_new_all | false |
| log | idl_fdt_maillog_conn | false |
| log | ods_anystream_nginx_log | false |
| log | ods_dim_nginx_project_to_group_c | false |
| log | ods_dim_nginx_project_to_ip_c | false |
| log | ods_meizu_mail_user_c | false |
±----------±------------------------------------------±-------------±-+
22 rows selected (0.536 seconds)

@hicenlee
Okay. Just wanna make sure that you’re running beeline and Metabase on the same machine?
If not, then I’m out of ideas, and let’s hope someone else posts a solution.