日韩av黄I国产麻豆传媒I国产91av视频在线观看I日韩一区二区三区在线看I美女国产在线I麻豆视频国产在线观看I成人黄色短片

歡迎訪問 生活随笔!

生活随笔

當前位置: 首頁 >

连接端口 配置hive_Zeppelin带有Kerberos认证的Hive解释器的配置

發布時間:2024/10/8 51 豆豆
生活随笔 收集整理的這篇文章主要介紹了 连接端口 配置hive_Zeppelin带有Kerberos认证的Hive解释器的配置 小編覺得挺不錯的,現在分享給大家,幫大家做個參考.

2.zeppelin連接Hive安裝配置

zeppelin 版本0.8.2 ,hive版本:3.0.0

2.1.安裝啟動hive3

2.1.配置hiveserver2

如果需要配置zeppelin與hive的集成,我們需要啟動hive的metastore服務以及hiveserver2服務。

首先為hive配置hiveserver2

conf/hive-site.xml

hive.server2.thrift.port

10005

hive.server2.thrift.bind.host

hadoop01

hive.server2.long.polling.timeout

5000

hive.metastore.uris

thrift://hadoop01:9083

啟動hiveserver2:

nohup bin/hive --service hiveserver2 &

2.2.beeline 連接:

beeline -u "jdbc:hive2://hadoop01:10005/feedback;principal=hive/_HOST@YOUR_REALM;auth=kerberos"

2.3.[Zeppelin] Kerberos認證下Hive解釋器的配置

default.driver = org.apache.hive.jdbc.HiveDriver

# thrift端口是10000,http端口是10001

# 如果hive開啟了負載均衡,則不能使用這些端口直連,只能連接負載均衡器的地址和端口

# YOUR_REALM表示kerberos所屬域,需要替換成實際的域,不清楚可以查看/etc/krb5.conf

default.url = jdbc:hive2://localhost:10000/default;principal=hive/_HOST@YOUR_REALM

default.user = hive

zeppelin.jdbc.auth.type = KERBEROS

# 運行Zeppelin的用戶需要對hive.keytab具有讀權限

zeppelin.jdbc.keytab.location = /home/xxxxx/hive.keytab

# 需要替換

zeppelin.jdbc.principal = hive@YOUR_REALM

default.proxy.user.property = hive.server2.proxy.user

2.4.添加依賴包

可以直接把jar 拷貝到interpreter/jdbc/目錄下,也可以在zeppelin界面上添加。

為什么是拷貝到 interpreter/jdbc 目錄呢?因為我們配置的hive解釋器屬于JDBC組,啟動解釋器時 interpreter/jdbc 目錄會被加到CLASSPATH里。

2.5.配置時報錯內容:

org.apache.zeppelin.interpreter.InterpreterException: Error in doAs at org.apache.zeppelin.jdbc.JDBCInterpreter.getConnection(JDBCInterpreter.java:479) at org.apache.zeppelin.jdbc.JDBCInterpreter.executeSql(JDBCInterpreter.java:692) at org.apache.zeppelin.jdbc.JDBCInterpreter.interpret(JDBCInterpreter.java:820) at org.apache.zeppelin.interpreter.LazyOpenInterpreter.interpret(LazyOpenInterpreter.java:103) at org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:632) at org.apache.zeppelin.scheduler.Job.run(Job.java:188) at org.apache.zeppelin.scheduler.ParallelScheduler$JobRunner.run(ParallelScheduler.java:162) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) at java.util.concurrent.FutureTask.run(FutureTask.java:266) at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180) at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) Caused by: java.lang.reflect.UndeclaredThrowableException at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1700) at org.apache.zeppelin.jdbc.JDBCInterpreter.getConnection(JDBCInterpreter.java:471) ... 13 more Caused by: java.sql.SQLException: Could not open client transport with JDBC Uri: jdbc:hive2://10.16.0.83:10005/feedback;principal=mcloud/bj-jd-appsubmit-node-prod-0002.tendcloud.com@HADOOP.COM;auth=kerberos: GSS initiate failed at org.apache.hive.jdbc.HiveConnection.(HiveConnection.java:256) at org.apache.hive.jdbc.HiveDriver.connect(HiveDriver.java:107) at java.sql.DriverManager.getConnection(DriverManager.java:664) at java.sql.DriverManager.getConnection(DriverManager.java:208) at org.apache.commons.dbcp2.DriverManagerConnectionFactory.createConnection(DriverManagerConnectionFactory.java:79) at org.apache.commons.dbcp2.PoolableConnectionFactory.makeObject(PoolableConnectionFactory.java:205) at org.apache.commons.pool2.impl.GenericObjectPool.create(GenericObjectPool.java:861) at org.apache.commons.pool2.impl.GenericObjectPool.borrowObject(GenericObjectPool.java:435) at org.apache.commons.pool2.impl.GenericObjectPool.borrowObject(GenericObjectPool.java:363) at org.apache.commons.dbcp2.PoolingDriver.connect(PoolingDriver.java:129) at java.sql.DriverManager.getConnection(DriverManager.java:664) at java.sql.DriverManager.getConnection(DriverManager.java:270) at org.apache.zeppelin.jdbc.JDBCInterpreter.getConnectionFromPool(JDBCInterpreter.java:425) at org.apache.zeppelin.jdbc.JDBCInterpreter.access$000(JDBCInterpreter.java:91) at org.apache.zeppelin.jdbc.JDBCInterpreter$2.run(JDBCInterpreter.java:474) at org.apache.zeppelin.jdbc.JDBCInterpreter$2.run(JDBCInterpreter.java:471) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1682) ... 14 more Caused by: org.apache.thrift.transport.TTransportException: GSS initiate failed at org.apache.thrift.transport.TSaslTransport.sendAndThrowMessage(TSaslTransport.java:232) at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:316) at org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37) at org.apache.hadoop.hive.metastore.security.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:51) at org.apache.hadoop.hive.metastore.security.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:48) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1682) at org.apache.hadoop.hive.metastore.security.TUGIAssumingTransport.open(TUGIAssumingTransport.java:48) at org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:343) at org.apache.hive.jdbc.HiveConnection.(HiveConnection.java:228) ... 32 more

解決:解釋器中添加如下配置完美解決:

default.proxy.user.property=hive.server2.proxy.user

總結

以上是生活随笔為你收集整理的连接端口 配置hive_Zeppelin带有Kerberos认证的Hive解释器的配置的全部內容,希望文章能夠幫你解決所遇到的問題。

如果覺得生活随笔網站內容還不錯,歡迎將生活随笔推薦給好友。