troubleshooting-Kerberos 鉴权异常
ERROR transport.TSaslTransport: SASL negotiation failure
javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)
解决办法
添加 keberos鉴权。
1)生成 keytab 密码文件(只能使用 kerberos admin 用户)
kadmin.local
xst -norandkey -k chen.keytab chenweidong@HADOOP.COM
2)Shell脚本增加这行命令
kinit -kt chen.keytab chenweidong@HADOOP.COM
3)Hue WebUI添加keytab密码文件
<workflow-app name="user_bank" xmlns="uri:oozie:workflow:0.5">
<start to="shell-bcd1"/>
<kill name="Kill">
<message>Action failed, error message[${wf:errorMessage(wf:lastErrorNode())}]</message>
</kill>
<action name="shell-bcd1">
<shell xmlns="uri:oozie:shell-action:0.1">
<job-tracker>${jobTracker}</job-tracker>
<name-node>${nameNode}</name-node>
<exec>/user/chenweidong/s_base.sh</exec>
<argument>user_bank</argument>
<file>/user/chenweidong/s_base.sh#s_base.sh</file>
<file>/user/chenweidong/chen.keytab#chen.keytab</file>
<capture-output/>
</shell>
<ok to="End"/>
<error to="Kill"/>
</action>
<end name="End"/>
</workflow-app>
异常日志
18/09/12 16:28:17 INFO hive.metastore: Trying to connect to metastore with URI thrift://master.prodcdh.com:9083
18/09/12 16:28:17 ERROR transport.TSaslTransport: SASL negotiation failure
javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]
at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:211)
at org.apache.thrift.transport.TSaslClientTransport.handleSaslStartMessage(TSaslClientTransport.java:94)
at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:271)
at org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37)
at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:52)
at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:49)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1685)
at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport.open(TUGIAssumingTransport.java:49)
at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:532)
at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:297)
at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:70)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1700)
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:80)
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:130)
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:101)
at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3554)
at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3606)
at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3586)
at org.apache.hadoop.hive.ql.metadata.Hive.getAllFunctions(Hive.java:3840)
at org.apache.hadoop.hive.ql.metadata.Hive.reloadFunctions(Hive.java:246)
at org.apache.hadoop.hive.ql.metadata.Hive.registerAllFunctionsOnce(Hive.java:229)
at org.apache.hadoop.hive.ql.metadata.Hive.<init>(Hive.java:386)
at org.apache.hadoop.hive.ql.metadata.Hive.create(Hive.java:330)
at org.apache.hadoop.hive.ql.metadata.Hive.getInternal(Hive.java:310)
at org.apache.hadoop.hive.ql.metadata.Hive.get(Hive.java:286)
at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.createHiveDB(BaseSemanticAnalyzer.java:228)
at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.<init>(BaseSemanticAnalyzer.java:207)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.<init>(SemanticAnalyzer.java:359)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzerFactory.get(SemanticAnalyzerFactory.java:304)
at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:537)
at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:1347)
at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1480)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1267)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1257)
at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:239)
at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:187)
at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:409)
at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:342)
at org.apache.hadoop.hive.cli.CliDriver.processReader(CliDriver.java:489)
at org.apache.hadoop.hive.cli.CliDriver.processFile(CliDriver.java:505)
at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:808)
at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:774)
at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:701)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.sqoop.hive.HiveImport.executeScript(HiveImport.java:341)
at org.apache.sqoop.hive.HiveImport.importTable(HiveImport.java:246)
at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:543)
at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:634)
at org.apache.sqoop.Sqoop.run(Sqoop.java:145)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:181)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:232)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:241)
at org.apache.sqoop.Sqoop.main(Sqoop.java:250)
Caused by: GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)
at sun.security.jgss.krb5.Krb5InitCredential.getInstance(Krb5InitCredential.java:147)
at sun.security.jgss.krb5.Krb5MechFactory.getCredentialElement(Krb5MechFactory.java:122)
at sun.security.jgss.krb5.Krb5MechFactory.getMechanismContext(Krb5MechFactory.java:187)
at sun.security.jgss.GSSManagerImpl.getMechanismContext(GSSManagerImpl.java:224)
at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:212)
at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179)
at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:192)
... 62 more
18/09/12 16:28:17 WARN hive.metastore: Failed to connect to the MetaStore Server...
18/09/12 16:28:17 INFO hive.metastore: Waiting 1 seconds before next connection attempt.
18/09/12 16:28:18 INFO hive.metastore: Trying to connect to metastore with URI thrift://master.prodcdh.com:9083
18/09/12 16:28:18 ERROR transport.TSaslTransport: SASL negotiation failure
javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]
at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:211)
at org.apache.thrift.transport.TSaslClientTransport.handleSaslStartMessage(TSaslClientTransport.java:94)
at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:271)
troubleshooting-Kerberos 鉴权异常的更多相关文章
- SpringBoot整合SpringSecurityOauth2实现鉴权-动态权限
写在前面 思考:为什么需要鉴权呢? 系统开发好上线后,API接口会暴露在互联网上会存在一定的安全风险,例如:爬虫.恶意访问等.因此,我们需要对非开放API接口进行用户鉴权,鉴权通过之后再允许调用. 准 ...
- 无线端安全登录与鉴权一之Kerberos
无线端登录与鉴权是安全登录以及保证用户数据安全的第一步,也是最重要的一步.之前做过一个安全登录与鉴权的方案,借这个机会,系统的思考一下,与大家交流交流 先介绍一下TX系统使用的Kerberos方案,参 ...
- Mongodb 认证鉴权那点事
[TOC] 一.Mongodb 的权限管理 认识权限管理,说明主要概念及关系 与大多数数据库一样,Mongodb同样提供了一套权限管理机制. 为了体验Mongodb 的权限管理,我们找一台已经安装好的 ...
- 使用ranger对kafka进行鉴权
使用ranger对kafka进行鉴权测试环境:ranger-kafka-plugin为0.6.3版本,kafka版本为kafka_2.10-0.10.1.1,且kafka broker为一个节点.一. ...
- ASP.NET Core 项目简单实现身份验证及鉴权
ASP.NET Core 身份验证及鉴权 目录 项目准备 身份验证 定义基本类型和接口 编写验证处理器 实现用户身份验证 权限鉴定 思路 编写过滤器类及相关接口 实现属性注入 实现用户权限鉴定 测试 ...
- 认证鉴权与API权限控制在微服务架构中的设计与实现(四)
引言: 本文系<认证鉴权与API权限控制在微服务架构中的设计与实现>系列的完结篇,前面三篇已经将认证鉴权与API权限控制的流程和主要细节讲解完.本文比较长,对这个系列进行收尾,主要内容包括 ...
- 单点登录SSO+鉴权
一.单点登录原理 1.登录 2.注销 --------------------------------------------------------------------------------- ...
- shiro jwt 构建无状态分布式鉴权体系
一:JWT 1.令牌构造 JWT(json web token)是可在网络上传输的用于声明某种主张的令牌(token),以JSON 对象为载体的轻量级开放标准(RFC 7519). 一个JWT令牌的定 ...
- shiro,基于springboot,基于前后端分离,从登录认证到鉴权,从入门到放弃
这个demo是基于springboot项目的. 名词介绍: ShiroShiro 主要分为 安全认证 和 接口授权 两个部分,其中的核心组件为 Subject. SecurityManager. Re ...
随机推荐
- ESXI虚拟机磁盘管理(精简-厚置-精简)
VMwareESX/ESXi 精简置备(thin)与厚置备(thick)虚拟机磁盘之间转换 VMwareESX/ESXi 虚拟机磁盘thin与thick之间转换 注意:转换前请先关闭虚拟机!!! 一. ...
- Ucenter社区服务搭建
1.搭建lamp环境yum –y install httpd php php-mysql mysql mysql-server 2启动服务 3.设置服务开机自动启动 4.上传UCEN ...
- 【JMeter】集合点的设置
[JMeter]集合点的设置 简单来理解一下,虽然我们的“性能测试”理解为“多用户并发测试”,但真正的并发是不存在的,为了更真实的实现并发这感念,我们可以在需要压力的地方设置集合点,每到输入用户名和密 ...
- Mirror--如何TSQL查看镜像状态和镜像相关存储过程
--==================================================== --查看镜像状态 SELECT DB_NAME(database_id) AS Datab ...
- 当前数据库普遍使用wait-for graph等待图来进行死锁检测
当前数据库普遍使用wait-for graph等待图来进行死锁检测 较超时机制,这是一种更主动的死锁检测方式,innodb引擎也采用wait-for graph SQL Server也使用wait-f ...
- u-boot SPL的理解
uboot分为uboot-spl和uboot两个组成部分.SPL是Secondary Program Loader的简称,第二阶段程序加载器,这里所谓的第二阶段是相对于SOC中的BROM来说的,之前的 ...
- CentOS忘记普通用户密码解决办法
普通用户忘记密码 1.使用root用户登录系统,找到/etc/shadow文件. 2.找到用户名开头的那一行,例如我的用户名为pds,,以冒号为分割符,红色部分是密码加密部分 pds:$1$Civop ...
- 如何快速获取ListView的打气筒对象
简单的方式有三种: @Override public View getView(int position, View convertView, ViewGroup parent) { View vie ...
- git克隆代码
1.vs--team explorer-clone,或者team-connect to tfs-clone 2.1输入git的url,2输入本地放代码的文件夹,3点clone,克隆出4.双击4 3.点 ...
- STA分析(一) setup and hold
timing check可以分为Dynamic Timing Analysis(Post_sim)和Static Timing Analysis STA:可以分析的很全面:仿真速度也很快:可以分析控制 ...