Hive!init模块init是什么么东西

后使用快捷导航没有帐号?
查看: 1817|回复: 2
hive启动时报错
新手上路, 积分 29, 距离下一级还需 21 积分
论坛徽章:3
&property&& & & & & & & & &name&javax.jdo.option.ConnectionURL&/name&& & & & & & & & &value&jdbc:mysql://0:3306/hive?createDatabaseIfNotExist=true&/value&& & & & &/property&& & & & &property&& & & & & & & & &name&javax.jdo.option.ConnectionDriverName&/name&& & & & & & & & &value&com.mysql.jdbc.Driver&/value&& & & & &/property&& & & & &property&& & & & & & & & &name&javax.jdo.option.ConnectionUserName&/name&& & & & & & & & &value&root&/value&& & & & &/property&& & & & &property&& & & & & & & & &name&javax.jdo.option.ConnectionPassword&/name&& & & & & & & & &value&admin&/value&& & & & &/property&启动hive时报错,有谁遇到过吗,帮忙看看&&谢谢
Caused by: java.sql.SQLException: Unable to open a test connection to the given database. JDBC url = jdbc:mysql://node1:3306/hive?createDatabaseIfNotExist=true, username = root. Terminating connection pool (set lazyInit to true if you expect to start your database after your app). Original Exception: ------
java.sql.SQLException: Access denied for user
(using password: YES)
& && &&&at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:1055)
& && &&&at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:956)
& && &&&at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3558)
& && &&&at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3490)
& && &&&at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:919)
& && &&&at com.mysql.jdbc.MysqlIO.secureAuth411(MysqlIO.java:3996)
& && &&&at com.mysql.jdbc.MysqlIO.doHandshake(MysqlIO.java:1284)
& && &&&at com.mysql.jdbc.ConnectionImpl.createNewIO(ConnectionImpl.java:2142)
& && &&&at com.mysql.jdbc.ConnectionImpl.&init&(ConnectionImpl.java:781)
& && &&&at com.mysql.jdbc.JDBC4Connection.&init&(JDBC4Connection.java:46)
& && &&&at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
& && &&&at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
& && &&&at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
& && &&&at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
& && &&&at com.mysql.jdbc.Util.handleNewInstance(Util.java:406)
& && &&&at com.mysql.jdbc.ConnectionImpl.getInstance(ConnectionImpl.java:352)
& && &&&at com.mysql.jdbc.NonRegisteringDriver.connect(NonRegisteringDriver.java:284)
& && &&&at java.sql.DriverManager.getConnection(DriverManager.java:571)
& && &&&at java.sql.DriverManager.getConnection(DriverManager.java:187)
& && &&&at com.jolbox.bonecp.BoneCP.obtainRawInternalConnection(BoneCP.java:361)
& && &&&at com.jolbox.bonecp.BoneCP.&init&(BoneCP.java:416)
& && &&&at com.jolbox.bonecp.BoneCPDataSource.getConnection(BoneCPDataSource.java:120)
& && &&&at org.datanucleus.store.rdbms.ConnectionFactoryImpl$ManagedConnectionImpl.getConnection(ConnectionFactoryImpl.java:501)
& && &&&at org.datanucleus.store.rdbms.RDBMSStoreManager.&init&(RDBMSStoreManager.java:298)
& && &&&at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
& && &&&at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
& && &&&at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
& && &&&at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
& && &&&at org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:631)
& && &&&at org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:301)
& && &&&at org.datanucleus.NucleusContext.createStoreManagerForProperties(NucleusContext.java:1187)
& && &&&at org.datanucleus.NucleusContext.initialise(NucleusContext.java:356)
& && &&&at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:775)
& && &&&at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:333)
& && &&&at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:202)
& && &&&at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
& && &&&at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
& && &&&at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
& && &&&at java.lang.reflect.Method.invoke(Method.java:606)
& && &&&at javax.jdo.JDOHelper$16.run(JDOHelper.java:1965)
& && &&&at java.security.AccessController.doPrivileged(Native Method)
& && &&&at javax.jdo.JDOHelper.invoke(JDOHelper.java:1960)
& && &&&at javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1166)
& && &&&at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:808)
& && &&&at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:701)
& && &&&at org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:310)
& && &&&at org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:339)
& && &&&at org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:248)
& && &&&at org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:223)
& && &&&at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:73)
& && &&&at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:133)
& && &&&at org.apache.hadoop.hive.metastore.RawStoreProxy.&init&(RawStoreProxy.java:58)
& && &&&at org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:67)
& && &&&at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:497)
& && &&&at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:475)
& && &&&at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:523)
& && &&&at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:397)
& && &&&at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.&init&(HiveMetaStore.java:356)
& && &&&at org.apache.hadoop.hive.metastore.RetryingHMSHandler.&init&(RetryingHMSHandler.java:54)
& && &&&at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:59)
& && &&&at org.apache.hadoop.hive.metastore.HiveMetaStore.newHMSHandler(HiveMetaStore.java:4944)
& && &&&at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.&init&(HiveMetaStoreClient.java:171)
& && &&&at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
& && &&&at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
& && &&&at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
& && &&&at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
& && &&&at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1410)
& && &&&at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.&init&(RetryingMetaStoreClient.java:62)
& && &&&at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:72)
& && &&&at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:2453)
& && &&&at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:2465)
& && &&&at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:340)
& && &&&at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:681)
& && &&&at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:625)
& && &&&at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
& && &&&at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
& && &&&at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
& && &&&at java.lang.reflect.Method.invoke(Method.java:606)
& && &&&at org.apache.hadoop.util.RunJar.main(RunJar.java:212)
& && &&&at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
& && &&&at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
& && &&&at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
& && &&&at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
& && &&&at com.jolbox.bonecp.PoolUtil.generateSQLException(PoolUtil.java:192)
& && &&&at com.jolbox.bonecp.BoneCP.&init&(BoneCP.java:422)
& && &&&at com.jolbox.bonecp.BoneCPDataSource.getConnection(BoneCPDataSource.java:120)
& && &&&at org.datanucleus.store.rdbms.ConnectionFactoryImpl$ManagedConnectionImpl.getConnection(ConnectionFactoryImpl.java:501)
& && &&&at org.datanucleus.store.rdbms.RDBMSStoreManager.&init&(RDBMSStoreManager.java:298)
& && &&&at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
& && &&&at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
& && &&&at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
& && &&&at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
& && &&&at org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:631)
& && &&&at org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:301)
& && &&&at org.datanucleus.NucleusContext.createStoreManagerForProperties(NucleusContext.java:1187)
& && &&&at org.datanucleus.NucleusContext.initialise(NucleusContext.java:356)
& && &&&at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:775)
& && &&&... 46 more
Caused by: java.sql.SQLException: Access denied for user
(using password: YES)
& && &&&at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:1055)
& && &&&at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:956)
& && &&&at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3558)
& && &&&at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3490)
& && &&&at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:919)
& && &&&at com.mysql.jdbc.MysqlIO.secureAuth411(MysqlIO.java:3996)
& && &&&at com.mysql.jdbc.MysqlIO.doHandshake(MysqlIO.java:1284)
& && &&&at com.mysql.jdbc.ConnectionImpl.createNewIO(ConnectionImpl.java:2142)
& && &&&at com.mysql.jdbc.ConnectionImpl.&init&(ConnectionImpl.java:781)
& && &&&at com.mysql.jdbc.JDBC4Connection.&init&(JDBC4Connection.java:46)
& && &&&at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
& && &&&at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
& && &&&at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
& && &&&at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
& && &&&at com.mysql.jdbc.Util.handleNewInstance(Util.java:406)
& && &&&at com.mysql.jdbc.ConnectionImpl.getInstance(ConnectionImpl.java:352)
& && &&&at com.mysql.jdbc.NonRegisteringDriver.connect(NonRegisteringDriver.java:284)
& && &&&at java.sql.DriverManager.getConnection(DriverManager.java:571)
& && &&&at java.sql.DriverManager.getConnection(DriverManager.java:187)
& && &&&at com.jolbox.bonecp.BoneCP.obtainRawInternalConnection(BoneCP.java:361)
注册会员, 积分 154, 距离下一级还需 46 积分
论坛徽章:3
grant all on *.* to 'root'@'node1' identified by 'password';
增加一个授权
金牌会员, 积分 1596, 距离下一级还需 1404 积分
论坛徽章:9
楼上说的对,进入MySQL,运行
grant all on *.* to 'root'@'node1' identified by 'password';
及重新授权
扫一扫加入本版微信群博客访问: 398081
博文数量: 260
博客积分: 10
博客等级: 民兵
技术积分: 2508
注册时间:
科技改变世界,技术改变人生。
APP发帖 享双倍积分
IT168企业级官微
微信号:IT168qiye
系统架构师大会
微信号:SACC2013
分类: 大数据
今天遇到一个问题,(Hive 0.9.0 和 hbase0.93.3)
hive与hbase进行关联的时候,hbase录入的数据,但是select hive中的表示,提示如下错误:
Exception in thread "main" java.lang.NoSuchMethodError: org.apache.hadoop.hbase.mapred.TableMapReduceUtil.initCredentials(Lorg/apache/hadoop/mapred/JobC)V
&& &at org.apache.hadoop.hive.hbase.HiveHBaseTableInputFormat.getSplits(HiveHBaseTableInputFormat.java:419)
&& &at org.apache.hadoop.hive.ql.exec.FetchOperator.getRecordReader(FetchOperator.java:281)
&& &at org.apache.hadoop.hive.ql.exec.FetchOperator.getNextRow(FetchOperator.java:320)
&& &at org.apache.hadoop.hive.ql.exec.FetchTask.fetch(FetchTask.java:154)
&& &at org.apache.hadoop.hive.ql.Driver.getResults(Driver.java:1377)
&& &at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:269)
&& &at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:215)
&& &at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:406)
&& &at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:689)
&& &at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:557)
&& &at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
&& &at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
&& &at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
&& &at java.lang.reflect.Method.invoke(Method.java:597)
&& &at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
hive和hbase版本不兼容问题,使用
hbase-0.94.7 和 hive-0.9.0 就可以了
进入/usr/local/hive/lib/ 可以看到hbase的信息,使用此版本的即可。
hive版本查看
#ls /usr/local/hive/lib/
hbase版本查看
#hbase shell
>CREATE TABLE hbase_table_1(key int, value string) STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler' WITH SERDEPROPERTIES ("hbase.columns.mapping" = ":key,cf1:val") TBLPROPERTIES ("hbase.table.name" = "hbase_fudk");
#hbase shell
>describe 'hbase_fudk'
>put 'hbase_fudk','','cf1:val',''
>scan 'hbase_fudk'
>select * from hbase_table_1;
阅读(902) | 评论(0) | 转发(0) |
相关热门文章
给主人留下些什么吧!~~
请登录后评论。hadoop - Hive - InvocationTargetException null - Stack Overflow
to customize your list.
Join the Stack Overflow Community
Stack Overflow is a community of 6.5 million programmers, just like you, helping each other.
J it only takes a minute:
I am trying to connect hive with beeline or hue so i am getting below error
11:56:18,312 ERROR sentry.org.apache.thrift.transport.TSaslTransport: SASL negotiation failure
javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]
at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:212)
at sentry.org.apache.thrift.transport.TSaslClientTransport.handleSaslStartMessage(TSaslClientTransport.java:94)
at sentry.org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:253)
at sentry.org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:1)
at org.apache.sentry.provider.db.service.thrift.SentryPolicyServiceClient$UgiSaslClientTransport.baseOpen(SentryPolicyServiceClient.java:115)
at org.apache.sentry.provider.db.service.thrift.SentryPolicyServiceClient$UgiSaslClientTransport.access$000(SentryPolicyServiceClient.java:77)
at org.apache.sentry.provider.db.service.thrift.SentryPolicyServiceClient$UgiSaslClientTransport$1.run(SentryPolicyServiceClient.java:101)
at org.apache.sentry.provider.db.service.thrift.SentryPolicyServiceClient$UgiSaslClientTransport$1.run(SentryPolicyServiceClient.java:99)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1554)
at org.apache.sentry.provider.db.service.thrift.SentryPolicyServiceClient$UgiSaslClientTransport.open(SentryPolicyServiceClient.java:99)
at org.apache.sentry.provider.db.service.thrift.SentryPolicyServiceClient.&init&(SentryPolicyServiceClient.java:151)
at org.apache.sentry.provider.db.SimpleDBProviderBackend.&init&(SimpleDBProviderBackend.java:52)
at org.apache.sentry.provider.db.SimpleDBProviderBackend.&init&(SimpleDBProviderBackend.java:48)
at sun.reflect.GeneratedConstructorAccessor12.newInstance(Unknown Source)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
at org.apache.sentry.binding.hive.authz.HiveAuthzBinding.getAuthProvider(HiveAuthzBinding.java:247)
at org.apache.sentry.binding.hive.authz.HiveAuthzBinding.&init&(HiveAuthzBinding.java:88)
at org.apache.sentry.binding.hive.authz.HiveAuthzBinding.&init&(HiveAuthzBinding.java:81)
at org.apache.sentry.binding.hive.HiveAuthzBindingHook.&init&(HiveAuthzBindingHook.java:98)
at sun.reflect.GeneratedConstructorAccessor11.newInstance(Unknown Source)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
at java.lang.Class.newInstance(Class.java:374)
at org.apache.hadoop.hive.ql.hooks.HookUtils.getHooks(HookUtils.java:59)
at org.apache.hadoop.hive.ql.Driver.getHooks(Driver.java:1162)
at org.apache.hadoop.hive.pile(Driver.java:440)
at org.apache.hadoop.hive.pile(Driver.java:352)
at org.apache.hadoop.hive.pileInternal(Driver.java:995)
at org.apache.hadoop.hive.pileAndRespond(Driver.java:988)
at org.apache.hive.service.cli.operation.SQLOperation.prepare(SQLOperation.java:98)
at org.apache.hive.service.cli.operation.SQLOperation.run(SQLOperation.java:163)
at org.apache.hive.service.cli.session.HiveSessionImpl.runOperationWithLogCapture(HiveSessionImpl.java:514)
at org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementInternal(HiveSessionImpl.java:222)
at org.apache.hive.service.cli.session.HiveSessionImpl.executeStatement(HiveSessionImpl.java:204)
at org.apache.hive.service.cli.CLIService.executeStatement(CLIService.java:168)
at org.apache.hive.service.cli.thrift.ThriftCLIService.ExecuteStatement(ThriftCLIService.java:316)
at org.apache.hive.service.cli.thrift.TCLIService$Processor$ExecuteStatement.getResult(TCLIService.java:1373)
at org.apache.hive.service.cli.thrift.TCLIService$Processor$ExecuteStatement.getResult(TCLIService.java:1358)
at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39)
at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39)
at org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge20S$Server$TUGIAssumingProcessor.process(HadoopThriftAuthBridge20S.java:608)
at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:244)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
Caused by: GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)
at sun.security.jgss.krb5.Krb5InitCredential.getInstance(Krb5InitCredential.java:147)
at sun.security.jgss.krb5.Krb5MechFactory.getCredentialElement(Krb5MechFactory.java:121)
at sun.security.jgss.krb5.Krb5MechFactory.getMechanismContext(Krb5MechFactory.java:187)
at sun.security.jgss.GSSManagerImpl.getMechanismContext(GSSManagerImpl.java:223)
at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:212)
at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179)
at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:193)
... 47 more
11:56:18,313 WARN org.apache.hadoop.security.UserGroupInformation: PriviledgedActionException as:hive/@ (auth:KERBEROS) cause:sentry.org.apache.thrift.transport.TTransportException: GSS initiate failed
11:56:18,313 ERROR org.apache.hadoop.hive.ql.Driver: FAILED: InvocationTargetException null
java.lang.reflect.InvocationTargetException
at sun.reflect.GeneratedConstructorAccessor12.newInstance(Unknown Source)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
at org.apache.sentry.binding.hive.authz.HiveAuthzBinding.getAuthProvider(HiveAuthzBinding.java:247)
at org.apache.sentry.binding.hive.authz.HiveAuthzBinding.&init&(HiveAuthzBinding.java:88)
at org.apache.sentry.binding.hive.authz.HiveAuthzBinding.&init&(HiveAuthzBinding.java:81)
at org.apache.sentry.binding.hive.HiveAuthzBindingHook.&init&(HiveAuthzBindingHook.java:98)
at sun.reflect.GeneratedConstructorAccessor11.newInstance(Unknown Source)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
at java.lang.Class.newInstance(Class.java:374)
at org.apache.hadoop.hive.ql.hooks.HookUtils.getHooks(HookUtils.java:59)
at org.apache.hadoop.hive.ql.Driver.getHooks(Driver.java:1162)
at org.apache.hadoop.hive.pile(Driver.java:440)
at org.apache.hadoop.hive.pile(Driver.java:352)
at org.apache.hadoop.hive.pileInternal(Driver.java:995)
at org.apache.hadoop.hive.pileAndRespond(Driver.java:988)
at org.apache.hive.service.cli.operation.SQLOperation.prepare(SQLOperation.java:98)
at org.apache.hive.service.cli.operation.SQLOperation.run(SQLOperation.java:163)
at org.apache.hive.service.cli.session.HiveSessionImpl.runOperationWithLogCapture(HiveSessionImpl.java:514)
at org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementInternal(HiveSessionImpl.java:222)
at org.apache.hive.service.cli.session.HiveSessionImpl.executeStatement(HiveSessionImpl.java:204)
at org.apache.hive.service.cli.CLIService.executeStatement(CLIService.java:168)
at org.apache.hive.service.cli.thrift.ThriftCLIService.ExecuteStatement(ThriftCLIService.java:316)
at org.apache.hive.service.cli.thrift.TCLIService$Processor$ExecuteStatement.getResult(TCLIService.java:1373)
at org.apache.hive.service.cli.thrift.TCLIService$Processor$ExecuteStatement.getResult(TCLIService.java:1358)
at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39)
at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39)
at org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge20S$Server$TUGIAssumingProcessor.process(HadoopThriftAuthBridge20S.java:608)
at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:244)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.reflect.UndeclaredThrowableException
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1567)
at org.apache.sentry.provider.db.service.thrift.SentryPolicyServiceClient$UgiSaslClientTransport.open(SentryPolicyServiceClient.java:99)
at org.apache.sentry.provider.db.service.thrift.SentryPolicyServiceClient.&init&(SentryPolicyServiceClient.java:151)
at org.apache.sentry.provider.db.SimpleDBProviderBackend.&init&(SimpleDBProviderBackend.java:52)
at org.apache.sentry.provider.db.SimpleDBProviderBackend.&init&(SimpleDBProviderBackend.java:48)
... 33 more
Caused by: sentry.org.apache.thrift.transport.TTransportException: GSS initiate failed
at sentry.org.apache.thrift.transport.TSaslTransport.sendAndThrowMessage(TSaslTransport.java:221)
at sentry.org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:297)
at sentry.org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:1)
at org.apache.sentry.provider.db.service.thrift.SentryPolicyServiceClient$UgiSaslClientTransport.baseOpen(SentryPolicyServiceClient.java:115)
at org.apache.sentry.provider.db.service.thrift.SentryPolicyServiceClient$UgiSaslClientTransport.access$000(SentryPolicyServiceClient.java:77)
at org.apache.sentry.provider.db.service.thrift.SentryPolicyServiceClient$UgiSaslClientTransport$1.run(SentryPolicyServiceClient.java:101)
at org.apache.sentry.provider.db.service.thrift.SentryPolicyServiceClient$UgiSaslClientTransport$1.run(SentryPolicyServiceClient.java:99)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1554)
... 37 more
I have found temporary resolution :
If i restart hive service everyday then it's working fine but every day i have to restart.
I want permanent solution. Could anyone please help me to solve it?
Here the issue is not with Hive, rather with your kerberos credentials.The tickets generated by Ticket Granting Service stay alive for a specific number of hours after which they expire. Before running hive, you need to run kinit to generate your TGS ticket. That way you will not have to retsart Hive service everytime.
A simple test would be to run the klist command before and after running your hive service.
Your Answer
Sign up or
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Post as a guest
By posting your answer, you agree to the
Not the answer you're looking for?
Browse other questions tagged
Stack Overflow works best with JavaScript enabled

我要回帖

更多关于 init.d是什么 的文章

 

随机推荐