通过hue的web界面进行hive的sql查询,无法显示结果并报错timeout

报错如下:
[28/Jul/2017 11:23:29 +0800] decorators ERROR error running <function execute at 0x7fa741ddc8c0>
Traceback (most recent call last):
File "/home/hadoop/.versions/hue-3.10.0/desktop/libs/notebook/src/notebook/decorators.py", line 81, in decorator
return func(*args, **kwargs)
File "/home/hadoop/.versions/hue-3.10.0/desktop/libs/notebook/src/notebook/api.py", line 109, in execute
response['handle'] = get_api(request, snippet).execute(notebook, snippet)
File "/home/hadoop/.versions/hue-3.10.0/desktop/libs/notebook/src/notebook/connectors/hiveserver2.py", line 64, in decorator
return func(*args, **kwargs)
File "/home/hadoop/.versions/hue-3.10.0/desktop/libs/notebook/src/notebook/connectors/hiveserver2.py", line 199, in execute
db.use(query.database)
File "/home/hadoop/.versions/hue-3.10.0/apps/beeswax/src/beeswax/server/dbms.py", line 613, in use
return self.client.use(query)
File "/home/hadoop/.versions/hue-3.10.0/apps/beeswax/src/beeswax/server/hive_server2_lib.py", line 1053, in use
data = self._client.execute_query(query)
File "/home/hadoop/.versions/hue-3.10.0/apps/beeswax/src/beeswax/server/hive_server2_lib.py", line 748, in execute_query
return self.execute_query_statement(statement=query.query['query'], max_rows=max_rows, configuration=configuration)
File "/home/hadoop/.versions/hue-3.10.0/apps/beeswax/src/beeswax/server/hive_server2_lib.py", line 752, in execute_query_statement
(results, schema), operation_handle = self.execute_statement(statement=statement, max_rows=max_rows, configuration=configuration, orientation=orientation)
File "/home/hadoop/.versions/hue-3.10.0/apps/beeswax/src/beeswax/server/hive_server2_lib.py", line 780, in execute_statement
res = self.call(self._client.ExecuteStatement, req)
File "/home/hadoop/.versions/hue-3.10.0/apps/beeswax/src/beeswax/server/hive_server2_lib.py", line 625, in call
res = fn(req)
File "/home/hadoop/.versions/hue-3.10.0/desktop/core/src/desktop/lib/thrift_util.py", line 376, in wrapper
raise StructuredException('THRIFTSOCKET', str(e), data=None, error_code=502)
StructuredException: timed out (code THRIFTSOCKET): None

解决方法:
修改配置文件

/home/hadoop/.versions/hue-3.10.0/desktop/conf/hue.ini
server_conn_timeout=120
改为,重启hue服务,好像没用
server_conn_timeout=1200

搜索了下网上说,如果通过Hue的web来执行Hive查询,需要启动HiveServer2服务

查看配置conf/hue.ini,连接的是uhadoop-bwgkeu-master2的10000端口,连接这个master2通过ps -ef|grep hive发现已经启动了的,于是重启服务,问题解决

[hadoop@uhadoop-bwgkeu-master2 log]$ ps -ef|grep hive
hadoop : ? :: /usr/java/jdk1..0_60/bin/java -Xmx1024m -Dhive.log.dir=/var/log/hive -Dhive.log.file=hive-server2.log -Dhive.log.threshold=INFO -Djava.net.preferIPv4Stack=true -Dhadoop.log.dir=/home/hadoop/logs -Dhadoop.log.file=hadoop.log -Dhadoop.home.dir=/home/hadoop -Dhadoop.id.str=hadoop -Dhadoop.root.logger=INFO,console -Djava.library.path=/home/hadoop/lib/native -Dhadoop.policy.file=hadoop-policy.xml -Djava.net.preferIPv4Stack=true -Xmx1000m -Xmx4096m -Xms1024m -XX:NewRatio= -XX:MaxHeapFreeRatio= -XX:MinHeapFreeRatio= -XX:+UseParNewGC -XX:-UseGCOverheadLimit -Dhadoop.security.logger=INFO,NullAppender org.apache.hadoop.util.RunJar /home/hadoop/hive/lib/hive-service-1.2..jar org.apache.hive.service.server.HiveServer2 --hiveconf hive.aux.jars.path=file:///home/hadoop/hbase/lib/activation-1.1.jar,file:///home/hadoop/hbase/lib/antisamy-1.4.3.jar,file:///home/hadoop/hbase/lib/apacheds-i18n-2.0.0-M15.jar,file:///home/hadoop/hbase/lib/apacheds-kerberos-codec-2.0.0-M15.jar,file:///home/hadoop/hbase/lib/api-asn1-api-1.0.0-M20.jar,file:///home/hadoop/hbase/lib/api-util-1.0.0-M20.jar,file:///home/hadoop/hbase/lib/asm-3.2.jar,file:///home/hadoop/hbase/lib/avro-1.7.6-cdh5.8.0.jar,file:///home/hadoop/hbase/lib/aws-java-sdk-core-1.10.6.jar,file:///home/hadoop/hbase/lib/aws-java-sdk-kms-1.10.6.jar,file:///home/hadoop/hbase/lib/aws-java-sdk-s3-1.10.6.jar,file:///home/hadoop/hbase/lib/batik-css-1.7.jar,file:///home/hadoop/hbase/lib/batik-ext-1.7.jar,file:///home/hadoop/hbase/lib/batik-util-1.7.jar,file:///home/hadoop/hbase/lib/bsh-core-2.0b4.jar,file:///home/hadoop/hbase/lib/commons-beanutils-1.7.0.jar,file:///home/hadoop/hbase/lib/commons-beanutils-core-1.7.0.jar,file:///home/hadoop/hbase/lib/commons-cli-1.2.jar,file:///home/hadoop/hbase/lib/commons-codec-1.9.jar,file:///home/hadoop/hbase/lib/commons-collections-3.2.2.jar,file:///home/hadoop/hbase/lib/commons-compress-1.4.1.jar,file:///home/hadoop/hbase/lib/commons-configuration-1.6.jar,file:///home/hadoop/hbase/lib/commons-daemon-1.0.3.jar,file:///home/hadoop/hbase/lib/commons-digester-1.8.jar,file:///home/hadoop/hbase/lib/commons-el-1.0.jar,file:///home/hadoop/hbase/lib/commons-fileupload-1.2.jar,file:///home/hadoop/hbase/lib/commons-httpclient-3.1.jar,file:///home/hadoop/hbase/lib/commons-io-2.4.jar,file:///home/hadoop/hbase/lib/commons-lang-2.6.jar,file:///home/hadoop/hbase/lib/commons-logging-1.2.jar,file:///home/hadoop/hbase/lib/commons-math-2.1.jar,file:///home/hadoop/hbase/lib/commons-math3-3.1.1.jar,file:///home/hadoop/hbase/lib/commons-net-3.1.jar,file:///home/hadoop/hbase/lib/core-3.1.1.jar,file:///home/hadoop/hbase/lib/curator-client-2.7.1.jar,file:///home/hadoop/hbase/lib/curator-framework-2.7.1.jar,file:///home/hadoop/hbase/lib/curator-recipes-2.7.1.jar,file:///home/hadoop/hbase/lib/disruptor-3.3.0.jar,file:///home/hadoop/hbase/lib/esapi-2.1.0.jar,file:///home/hadoop/hbase/lib/findbugs-annotations-1.3.9-1.jar,file:///home/hadoop/hbase/lib/gson-2.2.4.jar,file:///home/hadoop/hbase/lib/guava-12.0.1.jar,file:///home/hadoop/hbase/lib/hadoop-annotations-2.6.0-cdh5.4.9.jar,file:///home/hadoop/hbase/lib/hadoop-auth-2.6.0-cdh5.4.9.jar,file:///home/hadoop/hbase/lib/hadoop-aws-2.6.0-cdh5.4.9.jar,file:///home/hadoop/hbase/lib/hadoop-client-2.6.0-cdh5.4.9.jar,file:///home/hadoop/hbase/lib/hadoop-common-2.6.0-cdh5.4.9.jar,file:///home/hadoop/hbase/lib/hadoop-hdfs-2.6.0-cdh5.4.9-tests.jar,file:///home/hadoop/hbase/lib/hadoop-hdfs-2.6.0-cdh5.4.9.jar,file:///home/hadoop/hbase/lib/hadoop-mapreduce-client-app-2.6.0-cdh5.4.9.jar,file:///home/hadoop/hbase/lib/hadoop-mapreduce-client-common-2.6.0-cdh5.4.9.jar,file:///home/hadoop/hbase/lib/hadoop-mapreduce-client-core-2.6.0-cdh5.4.9.jar,file:///home/hadoop/hbase/lib/hadoop-mapreduce-client-jobclient-2.6.0-cdh5.4.9.jar,file:///home/hadoop/hbase/lib/hadoop-mapreduce-client-shuffle-2.6.0-cdh5.4.9.jar,file:///home/hadoop/hbase/lib/hadoop-yarn-api-2.6.0-cdh5.4.9.jar,file:///home/hadoop/hbase/lib/hadoop-yarn-client-2.6.0-cdh5.4.9.jar,file:///home/hadoop/hbase/lib/hadoop-yarn-common-2.6.0-cdh5.4.9.jar,file:///home/hadoop/hbase/lib/hadoop-yarn-server-commo
hadoop Jun28 ? :: /usr/java/jdk1..0_60/bin/java -Xmx1024m -Dhive.log.dir=/var/log/hive -Dhive.log.file=hive-metastore.log -Dhive.log.threshold=INFO -Djava.net.preferIPv4Stack=true -Dhadoop.log.dir=/home/hadoop/logs -Dhadoop.log.file=hadoop.log -Dhadoop.home.dir=/home/hadoop -Dhadoop.id.str=hadoop -Dhadoop.root.logger=INFO,console -Djava.library.path=/home/hadoop/lib/native -Dhadoop.policy.file=hadoop-policy.xml -Djava.net.preferIPv4Stack=true -Xmx1000m -Xmx4096m -Xms1024m -XX:NewRatio= -XX:MaxHeapFreeRatio= -XX:MinHeapFreeRatio= -XX:+UseParNewGC -XX:-UseGCOverheadLimit -Dhadoop.security.logger=INFO,NullAppender org.apache.hadoop.util.RunJar /home/hadoop/hive/lib/hive-service-1.2..jar org.apache.hadoop.hive.metastore.HiveMetaStore
hadoop Jun28 ? :: /usr/java/jdk1..0_60/bin/java -cp /home/hadoop/lib/hadoop-lzo.jar:/home/hadoop/share/hadoop/common/*:/home/hadoop/share/hadoop/common/lib/*:/home/hadoop/share/hadoop/yarn/*:/home/hadoop/share/hadoop/yarn/lib/*:/home/hadoop/spark/lib/*:/home/hadoop/hive/lib/*:/home/hadoop/hbase/lib/*:/home/hadoop/spark/conf/:/home/hadoop/spark/jars/*:/home/hadoop/conf/ -Xmx1g -XX:MaxPermSize=256m org.apache.spark.deploy.history.HistoryServer

配置文件说明:

[beeswax]
hive_server_host=uhadoop-bwgkeu-master2 # 连接hive的服务器
hive_server_port= # hiveserver2的端口
hive_conf_dir=/home/hadoop/hive/conf # 配置文件
server_conn_timeout= # 超时实际

主机映射关系

# cat /etc/hosts

10.19.128.248 uhadoop-bwgkeu-master1
10.19.196.141 uhadoop-bwgkeu-core4
10.19.80.123 uhadoop-bwgkeu-core2
10.19.185.160 uhadoop-bwgkeu-core1
10.19.91.236 uhadoop-bwgkeu-core3
10.19.72.208 uhadoop-bwgkeu-master2

简单的hive查询验证是否能够正常查询:
SELECT s07.description, s07.salary, s08.salary,
s08.salary - s07.salary
FROM
sample_07 s07 JOIN sample_08 s08
ON ( s07.code = s08.code)
WHERE
s07.salary < s08.salary
ORDER BY s08.salary-s07.salary DESC
LIMIT 1000

简单hive查询的例子
SELECT s07.description, s07.salary
FROM
sample_07 s07
LIMIT 10

hive_shell查询:

# hive
hive> select * from tbl_push_user_req limit ;

hue报错StructuredException: timed out (code THRIFTSOCKET): None的处理的更多相关文章

  1. 解决hue报错:timed out (code THRIFTSOCKET): None

    报错栈: [/Jun/ :: +] decorators ERROR error running <function execute at 0x7fba2804ecf8> Tracebac ...

  2. hue解决timed out(code THRIFTSOCKET):None

    报错栈: Traceback (most recent call last): File , in decorator return func(*args, **kwargs) File , in e ...

  3. http报错之return error code:401 unauthorized

     http报错之return error code:401 unauthorized 依据HTTP返回码所表示的意思应该是未授权,没有输入账号和password,因此解决方法就直接在HTTP包里面 ...

  4. nginx 报错 upstream timed out (110: Connection timed out)解决方案【转】

    转自 nginx 报错 upstream timed out (110: Connection timed out)解决方案 - 为程序员服务http://outofmemory.cn/code-sn ...

  5. 执行npm install报错:npm ERR! code EINTEGRITY

    命令行执行npm install报错如下: D:\frontend\viewsdev>npm install npm ERR! code EINTEGRITY npm ERR! sha512-8 ...

  6. xdebug 一直报错 upstream timed out (110: Connection timed out) while reading response header from upstream

    本地主机(Windows环境192.168.66.1)访问虚拟机(192.168.66.139)里面的搭建的php环境(系统centos6.5版本,php版本是5.5.30 ,xdebug 2.4.0 ...

  7. 安装xcode6 beta 后调试出现Unable to boot the iOS Simulator以及编译苹果官方Swift的demo报错failed with exit code 1的解决的方法

    苹果昨天公布新语言Swift(雨燕),须要安装xcode6 以及mac os 系统为10.9以上. (xcode6 beta 可在官方下载.须要登录开发人员账号:mac os 系统直接更新就可以.在此 ...

  8. hive报错 Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. MetaException(message:For direct MetaStore DB connections,

    学习hive 使用mysql作为元数据  hive创建数据库和切换数据库都是可以的 但是创建表就是出问题 百度之后发现 是编码问题 特别记录一下~~~ 1.报错前如图: 2.在mysql数据库中执行如 ...

  9. hive 字符集问题 报错 Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. MetaException(message:For direct MetaStore DB connections,

    学习hive 使用mysql作为元数据  hive创建数据库和切换数据库都是可以的 但是创建表就是出问题 百度之后发现 是编码问题 特别记录一下~~~ 1.报错前如图: 2.在mysql数据库中执行如 ...

随机推荐

  1. HDU - 4625 JZPTREE(第二类斯特林数+树DP)

    https://vjudge.net/problem/HDU-4625 题意 给出一颗树,边权为1,对于每个结点u,求sigma(dist(u,v)^k). 分析 贴个官方题解 n^k并不好转移,于是 ...

  2. js 数组拷贝与深拷贝

    1.对于普通数组(数组元素为数字或者字符串) var _testCopy = [1,2,3].concat();//拷贝数组(浅拷贝) 2.对于对象数组 (深拷贝) //形如var _objArr=[ ...

  3. MQTT学习笔记

    因为工作需要,了解了一下MQTT.顺便记下来,现在还不会用. 一.概述 MQTT(Message Queuing Telemetyr Transport  消息队列遥测传输协议):基于发布/订阅(Pu ...

  4. 导入numpy时,出错怎么解决?

    在linux中导入numpy时出错,出现如下图所示的问题,采用更新版本的问题并未解决, 解决方法如下:进入文件夹中,删除其中的numpy文件夹,其他的文件夹不动,然后重新安装numpy即可

  5. [C++]PAT乙级1010. 一元多项式求导 (25/25)

    /* 1010. 一元多项式求导 (25) 设计函数求一元多项式的导数.(注:x^n(n为整数)的一阶导数为n*x^n-1.) 输入格式: 以指数递降方式输入多项式非零项系数和指数(绝对值均为不超过1 ...

  6. Loadrunner 如何在其他浏览器进行录制(一)

    背景: 由于lr只支持低版本的IE浏览器,当我们想使用高版本或其他浏览器进行录制时,这时,我们需要用到浏览器的代理功能. 传统的访问模式如下: 使用代理后的访问方式: 下面来总结一下具体的步骤: 1. ...

  7. BigDecimal实现末尾去掉无用0

    BigDecimal 原生提供了 stripTrailingZeros 方法可以实现去掉末尾的 0,然后使用 toPlainString 可以输出数值,注意这里如果使用 toString()  会变成 ...

  8. git 配置 BeyondCompare

    安装 Beyond Compare 4 配置 git git config --global merge.tool bc3 git config --global mergetool.bc3.path ...

  9. Python问题:UnboundLocalError: local variable 'xxx' referenced before assignment

    参考链接: http://blog.csdn.net/onlyanyz/article/details/45009697 https://www.cnblogs.com/fendou-999/p/38 ...

  10. Realtime Multi-Person 2D Pose Estimation using Part Affinity Fields(理解)

    0 - 人体姿态识别存在的挑战 图像中的个体数量.尺寸大小.位置均未知 个体间接触.遮挡等影响检测 实时性要求较高,传统的自顶向下方法运行时间随着个体数越多而越长 1 - 整体思路 整个模型架构是自底 ...