栏目分类:
子分类:
返回
名师互学网用户登录
快速导航关闭
当前搜索
当前分类
子分类
实用工具
热门搜索
名师互学网 > IT > 前沿技术 > 大数据 > 大数据系统

Hive各种问题

Hive各种问题

Error while cleaning up the server resources
00:30:07.590 [Druid-ConnectionPool-Destroy-1262024801] ERROR com.alibaba.druid.util.JdbcUtils - close connection error
java.sql.SQLException: Error while cleaning up the server resources
	at org.apache.hive.jdbc.HiveConnection.close(HiveConnection.java:640)
	at com.alibaba.druid.util.JdbcUtils.close(JdbcUtils.java:85)
	at com.alibaba.druid.pool.DruidDataSource.shrink(DruidDataSource.java:3194)
	at com.alibaba.druid.pool.DruidDataSource$DestroyTask.run(DruidDataSource.java:2938)
	at com.alibaba.druid.pool.DruidDataSource$DestroyConnectionThread.run(DruidDataSource.java:2922)
Caused by: org.apache.thrift.transport.TTransportException: java.net.SocketException: Broken pipe
	at org.apache.thrift.transport.TIOStreamTransport.flush(TIOStreamTransport.java:161)
	at org.apache.thrift.transport.TSaslTransport.flush(TSaslTransport.java:501)
	at org.apache.thrift.transport.TSaslClientTransport.flush(TSaslClientTransport.java:37)
	at org.apache.thrift.TServiceClient.sendbase(TServiceClient.java:73)
	at org.apache.thrift.TServiceClient.sendbase(TServiceClient.java:62)
	at org.apache.hive.service.cli.thrift.TCLIService$Client.send_CloseSession(TCLIService.java:173)
	at org.apache.hive.service.cli.thrift.TCLIService$Client.CloseSession(TCLIService.java:165)
	at sun.reflect.GeneratedMethodAccessor112.invoke(Unknown Source)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.apache.hive.jdbc.HiveConnection$SynchronizedHandler.invoke(HiveConnection.java:1299)
	at com.sun.proxy.$Proxy137.CloseSession(Unknown Source)
	at org.apache.hive.jdbc.HiveConnection.close(HiveConnection.java:638)
	... 4 common frames omitted
Caused by: java.net.SocketException: Broken pipe
	at java.net.SocketOutputStream.socketWrite0(Native Method)
	at java.net.SocketOutputStream.socketWrite(SocketOutputStream.java:109)
	at java.net.SocketOutputStream.write(SocketOutputStream.java:153)
	at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82)
	at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140)
	at org.apache.thrift.transport.TIOStreamTransport.flush(TIOStreamTransport.java:159)

statment语句执行一个sql时,在指定的时间内没有获取到结果,然后报超时异常后,紧接着去关闭connection 连接,而导致的异常。

The query did not generate a result set!

报错信息如下:

java.sql.SQLException: The query did not generate a result set!
	at org.apache.hive.jdbc.HiveStatement.executeQuery(HiveStatement.java:376)
	at com.alibaba.druid.pool.DruidPooledStatement.executeQuery(DruidPooledStatement.java:308)

报错代码行:

rs = ps.executeQuery(item);

分析:executeQuery()方法支持的是增删改查等DML语句,对于DDL需调用execute()来执行。

No such file or directory

报错信息如下:

java.sql.SQLException: Disk I/O error: Failed to open HDFS file hdfs://ppdhdpha/user/hive/warehouse/cszc.db/cs_phone_multi_call_pool_single/904924957a18ee1f-170b897c00000005_129981828_data.0.parq
Error(2): No such file or directory
Root cause: RemoteException: File does not exist: /user/hive/warehouse/cszc.db/cs_phone_multi_call_pool_single/904924957a18ee1f-170b897c00000005_129981828_data.0.parq
	at org.apache.hadoop.hdfs.server.namenode.INodeFile.valueOf(INodeFile.java:85)
	at org.apache.hadoop.hdfs.server.namenode.INodeFile.valueOf(INodeFile.java:75)
	at org.apache.hadoop.hdfs.server.namenode.FSDirStatAndListingOp.getBlockLocations(FSDirStatAndListingOp.java:152)
	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getBlockLocations(FSNamesystem.java:1909)
	at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getBlockLocations(NameNodeRpcServer.java:735)
	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getBlockLocations(ClientNamenodeProtocolServerSideTranslatorPB.java:415)
	at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:523)
	at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:991)
	at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:869)
	at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:815)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:422)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1875)
	at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2675)
	at org.apache.hive.jdbc.HiveStatement.execute(HiveStatement.java:279)
	at org.apache.hive.jdbc.HiveStatement.executeQuery(HiveStatement.java:375)
	at com.alibaba.druid.pool.DruidPooledStatement.executeQuery(DruidPooledStatement.java:308)

Disk I/O error: Failed to open HDFS file

参考

Error while cleaning up the server resources
hive-query-cant-generate-result-set-via-jdbc

转载请注明:文章转载自 www.mshxw.com
本文地址:https://www.mshxw.com/it/696199.html
我们一直用心在做
关于我们 文章归档 网站地图 联系我们

版权所有 (c)2021-2022 MSHXW.COM

ICP备案号:晋ICP备2021003244-6号