栏目分类:
子分类:
返回
名师互学网用户登录
快速导航关闭
当前搜索
当前分类
子分类
实用工具
热门搜索
名师互学网 > IT > 前沿技术 > 大数据 > 大数据系统

在idea运行spark程序报错:The root scratch dir: /tmp/hive on HDFS should be writable

在idea运行spark程序报错:The root scratch dir: /tmp/hive on HDFS should be writable

故障现象

win10 个人电脑,在idea运行spark程序【连接了hive】,代码类似如下:

 spark = SparkSession.builder()
        .appName(APP_NAME)
        .config("spark.master", MASTER)
        .config("spark.sql.warehouse.dir", new File(WAREHOUSE_LOC).getAbsolutePath())
        .enableHiveSupport()
        .getOrCreate();

报如下错误:

java.lang.RuntimeException: java.lang.RuntimeException: The root scratch dir: /tmp/hive on HDFS should be writable. Current permissions are: r--r--r--;
org.apache.spark.sql.AnalysisException: java.lang.RuntimeException: java.lang.RuntimeException: The root scratch dir: /tmp/hive on HDFS should be writable. Current permissions are: r--r--r--;
	at org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:106)
	at org.apache.spark.sql.hive.HiveExternalCatalog.databaseExists(HiveExternalCatalog.scala:214)
	at org.apache.spark.sql.internal.SharedState.externalCatalog$lzycompute(SharedState.scala:116)
	at org.apache.spark.sql.internal.SharedState.externalCatalog(SharedState.scala:104)
	at org.apache.spark.sql.hive.HiveSessionStateBuilder.org$apache$spark$sql$hive$HiveSessionStateBuilder$$externalCatalog(HiveSessionStateBuilder.scala:39)
故障问题分析

从报错日志看,应该与文件目录权限有关系

解决方法:

将/tmp/hive置为777权限,命令如下:

%HADOOP_HOME%binwinutils.exe chmod 777 H:tmphive

如果没有配置HADOOP_HOME环境变量,请先配置,步骤如下:

下载hadoop winutils hadoop.dll-and-winutils.exe-for-hadoop2.7.3-on-windows_X64-master.zip解压放在D盘根目录配置环境变量: HADOOP_HOME = D:hadoop-lib

转载请注明:文章转载自 www.mshxw.com
本文地址:https://www.mshxw.com/it/751864.html
我们一直用心在做
关于我们 文章归档 网站地图 联系我们

版权所有 (c)2021-2022 MSHXW.COM

ICP备案号:晋ICP备2021003244-6号