Caused by: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.AccessControlException): Permission denied: user=Admin, access=EXECUTE, inode="/tmp/hive":jinghang:supergroup:drwx------ at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:319)2.原因:权限不够 3.解决方法
方法一:修改hadoop配置文件conf/hdfs-core.xml
org.apache.spark spark-hive_2.11 2.4.3
方法二:直接修改要访问的文件的权限`
hdfs dfs chmod 777 /test
方法三:代码里设置HADOOP_USER_NAME`
System.setProperty("HADOOP_USER_NAME","jinghang")



