yarn集群下启动spark错误如下:
WARN Client:66 - Neither spark.yarn.jars nor spark.yarn.archive is set, falling back to uploading libraries under SPARK_HOME.
解决办法
在hdfs上创建目录:
hdfs dfs -mkdir -p /home/hadoop/spark_jars
上传spark的jars
hdfs dfs -put /opt/module/spark-2.3.0-yarn/jars/* /home/hadoop/spark_jars/
在spark的conf的spark-default.conf ,添加如下的配置:
spark.yarn.jars=hdfs://master:9000/opt/module/spark-2.3.0-yarn/jars/* /home/hadoop/spark_jars/
ok.



