Caused by: org.apache.flink.yarn.YarnClusterDescriptor$YarnDeploymentException: The YARN application unexpectedly switched to state FAILED during deployment. Diagnostics from YARN: Application application_1647077633391_0005 failed 1 times (global limit =2; local limit is =1) due to AM Container for appattempt_1647077633391_0005_000001 exited with exitCode: -1000 Failing this attempt.Diagnostics: [2022-03-12 17:54:46.879]File file:/home/jkop/.flink/application_1647077633391_0005/plugins/metrics-datadog/flink-metrics-datadog-1.11.0.jar does not exist java.io.FileNotFoundException: File file:/home/jkop/.flink/application_1647077633391_0005/plugins/metrics-datadog/flink-metrics-datadog-1.11.0.jar does not exist
原因一:由于找不到依赖,mapred-site.xml添加一下配置,然后重新启动hdfs和yarn:
mapreduce.application.classpath $HADOOP_HOME/share/hadoop/common/*,$HADOOP_HOME/share/hadoop/common/lib/*,$HADOOP_HOME/share/hadoop/hdfs/*,$HADOOP_HOME/share/hadoop/hdfs/lib/*,$HADOOP_HOME/share/hadoop/mapreduce/*,$HADOOP_HOME/share/hadoop/mapreduce/lib/*,$HADOOP_HOME/share/hadoop/yarn/*,$HADOOP_HOME/share/hadoop/yarn/lib/*
或
yarn.app.mapreduce.am.env HADOOP_MAPRED_HOME=${HADOOP_HOME} mapreduce.map.env HADOOP_MAPRED_HOME=${HADOP_HOME} mapreduce.reduce.env HADOOP_MAPRED_HOME=${HADOOP_HOME} mapreduce.application.classpath ${HADOOP_HOME}/etc/hadoop, ${HADOOP_HOME}/share/hadoop/common/*, ${HADOOP_HOME}/share/hadoop/common/lib/*, ${HADOOP_HOME}/share/hadoop/hdfs/*, ${HADOOP_HOME}/share/hadoop/hdfs/lib/*, ${HADOOP_HOME}/share/hadoop/mapreduce/*, ${HADOOP_HOME}/share/hadoop/mapreduce/lib/*, ${HADOOP_HOME}/share/hadoop/yarn/*, ${HADOOP_HOME}/share/hadoop/yarn/lib/*
原因二: flink程序在写日志的时候,数据块好像有问题。解决这个问题的办法如下
先切回到你的用户下,然后执行 hadoop fsck -delete 命令这样数据块就被修复了



