ambari默认是无法管理elasticsearch和flink的,在网上能搜到相关的第三方自定义组件。
es自定义组件安装参考链接:ElasticAmbari/README.md at master · ChengYingOpenSource/ElasticAmbari · GitHub
flink自定义组件安装参考链接:Ambari 2.7.5安装Flink1.13.2_韦不二的博客-CSDN博客
但是在实际使用的过程中发现上述两个自定义组件都存在一个共同的问题,每次如果重启服务的时候都会报错,大概的报错内容如下:
Traceback (most recent call last): File "/var/lib/ambari-agent/cache/stacks/HDP/3.1/services/Flink/package/scripts/flink.py", line 173, inMaster().execute() File "/usr/lib/ambari-agent/lib/resource_management/libraries/script/script.py", line 352, in execute method(env) File "/usr/lib/ambari-agent/lib/resource_management/libraries/script/script.py", line 980, in restart self.stop(env) File "/var/lib/ambari-agent/cache/stacks/HDP/3.1/services/Flink/package/scripts/flink.py", line 98, in stop pid = str(sudo.read_file(status_params.flink_pid_file)) File "/usr/lib/ambari-agent/lib/resource_management/core/sudo.py", line 151, in read_file with open(filename, "rb") as fp: IOError: [Errno 2] No such file or directory: u'/var/run/flink/flink.pid'
上述错误显示是存储pid的目录缺失了,如果尝试手动创建该目录则可以正常启动程序。因此初步猜测有可能是hdp3.1.5中ambari有所调整,自定义组件没有兼容新版本的变更。在实际应用中我们肯定不希望每次重启都手动创建一个pid存储目录。为了解决这个简单的小问题,只需要修改一下自定义组件的启动脚本即可。
定位到报错的脚本文件,在该脚本文件的启动函数中增加一行创建pid目录的命令即可:
Directory([status_params.flink_pid_dir],owner=params.flink_user,group=params.flink_group)
修改位置如下图所示/var/lib/ambari-agent/cache/stacks/HDP/3.1/services/Flink/package/scripts/flink.py
elasticsearch组件同理:
我们可以在脚本路径/var/lib/ambari-agent/cache/common-services/ELASTICSEARCH/7.13.4/package/scripts/ElasticSearchService.py文件中,增加"self.__creatPidDirectory()"
def __creatPidDirectory(self):
import params
name = os.path.dirname(params.elasticSearchPidFile)
if not os.path.exists(name):
os.makedirs(name, mode=0o755)
Utils.chown(name, params.elasticSearchUser,params.elasticSearchGroup)



