栏目分类:
子分类:
返回
名师互学网用户登录
快速导航关闭
当前搜索
当前分类
子分类
实用工具
热门搜索
名师互学网 > IT > 面试经验 > 面试问答

如何通过AWS Elastic Beanstalk可扩展的Django应用运celery工作者?

面试问答 更新时间: 发布时间: IT归档 最新发布 模块sitemap 名妆网 法律咨询 聚返吧 英语巴士网 伯小乐 网商动力

如何通过AWS Elastic Beanstalk可扩展的Django应用运celery工作者?

这就是我在弹性beantalk上用django设置celery并具有良好的可伸缩性的方法。

请记住,“leader_only”的选项container_commands仅适用于环境重建或部署的应用程式。如果服务工作足够长时间,则Elastic Beanstalk可能会删除引导者节点。为了解决这个问题,你可能必须为领导节点应用实例保护。检查:http : //docs.aws.amazon.com/autoscaling/latest/userguide/as-instance-termination.html#instance-protection-instance

为ce​​lery worker添加bash脚本并进行节拍配置。

添加文件root_folder / .ebextensions / files / celery_configuration.txt:

#!/usr/bin/env bash# Get django environment variablesceleryenv=`cat /opt/python/current/env | tr 'n' ',' | sed 's/export //g' | sed 's/$PATH/%(ENV_PATH)s/g' | sed 's/$PYTHONPATH//g' | sed 's/$LD_LIBRARY_PATH//g' | sed 's/%/%%/g'`celeryenv=${celeryenv%?}# Create celery configuraiton scriptceleryconf="[program:celeryd-worker]; Set full path to celery program if using virtualenvcommand=/opt/python/run/venv/bin/celery worker -A django_app --loglevel=INFOdirectory=/opt/python/current/appuser=nobodynumprocs=1stdout_logfile=/var/log/celery-worker.logstderr_logfile=/var/log/celery-worker.logautostart=trueautorestart=truestartsecs=10; Need to wait for currently executing tasks to finish at shutdown.; Increase this if you have very long running tasks.stopwaitsecs = 600; When resorting to send SIGKILL to the program to terminate it; send SIGKILL to its whole process group instead,; taking care of its children as well.killasgroup=true; if rabbitmq is supervised, set its priority higher; so it starts firstpriority=998environment=$celeryenv[program:celeryd-beat]; Set full path to celery program if using virtualenvcommand=/opt/python/run/venv/bin/celery beat -A django_app --loglevel=INFO --workdir=/tmp -S django --pidfile /tmp/celerybeat.piddirectory=/opt/python/current/appuser=nobodynumprocs=1stdout_logfile=/var/log/celery-beat.logstderr_logfile=/var/log/celery-beat.logautostart=trueautorestart=truestartsecs=10; Need to wait for currently executing tasks to finish at shutdown.; Increase this if you have very long running tasks.stopwaitsecs = 600; When resorting to send SIGKILL to the program to terminate it; send SIGKILL to its whole process group instead,; taking care of its children as well.killasgroup=true; if rabbitmq is supervised, set its priority higher; so it starts firstpriority=998environment=$celeryenv"# Create the celery supervisord conf scriptecho "$celeryconf" | tee /opt/python/etc/celery.conf# Add configuration script to supervisord conf (if not there already)if ! grep -Fxq "[include]" /opt/python/etc/supervisord.conf  then  echo "[include]" | tee -a /opt/python/etc/supervisord.conf  echo "files: celery.conf" | tee -a /opt/python/etc/supervisord.conffi# Reread the supervisord configsupervisorctl -c /opt/python/etc/supervisord.conf reread# Update supervisord in cache without restarting all servicessupervisorctl -c /opt/python/etc/supervisord.conf update# Start/Restart celeryd through supervisordsupervisorctl -c /opt/python/etc/supervisord.conf restart celeryd-beatsupervisorctl -c /opt/python/etc/supervisord.conf restart celeryd-worker

请注意部署期间脚本的执行,但只在主节点上执行(leader_only:true)。添加文件root_folder / .ebextensions / 02-python.config:

container_commands:  04_celery_tasks:    command: "cat .ebextensions/files/celery_configuration.txt > /opt/elasticbeanstalk/hooks/appdeploy/post/run_supervised_celeryd.sh && chmod 744 /opt/elasticbeanstalk/hooks/appdeploy/post/run_supervised_celeryd.sh"    leader_only: true  05_celery_tasks_run:    command: "/opt/elasticbeanstalk/hooks/appdeploy/post/run_supervised_celeryd.sh"    leader_only: true

Beat可以通过单独的django应用程序进行配置,而无需重新部署:https : //pypi.python.org/pypi/django_celery_beat。
存储任务结果是一个好主意:https : //pypi.python.org/pypi/django_celery_beat
文件requirements.txt

celery==4.0.0django_celery_beat==1.0.1django_celery_results==1.0.1pycurl==7.43.0 --global-option="--with-nss"

为Amazon SQS代理配置celery(从列表中获取所需的终端节点:http : //docs.aws.amazon.com/general/latest/gr/rande.html) root_folder / django_app / settings.py:

...CELERY_RESULT_BACKEND = 'django-db'CELERY_BROKER_URL = 'sqs://%s:%s@' % (aws_access_key_id, aws_secret_access_key)# Due to error on lib region N Virginia is used temporarily. please set it on Ireland "eu-west-1" after fix.CELERY_BROKER_TRANSPORT_OPTIONS = {    "region": "eu-west-1",    'queue_name_prefix': 'django_app-%s-' % os.environ.get('APP_ENV', 'dev'),    'visibility_timeout': 360,    'polling_interval': 1}...

django的Celery配置django_app应用

添加文件root_folder / django_app / celery.py:

from __future__ import absolute_import, unipre_literalsimport osfrom celery import Celery# set the default Django settings module for the 'celery' program.os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'django_app.settings')app = Celery('django_app')# Using a string here means the worker don't have to serialize# the configuration object to child processes.# - namespace='CELERY' means all celery-related configuration keys#   should have a `CELERY_` prefix.app.config_from_object('django.conf:settings', namespace='CELERY')# Load task modules from all registered Django app configs.app.autodiscover_tasks()

修改文件root_folder / django_app / init.py:

from __future__ import absolute_import, unipre_literals# This will make sure the app is always imported when# Django starts so that shared_task will use this app.from django_app.celery import app as celery_app__all__ = ['celery_app']


转载请注明:文章转载自 www.mshxw.com
本文地址:https://www.mshxw.com/it/392719.html
我们一直用心在做
关于我们 文章归档 网站地图 联系我们

版权所有 (c)2021-2022 MSHXW.COM

ICP备案号:晋ICP备2021003244-6号