特定于队列的芹菜事件

我有两个Django项目,每个项目都有一个Celery应用程序:

- fooproj.celery_app
- barproj.celery_app

每个应用程序都在运行自己的Celery worker:

celery worker -A fooproj.celery_app -l info -E -Q foo_queue
celery worker -A barproj.celery_app -l info -E -Q bar_queue

这是我配置Celery应用程序的方式:

import os
from celery import Celery
from django.conf import settings


# set the default Django settings module for the 'celery' program.
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'proj.settings.local')


app = Celery('celery_app', broker=settings.BROKER_URL)
app.conf.update(
    CELERY_ACCEPT_CONTENT=['json'],
    CELERY_TASK_SERIALIZER='json',
    CELERY_RESULT_SERIALIZER='json',
    CELERY_RESULT_BACKEND='djcelery.backends.database:DatabaseBackend',
    CELERY_SEND_EVENTS=True,
    CELERY_DEFAULT_QUEUE=settings.CELERY_DEFAULT_QUEUE,
    CELERY_DEFAULT_EXCHANGE=settings.CELERY_DEFAULT_EXCHANGE,
    CELERY_DEFAULT_ROUTING_KEY=settings.CELERY_DEFAULT_ROUTING_KEY,
    CELERY_DEFAULT_EXCHANGE_TYPE='direct',
    CELERY_ROUTES = ('proj.celeryrouters.MainRouter', ),
    CELERY_IMPORTS=(
        'apps.qux.tasks',
        'apps.lorem.tasks',
        'apps.ipsum.tasks',
        'apps.sit.tasks'
    ),
)

我的路由器类:

from django.conf import settings


class MainRouter(object):
    """
    Routes Celery tasks to a proper exchange and queue
    """
    def route_for_task(self, task, args=None, kwargs=None):
        return {
            'exchange': settings.CELERY_DEFAULT_EXCHANGE,
            'exchange_type': 'direct',
            'queue': settings.CELERY_DEFAULT_QUEUE,
            'routing_key': settings.CELERY_DEFAULT_ROUTING_KEY,
        }

fooproj具有设置:

BROKER_URL = redis://localhost:6379/0
CELERY_DEFAULT_EXCHANGE = 'foo_exchange'
CELERY_DEFAULT_QUEUE = 'foo_queue'
CELERY_DEFAULT_ROUTING_KEY = 'foo_routing_key'

barproj具有设置:

BROKER_URL = redis://localhost:6379/1
CELERY_DEFAULT_EXCHANGE = 'foo_exchange'
CELERY_DEFAULT_QUEUE = 'foo_queue'
CELERY_DEFAULT_ROUTING_KEY = 'foo_routing_key'

如您所见,两个项目都使用自己的Redis数据库作为代理,使用自己的MySQL数据库作为后端,使用自己的交换,队列和路由键.

我正在尝试运行两个Celery事件进程,每个应用程序一个:

celery events -A fooproj.celery_app -l info -c djcelery.snapshot.Camera
celery events -A barproj.celery_app -l info -c djcelery.snapshot.Camera

问题是,这两个芹菜事件流程都在从我所有芹菜工人那里接任务!因此,在fooproj数据库中,我可以从barproj数据库中看到任务结果.

任何想法如何解决这个问题?

解决方法:

http://celery.readthedocs.org/en/latest/getting-started/brokers/redis.html开始:

Monitoring events (as used by flower and other tools) are global and
is not affected by the virtual host setting.

This is caused by a limitation in Redis. The Redis PUB/SUB channels are global and not affected by the database number.

这似乎是Redis的警告之一:(

上一篇:python-使芹菜返回未来


下一篇:Celery