当前位置: 动力学知识库 > 问答 > 编程问答 >

php - Symfony and Heroku worker: can I use the database as communication mechanism between the front-end and the background worker?

问题描述:

I need an Heroku's Worker to do some background tasks.

Now, reading the article Background jobs with workers in PHP, I found that the architecture designed in the article makes use of RabbitMQ as messaging system between the web dyno and the worker dyno.

But I don't want to use RabbitMQ as it is really too much complex at this stage.

So, as communication mechanism between web dyno and worker dyno, I'd like to use one of these two alternatives:

  1. Simply the database (largely my best choice :P)
  2. AWS SQS (I can use it, but the database would be better for my needs)

Now, the example provided using RabbitMQ uses a callback to make the script alive and continously receive new messages in the queue:

$callback = function($msg) use($app) {

$app['monolog']->debug('New task received for censoring message: ' . $msg->body);

try {

// call the "censor" API and pass it the text to clean up

$result = $app['guzzle']->get('censor', ['query' => ['corpus' => $msg->body]]);

$result = json_decode($result->getBody());

if($result) {

$app['monolog']->debug('Censored message result is: ' . $result->censored_text);

// store in Redis

$app['predis']->lpush('opinions', $result->censored_text);

// mark as delivered in RabbitMQ

$msg->delivery_info['channel']->basic_ack($msg->delivery_info['delivery_tag']);

} else {

$app['monolog']->warning('Failed to decode JSON, will retry later');

}

} catch(Exception $e) {

$app['monolog']->warning('Failed to call API, will retry later');

}

};

$channel->basic_qos(null, 1, null);

$channel->basic_consume('task_queue', '', false, false, false, false, $callback);

// loop over incoming messages

while(count($channel->callbacks)) {

$channel->wait();

}

My question is: how can I "emulate" the $channel->wait() command without using RabbitMQ?

In other words, how can I make the worker dyno able to read the queue from the database or from the AWS SQS, continously, starting processing the messages as they appears in the database or in the AWS SQS queue?

Should have I to use a scheduled job using Heroku Scheduler that starts the dyno? (Not applicable: see here why).

Or there is another flow that I'm not considering?

Or, more, maybe creating a symfony command line app based on the front-end app is the definitive solution? Will it run without stopping?

网友答案:

If you are using Heroku PostgreSQL (for example), you could have your worker process listen on asynchronous notifications, that you could have coming from the database by creating appropriate triggers.

See this nice post on Postgres Pub-Sub features for details.

So, your web dyno can insert or update records in the database, which can automatically trigger notifications to your worker dyno.

Your worker dyno runs forever, and simply listens on the notification channel, processing any messages it receives.

If you are using a different database, it may or may not have similar capabilities. But this is definitely doable (and in fact very easy) using Postgres.

分享给朋友:
您可能感兴趣的文章:
随机阅读: