Building Bulletproof Background Tasks: Drupal Queue Workers and Ultimate Cron

Building Bulletproof Background Tasks: Drupal Queue Workers and Ultimate Cron
A client recently for asked TTS on all articles. This seemed straight forward and I thought that I could just add it to generate on update/insertion. This was until I realised the load this was going to take. TTS generation could take up to 10 mins per article depending on length! This was because I would have to split up the text and send different packets to play within the API bounds.
In walks in background task processing, and in my case Queue workers where the heavy stuff get's handled in the background.
Most of the time, I would use a cron job to run with the default cron and away we go but with the timeout of 240s and our loaded default cron (even with our queue worker) just wouldn't cut it.
The solution I came to ended up pairing Queue workers with Ultimate Cron, create a system robust enough to handle any demanding task. The secret isn't in the complexity - it's in understanding how these two technologies complement each other to create something greater than the sum of their parts.
The Problem with Default Drupal Cron
Drupal's default cron system is powerful on it's own - it does everything, sort of. Every task runs at the same frequency, during the same window, competing for the same resources. Your lightweight cache clearing waits behind your heavyweight data processing. Your users refresh the page while your server chokes on a thirty-minute import job.
This is where most Drupal sites hit their first real scaling wall. Not because of traffic, not because of content volume, but because they're trying to do everything at once. One of the huge caveats of default cron is that it has a 240 second time limit. Once tasks exceed this, the remainder are cut short and cancelled.
Enter Ultimate Cron
Ultimate Cron transforms Drupal's chaotic cron into a well-orchestrated performance. Instead of every task running whenever default cron fires, you get granular control over when each task executes, how often, and during which hours.
Think of it as giving each of your background tasks its own dedicated time slot. Your AI content generation can run during low-traffic hours at 3 AM. Your user notification system can run every few minutes during business hours. Your heavy data processing can operate once daily when your server isn't busy serving actual users.
The installation is straight forward - composer require drupal/ultimate_cron
and you're away. Once installed, configuration at /admin/config/system/cron/jobs
is a breeze.
Queue Workers
While Ultimate Cron handles the scheduling, queue workers handle the heavy lifting. They're designed from the ground up for tasks that are too complex, too time-consuming, or too failure-prone for traditional cron jobs.
Here's the beautiful part: queue workers are inherently fault-tolerant. If a task fails, it can be retried. If your server runs out of memory processing one item, the rest of the queue remains intact. If you need to scale processing, you can run multiple workers simultaneously.
Let's look at a real-world implementation - a simplified version of an AI content generation worker that processes articles in the background:
<?php
declare(strict_types=1);
namespace Drupal\custom_content\Plugin\QueueWorker;
use Drupal\Core\Entity\EntityTypeManagerInterface;
use Drupal\Core\Logger\LoggerChannelFactoryInterface;
use Drupal\Core\Plugin\ContainerFactoryPluginInterface;
use Drupal\Core\Queue\QueueWorkerBase;
use Drupal\Core\Queue\RequeueException;
use Symfony\Component\DependencyInjection\ContainerInterface;
/**
* Processes AI content generation tasks.
*
* @QueueWorker(
* id = "content_ai_generation",
* title = @Translation("Content AI Generation Worker")
* )
*/
final class ContentGenerationWorker extends QueueWorkerBase implements ContainerFactoryPluginInterface {
private $aiService;
private $entityTypeManager;
private $logger;
public function __construct(
array $configuration,
$plugin_id,
$plugin_definition,
$aiService,
EntityTypeManagerInterface $entityTypeManager,
LoggerChannelFactoryInterface $loggerFactory,
) {
parent::__construct($configuration, $plugin_id, $plugin_definition);
$this->aiService = $aiService;
$this->entityTypeManager = $entityTypeManager;
$this->logger = $loggerFactory->get('content_generation');
}
public static function create(ContainerInterface $container, array $configuration, $plugin_id, $plugin_definition): self {
return new self(
$configuration,
$plugin_id,
$plugin_definition,
$container->get('custom_content.ai_service'),
$container->get('entity_type.manager'),
$container->get('logger.factory')
);
}
public function processItem($data): void {
if (!isset($data['nid']) || !isset($data['language'])) {
$this->logger->error('Queue item missing required data.');
return;
}
$nid = $data['nid'];
$language = $data['language'];
// Always get the fresh version from database
$node_storage = $this->entityTypeManager->getStorage('node');
$node_storage->resetCache([$nid]);
$node = $node_storage->load($nid);
if (!$node || $node->bundle() !== 'article') {
$this->logger->warning('Node @nid not found or not an article.', ['@nid' => $nid]);
return;
}
// Handle translations properly
if ($node->hasTranslation($language)) {
$node = $node->getTranslation($language);
}
$this->logger->info('Processing AI generation for node @nid in @lang.', [
'@nid' => $nid,
'@lang' => $language,
]);
try {
$content = $node->get('body')->value;
if (empty(trim(strip_tags($content)))) {
$this->logger->notice('Empty content for node @nid, skipping.', ['@nid' => $nid]);
return;
}
$summary = $this->aiService->generateSummary($content);
if ($summary) {
$current_summary = $node->get('field_ai_summary')->value;
if ($current_summary !== $summary) {
$node->set('field_ai_summary', [
'value' => $summary,
'format' => 'rich_text',
]);
$node->save();
$this->logger->info('AI summary generated for node @nid.', ['@nid' => $nid]);
}
} else {
$this->logger->warning('AI service returned empty summary for node @nid.', ['@nid' => $nid]);
}
} catch (\Exception $e) {
$this->logger->error('Error processing node @nid: @message', [
'@nid' => $nid,
'@message' => $e->getMessage(),
]);
// This will requeue the item for retry
throw new RequeueException('Failed to process item, requeuing.');
}
}
}
The beauty of this implementation lies in its resilience. Notice how the RequeueException
allows failed items to be retried automatically, while comprehensive logging ensures you can track exactly what's happening with each task.
The Cron Connection
Now comes the crucial part - connecting your queue worker to Ultimate Cron. You need a cron job that processes items from your queue at the right frequency, during the right hours, without overwhelming your server.
Here's how you create the bridge in your custom module:
/**
* Implements hook_cron().
*/
function custom_content_cron() {
$queue = \Drupal::queue('content_ai_generation');
$queue_worker = \Drupal::service('plugin.manager.queue_worker')
->createInstance('content_ai_generation');
$processed = 0;
$max_items = 10; // Process in small batches for AI tasks
while ($processed < $max_items && ($item = $queue->claimItem())) {
try {
$queue_worker->processItem($item->data);
$queue->deleteItem($item);
$processed++;
} catch (RequeueException $e) {
// Item will be requeued automatically
$queue->releaseItem($item);
} catch (\Exception $e) {
// Critical failure - remove item and log error
\Drupal::logger('custom_content')->error('Queue processing failed: @message', [
'@message' => $e->getMessage(),
]);
$queue->deleteItem($item);
}
}
if ($processed > 0) {
\Drupal::logger('custom_content')->info('Processed @count AI generation tasks.', [
'@count' => $processed,
]);
}
}
The magic happens when you configure this in Ultimate Cron. Navigate to /admin/config/system/cron/jobs
, discover your new tasks, and set up your scheduling. Your main cron might run every few minutes for lightweight tasks, while your AI processing queue runs every 15 minutes during off-peak hours only.
Adding Items to Your Queue
The final piece of the puzzle is populating your queue with work. This typically happens when users create or modify content:
// In your module's hook_node_update() or custom form submit handler
function custom_content_node_update(NodeInterface $node) {
// Only queue articles that need AI processing
if ($node->bundle() === 'article' && $node->isPublished()) {
$queue = \Drupal::queue('content_ai_generation');
$queue->createItem([
'nid' => $node->id(),
'language' => $node->language()->getId(),
]);
}
}
This approach ensures that every published article automatically gets queued for AI processing, but the actual work happens in the background without affecting the user's experience.
The Payoff: Performance Without Compromise
When you combine queue workers with Ultimate Cron properly, something remarkable happens. Your heaviest tasks become invisible to users. Your server resources get used efficiently. Your error handling becomes robust and transparent.
More importantly, you've built a system that scales. Need to process more content? Increase the batch size or frequency. Hit a performance bottleneck? Split tasks into multiple specialized queues. Experiencing failures? The retry mechanisms ensure nothing gets lost.
This isn't just about making your current site faster - it's about building an architecture that can handle whatever demands come next. Whether that's AI processing, bulk operations, external API integrations, or data processing pipelines.
The difference between sites that scale gracefully and sites that crumble under load often comes down to decisions made early in their architecture. Queue workers and Ultimate Cron give you that foundation from day one, turning what could be your biggest performance liability into your most reliable asset.
Your users will never know how much work is happening behind the scenes. And that's exactly how it should be.