Node.js developers often reach for Promise.all() when handling multiple asynchronous operations. However, this seemingly innocent approach can cause serious performance issues in production environments. Understanding the limitations of Promise.all() and implementing proper batch execution strategies is crucial for building robust, scalable Node.js applications.

nodejs batch execution

Understanding the Promise.all Problem

The Promise.all() method executes all promises concurrently, which can lead to resource exhaustion and application crashes in production scenarios. This approach has three critical limitations that impact application performance and reliability.

Resource Exhaustion Issues

When dealing with large numbers of concurrent operations, Promise.all() can overwhelm system resources:

1
2
3
// ❌ Problematic approach
const urls = [...Array(1000).keys()].map(i => `https://api.example.com/data/${i}`);
await Promise.all(urls.map(url => fetch(url)));

This code attempts to open 1000 TCP connections simultaneously, potentially causing:

  • Memory exhaustion
  • Network timeout errors
  • API rate limiting
  • Application crashes

Promise.all Limitations

The Promise.all() method has three fundamental limitations that make it unsuitable for production workloads:

  1. No Concurrency Control - All operations execute simultaneously
  2. No Timeout Handling - Single hanging promise blocks entire batch
  3. No Partial Recovery - One failure causes all results to be lost
  4. Poor Error Handling - If a single operation fails, the entire batch fails immediately

Implementing Batch Execution Strategies

Batch execution provides controlled concurrency with proper error handling and timeout management. This approach ensures optimal resource utilization while maintaining application stability.

Core Batch Executor Implementation

The batch executor manages concurrent operations with configurable limits and graceful error handling:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
/**
 * Batch Executor
 * Runs async tasks with concurrency, per-task timeout, and graceful errors.
 */

export default async function batchExecutor({
  tasks,               // Array<() => Promise<any>>
  maxConcurrency = 5,  // How many run at once
  maxTimeout,          // Milliseconds before a task is aborted
  onError              // (err, index) => void  — optional centralized logger
}) {
  if (!Array.isArray(tasks) || tasks.length === 0) return [];

  const results = new Array(tasks.length).fill(null);
  let cursor = 0;

  // Helper that wraps a promise with a timeout
  function withTimeout(promise, ms, idx) {
    if (!ms) return promise;
    return new Promise((resolve, reject) => {
      const timer = setTimeout(() => {
        reject(new Error(`Task ${idx} timed out after ${ms}ms`));
      }, ms);

      promise
        .then(resolve)
        .catch(reject)
        .finally(() => clearTimeout(timer));
    });
  }

  // One worker = one seat in the club
  async function worker() {
    while (cursor < tasks.length) {
      const index = cursor++;
      try {
        const task = tasks[index];
        results[index] = await withTimeout(task(), maxTimeout, index);
      } catch (err) {
        results[index] = null; // keep the array length stable
        if (onError) onError(err, index);
        else throw err; // fail-fast if no custom handler
      }
    }
  }

  // Spin up N workers
  const workers = Array.from(
    { length: Math.min(maxConcurrency, tasks.length) },
    () => worker()
  );
  await Promise.all(workers);
  return results;
}

Timeout Management

Proper timeout handling prevents individual operations from blocking the entire batch:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
// ✅ Proper timeout implementation
function withTimeout(promise, ms, idx) {
  if (!ms) return promise;
  return new Promise((resolve, reject) => {
    const timer = setTimeout(() => {
      reject(new Error(`Task ${idx} timed out after ${ms}ms`));
    }, ms);

    promise
      .then(resolve)
      .catch(reject)
      .finally(() => clearTimeout(timer));
  });
}

Practical Implementation Examples

Batch execution can be applied to various real-world scenarios, from file processing to API calls and database operations.

File Download Management

Download multiple files with controlled concurrency and timeout protection:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
import batchExecutor from './batch.js';
import fs from 'fs/promises';
import fetch from 'node-fetch';

const urls = [...Array(100).keys()].map(i => `https://api.example.com/files/${i}`);

const tasks = urls.map(url => async () => {
  const res = await fetch(url);
  const buffer = await res.buffer();
  await fs.writeFile(`./downloads/${Date.now()}.jpg`, buffer);
  return url;
});

await batchExecutor({
  tasks,
  maxConcurrency: 3,
  maxTimeout: 10_000,
  onError: (err, idx) => console.error(`Download ${idx} failed:`, err.message)
});

Mixed Operation Types

Handle different types of asynchronous operations within the same batch:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
const tasks = [
  async () => (await db.query('SELECT COUNT(*) FROM users')).rows[0].count,
  async () => (await fetch('https://api.external.com/data')).json(),
  async () => fs.readFile('./config.json', 'utf8')
];

const results = await batchExecutor({
  tasks,
  maxConcurrency: 2,
  maxTimeout: 5_000,
  onError: (err, i) => console.warn(`Job ${i} failed:`, err.message)
});

Performance Comparison and Best Practices

Understanding when to use batch execution versus other concurrency patterns is crucial for optimal application performance.

Feature Comparison

Feature Promise.all() Batch Executor
Runs everything at once ❌ (throttled)
Keeps order
Per-task timeout
Partial failure handling
Resource exhaustion guard

Concurrency Optimization

Choose appropriate concurrency levels based on your specific use case:

  • CPU-intensive tasks: Lower concurrency (2-4)
  • I/O-bound operations: Higher concurrency (10-20)
  • Network requests: Moderate concurrency (5-10)

Error Handling Strategies

Implement robust error handling for production environments:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
// ✅ Comprehensive error handling
const results = await batchExecutor({
  tasks,
  maxConcurrency: 5,
  maxTimeout: 5000,
  onError: (err, index) => {
    console.error(`Task ${index} failed:`, err.message);
    // Log to monitoring service
    // Send alert if critical
    // Retry logic if appropriate
  }
});

Advanced Batch Processing Techniques

For complex scenarios, consider implementing additional features like retry logic, progress tracking, and dynamic concurrency adjustment.

Retry Logic Implementation

Add automatic retry capabilities for failed operations:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
async function withRetry(fn, maxRetries = 3, delay = 1000) {
  for (let i = 0; i < maxRetries; i++) {
    try {
      return await fn();
    } catch (error) {
      if (i === maxRetries - 1) throw error;
      await new Promise(resolve => setTimeout(resolve, delay * (i + 1)));
    }
  }
}

const tasks = urls.map(url => () => withRetry(() => fetch(url)));

Progress Tracking

Monitor batch execution progress for better user experience:

1
2
3
4
5
6
7
8
9
function createProgressTracker(total) {
  let completed = 0;
  return {
    increment: () => {
      completed++;
      console.log(`Progress: ${completed}/${total} (${Math.round(completed/total*100)}%)`);
    }
  };
}

Final Thoughts: Use batch executor instead of Promise.all()

Batch execution provides a robust alternative to Promise.all() for production Node.js applications. By implementing controlled concurrency, proper timeout handling, and graceful error recovery, developers can build applications that scale efficiently while maintaining stability under load.

The batch executor pattern is particularly valuable for:

  • Processing large datasets
  • Managing API rate limits
  • Handling file operations
  • Database batch operations

Remember that successful batch processing requires careful consideration of concurrency levels, timeout values, and error handling strategies tailored to your specific use case. Start with conservative settings and adjust based on monitoring and performance metrics.

✨ Thank you for reading and I hope you find it helpful. I sincerely request for your feedback in the comment’s section.