for await of VS Promise.all - javascript

Is there any difference between this:
const promises = await Promise.all(items.map(e => somethingAsync(e)));
for (const res of promises) {
// do some calculations
}
And this ?
for await (const res of items.map(e => somethingAsync(e))) {
// do some calculations
}
I know that in the first snippet, all the promises are fired at the same time but I'm not sure about the second. Does the for loop wait for the first iteration to be done to call the next promise ? Or are all the promises fired at the same time and the inside of the loop acts like a callback for them ?

Yes, they absolutely are different. for await is supposed to be used with asynchronous iterators, not with arrays of pre-existing promises.
Just to make clear,
for await (const res of items.map(e => somethingAsync(e))) …
works the same as
const promises = items.map(e => somethingAsync(e));
for await (const res of promises) …
or
const promises = [somethingAsync(items[0]), somethingAsync(items[1]), …];
for await (const res of promises) …
The somethingAsync calls are happening immediately, all at once, before anything is awaited. Then, they are awaited one after another, which is definitely a problem if any one of them gets rejected: it will cause an unhandled promise rejection error. Using Promise.all is the only viable choice to deal with the array of promises:
for (const res of await Promise.all(promises)) …
See Waiting for more than one concurrent await operation and Any difference between await Promise.all() and multiple await? for details.

The need for for await ... arises when on an asynchronous iterator the computation of the current iteration depends on some of the previous iterations. If there are no dependences, Promise.all is your choice. The for await construct was designed to work with asynchronous iterators, although - as in your example, you can use it with an array of promises.
See the example paginated data in the book javascript.info for an example using an asynchronous iterator that can't be rewritten using Promise.all:
(async () => {
for await (const commit of fetchCommits('javascript-tutorial/en.javascript.info')) {
console.log(commit.author.login);
}
})();
Here the fetchCommits async iterator makes a request to fetch the commits of a GitHub repo. The fetch responds with a JSON of 30 commits, and also provides a link to the next page in the Link header. Therefore the next iteration can only start after the previous iteration has the link for the next request
async function* fetchCommits(repo) {
let url = `https://api.github.com/repos/${repo}/commits`;
while (url) {
const response = await fetch(url, {
headers: {'User-Agent': 'Our script'},
});
const body = await response.json(); // (array of commits
// The URL of the next page is in the headers, extract it using a regexp
let nextPage = response.headers.get('Link').match(/<(.*?)>; rel="next"/);
nextPage = nextPage?.[1];
url = nextPage;
for(let commit of body) { // yield commits one by one, until the page ends
yield commit;
}
}
}

As you said Promise.all will send all the requests in one go and then you will get the response when all of them gets completed.
In the second scenario, you will send the request in one go but recieve response as for each one by one.
See this small example for reference.
let i = 1;
function somethingAsync(time) {
console.log("fired");
return delay(time).then(() => Promise.resolve(i++));
}
const items = [1000, 2000, 3000, 4000];
function delay(time) {
return new Promise((resolve) => {
setTimeout(resolve, time)
});
}
(async() => {
console.time("first way");
const promises = await Promise.all(items.map(e => somethingAsync(e)));
for (const res of promises) {
console.log(res);
}
console.timeEnd("first way");
i=1; //reset counter
console.time("second way");
for await (const res of items.map(e => somethingAsync(e))) {
// do some calculations
console.log(res);
}
console.timeEnd("second way");
})();
You could try it here as well - https://repl.it/repls/SuddenUselessAnalyst
Hope this helps.

Actually, using the for await syntax does fire the promises all at once.
The small piece of code proves it:
const sleep = s => {
return new Promise(resolve => {
setTimeout(resolve, s * 1000);
});
}
const somethingAsync = async t => {
await sleep(t);
return t;
}
(async () => {
const items = [1, 2, 3, 4];
const now = Date.now();
for await (const res of items.map(e => somethingAsync(e))) {
console.log(res);
}
console.log("time: ", (Date.now() - now) / 1000);
})();
stdout:
time: 4.001
But the inside of the loop doesn't act "as a callback". If I reverse the array, all the logs appear at once. I suppose that the promises are fired at once and the runtime just waits for the first one to resolve to go to the next iteration.
EDIT: Actually, using for await is bad practice when we use it with something other than an asynchronous iterator, best is to use Promise.all, according to #Bergi in his answer.

Related

Javascript async await - why is my await not stopping execution of following code until resolved? [duplicate]

There is quite some topics posted about how async/await behaves in javascript map function, but still, detail explanation in bellow two examples would be nice:
const resultsPromises = myArray.map(async number => {
return await getResult(number);
});
const resultsPromises = myArray.map(number => {
return getResult(number);
});
edited: this if of course a fictional case, so just opened for debate, why,how and when should map function wait for await keyword. solutions how to modify this example, calling Promise.all() is kind of not the aim of this question.
getResult is an async function
The other answers have pretty well covered the details of how your examples behave, but I wanted to try to state it more succinctly.
const resultsPromises = myArray.map(async number => {
return await getResult(number);
});
const resultsPromises = myArray.map(number => {
return getResult(number);
});
Array.prototype.map synchronously loops through an array and transforms each element to the return value of its callback.
Both examples return a Promise.
async functions always return a Promise.
getResult returns a Promise.
Therefore, if there are no errors you can think of them both in pseudocode as:
const resultsPromises = myArray.map(/* map each element to a Promise */);
As zero298 stated and alnitak demonstrated, this very quickly (synchronously) starts off each promise in order; however, since they're run in parallel each promise will resolve/reject as they see fit and will likely not settle (fulfill or reject) in order.
Either run the promises in parallel and collect the results with Promise.all or run them sequentially using a for * loop or Array.prototype.reduce.
Alternatively, you could use a third-party module for chainable asynchronous JavaScript methods I maintain to clean things up and--perhaps--make the code match your intuition of how an async map operation might work:
const delay = ms => new Promise(resolve => setTimeout(resolve, ms));
const getResult = async n => {
await delay(Math.random() * 1000);
console.log(n);
return n;
};
(async () => {
console.log('parallel:');
await AsyncAF([1, 2, 3]).map(getResult).then(console.log);
console.log('sequential:');
await AsyncAF([1, 2, 3]).series.map(getResult).then(console.log)
})();
<script src="https://unpkg.com/async-af#7.0.12/index.js"></script>
async/await is usefull when you want to flatten your code by removing the .then() callbacks or if you want to implicitly return a Promise:
const delay = n => new Promise(res => setTimeout(res, n));
async function test1() {
await delay(200);
// do something usefull here
console.log('hello 1');
}
async function test2() {
return 'hello 2'; // this returned value will be wrapped in a Promise
}
test1();
test2().then(console.log);
However, in your case, you are not using await to replace a .then(), nor are you using it to return an implicit Promise since your function already returns a Promise. So they are not necessary.
Parallel execution of all the Promises
If you want to run all Promises in parallel, I would suggest to simply return the result of getResult with map() and generate an array of Promises. The Promises will be started sequentially but will eventually run in parallel.
const resultsPromises = indicators.map(getResult);
Then you can await all promises and get the resolved results using Promise.all():
const data = [1, 2, 3];
const getResult = x => new Promise(res => {
return setTimeout(() => {
console.log(x);
res(x);
}, Math.random() * 1000)
});
Promise.all(data.map(getResult)).then(console.log);
Sequential execution of the Promises
However, if you want to run each Promise sequentially and wait for the previous Promise to resolve before running the next one, then you can use reduce() and async/await like this:
const data = [1, 2, 3];
const getResult = x => new Promise(res => {
return setTimeout(() => {
console.log(x);
res(x);
}, Math.random() * 1000)
});
data.reduce(async (previous, x) => {
const result = await previous;
return [...result, await getResult(x)];
}, Promise.resolve([])).then(console.log);
Array.prototype.map() is a function that transforms Arrays. It maps one Array to another Array. The most important part of its function signature is the callback. The callback is called on each item in the Array and what that callback returns is what is put into the new Array returned by map.
It does not do anything special with what gets returned. It does not call .then() on the items, it does not await anything. It synchronously transforms data.
That means that if the callback returns a Promise (which all async functions do), all the promises will be "hot" and running in parallel.
In your example, if getResult() returns a Promise or is itself async, there isn't really a difference between your implementations. resultsPromises will be populated by Promises that may or may not be resolved yet.
If you want to wait for everything to finish before moving on, you need to use Promise.all().
Additionally, if you only want 1 getResults() to be running at a time, use a regular for loop and await within the loop.
If the intent of the first code snippet was to have a .map call that waits for all of the Promises to be resolved before returning (and to have those callbacks run sequentially) I'm afraid it doesn't work like that. The .map function doesn't know how to do that with async functions.
This can be demonstrated with the following code:
const array = [ 1, 2, 3, 4, 5 ];
function getResult(n)
{
console.log('starting ' + n);
return new Promise(resolve => {
setTimeout(() => {
console.log('finished ' + n);
resolve(n);
}, 1000 * (Math.random(5) + 1));
});
}
let promises = array.map(async (n) => {
return await getResult(n);
});
console.log('map finished');
Promise.all(promises).then(console.log);
Where you'll see that the .map call finishes immediately before any of the asynchronous operations are completed.
If getResult always returns a promise and never throws an error then both will behave the same.
Some promise returning functions can throw errors before the promise is returned, in this case wrapping the call to getResult in an async function will turn that thrown error into a rejected promise, which can be useful.
As has been stated in many comments, you never need return await - it is equivalent to adding .then(result=>result) on the end of a promise chain - it is (mostly) harmless but unessesary. Just use return.

node.js, return the same promise for all callers until a promise is resolved

I have a heavy promise function that gets called multiple times with the same parameters.
I want it to avoid re-running the same calculations for the same params. Instead, if a consecutive call has the same params, return the promise created by the previous one.
Normally, for such cases, I would use memoizee package (or memoize in lodash).
However, this promise returns a large dataset, which can be problematic to keep in memory. Since these packages cache the results in-memory, they won't be a good fit for this case.
An simplify example:
const sleep = ms => new Promise(resolve => setTimeout(resolve, ms))
async function sleepOneSec() {
console.log("going to sleep");
await sleep(1000)
};
async function myApp() {
await Promise.all([1, 2, 3].map(async item => {
console.log('item', item);
await sleepOneSec();
}));
console.log('promise.all is done');
await sleepOneSec();
console.log('completed');
}
myApp().then(() => {}).catch(err => console.log(err));
currently, without any memoization, the output is:
item 1
going to sleep
item 2
going to sleep
item 3
going to sleep
promise.all is done
going to sleep
completed
Ideally I want the results to be:
item 1
going to sleep
item 2
item 3
promise.all is done
going to sleep
completed
OK, here's an implementation for sleepOneSec(num) that generates the output you asked for. It creates a cache for the promise from sleepOneSec() that is indexed by the arguments. The promise is removed from the cache as soon as it resolves or rejects so it is only reused while "in progress":
const sleep = ms => new Promise(resolve => setTimeout(resolve, ms))
const sleepCache = new Map();
async function sleepOneSec(num) {
// serialize args
const args = JSON.stringify({ num });
let p = sleepCache.get(args);
if (!p) {
console.log("going to sleep");
p = sleep(num);
sleepCache.set(args, p);
// when this promise resolves, remove it from the cache
// so it won't be used any more
const remove = () => sleepCache.delete(args);
p.then(remove, remove);
}
return p;
}
async function myApp() {
await Promise.all([1, 2, 3].map(async item => {
console.log('item', item);
await sleepOneSec(1000);
}));
console.log('promise.all is done');
await sleepOneSec(1000);
console.log('completed');
}
myApp().then(() => {}).catch(err => console.log(err));
If the arguments to the function in question contain objects, then you have to make sure they are uniquely serializable with JSON.stringify() or you need to manually convert them into something that is representative of their uniqueness that is serializable by JSON.stringify(). It's also important to manually handle serialization of any objects because JSON.stringify() does not necessarily stringify properties in the same order which would break the comparison. This is why I asked about the arguments in your real code. All you offered me when I asked about what the real arguments are was a single numeric argument so that's what is shown here. But, if you have objects as arguments, those have to be handled carefully so the caching works properly.

Await multiple promises to resolve while returning the value of each promise as soon as it is resolved

The problem
While working with a restful API, I had to make multiple requests to retrieve data around a single search. The problem that I seem to be facing is that as the results are returned from a large database, some promises take FOREVER to resolve.
Current solution
Currently I make all the requests in a loop while adding the promises to an array and then using await Promise.all() to wait for them to be resolved but this makes loading times > 30 seconds at times even when the earliest promise resolved within a few seconds.
I am looking for a way that I can 'Lazy Load' the results in. I do have access to the restful server so any changes in either the front-end or back-end would help however I would prefer the changes to be at the front end.
Edit 1
My bad I didn't put any references to the code that I am currently using. For reference here is what I am currently doing
async function retrieve_data() {
let request_urls = [list of api endpoints to fetch]
let promises = []
for (let url of request_urls)
promises.push( fetch(url) )
await promise.all( promises )
// Do something with the returned results
}
The solution I want
async function retrieve_data() {
let request_urls = [list of api endpoints to fetch]
let promises = []
for (let url of request_urls)
promises.push( fetch(url) )
let results = Promise.some_magic_function()
// Or server side
res.write(Promise.some_magic_function()) // If results are received in chunks
// Do something with the returned results
}
Edit 2
So while I did find Promise.any() and Promise.all() to fix half of my problems, I seem unable to comprehend on how to have some function which adds the value of fulfilled promises to an array as soon as they are fulfilled. While these solutions do work for my usecase, I just cant help but think that there must be a better solution.
You can handle the individual results as you're populating the array for Promise.all, which you can use to know when they've all finished.
It's tricky without any code in the question to work with, but here's a simple example:
const promises = [1, 2, 3, 4, 5].map(async (num) => {
const result = await doRequest(num);
console.log(`Immediate processing for "${result}"`);
return result; // If you're going to use the contents of the array from `Promise.all`'s fulfillment
});
const allResults = await Promise.all(promises);
console.log(`All processing done, all results:`);
for (const result of allResults) {
console.log(result);
}
Live Example:
const rndDelay = () => new Promise((resolve) => setTimeout(resolve, Math.round(Math.random() * 1000) + 100));
async function doRequest(num) {
console.log(`Requesting ${num}`);
await rndDelay();
return `Result for ${num}`;
}
async function main() {
const promises = [1, 2, 3, 4, 5].map(async (num) => {
const result = await doRequest(num);
console.log(`Immediate processing for "${result}"`);
return result; // If you're going to use the contents of the array from `Promise.all`'s fulfillment
});
const allResults = await Promise.all(promises);
console.log(`All processing done, all results:`);
for (const result of allResults) {
console.log(result);
}
}
main()
.catch((error) => console.error(error));
.as-console-wrapper {
max-height: 100% !important;
}
Notice that the callback we give map (in this example) is an async function, so it returns a promise and we can use await within it.
If you can't use an async wrapper where you're creating the promises, that isn't a problem, you can fall back to .then:
const promises = [1, 2, 3, 4, 5].map((num) => {
return doRequest(num)
.then((result) => {
console.log(`Immediate processing for "${result}"`);
return result; // If you're going to use the contents of the array from `Promise.all`'s fulfillment
});
});
const allResults = await Promise.all(promises);
console.log(`All processing done, all results:`);
for (const result of allResults) {
console.log(result);
}
const rndDelay = () => new Promise((resolve) => setTimeout(resolve, Math.round(Math.random() * 1000) + 100));
async function doRequest(num) {
console.log(`Requesting ${num}`);
await rndDelay();
return `Result for ${num}`;
}
async function main() {
const promises = [1, 2, 3, 4, 5].map((num) => {
return doRequest(num)
.then((result) => {
console.log(`Immediate processing for "${result}"`);
return result; // If you're going to use the contents of the array from `Promise.all`'s fulfillment
});
});
const allResults = await Promise.all(promises);
console.log(`All processing done, all results:`);
for (const result of allResults) {
console.log(result);
}
}
main()
.catch((error) => console.error(error));
.as-console-wrapper {
max-height: 100% !important;
}
Extract the requests to different async functions and process them the moment they resolve:
// Request that responds in 2 seconds
async function req1() {
let response = await http.get(`${serverRoot}/api/req1`);
// Work on response from req 1. Everything below this line executes after 2'
}
//Request that responds in 30 seconds
async function req2() {
let response = await http.get(`${serverRoot}/api/req2`);
// Work on response from req 1. Everything below this line executes after 30'
}
//Request that responds in 10 seconds
async function req3() {
let response = await http.get(`${serverRoot}/api/req3`);
// Work on response from req 1. Everything below this line executes after 30'
}
Then you can go ahead and add them all in an array and wait for all of them to finish, in order to, let's say, hide a loading indicator:
loading = true;
//Notice that we have to invoke the functions inside the array
await Promise.all([req1(), req2(), req3()]);
//Everything below this line will run after the longest request is finished, 30 seconds
loading = false;

Javascript: await inside of loop issue

I want to use Eslint plugin in my project with Webpack but it does not let me use await inside the loop.
According to Eslint docs it recommends to remove await from the loop and just add a Promise after.
Wrong example:
async function foo(things) {
const results = [];
for (const thing of things) {
// Bad: each loop iteration is delayed until the entire asynchronous operation completes
results.push(await bar(thing));
}
return baz(results);
}
Correct example:
async function foo(things) {
const results = [];
for (const thing of things) {
// Good: all asynchronous operations are immediately started.
results.push(bar(thing));
}
// Now that all the asynchronous operations are running, here we wait until they all complete.
return baz(await Promise.all(results));
}
But in my code I just merge data into one array which comes from HTTP request:
async update() {
let array = [];
for (url of this.myUrls) {
const response = await this.getData(url);
array = await array.concat(response);
}
}
Is it possible to remove await from this loop and add Promise just for array concatenation? I don't have an idea how to do it...
If you like one-liners.
const array = await Promise.all(this.myUrls.map((url)=> this.getData(url)));
In this case, the map method returns a bunch of promises, based on the URL, and your getData method. The Promise.all waits until all of your promises will be resolved. These promises run parallel.
You can use promise like this:
function update() {
let array = [],req=[];
for (url of this.myUrls) {
req.push(this.getData(url));
}
return Promise.all(req).then((data)=>{
console.log(data);
return data;
})
}
If I'm understanding you correctly, your getData function returns an array?
Using the one-liner supplied by Anarno, we can wait until all promises are resolved and then concatenate all the arrays.
const allResults = await Promise.all(this.myUrls.map((url) => this.getData(url)));
let finalArray = [];
allResults.forEach((item) => finalArray = finalArray.concat(item));

Is it possible to take advantage of Mongo threads from a NodeJS app without forking the node process? [duplicate]

I have some code that is iterating over a list that was queried out of a database and making an HTTP request for each element in that list. That list can sometimes be a reasonably large number (in the thousands), and I would like to make sure I am not hitting a web server with thousands of concurrent HTTP requests.
An abbreviated version of this code currently looks something like this...
function getCounts() {
return users.map(user => {
return new Promise(resolve => {
remoteServer.getCount(user) // makes an HTTP request
.then(() => {
/* snip */
resolve();
});
});
});
}
Promise.all(getCounts()).then(() => { /* snip */});
This code is running on Node 4.3.2. To reiterate, can Promise.all be managed so that only a certain number of Promises are in progress at any given time?
P-Limit
I have compared promise concurrency limitation with a custom script, bluebird, es6-promise-pool, and p-limit. I believe that p-limit has the most simple, stripped down implementation for this need. See their documentation.
Requirements
To be compatible with async in example
ECMAScript 2017 (version 8)
Node version > 8.2.1
My Example
In this example, we need to run a function for every URL in the array (like, maybe an API request). Here this is called fetchData(). If we had an array of thousands of items to process, concurrency would definitely be useful to save on CPU and memory resources.
const pLimit = require('p-limit');
// Example Concurrency of 3 promise at once
const limit = pLimit(3);
let urls = [
"http://www.exampleone.com/",
"http://www.exampletwo.com/",
"http://www.examplethree.com/",
"http://www.examplefour.com/",
]
// Create an array of our promises using map (fetchData() returns a promise)
let promises = urls.map(url => {
// wrap the function we are calling in the limit function we defined above
return limit(() => fetchData(url));
});
(async () => {
// Only three promises are run at once (as defined above)
const result = await Promise.all(promises);
console.log(result);
})();
The console log result is an array of your resolved promises response data.
Using Array.prototype.splice
while (funcs.length) {
// 100 at a time
await Promise.all( funcs.splice(0, 100).map(f => f()) )
}
Note that Promise.all() doesn't trigger the promises to start their work, creating the promise itself does.
With that in mind, one solution would be to check whenever a promise is resolved whether a new promise should be started or whether you're already at the limit.
However, there is really no need to reinvent the wheel here. One library that you could use for this purpose is es6-promise-pool. From their examples:
var PromisePool = require('es6-promise-pool')
var promiseProducer = function () {
// Your code goes here.
// If there is work left to be done, return the next work item as a promise.
// Otherwise, return null to indicate that all promises have been created.
// Scroll down for an example.
}
// The number of promises to process simultaneously.
var concurrency = 3
// Create a pool.
var pool = new PromisePool(promiseProducer, concurrency)
// Start the pool.
var poolPromise = pool.start()
// Wait for the pool to settle.
poolPromise.then(function () {
console.log('All promises fulfilled')
}, function (error) {
console.log('Some promise rejected: ' + error.message)
})
If you know how iterators work and how they are consumed you would't need any extra library, since it can become very easy to build your own concurrency yourself. Let me demonstrate:
/* [Symbol.iterator]() is equivalent to .values()
const iterator = [1,2,3][Symbol.iterator]() */
const iterator = [1,2,3].values()
// loop over all items with for..of
for (const x of iterator) {
console.log('x:', x)
// notices how this loop continues the same iterator
// and consumes the rest of the iterator, making the
// outer loop not logging any more x's
for (const y of iterator) {
console.log('y:', y)
}
}
We can use the same iterator and share it across workers.
If you had used .entries() instead of .values() you would have gotten a iterator that yields [index, value] which i will demonstrate below with a concurrency of 2
const sleep = t => new Promise(rs => setTimeout(rs, t))
const iterator = Array.from('abcdefghij').entries()
// const results = [] || Array(someLength)
async function doWork (iterator, i) {
for (let [index, item] of iterator) {
await sleep(1000)
console.log(`Worker#${i}: ${index},${item}`)
// in case you need to store the results in order
// results[index] = item + item
// or if the order dose not mather
// results.push(item + item)
}
}
const workers = Array(2).fill(iterator).map(doWork)
// ^--- starts two workers sharing the same iterator
Promise.allSettled(workers).then(console.log.bind(null, 'done'))
The benefit of this is that you can have a generator function instead of having everything ready at once.
What's even more awesome is that you can do stream.Readable.from(iterator) in node (and eventually in whatwg streams as well). and with transferable ReadbleStream, this makes this potential very useful in the feature if you are working with web workers also for performances
Note: the different from this compared to example async-pool is that it spawns two workers, so if one worker throws an error for some reason at say index 5 it won't stop the other worker from doing the rest. So you go from doing 2 concurrency down to 1. (so it won't stop there) So my advise is that you catch all errors inside the doWork function
Instead of using promises for limiting http requests, use node's built-in http.Agent.maxSockets. This removes the requirement of using a library or writing your own pooling code, and has the added advantage more control over what you're limiting.
agent.maxSockets
By default set to Infinity. Determines how many concurrent sockets the agent can have open per origin. Origin is either a 'host:port' or 'host:port:localAddress' combination.
For example:
var http = require('http');
var agent = new http.Agent({maxSockets: 5}); // 5 concurrent connections per origin
var request = http.request({..., agent: agent}, ...);
If making multiple requests to the same origin, it might also benefit you to set keepAlive to true (see docs above for more info).
bluebird's Promise.map can take a concurrency option to control how many promises should be running in parallel. Sometimes it is easier than .all because you don't need to create the promise array.
const Promise = require('bluebird')
function getCounts() {
return Promise.map(users, user => {
return new Promise(resolve => {
remoteServer.getCount(user) // makes an HTTP request
.then(() => {
/* snip */
resolve();
});
});
}, {concurrency: 10}); // <---- at most 10 http requests at a time
}
As all others in this answer thread have pointed out, Promise.all() won't do the right thing if you need to limit concurrency. But ideally you shouldn't even want to wait until all of the Promises are done before processing them.
Instead, you want to process each result ASAP as soon as it becomes available, so you don't have to wait for the very last promise to finish before you start iterating over them.
So, here's a code sample that does just that, based partly on the answer by Endless and also on this answer by T.J. Crowder.
// example tasks that sleep and return a number
// in real life, you'd probably fetch URLs or something
const tasks = [];
for (let i = 0; i < 20; i++) {
tasks.push(async () => {
console.log(`start ${i}`);
await sleep(Math.random() * 1000);
console.log(`end ${i}`);
return i;
});
}
function sleep(ms) { return new Promise(r => setTimeout(r, ms)); }
(async () => {
for await (let value of runTasks(3, tasks.values())) {
console.log(`output ${value}`);
}
})();
async function* runTasks(maxConcurrency, taskIterator) {
// Each async iterator is a worker, polling for tasks from the shared taskIterator
// Sharing the iterator ensures that each worker gets unique tasks.
const asyncIterators = new Array(maxConcurrency);
for (let i = 0; i < maxConcurrency; i++) {
asyncIterators[i] = (async function* () {
for (const task of taskIterator) yield await task();
})();
}
yield* raceAsyncIterators(asyncIterators);
}
async function* raceAsyncIterators(asyncIterators) {
async function nextResultWithItsIterator(iterator) {
return { result: await iterator.next(), iterator: iterator };
}
/** #type Map<AsyncIterator<T>,
Promise<{result: IteratorResult<T>, iterator: AsyncIterator<T>}>> */
const promises = new Map(asyncIterators.map(iterator =>
[iterator, nextResultWithItsIterator(iterator)]));
while (promises.size) {
const { result, iterator } = await Promise.race(promises.values());
if (result.done) {
promises.delete(iterator);
} else {
promises.set(iterator, nextResultWithItsIterator(iterator));
yield result.value;
}
}
}
There's a lot of magic in here; let me explain.
This solution is built around async generator functions, which many JS developers may not be familiar with.
A generator function (aka function* function) returns a "generator," an iterator of results. Generator functions are allowed to use the yield keyword where you might have normally used a return keyword. The first time a caller calls next() on the generator (or uses a for...of loop), the function* function runs until it yields a value; that becomes the next() value of the iterator. But the subsequent time next() is called, the generator function resumes from the yield statement, right where it left off, even if it's in the middle of a loop. (You can also yield*, to yield all of the results of another generator function.)
An "async generator function" (async function*) is a generator function that returns an "async iterator," which is an iterator of promises. You can call for await...of on an async iterator. Async generator functions can use the await keyword, as you might do in any async function.
In the example, we call runTasks() with an array of task functions. runTasks() is an async generator function, so we can call it with a for await...of loop. Each time the loop runs, we'll process the result of the latest completed task.
runTasks() creates N async iterators, the workers. (Note that the workers are initially defined as async generator functions, but we immediately invoke each function, and store each resulting async iterator in the asyncIterators array.) The example calls runTasks with 3 concurrent workers, so no more than 3 tasks are launched at the same time. When any task completes, we immediately queue up the next task. (This is superior to "batching", where you do 3 tasks at once, await all three of them, and don't start the next batch of three until the entire previous batch has finished.)
runTasks() concludes by "racing" its async iterators with yield* raceAsyncIterators(). raceAsyncIterators() is like Promise.race() but it races N iterators of Promises instead of just N Promises; it returns an async iterator that yields the results of resolved Promises.
raceAsyncIterators() starts by defining a promises Map from each of the iterators to promises. Each promise is a promise for an iteration result along with the iterator that generated it.
With the promises map, we can Promise.race() the values of the map, giving us the winning iteration result and its iterator. If the iterator is completely done, we remove it from the map; otherwise we replace its Promise in the promises map with the iterator's next() Promise and yield result.value.
In conclusion, runTasks() is an async generator function that yields the results of racing N concurrent async iterators of tasks, so the end user can just for await (let value of runTasks(3, tasks.values())) to process each result as soon as it becomes available.
I suggest the library async-pool: https://github.com/rxaviers/async-pool
npm install tiny-async-pool
Description:
Run multiple promise-returning & async functions with limited concurrency using native ES6/ES7
asyncPool runs multiple promise-returning & async functions in a limited concurrency pool. It rejects immediately as soon as one of the promises rejects. It resolves when all the promises completes. It calls the iterator function as soon as possible (under concurrency limit).
Usage:
const timeout = i => new Promise(resolve => setTimeout(() => resolve(i), i));
await asyncPool(2, [1000, 5000, 3000, 2000], timeout);
// Call iterator (i = 1000)
// Call iterator (i = 5000)
// Pool limit of 2 reached, wait for the quicker one to complete...
// 1000 finishes
// Call iterator (i = 3000)
// Pool limit of 2 reached, wait for the quicker one to complete...
// 3000 finishes
// Call iterator (i = 2000)
// Itaration is complete, wait until running ones complete...
// 5000 finishes
// 2000 finishes
// Resolves, results are passed in given array order `[1000, 5000, 3000, 2000]`.
Here is my ES7 solution to a copy-paste friendly and feature complete Promise.all()/map() alternative, with a concurrency limit.
Similar to Promise.all() it maintains return order as well as a fallback for non promise return values.
I also included a comparison of the different implementation as it illustrates some aspects a few of the other solutions have missed.
Usage
const asyncFn = delay => new Promise(resolve => setTimeout(() => resolve(), delay));
const args = [30, 20, 15, 10];
await asyncPool(args, arg => asyncFn(arg), 4); // concurrency limit of 4
Implementation
async function asyncBatch(args, fn, limit = 8) {
// Copy arguments to avoid side effects
args = [...args];
const outs = [];
while (args.length) {
const batch = args.splice(0, limit);
const out = await Promise.all(batch.map(fn));
outs.push(...out);
}
return outs;
}
async function asyncPool(args, fn, limit = 8) {
return new Promise((resolve) => {
// Copy arguments to avoid side effect, reverse queue as
// pop is faster than shift
const argQueue = [...args].reverse();
let count = 0;
const outs = [];
const pollNext = () => {
if (argQueue.length === 0 && count === 0) {
resolve(outs);
} else {
while (count < limit && argQueue.length) {
const index = args.length - argQueue.length;
const arg = argQueue.pop();
count += 1;
const out = fn(arg);
const processOut = (out, index) => {
outs[index] = out;
count -= 1;
pollNext();
};
if (typeof out === 'object' && out.then) {
out.then(out => processOut(out, index));
} else {
processOut(out, index);
}
}
}
};
pollNext();
});
}
Comparison
// A simple async function that returns after the given delay
// and prints its value to allow us to determine the response order
const asyncFn = delay => new Promise(resolve => setTimeout(() => {
console.log(delay);
resolve(delay);
}, delay));
// List of arguments to the asyncFn function
const args = [30, 20, 15, 10];
// As a comparison of the different implementations, a low concurrency
// limit of 2 is used in order to highlight the performance differences.
// If a limit greater than or equal to args.length is used the results
// would be identical.
// Vanilla Promise.all/map combo
const out1 = await Promise.all(args.map(arg => asyncFn(arg)));
// prints: 10, 15, 20, 30
// total time: 30ms
// Pooled implementation
const out2 = await asyncPool(args, arg => asyncFn(arg), 2);
// prints: 20, 30, 15, 10
// total time: 40ms
// Batched implementation
const out3 = await asyncBatch(args, arg => asyncFn(arg), 2);
// prints: 20, 30, 20, 30
// total time: 45ms
console.log(out1, out2, out3); // prints: [30, 20, 15, 10] x 3
// Conclusion: Execution order and performance is different,
// but return order is still identical
Conclusion
asyncPool() should be the best solution as it allows new requests to start as soon as any previous one finishes.
asyncBatch() is included as a comparison as its implementation is simpler to understand, but it should be slower in performance as all requests in the same batch is required to finish in order to start the next batch.
In this contrived example, the non-limited vanilla Promise.all() is of course the fastest, while the others could perform more desirable in a real world congestion scenario.
Update
The async-pool library that others have already suggested is probably a better alternative to my implementation as it works almost identically and has a more concise implementation with a clever usage of Promise.race(): https://github.com/rxaviers/async-pool/blob/master/lib/es7.js
Hopefully my answer can still serve an educational value.
Semaphore is well known concurrency primitive that was designed to solve similar problems. It's very universal construct, implementations of Semaphore exist in many languages. This is how one would use Semaphore to solve this issue:
async function main() {
const s = new Semaphore(100);
const res = await Promise.all(
entities.map((users) =>
s.runExclusive(() => remoteServer.getCount(user))
)
);
return res;
}
I'm using implementation of Semaphore from async-mutex, it has decent documentation and TypeScript support.
If you want to dig deep into topics like this you can take a look at the book "The Little Book of Semaphores" which is freely available as PDF here
Unfortunately there is no way to do it with native Promise.all, so you have to be creative.
This is the quickest most concise way I could find without using any outside libraries.
It makes use of a newer javascript feature called an iterator. The iterator basically keeps track of what items have been processed and what haven't.
In order to use it in code, you create an array of async functions. Each async function asks the same iterator for the next item that needs to be processed. Each function processes its own item asynchronously, and when done asks the iterator for a new one. Once the iterator runs out of items, all the functions complete.
Thanks to #Endless for inspiration.
const items = [
'https://httpbin.org/bytes/2',
'https://httpbin.org/bytes/2',
'https://httpbin.org/bytes/2',
'https://httpbin.org/bytes/2',
'https://httpbin.org/bytes/2',
'https://httpbin.org/bytes/2',
'https://httpbin.org/bytes/2',
'https://httpbin.org/bytes/2'
]
// get a cursor that keeps track of what items have already been processed.
let cursor = items.entries();
// create 5 for loops that each run off the same cursor which keeps track of location
Array(5).fill().forEach(async () => {
for (let [index, url] of cursor){
console.log('getting url is ', index, url)
// run your async task instead of this next line
var text = await fetch(url).then(res => res.text())
console.log('text is', text.slice(0, 20))
}
})
Here goes basic example for streaming and 'p-limit'. It streams http read stream to mongo db.
const stream = require('stream');
const util = require('util');
const pLimit = require('p-limit');
const es = require('event-stream');
const streamToMongoDB = require('stream-to-mongo-db').streamToMongoDB;
const pipeline = util.promisify(stream.pipeline)
const outputDBConfig = {
dbURL: 'yr-db-url',
collection: 'some-collection'
};
const limit = pLimit(3);
async yrAsyncStreamingFunction(readStream) => {
const mongoWriteStream = streamToMongoDB(outputDBConfig);
const mapperStream = es.map((data, done) => {
let someDataPromise = limit(() => yr_async_call_to_somewhere())
someDataPromise.then(
function handleResolve(someData) {
data.someData = someData;
done(null, data);
},
function handleError(error) {
done(error)
}
);
})
await pipeline(
readStream,
JSONStream.parse('*'),
mapperStream,
mongoWriteStream
);
}
So many good solutions. I started out with the elegant solution posted by #Endless and ended up with this little extension method that does not use any external libraries nor does it run in batches (although assumes you have features like async, etc):
Promise.allWithLimit = async (taskList, limit = 5) => {
const iterator = taskList.entries();
let results = new Array(taskList.length);
let workerThreads = new Array(limit).fill(0).map(() =>
new Promise(async (resolve, reject) => {
try {
let entry = iterator.next();
while (!entry.done) {
let [index, promise] = entry.value;
try {
results[index] = await promise;
entry = iterator.next();
}
catch (err) {
results[index] = err;
}
}
// No more work to do
resolve(true);
}
catch (err) {
// This worker is dead
reject(err);
}
}));
await Promise.all(workerThreads);
return results;
};
Promise.allWithLimit = async (taskList, limit = 5) => {
const iterator = taskList.entries();
let results = new Array(taskList.length);
let workerThreads = new Array(limit).fill(0).map(() =>
new Promise(async (resolve, reject) => {
try {
let entry = iterator.next();
while (!entry.done) {
let [index, promise] = entry.value;
try {
results[index] = await promise;
entry = iterator.next();
}
catch (err) {
results[index] = err;
}
}
// No more work to do
resolve(true);
}
catch (err) {
// This worker is dead
reject(err);
}
}));
await Promise.all(workerThreads);
return results;
};
const demoTasks = new Array(10).fill(0).map((v,i) => new Promise(resolve => {
let n = (i + 1) * 5;
setTimeout(() => {
console.log(`Did nothing for ${n} seconds`);
resolve(n);
}, n * 1000);
}));
var results = Promise.allWithLimit(demoTasks);
#tcooc's answer was quite cool. Didn't know about it and will leverage it in the future.
I also enjoyed #MatthewRideout's answer, but it uses an external library!!
Whenever possible, I give a shot at developing this kind of things on my own, rather than going for a library. You end up learning a lot of concepts which seemed daunting before.
class Pool{
constructor(maxAsync) {
this.maxAsync = maxAsync;
this.asyncOperationsQueue = [];
this.currentAsyncOperations = 0
}
runAnother() {
if (this.asyncOperationsQueue.length > 0 && this.currentAsyncOperations < this.maxAsync) {
this.currentAsyncOperations += 1;
this.asyncOperationsQueue.pop()()
.then(() => { this.currentAsyncOperations -= 1; this.runAnother() }, () => { this.currentAsyncOperations -= 1; this.runAnother() })
}
}
add(f){ // the argument f is a function of signature () => Promise
this.runAnother();
return new Promise((resolve, reject) => {
this.asyncOperationsQueue.push(
() => f().then(resolve).catch(reject)
)
})
}
}
//#######################################################
// TESTS
//#######################################################
function dbCall(id, timeout, fail) {
return new Promise((resolve, reject) => {
setTimeout(() => {
if (fail) {
reject(`Error for id ${id}`);
} else {
resolve(id);
}
}, timeout)
}
)
}
const dbQuery1 = () => dbCall(1, 5000, false);
const dbQuery2 = () => dbCall(2, 5000, false);
const dbQuery3 = () => dbCall(3, 5000, false);
const dbQuery4 = () => dbCall(4, 5000, true);
const dbQuery5 = () => dbCall(5, 5000, false);
const cappedPool = new Pool(2);
const dbQuery1Res = cappedPool.add(dbQuery1).catch(i => i).then(i => console.log(`Resolved: ${i}`))
const dbQuery2Res = cappedPool.add(dbQuery2).catch(i => i).then(i => console.log(`Resolved: ${i}`))
const dbQuery3Res = cappedPool.add(dbQuery3).catch(i => i).then(i => console.log(`Resolved: ${i}`))
const dbQuery4Res = cappedPool.add(dbQuery4).catch(i => i).then(i => console.log(`Resolved: ${i}`))
const dbQuery5Res = cappedPool.add(dbQuery5).catch(i => i).then(i => console.log(`Resolved: ${i}`))
This approach provides a nice API, similar to thread pools in scala/java.
After creating one instance of the pool with const cappedPool = new Pool(2), you provide promises to it with simply cappedPool.add(() => myPromise).
Obliviously we must ensure that the promise does not start immediately and that is why we must "provide it lazily" with the help of the function.
Most importantly, notice that the result of the method add is a Promise which will be completed/resolved with the value of your original promise! This makes for a very intuitive use.
const resultPromise = cappedPool.add( () => dbCall(...))
resultPromise
.then( actualResult => {
// Do something with the result form the DB
}
)
This solution uses an async generator to manage concurrent promises with vanilla javascript. The throttle generator takes 3 arguments:
An array of values to be be supplied as arguments to a promise genrating function. (e.g. An array of URLs.)
A function that return a promise. (e.g. Returns a promise for an HTTP request.)
An integer that represents the maximum concurrent promises allowed.
Promises are only instantiated as required in order to reduce memory consumption. Results can be iterated over using a for await...of statement.
The example below provides a function to check promise state, the throttle async generator, and a simple function that return a promise based on setTimeout. The async IIFE at the end defines the reservoir of timeout values, sets the async iterable returned by throttle, then iterates over the results as they resolve.
If you would like a more complete example for HTTP requests, let me know in the comments.
Please note that Node.js 16+ is required in order async generators.
const promiseState = function( promise ) {
const control = Symbol();
return Promise
.race([ promise, control ])
.then( value => ( value === control ) ? 'pending' : 'fulfilled' )
.catch( () => 'rejected' );
}
const throttle = async function* ( reservoir, promiseClass, highWaterMark ) {
let iterable = reservoir.splice( 0, highWaterMark ).map( item => promiseClass( item ) );
while ( iterable.length > 0 ) {
await Promise.any( iterable );
const pending = [];
const resolved = [];
for ( const currentValue of iterable ) {
if ( await promiseState( currentValue ) === 'pending' ) {
pending.push( currentValue );
} else {
resolved.push( currentValue );
}
}
console.log({ pending, resolved, reservoir });
iterable = [
...pending,
...reservoir.splice( 0, highWaterMark - pending.length ).map( value => promiseClass( value ) )
];
yield Promise.allSettled( resolved );
}
}
const getTimeout = delay => new Promise( ( resolve, reject ) => {
setTimeout(resolve, delay, delay);
} );
( async () => {
const test = [ 1100, 1200, 1300, 10000, 11000, 9000, 5000, 6000, 3000, 4000, 1000, 2000, 3500 ];
const throttledRequests = throttle( test, getTimeout, 4 );
for await ( const timeout of throttledRequests ) {
console.log( timeout );
}
} )();
The concurrent function below will return a Promise which resolves to an array of resolved promise values, while implementing a concurrency limit. No 3rd party library.
// waits 50 ms then resolves to the passed-in arg
const sleepAndResolve = s => new Promise(rs => setTimeout(()=>rs(s), 50))
// queue 100 promises
const funcs = []
for(let i=0; i<100; i++) funcs.push(()=>sleepAndResolve(i))
//run the promises with a max concurrency of 10
concurrent(10,funcs)
.then(console.log) // prints [0,1,2...,99]
.catch(()=>console.log("there was an error"))
/**
* Run concurrent promises with a maximum concurrency level
* #param concurrency The number of concurrently running promises
* #param funcs An array of functions that return promises
* #returns a promise that resolves to an array of the resolved values from the promises returned by funcs
*/
function concurrent(concurrency, funcs) {
return new Promise((resolve, reject) => {
let index = -1;
const p = [];
for (let i = 0; i < Math.max(1, Math.min(concurrency, funcs.length)); i++)
runPromise();
function runPromise() {
if (++index < funcs.length)
(p[p.length] = funcs[index]()).then(runPromise).catch(reject);
else if (index === funcs.length)
Promise.all(p).then(resolve).catch(reject);
}
});
}
Here's the Typescript version if you are interested
/**
* Run concurrent promises with a maximum concurrency level
* #param concurrency The number of concurrently running promises
* #param funcs An array of functions that return promises
* #returns a promise that resolves to an array of the resolved values from the promises returned by funcs
*/
function concurrent<V>(concurrency:number, funcs:(()=>Promise<V>)[]):Promise<V[]> {
return new Promise((resolve,reject)=>{
let index = -1;
const p:Promise<V>[] = []
for(let i=0; i<Math.max(1,Math.min(concurrency, funcs.length)); i++) runPromise()
function runPromise() {
if (++index < funcs.length) (p[p.length] = funcs[index]()).then(runPromise).catch(reject)
else if (index === funcs.length) Promise.all(p).then(resolve).catch(reject)
}
})
}
No external libraries. Just plain JS.
It can be resolved using recursion.
The idea is that initially we immediately execute the maximum allowed number of queries and each of these queries should recursively initiate a new query on its completion.
In this example I populate successful responses together with errors and I execute all queries but it's possible to slightly modify algorithm if you want to terminate batch execution on the first failure.
async function batchQuery(queries, limit) {
limit = Math.min(queries.length, limit);
return new Promise((resolve, reject) => {
const responsesOrErrors = new Array(queries.length);
let startedCount = 0;
let finishedCount = 0;
let hasErrors = false;
function recursiveQuery() {
let index = startedCount++;
doQuery(queries[index])
.then(res => {
responsesOrErrors[index] = res;
})
.catch(error => {
responsesOrErrors[index] = error;
hasErrors = true;
})
.finally(() => {
finishedCount++;
if (finishedCount === queries.length) {
hasErrors ? reject(responsesOrErrors) : resolve(responsesOrErrors);
} else if (startedCount < queries.length) {
recursiveQuery();
}
});
}
for (let i = 0; i < limit; i++) {
recursiveQuery();
}
});
}
async function doQuery(query) {
console.log(`${query} started`);
const delay = Math.floor(Math.random() * 1500);
return new Promise((resolve, reject) => {
setTimeout(() => {
if (delay <= 1000) {
console.log(`${query} finished successfully`);
resolve(`${query} success`);
} else {
console.log(`${query} finished with error`);
reject(`${query} error`);
}
}, delay);
});
}
const queries = new Array(10).fill('query').map((query, index) => `${query}_${index + 1}`);
batchQuery(queries, 3)
.then(responses => console.log('All successfull', responses))
.catch(responsesWithErrors => console.log('All with several failed', responsesWithErrors));
So I tried to make some examples shown work for my code, but since this was only for an import script and not production code, using the npm package batch-promises was surely the easiest path for me
NOTE: Requires runtime to support Promise or to be polyfilled.
Api
batchPromises(int: batchSize, array: Collection, i => Promise: Iteratee)
The Promise: Iteratee will be called after each batch.
Use:
batch-promises
Easily batch promises
NOTE: Requires runtime to support Promise or to be polyfilled.
Api
batchPromises(int: batchSize, array: Collection, i => Promise: Iteratee)
The Promise: Iteratee will be called after each batch.
Use:
import batchPromises from 'batch-promises';
batchPromises(2, [1,2,3,4,5], i => new Promise((resolve, reject) => {
// The iteratee will fire after each batch resulting in the following behaviour:
// # 100ms resolve items 1 and 2 (first batch of 2)
// # 200ms resolve items 3 and 4 (second batch of 2)
// # 300ms resolve remaining item 5 (last remaining batch)
setTimeout(() => {
resolve(i);
}, 100);
}))
.then(results => {
console.log(results); // [1,2,3,4,5]
});
Recursion is the answer if you don't want to use external libraries
downloadAll(someArrayWithData){
var self = this;
var tracker = function(next){
return self.someExpensiveRequest(someArrayWithData[next])
.then(function(){
next++;//This updates the next in the tracker function parameter
if(next < someArrayWithData.length){//Did I finish processing all my data?
return tracker(next);//Go to the next promise
}
});
}
return tracker(0);
}
expanding on the answer posted by #deceleratedcaviar, I created a 'batch' utility function that takes as argument: array of values, concurrency limit and processing function. Yes I realize that using Promise.all this way is more akin to batch processing vs true concurrency, but if the goal is to limit excessive number of HTTP calls at one time I go with this approach due to its simplicity and no need for external library.
async function batch(o) {
let arr = o.arr
let resp = []
while (arr.length) {
let subset = arr.splice(0, o.limit)
let results = await Promise.all(subset.map(o.process))
resp.push(results)
}
return [].concat.apply([], resp)
}
let arr = []
for (let i = 0; i < 250; i++) { arr.push(i) }
async function calc(val) { return val * 100 }
(async () => {
let resp = await batch({
arr: arr,
limit: 100,
process: calc
})
console.log(resp)
})();
One more solution with a custom promise library (CPromise):
using generators Live codesandbox demo
import { CPromise } from "c-promise2";
import cpFetch from "cp-fetch";
const promise = CPromise.all(
function* () {
const urls = [
"https://run.mocky.io/v3/7b038025-fc5f-4564-90eb-4373f0721822?mocky-delay=2s&x=1",
"https://run.mocky.io/v3/7b038025-fc5f-4564-90eb-4373f0721822?mocky-delay=2s&x=2",
"https://run.mocky.io/v3/7b038025-fc5f-4564-90eb-4373f0721822?mocky-delay=2s&x=3",
"https://run.mocky.io/v3/7b038025-fc5f-4564-90eb-4373f0721822?mocky-delay=2s&x=4",
"https://run.mocky.io/v3/7b038025-fc5f-4564-90eb-4373f0721822?mocky-delay=2s&x=5",
"https://run.mocky.io/v3/7b038025-fc5f-4564-90eb-4373f0721822?mocky-delay=2s&x=6",
"https://run.mocky.io/v3/7b038025-fc5f-4564-90eb-4373f0721822?mocky-delay=2s&x=7"
];
for (const url of urls) {
yield cpFetch(url); // add a promise to the pool
console.log(`Request [${url}] completed`);
}
},
{ concurrency: 2 }
).then(
(v) => console.log(`Done: `, v),
(e) => console.warn(`Failed: ${e}`)
);
// yeah, we able to cancel the task and abort pending network requests
// setTimeout(() => promise.cancel(), 4500);
using mapper Live codesandbox demo
import { CPromise } from "c-promise2";
import cpFetch from "cp-fetch";
const promise = CPromise.all(
[
"https://run.mocky.io/v3/7b038025-fc5f-4564-90eb-4373f0721822?mocky-delay=2s&x=1",
"https://run.mocky.io/v3/7b038025-fc5f-4564-90eb-4373f0721822?mocky-delay=2s&x=2",
"https://run.mocky.io/v3/7b038025-fc5f-4564-90eb-4373f0721822?mocky-delay=2s&x=3",
"https://run.mocky.io/v3/7b038025-fc5f-4564-90eb-4373f0721822?mocky-delay=2s&x=4",
"https://run.mocky.io/v3/7b038025-fc5f-4564-90eb-4373f0721822?mocky-delay=2s&x=5",
"https://run.mocky.io/v3/7b038025-fc5f-4564-90eb-4373f0721822?mocky-delay=2s&x=6",
"https://run.mocky.io/v3/7b038025-fc5f-4564-90eb-4373f0721822?mocky-delay=2s&x=7"
],
{
mapper: (url) => {
console.log(`Request [${url}]`);
return cpFetch(url);
},
concurrency: 2
}
).then(
(v) => console.log(`Done: `, v),
(e) => console.warn(`Failed: ${e}`)
);
// yeah, we able to cancel the task and abort pending network requests
//setTimeout(() => promise.cancel(), 4500);
Warning this has not been benchmarked for efficiency and does a lot of array copying/creation
If you want a more functional approach you could do something like:
import chunk from 'lodash.chunk';
const maxConcurrency = (max) => (dataArr, promiseFn) =>
chunk(dataArr, max).reduce(
async (agg, batch) => [
...(await agg),
...(await Promise.all(batch.map(promiseFn)))
],
[]
);
and then to you could use it like:
const randomFn = (data) =>
new Promise((res) => setTimeout(
() => res(data + 1),
Math.random() * 1000
));
const result = await maxConcurrency(5)(
[1, 2, 3, 4, 5, 6, 7, 8, 9, 10],
randomFn
);
console.log('result+++', result);
I had been using the bottleneck library, which I actually really liked, but in my case wasn't releasing memory and kept tanking long running jobs... Which isn't great for running the massive jobs that you likely want a throttling/concurrency library for in the first place.
I needed a simple, low-overhead, easy to maintain solution. I also wanted something that kept the pool topped up, rather than simply batching predefined chunks... In the case of a downloader, this will stop that nGB file from holding up your queue for minutes/hours at a time, even though the rest of the batch finished ages ago.
This is the Node.js v16+, no-dependency, async generator solution I've been using instead:
const promiseState = function( promise ) {
// A promise could never resolve to a unique symbol unless it was in this scope
const control = Symbol();
// This helps us determine the state of the promise... A little heavy, but it beats a third-party promise library. The control is the second element passed to Promise.race() since it will only resolve first if the promise being tested is pending.
return Promise
.race([ promise, control ])
.then( value => ( value === control ) ? 'pending' : 'fulfilled' )
.catch( () => 'rejected' );
}
const throttle = async function* ( reservoir, promiseFunction, highWaterMark ) {
let iterable = reservoir.splice( 0, highWaterMark ).map( item => promiseFunction( item ) );
while ( iterable.length > 0 ) {
// When a promise has resolved we have space to top it up to the high water mark...
await Promise.any( iterable );
const pending = [];
const resolved = [];
// This identifies the promise(s) that have resolved so that we can yield them
for ( const currentValue of iterable ) {
if ( await promiseState( currentValue ) === 'pending' ) {
pending.push( currentValue );
} else {
resolved.push( currentValue );
}
}
// Put the remaining promises back into iterable, and top it to the high water mark
iterable = [
...pending,
...reservoir.splice( 0, highWaterMark - pending.length ).map( value => promiseFunction( value ) )
];
yield Promise.allSettled( resolved );
}
}
// This is just an example of what would get passed as "promiseFunction"... This can be the function that returns your HTTP request promises
const getTimeout = delay => new Promise( (resolve, reject) => setTimeout(resolve, delay, delay) );
// This is just the async IIFE that bootstraps this example
( async () => {
const test = [ 1000, 2000, 3000, 4000, 5000, 6000, 1500, 2500, 3500, 4500, 5500, 6500 ];
for await ( const timeout of throttle( test, getTimeout, 4 ) ) {
console.log( timeout );
}
} )();
I have solution with creating chunks and using .reduce function to wait each chunks promise.alls to be finished. And also I add some delay if the promises have some call limits.
export function delay(ms: number) {
return new Promise<void>((resolve) => setTimeout(resolve, ms));
}
export const chunk = <T>(arr: T[], size: number): T[][] => [
...Array(Math.ceil(arr.length / size)),
].map((_, i) => arr.slice(size * i, size + size * i));
const myIdlist = []; // all items
const groupedIdList = chunk(myIdList, 20); // grouped by 20 items
await groupedIdList.reduce(async (prev, subIdList) => {
await prev;
// Make sure we wait for 500 ms after processing every page to prevent overloading the calls.
const data = await Promise.all(subIdList.map(myPromise));
await delay(500);
}, Promise.resolve());
Using tiny-async-pool ES9 for await...of API, you can do the following:
const asyncPool = require("tiny-async-pool");
const getCount = async (user) => ([user, remoteServer.getCount(user)]);
const concurrency = 2;
for await (const [user, count] of asyncPool(concurrency, users, getCount)) {
console.log(user, count);
}
The above asyncPool function returns an async iterator that yields as soon as a promise completes (under concurrency limit) and it rejects immediately as soon as one of the promises rejects.
It is possible to limit requests to server by using https://www.npmjs.com/package/job-pipe
Basically you create a pipe and tell it how many concurrent requests you want:
const pipe = createPipe({ throughput: 6, maxQueueSize: Infinity })
Then you take your function which performs call and force it through the pipe to create a limited amount of calls at the same time:
const makeCall = async () => {...}
const limitedMakeCall = pipe(makeCall)
Finally, you call this method as many times as you need as if it was unchanged and it will limit itself on how many parallel executions it can handle:
await limitedMakeCall()
await limitedMakeCall()
await limitedMakeCall()
await limitedMakeCall()
await limitedMakeCall()
....
await limitedMakeCall()
Profit.
I suggest not downloading packages and not writing hundreds of lines of code:
async function async_arr<T1, T2>(
arr: T1[],
func: (x: T1) => Promise<T2> | T2, //can be sync or async
limit = 5
) {
let results: T2[] = [];
let workers = [];
let current = Math.min(arr.length, limit);
async function process(i) {
if (i < arr.length) {
results[i] = await Promise.resolve(func(arr[i]));
await process(current++);
}
}
for (let i = 0; i < current; i++) {
workers.push(process(i));
}
await Promise.all(workers);
return results;
}
Here's my recipe, based on killdash9's answer.
It allows to choose the behaviour on exceptions (Promise.all vs Promise.allSettled).
// Given an array of async functions, runs them in parallel,
// with at most maxConcurrency simultaneous executions
// Except for that, behaves the same as Promise.all,
// unless allSettled is true, where it behaves as Promise.allSettled
function concurrentRun(maxConcurrency = 10, funcs = [], allSettled = false) {
if (funcs.length <= maxConcurrency) {
const ps = funcs.map(f => f());
return allSettled ? Promise.allSettled(ps) : Promise.all(ps);
}
return new Promise((resolve, reject) => {
let idx = -1;
const ps = new Array(funcs.length);
function nextPromise() {
idx += 1;
if (idx < funcs.length) {
(ps[idx] = funcs[idx]()).then(nextPromise).catch(allSettled ? nextPromise : reject);
} else if (idx === funcs.length) {
(allSettled ? Promise.allSettled(ps) : Promise.all(ps)).then(resolve).catch(reject);
}
}
for (let i = 0; i < maxConcurrency; i += 1) nextPromise();
});
}
I know there are a lot of answers already, but I ended up using a very simple, no library or sleep required, solution that uses only a few commands. Promise.all() simply lets you know when all the promises passed to it are finalized. So, you can check on the queue intermittently to see if it is ready for more work, if so, add more processes.
For example:
// init vars
const batchSize = 5
const calls = []
// loop through data and run processes
for (let [index, data] of [1,2,3].entries()) {
// pile on async processes
calls.push(doSomethingAsyncWithData(data))
// every 5th concurrent call, wait for them to finish before adding more
if (index % batchSize === 0) await Promise.all(calls)
}
// clean up for any data to process left over if smaller than batch size
const allFinishedProcs = await Promise.all(calls)
A good solution for controlling the maximum number of promises/requests is to split your list of requests into pages, and produce only requests for one page at a time.
The example below makes use of iter-ops library:
import {pipeAsync, map, page} from 'iter-ops';
const i = pipeAsync(
users, // make it asynchronous
page(10), // split into pages of 10 items in each
map(p => Promise.all(p.map(u => u.remoteServer.getCount(u)))), // map into requests
wait() // resolve each page in the pipeline
);
// below triggers processing page-by-page:
for await(const p of i) {
//=> p = resolved page of data
}
This way it won't try to create more requests/promises than the size of one page.

Categories