Pause HTTP fetch() requests - javascript

As seen in Stack Overflow - How do I cancel an HTTP fetch() request?, we can nowadays abort a fetch() request using AbortController.
Instead of aborting a fetch() request, I'd like to merely pause it and resume it later. Is it possible?

You can achieve pause using generators:
async function* fetchBooks () {
const response = await fetch('https://dummyjson.com/todos')
.then(res => res.json());
yield response;
}
var res = fetchBooks();
console.log(res);
// paused
// run it
res.next().then(res => console.log(res.value));

Related

for await of VS Promise.all

Is there any difference between this:
const promises = await Promise.all(items.map(e => somethingAsync(e)));
for (const res of promises) {
// do some calculations
}
And this ?
for await (const res of items.map(e => somethingAsync(e))) {
// do some calculations
}
I know that in the first snippet, all the promises are fired at the same time but I'm not sure about the second. Does the for loop wait for the first iteration to be done to call the next promise ? Or are all the promises fired at the same time and the inside of the loop acts like a callback for them ?
Yes, they absolutely are different. for await is supposed to be used with asynchronous iterators, not with arrays of pre-existing promises.
Just to make clear,
for await (const res of items.map(e => somethingAsync(e))) …
works the same as
const promises = items.map(e => somethingAsync(e));
for await (const res of promises) …
or
const promises = [somethingAsync(items[0]), somethingAsync(items[1]), …];
for await (const res of promises) …
The somethingAsync calls are happening immediately, all at once, before anything is awaited. Then, they are awaited one after another, which is definitely a problem if any one of them gets rejected: it will cause an unhandled promise rejection error. Using Promise.all is the only viable choice to deal with the array of promises:
for (const res of await Promise.all(promises)) …
See Waiting for more than one concurrent await operation and Any difference between await Promise.all() and multiple await? for details.
The need for for await ... arises when on an asynchronous iterator the computation of the current iteration depends on some of the previous iterations. If there are no dependences, Promise.all is your choice. The for await construct was designed to work with asynchronous iterators, although - as in your example, you can use it with an array of promises.
See the example paginated data in the book javascript.info for an example using an asynchronous iterator that can't be rewritten using Promise.all:
(async () => {
for await (const commit of fetchCommits('javascript-tutorial/en.javascript.info')) {
console.log(commit.author.login);
}
})();
Here the fetchCommits async iterator makes a request to fetch the commits of a GitHub repo. The fetch responds with a JSON of 30 commits, and also provides a link to the next page in the Link header. Therefore the next iteration can only start after the previous iteration has the link for the next request
async function* fetchCommits(repo) {
let url = `https://api.github.com/repos/${repo}/commits`;
while (url) {
const response = await fetch(url, {
headers: {'User-Agent': 'Our script'},
});
const body = await response.json(); // (array of commits
// The URL of the next page is in the headers, extract it using a regexp
let nextPage = response.headers.get('Link').match(/<(.*?)>; rel="next"/);
nextPage = nextPage?.[1];
url = nextPage;
for(let commit of body) { // yield commits one by one, until the page ends
yield commit;
}
}
}
As you said Promise.all will send all the requests in one go and then you will get the response when all of them gets completed.
In the second scenario, you will send the request in one go but recieve response as for each one by one.
See this small example for reference.
let i = 1;
function somethingAsync(time) {
console.log("fired");
return delay(time).then(() => Promise.resolve(i++));
}
const items = [1000, 2000, 3000, 4000];
function delay(time) {
return new Promise((resolve) => {
setTimeout(resolve, time)
});
}
(async() => {
console.time("first way");
const promises = await Promise.all(items.map(e => somethingAsync(e)));
for (const res of promises) {
console.log(res);
}
console.timeEnd("first way");
i=1; //reset counter
console.time("second way");
for await (const res of items.map(e => somethingAsync(e))) {
// do some calculations
console.log(res);
}
console.timeEnd("second way");
})();
You could try it here as well - https://repl.it/repls/SuddenUselessAnalyst
Hope this helps.
Actually, using the for await syntax does fire the promises all at once.
The small piece of code proves it:
const sleep = s => {
return new Promise(resolve => {
setTimeout(resolve, s * 1000);
});
}
const somethingAsync = async t => {
await sleep(t);
return t;
}
(async () => {
const items = [1, 2, 3, 4];
const now = Date.now();
for await (const res of items.map(e => somethingAsync(e))) {
console.log(res);
}
console.log("time: ", (Date.now() - now) / 1000);
})();
stdout:
time: 4.001
But the inside of the loop doesn't act "as a callback". If I reverse the array, all the logs appear at once. I suppose that the promises are fired at once and the runtime just waits for the first one to resolve to go to the next iteration.
EDIT: Actually, using for await is bad practice when we use it with something other than an asynchronous iterator, best is to use Promise.all, according to #Bergi in his answer.

How to make a simultaneous call to api?

I have an async function that calls my API simultaneously, but Promise.all() resolving after about 2 seconds for me (One request resolving in about 200ms). That means my requests resolving one after another, not simultaneously even if they started asyncronious.
// don't run in client side of browser - CORS
async asyncData () {
axios.interceptors.request.use(config => {
console.log('Start: ', new Date)
return config
})
const promise1 = axios.get('https://jsonplaceholder.typicode.com/comments')
const promise2 = axios.get('https://jsonplaceholder.typicode.com/posts')
const promise3 = axios.get('https://jsonplaceholder.typicode.com/todos')
const promise4 = axios.get('https://jsonplaceholder.typicode.com/users')
const promise5 = axios.get('https://jsonplaceholder.typicode.com/photos')
const promise6 = axios.get('https://jsonplaceholder.typicode.com/photos')
const promise7 = axios.get('https://jsonplaceholder.typicode.com/photos')
const promise8 = axios.get('https://jsonplaceholder.typicode.com/photos')
const promise9 = axios.get('https://jsonplaceholder.typicode.com/photos')
await Promise.all([
promise1,
promise2,
promise3,
promise4,
promise5,
promise6,
promise7,
promise8,
promise9,
]).then(data => {
console.log('End: ', new Date)
})
}
I am confused because this code below resolving in 4000ms, that means they starting simultaneously and resolving after ending of biggest timeout - 4000. This time Promises not resolving in chain
async asyncData () {
function startTimer (time) {
return new Promise(function (resolve) {
setTimeout(function () {
resolve()
}, time)
})
}
console.time('Timer')
Promise.all([
startTimer(1000),
startTimer(4000),
startTimer(2000),
]).then(function () {
console.timeEnd('Timer')
})
}
P.S. Sorry for bad English ¯\_(ツ)_/¯
Your code is fine. The problem is that large numbers of simultaneous network requests, especially those made to the same endpoint, often take longer to resolve than if only one request was made - even if you made all those requests in parallel. Here's a screenshot from Chrome devtools of the requests:
As you can see, all the requests are being fired off immediately, but the subsequent ones are taking longer to resolve. Often this is caused by the endpoint (here, jsonplaceholder) throttling requests.
If you were making requests to your own server, and both your server and the script source had sufficient bandwidth, connectivity, and no throttling, the parallel requests would resolve at least close to the same time.
(Browsers may restrict the total number of connections allowed at any one time, as may OSs)

Axios not returning promise, not synchronous

In my react app, I am making an axios call within a for each loop however, the program doesn't wait for the response of the GET request but rather it executes the next part of the program. When it eventually resolves the promise it is too late. How do I pause the execution of the program until axios has returned from its request.
In a nutshell I need axios to return the result and then continue to run the rest of the program in a synchronous manner.
I have tried using the async/await options however, it doesn't seem to do the trick. I need Axios to run synchronously.
pids.forEach((i1) => loadedContent.forEach((i2, idx) => {
if (i1 === i2.available_versions.version[0].pid || i1 ===
i2.versionPid) {
axios.get(`https://programmes.api.hebel.com/dougi/api/versions?
anshur${i1}`).then((response) => {
xml2js.parseString(response.data, function(err, result) {
loadedContent[idx].nCrid =
result.gui.results[0].vastilp[0].roundup[0].identifier[0]._
alert(loadedContent[idx].nCrid)
//Need it to go into here, however it is bypassing this and
continuing
//on to line 111
});
}).catch(e => {
console.log('error', e);
});
}
}))
I expect axios to run synchronously, or at the very least wait until the promise has been resolved before continuing on.
Doing an asynchronous call would effectively 'pause' the execution until the value is available. With the way your program is currently written with .then(), why don't you try it with fetch?
You can use bluebirds Promise.map method to execute your async code synchronously. Foreach is synchronized function so it's ok that's next loop is starting before first one is ended. Also you can use async await to exec it in a correct order. That's how i will make it with async await and Promise.map:
await Promise.map(pids, async i1 => {
await Promise.map(loadedContent, async (i2, idx) => {
if (i1 === i2.available_versions.version[0].pid || i1 ===
i2.versionPid) {
const response = await axios.get(`https://programmes.api.hebel.com/dougi/api/versions?
anshur${i1}`);
xml2js.parseString(response.data, function (err, result) {
loadedContent[idx].nCrid =
result.gui.results[0].vastilp[0].roundup[0].identifier[0]._
alert(loadedContent[idx].nCrid)
});
});
});

Multiple fetch() with one signal in order to abort them all

I saw this other answer: https://stackoverflow.com/a/47250621/2809729
So can I abort multiple fetch request using just one signal?
At the time I was writing the question, I already found the solution in this post on how to abort fetches from one of the pioneers that were working to the implementation of an abort.
So yes you can :)
Here a brief example:
async function fetchStory({ signal }={}) {
const storyResponse = await fetch('/story.json', { signal });
const data = await storyResponse.json();
const chapterFetches = data.chapterUrls.map(async url => {
const response = await fetch(url, { signal });
return response.text();
});
return Promise.all(chapterFetches);
}
In the above example, the same signal is used for the initial fetch, and for the parallel chapter fetches. Here's how you'd use fetchStory:
const controller = new AbortController();
const signal = controller.signal;
fetchStory({ signal }).then(chapters => {
console.log(chapters);
});
In this case, calling controller.abort() will reject the promise of the in-progress fetches.
I abstracted the solution to this problem, and came up with "fetch-tx", fetch transaction, which provides a single abort function to all related fetch operations.
you can find it here:
fetch-tx

Is there a way to short circuit async/await flow?

All four functions are called below in update return promises.
async function update() {
var urls = await getCdnUrls();
var metadata = await fetchMetaData(urls);
var content = await fetchContent(metadata);
await render(content);
return;
}
What if we want to abort the sequence from outside, at any given time?
For example, while fetchMetaData is being executed, we realize we no longer need to render the component and we want to cancel the remaining operations (fetchContent and render). Is there a way to abort/cancel these operations from outside the update function?
We could check against a condition after each await, but that seems like an inelegant solution, and even then we will have to wait for the current operation to finish.
The standard way to do this now is through AbortSignals
async function update({ signal } = {}) {
// pass these to methods to cancel them internally in turn
// this is implemented throughout Node.js and most of the web platform
try {
var urls = await getCdnUrls({ signal });
var metadata = await fetchMetaData(urls);
var content = await fetchContent(metadata);
await render(content);
} catch (e) {
if(e.name !== 'AbortError') throw e;
}
return;
}
// usage
const ac = new AbortController();
update({ signal: ac.signal });
ac.abort(); // cancel the update
OLD 2016 content below, beware dragons
I just gave a talk about this - this is a lovely topic but sadly you're not really going to like the solutions I'm going to propose as they're gateway-solutions.
What the spec does for you
Getting cancellation "just right" is actually very hard. People have been working on just that for a while and it was decided not to block async functions on it.
There are two proposals attempting to solve this in ECMAScript core:
Cancellation tokens - which adds cancellation tokens that aim to solve this issue.
Cancelable promise - which adds catch cancel (e) { syntax and throw.cancel syntax which aims to address this issue.
Both proposals changed substantially over the last week so I wouldn't count on either to arrive in the next year or so. The proposals are somewhat complimentary and are not at odds.
What you can do to solve this from your side
Cancellation tokens are easy to implement. Sadly the sort of cancellation you'd really want (aka "third state cancellation where cancellation is not an exception) is impossible with async functions at the moment since you don't control how they're run. You can do two things:
Use coroutines instead - bluebird ships with sound cancellation using generators and promises which you can use.
Implement tokens with abortive semantics - this is actually pretty easy so let's do it here
CancellationTokens
Well, a token signals cancellation:
class Token {
constructor(fn) {
this.isCancellationRequested = false;
this.onCancelled = []; // actions to execute when cancelled
this.onCancelled.push(() => this.isCancellationRequested = true);
// expose a promise to the outside
this.promise = new Promise(resolve => this.onCancelled.push(resolve));
// let the user add handlers
fn(f => this.onCancelled.push(f));
}
cancel() { this.onCancelled.forEach(x => x); }
}
This would let you do something like:
async function update(token) {
if(token.isCancellationRequested) return;
var urls = await getCdnUrls();
if(token.isCancellationRequested) return;
var metadata = await fetchMetaData(urls);
if(token.isCancellationRequested) return;
var content = await fetchContent(metadata);
if(token.isCancellationRequested) return;
await render(content);
return;
}
var token = new Token(); // don't ned any special handling here
update(token);
// ...
if(updateNotNeeded) token.cancel(); // will abort asynchronous actions
Which is a really ugly way that would work, optimally you'd want async functions to be aware of this but they're not (yet).
Optimally, all your interim functions would be aware and would throw on cancellation (again, only because we can't have third-state) which would look like:
async function update(token) {
var urls = await getCdnUrls(token);
var metadata = await fetchMetaData(urls, token);
var content = await fetchContent(metadata, token);
await render(content, token);
return;
}
Since each of our functions are cancellation aware, they can perform actual logical cancellation - getCdnUrls can abort the request and throw, fetchMetaData can abort the underlying request and throw and so on.
Here is how one might write getCdnUrl (note the singular) using the XMLHttpRequest API in browsers:
function getCdnUrl(url, token) {
var xhr = new XMLHttpRequest();
xhr.open("GET", url);
var p = new Promise((resolve, reject) => {
xhr.onload = () => resolve(xhr);
xhr.onerror = e => reject(new Error(e));
token.promise.then(x => {
try { xhr.abort(); } catch(e) {}; // ignore abort errors
reject(new Error("cancelled"));
});
});
xhr.send();
return p;
}
This is as close as we can get with async functions without coroutines. It's not very pretty but it's certainly usable.
Note that you'd want to avoid cancellations being treated as exceptions. This means that if your functions throw on cancellation you need to filter those errors on the global error handlers process.on("unhandledRejection", e => ... and such.
You can get what you want using Typescript + Bluebird + cancelable-awaiter.
Now that all evidence point to cancellation tokens not making it to ECMAScript, I think the best solution for cancellations is the bluebird implementation mentioned by #BenjaminGruenbaum, however, I find the usage of co-routines and generators a bit clumsy and uneasy on the eyes.
Since I'm using Typescript, which now support async/await syntax for es5 and es3 targets, I've created a simple module which replaces the default __awaiter helper with one that supports bluebird cancellations: https://www.npmjs.com/package/cancelable-awaiter
Unfortunately, there is no support of cancellable promises so far. There are some custom implementations e.g.
Extends/wraps a promise to be cancellable and resolvable
function promisify(promise) {
let _resolve, _reject
let wrap = new Promise(async (resolve, reject) => {
_resolve = resolve
_reject = reject
let result = await promise
resolve(result)
})
wrap.resolve = _resolve
wrap.reject = _reject
return wrap
}
Usage: Cancel promise and stop further execution immediately after it
async function test() {
// Create promise that should be resolved in 3 seconds
let promise = new Promise(resolve => setTimeout(() => resolve('our resolved value'), 3000))
// extend our promise to be cancellable
let cancellablePromise = promisify(promise)
// Cancel promise in 2 seconds.
// if you comment this line out, then promise will be resolved.
setTimeout(() => cancellablePromise.reject('error code'), 2000)
// wait promise to be resolved
let result = await cancellablePromise
// this line will never be executed!
console.log(result)
}
In this approach, a promise itself is executed till the end, but the caller code that awaits promise result can be 'cancelled'.
Unfortunately, no, you can't control execution flow of default async/await behaviour – it does not mean that the problem itself is impossible, it means that you need to do change your approach a bit.
First of all, your proposal about wrapping every async line in a check is a working solution, and if you have just couple places with such functionality, there is nothing wrong with it.
If you want to use this pattern pretty often, the best solution, probably, is to switch to generators: while not so widespread, they allow you to define each step's behaviour, and adding cancel is the easiest. Generators are pretty powerful, but, as I've mentioned, they require a runner function and not so straightforward as async/await.
Another approach is to create cancellable tokens pattern – you create an object, which will be filled a function which wants to implement this functionality:
async function updateUser(token) {
let cancelled = false;
// we don't reject, since we don't have access to
// the returned promise
// so we just don't call other functions, and reject
// in the end
token.cancel = () => {
cancelled = true;
};
const data = await wrapWithCancel(fetchData)();
const userData = await wrapWithCancel(updateUserData)(data);
const userAddress = await wrapWithCancel(updateUserAddress)(userData);
const marketingData = await wrapWithCancel(updateMarketingData)(userAddress);
// because we've wrapped all functions, in case of cancellations
// we'll just fall through to this point, without calling any of
// actual functions. We also can't reject by ourselves, since
// we don't have control over returned promise
if (cancelled) {
throw { reason: 'cancelled' };
}
return marketingData;
function wrapWithCancel(fn) {
return data => {
if (!cancelled) {
return fn(data);
}
}
}
}
const token = {};
const promise = updateUser(token);
// wait some time...
token.cancel(); // user will be updated any way
I've written articles, both on cancellation and generators:
promise cancellation
generators usage
To summarize – you have to do some additional work in order to support canncellation, and if you want to have it as a first class citizen in your application, you have to use generators.
Here is a simple exemple with a promise:
let resp = await new Promise(function(resolve, reject) {
// simulating time consuming process
setTimeout(() => resolve('Promise RESOLVED !'), 3000);
// hit a button to cancel the promise
$('#btn').click(() => resolve('Promise CANCELED !'));
});
Please see this codepen for a demo
Using CPromise (c-promise2 package) this can be easily done in the following way
(Demo):
import CPromise from "c-promise2";
async function getCdnUrls() {
console.log(`task1:start`);
await CPromise.delay(1000);
console.log(`task1:end`);
}
async function fetchMetaData() {
console.log(`task2:start`);
await CPromise.delay(1000);
console.log(`task2:end`);
}
function* fetchContent() {
// using generators is the recommended way to write asynchronous code with CPromise
console.log(`task3:start`);
yield CPromise.delay(1000);
console.log(`task3:end`);
}
function* render() {
console.log(`task4:start`);
yield CPromise.delay(1000);
console.log(`task4:end`);
}
const update = CPromise.promisify(function* () {
var urls = yield getCdnUrls();
var metadata = yield fetchMetaData(urls);
var content = yield* fetchContent(metadata);
yield* render(content);
return 123;
});
const promise = update().then(
(v) => console.log(`Done: ${v}`),
(e) => console.warn(`Fail: ${e}`)
);
setTimeout(() => promise.cancel(), 2500);
Console output:
task1:start
task1:end
task2:start
task2:end
task3:start
Fail: CanceledError: canceled
Just like in regular code you should throw an exception from the first function (or each of the next functions) and have a try block around the whole set of calls. No need to have extra if-elses. That's one of the nice bits about async/await, that you get to keep error handling the way we're used to from regular code.
Wrt cancelling the other operations there is no need to. They will actually not start until their expressions are encountered by the interpreter. So the second async call will only start after the first one finishes, without errors. Other tasks might get the chance to execute in the meantime, but for all intents and purposes, this section of code is serial and will execute in the desired order.
This answer I posted may help you to rewrite your function as:
async function update() {
var get_urls = comPromise.race([getCdnUrls()]);
var get_metadata = get_urls.then(urls=>fetchMetaData(urls));
var get_content = get_metadata.then(metadata=>fetchContent(metadata);
var render = get_content.then(content=>render(content));
await render;
return;
}
// this is the cancel command so that later steps will never proceed:
get_urls.abort();
But I am yet to implement the "class-preserving" then function so currently you have to wrap every part you want to be able to cancel with comPromise.race.
I created a library called #kaisukez/cancellation-token
The idea is to pass a CancellationToken to every async function, then wrap every promise in AsyncCheckpoint. So that when the token is cancelled, your async function will be cancelled in the next checkpoint.
This idea came from tc39/proposal-cancelable-promises
and conradreuter/cancellationtoken.
How to use my library
Refactor your code
// from this
async function yourFunction(param1, param2) {
const result1 = await someAsyncFunction1(param1)
const result2 = await someAsyncFunction2(param2)
return [result1, result2]
}
// to this
import { AsyncCheckpoint } from '#kaisukez/cancellation-token'
async function yourFunction(token, param1, param2) {
const result1 = await AsyncCheckpoint.after(token, () => someAsyncFunction1(param1))
const result2 = await AsyncCheckpoint.after(token, () => someAsyncFunction2(param2))
return [result1, result2]
}
Create a token then call your function with that token
import { CancellationToken, CancellationError } from '#kaisukez/cancellation-token'
const [token, cancel] = CancellationToken.source()
// spawn background task (run async function without using `await`)
CancellationError.ignoreAsync(() => yourAsyncFunction(token, param1, param2))
// ... do something ...
// then cancel the background task
await cancel()
So this is the solution of the OP's question.
import { CancellationToken, CancellationError, AsyncCheckpoint } from '#kaisukez/cancellation-token'
async function update(token) {
var urls = await AsyncCheckpoint.after(token, () => getCdnUrls());
var metadata = await AsyncCheckpoint.after(token, () => fetchMetaData(urls));
var content = await AsyncCheckpoint.after(token, () => fetchContent(metadata));
await AsyncCheckpoint.after(token, () => render(content));
return;
}
const [token, cancel] = CancellationToken.source();
// spawn background task (run async function without using `await`)
CancellationError.ignoreAsync(() => update(token))
// ... do something ...
// then cancel the background task
await cancel()
Example written in Node with Typescript of a call which can be aborted from outside:
function cancelable(asyncFunc: Promise<void>): [Promise<void>, () => boolean] {
class CancelEmitter extends EventEmitter { }
const cancelEmitter = new CancelEmitter();
const promise = new Promise<void>(async (resolve, reject) => {
cancelEmitter.on('cancel', () => {
resolve();
});
try {
await asyncFunc;
resolve();
} catch (err) {
reject(err);
}
});
return [promise, () => cancelEmitter.emit('cancel')];
}
Usage:
const asyncFunction = async () => {
// doSomething
}
const [promise, cancel] = cancelable(asyncFunction());
setTimeout(() => {
cancel();
}, 2000);
(async () => await promise)();

Categories