Multiple fetch() with one signal in order to abort them all - javascript

I saw this other answer: https://stackoverflow.com/a/47250621/2809729
So can I abort multiple fetch request using just one signal?

At the time I was writing the question, I already found the solution in this post on how to abort fetches from one of the pioneers that were working to the implementation of an abort.
So yes you can :)
Here a brief example:
async function fetchStory({ signal }={}) {
const storyResponse = await fetch('/story.json', { signal });
const data = await storyResponse.json();
const chapterFetches = data.chapterUrls.map(async url => {
const response = await fetch(url, { signal });
return response.text();
});
return Promise.all(chapterFetches);
}
In the above example, the same signal is used for the initial fetch, and for the parallel chapter fetches. Here's how you'd use fetchStory:
const controller = new AbortController();
const signal = controller.signal;
fetchStory({ signal }).then(chapters => {
console.log(chapters);
});
In this case, calling controller.abort() will reject the promise of the in-progress fetches.

I abstracted the solution to this problem, and came up with "fetch-tx", fetch transaction, which provides a single abort function to all related fetch operations.
you can find it here:
fetch-tx

Related

How to consume a promise and use the result later on with my code?

I'm new to async operations and js. Here is my question.
I have a Person class. I want to init Person instance using data that I get from an API call.
class Person {
constructor(data) {
this.data = data;
}
}
I am making an API call using Axios. I get a response and want to use it in my class.
const res = axios.get('https://findperson.com/api/david');
const david = new Person(res);
I understand that res is a promise at this stage, and that I need to consume it.
How should I do it?
How can I take the response and use it properly?
axios.get() return a promise of an object which contains the returned data, status, headers, etc...
async function getPerson() {
try {
const res = await axios.get('https://findperson.com/api/david');
const david = new Person(res.data);
// do something with david
} catch (error) {
console.log(error)
}
}
or
function getPerson() {
axios
.get('https://findperson.com/api/david')
.then(res => {
const david = new Person(res.data)
// do something with david
})
.catch(console.log)
}
Inside another async function, or at the top level of a module or at the REPL (in node 16.6+ or some earlier versions with the --experimental-repl-await feature enabled), you can just use await.
const res = await axios.get('https://findperson.com/api/david');
That will wait for the promise to be resolved and unwrap it to store the contained value in res.
If you want to get the value out of the world of async and into synchro-land, you have to do something with it via a callback function:
axios.get('https://findperson.com/api/david').then(
res => {
// do stuff with res here
});
... but don't be fooled; without an await, any code that comes after that axios.get call will run immediately without waiting for the callback. So you can't do something like copy res to a global var in the callback and then expect it to be set in subsequent code; it has to be callbacks all the way down.
You can do this:
axios.get('https://findperson.com/api/david').then(res => {
const david = new Person(res);
});
Or in async function: (See async await for javascript)
const res = await axios.get('https://findperson.com/api/david');
const david = new Person(res);

Make an async multiple update in a database

I have an array of elements to insert in a database. For each of them, I have to check their integrity (I send "Bad request" if I don't find an element):
let ret = []
const { idElement, type, description, name } = req.body
let promises = []
req.body.pjs.forEach((pj) => {
promises.push(new Promise (async function(resolve, reject) {
const { rows } = await db.query(`SELECT * FROM files WHERE uuid = '${pj.uuid}' AND name = '${pj.name}'`)
if (rows.length == 0) { res.status(400).send("Bad request!") }
const idFile = rows[0].id
await db.query(`UPDATE elements
SET base = base || '{"type":"file","valeur":"${idFile}","description":"${description}","name":"${pj.name}"}'::json
WHERE id = ${idElement}; `)
resolve({id: idElement, name: pj.name, val: idFile, description: description})
}))
});
(async function() {
const asyncFunctions = promises
await asyncFunctions.reduce(async (previousPromise, nextAsyncFunction) => {
await previousPromise;
const r = await nextAsyncFunction();
ret.push(r)
}, Promise.resolve());
})();
res.send(ret)
I took the example of the paragraph "3) one-by-one" heree: https://dev.to/afifsohaili/dealing-with-promises-in-an-array-with-async-await-5d7g
This trick works for a lot of use cases in other parts of my code, but not for this particular case. I have this error:
const r = await nextAsyncFunction();
TypeError: nextAsyncFunction is not a function
And I don't know why. If anybody could give me a hand, it would be very kind :)
The error message is correct, the second parameter of reduce is the next entry of the array being reduced, which in this case is the promises array.
So the immediate solution is to await the promise without trying to call it:
const r = await nextAsyncFunction; // no () on the end
Why the nextAsyncFunction name was used instead of nextPromise or variation thereof is not self evident - it's certainly confusing and led to errors.
Aside from that there seems to be some bugs waiting to happen:
If the "Bad request" message is sent, the code continues to execute and tries to update the database and resolve the promise pushed by the forEach function. Subsequently res.send(ret) will (is likely to?) error as an attempt to send a second set of response headers. Try thowing a Bad Request error and catching it in a promise catch handler to send the 400 response.
there is no attempt to wait for asynchronous processing to finish before executing
res.send(ret)
which would send an empty array if it succeeded.
The reduce(async (previousPromise, nextPromise) construct is a rather complicated way of waiting for promises to be resolved in turn by using for ... of :
(async function() {
for( promise of promises) {
ret.push( await promise);
}
}()
.then( ()=> res.send(ret));
.catch( ()=> // server error response?
Handling requests that are a mixture of valid and invalid pj request values may require further attention.

Node async loop - how to make this code run in sequential order?

I know there are several posts about this, but according to those I've found, this should work correctly.
I want to make an http request in a loop and I do not want the loop to iterate until the request callback has been fired. I'm using the async library like so:
const async = require("async");
const request = require("request");
let data = [
"Larry",
"Curly",
"Moe"
];
async.forEachOf(data, (result, idx, callback) => {
console.log("Loop iterated", idx);
let fullUri = "https://jsonplaceholder.typicode.com/posts";
request({
url: fullUri
},
(err, res, body) => {
console.log("Request callback fired...");
if (err || res.statusCode !== 200) return callback(err);
console.log(result);
callback();
});
});
What I see is:
Loop iterated 0
Loop iterated 1
Loop iterated 2
Request callback fired...
Curly
Request callback fired...
Larry
Request callback fired...
Moe
What I need to see is:
Loop iterated 0
Request callback fired...
Curly
Loop iterated 1
Request callback fired...
Larry
Loop iterated 2
Request callback fired...
Moe
Also, if there's a built-in way to do the same thing (async/await? Promise?) and the async library could be removed, that'd be even better.
I've seen some examples of recursion out there that are clever, but when I put this to use in a much more complex situation (e.g. multiple request calls per-loop, etc.) I feel like that approach is hard to follow, and isn't as readable.
You can ditch async altogether and go for async/await quite easily.
Promisify your request and use async/await
Just turn request into a Promise so you can await on it.
Better yet just use request-promise-native that already wraps request using native Promises.
Serial example
From then on it's a slam dunk with async/await:
const rp = require('request-promise-native')
const users = [1, 2, 3, 4]
const results = []
for (const idUser of users) {
const result = await rp('http://foo.com/users/' + idUser)
results.push(result)
}
Parallel example
Now, the problem with the above solution is that it's slow - the requests run serially. That's not ideal most of the time.
If you don't need the result of the previous request for the next request, just go ahead and do a Promise.all to fire parallel requests.
const users = [1, 2, 3, 4]
const pendingPromises = []
for (const idUser of users) {
// Here we won't `await` on *each and every* request.
// We'll just prepare it and push it into an Array
pendingPromises.push(rp('http://foo.com/users/' + idUser))
}
// Then we `await` on a a `Promise.all` of those requests
// which will fire all the prepared promises *simultaneously*,
// and resolve when all have been completed
const results = await Promise.all(pendingPromises)
Error handling
Error handling in async/await is provided by plain-old try..catch blocks, which I've omitted for brevity.
If you have many (thousands) of urls to process it's best to define a batch size and recursively call the process function to process one batch.
It's also best to limit amount of active connections, you can use this to throttle active connections or connections within a certain time (only 5 per second).
Last but not least; if you use Promise.all you want to make sure not all successes are lost when one promise rejects. You can catch rejected requests and return a Fail type object so it'll then resolve with this Fail type.
The code would look something like this:
const async = require("async");
//lib comes from: https://github.com/amsterdamharu/lib/blob/master/src/index.js
const lib = require("lib");
const request = require("request");
const Fail = function(reason){this.reason=reason;};
const isFail = o=>(o&&o.constructor)===Fail;
const requestAsPromise = fullUri =>
new Promise(
(resolve,reject)=>
request({
url: fullUri
},
(err, res, body) => {
console.log("Request callback fired...");
if (err || res.statusCode !== 200) reject(err);
console.log("Success:",fullUri);
resolve([res,body]);
})
)
const process =
handleBatchResult =>
batchSize =>
maxFunction =>
urls =>
Promise.all(
urls.slice(0,batchSize)
.map(
url=>
maxFunction(requestAsPromise)(url)
.catch(err=>new Fail([err,url]))//catch reject and resolve with fail object
)
)
.then(handleBatch)
.catch(panic=>console.error(panic))
.then(//recursively call itself with next batch
_=>
process(handleBatchResult)(batchSize)(maxFunction)(urls.slice(batchSize))
);
const handleBatch = results =>{//this will handle results of a batch
//maybe write successes to file but certainly write failed
// you can retry later
const successes = results.filter(result=>!isFail(result));
//failed are the requests that failed
const failed = results.filter(isFail);
//To get the failed urls you can do
const failedUrls = failed.map(([error,url])=>url);
};
const per_batch_1000_max_10_active =
process (handleBatch) (1000) (lib.throttle(10));
//start the process
per_batch_1000_max_10_active(largeArrayOfUrls)
.then(
result=>console.log("Process done")
,err=>console.error("This should not happen:".err)
);
In your handleBatchResult you can store failed requests to a file to try later const [error,uri] = failedResultItem; you should give up if a high amount of requests are failing.
After handleBatchResult there is a .catch, that is your panic mode, it should not fail there so I'd advice to pipe errors to a file (linux).

Is there a way to short circuit async/await flow?

All four functions are called below in update return promises.
async function update() {
var urls = await getCdnUrls();
var metadata = await fetchMetaData(urls);
var content = await fetchContent(metadata);
await render(content);
return;
}
What if we want to abort the sequence from outside, at any given time?
For example, while fetchMetaData is being executed, we realize we no longer need to render the component and we want to cancel the remaining operations (fetchContent and render). Is there a way to abort/cancel these operations from outside the update function?
We could check against a condition after each await, but that seems like an inelegant solution, and even then we will have to wait for the current operation to finish.
The standard way to do this now is through AbortSignals
async function update({ signal } = {}) {
// pass these to methods to cancel them internally in turn
// this is implemented throughout Node.js and most of the web platform
try {
var urls = await getCdnUrls({ signal });
var metadata = await fetchMetaData(urls);
var content = await fetchContent(metadata);
await render(content);
} catch (e) {
if(e.name !== 'AbortError') throw e;
}
return;
}
// usage
const ac = new AbortController();
update({ signal: ac.signal });
ac.abort(); // cancel the update
OLD 2016 content below, beware dragons
I just gave a talk about this - this is a lovely topic but sadly you're not really going to like the solutions I'm going to propose as they're gateway-solutions.
What the spec does for you
Getting cancellation "just right" is actually very hard. People have been working on just that for a while and it was decided not to block async functions on it.
There are two proposals attempting to solve this in ECMAScript core:
Cancellation tokens - which adds cancellation tokens that aim to solve this issue.
Cancelable promise - which adds catch cancel (e) { syntax and throw.cancel syntax which aims to address this issue.
Both proposals changed substantially over the last week so I wouldn't count on either to arrive in the next year or so. The proposals are somewhat complimentary and are not at odds.
What you can do to solve this from your side
Cancellation tokens are easy to implement. Sadly the sort of cancellation you'd really want (aka "third state cancellation where cancellation is not an exception) is impossible with async functions at the moment since you don't control how they're run. You can do two things:
Use coroutines instead - bluebird ships with sound cancellation using generators and promises which you can use.
Implement tokens with abortive semantics - this is actually pretty easy so let's do it here
CancellationTokens
Well, a token signals cancellation:
class Token {
constructor(fn) {
this.isCancellationRequested = false;
this.onCancelled = []; // actions to execute when cancelled
this.onCancelled.push(() => this.isCancellationRequested = true);
// expose a promise to the outside
this.promise = new Promise(resolve => this.onCancelled.push(resolve));
// let the user add handlers
fn(f => this.onCancelled.push(f));
}
cancel() { this.onCancelled.forEach(x => x); }
}
This would let you do something like:
async function update(token) {
if(token.isCancellationRequested) return;
var urls = await getCdnUrls();
if(token.isCancellationRequested) return;
var metadata = await fetchMetaData(urls);
if(token.isCancellationRequested) return;
var content = await fetchContent(metadata);
if(token.isCancellationRequested) return;
await render(content);
return;
}
var token = new Token(); // don't ned any special handling here
update(token);
// ...
if(updateNotNeeded) token.cancel(); // will abort asynchronous actions
Which is a really ugly way that would work, optimally you'd want async functions to be aware of this but they're not (yet).
Optimally, all your interim functions would be aware and would throw on cancellation (again, only because we can't have third-state) which would look like:
async function update(token) {
var urls = await getCdnUrls(token);
var metadata = await fetchMetaData(urls, token);
var content = await fetchContent(metadata, token);
await render(content, token);
return;
}
Since each of our functions are cancellation aware, they can perform actual logical cancellation - getCdnUrls can abort the request and throw, fetchMetaData can abort the underlying request and throw and so on.
Here is how one might write getCdnUrl (note the singular) using the XMLHttpRequest API in browsers:
function getCdnUrl(url, token) {
var xhr = new XMLHttpRequest();
xhr.open("GET", url);
var p = new Promise((resolve, reject) => {
xhr.onload = () => resolve(xhr);
xhr.onerror = e => reject(new Error(e));
token.promise.then(x => {
try { xhr.abort(); } catch(e) {}; // ignore abort errors
reject(new Error("cancelled"));
});
});
xhr.send();
return p;
}
This is as close as we can get with async functions without coroutines. It's not very pretty but it's certainly usable.
Note that you'd want to avoid cancellations being treated as exceptions. This means that if your functions throw on cancellation you need to filter those errors on the global error handlers process.on("unhandledRejection", e => ... and such.
You can get what you want using Typescript + Bluebird + cancelable-awaiter.
Now that all evidence point to cancellation tokens not making it to ECMAScript, I think the best solution for cancellations is the bluebird implementation mentioned by #BenjaminGruenbaum, however, I find the usage of co-routines and generators a bit clumsy and uneasy on the eyes.
Since I'm using Typescript, which now support async/await syntax for es5 and es3 targets, I've created a simple module which replaces the default __awaiter helper with one that supports bluebird cancellations: https://www.npmjs.com/package/cancelable-awaiter
Unfortunately, there is no support of cancellable promises so far. There are some custom implementations e.g.
Extends/wraps a promise to be cancellable and resolvable
function promisify(promise) {
let _resolve, _reject
let wrap = new Promise(async (resolve, reject) => {
_resolve = resolve
_reject = reject
let result = await promise
resolve(result)
})
wrap.resolve = _resolve
wrap.reject = _reject
return wrap
}
Usage: Cancel promise and stop further execution immediately after it
async function test() {
// Create promise that should be resolved in 3 seconds
let promise = new Promise(resolve => setTimeout(() => resolve('our resolved value'), 3000))
// extend our promise to be cancellable
let cancellablePromise = promisify(promise)
// Cancel promise in 2 seconds.
// if you comment this line out, then promise will be resolved.
setTimeout(() => cancellablePromise.reject('error code'), 2000)
// wait promise to be resolved
let result = await cancellablePromise
// this line will never be executed!
console.log(result)
}
In this approach, a promise itself is executed till the end, but the caller code that awaits promise result can be 'cancelled'.
Unfortunately, no, you can't control execution flow of default async/await behaviour – it does not mean that the problem itself is impossible, it means that you need to do change your approach a bit.
First of all, your proposal about wrapping every async line in a check is a working solution, and if you have just couple places with such functionality, there is nothing wrong with it.
If you want to use this pattern pretty often, the best solution, probably, is to switch to generators: while not so widespread, they allow you to define each step's behaviour, and adding cancel is the easiest. Generators are pretty powerful, but, as I've mentioned, they require a runner function and not so straightforward as async/await.
Another approach is to create cancellable tokens pattern – you create an object, which will be filled a function which wants to implement this functionality:
async function updateUser(token) {
let cancelled = false;
// we don't reject, since we don't have access to
// the returned promise
// so we just don't call other functions, and reject
// in the end
token.cancel = () => {
cancelled = true;
};
const data = await wrapWithCancel(fetchData)();
const userData = await wrapWithCancel(updateUserData)(data);
const userAddress = await wrapWithCancel(updateUserAddress)(userData);
const marketingData = await wrapWithCancel(updateMarketingData)(userAddress);
// because we've wrapped all functions, in case of cancellations
// we'll just fall through to this point, without calling any of
// actual functions. We also can't reject by ourselves, since
// we don't have control over returned promise
if (cancelled) {
throw { reason: 'cancelled' };
}
return marketingData;
function wrapWithCancel(fn) {
return data => {
if (!cancelled) {
return fn(data);
}
}
}
}
const token = {};
const promise = updateUser(token);
// wait some time...
token.cancel(); // user will be updated any way
I've written articles, both on cancellation and generators:
promise cancellation
generators usage
To summarize – you have to do some additional work in order to support canncellation, and if you want to have it as a first class citizen in your application, you have to use generators.
Here is a simple exemple with a promise:
let resp = await new Promise(function(resolve, reject) {
// simulating time consuming process
setTimeout(() => resolve('Promise RESOLVED !'), 3000);
// hit a button to cancel the promise
$('#btn').click(() => resolve('Promise CANCELED !'));
});
Please see this codepen for a demo
Using CPromise (c-promise2 package) this can be easily done in the following way
(Demo):
import CPromise from "c-promise2";
async function getCdnUrls() {
console.log(`task1:start`);
await CPromise.delay(1000);
console.log(`task1:end`);
}
async function fetchMetaData() {
console.log(`task2:start`);
await CPromise.delay(1000);
console.log(`task2:end`);
}
function* fetchContent() {
// using generators is the recommended way to write asynchronous code with CPromise
console.log(`task3:start`);
yield CPromise.delay(1000);
console.log(`task3:end`);
}
function* render() {
console.log(`task4:start`);
yield CPromise.delay(1000);
console.log(`task4:end`);
}
const update = CPromise.promisify(function* () {
var urls = yield getCdnUrls();
var metadata = yield fetchMetaData(urls);
var content = yield* fetchContent(metadata);
yield* render(content);
return 123;
});
const promise = update().then(
(v) => console.log(`Done: ${v}`),
(e) => console.warn(`Fail: ${e}`)
);
setTimeout(() => promise.cancel(), 2500);
Console output:
task1:start
task1:end
task2:start
task2:end
task3:start
Fail: CanceledError: canceled
Just like in regular code you should throw an exception from the first function (or each of the next functions) and have a try block around the whole set of calls. No need to have extra if-elses. That's one of the nice bits about async/await, that you get to keep error handling the way we're used to from regular code.
Wrt cancelling the other operations there is no need to. They will actually not start until their expressions are encountered by the interpreter. So the second async call will only start after the first one finishes, without errors. Other tasks might get the chance to execute in the meantime, but for all intents and purposes, this section of code is serial and will execute in the desired order.
This answer I posted may help you to rewrite your function as:
async function update() {
var get_urls = comPromise.race([getCdnUrls()]);
var get_metadata = get_urls.then(urls=>fetchMetaData(urls));
var get_content = get_metadata.then(metadata=>fetchContent(metadata);
var render = get_content.then(content=>render(content));
await render;
return;
}
// this is the cancel command so that later steps will never proceed:
get_urls.abort();
But I am yet to implement the "class-preserving" then function so currently you have to wrap every part you want to be able to cancel with comPromise.race.
I created a library called #kaisukez/cancellation-token
The idea is to pass a CancellationToken to every async function, then wrap every promise in AsyncCheckpoint. So that when the token is cancelled, your async function will be cancelled in the next checkpoint.
This idea came from tc39/proposal-cancelable-promises
and conradreuter/cancellationtoken.
How to use my library
Refactor your code
// from this
async function yourFunction(param1, param2) {
const result1 = await someAsyncFunction1(param1)
const result2 = await someAsyncFunction2(param2)
return [result1, result2]
}
// to this
import { AsyncCheckpoint } from '#kaisukez/cancellation-token'
async function yourFunction(token, param1, param2) {
const result1 = await AsyncCheckpoint.after(token, () => someAsyncFunction1(param1))
const result2 = await AsyncCheckpoint.after(token, () => someAsyncFunction2(param2))
return [result1, result2]
}
Create a token then call your function with that token
import { CancellationToken, CancellationError } from '#kaisukez/cancellation-token'
const [token, cancel] = CancellationToken.source()
// spawn background task (run async function without using `await`)
CancellationError.ignoreAsync(() => yourAsyncFunction(token, param1, param2))
// ... do something ...
// then cancel the background task
await cancel()
So this is the solution of the OP's question.
import { CancellationToken, CancellationError, AsyncCheckpoint } from '#kaisukez/cancellation-token'
async function update(token) {
var urls = await AsyncCheckpoint.after(token, () => getCdnUrls());
var metadata = await AsyncCheckpoint.after(token, () => fetchMetaData(urls));
var content = await AsyncCheckpoint.after(token, () => fetchContent(metadata));
await AsyncCheckpoint.after(token, () => render(content));
return;
}
const [token, cancel] = CancellationToken.source();
// spawn background task (run async function without using `await`)
CancellationError.ignoreAsync(() => update(token))
// ... do something ...
// then cancel the background task
await cancel()
Example written in Node with Typescript of a call which can be aborted from outside:
function cancelable(asyncFunc: Promise<void>): [Promise<void>, () => boolean] {
class CancelEmitter extends EventEmitter { }
const cancelEmitter = new CancelEmitter();
const promise = new Promise<void>(async (resolve, reject) => {
cancelEmitter.on('cancel', () => {
resolve();
});
try {
await asyncFunc;
resolve();
} catch (err) {
reject(err);
}
});
return [promise, () => cancelEmitter.emit('cancel')];
}
Usage:
const asyncFunction = async () => {
// doSomething
}
const [promise, cancel] = cancelable(asyncFunction());
setTimeout(() => {
cancel();
}, 2000);
(async () => await promise)();

Using rx.js, how do I emit a memoized result from an existing observable sequence on a timer?

I'm currently teaching myself reactive programming with rxjs, and I've set myself a challenge of creating an observable stream which will always emit the same result to a subscriber no matter what.
I've memoized the creation of an HTTP "GET" stream given a specific URL, and I'm trying to act on that stream every two seconds, with the outcome being that for each tick of the timer, I'll extract a cached/memoized HTTP result from the original stream.
import superagent from 'superagent';
import _ from 'lodash';
// Cached GET function, returning a stream that emits the HTTP response object
var httpget = _.memoize(function(url) {
var req = superagent.get(url);
req = req.end.bind(req);
return Rx.Observable.fromNodeCallback(req)();
});
// Assume this is created externally and I only have access to response$
var response$ = httpget('/ontologies/acl.ttl');
// Every two seconds, emit the memoized HTTP response
Rx.Observable.timer(0, 2000)
.map(() => response$)
.flatMap($ => $)
.subscribe(response => {
console.log('Got the response!');
});
I was sure that I'd have to stick a call to replay() in there somewhere, but no matter what I do, a fresh HTTP call is initiated every two seconds. How can I structure this so that I can construct an observable from a URL and have it always emit the same HTTP result to any subsequent subscribers?
EDIT
I found a way to get the result I want, but I feel like I am missing something, and should be able to refactor this with a much more streamlined approach:
var httpget = _.memoize(function(url) {
var subject = new Rx.ReplaySubject();
try {
superagent.get(url).end((err, res) => {
if(err) {
subject.onError(err);
}
else {
subject.onNext(res);
subject.onCompleted();
}
});
}
catch(e) {
subject.onError(e);
}
return subject.asObservable();
});
Your first code sample is actually closer to the way to do it
var httpget = _.memoize(function(url) {
var req = superagent.get(url);
return Rx.Observable.fromNodeCallback(req.end, req)();
});
However, this isn't working because there appears to be a bug in fromNodeCallback. As to work around till this is fixed, I think you are actually looking for the AsyncSubject instead of ReplaySubject. The latter works, but the former is designed for exactly this scenario (and doesn't have the overhead of an array creation + runtime checks for cache expiration if that matters to you).
var httpget = _.memoize(function(url) {
var subject = new Rx.AsyncSubject();
var req = superagent.get(url);
Rx.Observable.fromNodeCallback(req.end, req)().subscribe(subject);
return subject.asObservable();
});
Finally, though map appreciates that you are thinking of it, you can simplify your timer code by using the flatMap overload that takes an Observable directly:
Rx.Observable.timer(0, 2000)
.flatMap($response)
.subscribe(response => {
console.log('Got the response');
});
Unless I am getting your question wrong, Observable.combineLatest does just that for you, it cache the last emitted value of your observable.
This code sends the request once and then give same cached response every 200 ms:
import reqPromise from 'request-promise';
import {Observable} from 'rx';
let httpGet_ = (url) =>
Observable
.combineLatest(
Observable.interval(200),
reqPromise(url),
(count, response) => response
);
httpGet_('http://google.com/')
.subscribe(
x => console.log(x),
e => console.error(e)
);

Categories