multi_req_perform {httr2} | R Documentation |
This variation on req_perform()
performs multiple requests in parallel.
Unlike req_perform()
it always succeeds; it will never throw an error.
Instead it will return error objects, which are your responsibility to
handle.
Exercise caution when using this function; it's easy to pummel a server with many simultaneous requests. Only use it with hosts designed to serve many files at once.
multi_req_perform(reqs, paths = NULL, pool = NULL, cancel_on_error = FALSE)
reqs |
A list of requests. |
paths |
An optional list of paths, if you want to download the request
bodies to disks. If supplied, must be the same length as |
pool |
Optionally, a curl pool made by |
cancel_on_error |
Should all pending requests be cancelled when you
hit an error. Set this to |
A list the same length as reqs
where each element is either a
response or an error
.
Will not retrieve a new OAuth token if it expires part way through the requests.
Does not perform throttling with req_throttle()
.
Does not attempt retries as described by req_retry()
.
Consults the cache set by req_cache()
before/after all requests.
In general, where req_perform()
might make multiple requests due to retries
or OAuth failures, multi_req_perform()
will make only make 1.
# Requesting these 4 pages one at a time would take four seconds:
reqs <- list(
request("https://httpbin.org/delay/1"),
request("https://httpbin.org/delay/1"),
request("https://httpbin.org/delay/1"),
request("https://httpbin.org/delay/1")
)
# But it's much faster if you request in parallel
system.time(resps <- multi_req_perform(reqs))
reqs <- list(
request("https://httpbin.org/status/200"),
request("https://httpbin.org/status/400"),
request("FAILURE")
)
# multi_req_perform() will always succeed
resps <- multi_req_perform(reqs)
# you'll need to inspect the results to figure out which requests fails
fail <- vapply(resps, inherits, "error", FUN.VALUE = logical(1))
resps[fail]