xdnuts {XDNUTS} | R Documentation |
The function allows generating multiple Markov Chains for sampling from both continuous and discontinuous posterior distributions using a variety of algorithms. Classic Hamiltonian Monte Carlo (Duane et al. 1987), NUTS (Hoffman et al. 2014), and XHMC (Betancourt 2016) are embedded into the framework described in (Nishimura et al. 2020), which allows dealing with such posteriors. Furthermore, for each method, it is possible to recycle samples from the trajectories using the method proposed by (Nishimura and Dunson 2020). This is used to improve the estimate of the Mass Matrix during the warm-up phase without requiring a relevant additional computational cost.
xdnuts(
theta0,
nlp,
args,
k,
N = 1000,
K = 3,
method = "NUTS",
tau = NULL,
L = NULL,
thin = 1,
control = set_parameters(),
parallel = FALSE,
loadLibraries = NULL,
loadRObject = NULL,
verbose = FALSE,
hide = FALSE
)
theta0 |
a list containing the starting values for each chain. These starting values are vectors of length- |
nlp |
a function which evaluates the negative log posterior and its gradient with respect to
parameters that do not induce any discontinuity in the posterior distribution (more generally, the first
|
args |
a list containing the inputs for the negative posterior function. |
k |
an integer value that states the number of parameters that determines a discontinuity in the posterior distribution.
Actually, since the algorithm proposed in (Nishimura et al. 2020) also works for the full continuous case,
|
N |
the number of draws from the posterior distribution, after warm-up, for each chain. Default value is |
K |
the number of recycled samples per iteration used by default during the warm-up phase.
Default value is |
method |
a character value which defines the type of algorithm to exploit:
|
tau |
the threshold for the virial termination criterion (Betancourt 2016). |
L |
the desired length of the trajectory of classic Hamiltonian Monte Carlo algorithm. |
thin |
the number of necessary and discarded samples to obtain a final iteration of one chain. |
control |
an object of class |
parallel |
a boolean value specifying whether the chains must be run in parallel. Default value is |
loadLibraries |
A character vector indicating the names of the packages to load on each cluster if
|
loadRObject |
A character vector indicating the names of the R objects to load on each cluster if
|
verbose |
a boolean value for printing all the information regarding the sampling process. |
hide |
a boolean value that omits the printing to the console if set to |
a list of class XDNUTS
containing
a list of the same length of theta0
, each element containing the output from the function main_function.
the dimension of the parameter space.
the number of parameters that lead to a discontinuous posterior distribution. Or, more generally, for which the algorithm of (Nishimura et al. 2020) is exploited.
the number of recycled samples for each iteration during the sampling phase.
the number of posterior draws for each chain.
the MCMC method used. This could be either "NUTS", "XHMC", or "HMC".
the threshold for the virial termination criterion (Betancourt 2016).
Only if method = "XHMC"
this value is different from zero.
the desired length of the trajectory of classic Hamiltonian Monte Carlo algorithm specified by the user.
This argument is necessary if method = "HMC"
.
the number of discarded samples for every final iteration, specified by the user.
an object of class control_xdnuts
, output of the function set_parameters with arguments specified by the user.
the boolean value specified by the user regarding the printing of the sampling process information.
the boolean value specified by the user regarding parallel processing.
Betancourt M (2016).
“Identifying the optimal integration time in Hamiltonian Monte Carlo.”
arXiv preprint arXiv:1601.00225.
Betancourt M (2017).
“A conceptual introduction to Hamiltonian Monte Carlo.”
arXiv preprint arXiv:1701.02434.
Duane S, Kennedy AD, Pendleton BJ, Roweth D (1987).
“Hybrid monte carlo.”
Physics letters B, 195(2), 216–222.
Hoffman MD, Gelman A, others (2014).
“The No-U-Turn sampler: adaptively setting path lengths in Hamiltonian Monte Carlo.”
J. Mach. Learn. Res., 15(1), 1593–1623.
Nishimura A, Dunson D (2020).
“Recycling Intermediate Steps to Improve Hamiltonian Monte Carlo.”
Bayesian Analysis, 15(4).
ISSN 1936-0975, doi:10.1214/19-ba1171, http://dx.doi.org/10.1214/19-BA1171.
Nishimura A, Dunson DB, Lu J (2020).
“Discontinuous Hamiltonian Monte Carlo for discrete parameters and discontinuous likelihoods.”
Biometrika, 107(2), 365–380.