r2jags - How to interpret some syntax (n.adapt, update..) in jags? -


i feel confused following syntax in jags, example,

n.iter=100,000 thin=100 n.adapt=100 update(model,1000,progress.bar = "none") 

currently think

n.adapt=100 means set first 100 draws burn-in,

n.iter=100,000 means mcmc chain has 100,000 iterations including burn-in,

i have checked explanation question lot of time still not sure whether interpretation n.iter , n.adapt correct , how understand update() , thinning.

could explain me?

this answer based on package rjags, takes n.adapt argument. first discuss meanings of adaptation, burn-in, , thinning, , discuss syntax (i sense aware of meaning of burn-in , thinning, not of adaptation; full explanation may make answer more useful future readers).

burn-in understand introductions mcmc sampling, number of iterations mcmc chain must discarded burn-in. because prior fitting model, don't know whether have initialized mcmc chain within characteristic set, region of reasonable posterior probability. chains initialized outside region take finite (sometimes large) number of iterations find region , begin exploring it. mcmc samples period of exploration not random draws posterior distribution. therefore, standard discard first portion of each mcmc chain "burn-in". there several post-hoc techniques determine how of chain must discarded.

thinning separate problem arises because in simplest models, mcmc sampling algorithms produce chains in successive draws substantially autocorrelated. thus, summarizing posterior based on iterations of mcmc chain (post burn-in) may inadvisable, effective posterior sample size can smaller analyst realizes (note stan's implementation of hamiltonian monte-carlo sampling dramatically reduces problem in situations). therefore, standard make inference on "thinned" chains fraction of mcmc iterations used in inference (e.g. every fifth, tenth, or hundredth iteration, depending on severity of autocorrelation).

adaptation mcmc samplers jags uses sample posterior governed tunable parameters affect precise behavior. proper tuning of these parameters can produce gains in speed or de-correlation of sampling. jags contains machinery tune these parameters automatically, , draws posterior samples. process called adaptation, it non-markovian; resulting samples not constitute markov chain. therefore, burn-in must performed separately after adaptation. incorrect substitute adaptation period burn-in. however, relatively short burn-in necessary post-adaptation.

syntax let's @ highly specific example (the code in op doesn't show parameters n.adapt or thin used). we'll ask rjags fit model in such way each step clear.

 n.chains = 3  n.adapt = 1000  n.burn = 10000  n.iter = 20000  thin = 50  my.model <- jags.model(mymodel.txt, data=x, inits=y, n.adapt=n.adapt) # x list pointing jags data are, y vector or function giving initial values  update(my.model, n.burn)  my.samples <- coda.samples(my.model, params, n.iter=n.iter, thin=thin) # params list of parameters set trace monitors (i.e. want posterior inference on these parameters) 

jags.model() builds directed acyclic graph , performs adaptation phase number of iterations given n.adapt. update() performs burn-in on each chain running mcmc n.burn iterations without saving of posterior samples (skip step if want examine full chains , discard burn-in period post-hoc). coda.samples() (from coda package) runs each mcmc chain number of iterations specified n.iter, but not save every iteration. instead, saves ever nth iteration, n given thin. again, if want determine thinning interval post-hoc, there no need thin @ stage. 1 advantage of thinning @ stage coda syntax makes simple so; don't have understand structure of mcmc object returned coda.samples() , thin yourself. bigger advantage thinning @ stage realized if n.iter large. example, if autocorrelation bad, might run 2 million iterations , save every thousandth (thin=1000). if didn't thin @ stage, (and ram) need manipulate object 3 chains of 2 million numbers each. thinning go, final object has 2 thousand numbers in each chain.


Comments