Skip to contents

Compute convergence and efficiency diagnostics from high-dimensional MCMC outputs (mcmc.list, mcmc, or numeric matrices) using a block-wise, vectorized backend while preserving the classic return layout of compute_diag_from_mcmc.

Usage

compute_diag_from_mcmc_alt(samples, runtime_s)

Arguments

samples

An mcmc.list, mcmc, or numeric matrix of samples. Chains must share at least one common parameter name.

runtime_s

Numeric scalar; total runtime in seconds associated with the MCMC run(s), assumed to be the wall-clock time for the full set of chains.

Value

A data.frame with one row per parameter and the classic columns expected by compute_diag_from_mcmc:

target

Parameter name (column name in the MCMC object).

ESS

Effective sample size used for efficiency summaries (worst-chain ESS).

AE_ESS_per_it

Algorithmic efficiency: ESS / n_iter.

ESS_per_sec

Computational efficiency: ESS / runtime_s.

time_s_per_ESS

Seconds per effective sample: runtime_s / ESS.

Family

Top-level node family (substring of target before the first "[").

Details

This function targets models with tens of thousands of monitored parameters (e.g., large hierarchical stock-assessment or life-cycle models such as WGNAS, GEREM, and Scorff LCM), for which naive per-parameter loops can become slow or memory-bound.

Chains are first harmonized by truncating to the shortest common length and aligning to a common set of parameter names. Diagnostics are then computed in column blocks to control memory usage. The ESS column in the output corresponds to a conservative worst-chain ESS (minimum across chains), and the classic efficiency metrics are derived as:

  • AE_ESS_per_it = ESS / n_iter,

  • ESS_per_sec = ESS / runtime_s,

  • time_s_per_ESS = runtime_s / ESS.

The goal is backward compatibility with downstream consumers of compute_diag_from_mcmc (plots, summaries, bottleneck detectors), while scaling to very large parameter vectors via a vectorized backend.

See also

compute_diag_from_mcmc, effectiveSize, Vehtari et al. (2021), Bayesian Analysis 16(2):667–718.