mashr - Multivariate Adaptive Shrinkage
Implements the multivariate adaptive shrinkage (mash) method of Urbut et al (2019) <DOI:10.1038/s41588-018-0268-8> for estimating and testing large numbers of effects in many conditions (or many outcomes). Mash takes an empirical Bayes approach to testing and effect estimation; it estimates patterns of similarity among conditions, then exploits these patterns to improve accuracy of the effect estimates. The core linear algebra is implemented in C++ for fast model fitting and posterior computation.
Last updated 5 months ago
openblasgslcppopenmp
11.04 score 91 stars 3 dependents 624 scripts 513 downloadsmixsqp - Sequential Quadratic Programming for Fast Maximum-Likelihood Estimation of Mixture Proportions
Provides an optimization method based on sequential quadratic programming (SQP) for maximum likelihood estimation of the mixture proportions in a finite mixture model where the component densities are known. The algorithm is expected to obtain solutions that are at least as accurate as the state-of-the-art MOSEK interior-point solver (called by function "KWDual" in the 'REBayes' package), and they are expected to arrive at solutions more quickly when the number of samples is large and the number of mixture components is not too large. This implements the "mix-SQP" algorithm, with some improvements, described in Y. Kim, P. Carbonetto, M. Stephens & M. Anitescu (2020) <DOI:10.1080/10618600.2019.1689985>.
Last updated 1 years ago
openblascpp
8.68 score 11 stars 23 dependents 87 scripts 4.8k downloadsebnm - Solve the Empirical Bayes Normal Means Problem
Provides simple, fast, and stable functions to fit the normal means model using empirical Bayes. For available models and details, see function ebnm(). A detailed introduction to the package is provided by Willwerscheid and Stephens (2023) <arXiv:2110.00152>.
Last updated 10 months ago
8.15 score 12 stars 1 dependents 145 scripts 268 downloadsvarbvs - Large-Scale Bayesian Variable Selection Using Variational Methods
Fast algorithms for fitting Bayesian variable selection models and computing Bayes factors, in which the outcome (or response variable) is modeled using a linear regression or a logistic regression. The algorithms are based on the variational approximations described in "Scalable variational inference for Bayesian variable selection in regression, and its accuracy in genetic association studies" (P. Carbonetto & M. Stephens, 2012, <DOI:10.1214/12-BA703>). This software has been applied to large data sets with over a million variables and thousands of samples.
Last updated 2 years ago
cpp
4.85 score 2 dependents 146 scripts 263 downloads