# Ajay Jasra

## Contact Details

NameAjay Jasra |
||

Affiliation |
||

Location |
||

## Pubs By Year |
||

## Pub CategoriesStatistics - Computation (36) Statistics - Methodology (13) Mathematics - Statistics (9) Statistics - Theory (9) Mathematics - Probability (5) Mathematics - Numerical Analysis (3) Mathematics - Dynamical Systems (2) Statistics - Applications (1) Mathematics - Optimization and Control (1) |

## Publications Authored By Ajay Jasra

In this article we develop a new sequential Monte Carlo (SMC) method for multilevel (ML) Monte Carlo estimation. In particular, the method can be used to estimate expectations with respect to a target probability distribution over an infinite-dimensional and non-compact space as given, for example, by a Bayesian inverse problem with Gaussian random field prior. Under suitable assumptions the MLSMC method has the optimal $O(\epsilon^{-2})$ bound on the cost to obtain a mean-square error of $O(\epsilon^2)$. Read More

In the following article we consider approximate Bayesian computation (ABC) inference. We introduce a method for numerically approximating ABC posteriors using the multilevel Monte Carlo (MLMC). A sequential Monte Carlo version of the approach is developed and it is shown under some assumptions that for a given level of mean square error, this method for ABC has a lower cost than i. Read More

We introduce a new class of Monte Carlo based approximations of expectations of random variables defined whose laws are not available directly, but only through certain discretisatizations. Sampling from the discretized versions of these laws can typically introduce a bias. In this paper, we show how to remove that bias, by introducing a new version of multi-index Monte Carlo (MIMC) that has the added advantage of reducing the computational effort, relative to i. Read More

In this paper we consider filtering and smoothing of partially observed chaotic dynamical systems that are discretely observed, with an additive Gaussian noise in the observation. These models are found in a wide variety of real applications and include the Lorenz 96' model. In the context of a fixed observation interval $T$, observation frequency $h$ and Gaussian observation variance $\sigma_Z^2$, we show under assumptions that the filter and smoother are well approximated by a Gaussian when $\sigma^2_Z h$ is sufficiently small. Read More

In this article we consider static Bayesian parameter estimation for partially observed diffusions that are discretely observed. We work under the assumption that one must resort to discretizing the underlying diffusion process, for instance using the Euler-Maruyama method. Given this assumption, we show how one can use Markov chain Monte Carlo (MCMC) and particularly particle MCMC [Andrieu, C. Read More

In this article we consider importance sampling (IS) and sequential Monte Carlo (SMC) methods in the context of 1-dimensional random walks with absorbing barriers. In particular, we develop a very precise variance analysis for several IS and SMC procedures. We take advantage of some explicit spectral formulae available for these models to derive sharp and explicit estimates; this provides stability properties of the associated normalized Feynman-Kac semigroups. Read More

This article presents results on the concentration properties of the smoothing and filtering distributions of some partially observed chaotic dynamical systems. We show that, rather surprisingly, for the geometric model of the Lorenz equations, as well as some other chaotic dynamical systems, the smoothing and filtering distributions do not concentrate around the true position of the signal, as the number of observations tends to infinity. Instead, under various assumptions on the observation noise, we show that the expected value of the diameter of the support of the smoothing and filtering distributions remains lower bounded by a constant times the standard deviation of the noise, independently of the number of observations. Read More

Pricing options is an important problem in financial engineering. In many scenarios of practical interest, financial option prices associated to an underlying asset reduces to computing an expectation w.r. Read More

Particle filters are a powerful and flexible tool for performing inference on state-space models. They involve a collection of samples evolving over time through a combination of sampling and re-sampling steps. The re-sampling step is necessary to ensure that weight degeneracy is avoided. Read More

In this article we introduce two new estimates of the normalizing constant (or marginal likelihood) for partially observed diffusion (POD) processes, with discrete observations. One estimate is biased but non-negative and the other is unbiased but not almost surely non-negative. Our method uses the multilevel particle filter of Jasra et al (2015). Read More

We propose using the multiplicative (or Chung-Lu random graph) model as a prior on graphs for Bayesian inference of Gaussian graphical models (GGMs). In the multiplicative model, each edge is chosen independently with probability equal to the product of the connectivities of the end nodes. This class of prior is parsimonious yet highly flexible; it can be used to encourage sparsity or graphs with a pre-specified degree distribution when such prior knowledge is available. Read More

This paper considers uncertainty quantification for an elliptic nonlocal equation. In particular, it is assumed that the parameters which define the kernel in the nonlocal operator are uncertain and a priori distributed according to a probability measure. It is shown that the induced probability measure on some quantities of interest arising from functionals of the solution to the equation with random inputs is well-defined; as is the posterior distribution on parameters given observations. Read More

This article considers the sequential Monte Carlo (SMC) approximation of ratios of normalizing constants associated to posterior distributions which in principle rely on continuum models. Therefore, the Monte Carlo estimation error and the discrete approximation error must be balanced. A multilevel strategy is utilized to substantially reduce the cost to obtain a given error level in the approximation as compared to standard estimators. Read More

Flexible regression methods where interest centres on the way that the whole distribution of a response vector changes with covariates are very useful in some applications. A recently developed technique in this regard uses the matrix-variate Dirichlet process as a prior for a mixing distribution on a coefficient in a multivariate linear regression model. The method is attractive, particularly in the multivariate setting, for the convenient way that it allows for borrowing strength across different component regressions and for its computational simplicity and tractability. Read More

In this paper the filtering of partially observed diffusions, with discrete-time observations, is considered. It is assumed that only biased approximations of the diffusion can be obtained, for choice of an accuracy parameter indexed by $l$. A multilevel estimator is proposed, consisting of a telescopic sum of increment estimators associated to the successive levels. Read More

In this paper, we provide bounds on the asymptotic variance for a class of sequential Monte Carlo (SMC) samplers designed for approximating multimodal distributions. Such methods combine standard SMC methods and Markov chain Monte Carlo (MCMC) kernels. Our bounds improve upon previous results, and unlike some earlier work, they also apply in the case when the MCMC kernels can move between the modes. Read More

We observe an undirected graph $G$ without multiple edges and self-loops, which is to represent a protein-protein interaction (PPI) network. We assume that $G$ evolved under the duplication-mutation with complementarity (DMC) model from a seed graph, $G_0$, and we also observe the binary forest $\Gamma$ that represents the duplication history of $G$. A posterior density for the DMC model parameters is established, and we outline a sampling strategy by which one can perform Bayesian inference; that sampling strategy employs a particle marginal Metropolis-Hastings (PMMH) algorithm. Read More

In this article we consider the approximation of expectations w.r.t. Read More

We consider Bayesian online static parameter estimation for state-space models. This is a very important problem, but is very computationally challenging as the state- of-the art methods that are exact, often have a computational cost that grows with the time parameter; perhaps the most successful algorithm is that of SMC2 [9]. We present a version of the SMC2 algorithm which has computational cost that does not grow with the time parameter. Read More

In this article we consider a Bayesian inverse problem associated to elliptic partial differential equations (PDEs) in two and three dimensions. This class of inverse problems is important in applications such as hydrology, but the complexity of the link function between unknown field and measurements can make it difficult to draw inference from the associated posterior. We prove that for this inverse problem a basic SMC method has a Monte Carlo rate of convergence with constants which are independent of the dimension of the discretization of the problem; indeed convergence of the SMC method is established in a function space setting. Read More

We consider the numerical approximation of the filtering problem in high dimensions, that is, when the hidden state lies in $\mathbb{R}^d$ with $d$ large. For low dimensional problems, one of the most popular numerical procedures for consistent inference is the class of approximations termed particle filters or sequential Monte Carlo methods. However, in high dimensions, standard particle filters (e. Read More

This article provides a new theory for the analysis of forward and backward particle approximations of Feynman-Kac models. Such formulae are found in a wide variety of applications and their numerical (particle) approximation are required due to their intractability. Under mild assumptions, we provide sharp and non-asymptotic first order expansions of these particle methods, potentially on path space and for possibly unbounded functions. Read More

The objective of this article is to study the asymptotic behavior of a new particle filtering approach in the context of hidden Markov models (HMMs). In particular, we develop an algorithm where the latent-state sequence is segmented into multiple shorter portions, with an estimation technique based upon a separate particle filter in each portion. The partitioning facilitates the use of parallel processing. Read More

We observe $n$ sequences at each of $m$ sites, and assume that they have evolved from an ancestral sequence that forms the root of a binary tree of known topology and branch lengths, but the sequence states at internal nodes are unknown. The topology of the tree and branch lengths are the same for all sites, but the parameters of the evolutionary model can vary over sites. We assume a piecewise constant model for these parameters, with an unknown number of change-points and hence a trans-dimensional parameter space over which we seek to perform Bayesian inference. Read More

In the following article we consider approximate Bayesian computation (ABC) for certain classes of time series models. In particular, we focus upon scenarios where the likelihoods of the observations and parameter are intractable, by which we mean that one cannot evaluate the likelihood even up-to a positive unbiased estimate. This paper reviews and develops a class of approximation procedures based upon the idea of ABC, but, specifically maintains the probabilistic structure of the original statistical model. Read More

In the following article we consider the time-stability associated to the sequential Monte Carlo (SMC) estimate of the backward interpretation of Feynman-Kac Formulae. This is particularly of interest in the context of performing smoothing for hidden Markov models (HMMs). We prove a central limit theorem (CLT) under weaker assumptions than adopted in the literature. Read More

We propose sequential Monte Carlo based algorithms for maximum likelihood estimation of the static parameters in hidden Markov models with an intractable likelihood using ideas from approximate Bayesian computation. The static parameter estimation algorithms are gradient based and cover both offline and online estimation. We demonstrate their performance by estimating the parameters of three intractable models, namely the alpha-stable distribution, g-and-k distribution, and the stochastic volatility model with alpha-stable returns, using both real and synthetic data. Read More

**Category:**Statistics - Methodology

This work focuses on sampling from hidden Markov models (Cappe et al, 2005) whose observations have intractable density functions. We develop a new sequential Monte Carlo (Doucet et al, 2000 and Gordon et al, 1993) algorithm and a new particle marginal Metropolis-Hastings (Andrieu et al, 2010) algorithm for these purposes. We build from Jasra, et al (2013) and Whiteley, et al (2013) to construct the sequential Monte Carlo (SMC) algorithm (which we call the alive twisted particle filter). Read More

We consider the inverse problem of estimating the initial condition of a partial differential equation, which is only observed through noisy measurements at discrete time intervals. In particular, we focus on the case where Eulerian measurements are obtained from the time and space evolving vector field, whose evolution obeys the two-dimensional Navier-Stokes equations defined on a torus. This context is particularly relevant to the area of numerical weather forecasting and data assimilation. Read More

In several implementations of Sequential Monte Carlo (SMC) methods it is natural, and important in terms of algorithmic efficiency, to exploit the information of the history of the samples to optimally tune their subsequent propagations. In this article we provide a carefully formulated asymptotic theory for a class of such \emph{adaptive} SMC methods. The theoretical framework developed here will cover, under assumptions, several commonly used SMC algorithms. Read More

In the following article we provide an exposition of exact computational methods to perform parameter inference from partially observed network models. In particular, we consider the duplication attachment (DA) model which has a likelihood function that typically cannot be evaluated in any reasonable computational time. We consider a number of importance sampling (IS) and sequential Monte Carlo (SMC) methods for approximating the likelihood of the network model for a fixed parameter value. Read More

We develop a fast variational approximation scheme for Gaussian process (GP) regression, where the spectrum of the covariance function is subjected to a sparse approximation. Our approach enables uncertainty in covariance function hyperparameters to be treated without using Monte Carlo methods and is robust to overfitting. Our article makes three contributions. Read More

We consider the computation of the permanent of a binary n by n matrix. It is well- known that the exact computation is a #P complete problem. A variety of Markov chain Monte Carlo (MCMC) computational algorithms have been introduced in the literature whose cost, in order to achieve a given level of accuracy, is O(n^7 log^4(n)). Read More

**Category:**Statistics - Computation

In the following article we develop a particle filter for approximating Feynman-Kac models with indicator potentials. Examples of such models include approximate Bayesian computation (ABC) posteriors associated with hidden Markov models (HMMs) or rare-event problems. Such models require the use of advanced particle filter or Markov chain Monte Carlo (MCMC) algorithms e. Read More

In the following article we consider approximate Bayesian parameter inference for observation driven time series models. Such statistical models appear in a wide variety of applications, including econometrics and applied mathematics. This article considers the scenario where the likelihood function cannot be evaluated point-wise; in such cases, one cannot perform exact statistical inference, including parameter estimation, which often requires advanced computational algorithms, such as Markov chain Monte Carlo (MCMC). Read More

The simulation of genealogical trees backwards in time, from observations up to the most recent common ancestor (MRCA), is hindered by the fact that, while approaching the root of the tree, coalescent events become rarer, with a corresponding increase in computation time. The recently proposed "Time Machine" tackles this issue by stopping the simulation of the tree before reaching the MRCA and correcting for the induced bias. We present a computationally efficient implementation of this approach that exploits multithreading. Read More

In this article we propose an improvement on the sequential updating and greedy search (SUGS) algorithm Wang and Dunson for fast fitting of Dirichlet process mixture models. The SUGS algorithm provides a means for very fast approximate Bayesian inference for mixture data which is particularly of use when data sets are so large that many standard Markov chain Monte Carlo (MCMC) algorithms cannot be applied efficiently, or take a prohibitively long time to converge. In particular, these ideas are used to initially interrogate the data, and to refine models such that one can potentially apply exact data analysis later on. Read More

In this article we focus on Maximum Likelihood estimation (MLE) for the static parameters of hidden Markov models (HMMs). We will consider the case where one cannot or does not want to compute the conditional likelihood density of the observation given the hidden state because of increased computational complexity or analytical intractability. Instead we will assume that one may obtain samples from this conditional likelihood and hence use approximate Bayesian computation (ABC) approximations of the original HMM. Read More

In this note we introduce an estimate for the marginal likelihood associated to hidden Markov models (HMMs) using sequential Monte Carlo (SMC) approximations of the generalized two-filter smoothing decomposition (Briers, 2010). This estimate is shown to be unbiased and a central limit theorem (CLT) is established. This latter CLT also allows one to prove a CLT associated to estimates of expectations w. Read More

We consider a method for approximate inference in hidden Markov models (HMMs). The method circumvents the need to evaluate conditional densities of observations given the hidden states. It may be considered an instance of Approximate Bayesian Computation (ABC) and it involves the introduction of auxiliary variables valued in the same space as the observations. Read More

Motivated by a challenging problem in financial trading we are presented with a mixture of regressions with variable selection problem. In this regard, one is faced with data which possess outliers, skewness and, simultaneously, due to the nature of financial trading, one would like to be able to construct clusters with specific predictors that are fairly sparse. We develop a Bayesian mixture of lasso regressions with $t-$errors to reflect these specific demands. Read More

Sequential Monte Carlo (SMC) methods are a class of techniques to sample approximately from any sequence of probability distributions using a combination of importance sampling and resampling steps. This paper is concerned with the convergence analysis of a class of SMC methods where the times at which resampling occurs are computed online using criteria such as the effective sample size. This is a popular approach amongst practitioners but there are very few convergence results available for these methods. Read More

Cluster analysis of biological samples using gene expression measurements is a common task which aids the discovery of heterogeneous biological sub-populations having distinct mRNA profiles. Several model-based clustering algorithms have been proposed in which the distribution of gene expression values within each sub-group is assumed to be Gaussian. In the presence of noise and extreme observations, a mixture of Gaussian densities may over-fit and overestimate the true number of clusters. Read More

This paper presents a simulation-based framework for sequential inference from partially and discretely observed point process (PP's) models with static parameters. Taking on a Bayesian perspective for the static parameters, we build upon sequential Monte Carlo (SMC) methods, investigating the problems of performing sequential filtering and smoothing in complex examples, where current methods often fail. We consider various approaches for approximating posterior distributions using SMC. Read More

In this article we consider Bayesian parameter inference associated to partially-observed stochastic processes that start from a set B0 and are stopped or killed at the first hitting time of a known set A. Such processes occur naturally within the context of a wide variety of applications. The associated posterior distributions are highly complex and posterior parameter inference requires the use of advanced Markov chain Monte Carlo (MCMC) techniques. Read More

This report is a collection of comments on the Read Paper of Fearnhead and Prangle (2011), to appear in the Journal of the Royal Statistical Society Series B, along with a reply from the authors. Read More

In a recent paper Beskos et al (2011), the Sequential Monte Carlo (SMC) sampler introduced in Del Moral et al (2006), Neal (2001) has been shown to be asymptotically stable in the dimension of the state space d at a cost that is only polynomial in d, when N the number of Monte Carlo samples, is fixed. More precisely, it has been established that the effective sample size (ESS) of the ensuing (approximate) sample and the Monte Carlo error of fixed dimensional marginals will converge as $d$ grows, with a computational cost of $\mathcal{O}(Nd^2)$. In the present work, further results on SMC methods in high dimensions are provided as $d\to\infty$ and with $N$ fixed. Read More

This article establishes sufficient conditions for a linear-in-time bound on the non-asymptotic variance of particle approximations of time-homogeneous Feynman-Kac formulae. These formulae appear in a wide variety of applications including option pricing in finance and risk sensitive control in engineering. In direct Monte Carlo approximation of these formulae, the non-asymptotic variance typically increases at an exponential rate in the time parameter. Read More

Let $\mathscr{P}(E)$ be the space of probability measures on a measurable space $(E,\mathcal{E})$. In this paper we introduce a class of nonlinear Markov chain Monte Carlo (MCMC) methods for simulating from a probability measure $\pi\in\mathscr{P}(E)$. Nonlinear Markov kernels (see [Feynman--Kac Formulae: Genealogical and Interacting Particle Systems with Applications (2004) Springer]) $K:\mathscr{P}(E)\times E\rightarrow\mathscr{P}(E)$ can be constructed to, in some sense, improve over MCMC methods. Read More

Approximate Bayesian computation (ABC) is a popular technique for approximating likelihoods and is often used in parameter estimation when the likelihood functions are analytically intractable. Although the use of ABC is widespread in many fields, there has been little investigation of the theoretical properties of the resulting estimators. In this paper we give a theoretical analysis of the asymptotic properties of ABC based maximum likelihood parameter estimation for hidden Markov models. Read More