Ankit Garg

Ankit Garg
Are you Ankit Garg?

Claim your profile, edit publications, add additional information:

Contact Details

Name
Ankit Garg
Affiliation
Location

Pubs By Year

Pub Categories

 
Computer Science - Computational Complexity (6)
 
Quantum Physics (4)
 
Computer Science - Information Theory (3)
 
Computer Science - Learning (3)
 
Mathematics - Information Theory (3)
 
Computer Science - Data Structures and Algorithms (2)
 
Mathematics - Algebraic Geometry (1)
 
Statistics - Machine Learning (1)
 
Mathematics - Commutative Algebra (1)
 
Computer Science - Computational Geometry (1)
 
Computer Science - Operating Systems (1)
 
Computer Science - Architecture (1)
 
Mathematics - Combinatorics (1)
 
Mathematics - Probability (1)
 
Mathematics - Classical Analysis and ODEs (1)

Publications Authored By Ankit Garg

We prove a Chernoff-type bound for sums of matrix-valued random variables sampled via a random walk on an expander, confirming a conjecture of Wigderson and Xiao up to logarithmic factors in the deviation parameter. Our proof is based on a recent multi-matrix extension of the Golden-Thompson inequality due to Sutter et al. \cite{Sutter2017}, as well as an adaptation of an argument for the scalar case due to Healy \cite{healy08}. Read More

Checkpoint-restart is now a mature technology. It allows a user to save and later restore the state of a running process. The new plugin model for the upcoming version 3. Read More

One of the best lower bound methods for the quantum communication complexity of a function H (with or without shared entanglement) is the logarithm of the approximate rank of the communication matrix of H. This measure is essentially equivalent to the approximate gamma_2 norm and generalized discrepancy, and subsumes several other lower bounds. All known lower bounds on quantum communication complexity in the general unbounded-round model can be shown via the logarithm of approximate rank, and it was an open problem to give any separation at all between quantum communication complexity and the logarithm of the approximate rank. Read More

Design matrices are sparse matrices in which the supports of different columns intersect in a few positions. Such matrices come up naturally when studying problems involving point sets with many collinear triples. In this work we consider design matrices with block (or matrix) entries. Read More

The celebrated Brascamp-Lieb (BL) inequalities (and their extensions) are an important mathematical tool, unifying and generalizing numerous inequalities in analysis, convex geometry and information theory. While their structural theory is very well understood, far less is known about computing their main parameters. We give polynomial time algorithms to compute feasibility of BL-datum, the optimal BL-constant and a weak separation oracle for the BL-polytope. Read More

Data compression is a fundamental problem in quantum and classical information theory. A typical version of the problem is that the sender Alice receives a (classical or quantum) state from some known ensemble and needs to transmit it to the receiver Bob with average error below some specified bound. We consider the case in which the message can have a variable length and goal is to minimize its expected length. Read More

In this paper we present a deterministic polynomial time algorithm for testing if a symbolic matrix in non-commuting variables over $\mathbb{Q}$ is invertible or not. The analogous question for commuting variables is the celebrated polynomial identity testing (PIT) for symbolic determinants. In contrast to the commutative case, which has an efficient probabilistic algorithm, the best previous algorithm for the non-commutative setting required exponential time (whether or not randomization is allowed). Read More

We study the tradeoff between the statistical error and communication cost of distributed statistical estimation problems in high dimensions. In the distributed sparse Gaussian mean estimation problem, each of the $m$ machines receives $n$ data points from a $d$-dimensional Gaussian distribution with unknown mean $\theta$ which is promised to be $k$-sparse. The machines communicate by message passing and aim to estimate the mean $\theta$. Read More

We prove a near optimal round-communication tradeoff for the two-party quantum communication complexity of disjointness. For protocols with $r$ rounds, we prove a lower bound of $\tilde{\Omega}(n/r + r)$ on the communication required for computing disjointness of input size $n$, which is optimal up to logarithmic factors. The previous best lower bound was $\Omega(n/r^2 + r)$ due to Jain, Radhakrishnan and Sen [JRS03]. Read More

We explore the connection between dimensionality and communication cost in distributed learning problems. Specifically we study the problem of estimating the mean $\vec{\theta}$ of an unknown $d$ dimensional gaussian distribution in the distributed setting. In this problem, the samples from the unknown distribution are distributed among $m$ different machines. Read More

Most image-search approaches today are based on the text based tags associated with the images which are mostly human generated and are subject to various kinds of errors. The results of a query to the image database thus can often be misleading and may not satisfy the requirements of the user. In this work we propose our approach to automate this tagging process of images, where image results generated can be fine filtered based on a probabilistic tagging mechanism. Read More