Jiecao Chen

Jiecao Chen
Are you Jiecao Chen?

Claim your profile, edit publications, add additional information:

Contact Details

Jiecao Chen

Pubs By Year

Pub Categories

Computer Science - Data Structures and Algorithms (6)
Computer Science - Learning (1)

Publications Authored By Jiecao Chen

Clustering large datasets is a fundamental problem with a number of applications in machine learning. Data is often collected on different sites and clustering needs to be performed in a distributed manner with low communication. We would like the quality of the clustering in the distributed setting to match that in the centralized setting for which all the data resides on a single site. Read More

In this paper we study the extraction of representative elements in the data stream model in the form of submodular maximization. Different from the previous work on streaming submodular maximization, we are interested only in the recent data, and study the maximization problem over sliding windows. We provide a general reduction from the sliding window model to the standard streaming model, and thus our approach works for general constraints as long as there is a corresponding streaming algorithm in the standard streaming model. Read More

Linear sketching algorithms have been widely used for processing large-scale distributed and streaming datasets. Their popularity is largely due to the fact that linear sketches can be naturally composed in the distributed model and be efficiently updated in the streaming model. The errors of linear sketches are typically expressed in terms of the sum of coordinates of the input vector excluding those largest ones, or, the mass on the tail of the vector. Read More

We undertake a systematic study of sketching a quadratic form: given an $n \times n$ matrix $A$, create a succinct sketch $\textbf{sk}(A)$ which can produce (without further access to $A$) a multiplicative $(1+\epsilon)$-approximation to $x^T A x$ for any desired query $x \in \mathbb{R}^n$. While a general matrix does not admit non-trivial sketches, positive semi-definite (PSD) matrices admit sketches of size $\Theta(\epsilon^{-2} n)$, via the Johnson-Lindenstrauss lemma, achieving the "for each" guarantee, namely, for each query $x$, with a constant probability the sketch succeeds. (For the stronger "for all" guarantee, where the sketch succeeds for all $x$'s simultaneously, again there are no non-trivial sketches. Read More

We study the problem of compressing a weighted graph $G$ on $n$ vertices, building a "sketch" $H$ of $G$, so that given any vector $x \in \mathbb{R}^n$, the value $x^T L_G x$ can be approximated up to a multiplicative $1+\epsilon$ factor from only $H$ and $x$, where $L_G$ denotes the Laplacian of $G$. One solution to this problem is to build a spectral sparsifier $H$ of $G$, which, using the result of Batson, Spielman, and Srivastava, consists of $O(n \epsilon^{-2})$ reweighted edges of $G$ and has the property that simultaneously for all $x \in \mathbb{R}^n$, $x^T L_H x = (1 \pm \epsilon) x^T L_G x$. The $O(n \epsilon^{-2})$ bound is optimal for spectral sparsifiers. Read More

Modern data management systems often need to deal with massive, dynamic and inherently distributed data sources. We collect the data using a distributed network, and at the same time try to maintain a global view of the data at a central coordinator using a minimal amount of communication. Such applications have been captured by the distributed monitoring model which has attracted a lot of attention in recent years. Read More