Differential privacy is a cryptographically-motivated privacy definition which has gained significant attention over the past few years. Differentially private solutions enforce privacy by adding random noise to the data or a function computed on the data, and the challenge in designing such algorithms is to optimize the privacy-accuracy-sample size tradeoff. This work studies differentially-private statistical estimation, and shows upper and lower bounds on the convergence rates of differentially private approximations to statistical estimators. Our results reveal a connection between differential privacy and the notion of B-robustness in robust statistics, by showing that unless an estimator is B-robust, we cannot approximate it well with differential privacy over a large class of distributions. We then provide an upper bound on the convergence rate of a differentially private approximation to a B-robust estimator with a bounded range. We show that the bounded range condition is necessary if we wish to ensure a strict form of differential privacy.