We study the complexity of the entire regularization path for least squaresregression with 1-norm penalty, known as the Lasso. Every regressionparameter in the Lasso changes linearly as a function of the regularizationvalue. The number of changes is regarded as the Lasso's complexity.Experimental results using exact path following exhibitpolynomial complexity of the Lasso in the problem size. Alas, the pathcomplexity of the Lasso on artificially designed regression problems isexponentialWe use smoothed analysis as a mechanism for bridging the gapbetween worst case settings and the de facto low complexity. Our analysisassumes that the observed data has a tiny amount of intrinsic noise.We then prove that the Lasso's complexity is polynomial in the problem size.