The ability to compress observational data and accurately estimate physical parameters relies heavily on informative summary statistics. In this paper, we introduce the use of mutual information (MI) as a means of evaluating the quality of summary statistics in inference tasks. MI can assess the sufficiency of summaries, and provide a quantitative basis for comparison. We show that commonly adopted metrics for comparing statistics can be considered as processes of MI estimation, but with different assumptions. Based on this, we propose to estimate MI using the Barber-Agakov lower bound and normalizing flow based variational distributions. To demonstrate the effectiveness of our method, we compare three different summary statistics (namely the power spectrum, bispectrum, and scattering transform) in the context of inferring reionization parameters from mock SKA images. We find that this approach is able to correctly assess the informativeness of different summary statistics and allows us to select the optimal statistic for our inference task.