We are very grateful for the highly positive and valuable feedback from the reviewers. In our submission, we presented a new strategy for k-nearest neighbour search that completely sidesteps the curse of dimensionality by avoiding space partitioning. While R1 felt not sufficiently familiar with the area to review our submission, R2 possesses very rich expertise in the area and described our submission as “very significant”, “very interesting”, “very well written” and “extremely promising” and R3 characterized it as “conceptually simple and elegant”. $ We deeply appreciate R2 for his/her insightful feedback and pointers to related work that we inadvertently missed, which would make the advantages of the proposed method over existing methods more apparent and further strengthens our central argument against space partitioning. Specifically, his/her suggestion to characterize the running time of the proposed method using the intrinsic dimension of the dataset enables us to perform direct theoretical comparisons to several existing methods. Whereas the dependence of the running times of cover trees and various flavours of randomized partition trees on the intrinsic dimension d' is exponential, the proposed method reduces this dependence to sublinear (in fact, it is bounded). We will make sure to include this comparison and the references mentioned by R2 in the final manuscript. We also thank him/her for the feedback on the clarity of the presentation and will incorporate the suggested changes in the final manuscript. We highly appreciate R3 for pointing out the mistake in Section 2 and will correct it in the final manuscript. We also thank him/her for the pointer to related work and suggestions on the presentation and will incorporate them in the final manuscript.