Given $\bx_j = \btheta + \bepsilon_j$, $j=1,...,n$ where $\btheta \in \RR^d$ is an unknown parameter and
$\bepsilon_j$ are i.i.d. Gaussian noise vectors,
we study the estimation of $f(\btheta)$ for a given smooth function $f:\RR^d \rightarrow \RR$ equipped with an additive structure.
We inherit the idea from a recent work which introduced an effective bias reduction technique through iterative bootstrap and derive
a bias-reducing estimator.
By establishing its normal approximation results, we show that the proposed estimator can achieve asymptotic normality with a looser constraint on smoothness compared with general smooth function due to the additive structure.
Such results further imply that the proposed estimator is asymptotically efficient.
Both upper and lower bounds on mean squared error are proved which shows the proposed estimator is minimax optimal for the smooth class considered.
Numerical simulation results are presented to validate our analysis and show its superior performance of the proposed estimator
over the plug-in approach in terms of bias reduction and building confidence~intervals.