Keywords: [ Computational Learning Theory ] [ Learning Theory ]

[
Abstract
]

Abstract:
In many practical applications, heuristic or approximation algorithms
are used to efficiently solve the task at hand. However their
solutions frequently do not satisfy natural monotonicity properties
expected to hold in the optimum. In this work we develop algorithms that are
able to restore monotonicity in the parameters of interest.
Specifically, given oracle access to a possibly non monotone function,
we provide an algorithm that restores monotonicity while
degrading the expected value of the function by at most $\epsilon$. The
number of queries required is at most logarithmic in $1/\epsilon$ and
exponential in the number of parameters. We also give a lower bound
showing that this exponential dependence is necessary.
Finally, we obtain improved query complexity bounds for restoring the
weaker property of $k$-marginal monotonicity. Under this property, every
$k$-dimensional projection of the function is required to be
monotone. The query complexity we obtain only scales exponentially with $k$ and is polynomial in the number of parameters.