We optimize a collision-induced cooling process for ultracold atoms in the nondegenerate regime. It makes use of a Feshbach resonance, instead of rf radiation in evaporative cooling, to selectively expel hot atoms from a trap. Using functional minimization we analytically show that for the optimal cooling process the resonance energy must be tuned such that it linearly follows the temperature. Here, optimal cooling is defined as maximizing the phase-space density after a fixed cooling duration. The analytical results are confirmed by numerical Monte Carlo simulations. In order to simulate more realistic experimental conditions, we show that background losses do not change our conclusions, while additional nonresonant two-body losses make a lower initial resonance energy with nonlinear dependence on temperature preferable.