Dissipative systems often exhibit wavelength-dependent loss rates. One prominent example is Rydberg polaritons formed by electromagnetically induced transparency, which have long been a leading candidate for studying the physics of interacting photons and also hold promise as a platform for quantum information. In this system, dissipation is in the form of quantum diffusion, i.e., proportional to k(2) (k being the wavevector) and vanishing at long wavelengths as k -> 0. Here, we show that one-dimensional condensates subject to this type of loss are unstable to long-wavelength density fluctuations in an unusual manner: after a prolonged period in which the condensate appears to relax to a uniform state, local depleted regions quickly form and spread ballistically throughout the system. We connect this behavior to the leading-order equation for the nearly uniform condensate-a dispersive analog to the Kardar-Parisi-Zhang equation-which develops singularities in finite time. Furthermore, we show that the wavefronts of the depleted regions are described by purely dissipative solitons within a pair of hydrodynamic equations, with no counterpart in lossless condensates. We close by discussing conditions under which such singularities and the resulting solitons can be physically realized.