We provide an asymptotic theory for penalized least squares estimators of locally constant functions with finitely many jumps which are blurred by an operator and random noise. Differences to the direct case are highlighted, particularly, it turns out that a sqrt(n) rate of convergence for estimation of the jump locations is generic in the inverse case. Moreover, locations of jumps are jointly asymptotic normal, which allows to construct confidence regions for the graph of a function with a finite number of jumps. A minimax argument shows that penalized least squares estimators are rate optimal, if the kernel has a singularity this rate can be improved. Applications from biophysics and material sciences are discussed.