Using the metric regularity of infinite-dimensional set-valued mappings, we derive stability criteria for saddle points of nonsmooth optimization problems in Hilbert spaces. A main ingredient is an explicit pointwise characterization of the regular coderivative of the subdifferential of convex integral functionals. We apply these stability results to study the convergence of an extension of the Chambolle--Pock primal-dual algorithm to nonsmooth optimization problems involving nonlinear operators between function spaces. We show, both numerically and theoretically, the applicability of an accelerated variant of the algorithm to parameter identification problems for an elliptic partial differential equation with non-differentiable data fitting terms. Convergence is guaranteed if the data is finite-dimensional (with the unknown still in a function space), or a small amount of Moreau--Yosida regularisation is applied.