Recently, several local and global linear convergence rate results for Douglas–Rachford splitting have appeared in the literature. Many of these are derived under strong monotonicity, Lipschitz continuity, and/or cocoercivity assumptions, and most focus on the convex optimization setting in the dual Douglas-Rachford algorithm, i.e., the alternating direction method of multipliers. In the monotone inclusion setting for Douglas-Rachford splitting, Lions and Mercier showed a linear convergence rate bound under the assumption that one of the two operators is strongly monotone and Lipschitz continuous. In this talk, we show that this rate is not sharp, and present a sharp contraction factor for the Douglas-Rachford operator under the stated assumptions. We also present, for many cases sharp, contraction factors for the Douglas-Rachford operator under different combinations of strong monotonicity, Lipschitz continuity, and cocoercivity assumptions. If these stronger properties are split between the operators, our results rely on the introduced notions of negatively averaged operators and negatively linearly regular operators, meaning that the negative operator is averaged and linearly regular respectively. |