We introduce new primal-dual algorithms to minimize the sum of three convex functions, each having its own oracle. Namely, the first one is differentiable, smooth and possibly stochastic, the second is proximable, and the last one is a composition of a proximable function with a linear map. By leveraging variance reduction, we prove convergence to an exact solution with sublinear or linear rates, depending on strong convexity properties. The proposed theory is simple and unified by the umbrella of stochastic Davis-Yin splitting, which we first design in this work. Our theory covers several settings that are not tackled by any existing algorithm; we illustrate their importance with real-world applications and we show the efficiency of our algorithms by numerical experiments.