From the flashes of fireflies to Josephson junctions and power infrastructure, networks of coupled phase oscillators provide a powerful framework to describe synchronization phenomena in many natural and engineered systems. Most real-world networks are under the influence of noisy, random inputs, potentially inhibiting synchronization. While noise is unavoidable, here we show that there exist optimal noise patterns which minimize desynchronizing effects and even enhance order. Specifically, using analytical arguments we show that in the case of a two-oscillator model, there exists a sharp transition from a regime where the optimal synchrony-enhancing noise is perfectly anti-correlated, to one where the optimal noise is correlated. More generally, we then use numerical optimization methods to demonstrate that there exist anti-correlated noise patterns that optimally enhance synchronization in large complex oscillator networks. Our results may have implications in real-world networks such as power grids and neuronal networks, which are subject to significant amounts of correlated input noise.