Overcoming the influence of noise and imperfections in quantum devices is one of the main challenges for viable quantum applications. In this article, we present different protocols, which we denote as "superposed quantum error mitigation," that enhance the fidelity of single gates or entire computations by performing them in coherent superposition. Our results demonstrate that via our methods, significant noise suppression can be achieved for most kinds of decoherence and standard experimental parameter regimes. Our protocols can be either deterministic, such that the outcome is never postselected, or probabilistic, in which case the resulting state must be discarded unless a well-specified condition is met. By using sufficiently many resources and working under broad assumptions, our methods can yield the desired output state with unit fidelity. Finally, we analyze our approach for gate-based, measurement-based, and interferometric-based models, demonstrating the applicability in all cases and investigating the fundamental mechanisms they rely upon.