Document Type


Publication Date



The quantum relative entropy is a measure of the distinguishability of two quantum states, and it is a unifying concept in quantum information theory: many information measures such as entropy, conditional entropy, mutual information, and entanglement measures can be realized from it. As such, there has been broad interest in generalizing the notion to further understand its most basic properties, one of which is the data processing inequality. The quantum f-divergence of Petz is one generalization of the quantum relative entropy, and it also leads to other relative entropies, such as the Petz-Rényi relative entropies. In this paper, I introduce the optimized quantum f-divergence as a related generalization of quantum relative entropy. I prove that it satisfies the data processing inequality, and the method of proof relies upon the operator Jensen inequality, similar to Petz's original approach. Interestingly, the sandwiched Rényi relative entropies are particular examples of the optimized f-divergence. Thus, one benefit of this paper is that there is now a single, unified approach for establishing the data processing inequality for both the Petz-Rényi and sandwiched Rényi relative entropies, for the full range of parameters for which it is known to hold. This paper discusses other aspects of the optimized f-divergence, such as the classical case, the classical-quantum case, and how to construct optimized f-information measures.

Publication Source (Journal or Book title)

Journal of Physics A: Mathematical and Theoretical