This site uses cookies. By continuing to use this site you agree to our use of cookies. To find out more, see our Privacy and Cookies policy. Close this notification
Brought to you by:
Paper

Construction of all general symmetric informationally complete measurements

and

Published 4 August 2014 © 2014 IOP Publishing Ltd
, , Citation Gilad Gour and Amir Kalev 2014 J. Phys. A: Math. Theor. 47 335302 DOI 10.1088/1751-8113/47/33/335302

1751-8121/47/33/335302

Abstract

We construct the set of all general (i.e. not necessarily rank 1) symmetric informationally complete (SIC) positive operator valued measures (POVMs). In particular, we show that any orthonormal basis of a real vector space of dimension ${{d}^{2}}-1$ corresponds to some general SIC POVM and vice versa. Our constructed set of all general SIC POVMs contains weak SIC POVMs for which each POVM element can be made arbitrarily close to a multiple times the identity. On the other hand, it remains open if for all finite dimensions our constructed family contains a rank 1 SIC POVM.

Export citation and abstract BibTeX RIS

1. Introduction

The development of coherent quantum technologies depends on the ability to evaluate how well one can prepare or create a particular quantum state [1]. Such an evaluation can be carried out by making appropriate quantum measurements on a sequence of quantum systems that were prepared in exactly the same way. The goal in quantum tomography to find efficient quantum measurements for which their statistics determine completely the states on which the measurement is carried out. Such quantum measurements are said to be informationally complete [2, 3]. In the framework of quantum mechanics quantum measurements are represented by positive-operator-valued measures (POVMs) [4]. Informationally complete POVMs have been studied extensively in the last decade [111] due to their appeal both from a foundational perspective [6] and from a practical perspective for the purpose of quantum state tomography [7] and quantum key distribution [8].

A particularly attractive subset of informationally complete POVMs are symmetric-informationally-complete (SIC) POVMs [912] for which the operator inner products of all pairs of POVM elements are the same. Most of the literature on SIC POVMs focus on rank 1 SIC POVMs (i.e. all the POVM elements are proportional to rank 1 projectors). Such rank 1 SIC POVMs have been shown analytically to exist in dimensions $d=1,...,16,19,24,28,35,48$, and numerically for all dimensions $d\leqslant 67$ (see [11] and references therein). However, despite the enormous effort of the last years, it is still not known if rank 1 SIC POVMs exist in all finite dimensions.

The interest in rank 1 SIC POVMs stems from the fact that for quantum tomography rank 1 SIC POVMs are maximally efficient at determining the state of the system [10]. On the other hand, rank 1 POVMs have a disadvantage as they completely erase the original state of the system being measured. Moreover, if the system being measured is a subsystem of a bigger composite system, then when a rank 1 POVM is applied to one part of the system it will destroy all the correlations (both classical and quantum) with the other parts. Therefore, there is a tradeoff between efficiency of a measurement (for the purposes of tomography) and non-disturbance (for the purpose of post processing). For this reason, a more 'weak' version of SIC POVMs is very useful for tomography that is followed by other quantum information processing tasks. Properties of general SIC POVMs were also studied in [13] and more recently in [14].

In this paper we construct the family of all general SIC POVMs and therefore prove that general SIC POVMs exist in all finite dimensions. In particular, we show that the set of all general SIC POVMs is as big as the set of all orthonormal bases of a real vector space of dimension ${{d}^{2}}-1$. For every SIC POVM in the family we associate a parameter a that determines how close the SIC POVM is to a rank 1 SIC POVM. On one extreme value of a, our construction shows that weak SIC POVMs with the property that all the operators in the POVM are close to (but not equal to) a constant factor times the identity always exist. On the other extreme, the question whether there exist rank 1 SIC POVMs in all finite dimensions depends on whether the family of general SIC POVMs contains a rank 1 POVM.

This paper is organized as follows. In section 2 we discuss general properties of SIC POVMs. In section 3 we introduce our construction of all SIC POVMs. Then in section 4 we study SIC POVMs constructed from the generalized Gell-Mann matrices. In section 5 we discuss the connection of our general SIC POVMs with rank 1 SIC POVMs. In section 6 we give an example to illustrate our construction for rank 1 SIC POVM, and we end in section 7 with discussion.

2. Properties of general SIC POVMs

We denote by ${{{\rm H}}_{d}}$ the set of all d × d Hermitian matrices. ${{{\rm H}}_{d}}$ can be viewed as a d2-dimensional vector space over the real numbers, equipped with the Hilbert–Schmidt inner product $(A,B)={\rm Tr}(AB)$ for $A,B\in {{{\rm H}}_{d}}$.

Definition 1. A set of d2 positive-semidefinite operators ${{\{{{P}_{\alpha }}\}}_{\alpha =1,...,{{d}^{2}}}}$ in ${{{\rm H}}_{d}}$ is called a general SIC POVM if

  • (1)  
    it is a POVM: ${{P}_{\alpha }}\geqslant 0\ \;\sum _{\alpha =1}^{{{d}^{2}}}\;{{P}_{\alpha }}=I$, where I is the d × d identity matrix, and
  • (2)  
    it is symmetric: ${\rm Tr}(P_{\alpha }^{2})={\rm Tr}(P_{\beta }^{2})\ne \frac{1}{{{d}^{3}}}$ for all $\alpha ,\beta \in \{1,2,...,{{d}^{2}}\}$, and ${\rm Tr}({{P}_{\alpha }}{{P}_{\beta }})={\rm Tr}({{P}_{\alpha ^{\prime} }}{{P}_{\beta ^{\prime} }})$ for all $\alpha \ne \beta $ and $\alpha ^{\prime} \ne \beta ^{\prime} $.

Remark. In the definition above we assumed that ${\rm Tr}(P_{\alpha }^{2})\ne \frac{1}{{{d}^{3}}}$ since otherwise (see below) all ${{P}_{\alpha }}=\frac{1}{{{d}^{2}}}I$. Moreover, we will see that conditions (1) and (2) are sufficient to ensure that the set $\{{{P}_{\alpha }}\}$ is a basis for ${{{\rm H}}_{d}}$ (for rank one SIC POVMs it was shown for example in [9]). Thus, SIC POVMs, as their name suggest, are informationally complete.

Denoting by $a\equiv {\rm Tr}(P_{\alpha }^{2})$ and $b\equiv {\rm Tr}({{P}_{\alpha }}{{P}_{\beta }})$ for $\alpha \ne \beta $, we have

Thus, the parameters a and b are not independent and b is given by

Equation (1)

Also the traces of all the elements in a SIC POVM are equal:

Thus, a is the only parameter defining the 'type' of a general SIC POVM. The range of a is given by

Equation (2)

where the strict inequality above follows from the Cauchy–Schwarz inequality ${\rm Tr}({{P}_{\alpha }})={\rm Tr}({{P}_{\alpha }}I)\lt \sqrt{da}$. This is a strict inequality since otherwise all ${{P}_{\alpha }}=\frac{1}{{{d}^{2}}}I$. Note also that $a=1/{{d}^{2}}$ if and only if all ${{P}_{\alpha }}$ are rank one.

The set ${{\{{{P}_{\alpha }}\}}_{\alpha \,=\,1,...,{{d}^{2}}}}$ form a basis for ${{{\rm H}}_{d}}$. To see that all ${{P}_{\alpha }}$ are linearly independent we follow the same lines as in [9]. Suppose there is a set of d2 real numbers ${{\{{{r}_{\alpha }}\}}_{\alpha =1,...,{{d}^{2}}}}$ satisfying

Then by taking the trace on both sides we get $\sum _{\alpha =1}^{{{d}^{2}}}{{r}_{\alpha }}=0$. Moreover, multiplying the equation above by ${{P}_{\beta }}$ and taking the trace gives

where we have used $\sum _{\alpha =1}^{{{d}^{2}}}{{r}_{\alpha }}=0$. Now, from equation (1) it follows that a = b only if $a=1/{{d}^{3}}$. Since it is not in the domain of a (see equation (2)), we conclude that $a\ne b$ and therefore all ${{r}_{\beta }}=0$. Thus, general SIC POVMs are informationally complete POVMs.

The dual basis ${{\{{{Q}_{\alpha }}\}}_{\alpha =1,2,...,{{d}^{2}}}}$ of an informationally complete POVM ${{\{{{P}_{\alpha }}\}}_{\alpha =1,2,...,{{d}^{2}}}}$, is a basis of ${{{\rm H}}_{d}}$ satisfying

Equation (3)

It is simple to check that any density matrix $\rho \in {{{\rm H}}_{d}}$ can be expressed as

Equation (4)

where ${{p}_{\alpha }}\equiv {\rm Tr}\left( {{P}_{\alpha }}\rho \right)$ are the probabilities associated with the informationally complete measurement $\{{{P}_{\alpha }}\}$. Thus, the existence of a dual basis to an informationally complete POVM shows that a d × d density matrix $\rho $ can be viewed as a d2-dimensional probability vector $\vec{p}=({{p}_{1}},...,{{p}_{{{d}^{2}}}})$.

In [15, 16] it was shown that the dual basis to a basis of positive-semidefinite matrices can not itself consist of only positive-semidefinite matrices. Therefore, the requirement that $\rho $ in (4) is positive-semidefinite imposes a constraint on the probability vectors $\vec{p}$ that correspond to a density matrix $\rho $. For example, the vector $\vec{p}=(1,0,...,0)$ corresponds to $\rho ={{Q}_{1}}$. Thus, if Q1 is not positive-semidefinite then the vector $\vec{p}=(1,0,...,0)$ does not correspond to a density matrix. The set of all probability vectors $\vec{p}=({{p}_{1}},...,{{p}_{{{d}^{2}}}})$ that correspond to density matrices form a simplex (in the convex set of d2-dimensional probability vectors) that depends on the choice of the POVM basis $\{{{P}_{\alpha }}\}$ and, in particular, its dual.

The calculation of the dual basis for an informationally complete POVM involves in general cumbersome expressions. However, for a general SIC POVM ${{\{{{P}_{\alpha }}\}}_{\alpha =1,2,...,{{d}^{2}}}}$, with a parameter a as above, the dual basis is given by

Using the symmetric properties of ${{P}_{\alpha }}$, it is straightforward to check that the ${{Q}_{\alpha }}$ s above indeed satisfy equation (3). Note that for a rank one SIC POVM ${{Q}_{\alpha }}=d(d+1){{P}_{\alpha }}-I$.

3. Construction of general SIC POVMs

Let ${{{\rm T}}_{d}}\subset {{{\rm H}}_{d}}$ be the $({{d}^{2}}-1)$-dimensional subspace of ${{{\rm H}}_{d}}$ consisting of all d × d traceless Hermitian matrices. To construct general SIC POVMs we need the following two Lemmas:

Lemma 1. Let ${{\{{{R}_{\alpha }}\}}_{\alpha =1,2,...,{{d}^{2}}-1}}$ be a basis of ${{{\rm T}}_{d}}$ such that

Equation (5)

for some $0\lt r\in \mathbb{R}$. Then, for any real $t\ne 0$, the set ${{\{{{P}_{\alpha }}\}}_{\alpha =1,2,...,{{d}^{2}}}}$ defined by

Equation (6)

is a symmetric basis of ${{{\rm H}}_{d}}$ according to definition 1.

The proof is given in appendix A. We now construct a basis ${{\{{{R}_{\alpha }}\}}_{\alpha =1,2,...,{{d}^{2}}-1}}$ of ${{{\rm T}}_{d}}$ that satisfies the conditions in lemma 1. Let ${{\{{{F}_{\alpha }}\}}_{\alpha =1,2,...,{{d}^{2}}-1}}$ be an orthonormal basis of ${{{\rm T}}_{d}}$, ${\rm Tr}({{F}_{\alpha }})=0$ and ${\rm Tr}({{F}_{\alpha }}{{F}_{\beta }})={{\delta }_{\alpha ,\beta }}$, and let

where x and y are some non-zero real constants. We then get

Equation (7)

We would like to find x and y such that ${\rm Tr}({{R}_{\alpha }}{{R}_{\beta }})$ above has the form equation (5). We therefore looking for x and y such that for any $\alpha \ne \beta \in \{1,2,...,{{d}^{2}}-1\}$

Equation (8)

From equation (7) we get

Equation (9)

where $\omega \equiv x/y$. Comparing equations (8) and (9) gives two solutions for $\omega $:

Without loss of generality, we take y = 1 so that $x=\omega $. Thus, for any real $t\ne 0$, we define:

Denoting by $F=\sum _{\alpha =1}^{{{d}^{2}}-1}{{F}_{\alpha }}$, we get

Equation (10)

Lemma 2. Let ${{\{{{P}_{\alpha }}\}}_{\alpha =1,2,...,{{d}^{2}}}}$ be a symmetric basis of ${{{\rm H}}_{d}}$ according to definition 1. Then, for any real $t\ne 0$, the two sets ${{\{{{F}_{\alpha ,\pm }}\}}_{\alpha =1,2,...,{{d}^{2}}-1}}$

are orthonormal bases of ${{{\rm T}}_{d}}$.

Proof. Assuming that the ${{\{{{P}_{\alpha }}\}}_{\alpha =1,2,...,{{d}^{2}}}}$ are symmetric basis of ${{{\rm H}}_{d}}$, it is straight forward to check that the ${{F}_{\alpha ,\pm }}$ form an orthonormal operator basis of ${{{\rm T}}_{d}}$, ${\rm Tr}({{F}_{\alpha }})=0$ and ${\rm Tr}({{F}_{\alpha }}{{F}_{\beta }})={{\delta }_{\alpha ,\beta }}$. □

Most importantly, the two orthonormal bases $\{{{F}_{\alpha ,\pm }}\}$ are related to each other by an orthogonal transformation. Given a symmetric basis ${{\{{{P}_{\alpha }}\}}_{\alpha =1,2,...,{{d}^{2}}}}$ in ${{{\rm H}}_{d}}$, we can write it in two ways

Therefore, without loss of generality, we can express all symmetric operators on ${{{\rm H}}_{d}}$ in a given structure, say $\{{{P}_{\alpha ,-}}\}$ of equation (10).

We are now ready to introduce the construction of all general SIC POVMs. Let $\{{{F}_{\alpha }}\}$ (with $\alpha =1,...,{{d}^{2}}-1$) be an orthonormal basis of ${{{\rm T}}_{d}}$; that is, ${\rm Tr}({{F}_{\alpha }})=0$ and ${\rm Tr}({{F}_{\alpha }}{{F}_{\beta }})={{\delta }_{\alpha ,\beta }}$ for all $\alpha ,\beta \in \{1,2,...,{{d}^{2}}-1\}$. For example, for d = 2 the normalized three Pauli matrices form such a basis for ${{{\rm T}}_{2}}$. For higher dimensions one can take for example the generalized Gell-Mann matrices. Given such a fixed basis for ${{{\rm T}}_{d}}$, we define two real numbers t0 and t1 as follows. Let $F\equiv \sum _{\alpha =1}^{{{d}^{2}}-1}{{F}_{\alpha }}$ and for $\alpha =1,...,{{d}^{2}}-1$ let ${{\lambda }_{\alpha }}$ and ${{\mu }_{\alpha }}$ be the maximum and minimum eigenvalues of ${{R}_{\alpha }}\equiv F-d(d+1){{F}_{\alpha }}$, respectively. Denote also by λd 2 and μd 2 the maximum and minimum eigenvalues of ${{R}_{{{d}^{2}}}}\equiv (d+1)F$, respectively. Note that for all $\alpha $, ${{\lambda }_{\alpha }}$ are positive and ${{\mu }_{\alpha }}$ are negative since ${\rm Tr}(F)={\rm Tr}({{F}_{\alpha }})=0$. We denote

Theorem 3. For any non-zero $t\in [{{t}_{0}},{{t}_{1}}]$ the set of d2 operators

Equation (11)

form a general SIC POVM in ${{{\rm H}}_{d}}$. Moreover, any general SIC POVM can be obtained in this way.

Proof. By construction we have $\sum _{\alpha =1}^{{{d}^{2}}}{{P}_{\alpha }}=I$. Moreover, the real parameters t0 and t1 have been chosen in such a way to ensure that all ${{\{{{P}_{\alpha }}\}}_{\alpha =1,...,{{d}^{2}}}}$ are positive-semidefinite. Thus, $\{{{P}_{\alpha }}\}$ is a POVM. Since ${\rm Tr}({{F}_{\alpha }})={\rm Tr}(F)=0$ and ${\rm Tr}({{F}_{\alpha }}F)=1$ we get for $\alpha ,\beta \in \{1,2,...,{{d}^{2}}-1\}$

Thus, for all $\alpha ,\beta \in \{1,2,...,{{d}^{2}}-1\}$ with $\alpha \ne \beta $ ${\rm Tr}({{P}_{\alpha }}{{P}_{\beta }})={\rm Tr}({{P}_{\alpha }}{{P}_{{{d}^{2}}}})=1/{{d}^{3}}-{{t}^{2}}{{(d+1)}^{2}}$. Similarly, ${\rm Tr}(P_{\alpha }^{2})={\rm Tr}({{P}_{{{d}^{2}}}})$ for all $\alpha =1,2,...,{{d}^{2}}-1$.

It is therefore left to show that any general SIC POVM can be obtained in this way. Indeed, isolating ${{F}_{\alpha }}$ from equation (11) gives

Equation (12)

Thus, if ${{\{{{P}_{\alpha }}\}}_{\alpha =1,...,{{d}^{2}}}}$ is a general SIC POVM then one can easily verify that the set ${{\{{{F}_{\alpha }}\}}_{\alpha =1,...,{{d}^{2}}-1}}$ defined in (12) is orthonormal. This completes the proof. □

The parameter a associated with our general SIC POVMs is given by

Equation (13)

The maximum value of a is obtained when $t={{t}_{m}}\equiv {\rm max} \{|{{t}_{0}}|,{{t}_{1}}\}$ and is depending on the choice of the orthonormal basis $\{{{F}_{\alpha }}\}$. Thus, for a given basis $\{{{F}_{\alpha }}\}$, the construction above generates general SIC POVMs for any $\frac{1}{{{d}^{3}}}\lt a\leqslant a({{t}_{m}})$. The value of $a({{t}_{m}})$ is always bounded from above by $1/{{d}^{2}}$, and it is equal to $1/{{d}^{2}}$ only if the resulting SIC POVM consists of rank 1 operators. We therefore denote

Equation (14)

where the maximum is taken over all orthonormal bases $\{{{F}_{\alpha }}\}$ of ${{{\rm T}}_{d}}$. We say here that $\{{{F}_{\alpha }}\}$ is an optimal basis of ${{{\rm T}}_{d}}$ if its $a({{t}_{m}})$ value equals ${{a}_{{\rm max} }}$. The conjecture that rank 1 SIC POVMs exist in all finite dimensions is equivalent to the conjecture that ${{a}_{{\rm max} }}$ always equal to $1/{{d}^{2}}$. We finally note that theorem 3 states a bijection between a basis for ${{{\rm T}}_{d}}$ and general SIC-POVMs. Thus a given basis corresponds to a SIC-POVM. By applying an orthonormal transformation on ${{{\rm T}}_{d}}$ we obtain a different basis that corresponds, in general, to a different SIC-POVM through the bijection of theorem 3.

4. SIC POVM from generalized Gell-Mann operator basis

We now calculate the maximal value of the parameter, a, for the case where the operator basis, $\{{{F}_{\alpha }}\}$, is the generalized Gell-Mann basis. The generalized Gell-Mann operators are a set of ${{d}^{2}}-1$ operators which form a basis for ${{{\rm T}}_{d}}$. We can label them by two indexes $n,m$ each taking on integer values, $n,m=1,2,\ldots ,d$, such that

Equation (15)

and

Equation (16)

for $n=1,2,\ldots ,d-1$. The pre-factors in equations (15) and (16) were chosen such that ${\rm Tr}(G_{nm}^{2})=1$ for all $n,m=1,2,\ldots ,d-1$.

We first bound the eigenvalues of $F={{\sum }_{\alpha }}{{F}_{\alpha }}={{\sum }_{n,m}}{{G}_{nm}}$ using the Weylʼs inequality [17]. The inequality states that the eigenvalues of $F=N+V$, where N and V are d × d hermitian matrices, are bounded as

Equation (17)

where ${{f}_{1}}\geqslant \cdots \geqslant {{f}_{d}}$, ${{n}_{1}}\geqslant \cdots \geqslant {{n}_{d}}$, and ${{v}_{1}}\geqslant \cdots \geqslant {{v}_{d}}$, are the eigenvalues of F, N and V, respectively. In our case F could be represented as the sum of two d × d hermitian matrices,

where $N=\sum _{n=1}^{d-1}{{G}_{nn}}$ is a diagonal matrix. The maximal and minimal eigenvalues of N are given by ${{n}_{1}}=\sum _{n=1}^{d-1}1/\sqrt{n(n+1)}$ and ${{n}_{d}}=-\sqrt{(d-1)/d}$. In order to bound the eigenvalues of F, using inequality (17), we need to evaluate the eigenvalues of,

whose eigenvalues equation is

To solve this equation we use the identity

Equation (18)

which holds for invertible D. By identifying $A=-\sqrt{2}v$, $B=(1-{\rm i})(1,\ldots ,1)={{C}^{\dagger }}$ are vectors with $d-1$ components, and $D={{\tilde{V}}_{d-1}}$, where ${{\tilde{V}}_{d-1}}$ is a $(d-1)\times (d-1)$ matrix with exactly the same structure as $\tilde{V}$, we obtain the recursive relation,

Equation (19)

where Bk is the same vector as B but with k elements. To solve this recursive relation we simply need to give the solution for the k = 1,

We therefore have an analytical solution (in a form of a recursive relation) for the eigenvalues of V, hence, we can use them to bound the eigenvalues of F according to equation (17),

Next we bound the eigenvalues of ${{R}_{nm}}=F-d(d+1){{G}_{nm}}$. For n = m with $n=1,\ldots ,d-1$,

Equation (20)

where $\tilde{d}=d(d+1)$. The largest eigenvalue of the matrices $\sum _{m=1}^{d-1}{{G}_{m,m}}-\tilde{d}{{G}_{n,n}}$ is $\sum _{m=1}^{d-1}1/\sqrt{m(m+1)}+\tilde{d}\sqrt{(d-1)/d}$ while the smallest eigenvalue is $-\sqrt{(d-1)/d}-\tilde{d}\sum _{m=1}^{d-1}1/\sqrt{m(m+1)}$. Using inequality (17), the eigenvalues of Rnn for all $n=1,\ldots ,d-1$ are bounded by

To bound the eigenvalue of Rnm with $n\ne m$ let us first look at R12,

The eigenvalues equation of the non-diagonal matrix,

reads

Equation (21)

Except of the first row and column (i.e., the $(d-1)\times (d-1)$ inner matrix) of the above equation equals ${{\tilde{V}}_{d-1}}$. Therefore to solve equation (21), one can use the same recursive relation of equation (19) with the same definitions of $A,B,C$ and D as before up to the $d-1$ th 'level', and at the dth level make use of

with the $d-1$ components' vectors ${{B}_{d-1}}=(1-\tilde{d}-{\rm i},1-{\rm i},\cdots ,1-{\rm i})=C_{d-1}^{\dagger }$. Thus, we find the maximum and minimum eigenvalues of W, w1 and wd , which enable us to bound the eigenvalues of R12,

Equation (22)

By writing Rnm with $n\ne m$ as

one can show that the eigenvalues of ${{W}_{nm}}=V-\tilde{d}{{G}_{nm}}$ are equal to the eigenvalues of W, and therefore the eigenvalues of Rnm are all bounded by the bounds appearing in equation (22).

Upon defining

${{t}_{0}}=-\frac{1}{{{d}^{2}}}\frac{1}{{{\lambda }_{{\rm max} }}}$, ${{t}_{1}}=-\frac{1}{{{d}^{2}}}\frac{1}{{{\lambda }_{{\rm min} }}}$, we obtain ${{t}_{m}}={\rm max} \{|{{t}_{0}}|,{{t}_{1}}\}$. The optimal SIC POVM associated with the Gell-Mann basis is then given by

We have numerically calculated tm as a function of the dimension and plotted in figure (1) the ratio of tm to its value had a rank-1 SIC POVM exists in this dimension $1/{{(d(d+1))}^{3/2}}$. We found that the as the dimension grows ${{t}_{m}}{{(d(d+1))}^{3/2}}$ quickly drops to zero, that is the Gell-Mann basis is far from an optimal basis.

Figure 1.

Figure 1. A plot of ${{t}_{m}}{{(d(d+1))}^{3/2}}$ as a function of the dimension d. The tm is calculated for the Gell-Mann basis, and the quantity $0\leqslant {{t}_{m}}{{(d(d+1))}^{3/2}}\leqslant 1$ indicates how close the SIC POVM to a rank-1 POVM (upper bound). As the dimension grows ${{t}_{m}}{{(d(d+1))}^{3/2}}$ quickly drops to zero, that is the Gell-Mann basis results in a SIC POVM whose elements are close to the identity operator.

Standard image High-resolution image

5. Rank 1 SIC POVMs

From the expression for a, equation (13), we see that if $a=1/{{d}^{2}}$ then

Equation (23)

where we assumed without loss of generality that t is positive since we can always replace the orthonormal basis $\{{{F}_{\alpha }}\}$ with the orthonormal basis $\{-{{F}_{\alpha }}\}$. Substituting this value of t in equation (12) gives:

Equation (24)

for rank 1 SIC POVMs where ${{\Pi }_{\alpha }}\equiv d{{P}_{\alpha }}$ and ${{\Pi }_{{{d}^{2}}}}\equiv d{{P}_{{{d}^{2}}}}$ are rank 1 projections.

If $\{{{P}_{\alpha }}\}$ is a rank 1 SIC POVM then the eigenvalues of ${{F}_{\alpha }}$ in (24) are not depending on $\alpha $ since ${\rm Tr}({{\Pi }_{\alpha }}{{P}_{{{d}^{2}}}})=1/(d+1)$ for all $\alpha =1,2,...,{{d}^{2}}-1$. A straightforward calculation shows that the eigenvalues of the rank 2 matrix ${{\Pi }_{{{d}^{2}}}}-(d+1){{\Pi }_{\alpha }}$ are given by $\left( -d\pm \sqrt{{{d}^{2}}+4d} \right)/2$. Thus, for rank 1 SIC POVMs the eigenvalues of all ${{F}_{\alpha }}$ are given by:

Equation (25)

This observation indicates that while the class of general SIC POVMs in ${{{\rm H}}_{d}}$ is relative big (any orthonormal basis $\{{{F}_{\alpha }}\}$ yields a general SIC POVM), the class of rank 1 SIC POVMs in ${{{\rm H}}_{d}}$ (if exists) is extremely small. The fact that all $\{{{F}_{\alpha }}\}$ have the same eigenvalues implies that there exist ${{d}^{2}}-1$ unitary matrices $\{{{U}_{\alpha }}\}$ such that ${{F}_{\alpha }}={{U}_{\alpha }}DU_{\alpha }^{\dagger }$, where $D={\rm diag}\{{{\gamma }_{1}},...,{{\gamma }_{d}}\}$ is a diagonal matrix with the ${{\gamma }_{k}}$ s as in equation (25). The orthogonality relation reads as ${\rm Tr}({{U}_{\alpha }}DU_{\alpha }^{\dagger }{{U}_{\beta }}DU_{\beta }^{\dagger })={{\delta }_{\alpha ,\beta }}$. Thus, if the set $\{{{U}_{\alpha }}\}$ forms a group, the orthogonality relations can be written simply as ${\rm Tr}({{U}_{\alpha }}DU_{\alpha }^{\dagger }D)=0$ for ${{U}_{\alpha }}\ne I$. This is somewhat reminiscent to the more standard construction of rank 1 SIC POVMs with the Weyl–Heisenberg group [10, 11].

6. Example in d = 2

The simplest example to construct is in dimension d = 2. In this case, we take the three normalized Pauli matrices

to be our fixed orthonormal basis for the vector space of 2 × 2 traceless Hermitian matrices ${{{\rm T}}_{2}}$. With this choice we get

The eigenvalues of the three matrices $F-d(d+1){{F}_{\alpha }}=F-6{{F}_{\alpha }}$ ($\alpha =1,2,3$) are all the same and are given by ${{\lambda }_{\alpha }}=3\sqrt{3/2}$ and ${{\mu }_{\alpha }}=-3\sqrt{3/2}$. Also the eigenvalues of $3F$ are the same and given by ${{\lambda }_{4}}=3\sqrt{3/2}$ and ${{\mu }_{4}}=-3\sqrt{3/2}$. Thus, ${{t}_{0}}=-\frac{1}{12}\sqrt{\frac{2}{3}}$, ${{t}_{1}}=\frac{1}{12}\sqrt{\frac{2}{3}}$, and for any non-zero $t\in [{{t}_{0}},{{t}_{1}}]$ the four matrices

form a general SIC POVM. Note that t1 equals to the maximum possible value given in equation (23). Thus, for $t={{t}_{1}}$ we get the following rank 1 SIC POVM:

Note that the above rank 1 SIC POVM is equivalent (up to a global rotation) to the original one introduced first in [9, 12]. Our example though shows that the rank 1 SIC POVM can be obtained from the three Pauli matrices. Thus, the three normalized Pauli matrices form an optimal basis for ${{{\rm T}}_{d=2}}$. The generalized Gell-Mann matrices, which reduce to the normalized Pauli matrices for d = 2, are not the optimal basis for $d\gt 2$, thus do not correspond rank 1 SIC POVM for higher dimensions, as indicated in figure 1.

7. Discussion

We have constructed the complete set of general SIC POVMs given in equation (11). Our construction shows that the family of general SIC POVMs is as big as the set of all orthonormal bases in ${{{\rm T}}_{d}}$. Fixing an orthonormal basis $\{{{F}_{\alpha }}\}$, every other orthonormal basis $\{F_{\alpha }^{\prime }\}$ of ${{{\rm T}}_{d}}$ can be obtained from $\{{{F}_{\alpha }}\}$ via $F_{\alpha }^{\prime }={{\sum }_{\beta }}{{A}_{\alpha \beta }}{{F}_{\beta }}$, where A is a real orthogonal matrix. Thus, every element in the group $O({{d}^{2}}-1)$ (i.e. the set of $({{d}^{2}}-1)\times ({{d}^{2}}-1)$ real orthogonal matrices) defines a one parameter set of general SIC POVMs (see equation (11)). Typically, distinct orthogonal matrices corresponds to distinct SIC POVMs.

The SIC POVMs defined in equation (11) can be made arbitrarily close to $\frac{1}{{{d}^{2}}}I$ by taking t to be close enough to zero. These weak SIC POVMs do not disturb much the state of the system and therefore can be used for tomography that is followed by other quantum information processing tasks. Note that from equation (11) any orthonormal basis of ${{{\rm T}}_{d}}$ can be used to construct a weak SIC POVM. This implies that the set of weak SIC POVMs is extremely big whereas the set of rank 1 SIC POVMs (if exists) is extremely small.

The purity parameter a of a general SIC POVM determines how close it is to a rank 1 SIC POVM. If a is very close to $1/{{d}^{3}}$ than the general SIC POVMs is full rank, whereas if it is very close to $1/{{d}^{2}}$ it is close to being rank 1. From our construction it is obvious that if there exists a SIC POVM with some $a={{a}_{0}}\gt 1/{{d}^{3}}$, then there exist SIC POVMs with any a in the range $(1/{{d}^{3}},{{a}_{0}}]$. Thus, it is natural to look for the highest possible value that a can take which we denoted in equation (14) by ${{a}_{{\rm max} }}$. From both the analytical and numerical evidence we know that ${{a}_{{\rm max} }}=1/{{d}^{2}}$ for low dimensions since in small dimensions there exists rank 1 SIC POVMs [11]. However, since in higher dimensions it seems to be a hard task to construct rank 1 SIC POVMs, one can try to find a lower bound for ${{a}_{{\rm max} }}$. In section 4 considered such bound by constructing SIC POVMs through the generalized Gell-Mann operator basis. As one may expect, we found that the lower bound quickly approaches its lower value $1/{{d}^{3}}$ as the dimension increases. However, one can numerically search for bases which give interesting bounds by making orthogonal transformations on the generalized Gell-Mann basis, and to look for those transformations for which a increases. We leave this analysis for future work.

Acknowledgments

We would like to thank the anonymous referees that suggested to improve the derivation of theorem 3 and to consider the particular case of SIC POVMs constructed from the generalized Gell-Mann basis. GG thanks Rob Spekkens and Nolan Wallach for many fruitful discussions that initiated this project, to Markus Grassl for helpful correspondence and comments on the first draft of this paper, and to Christopher Fuchs for pointing out reference [13]. GG research was partially supported by NSERC and by the Air Force Office of Scientific Research as part of the Transformational Computing in Aerospace Science and Engineering Initiative under grant FA9550-12-1-0046. A.K. research is supported by NSF Grant PHY-1212445.

Appendix A.: Proof of Lemma 1

Proof. To prove that the operators ${{P}_{\alpha }}$, $\alpha =1,2,...,{{d}^{2}}$ of equation (6) are symmetric according to definition 1, we first note that ${\rm Tr}({{P}_{\alpha }})=1/d$ for all $\alpha =1,2,...,{{d}^{2}}$. For $\alpha ,\beta \in \{1,2,...,{{d}^{2}}-1\}$ we have

We also have for $\alpha \in \{1,2,...,{{d}^{2}}-1\}$

and

Thus, all that is left to show is that ${{\{{{P}_{\alpha }}\}}_{\alpha =1,2,...,{{d}^{2}}}}$ is linearly independent. Indeed, suppose $\sum _{\alpha =1}^{{{d}^{2}}}{{s}_{\alpha }}{{P}_{\alpha }}=0$. Note first that by taking the trace on both sides we get $\sum _{\alpha =1}^{{{d}^{2}}}{{s}_{\alpha }}=0$. Thus,

Recall that $t\ne 0$ and ${{R}_{\alpha }}$ are ${{d}^{2}}-1$ linearly independent matrices in ${{{\rm H}}_{d}}$. Thus, ${{s}_{\alpha }}={{s}_{{{d}^{2}}}}$, and since $\sum _{\alpha =1}^{{{d}^{2}}}{{s}_{\alpha }}=0$ we get ${{s}_{\alpha }}=0$ for all $\alpha =1,2,...,{{d}^{2}}$. □

Please wait… references are loading.
10.1088/1751-8113/47/33/335302