I'm finding that the left eigenvectors returned by ZGEEV can be way off the expected values, even for small (and apparently simple) matrices. Consider this 2 x 2 real matrix:
A(1,1) = -1.3258386681093
A(2,1) = 113.7872739924765
A(1,2) = -1.0361073988597
A(2,2) = 3.2366984478923
(I'm aware that ZGEEV is for complex matrices, but it should work with real matrices too, right?)
If I calculate the residual vector
x = vl^T A - w vl^T
where vl is an individual left eigenvector, and lambda the corresponding eigenvalue, then I find that I'm far from the zero vector I expect. Specifically, I find
abs(x) = (21.1352576040521,2.01680117864654)
for both of the left eigenvectors of A.
Pasted below is sample Fortran 95 code (using the F77 bindings provided by LAPACK95) that demonstrates the problem. The residuals of the right eigenvectors are fine, by the way.
Many thanks,
Rich
- Code: Select all
program test_geev
use f77_lapack
implicit none
integer, parameter :: WP = KIND(0.D0)
complex(WP) :: A(2,2)
complex(WP) :: A_(2,2)
complex(WP) :: W(2)
complex(WP) :: VL(2,2)
complex(WP) :: VR(2,2)
complex(WP) :: WORK(4)
real(WP) :: RWORK(4)
integer :: info
integer :: i
A(1,1) = -1.3258386681093_WP
A(2,1) = 113.7872739924765_WP
A(1,2) = -1.0361073988597_WP
A(2,2) = 3.2366984478923_WP
A_ = A
call ZGEEV('V', 'V', SIZE(A_,1), A_, SIZE(A_, 1), W, &
VL, SIZE(VL, 1), VR, SIZE(VR, 1), work, SIZE(WORK), RWORK, info)
print *,'info:',info
do i = 1,2
print *,'i:',i
print *,'abs(left residual vector):',ABS(MATMUL(VL(:,i), A) - W(i)*VL(:,i))
print *,'abs(right residual vector):',ABS(MATMUL(A, VR(:,i)) - W(i)*VR(:,i))
end do
end program test_geev

