Hello
I'd like to know about
how to use both MPI and Scalapack simultaneously in F90.
Now I'm building a large matrix diagonalization program
by using the pdsyev routine.
In that, I want to evaluate diagonalization matrix elements (referred as H)
by using MPI parallel execution by divide of loops
and each elements are stacked with ranks
(Elements of H is divided by nprocs and each ranks stack divided H elements).
After that, I want to excecute diagonalization of H by the pdsyev routine.
The problems are that
1) run-time error 'SIGSEGV, segmentation fault occurred' occur at
CALL BLACS_GRIDINFO( CONTXT, NPROW, NPCOL, MYROW, MYCOL ).
What is wrong?
2) relation of mpi_rank and blacs parallelization variables 'IAM(in pdsyev's sample program)?'.
Elements of H are stacked with each mpi_rank,
which must be sent to BLACS as local variables by PDELSET(A,I,J,DESCA,H(I,J)).
Can I use both mpi and blacs variables simultaneously ?
If anyone has ideas, please gime me suggestions?
Thank you in advance.
The basic structure of my program is like below:
=============================================================
double precision h(m,n),a(n/nprow,n/npcol)
call mpi_init(ierr)
call mpi_comm_size(mpi_comm_world,nprocs,ierr)
call mpi_comm_rank(mpi_comm_world,myrank,ierr)
m=n/nprocs
istart=myrank*(m-1)
do j=1,n
do i=i,m
h(i,j)=(evaluated; h(i+istart,j))
end do
end do
call mpi_barrier(mpi_comm_world,ierr)
definition of nb,nprow,npcol
call blacs_pinfo(iam,nprocs)
call blacs_get(-1,0,contxt)
call blacs_init(contxt,'R',nprow,npcol)
call blacs_gridinfo(contxt,nprow,npcol,myrow,mycol)
call descinit(desca,n,n,nb,nb,0,0,contxt,lda,info)
call descinit(descz,n,n,nb,nb,0,0,contxt,lda,info)
do j=1,n
do i=1,m
ii=istart+i
call pdelset(a,ii,j,desca,h(i,j))
end do
end do
call pdsyev('N','U',n,a,1,1,desca,w,z,1,1,descz,work,lwork,info)
call blacs_gridexit(contxt)
call blacs_exit(0)
call mpi_finalize(ierr)
stop
end
================================================================

