On Oct 27, 2010, at 3:26 AM, Benjamin Sanderse wrote:
> I have a somewhat related question regarding sending data to Matlab.
> For a while I have been sending vectors back and forth between Matlab and
> Petsc and that works perfect.
>
> In addition I also want to send some information like number of iterations,
> norm of residual and solution time to Matlab. This gives me some headaches
> when I run the code in parallel:
>
> - Can I simply send a PetscInt like number of iterations to Matlab? I tried
> PetscIntView, but this does not work:
> PetscInt iterations;
> ierr = KSPGetIterationNumber(ksp,&iterations);CHKERRQ(ierr);
> fd = PETSC_VIEWER_SOCKET_WORLD;
> ierr = PetscIntView(1,iterations,fd);CHKERRQ(ierr);
>
> Petsc-Matlab communication hangs without an error on the Petsc side.
You can do a PetscIntView() BUT since each process is sending a single
integer (since you are passing a 1 as the first argument on all processes) you
need to read all of those integers on the Matlab side with read(fd,size,'int32')
>
> - As alternative I tried to set a vector and set this to Matlab with VecView.
> This works, although it results (for this example) in a nx1 vector (n=no. of
> processors) that is received by Matlab, while I actually just want a 1x1
> vector:
> ierr = VecCreateMPI(PETSC_COMM_WORLD,1,PETSC_DECIDE,&to_matlab);CHKERRQ(ierr);
> ierr = KSPGetIterationNumber(ksp,&iterations);CHKERRQ(ierr);
> fd = PETSC_VIEWER_SOCKET_WORLD;
> ierr = VecSet(to_matlab,iterations);CHKERRQ(ierr);
> ierr = VecView(to_matlab,fd);CHKERRQ(ierr);
You are creating a Vec with local size 1 so its total size is the number of
processes. If you want a Vec of total size one then use
> err = VecCreateMPI(PETSC_COMM_WORLD,PETSC_DECIDE,1,&to_matlab);CHKERRQ(ierr);
>
> - So although I am not too happy with a nx1 vector instead of a 1x1 vector I
> could live with that. A bigger problem is that if instead of the number of
> iterations I want to pass the solution time to a vector, I get an error:
>
> PetscReal time1,time2,t_solve;
>
> ierr = VecCreateMPI(PETSC_COMM_WORLD,1,PETSC_DECIDE,&to_matlab);CHKERRQ(ierr);
> fd = PETSC_VIEWER_SOCKET_WORLD;
> ierr = PetscGetTime(&time1);CHKERRQ(ierr);
> // some matrix solve
> ierr = PetscGetTime(&time2);CHKERRQ(ierr);
> t_solve = time2-time1;
> ierr = VecSet(to_matlab,t_solve);CHKERRQ(ierr);
What do you want to send to matlab? The sum of the times from all processes?
ALL of the times? The maximum time? If you want the sum or max then use
MPI_Allreduce() first and pass the result to the Vec. If you want all of the
times then you do not want VecSet(). You want VecSetValues() and have each
process set its own time into its position in the vector
Barry
> ierr = VecView(to_matlab,fd);CHKERRQ(ierr);
>
> this produces the following error:
> [1]PETSC ERROR: --------------------- Error Message
> ------------------------------------
> [1]PETSC ERROR: Invalid argument!
> [1]PETSC ERROR: Same value should be used across all processors!
> [1]PETSC ERROR:
> ------------------------------------------------------------------------
>
> When I run Petsc with 1 processor there is no error. Any ideas?
>
>
> Ben
>
>
> ----- Original Message -----
> From: "Barry Smith" <bsmith at mcs.anl.gov>
> To: "PETSc users list" <petsc-users at mcs.anl.gov>
> Sent: Tuesday, October 26, 2010 10:57:04 PM
> Subject: Re: [petsc-users] Writing PETSc matrices
>
>
> Use PetscBinaryWrite('filename',sparsematlabmatrix) I do not know why your
> second argument has quotes around it.
>
> Barry
>
>
> On Oct 26, 2010, at 3:33 PM, Nunion wrote:
>
>> Hello,
>>
>> I am new to PETSc and programming. I have a question concerning writing
>> PETSc matrices in binary from binary matrices [compressessed/uncompressed]
>> generated in Matlab. I am attempting to use the files in the /bin/matlab
>> directory, in particular the PetscBinaryWrite.m file. However, the usage;
>>
>> PetscBinaryWrite('matrix.mat','output.ex') does not seem to work. I also
>> tried using the examples in the /mat directory however, matlab does not
>> support the writing of complex matrices in ASCII.
>>
>> Thanks in advance,
>>
>> Tom
>