Re: pickle/dump
2014-08-21 22:17 GMT+02:00 Daniel Wheeler daniel.wheel...@gmail.com: On Wed, Aug 20, 2014 at 6:12 PM, Seufzer, William J. (LARC-D307) bill.seuf...@nasa.gov wrote: Thanks Dan, This works... but I also made the change to nonUniformGrid3D.py as well. I noticed the simple edits, made them by hand, and re-installed FiPY in both environments. I missed that. Thanks for pointing it out. Just a note (mainly for anyone else who runs into this): Any data files created with the old code will still not be readable with this code update in the non-Trilinos environment. Both environments need to have the updated code. Hope I stated that clearly! In my experience, pickling doesn't work very well for long term or medium term data storage. For medium/short term storage (life time of a project) I always just save the numpy arrays (not with pickle, but with Pandas or numpy.savetxt), but not the FiPy objects to avoid the kinds of problems that you're having. I don't currently have a good solution for long term data storage. For long term storage, specific serialize and deserialize methods are needed. In text format for very long term (over OS and arch boundaries) For example, boost serialize in C++ classes ( http://www.boost.org/doc/libs/1_56_0/libs/serialization/doc/index.html ). Easy to add in C++ to a class. For python, working on top of json https://docs.python.org/2/library/json.html seems like the way to go. Benny -- Daniel Wheeler ___ fipy mailing list fipy@nist.gov http://www.ctcms.nist.gov/fipy [ NIST internal ONLY: https://email.nist.gov/mailman/listinfo/fipy ] ___ fipy mailing list fipy@nist.gov http://www.ctcms.nist.gov/fipy [ NIST internal ONLY: https://email.nist.gov/mailman/listinfo/fipy ]
Re: pickle/dump
On Aug 22, 2014, at 9:49 AM, Daniel Wheeler daniel.wheel...@gmail.com wrote: Benny, thanks for the insights. I was thinking more along the lines of standard ways to store meshes, field variables, equations and parameters with standard representations independent of FiPy and more interpolatable with other solvers, viewers etc. I really need to finish up my Xdmf branch (http://www.xdmf.org) ___ fipy mailing list fipy@nist.gov http://www.ctcms.nist.gov/fipy [ NIST internal ONLY: https://email.nist.gov/mailman/listinfo/fipy ]
Re: pickle/dump
Thanks for the note. We may be able to institute a hack in our pickle routines to accommodate legacy dumps (we've done it in other contexts). We really need to start versioning our data files to make that process easier. On Aug 20, 2014, at 6:12 PM, Seufzer, William J. (LARC-D307) bill.seuf...@nasa.gov wrote: Thanks Dan, This works... but I also made the change to nonUniformGrid3D.py as well. I noticed the simple edits, made them by hand, and re-installed FiPY in both environments. Just a note (mainly for anyone else who runs into this): Any data files created with the old code will still not be readable with this code update in the non-Trilinos environment. Both environments need to have the updated code. Hope I stated that clearly! Cheers, Bill On Aug 15, 2014, at 5:58 PM, Daniel Wheeler daniel.wheel...@gmail.com wrote: Hi Bill, Sorry for taking so long to respond. I tried running the scripts again in the different conda environments (with and without Trilinos) and do indeed get the same error as you. I probably didn't switch environments properly when I tried this before. Anyway, I think I've fixed the issue, the changes I made are http://matforge.org/fipy/changeset/b7246011a00584b5e6757404b2b51ca47d71004a/fipy/ and the ticket http://matforge.org/fipy/ticket/669 You'll have to fetch from the main repository and checkout the ticket669-pickle_comm branch to got these changes if they are important to you. Cheers, Daniel On Mon, Aug 11, 2014 at 3:54 PM, Seufzer, William J. (LARC-D307) bill.seuf...@nasa.gov wrote: Thanks Dan, I ran the example codes that you provided and still have the issue. I'm running the writer code on a cluster with PBS (that is, I can't just invoke MPI from the command line) to create the 'dump.gz' file with 16 cores. I then copy the file to the desktop and when I try to open it I get: therm: python fipyreaddump.py Traceback (most recent call last): File fipyreaddump.py, line 4, in module v = fp.tools.dump.read('data.dump') File /Users/wseufzer/anaconda/lib/python2.7/site-packages/FiPy-3.1-py2.7.egg/fipy/tools/dump.py, line 151, in read return unpickler.load() File /Users/wseufzer/anaconda/lib/python2.7/site-packages/FiPy-3.1-py2.7.egg/fipy/tools/comms/mpi4pyCommWrapper.py, line 55, in __setstate__ from PyTrilinos import Epetra ImportError: No module named PyTrilinos It appears that in pickling/dumping the cell variable information is stored regarding PyTrilinos. I am successful if I set environment variable FIPY_SOLVERS to 'scipy' on the cluster, run the code with one core, and then bring that file to the desktop machine. I'm enclosing an example file with nx=100 and ny=10, written on the cluster. Cheers, Bill -- Daniel Wheeler ___ fipy mailing list fipy@nist.gov http://www.ctcms.nist.gov/fipy [ NIST internal ONLY: https://email.nist.gov/mailman/listinfo/fipy ] ___ fipy mailing list fipy@nist.gov http://www.ctcms.nist.gov/fipy [ NIST internal ONLY: https://email.nist.gov/mailman/listinfo/fipy ] ___ fipy mailing list fipy@nist.gov http://www.ctcms.nist.gov/fipy [ NIST internal ONLY: https://email.nist.gov/mailman/listinfo/fipy ]
Re: pickle/dump
On Wed, Aug 20, 2014 at 6:12 PM, Seufzer, William J. (LARC-D307) bill.seuf...@nasa.gov wrote: Thanks Dan, This works... but I also made the change to nonUniformGrid3D.py as well. I noticed the simple edits, made them by hand, and re-installed FiPY in both environments. I missed that. Thanks for pointing it out. Just a note (mainly for anyone else who runs into this): Any data files created with the old code will still not be readable with this code update in the non-Trilinos environment. Both environments need to have the updated code. Hope I stated that clearly! In my experience, pickling doesn't work very well for long term or medium term data storage. For medium/short term storage (life time of a project) I always just save the numpy arrays (not with pickle, but with Pandas or numpy.savetxt), but not the FiPy objects to avoid the kinds of problems that you're having. I don't currently have a good solution for long term data storage. -- Daniel Wheeler ___ fipy mailing list fipy@nist.gov http://www.ctcms.nist.gov/fipy [ NIST internal ONLY: https://email.nist.gov/mailman/listinfo/fipy ]
Re: pickle/dump
Hi Bill, Sorry for taking so long to respond. I tried running the scripts again in the different conda environments (with and without Trilinos) and do indeed get the same error as you. I probably didn't switch environments properly when I tried this before. Anyway, I think I've fixed the issue, the changes I made are http://matforge.org/fipy/changeset/b7246011a00584b5e6757404b2b51ca47d71004a/fipy/ and the ticket http://matforge.org/fipy/ticket/669 You'll have to fetch from the main repository and checkout the ticket669-pickle_comm branch to got these changes if they are important to you. Cheers, Daniel On Mon, Aug 11, 2014 at 3:54 PM, Seufzer, William J. (LARC-D307) bill.seuf...@nasa.gov wrote: Thanks Dan, I ran the example codes that you provided and still have the issue. I'm running the writer code on a cluster with PBS (that is, I can't just invoke MPI from the command line) to create the 'dump.gz' file with 16 cores. I then copy the file to the desktop and when I try to open it I get: therm: python fipyreaddump.py Traceback (most recent call last): File fipyreaddump.py, line 4, in module v = fp.tools.dump.read('data.dump') File /Users/wseufzer/anaconda/lib/python2.7/site-packages/FiPy-3.1-py2.7.egg/fipy/tools/dump.py, line 151, in read return unpickler.load() File /Users/wseufzer/anaconda/lib/python2.7/site-packages/FiPy-3.1-py2.7.egg/fipy/tools/comms/mpi4pyCommWrapper.py, line 55, in __setstate__ from PyTrilinos import Epetra ImportError: No module named PyTrilinos It appears that in pickling/dumping the cell variable information is stored regarding PyTrilinos. I am successful if I set environment variable FIPY_SOLVERS to 'scipy' on the cluster, run the code with one core, and then bring that file to the desktop machine. I'm enclosing an example file with nx=100 and ny=10, written on the cluster. Cheers, Bill -- Daniel Wheeler ___ fipy mailing list fipy@nist.gov http://www.ctcms.nist.gov/fipy [ NIST internal ONLY: https://email.nist.gov/mailman/listinfo/fipy ]
Re: pickle/dump
Thanks Dan,I ran the example codes that you provided and still have the issue. I'm running the writer code on a cluster with PBS (that is, I can't just invoke MPI from the command line) to create the 'dump.gz' file with 16 cores. I then copy the file to the desktop and when I try to open it I get:therm: python fipyreaddump.pyTraceback (most recent call last): File "fipyreaddump.py", line 4, in module v = fp.tools.dump.read('data.dump') File "/Users/wseufzer/anaconda/lib/python2.7/site-packages/FiPy-3.1-py2.7.egg/fipy/tools/dump.py", line 151, in read return unpickler.load() File "/Users/wseufzer/anaconda/lib/python2.7/site-packages/FiPy-3.1-py2.7.egg/fipy/tools/comms/mpi4pyCommWrapper.py", line 55, in __setstate__ from PyTrilinos import EpetraImportError: No module named PyTrilinosIt appears that in pickling/dumping the cell variable information is stored regarding PyTrilinos.I am successful if I set environment variable FIPY_SOLVERS to 'scipy' on the cluster, run the code with one core, and then bring that file to the desktop machine. I'm enclosing an example file with nx=100 and ny=10, written on the cluster. data.dump Description: Binary data Cheers,BillOn Aug 5, 2014, at 3:29 PM, Daniel Wheeler daniel.wheel...@gmail.com wrote:Hi Bill,I tried to to this using conda environments and didn't have a problem. I ran import fipy as fp m = fp.Grid2D(nx=10, ny=10) v = fp.CellVariable(mesh=m, value=m.x * m.y) fp.dump.write(v, 'dump.gz')in a coda environment with PyTrilinos installled using "mpirun -np 2 python bill.py" and then I changed to an environment without Trilinos installed $ source activate notril $ python -c "from PyTrilinos import ImportError: No module named PyTrilinosI then ran import fipy as fp v = fp.tools.dump.read('data.gz') print vwith "python bill1.py" and it worked fine.I did this with the "develop" branch, but I doubt it matters. Does the above work for you? Can you dump the file in an environment with PyTrilinos and then read it in in an environment without PyTrilinos? Cheers,DanielOn Fri, Aug 1, 2014 at 11:56 AM, Seufzer, William J. (LARC-D307) bill.seuf...@nasa.gov wrote: FiPy, I would like to be able to dump (fipy.tools.dump) data while running on a cluster and later visualize data from the dump files on my laptop. The dump.write goes well on the cluster but when I try to dump.read on my laptop I get: File "/Users/wseufzer/anaconda/lib/python2.7/site-packages/FiPy-3.1-py2.7.egg/fipy/tools/comms/mpi4pyCommWrapper.py", line 55, in __setstate__ from PyTrilinos import Epetra ImportError: No module named PyTrilinos I would rather not have to build Trilinos on my laptop. So yes, I do have PyTrilinos installed on the cluster with FiPy. I made attempts with using communicator = None, and to use the SerialCommWrapper. But I may not have used the correct syntax, I got various errors. Thanks, Bill___ fipy mailing list fipy@nist.gov http://www.ctcms.nist.gov/fipy [ NIST internal ONLY: https://email.nist.gov/mailman/listinfo/fipy ] -- Daniel Wheeler ___fipy mailing listfipy@nist.govhttp://www.ctcms.nist.gov/fipy [ NIST internal ONLY: https://email.nist.gov/mailman/listinfo/fipy ] smime.p7s Description: S/MIME cryptographic signature ___ fipy mailing list fipy@nist.gov http://www.ctcms.nist.gov/fipy [ NIST internal ONLY: https://email.nist.gov/mailman/listinfo/fipy ]
Re: pickle/dump
Hi Bill, I tried to to this using conda environments and didn't have a problem. I ran import fipy as fp m = fp.Grid2D(nx=10, ny=10) v = fp.CellVariable(mesh=m, value=m.x * m.y) fp.dump.write(v, 'dump.gz') in a coda environment with PyTrilinos installled using mpirun -np 2 python bill.py and then I changed to an environment without Trilinos installed $ source activate notril $ python -c from PyTrilinos import ImportError: No module named PyTrilinos I then ran import fipy as fp v = fp.tools.dump.read('data.gz') print v with python bill1.py and it worked fine. I did this with the develop branch, but I doubt it matters. Does the above work for you? Can you dump the file in an environment with PyTrilinos and then read it in in an environment without PyTrilinos? Cheers, Daniel On Fri, Aug 1, 2014 at 11:56 AM, Seufzer, William J. (LARC-D307) bill.seuf...@nasa.gov wrote: FiPy, I would like to be able to dump (fipy.tools.dump) data while running on a cluster and later visualize data from the dump files on my laptop. The dump.write goes well on the cluster but when I try to dump.read on my laptop I get: File /Users/wseufzer/anaconda/lib/python2.7/site-packages/FiPy-3.1-py2.7.egg/fipy/tools/comms/mpi4pyCommWrapper.py, line 55, in __setstate__ from PyTrilinos import Epetra ImportError: No module named PyTrilinos I would rather not have to build Trilinos on my laptop. So yes, I do have PyTrilinos installed on the cluster with FiPy. I made attempts with using communicator = None, and to use the SerialCommWrapper. But I may not have used the correct syntax, I got various errors. Thanks, Bill ___ fipy mailing list fipy@nist.gov http://www.ctcms.nist.gov/fipy [ NIST internal ONLY: https://email.nist.gov/mailman/listinfo/fipy ] -- Daniel Wheeler ___ fipy mailing list fipy@nist.gov http://www.ctcms.nist.gov/fipy [ NIST internal ONLY: https://email.nist.gov/mailman/listinfo/fipy ]