On 3 October 2013 09:49, Chris Richardson <[email protected]> wrote: > On 03/10/2013 09:40, Maximilian Albert wrote: >> >> 2013/10/3 Garth N. Wells <[email protected]>: >> >>> What's required is the right abstraction for handling Functions and >>> files. I think the hashing approach is more a hack. What about >>> something along the lines of: >>> >>> Function u(V); >>> Function w(V); >>> >>> HDF5Function hdf5_function_file("my_filename.h5", "w"); >>> hdf5_function_file.register(u, "u_name"); >>> hdf5_function_file.register(w, "w_name"); >>> >>> hdf5_function_file.parameters["common_mesh"] = true; >>> hdf5_function_file.parameters["write_mesh_once"] = true; >>> >>> // Write all registered functions >>> hdf5_function_file.write(); >>> >>> // Write all registered functions again >>> hdf5_function_file.write(); >>> >>> // Write u only >>> hdf5_function_file.write("u_name"); >> >> >> I can't comment on the efficienty/implementation side of things, but >> from a user's point of view my first reaction is that I like this >> idea. >> >> My question is, how does this relate to time-dependent problems? Would >> it be easy to associate timestep information with the saved functions >> through the interface suggested above? From a UI point of view I would >> imagine that something like this makes sense: >> >> // Write all functions at timestep t=0 >> hdf5_function_file.write(t=0); >> >> // Write u only at timestep t=2.5 >> hdf5_function_file.write("u_name", t=2.5); >> >> (If no timestep is provided, it could just increase in steps of 1 or so.) >> >> Are there any fundamental problems with this approach I'm missing? If >> not, is it something you'd be willing to implement/support? Also, >> could this be easily intergrated with XDMF files, so that animations >> (e.g. in Paraview) would use the correct timesteps? I haven't checked >> recently, but a while ago whenever a field was saved in dolfin this >> created a new timestep in the XDMF file so that it was impossible to >> animate a timeseries of two fields simultaneously. >> > > I am not entirely convinced that this extra level of complexity is required. > I think the HDF5File should be a generic container which can accept > different types of object inside it, rather than having different types of > file for different types of object. >
The point is to reduce the complexity for the user via a wrapper for managing the IO details for a Function. The IO to file would still be managed through HDF5File, and a user could still work at a lower level directly with HDF5File if they wish. This cleaner because HDF5File can be more abstract. I think that it's too much to ask one class to manage the IO details of all object types. Garth > It is quite reasonable to attach tags (such as timestamps) to HDF5 datasets, > and we should support this through the HDF5 attributes interface. > > The HDF5File interface to Function is fundamentally incompatible with > visualisation, because it supports a wider range of FunctionSpaces > > Chris > > _______________________________________________ fenics mailing list [email protected] http://fenicsproject.org/mailman/listinfo/fenics
