We could have a two tar balls, the petsc release with backward compatibility and one without and randomly parcel then out to downloaders. Then wait two releases and collect all the petsc-maint data from both sets of users and see which set are more of a pain in the ass.
Barry On Dec 14, 2012, at 8:29 PM, Karl Rupp <rupp at mcs.anl.gov> wrote: > Alright, then let's have it this way. :-) > > > > > >> It won't fix any complaints because people will just live with their > >> out-of-date code until we remove the legacy support and THEN they will > >> complain. (And by then we'll have half-forgotten what we did so it will be > >> harder to help people with their complaints). Far better to force them to > >> change immediately then to drag it out. >> >> I agree with Barry here. All these measures are made for conscientious >> users who read the release notes >> and look at warnings. They are wonderful, however like Santa, they >> also do not exist. >> >> It has to work for the worst (automatic conversion), or its not worth it. >> >> Matt >> >>> Barry >>> >>> On Dec 14, 2012, at 4:13 PM, Karl Rupp <rupp at mcs.anl.gov> wrote: >>> >>>> Hey, >>>> >>>> On 12/13/2012 03:46 PM, Jed Brown wrote: >>>>> I'm sure that users would appreciate one release of deprecation. It's >>>>> not hard to implement when deprecating a routine entirely, but it's >>>>> trickier when changing the interface to an existing routine. It can be >>>>> achieved through a "feature test macro" that asks for the old version, >>>>> though this still requires that the user "change their code" (or >>>>> preprocessor definitions) to build with the new version. Some projects >>>>> include the version number in the API, but that looks ugly and confusing >>>>> to the user, especially after the old version has been removed. >>>> >>>> I'm also in favor of offering a 'grace period' for the main user >>>> functions, at least for the most frequently used. I don't have a good >>>> strategy for dealing with changes to existing functions at hand, though. >>>> >>>> >>>>> It's technically feasible for PETSc to offer this, but it's still not a >>>>> trivial amount of effort and doesn't fix all the user's complaints. >>>> >>>> If it fixes half of the user complaints we would get otherwise, it's >>>> probably already worth the effort... >>>> >>>> Best regards, >>>> Karli >>>> >>>> >>>>> >>>>> On Sat, Dec 8, 2012 at 6:05 PM, Karl Rupp <rupp at mcs.anl.gov >>>>> <mailto:rupp at mcs.anl.gov>> wrote: >>>>> >>>>> Hey, >>>>> >>>>> this thread is sufficiently young such that I add some interesting >>>>> statement from LLVM rather than opening a new thread: >>>>> >>>>> "Another major aspect of LLVM remaining nimble (and a controversial >>>>> topic with clients of the libraries) is our willingness to >>>>> reconsider previous decisions and make widespread changes to APIs >>>>> without worrying about backwards compatibility. Invasive changes to >>>>> LLVM IR itself, for example, require updating all of the >>>>> optimization passes and cause substantial churn to the C++ APIs. >>>>> We've done this on several occasions, and though it causes pain for >>>>> clients, it is the right thing to do to maintain rapid forward >>>>> progress. To make life easier for external clients (and to support >>>>> bindings for other languages), we provide C wrappers for many >>>>> popular APIs (which are intended to be extremely stable) and new >>>>> versions of LLVM aim to continue reading old .ll and .bc files." >>>>> >>>>> as well as >>>>> >>>>> "Despite its success so far, there is still a lot left to be done, >>>>> as well as the ever-present risk that LLVM will become less nimble >>>>> and more calcified as it ages. While there is no magic answer to >>>>> this problem, I hope that the continued exposure to new problem >>>>> domains, a willingness to reevaluate previous decisions, and to >>>>> redesign and throw away code will help. After all, the goal isn't to >>>>> be perfect, it is to keep getting better over time." >>>>> >>>>> (Page 3 in [1]) >>>>> >>>>> The whole article is a good read. Modularity and reusability don't >>>>> seem to be something compiler people outside LLVM have really cared >>>>> about (I haven't checked this claim, though). >>>>> >>>>> Overall, the take-away is that they also argue in favor of >>>>> preserving a clean and consistent design even though it requires to >>>>> sacrifice backwards compatibility. Also, they are aware of the >>>>> hassle for their users and try to make the transition less painful. >>>>> A similar model should work with PETSc, e.g. by using pragma >>>>> messages to point at deprecated functionality for a while. This >>>>> would allow users to still obtain an executable when moving to a >>>>> newer version, but at the same time gives them a clear indication >>>>> that they should migrate to using the new interface soon (and not >>>>> necessarily *immediately*). >>>>> >>>>> Best regards, >>>>> Karli >>>>> >>>>> >>>>> [1] >>>>> >>>>> http://www.drdobbs.com/__architecture-and-design/the-__design-of-llvm/240001128 >>>>> >>>>> <http://www.drdobbs.com/architecture-and-design/the-design-of-llvm/240001128> >>>>> [2] >>>>> >>>>> http://blog.llvm.org/2011/12/__nvidia-cuda-41-compiler-now-__built-on.html >>>>> >>>>> <http://blog.llvm.org/2011/12/nvidia-cuda-41-compiler-now-built-on.html> >>>>> >>>>> >>>>> >>>>> On 11/26/2012 06:08 PM, Jed Brown wrote: >>>>> >>>>> Point in favor of evolutionary libraries over the Matryoshka >>>>> dolls that >>>>> arise when interfaces are frozen forever. >>>>> >>>>> http://akkartik.name/blog/__libraries2 >>>>> <http://akkartik.name/blog/libraries2> >>>>> >>>>> >>>>> >>>> >>> >> >> >> >> -- >> What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to which >> their experiments lead. >> -- Norbert Wiener >> >
