On 15.11.2013 02:14, Jan Pazdziora wrote:
On Thu, Nov 14, 2013 at 03:40:52PM +0100, Petr Spacek wrote:

In reality, it means that you can re-run OpenStack installer on the
same machine/set of machines (with the same configuration, of
course!) and it will re-do everything again. You can re-run

The point is that it should *not* redo everything again.
Okay, may be that "re-do" is not the best description ever ;-)

installer again and again without any harm!

This solves case where something went wrong during the installation,
the installation was aborted and the machine was left in some
inconsistent state. Think about e.g. network failure during
installation, improper configuration which prevented installation
from finishing (crap in DNS), some intermittent and mysterious
errors in Dogtag installer and so on.

It does mean that you don't need to recycle whole machine if
something went wrong during installation ...

I am not sure idempotence focuses on failures, really. It says that
if your previous installer run passed, the subsequent run should
No, it says that you will reach the same state again and again - if the conditions are the same. It does not apply only to successful installations.

detect that individual pieces of configuration are already in place
and it should not re-run the steps to re-create them. The implication
being that it should only configure the pieces missing.
I agree, "re-do" wasn't the best wording.

But if your installation is borked beyond repair for whatever reason,
it does not mean that the setup will be self-healing.
Sure, but you describe case when installation failed for whatever reason and you ran it again without any change. In that case it will fail again, idempotently :-)

The point is that you have a chance to fix the problem (reconfigure firewall, DNS etc.), run the installer again and it will finish the installation or fail later on some other problem. It means that you don't need to start from scratch. (This is exactly what was described at the conference by the speaker.)

Also, it allows you to reproduce the failure again and again (without any change to existing system) and dig towards the cause. This can be major advantage if you don't know why it failed.

I think that our installers do a pretty decent job of cleaning after
themselves if things fails (rolling back the changes) which I find
much more important than leaving mess around with the intent of fixing
it upon the next run. Leaving things in consistent state is higher
value than idempotence.

I think that this goes hand-in-hand. IMHO it is much harder to write and maintain uninstaller if you have just imperative installer, because you have to read it's code and manually write the uninstaller.

I think you have much better position with declarative installer, because you can extract a lot of information from the 'declarations' and generate uninstaller from that. (I don't say that it will be 100% bullet proof, but I'm pretty sure that this will lower the chance that you added something to installer but forgot to add the same step to uninstaller.)

--
Petr^2 Spacek

_______________________________________________
Freeipa-devel mailing list
Freeipa-devel@redhat.com
https://www.redhat.com/mailman/listinfo/freeipa-devel

Reply via email to