Just asked about this, and I was wrong. But you can also use: execute(parallel(foo), ...)
as the parallel decorator can just be called as a wrapping function. -goose On Thu, May 10, 2012 at 1:54 PM, Morgan Goose <[email protected]> wrote: > Hmm, I think perhaps I may have been wrong that it takes all env > params. Will look and see if there is a bug or feature request for > that. But decorating @parallel should still be honored. > > -goose > > On Thu, May 10, 2012 at 11:16 AM, anatoly techtonik <[email protected]> > wrote: >> Hi Morgan, >> >> Thanks for the advice, but keyword arguments from the environment >> don't work with execute(). The code below gives a traceback: >> >> @task >> def test(): >> pass >> @task >> def deploy(): >> execute(test, parallel=True) >> >> Traceback (most recent call last): >> File "/usr/lib/python2.7/site-packages/fabric/main.py", line 712, in main >> *args, **kwargs >> File "/usr/lib/python2.7/site-packages/fabric/tasks.py", line 327, in >> execute >> results['<local-only>'] = task.run(*args, **new_kwargs) >> File "/usr/lib/python2.7/site-packages/fabric/tasks.py", line 112, in run >> return self.wrapped(*args, **kwargs) >> File "/prdir/fabfile.py", line 95, in deploy >> execute(test, parallel=True) >> File "/usr/lib/python2.7/site-packages/fabric/tasks.py", line 327, in >> execute >> results['<local-only>'] = task.run(*args, **new_kwargs) >> File "/usr/lib/python2.7/site-packages/fabric/tasks.py", line 112, in run >> return self.wrapped(*args, **kwargs) >> TypeError: test() takes no arguments (1 given) >> >> >> On Wed, May 2, 2012 at 2:02 AM, Morgan Goose <[email protected]> wrote: >>> Notice the section in >>> http://docs.fabfile.org/en/1.4.1/api/core/tasks.html#fabric.tasks.execute >>> where it mentions that "Any other arguments or keyword arguments will >>> be passed verbatim into task when it is called, so execute(mytask, >>> 'arg1', kwarg1='value') will (once per host) invoke mytask('arg1', >>> kwarg1='value')." >>> >>> It is refering, albeit not clearly, to the fact that any keyword that >>> is given will be used as if it were a member of the env dict. So you >>> can use any of these env var names to specify things, eg: >>> parallel=True; >>> http://docs.fabfile.org/en/1.4.1/usage/env.html#full-list-of-env-vars >>> >>> Also I fumbled the name of the decorator. It's serial: >>> http://docs.fabfile.org/en/1.4.1/api/core/decorators.html#fabric.decorators.serial >>> >>> You can also specify custom args to fab tasks: >>> http://docs.fabfile.org/en/1.4.1/usage/fab.html#per-task-arguments >>> >>> And with that you could make the deploy task take a role name, or list >>> of hosts to then use as the lists, local and remote, to use in your >>> script. >>> >>> -goose >>> >>> On Tue, May 1, 2012 at 12:01 PM, anatoly techtonik <[email protected]> >>> wrote: >>>> Hi Morgan, >>>> >>>> I've tried your example, which is: >>>> >>>> local = ['a','b'] >>>> remote = ['c','d'] >>>> >>>> def test:pass >>>> def update:pass >>>> >>>> @task >>>> @sequential >>>> def deploy(): >>>> execute(test, hosts=local+remote) >>>> execute(update, hosts=local+remote) >>>> >>>> At first it didn't run at all with Fabric 1.4.1, unable to call >>>> @sequential decorator: >>>> >>>> Traceback (most recent call last): >>>> ... >>>> File "/checkout83/fabfile.py", line 20, in <module> >>>> @sequential >>>> NameError: name 'sequential' is not defined >>>> >>>> I commented the @sequential, after which `fab deploy` run sequentially: >>>> local test >>>> remote test >>>> local update >>>> remote update >>>> >>>> That's much better, but it still impossible to specify hosts from >>>> command line. `fab -H local deploy` still runs additional command on >>>> the remote, and `fab -H local,remote deploy` run everything twice on >>>> both hosts. >>>> >>>> I couldn't find parallel among params for execute - >>>> http://docs.fabfile.org/en/1.4.1/api/core/tasks.html#fabric.tasks.execute >>>> so I've just decorated functions: >>>> >>>> local = ['a','b'] >>>> remote = ['c','d'] >>>> >>>> @task >>>> @parallel >>>> def test:pass >>>> @task >>>> @parallel >>>> def update:pass >>>> >>>> @task >>>> def deploy(): >>>> execute(test, hosts=local+remote) >>>> execute(update, hosts=local+remote) >>>> >>>> I must admit that although not ideally, but this works. I just need to >>>> make sure that deploy is always called with empty host list and I need >>>> to find a way to specify hosts from command line for the subtasks. >>>> -- >>>> anatoly t. >>>> >>>> >>>> On Fri, Apr 27, 2012 at 11:48 PM, Morgan Goose <[email protected]> >>>> wrote: >>>>> The example I gave you and how to run it did all of those things, sans >>>>> the parallel. Execute has parallel as a param. Use that in your master >>>>> task's executes, and you'll have everything. If that still isn't what >>>>> you're looking for, we're going to need more information. Eg, like you >>>>> did before where you said how you're seeing it run, and how you'd >>>>> instead like it to run, complete with how you ran it with fab, and how >>>>> the fabfile looks. >>>>> >>>>> -goose >>>>> >>>>> On Thu, Apr 26, 2012 at 10:32 PM, anatoly techtonik <[email protected]> >>>>> wrote: >>>>>> Thanks for the explanation. I read the tutorial. Still I see I can't >>>>>> construct the system that will satisfy the following usability >>>>>> requirements: >>>>>> >>>>>> 1. Define (or override) hosts from command line >>>>>> 2. Execute each scheduled task in parallel >>>>>> 3. Wait until previous task completes successfully on all servers >>>>>> before moving to the next one >>>>>> 4. Ability to run each task separately or use master task as a helper >>>>>> command >>>>>> >>>>>> Command line hosts (1) is needed to test new nodes and control >>>>>> deployment from 3rd party application. >>>>>> Parallel execution (2) and (3) is critical to minimize servers downtime. >>>>>> Master task (4) is also highly desired, because full deployment process >>>>>> contains a lot of steps. >>>>>> >>>>>> It seems that this could be real with the @mastertask decorator that >>>>>> would make task run without servers at all. With it the following >>>>>> could work as expected. >>>>>> >>>>>> @task >>>>>> @parallel >>>>>> def test:pass >>>>>> >>>>>> @task >>>>>> @parallel >>>>>> def update:pass >>>>>> >>>>>> @mastertask >>>>>> def deploy(): >>>>>> execute(test) >>>>>> execute(update) >>>>>> >>>>>> -- >>>>>> anatoly t. >>>>>> >>>>>> >>>>>> On Fri, Apr 27, 2012 at 1:28 AM, Morgan Goose <[email protected]> >>>>>> wrote: >>>>>>> I just read that last line, where you said you wanted to run them with >>>>>>> hosts defined at runtime. You can either use functions to generate the >>>>>>> hosts lists or with said example i gave change it up to get rid of the >>>>>>> deploy task, and then run the fabfile like: >>>>>>> $ fab -H local,remote test update >>>>>>> >>>>>>> As that will be sequential and honor the host list as well. It will >>>>>>> also run the test on all hosts before moving to the next task listed, >>>>>>> update. >>>>>>> >>>>>>> -goose >>>>>>> >>>>>>> On Thu, Apr 26, 2012 at 3:25 PM, Morgan Goose <[email protected]> >>>>>>> wrote: >>>>>>>> First have you read the tutorial? This is the section that will get >>>>>>>> you in on host list defining, and point you the the more detailed >>>>>>>> information that I link to below it: >>>>>>>> http://docs.fabfile.org/en/1.4.1/tutorial.html#defining-connections-beforehand >>>>>>>> http://docs.fabfile.org/en/1.4.1/usage/execution.html#host-lists >>>>>>>> >>>>>>>> As to what you want to do. Each task will loop over it's list of hosts >>>>>>>> to run on. So top down it's TaskA over all HostListA's hosts. Then if >>>>>>>> no errors move to the other task with the same or unique host list >>>>>>>> (defined either globally or per task). >>>>>>>> >>>>>>>> From what you're wanting to do you will construct the fabfile like >>>>>>>> this: >>>>>>>> >>>>>>>> local = ['a','b'] >>>>>>>> remote = ['c','d'] >>>>>>>> >>>>>>>> def test:pass >>>>>>>> def update:pass >>>>>>>> >>>>>>>> @task >>>>>>>> @sequential >>>>>>>> def deploy(): >>>>>>>> execute(test, hosts=local+remote) >>>>>>>> execute(update, hosts=local+remote) >>>>>>>> >>>>>>>> >>>>>>>> Then all hosts will have the test task run on them before moving to >>>>>>>> looping over the two hosts lists and running the update. You then call >>>>>>>> it simply with: >>>>>>>> >>>>>>>> $ fab deploy >>>>>>>> >>>>>>>> And you never run the test and update tasks themselves. Nor do you >>>>>>>> define anything to be parallel. As parallel runs on the task would >>>>>>>> test on both hosts in parallel, and then deploy to both hosts in >>>>>>>> parallel. This was, running sequential and explicitly defined to do >>>>>>>> so, will fail fast on the test hosts if one dies. Because the host >>>>>>>> list's order is honored. >>>>>>>> >>>>>>>> -goose >>>>>>>> >>>>>>>> On Wed, Apr 25, 2012 at 1:13 PM, anatoly techtonik >>>>>>>> <[email protected]> wrote: >>>>>>>>> On Sat, Apr 21, 2012 at 7:51 PM, Jeff Forcier <[email protected]> >>>>>>>>> wrote: >>>>>>>>>> On Fri, Apr 20, 2012 at 5:31 AM, anatoly techtonik >>>>>>>>>> <[email protected]> wrote: >>>>>>>>>> >>>>>>>>>>> Is it possible in fabric to wait until a subtask completes on all >>>>>>>>>>> servers successfully before moving to next step? >>>>>>>>>> >>>>>>>>>> You need to use execute() to treat subroutines as if they were full >>>>>>>>>> fledged tasks. execute() is the machinery that says "take this >>>>>>>>>> function and run it once per host in this list of hosts." Right now >>>>>>>>>> that machinery is just applying implicitly to your deploy() task and >>>>>>>>>> you're probably using env.hosts to set your host list. >>>>>>>>>> >>>>>>>>>> Remove env.hosts and maybe make that host list a role in >>>>>>>>>> env.roledefs. >>>>>>>>>> Then you can do this: >>>>>>>>>> >>>>>>>>>> env.roledefs = {'myrole': ['a', 'b', 'c']} >>>>>>>>>> >>>>>>>>>> def test(): ... >>>>>>>>>> def update(): ... >>>>>>>>>> >>>>>>>>>> def deploy(): >>>>>>>>>> execute(test, role='myrole') >>>>>>>>>> execute(update, role='myrole') >>>>>>>>>> >>>>>>>>>> That should have the effect you want: "fab deploy" => first test() >>>>>>>>>> runs once per host, and when it's all done, update() will run once >>>>>>>>>> per >>>>>>>>>> host. deploy() itself will end up running only one time total -- it's >>>>>>>>>> just a "meta" task now. >>>>>>>>> >>>>>>>>> It took time to realize the that execute() iterates over the global >>>>>>>>> list of hosts. I expected that the following two to be equivalent, but >>>>>>>>> they were not: >>>>>>>>> >>>>>>>>> 1. fab -H local,remote test update >>>>>>>>> 2. fab -H local,remote deploy >>>>>>>>> >>>>>>>>> I used the script without roles: >>>>>>>>> >>>>>>>>> def test(): ... >>>>>>>>> def update(): ... >>>>>>>>> def deploy(): >>>>>>>>> execute(test) >>>>>>>>> execute(update) >>>>>>>>> >>>>>>>>> 1st execution variant is fully synchronous (i.e. next task doesn't >>>>>>>>> start until previous finishes) and gave the sequence: >>>>>>>>> >>>>>>>>> local test >>>>>>>>> local update >>>>>>>>> remote test >>>>>>>>> remove update >>>>>>>>> >>>>>>>>> but the 2nd variant with subtasks was confusing (I indented to see >>>>>>>>> what's going on): >>>>>>>>> >>>>>>>>> local deploy >>>>>>>>> local test >>>>>>>>> remote test >>>>>>>>> local update >>>>>>>>> remote update >>>>>>>>> remote deploy >>>>>>>>> local test >>>>>>>>> remote test >>>>>>>>> local update >>>>>>>>> remote update >>>>>>>>> >>>>>>>>> I found fabric pretty counter-intuitive in this case. I tried to fix >>>>>>>>> that without roles by explicitly passing current host: >>>>>>>>> >>>>>>>>> def test(): ... >>>>>>>>> def update(): ... >>>>>>>>> def deploy(): >>>>>>>>> execute(test, host=env.host_string) >>>>>>>>> execute(update, host=env.host_string) >>>>>>>>> >>>>>>>>> This gives: >>>>>>>>> local deploy >>>>>>>>> local test >>>>>>>>> local update >>>>>>>>> remote deploy >>>>>>>>> remote test >>>>>>>>> remove update >>>>>>>>> >>>>>>>>> Still not the desired behavior. The desired is: >>>>>>>>> deploy >>>>>>>>> local test >>>>>>>>> remote test >>>>>>>>> wait >>>>>>>>> local update >>>>>>>>> remote update >>>>>>>>> >>>>>>>>> I've tried using @parallel decorator for deploy task and it seemed to >>>>>>>>> work fine at first. >>>>>>>>> local deploy >>>>>>>>> remote deploy >>>>>>>>> remote test >>>>>>>>> local test >>>>>>>>> local update >>>>>>>>> remote update >>>>>>>>> >>>>>>>>> But the test step didn't not synchronize - local update executed while >>>>>>>>> remote test was still running. It looks like the roles is the only >>>>>>>>> chance, but I'd like to avoid hardcoding server names at all costs. Is >>>>>>>>> it possible? >>>>>>>>> >>>>>>>>> -- >>>>>>>>> anatoly t. >>>>>>>>> >>>>>>>>> _______________________________________________ >>>>>>>>> Fab-user mailing list >>>>>>>>> [email protected] >>>>>>>>> https://lists.nongnu.org/mailman/listinfo/fab-user _______________________________________________ Fab-user mailing list [email protected] https://lists.nongnu.org/mailman/listinfo/fab-user
