Issue #20013 has been updated by Hunter Haugen.

> It would be much better if the code didn’t automatically delete those 
> repositories when it was done

It only removes them if the tests pass.

For background, there are 4 rake tasks (namespacing should be revised imho):

- `spec` (runs `spec_prep` & `spec_standalone`, and if successful `spec_clean`)
- `spec_prep` (downloads fixtures)
- `spec_standalone` (runs spec tests
- `spec_clean` (removes fixtures)

The two use cases in which fixtures are used:

1. Automated test environments that are rebuild on each new test (like travis 
or jenkins). These are not impacted by the cleaning/non-crleaning of fixtures.
1. Manual test runs by a human. These are impacted by reclones and could 
benefit from not automatically cleaning fixtures.

To address the second option, I would propose that the `spec` task runs 
`spec_prep` and `spec_standalone`; the task `spec_clean` should only be called 
manually.

Additionally, the `spec_prep` task should be updated to only run `git fetch`'s 
if the repo exists and the last fetch was more than 24 hours ago (by checking 
the mtime of `${GIT_DIR}/FETCH_HEAD` or something). Fetches are a low-impact 
command compared to clones, so this may be overkill.

> to at least use a shallow close one revision deep to ensure that the absolute 
> minimum of content is transferred on each run.

I often mess with fixtures by checking out older tags when doing spec runs (to 
identify breaking behaviour) so I would recommend full clones, especially if we 
make it a low-impact event.

----------------------------------------
Feature #20013: puppetlabs_spec_helper wastes substantial time and network 
bandwidth recloning repos every run
https://projects.puppetlabs.com/issues/20013#change-94749

* Author: Daniel Pittman
* Status: Needs Decision
* Priority: Normal
* Assignee: Hunter Haugen
* Category: 
* Target version: 
* Affected Puppet version: 
* Keywords: 
* Branch: 
----------------------------------------
When I use `rake spec` in a module using `puppetlabs_spec_helper` I am greeted, 
on every single run, with this:

<pre>
⚡ rake spec
Cloning into 'spec/fixtures/modules/apt'...
remote: Counting objects: 949, done.
remote: Compressing objects: 100% (440/440), done.
remote: Total 949 (delta 555), reused 861 (delta 492)
Receiving objects: 100% (949/949), 138.76 KiB, done.
Resolving deltas: 100% (555/555), done.
HEAD is now at e01bbb6 Merge pull request #112 from richardc/patch-1
Cloning into 'spec/fixtures/modules/mongodb'...
remote: Counting objects: 117, done.
remote: Compressing objects: 100% (72/72), done.
remote: Total 117 (delta 35), reused 103 (delta 28)
</pre>

For my project, with a half dozen transitive dependencies, that runs to around 
1MB of data fetched *every run*.

Aside from the network cost, which is substantial and notable if you are, oh, 
developing somewhere that you pay by the MB for data, this also means waiting 
longer for the download of content than for the tests to run.  That makes the 
cycle time pretty damn terrible.

It would be much better if the code didn't automatically delete those 
repositories when it was done, or if that isn't practical, to at least use a 
shallow close one revision deep to ensure that the absolute minimum of content 
is transferred on each run.


-- 
You have received this notification because you have either subscribed to it, 
or are involved in it.
To change your notification preferences, please click here: 
http://projects.puppetlabs.com/my/account

-- 
You received this message because you are subscribed to the Google Groups 
"Puppet Bugs" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To post to this group, send email to [email protected].
Visit this group at http://groups.google.com/group/puppet-bugs.
For more options, visit https://groups.google.com/groups/opt_out.


Reply via email to