is it possible to pull all the test-only pieces of code from a "test_only" 
file, that is ignored by git. Include the test code in such a way that if 
the "test_only" file is not present, the production code still holds 
together.

On Friday, July 6, 2012 11:42:58 AM UTC-7, Tim Chase wrote:
>
> I've got a number of Python modules where the debugging code changes 
> on a regular basis, but is of no relevance to the history of the 
> file.  For a snippet, it might be something like 
>
> ============================ 
>   def real_code(x): 
>     return do_stuff(x) 
>
>   # version everything above the next line 
>   if __name__ == "__main__":   # everything from here on 
>     filename = r"/tmp/thing"   # should be ignored 
>     f = real_code(x)           # every time 
>     print f["some arbitrary test case"] 
> ============================ 
>
> The files are data-source files from providers, and the value I want 
> to check (the subscripting in this case) is usually some case with 
> which we're having a particular problem.  Because it's just there to 
> test new cases in the providers' data, and the tests aren't 
> available in every source data file (we don't keep huge histories of 
> the gigs of files/data to allow retesting later). 
>
> Currently, when I check it in, I do an "add -p" and just skip the 
> hunks after the "if __name__ ==".  I was wondering if there was a 
> better (i.e., automatic) way to delimit that a certain segment of 
> the file should never be considered "changed" and that git should 
> ignore it when adding/committing the file. 
>
> Thanks, 
>
> -tkc 
>
>
>
>

-- 
You received this message because you are subscribed to the Google Groups "Git 
for human beings" group.
To view this discussion on the web visit 
https://groups.google.com/d/msg/git-users/-/6Nl83a3Zo-sJ.
To post to this group, send email to git-users@googlegroups.com.
To unsubscribe from this group, send email to 
git-users+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/git-users?hl=en.

Reply via email to