Stephan Beyer <s-be...@gmx.net> writes:

>> +int bisect_clean_state(void)
>> +{
>> +    int result = 0;
>> +
>> +    /* There may be some refs packed during bisection */
>> +    struct string_list refs_for_removal = STRING_LIST_INIT_NODUP;
>> +    for_each_ref_in("refs/bisect", mark_for_removal, (void *) 
>> &refs_for_removal);
>> +    string_list_append(&refs_for_removal, xstrdup("BISECT_HEAD"));
>> +    result = delete_refs(&refs_for_removal, REF_NODEREF);
>> +    refs_for_removal.strdup_strings = 1;
>> +    string_list_clear(&refs_for_removal, 0);
>
> Does it have advantages to populate a list (with duplicated strings),
> hand it to delete_refs(), and clear the list (and strings), instead of
> just doing a single delete_ref() (or whatever name the singular function
> has) in the callback?

Depending on ref backends, removing multiple refs may be a lot more
efficient than calling a single ref removal for the same set of
refs, and the comment upfront I think hints that the code was
written in the way exactly with that in mind.  Removing N refs from
a packed refs file will involve a loop that runs N times, each
iteration loading the file, locating an entry among possibly 100s of
refs to remove, and then rewriting the file.

Besides, it is bad taste to delete each individual item being
iterated over in an interator in general, isn't it?

Reply via email to