Thanks for the reply Bergi.

Correct me if I'm wrong but writing something like:
const b = Array.from(a.values().filter(…).map(…))

Still requires 2 iterations over the array.
You're theoretically right that the running time complexity is still linear.
However with the introduction of the ability to have async functions within
each iteration,
in real life production code there's a significant difference between
running 1M iterations vs. ~2M (an iteration can involve interaction with a
database or 3rd party API).

I'd be happy to understand more from Gus, how would his proposal deal with
the scenario I've described.

Thanks,
Roma

On Sat, Jun 22, 2019 at 12:56 AM Bergi <[email protected]> wrote:

> Hi!
>
> > However, when I want to write performance sensitive code, chaining these
> > functions is not a good approach.
> > const b = a.filter().map()
> >
> > will require 2 traversals over the whole array, up to 2*N iterations (if
> > the filter passes all items).
>
> Actually, the number of passes hardly matters. It's still linear
> complexity. What makes this slow is the allocation of the unnecessary
> temporary array.
>
> > I suggest adding a capability to streamline items to these functions.
>
> We don't need streams, JavaScript already has iterators. What we do need
> are proper helper functions for those - see the existing proposal at
> <https://github.com/tc39/proposal-iterator-helpers>. You then can write
>
>      const b = Array.from(a.values().filter(…).map(…))
>
> or
>
>      for (const x of a.values().filter(…).map(…))
>          console.log(x);
>
> kind regards,
>   Bergi
> _______________________________________________
> es-discuss mailing list
> [email protected]
> https://mail.mozilla.org/listinfo/es-discuss
>
_______________________________________________
es-discuss mailing list
[email protected]
https://mail.mozilla.org/listinfo/es-discuss

Reply via email to