If, on average, your producers Don't outrun your producers, and only do so in
limited bursts, you could try storing some of the buffered data on disk, until
the burst is over?
Hard to imagine it would be reliable though
--
>> Read the docs: http://akka.io/docs/
>> Che
Sounds like your producers are out running your consumers?
Have you looked at akka streams?
--
>> Read the docs: http://akka.io/docs/
>> Check the FAQ:
>> http://doc.akka.io/docs/akka/current/additional/faq.html
>> Search the archives: https://gro
Just had a thought, I guess it was done this way because of the focus on
immutability? Iterator is inherently mutable, and can only be traversed
once...
On Wednesday, September 27, 2017 at 12:36:17 AM UTC-7, Bwmat wrote:
>
> Another thought - returning a 'true' Iterable
Another thought - returning a 'true' Iterable (meaning that it actually
supports multiple iterations) involves, in the general case, materializing
the sequence (which may be arbitrarily long...) into memory, somewhat
subverting one of the goals of streams, since it prevents laziness.
On Tuesday
>
> It means though that if you start 2 streams using a part of that mapConcat
> would end up accessing the same iterator potentially concurrently.
The functor uses the elements from the flow to create the iterators, so
there's no reason it has to return the same objects in multiple
materializ
The functor already can return a separate Iterable per-invocation, unless I
misunderstood your comment?
To clarify, I meant, does the implementation of
the
http://doc.akka.io/japi/akka/current/akka/stream/javadsl/Flow.html#mapConcat-akka.japi.function.Function-
stage ever need to obtain mult
I'm talking about
http://doc.akka.io/japi/akka/current/akka/stream/javadsl/Flow.html#mapConcat-akka.japi.function.Function-
I had a flow that was emitting Iterator, and I wanted a step to 'chunk'
the values to a given chunk size N (in other words, every iterator returned
by the 'chunker' would
completed with upstream
failure.
On Friday, September 22, 2017 at 10:54:41 AM UTC-7, Bwmat wrote:
>
> >causing the 'error checker' stage to complete 'successfully'
>
> Sorry, to be clear, I don't try to swallow the error; is it possible to
> detect
>causing the 'error checker' stage to complete 'successfully'
Sorry, to be clear, I don't try to swallow the error; is it possible to
detect that an outlet was closed because of an error?
On Friday, September 22, 2017 at 10:50:27 AM UTC-7, Bwmat wrote:
>
>
ponseFor' is a simple method that doesn't do anything
special (w.r.t. akka streams; the materializer isn't actually used anymore).
On Thursday, September 21, 2017 at 6:05:27 PM UTC-7, Bwmat wrote:
>
> I'm having trouble extracting a minimized reproducer. When I try 'sim
what operations you’re doing… Please share a
> minimised reproducer so we could advice.
> Error handling works as documented and is “forward propagated” through a
> stream, and operators are free to handle it in any way they want.
>
> —
> Konrad `kto.so` Malawski
> Akka <http:/
g custom GraphStages, note that by default they will not use the
> SupervisionStrategy so recover may be the better approach.
>
> Julian
>
> On Wednesday, September 20, 2017 at 3:54:30 AM UTC+1, Bwmat wrote:
>>
>> For some more context, the flow throwing the exception was
will try to use the
> loggers config, but there is none, so empty.
>
> Works: http://github.com/ktoso/akka-logging-example
>
> —
> Konrad `kto.so` Malawski
> Akka <http://akka.io> @ Lightbend <http://lightbend.com>
>
> On 20 September 2017 at 08:
logging.html ?
>
> —
> Konrad `kto.so` Malawski
> Akka <http://akka.io> @ Lightbend <http://lightbend.com>
>
> On 20 September 2017 at 08:30:53, Bwmat (bwmat.r...@gmail.com
> ) wrote:
>
> I have the following code
>
> public static void main(String[]
sed to the builder method in that question.
On Tuesday, September 19, 2017 at 7:49:53 PM UTC-7, Bwmat wrote:
>
> I'm doing a PoC in akka streams, and I just had to track down a case where
> the mapping functor in an instance of Flow.map() threw an exception. This
> seems to simply
I'm doing a PoC in akka streams, and I just had to track down a case where
the mapping functor in an instance of Flow.map() threw an exception. This
seems to simply close the stream, but _not_ report it anywhere. Other
stages were getting completed implicitly because of it, but the error
didn't
I have the following code
public static void main(String[] args) throws InterruptedException
{
final Config configuration = ConfigFactory.load(ConfigFactory.
parseString(
"akka {" +
" loggers = [\"akka.event.slf4j.Slf4jLogger\"]\r\n" +
"
ding into requestContextBuilderStage are pushed,
but my breakpoint in RequestContextFactory.apply() never gets hit, and when
I do an break-all in the debugger, it seems like none of my code is on any
of the stacks. Any ideas?
On Monday, September 18, 2017 at 7:39:50 PM UTC-7, Bwmat wrote:
>
>
I'm using akka-stream_2.11-2.5.4.jar, and I'm getting an exception with the
following stack trace when building my graph:
Exception in thread "main" java.lang.ArrayIndexOutOfBoundsException: -1
at
akka.stream.impl.AtomicTraversalBuilder.assign(TraversalBuilder.scala:550)
at
akka.stream.
The type is just Object, and it's not documented in the linked javadoc.
Don't have any working code yet, so can't easily check at runtime myself.
--
>> Read the docs: http://akka.io/docs/
>> Check the FAQ:
>> http://doc.akka.io/docs/akka/current/additional/faq.
llSwitch.shutdown();
}
}
so that I can get a Source/Sink pair to use in building the graph. Is this
a good idea? Will I 'leak' these channels if I don't explicitly close()
them?
I'll only ever need to use .out() once for my use-case.
On Wednesday, September 6, 2017 at 6:29:0
I'm kind of confused how to use `MergeHub`.
I'm designing a flow graph that uses `Flow.mapAsync()`, where the given
function creates another flow graph, and then runs it with `Sink.ignore()`,
and returns that `CompletionStage` as the value for `Flow.mapAsync()` to
wait for. The nested flow will
I have the following line of code:
Source> toRetryHubSource =
MergeHub.of(ResponseContext.class);
I get a warning, because ResponseContext is a generic class, but I can't
seem to specify the generic type arguments without errors.
I can get it compiling (with warnings about unchecked casts), bu
How would I get one part of the graph a materialized value from another part? I
don't want to watch it externally
--
>> Read the docs: http://akka.io/docs/
>> Check the FAQ:
>> http://doc.akka.io/docs/akka/current/additional/faq.html
>> Search the
Whoops,
>I'm not sure how to implement the 'Data Extractor' stage.
should have been
>I'm not sure how to implement the '*Response Handler*' stage.
On Thursday, August 31, 2017 at 1:43:45 PM UTC-7, Bwmat wrote:
>
> I'm just getting started w
I'm designing a graph with a cycle, and to allow termination, I want one of
the graph stages to be able to detect the completion of another (far
removed) graph stage, so it can itself complete.
My first idea was to use an AtomicBoolean, but then I thought it would be
nice to avoid manual synchr
I'm talking about DI in the constructor parameter sense here.
I've gone through some of the documentation, but one thing that's not clear to
me is the expected model of how to efficiently execute many instances of
'similar' graphs.
To elaborate on what I mean, I'm working on a PoC where I'm cre
Also, I'm curious how backpressure works in the created stream, especially the
child Iterator; does it only buffer one element?
--
>> Read the docs: http://akka.io/docs/
>> Check the FAQ:
>> http://doc.akka.io/docs/akka/current/additional/faq.html
>>
Sorry, that second "Source>" was meant to be "Source"
--
>> Read the docs: http://akka.io/docs/
>> Check the FAQ:
>> http://doc.akka.io/docs/akka/current/additional/faq.html
>> Search the archives: https://groups.google.com/group/akka-user
---
You
I'm trying to create a flow that I can consume via something like an Iterator.
I'm implementing a library that exposes an iterator-like interface, so that
would be the simplest thing for me to consume.
My graph designed so far is essentially a Source>. One thing
I see so far is to flatten it to
I'm just getting started with Akka & Akka Streams, and trying to create a
little PoC for my project.
I want to create a flow graph like
https://drive.google.com/file/d/0B8Cu6-NkpXpCX1BuUEVtY3VPVzg/view?usp=sharing
I'm not sure how to implement the 'Data Extractor' stage. My problem is
that I w
31 matches
Mail list logo