I suddenly realized that in near future I'll also need another "substream" 
to get some data from other parts of the system which is needed to generate 
MetaInfo so ConveyorStage should look something like this 
<https://lh6.googleusercontent.com/--mZBz0S28u8/VQW5oBwINfI/AAAAAAAAAF8/nJWbWBthKsg/s1600/ConveyorStage.png>
.
So conceptually ConveyorStage has 2 inputs(*ByteString* and 
*System.Responce*) and 3 outputs(*System.Request*, *ByteString* and 
*MetaInfo*). With my current approach the ConveyorStage would be defined as 
*class 
ConveyorStage extends Actor with ActorPublisher[CustomTriple[Bytestring, 
MetaInfo, SystemRequest]] with ActorSubscriber* which looks a little bit 
weird and not very scalable. AFAIU such multi-input multi-output Processor 
can be implemented via *PartialFlowGraph *however I don't know how to 
maintain state inside such materialized Processor instance(because all 
inputs and outputs are interdependant) and if it's even possible. Also I 
looked throught the ActorPublisher and ActorSubscriber source code and tend 
to believe that actor-based Processor with multiple inputs and outputs can 
be implemented but it really feels like a hack(lots of things are marked as 
internal API).
I really don't want to revert back to passing custom actors representing 
MetaInfoStorage and System to each conveyor stage and handle demand via 
custom ACK-like commands because it feels like reinventing reactive 
stremas. 
Could somebody recommend the most idiomatic approach to this problem?
Any help would be greatly appreciated! Thanks in advance!
On Sunday, March 15, 2015 at 4:13:49 AM UTC+3, [email protected] wrote:
>
> Hi! I'm struggling to build something like conveyor system for file 
> processing. The main idea is to have combinable conveyor stages which can 
> operate on ByteStrings streamed throught them and this use case seems to be 
> a perfect fit for reactive streams. However the conveyor stages also need 
> to extract some meta information from Bytestrings passing through them and 
> somehow send it down to other parts of the system but I'm not sure what is 
> the best/most idiomatic way to do it in the world of reactive streams. My 
> current solution is to have stages defined as *class ConveyorStage 
> extends Actor with ActorPublisher[Either[Bytestring, MetaInfo]] with 
> ActorSubscriber* and then split ByteStrings and MetaInfo to different 
> sinks in the end but it doesn't look conceptually right. I think that the 
> most idiomatic way would be to have a Processor which is part of two 
> separate streams at the same time and have an ability to emit to any of 
> them according to it's inner logic but I can't find any references in 
> documentation how to use the same "singleton" actor processor in different 
> materialized streams. Is this the right direction or am I overengineering 
> it?
> I'm quite new to all this reactive stuff and there are high chances that I 
> completely misunderstood the concept. So any advice/clarification/guideline 
> would be greatly appreciated. Thanks in advance!
>

-- 
>>>>>>>>>>      Read the docs: http://akka.io/docs/
>>>>>>>>>>      Check the FAQ: 
>>>>>>>>>> http://doc.akka.io/docs/akka/current/additional/faq.html
>>>>>>>>>>      Search the archives: https://groups.google.com/group/akka-user
--- 
You received this message because you are subscribed to the Google Groups "Akka 
User List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To post to this group, send email to [email protected].
Visit this group at http://groups.google.com/group/akka-user.
For more options, visit https://groups.google.com/d/optout.

Reply via email to