Hey,
I'm trying to design a stream processing of hundreds of thousands of files
row by row, reading files lazily. It comes with the obligation to close the
InputStream at the end so that creating an ActorPublisher for each file
that would close the underlying stream at the end seems to be the best
idea. But the streams must be merged into a single one. My question is, can
I do something like this for hundreds of thousands of files? Or is it a bad
idea? I can't think of anything else right now. Thank you
Source[Row]() { implicit b =>
val actorSources = files.map( file =>
Source(Props(classOf[BatchActor], file))
).toArray
val merge = b.add(Merge[Row](actorSources.length))
for (i <- 0 until actorSources.length) {
b.addEdge(b.add(actorSources(i)), merge.in(i))
}
merge.out
}
--
>>>>>>>>>> Read the docs: http://akka.io/docs/
>>>>>>>>>> Check the FAQ:
>>>>>>>>>> http://doc.akka.io/docs/akka/current/additional/faq.html
>>>>>>>>>> Search the archives: https://groups.google.com/group/akka-user
---
You received this message because you are subscribed to the Google Groups "Akka
User List" group.
To unsubscribe from this group and stop receiving emails from it, send an email
to [email protected].
To post to this group, send email to [email protected].
Visit this group at http://groups.google.com/group/akka-user.
For more options, visit https://groups.google.com/d/optout.