Hi, I am trying to get a sense of number of streams we can process in parallel on a Spark streaming cluster(Hadoop Yarn). Is there any benchmark for the same? We need a large number of streams(original + transformed) to be processed in parallel. The number is approximately around= 30,0000.
thanks Negi -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Number-of-Spark-streams-in-Yarn-cluster-tp7386.html Sent from the Apache Spark User List mailing list archive at Nabble.com.