-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org
[
https://issues.apache.org/jira/browse/SPARK-22589?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Hyukjin Kwon resolved SPARK-22589.
--
Resolution: Incomplete
> Subscribe to multiple roles in Me
[
https://issues.apache.org/jira/browse/SPARK-22589?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Hyukjin Kwon updated SPARK-22589:
-
Labels: bulk-closed (was: )
> Subscribe to multiple roles in Me
r just passing
>a comma-separated list of frameworks (hence, splitting on that string) would
>already be sufficient to leverage this capability.
> Subscribe to multiple roles in Mesos
>
>
> Key: SPARK-22589
>
Fabiano Francesconi created SPARK-22589:
---
Summary: Subscribe to multiple roles in Mesos
Key: SPARK-22589
URL: https://issues.apache.org/jira/browse/SPARK-22589
Project: Spark
Issue
many other consumers there are. I think this ticket can now be closed
(just re-open it if you don't believe so). Maybe it'll be worth opening a KIP
on Kafka to have some APIs to allow Spark to be a bit more "optimized", but it
all seems okay for now. Cheers!
> Kafka Consumer should be a
[
https://issues.apache.org/jira/browse/SPARK-20287?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Stephane Maarek closed SPARK-20287.
---
Resolution: Not A Problem
> Kafka Consumer should be able to subscribe to more than
afka Consumer should be able to subscribe to more than one topic partition
> ---
>
> Key: SPARK-20287
> URL: https://issues.apache.org/jira/browse/SPARK-20287
> Project:
[
https://issues.apache.org/jira/browse/SPARK-20287?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15966973#comment-15966973
]
Stephane Maarek commented on SPARK-20287:
-
[~c...@koeninger.org]
How about using the subscribe
api doesn't have a way
for a single consumer to subscribe to multiple partitions, but only read a
particular range of messages from one of them.
The max capacity is just a simple way of dealing with what is basically a LRU
cache - if someone creates topics dynamically and then stops sending
that has to re-coordinate XX
number of Kafka Consumers should one go down. That's more expensive if you have
100 consumers versus a few. But as you said, it should be performance
limitation-driven, right now that'd be speculation.
> Kafka Consumer should be able to subscribe to more than one to
?
Thanks for the discussion, I appreciate it
> Kafka Consumer should be able to subscribe to more than one topic partition
> ---
>
> Key: SPARK-20287
> URL: https://issues.apache.
a problem, and then propose a
solution?
> Kafka Consumer should be able to subscribe to more than one topic partition
> ---
>
> Key: SPARK-20287
> URL: https://issues.apache.org/ji
Stephane Maarek created SPARK-20287:
---
Summary: Kafka Consumer should be able to subscribe to more than
one topic partition
Key: SPARK-20287
URL: https://issues.apache.org/jira/browse/SPARK-20287
subscribe to spark issues
[
https://issues.apache.org/jira/browse/SPARK-9556?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Tathagata Das resolved SPARK-9556.
--
Resolution: Fixed
Fix Version/s: 1.5.0
Make all BlockGenerators subscribe to rate limit
[
https://issues.apache.org/jira/browse/SPARK-9556?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Apache Spark reassigned SPARK-9556:
---
Assignee: Tathagata Das (was: Apache Spark)
Make all BlockGenerators subscribe to rate
[
https://issues.apache.org/jira/browse/SPARK-9556?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Apache Spark reassigned SPARK-9556:
---
Assignee: Apache Spark (was: Tathagata Das)
Make all BlockGenerators subscribe to rate
for this issue:
https://github.com/apache/spark/pull/7913
Make all BlockGenerators subscribe to rate limit updates
Key: SPARK-9556
URL: https://issues.apache.org/jira/browse/SPARK-9556
Project: Spark
Tathagata Das created SPARK-9556:
Summary: Make all BlockGenerators subscribe to rate limit updates
Key: SPARK-9556
URL: https://issues.apache.org/jira/browse/SPARK-9556
Project: Spark
Issue
Best Regard,
Jeff Zhang
25 matches
Mail list logo