Did you check the worker process logs?  What are the emit counts for the
components in the UI?

Also I have not run on AWS personally but the following line seems strange
to me:

:node->host {"bd340179-5905-4233-ad5d-44b47b228177" "*.compute.amazonaws.com
"}

Generally this would be bound to a specific machine, not *.<domain>.

On Mon, Apr 13, 2015 at 10:22 AM, Matthew Sharpe <[email protected]>
wrote:

> Hi,
>
> I've set up a small Storm cluster on AWS that is giving me headaches. At
> the moment I've got a situation where the cluster (a nimbus node and 3
> supervisors) will successfully run in local mode my particular topology.
> However, as soon as I try to run the topology in distributed mode there are
> no errors (that I can see) in any of the logs - the topology appears in the
> UI but never shows any tuples emitted by the spouts.
>
> In order to check it wasn't my topology I cloned the storm-starter project
> and have tried with the RollingTopWords example - running locally I get a
> stream of output but as soon as I specify that we should run in a
> distributed fashion I get a success message (topology successfully
> submitted). The topology appears on the UI as successfully submitted but I
> see no progress.
>
> It's a bit of a weird one as I don't know which logs to share to show the
> problem as the whole point is I'm not seeing any errors!
>
> The last entry in the supervisor logs pertains to the set up of the
> supervisor (i.e. it's not logged anything about the topologies that have
> been submitted)
>
> The last entries in the nimbus logs are probably a bit more illuminating:
>
> 2015-04-13 14:09:25 b.s.d.nimbus [INFO] Executor
> production-topology-2-1428932653:[2 2] not alive
> 2015-04-13 14:09:25 b.s.d.nimbus [INFO] Executor
> production-topology-2-1428932653:[3 3] not alive
> 2015-04-13 14:09:25 b.s.d.nimbus [INFO] Executor
> production-topology-2-1428932653:[4 4] not alive
> 2015-04-13 14:09:25 b.s.s.EvenScheduler [INFO] Available slots:
> (["bd340179-5905-4233-ad5d-44b47b228177" 6702]
> ["bd340179-5905-4233-ad5d-44b47b228177" 6701]
> ["bd340179-5905-4233-ad5d-44b47b228177" 6700]
> ["9de97b6f-3d3f-4797-8ac0-e89501de48d3" 6703]
> ["9de97b6f-3d3f-4797-8ac0-e89501de48d3" 6702]
> ["9de97b6f-3d3f-4797-8ac0-e89501de48d3" 6701]
> ["9de97b6f-3d3f-4797-8ac0-e89501de48d3" 6700]
> ["2874d18d-e0d1-46e0-a50d-85ce645ec4aa" 6703]
> ["2874d18d-e0d1-46e0-a50d-85ce645ec4aa" 6702]
> ["2874d18d-e0d1-46e0-a50d-85ce645ec4aa" 6701]
> ["2874d18d-e0d1-46e0-a50d-85ce645ec4aa" 6700])
> 2015-04-13 14:09:25 b.s.d.nimbus [INFO] Reassigning
> production-topology-2-1428932653 to 1 slots
> 2015-04-13 14:09:25 b.s.d.nimbus [INFO] Reassign executors: [[2 2] [3 3]
> [4 4] [5 5] [6 6] [7 7] [8 8] [9 9] [10 10] [11 11] [12 12] [13 13] [14 14]
> [15 15] [1 1]]
> 2015-04-13 14:09:25 b.s.d.nimbus [INFO] Setting new assignment for
> topology id production-topology-2-1428932653:
> #backtype.storm.daemon.common.Assignment{:master-code-dir
> "/app/storm/nimbus/stormdist/production-topology-2-1428932653", :node->host
> {"bd340179-5905-4233-ad5d-44b47b228177" "*.compute.amazonaws.com"},
> :executor->node+port {[2 2] ["bd340179-5905-4233-ad5d-44b47b228177" 6702],
> [3 3] ["bd340179-5905-4233-ad5d-44b47b228177" 6702], [4 4]
> ["bd340179-5905-4233-ad5d-44b47b228177" 6702], [5 5]
> ["bd340179-5905-4233-ad5d-44b47b228177" 6702], [6 6]
> ["bd340179-5905-4233-ad5d-44b47b228177" 6702], [7 7]
> ["bd340179-5905-4233-ad5d-44b47b228177" 6702], [8 8]
> ["bd340179-5905-4233-ad5d-44b47b228177" 6702], [9 9]
> ["bd340179-5905-4233-ad5d-44b47b228177" 6702], [10 10]
> ["bd340179-5905-4233-ad5d-44b47b228177" 6702], [11 11]
> ["bd340179-5905-4233-ad5d-44b47b228177" 6702], [12 12]
> ["bd340179-5905-4233-ad5d-44b47b228177" 6702], [13 13]
> ["bd340179-5905-4233-ad5d-44b47b228177" 6702], [14 14]
> ["bd340179-5905-4233-ad5d-44b47b228177" 6702], [15 15]
> ["bd340179-5905-4233-ad5d-44b47b228177" 6702], [1 1]
> ["bd340179-5905-4233-ad5d-44b47b228177" 6702]}, :executor->start-time-secs
> {[2 2] 1428934165, [3 3] 1428934165, [4 4] 1428934165, [5 5] 1428934165, [6
> 6] 1428934165, [7 7] 1428934165, [8 8] 1428934165, [9 9] 1428934165, [10
> 10] 1428934165, [11 11] 1428934165, [12 12] 1428934165, [13 13] 1428934165,
> [14 14] 1428934165, [15 15] 1428934165, [1 1] 1428934165}}
>
> If anybody can help that'd be greatly appreciated!
>
> Thanks,
>
> Matt
>
> --
> Check out my blog: dogdogfish.com
>

Reply via email to