Bit of a long-shot since my example is related to Storm not Hadoop, and my 
experience was five years ago, but this blog post might help:

http://derek.troywest.com/articles/trident-in-clojure

and this related ticket: https://github.com/sorenmacbeth/marceline/issues/9

I could run a Storm topology locally, or on a single node, but not when 
deployed to a Storm cluster. It came down the the way Storm chops up 
topologies and serialises them across the network to run on remote nodes. 
There a few different ways of implementing interfaces in Clojure, and (at 
the time at least) they had different semantics in the 
serialise-reconstruct-and-run world.

This may also be helpful, it has been fixed since I wrote the blog 
post: http://dev.clojure.org/jira/browse/CLJ-1208

Good luck!

On Saturday, 5 January 2019 08:35:22 UTC+11, cloje wrote:
>
> I have a project that I'm trying to run on Hadoop in cluster mode. 
>
> I have written and executed the code on my local machine and on a single 
> node on Hadoop. But when I try to executed the same jar file in cluster 
> mode with 5 executors, it bombs at a function and says, Attempting to call 
> unbound fn, '#my-func-name.
>
> Has anyone faced the same kind?
> Was there a work around?
>

-- 
You received this message because you are subscribed to the Google
Groups "Clojure" group.
To post to this group, send email to clojure@googlegroups.com
Note that posts from new members are moderated - please be patient with your 
first post.
To unsubscribe from this group, send email to
clojure+unsubscr...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/clojure?hl=en
--- 
You received this message because you are subscribed to the Google Groups 
"Clojure" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to clojure+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Reply via email to