I am using Spark Streaming with Python. For each RDD, I call a map, i.e.,
myrdd.map(myfunc), myfunc is in a separate Python module. In yet another
separate Python module I have a global list, i.e. mylist, that's populated
with metadata. I can't get myfunc to see mylist...it's always empty.
Alternatively, I guess I could pass mylist to map.

Any suggestions?

Thanks,
Vadim
ᐧ

Reply via email to