Hi guys,

I am thinking of this because of Dotan's email (thanks, Dotan) . Currently
people are using different versions of hadoop. They will definitely have
problem if their hadoop server has different version from what Samza is
complied. That hurts user experience, no matter he is a veteran or newbie.

In Spark, they provide a way to configure different hadoop version during
compiling in their latest and previous release:
http://spark.apache.org/docs/latest/building-with-maven.html
http://spark.apache.org/docs/0.9.0/

Maybe we should consider this as an add-on for our 0.7.0 release too. Since
we already are able to switch scala version, it should not have technical
difficulty. The risk may come from Samza is not extensively tested in other
hadoop versions. If the risk is the concern, at least, we can provide
simple instruction to help user build the Samza with other hadoop versions.

What do you think?

Thanks,

Fang, Yan
[email protected]
+1 (206) 849-4108

Reply via email to