hey QiangCai,
thank you for your reply . i have spark 1.6.2. and also tried with
-Dspark.version=1.6.2 . But result is same . Still i am getting same
exception.
Is this exception possibe if i have different scala version?
--
View this message in context:
GitHub user rahulforallp opened a pull request:
https://github.com/apache/incubator-carbondata/pull/269
CARBONDATA-345
Be sure to do all of the following to help us incorporate your contribution
quickly and easily:
- [ ] Make sure the PR title is formatted like