Use Spark XML version,0.3.3
com.databricks
spark-xml_2.10
0.3.3
On Fri, Jun 17, 2016 at 4:25 PM, VG wrote:
> Hi Siva
>
> This is what i have for jars. Did you manage to run with these or
> different versions ?
>
>
>
> org.apache.spark
> spark-core_2.10
> 1.6.1
>
>
> org.apache.spark
> spark
It proceeded with the jars I mentioned.
However no data getting loaded into data frame...
sob sob :(
On Fri, Jun 17, 2016 at 4:25 PM, VG wrote:
> Hi Siva
>
> This is what i have for jars. Did you manage to run with these or
> different versions ?
>
>
>
> org.apache.spark
> spark-core_2.10
> 1.
Hi Siva
This is what i have for jars. Did you manage to run with these or different
versions ?
org.apache.spark
spark-core_2.10
1.6.1
org.apache.spark
spark-sql_2.10
1.6.1
com.databricks
spark-xml_2.10
0.2.0
org.scala-lang
scala-library
2.10.6
Thanks
VG
On Fri, Jun 17, 2016 at 4:16 PM
Hi Marco,
I did run in IDE(Intellij) as well. It works fine.
VG, make sure the right jar is in classpath.
--Siva
On Fri, Jun 17, 2016 at 4:11 PM, Marco Mistroni wrote:
> and your eclipse path is correct?
> i suggest, as Siva did before, to build your jar and run it via
> spark-submit by spec
and your eclipse path is correct?
i suggest, as Siva did before, to build your jar and run it via
spark-submit by specifying the --packages option
it's as simple as run this command
spark-submit --packages
com.databricks:spark-xml_: --class
Indeed, if you have only these lines to run, why
Try to import the class and see if you are getting compilation error
import com.databricks.spark.xml
Siva
On Fri, Jun 17, 2016 at 4:02 PM, VG wrote:
> nopes. eclipse.
>
>
> On Fri, Jun 17, 2016 at 3:58 PM, Siva A wrote:
>
>> If you are running from IDE, Are you using Intellij?
>>
>> On Fri, J
nopes. eclipse.
On Fri, Jun 17, 2016 at 3:58 PM, Siva A wrote:
> If you are running from IDE, Are you using Intellij?
>
> On Fri, Jun 17, 2016 at 3:20 PM, Siva A wrote:
>
>> Can you try to package as a jar and run using spark-submit
>>
>> Siva
>>
>> On Fri, Jun 17, 2016 at 3:17 PM, VG wrote:
If you are running from IDE, Are you using Intellij?
On Fri, Jun 17, 2016 at 3:20 PM, Siva A wrote:
> Can you try to package as a jar and run using spark-submit
>
> Siva
>
> On Fri, Jun 17, 2016 at 3:17 PM, VG wrote:
>
>> I am trying to run from IDE and everything else is working fine.
>> I add
Can you try to package as a jar and run using spark-submit
Siva
On Fri, Jun 17, 2016 at 3:17 PM, VG wrote:
> I am trying to run from IDE and everything else is working fine.
> I added spark-xml jar and now I ended up into this dependency
>
> 6/06/17 15:15:57 INFO BlockManagerMaster: Registered
I am trying to run from IDE and everything else is working fine.
I added spark-xml jar and now I ended up into this dependency
6/06/17 15:15:57 INFO BlockManagerMaster: Registered BlockManager
Exception in thread "main" *java.lang.NoClassDefFoundError:
scala/collection/GenTraversableOnce$class*
at
Hi Siva,
I still get a similar exception (See the highlighted section - It is
looking for DataSource)
16/06/17 15:11:37 INFO BlockManagerMaster: Registered BlockManager
Exception in thread "main" java.lang.ClassNotFoundException: Failed to find
data source: xml. Please find packages at http://spar
So you are using spark-submit or spark-shell?
you will need to launch either by passing --packages option (like in the
example below for spark-csv). you will need to iknow
--packages com.databricks:spark-xml_:
hth
On Fri, Jun 17, 2016 at 10:20 AM, VG wrote:
> Apologies for that.
> I am try
If its not working,
Add the package list while executing spark-submit/spark-shell like below
$SPARK_HOME/bin/spark-shell --packages com.databricks:spark-xml_2.10:0.3.3
$SPARK_HOME/bin/spark-submit --packages com.databricks:spark-xml_2.10:0.3.3
On Fri, Jun 17, 2016 at 2:56 PM, Siva A wrote:
Just try to use "xml" as format like below,
SQLContext sqlContext = new SQLContext(sc);
DataFrame df = sqlContext.read()
.format("xml")
.option("rowTag", "row")
.load("A.xml");
FYR: https://github.com/databricks/spark-xml
--Siva
On Fri, Jun 17
Apologies for that.
I am trying to use spark-xml to load data of a xml file.
here is the exception
16/06/17 14:49:04 INFO BlockManagerMaster: Registered BlockManager
Exception in thread "main" java.lang.ClassNotFoundException: Failed to find
data source: org.apache.spark.xml. Please find packages
too little info
it'll help if you can post the exception and show your sbt file (if you are
using sbt), and provide minimal details on what you are doing
kr
On Fri, Jun 17, 2016 at 10:08 AM, VG wrote:
> Failed to find data source: com.databricks.spark.xml
>
> Any suggestions to resolve this
>
>
Failed to find data source: com.databricks.spark.xml
Any suggestions to resolve this
17 matches
Mail list logo