You can create a DataFrame directly from a Cassandra table using something like
this:
val dfCassTable =
sqlContext.read.format("org.apache.spark.sql.cassandra").options(Map( "table"
-> "your_column_family", "keyspace" -> "your_keyspace")).load()
Then, you can get schema:
val dfCassTableSchema = dfCassTable.schema
Mohammed
Author: Big Data Analytics with Spark
-Original Message-
From: justneeraj [mailto:justnee...@gmail.com]
Sent: Tuesday, May 10, 2016 2:22 AM
To: user@spark.apache.org
Subject: Reading table schema from Cassandra
Hi,
We are using Spark Cassandra connector for our app.
And I am trying to create higher level roll up tables. e.g. minutes table from
seconds table.
If my tables are already defined. How can I read the schema of table?
So that I can load them in the Dataframe and create the aggregates.
Any help will be really thankful.
Thanks,
Neeraj
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Reading-table-schema-from-Cassandra-tp26915.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional
commands, e-mail: user-h...@spark.apache.org
-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org