https://spark.apache.org/docs/2.0.2/sql-programming-guide.html#json-datasets

"Spark SQL can automatically infer the schema of a JSON dataset and load it
as a DataFrame. This conversion can be done using SQLContext.read.json() on
either an RDD of String, or a JSON file."

val df = spark.sql("SELECT json_encoded_blob_column from table_name"); // A
cassandra query (cassandra stores blobs in hexadecimal )
json_encoded_blob_column
is encoded in hexadecimal. It will be great to have these blobs interpreted
and be loaded as a data frame but for now is there anyway to load or parse
json_encoded_blob_column into a data frame?

Reply via email to