All,

In Spark 1.6.0, we used

val jdbcDF = sqlContext.read.format(-----)

for creating a data frame through hsbc.

In Spark 2.1.x, we have seen this is
val jdbcDF = *spark*.read.format(-----)

Does that mean we should not be using sqlContext going forward? Also, we
see that sqlContext is not auto initialized while running spark-shell.
Please advise, thanks

Best, Ravion

Reply via email to