/latest/databricks_
> guide/index.html#04%20SQL,%20DataFrames%20%26%20Datasets/
> 02%20Introduction%20to%20DataFrames%20-%20scala.html
>
>
> On Sun, Nov 20, 2016 at 1:24 PM, pandees waran <pande...@gmail.com> wrote:
>
>> have you tried using "." access method?
>>
The following my dataframe schema
root
|-- name: string (nullable = true)
|-- addresses: array (nullable = true)
||-- element: struct (containsNull = true)
|||-- street: string (nullable = true)
|||-- city: string (nullable = true)
I want to
so it's not
> supported. You can define your class which is supported by SQL Encoder, and
> convert this generated class to the new class in `parseLine`.
>
> On Wed, Nov 16, 2016 at 4:22 PM, shyla deshpande <deshpandesh...@gmail.com
> > wrote:
>
>> Ryan,
>>
t;)
.format("console")
.start()
query.awaitTermination()
}
On Thu, Nov 17, 2016 at 11:30 AM, shyla deshpande <deshpandesh...@gmail.com>
wrote:
> val spark = SparkSession.builder.
> master("local")
> .appName("spark session example"
val spark = SparkSession.builder.
master("local")
.appName("spark session example")
.getOrCreate()
import spark.implicits._
val dframe1 = spark.readStream.format("kafka").
option("kafka.bootstrap.servers","localhost:9092").
option("subscribe","student").load()
*How do I deserialize
g city = 2;
}
message Person {
optional string name = 1;
optional int32 age = 2;
optional Gender gender = 3;
repeated string tags = 4;
repeated Address addresses = 5;
}
On Wed, Nov 16, 2016 at 3:04 PM, shyla deshpande <deshpandesh...@gmail.com>
wrote:
> *Thanks for the
ME_FIELD_NUMBER = 1
final val AGE_FIELD_NUMBER = 2
final val GENDER_FIELD_NUMBER = 3
final val TAGS_FIELD_NUMBER = 4
final val ADDRESSES_FIELD_NUMBER = 5
}
On Wed, Nov 16, 2016 at 1:28 PM, Shixiong(Ryan) Zhu <shixi...@databricks.com
> wrote:
> Could you provide the Person class?
to fix it (http://docs.scala-lang.org/overviews/reflection/thread-
> safety.html) AFAIK, the only way to fix it is upgrading to Scala 2.11.
>
> On Wed, Nov 16, 2016 at 11:16 AM, shyla deshpande <
> deshpandesh...@gmail.com> wrote:
>
>> I am using protobuf to encode. This may
I am using protobuf to encode. This may not be related to the new release
issue
Exception in thread "main" scala.ScalaReflectionException: is not a
term
at scala.reflect.api.Symbols$SymbolApi$class.asTerm(Symbols.scala:199)
at
Is it OK to use ProtoBuf for sending messages to Kafka? I do not see
anyone using it .
Please direct me to some code samples of how to use it in Spark Structured
streaming.
Thanks again..
On Sat, Nov 12, 2016 at 11:44 PM, shyla deshpande <deshpandesh...@gmail.com>
wrote:
> Thanks
/jaceklaskowski
>
>
> On Sat, Nov 12, 2016 at 4:07 PM, Luciano Resende <luckbr1...@gmail.com>
> wrote:
>
> If you are interested in Akka streaming, it is being maintained in Apache
> Bahir. For Akka there isn't a structured streaming version yet, but we
> would
> b
Using ProtoBuf for Kafka messages with Spark Streaming because ProtoBuf is
already being used in the system.
Some sample code and reading material for using ProtoBuf for Kafka messages
with Spark Streaming will be helpful.
Thanks
I am using Spark 2.0.1. I wanted to build a data pipeline using Kafka,
Spark Streaming and Cassandra using Structured Streaming. But the kafka
source support for Structured Streaming is not yet available. So now I am
trying to use Akka Stream as the source to Spark Streaming.
Want to make sure I
I am using spark-cassandra-connector_2.11.
On Mon, Nov 7, 2016 at 3:33 PM, shyla deshpande <deshpandesh...@gmail.com>
wrote:
> Hi ,
>
> I am trying to do structured streaming with the wonderful SparkSession,
> but cannot save the streaming data to Cassandra.
>
> If any
Hi ,
I am trying to do structured streaming with the wonderful SparkSession, but
cannot save the streaming data to Cassandra.
If anyone has got this working, please help
Thanks
Hi Jaya!
Thanks for the reply. Structured streaming works fine for me with socket
text stream . I think structured streaming with kafka source not yet
supported.
Please if anyone has got it working with kafka source, please provide me
some sample code or direction.
Thanks
On Sun, Nov 6, 2016
I am trying to do Structured Streaming with Kafka Source. Please let me
know where I can find some sample code for this. Thanks
provided, in this case they
> would only be provided you run your application with spark-submit or
> otherwise have Spark's JARs on your class path. How are you launching your
> application?
>
> On Fri, Nov 4, 2016 at 2:00 PM, shyla deshpande <deshpandesh...@gmail.com>
> wrote:
&
object App {
import org.apache.spark.sql.functions._
import org.apache.spark.sql.SparkSession
def main(args : Array[String]) {
println( "Hello World!" )
val sparkSession = SparkSession.builder.
master("local")
.appName("spark session example")
.getOrCreate()
}
Hi Shyla,
>
> there is the documentation for setting up IDE - https://cwiki.apache.org/
> confluence/display/SPARK/Useful+Developer+Tools#
> UsefulDeveloperTools-IDESetup
>
> I hope this is helpful.
>
>
> 2016-11-04 9:10 GMT+09:00 shyla deshpande <deshpandesh...@gmail.com>:
>
&g
Hello Everyone,
I just installed Spark 2.0.1, spark shell works fine.
Was able to run some simple programs from the Spark Shell, but find it hard
to make the same program work when using IntelliJ.
I am getting the following error.
Exception in thread "main" java.lang.NoSuchMethodError:
ich...@databricks.com>
wrote:
> I'm not aware of any open issues against the kafka source for structured
> streaming.
>
> On Tue, Nov 1, 2016 at 4:45 PM, shyla deshpande <deshpandesh...@gmail.com>
> wrote:
>
>> I am building a data pipeline using Kafka, Spark streaming and C
I am building a data pipeline using Kafka, Spark streaming and Cassandra.
Wondering if the issues with Kafka source fixed in Spark 2.0.1. If not,
please give me an update on when it may be fixed.
Thanks
-Shyla
101 - 123 of 123 matches
Mail list logo