Re: BUG: spark.readStream .schema(staticSchema) not receiving schema information

2020-03-28 Thread Zahid Rahman
Thanks for the tip! But if the first thing you come across Is somebody using the trim function to strip away spaces in /etc/hostnames like so from : 127.0.0.1 hostname local To 127.0.0.1hostnamelocal Then there is a log error message showing the outcome of unnecessarily using the trim

Re: BUG: spark.readStream .schema(staticSchema) not receiving schema information

2020-03-28 Thread Zahid Rahman
So the schema is limited to holding only the DEFINITION of schema. For example as you say the columns, I.e. first column User:Int 2nd column String:password. Not location of source I.e. csv file with or without header. SQL DB tables. I am pleased for once I am wrong about being another bug,

Re: BUG: spark.readStream .schema(staticSchema) not receiving schema information

2020-03-28 Thread Zahid Rahman
Very kind of you. On Sat, 28 Mar 2020, 15:24 Russell Spitzer, wrote: > This is probably more of a question for the user support list, but I > believe I understand the issue. > > Schema inside of spark refers to the structure of the output rows, for > example the schema for a particular

Re: BUG: spark.readStream .schema(staticSchema) not receiving schema information

2020-03-28 Thread Russell Spitzer
This is probably more of a question for the user support list, but I believe I understand the issue. Schema inside of spark refers to the structure of the output rows, for example the schema for a particular dataframe could be (User: Int, Password: String) - Two Columns the first is User of type

BUG: spark.readStream .schema(staticSchema) not receiving schema information

2020-03-27 Thread Zahid Rahman
Hi, version: spark-3.0.0-preview2-bin-hadoop2.7 As you can see from the code : STEP 1: I create a object of type static frame which holds all the information to the datasource (csv files). STEP 2: Then I create a variable called staticSchema assigning the information of the schema from the