Hi all,
# Programm Sketch
I create a HiveContext `hiveContext`
With that context, I create a DataFrame `df` from a JDBC relational table.I
register the DataFrame `df` viadf.registerTempTable("TESTTABLE")I start a
HiveThriftServer2 via
HiveThriftServer2.startWithContext(hiveContext)
The TESTTABLE
Ted, thx. Should I repost?
Am 31.10.2015 17:41 schrieb "Ted Yu" :
> From the result of http://search-hadoop.com/?q=spark+Martin+Senne ,
> Martin's post Tuesday didn't go through.
>
> FYI
>
> On Sat, Oct 31, 2015 at 9:34 AM, Nicholas Chammas <
> nicholas
che archives
> (which is precisely one of the motivations for proposing to migrate to
> Discourse), but there is a more readable archive on another unofficial
> site
> <http://search-hadoop.com/m/q3RTtzu5vu1tD3w52&subj=Discourse+A+proposed+alternative+to+the+Spark+User+list>
> .
&
Having written a post on last Tuesday, I'm still not able to see my post
under nabble. And yeah, subscription to u...@apache.spark.org was
successful (rechecked a minute ago)
Even more, I have no way (and no confirmation) that my post was accepted,
rejected, whatever.
This is very L4M3 and so 80i
Hi all,
# Programm Sketch
1. I create a HiveContext `hiveContext`
2. With that context, I create a DataFrame `df` from a JDBC relational
table.
3. I register the DataFrame `df` via
df.registerTempTable("TESTTABLE")
4. I start a HiveThriftServer2 via
HiveThriftServer2.star
When will window functions be integrated into Spark (without HiveContext?)
Gesendet mit AquaMail für Android
http://www.aqua-mail.com
Am 10. August 2015 23:04:22 schrieb Michael Armbrust :
You will need to use a HiveContext for window functions to work.
On Mon, Aug 10, 2015 at 1:26 PM, Jerry
the moment. Can someone please confirm this!
- Alias information is not displayed via DataFrame.printSchema. (or
at least I did not find a way of how to)
Cheers,
Martin
2015-07-31 22:51 GMT+02:00 Martin Senne :
> Dear Michael, dear all,
>
> a minimal example is listed below.
ot;joined2DF:")
joined2DF.show
joined2DF.printSchema
joined2DF.filter(joined2DF("y").isNotNull).show
//joined2DF:
//+-+-+++
//|x|a| x| y|
//+-+-+++
//|1|hello|null|null|
// |2| bob| 2| 5|
//+-+-+++
//
//
.isNotNull).show still contains null values
in the column y. This doesn't really have anything to do with nullable,
which is only a hint to the system so that we can avoid null checking when
we know that there are no null values. If you provide the full code i can
try and see if this is a bu
Dear Michael, dear all,
motivation:
object OtherEntities {
case class Record( x:Int, a: String)
case class Mapping( x: Int, y: Int )
val records = Seq( Record(1, "hello"), Record(2, "bob"))
val mappings = Seq( Mapping(2, 5) )
}
Now I want to perform an *left outer join* on records and
10 matches
Mail list logo