Yeah I had no specific reason, BaseSessionStateBuilder is probably better.
I'll Jira it up
On Thu, Sep 27, 2018 at 4:47 PM Herman van Hovell
wrote:
> Hey Russel,
>
> I took a quick look at your path. I think it is more inline with they way
> the current extensions work, if you call the extension
Yes, the "startWithContext" code predates SparkSessions in Thriftserver, so
it doesn't really work the way you want it to with Session initiation.
On Thu, Sep 27, 2018 at 11:13 AM Russell Spitzer
wrote:
> While that's easy for some users, we basically want to load up some
> functions by default
I wrote a quick patch and attached it if anyone wants to think about this
in context. I can always rebase this to master.
On Thu, Sep 27, 2018 at 1:39 PM Russell Spitzer
wrote:
> And incase anyone is wondering, the reason I want this may be avoided with
> DataSourceV2 depending on some of the fu
And incase anyone is wondering, the reason I want this may be avoided with
DataSourceV2 depending on some of the function pushdown discussions. We
want to add functions which work only with the Cassandra DataSource (ttl
and writetime), I've done the work to add in the custom expressions and
analysi
It would be a @dev internal api I think
If we wanted to go extremely general with post session init, it could be
added to SparkExtensions
def postSessionInit(session: SparkSession) : Unit
Which would allow you to do just about anything after sessionState was done
initialized.
Or if we specifica
Thoughts on how the api would look like?
On Thu, Sep 27, 2018 at 11:13 AM Russell Spitzer
wrote:
> While that's easy for some users, we basically want to load up some
> functions by default into all session catalogues regardless of who made
> them. We do this with certain rules and strategies us
While that's easy for some users, we basically want to load up some
functions by default into all session catalogues regardless of who made
them. We do this with certain rules and strategies using the
SparkExtensions, so all apps that run through our submit scripts get a
config parameter added and
You're talking about users starting Thriftserver or SqlShell from the
command line, right? It's much easier if you are starting a Thriftserver
programmatically so that you can register functions when initializing a
SparkContext and then HiveThriftServer2.startWithContext using that
context.
On We
I've been looking recently on possible avenues to load new functions into
the Thriftserver and SqlShell at launch time. I basically want to preload a
set of functions in addition to those already present in the Spark Code.
I'm not sure there is at present a way to do this and I was wondering if
any