Probably a workable solution is, create your own SQLContext by extending the 
class HiveContext, and override the `analyzer`, and add your own rule to do the 
hacking.

From: r7raul1...@163.com [mailto:r7raul1...@163.com]
Sent: Thursday, September 17, 2015 11:08 AM
To: Cheng, Hao; user
Subject: Re: RE: spark sql hook

Example:
select * from test.table     chang to  select * from production.table

________________________________
r7raul1...@163.com<mailto:r7raul1...@163.com>

From: Cheng, Hao<mailto:hao.ch...@intel.com>
Date: 2015-09-17 11:05
To: r7raul1...@163.com<mailto:r7raul1...@163.com>; 
user<mailto:user@spark.apache.org>
Subject: RE: spark sql hook
Catalyst TreeNode is very fundamental API, not sure what kind of hook you need. 
Any concrete example will be more helpful to understand your requirement.

Hao

From: r7raul1...@163.com<mailto:r7raul1...@163.com> [mailto:r7raul1...@163.com]
Sent: Thursday, September 17, 2015 10:54 AM
To: user
Subject: spark sql hook


I want to modify some sql treenode before execute. I cau do this by hive hook 
in hive. Does spark support such hook? Any advise?
________________________________
r7raul1...@163.com<mailto:r7raul1...@163.com>

Reply via email to