[ 
https://issues.apache.org/jira/browse/KNOX-844?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Kevin Risden resolved KNOX-844.
-------------------------------
    Resolution: Fixed

Marking as resolved because this is in the latest user guide.

https://knox.apache.org/books/knox-1-1-0/user-guide.html#Avatica

> Add documentation for support of Apache Phoenix via Knox
> --------------------------------------------------------
>
>                 Key: KNOX-844
>                 URL: https://issues.apache.org/jira/browse/KNOX-844
>             Project: Apache Knox
>          Issue Type: Improvement
>          Components: Site
>    Affects Versions: 0.11.0
>            Reporter: John McParland
>            Assignee: Josh Elser
>            Priority: Major
>             Fix For: 0.11.0
>
>         Attachments: KNOX-844.001.patch, KNOX-844.002.patch, Knox_Phoenix.png
>
>
> We would like to access data stored in Hadoop (especially HBase) using 
> traditional tools which rely on ODBC connections and SQL.
> Phoenix provides the SQL interface to HBase, and Hortonworks have an [ODBC 
> Connector for 
> Phoenix|http://hortonworks.com/hadoop-tutorial/bi-apache-phoenix-odbc/]
> However this is unsecured - in so far as accessing from outside of the 
> perimeter of the Big Data Platform.
> This ticket should address that by allowing the ODBC connection to Phoenix to 
> be proxied through Knox, to enforce perimeter level security.
> h4. Acceptance Criteria
> - Connections to Phoenix via Knox are only allowed with valid credentials, as 
> enforced by Knox
> - Connections to Phoenix via Knox are NOT allowed if Knox finds invalid 
> credentials.
> -  Connection to Phoenix via Knox can are made via an ODBC connector



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

Reply via email to