Re: [RESULT][VOTE] Accept Apache Knox Hadoop Gateway Project into the Incubator

2013-03-12 Thread Branko Čibej
Just tagging the [RESULT] onto the subject so that the vote status gets
updated.

On 24.02.2013 04:36, Devaraj Das wrote:
 Hi folks,
 With 10 binding +1 votes, this vote has passed.  Thanks to everyone who
 voted.
 Devaraj.
  +1 (binding)

 On Feb 15, 2013, at 11:22 AM, Devaraj Das wrote:

 Hi Folks,

 Thanks for participating in the discussion. I'd like to call a VOTE
 for acceptance of Apache Knox Hadoop Gateway Project into the
 Incubator. The vote will close on Feb 22 at 6:00 p.m.

 [ ]  +1 Accept Apache Knox Hadoop Gateway Project into the Incubator
 [ ]  +0 Don't care.
 [ ]  -1 Don't accept Apache Knox Hadoop Gateway Project into the
 Incubator because...

 Full proposal is pasted at the bottom of this email, and the
 corresponding wiki is http://wiki.apache.org/incubator/knox. Only
 VOTEs from Incubator PMC members are binding.

 Here's my +1 (binding).

 Thanks,
 Devaraj.

 -

 Knox Gateway Proposal

 Abstract

 Knox Gateway is a system that provides a single point of secure access
 for Apache Hadoop clusters.

 Proposal

 The Knox Gateway (“Gateway” or “Knox”) is a system that provides a
 single point of authentication and access for Apache Hadoop services
 in a cluster. The goal is to simplify Hadoop security for both users
 (i.e. who access the cluster data and execute jobs) and operators
 (i.e. who control access and manage the cluster). The Gateway runs as
 a server (or cluster of servers) that serve one or more Hadoop
 clusters.

 Provide perimeter security to make Hadoop security setup easier
 Support authentication and token verification security scenarios
 Deliver users a single cluster end-point that aggregates capabilities
 for data and jobs
 Enable integration with enterprise and cloud identity management
 environments
 Background

 An Apache Hadoop cluster is presented to consumers as a loose
 collection of independent services. This makes it difficult for users
 to interact with Hadoop since each service maintains it’s own method
 of access and security. As well, for operators, configuration and
 administration of a secure Hadoop cluster is a complex and many Hadoop
 clusters are insecure as a result.

 The goal of the project is to provide coverage for all existing Hadoop
 ecosystem projects. In addition, the project will be extensible to
 allow for new and/or proprietary Hadoop components without requiring
 changes to the gateway source code. The gateway is expected to run in
 a DMZ environment where it will provide controlled access to these
 Hadoop services. In this way Hadoop clusters can be protected by a
 firewall and only limited access provided through the firewall for the
 gateway. The authentication components of the gateway will be modular
 and extensible such that it can be integrated with existing security
 infrastructure.

 Rationale

 Organizations that are struggling with Hadoop cluster security result
 in a) running Hadoop without security or b) slowing adoption of
 Hadoop. The Gateway aims to provide perimeter security that integrates
 more easily into existing organizations’ security infrastructure.
 Doing so will simplify security for these organizations and benefit
 all Hadoop stakeholders (i.e. users and operators). Additionally,
 making a dedicated perimeter security project part of the Apache
 Hadoop ecosystem will prevent fragmentation in this area and further
 increase the value of Hadoop as a data platform.

 Current Status

 Prototype available, developed by the list of initial committers.

 Meritocracy

 We desire to build a diverse developer community around Gateway
 following the Apache Way. We want to make the project open source and
 will encourage contributors from multiple organizations following the
 Apache meritocracy model.

 Community

 We hope to extend the user and developer base in the future and build
 a solid open source community around Gateway. Apache Hadoop has a
 large ecosystem of open source projects, each with a strong community
 of contributors. All project communities in this ecosystem have an
 opportunity to participate in the advancement of the Gateway project
 because ultimately, Gateway will enable the security capabilities of
 their project to be more enterprise friendly.

 Core Developers

 Gateway is currently being developed by several engineers from
 Hortonworks - Kevin Minder, Larry McCay, John Speidel, Tom Beerbower
 and Sumit Mohanty. All the engineers have deep expertise in
 middleware, security  identity systems and are quite familiar with
 the Hadoop ecosystem.

 Alignment

 The ASF is a natural host for Gateway given that it is already the
 home of Hadoop, Hive, Pig, HBase, Oozie and other emerging big data
 software projects. Gateway is designed to solve the security
 challenges familiar to the Hadoop ecosystem family of projects.

 Known Risks

 Orphaned products  Reliance on Salaried Developers

 The core developers plan to work full time on the project. We believe
 that this project will be of 

Re: [VOTE] Accept Apache Knox Hadoop Gateway Project into the Incubator

2013-02-23 Thread Devaraj Das
Hi folks,
With 10 binding +1 votes, this vote has passed.  Thanks to everyone who
voted.
Devaraj.
 +1 (binding)

On Feb 15, 2013, at 11:22 AM, Devaraj Das wrote:

 Hi Folks,

 Thanks for participating in the discussion. I'd like to call a VOTE
 for acceptance of Apache Knox Hadoop Gateway Project into the
 Incubator. The vote will close on Feb 22 at 6:00 p.m.

 [ ]  +1 Accept Apache Knox Hadoop Gateway Project into the Incubator
 [ ]  +0 Don't care.
 [ ]  -1 Don't accept Apache Knox Hadoop Gateway Project into the
 Incubator because...

 Full proposal is pasted at the bottom of this email, and the
 corresponding wiki is http://wiki.apache.org/incubator/knox. Only
 VOTEs from Incubator PMC members are binding.

 Here's my +1 (binding).

 Thanks,
 Devaraj.

 -

 Knox Gateway Proposal

 Abstract

 Knox Gateway is a system that provides a single point of secure access
 for Apache Hadoop clusters.

 Proposal

 The Knox Gateway (“Gateway” or “Knox”) is a system that provides a
 single point of authentication and access for Apache Hadoop services
 in a cluster. The goal is to simplify Hadoop security for both users
 (i.e. who access the cluster data and execute jobs) and operators
 (i.e. who control access and manage the cluster). The Gateway runs as
 a server (or cluster of servers) that serve one or more Hadoop
 clusters.

 Provide perimeter security to make Hadoop security setup easier
 Support authentication and token verification security scenarios
 Deliver users a single cluster end-point that aggregates capabilities
 for data and jobs
 Enable integration with enterprise and cloud identity management
environments

 Background

 An Apache Hadoop cluster is presented to consumers as a loose
 collection of independent services. This makes it difficult for users
 to interact with Hadoop since each service maintains it’s own method
 of access and security. As well, for operators, configuration and
 administration of a secure Hadoop cluster is a complex and many Hadoop
 clusters are insecure as a result.

 The goal of the project is to provide coverage for all existing Hadoop
 ecosystem projects. In addition, the project will be extensible to
 allow for new and/or proprietary Hadoop components without requiring
 changes to the gateway source code. The gateway is expected to run in
 a DMZ environment where it will provide controlled access to these
 Hadoop services. In this way Hadoop clusters can be protected by a
 firewall and only limited access provided through the firewall for the
 gateway. The authentication components of the gateway will be modular
 and extensible such that it can be integrated with existing security
 infrastructure.

 Rationale

 Organizations that are struggling with Hadoop cluster security result
 in a) running Hadoop without security or b) slowing adoption of
 Hadoop. The Gateway aims to provide perimeter security that integrates
 more easily into existing organizations’ security infrastructure.
 Doing so will simplify security for these organizations and benefit
 all Hadoop stakeholders (i.e. users and operators). Additionally,
 making a dedicated perimeter security project part of the Apache
 Hadoop ecosystem will prevent fragmentation in this area and further
 increase the value of Hadoop as a data platform.

 Current Status

 Prototype available, developed by the list of initial committers.

 Meritocracy

 We desire to build a diverse developer community around Gateway
 following the Apache Way. We want to make the project open source and
 will encourage contributors from multiple organizations following the
 Apache meritocracy model.

 Community

 We hope to extend the user and developer base in the future and build
 a solid open source community around Gateway. Apache Hadoop has a
 large ecosystem of open source projects, each with a strong community
 of contributors. All project communities in this ecosystem have an
 opportunity to participate in the advancement of the Gateway project
 because ultimately, Gateway will enable the security capabilities of
 their project to be more enterprise friendly.

 Core Developers

 Gateway is currently being developed by several engineers from
 Hortonworks - Kevin Minder, Larry McCay, John Speidel, Tom Beerbower
 and Sumit Mohanty. All the engineers have deep expertise in
 middleware, security  identity systems and are quite familiar with
 the Hadoop ecosystem.

 Alignment

 The ASF is a natural host for Gateway given that it is already the
 home of Hadoop, Hive, Pig, HBase, Oozie and other emerging big data
 software projects. Gateway is designed to solve the security
 challenges familiar to the Hadoop ecosystem family of projects.

 Known Risks

 Orphaned products  Reliance on Salaried Developers

 The core developers plan to work full time on the project. We believe
 that this project will be of general interest to many Hadoop users and
 will attract a diverse set of contributors. We intend to demonstrate
 this by having 

Re: [VOTE] Accept Apache Knox Hadoop Gateway Project into the Incubator

2013-02-19 Thread Vinod Kumar Vavilapalli
+1 (non-binding)

Thanks,
+Vinod

On Feb 15, 2013, at 11:22 AM, Devaraj Das wrote:

 Hi Folks,
 
 Thanks for participating in the discussion. I'd like to call a VOTE
 for acceptance of Apache Knox Hadoop Gateway Project into the
 Incubator. The vote will close on Feb 22 at 6:00 p.m.
 
 [ ]  +1 Accept Apache Knox Hadoop Gateway Project into the Incubator
 [ ]  +0 Don't care.
 [ ]  -1 Don't accept Apache Knox Hadoop Gateway Project into the
 Incubator because...
 
 Full proposal is pasted at the bottom of this email, and the
 corresponding wiki is http://wiki.apache.org/incubator/knox. Only
 VOTEs from Incubator PMC members are binding.
 
 Here's my +1 (binding).
 
 Thanks,
 Devaraj.
 
 -
 
 Knox Gateway Proposal
 
 Abstract
 
 Knox Gateway is a system that provides a single point of secure access
 for Apache Hadoop clusters.
 
 Proposal
 
 The Knox Gateway (“Gateway” or “Knox”) is a system that provides a
 single point of authentication and access for Apache Hadoop services
 in a cluster. The goal is to simplify Hadoop security for both users
 (i.e. who access the cluster data and execute jobs) and operators
 (i.e. who control access and manage the cluster). The Gateway runs as
 a server (or cluster of servers) that serve one or more Hadoop
 clusters.
 
 Provide perimeter security to make Hadoop security setup easier
 Support authentication and token verification security scenarios
 Deliver users a single cluster end-point that aggregates capabilities
 for data and jobs
 Enable integration with enterprise and cloud identity management environments
 
 Background
 
 An Apache Hadoop cluster is presented to consumers as a loose
 collection of independent services. This makes it difficult for users
 to interact with Hadoop since each service maintains it’s own method
 of access and security. As well, for operators, configuration and
 administration of a secure Hadoop cluster is a complex and many Hadoop
 clusters are insecure as a result.
 
 The goal of the project is to provide coverage for all existing Hadoop
 ecosystem projects. In addition, the project will be extensible to
 allow for new and/or proprietary Hadoop components without requiring
 changes to the gateway source code. The gateway is expected to run in
 a DMZ environment where it will provide controlled access to these
 Hadoop services. In this way Hadoop clusters can be protected by a
 firewall and only limited access provided through the firewall for the
 gateway. The authentication components of the gateway will be modular
 and extensible such that it can be integrated with existing security
 infrastructure.
 
 Rationale
 
 Organizations that are struggling with Hadoop cluster security result
 in a) running Hadoop without security or b) slowing adoption of
 Hadoop. The Gateway aims to provide perimeter security that integrates
 more easily into existing organizations’ security infrastructure.
 Doing so will simplify security for these organizations and benefit
 all Hadoop stakeholders (i.e. users and operators). Additionally,
 making a dedicated perimeter security project part of the Apache
 Hadoop ecosystem will prevent fragmentation in this area and further
 increase the value of Hadoop as a data platform.
 
 Current Status
 
 Prototype available, developed by the list of initial committers.
 
 Meritocracy
 
 We desire to build a diverse developer community around Gateway
 following the Apache Way. We want to make the project open source and
 will encourage contributors from multiple organizations following the
 Apache meritocracy model.
 
 Community
 
 We hope to extend the user and developer base in the future and build
 a solid open source community around Gateway. Apache Hadoop has a
 large ecosystem of open source projects, each with a strong community
 of contributors. All project communities in this ecosystem have an
 opportunity to participate in the advancement of the Gateway project
 because ultimately, Gateway will enable the security capabilities of
 their project to be more enterprise friendly.
 
 Core Developers
 
 Gateway is currently being developed by several engineers from
 Hortonworks - Kevin Minder, Larry McCay, John Speidel, Tom Beerbower
 and Sumit Mohanty. All the engineers have deep expertise in
 middleware, security  identity systems and are quite familiar with
 the Hadoop ecosystem.
 
 Alignment
 
 The ASF is a natural host for Gateway given that it is already the
 home of Hadoop, Hive, Pig, HBase, Oozie and other emerging big data
 software projects. Gateway is designed to solve the security
 challenges familiar to the Hadoop ecosystem family of projects.
 
 Known Risks
 
 Orphaned products  Reliance on Salaried Developers
 
 The core developers plan to work full time on the project. We believe
 that this project will be of general interest to many Hadoop users and
 will attract a diverse set of contributors. We intend to demonstrate
 this by having contributors from several organizations recognized 

Re: [VOTE] Accept Apache Knox Hadoop Gateway Project into the Incubator

2013-02-19 Thread Hitesh Shah
+1 ( non-binding )

-- Hitesh

On Feb 14, 2013, at 5:26 PM, Devaraj Das wrote:

 Hi Folks,
 
 Thanks for participating in the discussion. I'd like to call a VOTE
 for acceptance of Apache Knox Hadoop Gateway Project into the
 Incubator. The vote will close on Feb 21 at 6:00 p.m.
 
 [ ]  +1 Accept Apache Open Climate Workbench into the Incubator
 [ ]  +0 Don't care.
 [ ]  -1 Don't accept Apache Open Climate Workbench into the Incubator 
 because...
 
 Full proposal is pasted at the bottom of this email, and the
 corresponding wiki is http://wiki.apache.org/incubator/knox. Only
 VOTEs from Incubator PMC members are binding.
 
 Here's my +1 (binding).
 
 Thanks,
 Devaraj.
 
 p.s. In the last day, Tom White has been added as a mentor, and
 Venkatesh Seetharam has been added in the list of initial committers.
 
 
 Knox Gateway Proposal
 
 Abstract
 
 Knox Gateway is a system that provides a single point of secure access
 for Apache Hadoop clusters.
 
 Proposal
 
 The Knox Gateway (“Gateway” or “Knox”) is a system that provides a
 single point of authentication and access for Apache Hadoop services
 in a cluster. The goal is to simplify Hadoop security for both users
 (i.e. who access the cluster data and execute jobs) and operators
 (i.e. who control access and manage the cluster). The Gateway runs as
 a server (or cluster of servers) that serve one or more Hadoop
 clusters.
 
 Provide perimeter security to make Hadoop security setup easier
 Support authentication and token verification security scenarios
 Deliver users a single cluster end-point that aggregates capabilities
 for data and jobs
 Enable integration with enterprise and cloud identity management environments
 
 Background
 
 An Apache Hadoop cluster is presented to consumers as a loose
 collection of independent services. This makes it difficult for users
 to interact with Hadoop since each service maintains it’s own method
 of access and security. As well, for operators, configuration and
 administration of a secure Hadoop cluster is a complex and many Hadoop
 clusters are insecure as a result.
 
 The goal of the project is to provide coverage for all existing Hadoop
 ecosystem projects. In addition, the project will be extensible to
 allow for new and/or proprietary Hadoop components without requiring
 changes to the gateway source code. The gateway is expected to run in
 a DMZ environment where it will provide controlled access to these
 Hadoop services. In this way Hadoop clusters can be protected by a
 firewall and only limited access provided through the firewall for the
 gateway. The authentication components of the gateway will be modular
 and extensible such that it can be integrated with existing security
 infrastructure.
 
 Rationale
 
 Organizations that are struggling with Hadoop cluster security result
 in a) running Hadoop without security or b) slowing adoption of
 Hadoop. The Gateway aims to provide perimeter security that integrates
 more easily into existing organizations’ security infrastructure.
 Doing so will simplify security for these organizations and benefit
 all Hadoop stakeholders (i.e. users and operators). Additionally,
 making a dedicated perimeter security project part of the Apache
 Hadoop ecosystem will prevent fragmentation in this area and further
 increase the value of Hadoop as a data platform.
 
 Current Status
 
 Prototype available, developed by the list of initial committers.
 
 Meritocracy
 
 We desire to build a diverse developer community around Gateway
 following the Apache Way. We want to make the project open source and
 will encourage contributors from multiple organizations following the
 Apache meritocracy model.
 
 Community
 
 We hope to extend the user and developer base in the future and build
 a solid open source community around Gateway. Apache Hadoop has a
 large ecosystem of open source projects, each with a strong community
 of contributors. All project communities in this ecosystem have an
 opportunity to participate in the advancement of the Gateway project
 because ultimately, Gateway will enable the security capabilities of
 their project to be more enterprise friendly.
 
 Core Developers
 
 Gateway is currently being developed by several engineers from
 Hortonworks - Kevin Minder, Larry McCay, John Speidel, Tom Beerbower
 and Sumit Mohanty. All the engineers have deep expertise in
 middleware, security  identity systems and are quite familiar with
 the Hadoop ecosystem.
 
 Alignment
 
 The ASF is a natural host for Gateway given that it is already the
 home of Hadoop, Hive, Pig, HBase, Oozie and other emerging big data
 software projects. Gateway is designed to solve the security
 challenges familiar to the Hadoop ecosystem family of projects.
 
 Known Risks
 
 Orphaned products  Reliance on Salaried Developers
 
 The core developers plan to work full time on the project. We believe
 that this project will be of general interest to many Hadoop users and
 will attract a 

Re: [VOTE] Accept Apache Knox Hadoop Gateway Project into the Incubator

2013-02-19 Thread Vinod Kumar Vavilapalli

This thread is dead because of needed edits to the proposal, please vote on the 
other voting thread.

Thanks,
+Vinod

On Feb 19, 2013, at 11:43 AM, Hitesh Shah wrote:

 +1 ( non-binding )
 
 -- Hitesh
 
 On Feb 14, 2013, at 5:26 PM, Devaraj Das wrote:
 
 Hi Folks,
 
 Thanks for participating in the discussion. I'd like to call a VOTE
 for acceptance of Apache Knox Hadoop Gateway Project into the
 Incubator. The vote will close on Feb 21 at 6:00 p.m.
 
 [ ]  +1 Accept Apache Open Climate Workbench into the Incubator
 [ ]  +0 Don't care.
 [ ]  -1 Don't accept Apache Open Climate Workbench into the Incubator 
 because...
 
 Full proposal is pasted at the bottom of this email, and the
 corresponding wiki is http://wiki.apache.org/incubator/knox. Only
 VOTEs from Incubator PMC members are binding.
 
 Here's my +1 (binding).
 
 Thanks,
 Devaraj.
 
 p.s. In the last day, Tom White has been added as a mentor, and
 Venkatesh Seetharam has been added in the list of initial committers.
 
 
 Knox Gateway Proposal
 
 Abstract
 
 Knox Gateway is a system that provides a single point of secure access
 for Apache Hadoop clusters.
 
 Proposal
 
 The Knox Gateway (“Gateway” or “Knox”) is a system that provides a
 single point of authentication and access for Apache Hadoop services
 in a cluster. The goal is to simplify Hadoop security for both users
 (i.e. who access the cluster data and execute jobs) and operators
 (i.e. who control access and manage the cluster). The Gateway runs as
 a server (or cluster of servers) that serve one or more Hadoop
 clusters.
 
 Provide perimeter security to make Hadoop security setup easier
 Support authentication and token verification security scenarios
 Deliver users a single cluster end-point that aggregates capabilities
 for data and jobs
 Enable integration with enterprise and cloud identity management environments
 
 Background
 
 An Apache Hadoop cluster is presented to consumers as a loose
 collection of independent services. This makes it difficult for users
 to interact with Hadoop since each service maintains it’s own method
 of access and security. As well, for operators, configuration and
 administration of a secure Hadoop cluster is a complex and many Hadoop
 clusters are insecure as a result.
 
 The goal of the project is to provide coverage for all existing Hadoop
 ecosystem projects. In addition, the project will be extensible to
 allow for new and/or proprietary Hadoop components without requiring
 changes to the gateway source code. The gateway is expected to run in
 a DMZ environment where it will provide controlled access to these
 Hadoop services. In this way Hadoop clusters can be protected by a
 firewall and only limited access provided through the firewall for the
 gateway. The authentication components of the gateway will be modular
 and extensible such that it can be integrated with existing security
 infrastructure.
 
 Rationale
 
 Organizations that are struggling with Hadoop cluster security result
 in a) running Hadoop without security or b) slowing adoption of
 Hadoop. The Gateway aims to provide perimeter security that integrates
 more easily into existing organizations’ security infrastructure.
 Doing so will simplify security for these organizations and benefit
 all Hadoop stakeholders (i.e. users and operators). Additionally,
 making a dedicated perimeter security project part of the Apache
 Hadoop ecosystem will prevent fragmentation in this area and further
 increase the value of Hadoop as a data platform.
 
 Current Status
 
 Prototype available, developed by the list of initial committers.
 
 Meritocracy
 
 We desire to build a diverse developer community around Gateway
 following the Apache Way. We want to make the project open source and
 will encourage contributors from multiple organizations following the
 Apache meritocracy model.
 
 Community
 
 We hope to extend the user and developer base in the future and build
 a solid open source community around Gateway. Apache Hadoop has a
 large ecosystem of open source projects, each with a strong community
 of contributors. All project communities in this ecosystem have an
 opportunity to participate in the advancement of the Gateway project
 because ultimately, Gateway will enable the security capabilities of
 their project to be more enterprise friendly.
 
 Core Developers
 
 Gateway is currently being developed by several engineers from
 Hortonworks - Kevin Minder, Larry McCay, John Speidel, Tom Beerbower
 and Sumit Mohanty. All the engineers have deep expertise in
 middleware, security  identity systems and are quite familiar with
 the Hadoop ecosystem.
 
 Alignment
 
 The ASF is a natural host for Gateway given that it is already the
 home of Hadoop, Hive, Pig, HBase, Oozie and other emerging big data
 software projects. Gateway is designed to solve the security
 challenges familiar to the Hadoop ecosystem family of projects.
 
 Known Risks
 
 Orphaned products  Reliance on Salaried 

Re: [VOTE] Accept Apache Knox Hadoop Gateway Project into the Incubator

2013-02-19 Thread Hitesh Shah
+1 ( non-binding )

-- Hitesh 

On Feb 15, 2013, at 11:22 AM, Devaraj Das wrote:

 Hi Folks,
 
 Thanks for participating in the discussion. I'd like to call a VOTE
 for acceptance of Apache Knox Hadoop Gateway Project into the
 Incubator. The vote will close on Feb 22 at 6:00 p.m.
 
 [ ]  +1 Accept Apache Knox Hadoop Gateway Project into the Incubator
 [ ]  +0 Don't care.
 [ ]  -1 Don't accept Apache Knox Hadoop Gateway Project into the
 Incubator because...
 
 Full proposal is pasted at the bottom of this email, and the
 corresponding wiki is http://wiki.apache.org/incubator/knox. Only
 VOTEs from Incubator PMC members are binding.
 
 Here's my +1 (binding).
 
 Thanks,
 Devaraj.
 
 -
 
 Knox Gateway Proposal
 
 Abstract
 
 Knox Gateway is a system that provides a single point of secure access
 for Apache Hadoop clusters.
 
 Proposal
 
 The Knox Gateway (“Gateway” or “Knox”) is a system that provides a
 single point of authentication and access for Apache Hadoop services
 in a cluster. The goal is to simplify Hadoop security for both users
 (i.e. who access the cluster data and execute jobs) and operators
 (i.e. who control access and manage the cluster). The Gateway runs as
 a server (or cluster of servers) that serve one or more Hadoop
 clusters.
 
 Provide perimeter security to make Hadoop security setup easier
 Support authentication and token verification security scenarios
 Deliver users a single cluster end-point that aggregates capabilities
 for data and jobs
 Enable integration with enterprise and cloud identity management environments
 
 Background
 
 An Apache Hadoop cluster is presented to consumers as a loose
 collection of independent services. This makes it difficult for users
 to interact with Hadoop since each service maintains it’s own method
 of access and security. As well, for operators, configuration and
 administration of a secure Hadoop cluster is a complex and many Hadoop
 clusters are insecure as a result.
 
 The goal of the project is to provide coverage for all existing Hadoop
 ecosystem projects. In addition, the project will be extensible to
 allow for new and/or proprietary Hadoop components without requiring
 changes to the gateway source code. The gateway is expected to run in
 a DMZ environment where it will provide controlled access to these
 Hadoop services. In this way Hadoop clusters can be protected by a
 firewall and only limited access provided through the firewall for the
 gateway. The authentication components of the gateway will be modular
 and extensible such that it can be integrated with existing security
 infrastructure.
 
 Rationale
 
 Organizations that are struggling with Hadoop cluster security result
 in a) running Hadoop without security or b) slowing adoption of
 Hadoop. The Gateway aims to provide perimeter security that integrates
 more easily into existing organizations’ security infrastructure.
 Doing so will simplify security for these organizations and benefit
 all Hadoop stakeholders (i.e. users and operators). Additionally,
 making a dedicated perimeter security project part of the Apache
 Hadoop ecosystem will prevent fragmentation in this area and further
 increase the value of Hadoop as a data platform.
 
 Current Status
 
 Prototype available, developed by the list of initial committers.
 
 Meritocracy
 
 We desire to build a diverse developer community around Gateway
 following the Apache Way. We want to make the project open source and
 will encourage contributors from multiple organizations following the
 Apache meritocracy model.
 
 Community
 
 We hope to extend the user and developer base in the future and build
 a solid open source community around Gateway. Apache Hadoop has a
 large ecosystem of open source projects, each with a strong community
 of contributors. All project communities in this ecosystem have an
 opportunity to participate in the advancement of the Gateway project
 because ultimately, Gateway will enable the security capabilities of
 their project to be more enterprise friendly.
 
 Core Developers
 
 Gateway is currently being developed by several engineers from
 Hortonworks - Kevin Minder, Larry McCay, John Speidel, Tom Beerbower
 and Sumit Mohanty. All the engineers have deep expertise in
 middleware, security  identity systems and are quite familiar with
 the Hadoop ecosystem.
 
 Alignment
 
 The ASF is a natural host for Gateway given that it is already the
 home of Hadoop, Hive, Pig, HBase, Oozie and other emerging big data
 software projects. Gateway is designed to solve the security
 challenges familiar to the Hadoop ecosystem family of projects.
 
 Known Risks
 
 Orphaned products  Reliance on Salaried Developers
 
 The core developers plan to work full time on the project. We believe
 that this project will be of general interest to many Hadoop users and
 will attract a diverse set of contributors. We intend to demonstrate
 this by having contributors from several organizations recognized 

Re: [VOTE] Accept Apache Knox Hadoop Gateway Project into the Incubator

2013-02-19 Thread Arun C Murthy
+1 (binding)

On Feb 15, 2013, at 11:22 AM, Devaraj Das wrote:

 Hi Folks,
 
 Thanks for participating in the discussion. I'd like to call a VOTE
 for acceptance of Apache Knox Hadoop Gateway Project into the
 Incubator. The vote will close on Feb 22 at 6:00 p.m.
 
 [ ]  +1 Accept Apache Knox Hadoop Gateway Project into the Incubator
 [ ]  +0 Don't care.
 [ ]  -1 Don't accept Apache Knox Hadoop Gateway Project into the
 Incubator because...
 
 Full proposal is pasted at the bottom of this email, and the
 corresponding wiki is http://wiki.apache.org/incubator/knox. Only
 VOTEs from Incubator PMC members are binding.
 
 Here's my +1 (binding).
 
 Thanks,
 Devaraj.
 
 -
 
 Knox Gateway Proposal
 
 Abstract
 
 Knox Gateway is a system that provides a single point of secure access
 for Apache Hadoop clusters.
 
 Proposal
 
 The Knox Gateway (“Gateway” or “Knox”) is a system that provides a
 single point of authentication and access for Apache Hadoop services
 in a cluster. The goal is to simplify Hadoop security for both users
 (i.e. who access the cluster data and execute jobs) and operators
 (i.e. who control access and manage the cluster). The Gateway runs as
 a server (or cluster of servers) that serve one or more Hadoop
 clusters.
 
 Provide perimeter security to make Hadoop security setup easier
 Support authentication and token verification security scenarios
 Deliver users a single cluster end-point that aggregates capabilities
 for data and jobs
 Enable integration with enterprise and cloud identity management environments
 
 Background
 
 An Apache Hadoop cluster is presented to consumers as a loose
 collection of independent services. This makes it difficult for users
 to interact with Hadoop since each service maintains it’s own method
 of access and security. As well, for operators, configuration and
 administration of a secure Hadoop cluster is a complex and many Hadoop
 clusters are insecure as a result.
 
 The goal of the project is to provide coverage for all existing Hadoop
 ecosystem projects. In addition, the project will be extensible to
 allow for new and/or proprietary Hadoop components without requiring
 changes to the gateway source code. The gateway is expected to run in
 a DMZ environment where it will provide controlled access to these
 Hadoop services. In this way Hadoop clusters can be protected by a
 firewall and only limited access provided through the firewall for the
 gateway. The authentication components of the gateway will be modular
 and extensible such that it can be integrated with existing security
 infrastructure.
 
 Rationale
 
 Organizations that are struggling with Hadoop cluster security result
 in a) running Hadoop without security or b) slowing adoption of
 Hadoop. The Gateway aims to provide perimeter security that integrates
 more easily into existing organizations’ security infrastructure.
 Doing so will simplify security for these organizations and benefit
 all Hadoop stakeholders (i.e. users and operators). Additionally,
 making a dedicated perimeter security project part of the Apache
 Hadoop ecosystem will prevent fragmentation in this area and further
 increase the value of Hadoop as a data platform.
 
 Current Status
 
 Prototype available, developed by the list of initial committers.
 
 Meritocracy
 
 We desire to build a diverse developer community around Gateway
 following the Apache Way. We want to make the project open source and
 will encourage contributors from multiple organizations following the
 Apache meritocracy model.
 
 Community
 
 We hope to extend the user and developer base in the future and build
 a solid open source community around Gateway. Apache Hadoop has a
 large ecosystem of open source projects, each with a strong community
 of contributors. All project communities in this ecosystem have an
 opportunity to participate in the advancement of the Gateway project
 because ultimately, Gateway will enable the security capabilities of
 their project to be more enterprise friendly.
 
 Core Developers
 
 Gateway is currently being developed by several engineers from
 Hortonworks - Kevin Minder, Larry McCay, John Speidel, Tom Beerbower
 and Sumit Mohanty. All the engineers have deep expertise in
 middleware, security  identity systems and are quite familiar with
 the Hadoop ecosystem.
 
 Alignment
 
 The ASF is a natural host for Gateway given that it is already the
 home of Hadoop, Hive, Pig, HBase, Oozie and other emerging big data
 software projects. Gateway is designed to solve the security
 challenges familiar to the Hadoop ecosystem family of projects.
 
 Known Risks
 
 Orphaned products  Reliance on Salaried Developers
 
 The core developers plan to work full time on the project. We believe
 that this project will be of general interest to many Hadoop users and
 will attract a diverse set of contributors. We intend to demonstrate
 this by having contributors from several organizations recognized as
 committers by 

Re: [VOTE] Accept Apache Knox Hadoop Gateway Project into the Incubator

2013-02-18 Thread Mahadev Konar
+1 (binding).

mahadev

On Feb 17, 2013, at 1:13 PM, Tom White wrote:

 +1
 
 Tom
 
 On Fri, Feb 15, 2013 at 7:22 PM, Devaraj Das d...@hortonworks.com wrote:
 Hi Folks,
 
 Thanks for participating in the discussion. I'd like to call a VOTE
 for acceptance of Apache Knox Hadoop Gateway Project into the
 Incubator. The vote will close on Feb 22 at 6:00 p.m.
 
 [ ]  +1 Accept Apache Knox Hadoop Gateway Project into the Incubator
 [ ]  +0 Don't care.
 [ ]  -1 Don't accept Apache Knox Hadoop Gateway Project into the
 Incubator because...
 
 Full proposal is pasted at the bottom of this email, and the
 corresponding wiki is http://wiki.apache.org/incubator/knox. Only
 VOTEs from Incubator PMC members are binding.
 
 Here's my +1 (binding).
 
 Thanks,
 Devaraj.
 
 -
 
 Knox Gateway Proposal
 
 Abstract
 
 Knox Gateway is a system that provides a single point of secure access
 for Apache Hadoop clusters.
 
 Proposal
 
 The Knox Gateway (“Gateway” or “Knox”) is a system that provides a
 single point of authentication and access for Apache Hadoop services
 in a cluster. The goal is to simplify Hadoop security for both users
 (i.e. who access the cluster data and execute jobs) and operators
 (i.e. who control access and manage the cluster). The Gateway runs as
 a server (or cluster of servers) that serve one or more Hadoop
 clusters.
 
 Provide perimeter security to make Hadoop security setup easier
 Support authentication and token verification security scenarios
 Deliver users a single cluster end-point that aggregates capabilities
 for data and jobs
 Enable integration with enterprise and cloud identity management environments
 
 Background
 
 An Apache Hadoop cluster is presented to consumers as a loose
 collection of independent services. This makes it difficult for users
 to interact with Hadoop since each service maintains it’s own method
 of access and security. As well, for operators, configuration and
 administration of a secure Hadoop cluster is a complex and many Hadoop
 clusters are insecure as a result.
 
 The goal of the project is to provide coverage for all existing Hadoop
 ecosystem projects. In addition, the project will be extensible to
 allow for new and/or proprietary Hadoop components without requiring
 changes to the gateway source code. The gateway is expected to run in
 a DMZ environment where it will provide controlled access to these
 Hadoop services. In this way Hadoop clusters can be protected by a
 firewall and only limited access provided through the firewall for the
 gateway. The authentication components of the gateway will be modular
 and extensible such that it can be integrated with existing security
 infrastructure.
 
 Rationale
 
 Organizations that are struggling with Hadoop cluster security result
 in a) running Hadoop without security or b) slowing adoption of
 Hadoop. The Gateway aims to provide perimeter security that integrates
 more easily into existing organizations’ security infrastructure.
 Doing so will simplify security for these organizations and benefit
 all Hadoop stakeholders (i.e. users and operators). Additionally,
 making a dedicated perimeter security project part of the Apache
 Hadoop ecosystem will prevent fragmentation in this area and further
 increase the value of Hadoop as a data platform.
 
 Current Status
 
 Prototype available, developed by the list of initial committers.
 
 Meritocracy
 
 We desire to build a diverse developer community around Gateway
 following the Apache Way. We want to make the project open source and
 will encourage contributors from multiple organizations following the
 Apache meritocracy model.
 
 Community
 
 We hope to extend the user and developer base in the future and build
 a solid open source community around Gateway. Apache Hadoop has a
 large ecosystem of open source projects, each with a strong community
 of contributors. All project communities in this ecosystem have an
 opportunity to participate in the advancement of the Gateway project
 because ultimately, Gateway will enable the security capabilities of
 their project to be more enterprise friendly.
 
 Core Developers
 
 Gateway is currently being developed by several engineers from
 Hortonworks - Kevin Minder, Larry McCay, John Speidel, Tom Beerbower
 and Sumit Mohanty. All the engineers have deep expertise in
 middleware, security  identity systems and are quite familiar with
 the Hadoop ecosystem.
 
 Alignment
 
 The ASF is a natural host for Gateway given that it is already the
 home of Hadoop, Hive, Pig, HBase, Oozie and other emerging big data
 software projects. Gateway is designed to solve the security
 challenges familiar to the Hadoop ecosystem family of projects.
 
 Known Risks
 
 Orphaned products  Reliance on Salaried Developers
 
 The core developers plan to work full time on the project. We believe
 that this project will be of general interest to many Hadoop users and
 will attract a diverse set of contributors. We intend to 

Re: [VOTE] Accept Apache Knox Hadoop Gateway Project into the Incubator

2013-02-18 Thread Daniel Kulp
+1 

Dan


On Feb 15, 2013, at 2:22 PM, Devaraj Das d...@hortonworks.com wrote:

 Hi Folks,
 
 Thanks for participating in the discussion. I'd like to call a VOTE
 for acceptance of Apache Knox Hadoop Gateway Project into the
 Incubator. The vote will close on Feb 22 at 6:00 p.m.
 
 [ ]  +1 Accept Apache Knox Hadoop Gateway Project into the Incubator
 [ ]  +0 Don't care.
 [ ]  -1 Don't accept Apache Knox Hadoop Gateway Project into the
 Incubator because...
 
 Full proposal is pasted at the bottom of this email, and the
 corresponding wiki is http://wiki.apache.org/incubator/knox. Only
 VOTEs from Incubator PMC members are binding.
 
 Here's my +1 (binding).
 
 Thanks,
 Devaraj.
 
 -
 
 Knox Gateway Proposal
 
 Abstract
 
 Knox Gateway is a system that provides a single point of secure access
 for Apache Hadoop clusters.
 
 Proposal
 
 The Knox Gateway (“Gateway” or “Knox”) is a system that provides a
 single point of authentication and access for Apache Hadoop services
 in a cluster. The goal is to simplify Hadoop security for both users
 (i.e. who access the cluster data and execute jobs) and operators
 (i.e. who control access and manage the cluster). The Gateway runs as
 a server (or cluster of servers) that serve one or more Hadoop
 clusters.
 
 Provide perimeter security to make Hadoop security setup easier
 Support authentication and token verification security scenarios
 Deliver users a single cluster end-point that aggregates capabilities
 for data and jobs
 Enable integration with enterprise and cloud identity management environments
 
 Background
 
 An Apache Hadoop cluster is presented to consumers as a loose
 collection of independent services. This makes it difficult for users
 to interact with Hadoop since each service maintains it’s own method
 of access and security. As well, for operators, configuration and
 administration of a secure Hadoop cluster is a complex and many Hadoop
 clusters are insecure as a result.
 
 The goal of the project is to provide coverage for all existing Hadoop
 ecosystem projects. In addition, the project will be extensible to
 allow for new and/or proprietary Hadoop components without requiring
 changes to the gateway source code. The gateway is expected to run in
 a DMZ environment where it will provide controlled access to these
 Hadoop services. In this way Hadoop clusters can be protected by a
 firewall and only limited access provided through the firewall for the
 gateway. The authentication components of the gateway will be modular
 and extensible such that it can be integrated with existing security
 infrastructure.
 
 Rationale
 
 Organizations that are struggling with Hadoop cluster security result
 in a) running Hadoop without security or b) slowing adoption of
 Hadoop. The Gateway aims to provide perimeter security that integrates
 more easily into existing organizations’ security infrastructure.
 Doing so will simplify security for these organizations and benefit
 all Hadoop stakeholders (i.e. users and operators). Additionally,
 making a dedicated perimeter security project part of the Apache
 Hadoop ecosystem will prevent fragmentation in this area and further
 increase the value of Hadoop as a data platform.
 
 Current Status
 
 Prototype available, developed by the list of initial committers.
 
 Meritocracy
 
 We desire to build a diverse developer community around Gateway
 following the Apache Way. We want to make the project open source and
 will encourage contributors from multiple organizations following the
 Apache meritocracy model.
 
 Community
 
 We hope to extend the user and developer base in the future and build
 a solid open source community around Gateway. Apache Hadoop has a
 large ecosystem of open source projects, each with a strong community
 of contributors. All project communities in this ecosystem have an
 opportunity to participate in the advancement of the Gateway project
 because ultimately, Gateway will enable the security capabilities of
 their project to be more enterprise friendly.
 
 Core Developers
 
 Gateway is currently being developed by several engineers from
 Hortonworks - Kevin Minder, Larry McCay, John Speidel, Tom Beerbower
 and Sumit Mohanty. All the engineers have deep expertise in
 middleware, security  identity systems and are quite familiar with
 the Hadoop ecosystem.
 
 Alignment
 
 The ASF is a natural host for Gateway given that it is already the
 home of Hadoop, Hive, Pig, HBase, Oozie and other emerging big data
 software projects. Gateway is designed to solve the security
 challenges familiar to the Hadoop ecosystem family of projects.
 
 Known Risks
 
 Orphaned products  Reliance on Salaried Developers
 
 The core developers plan to work full time on the project. We believe
 that this project will be of general interest to many Hadoop users and
 will attract a diverse set of contributors. We intend to demonstrate
 this by having contributors from several organizations recognized as

Re: [VOTE] Accept Apache Knox Hadoop Gateway Project into the Incubator

2013-02-17 Thread Tom White
+1

Tom

On Fri, Feb 15, 2013 at 7:22 PM, Devaraj Das d...@hortonworks.com wrote:
 Hi Folks,

 Thanks for participating in the discussion. I'd like to call a VOTE
 for acceptance of Apache Knox Hadoop Gateway Project into the
 Incubator. The vote will close on Feb 22 at 6:00 p.m.

 [ ]  +1 Accept Apache Knox Hadoop Gateway Project into the Incubator
 [ ]  +0 Don't care.
 [ ]  -1 Don't accept Apache Knox Hadoop Gateway Project into the
 Incubator because...

 Full proposal is pasted at the bottom of this email, and the
 corresponding wiki is http://wiki.apache.org/incubator/knox. Only
 VOTEs from Incubator PMC members are binding.

 Here's my +1 (binding).

 Thanks,
 Devaraj.

 -

 Knox Gateway Proposal

 Abstract

 Knox Gateway is a system that provides a single point of secure access
 for Apache Hadoop clusters.

 Proposal

 The Knox Gateway (“Gateway” or “Knox”) is a system that provides a
 single point of authentication and access for Apache Hadoop services
 in a cluster. The goal is to simplify Hadoop security for both users
 (i.e. who access the cluster data and execute jobs) and operators
 (i.e. who control access and manage the cluster). The Gateway runs as
 a server (or cluster of servers) that serve one or more Hadoop
 clusters.

 Provide perimeter security to make Hadoop security setup easier
 Support authentication and token verification security scenarios
 Deliver users a single cluster end-point that aggregates capabilities
 for data and jobs
 Enable integration with enterprise and cloud identity management environments

 Background

 An Apache Hadoop cluster is presented to consumers as a loose
 collection of independent services. This makes it difficult for users
 to interact with Hadoop since each service maintains it’s own method
 of access and security. As well, for operators, configuration and
 administration of a secure Hadoop cluster is a complex and many Hadoop
 clusters are insecure as a result.

 The goal of the project is to provide coverage for all existing Hadoop
 ecosystem projects. In addition, the project will be extensible to
 allow for new and/or proprietary Hadoop components without requiring
 changes to the gateway source code. The gateway is expected to run in
 a DMZ environment where it will provide controlled access to these
 Hadoop services. In this way Hadoop clusters can be protected by a
 firewall and only limited access provided through the firewall for the
 gateway. The authentication components of the gateway will be modular
 and extensible such that it can be integrated with existing security
 infrastructure.

 Rationale

 Organizations that are struggling with Hadoop cluster security result
 in a) running Hadoop without security or b) slowing adoption of
 Hadoop. The Gateway aims to provide perimeter security that integrates
 more easily into existing organizations’ security infrastructure.
 Doing so will simplify security for these organizations and benefit
 all Hadoop stakeholders (i.e. users and operators). Additionally,
 making a dedicated perimeter security project part of the Apache
 Hadoop ecosystem will prevent fragmentation in this area and further
 increase the value of Hadoop as a data platform.

 Current Status

 Prototype available, developed by the list of initial committers.

 Meritocracy

 We desire to build a diverse developer community around Gateway
 following the Apache Way. We want to make the project open source and
 will encourage contributors from multiple organizations following the
 Apache meritocracy model.

 Community

 We hope to extend the user and developer base in the future and build
 a solid open source community around Gateway. Apache Hadoop has a
 large ecosystem of open source projects, each with a strong community
 of contributors. All project communities in this ecosystem have an
 opportunity to participate in the advancement of the Gateway project
 because ultimately, Gateway will enable the security capabilities of
 their project to be more enterprise friendly.

 Core Developers

 Gateway is currently being developed by several engineers from
 Hortonworks - Kevin Minder, Larry McCay, John Speidel, Tom Beerbower
 and Sumit Mohanty. All the engineers have deep expertise in
 middleware, security  identity systems and are quite familiar with
 the Hadoop ecosystem.

 Alignment

 The ASF is a natural host for Gateway given that it is already the
 home of Hadoop, Hive, Pig, HBase, Oozie and other emerging big data
 software projects. Gateway is designed to solve the security
 challenges familiar to the Hadoop ecosystem family of projects.

 Known Risks

 Orphaned products  Reliance on Salaried Developers

 The core developers plan to work full time on the project. We believe
 that this project will be of general interest to many Hadoop users and
 will attract a diverse set of contributors. We intend to demonstrate
 this by having contributors from several organizations recognized as
 committers by the time Knox 

Re: [VOTE] Accept Apache Knox Hadoop Gateway Project into the Incubator

2013-02-16 Thread Alex Karasulu
+1 (binding)


On Sat, Feb 16, 2013 at 4:08 AM, Arun C Murthy a...@hortonworks.com wrote:

 +1 (binding)

 Arun

 On Feb 14, 2013, at 5:26 PM, Devaraj Das wrote:

  Hi Folks,
 
  Thanks for participating in the discussion. I'd like to call a VOTE
  for acceptance of Apache Knox Hadoop Gateway Project into the
  Incubator. The vote will close on Feb 21 at 6:00 p.m.
 
  [ ]  +1 Accept Apache Open Climate Workbench into the Incubator
  [ ]  +0 Don't care.
  [ ]  -1 Don't accept Apache Open Climate Workbench into the Incubator
 because...
 
  Full proposal is pasted at the bottom of this email, and the
  corresponding wiki is http://wiki.apache.org/incubator/knox. Only
  VOTEs from Incubator PMC members are binding.
 
  Here's my +1 (binding).
 
  Thanks,
  Devaraj.
 
  p.s. In the last day, Tom White has been added as a mentor, and
  Venkatesh Seetharam has been added in the list of initial committers.
 
  
  Knox Gateway Proposal
 
  Abstract
 
  Knox Gateway is a system that provides a single point of secure access
  for Apache Hadoop clusters.
 
  Proposal
 
  The Knox Gateway (“Gateway” or “Knox”) is a system that provides a
  single point of authentication and access for Apache Hadoop services
  in a cluster. The goal is to simplify Hadoop security for both users
  (i.e. who access the cluster data and execute jobs) and operators
  (i.e. who control access and manage the cluster). The Gateway runs as
  a server (or cluster of servers) that serve one or more Hadoop
  clusters.
 
  Provide perimeter security to make Hadoop security setup easier
  Support authentication and token verification security scenarios
  Deliver users a single cluster end-point that aggregates capabilities
  for data and jobs
  Enable integration with enterprise and cloud identity management
 environments
 
  Background
 
  An Apache Hadoop cluster is presented to consumers as a loose
  collection of independent services. This makes it difficult for users
  to interact with Hadoop since each service maintains it’s own method
  of access and security. As well, for operators, configuration and
  administration of a secure Hadoop cluster is a complex and many Hadoop
  clusters are insecure as a result.
 
  The goal of the project is to provide coverage for all existing Hadoop
  ecosystem projects. In addition, the project will be extensible to
  allow for new and/or proprietary Hadoop components without requiring
  changes to the gateway source code. The gateway is expected to run in
  a DMZ environment where it will provide controlled access to these
  Hadoop services. In this way Hadoop clusters can be protected by a
  firewall and only limited access provided through the firewall for the
  gateway. The authentication components of the gateway will be modular
  and extensible such that it can be integrated with existing security
  infrastructure.
 
  Rationale
 
  Organizations that are struggling with Hadoop cluster security result
  in a) running Hadoop without security or b) slowing adoption of
  Hadoop. The Gateway aims to provide perimeter security that integrates
  more easily into existing organizations’ security infrastructure.
  Doing so will simplify security for these organizations and benefit
  all Hadoop stakeholders (i.e. users and operators). Additionally,
  making a dedicated perimeter security project part of the Apache
  Hadoop ecosystem will prevent fragmentation in this area and further
  increase the value of Hadoop as a data platform.
 
  Current Status
 
  Prototype available, developed by the list of initial committers.
 
  Meritocracy
 
  We desire to build a diverse developer community around Gateway
  following the Apache Way. We want to make the project open source and
  will encourage contributors from multiple organizations following the
  Apache meritocracy model.
 
  Community
 
  We hope to extend the user and developer base in the future and build
  a solid open source community around Gateway. Apache Hadoop has a
  large ecosystem of open source projects, each with a strong community
  of contributors. All project communities in this ecosystem have an
  opportunity to participate in the advancement of the Gateway project
  because ultimately, Gateway will enable the security capabilities of
  their project to be more enterprise friendly.
 
  Core Developers
 
  Gateway is currently being developed by several engineers from
  Hortonworks - Kevin Minder, Larry McCay, John Speidel, Tom Beerbower
  and Sumit Mohanty. All the engineers have deep expertise in
  middleware, security  identity systems and are quite familiar with
  the Hadoop ecosystem.
 
  Alignment
 
  The ASF is a natural host for Gateway given that it is already the
  home of Hadoop, Hive, Pig, HBase, Oozie and other emerging big data
  software projects. Gateway is designed to solve the security
  challenges familiar to the Hadoop ecosystem family of projects.
 
  Known Risks
 
  Orphaned products  Reliance on Salaried 

Re: [VOTE] Accept Apache Knox Hadoop Gateway Project into the Incubator

2013-02-16 Thread Alan Gates
+1.

Alan.

On Feb 15, 2013, at 5:03 PM, Owen O'Malley wrote:

 +1
 
 
 On Fri, Feb 15, 2013 at 11:22 AM, Devaraj Das d...@hortonworks.com wrote:
 
 Hi Folks,
 
 Thanks for participating in the discussion. I'd like to call a VOTE
 for acceptance of Apache Knox Hadoop Gateway Project into the
 Incubator. The vote will close on Feb 22 at 6:00 p.m.
 
 [ ]  +1 Accept Apache Knox Hadoop Gateway Project into the Incubator
 [ ]  +0 Don't care.
 [ ]  -1 Don't accept Apache Knox Hadoop Gateway Project into the
 Incubator because...
 
 Full proposal is pasted at the bottom of this email, and the
 corresponding wiki is http://wiki.apache.org/incubator/knox. Only
 VOTEs from Incubator PMC members are binding.
 
 Here's my +1 (binding).
 
 Thanks,
 Devaraj.
 
 -
 
 Knox Gateway Proposal
 
 Abstract
 
 Knox Gateway is a system that provides a single point of secure access
 for Apache Hadoop clusters.
 
 Proposal
 
 The Knox Gateway (“Gateway” or “Knox”) is a system that provides a
 single point of authentication and access for Apache Hadoop services
 in a cluster. The goal is to simplify Hadoop security for both users
 (i.e. who access the cluster data and execute jobs) and operators
 (i.e. who control access and manage the cluster). The Gateway runs as
 a server (or cluster of servers) that serve one or more Hadoop
 clusters.
 
 Provide perimeter security to make Hadoop security setup easier
 Support authentication and token verification security scenarios
 Deliver users a single cluster end-point that aggregates capabilities
 for data and jobs
 Enable integration with enterprise and cloud identity management
 environments
 
 Background
 
 An Apache Hadoop cluster is presented to consumers as a loose
 collection of independent services. This makes it difficult for users
 to interact with Hadoop since each service maintains it’s own method
 of access and security. As well, for operators, configuration and
 administration of a secure Hadoop cluster is a complex and many Hadoop
 clusters are insecure as a result.
 
 The goal of the project is to provide coverage for all existing Hadoop
 ecosystem projects. In addition, the project will be extensible to
 allow for new and/or proprietary Hadoop components without requiring
 changes to the gateway source code. The gateway is expected to run in
 a DMZ environment where it will provide controlled access to these
 Hadoop services. In this way Hadoop clusters can be protected by a
 firewall and only limited access provided through the firewall for the
 gateway. The authentication components of the gateway will be modular
 and extensible such that it can be integrated with existing security
 infrastructure.
 
 Rationale
 
 Organizations that are struggling with Hadoop cluster security result
 in a) running Hadoop without security or b) slowing adoption of
 Hadoop. The Gateway aims to provide perimeter security that integrates
 more easily into existing organizations’ security infrastructure.
 Doing so will simplify security for these organizations and benefit
 all Hadoop stakeholders (i.e. users and operators). Additionally,
 making a dedicated perimeter security project part of the Apache
 Hadoop ecosystem will prevent fragmentation in this area and further
 increase the value of Hadoop as a data platform.
 
 Current Status
 
 Prototype available, developed by the list of initial committers.
 
 Meritocracy
 
 We desire to build a diverse developer community around Gateway
 following the Apache Way. We want to make the project open source and
 will encourage contributors from multiple organizations following the
 Apache meritocracy model.
 
 Community
 
 We hope to extend the user and developer base in the future and build
 a solid open source community around Gateway. Apache Hadoop has a
 large ecosystem of open source projects, each with a strong community
 of contributors. All project communities in this ecosystem have an
 opportunity to participate in the advancement of the Gateway project
 because ultimately, Gateway will enable the security capabilities of
 their project to be more enterprise friendly.
 
 Core Developers
 
 Gateway is currently being developed by several engineers from
 Hortonworks - Kevin Minder, Larry McCay, John Speidel, Tom Beerbower
 and Sumit Mohanty. All the engineers have deep expertise in
 middleware, security  identity systems and are quite familiar with
 the Hadoop ecosystem.
 
 Alignment
 
 The ASF is a natural host for Gateway given that it is already the
 home of Hadoop, Hive, Pig, HBase, Oozie and other emerging big data
 software projects. Gateway is designed to solve the security
 challenges familiar to the Hadoop ecosystem family of projects.
 
 Known Risks
 
 Orphaned products  Reliance on Salaried Developers
 
 The core developers plan to work full time on the project. We believe
 that this project will be of general interest to many Hadoop users and
 will attract a diverse set of contributors. We intend to demonstrate
 

Re: [VOTE] Accept Apache Knox Hadoop Gateway Project into the Incubator

2013-02-15 Thread Devaraj Das
Oops. Sorry. Will re-initiate the vote.

On Thu, Feb 14, 2013 at 8:25 PM, Mattmann, Chris A (388J)
chris.a.mattm...@jpl.nasa.gov wrote:
 s/Apache Open Climate Workbench/Apache Knox Hadoop Gateway/ :)

 May want to resend the [VOTE] thread.

 On 2/14/13 5:26 PM, Devaraj Das d...@hortonworks.com wrote:

Hi Folks,

Thanks for participating in the discussion. I'd like to call a VOTE
for acceptance of Apache Knox Hadoop Gateway Project into the
Incubator. The vote will close on Feb 21 at 6:00 p.m.

[ ]  +1 Accept Apache Open Climate Workbench into the Incubator
[ ]  +0 Don't care.
[ ]  -1 Don't accept Apache Open Climate Workbench into the Incubator
because...

Full proposal is pasted at the bottom of this email, and the
corresponding wiki is http://wiki.apache.org/incubator/knox. Only
VOTEs from Incubator PMC members are binding.

Here's my +1 (binding).

Thanks,
Devaraj.

p.s. In the last day, Tom White has been added as a mentor, and
Venkatesh Seetharam has been added in the list of initial committers.


Knox Gateway Proposal

Abstract

Knox Gateway is a system that provides a single point of secure access
for Apache Hadoop clusters.

Proposal

The Knox Gateway (³Gateway² or ³Knox²) is a system that provides a
single point of authentication and access for Apache Hadoop services
in a cluster. The goal is to simplify Hadoop security for both users
(i.e. who access the cluster data and execute jobs) and operators
(i.e. who control access and manage the cluster). The Gateway runs as
a server (or cluster of servers) that serve one or more Hadoop
clusters.

Provide perimeter security to make Hadoop security setup easier
Support authentication and token verification security scenarios
Deliver users a single cluster end-point that aggregates capabilities
for data and jobs
Enable integration with enterprise and cloud identity management
environments

Background

An Apache Hadoop cluster is presented to consumers as a loose
collection of independent services. This makes it difficult for users
to interact with Hadoop since each service maintains it¹s own method
of access and security. As well, for operators, configuration and
administration of a secure Hadoop cluster is a complex and many Hadoop
clusters are insecure as a result.

The goal of the project is to provide coverage for all existing Hadoop
ecosystem projects. In addition, the project will be extensible to
allow for new and/or proprietary Hadoop components without requiring
changes to the gateway source code. The gateway is expected to run in
a DMZ environment where it will provide controlled access to these
Hadoop services. In this way Hadoop clusters can be protected by a
firewall and only limited access provided through the firewall for the
gateway. The authentication components of the gateway will be modular
and extensible such that it can be integrated with existing security
infrastructure.

Rationale

Organizations that are struggling with Hadoop cluster security result
in a) running Hadoop without security or b) slowing adoption of
Hadoop. The Gateway aims to provide perimeter security that integrates
more easily into existing organizations¹ security infrastructure.
Doing so will simplify security for these organizations and benefit
all Hadoop stakeholders (i.e. users and operators). Additionally,
making a dedicated perimeter security project part of the Apache
Hadoop ecosystem will prevent fragmentation in this area and further
increase the value of Hadoop as a data platform.

Current Status

Prototype available, developed by the list of initial committers.

Meritocracy

We desire to build a diverse developer community around Gateway
following the Apache Way. We want to make the project open source and
will encourage contributors from multiple organizations following the
Apache meritocracy model.

Community

We hope to extend the user and developer base in the future and build
a solid open source community around Gateway. Apache Hadoop has a
large ecosystem of open source projects, each with a strong community
of contributors. All project communities in this ecosystem have an
opportunity to participate in the advancement of the Gateway project
because ultimately, Gateway will enable the security capabilities of
their project to be more enterprise friendly.

Core Developers

Gateway is currently being developed by several engineers from
Hortonworks - Kevin Minder, Larry McCay, John Speidel, Tom Beerbower
and Sumit Mohanty. All the engineers have deep expertise in
middleware, security  identity systems and are quite familiar with
the Hadoop ecosystem.

Alignment

The ASF is a natural host for Gateway given that it is already the
home of Hadoop, Hive, Pig, HBase, Oozie and other emerging big data
software projects. Gateway is designed to solve the security
challenges familiar to the Hadoop ecosystem family of projects.

Known Risks

Orphaned products  Reliance on Salaried Developers

The core developers plan to work full time on the project. We 

[VOTE] Accept Apache Knox Hadoop Gateway Project into the Incubator

2013-02-15 Thread Devaraj Das
Hi Folks,

Thanks for participating in the discussion. I'd like to call a VOTE
for acceptance of Apache Knox Hadoop Gateway Project into the
Incubator. The vote will close on Feb 22 at 6:00 p.m.

[ ]  +1 Accept Apache Knox Hadoop Gateway Project into the Incubator
[ ]  +0 Don't care.
[ ]  -1 Don't accept Apache Knox Hadoop Gateway Project into the
Incubator because...

Full proposal is pasted at the bottom of this email, and the
corresponding wiki is http://wiki.apache.org/incubator/knox. Only
VOTEs from Incubator PMC members are binding.

Here's my +1 (binding).

Thanks,
Devaraj.

-

Knox Gateway Proposal

Abstract

Knox Gateway is a system that provides a single point of secure access
for Apache Hadoop clusters.

Proposal

The Knox Gateway (“Gateway” or “Knox”) is a system that provides a
single point of authentication and access for Apache Hadoop services
in a cluster. The goal is to simplify Hadoop security for both users
(i.e. who access the cluster data and execute jobs) and operators
(i.e. who control access and manage the cluster). The Gateway runs as
a server (or cluster of servers) that serve one or more Hadoop
clusters.

Provide perimeter security to make Hadoop security setup easier
Support authentication and token verification security scenarios
Deliver users a single cluster end-point that aggregates capabilities
for data and jobs
Enable integration with enterprise and cloud identity management environments

Background

An Apache Hadoop cluster is presented to consumers as a loose
collection of independent services. This makes it difficult for users
to interact with Hadoop since each service maintains it’s own method
of access and security. As well, for operators, configuration and
administration of a secure Hadoop cluster is a complex and many Hadoop
clusters are insecure as a result.

The goal of the project is to provide coverage for all existing Hadoop
ecosystem projects. In addition, the project will be extensible to
allow for new and/or proprietary Hadoop components without requiring
changes to the gateway source code. The gateway is expected to run in
a DMZ environment where it will provide controlled access to these
Hadoop services. In this way Hadoop clusters can be protected by a
firewall and only limited access provided through the firewall for the
gateway. The authentication components of the gateway will be modular
and extensible such that it can be integrated with existing security
infrastructure.

Rationale

Organizations that are struggling with Hadoop cluster security result
in a) running Hadoop without security or b) slowing adoption of
Hadoop. The Gateway aims to provide perimeter security that integrates
more easily into existing organizations’ security infrastructure.
Doing so will simplify security for these organizations and benefit
all Hadoop stakeholders (i.e. users and operators). Additionally,
making a dedicated perimeter security project part of the Apache
Hadoop ecosystem will prevent fragmentation in this area and further
increase the value of Hadoop as a data platform.

Current Status

Prototype available, developed by the list of initial committers.

Meritocracy

We desire to build a diverse developer community around Gateway
following the Apache Way. We want to make the project open source and
will encourage contributors from multiple organizations following the
Apache meritocracy model.

Community

We hope to extend the user and developer base in the future and build
a solid open source community around Gateway. Apache Hadoop has a
large ecosystem of open source projects, each with a strong community
of contributors. All project communities in this ecosystem have an
opportunity to participate in the advancement of the Gateway project
because ultimately, Gateway will enable the security capabilities of
their project to be more enterprise friendly.

Core Developers

Gateway is currently being developed by several engineers from
Hortonworks - Kevin Minder, Larry McCay, John Speidel, Tom Beerbower
and Sumit Mohanty. All the engineers have deep expertise in
middleware, security  identity systems and are quite familiar with
the Hadoop ecosystem.

Alignment

The ASF is a natural host for Gateway given that it is already the
home of Hadoop, Hive, Pig, HBase, Oozie and other emerging big data
software projects. Gateway is designed to solve the security
challenges familiar to the Hadoop ecosystem family of projects.

Known Risks

Orphaned products  Reliance on Salaried Developers

The core developers plan to work full time on the project. We believe
that this project will be of general interest to many Hadoop users and
will attract a diverse set of contributors. We intend to demonstrate
this by having contributors from several organizations recognized as
committers by the time Knox graduates from incubation.

Inexperience with Open Source

All of the core developers are active users and followers of open
source. As well, Hortonworks and the affiliated 

Re: [VOTE] Accept Apache Knox Hadoop Gateway Project into the Incubator

2013-02-15 Thread Owen O'Malley
+1


On Fri, Feb 15, 2013 at 11:22 AM, Devaraj Das d...@hortonworks.com wrote:

 Hi Folks,

 Thanks for participating in the discussion. I'd like to call a VOTE
 for acceptance of Apache Knox Hadoop Gateway Project into the
 Incubator. The vote will close on Feb 22 at 6:00 p.m.

 [ ]  +1 Accept Apache Knox Hadoop Gateway Project into the Incubator
 [ ]  +0 Don't care.
 [ ]  -1 Don't accept Apache Knox Hadoop Gateway Project into the
 Incubator because...

 Full proposal is pasted at the bottom of this email, and the
 corresponding wiki is http://wiki.apache.org/incubator/knox. Only
 VOTEs from Incubator PMC members are binding.

 Here's my +1 (binding).

 Thanks,
 Devaraj.

 -

 Knox Gateway Proposal

 Abstract

 Knox Gateway is a system that provides a single point of secure access
 for Apache Hadoop clusters.

 Proposal

 The Knox Gateway (“Gateway” or “Knox”) is a system that provides a
 single point of authentication and access for Apache Hadoop services
 in a cluster. The goal is to simplify Hadoop security for both users
 (i.e. who access the cluster data and execute jobs) and operators
 (i.e. who control access and manage the cluster). The Gateway runs as
 a server (or cluster of servers) that serve one or more Hadoop
 clusters.

 Provide perimeter security to make Hadoop security setup easier
 Support authentication and token verification security scenarios
 Deliver users a single cluster end-point that aggregates capabilities
 for data and jobs
 Enable integration with enterprise and cloud identity management
 environments

 Background

 An Apache Hadoop cluster is presented to consumers as a loose
 collection of independent services. This makes it difficult for users
 to interact with Hadoop since each service maintains it’s own method
 of access and security. As well, for operators, configuration and
 administration of a secure Hadoop cluster is a complex and many Hadoop
 clusters are insecure as a result.

 The goal of the project is to provide coverage for all existing Hadoop
 ecosystem projects. In addition, the project will be extensible to
 allow for new and/or proprietary Hadoop components without requiring
 changes to the gateway source code. The gateway is expected to run in
 a DMZ environment where it will provide controlled access to these
 Hadoop services. In this way Hadoop clusters can be protected by a
 firewall and only limited access provided through the firewall for the
 gateway. The authentication components of the gateway will be modular
 and extensible such that it can be integrated with existing security
 infrastructure.

 Rationale

 Organizations that are struggling with Hadoop cluster security result
 in a) running Hadoop without security or b) slowing adoption of
 Hadoop. The Gateway aims to provide perimeter security that integrates
 more easily into existing organizations’ security infrastructure.
 Doing so will simplify security for these organizations and benefit
 all Hadoop stakeholders (i.e. users and operators). Additionally,
 making a dedicated perimeter security project part of the Apache
 Hadoop ecosystem will prevent fragmentation in this area and further
 increase the value of Hadoop as a data platform.

 Current Status

 Prototype available, developed by the list of initial committers.

 Meritocracy

 We desire to build a diverse developer community around Gateway
 following the Apache Way. We want to make the project open source and
 will encourage contributors from multiple organizations following the
 Apache meritocracy model.

 Community

 We hope to extend the user and developer base in the future and build
 a solid open source community around Gateway. Apache Hadoop has a
 large ecosystem of open source projects, each with a strong community
 of contributors. All project communities in this ecosystem have an
 opportunity to participate in the advancement of the Gateway project
 because ultimately, Gateway will enable the security capabilities of
 their project to be more enterprise friendly.

 Core Developers

 Gateway is currently being developed by several engineers from
 Hortonworks - Kevin Minder, Larry McCay, John Speidel, Tom Beerbower
 and Sumit Mohanty. All the engineers have deep expertise in
 middleware, security  identity systems and are quite familiar with
 the Hadoop ecosystem.

 Alignment

 The ASF is a natural host for Gateway given that it is already the
 home of Hadoop, Hive, Pig, HBase, Oozie and other emerging big data
 software projects. Gateway is designed to solve the security
 challenges familiar to the Hadoop ecosystem family of projects.

 Known Risks

 Orphaned products  Reliance on Salaried Developers

 The core developers plan to work full time on the project. We believe
 that this project will be of general interest to many Hadoop users and
 will attract a diverse set of contributors. We intend to demonstrate
 this by having contributors from several organizations recognized as
 committers by the time Knox 

Re: [VOTE] Accept Apache Knox Hadoop Gateway Project into the Incubator

2013-02-15 Thread Chris Douglas
+1 (binding) -C

On Fri, Feb 15, 2013 at 11:22 AM, Devaraj Das d...@hortonworks.com wrote:
 Hi Folks,

 Thanks for participating in the discussion. I'd like to call a VOTE
 for acceptance of Apache Knox Hadoop Gateway Project into the
 Incubator. The vote will close on Feb 22 at 6:00 p.m.

 [ ]  +1 Accept Apache Knox Hadoop Gateway Project into the Incubator
 [ ]  +0 Don't care.
 [ ]  -1 Don't accept Apache Knox Hadoop Gateway Project into the
 Incubator because...

 Full proposal is pasted at the bottom of this email, and the
 corresponding wiki is http://wiki.apache.org/incubator/knox. Only
 VOTEs from Incubator PMC members are binding.

 Here's my +1 (binding).

 Thanks,
 Devaraj.

 -

 Knox Gateway Proposal

 Abstract

 Knox Gateway is a system that provides a single point of secure access
 for Apache Hadoop clusters.

 Proposal

 The Knox Gateway (“Gateway” or “Knox”) is a system that provides a
 single point of authentication and access for Apache Hadoop services
 in a cluster. The goal is to simplify Hadoop security for both users
 (i.e. who access the cluster data and execute jobs) and operators
 (i.e. who control access and manage the cluster). The Gateway runs as
 a server (or cluster of servers) that serve one or more Hadoop
 clusters.

 Provide perimeter security to make Hadoop security setup easier
 Support authentication and token verification security scenarios
 Deliver users a single cluster end-point that aggregates capabilities
 for data and jobs
 Enable integration with enterprise and cloud identity management environments

 Background

 An Apache Hadoop cluster is presented to consumers as a loose
 collection of independent services. This makes it difficult for users
 to interact with Hadoop since each service maintains it’s own method
 of access and security. As well, for operators, configuration and
 administration of a secure Hadoop cluster is a complex and many Hadoop
 clusters are insecure as a result.

 The goal of the project is to provide coverage for all existing Hadoop
 ecosystem projects. In addition, the project will be extensible to
 allow for new and/or proprietary Hadoop components without requiring
 changes to the gateway source code. The gateway is expected to run in
 a DMZ environment where it will provide controlled access to these
 Hadoop services. In this way Hadoop clusters can be protected by a
 firewall and only limited access provided through the firewall for the
 gateway. The authentication components of the gateway will be modular
 and extensible such that it can be integrated with existing security
 infrastructure.

 Rationale

 Organizations that are struggling with Hadoop cluster security result
 in a) running Hadoop without security or b) slowing adoption of
 Hadoop. The Gateway aims to provide perimeter security that integrates
 more easily into existing organizations’ security infrastructure.
 Doing so will simplify security for these organizations and benefit
 all Hadoop stakeholders (i.e. users and operators). Additionally,
 making a dedicated perimeter security project part of the Apache
 Hadoop ecosystem will prevent fragmentation in this area and further
 increase the value of Hadoop as a data platform.

 Current Status

 Prototype available, developed by the list of initial committers.

 Meritocracy

 We desire to build a diverse developer community around Gateway
 following the Apache Way. We want to make the project open source and
 will encourage contributors from multiple organizations following the
 Apache meritocracy model.

 Community

 We hope to extend the user and developer base in the future and build
 a solid open source community around Gateway. Apache Hadoop has a
 large ecosystem of open source projects, each with a strong community
 of contributors. All project communities in this ecosystem have an
 opportunity to participate in the advancement of the Gateway project
 because ultimately, Gateway will enable the security capabilities of
 their project to be more enterprise friendly.

 Core Developers

 Gateway is currently being developed by several engineers from
 Hortonworks - Kevin Minder, Larry McCay, John Speidel, Tom Beerbower
 and Sumit Mohanty. All the engineers have deep expertise in
 middleware, security  identity systems and are quite familiar with
 the Hadoop ecosystem.

 Alignment

 The ASF is a natural host for Gateway given that it is already the
 home of Hadoop, Hive, Pig, HBase, Oozie and other emerging big data
 software projects. Gateway is designed to solve the security
 challenges familiar to the Hadoop ecosystem family of projects.

 Known Risks

 Orphaned products  Reliance on Salaried Developers

 The core developers plan to work full time on the project. We believe
 that this project will be of general interest to many Hadoop users and
 will attract a diverse set of contributors. We intend to demonstrate
 this by having contributors from several organizations recognized as
 committers by the 

Re: [VOTE] Accept Apache Knox Hadoop Gateway Project into the Incubator

2013-02-15 Thread Mattmann, Chris A (388J)
+1 binding.

Cheers,
Chris 

Sent from my iPad

On Feb 15, 2013, at 11:23 AM, Devaraj Das d...@hortonworks.com wrote:

 Hi Folks,
 
 Thanks for participating in the discussion. I'd like to call a VOTE
 for acceptance of Apache Knox Hadoop Gateway Project into the
 Incubator. The vote will close on Feb 22 at 6:00 p.m.
 
 [ ]  +1 Accept Apache Knox Hadoop Gateway Project into the Incubator
 [ ]  +0 Don't care.
 [ ]  -1 Don't accept Apache Knox Hadoop Gateway Project into the
 Incubator because...
 
 Full proposal is pasted at the bottom of this email, and the
 corresponding wiki is http://wiki.apache.org/incubator/knox. Only
 VOTEs from Incubator PMC members are binding.
 
 Here's my +1 (binding).
 
 Thanks,
 Devaraj.
 
 -
 
 Knox Gateway Proposal
 
 Abstract
 
 Knox Gateway is a system that provides a single point of secure access
 for Apache Hadoop clusters.
 
 Proposal
 
 The Knox Gateway (“Gateway” or “Knox”) is a system that provides a
 single point of authentication and access for Apache Hadoop services
 in a cluster. The goal is to simplify Hadoop security for both users
 (i.e. who access the cluster data and execute jobs) and operators
 (i.e. who control access and manage the cluster). The Gateway runs as
 a server (or cluster of servers) that serve one or more Hadoop
 clusters.
 
 Provide perimeter security to make Hadoop security setup easier
 Support authentication and token verification security scenarios
 Deliver users a single cluster end-point that aggregates capabilities
 for data and jobs
 Enable integration with enterprise and cloud identity management environments
 
 Background
 
 An Apache Hadoop cluster is presented to consumers as a loose
 collection of independent services. This makes it difficult for users
 to interact with Hadoop since each service maintains it’s own method
 of access and security. As well, for operators, configuration and
 administration of a secure Hadoop cluster is a complex and many Hadoop
 clusters are insecure as a result.
 
 The goal of the project is to provide coverage for all existing Hadoop
 ecosystem projects. In addition, the project will be extensible to
 allow for new and/or proprietary Hadoop components without requiring
 changes to the gateway source code. The gateway is expected to run in
 a DMZ environment where it will provide controlled access to these
 Hadoop services. In this way Hadoop clusters can be protected by a
 firewall and only limited access provided through the firewall for the
 gateway. The authentication components of the gateway will be modular
 and extensible such that it can be integrated with existing security
 infrastructure.
 
 Rationale
 
 Organizations that are struggling with Hadoop cluster security result
 in a) running Hadoop without security or b) slowing adoption of
 Hadoop. The Gateway aims to provide perimeter security that integrates
 more easily into existing organizations’ security infrastructure.
 Doing so will simplify security for these organizations and benefit
 all Hadoop stakeholders (i.e. users and operators). Additionally,
 making a dedicated perimeter security project part of the Apache
 Hadoop ecosystem will prevent fragmentation in this area and further
 increase the value of Hadoop as a data platform.
 
 Current Status
 
 Prototype available, developed by the list of initial committers.
 
 Meritocracy
 
 We desire to build a diverse developer community around Gateway
 following the Apache Way. We want to make the project open source and
 will encourage contributors from multiple organizations following the
 Apache meritocracy model.
 
 Community
 
 We hope to extend the user and developer base in the future and build
 a solid open source community around Gateway. Apache Hadoop has a
 large ecosystem of open source projects, each with a strong community
 of contributors. All project communities in this ecosystem have an
 opportunity to participate in the advancement of the Gateway project
 because ultimately, Gateway will enable the security capabilities of
 their project to be more enterprise friendly.
 
 Core Developers
 
 Gateway is currently being developed by several engineers from
 Hortonworks - Kevin Minder, Larry McCay, John Speidel, Tom Beerbower
 and Sumit Mohanty. All the engineers have deep expertise in
 middleware, security  identity systems and are quite familiar with
 the Hadoop ecosystem.
 
 Alignment
 
 The ASF is a natural host for Gateway given that it is already the
 home of Hadoop, Hive, Pig, HBase, Oozie and other emerging big data
 software projects. Gateway is designed to solve the security
 challenges familiar to the Hadoop ecosystem family of projects.
 
 Known Risks
 
 Orphaned products  Reliance on Salaried Developers
 
 The core developers plan to work full time on the project. We believe
 that this project will be of general interest to many Hadoop users and
 will attract a diverse set of contributors. We intend to demonstrate
 this by having contributors 

Re: [VOTE] Accept Apache Knox Hadoop Gateway Project into the Incubator

2013-02-15 Thread Alejandro Abdelnur
+1 non binding

Alejandro
(phone typing)

On Feb 15, 2013, at 5:45 PM, Mattmann, Chris A (388J) 
chris.a.mattm...@jpl.nasa.gov wrote:

 +1 binding.
 
 Cheers,
 Chris 
 
 Sent from my iPad
 
 On Feb 15, 2013, at 11:23 AM, Devaraj Das d...@hortonworks.com wrote:
 
 Hi Folks,
 
 Thanks for participating in the discussion. I'd like to call a VOTE
 for acceptance of Apache Knox Hadoop Gateway Project into the
 Incubator. The vote will close on Feb 22 at 6:00 p.m.
 
 [ ]  +1 Accept Apache Knox Hadoop Gateway Project into the Incubator
 [ ]  +0 Don't care.
 [ ]  -1 Don't accept Apache Knox Hadoop Gateway Project into the
 Incubator because...
 
 Full proposal is pasted at the bottom of this email, and the
 corresponding wiki is http://wiki.apache.org/incubator/knox. Only
 VOTEs from Incubator PMC members are binding.
 
 Here's my +1 (binding).
 
 Thanks,
 Devaraj.
 
 -
 
 Knox Gateway Proposal
 
 Abstract
 
 Knox Gateway is a system that provides a single point of secure access
 for Apache Hadoop clusters.
 
 Proposal
 
 The Knox Gateway (“Gateway” or “Knox”) is a system that provides a
 single point of authentication and access for Apache Hadoop services
 in a cluster. The goal is to simplify Hadoop security for both users
 (i.e. who access the cluster data and execute jobs) and operators
 (i.e. who control access and manage the cluster). The Gateway runs as
 a server (or cluster of servers) that serve one or more Hadoop
 clusters.
 
 Provide perimeter security to make Hadoop security setup easier
 Support authentication and token verification security scenarios
 Deliver users a single cluster end-point that aggregates capabilities
 for data and jobs
 Enable integration with enterprise and cloud identity management environments
 
 Background
 
 An Apache Hadoop cluster is presented to consumers as a loose
 collection of independent services. This makes it difficult for users
 to interact with Hadoop since each service maintains it’s own method
 of access and security. As well, for operators, configuration and
 administration of a secure Hadoop cluster is a complex and many Hadoop
 clusters are insecure as a result.
 
 The goal of the project is to provide coverage for all existing Hadoop
 ecosystem projects. In addition, the project will be extensible to
 allow for new and/or proprietary Hadoop components without requiring
 changes to the gateway source code. The gateway is expected to run in
 a DMZ environment where it will provide controlled access to these
 Hadoop services. In this way Hadoop clusters can be protected by a
 firewall and only limited access provided through the firewall for the
 gateway. The authentication components of the gateway will be modular
 and extensible such that it can be integrated with existing security
 infrastructure.
 
 Rationale
 
 Organizations that are struggling with Hadoop cluster security result
 in a) running Hadoop without security or b) slowing adoption of
 Hadoop. The Gateway aims to provide perimeter security that integrates
 more easily into existing organizations’ security infrastructure.
 Doing so will simplify security for these organizations and benefit
 all Hadoop stakeholders (i.e. users and operators). Additionally,
 making a dedicated perimeter security project part of the Apache
 Hadoop ecosystem will prevent fragmentation in this area and further
 increase the value of Hadoop as a data platform.
 
 Current Status
 
 Prototype available, developed by the list of initial committers.
 
 Meritocracy
 
 We desire to build a diverse developer community around Gateway
 following the Apache Way. We want to make the project open source and
 will encourage contributors from multiple organizations following the
 Apache meritocracy model.
 
 Community
 
 We hope to extend the user and developer base in the future and build
 a solid open source community around Gateway. Apache Hadoop has a
 large ecosystem of open source projects, each with a strong community
 of contributors. All project communities in this ecosystem have an
 opportunity to participate in the advancement of the Gateway project
 because ultimately, Gateway will enable the security capabilities of
 their project to be more enterprise friendly.
 
 Core Developers
 
 Gateway is currently being developed by several engineers from
 Hortonworks - Kevin Minder, Larry McCay, John Speidel, Tom Beerbower
 and Sumit Mohanty. All the engineers have deep expertise in
 middleware, security  identity systems and are quite familiar with
 the Hadoop ecosystem.
 
 Alignment
 
 The ASF is a natural host for Gateway given that it is already the
 home of Hadoop, Hive, Pig, HBase, Oozie and other emerging big data
 software projects. Gateway is designed to solve the security
 challenges familiar to the Hadoop ecosystem family of projects.
 
 Known Risks
 
 Orphaned products  Reliance on Salaried Developers
 
 The core developers plan to work full time on the project. We believe
 that this project will be of 

Re: [VOTE] Accept Apache Knox Hadoop Gateway Project into the Incubator

2013-02-15 Thread Arun C Murthy
+1 (binding)

Arun

On Feb 14, 2013, at 5:26 PM, Devaraj Das wrote:

 Hi Folks,
 
 Thanks for participating in the discussion. I'd like to call a VOTE
 for acceptance of Apache Knox Hadoop Gateway Project into the
 Incubator. The vote will close on Feb 21 at 6:00 p.m.
 
 [ ]  +1 Accept Apache Open Climate Workbench into the Incubator
 [ ]  +0 Don't care.
 [ ]  -1 Don't accept Apache Open Climate Workbench into the Incubator 
 because...
 
 Full proposal is pasted at the bottom of this email, and the
 corresponding wiki is http://wiki.apache.org/incubator/knox. Only
 VOTEs from Incubator PMC members are binding.
 
 Here's my +1 (binding).
 
 Thanks,
 Devaraj.
 
 p.s. In the last day, Tom White has been added as a mentor, and
 Venkatesh Seetharam has been added in the list of initial committers.
 
 
 Knox Gateway Proposal
 
 Abstract
 
 Knox Gateway is a system that provides a single point of secure access
 for Apache Hadoop clusters.
 
 Proposal
 
 The Knox Gateway (“Gateway” or “Knox”) is a system that provides a
 single point of authentication and access for Apache Hadoop services
 in a cluster. The goal is to simplify Hadoop security for both users
 (i.e. who access the cluster data and execute jobs) and operators
 (i.e. who control access and manage the cluster). The Gateway runs as
 a server (or cluster of servers) that serve one or more Hadoop
 clusters.
 
 Provide perimeter security to make Hadoop security setup easier
 Support authentication and token verification security scenarios
 Deliver users a single cluster end-point that aggregates capabilities
 for data and jobs
 Enable integration with enterprise and cloud identity management environments
 
 Background
 
 An Apache Hadoop cluster is presented to consumers as a loose
 collection of independent services. This makes it difficult for users
 to interact with Hadoop since each service maintains it’s own method
 of access and security. As well, for operators, configuration and
 administration of a secure Hadoop cluster is a complex and many Hadoop
 clusters are insecure as a result.
 
 The goal of the project is to provide coverage for all existing Hadoop
 ecosystem projects. In addition, the project will be extensible to
 allow for new and/or proprietary Hadoop components without requiring
 changes to the gateway source code. The gateway is expected to run in
 a DMZ environment where it will provide controlled access to these
 Hadoop services. In this way Hadoop clusters can be protected by a
 firewall and only limited access provided through the firewall for the
 gateway. The authentication components of the gateway will be modular
 and extensible such that it can be integrated with existing security
 infrastructure.
 
 Rationale
 
 Organizations that are struggling with Hadoop cluster security result
 in a) running Hadoop without security or b) slowing adoption of
 Hadoop. The Gateway aims to provide perimeter security that integrates
 more easily into existing organizations’ security infrastructure.
 Doing so will simplify security for these organizations and benefit
 all Hadoop stakeholders (i.e. users and operators). Additionally,
 making a dedicated perimeter security project part of the Apache
 Hadoop ecosystem will prevent fragmentation in this area and further
 increase the value of Hadoop as a data platform.
 
 Current Status
 
 Prototype available, developed by the list of initial committers.
 
 Meritocracy
 
 We desire to build a diverse developer community around Gateway
 following the Apache Way. We want to make the project open source and
 will encourage contributors from multiple organizations following the
 Apache meritocracy model.
 
 Community
 
 We hope to extend the user and developer base in the future and build
 a solid open source community around Gateway. Apache Hadoop has a
 large ecosystem of open source projects, each with a strong community
 of contributors. All project communities in this ecosystem have an
 opportunity to participate in the advancement of the Gateway project
 because ultimately, Gateway will enable the security capabilities of
 their project to be more enterprise friendly.
 
 Core Developers
 
 Gateway is currently being developed by several engineers from
 Hortonworks - Kevin Minder, Larry McCay, John Speidel, Tom Beerbower
 and Sumit Mohanty. All the engineers have deep expertise in
 middleware, security  identity systems and are quite familiar with
 the Hadoop ecosystem.
 
 Alignment
 
 The ASF is a natural host for Gateway given that it is already the
 home of Hadoop, Hive, Pig, HBase, Oozie and other emerging big data
 software projects. Gateway is designed to solve the security
 challenges familiar to the Hadoop ecosystem family of projects.
 
 Known Risks
 
 Orphaned products  Reliance on Salaried Developers
 
 The core developers plan to work full time on the project. We believe
 that this project will be of general interest to many Hadoop users and
 will attract a diverse set of 

[VOTE] Accept Apache Knox Hadoop Gateway Project into the Incubator

2013-02-14 Thread Devaraj Das
Hi Folks,

Thanks for participating in the discussion. I'd like to call a VOTE
for acceptance of Apache Knox Hadoop Gateway Project into the
Incubator. The vote will close on Feb 21 at 6:00 p.m.

[ ]  +1 Accept Apache Open Climate Workbench into the Incubator
[ ]  +0 Don't care.
[ ]  -1 Don't accept Apache Open Climate Workbench into the Incubator because...

Full proposal is pasted at the bottom of this email, and the
corresponding wiki is http://wiki.apache.org/incubator/knox. Only
VOTEs from Incubator PMC members are binding.

Here's my +1 (binding).

Thanks,
Devaraj.

p.s. In the last day, Tom White has been added as a mentor, and
Venkatesh Seetharam has been added in the list of initial committers.


Knox Gateway Proposal

Abstract

Knox Gateway is a system that provides a single point of secure access
for Apache Hadoop clusters.

Proposal

The Knox Gateway (“Gateway” or “Knox”) is a system that provides a
single point of authentication and access for Apache Hadoop services
in a cluster. The goal is to simplify Hadoop security for both users
(i.e. who access the cluster data and execute jobs) and operators
(i.e. who control access and manage the cluster). The Gateway runs as
a server (or cluster of servers) that serve one or more Hadoop
clusters.

Provide perimeter security to make Hadoop security setup easier
Support authentication and token verification security scenarios
Deliver users a single cluster end-point that aggregates capabilities
for data and jobs
Enable integration with enterprise and cloud identity management environments

Background

An Apache Hadoop cluster is presented to consumers as a loose
collection of independent services. This makes it difficult for users
to interact with Hadoop since each service maintains it’s own method
of access and security. As well, for operators, configuration and
administration of a secure Hadoop cluster is a complex and many Hadoop
clusters are insecure as a result.

The goal of the project is to provide coverage for all existing Hadoop
ecosystem projects. In addition, the project will be extensible to
allow for new and/or proprietary Hadoop components without requiring
changes to the gateway source code. The gateway is expected to run in
a DMZ environment where it will provide controlled access to these
Hadoop services. In this way Hadoop clusters can be protected by a
firewall and only limited access provided through the firewall for the
gateway. The authentication components of the gateway will be modular
and extensible such that it can be integrated with existing security
infrastructure.

Rationale

Organizations that are struggling with Hadoop cluster security result
in a) running Hadoop without security or b) slowing adoption of
Hadoop. The Gateway aims to provide perimeter security that integrates
more easily into existing organizations’ security infrastructure.
Doing so will simplify security for these organizations and benefit
all Hadoop stakeholders (i.e. users and operators). Additionally,
making a dedicated perimeter security project part of the Apache
Hadoop ecosystem will prevent fragmentation in this area and further
increase the value of Hadoop as a data platform.

Current Status

Prototype available, developed by the list of initial committers.

Meritocracy

We desire to build a diverse developer community around Gateway
following the Apache Way. We want to make the project open source and
will encourage contributors from multiple organizations following the
Apache meritocracy model.

Community

We hope to extend the user and developer base in the future and build
a solid open source community around Gateway. Apache Hadoop has a
large ecosystem of open source projects, each with a strong community
of contributors. All project communities in this ecosystem have an
opportunity to participate in the advancement of the Gateway project
because ultimately, Gateway will enable the security capabilities of
their project to be more enterprise friendly.

Core Developers

Gateway is currently being developed by several engineers from
Hortonworks - Kevin Minder, Larry McCay, John Speidel, Tom Beerbower
and Sumit Mohanty. All the engineers have deep expertise in
middleware, security  identity systems and are quite familiar with
the Hadoop ecosystem.

Alignment

The ASF is a natural host for Gateway given that it is already the
home of Hadoop, Hive, Pig, HBase, Oozie and other emerging big data
software projects. Gateway is designed to solve the security
challenges familiar to the Hadoop ecosystem family of projects.

Known Risks

Orphaned products  Reliance on Salaried Developers

The core developers plan to work full time on the project. We believe
that this project will be of general interest to many Hadoop users and
will attract a diverse set of contributors. We intend to demonstrate
this by having contributors from several organizations recognized as
committers by the time Knox graduates from incubation.

Inexperience with Open Source


Re: [VOTE] Accept Apache Knox Hadoop Gateway Project into the Incubator

2013-02-14 Thread Mattmann, Chris A (388J)
s/Apache Open Climate Workbench/Apache Knox Hadoop Gateway/ :)

May want to resend the [VOTE] thread.

On 2/14/13 5:26 PM, Devaraj Das d...@hortonworks.com wrote:

Hi Folks,

Thanks for participating in the discussion. I'd like to call a VOTE
for acceptance of Apache Knox Hadoop Gateway Project into the
Incubator. The vote will close on Feb 21 at 6:00 p.m.

[ ]  +1 Accept Apache Open Climate Workbench into the Incubator
[ ]  +0 Don't care.
[ ]  -1 Don't accept Apache Open Climate Workbench into the Incubator
because...

Full proposal is pasted at the bottom of this email, and the
corresponding wiki is http://wiki.apache.org/incubator/knox. Only
VOTEs from Incubator PMC members are binding.

Here's my +1 (binding).

Thanks,
Devaraj.

p.s. In the last day, Tom White has been added as a mentor, and
Venkatesh Seetharam has been added in the list of initial committers.


Knox Gateway Proposal

Abstract

Knox Gateway is a system that provides a single point of secure access
for Apache Hadoop clusters.

Proposal

The Knox Gateway (³Gateway² or ³Knox²) is a system that provides a
single point of authentication and access for Apache Hadoop services
in a cluster. The goal is to simplify Hadoop security for both users
(i.e. who access the cluster data and execute jobs) and operators
(i.e. who control access and manage the cluster). The Gateway runs as
a server (or cluster of servers) that serve one or more Hadoop
clusters.

Provide perimeter security to make Hadoop security setup easier
Support authentication and token verification security scenarios
Deliver users a single cluster end-point that aggregates capabilities
for data and jobs
Enable integration with enterprise and cloud identity management
environments

Background

An Apache Hadoop cluster is presented to consumers as a loose
collection of independent services. This makes it difficult for users
to interact with Hadoop since each service maintains it¹s own method
of access and security. As well, for operators, configuration and
administration of a secure Hadoop cluster is a complex and many Hadoop
clusters are insecure as a result.

The goal of the project is to provide coverage for all existing Hadoop
ecosystem projects. In addition, the project will be extensible to
allow for new and/or proprietary Hadoop components without requiring
changes to the gateway source code. The gateway is expected to run in
a DMZ environment where it will provide controlled access to these
Hadoop services. In this way Hadoop clusters can be protected by a
firewall and only limited access provided through the firewall for the
gateway. The authentication components of the gateway will be modular
and extensible such that it can be integrated with existing security
infrastructure.

Rationale

Organizations that are struggling with Hadoop cluster security result
in a) running Hadoop without security or b) slowing adoption of
Hadoop. The Gateway aims to provide perimeter security that integrates
more easily into existing organizations¹ security infrastructure.
Doing so will simplify security for these organizations and benefit
all Hadoop stakeholders (i.e. users and operators). Additionally,
making a dedicated perimeter security project part of the Apache
Hadoop ecosystem will prevent fragmentation in this area and further
increase the value of Hadoop as a data platform.

Current Status

Prototype available, developed by the list of initial committers.

Meritocracy

We desire to build a diverse developer community around Gateway
following the Apache Way. We want to make the project open source and
will encourage contributors from multiple organizations following the
Apache meritocracy model.

Community

We hope to extend the user and developer base in the future and build
a solid open source community around Gateway. Apache Hadoop has a
large ecosystem of open source projects, each with a strong community
of contributors. All project communities in this ecosystem have an
opportunity to participate in the advancement of the Gateway project
because ultimately, Gateway will enable the security capabilities of
their project to be more enterprise friendly.

Core Developers

Gateway is currently being developed by several engineers from
Hortonworks - Kevin Minder, Larry McCay, John Speidel, Tom Beerbower
and Sumit Mohanty. All the engineers have deep expertise in
middleware, security  identity systems and are quite familiar with
the Hadoop ecosystem.

Alignment

The ASF is a natural host for Gateway given that it is already the
home of Hadoop, Hive, Pig, HBase, Oozie and other emerging big data
software projects. Gateway is designed to solve the security
challenges familiar to the Hadoop ecosystem family of projects.

Known Risks

Orphaned products  Reliance on Salaried Developers

The core developers plan to work full time on the project. We believe
that this project will be of general interest to many Hadoop users and
will attract a diverse set of contributors. We intend to