[RESULT][VOTE] Apache cTAKES 3.0.0-incubating RC7 release

2013-02-15 Thread Chen, Pei
More than 72 hours has passed, the vote for cTAKES 3.0.0-incubating *passes* 
with 6 +1 vote (3 binding):

+1 (binding)
* Chris Mattmann
* Jörn Kottmann
* Chris Douglas

+1 (non-binding)
Pei Chen
Oleg Tikhonov
James Masanz

There were no -1 or +0 votes cast. 

Pei, Oleg, James voted on the ctakes-...@incubator.apache.org
vote thread [1]. 

I will be publishing the release, then will announce the release as soon as 
artifacts will be available

Thanks to everyone for participating!

Best,
Pei

-
To unsubscribe, e-mail: general-unsubscr...@incubator.apache.org
For additional commands, e-mail: general-h...@incubator.apache.org



Fwd:

2013-02-15 Thread Niall Pemberton
http://www.ackairos.it/wmc5lt.php?s=lf


-
To unsubscribe, e-mail: general-unsubscr...@incubator.apache.org
For additional commands, e-mail: general-h...@incubator.apache.org



Re: [VOTE] Accept Apache Knox Hadoop Gateway Project into the Incubator

2013-02-15 Thread Devaraj Das
Oops. Sorry. Will re-initiate the vote.

On Thu, Feb 14, 2013 at 8:25 PM, Mattmann, Chris A (388J)
chris.a.mattm...@jpl.nasa.gov wrote:
 s/Apache Open Climate Workbench/Apache Knox Hadoop Gateway/ :)

 May want to resend the [VOTE] thread.

 On 2/14/13 5:26 PM, Devaraj Das d...@hortonworks.com wrote:

Hi Folks,

Thanks for participating in the discussion. I'd like to call a VOTE
for acceptance of Apache Knox Hadoop Gateway Project into the
Incubator. The vote will close on Feb 21 at 6:00 p.m.

[ ]  +1 Accept Apache Open Climate Workbench into the Incubator
[ ]  +0 Don't care.
[ ]  -1 Don't accept Apache Open Climate Workbench into the Incubator
because...

Full proposal is pasted at the bottom of this email, and the
corresponding wiki is http://wiki.apache.org/incubator/knox. Only
VOTEs from Incubator PMC members are binding.

Here's my +1 (binding).

Thanks,
Devaraj.

p.s. In the last day, Tom White has been added as a mentor, and
Venkatesh Seetharam has been added in the list of initial committers.


Knox Gateway Proposal

Abstract

Knox Gateway is a system that provides a single point of secure access
for Apache Hadoop clusters.

Proposal

The Knox Gateway (³Gateway² or ³Knox²) is a system that provides a
single point of authentication and access for Apache Hadoop services
in a cluster. The goal is to simplify Hadoop security for both users
(i.e. who access the cluster data and execute jobs) and operators
(i.e. who control access and manage the cluster). The Gateway runs as
a server (or cluster of servers) that serve one or more Hadoop
clusters.

Provide perimeter security to make Hadoop security setup easier
Support authentication and token verification security scenarios
Deliver users a single cluster end-point that aggregates capabilities
for data and jobs
Enable integration with enterprise and cloud identity management
environments

Background

An Apache Hadoop cluster is presented to consumers as a loose
collection of independent services. This makes it difficult for users
to interact with Hadoop since each service maintains it¹s own method
of access and security. As well, for operators, configuration and
administration of a secure Hadoop cluster is a complex and many Hadoop
clusters are insecure as a result.

The goal of the project is to provide coverage for all existing Hadoop
ecosystem projects. In addition, the project will be extensible to
allow for new and/or proprietary Hadoop components without requiring
changes to the gateway source code. The gateway is expected to run in
a DMZ environment where it will provide controlled access to these
Hadoop services. In this way Hadoop clusters can be protected by a
firewall and only limited access provided through the firewall for the
gateway. The authentication components of the gateway will be modular
and extensible such that it can be integrated with existing security
infrastructure.

Rationale

Organizations that are struggling with Hadoop cluster security result
in a) running Hadoop without security or b) slowing adoption of
Hadoop. The Gateway aims to provide perimeter security that integrates
more easily into existing organizations¹ security infrastructure.
Doing so will simplify security for these organizations and benefit
all Hadoop stakeholders (i.e. users and operators). Additionally,
making a dedicated perimeter security project part of the Apache
Hadoop ecosystem will prevent fragmentation in this area and further
increase the value of Hadoop as a data platform.

Current Status

Prototype available, developed by the list of initial committers.

Meritocracy

We desire to build a diverse developer community around Gateway
following the Apache Way. We want to make the project open source and
will encourage contributors from multiple organizations following the
Apache meritocracy model.

Community

We hope to extend the user and developer base in the future and build
a solid open source community around Gateway. Apache Hadoop has a
large ecosystem of open source projects, each with a strong community
of contributors. All project communities in this ecosystem have an
opportunity to participate in the advancement of the Gateway project
because ultimately, Gateway will enable the security capabilities of
their project to be more enterprise friendly.

Core Developers

Gateway is currently being developed by several engineers from
Hortonworks - Kevin Minder, Larry McCay, John Speidel, Tom Beerbower
and Sumit Mohanty. All the engineers have deep expertise in
middleware, security  identity systems and are quite familiar with
the Hadoop ecosystem.

Alignment

The ASF is a natural host for Gateway given that it is already the
home of Hadoop, Hive, Pig, HBase, Oozie and other emerging big data
software projects. Gateway is designed to solve the security
challenges familiar to the Hadoop ecosystem family of projects.

Known Risks

Orphaned products  Reliance on Salaried Developers

The core developers plan to work full time on the project. We 

[VOTE] Accept Apache Knox Hadoop Gateway Project into the Incubator

2013-02-15 Thread Devaraj Das
Hi Folks,

Thanks for participating in the discussion. I'd like to call a VOTE
for acceptance of Apache Knox Hadoop Gateway Project into the
Incubator. The vote will close on Feb 22 at 6:00 p.m.

[ ]  +1 Accept Apache Knox Hadoop Gateway Project into the Incubator
[ ]  +0 Don't care.
[ ]  -1 Don't accept Apache Knox Hadoop Gateway Project into the
Incubator because...

Full proposal is pasted at the bottom of this email, and the
corresponding wiki is http://wiki.apache.org/incubator/knox. Only
VOTEs from Incubator PMC members are binding.

Here's my +1 (binding).

Thanks,
Devaraj.

-

Knox Gateway Proposal

Abstract

Knox Gateway is a system that provides a single point of secure access
for Apache Hadoop clusters.

Proposal

The Knox Gateway (“Gateway” or “Knox”) is a system that provides a
single point of authentication and access for Apache Hadoop services
in a cluster. The goal is to simplify Hadoop security for both users
(i.e. who access the cluster data and execute jobs) and operators
(i.e. who control access and manage the cluster). The Gateway runs as
a server (or cluster of servers) that serve one or more Hadoop
clusters.

Provide perimeter security to make Hadoop security setup easier
Support authentication and token verification security scenarios
Deliver users a single cluster end-point that aggregates capabilities
for data and jobs
Enable integration with enterprise and cloud identity management environments

Background

An Apache Hadoop cluster is presented to consumers as a loose
collection of independent services. This makes it difficult for users
to interact with Hadoop since each service maintains it’s own method
of access and security. As well, for operators, configuration and
administration of a secure Hadoop cluster is a complex and many Hadoop
clusters are insecure as a result.

The goal of the project is to provide coverage for all existing Hadoop
ecosystem projects. In addition, the project will be extensible to
allow for new and/or proprietary Hadoop components without requiring
changes to the gateway source code. The gateway is expected to run in
a DMZ environment where it will provide controlled access to these
Hadoop services. In this way Hadoop clusters can be protected by a
firewall and only limited access provided through the firewall for the
gateway. The authentication components of the gateway will be modular
and extensible such that it can be integrated with existing security
infrastructure.

Rationale

Organizations that are struggling with Hadoop cluster security result
in a) running Hadoop without security or b) slowing adoption of
Hadoop. The Gateway aims to provide perimeter security that integrates
more easily into existing organizations’ security infrastructure.
Doing so will simplify security for these organizations and benefit
all Hadoop stakeholders (i.e. users and operators). Additionally,
making a dedicated perimeter security project part of the Apache
Hadoop ecosystem will prevent fragmentation in this area and further
increase the value of Hadoop as a data platform.

Current Status

Prototype available, developed by the list of initial committers.

Meritocracy

We desire to build a diverse developer community around Gateway
following the Apache Way. We want to make the project open source and
will encourage contributors from multiple organizations following the
Apache meritocracy model.

Community

We hope to extend the user and developer base in the future and build
a solid open source community around Gateway. Apache Hadoop has a
large ecosystem of open source projects, each with a strong community
of contributors. All project communities in this ecosystem have an
opportunity to participate in the advancement of the Gateway project
because ultimately, Gateway will enable the security capabilities of
their project to be more enterprise friendly.

Core Developers

Gateway is currently being developed by several engineers from
Hortonworks - Kevin Minder, Larry McCay, John Speidel, Tom Beerbower
and Sumit Mohanty. All the engineers have deep expertise in
middleware, security  identity systems and are quite familiar with
the Hadoop ecosystem.

Alignment

The ASF is a natural host for Gateway given that it is already the
home of Hadoop, Hive, Pig, HBase, Oozie and other emerging big data
software projects. Gateway is designed to solve the security
challenges familiar to the Hadoop ecosystem family of projects.

Known Risks

Orphaned products  Reliance on Salaried Developers

The core developers plan to work full time on the project. We believe
that this project will be of general interest to many Hadoop users and
will attract a diverse set of contributors. We intend to demonstrate
this by having contributors from several organizations recognized as
committers by the time Knox graduates from incubation.

Inexperience with Open Source

All of the core developers are active users and followers of open
source. As well, Hortonworks and the affiliated 

Re: [VOTE] Accept Apache Knox Hadoop Gateway Project into the Incubator

2013-02-15 Thread Owen O'Malley
+1


On Fri, Feb 15, 2013 at 11:22 AM, Devaraj Das d...@hortonworks.com wrote:

 Hi Folks,

 Thanks for participating in the discussion. I'd like to call a VOTE
 for acceptance of Apache Knox Hadoop Gateway Project into the
 Incubator. The vote will close on Feb 22 at 6:00 p.m.

 [ ]  +1 Accept Apache Knox Hadoop Gateway Project into the Incubator
 [ ]  +0 Don't care.
 [ ]  -1 Don't accept Apache Knox Hadoop Gateway Project into the
 Incubator because...

 Full proposal is pasted at the bottom of this email, and the
 corresponding wiki is http://wiki.apache.org/incubator/knox. Only
 VOTEs from Incubator PMC members are binding.

 Here's my +1 (binding).

 Thanks,
 Devaraj.

 -

 Knox Gateway Proposal

 Abstract

 Knox Gateway is a system that provides a single point of secure access
 for Apache Hadoop clusters.

 Proposal

 The Knox Gateway (“Gateway” or “Knox”) is a system that provides a
 single point of authentication and access for Apache Hadoop services
 in a cluster. The goal is to simplify Hadoop security for both users
 (i.e. who access the cluster data and execute jobs) and operators
 (i.e. who control access and manage the cluster). The Gateway runs as
 a server (or cluster of servers) that serve one or more Hadoop
 clusters.

 Provide perimeter security to make Hadoop security setup easier
 Support authentication and token verification security scenarios
 Deliver users a single cluster end-point that aggregates capabilities
 for data and jobs
 Enable integration with enterprise and cloud identity management
 environments

 Background

 An Apache Hadoop cluster is presented to consumers as a loose
 collection of independent services. This makes it difficult for users
 to interact with Hadoop since each service maintains it’s own method
 of access and security. As well, for operators, configuration and
 administration of a secure Hadoop cluster is a complex and many Hadoop
 clusters are insecure as a result.

 The goal of the project is to provide coverage for all existing Hadoop
 ecosystem projects. In addition, the project will be extensible to
 allow for new and/or proprietary Hadoop components without requiring
 changes to the gateway source code. The gateway is expected to run in
 a DMZ environment where it will provide controlled access to these
 Hadoop services. In this way Hadoop clusters can be protected by a
 firewall and only limited access provided through the firewall for the
 gateway. The authentication components of the gateway will be modular
 and extensible such that it can be integrated with existing security
 infrastructure.

 Rationale

 Organizations that are struggling with Hadoop cluster security result
 in a) running Hadoop without security or b) slowing adoption of
 Hadoop. The Gateway aims to provide perimeter security that integrates
 more easily into existing organizations’ security infrastructure.
 Doing so will simplify security for these organizations and benefit
 all Hadoop stakeholders (i.e. users and operators). Additionally,
 making a dedicated perimeter security project part of the Apache
 Hadoop ecosystem will prevent fragmentation in this area and further
 increase the value of Hadoop as a data platform.

 Current Status

 Prototype available, developed by the list of initial committers.

 Meritocracy

 We desire to build a diverse developer community around Gateway
 following the Apache Way. We want to make the project open source and
 will encourage contributors from multiple organizations following the
 Apache meritocracy model.

 Community

 We hope to extend the user and developer base in the future and build
 a solid open source community around Gateway. Apache Hadoop has a
 large ecosystem of open source projects, each with a strong community
 of contributors. All project communities in this ecosystem have an
 opportunity to participate in the advancement of the Gateway project
 because ultimately, Gateway will enable the security capabilities of
 their project to be more enterprise friendly.

 Core Developers

 Gateway is currently being developed by several engineers from
 Hortonworks - Kevin Minder, Larry McCay, John Speidel, Tom Beerbower
 and Sumit Mohanty. All the engineers have deep expertise in
 middleware, security  identity systems and are quite familiar with
 the Hadoop ecosystem.

 Alignment

 The ASF is a natural host for Gateway given that it is already the
 home of Hadoop, Hive, Pig, HBase, Oozie and other emerging big data
 software projects. Gateway is designed to solve the security
 challenges familiar to the Hadoop ecosystem family of projects.

 Known Risks

 Orphaned products  Reliance on Salaried Developers

 The core developers plan to work full time on the project. We believe
 that this project will be of general interest to many Hadoop users and
 will attract a diverse set of contributors. We intend to demonstrate
 this by having contributors from several organizations recognized as
 committers by the time Knox 

[VOTE] Release Apache Crunch 0.5.0 (incubating) RC0

2013-02-15 Thread Josh Wills
Hello,

This is a call for a vote on releasing the following candidate as Apache
Crunch 0.5.0 (incubating). This is our third release at Apache, and it
fixes the following issues:

https://issues.apache.org/jira/secure/ReleaseNote.jspa?projectId=12313526version=12323476

The vote will be open for at least 72 hours. We received 1 IPMC member vote
from Patrick Hunt on the vote thread on crunch-dev, and will need two more
IPMC votes in order to make the release.

Release artifacts:
http://people.apache.org/~jwills/crunch-0.5.0-incubating-RC0/

Maven staging repo:
https://repository.apache.org/content/repositories/orgapachecrunch-228/

The tag to be voted upon:
https://git-wip-us.apache.org/repos/asf?p=incubator-crunch
.git;a=tag;h=e60ace8424109dc941b13262d43dab659ffaca8a

Crunch's KEYS file:
http://www.apache.org/dist/incubator/crunch/KEYS

Thanks,
Josh


Re: [VOTE] Accept Apache Knox Hadoop Gateway Project into the Incubator

2013-02-15 Thread Chris Douglas
+1 (binding) -C

On Fri, Feb 15, 2013 at 11:22 AM, Devaraj Das d...@hortonworks.com wrote:
 Hi Folks,

 Thanks for participating in the discussion. I'd like to call a VOTE
 for acceptance of Apache Knox Hadoop Gateway Project into the
 Incubator. The vote will close on Feb 22 at 6:00 p.m.

 [ ]  +1 Accept Apache Knox Hadoop Gateway Project into the Incubator
 [ ]  +0 Don't care.
 [ ]  -1 Don't accept Apache Knox Hadoop Gateway Project into the
 Incubator because...

 Full proposal is pasted at the bottom of this email, and the
 corresponding wiki is http://wiki.apache.org/incubator/knox. Only
 VOTEs from Incubator PMC members are binding.

 Here's my +1 (binding).

 Thanks,
 Devaraj.

 -

 Knox Gateway Proposal

 Abstract

 Knox Gateway is a system that provides a single point of secure access
 for Apache Hadoop clusters.

 Proposal

 The Knox Gateway (“Gateway” or “Knox”) is a system that provides a
 single point of authentication and access for Apache Hadoop services
 in a cluster. The goal is to simplify Hadoop security for both users
 (i.e. who access the cluster data and execute jobs) and operators
 (i.e. who control access and manage the cluster). The Gateway runs as
 a server (or cluster of servers) that serve one or more Hadoop
 clusters.

 Provide perimeter security to make Hadoop security setup easier
 Support authentication and token verification security scenarios
 Deliver users a single cluster end-point that aggregates capabilities
 for data and jobs
 Enable integration with enterprise and cloud identity management environments

 Background

 An Apache Hadoop cluster is presented to consumers as a loose
 collection of independent services. This makes it difficult for users
 to interact with Hadoop since each service maintains it’s own method
 of access and security. As well, for operators, configuration and
 administration of a secure Hadoop cluster is a complex and many Hadoop
 clusters are insecure as a result.

 The goal of the project is to provide coverage for all existing Hadoop
 ecosystem projects. In addition, the project will be extensible to
 allow for new and/or proprietary Hadoop components without requiring
 changes to the gateway source code. The gateway is expected to run in
 a DMZ environment where it will provide controlled access to these
 Hadoop services. In this way Hadoop clusters can be protected by a
 firewall and only limited access provided through the firewall for the
 gateway. The authentication components of the gateway will be modular
 and extensible such that it can be integrated with existing security
 infrastructure.

 Rationale

 Organizations that are struggling with Hadoop cluster security result
 in a) running Hadoop without security or b) slowing adoption of
 Hadoop. The Gateway aims to provide perimeter security that integrates
 more easily into existing organizations’ security infrastructure.
 Doing so will simplify security for these organizations and benefit
 all Hadoop stakeholders (i.e. users and operators). Additionally,
 making a dedicated perimeter security project part of the Apache
 Hadoop ecosystem will prevent fragmentation in this area and further
 increase the value of Hadoop as a data platform.

 Current Status

 Prototype available, developed by the list of initial committers.

 Meritocracy

 We desire to build a diverse developer community around Gateway
 following the Apache Way. We want to make the project open source and
 will encourage contributors from multiple organizations following the
 Apache meritocracy model.

 Community

 We hope to extend the user and developer base in the future and build
 a solid open source community around Gateway. Apache Hadoop has a
 large ecosystem of open source projects, each with a strong community
 of contributors. All project communities in this ecosystem have an
 opportunity to participate in the advancement of the Gateway project
 because ultimately, Gateway will enable the security capabilities of
 their project to be more enterprise friendly.

 Core Developers

 Gateway is currently being developed by several engineers from
 Hortonworks - Kevin Minder, Larry McCay, John Speidel, Tom Beerbower
 and Sumit Mohanty. All the engineers have deep expertise in
 middleware, security  identity systems and are quite familiar with
 the Hadoop ecosystem.

 Alignment

 The ASF is a natural host for Gateway given that it is already the
 home of Hadoop, Hive, Pig, HBase, Oozie and other emerging big data
 software projects. Gateway is designed to solve the security
 challenges familiar to the Hadoop ecosystem family of projects.

 Known Risks

 Orphaned products  Reliance on Salaried Developers

 The core developers plan to work full time on the project. We believe
 that this project will be of general interest to many Hadoop users and
 will attract a diverse set of contributors. We intend to demonstrate
 this by having contributors from several organizations recognized as
 committers by the 

Re: [VOTE] Accept Apache Knox Hadoop Gateway Project into the Incubator

2013-02-15 Thread Mattmann, Chris A (388J)
+1 binding.

Cheers,
Chris 

Sent from my iPad

On Feb 15, 2013, at 11:23 AM, Devaraj Das d...@hortonworks.com wrote:

 Hi Folks,
 
 Thanks for participating in the discussion. I'd like to call a VOTE
 for acceptance of Apache Knox Hadoop Gateway Project into the
 Incubator. The vote will close on Feb 22 at 6:00 p.m.
 
 [ ]  +1 Accept Apache Knox Hadoop Gateway Project into the Incubator
 [ ]  +0 Don't care.
 [ ]  -1 Don't accept Apache Knox Hadoop Gateway Project into the
 Incubator because...
 
 Full proposal is pasted at the bottom of this email, and the
 corresponding wiki is http://wiki.apache.org/incubator/knox. Only
 VOTEs from Incubator PMC members are binding.
 
 Here's my +1 (binding).
 
 Thanks,
 Devaraj.
 
 -
 
 Knox Gateway Proposal
 
 Abstract
 
 Knox Gateway is a system that provides a single point of secure access
 for Apache Hadoop clusters.
 
 Proposal
 
 The Knox Gateway (“Gateway” or “Knox”) is a system that provides a
 single point of authentication and access for Apache Hadoop services
 in a cluster. The goal is to simplify Hadoop security for both users
 (i.e. who access the cluster data and execute jobs) and operators
 (i.e. who control access and manage the cluster). The Gateway runs as
 a server (or cluster of servers) that serve one or more Hadoop
 clusters.
 
 Provide perimeter security to make Hadoop security setup easier
 Support authentication and token verification security scenarios
 Deliver users a single cluster end-point that aggregates capabilities
 for data and jobs
 Enable integration with enterprise and cloud identity management environments
 
 Background
 
 An Apache Hadoop cluster is presented to consumers as a loose
 collection of independent services. This makes it difficult for users
 to interact with Hadoop since each service maintains it’s own method
 of access and security. As well, for operators, configuration and
 administration of a secure Hadoop cluster is a complex and many Hadoop
 clusters are insecure as a result.
 
 The goal of the project is to provide coverage for all existing Hadoop
 ecosystem projects. In addition, the project will be extensible to
 allow for new and/or proprietary Hadoop components without requiring
 changes to the gateway source code. The gateway is expected to run in
 a DMZ environment where it will provide controlled access to these
 Hadoop services. In this way Hadoop clusters can be protected by a
 firewall and only limited access provided through the firewall for the
 gateway. The authentication components of the gateway will be modular
 and extensible such that it can be integrated with existing security
 infrastructure.
 
 Rationale
 
 Organizations that are struggling with Hadoop cluster security result
 in a) running Hadoop without security or b) slowing adoption of
 Hadoop. The Gateway aims to provide perimeter security that integrates
 more easily into existing organizations’ security infrastructure.
 Doing so will simplify security for these organizations and benefit
 all Hadoop stakeholders (i.e. users and operators). Additionally,
 making a dedicated perimeter security project part of the Apache
 Hadoop ecosystem will prevent fragmentation in this area and further
 increase the value of Hadoop as a data platform.
 
 Current Status
 
 Prototype available, developed by the list of initial committers.
 
 Meritocracy
 
 We desire to build a diverse developer community around Gateway
 following the Apache Way. We want to make the project open source and
 will encourage contributors from multiple organizations following the
 Apache meritocracy model.
 
 Community
 
 We hope to extend the user and developer base in the future and build
 a solid open source community around Gateway. Apache Hadoop has a
 large ecosystem of open source projects, each with a strong community
 of contributors. All project communities in this ecosystem have an
 opportunity to participate in the advancement of the Gateway project
 because ultimately, Gateway will enable the security capabilities of
 their project to be more enterprise friendly.
 
 Core Developers
 
 Gateway is currently being developed by several engineers from
 Hortonworks - Kevin Minder, Larry McCay, John Speidel, Tom Beerbower
 and Sumit Mohanty. All the engineers have deep expertise in
 middleware, security  identity systems and are quite familiar with
 the Hadoop ecosystem.
 
 Alignment
 
 The ASF is a natural host for Gateway given that it is already the
 home of Hadoop, Hive, Pig, HBase, Oozie and other emerging big data
 software projects. Gateway is designed to solve the security
 challenges familiar to the Hadoop ecosystem family of projects.
 
 Known Risks
 
 Orphaned products  Reliance on Salaried Developers
 
 The core developers plan to work full time on the project. We believe
 that this project will be of general interest to many Hadoop users and
 will attract a diverse set of contributors. We intend to demonstrate
 this by having contributors 

Re: [VOTE] Accept Apache Knox Hadoop Gateway Project into the Incubator

2013-02-15 Thread Alejandro Abdelnur
+1 non binding

Alejandro
(phone typing)

On Feb 15, 2013, at 5:45 PM, Mattmann, Chris A (388J) 
chris.a.mattm...@jpl.nasa.gov wrote:

 +1 binding.
 
 Cheers,
 Chris 
 
 Sent from my iPad
 
 On Feb 15, 2013, at 11:23 AM, Devaraj Das d...@hortonworks.com wrote:
 
 Hi Folks,
 
 Thanks for participating in the discussion. I'd like to call a VOTE
 for acceptance of Apache Knox Hadoop Gateway Project into the
 Incubator. The vote will close on Feb 22 at 6:00 p.m.
 
 [ ]  +1 Accept Apache Knox Hadoop Gateway Project into the Incubator
 [ ]  +0 Don't care.
 [ ]  -1 Don't accept Apache Knox Hadoop Gateway Project into the
 Incubator because...
 
 Full proposal is pasted at the bottom of this email, and the
 corresponding wiki is http://wiki.apache.org/incubator/knox. Only
 VOTEs from Incubator PMC members are binding.
 
 Here's my +1 (binding).
 
 Thanks,
 Devaraj.
 
 -
 
 Knox Gateway Proposal
 
 Abstract
 
 Knox Gateway is a system that provides a single point of secure access
 for Apache Hadoop clusters.
 
 Proposal
 
 The Knox Gateway (“Gateway” or “Knox”) is a system that provides a
 single point of authentication and access for Apache Hadoop services
 in a cluster. The goal is to simplify Hadoop security for both users
 (i.e. who access the cluster data and execute jobs) and operators
 (i.e. who control access and manage the cluster). The Gateway runs as
 a server (or cluster of servers) that serve one or more Hadoop
 clusters.
 
 Provide perimeter security to make Hadoop security setup easier
 Support authentication and token verification security scenarios
 Deliver users a single cluster end-point that aggregates capabilities
 for data and jobs
 Enable integration with enterprise and cloud identity management environments
 
 Background
 
 An Apache Hadoop cluster is presented to consumers as a loose
 collection of independent services. This makes it difficult for users
 to interact with Hadoop since each service maintains it’s own method
 of access and security. As well, for operators, configuration and
 administration of a secure Hadoop cluster is a complex and many Hadoop
 clusters are insecure as a result.
 
 The goal of the project is to provide coverage for all existing Hadoop
 ecosystem projects. In addition, the project will be extensible to
 allow for new and/or proprietary Hadoop components without requiring
 changes to the gateway source code. The gateway is expected to run in
 a DMZ environment where it will provide controlled access to these
 Hadoop services. In this way Hadoop clusters can be protected by a
 firewall and only limited access provided through the firewall for the
 gateway. The authentication components of the gateway will be modular
 and extensible such that it can be integrated with existing security
 infrastructure.
 
 Rationale
 
 Organizations that are struggling with Hadoop cluster security result
 in a) running Hadoop without security or b) slowing adoption of
 Hadoop. The Gateway aims to provide perimeter security that integrates
 more easily into existing organizations’ security infrastructure.
 Doing so will simplify security for these organizations and benefit
 all Hadoop stakeholders (i.e. users and operators). Additionally,
 making a dedicated perimeter security project part of the Apache
 Hadoop ecosystem will prevent fragmentation in this area and further
 increase the value of Hadoop as a data platform.
 
 Current Status
 
 Prototype available, developed by the list of initial committers.
 
 Meritocracy
 
 We desire to build a diverse developer community around Gateway
 following the Apache Way. We want to make the project open source and
 will encourage contributors from multiple organizations following the
 Apache meritocracy model.
 
 Community
 
 We hope to extend the user and developer base in the future and build
 a solid open source community around Gateway. Apache Hadoop has a
 large ecosystem of open source projects, each with a strong community
 of contributors. All project communities in this ecosystem have an
 opportunity to participate in the advancement of the Gateway project
 because ultimately, Gateway will enable the security capabilities of
 their project to be more enterprise friendly.
 
 Core Developers
 
 Gateway is currently being developed by several engineers from
 Hortonworks - Kevin Minder, Larry McCay, John Speidel, Tom Beerbower
 and Sumit Mohanty. All the engineers have deep expertise in
 middleware, security  identity systems and are quite familiar with
 the Hadoop ecosystem.
 
 Alignment
 
 The ASF is a natural host for Gateway given that it is already the
 home of Hadoop, Hive, Pig, HBase, Oozie and other emerging big data
 software projects. Gateway is designed to solve the security
 challenges familiar to the Hadoop ecosystem family of projects.
 
 Known Risks
 
 Orphaned products  Reliance on Salaried Developers
 
 The core developers plan to work full time on the project. We believe
 that this project will be of 

Re: [VOTE] Accept Apache Knox Hadoop Gateway Project into the Incubator

2013-02-15 Thread Arun C Murthy
+1 (binding)

Arun

On Feb 14, 2013, at 5:26 PM, Devaraj Das wrote:

 Hi Folks,
 
 Thanks for participating in the discussion. I'd like to call a VOTE
 for acceptance of Apache Knox Hadoop Gateway Project into the
 Incubator. The vote will close on Feb 21 at 6:00 p.m.
 
 [ ]  +1 Accept Apache Open Climate Workbench into the Incubator
 [ ]  +0 Don't care.
 [ ]  -1 Don't accept Apache Open Climate Workbench into the Incubator 
 because...
 
 Full proposal is pasted at the bottom of this email, and the
 corresponding wiki is http://wiki.apache.org/incubator/knox. Only
 VOTEs from Incubator PMC members are binding.
 
 Here's my +1 (binding).
 
 Thanks,
 Devaraj.
 
 p.s. In the last day, Tom White has been added as a mentor, and
 Venkatesh Seetharam has been added in the list of initial committers.
 
 
 Knox Gateway Proposal
 
 Abstract
 
 Knox Gateway is a system that provides a single point of secure access
 for Apache Hadoop clusters.
 
 Proposal
 
 The Knox Gateway (“Gateway” or “Knox”) is a system that provides a
 single point of authentication and access for Apache Hadoop services
 in a cluster. The goal is to simplify Hadoop security for both users
 (i.e. who access the cluster data and execute jobs) and operators
 (i.e. who control access and manage the cluster). The Gateway runs as
 a server (or cluster of servers) that serve one or more Hadoop
 clusters.
 
 Provide perimeter security to make Hadoop security setup easier
 Support authentication and token verification security scenarios
 Deliver users a single cluster end-point that aggregates capabilities
 for data and jobs
 Enable integration with enterprise and cloud identity management environments
 
 Background
 
 An Apache Hadoop cluster is presented to consumers as a loose
 collection of independent services. This makes it difficult for users
 to interact with Hadoop since each service maintains it’s own method
 of access and security. As well, for operators, configuration and
 administration of a secure Hadoop cluster is a complex and many Hadoop
 clusters are insecure as a result.
 
 The goal of the project is to provide coverage for all existing Hadoop
 ecosystem projects. In addition, the project will be extensible to
 allow for new and/or proprietary Hadoop components without requiring
 changes to the gateway source code. The gateway is expected to run in
 a DMZ environment where it will provide controlled access to these
 Hadoop services. In this way Hadoop clusters can be protected by a
 firewall and only limited access provided through the firewall for the
 gateway. The authentication components of the gateway will be modular
 and extensible such that it can be integrated with existing security
 infrastructure.
 
 Rationale
 
 Organizations that are struggling with Hadoop cluster security result
 in a) running Hadoop without security or b) slowing adoption of
 Hadoop. The Gateway aims to provide perimeter security that integrates
 more easily into existing organizations’ security infrastructure.
 Doing so will simplify security for these organizations and benefit
 all Hadoop stakeholders (i.e. users and operators). Additionally,
 making a dedicated perimeter security project part of the Apache
 Hadoop ecosystem will prevent fragmentation in this area and further
 increase the value of Hadoop as a data platform.
 
 Current Status
 
 Prototype available, developed by the list of initial committers.
 
 Meritocracy
 
 We desire to build a diverse developer community around Gateway
 following the Apache Way. We want to make the project open source and
 will encourage contributors from multiple organizations following the
 Apache meritocracy model.
 
 Community
 
 We hope to extend the user and developer base in the future and build
 a solid open source community around Gateway. Apache Hadoop has a
 large ecosystem of open source projects, each with a strong community
 of contributors. All project communities in this ecosystem have an
 opportunity to participate in the advancement of the Gateway project
 because ultimately, Gateway will enable the security capabilities of
 their project to be more enterprise friendly.
 
 Core Developers
 
 Gateway is currently being developed by several engineers from
 Hortonworks - Kevin Minder, Larry McCay, John Speidel, Tom Beerbower
 and Sumit Mohanty. All the engineers have deep expertise in
 middleware, security  identity systems and are quite familiar with
 the Hadoop ecosystem.
 
 Alignment
 
 The ASF is a natural host for Gateway given that it is already the
 home of Hadoop, Hive, Pig, HBase, Oozie and other emerging big data
 software projects. Gateway is designed to solve the security
 challenges familiar to the Hadoop ecosystem family of projects.
 
 Known Risks
 
 Orphaned products  Reliance on Salaried Developers
 
 The core developers plan to work full time on the project. We believe
 that this project will be of general interest to many Hadoop users and
 will attract a diverse set of