Build failed in Jenkins: Phoenix | Master #2194

2018-10-15 Thread Apache Jenkins Server
See 

--
Started by an SCM change
[EnvInject] - Loading node environment variables.
Building remotely on H35 (ubuntu xenial) in workspace 

 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url 
 > https://git-wip-us.apache.org/repos/asf/phoenix.git # timeout=10
Fetching upstream changes from 
https://git-wip-us.apache.org/repos/asf/phoenix.git
 > git --version # timeout=10
 > git fetch --tags --progress 
 > https://git-wip-us.apache.org/repos/asf/phoenix.git 
 > +refs/heads/*:refs/remotes/origin/*
ERROR: Error fetching remote repo 'origin'
hudson.plugins.git.GitException: Failed to fetch from 
https://git-wip-us.apache.org/repos/asf/phoenix.git
at hudson.plugins.git.GitSCM.fetchFrom(GitSCM.java:888)
at hudson.plugins.git.GitSCM.retrieveChanges(GitSCM.java:1155)
at hudson.plugins.git.GitSCM.checkout(GitSCM.java:1186)
at hudson.scm.SCM.checkout(SCM.java:504)
at hudson.model.AbstractProject.checkout(AbstractProject.java:1208)
at 
hudson.model.AbstractBuild$AbstractBuildExecution.defaultCheckout(AbstractBuild.java:574)
at jenkins.scm.SCMCheckoutStrategy.checkout(SCMCheckoutStrategy.java:86)
at 
hudson.model.AbstractBuild$AbstractBuildExecution.run(AbstractBuild.java:499)
at hudson.model.Run.execute(Run.java:1794)
at hudson.model.FreeStyleBuild.run(FreeStyleBuild.java:43)
at hudson.model.ResourceController.execute(ResourceController.java:97)
at hudson.model.Executor.run(Executor.java:429)
Caused by: hudson.plugins.git.GitException: Command "git fetch --tags 
--progress https://git-wip-us.apache.org/repos/asf/phoenix.git 
+refs/heads/*:refs/remotes/origin/*" returned status code 128:
stdout: 
stderr: error: Could not read ba1fd712ae8cdc0fc20805a7b03b7df7b0258542
error: Could not read 78daf2e3273e56f9dc6254c0bb12d45f882ac4da
error: Could not read 3b03e1b0bddaf3b52637187ff556fbb39b1dddef
error: Could not read 9d2cf0bb33c8c4b6c893007cc36db6c5d1b74686
remote: Counting objects: 66839   remote: Counting objects: 104547, 
done.
remote: Compressing objects:   0% (1/49284)   remote: Compressing 
objects:   1% (493/49284)   remote: Compressing objects:   2% 
(986/49284)   remote: Compressing objects:   3% (1479/49284)   
remote: Compressing objects:   4% (1972/49284)   remote: Compressing 
objects:   5% (2465/49284)   remote: Compressing objects:   6% 
(2958/49284)   remote: Compressing objects:   7% (3450/49284)   
remote: Compressing objects:   8% (3943/49284)   remote: Compressing 
objects:   9% (4436/49284)   remote: Compressing objects:  10% 
(4929/49284)   remote: Compressing objects:  11% (5422/49284)   
remote: Compressing objects:  12% (5915/49284)   remote: Compressing 
objects:  13% (6407/49284)   remote: Compressing objects:  14% 
(6900/49284)   remote: Compressing objects:  15% (7393/49284)   
remote: Compressing objects:  16% (7886/49284)   remote: Compressing 
objects:  17% (8379/49284)   remote: Compressing objects:  18% 
(8872/49284)   remote: Compressing objects:  19% (9364/49284)   
remote: Compressing objects:  20% (9857/49284)   remote: Compressing 
objects:  21% (10350/49284)   remote: Compressing objects:  22% 
(10843/49284)   remote: Compressing objects:  23% (11336/49284) 
  remote: Compressing objects:  24% (11829/49284)   remote: Compressing 
objects:  25% (12321/49284)   remote: Compressing objects:  26% 
(12814/49284)   remote: Compressing objects:  27% (13307/49284) 
  remote: Compressing objects:  28% (13800/49284)   remote: Compressing 
objects:  29% (14293/49284)   remote: Compressing objects:  30% 
(14786/49284)   remote: Compressing objects:  31% (15279/49284) 
  remote: Compressing objects:  32% (15771/49284)   remote: Compressing 
objects:  33% (16264/49284)   remote: Compressing objects:  34% 
(16757/49284)   remote: Compressing objects:  35% (17250/49284) 
  remote: Compressing objects:  36% (17743/49284)   remote: Compressing 
objects:  37% (18236/49284)   remote: Compressing objects:  38% 
(18728/49284)   remote: Compressing objects:  39% (19221/49284) 
  remote: Compressing objects:  40% (19714/49284)   remote: Compressing 
objects:  41% (20207/49284)   remote: Compressing objects:  42% 
(20700/49284)   remote: Compressing objects:  43% (21193/49284) 
  remote: Compressing objects:  44% (21685/49284)   remote: Compressing 
objects:  45% (22178/49284)   remote: Compressing objec

Build failed in Jenkins: Phoenix | Master #2193

2018-10-15 Thread Apache Jenkins Server
See 

--
Started by an SCM change
[EnvInject] - Loading node environment variables.
Building remotely on H35 (ubuntu xenial) in workspace 

 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url 
 > https://git-wip-us.apache.org/repos/asf/phoenix.git # timeout=10
Fetching upstream changes from 
https://git-wip-us.apache.org/repos/asf/phoenix.git
 > git --version # timeout=10
 > git fetch --tags --progress 
 > https://git-wip-us.apache.org/repos/asf/phoenix.git 
 > +refs/heads/*:refs/remotes/origin/*
ERROR: Error fetching remote repo 'origin'
hudson.plugins.git.GitException: Failed to fetch from 
https://git-wip-us.apache.org/repos/asf/phoenix.git
at hudson.plugins.git.GitSCM.fetchFrom(GitSCM.java:888)
at hudson.plugins.git.GitSCM.retrieveChanges(GitSCM.java:1155)
at hudson.plugins.git.GitSCM.checkout(GitSCM.java:1186)
at hudson.scm.SCM.checkout(SCM.java:504)
at hudson.model.AbstractProject.checkout(AbstractProject.java:1208)
at 
hudson.model.AbstractBuild$AbstractBuildExecution.defaultCheckout(AbstractBuild.java:574)
at jenkins.scm.SCMCheckoutStrategy.checkout(SCMCheckoutStrategy.java:86)
at 
hudson.model.AbstractBuild$AbstractBuildExecution.run(AbstractBuild.java:499)
at hudson.model.Run.execute(Run.java:1794)
at hudson.model.FreeStyleBuild.run(FreeStyleBuild.java:43)
at hudson.model.ResourceController.execute(ResourceController.java:97)
at hudson.model.Executor.run(Executor.java:429)
Caused by: hudson.plugins.git.GitException: Command "git fetch --tags 
--progress https://git-wip-us.apache.org/repos/asf/phoenix.git 
+refs/heads/*:refs/remotes/origin/*" returned status code 128:
stdout: 
stderr: error: Could not read ba1fd712ae8cdc0fc20805a7b03b7df7b0258542
error: Could not read 78daf2e3273e56f9dc6254c0bb12d45f882ac4da
error: Could not read 3b03e1b0bddaf3b52637187ff556fbb39b1dddef
error: Could not read 9d2cf0bb33c8c4b6c893007cc36db6c5d1b74686
remote: Counting objects: 74650   remote: Counting objects: 104547, 
done.
remote: Compressing objects:   0% (1/49284)   remote: Compressing 
objects:   1% (493/49284)   remote: Compressing objects:   2% 
(986/49284)   remote: Compressing objects:   3% (1479/49284)   
remote: Compressing objects:   4% (1972/49284)   remote: Compressing 
objects:   5% (2465/49284)   remote: Compressing objects:   6% 
(2958/49284)   remote: Compressing objects:   7% (3450/49284)   
remote: Compressing objects:   8% (3943/49284)   remote: Compressing 
objects:   9% (4436/49284)   remote: Compressing objects:  10% 
(4929/49284)   remote: Compressing objects:  11% (5422/49284)   
remote: Compressing objects:  12% (5915/49284)   remote: Compressing 
objects:  13% (6407/49284)   remote: Compressing objects:  14% 
(6900/49284)   remote: Compressing objects:  15% (7393/49284)   
remote: Compressing objects:  16% (7886/49284)   remote: Compressing 
objects:  17% (8379/49284)   remote: Compressing objects:  18% 
(8872/49284)   remote: Compressing objects:  19% (9364/49284)   
remote: Compressing objects:  20% (9857/49284)   remote: Compressing 
objects:  21% (10350/49284)   remote: Compressing objects:  22% 
(10843/49284)   remote: Compressing objects:  23% (11336/49284) 
  remote: Compressing objects:  24% (11829/49284)   remote: Compressing 
objects:  25% (12321/49284)   remote: Compressing objects:  26% 
(12814/49284)   remote: Compressing objects:  27% (13307/49284) 
  remote: Compressing objects:  28% (13800/49284)   remote: Compressing 
objects:  29% (14293/49284)   remote: Compressing objects:  30% 
(14786/49284)   remote: Compressing objects:  31% (15279/49284) 
  remote: Compressing objects:  32% (15771/49284)   remote: Compressing 
objects:  33% (16264/49284)   remote: Compressing objects:  34% 
(16757/49284)   remote: Compressing objects:  35% (17250/49284) 
  remote: Compressing objects:  36% (17743/49284)   remote: Compressing 
objects:  37% (18236/49284)   remote: Compressing objects:  38% 
(18728/49284)   remote: Compressing objects:  39% (19221/49284) 
  remote: Compressing objects:  40% (19714/49284)   remote: Compressing 
objects:  41% (20207/49284)   remote: Compressing objects:  42% 
(20700/49284)   remote: Compressing objects:  43% (21193/49284) 
  remote: Compressing objects:  44% (21685/49284)   remote: Compressing 
objects:  45% (22178/49284)   remote: Compressing objec

Build failed in Jenkins: Phoenix | Master #2192

2018-10-15 Thread Apache Jenkins Server
See 

--
Started by an SCM change
[EnvInject] - Loading node environment variables.
Building remotely on H35 (ubuntu xenial) in workspace 

 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url 
 > https://git-wip-us.apache.org/repos/asf/phoenix.git # timeout=10
Fetching upstream changes from 
https://git-wip-us.apache.org/repos/asf/phoenix.git
 > git --version # timeout=10
 > git fetch --tags --progress 
 > https://git-wip-us.apache.org/repos/asf/phoenix.git 
 > +refs/heads/*:refs/remotes/origin/*
ERROR: Error fetching remote repo 'origin'
hudson.plugins.git.GitException: Failed to fetch from 
https://git-wip-us.apache.org/repos/asf/phoenix.git
at hudson.plugins.git.GitSCM.fetchFrom(GitSCM.java:888)
at hudson.plugins.git.GitSCM.retrieveChanges(GitSCM.java:1155)
at hudson.plugins.git.GitSCM.checkout(GitSCM.java:1186)
at hudson.scm.SCM.checkout(SCM.java:504)
at hudson.model.AbstractProject.checkout(AbstractProject.java:1208)
at 
hudson.model.AbstractBuild$AbstractBuildExecution.defaultCheckout(AbstractBuild.java:574)
at jenkins.scm.SCMCheckoutStrategy.checkout(SCMCheckoutStrategy.java:86)
at 
hudson.model.AbstractBuild$AbstractBuildExecution.run(AbstractBuild.java:499)
at hudson.model.Run.execute(Run.java:1794)
at hudson.model.FreeStyleBuild.run(FreeStyleBuild.java:43)
at hudson.model.ResourceController.execute(ResourceController.java:97)
at hudson.model.Executor.run(Executor.java:429)
Caused by: hudson.plugins.git.GitException: Command "git fetch --tags 
--progress https://git-wip-us.apache.org/repos/asf/phoenix.git 
+refs/heads/*:refs/remotes/origin/*" returned status code 128:
stdout: 
stderr: error: Could not read ba1fd712ae8cdc0fc20805a7b03b7df7b0258542
error: Could not read 78daf2e3273e56f9dc6254c0bb12d45f882ac4da
error: Could not read 3b03e1b0bddaf3b52637187ff556fbb39b1dddef
error: Could not read 9d2cf0bb33c8c4b6c893007cc36db6c5d1b74686
remote: Counting objects: 72600   remote: Counting objects: 104547, 
done.
remote: Compressing objects:   0% (1/49284)   remote: Compressing 
objects:   1% (493/49284)   remote: Compressing objects:   2% 
(986/49284)   remote: Compressing objects:   3% (1479/49284)   
remote: Compressing objects:   4% (1972/49284)   remote: Compressing 
objects:   5% (2465/49284)   remote: Compressing objects:   6% 
(2958/49284)   remote: Compressing objects:   7% (3450/49284)   
remote: Compressing objects:   8% (3943/49284)   remote: Compressing 
objects:   9% (4436/49284)   remote: Compressing objects:  10% 
(4929/49284)   remote: Compressing objects:  11% (5422/49284)   
remote: Compressing objects:  12% (5915/49284)   remote: Compressing 
objects:  13% (6407/49284)   remote: Compressing objects:  14% 
(6900/49284)   remote: Compressing objects:  15% (7393/49284)   
remote: Compressing objects:  16% (7886/49284)   remote: Compressing 
objects:  17% (8379/49284)   remote: Compressing objects:  18% 
(8872/49284)   remote: Compressing objects:  19% (9364/49284)   
remote: Compressing objects:  20% (9857/49284)   remote: Compressing 
objects:  21% (10350/49284)   remote: Compressing objects:  22% 
(10843/49284)   remote: Compressing objects:  23% (11336/49284) 
  remote: Compressing objects:  24% (11829/49284)   remote: Compressing 
objects:  25% (12321/49284)   remote: Compressing objects:  26% 
(12814/49284)   remote: Compressing objects:  27% (13307/49284) 
  remote: Compressing objects:  28% (13800/49284)   remote: Compressing 
objects:  29% (14293/49284)   remote: Compressing objects:  30% 
(14786/49284)   remote: Compressing objects:  31% (15279/49284) 
  remote: Compressing objects:  32% (15771/49284)   remote: Compressing 
objects:  33% (16264/49284)   remote: Compressing objects:  34% 
(16757/49284)   remote: Compressing objects:  35% (17250/49284) 
  remote: Compressing objects:  36% (17743/49284)   remote: Compressing 
objects:  37% (18236/49284)   remote: Compressing objects:  38% 
(18728/49284)   remote: Compressing objects:  39% (19221/49284) 
  remote: Compressing objects:  40% (19714/49284)   remote: Compressing 
objects:  41% (20207/49284)   remote: Compressing objects:  42% 
(20700/49284)   remote: Compressing objects:  43% (21193/49284) 
  remote: Compressing objects:  44% (21685/49284)   remote: Compressing 
objects:  45% (22178/49284)   remote: Compressing objec

Build failed in Jenkins: Phoenix | Master #2191

2018-10-15 Thread Apache Jenkins Server
See 

--
Started by an SCM change
Started by an SCM change
[EnvInject] - Loading node environment variables.
Building remotely on H35 (ubuntu xenial) in workspace 

 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url 
 > https://git-wip-us.apache.org/repos/asf/phoenix.git # timeout=10
Fetching upstream changes from 
https://git-wip-us.apache.org/repos/asf/phoenix.git
 > git --version # timeout=10
 > git fetch --tags --progress 
 > https://git-wip-us.apache.org/repos/asf/phoenix.git 
 > +refs/heads/*:refs/remotes/origin/*
ERROR: Error fetching remote repo 'origin'
hudson.plugins.git.GitException: Failed to fetch from 
https://git-wip-us.apache.org/repos/asf/phoenix.git
at hudson.plugins.git.GitSCM.fetchFrom(GitSCM.java:888)
at hudson.plugins.git.GitSCM.retrieveChanges(GitSCM.java:1155)
at hudson.plugins.git.GitSCM.checkout(GitSCM.java:1186)
at hudson.scm.SCM.checkout(SCM.java:504)
at hudson.model.AbstractProject.checkout(AbstractProject.java:1208)
at 
hudson.model.AbstractBuild$AbstractBuildExecution.defaultCheckout(AbstractBuild.java:574)
at jenkins.scm.SCMCheckoutStrategy.checkout(SCMCheckoutStrategy.java:86)
at 
hudson.model.AbstractBuild$AbstractBuildExecution.run(AbstractBuild.java:499)
at hudson.model.Run.execute(Run.java:1794)
at hudson.model.FreeStyleBuild.run(FreeStyleBuild.java:43)
at hudson.model.ResourceController.execute(ResourceController.java:97)
at hudson.model.Executor.run(Executor.java:429)
Caused by: hudson.plugins.git.GitException: Command "git fetch --tags 
--progress https://git-wip-us.apache.org/repos/asf/phoenix.git 
+refs/heads/*:refs/remotes/origin/*" returned status code 128:
stdout: 
stderr: error: Could not read ba1fd712ae8cdc0fc20805a7b03b7df7b0258542
error: Could not read 78daf2e3273e56f9dc6254c0bb12d45f882ac4da
error: Could not read 3b03e1b0bddaf3b52637187ff556fbb39b1dddef
error: Could not read 9d2cf0bb33c8c4b6c893007cc36db6c5d1b74686
remote: Counting objects: 66611   remote: Counting objects: 104547, 
done.
remote: Compressing objects:   0% (1/49284)   remote: Compressing 
objects:   1% (493/49284)   remote: Compressing objects:   2% 
(986/49284)   remote: Compressing objects:   3% (1479/49284)   
remote: Compressing objects:   4% (1972/49284)   remote: Compressing 
objects:   5% (2465/49284)   remote: Compressing objects:   6% 
(2958/49284)   remote: Compressing objects:   7% (3450/49284)   
remote: Compressing objects:   8% (3943/49284)   remote: Compressing 
objects:   9% (4436/49284)   remote: Compressing objects:  10% 
(4929/49284)   remote: Compressing objects:  11% (5422/49284)   
remote: Compressing objects:  12% (5915/49284)   remote: Compressing 
objects:  13% (6407/49284)   remote: Compressing objects:  14% 
(6900/49284)   remote: Compressing objects:  15% (7393/49284)   
remote: Compressing objects:  16% (7886/49284)   remote: Compressing 
objects:  17% (8379/49284)   remote: Compressing objects:  18% 
(8872/49284)   remote: Compressing objects:  19% (9364/49284)   
remote: Compressing objects:  20% (9857/49284)   remote: Compressing 
objects:  21% (10350/49284)   remote: Compressing objects:  22% 
(10843/49284)   remote: Compressing objects:  23% (11336/49284) 
  remote: Compressing objects:  24% (11829/49284)   remote: Compressing 
objects:  25% (12321/49284)   remote: Compressing objects:  26% 
(12814/49284)   remote: Compressing objects:  27% (13307/49284) 
  remote: Compressing objects:  28% (13800/49284)   remote: Compressing 
objects:  29% (14293/49284)   remote: Compressing objects:  30% 
(14786/49284)   remote: Compressing objects:  31% (15279/49284) 
  remote: Compressing objects:  32% (15771/49284)   remote: Compressing 
objects:  33% (16264/49284)   remote: Compressing objects:  34% 
(16757/49284)   remote: Compressing objects:  35% (17250/49284) 
  remote: Compressing objects:  36% (17743/49284)   remote: Compressing 
objects:  37% (18236/49284)   remote: Compressing objects:  38% 
(18728/49284)   remote: Compressing objects:  39% (19221/49284) 
  remote: Compressing objects:  40% (19714/49284)   remote: Compressing 
objects:  41% (20207/49284)   remote: Compressing objects:  42% 
(20700/49284)   remote: Compressing objects:  43% (21193/49284) 
  remote: Compressing objects:  44% (21685/49284)   remote: Compressing 
objects:  45% (22178/49284)   

Build failed in Jenkins: Phoenix | Master #2190

2018-10-15 Thread Apache Jenkins Server
See 

--
Started by an SCM change
Started by an SCM change
[EnvInject] - Loading node environment variables.
Building remotely on H35 (ubuntu xenial) in workspace 

 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url 
 > https://git-wip-us.apache.org/repos/asf/phoenix.git # timeout=10
Fetching upstream changes from 
https://git-wip-us.apache.org/repos/asf/phoenix.git
 > git --version # timeout=10
 > git fetch --tags --progress 
 > https://git-wip-us.apache.org/repos/asf/phoenix.git 
 > +refs/heads/*:refs/remotes/origin/*
ERROR: Error fetching remote repo 'origin'
hudson.plugins.git.GitException: Failed to fetch from 
https://git-wip-us.apache.org/repos/asf/phoenix.git
at hudson.plugins.git.GitSCM.fetchFrom(GitSCM.java:888)
at hudson.plugins.git.GitSCM.retrieveChanges(GitSCM.java:1155)
at hudson.plugins.git.GitSCM.checkout(GitSCM.java:1186)
at hudson.scm.SCM.checkout(SCM.java:504)
at hudson.model.AbstractProject.checkout(AbstractProject.java:1208)
at 
hudson.model.AbstractBuild$AbstractBuildExecution.defaultCheckout(AbstractBuild.java:574)
at jenkins.scm.SCMCheckoutStrategy.checkout(SCMCheckoutStrategy.java:86)
at 
hudson.model.AbstractBuild$AbstractBuildExecution.run(AbstractBuild.java:499)
at hudson.model.Run.execute(Run.java:1794)
at hudson.model.FreeStyleBuild.run(FreeStyleBuild.java:43)
at hudson.model.ResourceController.execute(ResourceController.java:97)
at hudson.model.Executor.run(Executor.java:429)
Caused by: hudson.plugins.git.GitException: Command "git fetch --tags 
--progress https://git-wip-us.apache.org/repos/asf/phoenix.git 
+refs/heads/*:refs/remotes/origin/*" returned status code 128:
stdout: 
stderr: error: Could not read ba1fd712ae8cdc0fc20805a7b03b7df7b0258542
error: Could not read 78daf2e3273e56f9dc6254c0bb12d45f882ac4da
error: Could not read 3b03e1b0bddaf3b52637187ff556fbb39b1dddef
error: Could not read 9d2cf0bb33c8c4b6c893007cc36db6c5d1b74686
remote: Counting objects: 64244   remote: Counting objects: 104547, 
done.
remote: Compressing objects:   0% (1/49284)   remote: Compressing 
objects:   1% (493/49284)   remote: Compressing objects:   2% 
(986/49284)   remote: Compressing objects:   3% (1479/49284)   
remote: Compressing objects:   4% (1972/49284)   remote: Compressing 
objects:   5% (2465/49284)   remote: Compressing objects:   6% 
(2958/49284)   remote: Compressing objects:   7% (3450/49284)   
remote: Compressing objects:   8% (3943/49284)   remote: Compressing 
objects:   9% (4436/49284)   remote: Compressing objects:  10% 
(4929/49284)   remote: Compressing objects:  11% (5422/49284)   
remote: Compressing objects:  12% (5915/49284)   remote: Compressing 
objects:  13% (6407/49284)   remote: Compressing objects:  14% 
(6900/49284)   remote: Compressing objects:  15% (7393/49284)   
remote: Compressing objects:  16% (7886/49284)   remote: Compressing 
objects:  17% (8379/49284)   remote: Compressing objects:  18% 
(8872/49284)   remote: Compressing objects:  19% (9364/49284)   
remote: Compressing objects:  20% (9857/49284)   remote: Compressing 
objects:  21% (10350/49284)   remote: Compressing objects:  22% 
(10843/49284)   remote: Compressing objects:  23% (11336/49284) 
  remote: Compressing objects:  24% (11829/49284)   remote: Compressing 
objects:  25% (12321/49284)   remote: Compressing objects:  26% 
(12814/49284)   remote: Compressing objects:  27% (13307/49284) 
  remote: Compressing objects:  28% (13800/49284)   remote: Compressing 
objects:  29% (14293/49284)   remote: Compressing objects:  30% 
(14786/49284)   remote: Compressing objects:  31% (15279/49284) 
  remote: Compressing objects:  32% (15771/49284)   remote: Compressing 
objects:  33% (16264/49284)   remote: Compressing objects:  34% 
(16757/49284)   remote: Compressing objects:  35% (17250/49284) 
  remote: Compressing objects:  36% (17743/49284)   remote: Compressing 
objects:  37% (18236/49284)   remote: Compressing objects:  38% 
(18728/49284)   remote: Compressing objects:  39% (19221/49284) 
  remote: Compressing objects:  40% (19714/49284)   remote: Compressing 
objects:  41% (20207/49284)   remote: Compressing objects:  42% 
(20700/49284)   remote: Compressing objects:  43% (21193/49284) 
  remote: Compressing objects:  44% (21685/49284)   remote: Compressing 
objects:  45% (22178/49284)   

Build failed in Jenkins: Phoenix | Master #2189

2018-10-15 Thread Apache Jenkins Server
See 

--
Started by an SCM change
[EnvInject] - Loading node environment variables.
Building remotely on H35 (ubuntu xenial) in workspace 

 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url 
 > https://git-wip-us.apache.org/repos/asf/phoenix.git # timeout=10
Fetching upstream changes from 
https://git-wip-us.apache.org/repos/asf/phoenix.git
 > git --version # timeout=10
 > git fetch --tags --progress 
 > https://git-wip-us.apache.org/repos/asf/phoenix.git 
 > +refs/heads/*:refs/remotes/origin/*
ERROR: Error fetching remote repo 'origin'
hudson.plugins.git.GitException: Failed to fetch from 
https://git-wip-us.apache.org/repos/asf/phoenix.git
at hudson.plugins.git.GitSCM.fetchFrom(GitSCM.java:888)
at hudson.plugins.git.GitSCM.retrieveChanges(GitSCM.java:1155)
at hudson.plugins.git.GitSCM.checkout(GitSCM.java:1186)
at hudson.scm.SCM.checkout(SCM.java:504)
at hudson.model.AbstractProject.checkout(AbstractProject.java:1208)
at 
hudson.model.AbstractBuild$AbstractBuildExecution.defaultCheckout(AbstractBuild.java:574)
at jenkins.scm.SCMCheckoutStrategy.checkout(SCMCheckoutStrategy.java:86)
at 
hudson.model.AbstractBuild$AbstractBuildExecution.run(AbstractBuild.java:499)
at hudson.model.Run.execute(Run.java:1794)
at hudson.model.FreeStyleBuild.run(FreeStyleBuild.java:43)
at hudson.model.ResourceController.execute(ResourceController.java:97)
at hudson.model.Executor.run(Executor.java:429)
Caused by: hudson.plugins.git.GitException: Command "git fetch --tags 
--progress https://git-wip-us.apache.org/repos/asf/phoenix.git 
+refs/heads/*:refs/remotes/origin/*" returned status code 128:
stdout: 
stderr: error: Could not read ba1fd712ae8cdc0fc20805a7b03b7df7b0258542
error: Could not read 78daf2e3273e56f9dc6254c0bb12d45f882ac4da
error: Could not read 3b03e1b0bddaf3b52637187ff556fbb39b1dddef
error: Could not read 9d2cf0bb33c8c4b6c893007cc36db6c5d1b74686
remote: Counting objects: 68356   remote: Counting objects: 104547, 
done.
remote: Compressing objects:   0% (1/49284)   remote: Compressing 
objects:   1% (493/49284)   remote: Compressing objects:   2% 
(986/49284)   remote: Compressing objects:   3% (1479/49284)   
remote: Compressing objects:   4% (1972/49284)   remote: Compressing 
objects:   5% (2465/49284)   remote: Compressing objects:   6% 
(2958/49284)   remote: Compressing objects:   7% (3450/49284)   
remote: Compressing objects:   8% (3943/49284)   remote: Compressing 
objects:   9% (4436/49284)   remote: Compressing objects:  10% 
(4929/49284)   remote: Compressing objects:  11% (5422/49284)   
remote: Compressing objects:  12% (5915/49284)   remote: Compressing 
objects:  13% (6407/49284)   remote: Compressing objects:  14% 
(6900/49284)   remote: Compressing objects:  15% (7393/49284)   
remote: Compressing objects:  16% (7886/49284)   remote: Compressing 
objects:  17% (8379/49284)   remote: Compressing objects:  18% 
(8872/49284)   remote: Compressing objects:  19% (9364/49284)   
remote: Compressing objects:  20% (9857/49284)   remote: Compressing 
objects:  21% (10350/49284)   remote: Compressing objects:  22% 
(10843/49284)   remote: Compressing objects:  23% (11336/49284) 
  remote: Compressing objects:  24% (11829/49284)   remote: Compressing 
objects:  25% (12321/49284)   remote: Compressing objects:  26% 
(12814/49284)   remote: Compressing objects:  27% (13307/49284) 
  remote: Compressing objects:  28% (13800/49284)   remote: Compressing 
objects:  29% (14293/49284)   remote: Compressing objects:  30% 
(14786/49284)   remote: Compressing objects:  31% (15279/49284) 
  remote: Compressing objects:  32% (15771/49284)   remote: Compressing 
objects:  33% (16264/49284)   remote: Compressing objects:  34% 
(16757/49284)   remote: Compressing objects:  35% (17250/49284) 
  remote: Compressing objects:  36% (17743/49284)   remote: Compressing 
objects:  37% (18236/49284)   remote: Compressing objects:  38% 
(18728/49284)   remote: Compressing objects:  39% (19221/49284) 
  remote: Compressing objects:  40% (19714/49284)   remote: Compressing 
objects:  41% (20207/49284)   remote: Compressing objects:  42% 
(20700/49284)   remote: Compressing objects:  43% (21193/49284) 
  remote: Compressing objects:  44% (21685/49284)   remote: Compressing 
objects:  45% (22178/49284)   remote: Compressing objec

Build failed in Jenkins: Phoenix | Master #2188

2018-10-15 Thread Apache Jenkins Server
See 

--
Started by an SCM change
[EnvInject] - Loading node environment variables.
Building remotely on H35 (ubuntu xenial) in workspace 

 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url 
 > https://git-wip-us.apache.org/repos/asf/phoenix.git # timeout=10
Fetching upstream changes from 
https://git-wip-us.apache.org/repos/asf/phoenix.git
 > git --version # timeout=10
 > git fetch --tags --progress 
 > https://git-wip-us.apache.org/repos/asf/phoenix.git 
 > +refs/heads/*:refs/remotes/origin/*
ERROR: Error fetching remote repo 'origin'
hudson.plugins.git.GitException: Failed to fetch from 
https://git-wip-us.apache.org/repos/asf/phoenix.git
at hudson.plugins.git.GitSCM.fetchFrom(GitSCM.java:888)
at hudson.plugins.git.GitSCM.retrieveChanges(GitSCM.java:1155)
at hudson.plugins.git.GitSCM.checkout(GitSCM.java:1186)
at hudson.scm.SCM.checkout(SCM.java:504)
at hudson.model.AbstractProject.checkout(AbstractProject.java:1208)
at 
hudson.model.AbstractBuild$AbstractBuildExecution.defaultCheckout(AbstractBuild.java:574)
at jenkins.scm.SCMCheckoutStrategy.checkout(SCMCheckoutStrategy.java:86)
at 
hudson.model.AbstractBuild$AbstractBuildExecution.run(AbstractBuild.java:499)
at hudson.model.Run.execute(Run.java:1794)
at hudson.model.FreeStyleBuild.run(FreeStyleBuild.java:43)
at hudson.model.ResourceController.execute(ResourceController.java:97)
at hudson.model.Executor.run(Executor.java:429)
Caused by: hudson.plugins.git.GitException: Command "git fetch --tags 
--progress https://git-wip-us.apache.org/repos/asf/phoenix.git 
+refs/heads/*:refs/remotes/origin/*" returned status code 128:
stdout: 
stderr: error: Could not read ba1fd712ae8cdc0fc20805a7b03b7df7b0258542
error: Could not read 78daf2e3273e56f9dc6254c0bb12d45f882ac4da
error: Could not read 3b03e1b0bddaf3b52637187ff556fbb39b1dddef
error: Could not read 9d2cf0bb33c8c4b6c893007cc36db6c5d1b74686
remote: Counting objects: 73796   remote: Counting objects: 104547, 
done.
remote: Compressing objects:   0% (1/49284)   remote: Compressing 
objects:   1% (493/49284)   remote: Compressing objects:   2% 
(986/49284)   remote: Compressing objects:   3% (1479/49284)   
remote: Compressing objects:   4% (1972/49284)   remote: Compressing 
objects:   5% (2465/49284)   remote: Compressing objects:   6% 
(2958/49284)   remote: Compressing objects:   7% (3450/49284)   
remote: Compressing objects:   8% (3943/49284)   remote: Compressing 
objects:   9% (4436/49284)   remote: Compressing objects:  10% 
(4929/49284)   remote: Compressing objects:  11% (5422/49284)   
remote: Compressing objects:  12% (5915/49284)   remote: Compressing 
objects:  13% (6407/49284)   remote: Compressing objects:  14% 
(6900/49284)   remote: Compressing objects:  15% (7393/49284)   
remote: Compressing objects:  16% (7886/49284)   remote: Compressing 
objects:  17% (8379/49284)   remote: Compressing objects:  18% 
(8872/49284)   remote: Compressing objects:  19% (9364/49284)   
remote: Compressing objects:  20% (9857/49284)   remote: Compressing 
objects:  21% (10350/49284)   remote: Compressing objects:  22% 
(10843/49284)   remote: Compressing objects:  23% (11336/49284) 
  remote: Compressing objects:  24% (11829/49284)   remote: Compressing 
objects:  25% (12321/49284)   remote: Compressing objects:  26% 
(12814/49284)   remote: Compressing objects:  27% (13307/49284) 
  remote: Compressing objects:  28% (13800/49284)   remote: Compressing 
objects:  29% (14293/49284)   remote: Compressing objects:  30% 
(14786/49284)   remote: Compressing objects:  31% (15279/49284) 
  remote: Compressing objects:  32% (15771/49284)   remote: Compressing 
objects:  33% (16264/49284)   remote: Compressing objects:  34% 
(16757/49284)   remote: Compressing objects:  35% (17250/49284) 
  remote: Compressing objects:  36% (17743/49284)   remote: Compressing 
objects:  37% (18236/49284)   remote: Compressing objects:  38% 
(18728/49284)   remote: Compressing objects:  39% (19221/49284) 
  remote: Compressing objects:  40% (19714/49284)   remote: Compressing 
objects:  41% (20207/49284)   remote: Compressing objects:  42% 
(20700/49284)   remote: Compressing objects:  43% (21193/49284) 
  remote: Compressing objects:  44% (21685/49284)   remote: Compressing 
objects:  45% (22178/49284)   remote: Compressing objec

Build failed in Jenkins: Phoenix | Master #2187

2018-10-15 Thread Apache Jenkins Server
See 

--
Started by an SCM change
[EnvInject] - Loading node environment variables.
Building remotely on H35 (ubuntu xenial) in workspace 

 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url 
 > https://git-wip-us.apache.org/repos/asf/phoenix.git # timeout=10
Fetching upstream changes from 
https://git-wip-us.apache.org/repos/asf/phoenix.git
 > git --version # timeout=10
 > git fetch --tags --progress 
 > https://git-wip-us.apache.org/repos/asf/phoenix.git 
 > +refs/heads/*:refs/remotes/origin/*
ERROR: Error fetching remote repo 'origin'
hudson.plugins.git.GitException: Failed to fetch from 
https://git-wip-us.apache.org/repos/asf/phoenix.git
at hudson.plugins.git.GitSCM.fetchFrom(GitSCM.java:888)
at hudson.plugins.git.GitSCM.retrieveChanges(GitSCM.java:1155)
at hudson.plugins.git.GitSCM.checkout(GitSCM.java:1186)
at hudson.scm.SCM.checkout(SCM.java:504)
at hudson.model.AbstractProject.checkout(AbstractProject.java:1208)
at 
hudson.model.AbstractBuild$AbstractBuildExecution.defaultCheckout(AbstractBuild.java:574)
at jenkins.scm.SCMCheckoutStrategy.checkout(SCMCheckoutStrategy.java:86)
at 
hudson.model.AbstractBuild$AbstractBuildExecution.run(AbstractBuild.java:499)
at hudson.model.Run.execute(Run.java:1794)
at hudson.model.FreeStyleBuild.run(FreeStyleBuild.java:43)
at hudson.model.ResourceController.execute(ResourceController.java:97)
at hudson.model.Executor.run(Executor.java:429)
Caused by: hudson.plugins.git.GitException: Command "git fetch --tags 
--progress https://git-wip-us.apache.org/repos/asf/phoenix.git 
+refs/heads/*:refs/remotes/origin/*" returned status code 128:
stdout: 
stderr: error: Could not read ba1fd712ae8cdc0fc20805a7b03b7df7b0258542
error: Could not read 78daf2e3273e56f9dc6254c0bb12d45f882ac4da
error: Could not read 3b03e1b0bddaf3b52637187ff556fbb39b1dddef
error: Could not read 9d2cf0bb33c8c4b6c893007cc36db6c5d1b74686
remote: Counting objects: 70654   remote: Counting objects: 104547, 
done.
remote: Compressing objects:   0% (1/49284)   remote: Compressing 
objects:   1% (493/49284)   remote: Compressing objects:   2% 
(986/49284)   remote: Compressing objects:   3% (1479/49284)   
remote: Compressing objects:   4% (1972/49284)   remote: Compressing 
objects:   5% (2465/49284)   remote: Compressing objects:   6% 
(2958/49284)   remote: Compressing objects:   7% (3450/49284)   
remote: Compressing objects:   8% (3943/49284)   remote: Compressing 
objects:   9% (4436/49284)   remote: Compressing objects:  10% 
(4929/49284)   remote: Compressing objects:  11% (5422/49284)   
remote: Compressing objects:  12% (5915/49284)   remote: Compressing 
objects:  13% (6407/49284)   remote: Compressing objects:  14% 
(6900/49284)   remote: Compressing objects:  15% (7393/49284)   
remote: Compressing objects:  16% (7886/49284)   remote: Compressing 
objects:  17% (8379/49284)   remote: Compressing objects:  18% 
(8872/49284)   remote: Compressing objects:  19% (9364/49284)   
remote: Compressing objects:  20% (9857/49284)   remote: Compressing 
objects:  21% (10350/49284)   remote: Compressing objects:  22% 
(10843/49284)   remote: Compressing objects:  23% (11336/49284) 
  remote: Compressing objects:  24% (11829/49284)   remote: Compressing 
objects:  25% (12321/49284)   remote: Compressing objects:  26% 
(12814/49284)   remote: Compressing objects:  27% (13307/49284) 
  remote: Compressing objects:  28% (13800/49284)   remote: Compressing 
objects:  29% (14293/49284)   remote: Compressing objects:  30% 
(14786/49284)   remote: Compressing objects:  31% (15279/49284) 
  remote: Compressing objects:  32% (15771/49284)   remote: Compressing 
objects:  33% (16264/49284)   remote: Compressing objects:  34% 
(16757/49284)   remote: Compressing objects:  35% (17250/49284) 
  remote: Compressing objects:  36% (17743/49284)   remote: Compressing 
objects:  37% (18236/49284)   remote: Compressing objects:  38% 
(18728/49284)   remote: Compressing objects:  39% (19221/49284) 
  remote: Compressing objects:  40% (19714/49284)   remote: Compressing 
objects:  41% (20207/49284)   remote: Compressing objects:  42% 
(20700/49284)   remote: Compressing objects:  43% (21193/49284) 
  remote: Compressing objects:  44% (21685/49284)   remote: Compressing 
objects:  45% (22178/49284)   remote: Compressing objec

Build failed in Jenkins: Phoenix-4.x-HBase-1.2 #517

2018-10-15 Thread Apache Jenkins Server
See 

--
Started by an SCM change
[EnvInject] - Loading node environment variables.
Building remotely on H21 (ubuntu xenial) in workspace 

 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url 
 > https://git-wip-us.apache.org/repos/asf/phoenix.git # timeout=10
Fetching upstream changes from 
https://git-wip-us.apache.org/repos/asf/phoenix.git
 > git --version # timeout=10
 > git fetch --tags --progress 
 > https://git-wip-us.apache.org/repos/asf/phoenix.git 
 > +refs/heads/*:refs/remotes/origin/*
 > git rev-parse origin/4.x-HBase-1.2^{commit} # timeout=10
Checking out Revision 2804121837d77e048f6ce5cacee0084d5258220a 
(origin/4.x-HBase-1.2)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 2804121837d77e048f6ce5cacee0084d5258220a
Commit message: "PHOENIX-4942 Move MetaDataEndpointImplTest to integration test"
 > git rev-list --no-walk 2804121837d77e048f6ce5cacee0084d5258220a # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the 
SCM step.
[EnvInject] - Injecting as environment variables the properties content 
MAVEN_OPTS=-Xmx3G

[EnvInject] - Variables injected successfully.
[Phoenix-4.x-HBase-1.2] $ /bin/bash -xe /tmp/jenkins1954605226682493682.sh
+ echo 'DELETING ~/.m2/repository/org/apache/htrace. See 
https://issues.apache.org/jira/browse/PHOENIX-1802'
DELETING ~/.m2/repository/org/apache/htrace. See 
https://issues.apache.org/jira/browse/PHOENIX-1802
+ echo 'CURRENT CONTENT:'
CURRENT CONTENT:
+ ls /home/jenkins/.m2/repository/org/apache/htrace
htrace
htrace-core
htrace-core4
htrace-zipkin
[Phoenix-4.x-HBase-1.2] $ /home/jenkins/tools/maven/latest3/bin/mvn -U clean 
install -Dcheckstyle.skip=true
[INFO] Scanning for projects...
Downloading from central: 
https://repo.maven.apache.org/maven2/org/apache/felix/maven-bundle-plugin/2.5.3/maven-bundle-plugin-2.5.3.pom
[ERROR] [ERROR] Some problems were encountered while processing the POMs:
[ERROR] Unresolveable build extension: Plugin 
org.apache.felix:maven-bundle-plugin:2.5.3 or one of its dependencies could not 
be resolved: Failed to read artifact descriptor for 
org.apache.felix:maven-bundle-plugin:jar:2.5.3 @ 
[WARNING] Reporting configuration should be done in  section, not in 
maven-site-plugin  as reportPlugins parameter. @ line 467, 
column 24
 @ 
[ERROR] The build could not read 1 project -> [Help 1]
[ERROR]   
[ERROR]   The project org.apache.phoenix:phoenix:4.14.0-HBase-1.2 
( has 1 error
[ERROR] Unresolveable build extension: Plugin 
org.apache.felix:maven-bundle-plugin:2.5.3 or one of its dependencies could not 
be resolved: Failed to read artifact descriptor for 
org.apache.felix:maven-bundle-plugin:jar:2.5.3: Could not transfer artifact 
org.apache.felix:maven-bundle-plugin:pom:2.5.3 from/to central 
(https://repo.maven.apache.org/maven2): Received fatal alert: protocol_version 
-> [Help 2]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e 
switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please 
read the following articles:
[ERROR] [Help 1] 
http://cwiki.apache.org/confluence/display/MAVEN/ProjectBuildingException
[ERROR] [Help 2] 
http://cwiki.apache.org/confluence/display/MAVEN/PluginManagerException
Build step 'Invoke top-level Maven targets' marked build as failure
Archiving artifacts
Recording test results
ERROR: Step ?Publish JUnit test result report? failed: No test report files 
were found. Configuration error?
Not sending mail to unregistered user ankitsingha...@gmail.com
Not sending mail to unregistered user k.me...@salesforce.com


Build failed in Jenkins: Phoenix-4.x-HBase-1.2 #516

2018-10-15 Thread Apache Jenkins Server
See 


Changes:

[tdsilva] PHOENIX-4942 Move MetaDataEndpointImplTest to integration test

--
Started by an SCM change
Started by an SCM change
[EnvInject] - Loading node environment variables.
Building remotely on H21 (ubuntu xenial) in workspace 

 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url 
 > https://git-wip-us.apache.org/repos/asf/phoenix.git # timeout=10
Fetching upstream changes from 
https://git-wip-us.apache.org/repos/asf/phoenix.git
 > git --version # timeout=10
 > git fetch --tags --progress 
 > https://git-wip-us.apache.org/repos/asf/phoenix.git 
 > +refs/heads/*:refs/remotes/origin/*
 > git rev-parse origin/4.x-HBase-1.2^{commit} # timeout=10
Checking out Revision 2804121837d77e048f6ce5cacee0084d5258220a 
(origin/4.x-HBase-1.2)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 2804121837d77e048f6ce5cacee0084d5258220a
Commit message: "PHOENIX-4942 Move MetaDataEndpointImplTest to integration test"
 > git rev-list --no-walk 9ab708061ad6c2fa3c0f006c4feaea6ba1cc658c # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the 
SCM step.
[EnvInject] - Injecting as environment variables the properties content 
MAVEN_OPTS=-Xmx3G

[EnvInject] - Variables injected successfully.
[Phoenix-4.x-HBase-1.2] $ /bin/bash -xe /tmp/jenkins2532134438438522.sh
+ echo 'DELETING ~/.m2/repository/org/apache/htrace. See 
https://issues.apache.org/jira/browse/PHOENIX-1802'
DELETING ~/.m2/repository/org/apache/htrace. See 
https://issues.apache.org/jira/browse/PHOENIX-1802
+ echo 'CURRENT CONTENT:'
CURRENT CONTENT:
+ ls /home/jenkins/.m2/repository/org/apache/htrace
htrace
htrace-core
htrace-core4
htrace-zipkin
[Phoenix-4.x-HBase-1.2] $ /home/jenkins/tools/maven/latest3/bin/mvn -U clean 
install -Dcheckstyle.skip=true
[INFO] Scanning for projects...
Downloading from central: 
https://repo.maven.apache.org/maven2/org/apache/felix/maven-bundle-plugin/2.5.3/maven-bundle-plugin-2.5.3.pom
[ERROR] [ERROR] Some problems were encountered while processing the POMs:
[ERROR] Unresolveable build extension: Plugin 
org.apache.felix:maven-bundle-plugin:2.5.3 or one of its dependencies could not 
be resolved: Failed to read artifact descriptor for 
org.apache.felix:maven-bundle-plugin:jar:2.5.3 @ 
[WARNING] Reporting configuration should be done in  section, not in 
maven-site-plugin  as reportPlugins parameter. @ line 467, 
column 24
 @ 
[ERROR] The build could not read 1 project -> [Help 1]
[ERROR]   
[ERROR]   The project org.apache.phoenix:phoenix:4.14.0-HBase-1.2 
( has 1 error
[ERROR] Unresolveable build extension: Plugin 
org.apache.felix:maven-bundle-plugin:2.5.3 or one of its dependencies could not 
be resolved: Failed to read artifact descriptor for 
org.apache.felix:maven-bundle-plugin:jar:2.5.3: Could not transfer artifact 
org.apache.felix:maven-bundle-plugin:pom:2.5.3 from/to central 
(https://repo.maven.apache.org/maven2): Received fatal alert: protocol_version 
-> [Help 2]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e 
switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please 
read the following articles:
[ERROR] [Help 1] 
http://cwiki.apache.org/confluence/display/MAVEN/ProjectBuildingException
[ERROR] [Help 2] 
http://cwiki.apache.org/confluence/display/MAVEN/PluginManagerException
Build step 'Invoke top-level Maven targets' marked build as failure
Archiving artifacts
Recording test results
ERROR: Step ?Publish JUnit test result report? failed: No test report files 
were found. Configuration error?
Not sending mail to unregistered user ankitsingha...@gmail.com
Not sending mail to unregistered user k.me...@salesforce.com


phoenix git commit: PHOENIX-4942 Move MetaDataEndpointImplTest to integration test

2018-10-15 Thread tdsilva
Repository: phoenix
Updated Branches:
  refs/heads/4.x-HBase-1.4 7c5c7f6c6 -> 1b561ce85


PHOENIX-4942 Move MetaDataEndpointImplTest to integration test


Project: http://git-wip-us.apache.org/repos/asf/phoenix/repo
Commit: http://git-wip-us.apache.org/repos/asf/phoenix/commit/1b561ce8
Tree: http://git-wip-us.apache.org/repos/asf/phoenix/tree/1b561ce8
Diff: http://git-wip-us.apache.org/repos/asf/phoenix/diff/1b561ce8

Branch: refs/heads/4.x-HBase-1.4
Commit: 1b561ce852c6bfae0e1812cfa6732648e81e24b9
Parents: 7c5c7f6
Author: Thomas D'Silva 
Authored: Mon Oct 15 22:17:24 2018 -0700
Committer: Thomas D'Silva 
Committed: Mon Oct 15 22:19:09 2018 -0700

--
 .../phoenix/end2end/MetaDataEndpointImplIT.java | 301 +++
 .../coprocessor/MetaDataEndpointImplTest.java   | 299 --
 2 files changed, 301 insertions(+), 299 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/phoenix/blob/1b561ce8/phoenix-core/src/it/java/org/apache/phoenix/end2end/MetaDataEndpointImplIT.java
--
diff --git 
a/phoenix-core/src/it/java/org/apache/phoenix/end2end/MetaDataEndpointImplIT.java
 
b/phoenix-core/src/it/java/org/apache/phoenix/end2end/MetaDataEndpointImplIT.java
new file mode 100644
index 000..f14af9e
--- /dev/null
+++ 
b/phoenix-core/src/it/java/org/apache/phoenix/end2end/MetaDataEndpointImplIT.java
@@ -0,0 +1,301 @@
+package org.apache.phoenix.end2end;
+
+import static org.junit.Assert.assertEquals;
+import static org.junit.Assert.assertNotNull;
+import static org.junit.Assert.fail;
+
+import java.io.IOException;
+import java.sql.Connection;
+import java.sql.DriverManager;
+import java.sql.SQLException;
+import java.util.Arrays;
+import java.util.List;
+import java.util.Map;
+
+import org.apache.hadoop.hbase.HConstants;
+import org.apache.hadoop.hbase.TableName;
+import org.apache.hadoop.hbase.client.HTable;
+import org.apache.phoenix.coprocessor.TableViewFinderResult;
+import org.apache.phoenix.coprocessor.ViewFinder;
+import org.apache.phoenix.end2end.ParallelStatsDisabledIT;
+import org.apache.phoenix.exception.SQLExceptionCode;
+import org.apache.phoenix.jdbc.PhoenixDatabaseMetaData;
+import org.apache.phoenix.schema.PColumn;
+import org.apache.phoenix.schema.PTable;
+import org.apache.phoenix.schema.TableNotFoundException;
+import org.apache.phoenix.util.PhoenixRuntime;
+import org.junit.Test;
+
+import com.google.common.base.Joiner;
+import com.google.common.collect.ImmutableMap;
+import com.google.common.collect.Lists;
+import com.google.common.collect.Maps;
+
+/**
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ * http://www.apache.org/licenses/LICENSE-2.0
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+public class MetaDataEndpointImplIT extends ParallelStatsDisabledIT {
+private final TableName catalogTable = 
TableName.valueOf(PhoenixDatabaseMetaData.SYSTEM_CATALOG_NAME_BYTES);
+private final TableName linkTable = 
TableName.valueOf(PhoenixDatabaseMetaData.SYSTEM_CHILD_LINK_NAME_BYTES);
+
+/*
+  The tree structure is as follows: Where ParentTable is the Base Table
+  and all children are views and child views respectively.
+
+ParentTable
+  / \
+leftChild   rightChild
+  /
+   leftGrandChild
+ */
+
+@Test
+public void testGettingChildrenAndParentViews() throws Exception {
+String baseTable = generateUniqueName();
+String leftChild = generateUniqueName();
+String rightChild = generateUniqueName();
+String leftGrandChild = generateUniqueName();
+Connection conn = DriverManager.getConnection(getUrl());
+String ddlFormat =
+"CREATE TABLE IF NOT EXISTS " + baseTable + "  (" + " PK2 VARCHAR 
NOT NULL, V1 VARCHAR, V2 VARCHAR "
++ " CONSTRAINT NAME_PK PRIMARY KEY (PK2)" + " )";
+conn.createStatement().execute(ddlFormat);
+
+conn.createStatement().execute("CREATE VIEW " + rightChild + " AS 
SELECT * FROM " + baseTable);
+conn.createStatement().execute("CREATE VIEW " + leftChild + " (carrier 
VARCHAR) AS SELECT * FROM " + baseTable);
+con

phoenix git commit: PHOENIX-4942 Move MetaDataEndpointImplTest to integration test

2018-10-15 Thread tdsilva
Repository: phoenix
Updated Branches:
  refs/heads/master 0d32f8700 -> 02259651e


PHOENIX-4942 Move MetaDataEndpointImplTest to integration test


Project: http://git-wip-us.apache.org/repos/asf/phoenix/repo
Commit: http://git-wip-us.apache.org/repos/asf/phoenix/commit/02259651
Tree: http://git-wip-us.apache.org/repos/asf/phoenix/tree/02259651
Diff: http://git-wip-us.apache.org/repos/asf/phoenix/diff/02259651

Branch: refs/heads/master
Commit: 02259651ed66ba9dcde7acfb277bf129fa01249d
Parents: 0d32f87
Author: Thomas D'Silva 
Authored: Mon Oct 15 22:23:06 2018 -0700
Committer: Thomas D'Silva 
Committed: Mon Oct 15 22:23:06 2018 -0700

--
 .../coprocessor/MetaDataEndpointImplIT.java | 294 ---
 .../phoenix/end2end/MetaDataEndpointImplIT.java | 294 +++
 2 files changed, 294 insertions(+), 294 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/phoenix/blob/02259651/phoenix-core/src/it/java/org/apache/phoenix/coprocessor/MetaDataEndpointImplIT.java
--
diff --git 
a/phoenix-core/src/it/java/org/apache/phoenix/coprocessor/MetaDataEndpointImplIT.java
 
b/phoenix-core/src/it/java/org/apache/phoenix/coprocessor/MetaDataEndpointImplIT.java
deleted file mode 100644
index 3b7686d..000
--- 
a/phoenix-core/src/it/java/org/apache/phoenix/coprocessor/MetaDataEndpointImplIT.java
+++ /dev/null
@@ -1,294 +0,0 @@
-package org.apache.phoenix.coprocessor;
-
-import static org.junit.Assert.assertEquals;
-import static org.junit.Assert.assertNotNull;
-import static org.junit.Assert.fail;
-
-import java.sql.Connection;
-import java.sql.DriverManager;
-import java.sql.SQLException;
-import java.util.Arrays;
-import java.util.List;
-import java.util.Map;
-
-import org.apache.hadoop.hbase.HConstants;
-import org.apache.hadoop.hbase.TableName;
-import org.apache.phoenix.end2end.ParallelStatsDisabledIT;
-import org.apache.phoenix.exception.SQLExceptionCode;
-import org.apache.phoenix.jdbc.PhoenixDatabaseMetaData;
-import org.apache.phoenix.schema.PColumn;
-import org.apache.phoenix.schema.PTable;
-import org.apache.phoenix.schema.TableNotFoundException;
-import org.apache.phoenix.util.PhoenixRuntime;
-import org.junit.Test;
-
-import com.google.common.base.Joiner;
-import com.google.common.collect.ImmutableMap;
-import com.google.common.collect.Lists;
-import com.google.common.collect.Maps;
-
-/**
- * Licensed to the Apache Software Foundation (ASF) under one
- * or more contributor license agreements.  See the NOTICE file
- * distributed with this work for additional information
- * regarding copyright ownership.  The ASF licenses this file
- * to you under the Apache License, Version 2.0 (the
- * "License"); you may not use this file except in compliance
- * with the License.  You may obtain a copy of the License at
- * http://www.apache.org/licenses/LICENSE-2.0
- * Unless required by applicable law or agreed to in writing, software
- * distributed under the License is distributed on an "AS IS" BASIS,
- * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
- * See the License for the specific language governing permissions and
- * limitations under the License.
- */
-public class MetaDataEndpointImplIT extends ParallelStatsDisabledIT {
-private final TableName catalogTable = 
TableName.valueOf(PhoenixDatabaseMetaData.SYSTEM_CATALOG_NAME_BYTES);
-private final TableName linkTable = 
TableName.valueOf(PhoenixDatabaseMetaData.SYSTEM_CHILD_LINK_NAME_BYTES);
-
-/*
-  The tree structure is as follows: Where ParentTable is the Base Table
-  and all children are views and child views respectively.
-
-ParentTable
-  / \
-leftChild   rightChild
-  /
-   leftGrandChild
- */
-
-@Test
-public void testGettingChildrenAndParentViews() throws Exception {
-String baseTable = generateUniqueName();
-String leftChild = generateUniqueName();
-String rightChild = generateUniqueName();
-String leftGrandChild = generateUniqueName();
-Connection conn = DriverManager.getConnection(getUrl());
-String ddlFormat =
-"CREATE TABLE IF NOT EXISTS " + baseTable + "  (" + " PK2 VARCHAR 
NOT NULL, V1 VARCHAR, V2 VARCHAR "
-+ " CONSTRAINT NAME_PK PRIMARY KEY (PK2)" + " )";
-conn.createStatement().execute(ddlFormat);
-
-conn.createStatement().execute("CREATE VIEW " + rightChild + " AS 
SELECT * FROM " + baseTable);
-conn.createStatement().execute("CREATE VIEW " + leftChild + " (carrier 
VARCHAR) AS SELECT * FROM " + baseTable);
-conn.createStatement().execute("CREATE VIEW " + leftGrandChild + " 
(dropped_calls BIGINT) AS SELECT * FROM " + leftChild);
-
-PTable table = PhoenixRuntime.getTable(conn, b

phoenix git commit: PHOENIX-4942 Move MetaDataEndpointImplTest to integration test

2018-10-15 Thread tdsilva
Repository: phoenix
Updated Branches:
  refs/heads/4.x-HBase-1.3 db02372fa -> 3763709e8


PHOENIX-4942 Move MetaDataEndpointImplTest to integration test


Project: http://git-wip-us.apache.org/repos/asf/phoenix/repo
Commit: http://git-wip-us.apache.org/repos/asf/phoenix/commit/3763709e
Tree: http://git-wip-us.apache.org/repos/asf/phoenix/tree/3763709e
Diff: http://git-wip-us.apache.org/repos/asf/phoenix/diff/3763709e

Branch: refs/heads/4.x-HBase-1.3
Commit: 3763709e8a57534d1a1a81892a62ad363c36bcdf
Parents: db02372
Author: Thomas D'Silva 
Authored: Mon Oct 15 22:17:24 2018 -0700
Committer: Thomas D'Silva 
Committed: Mon Oct 15 22:18:49 2018 -0700

--
 .../phoenix/end2end/MetaDataEndpointImplIT.java | 301 +++
 .../coprocessor/MetaDataEndpointImplTest.java   | 299 --
 2 files changed, 301 insertions(+), 299 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/phoenix/blob/3763709e/phoenix-core/src/it/java/org/apache/phoenix/end2end/MetaDataEndpointImplIT.java
--
diff --git 
a/phoenix-core/src/it/java/org/apache/phoenix/end2end/MetaDataEndpointImplIT.java
 
b/phoenix-core/src/it/java/org/apache/phoenix/end2end/MetaDataEndpointImplIT.java
new file mode 100644
index 000..f14af9e
--- /dev/null
+++ 
b/phoenix-core/src/it/java/org/apache/phoenix/end2end/MetaDataEndpointImplIT.java
@@ -0,0 +1,301 @@
+package org.apache.phoenix.end2end;
+
+import static org.junit.Assert.assertEquals;
+import static org.junit.Assert.assertNotNull;
+import static org.junit.Assert.fail;
+
+import java.io.IOException;
+import java.sql.Connection;
+import java.sql.DriverManager;
+import java.sql.SQLException;
+import java.util.Arrays;
+import java.util.List;
+import java.util.Map;
+
+import org.apache.hadoop.hbase.HConstants;
+import org.apache.hadoop.hbase.TableName;
+import org.apache.hadoop.hbase.client.HTable;
+import org.apache.phoenix.coprocessor.TableViewFinderResult;
+import org.apache.phoenix.coprocessor.ViewFinder;
+import org.apache.phoenix.end2end.ParallelStatsDisabledIT;
+import org.apache.phoenix.exception.SQLExceptionCode;
+import org.apache.phoenix.jdbc.PhoenixDatabaseMetaData;
+import org.apache.phoenix.schema.PColumn;
+import org.apache.phoenix.schema.PTable;
+import org.apache.phoenix.schema.TableNotFoundException;
+import org.apache.phoenix.util.PhoenixRuntime;
+import org.junit.Test;
+
+import com.google.common.base.Joiner;
+import com.google.common.collect.ImmutableMap;
+import com.google.common.collect.Lists;
+import com.google.common.collect.Maps;
+
+/**
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ * http://www.apache.org/licenses/LICENSE-2.0
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+public class MetaDataEndpointImplIT extends ParallelStatsDisabledIT {
+private final TableName catalogTable = 
TableName.valueOf(PhoenixDatabaseMetaData.SYSTEM_CATALOG_NAME_BYTES);
+private final TableName linkTable = 
TableName.valueOf(PhoenixDatabaseMetaData.SYSTEM_CHILD_LINK_NAME_BYTES);
+
+/*
+  The tree structure is as follows: Where ParentTable is the Base Table
+  and all children are views and child views respectively.
+
+ParentTable
+  / \
+leftChild   rightChild
+  /
+   leftGrandChild
+ */
+
+@Test
+public void testGettingChildrenAndParentViews() throws Exception {
+String baseTable = generateUniqueName();
+String leftChild = generateUniqueName();
+String rightChild = generateUniqueName();
+String leftGrandChild = generateUniqueName();
+Connection conn = DriverManager.getConnection(getUrl());
+String ddlFormat =
+"CREATE TABLE IF NOT EXISTS " + baseTable + "  (" + " PK2 VARCHAR 
NOT NULL, V1 VARCHAR, V2 VARCHAR "
++ " CONSTRAINT NAME_PK PRIMARY KEY (PK2)" + " )";
+conn.createStatement().execute(ddlFormat);
+
+conn.createStatement().execute("CREATE VIEW " + rightChild + " AS 
SELECT * FROM " + baseTable);
+conn.createStatement().execute("CREATE VIEW " + leftChild + " (carrier 
VARCHAR) AS SELECT * FROM " + baseTable);
+con

phoenix git commit: PHOENIX-4942 Move MetaDataEndpointImplTest to integration test

2018-10-15 Thread tdsilva
Repository: phoenix
Updated Branches:
  refs/heads/4.x-HBase-1.2 9ab708061 -> 280412183


PHOENIX-4942 Move MetaDataEndpointImplTest to integration test


Project: http://git-wip-us.apache.org/repos/asf/phoenix/repo
Commit: http://git-wip-us.apache.org/repos/asf/phoenix/commit/28041218
Tree: http://git-wip-us.apache.org/repos/asf/phoenix/tree/28041218
Diff: http://git-wip-us.apache.org/repos/asf/phoenix/diff/28041218

Branch: refs/heads/4.x-HBase-1.2
Commit: 2804121837d77e048f6ce5cacee0084d5258220a
Parents: 9ab7080
Author: Thomas D'Silva 
Authored: Mon Oct 15 22:17:24 2018 -0700
Committer: Thomas D'Silva 
Committed: Mon Oct 15 22:17:24 2018 -0700

--
 .../phoenix/end2end/MetaDataEndpointImplIT.java | 301 +++
 .../coprocessor/MetaDataEndpointImplTest.java   | 299 --
 2 files changed, 301 insertions(+), 299 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/phoenix/blob/28041218/phoenix-core/src/it/java/org/apache/phoenix/end2end/MetaDataEndpointImplIT.java
--
diff --git 
a/phoenix-core/src/it/java/org/apache/phoenix/end2end/MetaDataEndpointImplIT.java
 
b/phoenix-core/src/it/java/org/apache/phoenix/end2end/MetaDataEndpointImplIT.java
new file mode 100644
index 000..f14af9e
--- /dev/null
+++ 
b/phoenix-core/src/it/java/org/apache/phoenix/end2end/MetaDataEndpointImplIT.java
@@ -0,0 +1,301 @@
+package org.apache.phoenix.end2end;
+
+import static org.junit.Assert.assertEquals;
+import static org.junit.Assert.assertNotNull;
+import static org.junit.Assert.fail;
+
+import java.io.IOException;
+import java.sql.Connection;
+import java.sql.DriverManager;
+import java.sql.SQLException;
+import java.util.Arrays;
+import java.util.List;
+import java.util.Map;
+
+import org.apache.hadoop.hbase.HConstants;
+import org.apache.hadoop.hbase.TableName;
+import org.apache.hadoop.hbase.client.HTable;
+import org.apache.phoenix.coprocessor.TableViewFinderResult;
+import org.apache.phoenix.coprocessor.ViewFinder;
+import org.apache.phoenix.end2end.ParallelStatsDisabledIT;
+import org.apache.phoenix.exception.SQLExceptionCode;
+import org.apache.phoenix.jdbc.PhoenixDatabaseMetaData;
+import org.apache.phoenix.schema.PColumn;
+import org.apache.phoenix.schema.PTable;
+import org.apache.phoenix.schema.TableNotFoundException;
+import org.apache.phoenix.util.PhoenixRuntime;
+import org.junit.Test;
+
+import com.google.common.base.Joiner;
+import com.google.common.collect.ImmutableMap;
+import com.google.common.collect.Lists;
+import com.google.common.collect.Maps;
+
+/**
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ * http://www.apache.org/licenses/LICENSE-2.0
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+public class MetaDataEndpointImplIT extends ParallelStatsDisabledIT {
+private final TableName catalogTable = 
TableName.valueOf(PhoenixDatabaseMetaData.SYSTEM_CATALOG_NAME_BYTES);
+private final TableName linkTable = 
TableName.valueOf(PhoenixDatabaseMetaData.SYSTEM_CHILD_LINK_NAME_BYTES);
+
+/*
+  The tree structure is as follows: Where ParentTable is the Base Table
+  and all children are views and child views respectively.
+
+ParentTable
+  / \
+leftChild   rightChild
+  /
+   leftGrandChild
+ */
+
+@Test
+public void testGettingChildrenAndParentViews() throws Exception {
+String baseTable = generateUniqueName();
+String leftChild = generateUniqueName();
+String rightChild = generateUniqueName();
+String leftGrandChild = generateUniqueName();
+Connection conn = DriverManager.getConnection(getUrl());
+String ddlFormat =
+"CREATE TABLE IF NOT EXISTS " + baseTable + "  (" + " PK2 VARCHAR 
NOT NULL, V1 VARCHAR, V2 VARCHAR "
++ " CONSTRAINT NAME_PK PRIMARY KEY (PK2)" + " )";
+conn.createStatement().execute(ddlFormat);
+
+conn.createStatement().execute("CREATE VIEW " + rightChild + " AS 
SELECT * FROM " + baseTable);
+conn.createStatement().execute("CREATE VIEW " + leftChild + " (carrier 
VARCHAR) AS SELECT * FROM " + baseTable);
+con

Build failed in Jenkins: Phoenix-omid2 #125

2018-10-15 Thread Apache Jenkins Server
See 


Changes:

[jamestaylor] Revert "PHOENIX-4874 psql doesn't support date/time with values 
smaller

--
[...truncated 549.24 KB...]
Caused by: org.jboss.netty.channel.ChannelException: Failed to bind to: 
0.0.0.0/0.0.0.0:43214
Caused by: java.net.BindException: Address already in use

[ERROR] 
testUpdateStatsWithMultipleTables[mutable=true,transactionProvider=OMID,isUserTableNamespaceMapped=false,columnEncoded=false](org.apache.phoenix.end2end.SysTableNamespaceMappedStatsCollectorIT)
  Time elapsed: 0.097 s  <<< ERROR!
com.google.common.util.concurrent.UncheckedExecutionException: 
org.jboss.netty.channel.ChannelException: Failed to bind to: 
0.0.0.0/0.0.0.0:43214
Caused by: org.jboss.netty.channel.ChannelException: Failed to bind to: 
0.0.0.0/0.0.0.0:43214
Caused by: java.net.BindException: Address already in use

[ERROR] 
testUpdateStats[mutable=true,transactionProvider=OMID,isUserTableNamespaceMapped=false,columnEncoded=false](org.apache.phoenix.end2end.SysTableNamespaceMappedStatsCollectorIT)
  Time elapsed: 0.084 s  <<< ERROR!
com.google.common.util.concurrent.UncheckedExecutionException: 
org.jboss.netty.channel.ChannelException: Failed to bind to: 
0.0.0.0/0.0.0.0:43214
Caused by: org.jboss.netty.channel.ChannelException: Failed to bind to: 
0.0.0.0/0.0.0.0:43214
Caused by: java.net.BindException: Address already in use

[ERROR] 
testGuidePostWidthUsedInDefaultStatsCollector[mutable=true,transactionProvider=OMID,isUserTableNamespaceMapped=false,columnEncoded=false](org.apache.phoenix.end2end.SysTableNamespaceMappedStatsCollectorIT)
  Time elapsed: 0.151 s  <<< ERROR!
com.google.common.util.concurrent.UncheckedExecutionException: 
org.jboss.netty.channel.ChannelException: Failed to bind to: 
0.0.0.0/0.0.0.0:43214
Caused by: org.jboss.netty.channel.ChannelException: Failed to bind to: 
0.0.0.0/0.0.0.0:43214
Caused by: java.net.BindException: Address already in use

[ERROR] 
testNoDuplicatesAfterUpdateStatsWithDesc[mutable=true,transactionProvider=OMID,isUserTableNamespaceMapped=false,columnEncoded=false](org.apache.phoenix.end2end.SysTableNamespaceMappedStatsCollectorIT)
  Time elapsed: 0.181 s  <<< ERROR!
com.google.common.util.concurrent.UncheckedExecutionException: 
org.jboss.netty.channel.ChannelException: Failed to bind to: 
0.0.0.0/0.0.0.0:43214
Caused by: org.jboss.netty.channel.ChannelException: Failed to bind to: 
0.0.0.0/0.0.0.0:43214
Caused by: java.net.BindException: Address already in use

[ERROR] 
testNoDuplicatesAfterUpdateStatsWithSplits[mutable=true,transactionProvider=OMID,isUserTableNamespaceMapped=false,columnEncoded=false](org.apache.phoenix.end2end.SysTableNamespaceMappedStatsCollectorIT)
  Time elapsed: 0.159 s  <<< ERROR!
com.google.common.util.concurrent.UncheckedExecutionException: 
org.jboss.netty.channel.ChannelException: Failed to bind to: 
0.0.0.0/0.0.0.0:43214
Caused by: org.jboss.netty.channel.ChannelException: Failed to bind to: 
0.0.0.0/0.0.0.0:43214
Caused by: java.net.BindException: Address already in use

[ERROR] 
testUpdateEmptyStats[mutable=true,transactionProvider=OMID,isUserTableNamespaceMapped=false,columnEncoded=false](org.apache.phoenix.end2end.SysTableNamespaceMappedStatsCollectorIT)
  Time elapsed: 0.372 s  <<< ERROR!
com.google.common.util.concurrent.UncheckedExecutionException: 
org.jboss.netty.channel.ChannelException: Failed to bind to: 
0.0.0.0/0.0.0.0:43214
Caused by: org.jboss.netty.channel.ChannelException: Failed to bind to: 
0.0.0.0/0.0.0.0:43214
Caused by: java.net.BindException: Address already in use

[ERROR] 
testWithMultiCF[mutable=true,transactionProvider=OMID,isUserTableNamespaceMapped=false,columnEncoded=false](org.apache.phoenix.end2end.SysTableNamespaceMappedStatsCollectorIT)
  Time elapsed: 0.178 s  <<< ERROR!
com.google.common.util.concurrent.UncheckedExecutionException: 
org.jboss.netty.channel.ChannelException: Failed to bind to: 
0.0.0.0/0.0.0.0:43214
Caused by: org.jboss.netty.channel.ChannelException: Failed to bind to: 
0.0.0.0/0.0.0.0:43214
Caused by: java.net.BindException: Address already in use

[INFO] Running org.apache.phoenix.end2end.SystemCatalogIT
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 96.918 s 
- in org.apache.phoenix.end2end.SystemCatalogIT
[INFO] Running org.apache.phoenix.end2end.SystemTablePermissionsIT
[INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 145.562 
s - in org.apache.phoenix.end2end.SystemTablePermissionsIT
[INFO] Running org.apache.phoenix.end2end.TableDDLPermissionsIT
[ERROR] Tests run: 12, Failures: 0, Errors: 1, Skipped: 0, Time elapsed: 
945.757 s <<< FAILURE! - in org.apache.phoenix.end2end.SplitIT
[ERROR] testSimpleSelectAfterTableSplit(org.apache.phoenix.end2end.SplitIT)  
Time elapsed: 623.231 s  <<< ERROR!
org.apache.hadoop.hbase.exceptions.TimeoutIOException: 
java.util.concurrent.TimeoutException: The procedure 72 is s

phoenix git commit: Revert "PHOENIX-4874 psql doesn't support date/time with values smaller than milliseconds(Rajeshbabu)"

2018-10-15 Thread jamestaylor
Repository: phoenix
Updated Branches:
  refs/heads/omid2 24ffcf5c8 -> 0ce663c53


Revert "PHOENIX-4874 psql doesn't support date/time with values smaller than 
milliseconds(Rajeshbabu)"

This reverts commit 34b8fe86b40f6cc2a05395640044e9dd7e1a1a8f.


Project: http://git-wip-us.apache.org/repos/asf/phoenix/repo
Commit: http://git-wip-us.apache.org/repos/asf/phoenix/commit/0ce663c5
Tree: http://git-wip-us.apache.org/repos/asf/phoenix/tree/0ce663c5
Diff: http://git-wip-us.apache.org/repos/asf/phoenix/diff/0ce663c5

Branch: refs/heads/omid2
Commit: 0ce663c53089efa9821a4423fb973b274ef67638
Parents: 24ffcf5
Author: James Taylor 
Authored: Mon Oct 15 15:56:20 2018 -0700
Committer: James Taylor 
Committed: Mon Oct 15 15:56:20 2018 -0700

--
 .../phoenix/util/csv/CsvUpsertExecutor.java | 20 +++-
 .../phoenix/util/json/JsonUpsertExecutor.java   |  3 --
 .../util/AbstractUpsertExecutorTest.java| 51 +---
 3 files changed, 20 insertions(+), 54 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/phoenix/blob/0ce663c5/phoenix-core/src/main/java/org/apache/phoenix/util/csv/CsvUpsertExecutor.java
--
diff --git 
a/phoenix-core/src/main/java/org/apache/phoenix/util/csv/CsvUpsertExecutor.java 
b/phoenix-core/src/main/java/org/apache/phoenix/util/csv/CsvUpsertExecutor.java
index d2529f7..cd40b44 100644
--- 
a/phoenix-core/src/main/java/org/apache/phoenix/util/csv/CsvUpsertExecutor.java
+++ 
b/phoenix-core/src/main/java/org/apache/phoenix/util/csv/CsvUpsertExecutor.java
@@ -20,7 +20,6 @@ package org.apache.phoenix.util.csv;
 import java.sql.Connection;
 import java.sql.PreparedStatement;
 import java.sql.SQLException;
-import java.sql.Timestamp;
 import java.sql.Types;
 import java.util.List;
 import java.util.Properties;
@@ -31,7 +30,6 @@ import org.apache.commons.csv.CSVRecord;
 import org.apache.hadoop.hbase.util.Base64;
 import org.apache.hadoop.hbase.util.Bytes;
 import org.apache.phoenix.expression.function.EncodeFormat;
-import org.apache.phoenix.jdbc.PhoenixConnection;
 import org.apache.phoenix.query.QueryServices;
 import org.apache.phoenix.query.QueryServicesOptions;
 import org.apache.phoenix.schema.IllegalDataException;
@@ -43,7 +41,6 @@ import org.apache.phoenix.schema.types.PTimestamp;
 import org.apache.phoenix.schema.types.PVarbinary;
 import org.apache.phoenix.util.ColumnInfo;
 import org.apache.phoenix.util.DateUtil;
-import org.apache.phoenix.util.ReadOnlyProps;
 import org.apache.phoenix.util.UpsertExecutor;
 import org.slf4j.Logger;
 import org.slf4j.LoggerFactory;
@@ -128,9 +125,9 @@ public class CsvUpsertExecutor extends 
UpsertExecutor {
 private final String binaryEncoding;
 
 SimpleDatatypeConversionFunction(PDataType dataType, Connection conn) {
-ReadOnlyProps props;
+Properties props;
 try {
-props = 
conn.unwrap(PhoenixConnection.class).getQueryServices().getProps();
+props = conn.getClientInfo();
 } catch (SQLException e) {
 throw new RuntimeException(e);
 }
@@ -142,23 +139,23 @@ public class CsvUpsertExecutor extends 
UpsertExecutor {
 String dateFormat;
 int dateSqlType = dataType.getResultSetSqlType();
 if (dateSqlType == Types.DATE) {
-dateFormat = props.get(QueryServices.DATE_FORMAT_ATTRIB,
+dateFormat = 
props.getProperty(QueryServices.DATE_FORMAT_ATTRIB,
 DateUtil.DEFAULT_DATE_FORMAT);
 } else if (dateSqlType == Types.TIME) {
-dateFormat = props.get(QueryServices.TIME_FORMAT_ATTRIB,
+dateFormat = 
props.getProperty(QueryServices.TIME_FORMAT_ATTRIB,
 DateUtil.DEFAULT_TIME_FORMAT);
 } else {
-dateFormat = 
props.get(QueryServices.TIMESTAMP_FORMAT_ATTRIB,
+dateFormat = 
props.getProperty(QueryServices.TIMESTAMP_FORMAT_ATTRIB,
 DateUtil.DEFAULT_TIMESTAMP_FORMAT);

 }
-String timeZoneId = 
props.get(QueryServices.DATE_FORMAT_TIMEZONE_ATTRIB,
+String timeZoneId = 
props.getProperty(QueryServices.DATE_FORMAT_TIMEZONE_ATTRIB,
 QueryServicesOptions.DEFAULT_DATE_FORMAT_TIMEZONE);
 this.dateTimeParser = DateUtil.getDateTimeParser(dateFormat, 
dataType, timeZoneId);
 } else {
 this.dateTimeParser = null;
 }
 this.codec = codec;
-this.binaryEncoding = 
props.get(QueryServices.UPLOAD_BINARY_DATA_TYPE_ENCODING,
+this.binaryEncoding = 
props.getProperty(QueryServices.UPLOAD_BINARY_DATA_TYP

[12/22] phoenix git commit: Revert "PHOENIX-4825 Replace usage of HBase Base64 implementation with java.util.Base64"

2018-10-15 Thread jamestaylor
Revert "PHOENIX-4825 Replace usage of HBase Base64 implementation with 
java.util.Base64"

This reverts commit 22934e5af7af79580bf54feeb7667eccafaafc71 in order to 
support JDK 1.7 for 4.x releases.


Project: http://git-wip-us.apache.org/repos/asf/phoenix/repo
Commit: http://git-wip-us.apache.org/repos/asf/phoenix/commit/8c76e7c9
Tree: http://git-wip-us.apache.org/repos/asf/phoenix/tree/8c76e7c9
Diff: http://git-wip-us.apache.org/repos/asf/phoenix/diff/8c76e7c9

Branch: refs/heads/omid2
Commit: 8c76e7c9775f8695a513168ac3ba3db467d54482
Parents: e62be9c
Author: Ankit Singhal 
Authored: Fri Oct 5 16:51:37 2018 -0700
Committer: Ankit Singhal 
Committed: Fri Oct 5 16:51:37 2018 -0700

--
 .../org/apache/phoenix/end2end/QueryMoreIT.java |  7 ++---
 .../phoenix/mapreduce/CsvBulkImportUtil.java|  8 ++
 .../util/PhoenixConfigurationUtil.java  |  7 ++---
 .../apache/phoenix/schema/types/PVarbinary.java |  4 +--
 .../phoenix/util/csv/CsvUpsertExecutor.java |  4 +--
 .../phoenix/util/json/JsonUpsertExecutor.java   |  4 +--
 .../util/AbstractUpsertExecutorTest.java| 12 
 .../util/TenantIdByteConversionTest.java| 30 
 8 files changed, 26 insertions(+), 50 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/phoenix/blob/8c76e7c9/phoenix-core/src/it/java/org/apache/phoenix/end2end/QueryMoreIT.java
--
diff --git 
a/phoenix-core/src/it/java/org/apache/phoenix/end2end/QueryMoreIT.java 
b/phoenix-core/src/it/java/org/apache/phoenix/end2end/QueryMoreIT.java
index 528fe7f..04272fa 100644
--- a/phoenix-core/src/it/java/org/apache/phoenix/end2end/QueryMoreIT.java
+++ b/phoenix-core/src/it/java/org/apache/phoenix/end2end/QueryMoreIT.java
@@ -31,13 +31,12 @@ import java.sql.ResultSet;
 import java.sql.SQLException;
 import java.sql.Statement;
 import java.util.ArrayList;
-import java.util.Base64;
 import java.util.HashMap;
 import java.util.List;
 import java.util.Map;
 import java.util.Properties;
 
-import org.apache.hadoop.hbase.util.Bytes;
+import org.apache.hadoop.hbase.util.Base64;
 import org.apache.hadoop.hbase.util.Pair;
 import org.apache.phoenix.jdbc.PhoenixConnection;
 import org.apache.phoenix.query.QueryServices;
@@ -279,7 +278,7 @@ public class QueryMoreIT extends ParallelStatsDisabledIT {
 values[i] = rs.getObject(i + 1);
 }
 conn = getTenantSpecificConnection(tenantId);
-
pkIds.add(Bytes.toString(Base64.getEncoder().encode(PhoenixRuntime.encodeColumnValues(conn,
 tableOrViewName.toUpperCase(), values, columns;
+
pkIds.add(Base64.encodeBytes(PhoenixRuntime.encodeColumnValues(conn, 
tableOrViewName.toUpperCase(), values, columns)));
 }
 return pkIds.toArray(new String[pkIds.size()]);
 }
@@ -297,7 +296,7 @@ public class QueryMoreIT extends ParallelStatsDisabledIT {
 PreparedStatement stmt = conn.prepareStatement(query);
 int bindCounter = 1;
 for (int i = 0; i < cursorIds.length; i++) {
-Object[] pkParts = PhoenixRuntime.decodeColumnValues(conn, 
tableName.toUpperCase(), Base64.getDecoder().decode(cursorIds[i]), columns);
+Object[] pkParts = PhoenixRuntime.decodeColumnValues(conn, 
tableName.toUpperCase(), Base64.decode(cursorIds[i]), columns);
 for (int j = 0; j < pkParts.length; j++) {
 stmt.setObject(bindCounter++, pkParts[j]);
 }

http://git-wip-us.apache.org/repos/asf/phoenix/blob/8c76e7c9/phoenix-core/src/main/java/org/apache/phoenix/mapreduce/CsvBulkImportUtil.java
--
diff --git 
a/phoenix-core/src/main/java/org/apache/phoenix/mapreduce/CsvBulkImportUtil.java
 
b/phoenix-core/src/main/java/org/apache/phoenix/mapreduce/CsvBulkImportUtil.java
index bf5a538..ff9ff72 100644
--- 
a/phoenix-core/src/main/java/org/apache/phoenix/mapreduce/CsvBulkImportUtil.java
+++ 
b/phoenix-core/src/main/java/org/apache/phoenix/mapreduce/CsvBulkImportUtil.java
@@ -17,11 +17,9 @@
  */
 package org.apache.phoenix.mapreduce;
 
-import java.util.Base64;
-
 import org.apache.hadoop.conf.Configuration;
 import org.apache.hadoop.fs.Path;
-import org.apache.hadoop.hbase.util.Bytes;
+import org.apache.hadoop.hbase.util.Base64;
 import org.apache.phoenix.mapreduce.util.PhoenixConfigurationUtil;
 import org.apache.phoenix.query.QueryConstants;
 import org.apache.phoenix.query.QueryServices;
@@ -70,7 +68,7 @@ public class CsvBulkImportUtil {
 
 @VisibleForTesting
 static void setChar(Configuration conf, String confKey, char charValue) {
-conf.set(confKey, 
Bytes.toString(Base64.getEncoder().encode(Character.toString(charValue).getBytes(;
+conf.set(confKey, 
Base64.encodeBytes(Character.toString(charValue)

[10/22] phoenix git commit: PHOENIX-4688 Support SPNEGO for python driver via requests-kerberos

2018-10-15 Thread jamestaylor
http://git-wip-us.apache.org/repos/asf/phoenix/blob/e62be9c8/python/phoenixdb/avatica/__init__.py
--
diff --git a/python/phoenixdb/avatica/__init__.py 
b/python/phoenixdb/avatica/__init__.py
deleted file mode 100644
index 53776d7..000
--- a/python/phoenixdb/avatica/__init__.py
+++ /dev/null
@@ -1,16 +0,0 @@
-# Licensed to the Apache Software Foundation (ASF) under one or more
-# contributor license agreements.  See the NOTICE file distributed with
-# this work for additional information regarding copyright ownership.
-# The ASF licenses this file to You under the Apache License, Version 2.0
-# (the "License"); you may not use this file except in compliance with
-# the License.  You may obtain a copy of the License at
-#
-# http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing, software
-# distributed under the License is distributed on an "AS IS" BASIS,
-# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-# See the License for the specific language governing permissions and
-# limitations under the License.
-
-from .client import AvaticaClient  # noqa: F401

http://git-wip-us.apache.org/repos/asf/phoenix/blob/e62be9c8/python/phoenixdb/avatica/client.py
--
diff --git a/python/phoenixdb/avatica/client.py 
b/python/phoenixdb/avatica/client.py
deleted file mode 100644
index ea00631..000
--- a/python/phoenixdb/avatica/client.py
+++ /dev/null
@@ -1,510 +0,0 @@
-# Licensed to the Apache Software Foundation (ASF) under one or more
-# contributor license agreements.  See the NOTICE file distributed with
-# this work for additional information regarding copyright ownership.
-# The ASF licenses this file to You under the Apache License, Version 2.0
-# (the "License"); you may not use this file except in compliance with
-# the License.  You may obtain a copy of the License at
-#
-# http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing, software
-# distributed under the License is distributed on an "AS IS" BASIS,
-# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-# See the License for the specific language governing permissions and
-# limitations under the License.
-
-"""Implementation of the JSON-over-HTTP RPC protocol used by Avatica."""
-
-import re
-import socket
-import pprint
-import math
-import logging
-import time
-from phoenixdb import errors
-from phoenixdb.avatica.proto import requests_pb2, common_pb2, responses_pb2
-
-try:
-import httplib
-except ImportError:
-import http.client as httplib
-
-try:
-import urlparse
-except ImportError:
-import urllib.parse as urlparse
-
-try:
-from HTMLParser import HTMLParser
-except ImportError:
-from html.parser import HTMLParser
-
-__all__ = ['AvaticaClient']
-
-logger = logging.getLogger(__name__)
-
-
-class JettyErrorPageParser(HTMLParser):
-
-def __init__(self):
-HTMLParser.__init__(self)
-self.path = []
-self.title = []
-self.message = []
-
-def handle_starttag(self, tag, attrs):
-self.path.append(tag)
-
-def handle_endtag(self, tag):
-self.path.pop()
-
-def handle_data(self, data):
-if len(self.path) > 2 and self.path[0] == 'html' and self.path[1] == 
'body':
-if len(self.path) == 3 and self.path[2] == 'h2':
-self.title.append(data.strip())
-elif len(self.path) == 4 and self.path[2] == 'p' and self.path[3] 
== 'pre':
-self.message.append(data.strip())
-
-
-def parse_url(url):
-url = urlparse.urlparse(url)
-if not url.scheme and not url.netloc and url.path:
-netloc = url.path
-if ':' not in netloc:
-netloc = '{}:8765'.format(netloc)
-return urlparse.ParseResult('http', netloc, '/', '', '', '')
-return url
-
-
-# Defined in 
phoenix-core/src/main/java/org/apache/phoenix/exception/SQLExceptionCode.java
-SQLSTATE_ERROR_CLASSES = [
-('08', errors.OperationalError),  # Connection Exception
-('22018', errors.IntegrityError),  # Constraint violatioin.
-('22', errors.DataError),  # Data Exception
-('23', errors.IntegrityError),  # Constraint Violation
-('24', errors.InternalError),  # Invalid Cursor State
-('25', errors.InternalError),  # Invalid Transaction State
-('42', errors.ProgrammingError),  # Syntax Error or Access Rule Violation
-('XLC', errors.OperationalError),  # Execution exceptions
-('INT', errors.InternalError),  # Phoenix internal error
-]
-
-# Relevant properties as defined by 
https://calcite.apache.org/avatica/docs/client_reference.html
-OPEN_CONNECTION_PROPERTIES = (
-'user',  # User for the database connection
-'password',  # Password for the user
-)
-
-
-def raise_sql_error(code, sqlstate, message):
-for pr

[06/22] phoenix git commit: PHOENIX-4688 Support SPNEGO for python driver via requests-kerberos

2018-10-15 Thread jamestaylor
http://git-wip-us.apache.org/repos/asf/phoenix/blob/e62be9c8/python/phoenixdb/phoenixdb/avatica/proto/requests_pb2.py
--
diff --git a/python/phoenixdb/phoenixdb/avatica/proto/requests_pb2.py 
b/python/phoenixdb/phoenixdb/avatica/proto/requests_pb2.py
new file mode 100644
index 000..203f945
--- /dev/null
+++ b/python/phoenixdb/phoenixdb/avatica/proto/requests_pb2.py
@@ -0,0 +1,1206 @@
+# Generated by the protocol buffer compiler.  DO NOT EDIT!
+# source: requests.proto
+
+import sys
+_b=sys.version_info[0]<3 and (lambda x:x) or (lambda x:x.encode('latin1'))
+from google.protobuf import descriptor as _descriptor
+from google.protobuf import message as _message
+from google.protobuf import reflection as _reflection
+from google.protobuf import symbol_database as _symbol_database
+from google.protobuf import descriptor_pb2
+# @@protoc_insertion_point(imports)
+
+_sym_db = _symbol_database.Default()
+
+
+from . import common_pb2 as common__pb2
+
+
+DESCRIPTOR = _descriptor.FileDescriptor(
+  name='requests.proto',
+  package='',
+  syntax='proto3',
+  
serialized_pb=_b('\n\x0erequests.proto\x1a\x0c\x63ommon.proto\"(\n\x0f\x43\x61talogsRequest\x12\x15\n\rconnection_id\x18\x01
 \x01(\t\"0\n\x17\x44\x61tabasePropertyRequest\x12\x15\n\rconnection_id\x18\x01 
\x01(\t\"P\n\x0eSchemasRequest\x12\x0f\n\x07\x63\x61talog\x18\x01 
\x01(\t\x12\x16\n\x0eschema_pattern\x18\x02 
\x01(\t\x12\x15\n\rconnection_id\x18\x03 
\x01(\t\"\x95\x01\n\rTablesRequest\x12\x0f\n\x07\x63\x61talog\x18\x01 
\x01(\t\x12\x16\n\x0eschema_pattern\x18\x02 
\x01(\t\x12\x1a\n\x12table_name_pattern\x18\x03 
\x01(\t\x12\x11\n\ttype_list\x18\x04 \x03(\t\x12\x15\n\rhas_type_list\x18\x06 
\x01(\x08\x12\x15\n\rconnection_id\x18\x07 
\x01(\t\"*\n\x11TableTypesRequest\x12\x15\n\rconnection_id\x18\x01 
\x01(\t\"\x89\x01\n\x0e\x43olumnsRequest\x12\x0f\n\x07\x63\x61talog\x18\x01 
\x01(\t\x12\x16\n\x0eschema_pattern\x18\x02 
\x01(\t\x12\x1a\n\x12table_name_pattern\x18\x03 
\x01(\t\x12\x1b\n\x13\x63olumn_name_pattern\x18\x04 
\x01(\t\x12\x15\n\rconnection_id\x18\x05 \x01(\t\"(\n\x0fTypeInfoReque
 st\x12\x15\n\rconnection_id\x18\x01 
\x01(\t\"\xa1\x01\n\x18PrepareAndExecuteRequest\x12\x15\n\rconnection_id\x18\x01
 \x01(\t\x12\x0b\n\x03sql\x18\x02 \x01(\t\x12\x15\n\rmax_row_count\x18\x03 
\x01(\x04\x12\x14\n\x0cstatement_id\x18\x04 
\x01(\r\x12\x16\n\x0emax_rows_total\x18\x05 
\x01(\x03\x12\x1c\n\x14\x66irst_frame_max_size\x18\x06 
\x01(\x05\"c\n\x0ePrepareRequest\x12\x15\n\rconnection_id\x18\x01 
\x01(\t\x12\x0b\n\x03sql\x18\x02 \x01(\t\x12\x15\n\rmax_row_count\x18\x03 
\x01(\x04\x12\x16\n\x0emax_rows_total\x18\x04 
\x01(\x03\"\x80\x01\n\x0c\x46\x65tchRequest\x12\x15\n\rconnection_id\x18\x01 
\x01(\t\x12\x14\n\x0cstatement_id\x18\x02 \x01(\r\x12\x0e\n\x06offset\x18\x03 
\x01(\x04\x12\x1b\n\x13\x66\x65tch_max_row_count\x18\x04 
\x01(\r\x12\x16\n\x0e\x66rame_max_size\x18\x05 
\x01(\x05\"/\n\x16\x43reateStatementRequest\x12\x15\n\rconnection_id\x18\x01 
\x01(\t\"D\n\x15\x43loseStatementRequest\x12\x15\n\rconnection_id\x18\x01 
\x01(\t\x12\x14\n\x0cstatement_id\x18\x02 \x01(\r\"\x8b\x01\n\x15Op
 enConnectionRequest\x12\x15\n\rconnection_id\x18\x01 
\x01(\t\x12.\n\x04info\x18\x02 \x03(\x0b\x32 
.OpenConnectionRequest.InfoEntry\x1a+\n\tInfoEntry\x12\x0b\n\x03key\x18\x01 
\x01(\t\x12\r\n\x05value\x18\x02 
\x01(\t:\x02\x38\x01\"/\n\x16\x43loseConnectionRequest\x12\x15\n\rconnection_id\x18\x01
 \x01(\t\"Y\n\x15\x43onnectionSyncRequest\x12\x15\n\rconnection_id\x18\x01 
\x01(\t\x12)\n\nconn_props\x18\x02 
\x01(\x0b\x32\x15.ConnectionProperties\"\xc7\x01\n\x0e\x45xecuteRequest\x12)\n\x0fstatementHandle\x18\x01
 \x01(\x0b\x32\x10.StatementHandle\x12%\n\x10parameter_values\x18\x02 
\x03(\x0b\x32\x0b.TypedValue\x12\'\n\x1f\x64\x65precated_first_frame_max_size\x18\x03
 \x01(\x04\x12\x1c\n\x14has_parameter_values\x18\x04 
\x01(\x08\x12\x1c\n\x14\x66irst_frame_max_size\x18\x05 
\x01(\x05\"m\n\x12SyncResultsRequest\x12\x15\n\rconnection_id\x18\x01 
\x01(\t\x12\x14\n\x0cstatement_id\x18\x02 \x01(\r\x12\x1a\n\x05state\x18\x03 
\x01(\x0b\x32\x0b.QueryState\x12\x0e\n\x06offset\x18\x04 \x01(\x04\"&\n\rCommi
 tRequest\x12\x15\n\rconnection_id\x18\x01 
\x01(\t\"(\n\x0fRollbackRequest\x12\x15\n\rconnection_id\x18\x01 
\x01(\t\"b\n\x1dPrepareAndExecuteBatchRequest\x12\x15\n\rconnection_id\x18\x01 
\x01(\t\x12\x14\n\x0cstatement_id\x18\x02 
\x01(\r\x12\x14\n\x0csql_commands\x18\x03 
\x03(\t\"4\n\x0bUpdateBatch\x12%\n\x10parameter_values\x18\x01 
\x03(\x0b\x32\x0b.TypedValue\"a\n\x13\x45xecuteBatchRequest\x12\x15\n\rconnection_id\x18\x01
 \x01(\t\x12\x14\n\x0cstatement_id\x18\x02 \x01(\r\x12\x1d\n\x07updates\x18\x03 
\x03(\x0b\x32\x0c.UpdateBatchB\"\n org.apache.calcite.avatica.protob\x06proto3')
+  ,
+  dependencies=[common__pb2.DESCRIPTOR,])
+_sym_db.RegisterFileDescriptor(DESCRIPTOR)
+
+
+
+
+_CATALOGSREQUEST = _descriptor.Descriptor(
+  name='CatalogsRequest',
+  full_name='CatalogsRequest',
+  filename=None,
+  file=D

[19/22] phoenix git commit: PHOENIX-4358 Case Sensitive String match on SqlType in PDataType (Dave Angulo)

2018-10-15 Thread jamestaylor
PHOENIX-4358 Case Sensitive String match on SqlType in PDataType (Dave Angulo)


Project: http://git-wip-us.apache.org/repos/asf/phoenix/repo
Commit: http://git-wip-us.apache.org/repos/asf/phoenix/commit/9556b8e1
Tree: http://git-wip-us.apache.org/repos/asf/phoenix/tree/9556b8e1
Diff: http://git-wip-us.apache.org/repos/asf/phoenix/diff/9556b8e1

Branch: refs/heads/omid2
Commit: 9556b8e10361286b3a7ef9402620be29d2422355
Parents: 7580b7e
Author: Thomas D'Silva 
Authored: Fri Oct 12 13:46:15 2018 -0700
Committer: Thomas D'Silva 
Committed: Fri Oct 12 13:47:42 2018 -0700

--
 .../main/java/org/apache/phoenix/schema/types/PDataType.java   | 2 +-
 .../java/org/apache/phoenix/schema/types/PDataTypeTest.java| 6 ++
 2 files changed, 7 insertions(+), 1 deletion(-)
--


http://git-wip-us.apache.org/repos/asf/phoenix/blob/9556b8e1/phoenix-core/src/main/java/org/apache/phoenix/schema/types/PDataType.java
--
diff --git 
a/phoenix-core/src/main/java/org/apache/phoenix/schema/types/PDataType.java 
b/phoenix-core/src/main/java/org/apache/phoenix/schema/types/PDataType.java
index 1e29d6f..eba6079 100644
--- a/phoenix-core/src/main/java/org/apache/phoenix/schema/types/PDataType.java
+++ b/phoenix-core/src/main/java/org/apache/phoenix/schema/types/PDataType.java
@@ -1041,7 +1041,7 @@ public abstract class PDataType implements 
DataType, Comparablehttp://git-wip-us.apache.org/repos/asf/phoenix/blob/9556b8e1/phoenix-core/src/test/java/org/apache/phoenix/schema/types/PDataTypeTest.java
--
diff --git 
a/phoenix-core/src/test/java/org/apache/phoenix/schema/types/PDataTypeTest.java 
b/phoenix-core/src/test/java/org/apache/phoenix/schema/types/PDataTypeTest.java
index 4b02cea..e868f4e 100644
--- 
a/phoenix-core/src/test/java/org/apache/phoenix/schema/types/PDataTypeTest.java
+++ 
b/phoenix-core/src/test/java/org/apache/phoenix/schema/types/PDataTypeTest.java
@@ -1949,4 +1949,10 @@ public class PDataTypeTest {
 }
 }
 }
+
+@Test
+public void testFromSqlTypeName() {
+assertEquals(PVarchar.INSTANCE, PDataType.fromSqlTypeName("varchar"));
+}
+
 }



[15/22] phoenix git commit: PHOENIX-3955: Ensure KEEP_DELETED_CELLS, REPLICATION_SCOPE, and TTL properties stay in sync between the physical data table and index tables

2018-10-15 Thread jamestaylor
http://git-wip-us.apache.org/repos/asf/phoenix/blob/dd3e55f1/phoenix-core/src/main/java/org/apache/phoenix/query/ConnectionQueryServicesImpl.java
--
diff --git 
a/phoenix-core/src/main/java/org/apache/phoenix/query/ConnectionQueryServicesImpl.java
 
b/phoenix-core/src/main/java/org/apache/phoenix/query/ConnectionQueryServicesImpl.java
index ab4678a..b68637a 100644
--- 
a/phoenix-core/src/main/java/org/apache/phoenix/query/ConnectionQueryServicesImpl.java
+++ 
b/phoenix-core/src/main/java/org/apache/phoenix/query/ConnectionQueryServicesImpl.java
@@ -18,6 +18,8 @@
 package org.apache.phoenix.query;
 import static java.util.concurrent.TimeUnit.MILLISECONDS;
 import static org.apache.hadoop.hbase.HColumnDescriptor.TTL;
+import static org.apache.hadoop.hbase.HColumnDescriptor.REPLICATION_SCOPE;
+import static org.apache.hadoop.hbase.HColumnDescriptor.KEEP_DELETED_CELLS;
 import static 
org.apache.phoenix.coprocessor.MetaDataProtocol.MIN_SYSTEM_TABLE_TIMESTAMP;
 import static 
org.apache.phoenix.coprocessor.MetaDataProtocol.PHOENIX_MAJOR_VERSION;
 import static 
org.apache.phoenix.coprocessor.MetaDataProtocol.PHOENIX_MINOR_VERSION;
@@ -62,6 +64,7 @@ import static 
org.apache.phoenix.util.UpgradeUtil.addViewIndexToParentLinks;
 import static org.apache.phoenix.util.UpgradeUtil.getSysCatalogSnapshotName;
 import static org.apache.phoenix.util.UpgradeUtil.moveChildLinks;
 import static org.apache.phoenix.util.UpgradeUtil.upgradeTo4_5_0;
+import static org.apache.phoenix.util.UpgradeUtil.syncTableAndIndexProperties;
 
 import java.io.IOException;
 import java.lang.management.ManagementFactory;
@@ -101,11 +104,13 @@ import java.util.concurrent.atomic.AtomicInteger;
 
 import javax.annotation.concurrent.GuardedBy;
 
+import com.google.common.base.Strings;
 import org.apache.hadoop.conf.Configuration;
 import org.apache.hadoop.hbase.HColumnDescriptor;
 import org.apache.hadoop.hbase.HConstants;
 import org.apache.hadoop.hbase.HRegionLocation;
 import org.apache.hadoop.hbase.HTableDescriptor;
+import org.apache.hadoop.hbase.KeepDeletedCells;
 import org.apache.hadoop.hbase.NamespaceDescriptor;
 import org.apache.hadoop.hbase.NamespaceNotFoundException;
 import org.apache.hadoop.hbase.TableExistsException;
@@ -775,62 +780,94 @@ public class ConnectionQueryServicesImpl extends 
DelegateQueryServices implement
 PTableType tableType, Map tableProps, 
List>> families,
 byte[][] splits, boolean isNamespaceMapped) throws SQLException {
 String defaultFamilyName = 
(String)tableProps.remove(PhoenixDatabaseMetaData.DEFAULT_COLUMN_FAMILY_NAME);
-HTableDescriptor tableDescriptor = (existingDesc != null) ? new 
HTableDescriptor(existingDesc)
-: new HTableDescriptor(physicalTableName);
+HTableDescriptor newTableDescriptor = (existingDesc != null) ? new 
HTableDescriptor(existingDesc)
+: new HTableDescriptor(TableName.valueOf(physicalTableName));
+
+HColumnDescriptor dataTableColDescForIndexTablePropSyncing = null;
+if (tableType == PTableType.INDEX || 
MetaDataUtil.isViewIndex(Bytes.toString(physicalTableName))) {
+byte[] defaultFamilyBytes =
+defaultFamilyName == null ? 
QueryConstants.DEFAULT_COLUMN_FAMILY_BYTES : Bytes.toBytes(defaultFamilyName);
+
+final HTableDescriptor baseTableDesc;
+if (MetaDataUtil.isViewIndex(Bytes.toString(physicalTableName))) {
+// Handles indexes created on views for single-tenant tables 
and
+// global indexes created on views of multi-tenant tables
+baseTableDesc = 
this.getTableDescriptor(Bytes.toBytes(MetaDataUtil.getViewIndexUserTableName(Bytes.toString(physicalTableName;
+} else if (existingDesc == null) {
+// Global/local index creation on top of a physical base table
+baseTableDesc = 
this.getTableDescriptor(SchemaUtil.getPhysicalTableName(
+Bytes.toBytes((String) 
tableProps.get(PhoenixDatabaseMetaData.DATA_TABLE_NAME)), isNamespaceMapped)
+.getName());
+} else {
+// In case this a local index created on a view of a 
multi-tenant table, the
+// DATA_TABLE_NAME points to the name of the view instead of 
the physical base table
+baseTableDesc = existingDesc;
+}
+dataTableColDescForIndexTablePropSyncing = 
baseTableDesc.getFamily(defaultFamilyBytes);
+// It's possible that the table has specific column families and 
none of them are declared
+// to be the DEFAULT_COLUMN_FAMILY, so we choose the first column 
family for syncing properties
+if (dataTableColDescForIndexTablePropSyncing == null) {
+dataTableColDescForIndexTablePropSyncing = 
baseTableDesc.getColumnFamilies()[0];
+}
+}
 // By default

[13/22] phoenix git commit: PHOENIX-4855 Continue to write base table column metadata when creating a view in order to support rollback (addendum)

2018-10-15 Thread jamestaylor
PHOENIX-4855 Continue to write base table column metadata when creating a view 
in order to support rollback (addendum)


Project: http://git-wip-us.apache.org/repos/asf/phoenix/repo
Commit: http://git-wip-us.apache.org/repos/asf/phoenix/commit/c90d090a
Tree: http://git-wip-us.apache.org/repos/asf/phoenix/tree/c90d090a
Diff: http://git-wip-us.apache.org/repos/asf/phoenix/diff/c90d090a

Branch: refs/heads/omid2
Commit: c90d090a1fba7c2937a5b91e4ac0f1fe379ec539
Parents: 8c76e7c
Author: Thomas D'Silva 
Authored: Sat Oct 6 12:40:54 2018 -0700
Committer: Thomas D'Silva 
Committed: Sat Oct 6 12:42:01 2018 -0700

--
 .../java/org/apache/phoenix/jdbc/PhoenixDatabaseMetaData.java | 7 ---
 1 file changed, 4 insertions(+), 3 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/phoenix/blob/c90d090a/phoenix-core/src/main/java/org/apache/phoenix/jdbc/PhoenixDatabaseMetaData.java
--
diff --git 
a/phoenix-core/src/main/java/org/apache/phoenix/jdbc/PhoenixDatabaseMetaData.java
 
b/phoenix-core/src/main/java/org/apache/phoenix/jdbc/PhoenixDatabaseMetaData.java
index 18c9000..52dfe99 100644
--- 
a/phoenix-core/src/main/java/org/apache/phoenix/jdbc/PhoenixDatabaseMetaData.java
+++ 
b/phoenix-core/src/main/java/org/apache/phoenix/jdbc/PhoenixDatabaseMetaData.java
@@ -737,9 +737,10 @@ public class PhoenixDatabaseMetaData implements 
DatabaseMetaData {
 boolean isSalted = table.getBucketNum()!=null;
 boolean tenantColSkipped = false;
 List columns = table.getColumns();
-columns = Lists.newArrayList(columns.subList(isSalted ? 1 : 0, 
columns.size()));
+int startOffset = isSalted ? 1 : 0;
+   columns = 
Lists.newArrayList(columns.subList(startOffset, columns.size()));
 for (PColumn column : columns) {
-if (isTenantSpecificConnection && 
column.equals(table.getPKColumns().get(0))) {
+if (isTenantSpecificConnection && 
column.equals(table.getPKColumns().get(startOffset))) {
 // skip the tenant column
 tenantColSkipped = true;
 continue;
@@ -874,7 +875,7 @@ public class PhoenixDatabaseMetaData implements 
DatabaseMetaData {
 byte[] keySeqBytes = ByteUtil.EMPTY_BYTE_ARRAY;
 int pkPos = table.getPKColumns().indexOf(column);
 if (pkPos!=-1) {
-short keySeq = (short) (pkPos + 1 - (isSalted ? 1 : 0) - 
(tenantColSkipped ? 1 : 0));
+short keySeq = (short) (pkPos + 1 - startOffset - 
(tenantColSkipped ? 1 : 0));
 keySeqBytes = PSmallint.INSTANCE.toBytes(keySeq);
 }
 cells.add(KeyValueUtil.newKeyValue(rowKey, TABLE_FAMILY_BYTES, 
KEY_SEQ_BYTES,



[11/22] phoenix git commit: PHOENIX-4688 Support SPNEGO for python driver via requests-kerberos

2018-10-15 Thread jamestaylor
PHOENIX-4688 Support SPNEGO for python driver via requests-kerberos

Includes updated L&N for requests-kerberos. Tries to detect when the
host system doesn't have necessary dependencies to run the test

Closes #344

Signed-off-by: Josh Elser 


Project: http://git-wip-us.apache.org/repos/asf/phoenix/repo
Commit: http://git-wip-us.apache.org/repos/asf/phoenix/commit/e62be9c8
Tree: http://git-wip-us.apache.org/repos/asf/phoenix/tree/e62be9c8
Diff: http://git-wip-us.apache.org/repos/asf/phoenix/diff/e62be9c8

Branch: refs/heads/omid2
Commit: e62be9c820aaf05266500cc509d4a89658cb6918
Parents: e4d170c
Author: Lev Bronshtein 
Authored: Wed Aug 29 17:19:51 2018 -0400
Committer: Josh Elser 
Committed: Fri Oct 5 18:41:18 2018 -0400

--
 LICENSE |   18 +
 NOTICE  |2 +
 dev/release_files/LICENSE   |   18 +
 dev/release_files/NOTICE|4 +
 .../src/it/bin/test_phoenixdb.py|   39 +
 .../src/it/bin/test_phoenixdb.sh|   79 +
 .../end2end/SecureQueryServerPhoenixDBIT.java   |  424 +
 pom.xml |   14 +-
 python/NEWS.rst |   44 -
 python/README.md|   93 +
 python/README.rst   |  136 --
 python/RELEASING.rst|   12 -
 python/ci/build-env/Dockerfile  |7 -
 python/ci/phoenix/Dockerfile|   33 -
 python/ci/phoenix/docker-entrypoint.sh  |   24 -
 python/ci/phoenix/hbase-site.xml|   12 -
 python/doc/Makefile |  192 --
 python/doc/api.rst  |   30 -
 python/doc/conf.py  |  287 ---
 python/doc/index.rst|   27 -
 python/doc/versions.rst |3 -
 python/docker-compose.yml   |   21 -
 python/examples/basic.py|   27 -
 python/examples/shell.py|   33 -
 python/gen-protobuf.sh  |   38 -
 python/phoenixdb/NEWS.rst   |   44 +
 python/phoenixdb/README.rst |  136 ++
 python/phoenixdb/RELEASING.rst  |   12 +
 python/phoenixdb/__init__.py|   68 -
 python/phoenixdb/avatica/__init__.py|   16 -
 python/phoenixdb/avatica/client.py  |  510 --
 python/phoenixdb/avatica/proto/__init__.py  |0
 python/phoenixdb/avatica/proto/common_pb2.py| 1667 --
 python/phoenixdb/avatica/proto/requests_pb2.py  | 1206 -
 python/phoenixdb/avatica/proto/responses_pb2.py |  917 --
 python/phoenixdb/ci/build-env/Dockerfile|7 +
 python/phoenixdb/ci/phoenix/Dockerfile  |   33 +
 .../phoenixdb/ci/phoenix/docker-entrypoint.sh   |   24 +
 python/phoenixdb/ci/phoenix/hbase-site.xml  |   12 +
 python/phoenixdb/connection.py  |  187 --
 python/phoenixdb/cursor.py  |  347 
 python/phoenixdb/doc/Makefile   |  192 ++
 python/phoenixdb/doc/api.rst|   30 +
 python/phoenixdb/doc/conf.py|  287 +++
 python/phoenixdb/doc/index.rst  |   27 +
 python/phoenixdb/doc/versions.rst   |3 +
 python/phoenixdb/docker-compose.yml |   21 +
 python/phoenixdb/errors.py  |   93 -
 python/phoenixdb/examples/basic.py  |   27 +
 python/phoenixdb/examples/shell.py  |   33 +
 python/phoenixdb/gen-protobuf.sh|   39 +
 python/phoenixdb/phoenixdb/__init__.py  |   72 +
 python/phoenixdb/phoenixdb/avatica/__init__.py  |   16 +
 python/phoenixdb/phoenixdb/avatica/client.py|  502 ++
 .../phoenixdb/avatica/proto/__init__.py |0
 .../phoenixdb/avatica/proto/common_pb2.py   | 1667 ++
 .../phoenixdb/avatica/proto/requests_pb2.py | 1206 +
 .../phoenixdb/avatica/proto/responses_pb2.py|  917 ++
 python/phoenixdb/phoenixdb/connection.py|  187 ++
 python/phoenixdb/phoenixdb/cursor.py|  347 
 python/phoenixdb/phoenixdb/errors.py|   93 +
 python/phoenixdb/phoenixdb/tests/__init__.py|   44 +
 python/phoenixdb/phoenixdb/tests/dbapi20.py |  857 +
 .../phoenixdb/phoenixdb/tests/test_avatica.py   |   25 +
 .../phoenixdb/tests/test_connection.py  |   42 +
 python/phoenixdb/phoenixdb/tests/test_db.py |   99 ++
 .../phoenixdb/phoenixdb/tests/test_dbapi20.py   |  122 ++
 python/phoenixdb/phoenixdb/tests/test_errors.py |   60 +
 python/phoenixdb/phoenixdb/tests/test_types.py  |  327 
 python/phoenixdb/phoenixdb/types.py |  202 +++
 python/phoenixdb

[16/22] phoenix git commit: PHOENIX-3955: Ensure KEEP_DELETED_CELLS, REPLICATION_SCOPE, and TTL properties stay in sync between the physical data table and index tables

2018-10-15 Thread jamestaylor
PHOENIX-3955: Ensure KEEP_DELETED_CELLS, REPLICATION_SCOPE, and TTL properties 
stay in sync between the physical data table and index tables


Project: http://git-wip-us.apache.org/repos/asf/phoenix/repo
Commit: http://git-wip-us.apache.org/repos/asf/phoenix/commit/dd3e55f1
Tree: http://git-wip-us.apache.org/repos/asf/phoenix/tree/dd3e55f1
Diff: http://git-wip-us.apache.org/repos/asf/phoenix/diff/dd3e55f1

Branch: refs/heads/omid2
Commit: dd3e55f10e6e8ee7373a7c8ad18110cb79afa2d7
Parents: 2ded8b6
Author: Chinmay Kulkarni 
Authored: Sun Oct 7 21:12:55 2018 -0700
Committer: Thomas D'Silva 
Committed: Wed Oct 10 13:16:21 2018 -0700

--
 .../apache/phoenix/end2end/AlterTableIT.java|   5 +-
 .../org/apache/phoenix/end2end/BaseQueryIT.java |  15 +-
 .../apache/phoenix/end2end/CreateTableIT.java   |  27 +-
 .../phoenix/end2end/PropertiesInSyncIT.java | 494 +++
 .../end2end/QueryDatabaseMetaDataIT.java|   7 +-
 .../apache/phoenix/end2end/SetPropertyIT.java   |  64 ++-
 .../org/apache/phoenix/end2end/SplitIT.java |  17 +
 .../org/apache/phoenix/tx/TransactionIT.java|   4 +-
 .../phoenix/exception/SQLExceptionCode.java |   6 +-
 .../query/ConnectionQueryServicesImpl.java  | 485 +-
 .../apache/phoenix/schema/MetaDataClient.java   | 112 +++--
 .../apache/phoenix/schema/TableProperty.java|   4 +-
 .../org/apache/phoenix/util/MetaDataUtil.java   |  44 +-
 .../org/apache/phoenix/util/UpgradeUtil.java| 142 +-
 14 files changed, 1187 insertions(+), 239 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/phoenix/blob/dd3e55f1/phoenix-core/src/it/java/org/apache/phoenix/end2end/AlterTableIT.java
--
diff --git 
a/phoenix-core/src/it/java/org/apache/phoenix/end2end/AlterTableIT.java 
b/phoenix-core/src/it/java/org/apache/phoenix/end2end/AlterTableIT.java
index 2cac1a6..7af62b3 100644
--- a/phoenix-core/src/it/java/org/apache/phoenix/end2end/AlterTableIT.java
+++ b/phoenix-core/src/it/java/org/apache/phoenix/end2end/AlterTableIT.java
@@ -925,7 +925,8 @@ public class AlterTableIT extends ParallelStatsDisabledIT {
 // set HColumnProperty property when adding a pk column and other 
key value columns should work
 ddl = "ALTER TABLE "
 + dataTableFullName
-+ " ADD k3 DECIMAL PRIMARY KEY, col2 bigint, CF.col3 
bigint IN_MEMORY = true, CF.IN_MEMORY=false, CF.REPLICATION_SCOPE = 1";
++ " ADD k3 DECIMAL PRIMARY KEY, col2 bigint, CF.col3 
bigint IN_MEMORY = true,"
++ " CF.IN_MEMORY=false, REPLICATION_SCOPE = 1";
 conn.createStatement().execute(ddl);
 // assert that k3 was added as new pk
 ResultSet rs = conn.getMetaData().getPrimaryKeys("", schemaName, 
dataTableName);
@@ -946,7 +947,7 @@ public class AlterTableIT extends ParallelStatsDisabledIT {
 assertEquals(2, columnFamilies.length);
 assertEquals("0", columnFamilies[0].getNameAsString());
 assertEquals(true, columnFamilies[0].isInMemory());
-assertEquals(0, columnFamilies[0].getScope());
+assertEquals(1, columnFamilies[0].getScope());
 assertEquals("CF", columnFamilies[1].getNameAsString());
 assertEquals(false, columnFamilies[1].isInMemory());
 assertEquals(1, columnFamilies[1].getScope());

http://git-wip-us.apache.org/repos/asf/phoenix/blob/dd3e55f1/phoenix-core/src/it/java/org/apache/phoenix/end2end/BaseQueryIT.java
--
diff --git 
a/phoenix-core/src/it/java/org/apache/phoenix/end2end/BaseQueryIT.java 
b/phoenix-core/src/it/java/org/apache/phoenix/end2end/BaseQueryIT.java
index ed3669c..e88dc57 100644
--- a/phoenix-core/src/it/java/org/apache/phoenix/end2end/BaseQueryIT.java
+++ b/phoenix-core/src/it/java/org/apache/phoenix/end2end/BaseQueryIT.java
@@ -53,20 +53,20 @@ public abstract class BaseQueryIT extends 
ParallelStatsDisabledIT {
 protected static final String[] GLOBAL_INDEX_DDLS =
 new String[] {
 "CREATE INDEX %s ON %s (a_integer DESC) INCLUDE (" + "
A_STRING, "
-+ "B_STRING, " + "A_DATE) %s",
++ "B_STRING, " + "A_DATE)",
 "CREATE INDEX %s ON %s (a_integer, a_string) INCLUDE (" + 
"B_STRING, "
-+ "A_DATE) %s",
++ "A_DATE)",
 "CREATE INDEX %s ON %s (a_integer) INCLUDE (" + "
A_STRING, "
-+ "B_STRING, " + "A_DATE) %s",
++ "B_STRING, " + "A_DATE)",
 NO_IND

[05/22] phoenix git commit: PHOENIX-4688 Support SPNEGO for python driver via requests-kerberos

2018-10-15 Thread jamestaylor
http://git-wip-us.apache.org/repos/asf/phoenix/blob/e62be9c8/python/phoenixdb/phoenixdb/cursor.py
--
diff --git a/python/phoenixdb/phoenixdb/cursor.py 
b/python/phoenixdb/phoenixdb/cursor.py
new file mode 100644
index 000..8be7bed
--- /dev/null
+++ b/python/phoenixdb/phoenixdb/cursor.py
@@ -0,0 +1,347 @@
+# Licensed to the Apache Software Foundation (ASF) under one or more
+# contributor license agreements.  See the NOTICE file distributed with
+# this work for additional information regarding copyright ownership.
+# The ASF licenses this file to You under the Apache License, Version 2.0
+# (the "License"); you may not use this file except in compliance with
+# the License.  You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+import logging
+import collections
+from phoenixdb.types import TypeHelper
+from phoenixdb.errors import ProgrammingError, InternalError
+from phoenixdb.avatica.proto import common_pb2
+
+__all__ = ['Cursor', 'ColumnDescription', 'DictCursor']
+
+logger = logging.getLogger(__name__)
+
+# TODO see note in Cursor.rowcount()
+MAX_INT = 2 ** 64 - 1
+
+ColumnDescription = collections.namedtuple('ColumnDescription', 'name 
type_code display_size internal_size precision scale null_ok')
+"""Named tuple for representing results from :attr:`Cursor.description`."""
+
+
+class Cursor(object):
+"""Database cursor for executing queries and iterating over results.
+
+You should not construct this object manually, use 
:meth:`Connection.cursor() ` instead.
+"""
+
+arraysize = 1
+"""
+Read/write attribute specifying the number of rows to fetch
+at a time with :meth:`fetchmany`. It defaults to 1 meaning to
+fetch a single row at a time.
+"""
+
+itersize = 2000
+"""
+Read/write attribute specifying the number of rows to fetch
+from the backend at each network roundtrip during iteration
+on the cursor. The default is 2000.
+"""
+
+def __init__(self, connection, id=None):
+self._connection = connection
+self._id = id
+self._signature = None
+self._column_data_types = []
+self._frame = None
+self._pos = None
+self._closed = False
+self.arraysize = self.__class__.arraysize
+self.itersize = self.__class__.itersize
+self._updatecount = -1
+
+def __del__(self):
+if not self._connection._closed and not self._closed:
+self.close()
+
+def __enter__(self):
+return self
+
+def __exit__(self, exc_type, exc_value, traceback):
+if not self._closed:
+self.close()
+
+def __iter__(self):
+return self
+
+def __next__(self):
+row = self.fetchone()
+if row is None:
+raise StopIteration
+return row
+
+next = __next__
+
+def close(self):
+"""Closes the cursor.
+No further operations are allowed once the cursor is closed.
+
+If the cursor is used in a ``with`` statement, this method will
+be automatically called at the end of the ``with`` block.
+"""
+if self._closed:
+raise ProgrammingError('the cursor is already closed')
+if self._id is not None:
+self._connection._client.close_statement(self._connection._id, 
self._id)
+self._id = None
+self._signature = None
+self._column_data_types = []
+self._frame = None
+self._pos = None
+self._closed = True
+
+@property
+def closed(self):
+"""Read-only attribute specifying if the cursor is closed or not."""
+return self._closed
+
+@property
+def description(self):
+if self._signature is None:
+return None
+description = []
+for column in self._signature.columns:
+description.append(ColumnDescription(
+column.column_name,
+column.type.name,
+column.display_size,
+None,
+column.precision,
+column.scale,
+None if column.nullable == 2 else bool(column.nullable),
+))
+return description
+
+def _set_id(self, id):
+if self._id is not None and self._id != id:
+self._connection._client.close_statement(self._connection._id, 
self._id)
+self._id = id
+
+def _set_signature(self, signature):
+self._signature = signature
+self._column_data_types = []
+self._parameter_data_types = []
+if signature is None

[04/22] phoenix git commit: PHOENIX-4688 Support SPNEGO for python driver via requests-kerberos

2018-10-15 Thread jamestaylor
http://git-wip-us.apache.org/repos/asf/phoenix/blob/e62be9c8/python/phoenixdb/phoenixdb/types.py
--
diff --git a/python/phoenixdb/phoenixdb/types.py 
b/python/phoenixdb/phoenixdb/types.py
new file mode 100644
index 000..f41355a
--- /dev/null
+++ b/python/phoenixdb/phoenixdb/types.py
@@ -0,0 +1,202 @@
+# Licensed to the Apache Software Foundation (ASF) under one or more
+# contributor license agreements.  See the NOTICE file distributed with
+# this work for additional information regarding copyright ownership.
+# The ASF licenses this file to You under the Apache License, Version 2.0
+# (the "License"); you may not use this file except in compliance with
+# the License.  You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+import sys
+import time
+import datetime
+from decimal import Decimal
+from phoenixdb.avatica.proto import common_pb2
+
+__all__ = [
+'Date', 'Time', 'Timestamp', 'DateFromTicks', 'TimeFromTicks', 
'TimestampFromTicks',
+'Binary', 'STRING', 'BINARY', 'NUMBER', 'DATETIME', 'ROWID', 'BOOLEAN',
+'JAVA_CLASSES', 'JAVA_CLASSES_MAP', 'TypeHelper',
+]
+
+
+def Date(year, month, day):
+"""Constructs an object holding a date value."""
+return datetime.date(year, month, day)
+
+
+def Time(hour, minute, second):
+"""Constructs an object holding a time value."""
+return datetime.time(hour, minute, second)
+
+
+def Timestamp(year, month, day, hour, minute, second):
+"""Constructs an object holding a datetime/timestamp value."""
+return datetime.datetime(year, month, day, hour, minute, second)
+
+
+def DateFromTicks(ticks):
+"""Constructs an object holding a date value from the given UNIX 
timestamp."""
+return Date(*time.localtime(ticks)[:3])
+
+
+def TimeFromTicks(ticks):
+"""Constructs an object holding a time value from the given UNIX 
timestamp."""
+return Time(*time.localtime(ticks)[3:6])
+
+
+def TimestampFromTicks(ticks):
+"""Constructs an object holding a datetime/timestamp value from the given 
UNIX timestamp."""
+return Timestamp(*time.localtime(ticks)[:6])
+
+
+def Binary(value):
+"""Constructs an object capable of holding a binary (long) string value."""
+return bytes(value)
+
+
+def time_from_java_sql_time(n):
+dt = datetime.datetime(1970, 1, 1) + datetime.timedelta(milliseconds=n)
+return dt.time()
+
+
+def time_to_java_sql_time(t):
+return ((t.hour * 60 + t.minute) * 60 + t.second) * 1000 + t.microsecond 
// 1000
+
+
+def date_from_java_sql_date(n):
+return datetime.date(1970, 1, 1) + datetime.timedelta(days=n)
+
+
+def date_to_java_sql_date(d):
+if isinstance(d, datetime.datetime):
+d = d.date()
+td = d - datetime.date(1970, 1, 1)
+return td.days
+
+
+def datetime_from_java_sql_timestamp(n):
+return datetime.datetime(1970, 1, 1) + datetime.timedelta(milliseconds=n)
+
+
+def datetime_to_java_sql_timestamp(d):
+td = d - datetime.datetime(1970, 1, 1)
+return td.microseconds // 1000 + (td.seconds + td.days * 24 * 3600) * 1000
+
+
+class ColumnType(object):
+
+def __init__(self, eq_types):
+self.eq_types = tuple(eq_types)
+self.eq_types_set = set(eq_types)
+
+def __eq__(self, other):
+return other in self.eq_types_set
+
+def __cmp__(self, other):
+if other in self.eq_types_set:
+return 0
+if other < self.eq_types:
+return 1
+else:
+return -1
+
+
+STRING = ColumnType(['VARCHAR', 'CHAR'])
+"""Type object that can be used to describe string-based columns."""
+
+BINARY = ColumnType(['BINARY', 'VARBINARY'])
+"""Type object that can be used to describe (long) binary columns."""
+
+NUMBER = ColumnType([
+'INTEGER', 'UNSIGNED_INT', 'BIGINT', 'UNSIGNED_LONG', 'TINYINT', 
'UNSIGNED_TINYINT',
+'SMALLINT', 'UNSIGNED_SMALLINT', 'FLOAT', 'UNSIGNED_FLOAT', 'DOUBLE', 
'UNSIGNED_DOUBLE', 'DECIMAL'
+])
+"""Type object that can be used to describe numeric columns."""
+
+DATETIME = ColumnType(['TIME', 'DATE', 'TIMESTAMP', 'UNSIGNED_TIME', 
'UNSIGNED_DATE', 'UNSIGNED_TIMESTAMP'])
+"""Type object that can be used to describe date/time columns."""
+
+ROWID = ColumnType([])
+"""Only implemented for DB API 2.0 compatibility, not used."""
+
+BOOLEAN = ColumnType(['BOOLEAN'])
+"""Type object that can be used to describe boolean columns. This is a 
phoenixdb-specific extension."""
+
+
+# XXX ARRAY
+
+if sys.version_info[0] < 3:
+_long = long  # noqa: F821
+else:
+_long = int
+
+JAVA_CLASSES = {
+'bool_value': [
+('java.lang.Boolean', common_pb2.BOOLEA

[08/22] phoenix git commit: PHOENIX-4688 Support SPNEGO for python driver via requests-kerberos

2018-10-15 Thread jamestaylor
http://git-wip-us.apache.org/repos/asf/phoenix/blob/e62be9c8/python/phoenixdb/connection.py
--
diff --git a/python/phoenixdb/connection.py b/python/phoenixdb/connection.py
deleted file mode 100644
index 593a242..000
--- a/python/phoenixdb/connection.py
+++ /dev/null
@@ -1,187 +0,0 @@
-# Licensed to the Apache Software Foundation (ASF) under one or more
-# contributor license agreements.  See the NOTICE file distributed with
-# this work for additional information regarding copyright ownership.
-# The ASF licenses this file to You under the Apache License, Version 2.0
-# (the "License"); you may not use this file except in compliance with
-# the License.  You may obtain a copy of the License at
-#
-# http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing, software
-# distributed under the License is distributed on an "AS IS" BASIS,
-# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-# See the License for the specific language governing permissions and
-# limitations under the License.
-
-import logging
-import uuid
-import weakref
-from phoenixdb import errors
-from phoenixdb.avatica.client import OPEN_CONNECTION_PROPERTIES
-from phoenixdb.cursor import Cursor
-from phoenixdb.errors import ProgrammingError
-
-__all__ = ['Connection']
-
-logger = logging.getLogger(__name__)
-
-
-class Connection(object):
-"""Database connection.
-
-You should not construct this object manually, use 
:func:`~phoenixdb.connect` instead.
-"""
-
-cursor_factory = None
-"""
-The default cursor factory used by :meth:`cursor` if the parameter is not 
specified.
-"""
-
-def __init__(self, client, cursor_factory=None, **kwargs):
-self._client = client
-self._closed = False
-if cursor_factory is not None:
-self.cursor_factory = cursor_factory
-else:
-self.cursor_factory = Cursor
-self._cursors = []
-# Extract properties to pass to OpenConnectionRequest
-self._connection_args = {}
-# The rest of the kwargs
-self._filtered_args = {}
-for k in kwargs:
-if k in OPEN_CONNECTION_PROPERTIES:
-self._connection_args[k] = kwargs[k]
-else:
-self._filtered_args[k] = kwargs[k]
-self.open()
-self.set_session(**self._filtered_args)
-
-def __del__(self):
-if not self._closed:
-self.close()
-
-def __enter__(self):
-return self
-
-def __exit__(self, exc_type, exc_value, traceback):
-if not self._closed:
-self.close()
-
-def open(self):
-"""Opens the connection."""
-self._id = str(uuid.uuid4())
-self._client.open_connection(self._id, info=self._connection_args)
-
-def close(self):
-"""Closes the connection.
-No further operations are allowed, either on the connection or any
-of its cursors, once the connection is closed.
-
-If the connection is used in a ``with`` statement, this method will
-be automatically called at the end of the ``with`` block.
-"""
-if self._closed:
-raise ProgrammingError('the connection is already closed')
-for cursor_ref in self._cursors:
-cursor = cursor_ref()
-if cursor is not None and not cursor._closed:
-cursor.close()
-self._client.close_connection(self._id)
-self._client.close()
-self._closed = True
-
-@property
-def closed(self):
-"""Read-only attribute specifying if the connection is closed or 
not."""
-return self._closed
-
-def commit(self):
-"""Commits pending database changes.
-
-Currently, this does nothing, because the RPC does not support
-transactions. Only defined for DB API 2.0 compatibility.
-You need to use :attr:`autocommit` mode.
-"""
-# TODO can support be added for this?
-if self._closed:
-raise ProgrammingError('the connection is already closed')
-
-def cursor(self, cursor_factory=None):
-"""Creates a new cursor.
-
-:param cursor_factory:
-This argument can be used to create non-standard cursors.
-The class returned must be a subclass of
-:class:`~phoenixdb.cursor.Cursor` (for example 
:class:`~phoenixdb.cursor.DictCursor`).
-A default factory for the connection can also be specified using 
the
-:attr:`cursor_factory` attribute.
-
-:returns:
-A :class:`~phoenixdb.cursor.Cursor` object.
-"""
-if self._closed:
-raise ProgrammingError('the connection is already closed')
-cursor = (cursor_factory or self.cursor_factory)(self)
-self._cursors.append(weakref.ref(cursor, self._cursors.remove))
-

[18/22] phoenix git commit: PHOENIX-4966 Implement unhandledFilters in PhoenixRelation so that spark only evaluates filters when required

2018-10-15 Thread jamestaylor
PHOENIX-4966 Implement unhandledFilters in PhoenixRelation so that spark only 
evaluates filters when required


Project: http://git-wip-us.apache.org/repos/asf/phoenix/repo
Commit: http://git-wip-us.apache.org/repos/asf/phoenix/commit/7580b7e9
Tree: http://git-wip-us.apache.org/repos/asf/phoenix/tree/7580b7e9
Diff: http://git-wip-us.apache.org/repos/asf/phoenix/diff/7580b7e9

Branch: refs/heads/omid2
Commit: 7580b7e9249a276f109daf57840b963f7721092a
Parents: ed183ef
Author: Thomas D'Silva 
Authored: Thu Oct 11 15:46:48 2018 -0700
Committer: Thomas D'Silva 
Committed: Fri Oct 12 13:47:04 2018 -0700

--
 .../org/apache/phoenix/spark/PhoenixSparkIT.scala   | 14 +++---
 .../org/apache/phoenix/spark/PhoenixRelation.scala  | 16 
 2 files changed, 19 insertions(+), 11 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/phoenix/blob/7580b7e9/phoenix-spark/src/it/scala/org/apache/phoenix/spark/PhoenixSparkIT.scala
--
diff --git 
a/phoenix-spark/src/it/scala/org/apache/phoenix/spark/PhoenixSparkIT.scala 
b/phoenix-spark/src/it/scala/org/apache/phoenix/spark/PhoenixSparkIT.scala
index b8e44fe..4e11acc 100644
--- a/phoenix-spark/src/it/scala/org/apache/phoenix/spark/PhoenixSparkIT.scala
+++ b/phoenix-spark/src/it/scala/org/apache/phoenix/spark/PhoenixSparkIT.scala
@@ -285,13 +285,13 @@ class PhoenixSparkIT extends AbstractPhoenixSparkIT {
 // Make sure we got the right value back
 assert(res.first().getLong(0) == 1L)
 
-/*
-  NOTE: There doesn't appear to be any way of verifying from the Spark 
query planner that
-  filtering is being pushed down and done server-side. However, since 
PhoenixRelation
-  implements PrunedFilteredScan, debugging has shown that both the SELECT 
columns and WHERE
-  predicates are being passed along to us, which we then forward it to 
Phoenix.
-  TODO: investigate further to find a way to verify server-side pushdown
- */
+val plan = res.queryExecution.sparkPlan
+// filters should be pushed into phoenix relation
+assert(plan.toString.contains("PushedFilters: [IsNotNull(COL1), 
IsNotNull(ID), " +
+  "EqualTo(COL1,test_row_1), EqualTo(ID,1)]"))
+// spark should run the filters on the rows returned by Phoenix
+assert(!plan.toString.contains("Filter (((isnotnull(COL1#8) && 
isnotnull(ID#7L)) " +
+  "&& (COL1#8 = test_row_1)) && (ID#7L = 1))"))
   }
 
   test("Can persist a dataframe using 'DataFrame.saveToPhoenix'") {

http://git-wip-us.apache.org/repos/asf/phoenix/blob/7580b7e9/phoenix-spark/src/main/scala/org/apache/phoenix/spark/PhoenixRelation.scala
--
diff --git 
a/phoenix-spark/src/main/scala/org/apache/phoenix/spark/PhoenixRelation.scala 
b/phoenix-spark/src/main/scala/org/apache/phoenix/spark/PhoenixRelation.scala
index d2eac8c..38bf29a 100644
--- 
a/phoenix-spark/src/main/scala/org/apache/phoenix/spark/PhoenixRelation.scala
+++ 
b/phoenix-spark/src/main/scala/org/apache/phoenix/spark/PhoenixRelation.scala
@@ -36,11 +36,12 @@ case class PhoenixRelation(tableName: String, zkUrl: 
String, dateAsTimestamp: Bo
 but this prevents having to load the whole table into Spark first.
   */
   override def buildScan(requiredColumns: Array[String], filters: 
Array[Filter]): RDD[Row] = {
+val(pushedFilters, unhandledFilters) = buildFilter(filters)
 new PhoenixRDD(
   sqlContext.sparkContext,
   tableName,
   requiredColumns,
-  Some(buildFilter(filters)),
+  Some(pushedFilters),
   Some(zkUrl),
   new Configuration(),
   dateAsTimestamp
@@ -62,12 +63,13 @@ case class PhoenixRelation(tableName: String, zkUrl: 
String, dateAsTimestamp: Bo
 
   // Attempt to create Phoenix-accepted WHERE clauses from Spark filters,
   // mostly inspired from Spark SQL JDBCRDD and the couchbase-spark-connector
-  private def buildFilter(filters: Array[Filter]): String = {
+  private def buildFilter(filters: Array[Filter]): (String, Array[Filter]) = {
 if (filters.isEmpty) {
-  return ""
+  return ("" , Array[Filter]())
 }
 
 val filter = new StringBuilder("")
+val unsupportedFilters = Array[Filter]();
 var i = 0
 
 filters.foreach(f => {
@@ -92,12 +94,18 @@ case class PhoenixRelation(tableName: String, zkUrl: 
String, dateAsTimestamp: Bo
 case StringStartsWith(attr, value) => filter.append(s" 
${escapeKey(attr)} LIKE ${compileValue(value + "%")}")
 case StringEndsWith(attr, value) => filter.append(s" 
${escapeKey(attr)} LIKE ${compileValue("%" + value)}")
 case StringContains(attr, value) => filter.append(s" 
${escapeKey(attr)} LIKE ${compileValue("%" + value + "%")}")
+case _ => unsupportedFilters :+ f
   }
 
   i = i + 1
 })
 
-filter.toSt

[09/22] phoenix git commit: PHOENIX-4688 Support SPNEGO for python driver via requests-kerberos

2018-10-15 Thread jamestaylor
http://git-wip-us.apache.org/repos/asf/phoenix/blob/e62be9c8/python/phoenixdb/avatica/proto/requests_pb2.py
--
diff --git a/python/phoenixdb/avatica/proto/requests_pb2.py 
b/python/phoenixdb/avatica/proto/requests_pb2.py
deleted file mode 100644
index 203f945..000
--- a/python/phoenixdb/avatica/proto/requests_pb2.py
+++ /dev/null
@@ -1,1206 +0,0 @@
-# Generated by the protocol buffer compiler.  DO NOT EDIT!
-# source: requests.proto
-
-import sys
-_b=sys.version_info[0]<3 and (lambda x:x) or (lambda x:x.encode('latin1'))
-from google.protobuf import descriptor as _descriptor
-from google.protobuf import message as _message
-from google.protobuf import reflection as _reflection
-from google.protobuf import symbol_database as _symbol_database
-from google.protobuf import descriptor_pb2
-# @@protoc_insertion_point(imports)
-
-_sym_db = _symbol_database.Default()
-
-
-from . import common_pb2 as common__pb2
-
-
-DESCRIPTOR = _descriptor.FileDescriptor(
-  name='requests.proto',
-  package='',
-  syntax='proto3',
-  
serialized_pb=_b('\n\x0erequests.proto\x1a\x0c\x63ommon.proto\"(\n\x0f\x43\x61talogsRequest\x12\x15\n\rconnection_id\x18\x01
 \x01(\t\"0\n\x17\x44\x61tabasePropertyRequest\x12\x15\n\rconnection_id\x18\x01 
\x01(\t\"P\n\x0eSchemasRequest\x12\x0f\n\x07\x63\x61talog\x18\x01 
\x01(\t\x12\x16\n\x0eschema_pattern\x18\x02 
\x01(\t\x12\x15\n\rconnection_id\x18\x03 
\x01(\t\"\x95\x01\n\rTablesRequest\x12\x0f\n\x07\x63\x61talog\x18\x01 
\x01(\t\x12\x16\n\x0eschema_pattern\x18\x02 
\x01(\t\x12\x1a\n\x12table_name_pattern\x18\x03 
\x01(\t\x12\x11\n\ttype_list\x18\x04 \x03(\t\x12\x15\n\rhas_type_list\x18\x06 
\x01(\x08\x12\x15\n\rconnection_id\x18\x07 
\x01(\t\"*\n\x11TableTypesRequest\x12\x15\n\rconnection_id\x18\x01 
\x01(\t\"\x89\x01\n\x0e\x43olumnsRequest\x12\x0f\n\x07\x63\x61talog\x18\x01 
\x01(\t\x12\x16\n\x0eschema_pattern\x18\x02 
\x01(\t\x12\x1a\n\x12table_name_pattern\x18\x03 
\x01(\t\x12\x1b\n\x13\x63olumn_name_pattern\x18\x04 
\x01(\t\x12\x15\n\rconnection_id\x18\x05 \x01(\t\"(\n\x0fTypeInfoReque
 st\x12\x15\n\rconnection_id\x18\x01 
\x01(\t\"\xa1\x01\n\x18PrepareAndExecuteRequest\x12\x15\n\rconnection_id\x18\x01
 \x01(\t\x12\x0b\n\x03sql\x18\x02 \x01(\t\x12\x15\n\rmax_row_count\x18\x03 
\x01(\x04\x12\x14\n\x0cstatement_id\x18\x04 
\x01(\r\x12\x16\n\x0emax_rows_total\x18\x05 
\x01(\x03\x12\x1c\n\x14\x66irst_frame_max_size\x18\x06 
\x01(\x05\"c\n\x0ePrepareRequest\x12\x15\n\rconnection_id\x18\x01 
\x01(\t\x12\x0b\n\x03sql\x18\x02 \x01(\t\x12\x15\n\rmax_row_count\x18\x03 
\x01(\x04\x12\x16\n\x0emax_rows_total\x18\x04 
\x01(\x03\"\x80\x01\n\x0c\x46\x65tchRequest\x12\x15\n\rconnection_id\x18\x01 
\x01(\t\x12\x14\n\x0cstatement_id\x18\x02 \x01(\r\x12\x0e\n\x06offset\x18\x03 
\x01(\x04\x12\x1b\n\x13\x66\x65tch_max_row_count\x18\x04 
\x01(\r\x12\x16\n\x0e\x66rame_max_size\x18\x05 
\x01(\x05\"/\n\x16\x43reateStatementRequest\x12\x15\n\rconnection_id\x18\x01 
\x01(\t\"D\n\x15\x43loseStatementRequest\x12\x15\n\rconnection_id\x18\x01 
\x01(\t\x12\x14\n\x0cstatement_id\x18\x02 \x01(\r\"\x8b\x01\n\x15Op
 enConnectionRequest\x12\x15\n\rconnection_id\x18\x01 
\x01(\t\x12.\n\x04info\x18\x02 \x03(\x0b\x32 
.OpenConnectionRequest.InfoEntry\x1a+\n\tInfoEntry\x12\x0b\n\x03key\x18\x01 
\x01(\t\x12\r\n\x05value\x18\x02 
\x01(\t:\x02\x38\x01\"/\n\x16\x43loseConnectionRequest\x12\x15\n\rconnection_id\x18\x01
 \x01(\t\"Y\n\x15\x43onnectionSyncRequest\x12\x15\n\rconnection_id\x18\x01 
\x01(\t\x12)\n\nconn_props\x18\x02 
\x01(\x0b\x32\x15.ConnectionProperties\"\xc7\x01\n\x0e\x45xecuteRequest\x12)\n\x0fstatementHandle\x18\x01
 \x01(\x0b\x32\x10.StatementHandle\x12%\n\x10parameter_values\x18\x02 
\x03(\x0b\x32\x0b.TypedValue\x12\'\n\x1f\x64\x65precated_first_frame_max_size\x18\x03
 \x01(\x04\x12\x1c\n\x14has_parameter_values\x18\x04 
\x01(\x08\x12\x1c\n\x14\x66irst_frame_max_size\x18\x05 
\x01(\x05\"m\n\x12SyncResultsRequest\x12\x15\n\rconnection_id\x18\x01 
\x01(\t\x12\x14\n\x0cstatement_id\x18\x02 \x01(\r\x12\x1a\n\x05state\x18\x03 
\x01(\x0b\x32\x0b.QueryState\x12\x0e\n\x06offset\x18\x04 \x01(\x04\"&\n\rCommi
 tRequest\x12\x15\n\rconnection_id\x18\x01 
\x01(\t\"(\n\x0fRollbackRequest\x12\x15\n\rconnection_id\x18\x01 
\x01(\t\"b\n\x1dPrepareAndExecuteBatchRequest\x12\x15\n\rconnection_id\x18\x01 
\x01(\t\x12\x14\n\x0cstatement_id\x18\x02 
\x01(\r\x12\x14\n\x0csql_commands\x18\x03 
\x03(\t\"4\n\x0bUpdateBatch\x12%\n\x10parameter_values\x18\x01 
\x03(\x0b\x32\x0b.TypedValue\"a\n\x13\x45xecuteBatchRequest\x12\x15\n\rconnection_id\x18\x01
 \x01(\t\x12\x14\n\x0cstatement_id\x18\x02 \x01(\r\x12\x1d\n\x07updates\x18\x03 
\x03(\x0b\x32\x0c.UpdateBatchB\"\n org.apache.calcite.avatica.protob\x06proto3')
-  ,
-  dependencies=[common__pb2.DESCRIPTOR,])
-_sym_db.RegisterFileDescriptor(DESCRIPTOR)
-
-
-
-
-_CATALOGSREQUEST = _descriptor.Descriptor(
-  name='CatalogsRequest',
-  full_name='CatalogsRequest',
-  filename=None,
-  file=DESCRIPTOR,
-  containing_type=None,

[07/22] phoenix git commit: PHOENIX-4688 Support SPNEGO for python driver via requests-kerberos

2018-10-15 Thread jamestaylor
http://git-wip-us.apache.org/repos/asf/phoenix/blob/e62be9c8/python/phoenixdb/phoenixdb/avatica/proto/common_pb2.py
--
diff --git a/python/phoenixdb/phoenixdb/avatica/proto/common_pb2.py 
b/python/phoenixdb/phoenixdb/avatica/proto/common_pb2.py
new file mode 100644
index 000..3c99502
--- /dev/null
+++ b/python/phoenixdb/phoenixdb/avatica/proto/common_pb2.py
@@ -0,0 +1,1667 @@
+# Generated by the protocol buffer compiler.  DO NOT EDIT!
+# source: common.proto
+
+import sys
+_b=sys.version_info[0]<3 and (lambda x:x) or (lambda x:x.encode('latin1'))
+from google.protobuf.internal import enum_type_wrapper
+from google.protobuf import descriptor as _descriptor
+from google.protobuf import message as _message
+from google.protobuf import reflection as _reflection
+from google.protobuf import symbol_database as _symbol_database
+from google.protobuf import descriptor_pb2
+# @@protoc_insertion_point(imports)
+
+_sym_db = _symbol_database.Default()
+
+
+
+
+DESCRIPTOR = _descriptor.FileDescriptor(
+  name='common.proto',
+  package='',
+  syntax='proto3',
+  
serialized_pb=_b('\n\x0c\x63ommon.proto\"\xc0\x01\n\x14\x43onnectionProperties\x12\x10\n\x08is_dirty\x18\x01
 \x01(\x08\x12\x13\n\x0b\x61uto_commit\x18\x02 
\x01(\x08\x12\x17\n\x0fhas_auto_commit\x18\x07 
\x01(\x08\x12\x11\n\tread_only\x18\x03 
\x01(\x08\x12\x15\n\rhas_read_only\x18\x08 
\x01(\x08\x12\x1d\n\x15transaction_isolation\x18\x04 
\x01(\r\x12\x0f\n\x07\x63\x61talog\x18\x05 \x01(\t\x12\x0e\n\x06schema\x18\x06 
\x01(\t\"S\n\x0fStatementHandle\x12\x15\n\rconnection_id\x18\x01 
\x01(\t\x12\n\n\x02id\x18\x02 \x01(\r\x12\x1d\n\tsignature\x18\x03 
\x01(\x0b\x32\n.Signature\"\xb0\x01\n\tSignature\x12 \n\x07\x63olumns\x18\x01 
\x03(\x0b\x32\x0f.ColumnMetaData\x12\x0b\n\x03sql\x18\x02 
\x01(\t\x12%\n\nparameters\x18\x03 
\x03(\x0b\x32\x11.AvaticaParameter\x12&\n\x0e\x63ursor_factory\x18\x04 
\x01(\x0b\x32\x0e.CursorFactory\x12%\n\rstatementType\x18\x05 
\x01(\x0e\x32\x0e.StatementType\"\xad\x03\n\x0e\x43olumnMetaData\x12\x0f\n\x07ordinal\x18\x01
 \x01(\r\x12\x16\n\x0e\x61uto_increment\x18\x02 \x
 01(\x08\x12\x16\n\x0e\x63\x61se_sensitive\x18\x03 
\x01(\x08\x12\x12\n\nsearchable\x18\x04 
\x01(\x08\x12\x10\n\x08\x63urrency\x18\x05 
\x01(\x08\x12\x10\n\x08nullable\x18\x06 \x01(\r\x12\x0e\n\x06signed\x18\x07 
\x01(\x08\x12\x14\n\x0c\x64isplay_size\x18\x08 \x01(\r\x12\r\n\x05label\x18\t 
\x01(\t\x12\x13\n\x0b\x63olumn_name\x18\n 
\x01(\t\x12\x13\n\x0bschema_name\x18\x0b \x01(\t\x12\x11\n\tprecision\x18\x0c 
\x01(\r\x12\r\n\x05scale\x18\r \x01(\r\x12\x12\n\ntable_name\x18\x0e 
\x01(\t\x12\x14\n\x0c\x63\x61talog_name\x18\x0f 
\x01(\t\x12\x11\n\tread_only\x18\x10 \x01(\x08\x12\x10\n\x08writable\x18\x11 
\x01(\x08\x12\x1b\n\x13\x64\x65\x66initely_writable\x18\x12 
\x01(\x08\x12\x19\n\x11\x63olumn_class_name\x18\x13 
\x01(\t\x12\x1a\n\x04type\x18\x14 
\x01(\x0b\x32\x0c.AvaticaType\"}\n\x0b\x41vaticaType\x12\n\n\x02id\x18\x01 
\x01(\r\x12\x0c\n\x04name\x18\x02 \x01(\t\x12\x11\n\x03rep\x18\x03 
\x01(\x0e\x32\x04.Rep\x12 \n\x07\x63olumns\x18\x04 
\x03(\x0b\x32\x0f.ColumnMetaData\x12\x1f\n\tcomponent\x18
 \x05 
\x01(\x0b\x32\x0c.AvaticaType\"\x91\x01\n\x10\x41vaticaParameter\x12\x0e\n\x06signed\x18\x01
 \x01(\x08\x12\x11\n\tprecision\x18\x02 \x01(\r\x12\r\n\x05scale\x18\x03 
\x01(\r\x12\x16\n\x0eparameter_type\x18\x04 
\x01(\r\x12\x11\n\ttype_name\x18\x05 \x01(\t\x12\x12\n\nclass_name\x18\x06 
\x01(\t\x12\x0c\n\x04name\x18\x07 
\x01(\t\"\xb3\x01\n\rCursorFactory\x12#\n\x05style\x18\x01 
\x01(\x0e\x32\x14.CursorFactory.Style\x12\x12\n\nclass_name\x18\x02 
\x01(\t\x12\x13\n\x0b\x66ield_names\x18\x03 
\x03(\t\"T\n\x05Style\x12\n\n\x06OBJECT\x10\x00\x12\n\n\x06RECORD\x10\x01\x12\x15\n\x11RECORD_PROJECTION\x10\x02\x12\t\n\x05\x41RRAY\x10\x03\x12\x08\n\x04LIST\x10\x04\x12\x07\n\x03MAP\x10\x05\"9\n\x05\x46rame\x12\x0e\n\x06offset\x18\x01
 \x01(\x04\x12\x0c\n\x04\x64one\x18\x02 \x01(\x08\x12\x12\n\x04rows\x18\x03 
\x03(\x0b\x32\x04.Row\"\"\n\x03Row\x12\x1b\n\x05value\x18\x01 
\x03(\x0b\x32\x0c.ColumnValue\"3\n\x10\x44\x61tabaseProperty\x12\x0c\n\x04name\x18\x01
 \x01(\t\x12\x11\n\tfunctions\x18\x02 \x03(
 \t\"4\n\x0bWireMessage\x12\x0c\n\x04name\x18\x01 
\x01(\t\x12\x17\n\x0fwrapped_message\x18\x02 
\x01(\x0c\"\x87\x01\n\x0b\x43olumnValue\x12\x1a\n\x05value\x18\x01 
\x03(\x0b\x32\x0b.TypedValue\x12 \n\x0b\x61rray_value\x18\x02 
\x03(\x0b\x32\x0b.TypedValue\x12\x17\n\x0fhas_array_value\x18\x03 
\x01(\x08\x12!\n\x0cscalar_value\x18\x04 
\x01(\x0b\x32\x0b.TypedValue\"\xf2\x01\n\nTypedValue\x12\x12\n\x04type\x18\x01 
\x01(\x0e\x32\x04.Rep\x12\x12\n\nbool_value\x18\x02 
\x01(\x08\x12\x14\n\x0cstring_value\x18\x03 
\x01(\t\x12\x14\n\x0cnumber_value\x18\x04 
\x01(\x12\x12\x13\n\x0b\x62ytes_value\x18\x05 
\x01(\x0c\x12\x14\n\x0c\x64ouble_value\x18\x06 
\x01(\x01\x12\x0c\n\x04null\x18\x07 \x01(\x08\x12 \n\x0b\x61rray_value\x18\x08 
\x03(\x0b\x32\x0b.TypedValue\x12\x1c\n\x0e\x63omponent_type\x18\t 
\x01(\x0e\x32\x04.Rep\

[03/22] phoenix git commit: PHOENIX-4688 Support SPNEGO for python driver via requests-kerberos

2018-10-15 Thread jamestaylor
http://git-wip-us.apache.org/repos/asf/phoenix/blob/e62be9c8/python/phoenixdb/types.py
--
diff --git a/python/phoenixdb/types.py b/python/phoenixdb/types.py
deleted file mode 100644
index f41355a..000
--- a/python/phoenixdb/types.py
+++ /dev/null
@@ -1,202 +0,0 @@
-# Licensed to the Apache Software Foundation (ASF) under one or more
-# contributor license agreements.  See the NOTICE file distributed with
-# this work for additional information regarding copyright ownership.
-# The ASF licenses this file to You under the Apache License, Version 2.0
-# (the "License"); you may not use this file except in compliance with
-# the License.  You may obtain a copy of the License at
-#
-# http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing, software
-# distributed under the License is distributed on an "AS IS" BASIS,
-# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-# See the License for the specific language governing permissions and
-# limitations under the License.
-
-import sys
-import time
-import datetime
-from decimal import Decimal
-from phoenixdb.avatica.proto import common_pb2
-
-__all__ = [
-'Date', 'Time', 'Timestamp', 'DateFromTicks', 'TimeFromTicks', 
'TimestampFromTicks',
-'Binary', 'STRING', 'BINARY', 'NUMBER', 'DATETIME', 'ROWID', 'BOOLEAN',
-'JAVA_CLASSES', 'JAVA_CLASSES_MAP', 'TypeHelper',
-]
-
-
-def Date(year, month, day):
-"""Constructs an object holding a date value."""
-return datetime.date(year, month, day)
-
-
-def Time(hour, minute, second):
-"""Constructs an object holding a time value."""
-return datetime.time(hour, minute, second)
-
-
-def Timestamp(year, month, day, hour, minute, second):
-"""Constructs an object holding a datetime/timestamp value."""
-return datetime.datetime(year, month, day, hour, minute, second)
-
-
-def DateFromTicks(ticks):
-"""Constructs an object holding a date value from the given UNIX 
timestamp."""
-return Date(*time.localtime(ticks)[:3])
-
-
-def TimeFromTicks(ticks):
-"""Constructs an object holding a time value from the given UNIX 
timestamp."""
-return Time(*time.localtime(ticks)[3:6])
-
-
-def TimestampFromTicks(ticks):
-"""Constructs an object holding a datetime/timestamp value from the given 
UNIX timestamp."""
-return Timestamp(*time.localtime(ticks)[:6])
-
-
-def Binary(value):
-"""Constructs an object capable of holding a binary (long) string value."""
-return bytes(value)
-
-
-def time_from_java_sql_time(n):
-dt = datetime.datetime(1970, 1, 1) + datetime.timedelta(milliseconds=n)
-return dt.time()
-
-
-def time_to_java_sql_time(t):
-return ((t.hour * 60 + t.minute) * 60 + t.second) * 1000 + t.microsecond 
// 1000
-
-
-def date_from_java_sql_date(n):
-return datetime.date(1970, 1, 1) + datetime.timedelta(days=n)
-
-
-def date_to_java_sql_date(d):
-if isinstance(d, datetime.datetime):
-d = d.date()
-td = d - datetime.date(1970, 1, 1)
-return td.days
-
-
-def datetime_from_java_sql_timestamp(n):
-return datetime.datetime(1970, 1, 1) + datetime.timedelta(milliseconds=n)
-
-
-def datetime_to_java_sql_timestamp(d):
-td = d - datetime.datetime(1970, 1, 1)
-return td.microseconds // 1000 + (td.seconds + td.days * 24 * 3600) * 1000
-
-
-class ColumnType(object):
-
-def __init__(self, eq_types):
-self.eq_types = tuple(eq_types)
-self.eq_types_set = set(eq_types)
-
-def __eq__(self, other):
-return other in self.eq_types_set
-
-def __cmp__(self, other):
-if other in self.eq_types_set:
-return 0
-if other < self.eq_types:
-return 1
-else:
-return -1
-
-
-STRING = ColumnType(['VARCHAR', 'CHAR'])
-"""Type object that can be used to describe string-based columns."""
-
-BINARY = ColumnType(['BINARY', 'VARBINARY'])
-"""Type object that can be used to describe (long) binary columns."""
-
-NUMBER = ColumnType([
-'INTEGER', 'UNSIGNED_INT', 'BIGINT', 'UNSIGNED_LONG', 'TINYINT', 
'UNSIGNED_TINYINT',
-'SMALLINT', 'UNSIGNED_SMALLINT', 'FLOAT', 'UNSIGNED_FLOAT', 'DOUBLE', 
'UNSIGNED_DOUBLE', 'DECIMAL'
-])
-"""Type object that can be used to describe numeric columns."""
-
-DATETIME = ColumnType(['TIME', 'DATE', 'TIMESTAMP', 'UNSIGNED_TIME', 
'UNSIGNED_DATE', 'UNSIGNED_TIMESTAMP'])
-"""Type object that can be used to describe date/time columns."""
-
-ROWID = ColumnType([])
-"""Only implemented for DB API 2.0 compatibility, not used."""
-
-BOOLEAN = ColumnType(['BOOLEAN'])
-"""Type object that can be used to describe boolean columns. This is a 
phoenixdb-specific extension."""
-
-
-# XXX ARRAY
-
-if sys.version_info[0] < 3:
-_long = long  # noqa: F821
-else:
-_long = int
-
-JAVA_CLASSES = {
-'bool_value': [
-('java.lang.Boolean', common_pb2.BOOLEAN, None, None),
-],
-'string_

[01/22] phoenix git commit: PHOENIX-4946 Switch from HC's annotations (since removed) to JCIP annotations

2018-10-15 Thread jamestaylor
Repository: phoenix
Updated Branches:
  refs/heads/omid2 c6a45bcf1 -> 24ffcf5c8


PHOENIX-4946 Switch from HC's annotations (since removed) to JCIP annotations

Avoids an old httpclient artifact conflicting with Hadoop3 implementation.

Signed-off-by: Sergey Soldatov 


Project: http://git-wip-us.apache.org/repos/asf/phoenix/repo
Commit: http://git-wip-us.apache.org/repos/asf/phoenix/commit/e4d170c8
Tree: http://git-wip-us.apache.org/repos/asf/phoenix/tree/e4d170c8
Diff: http://git-wip-us.apache.org/repos/asf/phoenix/diff/e4d170c8

Branch: refs/heads/omid2
Commit: e4d170c841215496d02b06a6bbdb6b5c12a0f4c5
Parents: fd9af91
Author: Josh Elser 
Authored: Wed Oct 3 17:43:05 2018 -0400
Committer: Josh Elser 
Committed: Fri Oct 5 11:47:59 2018 -0400

--
 phoenix-core/pom.xml   | 6 --
 .../src/main/java/org/apache/phoenix/cache/HashCache.java  | 3 ++-
 .../main/java/org/apache/phoenix/compile/GroupByCompiler.java  | 3 ++-
 .../java/org/apache/phoenix/memory/ChildMemoryManager.java | 5 +++--
 .../java/org/apache/phoenix/memory/GlobalMemoryManager.java| 4 +++-
 .../main/java/org/apache/phoenix/parse/FunctionParseNode.java  | 3 ++-
 .../src/main/java/org/apache/phoenix/query/QueryServices.java  | 3 ++-
 .../src/main/java/org/apache/phoenix/schema/ColumnRef.java | 3 ++-
 .../main/java/org/apache/phoenix/schema/KeyValueSchema.java| 3 ++-
 .../src/main/java/org/apache/phoenix/schema/PNameImpl.java | 5 +++--
 10 files changed, 21 insertions(+), 17 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/phoenix/blob/e4d170c8/phoenix-core/pom.xml
--
diff --git a/phoenix-core/pom.xml b/phoenix-core/pom.xml
index aa59b72..3a32407 100644
--- a/phoenix-core/pom.xml
+++ b/phoenix-core/pom.xml
@@ -283,12 +283,6 @@
   protobuf-java
   ${protobuf-java.version}
 
-
-
-  org.apache.httpcomponents
-  httpclient
-  4.0.1
-
 
   log4j
   log4j

http://git-wip-us.apache.org/repos/asf/phoenix/blob/e4d170c8/phoenix-core/src/main/java/org/apache/phoenix/cache/HashCache.java
--
diff --git a/phoenix-core/src/main/java/org/apache/phoenix/cache/HashCache.java 
b/phoenix-core/src/main/java/org/apache/phoenix/cache/HashCache.java
index 764fd17..80e37ce 100644
--- a/phoenix-core/src/main/java/org/apache/phoenix/cache/HashCache.java
+++ b/phoenix-core/src/main/java/org/apache/phoenix/cache/HashCache.java
@@ -21,7 +21,8 @@ import java.io.Closeable;
 import java.io.IOException;
 import java.util.List;
 
-import org.apache.http.annotation.Immutable;
+import net.jcip.annotations.Immutable;
+
 import org.apache.phoenix.hbase.index.util.ImmutableBytesPtr;
 import org.apache.phoenix.schema.tuple.Tuple;
 

http://git-wip-us.apache.org/repos/asf/phoenix/blob/e4d170c8/phoenix-core/src/main/java/org/apache/phoenix/compile/GroupByCompiler.java
--
diff --git 
a/phoenix-core/src/main/java/org/apache/phoenix/compile/GroupByCompiler.java 
b/phoenix-core/src/main/java/org/apache/phoenix/compile/GroupByCompiler.java
index 0a9e1bc..4777c29 100644
--- a/phoenix-core/src/main/java/org/apache/phoenix/compile/GroupByCompiler.java
+++ b/phoenix-core/src/main/java/org/apache/phoenix/compile/GroupByCompiler.java
@@ -23,8 +23,9 @@ import java.util.Collections;
 import java.util.Comparator;
 import java.util.List;
 
+import net.jcip.annotations.Immutable;
+
 import org.apache.hadoop.hbase.util.Pair;
-import org.apache.http.annotation.Immutable;
 import org.apache.phoenix.compile.OrderPreservingTracker.Ordering;
 import org.apache.phoenix.coprocessor.BaseScannerRegionObserver;
 import org.apache.phoenix.exception.SQLExceptionCode;

http://git-wip-us.apache.org/repos/asf/phoenix/blob/e4d170c8/phoenix-core/src/main/java/org/apache/phoenix/memory/ChildMemoryManager.java
--
diff --git 
a/phoenix-core/src/main/java/org/apache/phoenix/memory/ChildMemoryManager.java 
b/phoenix-core/src/main/java/org/apache/phoenix/memory/ChildMemoryManager.java
index da009fb..f5ad5dd 100644
--- 
a/phoenix-core/src/main/java/org/apache/phoenix/memory/ChildMemoryManager.java
+++ 
b/phoenix-core/src/main/java/org/apache/phoenix/memory/ChildMemoryManager.java
@@ -17,8 +17,9 @@
  */
 package org.apache.phoenix.memory;
 
-import org.apache.http.annotation.GuardedBy;
-import org.apache.http.annotation.ThreadSafe;
+import net.jcip.annotations.GuardedBy;
+import net.jcip.annotations.ThreadSafe;
+
 import org.apache.phoenix.exception.SQLExceptionCode;
 import org.apache.phoenix.exception.SQLExceptionInfo;
 

http://git-wip-us.apache.org/repos/asf/phoenix/blob/e4d170c8/phoenix-core/src/main/java

[02/22] phoenix git commit: PHOENIX-4688 Support SPNEGO for python driver via requests-kerberos

2018-10-15 Thread jamestaylor
http://git-wip-us.apache.org/repos/asf/phoenix/blob/e62be9c8/python/requests-kerberos/tests/test_requests_kerberos.py
--
diff --git a/python/requests-kerberos/tests/test_requests_kerberos.py 
b/python/requests-kerberos/tests/test_requests_kerberos.py
new file mode 100644
index 000..ebaca37
--- /dev/null
+++ b/python/requests-kerberos/tests/test_requests_kerberos.py
@@ -0,0 +1,904 @@
+#!/usr/bin/env python
+# -*- coding: utf-8 -*-
+
+"""Tests for requests_kerberos."""
+
+import base64
+from mock import Mock, patch
+from requests.compat import urlparse
+import requests
+import warnings
+
+
+try:
+import kerberos
+kerberos_module_name='kerberos'
+except ImportError:
+import winkerberos as kerberos  # On Windows
+kerberos_module_name = 'winkerberos'
+
+import requests_kerberos
+import unittest
+from requests_kerberos.kerberos_ import _get_certificate_hash
+
+# kerberos.authClientInit() is called with the service name (HTTP@FQDN) and
+# returns 1 and a kerberos context object on success. Returns -1 on failure.
+clientInit_complete = Mock(return_value=(1, "CTX"))
+clientInit_error = Mock(return_value=(-1, "CTX"))
+
+# kerberos.authGSSClientStep() is called with the kerberos context object
+# returned by authGSSClientInit and the negotiate auth token provided in the
+# http response's www-authenticate header. It returns 0 or 1 on success. 0
+# Indicates that authentication is progressing but not complete.
+clientStep_complete = Mock(return_value=1)
+clientStep_continue = Mock(return_value=0)
+clientStep_error = Mock(return_value=-1)
+clientStep_exception = Mock(side_effect=kerberos.GSSError)
+
+# kerberos.authGSSCLientResponse() is called with the kerberos context which
+# was initially returned by authGSSClientInit and had been mutated by a call by
+# authGSSClientStep. It returns a string.
+clientResponse = Mock(return_value="GSSRESPONSE")
+
+# Note: we're not using the @mock.patch decorator:
+# > My only word of warning is that in the past, the patch decorator hides
+# > tests when using the standard unittest library.
+# > -- sigmavirus24 in https://github.com/requests/requests-kerberos/issues/1
+
+
+class KerberosTestCase(unittest.TestCase):
+
+def setUp(self):
+"""Setup."""
+clientInit_complete.reset_mock()
+clientInit_error.reset_mock()
+clientStep_complete.reset_mock()
+clientStep_continue.reset_mock()
+clientStep_error.reset_mock()
+clientStep_exception.reset_mock()
+clientResponse.reset_mock()
+
+def tearDown(self):
+"""Teardown."""
+pass
+
+def test_negotate_value_extraction(self):
+response = requests.Response()
+response.headers = {'www-authenticate': 'negotiate token'}
+self.assertEqual(
+requests_kerberos.kerberos_._negotiate_value(response),
+'token'
+)
+
+def test_negotate_value_extraction_none(self):
+response = requests.Response()
+response.headers = {}
+self.assertTrue(
+requests_kerberos.kerberos_._negotiate_value(response) is None
+)
+
+def test_force_preemptive(self):
+with patch.multiple(kerberos_module_name,
+authGSSClientInit=clientInit_complete,
+authGSSClientResponse=clientResponse,
+authGSSClientStep=clientStep_continue):
+auth = requests_kerberos.HTTPKerberosAuth(force_preemptive=True)
+
+request = requests.Request(url="http://www.example.org";)
+
+auth.__call__(request)
+
+self.assertTrue('Authorization' in request.headers)
+self.assertEqual(request.headers.get('Authorization'), 'Negotiate 
GSSRESPONSE')
+
+def test_no_force_preemptive(self):
+with patch.multiple(kerberos_module_name,
+authGSSClientInit=clientInit_complete,
+authGSSClientResponse=clientResponse,
+authGSSClientStep=clientStep_continue):
+auth = requests_kerberos.HTTPKerberosAuth()
+
+request = requests.Request(url="http://www.example.org";)
+
+auth.__call__(request)
+
+self.assertTrue('Authorization' not in request.headers)
+
+def test_generate_request_header(self):
+with patch.multiple(kerberos_module_name,
+authGSSClientInit=clientInit_complete,
+authGSSClientResponse=clientResponse,
+authGSSClientStep=clientStep_continue):
+response = requests.Response()
+response.url = "http://www.example.org/";
+response.headers = {'www-authenticate': 'negotiate token'}
+host = urlparse(response.url).hostname
+auth = requests_kerberos.HTTPKerberosAuth()
+self.assertEqual(
+aut

[17/22] phoenix git commit: PHOENIX-4964 ORDER BY should use a LOCAL index even if the query is not fully covered.

2018-10-15 Thread jamestaylor
PHOENIX-4964 ORDER BY should use a LOCAL index even if the query is not fully 
covered.


Project: http://git-wip-us.apache.org/repos/asf/phoenix/repo
Commit: http://git-wip-us.apache.org/repos/asf/phoenix/commit/ed183eff
Tree: http://git-wip-us.apache.org/repos/asf/phoenix/tree/ed183eff
Diff: http://git-wip-us.apache.org/repos/asf/phoenix/diff/ed183eff

Branch: refs/heads/omid2
Commit: ed183eff0003c3be5d0f2a0b581a1f20f3b1857b
Parents: dd3e55f
Author: Lars Hofhansl 
Authored: Thu Oct 11 22:49:07 2018 -0700
Committer: Lars Hofhansl 
Committed: Thu Oct 11 22:49:07 2018 -0700

--
 .../phoenix/end2end/index/LocalIndexIT.java | 59 
 .../apache/phoenix/optimize/QueryOptimizer.java |  9 ++-
 2 files changed, 66 insertions(+), 2 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/phoenix/blob/ed183eff/phoenix-core/src/it/java/org/apache/phoenix/end2end/index/LocalIndexIT.java
--
diff --git 
a/phoenix-core/src/it/java/org/apache/phoenix/end2end/index/LocalIndexIT.java 
b/phoenix-core/src/it/java/org/apache/phoenix/end2end/index/LocalIndexIT.java
index e260969..5a59c81 100644
--- 
a/phoenix-core/src/it/java/org/apache/phoenix/end2end/index/LocalIndexIT.java
+++ 
b/phoenix-core/src/it/java/org/apache/phoenix/end2end/index/LocalIndexIT.java
@@ -266,6 +266,65 @@ public class LocalIndexIT extends BaseLocalIndexIT {
 }
 indexTable.close();
 }
+
+@Test
+public void testLocalIndexUsedForUncoveredOrderBy() throws Exception {
+String tableName = schemaName + "." + generateUniqueName();
+String indexName = "IDX_" + generateUniqueName();
+TableName physicalTableName = 
SchemaUtil.getPhysicalTableName(tableName.getBytes(), isNamespaceMapped);
+String indexPhysicalTableName = physicalTableName.getNameAsString();
+
+createBaseTable(tableName, null, "('e','i','o')");
+try (Connection conn1 = getConnection()) {
+conn1.createStatement().execute("UPSERT INTO " + tableName + " 
values('b',1,2,4,'z')");
+conn1.createStatement().execute("UPSERT INTO " + tableName + " 
values('f',1,2,3,'a')");
+conn1.createStatement().execute("UPSERT INTO " + tableName + " 
values('j',2,4,2,'a')");
+conn1.createStatement().execute("UPSERT INTO " + tableName + " 
values('q',3,1,1,'c')");
+conn1.commit();
+conn1.createStatement().execute("CREATE LOCAL INDEX " + indexName 
+ " ON " + tableName + "(v1)");
+
+String query = "SELECT * FROM " + tableName +" ORDER BY V1";
+ResultSet rs = conn1.createStatement().executeQuery("EXPLAIN "+ 
query);
+
+HBaseAdmin admin = driver.getConnectionQueryServices(getUrl(), 
TestUtil.TEST_PROPERTIES).getAdmin();
+int numRegions = admin.getTableRegions(physicalTableName).size();
+
+assertEquals(
+"CLIENT PARALLEL " + numRegions + "-WAY RANGE SCAN OVER "
++ indexPhysicalTableName + " [1]\n"
++ "SERVER FILTER BY FIRST KEY ONLY\n"
++ "CLIENT MERGE SORT",
+QueryUtil.getExplainPlan(rs));
+
+rs = conn1.createStatement().executeQuery(query);
+String v = "";
+while(rs.next()) {
+String next = rs.getString("v1");
+assertTrue(v.compareTo(next) <= 0);
+v = next;
+}
+rs.close();
+
+query = "SELECT * FROM " + tableName +" ORDER BY V1 DESC NULLS 
LAST";
+rs = conn1.createStatement().executeQuery("EXPLAIN "+ query);
+assertEquals(
+"CLIENT PARALLEL " + numRegions + "-WAY REVERSE RANGE SCAN 
OVER "
++ indexPhysicalTableName + " [1]\n"
++ "SERVER FILTER BY FIRST KEY ONLY\n"
++ "CLIENT MERGE SORT",
+QueryUtil.getExplainPlan(rs));
+
+rs = conn1.createStatement().executeQuery(query);
+v = "zz";
+while(rs.next()) {
+String next = rs.getString("v1");
+assertTrue(v.compareTo(next) >= 0);
+v = next;
+}
+rs.close();
+
+}
+}
 
 @Test
 public void testLocalIndexScanJoinColumnsFromDataTable() throws Exception {

http://git-wip-us.apache.org/repos/asf/phoenix/blob/ed183eff/phoenix-core/src/main/java/org/apache/phoenix/optimize/QueryOptimizer.java
--
diff --git 
a/phoenix-core/src/main/java/org/apache/phoenix/optimize/QueryOptimizer.java 
b/phoenix-core/src/main/java/org/apache/phoenix/optimize/QueryOptimizer.java
index 6d668cc.

[22/22] phoenix git commit: Use Java 1.7 for 4.x-HBase-1.3

2018-10-15 Thread jamestaylor
Use Java 1.7 for 4.x-HBase-1.3


Project: http://git-wip-us.apache.org/repos/asf/phoenix/repo
Commit: http://git-wip-us.apache.org/repos/asf/phoenix/commit/24ffcf5c
Tree: http://git-wip-us.apache.org/repos/asf/phoenix/tree/24ffcf5c
Diff: http://git-wip-us.apache.org/repos/asf/phoenix/diff/24ffcf5c

Branch: refs/heads/omid2
Commit: 24ffcf5c8ec7abbbf4e304aae20c40460091072c
Parents: 26f13fb
Author: James Taylor 
Authored: Mon Oct 15 15:16:16 2018 -0700
Committer: James Taylor 
Committed: Mon Oct 15 15:16:16 2018 -0700

--
 pom.xml | 4 ++--
 1 file changed, 2 insertions(+), 2 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/phoenix/blob/24ffcf5c/pom.xml
--
diff --git a/pom.xml b/pom.xml
index c3784ff..f6a8a6a 100644
--- a/pom.xml
+++ b/pom.xml
@@ -142,8 +142,8 @@
   maven-compiler-plugin
   3.0
   
-1.8
-1.8
+1.7
+1.7
   
 
 

[14/22] phoenix git commit: PHOENIX-4859 Using local index in where statement for join (only rhs table) query fails(Rajeshbabu)

2018-10-15 Thread jamestaylor
PHOENIX-4859 Using local index in where statement for join (only rhs table) 
query fails(Rajeshbabu)


Project: http://git-wip-us.apache.org/repos/asf/phoenix/repo
Commit: http://git-wip-us.apache.org/repos/asf/phoenix/commit/2ded8b64
Tree: http://git-wip-us.apache.org/repos/asf/phoenix/tree/2ded8b64
Diff: http://git-wip-us.apache.org/repos/asf/phoenix/diff/2ded8b64

Branch: refs/heads/omid2
Commit: 2ded8b64cdea48e89e7a0c936a59913a00345416
Parents: c90d090
Author: Rajeshbabu Chintaguntla 
Authored: Tue Oct 9 16:04:41 2018 +0530
Committer: Rajeshbabu Chintaguntla 
Committed: Tue Oct 9 16:04:41 2018 +0530

--
 .../phoenix/end2end/index/LocalIndexIT.java | 29 
 .../phoenix/compile/ExpressionCompiler.java |  2 +-
 .../apache/phoenix/compile/JoinCompiler.java|  2 +-
 .../phoenix/compile/ProjectionCompiler.java |  4 +--
 .../compile/TupleProjectionCompiler.java|  2 +-
 .../phoenix/schema/LocalIndexDataColumnRef.java | 18 ++--
 6 files changed, 44 insertions(+), 13 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/phoenix/blob/2ded8b64/phoenix-core/src/it/java/org/apache/phoenix/end2end/index/LocalIndexIT.java
--
diff --git 
a/phoenix-core/src/it/java/org/apache/phoenix/end2end/index/LocalIndexIT.java 
b/phoenix-core/src/it/java/org/apache/phoenix/end2end/index/LocalIndexIT.java
index ed1cf45..e260969 100644
--- 
a/phoenix-core/src/it/java/org/apache/phoenix/end2end/index/LocalIndexIT.java
+++ 
b/phoenix-core/src/it/java/org/apache/phoenix/end2end/index/LocalIndexIT.java
@@ -684,6 +684,35 @@ public class LocalIndexIT extends BaseLocalIndexIT {
 conn1.close();
 }
 
+@Test
+public void testLocalIndexSelfJoin() throws Exception {
+  String tableName = generateUniqueName();
+  String indexName = "IDX_" + generateUniqueName();
+  Connection conn1 = DriverManager.getConnection(getUrl());
+  if (isNamespaceMapped) {
+  conn1.createStatement().execute("CREATE SCHEMA IF NOT EXISTS " + 
schemaName);
+  }
+String ddl =
+"CREATE TABLE "
++ tableName
++ " (customer_id integer primary key, postal_code 
varchar, country_code varchar)";
+conn1.createStatement().execute(ddl);
+conn1.createStatement().execute("UPSERT INTO " + tableName + " 
values(1,'560103','IN')");
+conn1.commit();
+conn1.createStatement().execute(
+"CREATE LOCAL INDEX " + indexName + " ON " + tableName + 
"(postal_code)");
+ResultSet rs =
+conn1.createStatement()
+.executeQuery(
+"SELECT * from "
++ tableName
++ " c1, "
++ tableName
++ " c2 where c1.customer_id=c2.customer_id 
and c2.postal_code='560103'");
+assertTrue(rs.next());
+conn1.close();
+}
+
 private void copyLocalIndexHFiles(Configuration conf, HRegionInfo 
fromRegion, HRegionInfo toRegion, boolean move)
 throws IOException {
 Path root = FSUtils.getRootDir(conf);

http://git-wip-us.apache.org/repos/asf/phoenix/blob/2ded8b64/phoenix-core/src/main/java/org/apache/phoenix/compile/ExpressionCompiler.java
--
diff --git 
a/phoenix-core/src/main/java/org/apache/phoenix/compile/ExpressionCompiler.java 
b/phoenix-core/src/main/java/org/apache/phoenix/compile/ExpressionCompiler.java
index 9daa744..077e1af 100644
--- 
a/phoenix-core/src/main/java/org/apache/phoenix/compile/ExpressionCompiler.java
+++ 
b/phoenix-core/src/main/java/org/apache/phoenix/compile/ExpressionCompiler.java
@@ -376,7 +376,7 @@ public class ExpressionCompiler extends 
UnsupportedAllParseNodeVisitorhttp://git-wip-us.apache.org/repos/asf/phoenix/blob/2ded8b64/phoenix-core/src/main/java/org/apache/phoenix/compile/JoinCompiler.java
--
diff --git 
a/phoenix-core/src/main/java/org/apache/phoenix/compile/JoinCompiler.java 
b/phoenix-core/src/main/java/org/apache/phoenix/compile/JoinCompiler.java
index 36bfc5f..880fa72 100644
--- a/phoenix-core/src/main/java/org/apache/phoenix/compile/JoinCompiler.java
+++ b/phoenix-core/src/main/java/org/apache/phoenix/compile/JoinCompiler.java
@@ -869,7 +869,7 @@ public class JoinCompiler {
 if (columnRef.getTableRef().equals(tableRef)
 && (!retainPKColumns || 
!SchemaUtil.isPKColumn(columnRef.getColumn( {
 if (columnRef instanceof LocalIndexColumnRef) {
-sourceColumns.add(new 
LocalIndexDataColumn

[21/22] phoenix git commit: Merge branch '4.x-HBase-1.3' into omid2

2018-10-15 Thread jamestaylor
Merge branch '4.x-HBase-1.3' into omid2


Project: http://git-wip-us.apache.org/repos/asf/phoenix/repo
Commit: http://git-wip-us.apache.org/repos/asf/phoenix/commit/26f13fb3
Tree: http://git-wip-us.apache.org/repos/asf/phoenix/tree/26f13fb3
Diff: http://git-wip-us.apache.org/repos/asf/phoenix/diff/26f13fb3

Branch: refs/heads/omid2
Commit: 26f13fb31812f30757198dafe095615a374880cf
Parents: c6a45bc db02372
Author: James Taylor 
Authored: Mon Oct 15 15:10:54 2018 -0700
Committer: James Taylor 
Committed: Mon Oct 15 15:10:54 2018 -0700

--
 LICENSE |   18 +
 NOTICE  |2 +
 dev/release_files/LICENSE   |   18 +
 dev/release_files/NOTICE|4 +
 phoenix-core/pom.xml|6 -
 .../apache/phoenix/end2end/AlterTableIT.java|5 +-
 .../org/apache/phoenix/end2end/BaseQueryIT.java |   15 +-
 .../apache/phoenix/end2end/CreateTableIT.java   |   27 +-
 .../phoenix/end2end/PropertiesInSyncIT.java |  494 ++
 .../end2end/QueryDatabaseMetaDataIT.java|7 +-
 .../org/apache/phoenix/end2end/QueryMoreIT.java |7 +-
 .../apache/phoenix/end2end/SetPropertyIT.java   |   64 +-
 .../org/apache/phoenix/end2end/SplitIT.java |   17 +
 .../phoenix/end2end/index/LocalIndexIT.java |  143 +-
 .../org/apache/phoenix/tx/TransactionIT.java|4 +-
 .../org/apache/phoenix/cache/HashCache.java |3 +-
 .../phoenix/compile/ExpressionCompiler.java |2 +-
 .../apache/phoenix/compile/GroupByCompiler.java |3 +-
 .../apache/phoenix/compile/JoinCompiler.java|2 +-
 .../phoenix/compile/ProjectionCompiler.java |4 +-
 .../compile/TupleProjectionCompiler.java|2 +-
 .../phoenix/exception/SQLExceptionCode.java |6 +-
 .../phoenix/iterate/BaseResultIterators.java|3 +-
 .../phoenix/jdbc/PhoenixDatabaseMetaData.java   |7 +-
 .../phoenix/mapreduce/CsvBulkImportUtil.java|8 +-
 .../util/PhoenixConfigurationUtil.java  |7 +-
 .../phoenix/memory/ChildMemoryManager.java  |5 +-
 .../phoenix/memory/GlobalMemoryManager.java |4 +-
 .../apache/phoenix/optimize/QueryOptimizer.java |9 +-
 .../apache/phoenix/parse/FunctionParseNode.java |3 +-
 .../query/ConnectionQueryServicesImpl.java  |  485 +++--
 .../org/apache/phoenix/query/QueryServices.java |3 +-
 .../org/apache/phoenix/schema/ColumnRef.java|3 +-
 .../apache/phoenix/schema/KeyValueSchema.java   |3 +-
 .../phoenix/schema/LocalIndexDataColumnRef.java |   18 +-
 .../apache/phoenix/schema/MetaDataClient.java   |  112 +-
 .../org/apache/phoenix/schema/PNameImpl.java|5 +-
 .../apache/phoenix/schema/TableProperty.java|4 +-
 .../apache/phoenix/schema/types/PDataType.java  |2 +-
 .../apache/phoenix/schema/types/PVarbinary.java |4 +-
 .../org/apache/phoenix/util/MetaDataUtil.java   |   44 +-
 .../org/apache/phoenix/util/UpgradeUtil.java|  142 +-
 .../phoenix/util/csv/CsvUpsertExecutor.java |4 +-
 .../phoenix/util/json/JsonUpsertExecutor.java   |4 +-
 .../phoenix/schema/types/PDataTypeTest.java |6 +
 .../util/AbstractUpsertExecutorTest.java|   12 +-
 .../util/TenantIdByteConversionTest.java|   30 +-
 .../src/it/bin/test_phoenixdb.py|   39 +
 .../src/it/bin/test_phoenixdb.sh|   79 +
 .../end2end/SecureQueryServerPhoenixDBIT.java   |  424 +
 .../apache/phoenix/spark/PhoenixSparkIT.scala   |   14 +-
 .../apache/phoenix/spark/PhoenixRelation.scala  |   16 +-
 pom.xml |   14 +-
 python/NEWS.rst |   44 -
 python/README.md|   93 +
 python/README.rst   |  136 --
 python/RELEASING.rst|   12 -
 python/ci/build-env/Dockerfile  |7 -
 python/ci/phoenix/Dockerfile|   33 -
 python/ci/phoenix/docker-entrypoint.sh  |   24 -
 python/ci/phoenix/hbase-site.xml|   12 -
 python/doc/Makefile |  192 --
 python/doc/api.rst  |   30 -
 python/doc/conf.py  |  287 ---
 python/doc/index.rst|   27 -
 python/doc/versions.rst |3 -
 python/docker-compose.yml   |   21 -
 python/examples/basic.py|   27 -
 python/examples/shell.py|   33 -
 python/gen-protobuf.sh  |   38 -
 python/phoenixdb/NEWS.rst   |   44 +
 python/phoenixdb/README.rst |  136 ++
 python/phoenixdb/RELEASING.rst  |   12 +
 python/phoenixdb/__init__.py|   68 -
 python/phoenixdb/avatica/__init__.py 

[20/22] phoenix git commit: PHOENIX-4967 Reverse scan along LOCAL index does not always return all data.

2018-10-15 Thread jamestaylor
PHOENIX-4967 Reverse scan along LOCAL index does not always return all data.


Project: http://git-wip-us.apache.org/repos/asf/phoenix/repo
Commit: http://git-wip-us.apache.org/repos/asf/phoenix/commit/db02372f
Tree: http://git-wip-us.apache.org/repos/asf/phoenix/tree/db02372f
Diff: http://git-wip-us.apache.org/repos/asf/phoenix/diff/db02372f

Branch: refs/heads/omid2
Commit: db02372fa6b02340c30cd48f093fd45979fe30f6
Parents: 9556b8e
Author: Lars Hofhansl 
Authored: Sat Oct 13 14:36:42 2018 -0700
Committer: Lars Hofhansl 
Committed: Sat Oct 13 14:36:42 2018 -0700

--
 .../phoenix/end2end/index/LocalIndexIT.java | 55 +++-
 .../phoenix/iterate/BaseResultIterators.java|  3 +-
 2 files changed, 56 insertions(+), 2 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/phoenix/blob/db02372f/phoenix-core/src/it/java/org/apache/phoenix/end2end/index/LocalIndexIT.java
--
diff --git 
a/phoenix-core/src/it/java/org/apache/phoenix/end2end/index/LocalIndexIT.java 
b/phoenix-core/src/it/java/org/apache/phoenix/end2end/index/LocalIndexIT.java
index 5a59c81..d70a505 100644
--- 
a/phoenix-core/src/it/java/org/apache/phoenix/end2end/index/LocalIndexIT.java
+++ 
b/phoenix-core/src/it/java/org/apache/phoenix/end2end/index/LocalIndexIT.java
@@ -298,11 +298,15 @@ public class LocalIndexIT extends BaseLocalIndexIT {
 
 rs = conn1.createStatement().executeQuery(query);
 String v = "";
+int i = 0;
 while(rs.next()) {
 String next = rs.getString("v1");
 assertTrue(v.compareTo(next) <= 0);
 v = next;
+i++;
 }
+// see PHOENIX-4967
+assertEquals(4, i);
 rs.close();
 
 query = "SELECT * FROM " + tableName +" ORDER BY V1 DESC NULLS 
LAST";
@@ -316,16 +320,65 @@ public class LocalIndexIT extends BaseLocalIndexIT {
 
 rs = conn1.createStatement().executeQuery(query);
 v = "zz";
+i = 0;
 while(rs.next()) {
 String next = rs.getString("v1");
 assertTrue(v.compareTo(next) >= 0);
 v = next;
+i++;
 }
+// see PHOENIX-4967
+assertEquals(4, i);
 rs.close();
 
 }
 }
-
+
+@Test
+public void testLocalIndexReverseScanShouldReturnAllRows() throws 
Exception {
+String tableName = schemaName + "." + generateUniqueName();
+String indexName = "IDX_" + generateUniqueName();
+TableName physicalTableName = 
SchemaUtil.getPhysicalTableName(tableName.getBytes(), isNamespaceMapped);
+String indexPhysicalTableName = physicalTableName.getNameAsString();
+
+createBaseTable(tableName, null, "('e','i','o')");
+try (Connection conn1 = getConnection()) {
+conn1.createStatement().execute("UPSERT INTO " + tableName + " 
values('b',1,2,4,'z')");
+conn1.createStatement().execute("UPSERT INTO " + tableName + " 
values('f',1,2,3,'a')");
+conn1.createStatement().execute("UPSERT INTO " + tableName + " 
values('j',2,4,2,'b')");
+conn1.createStatement().execute("UPSERT INTO " + tableName + " 
values('q',3,1,1,'c')");
+conn1.commit();
+conn1.createStatement().execute("CREATE LOCAL INDEX " + indexName 
+ " ON " + tableName + "(v1)");
+
+String query = "SELECT V1 FROM " + tableName +" ORDER BY V1 DESC 
NULLS LAST";
+ResultSet rs = conn1.createStatement().executeQuery("EXPLAIN "+ 
query);
+
+HBaseAdmin admin = driver.getConnectionQueryServices(getUrl(), 
TestUtil.TEST_PROPERTIES).getAdmin();
+int numRegions = admin.getTableRegions(physicalTableName).size();
+
+assertEquals(
+"CLIENT PARALLEL " + numRegions + "-WAY REVERSE RANGE SCAN 
OVER "
++ indexPhysicalTableName + " [1]\n"
++ "SERVER FILTER BY FIRST KEY ONLY\n"
++ "CLIENT MERGE SORT",
+QueryUtil.getExplainPlan(rs));
+
+rs = conn1.createStatement().executeQuery(query);
+String v = "zz";
+int i = 0;
+while(rs.next()) {
+String next = rs.getString("v1");
+assertTrue(v.compareTo(next) >= 0);
+v = next;
+i++;
+}
+// see PHOENIX-4967
+assertEquals(4, i);
+rs.close();
+
+}
+}
+
 @Test
 public void testLocalIndexScanJoinColumnsFromDataTable() throws Exception {
 String tableName = schemaName + "." + generateUniqueName();

http://git-wip-us.apache.org/repos/asf/phoenix/bl

Build failed in Jenkins: Phoenix | Master #2186

2018-10-15 Thread Apache Jenkins Server
See 


Changes:

[rchintaguntla] PHOENIX-4859 Using local index in where statement for join 
(only rhs

--
[...truncated 157.48 KB...]
[INFO] Running org.apache.phoenix.replication.SystemCatalogWALEntryFilterIT
[INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.117 s 
- in org.apache.phoenix.replication.SystemCatalogWALEntryFilterIT
[INFO] Running org.apache.phoenix.trace.PhoenixTracingEndToEndIT
[WARNING] Tests run: 1, Failures: 0, Errors: 0, Skipped: 1, Time elapsed: 0.001 
s - in org.apache.phoenix.trace.PhoenixTracingEndToEndIT
[INFO] Running org.apache.phoenix.tx.FlappingTransactionIT
[INFO] Running org.apache.phoenix.rpc.UpdateCacheIT
[INFO] Running org.apache.phoenix.trace.PhoenixTableMetricsWriterIT
[INFO] Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 16.666 s 
- in org.apache.phoenix.tx.FlappingTransactionIT
[INFO] Running org.apache.phoenix.tx.ParameterizedTransactionIT
[INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 10.811 s 
- in org.apache.phoenix.trace.PhoenixTableMetricsWriterIT
[INFO] Running org.apache.phoenix.tx.TransactionIT
[INFO] Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 26.001 s 
- in org.apache.phoenix.rpc.UpdateCacheIT
[INFO] Running org.apache.phoenix.tx.TxCheckpointIT
[INFO] Tests run: 12, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 61.781 
s - in org.apache.phoenix.tx.TransactionIT
[INFO] Running org.apache.phoenix.util.IndexScrutinyIT
[INFO] Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 15.781 s 
- in org.apache.phoenix.util.IndexScrutinyIT
[INFO] Tests run: 34, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 572.734 
s - in org.apache.phoenix.end2end.join.SortMergeJoinLocalIndexIT
[INFO] Tests run: 40, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 276.762 
s - in org.apache.phoenix.tx.TxCheckpointIT
[INFO] Tests run: 52, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 312.203 
s - in org.apache.phoenix.tx.ParameterizedTransactionIT
[INFO] 
[INFO] Results:
[INFO] 
[ERROR] Failures: 
[ERROR]   ConcurrentMutationsIT.testLockUntilMVCCAdvanced:385 Expected data 
table row count to match expected:<1> but was:<0>
[ERROR]   ConcurrentMutationsIT.testRowLockDuringPreBatchMutateWhenIndexed:329 
Expected data table row count to match expected:<1> but was:<0>
[ERROR] Errors: 
[ERROR]   NullIT.testEmptyStringValue:108->testNoStringValue:82 » PhoenixIO 
org.apache.p...
[ERROR]   UpsertSelectAutoCommitIT.testUpsertSelectDoesntSeeUpsertedData:167 » 
DoNotRetryIO
[ERROR]   
MutableIndexSplitForwardScanIT.testSplitDuringIndexScan:30->MutableIndexSplitIT.testSplitDuringIndexScan:87->MutableIndexSplitIT.splitDuringScan:152
 » StaleRegionBoundaryCache
[ERROR]   
MutableIndexSplitForwardScanIT.testSplitDuringIndexScan:30->MutableIndexSplitIT.testSplitDuringIndexScan:87->MutableIndexSplitIT.splitDuringScan:152
 » StaleRegionBoundaryCache
[ERROR]   
MutableIndexSplitReverseScanIT.testSplitDuringIndexScan:30->MutableIndexSplitIT.testSplitDuringIndexScan:87->MutableIndexSplitIT.splitDuringScan:152
 » StaleRegionBoundaryCache
[ERROR]   
MutableIndexSplitReverseScanIT.testSplitDuringIndexScan:30->MutableIndexSplitIT.testSplitDuringIndexScan:87->MutableIndexSplitIT.splitDuringScan:152
 » StaleRegionBoundaryCache
[INFO] 
[ERROR] Tests run: 3385, Failures: 2, Errors: 6, Skipped: 10
[INFO] 
[INFO] 
[INFO] --- maven-failsafe-plugin:2.20:integration-test (HBaseManagedTimeTests) 
@ phoenix-core ---
[INFO] 
[INFO] ---
[INFO]  T E S T S
[INFO] ---
[INFO] 
[INFO] Results:
[INFO] 
[INFO] Tests run: 0, Failures: 0, Errors: 0, Skipped: 0
[INFO] 
[INFO] 
[INFO] --- maven-failsafe-plugin:2.20:integration-test 
(NeedTheirOwnClusterTests) @ phoenix-core ---
[INFO] 
[INFO] ---
[INFO]  T E S T S
[INFO] ---
[INFO] Running 
org.apache.hadoop.hbase.regionserver.wal.WALReplayWithIndexWritesAndCompressedWALIT
[WARNING] Tests run: 1, Failures: 0, Errors: 0, Skipped: 1, Time elapsed: 0.001 
s - in 
org.apache.hadoop.hbase.regionserver.wal.WALReplayWithIndexWritesAndCompressedWALIT
[INFO] Running org.apache.phoenix.end2end.ChangePermissionsIT
[INFO] Running 
org.apache.hadoop.hbase.regionserver.wal.WALRecoveryRegionPostOpenIT
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.61 s - 
in org.apache.hadoop.hbase.regionserver.wal.WALRecoveryRegionPostOpenIT
[INFO] Running 
org.apache.phoenix.end2end.ColumnEncodedImmutableTxStatsCollectorIT
[INFO] Running 
org.apache.phoenix.end2end.ColumnEncodedImmutableNonTxStatsCollectorIT
[INFO] Running 
org.apache.phoenix.end2end.ColumnEncodedMutableNonTxStatsCollectorIT
[WARNING] Tests run: 28, Failures: 0, Errors: 0, Skipped: 4, Time el

Build failed in Jenkins: Phoenix-omid2 #122

2018-10-15 Thread Apache Jenkins Server
See 

--
[...truncated 76.88 KB...]
[INFO] --- license-maven-plugin:2.11:check (check-license) @ omid-examples ---
[INFO] Checking licenses...
[INFO] 
[INFO] --- maven-install-plugin:2.4:install (default-install) @ omid-examples 
---
[INFO] Installing 

 to 
/home/jenkins/.m2/repository/org/apache/omid/omid-examples/0.8.2.11-SNAPSHOT/omid-examples-0.8.2.11-SNAPSHOT.jar
[INFO] Installing 
 to 
/home/jenkins/.m2/repository/org/apache/omid/omid-examples/0.8.2.11-SNAPSHOT/omid-examples-0.8.2.11-SNAPSHOT.pom
[INFO] Installing 

 to 
/home/jenkins/.m2/repository/org/apache/omid/omid-examples/0.8.2.11-SNAPSHOT/omid-examples-0.8.2.11-SNAPSHOT-bin.tar.gz
[INFO] 
[INFO] ---< org.apache.omid:omid-packaging >---
[INFO] Building Omid Packaging 0.8.2.1-SNAPSHOT [19/19]
[INFO] [ pom ]-
[INFO] 
[INFO] --- maven-assembly-plugin:2.4:single (default) @ omid-packaging ---
[INFO] Reading assembly descriptor: maven/assembly/src.xml
[INFO] Building tar: 

[INFO] 
[INFO] --- license-maven-plugin:2.11:check (check-license) @ omid-packaging ---
[INFO] Checking licenses...
[INFO] 
[INFO] --- maven-install-plugin:2.4:install (default-install) @ omid-packaging 
---
[INFO] Installing 
 to 
/home/jenkins/.m2/repository/org/apache/omid/omid-packaging/0.8.2.1-SNAPSHOT/omid-packaging-0.8.2.1-SNAPSHOT.pom
[INFO] Installing 

 to 
/home/jenkins/.m2/repository/org/apache/omid/omid-packaging/0.8.2.1-SNAPSHOT/omid-packaging-0.8.2.1-SNAPSHOT-src.tar.gz
[INFO] 
[INFO] Reactor Summary:
[INFO] 
[INFO] Omid 0.8.2.11-SNAPSHOT . SUCCESS [  1.184 s]
[INFO] Common . SUCCESS [  6.702 s]
[INFO] State Machine .. SUCCESS [  0.337 s]
[INFO] Commit Table ... SUCCESS [  0.601 s]
[INFO] Metrics  SUCCESS [  0.302 s]
[INFO] Transaction Client . SUCCESS [  1.021 s]
[INFO] Shims Aggregator for HBase . SUCCESS [  0.053 s]
[INFO] Shims layer for HBase 1.x .. SUCCESS [  3.319 s]
[INFO] HBase Common ... SUCCESS [  0.722 s]
[INFO] HBase Commit Table . SUCCESS [  0.818 s]
[INFO] Codahale Metrics ... SUCCESS [  0.207 s]
[INFO] Benchmarks . SUCCESS [ 15.008 s]
[INFO] Timestamp Storage .. SUCCESS [  1.396 s]
[INFO] HBase tools  SUCCESS [  0.460 s]
[INFO] TSO and TO Servers . SUCCESS [ 12.865 s]
[INFO] HBase Client ... SUCCESS [  1.316 s]
[INFO] HBase Coprocessors . SUCCESS [  1.402 s]
[INFO] Omid Client Examples ... SUCCESS [  5.987 s]
[INFO] Omid Packaging 0.8.2.1-SNAPSHOT  SUCCESS [  0.730 s]
[INFO] 
[INFO] BUILD SUCCESS
[INFO] 
[INFO] Total time: 56.163 s
[INFO] Finished at: 2018-10-15T18:40:37Z
[INFO] 
[Phoenix-omid2] $ /home/jenkins/tools/maven/latest3/bin/mvn -U clean install 
-Dcheckstyle.skip=true
[INFO] Scanning for projects...
[WARNING] 
[WARNING] Some problems were encountered while building the effective model for 
org.apache.phoenix:phoenix-core:jar:4.14.0-HBase-1.3
[WARNING] Reporting configuration should be done in  section, not in 
maven-site-plugin  as reportPlugins parameter. @ 
org.apache.phoenix:phoenix-core:[unknown-version], 
 line 65, 
column 23
[WARNING] 
[WARNING] Some problems were encountered while building the effective model for 
org.apache.phoenix:phoenix-flume:jar:4.14.0-HBase-1.3
[WARNING] Reporting configuration should be done in  section, not

Build failed in Jenkins: Phoenix-omid2 #121

2018-10-15 Thread Apache Jenkins Server
See 

--
Started by user mujtaba
[EnvInject] - Loading node environment variables.
Building remotely on H30 (ubuntu xenial) in workspace 

 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url 
 > https://git-wip-us.apache.org/repos/asf/phoenix.git # timeout=10
Fetching upstream changes from 
https://git-wip-us.apache.org/repos/asf/phoenix.git
 > git --version # timeout=10
 > git fetch --tags --progress 
 > https://git-wip-us.apache.org/repos/asf/phoenix.git 
 > +refs/heads/*:refs/remotes/origin/*
 > git rev-parse origin/omid2^{commit} # timeout=10
Checking out Revision c6a45bcf1e842fb8261ae1bdfd6a9283a2763cae (origin/omid2)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f c6a45bcf1e842fb8261ae1bdfd6a9283a2763cae
Commit message: "Merge branch '4.x-HBase-1.3' into omid2"
 > git rev-list --no-walk c6a45bcf1e842fb8261ae1bdfd6a9283a2763cae # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the 
SCM step.
[EnvInject] - Injecting as environment variables the properties content 
MAVEN_OPTS=-Xmx3G

[EnvInject] - Variables injected successfully.
[Phoenix-omid2] $ /bin/bash -xe /tmp/jenkins3121781570937164448.sh
+ rm -rf inomid
+ git clone https://git-wip-us.apache.org/repos/asf/incubator-omid.git inomid
Cloning into 'inomid'...
+ cd inomid
+ git checkout phoenix-integration
Switched to a new branch 'phoenix-integration'
Branch phoenix-integration set up to track remote branch phoenix-integration 
from origin.
+ /home/jenkins/tools/maven/latest3/bin/mvn install 
'{color:#FF}-Dhttps.protocols=TLSv1.2{color}' -DskipTests -P hbase-1
[INFO] Scanning for projects...
Downloading from central: 
https://repo.maven.apache.org/maven2/org/codehaus/jettison/jettison/1.3.3/jettison-1.3.3.pom
Downloading from apache release: 
https://repository.apache.org/content/repositories/releases/org/codehaus/jettison/jettison/1.3.3/jettison-1.3.3.pom
Downloading from java.net: 
http://download.java.net/maven/2/org/codehaus/jettison/jettison/1.3.3/jettison-1.3.3.pom
Downloading from repository.jboss.org: 
http://repository.jboss.org/nexus/content/groups/public-jboss/org/codehaus/jettison/jettison/1.3.3/jettison-1.3.3.pom
Progress (1): 3.7/4.3 kBProgress (1): 4.3 kBDownloaded 
from repository.jboss.org: 
http://repository.jboss.org/nexus/content/groups/public-jboss/org/codehaus/jettison/jettison/1.3.3/jettison-1.3.3.pom
 (4.3 kB at 11 kB/s)
[INFO] 
[INFO] Reactor Build Order:
[INFO] 
[INFO] Omid   [pom]
[INFO] Common [jar]
[INFO] State Machine  [jar]
[INFO] Commit Table   [jar]
[INFO] Metrics[jar]
[INFO] Transaction Client [jar]
[INFO] Shims Aggregator for HBase [pom]
[INFO] Shims layer for HBase 1.x  [jar]
[INFO] HBase Common   [jar]
[INFO] HBase Commit Table [jar]
[INFO] Codahale Metrics   [jar]
[INFO] Benchmarks [jar]
[INFO] Timestamp Storage  [jar]
[INFO] HBase tools[jar]
[INFO] TSO and TO Servers [jar]
[INFO] HBase Client   [jar]
[INFO] HBase Coprocessors [jar]
[INFO] Omid Client Examples   [jar]
[INFO] Omid Packaging [pom]
Downloading from central: 
https://repo.maven.apache.org/maven2/org/apache/maven/plugins/maven-deploy-plugin/2.7/maven-deploy-plugin-2.7.pom
[WARNING] Failed to retrieve plugin descriptor for 
org.apache.maven.plugins:maven-deploy-plugin:2.7: Plugin 
org.apache.maven.plugins:maven-deploy-plugin:2.7 or one of its dependencies 
could not be resolved: Failed to read artifact descriptor for 
org.apache.maven.plugins:maven-deploy-plugin:jar:2.7
Downloading from central: 
https://repo.maven.apache.org/maven2/org/apache/maven/plugins/maven-release-plugin/2.3.2/maven-release-plugin-2.3.2.pom
[WARNING] Failed to retrieve plugin descriptor for 
org.apache.maven.plugins:maven-rel

Build failed in Jenkins: Phoenix | Master #2185

2018-10-15 Thread Apache Jenkins Server
See 

--
Started by an SCM change
[EnvInject] - Loading node environment variables.
Building remotely on H35 (ubuntu xenial) in workspace 

 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url 
 > https://git-wip-us.apache.org/repos/asf/phoenix.git # timeout=10
Fetching upstream changes from 
https://git-wip-us.apache.org/repos/asf/phoenix.git
 > git --version # timeout=10
 > git fetch --tags --progress 
 > https://git-wip-us.apache.org/repos/asf/phoenix.git 
 > +refs/heads/*:refs/remotes/origin/*
ERROR: Error fetching remote repo 'origin'
hudson.plugins.git.GitException: Failed to fetch from 
https://git-wip-us.apache.org/repos/asf/phoenix.git
at hudson.plugins.git.GitSCM.fetchFrom(GitSCM.java:888)
at hudson.plugins.git.GitSCM.retrieveChanges(GitSCM.java:1155)
at hudson.plugins.git.GitSCM.checkout(GitSCM.java:1186)
at hudson.scm.SCM.checkout(SCM.java:504)
at hudson.model.AbstractProject.checkout(AbstractProject.java:1208)
at 
hudson.model.AbstractBuild$AbstractBuildExecution.defaultCheckout(AbstractBuild.java:574)
at jenkins.scm.SCMCheckoutStrategy.checkout(SCMCheckoutStrategy.java:86)
at 
hudson.model.AbstractBuild$AbstractBuildExecution.run(AbstractBuild.java:499)
at hudson.model.Run.execute(Run.java:1794)
at hudson.model.FreeStyleBuild.run(FreeStyleBuild.java:43)
at hudson.model.ResourceController.execute(ResourceController.java:97)
at hudson.model.Executor.run(Executor.java:429)
Caused by: hudson.plugins.git.GitException: Command "git fetch --tags 
--progress https://git-wip-us.apache.org/repos/asf/phoenix.git 
+refs/heads/*:refs/remotes/origin/*" returned status code 128:
stdout: 
stderr: error: Could not read ba1fd712ae8cdc0fc20805a7b03b7df7b0258542
error: Could not read 78daf2e3273e56f9dc6254c0bb12d45f882ac4da
error: Could not read 3b03e1b0bddaf3b52637187ff556fbb39b1dddef
error: Could not read 9d2cf0bb33c8c4b6c893007cc36db6c5d1b74686
remote: Counting objects: 64455   remote: Counting objects: 104465, 
done.
remote: Compressing objects:   0% (1/49211)   remote: Compressing 
objects:   1% (493/49211)   remote: Compressing objects:   2% 
(985/49211)   remote: Compressing objects:   3% (1477/49211)   
remote: Compressing objects:   4% (1969/49211)   remote: Compressing 
objects:   5% (2461/49211)   remote: Compressing objects:   6% 
(2953/49211)   remote: Compressing objects:   7% (3445/49211)   
remote: Compressing objects:   8% (3937/49211)   remote: Compressing 
objects:   9% (4429/49211)   remote: Compressing objects:  10% 
(4922/49211)   remote: Compressing objects:  11% (5414/49211)   
remote: Compressing objects:  12% (5906/49211)   remote: Compressing 
objects:  13% (6398/49211)   remote: Compressing objects:  14% 
(6890/49211)   remote: Compressing objects:  15% (7382/49211)   
remote: Compressing objects:  16% (7874/49211)   remote: Compressing 
objects:  17% (8366/49211)   remote: Compressing objects:  18% 
(8858/49211)   remote: Compressing objects:  19% (9351/49211)   
remote: Compressing objects:  20% (9843/49211)   remote: Compressing 
objects:  21% (10335/49211)   remote: Compressing objects:  22% 
(10827/49211)   remote: Compressing objects:  23% (11319/49211) 
  remote: Compressing objects:  24% (11811/49211)   remote: Compressing 
objects:  25% (12303/49211)   remote: Compressing objects:  26% 
(12795/49211)   remote: Compressing objects:  27% (13287/49211) 
  remote: Compressing objects:  28% (13780/49211)   remote: Compressing 
objects:  29% (14272/49211)   remote: Compressing objects:  30% 
(14764/49211)   remote: Compressing objects:  31% (15256/49211) 
  remote: Compressing objects:  32% (15748/49211)   remote: Compressing 
objects:  33% (16240/49211)   remote: Compressing objects:  34% 
(16732/49211)   remote: Compressing objects:  35% (17224/49211) 
  remote: Compressing objects:  36% (17716/49211)   remote: Compressing 
objects:  37% (18209/49211)   remote: Compressing objects:  38% 
(18701/49211)   remote: Compressing objects:  39% (19193/49211) 
  remote: Compressing objects:  40% (19685/49211)   remote: Compressing 
objects:  41% (20177/49211)   remote: Compressing objects:  42% 
(20669/49211)   remote: Compressing objects:  43% (21161/49211) 
  remote: Compressing objects:  44% (21653/49211)   remote: Compressing 
objects:  45% (22145/49211)   remote: Compressing objec

Build failed in Jenkins: Phoenix | Master #2184

2018-10-15 Thread Apache Jenkins Server
See 

--
Started by an SCM change
[EnvInject] - Loading node environment variables.
Building remotely on H35 (ubuntu xenial) in workspace 

 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url 
 > https://git-wip-us.apache.org/repos/asf/phoenix.git # timeout=10
Fetching upstream changes from 
https://git-wip-us.apache.org/repos/asf/phoenix.git
 > git --version # timeout=10
 > git fetch --tags --progress 
 > https://git-wip-us.apache.org/repos/asf/phoenix.git 
 > +refs/heads/*:refs/remotes/origin/*
ERROR: Error fetching remote repo 'origin'
hudson.plugins.git.GitException: Failed to fetch from 
https://git-wip-us.apache.org/repos/asf/phoenix.git
at hudson.plugins.git.GitSCM.fetchFrom(GitSCM.java:888)
at hudson.plugins.git.GitSCM.retrieveChanges(GitSCM.java:1155)
at hudson.plugins.git.GitSCM.checkout(GitSCM.java:1186)
at hudson.scm.SCM.checkout(SCM.java:504)
at hudson.model.AbstractProject.checkout(AbstractProject.java:1208)
at 
hudson.model.AbstractBuild$AbstractBuildExecution.defaultCheckout(AbstractBuild.java:574)
at jenkins.scm.SCMCheckoutStrategy.checkout(SCMCheckoutStrategy.java:86)
at 
hudson.model.AbstractBuild$AbstractBuildExecution.run(AbstractBuild.java:499)
at hudson.model.Run.execute(Run.java:1794)
at hudson.model.FreeStyleBuild.run(FreeStyleBuild.java:43)
at hudson.model.ResourceController.execute(ResourceController.java:97)
at hudson.model.Executor.run(Executor.java:429)
Caused by: hudson.plugins.git.GitException: Command "git fetch --tags 
--progress https://git-wip-us.apache.org/repos/asf/phoenix.git 
+refs/heads/*:refs/remotes/origin/*" returned status code 128:
stdout: 
stderr: error: Could not read ba1fd712ae8cdc0fc20805a7b03b7df7b0258542
error: Could not read 78daf2e3273e56f9dc6254c0bb12d45f882ac4da
error: Could not read 3b03e1b0bddaf3b52637187ff556fbb39b1dddef
error: Could not read 9d2cf0bb33c8c4b6c893007cc36db6c5d1b74686
remote: Counting objects: 67641   remote: Counting objects: 104465, 
done.
remote: Compressing objects:   0% (1/49211)   remote: Compressing 
objects:   1% (493/49211)   remote: Compressing objects:   2% 
(985/49211)   remote: Compressing objects:   3% (1477/49211)   
remote: Compressing objects:   4% (1969/49211)   remote: Compressing 
objects:   5% (2461/49211)   remote: Compressing objects:   6% 
(2953/49211)   remote: Compressing objects:   7% (3445/49211)   
remote: Compressing objects:   8% (3937/49211)   remote: Compressing 
objects:   9% (4429/49211)   remote: Compressing objects:  10% 
(4922/49211)   remote: Compressing objects:  11% (5414/49211)   
remote: Compressing objects:  12% (5906/49211)   remote: Compressing 
objects:  13% (6398/49211)   remote: Compressing objects:  14% 
(6890/49211)   remote: Compressing objects:  15% (7382/49211)   
remote: Compressing objects:  16% (7874/49211)   remote: Compressing 
objects:  17% (8366/49211)   remote: Compressing objects:  18% 
(8858/49211)   remote: Compressing objects:  19% (9351/49211)   
remote: Compressing objects:  20% (9843/49211)   remote: Compressing 
objects:  21% (10335/49211)   remote: Compressing objects:  22% 
(10827/49211)   remote: Compressing objects:  23% (11319/49211) 
  remote: Compressing objects:  24% (11811/49211)   remote: Compressing 
objects:  25% (12303/49211)   remote: Compressing objects:  26% 
(12795/49211)   remote: Compressing objects:  27% (13287/49211) 
  remote: Compressing objects:  28% (13780/49211)   remote: Compressing 
objects:  29% (14272/49211)   remote: Compressing objects:  30% 
(14764/49211)   remote: Compressing objects:  31% (15256/49211) 
  remote: Compressing objects:  32% (15748/49211)   remote: Compressing 
objects:  33% (16240/49211)   remote: Compressing objects:  34% 
(16732/49211)   remote: Compressing objects:  35% (17224/49211) 
  remote: Compressing objects:  36% (17716/49211)   remote: Compressing 
objects:  37% (18209/49211)   remote: Compressing objects:  38% 
(18701/49211)   remote: Compressing objects:  39% (19193/49211) 
  remote: Compressing objects:  40% (19685/49211)   remote: Compressing 
objects:  41% (20177/49211)   remote: Compressing objects:  42% 
(20669/49211)   remote: Compressing objects:  43% (21161/49211) 
  remote: Compressing objects:  44% (21653/49211)   remote: Compressing 
objects:  45% (22145/49211)   remote: Compressing objec

Build failed in Jenkins: Phoenix | Master #2183

2018-10-15 Thread Apache Jenkins Server
See 

--
Started by an SCM change
[EnvInject] - Loading node environment variables.
Building remotely on H35 (ubuntu xenial) in workspace 

 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url 
 > https://git-wip-us.apache.org/repos/asf/phoenix.git # timeout=10
Fetching upstream changes from 
https://git-wip-us.apache.org/repos/asf/phoenix.git
 > git --version # timeout=10
 > git fetch --tags --progress 
 > https://git-wip-us.apache.org/repos/asf/phoenix.git 
 > +refs/heads/*:refs/remotes/origin/*
ERROR: Error fetching remote repo 'origin'
hudson.plugins.git.GitException: Failed to fetch from 
https://git-wip-us.apache.org/repos/asf/phoenix.git
at hudson.plugins.git.GitSCM.fetchFrom(GitSCM.java:888)
at hudson.plugins.git.GitSCM.retrieveChanges(GitSCM.java:1155)
at hudson.plugins.git.GitSCM.checkout(GitSCM.java:1186)
at hudson.scm.SCM.checkout(SCM.java:504)
at hudson.model.AbstractProject.checkout(AbstractProject.java:1208)
at 
hudson.model.AbstractBuild$AbstractBuildExecution.defaultCheckout(AbstractBuild.java:574)
at jenkins.scm.SCMCheckoutStrategy.checkout(SCMCheckoutStrategy.java:86)
at 
hudson.model.AbstractBuild$AbstractBuildExecution.run(AbstractBuild.java:499)
at hudson.model.Run.execute(Run.java:1794)
at hudson.model.FreeStyleBuild.run(FreeStyleBuild.java:43)
at hudson.model.ResourceController.execute(ResourceController.java:97)
at hudson.model.Executor.run(Executor.java:429)
Caused by: hudson.plugins.git.GitException: Command "git fetch --tags 
--progress https://git-wip-us.apache.org/repos/asf/phoenix.git 
+refs/heads/*:refs/remotes/origin/*" returned status code 128:
stdout: 
stderr: error: Could not read ba1fd712ae8cdc0fc20805a7b03b7df7b0258542
error: Could not read 78daf2e3273e56f9dc6254c0bb12d45f882ac4da
error: Could not read 3b03e1b0bddaf3b52637187ff556fbb39b1dddef
error: Could not read 9d2cf0bb33c8c4b6c893007cc36db6c5d1b74686
remote: Counting objects: 73485   remote: Counting objects: 104465, 
done.
remote: Compressing objects:   0% (1/49211)   remote: Compressing 
objects:   1% (493/49211)   remote: Compressing objects:   2% 
(985/49211)   remote: Compressing objects:   3% (1477/49211)   
remote: Compressing objects:   4% (1969/49211)   remote: Compressing 
objects:   5% (2461/49211)   remote: Compressing objects:   6% 
(2953/49211)   remote: Compressing objects:   7% (3445/49211)   
remote: Compressing objects:   8% (3937/49211)   remote: Compressing 
objects:   9% (4429/49211)   remote: Compressing objects:  10% 
(4922/49211)   remote: Compressing objects:  11% (5414/49211)   
remote: Compressing objects:  12% (5906/49211)   remote: Compressing 
objects:  13% (6398/49211)   remote: Compressing objects:  14% 
(6890/49211)   remote: Compressing objects:  15% (7382/49211)   
remote: Compressing objects:  16% (7874/49211)   remote: Compressing 
objects:  17% (8366/49211)   remote: Compressing objects:  18% 
(8858/49211)   remote: Compressing objects:  19% (9351/49211)   
remote: Compressing objects:  20% (9843/49211)   remote: Compressing 
objects:  21% (10335/49211)   remote: Compressing objects:  22% 
(10827/49211)   remote: Compressing objects:  23% (11319/49211) 
  remote: Compressing objects:  24% (11811/49211)   remote: Compressing 
objects:  25% (12303/49211)   remote: Compressing objects:  26% 
(12795/49211)   remote: Compressing objects:  27% (13287/49211) 
  remote: Compressing objects:  28% (13780/49211)   remote: Compressing 
objects:  29% (14272/49211)   remote: Compressing objects:  30% 
(14764/49211)   remote: Compressing objects:  31% (15256/49211) 
  remote: Compressing objects:  32% (15748/49211)   remote: Compressing 
objects:  33% (16240/49211)   remote: Compressing objects:  34% 
(16732/49211)   remote: Compressing objects:  35% (17224/49211) 
  remote: Compressing objects:  36% (17716/49211)   remote: Compressing 
objects:  37% (18209/49211)   remote: Compressing objects:  38% 
(18701/49211)   remote: Compressing objects:  39% (19193/49211) 
  remote: Compressing objects:  40% (19685/49211)   remote: Compressing 
objects:  41% (20177/49211)   remote: Compressing objects:  42% 
(20669/49211)   remote: Compressing objects:  43% (21161/49211) 
  remote: Compressing objects:  44% (21653/49211)   remote: Compressing 
objects:  45% (22145/49211)   remote: Compressing objec

Build failed in Jenkins: Phoenix | Master #2182

2018-10-15 Thread Apache Jenkins Server
See 

--
Started by an SCM change
[EnvInject] - Loading node environment variables.
Building remotely on H35 (ubuntu xenial) in workspace 

 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url 
 > https://git-wip-us.apache.org/repos/asf/phoenix.git # timeout=10
Fetching upstream changes from 
https://git-wip-us.apache.org/repos/asf/phoenix.git
 > git --version # timeout=10
 > git fetch --tags --progress 
 > https://git-wip-us.apache.org/repos/asf/phoenix.git 
 > +refs/heads/*:refs/remotes/origin/*
ERROR: Error fetching remote repo 'origin'
hudson.plugins.git.GitException: Failed to fetch from 
https://git-wip-us.apache.org/repos/asf/phoenix.git
at hudson.plugins.git.GitSCM.fetchFrom(GitSCM.java:888)
at hudson.plugins.git.GitSCM.retrieveChanges(GitSCM.java:1155)
at hudson.plugins.git.GitSCM.checkout(GitSCM.java:1186)
at hudson.scm.SCM.checkout(SCM.java:504)
at hudson.model.AbstractProject.checkout(AbstractProject.java:1208)
at 
hudson.model.AbstractBuild$AbstractBuildExecution.defaultCheckout(AbstractBuild.java:574)
at jenkins.scm.SCMCheckoutStrategy.checkout(SCMCheckoutStrategy.java:86)
at 
hudson.model.AbstractBuild$AbstractBuildExecution.run(AbstractBuild.java:499)
at hudson.model.Run.execute(Run.java:1794)
at hudson.model.FreeStyleBuild.run(FreeStyleBuild.java:43)
at hudson.model.ResourceController.execute(ResourceController.java:97)
at hudson.model.Executor.run(Executor.java:429)
Caused by: hudson.plugins.git.GitException: Command "git fetch --tags 
--progress https://git-wip-us.apache.org/repos/asf/phoenix.git 
+refs/heads/*:refs/remotes/origin/*" returned status code 128:
stdout: 
stderr: error: Could not read ba1fd712ae8cdc0fc20805a7b03b7df7b0258542
error: Could not read 78daf2e3273e56f9dc6254c0bb12d45f882ac4da
error: Could not read 3b03e1b0bddaf3b52637187ff556fbb39b1dddef
error: Could not read 9d2cf0bb33c8c4b6c893007cc36db6c5d1b74686
remote: Counting objects: 69146   remote: Counting objects: 104465, 
done.
remote: Compressing objects:   0% (1/49211)   remote: Compressing 
objects:   1% (493/49211)   remote: Compressing objects:   2% 
(985/49211)   remote: Compressing objects:   3% (1477/49211)   
remote: Compressing objects:   4% (1969/49211)   remote: Compressing 
objects:   5% (2461/49211)   remote: Compressing objects:   6% 
(2953/49211)   remote: Compressing objects:   7% (3445/49211)   
remote: Compressing objects:   8% (3937/49211)   remote: Compressing 
objects:   9% (4429/49211)   remote: Compressing objects:  10% 
(4922/49211)   remote: Compressing objects:  11% (5414/49211)   
remote: Compressing objects:  12% (5906/49211)   remote: Compressing 
objects:  13% (6398/49211)   remote: Compressing objects:  14% 
(6890/49211)   remote: Compressing objects:  15% (7382/49211)   
remote: Compressing objects:  16% (7874/49211)   remote: Compressing 
objects:  17% (8366/49211)   remote: Compressing objects:  18% 
(8858/49211)   remote: Compressing objects:  19% (9351/49211)   
remote: Compressing objects:  20% (9843/49211)   remote: Compressing 
objects:  21% (10335/49211)   remote: Compressing objects:  22% 
(10827/49211)   remote: Compressing objects:  23% (11319/49211) 
  remote: Compressing objects:  24% (11811/49211)   remote: Compressing 
objects:  25% (12303/49211)   remote: Compressing objects:  26% 
(12795/49211)   remote: Compressing objects:  27% (13287/49211) 
  remote: Compressing objects:  28% (13780/49211)   remote: Compressing 
objects:  29% (14272/49211)   remote: Compressing objects:  30% 
(14764/49211)   remote: Compressing objects:  31% (15256/49211) 
  remote: Compressing objects:  32% (15748/49211)   remote: Compressing 
objects:  33% (16240/49211)   remote: Compressing objects:  34% 
(16732/49211)   remote: Compressing objects:  35% (17224/49211) 
  remote: Compressing objects:  36% (17716/49211)   remote: Compressing 
objects:  37% (18209/49211)   remote: Compressing objects:  38% 
(18701/49211)   remote: Compressing objects:  39% (19193/49211) 
  remote: Compressing objects:  40% (19685/49211)   remote: Compressing 
objects:  41% (20177/49211)   remote: Compressing objects:  42% 
(20669/49211)   remote: Compressing objects:  43% (21161/49211) 
  remote: Compressing objects:  44% (21653/49211)   remote: Compressing 
objects:  45% (22145/49211)   remote: Compressing objec

Build failed in Jenkins: Phoenix | Master #2181

2018-10-15 Thread Apache Jenkins Server
See 

--
Started by an SCM change
[EnvInject] - Loading node environment variables.
Building remotely on H35 (ubuntu xenial) in workspace 

 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url 
 > https://git-wip-us.apache.org/repos/asf/phoenix.git # timeout=10
Fetching upstream changes from 
https://git-wip-us.apache.org/repos/asf/phoenix.git
 > git --version # timeout=10
 > git fetch --tags --progress 
 > https://git-wip-us.apache.org/repos/asf/phoenix.git 
 > +refs/heads/*:refs/remotes/origin/*
ERROR: Error fetching remote repo 'origin'
hudson.plugins.git.GitException: Failed to fetch from 
https://git-wip-us.apache.org/repos/asf/phoenix.git
at hudson.plugins.git.GitSCM.fetchFrom(GitSCM.java:888)
at hudson.plugins.git.GitSCM.retrieveChanges(GitSCM.java:1155)
at hudson.plugins.git.GitSCM.checkout(GitSCM.java:1186)
at hudson.scm.SCM.checkout(SCM.java:504)
at hudson.model.AbstractProject.checkout(AbstractProject.java:1208)
at 
hudson.model.AbstractBuild$AbstractBuildExecution.defaultCheckout(AbstractBuild.java:574)
at jenkins.scm.SCMCheckoutStrategy.checkout(SCMCheckoutStrategy.java:86)
at 
hudson.model.AbstractBuild$AbstractBuildExecution.run(AbstractBuild.java:499)
at hudson.model.Run.execute(Run.java:1794)
at hudson.model.FreeStyleBuild.run(FreeStyleBuild.java:43)
at hudson.model.ResourceController.execute(ResourceController.java:97)
at hudson.model.Executor.run(Executor.java:429)
Caused by: hudson.plugins.git.GitException: Command "git fetch --tags 
--progress https://git-wip-us.apache.org/repos/asf/phoenix.git 
+refs/heads/*:refs/remotes/origin/*" returned status code 128:
stdout: 
stderr: error: Could not read ba1fd712ae8cdc0fc20805a7b03b7df7b0258542
error: Could not read 78daf2e3273e56f9dc6254c0bb12d45f882ac4da
error: Could not read 3b03e1b0bddaf3b52637187ff556fbb39b1dddef
error: Could not read 9d2cf0bb33c8c4b6c893007cc36db6c5d1b74686
remote: Counting objects: 68036   remote: Counting objects: 104465, 
done.
remote: Compressing objects:   0% (1/49211)   remote: Compressing 
objects:   1% (493/49211)   remote: Compressing objects:   2% 
(985/49211)   remote: Compressing objects:   3% (1477/49211)   
remote: Compressing objects:   4% (1969/49211)   remote: Compressing 
objects:   5% (2461/49211)   remote: Compressing objects:   6% 
(2953/49211)   remote: Compressing objects:   7% (3445/49211)   
remote: Compressing objects:   8% (3937/49211)   remote: Compressing 
objects:   9% (4429/49211)   remote: Compressing objects:  10% 
(4922/49211)   remote: Compressing objects:  11% (5414/49211)   
remote: Compressing objects:  12% (5906/49211)   remote: Compressing 
objects:  13% (6398/49211)   remote: Compressing objects:  14% 
(6890/49211)   remote: Compressing objects:  15% (7382/49211)   
remote: Compressing objects:  16% (7874/49211)   remote: Compressing 
objects:  17% (8366/49211)   remote: Compressing objects:  18% 
(8858/49211)   remote: Compressing objects:  19% (9351/49211)   
remote: Compressing objects:  20% (9843/49211)   remote: Compressing 
objects:  21% (10335/49211)   remote: Compressing objects:  22% 
(10827/49211)   remote: Compressing objects:  23% (11319/49211) 
  remote: Compressing objects:  24% (11811/49211)   remote: Compressing 
objects:  25% (12303/49211)   remote: Compressing objects:  26% 
(12795/49211)   remote: Compressing objects:  27% (13287/49211) 
  remote: Compressing objects:  28% (13780/49211)   remote: Compressing 
objects:  29% (14272/49211)   remote: Compressing objects:  30% 
(14764/49211)   remote: Compressing objects:  31% (15256/49211) 
  remote: Compressing objects:  32% (15748/49211)   remote: Compressing 
objects:  33% (16240/49211)   remote: Compressing objects:  34% 
(16732/49211)   remote: Compressing objects:  35% (17224/49211) 
  remote: Compressing objects:  36% (17716/49211)   remote: Compressing 
objects:  37% (18209/49211)   remote: Compressing objects:  38% 
(18701/49211)   remote: Compressing objects:  39% (19193/49211) 
  remote: Compressing objects:  40% (19685/49211)   remote: Compressing 
objects:  41% (20177/49211)   remote: Compressing objects:  42% 
(20669/49211)   remote: Compressing objects:  43% (21161/49211) 
  remote: Compressing objects:  44% (21653/49211)   remote: Compressing 
objects:  45% (22145/49211)   remote: Compressing objec

phoenix git commit: PHOENIX-4859 Using local index in where statement for join (only rhs table) query fails-addendum(Rajeshbabu)

2018-10-15 Thread rajeshbabu
Repository: phoenix
Updated Branches:
  refs/heads/master 483979104 -> 0d32f8700


PHOENIX-4859 Using local index in where statement for join (only rhs table) 
query fails-addendum(Rajeshbabu)


Project: http://git-wip-us.apache.org/repos/asf/phoenix/repo
Commit: http://git-wip-us.apache.org/repos/asf/phoenix/commit/0d32f870
Tree: http://git-wip-us.apache.org/repos/asf/phoenix/tree/0d32f870
Diff: http://git-wip-us.apache.org/repos/asf/phoenix/diff/0d32f870

Branch: refs/heads/master
Commit: 0d32f870044c5fefdcf9a08e59e6c3c2ef3dc5d1
Parents: 4839791
Author: Rajeshbabu Chintaguntla 
Authored: Mon Oct 15 21:22:10 2018 +0530
Committer: Rajeshbabu Chintaguntla 
Committed: Mon Oct 15 21:22:10 2018 +0530

--
 .../java/org/apache/phoenix/schema/LocalIndexDataColumnRef.java| 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)
--


http://git-wip-us.apache.org/repos/asf/phoenix/blob/0d32f870/phoenix-core/src/main/java/org/apache/phoenix/schema/LocalIndexDataColumnRef.java
--
diff --git 
a/phoenix-core/src/main/java/org/apache/phoenix/schema/LocalIndexDataColumnRef.java
 
b/phoenix-core/src/main/java/org/apache/phoenix/schema/LocalIndexDataColumnRef.java
index 835cb6a..de0fe4c 100644
--- 
a/phoenix-core/src/main/java/org/apache/phoenix/schema/LocalIndexDataColumnRef.java
+++ 
b/phoenix-core/src/main/java/org/apache/phoenix/schema/LocalIndexDataColumnRef.java
@@ -41,7 +41,7 @@ public class LocalIndexDataColumnRef extends ColumnRef {
 TableName.create(tRef.getTable().getSchemaName().getString(), 
tRef.getTable()
 .getParentTableName().getString())), 
context.getConnection(), false)
 
.resolveTable(context.getCurrentTable().getTable().getSchemaName().getString(),
-
context.getCurrentTable().getTable().getParentTableName().getString()),
+tRef.getTable().getParentTableName().getString()),
 IndexUtil.getDataColumnFamilyName(indexColumnName), IndexUtil
 .getDataColumnName(indexColumnName));
 position = context.getDataColumnPosition(this.getColumn());



Build failed in Jenkins: Phoenix Compile Compatibility with HBase #787

2018-10-15 Thread Apache Jenkins Server
See 


--
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on H25 (ubuntu xenial) in workspace 

[Phoenix_Compile_Compat_wHBase] $ /bin/bash /tmp/jenkins4298379798401616056.sh
core file size  (blocks, -c) 0
data seg size   (kbytes, -d) unlimited
scheduling priority (-e) 0
file size   (blocks, -f) unlimited
pending signals (-i) 386407
max locked memory   (kbytes, -l) 64
max memory size (kbytes, -m) unlimited
open files  (-n) 6
pipe size(512 bytes, -p) 8
POSIX message queues (bytes, -q) 819200
real-time priority  (-r) 0
stack size  (kbytes, -s) 8192
cpu time   (seconds, -t) unlimited
max user processes  (-u) 10240
virtual memory  (kbytes, -v) unlimited
file locks  (-x) unlimited
core id : 0
core id : 1
core id : 2
core id : 3
core id : 4
core id : 5
physical id : 0
physical id : 1
MemTotal:   98957728 kB
MemFree:18685008 kB
Filesystem  Size  Used Avail Use% Mounted on
udev 48G 0   48G   0% /dev
tmpfs   9.5G  346M  9.1G   4% /run
/dev/sda1   364G  273G   73G  80% /
tmpfs48G 0   48G   0% /dev/shm
tmpfs   5.0M 0  5.0M   0% /run/lock
tmpfs48G 0   48G   0% /sys/fs/cgroup
/dev/loop3   32M   32M 0 100% /snap/snapcraft/1594
/dev/loop4   87M   87M 0 100% /snap/core/5145
tmpfs   1.0M 0  1.0M   0% /var/snap/lxd/common/ns
/dev/loop5   88M   88M 0 100% /snap/core/5328
/dev/loop6   28M   28M 0 100% /snap/snapcraft/1803
tmpfs   9.5G 0  9.5G   0% /run/user/910
/dev/loop1   28M   28M 0 100% /snap/snapcraft/1871
/dev/loop15  67M   67M 0 100% /snap/lxd/8942
/dev/loop16  67M   67M 0 100% /snap/lxd/8959
/dev/loop17  88M   88M 0 100% /snap/core/5548
/dev/loop2   67M   67M 0 100% /snap/lxd/9206
apache-maven-2.2.1
apache-maven-3.0.4
apache-maven-3.0.5
apache-maven-3.1.1
apache-maven-3.2.1
apache-maven-3.2.5
apache-maven-3.3.3
apache-maven-3.3.9
apache-maven-3.5.0
apache-maven-3.5.2
apache-maven-3.5.4
latest
latest2
latest3


===
Verifying compile level compatibility with HBase 0.98 with Phoenix 
4.x-HBase-0.98
===

Cloning into 'hbase'...
Switched to a new branch '0.98'
Branch 0.98 set up to track remote branch 0.98 from origin.
[ERROR] Plugin org.codehaus.mojo:findbugs-maven-plugin:2.5.2 or one of its 
dependencies could not be resolved: Failed to read artifact descriptor for 
org.codehaus.mojo:findbugs-maven-plugin:jar:2.5.2: Could not transfer artifact 
org.codehaus.mojo:findbugs-maven-plugin:pom:2.5.2 from/to central 
(https://repo.maven.apache.org/maven2): Received fatal alert: protocol_version 
-> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e 
switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please 
read the following articles:
[ERROR] [Help 1] 
http://cwiki.apache.org/confluence/display/MAVEN/PluginResolutionException
Build step 'Execute shell' marked build as failure