The pherf cleanup PRs are ready at https://issues.apache.org/jira/browse/PHOENIX-6114.
I will start the next RC as soon as they are approved (or vetoed) regards Istvan On Thu, Feb 4, 2021 at 8:18 AM Istvan Toth <[email protected]> wrote: > I've pushed the addendum to fix the phoenix-pherf test failure to master > and 4.x (but not to 4.16 which also needs it) > > However, on closer inspection, we still have a lot of problems with pherf: > > pherf-cluster.py seems to have been missed in the python3 support pass, > and doesn't even start > I highly suspect that even if it was able to start, we didn't keep lib > updated with all the necessary dependencies for it to work properly > Additionally, I don't even see the advantage over pherf-standalone, and it > is the only reason we have a huge /lib directory of out-of date > jars in the assembly, so I think we should just remove it. > > We also generate a shaded phoenix-pherf-minimal JAR, which needs > phoenix-compat-hbase to work, but we don't not > publish multiple versions of, and it is also dubious if it even works at > all with the recent changes. > > I will try to solve the above problem by creating a shaded phoenix-pherf > jar that works like phoenix-queryserver does, and removing > phoenix-cluster.py, and 95% of the contents of the /lib dir in the > assembly. > > Now that I enumerated all the pherf problems we have, I think it's worth > delaying the next RC by a day or two to fix these problems. > > regards > Istvan > > On Thu, Feb 4, 2021 at 4:58 AM Istvan Toth <[email protected]> wrote: > >> -1 because of the Pherf test classpath regression >> >> On Thu, Feb 4, 2021 at 4:57 AM Istvan Toth <[email protected]> wrote: >> >>> The above exception usually happens when you use the official upstream >>> HBase artifacts for building. >>> See BUILDING.md on how to rebuild HBase for Hadoop 3. >>> >>> However, I also broke phoenix-pherf's test classpath, which probably >>> affects 4.x, too. >>> >>> Expect an addendum to PHOENIX-6360 and an RC3 coming soon. >>> >>> Sorry for all the RC noise. >>> >>> Istvan >>> >>> On Thu, Feb 4, 2021 at 4:10 AM Xinyi Yan <[email protected]> wrote: >>> >>>> Huh. With my JDK8 environment, mvn clean install -DskipTests doesn't >>>> have >>>> an issue. However, mvn clean verify -Dhbase.profile=2.4 seems to have a >>>> problem, see the following: >>>> >>>> >>>> ------------------------------------------------------------------------------- >>>> Test set: >>>> >>>> org.apache.phoenix.hbase.index.write.recovery.TestPerRegionIndexWriteCache >>>> >>>> ------------------------------------------------------------------------------- >>>> Tests run: 3, Failures: 0, Errors: 3, Skipped: 0, Time elapsed: 14.021 s >>>> <<< FAILURE! - in >>>> >>>> org.apache.phoenix.hbase.index.write.recovery.TestPerRegionIndexWriteCache >>>> >>>> testMultipleRegions(org.apache.phoenix.hbase.index.write.recovery.TestPerRegionIndexWriteCache) >>>> Time elapsed: 0.342 s <<< ERROR! >>>> java.lang.*IncompatibleClassChangeError*: Found interface >>>> org.apache.hadoop.hdfs.protocol.HdfsFileStatus, but class was expected >>>> at >>>> org.apache.hadoop.hbase.io >>>> .asyncfs.FanOutOneBlockAsyncDFSOutputHelper.createOutput(FanOutOneBlockAsyncDFSOutputHelper.java:536) >>>> at >>>> org.apache.hadoop.hbase.io >>>> .asyncfs.FanOutOneBlockAsyncDFSOutputHelper.access$400(FanOutOneBlockAsyncDFSOutputHelper.java:112) >>>> at >>>> org.apache.hadoop.hbase.io >>>> .asyncfs.FanOutOneBlockAsyncDFSOutputHelper$8.doCall(FanOutOneBlockAsyncDFSOutputHelper.java:616) >>>> at >>>> org.apache.hadoop.hbase.io >>>> .asyncfs.FanOutOneBlockAsyncDFSOutputHelper$8.doCall(FanOutOneBlockAsyncDFSOutputHelper.java:611) >>>> at >>>> >>>> org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81) >>>> at >>>> org.apache.hadoop.hbase.io >>>> .asyncfs.FanOutOneBlockAsyncDFSOutputHelper.createOutput(FanOutOneBlockAsyncDFSOutputHelper.java:624) >>>> at >>>> org.apache.hadoop.hbase.io >>>> .asyncfs.AsyncFSOutputHelper.createOutput(AsyncFSOutputHelper.java:53) >>>> at >>>> >>>> org.apache.hadoop.hbase.regionserver.wal.AsyncProtobufLogWriter.initOutput(AsyncProtobufLogWriter.java:180) >>>> at >>>> >>>> org.apache.hadoop.hbase.regionserver.wal.AbstractProtobufLogWriter.init(AbstractProtobufLogWriter.java:166) >>>> at >>>> >>>> org.apache.hadoop.hbase.wal.AsyncFSWALProvider.createAsyncWriter(AsyncFSWALProvider.java:113) >>>> at >>>> >>>> org.apache.hadoop.hbase.regionserver.wal.AsyncFSWAL.createWriterInstance(AsyncFSWAL.java:662) >>>> at >>>> >>>> org.apache.hadoop.hbase.regionserver.wal.AsyncFSWAL.createWriterInstance(AsyncFSWAL.java:130) >>>> at >>>> >>>> org.apache.hadoop.hbase.regionserver.wal.AbstractFSWAL.rollWriter(AbstractFSWAL.java:848) >>>> at >>>> >>>> org.apache.hadoop.hbase.regionserver.wal.AbstractFSWAL.rollWriter(AbstractFSWAL.java:551) >>>> at >>>> >>>> org.apache.hadoop.hbase.regionserver.wal.AbstractFSWAL.init(AbstractFSWAL.java:492) >>>> at >>>> >>>> org.apache.hadoop.hbase.wal.AbstractFSWALProvider.getWAL(AbstractFSWALProvider.java:161) >>>> at >>>> >>>> org.apache.hadoop.hbase.wal.AbstractFSWALProvider.getWAL(AbstractFSWALProvider.java:63) >>>> at org.apache.hadoop.hbase.wal.WALFactory.getWAL(WALFactory.java:296) >>>> at >>>> >>>> org.apache.phoenix.hbase.index.write.recovery.TestPerRegionIndexWriteCache.setUp(TestPerRegionIndexWriteCache.java:109) >>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) >>>> at >>>> >>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) >>>> at >>>> >>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) >>>> at java.lang.reflect.Method.invoke(Method.java:498) >>>> at >>>> >>>> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59) >>>> at >>>> >>>> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12) >>>> at >>>> >>>> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56) >>>> at >>>> >>>> org.junit.internal.runners.statements.RunBefores.invokeMethod(RunBefores.java:33) >>>> at >>>> >>>> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:24) >>>> at >>>> >>>> org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:27) >>>> at org.junit.rules.TestWatcher$1.evaluate(TestWatcher.java:61) >>>> at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306) >>>> at >>>> >>>> org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100) >>>> at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366) >>>> at >>>> >>>> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103) >>>> at >>>> >>>> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63) >>>> at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331) >>>> at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79) >>>> at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329) >>>> at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66) >>>> at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293) >>>> at >>>> >>>> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26) >>>> at >>>> >>>> org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:27) >>>> at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306) >>>> at org.junit.runners.ParentRunner.run(ParentRunner.java:413) >>>> at >>>> >>>> org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:365) >>>> at >>>> >>>> org.apache.maven.surefire.junit4.JUnit4Provider.executeWithRerun(JUnit4Provider.java:273) >>>> at >>>> >>>> org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:238) >>>> at >>>> >>>> org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:159) >>>> at >>>> >>>> org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:384) >>>> at >>>> >>>> org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:345) >>>> at >>>> >>>> org.apache.maven.surefire.booter.ForkedBooter.execute(ForkedBooter.java:126) >>>> at >>>> org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:418) >>>> >>>> On Wed, Feb 3, 2021 at 12:19 PM Istvan Toth <[email protected]> wrote: >>>> >>>> > Please vote on this Apache phoenix release candidate, >>>> > phoenix-5.1.0RC2 >>>> > >>>> > The VOTE will remain open for at least 72 hours. >>>> > >>>> > [ ] +1 Release this package as Apache phoenix 5.1.0 >>>> > [ ] -1 Do not release this package because ... >>>> > >>>> > The tag to be voted on is 5.1.0RC2: >>>> > >>>> > https://github.com/apache/phoenix/tree/5.1.0RC2 >>>> > >>>> > The release files, including signatures, digests, as well as >>>> CHANGES.md >>>> > and RELEASENOTES.md included in this RC can be found at: >>>> > >>>> > https://dist.apache.org/repos/dist/dev/phoenix/phoenix-5.1.0RC2/ >>>> > >>>> > Maven artifacts are available in a staging repository at: >>>> > >>>> > https://repository.apache.org/#stagingRepositories >>>> > ()orgapachephoenix-1212 >>>> > >>>> > Artifacts were signed with 0x794433C7 key which can be found in: >>>> > >>>> > https://dist.apache.org/repos/dist/release/phoenix/KEYS >>>> > >>>> > To learn more about Apache phoenix, please see >>>> > >>>> > http://phoenix.apache.org/ >>>> > >>>> > Thanks, >>>> > Istvan >>>> > >>>> >>>
