Sure, here is the result (as an attachment)
looks like a problem regarding the locale for timestamps


Joe Witt <[email protected]> schrieb am Do., 11. Jan. 2018 um 10:01 Uhr:

> Please do share the test failure in poi.  The build is stable in Travis
> and most of us build in OSX all the time so basic build tests should be
> pretty stable and we always want that to be the case.
>  You can also just run with skip tests.
>
> mvn clean install -DskipTests -Pdir-only
>
> In NiFi 1.5.0 changes were made to Hive, HDFS, HBase code regarding
> handling of Kerberos tickets.
> https://issues.apache.org/jira/browse/NIFI-3472
>
> It is very possible this improves your situation.  It is best we do not
> get into vendor specific support here but rather we can speak to the Apache
> NiFi code this community puts out.
>
> Thanks
>
> On Thu, Jan 11, 2018 at 1:48 AM, Georg Heiler <[email protected]>
> wrote:
>
>> By now I can confirm that according to the documentation it must be the
>> old NiFi 1.2 according to our admin.
>>
>> Also, following your suggestion to try 1.5  fails as
>> [INFO] nifi-poi-processors ................................ FAILURE [
>>  6.617 s]
>> to build on OsX.
>>
>> @Juan Pablo Gardella:
>> Client {
>>   com.sun.security.auth.module.Krb5LoginModule required
>>   useKeyTab=true
>>   keyTab="xxx""
>>   principal="dddd"
>>   storeKey=true
>> *  useTicketCache=false*
>>   serviceName="zookeeper";
>> };
>> should be located in nifi_jaas.conf ?
>> Anyway, useTicketCache is already set to false.
>>
>>
>>
>> Schneider, Jonathan <[email protected]> schrieb am Mi., 10. Jan. 2018 um
>> 17:26 Uhr:
>>
>>> I don’t have the ability to test 1.5.0 RC at this time.  I’m not sure
>>> how well you could over-lay it over HDF’s flavor of Nifi.  If I get a
>>> chance, I’ll try to load a new server and load 1.5 on it, then load my
>>> flows into it.  The problem I can see having is regarding joining the
>>> Ambari managed Kerberos realm.
>>>
>>>
>>>
>>> *Jonathan Schneider*
>>>
>>> Hadoop/UNIX Administrator, STSC
>>>
>>> SCL Health
>>>
>>> 17501 W. 98
>>> <https://maps.google.com/?q=17501+W.+98&entry=gmail&source=g>th St,
>>> Pillars 25-33
>>>
>>> Lenexa, KS  66219
>>>
>>> P: 913.895.2999 <(913)%20895-2999>
>>>
>>> [email protected] <[email protected]%20%0d>
>>>
>>> www.sclhealthsystem.org
>>>
>>>
>>>
>>> [image: image001.jpg]
>>>
>>>
>>>
>>> *From:* Matt Burgess [mailto:[email protected]]
>>> *Sent:* Wednesday, January 10, 2018 10:24 AM
>>>
>>>
>>> *To:* [email protected]
>>> *Subject:* Re: [EXTERNAL EMAIL]Re: Kerberos hive failure to renew
>>> tickets
>>>
>>>
>>>
>>> To Joe's point, this may not be an issue in the upcoming 1.5.0 release
>>> as it may have been fixed under [1].
>>>
>>>
>>>
>>> Regards,
>>>
>>> Matt
>>>
>>>
>>>
>>> [1] https://issues.apache.org/jira/browse/NIFI-3472
>>>
>>> On Wed, Jan 10, 2018 at 11:14 AM, Georg Heiler <
>>> [email protected]> wrote:
>>>
>>> Regarding the stack trace I will clarify tomorrow. But it is pretty
>>> similar and cause from ticket renewal failure.
>>>
>>> Georg Heiler <[email protected]> schrieb am Mi. 10. Jan. 2018
>>> um 17:13:
>>>
>>> No. For sure some 3.0.x, but not entirely sure which one. Just realized
>>> that this must then be nifi 1.2 :(
>>>
>>> Schneider, Jonathan <[email protected]> schrieb am Mi. 10. Jan. 2018 um
>>> 17:11:
>>>
>>> HDF 3.0.0?
>>>
>>>
>>>
>>> *Jonathan Schneider*
>>>
>>> Hadoop/UNIX Administrator, STSC
>>>
>>> SCL Health
>>>
>>> 17501 W. 98
>>> <https://maps.google.com/?q=17501+W.+98&entry=gmail&source=g>th St,
>>> Pillars 25-33
>>>
>>> Lenexa, KS  66219
>>>
>>> P: 913.895.2999 <(913)%20895-2999>
>>>
>>> [email protected] <[email protected]%20%0d>
>>>
>>> www.sclhealthsystem.org
>>>
>>>
>>>
>>>
>>>
>>> *From:* Georg Heiler [mailto:[email protected]]
>>>
>>> *Sent:* Wednesday, January 10, 2018 10:07 AM
>>>
>>>
>>> *To:* [email protected]
>>> *Subject:* Re: [EXTERNAL EMAIL]Re: Kerberos hive failure to renew
>>> tickets
>>>
>>>
>>>
>>> Hive is 1.2.1
>>>
>>> Joe Witt <[email protected]> schrieb am Mi. 10. Jan. 2018 um 17:04:
>>>
>>> Interesting.  Not what I thought it might have been.
>>>
>>> Can you share the following:
>>> - NiFi config details for the Hive processors and any controller services
>>> - Hive version.
>>>
>>> And then lets see if someone who knows Hive and our NiFi components
>>> for it far better than I can chime in :)
>>>
>>>
>>>
>>> On Wed, Jan 10, 2018 at 8:56 AM, Schneider, Jonathan <[email protected]>
>>> wrote:
>>> > For reference, the specific error I get is:
>>> >
>>> > 2018-01-10 09:55:55,988 ERROR [Timer-Driven Process Thread-10]
>>> o.apache.nifi.processors.hive.PutHiveQL
>>> PutHiveQL[id=3a4f82fd-015f-1000-0000-00005aa22fb2] Failed to update Hive
>>> for
>>> StandardFlowFileRecord[uuid=7ba71cdb-7557-4eab-bd2d-bd89add1c73f,claim=StandardContentClaim
>>> [resourceClaim=StandardResourceClaim[id=1515205062419-12378,
>>> container=default, section=90], offset=342160,
>>> length=247],offset=0,name=vp_employmentstat.orc,size=247] due to
>>> java.sql.SQLException: org.apache.thrift.transport.TTransportException:
>>> org.apache.http.client.ClientProtocolException; it is possible that
>>> retrying the operation will succeed, so routing to retry:
>>> java.sql.SQLException: org.apache.thrift.transport.TTransportException:
>>> org.apache.http.client.ClientProtocolException
>>> > java.sql.SQLException:
>>> org.apache.thrift.transport.TTransportException:
>>> org.apache.http.client.ClientProtocolException
>>> >         at
>>> org.apache.hive.jdbc.HiveStatement.runAsyncOnServer(HiveStatement.java:308)
>>> >         at
>>> org.apache.hive.jdbc.HiveStatement.execute(HiveStatement.java:241)
>>> >         at
>>> org.apache.hive.jdbc.HivePreparedStatement.execute(HivePreparedStatement.java:98)
>>> >         at
>>> org.apache.commons.dbcp.DelegatingPreparedStatement.execute(DelegatingPreparedStatement.java:172)
>>> >         at
>>> org.apache.commons.dbcp.DelegatingPreparedStatement.execute(DelegatingPreparedStatement.java:172)
>>> >         at
>>> org.apache.nifi.processors.hive.PutHiveQL.lambda$null$3(PutHiveQL.java:218)
>>> >         at
>>> org.apache.nifi.processor.util.pattern.ExceptionHandler.execute(ExceptionHandler.java:127)
>>> >         at
>>> org.apache.nifi.processors.hive.PutHiveQL.lambda$new$4(PutHiveQL.java:199)
>>> >         at
>>> org.apache.nifi.processor.util.pattern.Put.putFlowFiles(Put.java:59)
>>> >         at
>>> org.apache.nifi.processor.util.pattern.Put.onTrigger(Put.java:101)
>>> >         at
>>> org.apache.nifi.processors.hive.PutHiveQL.lambda$onTrigger$6(PutHiveQL.java:255)
>>> >         at
>>> org.apache.nifi.processor.util.pattern.PartialFunctions.onTrigger(PartialFunctions.java:114)
>>> >         at
>>> org.apache.nifi.processor.util.pattern.RollbackOnFailure.onTrigger(RollbackOnFailure.java:184)
>>> >         at
>>> org.apache.nifi.processors.hive.PutHiveQL.onTrigger(PutHiveQL.java:255)
>>> >         at
>>> org.apache.nifi.controller.StandardProcessorNode.onTrigger(StandardProcessorNode.java:1118)
>>> >         at
>>> org.apache.nifi.controller.tasks.ContinuallyRunProcessorTask.call(ContinuallyRunProcessorTask.java:147)
>>> >         at
>>> org.apache.nifi.controller.tasks.ContinuallyRunProcessorTask.call(ContinuallyRunProcessorTask.java:47)
>>> >         at
>>> org.apache.nifi.controller.scheduling.TimerDrivenSchedulingAgent$1.run(TimerDrivenSchedulingAgent.java:132)
>>> >         at
>>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
>>> >         at
>>> java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308)
>>> >         at
>>> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180)
>>> >         at
>>> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294)
>>> >         at
>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
>>> >         at
>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
>>> >         at java.lang.Thread.run(Thread.java:745)
>>> > Caused by: org.apache.thrift.transport.TTransportException:
>>> org.apache.http.client.ClientProtocolException
>>> >         at
>>> org.apache.thrift.transport.THttpClient.flushUsingHttpClient(THttpClient.java:297)
>>> >         at
>>> org.apache.thrift.transport.THttpClient.flush(THttpClient.java:313)
>>> >         at
>>> org.apache.thrift.TServiceClient.sendBase(TServiceClient.java:73)
>>> >         at
>>> org.apache.thrift.TServiceClient.sendBase(TServiceClient.java:62)
>>> >         at
>>> org.apache.hive.service.cli.thrift.TCLIService$Client.send_ExecuteStatement(TCLIService.java:223)
>>> >         at
>>> org.apache.hive.service.cli.thrift.TCLIService$Client.ExecuteStatement(TCLIService.java:215)
>>> >         at sun.reflect.GeneratedMethodAccessor69.invoke(Unknown Source)
>>> >         at
>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>> >         at java.lang.reflect.Method.invoke(Method.java:498)
>>> >         at
>>> org.apache.hive.jdbc.HiveConnection$SynchronizedHandler.invoke(HiveConnection.java:1374)
>>> >         at com.sun.proxy.$Proxy174.ExecuteStatement(Unknown Source)
>>> >         at
>>> org.apache.hive.jdbc.HiveStatement.runAsyncOnServer(HiveStatement.java:299)
>>> >         ... 24 common frames omitted
>>> > Caused by: org.apache.http.client.ClientProtocolException: null
>>> >         at
>>> org.apache.http.impl.client.InternalHttpClient.doExecute(InternalHttpClient.java:187)
>>> >         at
>>> org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:118)
>>> >         at
>>> org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:56)
>>> >         at
>>> org.apache.thrift.transport.THttpClient.flushUsingHttpClient(THttpClient.java:251)
>>> >         ... 35 common frames omitted
>>> > Caused by: org.apache.http.HttpException: null
>>> >         at
>>> org.apache.hive.jdbc.HttpRequestInterceptorBase.process(HttpRequestInterceptorBase.java:86)
>>> >         at
>>> org.apache.http.protocol.ImmutableHttpProcessor.process(ImmutableHttpProcessor.java:132)
>>> >         at
>>> org.apache.http.impl.execchain.ProtocolExec.execute(ProtocolExec.java:183)
>>> >         at
>>> org.apache.http.impl.execchain.RetryExec.execute(RetryExec.java:89)
>>> >         at
>>> org.apache.http.impl.execchain.ServiceUnavailableRetryExec.execute(ServiceUnavailableRetryExec.java:85)
>>> >         at
>>> org.apache.http.impl.execchain.RedirectExec.execute(RedirectExec.java:111)
>>> >         at
>>> org.apache.http.impl.client.InternalHttpClient.doExecute(InternalHttpClient.java:185)
>>> >         ... 38 common frames omitted
>>> > Caused by: org.apache.http.HttpException: null
>>> >         at
>>> org.apache.hive.jdbc.HttpKerberosRequestInterceptor.addHttpAuthHeader(HttpKerberosRequestInterceptor.java:68)
>>> >         at
>>> org.apache.hive.jdbc.HttpRequestInterceptorBase.process(HttpRequestInterceptorBase.java:74)
>>> >         ... 44 common frames omitted
>>> > Caused by: java.lang.reflect.UndeclaredThrowableException: null
>>> >         at
>>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1884)
>>> >         at
>>> org.apache.hive.service.auth.HttpAuthUtils.getKerberosServiceTicket(HttpAuthUtils.java:83)
>>> >         at
>>> org.apache.hive.jdbc.HttpKerberosRequestInterceptor.addHttpAuthHeader(HttpKerberosRequestInterceptor.java:62)
>>> >         ... 45 common frames omitted
>>> > Caused by: org.ietf.jgss.GSSException: No valid credentials provided
>>> (Mechanism level: Failed to find any Kerberos tgt)
>>> >         at
>>> sun.security.jgss.krb5.Krb5InitCredential.getInstance(Krb5InitCredential.java:147)
>>> >         at
>>> sun.security.jgss.krb5.Krb5MechFactory.getCredentialElement(Krb5MechFactory.java:122)
>>> >         at
>>> sun.security.jgss.krb5.Krb5MechFactory.getMechanismContext(Krb5MechFactory.java:187)
>>> >         at
>>> sun.security.jgss.GSSManagerImpl.getMechanismContext(GSSManagerImpl.java:224)
>>> >         at
>>> sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:212)
>>> >         at
>>> sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179)
>>> >         at
>>> org.apache.hive.service.auth.HttpAuthUtils$HttpKerberosClientAction.run(HttpAuthUtils.java:183)
>>> >         at
>>> org.apache.hive.service.auth.HttpAuthUtils$HttpKerberosClientAction.run(HttpAuthUtils.java:151)
>>> >         at java.security.AccessController.doPrivileged(Native Method)
>>> >         at javax.security.auth.Subject.doAs(Subject.java:422)
>>> >         at
>>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1866)
>>> >         ... 47 common frames omitted
>>> >
>>> > Jonathan Schneider
>>> > Hadoop/UNIX Administrator, STSC
>>> > SCL Health
>>> > 17501 W. 98th St
>>> <https://maps.google.com/?q=17501+W.+98th+St&entry=gmail&source=g>,
>>> Pillars 25-33
>>> > Lenexa, KS  66219
>>> > P: 913.895.2999 <(913)%20895-2999>
>>> > [email protected]
>>> > www.sclhealthsystem.org
>>> >
>>> >
>>> >
>>> >
>>> > -----Original Message-----
>>> > From: Joe Witt [mailto:[email protected]]
>>> > Sent: Wednesday, January 10, 2018 9:55 AM
>>> > To: [email protected]
>>> > Subject: Re: [EXTERNAL EMAIL]Re: Kerberos hive failure to renew tickets
>>> >
>>> > Cool.  This is probably fixed in Apache NiFi 1.5.0 but please share
>>> the stack dump when it is stuck.
>>> >
>>> > bin/nifi.sh dump
>>> >
>>> > Then send us the logs dir content.
>>> >
>>> > Thanks
>>> >
>>> > On Wed, Jan 10, 2018 at 8:54 AM, Schneider, Jonathan <[email protected]>
>>> wrote:
>>> >> Joe,
>>> >>
>>> >> I can reproduce this easily.  Set up a connection to a kerberized
>>> Hive instance.  After 24 hours you will get errors about an expired TGT.
>>> Restarting the Ni-Fi process is the only way I've found to get it to renew
>>> the TGT.
>>> >>
>>> >> Jonathan Schneider
>>> >> Hadoop/UNIX Administrator, STSC
>>> >> SCL Health
>>> >> 17501 W. 98th St
>>> <https://maps.google.com/?q=17501+W.+98th+St&entry=gmail&source=g>,
>>> Pillars 25-33
>>> >> Lenexa, KS  66219
>>> >> P: 913.895.2999 <(913)%20895-2999>
>>> >> [email protected]
>>> >> www.sclhealthsystem.org
>>> >>
>>> >>
>>> >>
>>> >>
>>> >> -----Original Message-----
>>> >> From: Joe Witt [mailto:[email protected]]
>>> >> Sent: Wednesday, January 10, 2018 9:53 AM
>>> >> To: [email protected]
>>> >> Subject: [EXTERNAL EMAIL]Re: Kerberos hive failure to renew tickets
>>> >>
>>> >> *** CAUTION!  This email came from outside SCL Health. Do not open
>>> >> attachments or click links if you do not recognize the sender. ***
>>> >>
>>> >> Georg
>>> >>
>>> >> We'd need to see what you mean to really understand.  Can you please
>>> share NiFi logs directory content and if the flow is stuck/locked up please
>>> share a nifi thread dump which will be in the logs if you first run
>>> bin/nifi.sh dump.
>>> >>
>>> >> thanks
>>> >>
>>> >> On Wed, Jan 10, 2018 at 8:50 AM, Georg Heiler <
>>> [email protected]> wrote:
>>> >>> Hi
>>> >>> In production I observe problems with ticket renewal for the nifi
>>> >>> hive processor.
>>> >>>
>>> >>> A workaround is to restart the hive service but that doesn't seem
>>> right.
>>> >>>
>>> >>> Is there a real fix for this problem?
>>> >>>
>>> >>> Best Georg
>>> >
>>>
>>>
>
<?xml version="1.0" encoding="UTF-8"?>
<testsuite xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"; xsi:schemaLocation="https://maven.apache.org/surefire/maven-surefire-plugin/xsd/surefire-test-report.xsd"; name="org.apache.nifi.processors.poi.ConvertExcelToCSVProcessorTest" time="4.873" tests="11" errors="0" skipped="0" failures="4">
  <properties>
    <property name="java.runtime.name" value="Java(TM) SE Runtime Environment"/>
    <property name="sun.boot.library.path" value="/Library/Java/JavaVirtualMachines/jdk1.8.0_152.jdk/Contents/Home/jre/lib"/>
    <property name="java.vm.version" value="25.152-b16"/>
    <property name="gopherProxySet" value="false"/>
    <property name="java.vm.vendor" value="Oracle Corporation"/>
    <property name="maven.multiModuleProjectDirectory" value="/Users/geoheil/development/nifi"/>
    <property name="java.vendor.url" value="http://java.oracle.com/"/>
    <property name="path.separator" value=":"/>
    <property name="guice.disable.misplaced.annotation.check" value="true"/>
    <property name="java.vm.name" value="Java HotSpot(TM) 64-Bit Server VM"/>
    <property name="file.encoding.pkg" value="sun.io"/>
    <property name="user.country" value="DE"/>
    <property name="sun.java.launcher" value="SUN_STANDARD"/>
    <property name="sun.os.patch.level" value="unknown"/>
    <property name="java.vm.specification.name" value="Java Virtual Machine Specification"/>
    <property name="user.dir" value="/Users/geoheil/development/nifi"/>
    <property name="java.runtime.version" value="1.8.0_152-b16"/>
    <property name="java.awt.graphicsenv" value="sun.awt.CGraphicsEnvironment"/>
    <property name="java.endorsed.dirs" value="/Library/Java/JavaVirtualMachines/jdk1.8.0_152.jdk/Contents/Home/jre/lib/endorsed"/>
    <property name="os.arch" value="x86_64"/>
    <property name="java.io.tmpdir" value="/var/folders/lz/vlbj1rj12dzbvbmj16jbp0k80000gn/T/"/>
    <property name="line.separator" value="&#10;"/>
    <property name="java.vm.specification.vendor" value="Oracle Corporation"/>
    <property name="os.name" value="Mac OS X"/>
    <property name="classworlds.conf" value="/usr/local/Cellar/maven/3.5.2/libexec/bin/m2.conf"/>
    <property name="sun.jnu.encoding" value="UTF-8"/>
    <property name="java.library.path" value="/usr/local/cuda/lib:/usr/local/cuda:/usr/local/cuda/extras/CUPTI/lib:/Library/oracle/instantclient_12_1:/Users/geoheil/Library/Java/Extensions:/Library/Java/Extensions:/Network/Library/Java/Extensions:/System/Library/Java/Extensions:/usr/lib/java:."/>
    <property name="maven.conf" value="/usr/local/Cellar/maven/3.5.2/libexec/conf"/>
    <property name="java.specification.name" value="Java Platform API Specification"/>
    <property name="java.class.version" value="52.0"/>
    <property name="sun.management.compiler" value="HotSpot 64-Bit Tiered Compilers"/>
    <property name="os.version" value="10.13.2"/>
    <property name="library.jansi.path" value="/usr/local/Cellar/maven/3.5.2/libexec/lib/jansi-native"/>
    <property name="http.nonProxyHosts" value="local|*.local|169.254/16|*.169.254/16"/>
    <property name="user.home" value="/Users/geoheil"/>
    <property name="user.timezone" value="Europe/Vienna"/>
    <property name="java.awt.printerjob" value="sun.lwawt.macosx.CPrinterJob"/>
    <property name="java.specification.version" value="1.8"/>
    <property name="file.encoding" value="UTF-8"/>
    <property name="user.name" value="geoheil"/>
    <property name="java.class.path" value="/usr/local/Cellar/maven/3.5.2/libexec/boot/plexus-classworlds-2.5.2.jar"/>
    <property name="java.vm.specification.version" value="1.8"/>
    <property name="sun.arch.data.model" value="64"/>
    <property name="java.home" value="/Library/Java/JavaVirtualMachines/jdk1.8.0_152.jdk/Contents/Home/jre"/>
    <property name="sun.java.command" value="org.codehaus.plexus.classworlds.launcher.Launcher package"/>
    <property name="java.specification.vendor" value="Oracle Corporation"/>
    <property name="user.language" value="de"/>
    <property name="user.language.format" value="en"/>
    <property name="awt.toolkit" value="sun.lwawt.macosx.LWCToolkit"/>
    <property name="java.vm.info" value="mixed mode"/>
    <property name="java.version" value="1.8.0_152"/>
    <property name="java.ext.dirs" value="/Users/geoheil/Library/Java/Extensions:/Library/Java/JavaVirtualMachines/jdk1.8.0_152.jdk/Contents/Home/jre/lib/ext:/Library/Java/Extensions:/Network/Library/Java/Extensions:/System/Library/Java/Extensions:/usr/lib/java"/>
    <property name="sun.boot.class.path" value="/Library/Java/JavaVirtualMachines/jdk1.8.0_152.jdk/Contents/Home/jre/lib/resources.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_152.jdk/Contents/Home/jre/lib/rt.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_152.jdk/Contents/Home/jre/lib/sunrsasign.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_152.jdk/Contents/Home/jre/lib/jsse.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_152.jdk/Contents/Home/jre/lib/jce.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_152.jdk/Contents/Home/jre/lib/charsets.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_152.jdk/Contents/Home/jre/lib/jfr.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_152.jdk/Contents/Home/jre/classes"/>
    <property name="java.vendor" value="Oracle Corporation"/>
    <property name="maven.home" value="/usr/local/Cellar/maven/3.5.2/libexec"/>
    <property name="file.separator" value="/"/>
    <property name="java.vendor.url.bug" value="http://bugreport.sun.com/bugreport/"/>
    <property name="sun.cpu.endian" value="little"/>
    <property name="sun.io.unicode.encoding" value="UnicodeBig"/>
    <property name="socksNonProxyHosts" value="local|*.local|169.254/16|*.169.254/16"/>
    <property name="ftp.nonProxyHosts" value="local|*.local|169.254/16|*.169.254/16"/>
    <property name="sun.cpu.isalist" value=""/>
  </properties>
  <testcase name="testProcessAllSheets" classname="org.apache.nifi.processors.poi.ConvertExcelToCSVProcessorTest" time="3.039"/>
  <testcase name="testProcessASpecificSheetThatDoesExist" classname="org.apache.nifi.processors.poi.ConvertExcelToCSVProcessorTest" time="1.451"/>
  <testcase name="testSkipRows" classname="org.apache.nifi.processors.poi.ConvertExcelToCSVProcessorTest" time="0.104">
    <failure message="expected:&lt;1234[.46,12:00:00 PM,£   123.45&#10;1234.5,Sunday\, January 01\, 2017,¥   123.45&#10;1\,234.46,1/1/17 12:00,$   1\,023.45&#10;1\,234.4560,12:00 PM,£   1\,023.45&#10;9.88E+08,2017/01/01/ 12:00,¥   1\,023.45&#10;9.877E+08,,&#10;9.]8765E+08,,&#10;&gt; but was:&lt;1234[\,46,12:00:00 PM,£   123\,45&#10;1234\,5,Sonntag\, Januar 01\, 2017,¥   123\,45&#10;1.234\,46,1/1/17 12:00,$   1.023\,45&#10;1.234\,4560,12:00 PM,£   1.023\,45&#10;9\,88E+08,2017/01/01/ 12:00,¥   1.023\,45&#10;9\,877E+08,,&#10;9\,]8765E+08,,&#10;&gt;" type="org.junit.ComparisonFailure"><![CDATA[org.junit.ComparisonFailure: 
expected:<1234[.46,12:00:00 PM,£   123.45
1234.5,Sunday\, January 01\, 2017,Â¥   123.45
1\,234.46,1/1/17 12:00,$   1\,023.45
1\,234.4560,12:00 PM,£   1\,023.45
9.88E+08,2017/01/01/ 12:00,Â¥   1\,023.45
9.877E+08,,
9.]8765E+08,,
> but was:<1234[\,46,12:00:00 PM,£   123\,45
1234\,5,Sonntag\, Januar 01\, 2017,Â¥   123\,45
1.234\,46,1/1/17 12:00,$   1.023\,45
1.234\,4560,12:00 PM,£   1.023\,45
9\,88E+08,2017/01/01/ 12:00,Â¥   1.023\,45
9\,877E+08,,
9\,]8765E+08,,
>
	at org.apache.nifi.processors.poi.ConvertExcelToCSVProcessorTest.testSkipRows(ConvertExcelToCSVProcessorTest.java:146)
]]></failure>
  </testcase>
  <testcase name="testSkipColumns" classname="org.apache.nifi.processors.poi.ConvertExcelToCSVProcessorTest" time="0.019">
    <failure message="expected:&lt;Numbers,Money&#10;1234[.456,$   123.45&#10;1234.46,£   123.45&#10;1234.5,¥   123.45&#10;1\,234.46,$   1\,023.45&#10;1\,234.4560,£   1\,023.45&#10;9.88E+08,¥   1\,023.45&#10;9.877E+08,&#10;9.]8765E+08,&#10;&gt; but was:&lt;Numbers,Money&#10;1234[\,456,$   123\,45&#10;1234\,46,£   123\,45&#10;1234\,5,¥   123\,45&#10;1.234\,46,$   1.023\,45&#10;1.234\,4560,£   1.023\,45&#10;9\,88E+08,¥   1.023\,45&#10;9\,877E+08,&#10;9\,]8765E+08,&#10;&gt;" type="org.junit.ComparisonFailure"><![CDATA[org.junit.ComparisonFailure: 
expected:<Numbers,Money
1234[.456,$   123.45
1234.46,£   123.45
1234.5,Â¥   123.45
1\,234.46,$   1\,023.45
1\,234.4560,£   1\,023.45
9.88E+08,Â¥   1\,023.45
9.877E+08,
9.]8765E+08,
> but was:<Numbers,Money
1234[\,456,$   123\,45
1234\,46,£   123\,45
1234\,5,Â¥   123\,45
1.234\,46,$   1.023\,45
1.234\,4560,£   1.023\,45
9\,88E+08,Â¥   1.023\,45
9\,877E+08,
9\,]8765E+08,
>
	at org.apache.nifi.processors.poi.ConvertExcelToCSVProcessorTest.testSkipColumns(ConvertExcelToCSVProcessorTest.java:172)
]]></failure>
  </testcase>
  <testcase name="testDataFormatting" classname="org.apache.nifi.processors.poi.ConvertExcelToCSVProcessorTest" time="0.013"/>
  <testcase name="testHandleUnsupportedXlsFile" classname="org.apache.nifi.processors.poi.ConvertExcelToCSVProcessorTest" time="0.009"/>
  <testcase name="testCustomDelimiters" classname="org.apache.nifi.processors.poi.ConvertExcelToCSVProcessorTest" time="0.017">
    <failure message="expected:&lt;...mestamps|Money&#10;1234[.456|1/1/17|$   123.45&#10;1234.46|12:00:00 PM|£   123.45&#10;1234.5|Sunday, January 01, 2017|¥   123.45&#10;1,234.46|1/1/17 12:00|$   1,023.45&#10;1,234.4560|12:00 PM|£   1,023.45&#10;9.88E+08|2017/01/01/ 12:00|¥   1,023.45&#10;9.877E+08||&#10;9.]8765E+08||&#10;&gt; but was:&lt;...mestamps|Money&#10;1234[,456|1/1/17|$   123,45&#10;1234,46|12:00:00 PM|£   123,45&#10;1234,5|Sonntag, Januar 01, 2017|¥   123,45&#10;1.234,46|1/1/17 12:00|$   1.023,45&#10;1.234,4560|12:00 PM|£   1.023,45&#10;9,88E+08|2017/01/01/ 12:00|¥   1.023,45&#10;9,877E+08||&#10;9,]8765E+08||&#10;&gt;" type="org.junit.ComparisonFailure"><![CDATA[org.junit.ComparisonFailure: 
expected:<...mestamps|Money
1234[.456|1/1/17|$   123.45
1234.46|12:00:00 PM|£   123.45
1234.5|Sunday, January 01, 2017|Â¥   123.45
1,234.46|1/1/17 12:00|$   1,023.45
1,234.4560|12:00 PM|£   1,023.45
9.88E+08|2017/01/01/ 12:00|Â¥   1,023.45
9.877E+08||
9.]8765E+08||
> but was:<...mestamps|Money
1234[,456|1/1/17|$   123,45
1234,46|12:00:00 PM|£   123,45
1234,5|Sonntag, Januar 01, 2017|Â¥   123,45
1.234,46|1/1/17 12:00|$   1.023,45
1.234,4560|12:00 PM|£   1.023,45
9,88E+08|2017/01/01/ 12:00|Â¥   1.023,45
9,877E+08||
9,]8765E+08||
>
	at org.apache.nifi.processors.poi.ConvertExcelToCSVProcessorTest.testCustomDelimiters(ConvertExcelToCSVProcessorTest.java:201)
]]></failure>
  </testcase>
  <testcase name="testQuoting" classname="org.apache.nifi.processors.poi.ConvertExcelToCSVProcessorTest" time="0.017">
    <failure message="expected:&lt;...rs,Timestamps,Money&#10;[1234.456,1/1/17,$   123.45&#10;1234.46,12:00:00 PM,£   123.45&#10;1234.5,&quot;Sunday, January 01, 2017&quot;,¥   123.45&#10;&quot;1,234.46&quot;,1/1/17 12:00,&quot;$   1,023.45&quot;&#10;&quot;1,234.4560&quot;,12:00 PM,&quot;£   1,023.45&quot;&#10;9.88E+08,2017/01/01/ 12:00,&quot;¥   1,023.45&quot;&#10;9.877E+08,,&#10;9.8765E+08],,&#10;&gt; but was:&lt;...rs,Timestamps,Money&#10;[&quot;1234,456&quot;,1/1/17,&quot;$   123,45&quot;&#10;&quot;1234,46&quot;,12:00:00 PM,&quot;£   123,45&quot;&#10;&quot;1234,5&quot;,&quot;Sonntag, Januar 01, 2017&quot;,&quot;¥   123,45&quot;&#10;&quot;1.234,46&quot;,1/1/17 12:00,&quot;$   1.023,45&quot;&#10;&quot;1.234,4560&quot;,12:00 PM,&quot;£   1.023,45&quot;&#10;&quot;9,88E+08&quot;,2017/01/01/ 12:00,&quot;¥   1.023,45&quot;&#10;&quot;9,877E+08&quot;,,&#10;&quot;9,8765E+08&quot;],,&#10;&gt;" type="org.junit.ComparisonFailure"><![CDATA[org.junit.ComparisonFailure: 
expected:<...rs,Timestamps,Money
[1234.456,1/1/17,$   123.45
1234.46,12:00:00 PM,£   123.45
1234.5,"Sunday, January 01, 2017",Â¥   123.45
"1,234.46",1/1/17 12:00,"$   1,023.45"
"1,234.4560",12:00 PM,"£   1,023.45"
9.88E+08,2017/01/01/ 12:00,"Â¥   1,023.45"
9.877E+08,,
9.8765E+08],,
> but was:<...rs,Timestamps,Money
["1234,456",1/1/17,"$   123,45"
"1234,46",12:00:00 PM,"£   123,45"
"1234,5","Sonntag, Januar 01, 2017","Â¥   123,45"
"1.234,46",1/1/17 12:00,"$   1.023,45"
"1.234,4560",12:00 PM,"£   1.023,45"
"9,88E+08",2017/01/01/ 12:00,"Â¥   1.023,45"
"9,877E+08",,
"9,8765E+08"],,
>
	at org.apache.nifi.processors.poi.ConvertExcelToCSVProcessorTest.testQuoting(ConvertExcelToCSVProcessorTest.java:118)
]]></failure>
  </testcase>
  <testcase name="testProcessASheetWithBlankCells" classname="org.apache.nifi.processors.poi.ConvertExcelToCSVProcessorTest" time="0.013"/>
  <testcase name="testNonExistantSpecifiedSheetName" classname="org.apache.nifi.processors.poi.ConvertExcelToCSVProcessorTest" time="0.121"/>
  <testcase name="testMultipleSheetsGeneratesMultipleFlowFiles" classname="org.apache.nifi.processors.poi.ConvertExcelToCSVProcessorTest" time="0.011"/>
</testsuite>

Reply via email to