[jira] [Commented] (ARROW-6206) [Java][Docs] Document environment variables/java properties

2019-08-21 Thread Jim Northrup (Jira)


[ 
https://issues.apache.org/jira/browse/ARROW-6206?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16911955#comment-16911955
 ] 

Jim Northrup commented on ARROW-6206:
-

(previsouly responded as email, sorry if this creates a dupe)

 

I admire Arrow for doing a thing well.  I hope that if I simply call “mvn 
maven-versions-plugin:latest” in the future this simple jdbc code will work 
better than before.

 

I appreciate the attention to the details.

 

I think through this discussion the jist is that tensorflow one-hot columns may 
quickly test the expected norms of arrow.  Likewise, timeseries datasets have 
us blowing gaskets all over the place in terms of time-to-completion and RAM 
using pandas.  What do we do with a 300 gig numpy dataset living in swap that 
takes 3 dasy to build? There’s no LSTM examples to demonstrate anything but toy 
datasets.

 

Turbodbc looks like a good fit for reducing transcription times.

 

For what I need in the space of Arrow, I think the ideal tool is something to 
work in and out of numpy and delegate to and from apache Geode or Hazelcast as 
the main substrate. 

 

If perchance arrow can act as a window to memory grids, all the better.

 

As I find the time for signups and 2fa’s I will compose this to the lists

 

> [Java][Docs] Document environment variables/java properties
> ---
>
> Key: ARROW-6206
> URL: https://issues.apache.org/jira/browse/ARROW-6206
> Project: Apache Arrow
>  Issue Type: Improvement
>  Components: Documentation, Java
>Reporter: Micah Kornfield
>Assignee: Ji Liu
>Priority: Major
>  Labels: pull-request-available
> Fix For: 0.15.0
>
>  Time Spent: 1.5h
>  Remaining Estimate: 0h
>
> Specifically, "-Dio.netty.tryReflectionSetAccessible=true" for JVMs >= 9 and 
> BoundsChecking/NullChecking for get.
>  
>  



--
This message was sent by Atlassian Jira
(v8.3.2#803003)


[jira] [Commented] (ARROW-6206) [Java][Docs] Document environment variables/java properties

2019-08-20 Thread Jim Northrup (Jira)


[ 
https://issues.apache.org/jira/browse/ARROW-6206?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16911465#comment-16911465
 ] 

Jim Northrup commented on ARROW-6206:
-

> Could you provide a link to the text you quoted I'd be interested in reading 
> it.
 
this is the benefit of having written what amounts to a netty analog over the 
course of 4 years, including an SSL/TLS sockets layer for http at one point.  
ultimately there is danger in long-lived services using NIO, end-of-story.

the process cleanup of the underlying OS will be the best protection against 
java NIO/JNI memory handles -- if you have a daemon or long-running thing, or 
you must use directbuffers, assume that the reference counting is imperfect, 
and it will bite you one day (it may take days) if you trust it.  so thing that 
use nio should be short lived, and wherever possible process encapsulated.  

netty is the jboss-endorsed c10k java representative with the popular 
marketshare. iiuc arrow is a team that picked up netty derived off-heap tools 
naively and demonstrated that in 2019 it's still prone to some gotchas that are 
a little bit stronger than edge cases when the unit tests pass.  indeed, my 
initial testing with writing jdbc to arrow on kilobytes of records succeeded 
well, and gave me the confidence to assume this will do the job faster than 
python.  and so began this thread on 800+ megabytes of data.

considering the age and size of the netty ecosystem, there is no lack of 
scrutiny or open source virtue here.  it's a VM-level weakness that java NIO is 
still something like peanuts in the kitchen, you should really put a consumer 
facing notice on where NIO is and is not present.





> [Java][Docs] Document environment variables/java properties
> ---
>
> Key: ARROW-6206
> URL: https://issues.apache.org/jira/browse/ARROW-6206
> Project: Apache Arrow
>  Issue Type: Improvement
>  Components: Documentation, Java
>Reporter: Micah Kornfield
>Assignee: Ji Liu
>Priority: Major
>  Labels: pull-request-available
> Fix For: 0.15.0
>
>  Time Spent: 1.5h
>  Remaining Estimate: 0h
>
> Specifically, "-Dio.netty.tryReflectionSetAccessible=true" for JVMs >= 9 and 
> BoundsChecking/NullChecking for get.
>  
>  



--
This message was sent by Atlassian Jira
(v8.3.2#803003)


[jira] [Commented] (ARROW-6206) [Java][Docs] Document environment variables/java properties

2019-08-13 Thread Jim Northrup (JIRA)


[ 
https://issues.apache.org/jira/browse/ARROW-6206?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16906033#comment-16906033
 ] 

Jim Northrup commented on ARROW-6206:
-

NIO is not going to go away, and java is not going to stop harboring 
unreproducable NIO bugs.

is there a charter for what java usecases will be supported, and THEN, what 
among these items will leverage NIO, and what among these can use pure heap 
implementations of objects exclusively?

the utilities should abandon all hope of stability or useful benchmarks while 
there is a NIO component in a  piece of code.  the oracle engineers this year 
are certainly not on the same page as the jdk8 team, or the jdk6 team.  

Unsafe/NIO usecases number about 2:

if you're utilizing mmap files to minimize page faults, go there.
if you're talking to crossplatform structs and mailboxes, you have no choice.
if you're squirreling away heap objects using something greater than the -Xmx 
setting, you should probably engineer it through mmap file access instead of 
using native handles directly, this is extremely unstable in my experience.
 


> [Java][Docs] Document environment variables/java properties
> ---
>
> Key: ARROW-6206
> URL: https://issues.apache.org/jira/browse/ARROW-6206
> Project: Apache Arrow
>  Issue Type: Improvement
>  Components: Documentation, Java
>Reporter: Micah Kornfield
>Priority: Major
>
> Specifically, "-Dio.netty.tryReflectionSetAccessible=true" for JVMs >= 9 and 
> BoundsChecking/NullChecking for get.
>  
>  



--
This message was sent by Atlassian JIRA
(v7.6.14#76016)


[jira] [Commented] (ARROW-6202) [Java] Exception in thread "main" org.apache.arrow.memory.OutOfMemoryException: Unable to allocate buffer of size 4 due to memory limit. Current allocation: 2147483646

2019-08-13 Thread Jim Northrup (JIRA)


[ 
https://issues.apache.org/jira/browse/ARROW-6202?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16905982#comment-16905982
 ] 

Jim Northrup commented on ARROW-6202:
-

for the record we have pre-tensorflow column counts of about 14000 one-hot 
attributes.  we are seeing numpy RAM requirements of 160 gigs



> [Java] Exception in thread "main" 
> org.apache.arrow.memory.OutOfMemoryException: Unable to allocate buffer of 
> size 4 due to memory limit. Current allocation: 2147483646
> ---
>
> Key: ARROW-6202
> URL: https://issues.apache.org/jira/browse/ARROW-6202
> Project: Apache Arrow
>  Issue Type: Bug
>  Components: Java
>Affects Versions: 0.14.1
>Reporter: Jim Northrup
>Priority: Major
>  Labels: jdbc
>
> jdbc query results exceed native heap when using generous -Xmx settings. 
> for roughly 800 megabytes of csv/flatfile resultset, arrow is unable to house 
> the contents in RAM long enough to persist to disk, without explicit 
> knowledge beyond unit test sample code.
> source:
> https://github.com/jnorthrup/jdbc2json/blob/master/src/main/java/com/fnreport/QueryToFeather.kt#L83
> {code:java}
> Exception in thread "main" org.apache.arrow.memory.OutOfMemoryException: 
> Unable to allocate buffer of size 4 due to memory limit. Current allocation: 
> 2147483646
> at 
> org.apache.arrow.memory.BaseAllocator.buffer(BaseAllocator.java:307)
> at 
> org.apache.arrow.memory.BaseAllocator.buffer(BaseAllocator.java:277)
> at 
> org.apache.arrow.adapter.jdbc.JdbcToArrowUtils.updateVector(JdbcToArrowUtils.java:610)
> at 
> org.apache.arrow.adapter.jdbc.JdbcToArrowUtils.jdbcToFieldVector(JdbcToArrowUtils.java:462)
> at 
> org.apache.arrow.adapter.jdbc.JdbcToArrowUtils.jdbcToArrowVectors(JdbcToArrowUtils.java:396)
> at 
> org.apache.arrow.adapter.jdbc.JdbcToArrow.sqlToArrow(JdbcToArrow.java:225)
> at 
> org.apache.arrow.adapter.jdbc.JdbcToArrow.sqlToArrow(JdbcToArrow.java:187)
> at 
> org.apache.arrow.adapter.jdbc.JdbcToArrow.sqlToArrow(JdbcToArrow.java:156)
> at com.fnreport.QueryToFeather$Companion.go(QueryToFeather.kt:83)
> at 
> com.fnreport.QueryToFeather$Companion$main$1.invokeSuspend(QueryToFeather.kt:95)
> at 
> kotlin.coroutines.jvm.internal.BaseContinuationImpl.resumeWith(ContinuationImpl.kt:33)
> at kotlinx.coroutines.DispatchedTask.run(Dispatched.kt:241)
> at 
> kotlinx.coroutines.EventLoopImplBase.processNextEvent(EventLoop.common.kt:270)
> at kotlinx.coroutines.BlockingCoroutine.joinBlocking(Builders.kt:79)
> at 
> kotlinx.coroutines.BuildersKt__BuildersKt.runBlocking(Builders.kt:54)
> at kotlinx.coroutines.BuildersKt.runBlocking(Unknown Source)
> at 
> kotlinx.coroutines.BuildersKt__BuildersKt.runBlocking$default(Builders.kt:36)
> at kotlinx.coroutines.BuildersKt.runBlocking$default(Unknown Source)
> at com.fnreport.QueryToFeather$Companion.main(QueryToFeather.kt:93)
> at com.fnreport.QueryToFeather.main(QueryToFeather.kt)
> {code}



--
This message was sent by Atlassian JIRA
(v7.6.14#76016)


[jira] [Commented] (ARROW-6202) [Java] Exception in thread "main" org.apache.arrow.memory.OutOfMemoryException: Unable to allocate buffer of size 4 due to memory limit. Current allocation: 2147483646

2019-08-13 Thread Jim Northrup (JIRA)


[ 
https://issues.apache.org/jira/browse/ARROW-6202?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16905974#comment-16905974
 ] 

Jim Northrup commented on ARROW-6202:
-

full program execution below, stderr.
record count 6.6 million records.

{code:shell}
++ dirname ../../jdbc2json/bin/flatsql.sh
+ JDIR=../../jdbc2json/bin/../
+ exec java -classpath 
'../../jdbc2json/bin/..//target/*:../../jdbc2json/bin/..//target/lib/*' 
com.fnreport.QueryToFlat 
'jdbc:informix-sqli://10.14.18.30:9088/i_l01:INFORMIXSERVER=;user=;password='
 'select  b.so_invoice_date,
   sum(b.sol_shipped_qty) ag_qty,a.available_stock,
   b.item_code,
   b.so_whse_code,
   b.division,
   b.department,
   b.category,
   b.class,
   b.item_group,
   b.cust_group,
   b.cust_category,
   b.cust_segment,
   b.cust_subsegment,
   b.sol_price_rule,
   b.del_method
from informix.ZEMP_STOCK_DAILY as a
 inner join isnt.ZEMP_SALES_BY_ITEM_FOR_REPORT as b
on  
a.whse_code = b.so_whse_code
and a.stock_code = b.item_code
and a.stock_date = b.so_invoice_date
and b.isbulk = 0
group by a.available_stock,
 b.so_invoice_date,
 b.item_code,
 b.so_whse_code,
 b.division,
 b.department,
 b.category,
 b.class,
 b.item_group,
 b.cust_group,
 b.cust_category,
 b.cust_segment,
 b.cust_subsegment,
 b.sol_price_rule,
 b.del_method'
using sql: select  b.so_invoice_date,
   sum(b.sol_shipped_qty) ag_qty,a.available_stock,
   b.item_code,
   b.so_whse_code,
   b.division,
   b.department,
   b.category,
   b.class,
   b.item_group,
   b.cust_group,
   b.cust_category,
   b.cust_segment,
   b.cust_subsegment,
   b.sol_price_rule,
   b.del_method
from informix.ZEMP_STOCK_DAILY as a
 inner join isnt.ZEMP_SALES_BY_ITEM_FOR_REPORT as b
on  
a.whse_code = b.so_whse_code
and a.stock_code = b.item_code
and a.stock_date = b.so_invoice_date
and b.isbulk = 0
group by a.available_stock,
 b.so_invoice_date,
 b.item_code,
 b.so_whse_code,
 b.division,
 b.department,
 b.category,
 b.class,
 b.item_group,
 b.cust_group,
 b.cust_category,
 b.cust_segment,
 b.cust_subsegment,
 b.sol_price_rule,
 b.del_method
SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
SLF4J: Defaulting to no-operation (NOP) logger implementation
SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further 
details.
loading meta for ZEMP_STOCK_DAILY
(driver-specific potential meta, [TABLE_CAT, TABLE_SCHEM, TABLE_NAME, 
COLUMN_NAME, DATA_TYPE, TYPE_NAME, COLUMN_SIZE, BUFFER_LENGTH, DECIMAL_DIGITS, 
NUM_PREC_RADIX, NULLABLE, REMARKS, COLUMN_DEF, SQL_DATA_TYPE, SQL_DATETIME_SUB, 
CHAR_OCTET_LENGTH, ORDINAL_POSITION, IS_NULLABLE, SCOPE_CATALOG, SCOPE_SCHEMA, 
SCOPE_TABLE, SOURCE_DATA_TYPE, IS_AUTOINCREMENT])
loading meta for ZEMP_SALES_BY_ITEM_FOR_REPORT
(driver-specific potential meta, [TABLE_CAT, TABLE_SCHEM, TABLE_NAME, 
COLUMN_NAME, DATA_TYPE, TYPE_NAME, COLUMN_SIZE, BUFFER_LENGTH, DECIMAL_DIGITS, 
NUM_PREC_RADIX, NULLABLE, REMARKS, COLUMN_DEF, SQL_DATA_TYPE, SQL_DATETIME_SUB, 
CHAR_OCTET_LENGTH, ORDINAL_POSITION, IS_NULLABLE, SCOPE_CATALOG, SCOPE_SCHEMA, 
SCOPE_TABLE, SOURCE_DATA_TYPE, IS_AUTOINCREMENT])
[(ZEMP_STOCK_DAILY, [(whse_code, (DATE(13), 4)), (stock_date, (DOUBLE(7), 10)), 
(stock_code, (TINYINT(1), 16)), (stk_description, (TINYINT(1), 30)), 
(stk_apn_number, (TINYINT(1), 20)), (stock_group, (TINYINT(1), 4)), (sht_class, 
(TINYINT(1), 4)), (sht_category, (TINYINT(1), 4)), (sht_department, 
(TINYINT(1), 4)), (sht_division, (TINYINT(1), 4)), (stk_stock_status, 
(TINYINT(1), 1)), (whse_avg_cost, (FLOAT(5), 19)), (opening_balance, (FLOAT(5), 
14)), (transfers_in, (FLOAT(5), 14)), (transfers_out, (FLOAT(5), 14)), 
(whse_adjustments, (FLOAT(5), 14)), (cm_sales_qty, (FLOAT(5), 14)), 
(on_consignment, (FLOAT(5), 14)), (whse_qty_on_hand, (FLOAT(5), 19)), 
(awaiting_putaway, (FLOAT(5), 14)), (factory_orders, (FLOAT(5), 14)), 
(whse_back_orders, (FLOAT(5), 14)), (forward_orders, (FLOAT(5), 14)), 
(current_orders, (FLOAT(5), 14)), (qty_on_order, (FLOAT(5), 14)), 
(qty_in_transit, (FLOAT(5), 14)), (available_stock, (FLOAT(5), 20)), (dms, 
(FLOAT(5), 14)), (dmspromo, (FLOAT(5), 14)), (buffer, (FLOAT(5), 14)), 
(pradms1, (FLOAT(5), 14)), (pradms2, (FLOAT(5), 14)), (time_insert, (CHAR(10), 
23)), (lastbuydate, (DOUBLE(7), 10)), (kvi, (TINYINT(1), 4)), (condition_code, 
(TINYINT(1), 1)), (control_code, 

[jira] [Created] (ARROW-6202) Exception in thread "main" org.apache.arrow.memory.OutOfMemoryException: Unable to allocate buffer of size 4 due to memory limit. Current allocation: 2147483646

2019-08-11 Thread Jim Northrup (JIRA)
Jim Northrup created ARROW-6202:
---

 Summary: Exception in thread "main" 
org.apache.arrow.memory.OutOfMemoryException: Unable to allocate buffer of size 
4 due to memory limit. Current allocation: 2147483646
 Key: ARROW-6202
 URL: https://issues.apache.org/jira/browse/ARROW-6202
 Project: Apache Arrow
  Issue Type: Bug
Affects Versions: 0.14.1
Reporter: Jim Northrup



jdbc query results exceed native heap when using generous -Xmx settings. 

for roughly 800 megabytes of csv/flatfile resultset, arrow is unable to house 
the contents in RAM long enough to persist to disk, without explicit knowledge 
beyond unit test sample code.

source:
https://github.com/jnorthrup/jdbc2json/blob/master/src/main/java/com/fnreport/QueryToFeather.kt#L83


{code:java}
Exception in thread "main" org.apache.arrow.memory.OutOfMemoryException: Unable 
to allocate buffer of size 4 due to memory limit. Current allocation: 2147483646
at org.apache.arrow.memory.BaseAllocator.buffer(BaseAllocator.java:307)
at org.apache.arrow.memory.BaseAllocator.buffer(BaseAllocator.java:277)
at 
org.apache.arrow.adapter.jdbc.JdbcToArrowUtils.updateVector(JdbcToArrowUtils.java:610)
at 
org.apache.arrow.adapter.jdbc.JdbcToArrowUtils.jdbcToFieldVector(JdbcToArrowUtils.java:462)
at 
org.apache.arrow.adapter.jdbc.JdbcToArrowUtils.jdbcToArrowVectors(JdbcToArrowUtils.java:396)
at 
org.apache.arrow.adapter.jdbc.JdbcToArrow.sqlToArrow(JdbcToArrow.java:225)
at 
org.apache.arrow.adapter.jdbc.JdbcToArrow.sqlToArrow(JdbcToArrow.java:187)
at 
org.apache.arrow.adapter.jdbc.JdbcToArrow.sqlToArrow(JdbcToArrow.java:156)
at com.fnreport.QueryToFeather$Companion.go(QueryToFeather.kt:83)
at 
com.fnreport.QueryToFeather$Companion$main$1.invokeSuspend(QueryToFeather.kt:95)
at 
kotlin.coroutines.jvm.internal.BaseContinuationImpl.resumeWith(ContinuationImpl.kt:33)
at kotlinx.coroutines.DispatchedTask.run(Dispatched.kt:241)
at 
kotlinx.coroutines.EventLoopImplBase.processNextEvent(EventLoop.common.kt:270)
at kotlinx.coroutines.BlockingCoroutine.joinBlocking(Builders.kt:79)
at kotlinx.coroutines.BuildersKt__BuildersKt.runBlocking(Builders.kt:54)
at kotlinx.coroutines.BuildersKt.runBlocking(Unknown Source)
at 
kotlinx.coroutines.BuildersKt__BuildersKt.runBlocking$default(Builders.kt:36)
at kotlinx.coroutines.BuildersKt.runBlocking$default(Unknown Source)
at com.fnreport.QueryToFeather$Companion.main(QueryToFeather.kt:93)
at com.fnreport.QueryToFeather.main(QueryToFeather.kt)
{code}





--
This message was sent by Atlassian JIRA
(v7.6.14#76016)


[jira] [Comment Edited] (ARROW-5412) [Java] Integration test fails with UnsupportedOperationException

2019-08-11 Thread Jim Northrup (JIRA)


[ 
https://issues.apache.org/jira/browse/ARROW-5412?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16904736#comment-16904736
 ] 

Jim Northrup edited comment on ARROW-5412 at 8/11/19 8:43 PM:
--

setting the system properties as first line of main() succeeds for jdk12 where 
latest release as-of-this-comment still throws the exception cited.

 {code:kotlin}
@JvmStatic
fun main(vararg args: String) = runBlocking{
   System.setProperty( "io.netty.tryReflectionSetAccessible","true") 
   go(*args) 
}

{code} 




was (Author: jimn235):
setting the system properties as first line of main() succeeds for jdk12 where 
latest release as-of-this-comment still throws the exception cited.

@JvmStatic
fun main(vararg args: String) = runBlocking{
   System.setProperty( "io.netty.tryReflectionSetAccessible","true") 
   go(*args) 
}




> [Java] Integration test fails with UnsupportedOperationException
> 
>
> Key: ARROW-5412
> URL: https://issues.apache.org/jira/browse/ARROW-5412
> Project: Apache Arrow
>  Issue Type: Bug
>  Components: Integration, Java
>Reporter: Benjamin Kietzman
>Assignee: Bryan Cutler
>Priority: Minor
>  Labels: pull-request-available
> Fix For: 0.14.0
>
>  Time Spent: 1h 20m
>  Remaining Estimate: 0h
>
> Running the java integration test fails with an exception:
> {code}
> $ java -classpath 
> ~/arrow/java/tools/target/arrow-tools-0.14.0-SNAPSHOT-jar-with-dependencies.jar
>  -Dio.netty.tryReflectionSetAccessible=false 
> org.apache.arrow.tools.Integration -a ~/tmp/1832930b_simple.json_as_file -j 
> ~/arrow/integration/data/simple.json -c VALIDATE
> ...
> Incompatible files
> sun.misc.Unsafe or java.nio.DirectByteBuffer.(long, int) not available
> 08:55:43.597 [main] ERROR org.apache.arrow.tools.Integration - Incompatible 
> files
> java.lang.UnsupportedOperationException: sun.misc.Unsafe or 
> java.nio.DirectByteBuffer.(long, int) not available
>   at 
> io.netty.util.internal.PlatformDependent.directBuffer(PlatformDependent.java:399)
>   at io.netty.buffer.NettyArrowBuf.getDirectBuffer(NettyArrowBuf.java:233)
>   at io.netty.buffer.NettyArrowBuf.nioBuffer(NettyArrowBuf.java:223)
>   at io.netty.buffer.ArrowBuf.nioBuffer(ArrowBuf.java:245)
>   at 
> org.apache.arrow.vector.ipc.message.ArrowRecordBatch.computeBodyLength(ArrowRecordBatch.java:211)
>   at 
> org.apache.arrow.vector.ipc.message.MessageSerializer.serialize(MessageSerializer.java:175)
>   at 
> org.apache.arrow.vector.ipc.ArrowWriter.writeRecordBatch(ArrowWriter.java:119)
>   at 
> org.apache.arrow.vector.ipc.ArrowFileWriter.writeRecordBatch(ArrowFileWriter.java:61)
>   at 
> org.apache.arrow.vector.ipc.ArrowWriter.writeBatch(ArrowWriter.java:107)
>   at 
> org.apache.arrow.tools.Integration$Command$2.execute(Integration.java:171)
>   at org.apache.arrow.tools.Integration.run(Integration.java:118)
>   at org.apache.arrow.tools.Integration.main(Integration.java:69)
> {code}
> Looking through netty's source, it looks like this exception is [emitted 
> here|https://github.com/netty/netty/blob/master/common/src/main/java/io/netty/util/internal/PlatformDependent.java#L343-L344].
> {code}
> $ apt search jdk | grep installed
> default-jre/bionic-updates,bionic-security,now 2:1.11-68ubuntu1~18.04.1 amd64 
> [installed,automatic]
> default-jre-headless/bionic-updates,bionic-security,now 
> 2:1.11-68ubuntu1~18.04.1 amd64 [installed,automatic]
> libslf4j-java/bionic,bionic,now 1.7.25-3 all [installed,automatic]
> openjdk-11-jdk/bionic-updates,bionic-security,now 11.0.3+7-1ubuntu2~18.04.1 
> amd64 [installed]
> openjdk-11-jdk-headless/bionic-updates,bionic-security,now 
> 11.0.3+7-1ubuntu2~18.04.1 amd64 [installed,automatic]
> openjdk-11-jre/bionic-updates,bionic-security,now 11.0.3+7-1ubuntu2~18.04.1 
> amd64 [installed,automatic]
> openjdk-11-jre-headless/bionic-updates,bionic-security,now 
> 11.0.3+7-1ubuntu2~18.04.1 amd64 [installed,automatic]
> {code}



--
This message was sent by Atlassian JIRA
(v7.6.14#76016)


[jira] [Comment Edited] (ARROW-5412) [Java] Integration test fails with UnsupportedOperationException

2019-08-11 Thread Jim Northrup (JIRA)


[ 
https://issues.apache.org/jira/browse/ARROW-5412?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16904736#comment-16904736
 ] 

Jim Northrup edited comment on ARROW-5412 at 8/11/19 8:42 PM:
--

setting the system properties as first line of main() succeeds for jdk12 where 
latest release as-of-this-comment still throws the exception cited.

@JvmStatic
fun main(vararg args: String) = runBlocking{
   System.setProperty( "io.netty.tryReflectionSetAccessible","true") 
   go(*args) 
}





was (Author: jimn235):
setting the system properties as first leine of main() succeeds for jdk12 where 
latest reslease as-of-this-comment still throws the error cited.

 
 @JvmStatic
 fun main(vararg args: String) = runBlocking {
 System.setProperty( "io.netty.tryReflectionSetAccessible","true")  
 go(*args) }
 }

> [Java] Integration test fails with UnsupportedOperationException
> 
>
> Key: ARROW-5412
> URL: https://issues.apache.org/jira/browse/ARROW-5412
> Project: Apache Arrow
>  Issue Type: Bug
>  Components: Integration, Java
>Reporter: Benjamin Kietzman
>Assignee: Bryan Cutler
>Priority: Minor
>  Labels: pull-request-available
> Fix For: 0.14.0
>
>  Time Spent: 1h 20m
>  Remaining Estimate: 0h
>
> Running the java integration test fails with an exception:
> {code}
> $ java -classpath 
> ~/arrow/java/tools/target/arrow-tools-0.14.0-SNAPSHOT-jar-with-dependencies.jar
>  -Dio.netty.tryReflectionSetAccessible=false 
> org.apache.arrow.tools.Integration -a ~/tmp/1832930b_simple.json_as_file -j 
> ~/arrow/integration/data/simple.json -c VALIDATE
> ...
> Incompatible files
> sun.misc.Unsafe or java.nio.DirectByteBuffer.(long, int) not available
> 08:55:43.597 [main] ERROR org.apache.arrow.tools.Integration - Incompatible 
> files
> java.lang.UnsupportedOperationException: sun.misc.Unsafe or 
> java.nio.DirectByteBuffer.(long, int) not available
>   at 
> io.netty.util.internal.PlatformDependent.directBuffer(PlatformDependent.java:399)
>   at io.netty.buffer.NettyArrowBuf.getDirectBuffer(NettyArrowBuf.java:233)
>   at io.netty.buffer.NettyArrowBuf.nioBuffer(NettyArrowBuf.java:223)
>   at io.netty.buffer.ArrowBuf.nioBuffer(ArrowBuf.java:245)
>   at 
> org.apache.arrow.vector.ipc.message.ArrowRecordBatch.computeBodyLength(ArrowRecordBatch.java:211)
>   at 
> org.apache.arrow.vector.ipc.message.MessageSerializer.serialize(MessageSerializer.java:175)
>   at 
> org.apache.arrow.vector.ipc.ArrowWriter.writeRecordBatch(ArrowWriter.java:119)
>   at 
> org.apache.arrow.vector.ipc.ArrowFileWriter.writeRecordBatch(ArrowFileWriter.java:61)
>   at 
> org.apache.arrow.vector.ipc.ArrowWriter.writeBatch(ArrowWriter.java:107)
>   at 
> org.apache.arrow.tools.Integration$Command$2.execute(Integration.java:171)
>   at org.apache.arrow.tools.Integration.run(Integration.java:118)
>   at org.apache.arrow.tools.Integration.main(Integration.java:69)
> {code}
> Looking through netty's source, it looks like this exception is [emitted 
> here|https://github.com/netty/netty/blob/master/common/src/main/java/io/netty/util/internal/PlatformDependent.java#L343-L344].
> {code}
> $ apt search jdk | grep installed
> default-jre/bionic-updates,bionic-security,now 2:1.11-68ubuntu1~18.04.1 amd64 
> [installed,automatic]
> default-jre-headless/bionic-updates,bionic-security,now 
> 2:1.11-68ubuntu1~18.04.1 amd64 [installed,automatic]
> libslf4j-java/bionic,bionic,now 1.7.25-3 all [installed,automatic]
> openjdk-11-jdk/bionic-updates,bionic-security,now 11.0.3+7-1ubuntu2~18.04.1 
> amd64 [installed]
> openjdk-11-jdk-headless/bionic-updates,bionic-security,now 
> 11.0.3+7-1ubuntu2~18.04.1 amd64 [installed,automatic]
> openjdk-11-jre/bionic-updates,bionic-security,now 11.0.3+7-1ubuntu2~18.04.1 
> amd64 [installed,automatic]
> openjdk-11-jre-headless/bionic-updates,bionic-security,now 
> 11.0.3+7-1ubuntu2~18.04.1 amd64 [installed,automatic]
> {code}



--
This message was sent by Atlassian JIRA
(v7.6.14#76016)


[jira] [Commented] (ARROW-5412) [Java] Integration test fails with UnsupportedOperationException

2019-08-11 Thread Jim Northrup (JIRA)


[ 
https://issues.apache.org/jira/browse/ARROW-5412?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16904736#comment-16904736
 ] 

Jim Northrup commented on ARROW-5412:
-

setting the system properties as first leine of main() succeeds for jdk12 where 
latest reslease as-of-this-comment still throws the error cited.

 
 @JvmStatic
 fun main(vararg args: String) = runBlocking {
 System.setProperty( "io.netty.tryReflectionSetAccessible","true")  
 go(*args) }
 }

> [Java] Integration test fails with UnsupportedOperationException
> 
>
> Key: ARROW-5412
> URL: https://issues.apache.org/jira/browse/ARROW-5412
> Project: Apache Arrow
>  Issue Type: Bug
>  Components: Integration, Java
>Reporter: Benjamin Kietzman
>Assignee: Bryan Cutler
>Priority: Minor
>  Labels: pull-request-available
> Fix For: 0.14.0
>
>  Time Spent: 1h 20m
>  Remaining Estimate: 0h
>
> Running the java integration test fails with an exception:
> {code}
> $ java -classpath 
> ~/arrow/java/tools/target/arrow-tools-0.14.0-SNAPSHOT-jar-with-dependencies.jar
>  -Dio.netty.tryReflectionSetAccessible=false 
> org.apache.arrow.tools.Integration -a ~/tmp/1832930b_simple.json_as_file -j 
> ~/arrow/integration/data/simple.json -c VALIDATE
> ...
> Incompatible files
> sun.misc.Unsafe or java.nio.DirectByteBuffer.(long, int) not available
> 08:55:43.597 [main] ERROR org.apache.arrow.tools.Integration - Incompatible 
> files
> java.lang.UnsupportedOperationException: sun.misc.Unsafe or 
> java.nio.DirectByteBuffer.(long, int) not available
>   at 
> io.netty.util.internal.PlatformDependent.directBuffer(PlatformDependent.java:399)
>   at io.netty.buffer.NettyArrowBuf.getDirectBuffer(NettyArrowBuf.java:233)
>   at io.netty.buffer.NettyArrowBuf.nioBuffer(NettyArrowBuf.java:223)
>   at io.netty.buffer.ArrowBuf.nioBuffer(ArrowBuf.java:245)
>   at 
> org.apache.arrow.vector.ipc.message.ArrowRecordBatch.computeBodyLength(ArrowRecordBatch.java:211)
>   at 
> org.apache.arrow.vector.ipc.message.MessageSerializer.serialize(MessageSerializer.java:175)
>   at 
> org.apache.arrow.vector.ipc.ArrowWriter.writeRecordBatch(ArrowWriter.java:119)
>   at 
> org.apache.arrow.vector.ipc.ArrowFileWriter.writeRecordBatch(ArrowFileWriter.java:61)
>   at 
> org.apache.arrow.vector.ipc.ArrowWriter.writeBatch(ArrowWriter.java:107)
>   at 
> org.apache.arrow.tools.Integration$Command$2.execute(Integration.java:171)
>   at org.apache.arrow.tools.Integration.run(Integration.java:118)
>   at org.apache.arrow.tools.Integration.main(Integration.java:69)
> {code}
> Looking through netty's source, it looks like this exception is [emitted 
> here|https://github.com/netty/netty/blob/master/common/src/main/java/io/netty/util/internal/PlatformDependent.java#L343-L344].
> {code}
> $ apt search jdk | grep installed
> default-jre/bionic-updates,bionic-security,now 2:1.11-68ubuntu1~18.04.1 amd64 
> [installed,automatic]
> default-jre-headless/bionic-updates,bionic-security,now 
> 2:1.11-68ubuntu1~18.04.1 amd64 [installed,automatic]
> libslf4j-java/bionic,bionic,now 1.7.25-3 all [installed,automatic]
> openjdk-11-jdk/bionic-updates,bionic-security,now 11.0.3+7-1ubuntu2~18.04.1 
> amd64 [installed]
> openjdk-11-jdk-headless/bionic-updates,bionic-security,now 
> 11.0.3+7-1ubuntu2~18.04.1 amd64 [installed,automatic]
> openjdk-11-jre/bionic-updates,bionic-security,now 11.0.3+7-1ubuntu2~18.04.1 
> amd64 [installed,automatic]
> openjdk-11-jre-headless/bionic-updates,bionic-security,now 
> 11.0.3+7-1ubuntu2~18.04.1 amd64 [installed,automatic]
> {code}



--
This message was sent by Atlassian JIRA
(v7.6.14#76016)