[jira] [Created] (ARROW-3479) [R] Support to write record_batch as stream
Javier Luraschi created ARROW-3479: -- Summary: [R] Support to write record_batch as stream Key: ARROW-3479 URL: https://issues.apache.org/jira/browse/ARROW-3479 Project: Apache Arrow Issue Type: Improvement Components: R Reporter: Javier Luraschi Currently, one can only export a record batch to a file: {code:java} record <- arrow::record_batch(data.frame(a = c(1,2,3))) record$to_file() {code} But to improve performance in Spark's R bindings through sparklyr an improvement is to support streams returning R raw's as follows: {code:java} record <- arrow::record_batch(data.frame(a = c(1,2,3))) record$to_stream(){code} -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Created] (ARROW-3484) [R] Support to read record_batch from stream
Javier Luraschi created ARROW-3484: -- Summary: [R] Support to read record_batch from stream Key: ARROW-3484 URL: https://issues.apache.org/jira/browse/ARROW-3484 Project: Apache Arrow Issue Type: Improvement Components: R Reporter: Javier Luraschi https://issues.apache.org/jira/browse/ARROW-3479 enabled {code:java} record <- arrow::record_batch(data.frame(a = c(1,2,3))) stream <- record$to_stream() {code} This issues tracks implementing the reverse operation by running: {code:java} read_record_batch_stream(stream){code} -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Created] (ARROW-3547) [R] Protect against Null crash when reading from RecordBatch
Javier Luraschi created ARROW-3547: -- Summary: [R] Protect against Null crash when reading from RecordBatch Key: ARROW-3547 URL: https://issues.apache.org/jira/browse/ARROW-3547 Project: Apache Arrow Issue Type: Improvement Components: R Reporter: Javier Luraschi Reprex: {code:java} tbl <- tibble::tibble( int = 1:10, dbl = as.numeric(1:10), lgl = sample(c(TRUE, FALSE, NA), 10, replace = TRUE), chr = letters[1:10] ) batch <- record_batch(tbl) bytes <- write_record_batch(batch, raw()) stream_reader <- record_batch_stream_reader(bytes) batch1 <- read_record_batch(stream_reader) batch2 <- read_record_batch(stream_reader) # Crash as_tibble(batch2){code} While users should check for Null entries by running: {code:java} if(!batch2$is_null()) as_tibble(batch2) {code} It's harsh to trigger a crash, we should consider protecting all functions that use RecordBatch pointers to return NULL instead, for instance: {code:java} List RecordBatch__to_dataframe(const std::shared_ptr& batch) { if (batch->get() == nullptr) Rcpp::stop("Can't read from NULL record batch.") }{code} -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Created] (ARROW-3591) [R] Support to collect decimal type
Javier Luraschi created ARROW-3591: -- Summary: [R] Support to collect decimal type Key: ARROW-3591 URL: https://issues.apache.org/jira/browse/ARROW-3591 Project: Apache Arrow Issue Type: Improvement Components: R Reporter: Javier Luraschi Collecting from `sparklyr` decimal types through: {code:java} library(sparklyr) sc <- spark_connect(master = "local") sdf_len(sc, 3) %>% dplyr::mutate(new = 1) %>% dplyr::collect(){code} causes, {code:java} Error in RecordBatch__to_dataframe(x) : cannot handle Array of type decimal {code} -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Created] (ARROW-3604) [R] Support to collect int64 as ints
Javier Luraschi created ARROW-3604: -- Summary: [R] Support to collect int64 as ints Key: ARROW-3604 URL: https://issues.apache.org/jira/browse/ARROW-3604 Project: Apache Arrow Issue Type: Improvement Reporter: Javier Luraschi -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Created] (ARROW-3614) [R] Handle Type::TIMESTAMP from Arrow to R
Javier Luraschi created ARROW-3614: -- Summary: [R] Handle Type::TIMESTAMP from Arrow to R Key: ARROW-3614 URL: https://issues.apache.org/jira/browse/ARROW-3614 Project: Apache Arrow Issue Type: Improvement Components: R Reporter: Javier Luraschi -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Created] (ARROW-3615) [R] Support for NaN
Javier Luraschi created ARROW-3615: -- Summary: [R] Support for NaN Key: ARROW-3615 URL: https://issues.apache.org/jira/browse/ARROW-3615 Project: Apache Arrow Issue Type: Improvement Components: R Reporter: Javier Luraschi -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Created] (ARROW-3647) [R] Crash after unloading bit64 package
Javier Luraschi created ARROW-3647: -- Summary: [R] Crash after unloading bit64 package Key: ARROW-3647 URL: https://issues.apache.org/jira/browse/ARROW-3647 Project: Apache Arrow Issue Type: Improvement Components: R Reporter: Javier Luraschi {code:java} # create array with int64 values library(arrow) x <- bit64::as.integer64(1:10) a <- array(x) # unload package detach("package:arrow", unload = TRUE) # crash a$as_vector() {code} -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Created] (ARROW-3657) [R] Require bit64 package
Javier Luraschi created ARROW-3657: -- Summary: [R] Require bit64 package Key: ARROW-3657 URL: https://issues.apache.org/jira/browse/ARROW-3657 Project: Apache Arrow Issue Type: Improvement Components: R Reporter: Javier Luraschi Assignee: Javier Luraschi {code:java} devtools::install_github("apache/arrow", subdir = "r") {code} {code:java} Error in loadNamespace(j <- i[[1L]], c(lib.loc, .libPaths()), versionCheck = vI[[j]]) : there is no package called ‘bit64’ {code} -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Created] (ARROW-3693) [R] Invalid buffer for null characters with null data
Javier Luraschi created ARROW-3693: -- Summary: [R] Invalid buffer for null characters with null data Key: ARROW-3693 URL: https://issues.apache.org/jira/browse/ARROW-3693 Project: Apache Arrow Issue Type: Improvement Components: R Reporter: Javier Luraschi Attachments: Screen Shot 2018-11-02 at 10.27.11 PM.png I'm hitting this from data coming from Spark while retrieving this data frame: {code:java} default a TRUE b TRUE {code} Error: {code:java} error: Failed to fetch data: invalid data in buffer 2 {code} The problem is that is possible to have a NULL character array with offsets set to 0, notice that in the example above Spark returns two batches, so effectively, we are trying to parse: {code:java} b TRUE {code} Where the data array is NULL and the offsets 0s for the columns. Here is a snapshot while debugging StringArray_to_Vector: !Screen Shot 2018-11-02 at 10.27.11 PM.png! -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Created] (ARROW-3702) [R] POSIXct mapped to DateType not TimestampType?
Javier Luraschi created ARROW-3702: -- Summary: [R] POSIXct mapped to DateType not TimestampType? Key: ARROW-3702 URL: https://issues.apache.org/jira/browse/ARROW-3702 Project: Apache Arrow Issue Type: Improvement Components: R Reporter: Javier Luraschi Why was POSIXct mapped to [datatype|https://arrow.apache.org/docs/cpp/classarrow_1_1_date_type.html#a6aea1fcfd9f998e8fa50f5ae62dbd7e6] not [timestamp|https://arrow.apache.org/docs/cpp/classarrow_1_1_timestamp_type.html#a88e0ba47b82571b3fc3798b6c099499b]? What are the PRO/CONs from each approach? This is mostly to interoperate with Spark which choose to map POSIXct to Timestamps since in Spark, not Arrow, dates do not have a time component. There is a way to make this work in Spark with POSIXct mapped to DateType by mapping DateType to timestamps, so mostly looking to understand tradeoffs. One particular question, timestamps in arrow seem to support timezones, wouldn't it make more sense to map POSIXct to timestamps? -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Created] (ARROW-3780) [R] Failed to fetch data: invalid data when collecting int16
Javier Luraschi created ARROW-3780: -- Summary: [R] Failed to fetch data: invalid data when collecting int16 Key: ARROW-3780 URL: https://issues.apache.org/jira/browse/ARROW-3780 Project: Apache Arrow Issue Type: Improvement Components: R Reporter: Javier Luraschi Repro from sparklyr unit test: {code:java} library(dplyr) library(sparklyr) library(arrow) sc <- spark_connect(master = "local") hive_type <- tibble::frame_data( ~stype, ~svalue, ~rtype, ~rvalue, ~arrow, "smallint", "1", "integer", "1", "integer", ) spark_query <- hive_type %>% mutate( query = paste0("cast(", svalue, " as ", stype, ") as ", gsub("\\(|\\)", "", stype), "_col") ) %>% pull(query) %>% paste(collapse = ", ") %>% paste("SELECT", .) spark_types <- DBI::dbGetQuery(sc, spark_query) %>% lapply(function(e) class(e)[[1]]) %>% as.character(){code} Actual: error: Failed to fetch data: invalid data -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Created] (ARROW-3783) [R] Incorrect collection of float type
Javier Luraschi created ARROW-3783: -- Summary: [R] Incorrect collection of float type Key: ARROW-3783 URL: https://issues.apache.org/jira/browse/ARROW-3783 Project: Apache Arrow Issue Type: Improvement Components: R Reporter: Javier Luraschi Repro from `sparklyr`: {code:java} library(sparklyr) library(arrow) sc <- spark_connect(master = "local") DBI::dbGetQuery(sc, "SELECT cast(1 as float)"){code} Actual: {code:java} CAST(1 AS FLOAT) 1 1065353216{code} Expected: {code:java} CAST(1 AS FLOAT) 11{code} -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Created] (ARROW-3784) [R] Array with type fails with x is not a vector
Javier Luraschi created ARROW-3784: -- Summary: [R] Array with type fails with x is not a vector Key: ARROW-3784 URL: https://issues.apache.org/jira/browse/ARROW-3784 Project: Apache Arrow Issue Type: Improvement Components: R Reporter: Javier Luraschi {code:java} array(1:10, type = int32()) {code} Actual: {code:java} Error: `x` is not a vector {code} Expected: {code:java} arrow::Array [ 1, 2, 3, 4, 5, 6, 7, 8, 9, 10 ] {code} -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Created] (ARROW-3794) [R] Consider mapping INT8 to integer() not raw()
Javier Luraschi created ARROW-3794: -- Summary: [R] Consider mapping INT8 to integer() not raw() Key: ARROW-3794 URL: https://issues.apache.org/jira/browse/ARROW-3794 Project: Apache Arrow Issue Type: Improvement Reporter: Javier Luraschi The Arrow:BINARY type maps better to R's raw(), while Arrow::INT8 maps better to R's integer() since currently, NA's are not supported when collecting INT8's and numerical operations can't be performed against raw(). -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Created] (ARROW-3795) [R] Support for retrieving NAs from INT64 arrays
Javier Luraschi created ARROW-3795: -- Summary: [R] Support for retrieving NAs from INT64 arrays Key: ARROW-3795 URL: https://issues.apache.org/jira/browse/ARROW-3795 Project: Apache Arrow Issue Type: Improvement Components: R Reporter: Javier Luraschi I have a repro using sparklyr but likely to be possible to repro this through c++ bindings: {code:java} library(sparklyr) library(arrow) sc <- spark_connect(mater = "local") DBI::dbGetQuery(sc, "SELECT cast(NULL as bigint)") {code} Actual: {code:java} CAST(NULL AS BIGINT) 1 -4332462841530417152 {code} Expected: {code:java} CAST(NULL AS BIGINT) 1 NA {code} -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Created] (ARROW-3804) [R] Consider lowering required R runtime
Javier Luraschi created ARROW-3804: -- Summary: [R] Consider lowering required R runtime Key: ARROW-3804 URL: https://issues.apache.org/jira/browse/ARROW-3804 Project: Apache Arrow Issue Type: Improvement Components: R Reporter: Javier Luraschi Currently R 3.5 is required but 3.1 only needed by functionality and dependencies. -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Created] (ARROW-4450) Distribute in Debian Repo
Javier Luraschi created ARROW-4450: -- Summary: Distribute in Debian Repo Key: ARROW-4450 URL: https://issues.apache.org/jira/browse/ARROW-4450 Project: Apache Arrow Issue Type: Improvement Reporter: Javier Luraschi Distribute Arrow in Debian repo: [https://www.debian.org/doc/manuals/distribute-deb/distribute-deb.html#adding-packages-to-debian] Required to publish Arrow R package into CRAN. -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Created] (ARROW-4451) [Packaging] Distribute in Fedora Repo
Javier Luraschi created ARROW-4451: -- Summary: [Packaging] Distribute in Fedora Repo Key: ARROW-4451 URL: https://issues.apache.org/jira/browse/ARROW-4451 Project: Apache Arrow Issue Type: Improvement Components: Packaging Reporter: Javier Luraschi Distribute Arrow in Fedora repo: [https://fedoraproject.org/wiki/New_package_process_for_existing_contributors] Required to publish Arrow R package into CRAN. -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Created] (ARROW-4565) [R] Reading records with all non-null decimals SEGFAULTs
Javier Luraschi created ARROW-4565: -- Summary: [R] Reading records with all non-null decimals SEGFAULTs Key: ARROW-4565 URL: https://issues.apache.org/jira/browse/ARROW-4565 Project: Apache Arrow Issue Type: Improvement Components: R Reporter: Javier Luraschi Repro, {code:java} library(sparklyr) library(arrow) sc <- spark_connect(master = "local") sdf_len(sc, 10^5) %>% dplyr::mutate(batch = id %% 10) {code} produces using Arrow 0.12, no repro under Arrow 0.11. {code:java} *** caught segfault *** address 0x10, cause 'memory not mapped' Traceback: 1: RecordBatch__to_dataframe(x, use_threads = use_threads) 2: `as_tibble.arrow::RecordBatch`(record_entry) 3: tibble::as_tibble(record_entry) 4: arrow_read_stream(.) 5: function_list[[i]](value) 6: freduce(value, `_function_list`) 7: `_fseq`(`_lhs`) 8: eval(quote(`_fseq`(`_lhs`)), env, env) 9: eval(quote(`_fseq`(`_lhs`)), env, env) 10: withVisible(eval(quote(`_fseq`(`_lhs`)), env, env)) 11: invoke_static(sc, "sparklyr.ArrowConverters", "toArrowBatchRdd", sdf, session, time_zone) %>% arrow_read_stream() %>% dplyr::bind_rows() 12: arrow_collect(object, ...) {code} Notice that the following cast is unsupported, I can add a test if someone can come up with a way of creating a decimal type. {code:java} batch <- table(tibble::tibble(x = 1:10)) batch$cast(schema(x = decimal())){code} {code:java} Error in Decimal128Type__initialize(precision, scale) : argument "precision" is missing, with no default {code} I'll send a PR with a fix... -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Created] (ARROW-4725) [C++] Dictionary tests disabled under MinGW builds
Javier Luraschi created ARROW-4725: -- Summary: [C++] Dictionary tests disabled under MinGW builds Key: ARROW-4725 URL: https://issues.apache.org/jira/browse/ARROW-4725 Project: Apache Arrow Issue Type: Test Components: C++ Reporter: Javier Luraschi Follow up to needed for [arrow/pull/3693/files|https://github.com/apache/arrow/pull/3693/files]. Under cpp/src/arrow/CMakeLists.txt, PR disabled array-dict-test.cc test, by adding: {code:java} if(WIN32) add_arrow_test(array-test SOURCES array-test.cc array-binary-test.cc array-list-test.cc array-struct-test.cc) else() add_arrow_test(array-test SOURCES array-test.cc array-binary-test.cc array-dict-test.cc array-list-test.cc array-struct-test.cc) endif(){code} Which should be reverted and investigated further. The build error that including this test triggers is the following: {code:java} /arrow-array-test.dir/objects.a(array-dict-test.cc.obj):array-dict-test.cc:(.text+0xb9a2): undefined reference to `arrow::DictionaryBuilder::DictionaryBuilder(std::shared_ptr const&, arrow::MemoryPool*)' CMakeFiles/arrow-array-test.dir/objects.a(array-dict-test.cc.obj):array-dict-test.cc:(.text+0xcb8a): undefined reference to `arrow::DictionaryBuilder::DictionaryBuilder(std::shared_ptr const&, arrow::MemoryPool*)' CMakeFiles/arrow-array-test.dir/objects.a(array-dict-test.cc.obj):array-dict-test.cc:(.text+0xeef8): undefined reference to `arrow::DictionaryBuilder::DictionaryBuilder(std::shared_ptr const&, arrow::MemoryPool*)' CMakeFiles/arrow-array-test.dir/objects.a(array-dict-test.cc.obj):array-dict-test.cc:(.text+0x10240): undefined reference to `arrow::DictionaryBuilder::DictionaryBuilder(std::shared_ptr const&, arrow::MemoryPool*)' CMakeFiles/arrow-array-test.dir/objects.a(array-dict-test.cc.obj):array-dict-test.cc:(.text+0x104fc): undefined reference to `arrow::DictionaryBuilder::AppendArray(arrow::Array const&)' CMakeFiles/arrow-array-test.dir/objects.a(array-dict-test.cc.obj):array-dict-test.cc:(.text+0x108ef): undefined reference to `arrow{code} -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Created] (ARROW-4726) [C++] IntToFloatingPoint tests disabled under 32bit builds
Javier Luraschi created ARROW-4726: -- Summary: [C++] IntToFloatingPoint tests disabled under 32bit builds Key: ARROW-4726 URL: https://issues.apache.org/jira/browse/ARROW-4726 Project: Apache Arrow Issue Type: Improvement Components: C++ Reporter: Javier Luraschi Follow up to needed for [arrow/pull/3693/files|https://github.com/apache/arrow/pull/3693/files]. Under cpp/src/arrow/compute/kernels/cast-test.cc, the TestCast/IntToFloatingPoint test was disabled by added: {code:java} #if ARROW_BITNESS >= 64 #endif{code} This should be reverted and investigated further. -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Created] (ARROW-4724) [C++] Python not being built nor test under MinGW builds
Javier Luraschi created ARROW-4724: -- Summary: [C++] Python not being built nor test under MinGW builds Key: ARROW-4724 URL: https://issues.apache.org/jira/browse/ARROW-4724 Project: Apache Arrow Issue Type: Test Components: C++ Reporter: Javier Luraschi Follow up to needed for [arrow/pull/3693/files|https://github.com/apache/arrow/pull/3693/files]. appveyor-cpp-build-mingw.bat has not yet enabled Python tests, need to revert, -DARROW_PYTHON=OFF Suggestion was to use, {code:java} diff --git a/ci/appveyor-cpp-build-mingw.bat b/ci/appveyor-cpp-build-mingw.bat index 06e8b7f7..3a853031 100644 --- a/ci/appveyor-cpp-build-mingw.bat +++ b/ci/appveyor-cpp-build-mingw.bat @@ -24,6 +24,15 @@ set INSTALL_DIR=%HOMEDRIVE%%HOMEPATH%\install set PATH=%INSTALL_DIR%\bin;%PATH% set PKG_CONFIG_PATH=%INSTALL_DIR%\lib\pkgconfig +for /f "usebackq" %%v in (`python3 -c "import sys; print('.'.join(map(str, sys.version_info[0:2])))"`) do ( + set PYTHON_VERSION=%%v +) + +set PYTHONHOME=%MINGW_PREFIX%\lib\python%PYTHON_VERSION% +set PYTHONPATH=%PYTHONHOME% +set PYTHONPATH=%PYTHONPATH%;%MINGW_PREFIX%\lib\python%PYTHON_VERSION%\lib-dynload +set PYTHONPATH=%PYTHONPATH%;%MINGW_PREFIX%\lib\python%PYTHON_VERSION%\site-packages + {code} However, this suggestion currently trigger a built error in Travis, {code:java} [ 43%] Building CXX object src/arrow/CMakeFiles/arrow_objlib.dir/ipc/json-simple.cc.obj [ 44%] Building CXX object src/arrow/CMakeFiles/arrow_objlib.dir/ipc/message.cc.obj [ 44%] Building CXX object src/arrow/CMakeFiles/arrow_objlib.dir/ipc/metadata-internal.cc.obj [ 45%] Building CXX object src/arrow/CMakeFiles/arrow_objlib.dir/ipc/reader.cc.obj [ 45%] Building CXX object src/arrow/CMakeFiles/arrow_objlib.dir/ipc/writer.cc.obj [ 45%] Built target arrow_objlib make: *** [Makefile:141: all] Error 2 C:\projects\arrow\cpp\build>goto scriptexit{code} Therefore, additional investigation is needed. -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Created] (ARROW-4834) [R] Feature flag to disable parquet
Javier Luraschi created ARROW-4834: -- Summary: [R] Feature flag to disable parquet Key: ARROW-4834 URL: https://issues.apache.org/jira/browse/ARROW-4834 Project: Apache Arrow Issue Type: Improvement Components: R Reporter: Javier Luraschi Support for a ARROW_R_PARQUET_OFF feature flag that will disable building the R package with parquet support. -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Created] (ARROW-4911) [R] Support for building package for Windows
Javier Luraschi created ARROW-4911: -- Summary: [R] Support for building package for Windows Key: ARROW-4911 URL: https://issues.apache.org/jira/browse/ARROW-4911 Project: Apache Arrow Issue Type: Improvement Components: R Reporter: Javier Luraschi -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Created] (ARROW-4995) [R] Make sure winbuilder tests pass for package
Javier Luraschi created ARROW-4995: -- Summary: [R] Make sure winbuilder tests pass for package Key: ARROW-4995 URL: https://issues.apache.org/jira/browse/ARROW-4995 Project: Apache Arrow Issue Type: Improvement Components: R Reporter: Javier Luraschi Current submission to winbuilder triggers the following errors: {code:java} * using log directory 'd:/RCompile/CRANguest/R-release/arrow.Rcheck' * using R version 3.5.3 (2019-03-11) * using platform: x86_64-w64-mingw32 (64-bit) * using session charset: ISO8859-1 * checking for file 'arrow/DESCRIPTION' ... OK * this is package 'arrow' version '0.12.0.9000' * package encoding: UTF-8 * checking CRAN incoming feasibility ... NOTE Maintainer: 'Javier Luraschi ' New submission Version contains large components (0.12.0.9000) * checking package namespace information ... OK * checking package dependencies ... OK * checking if this is a source package ... OK * checking if there is a namespace ... OK * checking for hidden files and directories ... NOTE Found the following hidden files and directories: .travis.yml These were most likely included in error. See section 'Package structure' in the 'Writing R Extensions' manual. CRAN-pack does not know about .travis.yml * checking for portable file names ... OK * checking whether package 'arrow' can be installed ... OK * checking installed package size ... NOTE installed size is 8.6Mb sub-directories of 1Mb or more: R 1.9Mb libs 6.5Mb * checking package directory ... OK * checking DESCRIPTION meta-information ... OK * checking top-level files ... NOTE Non-standard file/directory found at top level: 'clang_format.sh' * checking for left-over files ... OK * checking index information ... OK * checking package subdirectories ... OK * checking R files for non-ASCII characters ... OK * checking R files for syntax errors ... OK * loading checks for arch 'i386' ** checking whether the package can be loaded ... OK ** checking whether the package can be loaded with stated dependencies ... OK ** checking whether the package can be unloaded cleanly ... OK ** checking whether the namespace can be loaded with stated dependencies ... OK ** checking whether the namespace can be unloaded cleanly ... OK ** checking loading without being on the library search path ... OK ** checking use of S3 registration ... OK * loading checks for arch 'x64' ** checking whether the package can be loaded ... OK ** checking whether the package can be loaded with stated dependencies ... OK ** checking whether the package can be unloaded cleanly ... OK ** checking whether the namespace can be loaded with stated dependencies ... OK ** checking whether the namespace can be unloaded cleanly ... OK ** checking loading without being on the library search path ... OK ** checking use of S3 registration ... OK * checking dependencies in R code ... OK * checking S3 generic/method consistency ... OK * checking replacement functions ... OK * checking foreign function calls ... OK * checking R code for possible problems ... [8s] OK * checking Rd files ... OK * checking Rd metadata ... OK * checking Rd line widths ... OK * checking Rd cross-references ... OK * checking for missing documentation entries ... OK * checking for code/documentation mismatches ... OK * checking Rd \usage sections ... OK * checking Rd contents ... OK * checking for unstated dependencies in examples ... OK * checking line endings in shell scripts ... OK * checking line endings in C/C++/Fortran sources/headers ... OK * checking line endings in Makefiles ... OK * checking compilation flags in Makevars ... OK * checking for GNU extensions in Makefiles ... OK * checking for portable use of $(BLAS_LIBS) and $(LAPACK_LIBS) ... OK * checking pragmas in C/C++ headers and code ... OK * checking compiled code ... OK * checking examples ... ** running examples for arch 'i386' ... [1s] OK ** running examples for arch 'x64' ... [1s] OK * checking for unstated dependencies in 'tests' ... OK * checking tests ... ** running tests for arch 'i386' ... [7s] ERROR Running 'testthat.R' [6s] Running the tests in 'tests/testthat.R' failed. Complete output: > # Licensed to the Apache Software Foundation (ASF) under one > # or more contributor license agreements. See the NOTICE file > # distributed with this work for additional information > # regarding copyright ownership. The ASF licenses this file > # to you under the Apache License, Version 2.0 (the > # "License"); you may not use this file except in compliance > # with the License. You may obtain a copy of the License at > # > # http://www.apache.org/licenses/LICENSE-2.0 > # > # Unless required by applicable law or agreed to in writing, > # software distributed under the License is distributed on an > # "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY > # KIND, either expr
[jira] [Created] (ARROW-5015) [R] Validate winlibs binaries with hash
Javier Luraschi created ARROW-5015: -- Summary: [R] Validate winlibs binaries with hash Key: ARROW-5015 URL: https://issues.apache.org/jira/browse/ARROW-5015 Project: Apache Arrow Issue Type: Improvement Components: R Reporter: Javier Luraschi See [https://github.com/apache/arrow/pull/4011#discussion_r269229280] It is a common practice to download binaries from the winlibs R repo; however, for the arrow project, validating the package against a SHA256/SHA512 hash is desired. -- This message was sent by Atlassian JIRA (v7.6.3#76005)