[R-pkg-devel] Changing License

2018-08-30 Thread Charles Determan
R developers,

It has come to my attention that some of the code I am distributing in one
of my packages was previously licensed under the MIT license.  I have
previously released my package under the GPL-3 license.  Would it be more
appropriate for me to change the license to MIT?  I know no one here is
likely a lawyer but I would like to hear if there is any issue in changing
a license of a currently released package as it is my understanding that my
package should have the MIT license instead.

Regards,
Charles

[[alternative HTML version deleted]]

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


[R-pkg-devel] S4 generics package?

2018-07-06 Thread Charles Determan
Greetings,

I know in Bioconductor there is the BiocGenerics package (
https://bioconductor.org/packages/release/bioc/html/BiocGenerics.html).  I
have not been able to find any similar package on CRAN.  I use S4 methods
quite often and a package like this would be very useful for future
packages I am developing.

My question generally is, should I consider using BiocGenerics as a
dependency if I normally submit to CRAN?  My understanding is that I would
need to submit any package which uses BiocGenerics to Bioconductor to meet
dependency requirements.  I have no issue with Bioconductor, I even have
another package published on there.  However, the packages I am developing
are not specific to bioinformatics.

That said, if no such 'generics' package exists on CRAN I am curious what
people would think of me possibly developing a 'peer' package similar to
BiocGenerics for CRAN packages.

Regards,
Charles

[[alternative HTML version deleted]]

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


[R-pkg-devel] Unit Testing CUDA Volunteers

2018-01-12 Thread Charles Determan
R Developers,

I am currently in development of some additional CUDA based packages to
bring more GPU computing to the R community.  Recently, I no longer have
access to an NVIDIA GPU and therefore I am unable to run any unit tests on
the functions I create.  I can confirm the packages build on a continuous
integration platform (e.g. Travis CI) but I am not aware of any that
provide an NVIDIA GPU for testing.

As you are all aware, unit testing is critical for the integrity of our
packages so I would like to openly request for volunteers who would be
willing to test CUDA based packages.  I previously petitioned for
developers with my previous package (gpuR) but that was open to anyone
given the OpenCL backend.  Naturally, this testing would be at your
convenience and there would be no additional responsibilities.  Simply
reporting if tests pass or fail.  If you are interested, please reply to me
off list and we can coordinate with my git repositories.

Kind Regards,
Charles

[[alternative HTML version deleted]]

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Exporting S3 methods for base generics

2017-06-26 Thread Charles Determan
Ah, I see now.  I came across this previous post (
http://r.789695.n4.nabble.com/override-pmin-pmax-for-my-own-matrix-td4715903.html)
which mentioned the caveat that all the elements passed to ... must be the
same.  When I pass two of the same class it works but I believe I will need
to go back to the S3 if I want different classes passed to the call.

Charles

On Mon, Jun 26, 2017 at 12:42 PM, Charles Determan <cdeterma...@gmail.com>
wrote:

> Could you point to one of these packages you refer to?  I'm still having
> problems and not sure why at the moment.
>
> Thanks
>
> On Mon, Jun 26, 2017 at 12:32 PM, Joris Meys <joris.m...@ugent.be> wrote:
>
>> Hi Charles,
>>
>> my mistake. I forgot that pmax has an extra argument na.rm. I'm surprised
>> you could define the method, as this normally should return an error from
>> conformMethod().
>>
>> So:
>> #' @rdname pmax
>> setGeneric("pmax", signature = "...")
>>
>> should work. I've used this myself in quite a number of packages.
>>
>> Cheers
>> Joris
>>
>> On Mon, Jun 26, 2017 at 7:20 PM, Charles Determan <cdeterma...@gmail.com>
>> wrote:
>>
>>> Thanks for the reply Joris, although I am not sure what I could be doing
>>> wrong.  I implement exactly the lines you show and yet I just get the
>>> following error when I call 'pmax' on the class.
>>>
>>> > pmax(x, 0)
>>>
>>> Error in mmm < each :
>>>   comparison (3) is possible only for atomic and list types
>>> In addition: Warning message:
>>> In is.na(mmm) : is.na() applied to non-(list or vector) of type 'S4'
>>>
>>> Regards,
>>> Charles
>>>
>>> On Mon, Jun 26, 2017 at 12:10 PM, Joris Meys <joris.m...@ugent.be>
>>> wrote:
>>>
>>>> Hi Charles,
>>>>
>>>> if a generic exists already in the base, you only have to export the
>>>> actual S3 method. Your problem is that base::pmax() is not a generic S3
>>>> function. So R gives you the correct warning: the S3 generic in your
>>>> package will always mask the base pmax function. And that's not really a
>>>> problem, especially since you ensured the base functionality with your
>>>> default method.
>>>>
>>>> If you want to avoid that warning, use S4.
>>>>
>>>> #' @rdname
>>>> setGeneric("pmax")
>>>>
>>>> #' @rdname pmax
>>>> #' @method pmax myclass
>>>> #' @export
>>>> setMethod("pmax",
>>>>   "myclass",
>>>>   function(...){
>>>>  # do some stuff
>>>> })
>>>>
>>>> More information on how to deal with dots can be found on the help page
>>>> ?dotsMethods.
>>>>
>>>> If you have a generic in the base package (eg plot is such one), you
>>>> only define the method and use:
>>>>
>>>> #' @export
>>>> plot.myclass <- function(x, y, ...){
>>>># do some more stuff
>>>> }
>>>>
>>>> Cheers
>>>> Joris
>>>>
>>>>
>>>>
>>>> On Mon, Jun 26, 2017 at 6:28 PM, Charles Determan <
>>>> cdeterma...@gmail.com> wrote:
>>>>
>>>>> Greetings R users,
>>>>>
>>>>> I was wondering how others are exporting S3 methods in their packages
>>>>> when
>>>>> the generic is in 'base'.  For example, let's say I want to export a
>>>>> new
>>>>> pmax method.  The only way I have found to get this to work is by
>>>>> redefining the function with 'UseMethod' and setting the default
>>>>> method.
>>>>>
>>>>> #' @export
>>>>> pmax <- function(...){ UseMethod("pmax") }
>>>>> #' @export
>>>>> pmax.default <- function(..., na.rm=FALSE){ base::pmax(...,
>>>>> na.rm=FALSE) }
>>>>>
>>>>> setClass("myclass")
>>>>>
>>>>> #' @export
>>>>> pmax.myclass <- function(..., na.rm = FALSE){
>>>>> print('myclass pmax!')
>>>>> }
>>>>>
>>>>> Although this works, I get the 'warning'
>>>>>
>>>>> The following objects are masked from 'package:base':
>>>>>
>>>>>   

Re: [R-pkg-devel] Exporting S3 methods for base generics

2017-06-26 Thread Charles Determan
Could you point to one of these packages you refer to?  I'm still having
problems and not sure why at the moment.

Thanks

On Mon, Jun 26, 2017 at 12:32 PM, Joris Meys <joris.m...@ugent.be> wrote:

> Hi Charles,
>
> my mistake. I forgot that pmax has an extra argument na.rm. I'm surprised
> you could define the method, as this normally should return an error from
> conformMethod().
>
> So:
> #' @rdname pmax
> setGeneric("pmax", signature = "...")
>
> should work. I've used this myself in quite a number of packages.
>
> Cheers
> Joris
>
> On Mon, Jun 26, 2017 at 7:20 PM, Charles Determan <cdeterma...@gmail.com>
> wrote:
>
>> Thanks for the reply Joris, although I am not sure what I could be doing
>> wrong.  I implement exactly the lines you show and yet I just get the
>> following error when I call 'pmax' on the class.
>>
>> > pmax(x, 0)
>>
>> Error in mmm < each :
>>   comparison (3) is possible only for atomic and list types
>> In addition: Warning message:
>> In is.na(mmm) : is.na() applied to non-(list or vector) of type 'S4'
>>
>> Regards,
>> Charles
>>
>> On Mon, Jun 26, 2017 at 12:10 PM, Joris Meys <joris.m...@ugent.be> wrote:
>>
>>> Hi Charles,
>>>
>>> if a generic exists already in the base, you only have to export the
>>> actual S3 method. Your problem is that base::pmax() is not a generic S3
>>> function. So R gives you the correct warning: the S3 generic in your
>>> package will always mask the base pmax function. And that's not really a
>>> problem, especially since you ensured the base functionality with your
>>> default method.
>>>
>>> If you want to avoid that warning, use S4.
>>>
>>> #' @rdname
>>> setGeneric("pmax")
>>>
>>> #' @rdname pmax
>>> #' @method pmax myclass
>>> #' @export
>>> setMethod("pmax",
>>>   "myclass",
>>>   function(...){
>>>          # do some stuff
>>> })
>>>
>>> More information on how to deal with dots can be found on the help page
>>> ?dotsMethods.
>>>
>>> If you have a generic in the base package (eg plot is such one), you
>>> only define the method and use:
>>>
>>> #' @export
>>> plot.myclass <- function(x, y, ...){
>>># do some more stuff
>>> }
>>>
>>> Cheers
>>> Joris
>>>
>>>
>>>
>>> On Mon, Jun 26, 2017 at 6:28 PM, Charles Determan <cdeterma...@gmail.com
>>> > wrote:
>>>
>>>> Greetings R users,
>>>>
>>>> I was wondering how others are exporting S3 methods in their packages
>>>> when
>>>> the generic is in 'base'.  For example, let's say I want to export a new
>>>> pmax method.  The only way I have found to get this to work is by
>>>> redefining the function with 'UseMethod' and setting the default method.
>>>>
>>>> #' @export
>>>> pmax <- function(...){ UseMethod("pmax") }
>>>> #' @export
>>>> pmax.default <- function(..., na.rm=FALSE){ base::pmax(...,
>>>> na.rm=FALSE) }
>>>>
>>>> setClass("myclass")
>>>>
>>>> #' @export
>>>> pmax.myclass <- function(..., na.rm = FALSE){
>>>> print('myclass pmax!')
>>>> }
>>>>
>>>> Although this works, I get the 'warning'
>>>>
>>>> The following objects are masked from 'package:base':
>>>>
>>>> pmax
>>>>
>>>>
>>>> I would like the package build and loading to be as clean as possible
>>>> but
>>>> if this is acceptable and not considered a problem I will let it go.  It
>>>> just seems odd that one would to redefine a the generic when in states
>>>> in
>>>> the docs for 'pmax' that it will also work on classed S3 objects but
>>>> perhaps I am reading this incorrectly.
>>>>
>>>> Thanks,
>>>> Charles
>>>>
>>>> [[alternative HTML version deleted]]
>>>>
>>>> __
>>>> R-package-devel@r-project.org mailing list
>>>> https://stat.ethz.ch/mailman/listinfo/r-package-devel
>>>>
>>>
>>>
>>>
>>> --
>>> Joris Meys
>>> Statistical consultant
>>>
>>> Ghent University
>>> Faculty of Bioscience Engineering
>>> Department of Mathematical Modelling, Statistics and Bio-Informatics
>>>
>>> tel : +32 9 264 59 87 <+32%209%20264%2059%2087>
>>> joris.m...@ugent.be
>>> ---
>>> Disclaimer : http://helpdesk.ugent.be/e-maildisclaimer.php
>>>
>>
>>
>
>
> --
> Joris Meys
> Statistical consultant
>
> Ghent University
> Faculty of Bioscience Engineering
> Department of Mathematical Modelling, Statistics and Bio-Informatics
>
> tel : +32 9 264 59 87 <+32%209%20264%2059%2087>
> joris.m...@ugent.be
> ---
> Disclaimer : http://helpdesk.ugent.be/e-maildisclaimer.php
>

[[alternative HTML version deleted]]

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Exporting S3 methods for base generics

2017-06-26 Thread Charles Determan
Thanks for the reply Joris, although I am not sure what I could be doing
wrong.  I implement exactly the lines you show and yet I just get the
following error when I call 'pmax' on the class.

> pmax(x, 0)

Error in mmm < each :
  comparison (3) is possible only for atomic and list types
In addition: Warning message:
In is.na(mmm) : is.na() applied to non-(list or vector) of type 'S4'

Regards,
Charles

On Mon, Jun 26, 2017 at 12:10 PM, Joris Meys <joris.m...@ugent.be> wrote:

> Hi Charles,
>
> if a generic exists already in the base, you only have to export the
> actual S3 method. Your problem is that base::pmax() is not a generic S3
> function. So R gives you the correct warning: the S3 generic in your
> package will always mask the base pmax function. And that's not really a
> problem, especially since you ensured the base functionality with your
> default method.
>
> If you want to avoid that warning, use S4.
>
> #' @rdname
> setGeneric("pmax")
>
> #' @rdname pmax
> #' @method pmax myclass
> #' @export
> setMethod("pmax",
>   "myclass",
>   function(...){
>  # do some stuff
> })
>
> More information on how to deal with dots can be found on the help page
> ?dotsMethods.
>
> If you have a generic in the base package (eg plot is such one), you only
> define the method and use:
>
> #' @export
> plot.myclass <- function(x, y, ...){
># do some more stuff
> }
>
> Cheers
> Joris
>
>
>
> On Mon, Jun 26, 2017 at 6:28 PM, Charles Determan <cdeterma...@gmail.com>
> wrote:
>
>> Greetings R users,
>>
>> I was wondering how others are exporting S3 methods in their packages when
>> the generic is in 'base'.  For example, let's say I want to export a new
>> pmax method.  The only way I have found to get this to work is by
>> redefining the function with 'UseMethod' and setting the default method.
>>
>> #' @export
>> pmax <- function(...){ UseMethod("pmax") }
>> #' @export
>> pmax.default <- function(..., na.rm=FALSE){ base::pmax(..., na.rm=FALSE) }
>>
>> setClass("myclass")
>>
>> #' @export
>> pmax.myclass <- function(..., na.rm = FALSE){
>> print('myclass pmax!')
>> }
>>
>> Although this works, I get the 'warning'
>>
>> The following objects are masked from 'package:base':
>>
>> pmax
>>
>>
>> I would like the package build and loading to be as clean as possible but
>> if this is acceptable and not considered a problem I will let it go.  It
>> just seems odd that one would to redefine a the generic when in states in
>> the docs for 'pmax' that it will also work on classed S3 objects but
>> perhaps I am reading this incorrectly.
>>
>> Thanks,
>> Charles
>>
>> [[alternative HTML version deleted]]
>>
>> __
>> R-package-devel@r-project.org mailing list
>> https://stat.ethz.ch/mailman/listinfo/r-package-devel
>>
>
>
>
> --
> Joris Meys
> Statistical consultant
>
> Ghent University
> Faculty of Bioscience Engineering
> Department of Mathematical Modelling, Statistics and Bio-Informatics
>
> tel : +32 9 264 59 87 <+32%209%20264%2059%2087>
> joris.m...@ugent.be
> ---
> Disclaimer : http://helpdesk.ugent.be/e-maildisclaimer.php
>

[[alternative HTML version deleted]]

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


[R-pkg-devel] Exporting S3 methods for base generics

2017-06-26 Thread Charles Determan
Greetings R users,

I was wondering how others are exporting S3 methods in their packages when
the generic is in 'base'.  For example, let's say I want to export a new
pmax method.  The only way I have found to get this to work is by
redefining the function with 'UseMethod' and setting the default method.

#' @export
pmax <- function(...){ UseMethod("pmax") }
#' @export
pmax.default <- function(..., na.rm=FALSE){ base::pmax(..., na.rm=FALSE) }

setClass("myclass")

#' @export
pmax.myclass <- function(..., na.rm = FALSE){
print('myclass pmax!')
}

Although this works, I get the 'warning'

The following objects are masked from 'package:base':

pmax


I would like the package build and loading to be as clean as possible but
if this is acceptable and not considered a problem I will let it go.  It
just seems odd that one would to redefine a the generic when in states in
the docs for 'pmax' that it will also work on classed S3 objects but
perhaps I am reading this incorrectly.

Thanks,
Charles

[[alternative HTML version deleted]]

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Windows specific compiler for CUDA builds

2017-03-16 Thread Charles Determan
Thanks Duncan,

You say there aren't a lot of people that no how to do that.  Do you know
of anyone who would?  I assume Dirk would be a likely person given the use
of Rtools with Rcpp.  I am happy to try and work at this as I have a vested
interest in getting CUDA packages to become functional on Windows systems
but I need somewhere to begin.  Basically I'm just looking how to switch
out the MinGW g++ with the VS cl compiler.  On a Linux system I can create
a .R/Makevars file to switch the CXX variable but I don't know how on
Windows.

Charles

On Thu, Mar 16, 2017 at 10:41 AM, Duncan Murdoch <murdoch.dun...@gmail.com>
wrote:

> On 16/03/2017 11:00 AM, Charles Determan wrote:
>
>> Greetings,
>>
>> Not sure if this should be on the Rcpp list but it isn't strictly related
>> to Rcpp but to package building involving Rcpp so I am posting it here.
>>
>> I am often working on GPU packages that use either OpenCL or CUDA.  OpenCL
>> is nice because it doesn't require a special additional compiler and I can
>> build it across platforms with relative ease.  With CUDA, it requires the
>> 'nvcc' compiler.  This is where my problem comes in.  On Windows the
>> 'nvcc'
>> requires the use of the 'cl' compiler within Visual Studio and the
>> resulting object files, AFAIK, cannot be linked to object files created by
>> g++ (via Rtools).  Everything works great on Linux (where the same
>> compiler
>> is used for everything) but on a Windows system this is causing a lot of
>> headaches.
>>
>> So, at the moment, my conclusion is that it is simply not possible to
>> build
>> a CUDA package that can be installed on a Windows system.  To my
>> knowledge,
>> no CUDA based R package has a Windows installation functional (please
>> state
>> otherwise if I am wrong).
>>
>> My only thought would be if it would be possible to have the Windows build
>> use 'cl' for the entire build process.  Perhaps that would allow all the
>> files to be linked together and create the necessary shared object at the
>> end?  Obviously the preference is to use Rtools but until NVIDIA updates
>> their special compiler to support MinGW tools I don't think that is
>> possible.
>>
>
> In principle it should be possible to use cl.  In practice, it will
> require someone to work out the details of doing it and to maintain it (by
> testing R-devel regularly to make sure changes there don't cause trouble
> for it).  There aren't a lot of people who know how to do that (e.g. I
> don't).  If you are willing to volunteer to do this (or can recruit someone
> to do it), go ahead.  Assuming you do a good job, we can put your patches
> into the base code.
>
> Duncan Murdoch
>
>
>
>

[[alternative HTML version deleted]]

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


[R-pkg-devel] Windows specific compiler for CUDA builds

2017-03-16 Thread Charles Determan
Greetings,

Not sure if this should be on the Rcpp list but it isn't strictly related
to Rcpp but to package building involving Rcpp so I am posting it here.

I am often working on GPU packages that use either OpenCL or CUDA.  OpenCL
is nice because it doesn't require a special additional compiler and I can
build it across platforms with relative ease.  With CUDA, it requires the
'nvcc' compiler.  This is where my problem comes in.  On Windows the 'nvcc'
requires the use of the 'cl' compiler within Visual Studio and the
resulting object files, AFAIK, cannot be linked to object files created by
g++ (via Rtools).  Everything works great on Linux (where the same compiler
is used for everything) but on a Windows system this is causing a lot of
headaches.

So, at the moment, my conclusion is that it is simply not possible to build
a CUDA package that can be installed on a Windows system.  To my knowledge,
no CUDA based R package has a Windows installation functional (please state
otherwise if I am wrong).

My only thought would be if it would be possible to have the Windows build
use 'cl' for the entire build process.  Perhaps that would allow all the
files to be linked together and create the necessary shared object at the
end?  Obviously the preference is to use Rtools but until NVIDIA updates
their special compiler to support MinGW tools I don't think that is
possible.

Any insight would be appreciated,

Charles

[[alternative HTML version deleted]]

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


[R-pkg-devel] Makevars.win cppflags

2017-03-15 Thread Charles Determan
I am working on an R package that contains some CUDA code.  As such, it
needs to use the 'nvcc' compiler.  I then need to use some of the R header
files such as R.h.  On a linux machine, I can handle this with my configure
script using autotools that will replace @R_INCL@ with AC_SUBST(R_INCL).
However, AFAIK, I cannot do this on a Windows machine.  I am trying to
write the Makevars.win file to create the appropriate variables that I can
pass to the nvcc compiler.  Here is an excerpt:

R_INCL=`$(shell ${R_HOME}/bin/R CMD config --cppflags)`

CU_INCL = -I../inst/include $(R_INCL)

%.o: %.cu $(cu_sources)
$(NVCC) $(CU_ARCH) $(CU_ARGS) $(CU_INCL) $< -c


However, I keep getting a truncated output from the --cppflags output.  If
I run from the commandline R CMD config --cppflags I get:

-IC:/Users/cdeterman/Documents/R/R-3.3.2/include
-IC:/Users/cdeterman/Documents/R/R-3.3.2/include/x64

but when run in the build/install I get

-IC:/Users/CDETER~1/DOCUME~1/R/R-33~1.2/include
-IC:/Users/CDETER~1/DOCUME~1/R/R-33~1.2/include/x64

and the compliation fails saying that the
-IC:/Users/CDETER~1/DOCUME~1/R/R-33~1.2/include is not found whereas the
full path is found.

Any assistance is appreciated.

Regards,
Charles

[[alternative HTML version deleted]]

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [Bioc-devel] Updating package

2017-01-05 Thread Charles Determan
That seems to have been the issue.  It had been some time since I needed to
commit to bioconductor I forgot my username was different.  Thank you
Martin for pointing that simple problem out.  I have now 'dcommit' my
'devel' branch so if I understand correctly the changes should be with
bioconductor to apply accordingly.

Regards,
Charles

On Thu, Jan 5, 2017 at 9:22 AM, Martin Morgan <martin.mor...@roswellpark.org
> wrote:

> On 01/05/2017 10:18 AM, Charles Determan wrote:
>
>> Not to be pushy but did anyone have any insights with this?  I would hate
>> to have my package still sitting with the bug fix it requires.
>>
>
> maybe svn credentials,
>
>  git svn rebase --username=c.determan
>
> ??
>
>
>> Thanks,
>> Charles
>>
>> On Wed, Jan 4, 2017 at 11:15 AM, Charles Determan <cdeterma...@gmail.com>
>> wrote:
>>
>> Hi,
>>>
>>> I received a notification that my package was failing some tests.  I have
>>> since made the necessary fixes and pushed my changes to the github repo.
>>> Previously this would result in http://gitsvn.bioconductor.
>>> org/git-push-hook updating the package for bioconductor.  I noticed
>>> however that this delivery fails and that the git-svn bridge is no longer
>>> available.
>>>
>>> How should I modify my current repository to update my package
>>> accordingly?
>>>
>>> I have my local changes in the 'devel' branch.
>>> I ran the update_ remotes.sh
>>> I run 'git svn rebase' but I get the following error:
>>>
>>> Can't create session: Unable to connect to a repository at URL '
>>> https://hedgehog.fhcrc.org/bioconductor/trunk/madman/Rpacks/OmicsMarkeR
>>> ':
>>> Unexpected server error 500 'Internal Server Error' on
>>> '/bioconductor/trunk/madman/Rpacks/OmicsMarkeR' at
>>> /mingw64/share/perl5/site_perl/Git/SVN.pm line 717.
>>>
>>> What did I miss here?
>>> Thanks,
>>>
>>> Charles
>>>
>>>
>> [[alternative HTML version deleted]]
>>
>> ___
>> Bioc-devel@r-project.org mailing list
>> https://stat.ethz.ch/mailman/listinfo/bioc-devel
>>
>>
>
> This email message may contain legally privileged and/or confidential
> information.  If you are not the intended recipient(s), or the employee or
> agent responsible for the delivery of this message to the intended
> recipient(s), you are hereby notified that any disclosure, copying,
> distribution, or use of this email message is prohibited.  If you have
> received this message in error, please notify the sender immediately by
> e-mail and delete this email message from your computer. Thank you.
>

[[alternative HTML version deleted]]

___
Bioc-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/bioc-devel


Re: [Bioc-devel] Updating package

2017-01-05 Thread Charles Determan
Not to be pushy but did anyone have any insights with this?  I would hate
to have my package still sitting with the bug fix it requires.

Thanks,
Charles

On Wed, Jan 4, 2017 at 11:15 AM, Charles Determan <cdeterma...@gmail.com>
wrote:

> Hi,
>
> I received a notification that my package was failing some tests.  I have
> since made the necessary fixes and pushed my changes to the github repo.
> Previously this would result in http://gitsvn.bioconductor.
> org/git-push-hook updating the package for bioconductor.  I noticed
> however that this delivery fails and that the git-svn bridge is no longer
> available.
>
> How should I modify my current repository to update my package
> accordingly?
>
> I have my local changes in the 'devel' branch.
> I ran the update_ remotes.sh
> I run 'git svn rebase' but I get the following error:
>
> Can't create session: Unable to connect to a repository at URL '
> https://hedgehog.fhcrc.org/bioconductor/trunk/madman/Rpacks/OmicsMarkeR':
> Unexpected server error 500 'Internal Server Error' on
> '/bioconductor/trunk/madman/Rpacks/OmicsMarkeR' at
> /mingw64/share/perl5/site_perl/Git/SVN.pm line 717.
>
> What did I miss here?
> Thanks,
>
> Charles
>

[[alternative HTML version deleted]]

___
Bioc-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/bioc-devel


Re: [R-pkg-devel] Recommendation on qr method export

2016-08-03 Thread Charles Determan
Thanks for the input Peter.  I suppose I never looked at lm() to see how it
was used there.  Given that though, do you see any reason then to not
create an S3/S4 method if different methods are used for different
functions?  I'm just looking for some design guidance from the community
here before I just implement something that may cause more confusion.

Regards,
Charles

On Tue, Aug 2, 2016 at 4:34 PM, Peter Dalgaard <pda...@gmail.com> wrote:

> Not strictly what you're asking, but at some point it may be important to
> note that the "QR" method used by lm() and friends (notably anova() and
> aov()) actually relies on successive orthogonalization. This does yield a
> QR decomposition but the reverse is not true. A generic X=QR decomposition
> does orthogonalize, but it does not necessarily hold that the first k
> columns of Q spans the same subspace as the first k columns of X. LINPACK's
> QR happens to be implemented as successive orthogonalization, but LAPACK's
> is not, so only the former is usable with lm().
>
> So, I suppose what I am getting at is that not even lm() uses qr(), it
> calls LINPACK directly.
>
> -pd
>
>
> > On 02 Aug 2016, at 21:17 , Charles Determan <cdeterma...@gmail.com>
> wrote:
> >
> > Hello,
> >
> > I am currently working on an implementation of QR decomposition
> (leveraging
> > a C++ library via Rcpp).  Naturally I would like to have the R syntax as
> > similar as possible to base R 'qr' function.  Given the class structure
> of
> > my package my instinct was to export an S4/S3 method.
> >
> > However, the QR decomposition doesn't store the final QR matrix in the
> same
> > format as base R via LINPACK, nor does it return 'qraux', 'rank' or
> 'pivot'
> > objects but instead a 'betas' object.  The final 'R' and 'Q' matrices are
> > in fact identical to those ultimately returned by qr.R or qr.Q.
> >
> > So my question is, given these differences, should I just create a
> > different function name or would creating a qr.myclass dispatch be
> > acceptable (whether S3 or S4)?  I would prefer the latter as I would like
> > the classes to potentially take advantage of previously written code
> using
> > 'qr'.
> >
> > Thanks,
> > Charles
> >
> >   [[alternative HTML version deleted]]
> >
> > __
> > R-package-devel@r-project.org mailing list
> > https://stat.ethz.ch/mailman/listinfo/r-package-devel
>
>

[[alternative HTML version deleted]]

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [Rd] R external pointer and GPU memory leak problem

2016-05-16 Thread Charles Determan
Hi Yuan,

I think this is likely more appropriate for the r-sig-hpc mailing list.
However, regarding you design and comment about R's 'current' GPU package
(I don't what you consider this, gputools?) I think you should look at two
other packages.  I believe the gmatrix (
https://cran.r-project.org/web/packages/gmatrix/index.html) implements
exactly what you are trying to do for NVIDIA specific code.  There is also
the gpuR package (https://cran.r-project.org/web/packages/gpuR/index.html)
package which also implements the object 'on GPU' functionality you desire
but in OpenCL so it works for 'all' GPUs.

If you really want to continue your development I strongly recommend you
look in to using Rcpp and the XPtr objects for external pointers.  They
handle the pointer protection and finalizer without you needing to worry
about them.

Regards,
Charles

On Sat, May 14, 2016 at 10:43 AM, Yuan Li  wrote:

> My question is based on a project I have partially done, but there is
> still something I'm not clear.
>
> My goal is to create a R package contains GPU functions (some are from
> Nividia cuda library, some are my self-defined CUDA functions)
>
> My design is quite different from current R's GPU package, I want to
> create a R object (external pointer) point to GPU address, and run my GPU
> function direct on GPU side without transferring forth and back between CPU
> and GPU.
>
> I used the R external pointer to implement my design. But I found I have
> memory leak problems on GPU side, I can still fix it by running gc()
> function explicitly in R side, but I'm just wondering if I missed something
> in my C code. Would you please indicate my mistake, because this is my
> first time write a R package, and I could possibly made some terrible
> mistakes.
>
> actually, I have wrote bunch of GPU functions which can run on GPU side
> with the object created by following create function, but the memory leak
> kills me if I need to deal with some huge dataset.
>
> Here is my create function, I create a gpu pointer x, and allocate GPU
> memory for x, then make a R external pointer ext based on x, and copy the
> cpu vector input to my gpu external pointer ext,
>
>
> /*
> define function to create a vector in GPU
> by transferring a R's vector to GPU.
> input is R's vector and its length,
> output is a R external pointer
> pointing to GPU vector(device)
> */
> SEXP createGPU(SEXP input, SEXP n)
> {
> int *lenth = INTEGER(n);
>PROTECT (input = AS_NUMERIC (input));
>double * temp;
>temp = REAL(input);
> double *x;   ##here is the step which causes the memory leak
> cudacall(cudaMalloc((void**), *lenth * sizeof(double)));
> //protect the R external pointer from finalizer
> SEXP ext = PROTECT(R_MakeExternalPtr(x, R_NilValue, R_NilValue));
> R_RegisterCFinalizerEx(ext, _finalizer, TRUE);
>
> //copying CPU to GPU
> cublascall(cublasSetVector(*lenth, sizeof(double), temp, 1,
> R_ExternalPtrAddr(ext), 1));
>UNPROTECT(2);
> return ext;
> }
>
>
>
> here is my finalized for my create function,
>
> /*
> define finalizer for R external pointer
> input is R external pointer, function will finalize the pointer
> when it is not in use.
> */
> static void _finalizer(SEXP ext)
> {
> if (!R_ExternalPtrAddr(ext))
> return;
>double * ptr= (double *) R_ExternalPtrAddr(ext);
> Rprintf("finalizer invoked once \n");
> cudacall(cudaFree(ptr));
> R_ClearExternalPtr(ext);
> }
>
>
> My create function can run smoothly, but if I run the create function too
> many times, it shows out of memory for my GPU device, which clearly implies
> memory leak problem. Can anybody help? Help alot in advance!
> __
> R-devel@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-devel
>

[[alternative HTML version deleted]]

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] Single-threaded aspect

2016-05-12 Thread Charles Determan
Thank you Simon for the detailed reply.  That explains much more of what I
was looking for from the R side.

Dirk, I'm sorry if I seem hung up on anything here but I am trying to
understand the details.  My reply about XPtr or XPtr on arma/Eigen was to
confirm my understanding was correct, which it appears it was.  I was not
aware the RVector/RMatrix objects don't connect to R as I am just now
familiarizing myself with the package, that explains more of my confusion.
I will look at doing work within the compiled code as you have suggested.

Regards,
Charles

On Thu, May 12, 2016 at 9:18 AM, Dirk Eddelbuettel <e...@debian.org> wrote:

>
> On 12 May 2016 at 13:11, Mark van der Loo wrote:
> | Charles,
> |
> | 1. Perhaps this question is better directed at the R-help or
> | R-pacakge-devel mailinglist.
> |
> | 2. It basically means that R itself can only evaluate one R expression at
> | the time.
> |
> | The parallel package circumvents this by starting multiple R-sessions and
> | dividing workload.
> |
> | Compiled code called by R (such as C++ code through RCpp or C-code
> through
> | base R's interface) can execute multi-threaded code for internal
> purposes,
> | using e.g. openMP. A limitation is that compiled code cannot call R's C
> API
> | from multiple threads (in many cases). For example, it is not thread-safe
> | to create R-variables from multiple threads running in C. (R's variable
> | administration is such that the order of (un)making them from compiled
> code
> | matters).
>
> Well put.
>
> | I am not very savvy on Rcpp or XPtr objects, but it appears that Dirk
> | provided answers about that in your SO-question.
>
> Charles seems to hang himself up completely about a small detail, failing
> to
> see the forest for the trees.
>
> There are (many) working examples of parallel (compiled) code with R. All
> of
> them stress (and I simplify here) that can you touch R objects, or call
> back
> into R, for fear of any assignment or allocation triggering an R event.  R
> being single-threaded it cannot do this.
>
> My answer to this problem is to only use non-R data structures. That is
> what
> RcpParallel does in the actual parallel code portions in all examples --
> types RVector and RMatrix do NOT connect back to R. There are several
> working
> examples.  That is also what the OpenMP examples at the Rcpp Gallery do.
>
> Charles seems to be replying 'but I use XPtr' or 'I use XPtr on arma::mat
> or
> Eigen::Matrixxd' and seems to forget that these are proxy objects to SEXPs.
> XPtr just wrap the SEXP for external pointers; Arma's and Eigen's matrices
> are performant via RcppArmadillo and RcppEigen because we use R memory via
> proxies.  All of that is 'too close to R' for comfort.
>
> So the short answer is:  enter compiled code from R, set a mutex (either
> conceptually or explicitly), _copy_ your data in to plain C++ data
> structures
> and go to town in parallel via OpenMP and other multithreaded approaches.
> Then collect the result, release the mutex and move back up.
>
> I hope this help.
>
> Dirk
>
> |
> | Best,
> | Mark
> |
> |
> |
> |
> |
> |
> |
> |
> |
> |
> | Op do 12 mei 2016 om 14:46 schreef Charles Determan <
> cdeterma...@gmail.com>:
> |
> | > R Developers,
> | >
> | > Could someone help explain what it means that R is single threaded?  I
> am
> | > trying to understand what is actually going on inside R when users
> want to
> | > parallelize code.  For example, using mclapply or foreach (with some
> | > backend) somehow allows users to benefit from multiple CPUs.
> | >
> | > Similarly there is the RcppParallel package for RMatrix/RVector
> objects.
> | > But none of these address the general XPtr objects in Rcpp.  Some
> readers
> | > here may recognize my question on SO (
> | >
> | >
> http://stackoverflow.com/questions/37167479/rcpp-parallelize-functions-that-return-xptr
> | > )
> | > where I was curious about parallel calls to C++/Rcpp functions that
> return
> | > XPtr objects.  I am being a little more persistent here as this
> limitation
> | > provides a very hard stop on the development on one of my packages that
> | > heavily uses XPtr objects.  It's not meant to be a criticism or
> intended to
> | > be rude, I just want to fully understand.
> | >
> | > I am willing to accept that it may be impossible currently but I want
> to at
> | > least understand why it is impossible so I can explain to future users
> why
> | > parallel functionality is not available.  Which just echos my original
> | > question, what does it mean that R is single threaded?
> | &g

Re: [Rd] Single-threaded aspect

2016-05-12 Thread Charles Determan
Thanks for the replies.  Regarding the answer by Dirk, I didn't feel like I
still understood the reasoning why mclapply or foreach cannot handle XPtr
objects.  Instead of cluttering the SO question with comments I was getting
the impression that this was a limitation inherited with R objects (which
XPtr is supposed to be a proxy for an R object according to Dirk's
comment).  If this is not the case, I could repost this on Rcpp-devel
unless it could be migrated.

Regards,
Charles

On Thu, May 12, 2016 at 8:11 AM, Mark van der Loo <mark.vander...@gmail.com>
wrote:

> Charles,
>
> 1. Perhaps this question is better directed at the R-help or
> R-pacakge-devel mailinglist.
>
> 2. It basically means that R itself can only evaluate one R expression at
> the time.
>
> The parallel package circumvents this by starting multiple R-sessions and
> dividing workload.
>
> Compiled code called by R (such as C++ code through RCpp or C-code through
> base R's interface) can execute multi-threaded code for internal purposes,
> using e.g. openMP. A limitation is that compiled code cannot call R's C API
> from multiple threads (in many cases). For example, it is not thread-safe
> to create R-variables from multiple threads running in C. (R's variable
> administration is such that the order of (un)making them from compiled code
> matters).
>
> I am not very savvy on Rcpp or XPtr objects, but it appears that Dirk
> provided answers about that in your SO-question.
>
> Best,
> Mark
>
>
>
>
>
>
>
>
>
>
> Op do 12 mei 2016 om 14:46 schreef Charles Determan <cdeterma...@gmail.com
> >:
>
>> R Developers,
>>
>> Could someone help explain what it means that R is single threaded?  I am
>> trying to understand what is actually going on inside R when users want to
>> parallelize code.  For example, using mclapply or foreach (with some
>> backend) somehow allows users to benefit from multiple CPUs.
>>
>> Similarly there is the RcppParallel package for RMatrix/RVector objects.
>> But none of these address the general XPtr objects in Rcpp.  Some readers
>> here may recognize my question on SO (
>>
>> http://stackoverflow.com/questions/37167479/rcpp-parallelize-functions-that-return-xptr
>> )
>> where I was curious about parallel calls to C++/Rcpp functions that return
>> XPtr objects.  I am being a little more persistent here as this limitation
>> provides a very hard stop on the development on one of my packages that
>> heavily uses XPtr objects.  It's not meant to be a criticism or intended
>> to
>> be rude, I just want to fully understand.
>>
>> I am willing to accept that it may be impossible currently but I want to
>> at
>> least understand why it is impossible so I can explain to future users why
>> parallel functionality is not available.  Which just echos my original
>> question, what does it mean that R is single threaded?
>>
>> Kind Regards,
>> Charles
>>
>> [[alternative HTML version deleted]]
>>
>> __
>> R-devel@r-project.org mailing list
>> https://stat.ethz.ch/mailman/listinfo/r-devel
>>
>

[[alternative HTML version deleted]]

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


[Rd] Single-threaded aspect

2016-05-12 Thread Charles Determan
R Developers,

Could someone help explain what it means that R is single threaded?  I am
trying to understand what is actually going on inside R when users want to
parallelize code.  For example, using mclapply or foreach (with some
backend) somehow allows users to benefit from multiple CPUs.

Similarly there is the RcppParallel package for RMatrix/RVector objects.
But none of these address the general XPtr objects in Rcpp.  Some readers
here may recognize my question on SO (
http://stackoverflow.com/questions/37167479/rcpp-parallelize-functions-that-return-xptr)
where I was curious about parallel calls to C++/Rcpp functions that return
XPtr objects.  I am being a little more persistent here as this limitation
provides a very hard stop on the development on one of my packages that
heavily uses XPtr objects.  It's not meant to be a criticism or intended to
be rude, I just want to fully understand.

I am willing to accept that it may be impossible currently but I want to at
least understand why it is impossible so I can explain to future users why
parallel functionality is not available.  Which just echos my original
question, what does it mean that R is single threaded?

Kind Regards,
Charles

[[alternative HTML version deleted]]

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] [R] TensorFlow in R

2016-04-01 Thread Charles Determan
Hi Axel,

Looks like the only thing right now is rflow (
https://github.com/terrytangyuan/rflow).  It appears to simply wrap around
the python bindings.  I am not aware of any others.  Be interesting to keep
an eye on.

Regards,
Charles


On Fri, Apr 1, 2016 at 11:32 AM, Axel Urbiz  wrote:

> Hi All,
>
> I didn't have much success through my Google search in finding any active
> R-related projects to create a wrapper around TensorFlow in R. Anyone know
> if this is on the go?
>
> Thanks,
> Axel.
>
> [[alternative HTML version deleted]]
>
> __
> r-h...@r-project.org mailing list -- To UNSUBSCRIBE and more, see
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide
> http://www.R-project.org/posting-guide.html
> and provide commented, minimal, self-contained, reproducible code.
>

[[alternative HTML version deleted]]

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [R-pkg-devel] Development Testers?

2016-02-06 Thread Charles Determan
Thanks Boris,

My main concern I suppose is if this mailing list is appropriate to
occasionally solicit additional testers.  Just curious what people (i.e.
Admins) would think about using the list for such a purpose.

Regards,
Charles

On Friday, February 5, 2016, Boris Steipe <boris.ste...@utoronto.ca> wrote:

> Of course I can speak only for myself, but I'd help out if asked and I
> think in general you'll find people here to be a pretty helpful bunch.
>
> Cheers,
> Boris
>
>
> On Feb 5, 2016, at 8:30 AM, Charles Determan <cdeterma...@gmail.com
> <javascript:;>> wrote:
>
> > I'm not sure if this question is appropriate for the R mailing lists but
> > I'm not sure where else to ask.
> >
> > I am wondering if there is any means by which package authors can solicit
> > testers for a package.  I believe this is often referred to as 'crowd
> > sourced testing'.
> >
> > I am aware of continuous integration platforms and unit testing (e.g.
> > testthat) but that only gets me so far.  In my particular instance I am
> > developing packages for GPU computing (maybe this request would be best
> on
> > the HPC mailing list?).  As such, I can't possibly have access to every
> > type of GPU.  I would like to find a means of 'recruiting' users who may
> > have different pieces of hardware to test my package.  Of course this
> won't
> > be exhaustive but any additional testing that others could provide would
> > make the package all the more stable.
> >
> > Regards,
> > Charles
> >
> >   [[alternative HTML version deleted]]
> >
> > __
> > R-package-devel@r-project.org <javascript:;> mailing list
> > https://stat.ethz.ch/mailman/listinfo/r-package-devel
>
>

[[alternative HTML version deleted]]

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


[R-pkg-devel] Development Testers?

2016-02-05 Thread Charles Determan
I'm not sure if this question is appropriate for the R mailing lists but
I'm not sure where else to ask.

I am wondering if there is any means by which package authors can solicit
testers for a package.  I believe this is often referred to as 'crowd
sourced testing'.

I am aware of continuous integration platforms and unit testing (e.g.
testthat) but that only gets me so far.  In my particular instance I am
developing packages for GPU computing (maybe this request would be best on
the HPC mailing list?).  As such, I can't possibly have access to every
type of GPU.  I would like to find a means of 'recruiting' users who may
have different pieces of hardware to test my package.  Of course this won't
be exhaustive but any additional testing that others could provide would
make the package all the more stable.

Regards,
Charles

[[alternative HTML version deleted]]

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [Bioc-devel] C library or C package API for regular expressions

2016-01-23 Thread Charles Determan
Hi Jiri,

I believe you can use the BH package. It contains most of the Boost headers.

Regards,
Charles

On Saturday, January 23, 2016, Jiří Hon  wrote:

> Dear package developers,
>
> I would like to ask you for advice. Please, what is the most seamless
> way to use regular expressions in C/C++ code of R/Bioconductor package?
> Is it allowed to bundle some C/C++ library for that (like PCRE or
> Boost.Regex)? Or is there existing C API of some package I can depend on
> and import?
>
> Thank you a lot for your attention and please have a nice day :)
>
> Jiri Hon
>
> ___
> Bioc-devel@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/bioc-devel
>

[[alternative HTML version deleted]]

___
Bioc-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/bioc-devel

[R-pkg-devel] R CMD check NOTE - Long paths in package

2015-10-12 Thread Charles Determan
Greetings,

I have a package which provides headers for a C++ library (similar to the
BH package).  However, the C++ library has some heavily nested components
within its' structure so when I run R CMD check I get the following NOTE:

Tarballs are only required to store paths of up to 100 bytes and cannot
store those of more than 256 bytes, with restrictions including to 100
bytes for the final component.

Is this a major problem?  As this is a 'NOTE' I am under the impression
that the long paths are not critical but I want to make sure.

Thank you,
Regards,

Charles

[[alternative HTML version deleted]]

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [Bioc-devel] Trouble installing S4Vectors?

2015-08-28 Thread Charles Determan
Thanks Herve, that solved the problem.  It installed correctly now.

Regards,
Charles

On Thu, Aug 27, 2015 at 4:14 PM, Hervé Pagès hpa...@fredhutch.org wrote:

 On 08/27/2015 10:21 AM, Charles Determan wrote:

 I am simply using gcc-4.8

 Here is the full output:

 BioC_mirror: http://bioconductor.org
 Using Bioconductor version 3.1 (BiocInstaller 1.18.4), R version 3.2.2.
 Installing package(s) ‘S4Vectors’
 trying URL
 '
 http://bioconductor.org/packages/3.1/bioc/src/contrib/S4Vectors_0.6.3.tar.gz
 '
 Content type 'application/x-gzip' length 187491 bytes (183 KB)
 ==
 downloaded 183 KB

 * installing *source* package ‘S4Vectors’ ...
 ** libs
 ccache gcc-4.8 -I/usr/share/R/include -DNDEBUG  -fpic  -Wall -O3
 -std=c99 -c AEbufs.c -o AEbufs.o
 ccache gcc-4.8 -I/usr/share/R/include -DNDEBUG  -fpic  -Wall -O3
 -std=c99 -c DataFrame_class.c -o DataFrame_class.o
 ccache gcc-4.8 -I/usr/share/R/include -DNDEBUG  -fpic  -Wall -O3
 -std=c99 -c Hits_class.c -o Hits_class.o
 Hits_class.c: In function ‘Hits_new’:
 Hits_class.c:157:13: warning: ‘q_hits2’ may be used uninitialized in
 this function [-Wmaybe-uninitialized]
 tsort_hits(q_hits2, s_hits, qh_out, sh_out, nhit,
   ^
 Hits_class.c:144:7: note: ‘q_hits2’ was declared here
int *q_hits2, *qh_out, *sh_out;
 ^
 Hits_class.c:233:14: warning: ‘revmap’ may be used uninitialized in this
 function [-Wmaybe-uninitialized]
defineVar(install(translateChar(symbol)), revmap, revmap_envir);
^
 ccache gcc-4.8 -I/usr/share/R/include -DNDEBUG  -fpic  -Wall -O3
 -std=c99 -c List_class.c -o List_class.o
 ccache gcc-4.8 -I/usr/share/R/include -DNDEBUG  -fpic  -Wall -O3
 -std=c99 -c R_init_S4Vectors.c -o R_init_S4Vectors.o
 ccache gcc-4.8 -I/usr/share/R/include -DNDEBUG  -fpic  -Wall -O3
 -std=c99 -c Rle_class.c -o Rle_class.o
 ccache gcc-4.8 -I/usr/share/R/include -DNDEBUG  -fpic  -Wall -O3
 -std=c99 -c Rle_utils.c -o Rle_utils.o
 ccache gcc-4.8 -I/usr/share/R/include -DNDEBUG  -fpic  -Wall -O3
 -std=c99 -c SEXP_utils.c -o SEXP_utils.o
 ccache gcc-4.8 -I/usr/share/R/include -DNDEBUG  -fpic  -Wall -O3
 -std=c99 -c SimpleList_class.c -o SimpleList_class.o
 ccache gcc-4.8 -I/usr/share/R/include -DNDEBUG  -fpic  -Wall -O3
 -std=c99 -c anyMissing.c -o anyMissing.o
 ccache gcc-4.8 -I/usr/share/R/include -DNDEBUG  -fpic  -Wall -O3
 -std=c99 -c eval_utils.c -o eval_utils.o
 ccache gcc-4.8 -I/usr/share/R/include -DNDEBUG  -fpic  -Wall -O3
 -std=c99 -c hash_utils.c -o hash_utils.o
 ccache gcc-4.8 -I/usr/share/R/include -DNDEBUG  -fpic  -Wall -O3
 -std=c99 -c int_utils.c -o int_utils.o
 ccache gcc-4.8 -I/usr/share/R/include -DNDEBUG  -fpic  -Wall -O3
 -std=c99 -c logical_utils.c -o logical_utils.o
 ccache gcc-4.8 -I/usr/share/R/include -DNDEBUG  -fpic  -Wall -O3
 -std=c99 -c safe_arithm.c -o safe_arithm.o
 ccache gcc-4.8 -I/usr/share/R/include -DNDEBUG  -fpic  -Wall -O3
 -std=c99 -c sort_utils.c -o sort_utils.o
 ccache gcc-4.8 -I/usr/share/R/include -DNDEBUG  -fpic  -Wall -O3
 -std=c99 -c str_utils.c -o str_utils.o
 str_utils.c: In function ‘get_svn_time’:
 str_utils.c:259:2: warning: implicit declaration of function ‘tzset’
 [-Wimplicit-function-declaration]
tzset();
^
 str_utils.c:261:18: error: ‘timezone’ undeclared (first use in this
 function)
utc_offset = - (timezone / 3600);
^
 str_utils.c:261:18: note: each undeclared identifier is reported only
 once for each function it appears in
 make: *** [str_utils.o] Error 1
 ERROR: compilation failed for package ‘S4Vectors’
 * removing ‘/home/cdeterman/R/x86_64-pc-linux-gnu-library/3.2/S4Vectors’


 The -std=c99 compilation flag seems to be causing this. On our build
 machines that use gcc (i.e. Linux, Windows, and Snow Leopard), we use
 -std=gnu99. Check what flag your R is using by running:

   R CMD config CC

 in your terminal. My understanding is that gcc doesn't fully support
 c99 yet, and that R packages are only required to support gnu99.
 AFAIK the standard procedure for configuring/compiling/installing
 R should set this to gnu99 when gcc is the default compiler.
 So I'm not not sure why R is configured to use -std=c99 on your
 system.

 Anyway I just made a change to S4Vectors so now it should compile
 with -std=c99. The change is in versions 0.6.4 (release) and 0.7.14
 (devel) which should become available in the next 24 hours or less.
 You might run into compilation problems with other packages that
 don't support -std=c99 though so I would still encourage you to change
 that setting (just edit R_HOME/etc/Makeconf if you know what you are
 doing or ask a system admin to look into this).

 Cheers,
 H.



 On Thu, Aug 27, 2015 at 12:04 PM, Hervé Pagès hpa...@fredhutch.org
 mailto:hpa...@fredhutch.org wrote:

 Hi Charles,

 What compiler do you use? Please show the entire output of
 biocLite(S4Vectors) plus your sessionInfo

[Bioc-devel] Trouble installing S4Vectors?

2015-08-27 Thread Charles Determan
I am trying to run the basic install with biocLite.  I am running R 3.2.2
on Ubuntu 14.04.

I install the BiocInstaller package with

source(http://bioconductor.org/biocLite.R;)

I then try again to just install S4Vectors but it errors out on the
src_utils file with the following error:

str_utils.c: In function ‘get_svn_time’:
str_utils.c:259:2: warning: implicit declaration of function ‘tzset’
[-Wimplicit-function-declaration]
  tzset();
  ^
str_utils.c:261:18: error: ‘timezone’ undeclared (first use in this
function)
  utc_offset = - (timezone / 3600);
  ^
str_utils.c:261:18: note: each undeclared identifier is reported only once
for each function it appears in
make: *** [str_utils.o] Error 1
ERROR: compilation failed for package ‘S4Vectors’

Did I miss something?

Thanks,
Charles

[[alternative HTML version deleted]]

___
Bioc-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/bioc-devel


Re: [Bioc-devel] Trouble installing S4Vectors?

2015-08-27 Thread Charles Determan
I am simply using gcc-4.8

Here is the full output:

BioC_mirror: http://bioconductor.org
Using Bioconductor version 3.1 (BiocInstaller 1.18.4), R version 3.2.2.
Installing package(s) ‘S4Vectors’
trying URL '
http://bioconductor.org/packages/3.1/bioc/src/contrib/S4Vectors_0.6.3.tar.gz
'
Content type 'application/x-gzip' length 187491 bytes (183 KB)
==
downloaded 183 KB

* installing *source* package ‘S4Vectors’ ...
** libs
ccache gcc-4.8 -I/usr/share/R/include -DNDEBUG  -fpic  -Wall -O3
-std=c99 -c AEbufs.c -o AEbufs.o
ccache gcc-4.8 -I/usr/share/R/include -DNDEBUG  -fpic  -Wall -O3
-std=c99 -c DataFrame_class.c -o DataFrame_class.o
ccache gcc-4.8 -I/usr/share/R/include -DNDEBUG  -fpic  -Wall -O3
-std=c99 -c Hits_class.c -o Hits_class.o
Hits_class.c: In function ‘Hits_new’:
Hits_class.c:157:13: warning: ‘q_hits2’ may be used uninitialized in this
function [-Wmaybe-uninitialized]
   tsort_hits(q_hits2, s_hits, qh_out, sh_out, nhit,
 ^
Hits_class.c:144:7: note: ‘q_hits2’ was declared here
  int *q_hits2, *qh_out, *sh_out;
   ^
Hits_class.c:233:14: warning: ‘revmap’ may be used uninitialized in this
function [-Wmaybe-uninitialized]
  defineVar(install(translateChar(symbol)), revmap, revmap_envir);
  ^
ccache gcc-4.8 -I/usr/share/R/include -DNDEBUG  -fpic  -Wall -O3
-std=c99 -c List_class.c -o List_class.o
ccache gcc-4.8 -I/usr/share/R/include -DNDEBUG  -fpic  -Wall -O3
-std=c99 -c R_init_S4Vectors.c -o R_init_S4Vectors.o
ccache gcc-4.8 -I/usr/share/R/include -DNDEBUG  -fpic  -Wall -O3
-std=c99 -c Rle_class.c -o Rle_class.o
ccache gcc-4.8 -I/usr/share/R/include -DNDEBUG  -fpic  -Wall -O3
-std=c99 -c Rle_utils.c -o Rle_utils.o
ccache gcc-4.8 -I/usr/share/R/include -DNDEBUG  -fpic  -Wall -O3
-std=c99 -c SEXP_utils.c -o SEXP_utils.o
ccache gcc-4.8 -I/usr/share/R/include -DNDEBUG  -fpic  -Wall -O3
-std=c99 -c SimpleList_class.c -o SimpleList_class.o
ccache gcc-4.8 -I/usr/share/R/include -DNDEBUG  -fpic  -Wall -O3
-std=c99 -c anyMissing.c -o anyMissing.o
ccache gcc-4.8 -I/usr/share/R/include -DNDEBUG  -fpic  -Wall -O3
-std=c99 -c eval_utils.c -o eval_utils.o
ccache gcc-4.8 -I/usr/share/R/include -DNDEBUG  -fpic  -Wall -O3
-std=c99 -c hash_utils.c -o hash_utils.o
ccache gcc-4.8 -I/usr/share/R/include -DNDEBUG  -fpic  -Wall -O3
-std=c99 -c int_utils.c -o int_utils.o
ccache gcc-4.8 -I/usr/share/R/include -DNDEBUG  -fpic  -Wall -O3
-std=c99 -c logical_utils.c -o logical_utils.o
ccache gcc-4.8 -I/usr/share/R/include -DNDEBUG  -fpic  -Wall -O3
-std=c99 -c safe_arithm.c -o safe_arithm.o
ccache gcc-4.8 -I/usr/share/R/include -DNDEBUG  -fpic  -Wall -O3
-std=c99 -c sort_utils.c -o sort_utils.o
ccache gcc-4.8 -I/usr/share/R/include -DNDEBUG  -fpic  -Wall -O3
-std=c99 -c str_utils.c -o str_utils.o
str_utils.c: In function ‘get_svn_time’:
str_utils.c:259:2: warning: implicit declaration of function ‘tzset’
[-Wimplicit-function-declaration]
  tzset();
  ^
str_utils.c:261:18: error: ‘timezone’ undeclared (first use in this
function)
  utc_offset = - (timezone / 3600);
  ^
str_utils.c:261:18: note: each undeclared identifier is reported only once
for each function it appears in
make: *** [str_utils.o] Error 1
ERROR: compilation failed for package ‘S4Vectors’
* removing ‘/home/cdeterman/R/x86_64-pc-linux-gnu-library/3.2/S4Vectors’

On Thu, Aug 27, 2015 at 12:04 PM, Hervé Pagès hpa...@fredhutch.org wrote:

 Hi Charles,

 What compiler do you use? Please show the entire output of
 biocLite(S4Vectors) plus your sessionInfo(). Thanks!

 H.


 On 08/27/2015 06:19 AM, Charles Determan wrote:

 I am trying to run the basic install with biocLite.  I am running R 3.2.2
 on Ubuntu 14.04.

 I install the BiocInstaller package with

 source(http://bioconductor.org/biocLite.R;)

 I then try again to just install S4Vectors but it errors out on the
 src_utils file with the following error:

 str_utils.c: In function ‘get_svn_time’:
 str_utils.c:259:2: warning: implicit declaration of function ‘tzset’
 [-Wimplicit-function-declaration]
tzset();
^
 str_utils.c:261:18: error: ‘timezone’ undeclared (first use in this
 function)
utc_offset = - (timezone / 3600);
^
 str_utils.c:261:18: note: each undeclared identifier is reported only once
 for each function it appears in
 make: *** [str_utils.o] Error 1
 ERROR: compilation failed for package ‘S4Vectors’

 Did I miss something?

 Thanks,
 Charles

 [[alternative HTML version deleted]]

 ___
 Bioc-devel@r-project.org mailing list
 https://stat.ethz.ch/mailman/listinfo/bioc-devel


 --
 Hervé Pagès

 Program in Computational Biology
 Division of Public Health Sciences
 Fred Hutchinson Cancer Research Center
 1100 Fairview Ave. N, M1-B514
 P.O. Box 19024
 Seattle, WA 98109-1024

 E-mail: hpa...@fredhutch.org
 Phone:  (206) 667-5791
 Fax:(206) 667-1319

[Bioc-devel] Bioconductor Build Still Failing on Travis CI

2015-08-24 Thread Charles Determan
This initially came to my attention with this question on StackOverflow (
http://stackoverflow.com/questions/32139378/travis-ci-failing-to-install-bioconductor?noredirect=1#comment52184421_32139378).
I have noticed that my own bioconductor package has continued to fail over
the past several days when it is trying to install bioconductor
dependencies.  I am able to install bioconductor packages locally but the
Travis environment continues to fail.  I want to check here to find out if
there is still something going on with bioconductor or if this is a travis
ci issue?

Note, my package built successfully previously and I have made no changes
to my .travis.yml file.

Regards,
Charles

[[alternative HTML version deleted]]

___
Bioc-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/bioc-devel


Re: [Bioc-devel] using covr with the parallel package

2015-08-17 Thread Charles Determan
This may be more appropriate to ask the covr package maintainer on his
github page (https://github.com/jimhester/covr/issues).

Charles

On Mon, Aug 17, 2015 at 1:13 PM, Leonard Goldstein 
goldstein.leon...@gene.com wrote:

 Hi Jim,

 I noticed that when covr calculates test coverage, functions called
 inside mclapply or mcmapply with argument mc.preschedule = FALSE are
 considered untested (even if unit tests exist)

 When running checks I only use one core. So an easy fix would be to
 set mc.preschedule to TRUE if mc.cores = 1. But I was wondering if you
 are aware of this behavior and whether there is a way to avoid it?

 Many thanks for your help.

 Leonard

 ___
 Bioc-devel@r-project.org mailing list
 https://stat.ethz.ch/mailman/listinfo/bioc-devel


[[alternative HTML version deleted]]

___
Bioc-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/bioc-devel


Re: [R-pkg-devel] CRAN submission which requires third party software?

2015-08-12 Thread Charles Determan
Hi Dirk,

I was primarily concerned with the requirement of the SDK.  Strictly
speaking, the SDK isn't actually need it to compile the package (assuming
the CRAN servers have the OpenCL library).  I would just need to have
nearly all the tests skipped on CRAN and the package would essentially be
worthless to someone installing it without the SDK.

Another concern is that the package requires a C++11 compiler.  This isn't
an issue on Linux systems but I don't know how to accomplish a build on
Windows as the Rtools compiler is still using 4.6 (as I am sure you know).

Thanks,
Charles

On Tue, Aug 11, 2015 at 12:13 PM, Dirk Eddelbuettel e...@debian.org wrote:


 On 11 August 2015 at 11:49, Charles Determan wrote:
 | I am beginning to reach the completion of a new package I would like to
 | submit to CRAN.  However, there are a few aspects of it that I would like
 | some guidance prior to submitting.
 |
 | 1. It requires some third party software (i.e. an OpenCL SDK) to be
 | installed before it will compile (it contains c++ code).  As such, I
 don't
 | believe it would build successfully when submitted to CRAN.  I can verify
 | it builds and tests successfully on Travis CI but not sure how CRAN would
 | approach this.

 So you are saying it needs _new tools to build_ it?  That may be tough.

 | 2. It requires a support package which is another header only package
 which
 | mirrors the structure of BH.

 I see no difficulty with this part.

 Dirk

 | I know this has been accomplished by some other packages like gputools (
 | https://cran.r-project.org/web/packages/gputools/index.html) which
 requires
 | the CUDA toolkit to be installed.  I also don't know how to approach the
 | companion package submission (i.e. that others could use just like the BH
 | package).
 |
 | Anyone have any recommendations on how I should prepare for submitting
 this
 | package.
 |
 | Thank you,
 | Charles
 |
 |   [[alternative HTML version deleted]]
 |
 | __
 | R-package-devel@r-project.org mailing list
 | https://stat.ethz.ch/mailman/listinfo/r-package-devel

 --
 http://dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org


[[alternative HTML version deleted]]

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


[R-pkg-devel] CRAN submission which requires third party software?

2015-08-11 Thread Charles Determan
I am beginning to reach the completion of a new package I would like to
submit to CRAN.  However, there are a few aspects of it that I would like
some guidance prior to submitting.

1. It requires some third party software (i.e. an OpenCL SDK) to be
installed before it will compile (it contains c++ code).  As such, I don't
believe it would build successfully when submitted to CRAN.  I can verify
it builds and tests successfully on Travis CI but not sure how CRAN would
approach this.

2. It requires a support package which is another header only package which
mirrors the structure of BH.

I know this has been accomplished by some other packages like gputools (
https://cran.r-project.org/web/packages/gputools/index.html) which requires
the CUDA toolkit to be installed.  I also don't know how to approach the
companion package submission (i.e. that others could use just like the BH
package).

Anyone have any recommendations on how I should prepare for submitting this
package.

Thank you,
Charles

[[alternative HTML version deleted]]

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


[Bioc-devel] Fwd: OmicsMarkeR problems reported by the Build/check report for BioC 3.1

2015-07-21 Thread Charles Determan
I apologize if this is the incorrect place to post this but I need some
assistance with the following build report.  I submitted my fix for this
problem two days ago to my git repo which has the corresponding svn
bridge.  Everything builds correctly on my system and Travis CI but doesn't
appear to do so for bioconductor's servers.  I tried to access my svn to
check if my updates were mirrored from my git repo at
https://hedgehog.fhcrc.org/bioconductor/trunk/madman/Rpacks/OmicsMarkeR but
after logging in all I get is an 'Internal Server Error'.

I want to address this problem but I don't know how to proceed at this
point.

Regards,
Charles


-- Forwarded message --
From: bbs-nore...@bioconductor.org
Date: Tue, Jul 21, 2015 at 1:12 PM
Subject: OmicsMarkeR problems reported by the Build/check report for BioC
3.1
To: cdeterma...@gmail.com


[This is an automatically generated email. Please don't reply.]

Hi OmicsMarkeR maintainer,

According to the Build/check report for BioC 3.1,
the OmicsMarkeR package has the following problem(s):

  o ERROR for 'R CMD build' on zin2. See the details here:

http://bioconductor.org/checkResults/3.1/bioc-LATEST/OmicsMarkeR/zin2-buildsrc.html

Please take the time to address this then use your Subversion account
when you are ready to commit a fix to your package.

Notes:

  * This was the status of your package at the time this email was sent to
you.
Given that the online report is updated daily (in normal conditions) you
could see something different when you visit the URL(s) above,
especially if
you do so several days after you received this email.

  * It is possible that the problems reported in this report are false
positives,
either because another package (from CRAN or Bioconductor) breaks your
package (if yours depends on it) or because of a Build System problem.
If this is the case, then you can ignore this email.

  * Please check the report again 24h after you've committed your changes
to the
package and make sure that all the problems have gone.

  * If you have questions about this report or need help with the
maintenance of your package, please use the Bioc-devel mailing list:

  http://bioconductor.org/help/mailing-list/

(all package maintainers are requested to subscribe to this list)

For immediate notification of package build status, please
subscribe to your package's RSS feed. Information is at:

http://bioconductor.org/developers/rss-feeds/

Thanks for contributing to the Bioconductor project!

[[alternative HTML version deleted]]

___
Bioc-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/bioc-devel


Re: [Bioc-devel] Fwd: OmicsMarkeR problems reported by the Build/check report for BioC 3.1

2015-07-21 Thread Charles Determan
I can get in with 'readonly' but not my user account.  Is there any way for
me to reset it?

Charles

On Tue, Jul 21, 2015 at 1:44 PM, Dan Tenenbaum dtene...@fredhutch.org
wrote:



 - Original Message -
  From: Charles Determan cdeterma...@gmail.com
  To: bioc-devel@r-project.org
  Sent: Tuesday, July 21, 2015 11:23:00 AM
  Subject: [Bioc-devel] Fwd: OmicsMarkeR problems reported by the
 Build/check report for BioC 3.1
 
  I apologize if this is the incorrect place to post this but I need
  some
  assistance with the following build report.  I submitted my fix for
  this
  problem two days ago to my git repo which has the corresponding svn
  bridge.  Everything builds correctly on my system and Travis CI but
  doesn't
  appear to do so for bioconductor's servers.  I tried to access my svn
  to
  check if my updates were mirrored from my git repo at
  https://hedgehog.fhcrc.org/bioconductor/trunk/madman/Rpacks/OmicsMarkeR
  but
  after logging in all I get is an 'Internal Server Error'.
 

 I can't reproduce this; if I paste that url into a web browser and log in
 either with my svn
 username and password or readonly/readonly, I don't get an error.

  I want to address this problem but I don't know how to proceed at
  this
  point.


 The error you were emailed about was in release. The problem here is that
 something in the CRAN package assertive has changed. It looks like you have
 not modified release at all.

 The error in devel is different:


 http://www.bioconductor.org/checkResults/devel/bioc-LATEST/OmicsMarkeR/zin1-checksrc.html

 Not sure if this is related to the assertive package or if it's something
 introduced in your commit. As I recall, before this commit, both release
 and devel had the same error that we are now seeing in release.

 FYI, to make changes to release, you need to check out


 https://hedgehog.fhcrc.org/bioconductor/branches/RELEASE_3_1/madman/Rpacks/OmicsMarkeR

 Dan



 
  Regards,
  Charles
 
 
  -- Forwarded message --
  From: bbs-nore...@bioconductor.org
  Date: Tue, Jul 21, 2015 at 1:12 PM
  Subject: OmicsMarkeR problems reported by the Build/check report
  for BioC
  3.1
  To: cdeterma...@gmail.com
 
 
  [This is an automatically generated email. Please don't reply.]
 
  Hi OmicsMarkeR maintainer,
 
  According to the Build/check report for BioC 3.1,
  the OmicsMarkeR package has the following problem(s):
 
o ERROR for 'R CMD build' on zin2. See the details here:
 
 
 http://bioconductor.org/checkResults/3.1/bioc-LATEST/OmicsMarkeR/zin2-buildsrc.html
 
  Please take the time to address this then use your Subversion account
  when you are ready to commit a fix to your package.
 
  Notes:
 
* This was the status of your package at the time this email was
sent to
  you.
  Given that the online report is updated daily (in normal
  conditions) you
  could see something different when you visit the URL(s) above,
  especially if
  you do so several days after you received this email.
 
* It is possible that the problems reported in this report are
false
  positives,
  either because another package (from CRAN or Bioconductor) breaks
  your
  package (if yours depends on it) or because of a Build System
  problem.
  If this is the case, then you can ignore this email.
 
* Please check the report again 24h after you've committed your
changes
  to the
  package and make sure that all the problems have gone.
 
* If you have questions about this report or need help with the
  maintenance of your package, please use the Bioc-devel mailing
  list:
 
http://bioconductor.org/help/mailing-list/
 
  (all package maintainers are requested to subscribe to this list)
 
  For immediate notification of package build status, please
  subscribe to your package's RSS feed. Information is at:
 
  http://bioconductor.org/developers/rss-feeds/
 
  Thanks for contributing to the Bioconductor project!
 
[[alternative HTML version deleted]]
 
  ___
  Bioc-devel@r-project.org mailing list
  https://stat.ethz.ch/mailman/listinfo/bioc-devel
 


[[alternative HTML version deleted]]

___
Bioc-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/bioc-devel


Re: [Bioc-devel] Fwd: OmicsMarkeR problems reported by the Build/check report for BioC 3.1

2015-07-21 Thread Charles Determan
Okay, I made it in and committed the bug fix (svn commit).  The package
should build properly now.  Thank you for your help.

Charles

On Tue, Jul 21, 2015 at 1:58 PM, Dan Tenenbaum dtene...@fredhutch.org
wrote:



 - Original Message -
  From: Charles Determan cdeterma...@gmail.com
  To: Dan Tenenbaum dtene...@fredhutch.org
  Cc: bioc-devel@r-project.org
  Sent: Tuesday, July 21, 2015 11:52:49 AM
  Subject: Re: [Bioc-devel] Fwd: OmicsMarkeR problems reported by the
 Build/check report for BioC 3.1
 
 
 
  I can get in with 'readonly' but not my user account. Is there any
  way for me to reset it?

 See
 http://www.bioconductor.org/developers/how-to/git-mirrors/#i-dont-know-my-subversion-username-andor-password-what-do-i-do

 Dan



 
  Charles
 
 
 
  On Tue, Jul 21, 2015 at 1:44 PM, Dan Tenenbaum 
  dtene...@fredhutch.org  wrote:
 
 
 
 
  - Original Message -
   From: Charles Determan  cdeterma...@gmail.com 
   To: bioc-devel@r-project.org
   Sent: Tuesday, July 21, 2015 11:23:00 AM
   Subject: [Bioc-devel] Fwd: OmicsMarkeR problems reported by the
   Build/check report for BioC 3.1
  
   I apologize if this is the incorrect place to post this but I need
   some
   assistance with the following build report. I submitted my fix for
   this
   problem two days ago to my git repo which has the corresponding svn
   bridge. Everything builds correctly on my system and Travis CI but
   doesn't
   appear to do so for bioconductor's servers. I tried to access my
   svn
   to
   check if my updates were mirrored from my git repo at
  
 https://hedgehog.fhcrc.org/bioconductor/trunk/madman/Rpacks/OmicsMarkeR
   but
   after logging in all I get is an 'Internal Server Error'.
  
 
  I can't reproduce this; if I paste that url into a web browser and
  log in either with my svn
  username and password or readonly/readonly, I don't get an error.
 
   I want to address this problem but I don't know how to proceed at
   this
   point.
 
 
  The error you were emailed about was in release. The problem here is
  that something in the CRAN package assertive has changed. It looks
  like you have not modified release at all.
 
  The error in devel is different:
 
 
 http://www.bioconductor.org/checkResults/devel/bioc-LATEST/OmicsMarkeR/zin1-checksrc.html
 
  Not sure if this is related to the assertive package or if it's
  something introduced in your commit. As I recall, before this
  commit, both release and devel had the same error that we are now
  seeing in release.
 
  FYI, to make changes to release, you need to check out
 
 
 https://hedgehog.fhcrc.org/bioconductor/branches/RELEASE_3_1/madman/Rpacks/OmicsMarkeR
 
  Dan
 
 
 
  
   Regards,
   Charles
  
  
   -- Forwarded message --
   From:  bbs-nore...@bioconductor.org 
   Date: Tue, Jul 21, 2015 at 1:12 PM
   Subject: OmicsMarkeR problems reported by the Build/check report
   for BioC
   3.1
   To: cdeterma...@gmail.com
  
  
   [This is an automatically generated email. Please don't reply.]
  
   Hi OmicsMarkeR maintainer,
  
   According to the Build/check report for BioC 3.1,
   the OmicsMarkeR package has the following problem(s):
  
   o ERROR for 'R CMD build' on zin2. See the details here:
  
  
 http://bioconductor.org/checkResults/3.1/bioc-LATEST/OmicsMarkeR/zin2-buildsrc.html
  
 
 
   Please take the time to address this then use your Subversion
   account
   when you are ready to commit a fix to your package.
  
   Notes:
  
   * This was the status of your package at the time this email was
   sent to
   you.
   Given that the online report is updated daily (in normal
   conditions) you
   could see something different when you visit the URL(s) above,
   especially if
   you do so several days after you received this email.
  
   * It is possible that the problems reported in this report are
   false
   positives,
   either because another package (from CRAN or Bioconductor) breaks
   your
   package (if yours depends on it) or because of a Build System
   problem.
   If this is the case, then you can ignore this email.
  
   * Please check the report again 24h after you've committed your
   changes
   to the
   package and make sure that all the problems have gone.
  
   * If you have questions about this report or need help with the
   maintenance of your package, please use the Bioc-devel mailing
   list:
  
   http://bioconductor.org/help/mailing-list/
  
   (all package maintainers are requested to subscribe to this list)
  
   For immediate notification of package build status, please
   subscribe to your package's RSS feed. Information is at:
  
   http://bioconductor.org/developers/rss-feeds/
  
   Thanks for contributing to the Bioconductor project!
  
   [[alternative HTML version deleted]]
  
   ___
   Bioc-devel@r-project.org mailing list
   https://stat.ethz.ch/mailman/listinfo/bioc-devel
  
 
 


[[alternative HTML version deleted

[Rd] Why doesn't R have a float data type?

2015-06-30 Thread Charles Determan
This is strictly a curiosity question.  I am aware the R doesn't possess a
float data type.  I also don't mean to request that such functionality be
implemented as I'm sure it would require a large amount of work with
potential back compatibility conflicts.  But I wanted to know why R has
never had a float data type available?

Regards,
Charles

[[alternative HTML version deleted]]

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] Why doesn't R have a float data type?

2015-06-30 Thread Charles Determan
Hi Greg, I was referring to the single precision type.  Your points were
what I expected.  I just wanted to ask the R community if there was any
other reason than 'there wasn't any reason to implement it'.

Thanks,
Charles

On Tue, Jun 30, 2015 at 12:29 PM, Greg Snow 538...@gmail.com wrote:

 My understanding is that R does have a float type, it is just called
 double instead of float.

 If you are referring to a single precision floating point type, then R
 does have the as.single function, but that does not really change
 the way the number is stored, just sets a flag so that the proper
 conversion is done when passing to the .C or .fortran functions.
 The original S language and S+ would store things in single precision
 if needed, but for computations these values were almost always
 converted to doubles for precision.  By the time R was developed the
 memory saving of using single precision instead of double precision
 was not as big an issue, so I expect that nobody ever considered it
 worth the effort to fully implement the single precision storage.

 If you mean something else other than the above by float data type
 then please give us more details so that we can better answer the
 question.

 On Tue, Jun 30, 2015 at 10:42 AM, Charles Determan
 cdeterma...@gmail.com wrote:
  This is strictly a curiosity question.  I am aware the R doesn't possess
 a
  float data type.  I also don't mean to request that such functionality be
  implemented as I'm sure it would require a large amount of work with
  potential back compatibility conflicts.  But I wanted to know why R has
  never had a float data type available?
 
  Regards,
  Charles
 
  [[alternative HTML version deleted]]
 
  __
  R-devel@r-project.org mailing list
  https://stat.ethz.ch/mailman/listinfo/r-devel



 --
 Gregory (Greg) L. Snow Ph.D.
 538...@gmail.com


[[alternative HTML version deleted]]

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


[R-pkg-devel] Creating a Makevars that includes objects from another compiler

2015-06-05 Thread Charles Determan
Greetings,

Quick context, I have been working on developing a set of packages to make
GPU computing as simple as possible with R.  I have a functional package
that works very nicely with OpenCL based code (gpuR -
https://github.com/cdeterman/gpuR) but I also am building a companion CUDA
backend package (gpuRcuda - https://github.com/cdeterman/gpuRcuda) which is
stripped down to a single function at the moment to work out the compiling
issue described below.

The problem is, CUDA requires me to use the NVCC compiler.  So I cannot
just rely on the magic of Rcpp, I need to create a Makevars file.  I have
done so and I am very close but I cannot get the shared object file to be
created with NVCC.  The individual files compile with NVCC just fine, the
build script just keeps switching back to g++ for the shared object file.

A quick excerpt from my build script

'''
/usr/local/cuda-7.0/bin/nvcc -arch=sm_30 -Xcompiler -fPIC -Xcudafe
--diag_suppress=boolean_controlling_expr_is_constant -I/usr/share/R/include
-I/usr/include
-I/home/cdeterman/R/x86_64-pc-linux-gnu-library/3.2/RViennaCL/include
-I/home/cdeterman/R/x86_64-pc-linux-gnu-library/3.2/RcppEigen/include
viennacl_sgemm.cu -c

# NOW IT SWITCHES TO G++?

ccache g++-4.8 -shared -L/usr/lib/R/lib -Wl,-Bsymbolic-functions
-Wl,-z,relro -o gpuRcuda.so RcppExports.o viennacl_cudaMatrix_sgemm.o
-L/usr/lib/R/lib -lR
'''

You can see my current Makevars file here (
https://github.com/cdeterman/gpuRcuda/blob/master/src/Makevars).

I have read in the 'Writing R Extensions' manual that this is possible with
the OBJECTS macro (
http://cran.r-project.org/doc/manuals/r-release/R-exts.html#Creating-shared-objects)
however I cannot figure out how to use it properly (and haven't been able
to find an example).  I have looked at the other packages on CRAN that also
use CUDA (such as WideLM which I tried to emulate) but I can't seem to
figure out why R keeps defaulting the shared library creation to g++.
Perhaps my shared object section of my Makevars is incorrect?

Any insights would be sincerely appreciated,
Charles

[[alternative HTML version deleted]]

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Creating a Makevars that includes objects from another compiler

2015-06-05 Thread Charles Determan
A quick followup, even when I try and use the g++ I cannot seem to pass the
objects (it keeps omitting the .o file generated from nvcc).  How can I
have my Makevars file pass a defined list of object files to the final
shared library call?

Alternate  gpuRcuda.so block where the build output still doesn't use the
OBJS variable (note, I confirmed that the OBJS variable does contain all .o
files)

gpuRcuda.so: $(OBJS)
$(CXX) $ -o $@ $(R_LIBS) $(LIBS)

Regards,
Charles

On Fri, Jun 5, 2015 at 9:42 AM, Charles Determan cdeterma...@gmail.com
wrote:

 Greetings,

 Quick context, I have been working on developing a set of packages to make
 GPU computing as simple as possible with R.  I have a functional package
 that works very nicely with OpenCL based code (gpuR -
 https://github.com/cdeterman/gpuR) but I also am building a companion
 CUDA backend package (gpuRcuda - https://github.com/cdeterman/gpuRcuda)
 which is stripped down to a single function at the moment to work out the
 compiling issue described below.

 The problem is, CUDA requires me to use the NVCC compiler.  So I cannot
 just rely on the magic of Rcpp, I need to create a Makevars file.  I have
 done so and I am very close but I cannot get the shared object file to be
 created with NVCC.  The individual files compile with NVCC just fine, the
 build script just keeps switching back to g++ for the shared object file.

 A quick excerpt from my build script

 '''
 /usr/local/cuda-7.0/bin/nvcc -arch=sm_30 -Xcompiler -fPIC -Xcudafe
 --diag_suppress=boolean_controlling_expr_is_constant -I/usr/share/R/include
 -I/usr/include
 -I/home/cdeterman/R/x86_64-pc-linux-gnu-library/3.2/RViennaCL/include
 -I/home/cdeterman/R/x86_64-pc-linux-gnu-library/3.2/RcppEigen/include
 viennacl_sgemm.cu -c

 # NOW IT SWITCHES TO G++?

 ccache g++-4.8 -shared -L/usr/lib/R/lib -Wl,-Bsymbolic-functions
 -Wl,-z,relro -o gpuRcuda.so RcppExports.o viennacl_cudaMatrix_sgemm.o
 -L/usr/lib/R/lib -lR
 '''

 You can see my current Makevars file here (
 https://github.com/cdeterman/gpuRcuda/blob/master/src/Makevars).

 I have read in the 'Writing R Extensions' manual that this is possible
 with the OBJECTS macro (
 http://cran.r-project.org/doc/manuals/r-release/R-exts.html#Creating-shared-objects)
 however I cannot figure out how to use it properly (and haven't been able
 to find an example).  I have looked at the other packages on CRAN that also
 use CUDA (such as WideLM which I tried to emulate) but I can't seem to
 figure out why R keeps defaulting the shared library creation to g++.
 Perhaps my shared object section of my Makevars is incorrect?

 Any insights would be sincerely appreciated,
 Charles



[[alternative HTML version deleted]]

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [Rd] example fails during R CMD CHECK but works interactively?

2015-05-15 Thread Charles Determan
Does anyone else have any thoughts about troubleshooting the R CMD check
environment?

Charles

On Wed, May 13, 2015 at 1:57 PM, Charles Determan cdeterma...@gmail.com
wrote:

 Thank you Dan but it isn't my tests that are failing (all of them pass
 without problem) but one of the examples from the inst/examples directory.
 I did try, however, to start R with the environmental variables as you
 suggest but it had no effect on my tests.

 Charles

 On Wed, May 13, 2015 at 1:51 PM, Dan Tenenbaum dtene...@fredhutch.org
 wrote:



 - Original Message -
  From: Charles Determan cdeterma...@gmail.com
  To: r-devel@r-project.org
  Sent: Wednesday, May 13, 2015 11:31:36 AM
  Subject: [Rd] example fails during R CMD CHECK but works interactively?
 
  Greetings,
 
  I am collaborating with developing the bigmemory package and have run
  in to
  a strange problem when we run R CMD CHECK.  For some reason that
  isn't
  clear to us one of the examples crashes stating:
 
  Error:  memory could not be allocated for instance of type big.matrix
 
  You can see the output on the Travis CI page at
  https://travis-ci.org/kaneplusplus/bigmemory where the error starts
  at line
  1035.  This is completely reproducible when running
  devtools::check(args='--as-cran') locally.  The part that is
  confusing is
  that the calls work perfectly when called interactively.
 
  Hadley comments on the 'check' page of his R packages website (
  http://r-pkgs.had.co.nz/check.html) regarding test failing following
  R CMD
  check:
 
  Occasionally you may have a problem where the tests pass when run
  interactively with devtools::test(), but fail when in R CMD check.
  This
  usually indicates that you’ve made a faulty assumption about the
  testing
  environment, and it’s often hard to figure it out.
 
  Any thoughts on how to troubleshoot this problem?  I have no idea
  what
  assumption we could have made.

 Note that R CMD check runs R with environment variables set as follows
 (at least on my system; you can check $R_HOME/bin/check to see what it does
 on yours):

  R_DEFAULT_PACKAGES= LC_COLLATE=C

 So try staring R like this:

  R_DEFAULT_PACKAGES= LC_COLLATE=C  R

 And see if that reproduces the test failure. The locale setting could
 affect tests of sort order, and the default package setting could
 potentially affect other things.

 Dan



 
  Regards,
  Charles
 
[[alternative HTML version deleted]]
 
  __
  R-devel@r-project.org mailing list
  https://stat.ethz.ch/mailman/listinfo/r-devel
 




[[alternative HTML version deleted]]

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


[Rd] example fails during R CMD CHECK but works interactively?

2015-05-13 Thread Charles Determan
Greetings,

I am collaborating with developing the bigmemory package and have run in to
a strange problem when we run R CMD CHECK.  For some reason that isn't
clear to us one of the examples crashes stating:

Error:  memory could not be allocated for instance of type big.matrix

You can see the output on the Travis CI page at
https://travis-ci.org/kaneplusplus/bigmemory where the error starts at line
1035.  This is completely reproducible when running
devtools::check(args='--as-cran') locally.  The part that is confusing is
that the calls work perfectly when called interactively.

Hadley comments on the 'check' page of his R packages website (
http://r-pkgs.had.co.nz/check.html) regarding test failing following R CMD
check:

Occasionally you may have a problem where the tests pass when run
interactively with devtools::test(), but fail when in R CMD check. This
usually indicates that you’ve made a faulty assumption about the testing
environment, and it’s often hard to figure it out.

Any thoughts on how to troubleshoot this problem?  I have no idea what
assumption we could have made.

Regards,
Charles

[[alternative HTML version deleted]]

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] example fails during R CMD CHECK but works interactively?

2015-05-13 Thread Charles Determan
Thank you Dan but it isn't my tests that are failing (all of them pass
without problem) but one of the examples from the inst/examples directory.
I did try, however, to start R with the environmental variables as you
suggest but it had no effect on my tests.

Charles

On Wed, May 13, 2015 at 1:51 PM, Dan Tenenbaum dtene...@fredhutch.org
wrote:



 - Original Message -
  From: Charles Determan cdeterma...@gmail.com
  To: r-devel@r-project.org
  Sent: Wednesday, May 13, 2015 11:31:36 AM
  Subject: [Rd] example fails during R CMD CHECK but works interactively?
 
  Greetings,
 
  I am collaborating with developing the bigmemory package and have run
  in to
  a strange problem when we run R CMD CHECK.  For some reason that
  isn't
  clear to us one of the examples crashes stating:
 
  Error:  memory could not be allocated for instance of type big.matrix
 
  You can see the output on the Travis CI page at
  https://travis-ci.org/kaneplusplus/bigmemory where the error starts
  at line
  1035.  This is completely reproducible when running
  devtools::check(args='--as-cran') locally.  The part that is
  confusing is
  that the calls work perfectly when called interactively.
 
  Hadley comments on the 'check' page of his R packages website (
  http://r-pkgs.had.co.nz/check.html) regarding test failing following
  R CMD
  check:
 
  Occasionally you may have a problem where the tests pass when run
  interactively with devtools::test(), but fail when in R CMD check.
  This
  usually indicates that you’ve made a faulty assumption about the
  testing
  environment, and it’s often hard to figure it out.
 
  Any thoughts on how to troubleshoot this problem?  I have no idea
  what
  assumption we could have made.

 Note that R CMD check runs R with environment variables set as follows (at
 least on my system; you can check $R_HOME/bin/check to see what it does on
 yours):

  R_DEFAULT_PACKAGES= LC_COLLATE=C

 So try staring R like this:

  R_DEFAULT_PACKAGES= LC_COLLATE=C  R

 And see if that reproduces the test failure. The locale setting could
 affect tests of sort order, and the default package setting could
 potentially affect other things.

 Dan



 
  Regards,
  Charles
 
[[alternative HTML version deleted]]
 
  __
  R-devel@r-project.org mailing list
  https://stat.ethz.ch/mailman/listinfo/r-devel
 


[[alternative HTML version deleted]]

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


[Bioc-devel] Submitting a new package

2015-03-13 Thread Charles Determan Jr
I have finished a new package recently that I would like to submit to
bioconductor.  I have read through the package guidelines and I believe I
understand the package submission process but embarrassingly I cannot
figure out how to actually submit the package?  Could someone kindly
explain where I actually submit the package for review?


Regards,

-- 
Dr. Charles Determan, PhD
Integrated Biosciences

[[alternative HTML version deleted]]

___
Bioc-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/bioc-devel


Re: [Bioc-devel] Submitting a new package

2015-03-13 Thread Charles Determan Jr
Thanks Dan, could you confirm that the email statement (packages NEAR
bioconductor POINT org) means packa...@bioconductor.org ?  I don't
recognize the NEAR notation.

Charles

On Fri, Mar 13, 2015 at 1:59 PM, Dan Tenenbaum dtene...@fredhutch.org
wrote:



 - Original Message -
  From: Charles Determan Jr deter...@umn.edu
  To: Bioc-devel@r-project.org
  Sent: Friday, March 13, 2015 11:57:24 AM
  Subject: [Bioc-devel] Submitting a new package
 
  I have finished a new package recently that I would like to submit to
  bioconductor.  I have read through the package guidelines and I
  believe I
  understand the package submission process but embarrassingly I cannot
  figure out how to actually submit the package?  Could someone kindly
  explain where I actually submit the package for review?
 

 See the very bottom of
 http://bioconductor.org/developers/package-submission/

 Dan


 
  Regards,
 
  --
  Dr. Charles Determan, PhD
  Integrated Biosciences
 
[[alternative HTML version deleted]]
 
  ___
  Bioc-devel@r-project.org mailing list
  https://stat.ethz.ch/mailman/listinfo/bioc-devel
 




-- 
Dr. Charles Determan, PhD
Integrated Biosciences

[[alternative HTML version deleted]]

___
Bioc-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/bioc-devel


[Bioc-devel] BiocCheck Error 'duplicate label install'?

2014-12-31 Thread Charles Determan Jr
Greetings,

Apologies if this is a double post, I was over zealous in my initial
submission that I didn't wait for my membership to be confirmed.

I have 'finished' creating a package I would like to submit to
bioconductor.  As per the instructions on the bioconductor website I am
trying to get the package to pass the BiocCheck function but I keep getting
the following error:

Error in parse_block(g[-1], g[1], params.src) : duplicate label 'install'

The package also passes through R CMD Check with no errors or warnings.  If
someone could kindly explain what this error means I would sincerely
appreciate it.

Regards,

-- 
Dr. Charles Determan, PhD
Integrated Biosciences

[[alternative HTML version deleted]]

___
Bioc-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/bioc-devel


[Rd] modifying some package code

2012-05-24 Thread Charles Determan Jr
Greetings,

I am working on modifying some code from the nlme package.  I have had many
discussions on the mixed models mailing list and have been directed to
simply 'hack' the source code to have the degrees of freedom generated by
one function to use in the output of another function that doesn't generate
them.  My current holdup is an error regarding a .c file called
'inner_perc_table' called by the .C function.  The error states that the
object 'inner_perc_table' is not found.  My confusion lies in the fact that
when I run the original script, it recognizes the part just fine.  At no
point is the object defined and I cannot currently find such a code in the
package's source.  Perhaps someone here is familiar with the nlme package
and could assist me in some form.  If you need further information, please
ask as I don't know if there is a general answer for this type of question
or if you will need the actual code.

Regards,
Charles

[[alternative HTML version deleted]]

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] modifying some package code

2012-05-24 Thread Charles Determan Jr
Simon,

Thank you for this valuable information.  However, you must forgive some
ignorance on my part.  If R-registerRoutines defines the native function,
how should I go about fixing this issue?  Would I copy the init.c to the
base package (where I have the new function)?

Thanks,
Charles

On Thu, May 24, 2012 at 11:58 AM, Simon Urbanek simon.urba...@r-project.org
 wrote:


 On May 24, 2012, at 12:25 PM, Charles Determan Jr wrote:

  Greetings,
 
  I am working on modifying some code from the nlme package.  I have had
 many
  discussions on the mixed models mailing list and have been directed to
  simply 'hack' the source code to have the degrees of freedom generated by
  one function to use in the output of another function that doesn't
 generate
  them.  My current holdup is an error regarding a .c file called
  'inner_perc_table' called by the .C function.  The error states that the
  object 'inner_perc_table' is not found.  My confusion lies in the fact
 that
  when I run the original script, it recognizes the part just fine.  At no
  point is the object defined and I cannot currently find such a code in
 the
  package's source.  Perhaps someone here is familiar with the nlme package
  and could assist me in some form.  If you need further information,
 please
  ask as I don't know if there is a general answer for this type of
 question
  or if you will need the actual code.
 

 The (unexported) object contains cached reference to the native function
 (see ?getNativeSymbolInfo) and is defined by R_registerRoutines in
 src/init.c. This is a typical optimization in R packages to avoid costly
 lookup of symbols and to provide check for native arguments.

 Cheers,
 Simon





[[alternative HTML version deleted]]

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel