Dear R community,
I found some missing x label when I saving this plot to tiff file:
justsample <- rnorm(n=1095*3,mean=100,sd=10)
justsample <- as.data.frame(matrix(justsample,ncol=3))
dd <- seq(from=as.Date("1985-01-01"), to =as.Date("1987-12-31"), by='day')
y <- data.frame(Year=substr(dd,1,4),
I have been using nlsr() to fit s curves to Covid-19 data over the past few
weeks and I have not had any issues.
Bernard
Sent from my iPhone so please excuse the spelling!"
> On May 13, 2020, at 5:16 PM, Abby Spurdle wrote:
>
> Hi Christofer,
>
> This doesn't really answer your question.
>
> It's possible that Martin's package, cobs, can do this, but not sure,
> I haven't tried it.
> And there may be other R packages for fitting splines/smoothers to
> data, subject to shape constraints.
Further to my previous post.
I read through the documentation for the cobs package.
And (someone
It looks like you are using the str_nth_currency() function from the strex
package but we have no idea of what the pdf files are or how you are
importing them is to R.
We need a lot more information on what you are doing "before" you use the
function.
Have a look at
Hi Christofer,
This doesn't really answer your question.
But if the goal is to fit an S-shaped curve to data, with increased
flexibility...
(I'm assuming that's the goal).
...then I'd like to note the option of splines (or smoothing), subject
to shape constraints...
My guess, is it's probably
On 2020-05-13 19:54 +, Poling, William wrote:
> I have R on personal laptop for
> consultative purposes from time to time,
> however, I cannot move data,
> confidentiality constraints, as you can
> imagine.
>
> I have initiated another IT ticket with
> organization, I think I will get to
Hello Abby and thank you for your response.
Your surly correct.
I have not worked a problem like this previously, however, I am learning fast.
I did not think I would need to apply mathematical formula-calculations for
this task, math in general not my primary area of expertise, but always
On 2020-05-13 13:13 -0700, Jeff Newmiller wrote:
> In general, any time you deal with floating
> point numbers having different magnitudes,
> you risk pushing some low precision bits
> out of the result. Simply changing the
> sequence of calculations such as a literal
> polynomial evaluation
> "determine the largest concentration of members in the smallest radius"
I haven't read the whole thread, and I'm not familiar with this topic.
However, looking at it from an intuitive perspective, isn't the
smallest radius zero.
If the concentration means the number of "members" divided by the
In general, any time you deal with floating point numbers having different
magnitudes, you risk pushing some low precision bits out of the result. Simply
changing the sequence of calculations such as a literal polynomial evaluation
versus Horner's method can obtain different results. Take a
I have R on personal laptop for consultative purposes from time to time,
however, I cannot move data, confidentiality constraints, as you can imagine.
I have initiated another IT ticket with organization, I think I will get to the
bottom of this at some point.
Thank you.
WHP
Proprietary
On 2020-05-13 19:27 +, Poling, William wrote:
> if this is something I can do without IT admin access.
Hi! O.T. on laptops: Also, perhaps it is
easier to find another laptop. There seems
to be some great ThinkPads readily available
anywhere around the U.S., like an X61 or X220
or
Hi Rasmus, thank you I will see if this is something I can do without IT admin
access.
In the mean time I have reloaded rmarkdown. To local
package ‘rmarkdown’ successfully unpacked and MD5 sums checked
The downloaded binary packages are in
On 2020-05-13 18:46 +, Poling, William via R-help wrote:
> Hello all.
>
> I am still struggling with this issue.
>
> It appears that new installations are going
> to local drive.
>
> # #Test 05/13/2020
> # install.packages("abjutils")
> # package ‘abjutils’ successfully unpacked and MD5
On 2020-05-13 11:44 -0700, Jeff Newmiller wrote:
> Depending on reproducibility in the least
> significant bits of floating point
> calculations is a bad practice. Just
> because you decide based on this one
> example that one implementation of BLAS is
> better than another does not mean that
Depending on reproducibility in the least significant bits of floating point
calculations is a bad practice. Just because you decide based on this one
example that one implementation of BLAS is better than another does not mean
that will be true for all specific examples. IMO you are drawing
Hello all.
I am still struggling with this issue.
It appears that new installations are going to local drive.
# #Test 05/13/2020
# install.packages("abjutils")
# package ‘abjutils’ successfully unpacked and MD5 sums checked
#
# The downloaded binary packages are in
#
On 2020-05-13 13:04 -0400, J C Nash wrote:
> On 2020-05-13 11:28 a.m., Rasmus Liland wrote:
> >
> > I get another solution on my Linux i7-7500U
> >
> > > D %*% solve(D)
> > [,1] [,2]
> > [1,] 1.00e+000
> > [2,] 8.881784e-161
> > > sessionInfo()
> > BLAS:
Note that my sessionInfo() gave
R version 4.0.0 (2020-04-24)
Platform: x86_64-pc-linux-gnu (64-bit)
Running under: Linux Mint 19.3
Matrix products: default
BLAS: /usr/lib/x86_64-linux-gnu/blas/libblas.so.3.7.1
LAPACK: /usr/lib/x86_64-linux-gnu/lapack/liblapack.so.3.7.1
So you have an older R
Hello all. Thank you in advance for any additional suggestions.
I have with, Jim's help, found some traction in my pursuit of this problem.
"determine the largest concentration of members in the smallest radius"
However, I need guidance in efficiencies as I will explain below.
1. I have used
On 2020-05-09 11:40 -0400, J C Nash wrote:
>
> > solve(D)
> [,1] [,2]
> [1,] -2.0 1.0
> [2,] 1.5 -0.5
> > D %*% solve(D)
> [,1] [,2]
> [1,]1 1.110223e-16
> [2,]0 1.00e+00
> >
Dear list,
I get another solution on my Linux i7-7500U
laptop, but the same solution on
Many moons ago (I think early 80s) I looked at some of the global optimizers,
including several based on intervals. For problems of this size, your suggestion
makes a lot of sense, though it has been so long since I looked at those
techniques
that I will avoid detailed comment.
I've not looked
Also, in the full curve referenced on Wikpedia, the parameters Q And M are
confounded - you only need one or the other But not both. If you are using both
and trying to estimate them both you will have problems.
I have fitted these curves quite easily using the Solver in Excel.
Bernard
Sent
John, have you ever looked at interval optimization as an alternative since it
can lead to provably global minima?
Bernard
Sent from my iPhone so please excuse the spelling!"
> On May 13, 2020, at 8:42 AM, J C Nash wrote:
>
> The Richards' curve is analytic, so nlsr::nlxb() should work
On 2020-05-13 06:44 -0700, Jeff Newmiller wrote:
> On May 13, 2020 6:33:03 AM PDT, Manish Mukherjee wrote:
> >
> > How to extract this value from a number
> > of PDF files and put it in a data frame.
>
> they could be part of embedded bitmaps.
Dear Manish and Jeff,
I recently found the
PDF files are actually "programs" that place graphic symbols on pages, and the
order in which those symbols are placed (the order in which most pdf-to-text
conversions return characters) may have nothing to do with how they appear
visually. There is not even a guarantee that those symbols are
Hi All,
Need some help with the following code , i have a number of pdf files , and the
first page of those files gives a currency value $xxx,xxx,xxx . How to extract
this value from a number of PDF files and put it in a data frame . I am able to
do it for a single file
with the code where
The Richards' curve is analytic, so nlsr::nlxb() should work better than nls()
for getting derivatives --
the dreaded "singular gradient" error will likely stop nls(). Also likely,
since even a 3-parameter
logistic can suffer from it (my long-standing Hobbs weed infestation problem
below), is
Dear Sir,
I am so sorry that due to certain inconveniences, I became late to try your
suggested code and to reply to your email.
Thank you very much for your wonderful solution and suggestion for my
problem. Like before, Your suggested code has worked awesome. Even, I
successfully imported the
Good morning Jim.
This is awesome start, visualization is splendid, thank you very much.
I have signed on to r-sig-geo and submitted my question there. I have no idea
of the volume of traffic on that list
However, hopefully, I will gain additional insight into how to determine max
number of
Shouldn't be hard to set up with nls(). (I kind of suspect that the Richards
curve has more flexibility than data can resolve, especially the subset
(Q,B,nu) seems highly related, but hey, it's your data...)
-pd
> On 13 May 2020, at 11:26 , Christofer Bogaso
> wrote:
>
> Hi,
>
> Is there
Hi Christofer
Try FlexParamCurve or maybe drc package.
Cheers
Petr
> -Original Message-
> From: R-help On Behalf Of Christofer Bogaso
> Sent: Wednesday, May 13, 2020 11:26 AM
> To: r-help
> Subject: [R] Fitting Richards' curve
>
> Hi,
>
> Is there any R package to fit Richards'
Good morning Bert. I will sign up for r-sig-geo and review your suggested link
as well, thank you very much for your response.
WHP
Proprietary
-Original Message-
From: Bert Gunter
Sent: Tuesday, May 12, 2020 8:30 PM
To: Poling, William
Cc: r-help@r-project.org
Subject: [EXTERNAL]
Hi,
Is there any R package to fit Richards' curve in the form of
https://en.wikipedia.org/wiki/Generalised_logistic_function
I found there is one package grofit, but currently defunct.
Any pointer appreciated.
__
R-help@r-project.org mailing list --
Hi Stefano,
Given only one observation point you will find it difficult. If your
automatic weather station is in the low area where the foehn wind is
felt, it can only be distinguished from a dry katabatic wind if the
upwind conditions are known. There is a similar but milder version of
this in
Well, let's hope that was my big screw up for today...
On Wed, May 13, 2020 at 4:39 PM peter dalgaard wrote:
>
> Hans? Try Heinz ;-)
>
> Actually listed as a quote _in_ Abby's, originally by Greg Snow, but w/o
> attribution...
>
> -pd
>
>
>
> > On 13 May 2020, at 02:23 , Jim Lemon wrote:
> >
>
Hans? Try Heinz ;-)
Actually listed as a quote _in_ Abby's, originally by Greg Snow, but w/o
attribution...
-pd
> On 13 May 2020, at 02:23 , Jim Lemon wrote:
>
> Sorry, it was listed in Hans' email as a reply from you. Far be it
> from me to speak for someone else.
>
> Jim
>
> On Wed,
Hi Bill,
A while ago I devised a couple of functions to accumulate millions of
geographic locations of events and then display the resulting matrix
of values on an existing plot. This may be of use to you, at least in
the visualization of the density of the locations. As your example
data only
38 matches
Mail list logo