Re: [R-sig-Geo] NASA's Black Marble monthly data: Reprojection isn't accurate

2023-09-09 Thread Michael Sumner
The extent of this one is

70, 80, 10, 20

Later versions of GDAL determine this automatically (I'm not sure when)

 rast("VNP46A3.A2018091.h25v07.001.2021125122857.h5")
class   : SpatRaster
dimensions  : 2400, 2400, 26  (nrow, ncol, nlyr)
resolution  : 0.00417, 0.00417  (x, y)
extent  : 70, 80, 10, 20  (xmin, xmax, ymin, ymax)
coord. ref. : lon/lat Unknown datum based upon the GRS 1980 Authalic Sphere
ellipsoid
sources :
VNP46A3.A2018091.h25v07.001.2021125122857.h5://AllAngle_Composite_Snow_Covered

VNP46A3.A2018091.h25v07.001.2021125122857.h5://AllAngle_Composite_Snow_Covered_Num

VNP46A3.A2018091.h25v07.001.2021125122857.h5://AllAngle_Composite_Snow_Covered_Quality
  ... and 23 more source(s)
varnames: AllAngle_Composite_Snow_Covered
  AllAngle_Composite_Snow_Covered_Num
  AllAngle_Composite_Snow_Covered_Quality
  ...

Be very careful with canned steps to "fix" georeferencing, make sure it
needs fixing and check that it's right. Ultimately the best way to "fix" it
is to go to the source of the code that interfaces the data, which here is
GDAL and report there - but clearly that's been updated from whatever
version was in use on stackoverflow.

Cheers, Mike
.





On Sun, Sep 10, 2023 at 1:17 AM Nikolaos Tziokas 
wrote:

> I downloaded NASA's Black Marble monthly nighttime light NTL data, VNP46A3
> <
> https://ladsweb.modaps.eosdis.nasa.gov/missions-and-measurements/products/VNP46A3/
> >.
> In a previous question
> <
> https://gis.stackexchange.com/questions/466571/extent-not-found-on-nasas-black-marble-monthly-images-how-to-set-it/466574?noredirect=1#comment761916_466574
> >
> of mine the reprojection worked perfectly but now it seems that it doesn't.
> For example, I wanted to download NTL data for the city of Mumbai, India.
> After reprojecting the NTL (product 5 (All_Angles_Snow_Free) from the .h5)
> the result is attached here
>  >.
>
> At the bottom if the image is a shp of Mumbai (downloaded from GADM) and
> the red circle in the top indicates where Mumbai is in the NTL image.
> Clearly something's not right.
>
> I downloaded the image from here
> <
> https://ladsweb.modaps.eosdis.nasa.gov/missions-and-measurements/products/VNP46A3/
> >
> (LAADS-DAAC, Level-1 and Atmosphere Archive & Distribution System
> Distributed Active Archive Center). The code I used to extract the NTL
> radiance image is:
>
> library(terra)
>
> wd <- "path/"
>
> r <- rast(paste0(wd, "VNP46A3.A2018091.h25v07.001.2021125122857.h5"))
> crs(r) <- "epsg:4326"
>
> 2400*(15/(60*60))
>
> h = 25
> v = 7
>
> ext(r) = c(-180+h*10,-180+(h+1)*10, (v-2)*10,(v-1)*10)
>
> ntl <- r[[5]]
>
> writeRaster(ntl, paste0(wd, "ntl.tif"), overwrite = TRUE)
>
> Why the code worked perfectly in the previous question and now it doesn't?
> From here
> 
> you can download the .h5 image if you don't want to use NASA's website. I
> am using R 4.3.1 and RStudio 2023.06.2+561.
>
> --
> Tziokas Nikolaos
> Cartographer
>
> Tel:(+44)07561120302
> LinkedIn 
>
> [[alternative HTML version deleted]]
>
> ___
> R-sig-Geo mailing list
> R-sig-Geo@r-project.org
> https://stat.ethz.ch/mailman/listinfo/r-sig-geo
>


-- 
Michael Sumner
Software and Database Engineer
Australian Antarctic Division
Hobart, Australia
e-mail: mdsum...@gmail.com

[[alternative HTML version deleted]]

___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo


[R-sig-Geo] NASA's Black Marble monthly data: Reprojection isn't accurate

2023-09-09 Thread Nikolaos Tziokas
I downloaded NASA's Black Marble monthly nighttime light NTL data, VNP46A3
.
In a previous question

of mine the reprojection worked perfectly but now it seems that it doesn't.
For example, I wanted to download NTL data for the city of Mumbai, India.
After reprojecting the NTL (product 5 (All_Angles_Snow_Free) from the .h5)
the result is attached here
.

At the bottom if the image is a shp of Mumbai (downloaded from GADM) and
the red circle in the top indicates where Mumbai is in the NTL image.
Clearly something's not right.

I downloaded the image from here

(LAADS-DAAC, Level-1 and Atmosphere Archive & Distribution System
Distributed Active Archive Center). The code I used to extract the NTL
radiance image is:

library(terra)

wd <- "path/"

r <- rast(paste0(wd, "VNP46A3.A2018091.h25v07.001.2021125122857.h5"))
crs(r) <- "epsg:4326"

2400*(15/(60*60))

h = 25
v = 7

ext(r) = c(-180+h*10,-180+(h+1)*10, (v-2)*10,(v-1)*10)

ntl <- r[[5]]

writeRaster(ntl, paste0(wd, "ntl.tif"), overwrite = TRUE)

Why the code worked perfectly in the previous question and now it doesn't?
>From here

you can download the .h5 image if you don't want to use NASA's website. I
am using R 4.3.1 and RStudio 2023.06.2+561.

-- 
Tziokas Nikolaos
Cartographer

Tel:(+44)07561120302
LinkedIn 

[[alternative HTML version deleted]]

___
R-sig-Geo mailing list
R-sig-Geo@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-geo


Re: [R-sig-Geo] Calculating median age for a group of US census blocks?

2023-09-09 Thread Kevin Zembower via R-sig-Geo
Dr. Snow, thanks so much for your response to my question.

I think I'm going to stick with the lower- and upper-bounds method I
described, even though it gives a wider range for the median age than
other methods. I read the vignette for 'survival' as well at the
chapters on survival from MASS and another book I have, and couldn't
make heads or tails of it, much less how to apply it to this question.
In the unlikely event of someone asking me to explain or defend my
conclusions on median age for my neighborhood population, I would be
lost about survival statistics, but could manage, with numerous hand-
waves, to explain my method. I'm an old, retired guy who thinks
statistics are fun, not someone with any kind of professional training
or credentials.

Thank you, again, for your thoughtful and thorough response. I
appreciate your help.

-Kevin

On Tue, 2023-09-05 at 11:31 -0600, Greg Snow wrote:
> Kevin,
> 
> Your idea of substituting the minimum and maximum values of the
> ranges
> will work for computing bounds on the median age, and for the median
> age you should not need to drop the 85+ group (unless more that 50%
> of
> people are in that group).  The mean is another issue.
> 
> Another approach that may give you a smaller interval and more
> statistically justified range would be to turn to survival analysis
> techniques and treat the values from the table as interval censored
> data.  If the data appears to come from a known distribution then you
> can use parametric survival techniques to fit the distribution (see
> the `survreg` function in the `survival` package).  Or, there are
> packages that fit non-parametric models to interval censored data
> (`Icens` and `interval` for example) that can then be used to
> estimate
> a confidence interval on the median age (and possibly the mean age,
> but with limitations).  For the 85+ group you can treat them as right
> censored, or interval censored from 85 to infinity, or interval
> censored from 85 to some value like 100 or 120 (there is a small
> chance that someone in the table could be over 100, but rare, I think
> the current oldest reported living person is in the hundred and
> teens,
> so 120 would be safe).
> 
> On Thu, Aug 31, 2023 at 1:48 PM Kevin Zembower via R-sig-Geo
>  wrote:
> > 
> > Sorry to resurrect a long-dead thread, but I'm still struggling
> > with my
> > desire to assign a median age to the population in a group of US
> > census
> > blocks. I'm using the data from the US Census table P12, which bins
> > the
> > ages into ranges.
> > 
> > I'm convinced (thank you!) that I can't compute the exact median
> > age.
> > Can I compute the lower and upper bounds of the median age? Can I
> > assign all the people in a binned age range (say "20 to 29 years")
> > to
> > the lower limit of the range, then compute the median of those
> > ages,
> > and say that the true median age is between this lower limit and
> > the
> > upper one, computed similarly?
> > 
> > If this is valid, how do I deal with the "85 years and older" bin?
> > I
> > have 9 people 85 years and older, out of a total population of 537
> > people in my group of census blocks. For the lower bounds of the
> > median, I assign all 9 the age of 85. What can I do for the upper
> > bounds?
> > 
> > I've done this, and found that the true median age is between 40
> > and 44
> > years old, if I drop all the "85 years and older" population as NA.
> > The
> > true mean is between 39.96 and 43.46, similarly.
> > 
> > One thought: If there are 9 people in the "85 years and older"
> > group,
> > should I drop them and also drop the 9 youngest ages?
> > 
> > I look forward to reading your thoughts. Thank you for any advice
> > and
> > guidance.
> > 
> > -Kevin
> > 
> > On Tue, 2023-08-08 at 12:00 +0200, r-sig-geo-requ...@r-project.org
> > wrote:
> > > 
> > > Message: 2
> > > Date: Mon, 7 Aug 2023 18:33:41 +
> > > From: Kevin Zembower 
> > > To: "r-sig-geo@r-project.org" 
> > > Subject: [R-sig-Geo] Calculating median age for a group of US
> > > census
> > >     blocks?
> > > Message-ID:
> > >     <01000189d146bd0d-ecb41aac-0501-46f4-b313-a1faebeff2a9-
> > > 000...@email.amazonses.com>
> > > 
> > > Content-Type: text/plain; charset="utf-8"
> > > 
> > > Hello, all,
> > > 
> > > I'd like to obtain the median age for a population in a specific
> > > group
> > > of US Decennial census blocks. Here's an example of the problem:
> > > 
> > > ## Example of calculating median age of population in census
> > > blocks.
> > > library(tidyverse)
> > > library(tidycensus)
> > > 
> > > counts <- get_decennial(
> > >  geography = "block",
> > >  state = "MD",
> > >  county = "Baltimore city",
> > >  table = "P1",
> > >  year = 2020,
> > >  sumfile = "dhc") %>%
> > >  mutate(NAME = NULL) %>%
> > >  filter(substr(GEOID, 6, 11) == "271101" &
> > >     substr(GEOID, 12, 15) %in% c(3000, 3001, 3002)
> > >     )
> > > 
> > > ages <- get_decennial(
> > >