Nelly Reduan | 27 May 19:05 2016

Building a prediction raster when the statistical model was built from sampling units of different sizes


I would like to build a predictive map of capture success from a GAM. To build the GAM, I used data of capture
success (dependent variable in the model) and proportions of land cover types (independent variables in
the model) that were estimated within 50 trapping sites (thus, sample size = 50 observations). The size of
trapping sites is ranged from 24 km˛ to 236 km˛ (mean = 133 km˛). I tested whether or not the capture success
varied between the trapping sites and the capture success didn't vary between the sites. Is it
statistically correct to create a prediction raster map with cells of equal size given that the trapping
sites are of different sizes?  If it's correct, should I use raster cells of 5 km x 5 km based on the minimum
size of trapping sizes or 16 km x 16 km based on the maximum size or 12 km x km based on the mean size ? If it's not
correct, what should I do? Instead of calculating proportions of land cover types within trapping sites,
should I calculate the proportions within a buffer around the centroid of trapping sites?

Thank you very much for your time.

Have nice day.


	[[alternative HTML version deleted]]

R-sig-Geo mailing list
R-sig-Geo <at>
Dr Didier G. Leibovici | 27 May 13:51 2016

R on Mac building up memory usage


I guess this may be not a specificity of r-sig-geo but as I am using

in this script so it may be the reason? (perhaps I should try running 
something else to check).

So basically I am running some code reading  61288 features and other 
things ... if I run it once I got in gc():
 > gc()
           used (Mb) gc trigger  (Mb) max used  (Mb)
Ncells 1833926 98.0    5103933 272.6  9968622 532.4
Vcells 2437534 18.6    7056348  53.9 11036325  84.3

and on the monitor it says R is using 3.3Go.

Then I remove everything rm(list=ls()) and run it again trying different 
sets of parameters for example.
Second run similar gc() but R is using 6.4Go
 > gc()
           used (Mb) gc trigger  (Mb) max used  (Mb)
Ncells 1834325 98.0    6323353 337.8  9968622 532.4
Vcells 2439267 18.7    7572947  57.8 11832730  90.3

After a while and few other computation I get R is using 10Go
and gc() gives
 > gc()
           used (Mb) gc trigger  (Mb)  max used  (Mb)
(Continue reading)

Miluji Sb | 26 May 23:31 2016

Match Coordinates to NUTS 2 ID

Dear all,

I have downloaded the NUTS 2 level data from

# Download Administrative Level data from EuroStat
temp <- tempfile(fileext = ".zip")

# Read data
EU_NUTS <- readOGR(dsn = "./NUTS_2010_60M_SH/data", layer =

# Subset NUTS 2 level data
map_nuts2 <- subset(EU_NUTS, STAT_LEVL_ == 2)

I also have data for a variable by coordinates, which looks like this:

structure(list(LON = c(-125.25, -124.75, -124.25, -124.25, -124.25,
-124.25), LAT = c(49.75, 49.25, 42.75, 43.25, 48.75, 49.25),
    yr = c(2.91457704560515, 9.94774197180345, -2.71956412885765,
    -0.466213169185147, -36.6645659563374, 10.5168056769535)), .Names =
(Continue reading)

Gwennaël Bataille | 25 May 15:53 2016

Overlay between polygons and their intersecting lines

Dear all,

I can't find a solution for the following problem:

When I first intersect a line with 2 polygons (splitting it into 2 
segments) and then use an overlay to get for each segment the attribute 
of the overlapping polygon, I sometimes get too answers (i.e. a small 
point overlapping one polygon, the rest of the segment overlapping another).

The functions I use for this are:


and sp::over

Do you have any idea how I could get the attribute of the polygon the 
segments are "mostly overlapping"?

Below is a reproductible example.

Thank you very much in advance for any hint on this.

Best regards,


Reproductible example:

matrix1 <- cbind(x= c(250300, 250451, 250494, 250300),

(Continue reading)

Mark R Payne | 25 May 11:58 2016

Raster: Can't read raster-created NCDF back in


I have a rasterBrick object that I have created through a series of
manipulations and written to disk using writeRaster(x,format="CDF"). In
another, independent script, I then need to read that netcdf file back in.
The following commands work:

> b <- brick("")
> plot(b)

but these don't:

> crop(b,extent(320,340,55,60))
Error in ncvar_get_inner(ncid2use, varid2use, nc$var[[li]]$missval,
addOffset,  :
  Error: variable has 2 dims, but start has 3 entries.  They must match!
> readAll(b)
Error in ncvar_get_inner(ncid2use, varid2use, nc$var[[li]]$missval,
addOffset,  :
  Error: variable has 2 dims, but start has 3 entries.  They must match!

which is just strange. Does anyone have any ideas what might be going on

The file is available here (its 60kb)

The netcdf header and my session info follow.

(Continue reading)

Hakim Abdi | 25 May 11:43 2016

Re: MODIS package's runGdal() returns error: dataFormat='GTiff', format not supported

Thanks for all your help Chris, the original error has been corrected, and
now there is another error when trying to run runGdal(). I removed the
previous version of MODIS and installed version 0.10-34 (develop branch) as
per Florian's suggestion. Everything runs fine until it gets to accessing
the server and downloading the file. The same happens to my colleague who
is running a 64-bit Ubuntu 14.04 machine with R 3.3.0. The issue seems to
stem from the fact that in the getStruc.R file FtpDayDirs[1]==FALSE. In
fact FtpDayDirs[1] is NULL, which in turn is due to the fact that
getStruc.R failed to connect to (port 80: Timed out).
Is anyone else having this issue? I tried accessing on my browser and it does seems to be down.
Running getStruc.R and selecting LAADS instead of LPDAAC seems to work, but
I don't know how to incorporate this into the runGdal() command.

Hakim Abdi

R version 3.2.5 (2016-04-14) -- "Very, Very Secure Dishes"
Copyright (C) 2016 The R Foundation for Statistical Computing
Platform: x86_64-w64-mingw32/x64 (64-bit)

R is free software and comes with ABSOLUTELY NO WARRANTY.
You are welcome to redistribute it under certain conditions.
Type 'license()' or 'licence()' for distribution details.

R is a collaborative project with many contributors.
Type 'contributors()' for more information and
'citation()' on how to cite R or R packages in publications.

Type 'demo()' for some demos, 'help()' for on-line help, or
(Continue reading)

Eduardo Diez | 25 May 05:15 2016

Bearing angle of UTM projected SpatialLines

Dear everyone,
I'm used to calculating the compass angles (clockwise from due North) of
line features projected in UTM using the tool Linear Directional Mean
from ArcGIS.
I could find functions for performing a similar task but taking Origin ->
Destination points and only in Lat/Lon (geographic coordinates) namely:
geosphere::bearing and maptools::gzAzimuth. (Somehow they give different
results for the same set of points: around 0.05 degrees, maybe because of
the trigonometry)
Because of the nature of my work it has to be in UTM.
Perhaps someone skillfull in trigonometry can figure it out but i'm not the
Is there a way to do this in R?


	[[alternative HTML version deleted]]
ASANTOS via R-sig-Geo | 25 May 03:17 2016

Calculate each polygon percentage inside a circles

Dear members,

        I will try to calculate each polygon percentage inside a circles 
given an arbitrary radius in a shapefile object with the code below and 
my output needs to be (Two first columns with center os circle 
coordinates and values of each polygon percentage):

"pts$x"   "pts$y" "ID1" "ID2" "ID3" "ID4"
180486.1  330358.8  16.3   0.2  NA   17.3
179884.4  331420.8  88.3   NA   96.3 NA
179799.6  333062.5  25.3   22.3 0.5  NA

      For this I try to do:


#Create 4 polygons (I create polygons because is more simple that given 
a shapefile in a post)

sr1=Polygons(list(Polygon(cbind(c(180114, 180553, 181127, 181477, 
181294, 181007, 180409,
   180162, 180114), c(332349, 332057, 332342, 333250, 333558, 333676,
   332618, 332413, 332349)))),'1')
sr2=Polygons(list(Polygon(cbind(c(180042, 180545, 180553, 180314, 
(Continue reading)

yasmine mohamed | 23 May 15:23 2016

TSA error using packet 1 any (sp) is not TRUE

Hello Guys,

I am trying to do Trend Surface Analysis (TSA) via OLS as rainfall function of longitude and latitude. This
TSA needs spatialgriddataframe which is grd. The code went fine, but the graph has the following message
written inside

error using packet 1 any (sp) is not TRUE

Here is my Code:

rain.tsm <- lm(rain~I(LON^2)+ I(LAT^2)+I(LON*LAT), data=yr757.eda)



gridded(grd) = TRUE # Make it a grid

grd$tsm.pred <- predict(rain.tsm, grd)

pts <- list('sp.points', yr757.eda, pch=1, cex=0.7, col='black')

spplot(grd, zcol='tsm.pred', first= FALSE, scales=list(draw=T), main="Rainfall Estimates (Trend
Surface Model)", sp.layout=pts)


	[[alternative HTML version deleted]]
(Continue reading)

Run focal function on a multi-layer raster

I have a multi-layer raster, in this case a rasterbrick, on which I would like to apply a focal function on
each layer, and get another rasterbrick as a result.

I am trying to use calc, but apparently I am not assembling my function in the proper way:

r <- raster(ncol=50, nrow=50)
b <- brick(r,r,r,r,r,r)
b <- b * 1:6

myfocalfun <- function(x){
b.f <- focal(x, w=matrix(1, 5, 5), mean)

b1 <- calc(b, myfocalfun)

Error in .calcTest(x[1:5], fun, na.rm, forcefun, forceapply) : 
cannot use this function

What am I missing here?
 -- Thiago V. dos Santos

PhD student
Land and Atmospheric Science
University of Minnesota
Daniela Nava | 20 May 14:53 2016

spatial data with binomial distribution

Hello,I'm trying to simulate a grid with binomial distribution spacially dependent. For the normal
distribution, the package geoR has the command grf(). Has anyone some ideia how can I do this?Thanks a lot, Daniela.


Daniela Trentin Nava
Msc. em Estatística pela UFPE
Professora da UTFPR - Campus Toledo
Email alternativo: dnava <at>

Hapiness is a decision. You are as happy as you decide to be. 
	[[alternative HTML version deleted]]

R-sig-Geo mailing list
R-sig-Geo <at>