Helen Sofaer | 29 Aug 06:52 2014

raster to dataframe with xy=TRUE, na.rm=TRUE

Hi all,

I’m trying to convert a RasterBrick to a dataframe while adding the
coordinates and while dropping cells that were masked to NA. This
combination of options gives me an error when the mask is done with an sp

Some reproducible code:

usa = getData('GADM', country = 'USA', level = 0)

r1 = raster()

values(r1) = 1:ncell(r1)

r1.b = brick(r1, r1, r1, r1)

r1.b.mask = mask(r1.b, usa)


r1.b.df = as.data.frame(r1.b.mask, xy = TRUE, na.rm = TRUE)

The error is:

Error in data.frame(..., check.names = FALSE) :

  arguments imply differing number of rows: 64800, 1109

Looks like it wants to combine all the coordinates with just the subset of
(Continue reading)

Dan Rosauer | 29 Aug 03:03 2014

function gbm.perspec() in package dismo


Can anyone explain use of the 'smooth' parameter in the function dismo::gbm.perspec()  ?

It sounds like just what I need.  The help says:

smooth             controls smoothing of the predicted surface

and the default value is "none"

But what values can be used, other than none?


Dan Rosauer
Postdoctoral Researcher
Moritz Lab
Ecology, Evolution & Genetics
Research School of Biology
Gould Building, Daley Road
Australian National University
Canberra ACT 0200

+61 413 950 275 (mobile)
+61 2 6125 1028 (office)
dan.rosauer <at> anu.edu.au

(Continue reading)

MacQueen, Don | 29 Aug 01:57 2014

Using spTransform() to reproduce another software package's transformation

The program I work for has specified the use of a local coordinate reference system and a method for
transforming and projecting from WGS84 long/lat to the local system. They use ESRI products to convert
from long/lat to the local system.

Since I do everything in R, naturally I wish to use spTransform() to replicate their conversion. I’ve
been using spTransform() for a number of years now, and thought I understood what I’ve been doing.

But I have run into trouble. I would appreciate any advice.

I believe I have a reproducible example. Toward the end of this email are R expressions (based on dput) that
will create two SpatialPoints objects that are used in the example. They need to be created first, before
running the example.

## before adding further detail and the example, here are some references

(1) http://downloads2.esri.com/support/TechArticles/Geographic_Transformations_10.1.zip
(2) http://resources.arcgis.com/en/help/main/10.2/index.html#/Equation_based_methods/003r00000012000000/
(3) http://resources.arcgis.com/en/help/main/10.2/index.html#/Grid_based_methods/003r00000013000000/

The programs’s specified CRS is epsg 26743 = California State Plane Zone 3 NAD27 US feet (out of my control!)

The specified method for transforming and projecting from WGS84 long/lat to the local CRS consists of two steps:
 1) transform and project to epsg 2227 = California State Plane Zone 3 NAD83 US feet
 2) transform to epsg 26743 = California State Plane Zone 3 NAD27 US feet

When doing the steps in the ESRI software's projection tool:
 step 1) use what ESRI calls "NAD_1983_To_WGS_1984_5"  (wkid 1515 in reference 1)
 step 2) use what ESRI calls "NAD_1927_To_NAD_1983_NADCON"  (wkid 1241 in reference 1)
(Continue reading)

alannie | 29 Aug 01:54 2014

ENFA: Contrasting Scores between species

Hi All!

Interested in contrasting the marginality and specificity scores from ENFA
for a group of closely related species that live in the same general region.
They contrast with each other based on breeding system. 

I have the following questions:
1) If marginality for the species is above 1, how would you interpret the

I realize that in the Hirzel et al. 2002 paper it states "A large value
(close to one) means that the species lives in a very particular habitat
relative to the reference set."  

However, I am not sure if a value of 1.2 is less marginal than a value of

In this example, 4.5 is large, but it is further from one than 1.2.

Would any value closest to 1 be the least marginalized?

2)Whichever the result, what would be the best way to analyze a set of data
contrasting marginality scores of different groups? I have been using a z
test. But, perhaps there is better way you can enlighten me to!

I am looking at species that evolved from one another, by the way.
Something like: green species that evolved from yellow species and seeing if
the marginality is different. 

(Continue reading)

Felinto COSTA | 28 Aug 20:56 2014

Spatiotemporal analysis of real estate market data

Hello all.

I graduated in 1985 ( in engineering ) and this year I'm
about to finish my graduate studies ( geostatistic ).

My field of work  is related to real estate market and I think spatio-temporal statistics  would be very
interesting to analyze land prices evolution in different points of my city over time.

Specifically I'm focusing in suitable models that consider simultaneously the spatial distribution of
land price and its variability over the last 15 years,  in order to use them to predict unsampled points ( in
time and space ) and to plot isovalues surfaces of different time periods ( as snapshots ).

My sample consists of more 3.500 points dispersed ( in time and space ), collected over 15 years of real
estate transactions, having UTMX, UTMY ( as coordinates ), V ( land value per square meter ) and D ( date in
MM/DD/YYY - from 1999 to 2014 ).

There are some R packages that seem to be adequate to this purpose, but I'm a little confused in establishing
the basics of an workflow for the analysis.

Appreciate any help.

Com os melhores cumprimentos

INCORP Londrina/PR
Felinto COSTA, engº
incorpld <at> onda.com.br
Tomislav Hengl | 28 Aug 17:10 2014

Comparison of prediction performance (mapping accuracy) - how to test if a method B is significantly more accurate than method A?

Dear list,

I'm trying to standardize a procedure to compare performance of 
competing spatial prediction methods. I know that this has been 
discussed in various literature and on various mailing lists, but I 
would be interested in any opinion I could get.

I am comparing (see below) 2 spatial prediction methods 
(regression-kriging and inverse distance interpolation) using 5-fold 
cross-validation and then testing if the difference between the two is 
significant. What I concluded is that there are two possible tests for 
the final residuals:
1. F-test to compare variances (cross-validation residuals),
2. t-test to compare mean values,

Both tests might be important, nevertheless the F-test ("var.test") 
seems to be more interesting to really be able to answer "is the method 
B significantly more accurate than method A?". It appears that the 
second test ("t.test") is only important if it fails -> which would mean 
that one of the methods systematically over or under-estimates the mean 
value (which should be 0). Did I maybe miss some important test?

Thank you!

R> library(GSIF)
R> library(gstat)
R> library(sp)
R> set.seed(2419)
R> demo(meuse, echo=FALSE)
(Continue reading)

Justin Michell | 28 Aug 14:44 2014

Verify units of distance between coordinates

Dear geo R group

I have a data frame like this:

df <- data.frame(Lon = c(29.6000,29.7333,30.3887,30.6667,30.6833,30.8667), Lat =
                  LonWater = c(29.63333,29.63333,30.25000,30.65000,30.35444,30.83278), LatWater =
c(-4.31667,-4.31667,-4.76667,-1.35000,-2.46667,-3.57000), DstClW =
c(0.5842815,0.3004491,0.3870362,0.2837918,0.4340793,0.1897561) )

At these locations (Lon, Lat pairs) I calculated the shortest distance to a water source (DstClW) and where
that source is (LonWater, LatWater).

I want to now determine what units DstClW is in, and also verify that these distances make sense and were
calculated correctly. 

Any suggestions as to how this might be done?

Justin Michell
Sancta Vega | 27 Aug 19:06 2014

Error with RGEOS

Just for dissolving I use the rgeos package with gUnaryUnion. But I get
this error :No UnaryUnion in this version of GEOS. I remove the package and
install it again but I get the same error. Is it linked to my version of R
or something else? This my code:

     summary(utah) #
     regional= gUnaryUnion(utah,utah$region)

According to your suggestion I ask my question with more details:

For my R version:
R version 3.0.2 (2013-09-25) -- "Frisbee Sailing"
Copyright (C) 2013 The R Foundation for Statistical Computing
Platform: i686-pc-linux-gnu (32-bit)

>From library (rgeos) I get this warning :
rgeos version: 0.3-6, (SVN revision 450)
 GEOS runtime version: 3.2.2-CAPI-1.6.2
 Polygon checking: TRUE

Thank you

	[[alternative HTML version deleted]]
Sancta Vega | 27 Aug 17:30 2014

RGEOS error

Just for dissolving I use the rgeos package with "gUnaryUnion".
But I get this error :" No UnaryUnion in this version of GEOS".
I remove the package and install it again but I get the same error.
Is it linked to my version of R  or something else?
Thank you in advance.

	[[alternative HTML version deleted]]
Aseem Sharma | 27 Aug 01:48 2014

Clip smaller domain from large domain netCDF file

I have this huge ( ~30GB) .nc file (NC_FORMAT_NETCDF4_CLASSIC)) for the
whole country 141.00 to 52.00 W, 41.00 to 84.00 N".
I am trying to clip this big dataset for a small region specific domain
(120.00 to 130.00 W, 50.00 to 60.00 N).
I am trying to do using netCDF4 r package but could not figure out how to
do so.
Kindly please suggest me how should i proceed.

Thank you,

	[[alternative HTML version deleted]]
Srinivas V | 26 Aug 11:55 2014

Subsetting Raster Time Series


I would like to subset the CRU dataset to a particular time period 
(1980-2013) is there an option to do this within package raster? I can 
manually specifiy the layers to drop, but I would like to drop them 
based on a time period. I'm doing this to ensure two datasets are of the 
same time period.

I would appreciate any advice on dealing with this issue. Thanks!


temp <at> z

 > temp
class       : RasterBrick
dimensions  : 360, 720, 259200, 1356  (nrow, ncol, ncell, nlayers)
resolution  : 0.5, 0.5  (x, y)
extent      : -180, 180, -90, 90  (xmin, xmax, ymin, ymax)
coord. ref. : +proj=longlat +datum=WGS84 +ellps=WGS84 +towgs84=0,0,0
data source : /media/data/data_cru/tmp/cru_ts3.22.1901.2013.tmp.dat.nc
names       : X1901.01.16, X1901.02.15, X1901.03.16, X1901.04.16, 
X1901.05.16, X1901.06.16, X1901.07.16, X1901.08.16, X1901.09.16, 
(Continue reading)