Elaine McVey | 29 May 18:20 2015

Missing projection file?

I'm new to the list and getting started with using R for geospatial data.
I'm having a problem I can't solve in which the projection file seems to be
missing.  This is the error I get:

proj4string(d) <- CRS("+init=epsg:28992")

Error in CRS("+init=epsg:28992") : no system list, errno: 2

It's confusing because when I load the package, it seems to successfully
autodetect the PROJ.4 files:

> library(rgdal)
rgdal: version: 0.9-2, (SVN revision 526)
Geospatial Data Abstraction Library extensions to R successfully loaded
Loaded GDAL runtime: GDAL 1.11.2, released 2015/02/10
Path to GDAL shared files:
Loaded PROJ.4 runtime: Rel. 4.9.1, 04 March 2015, [PJ_VERSION: 491]
Path to PROJ.4 shared files: (autodetected)

But when I do this, it returns zero:

.Call("PROJcopyEPSG", tempfile(), PACKAGE = "rgdal")

The only answer I've found from googling this is that epsg has to be
lowercase, which it is in my code.

I'm on a Mac running Yosemite, with R 3.2.0, and working inside RStudio
(version 0.99.441).  Other packages and versions from my session info:

rgeos_0.3-8          ggmap_2.4            ggplot2_1.0.1        rgdal_0.9-2
(Continue reading)

stevenhuyi | 29 May 13:00 2015

to forecast a disease risk using a dynamic model

Hi everyone,

I'm recently struggling with exploring a model to forecast the
schistosomiasis risk. Let me introduce the data I have first.
Parasitological data, including number of infected individuals and
population at risk, are available in a province from 1997 to 2010 at the
county level (i.e., polygon data in county) as well as some risk factors
(e.g., temperature, rainfall, and wetness). I want to forecast the disease
risk, that's the risks in 2011,2012, and etc. 

What comes to me first is the time series model (i.e., ARIMA) or Karman
filtering, but schistosomiasis often occurred in clusters, so the spatial
correlation should be considered. Besides,as the parasitologcal data are
count data, it would be better to fit the data using a Poisson or negative
binomial distribution. Unfortunately, either time series model or karman
filtering can tackle these. I tried the LINA package in R the other day. It
can take into account the issues that I'm concerned about, but it seems that
it can only smooth the data using the historical data, it cannot forecast. 
I have some references, but no example is related to forecast. 

A popular idea to fit spatiotemporal data is to use a hierarchical model,
which is composed of a data level model, a state level model that produce
the data, and finally a parameter level which exists in the former two level
models. Many statistician advocates this model, like Noel Cressie. I think
this would be my target. However, it seems that there is no R package that
can implement the hierarchical model for the polygon data, I know some can
deal with point data like spTimer (but it still cannot deal with count

What I said above might not be correct and welcome your comments and, most
(Continue reading)

Barry Rowlingson | 29 May 12:13 2015

Javascript All The Things!

[Apologies for the memey subject: http://knowyourmeme.com/memes/x-all-the-y]

Developers can't fail to have noticed the rise of Javascript, both in
the browser and outside of it. But it has now crept into R, and into
R-spatial packages.

Javascript, like most languages, has its right to exist and its place.
I'm just not sure that using it to call functionality that exists
perfectly well elsewhere in R is the place.

 For example, there is a Javascript library called "turf.js" that does
GIS operations like polygon overlay, buffering, point-in-polygon etc.
That's great, because now my web mapping interface can do those in the
browser. The user can draw a polygon, the browser can select the data
and do something with it without a round-trip and a load on the
server. Brilliant. That same Javascript can also run perfectly well
outside a browser via a JS interpreter such as node or V8, so its
possible to use turf.js to write javascript scripts to do GIS
operations. Again, brilliant. If you are a JS programmer developing
systems in JS that need this, you've now got it.

 Now there is a package (V8) to run Javascript in  R. Great. If
there's some JS functionality you want to call. But why should you use
it to access functionality you've got in a perfectly good R package
already? The turf.js code has been wrapped in the "lawn" package, so
you can call turf's GIS functions from R. Some people on twitter seem
to think this is novel, and suddenly you can now do "GIS in R!". They
seem to have not noticed we've been doing it for the past umpteen
years with things like gpclib, and lately rgeos.

(Continue reading)

Gregovich, Dave P (DFG | 29 May 02:37 2015

Setting raster x- and y-resolutions equal

I have a stack of about 40 rasters with about 1 X 10^7 cells in each. I did not realize it, but it turns out the
x-resolution and y-resolution differ at about the 13th decimal point. And I would like to feed this stack
into a function (dynatopmodel::upslope.area) that requires xres(x)==yres(x).
Because of the minute difference between the x- and y- resolutions, I am fine just setting the x- and y-
resolutions equal to each other. That would save me the time of resampling all of the bands of the stack.
However, this does not seem to be working for me when I try to reproduce the problem, as shown below using a
single band that mimics a single band from the stack I am working with:


#Example 1, my situation, which is not working....
rast <- raster(nrows=4072, ncols=3111, xmn=656419.81022503704298, xmx=730403.88166906696279,
rast[] <- rnorm(1:ncell(rast))
proj4string(rast) <- '+init=epsg:26931'
#xres and yres differ!!!!....

#and then two examples that work, Example 2, with a raster of the same size but resetting the resolution to
res(rast) <- c(5,5)....
rast <- raster(nrows=4072, ncols=3111, xmn=656419.81022503704298, xmx=730403.88166906696279,
rast[] <- rnorm(1:ncell(rast))
proj4string(rast) <- '+init=epsg:26931'
(Continue reading)

Chris English | 28 May 12:28 2015

STIDF - endTime, STI -> Track


I am wondering about the role of endTime in STIDF objects.  I am examining eye tracking data (previously
cleaned of blinks) in relation to 
presented stimuli that is for some subjects an optical illusion and for others not.  I want to examine
where they were looking and when.

My process is to make a STIDF from the eye tracking data case and a STSDF of the stimuli that was presented
where and for how long,
convert the STIDF to a Track then do some 'over' analysis.

If I build my endTime for the STIDF using the delta() function on N samples, I think I get something like N-1
endTimes, or every sample
is an endTime so N = N.

If instead I am thinking of endTime(s) as an interval during which there is a cross hair and some tangential
stimulus on the screen and
endTime is when a subject responds in some manner I can't build an STDIF due to the following test:

> eye_5v1_stidf <- STIDF(eye_5v1_sp, eye_5v1_time, eye_5v1_data,
+ eye_5v1_endTime)
Error: nrow(object <at> time) == length(object <at> endTime) is not TRUE
> nrow(eye_5v1_time)
[1] 4724
> length(eye_5v1_endTime)
[1] 63

Indeed, it is not true. But what information do I have in endTime other than my sensor sampling rate adjusted
for blinks?  What I hoped to
(Continue reading)

Antonio Bubbico | 27 May 12:55 2015

distance between municipalities and provincial capital


I have data from a census where the observations are the municipalities. I
would like to calculate the distance of each municipality by its own
provincial capital.

Have you some suggestions?


*Dr Antonio Bubbico*
Postdoctoral Researcher

*School of Agriculture and Food ScienceUniversity College DublinBelfield,
Dublin 4Ireland*

Tel: +353899528966
e-mail: a <marija.banovic <at> ucd.ie>ntonio.bubbico <at> gmail.com
Room: 1.27

	[[alternative HTML version deleted]]
MacQueen, Don | 27 May 00:25 2015

Hole attribute in Polygon lost when made into Polygons object?

I must be missing something basic
(either in understanding, or possibly being blind to a typo!)

Here is the example data:

m1 <-structure(c(2.8, 8, 8.2, 3.9, 1.6,
    9.2, 6.8, 3, 1.1, 4.2),
   .Dim = c(5L, 2L))

Pnh <- Polygon(m1, hole=FALSE)
Ph  <-  Polygon(m1, hole=TRUE)

Pnhs <- Polygons(list(Pnh), ID=1)
Phs <- Polygons(list(Ph), ID=1)

> Pnh <at> hole
> Ph <at> hole
[1] TRUE

> Pnhs <at> Polygons[[1]] <at> hole
> Phs <at> Polygons[[1]] <at> hole

It appears that the hole attribute has been lost??

R 3.1.2
(Continue reading)

John Wasige | 26 May 19:41 2015

residuals from linear regression of two rasterstack - time series

Hi every body! I need your help. How can calculate residuals from linear
regression of two rasterstack - time series? Thanks for your help


John Wasige

	[[alternative HTML version deleted]]
stevenhuyi | 26 May 10:59 2015

a problem with creating a neighbours list

Hi everyone,
I have a  simple problem, which I know will have a simple solution, but I
just can't tackle it. 

I have a shapefile with regions, including an attribute containing a
regional id. I used the following R script to create a neighbours list:

anhui <- readShapePoly("endemic.shp")
zzz<- poly2nb(anhui,row.names=anhui$COUNTY_ID)
nb2INLA("anhui.graph", zzz)
Anhui.adj <- paste(getwd(),"/Anhui.graph",sep="")

However, the Anhui.adj file is just as follows
1 5 3 4 5 6 7
2 2 3 7
3 3 1 2 7
4 2 1 5

the id number is not the original id number in my shapefile (which are
actually 1289, 1290,...). Is there a way to get a neighbours list that is
created using the original id number? Thanks for your help.


View this message in context: http://r-sig-geo.2731867.n2.nabble.com/a-problem-with-creating-a-neighbours-list-tp7588325.html
Sent from the R-sig-geo mailing list archive at Nabble.com.
(Continue reading)

sadaoui | 24 May 22:37 2015

What is the metric projection equivalent to WGS 84


what is the good  projection to be used, in order to calculate the area
(square meters ) of my polygons that are found in different places of the
world map.

best regards

View this message in context: http://r-sig-geo.2731867.n2.nabble.com/What-is-the-metric-projection-equivalent-to-WGS-84-tp7588321.html
Sent from the R-sig-geo mailing list archive at Nabble.com.
Robert | 24 May 14:00 2015

Regional count data age standardisation

Hi there,

I've been looking for an R script/ tutorial that will enable me to iterate
through a number of areas with the different age populations and case count
data in order to derive a single summary to compare area rates.

I would be very grateful if someone could please point me to some resources
and examples.



View this message in context: http://r-sig-geo.2731867.n2.nabble.com/Regional-count-data-age-standardisation-tp7588318.html
Sent from the R-sig-geo mailing list archive at Nabble.com.