Course description
With the continuing advances of geographic information science and geospatial
technologies, spatially referenced information have been easily and increasingly
available in the past decades and becoming important information sources in
scientific research and decision making processes. To effectively take advantage
of the rich collection of spatial (and temporal) data, statistical analysis is
often necessary, e.g., to extract implicit knowledge such as spatial relations and
patterns that are not explicit in the data. Spatial data analysis distinguishes
itself from classical data analysis in that spatial analysis focuses on locations,
areas, distances, relationships and interactions of measurements that are usually
referenced as points, lines, and areal units in geographical spaces. In the
past decades, a plethora of theory, methods and tools of spatial analysis have
been developed from different perspectives, and converged as fruitful fields of
geographic information science (GIScience) and spatial statistics.
The purpose of this class is to present the commonly used methods and current
trends in spatial and spatiotemporal data analysis, and innovative applications
in relevant fields (e.g., environmental science and engineering, natural resources
management, ecology, public health, climate sciences, civil engineering, and social
sciences). In this class, we will review the basic principles in spatiotemporal
analysis and modeling, and discuss commonly used methods and tools. Students
are expected to actively participate in class lecture, complete lab assignments,
read assigned articles and develop a project of their own choice or directly related
to their thesis/dissertation topics. The following topics will be covered in the
class, but can be adjustable to meet the students’ interests:
• Exploratory spatial data analysis
• Space-time geostatistics
• Spatial point process and species distribution modeling
• Spatiotemporal disease mapping
• Time series map analysis and change detection
Prerequisites
Prerequisites of this course includes an understanding of basic concepts of spatial
analysis and statistics, which could be fulfilled with basic statistics courses or
graduate level of GIS course. Students from different disciplines are welcome,
please contact the instructor should there any question about the prerequisites.

Learning outcomes
After completing this course, the students of this class are expected to be able
to:
• formulate real-world problems in the context of spatial and spatiotemporal
analysis with a knowledge of basic concepts and principles in this field;
• understand commonly used concepts and methods in statistical analysis of
spatiotemporal data;
• apply appropriate spatial and spatiotemporal analytical methods to solve
the formulated problems, and be able to critically review alternative
methods;
• utilize programmable scientific computing tools (e.g., R) to make maps,
solve spatial and spatiotemporal analysis problems, and evaluate and assess
the results of alternative methods;
Readings
• A reading list of articles will be provided. The following books will be
frequently referred to for reading:
– Bivand Roger S., Pebesma, Edzer J., and Gómez-Rubio, Virgilio
(2008), Applied Spatial Data Analysis with R, Springer (eBook available at TTU library).
– Cressie, N., & Wikle, C. K. (2011). Statistics for Spatio-temporal
Data. John Wiley & Sons.
Sample course outline
Day Sample topics Readings Hours
1 Class overview and introduction Handouts 3
2 Point pattern analysis Ch.7 BPG 3
3 Species distribution modeling Handouts 3
4 Space-time geostatistics Ch.8 BPG 3
5 Spatiotemporal regression Ch.9,10 BPG 3
6 Time series map analysis Handouts 3
7 Discussion and student presentation 5

Background of Instructor
Dr. Guofeng Cao is an Assistant Professor in the Department of Geosciences
at Texas Tech University. His research interests include geographic information
science and systems (GIS), cyberGIS and spatiotemporal statistics, with a
primary focus on statistical learning of complex spatial and spatiotemporal
patterns across different domains. His research has been supported by different
funding agency. He has published 45 peer-reviewed papers including 30 journal
articles. He received a B.S. in Earth Science from Zhejiang University, an M.S. in
GIS from Chinese Academy of Science, and a M.A. in Statistics and a Ph.D. in
Geography from the University of California, Santa Barbara. He also had several
years of industrial experiences before moving back to academia.

https://github.com/surfcao/summer2018-cug

--
title: "Day 1: Use R as GIS"
output: github_document
---

```{r global_options, results='asis', warning=FALSE}
knitr::opts_chunk$set(fig.width=12, fig.height=8, fig.path='Figs/', warning=FALSE, message=FALSE)
```

```{r load, echo=F, eval=T}
rm(list=ls())
x <- c("sp", "rgdal", "rgeos", "maptools", "classInt", "RColorBrewer", "GISTools", "maps", "raster", "ggmap")
#install.packages(x) # warning: this may take a number of minutes
lapply(x, library, character.only = TRUE) #load the required packages
```

# Spatial Objects

| | Without attributes | With attributes |
| ----- | ------------------ | -------------- |
|Points | SpatialPoints | SpatialPointsDataFrame|
|Lines | SpatialLines | SpatialLinesDataFrame|
|Polygons | SpatialPolygons | SpatialPolygonsDataFrame|
|Raster | SpatialGrid | SpatialGridDataFrame|
|Raster | SpatialPixels | SpatialPixelsDataFrame|

```{r load_library1, echo=T, eval=T}
LubbockBlock<-readShapePoly("Data/LubbockBlockNew.shp") #read polygon shapefile
class(LubbockBlock)
HouseLocation<-read.csv("Data/HouseLatLon.csv") #read GPS data
class(HouseLocation)
coordinates(HouseLocation)<-c('Lon', 'Lat')
class(HouseLocation)
cropland<-raster("Data/Lubbock_CDL_2013_USDA.tif")
class(cropland)

tmin <- getData("worldclim", var = "tmin", res = 10) # this will download
class(tmin)
```

```{r load_library2, echo=T, eval=T}
LubbockBlock<-readOGR("./Data", "LubbockBlockNew") #read polygon shapefile
class(LubbockBlock)
```

# Mapping with R

## Basic Mapping

```{r mapping, echo=T, eval=T}
LubbockBlock<-readShapePoly("Data/LubbockBlockNew.shp") #read polygon shapefile
plot(LubbockBlock,axes=TRUE, col=alpha("gray70", 0.6)) #plot Lubbock block shapefile
#add title, scalebar, north arrow, and legend
HouseLocation<-read.csv("Data/HouseLatLon.csv") #read GPS data
price<-HouseLocation$TotalPrice
nclr<-5
priceclr<-brewer.pal(nclr, "Reds")
class<-classIntervals(price, nclr, style="quantile")
clocode<-findColours(class, priceclr)

points(HouseLocation$Lon, HouseLocation$Lat, pch=19, col=clocode, cex=0.5) #add houses on top of Lubbock block shapefile
title(main="Houses on Sale in Lubbock, 2014")

legend(-101.95, 33.65, legend=names(attr(clocode, "table")), fill =attr(clocode, "palette"), cex=0.5, bty="n")
#map.scale(x=-101.85, y=33.49,0.001,"Miles",4,0.5,sfcol='red')
north.arrow(xb=-101.95, yb=33.65, len=0.005, lab="N")

#plot raster
plot(cropland)
#plot raster stack
tmin <- getData("worldclim", var = "tmin", res = 10) # this will download
plot(tmin)
```

## Mapping with static Google Maps

```{R mapping2, echo=F, eval=F}
library(RgoogleMaps)
lubbock=geocode('lubbock')

newmap <- GetMap(center = c(lubbock$lat, lubbock$lon), zoom = 12, destfile = "newmap.png", maptype = "roadmap")

PlotOnStaticMap(newmap, lat=HouseLocation$Lat, lon=HouseLocation$Lon, col='red')
lubbock<-SpatialPolygons(LubbockBlock@polygons, proj4string=CRS("+init=EPSG:4326"))
PlotPolysOnStaticMap(newmap, lubbock, col=alpha('blue', 0.2))
```

## Mapping with dynamic Google Maps

```{R mapping3, echo=F, eval=F}
library(plotGoogleMaps)

data(meuse)
coordinates(meuse)=~x+y
proj4string(meuse) = CRS('+init=epsg:28992')
plotGoogleMaps(meuse, filename='meuse.html')

HouseLocation<-read.csv("Data/HouseLatLon.csv") #read GPS data
coordinates(HouseLocation)<-c('Lon', 'Lat')
proj4string(HouseLocation)=CRS('+init=EPSG:4326')
plotGoogleMaps(HouseLocation, filename='house.html')

ic = iconlabels(meuse$zinc, height=12)
plotGoogleMaps(meuse, iconMarker=ic, mapTypeId='ROADMAP', filename='meuse2.html')

#plot raster
data(meuse.grid)
coordinates(meuse.grid)<-c('x', 'y')
meuse.grid<-as(meuse.grid, 'SpatialPixelsDataFrame')
proj4string(meuse.grid) <- CRS('+init=epsg:28992')
mapMeuseCl<- plotGoogleMaps(meuse.grid,zcol= 'dist',at=seq(0,0.9,0.1),colPalette= brewer.pal(9,"Reds"), filename='meuse3.html')

#plot polygons
proj4string(LubbockBlock)=CRS("+init=epsg:4326")
m<-plotGoogleMaps(LubbockBlock,zcol="Pop2010",filename= 'MyMap6.htm' , mapTypeId= ' TERRAIN ' ,colPalette= brewer.pal(7,"Reds"), strokeColor="white")

#plot line
meuse.grid<-as(meuse.grid, 'SpatialPixelsDataFrame')
im<-as.image.SpatialGridDataFrame(meuse.grid[ 'dist' ])
cl<-ContourLines2SLDF(contourLines(im))
proj4string(cl) <- CRS( '+init=epsg:28992')
mapMeuseCl<- plotGoogleMaps(cl,zcol= 'level' ,strokeWeight=1:9, filename= 'myMap6.htm',mapTypeId= 'ROADMAP')

```

## Changing map projections

```{r projection, eval=T }

#project a vector

boudary=readShapePoly('Data/boundary');
proj4string(boudary) <-CRS("+proj=utm +zone=17 +datum=WGS84 +units=m +no_defs +ellps=WGS84 +towgs84=0,0,0")
proj4string(boudary)
boudaryProj<-spTransform(boudary, CRS("+init=epsg:3857"))
proj4string(boudaryProj)

#project a raster
proj4string(cropland)
plot(cropland)
aea <- CRS("+init=ESRI:102003") #Albert equal area
projCropland=projectRaster(cropland, crs=aea)
plot(projCropland)
```

# Spatial analysis with R

```{r load_library4, echo=F, eval=T}

#subsetting a spatial dataframe
LubbockBlock<-readOGR("./Data", "LubbockBlockNew") #read polygon shapefile

selection = LubbockBlock[LubbockBlock$Pop2010>500,]
plot(selection)

#select by clicking
selected = click(LubbockBlock)

extent = drawExtent()

extent=as(extent,'SpatialPolygons')
proj4string(extent)=proj4string(selection)

# performace erase
plot(erase(selection, extent))

poly = drawPoly()
proj4string(poly) = proj4string(LubbockBlock)

# performe clip
cropselection = crop(LubbockBlock,poly)
plot(cropselection)

```
## vector analysis (overlay)

```{r vector, echo=T, eval=T }
#project a vector

# Datasets
# * CSV table of (fictionalized) brown bear sightings in Alaska, each
# containing an arbitrary ID and spatial location specified as a
# lat-lon coordinate pair.
# * Polygon shapefile containing the boundaries of US National Parks
# greater than 100,000 acres in size.

bears <- read.csv("Data/bear-sightings.csv")
coordinates(bears) <- c("longitude", "latitude")

# read in National Parks polygons
parks <- readOGR("Data", "10m_us_parks_area")

# tell R that bear coordinates are in the same lat/lon reference system as the parks data
proj4string(bears) <- proj4string(parks)

# combine is.na() with over() to do the containment test; note that we
# need to "demote" parks to a SpatialPolygons object first
inside.park <- !is.na(over(bears, as(parks, "SpatialPolygons")))

# calculate what fraction of sightings were inside a park
mean(inside.park)
## [1] 0.1720648

# determine which park contains each sighting and store the park name as an attribute of the bears data
bears$park <- over(bears, parks)$Unit_Name

# draw a map big enough to encompass all points, then add in park boundaries superimposed upon a map of the United States
plot(bears)
map("world", region="usa", add=TRUE)
plot(parks, border="green", add=TRUE)
legend("topright", cex=0.85, c("Bear in park", "Bear not in park", "Park boundary"), pch=c(16, 1, NA), lty=c(NA, NA, 1), col=c("red", "grey", "green"), bty="n")
title(expression(paste(italic("Ursus arctos"), " sightings with respect to national parks")))

# plot bear points with separate colors inside and outside of parks
points(bears[!inside.park, ], pch=1, col="gray")
points(bears[inside.park, ], pch=16, col="red")

# write the augmented bears dataset to CSV
write.csv(bears, "bears-by-park.csv", row.names=FALSE)

# ...or create a shapefile from the points
writeOGR(bears, ".", "bears-by-park", driver="ESRI Shapefile")
```

## Raster analysis

```{r raster, eval=T, echo=T}

tmin=getData('worldclim', var='tmin', res=10)

# Raster calculator
diff=tmin$tmin1 - tmin$tmin10

## the following code is faster for large datasets.
overlay(tmin$tmin1, tmin$tmin10, fun=function(x,y){return (x-y)})

elevation <- getData("alt", country = "ESP")
slope <- terrain(elevation, opt = "slope")
aspect <- terrain(elevation, opt = "aspect")
hill <- hillShade(slope, aspect, 40, 270)
plot(hill, col = grey(0:100/100), legend = FALSE, main = "Spain")
plot(elevation, col = rainbow(25, alpha = 0.35), add = TRUE)

#contours

contour(elevation)
```

```{r raster2, eval=F, echo=T}
#crop raster
plot(hill, col = grey(0:100/100), legend = FALSE, main = "Spain")
plot(elevation, col = rainbow(25, alpha = 0.35), add = TRUE)
extent=drawExtent()
cropElev <- crop(elevation, extent)
plot(cropElev)
```

Applied Spatiotemporal Data Mining应用时空数据挖掘的更多相关文章

  1. 数据挖掘(data mining),机器学习(machine learning),和人工智能(AI)的区别是什么? 数据科学(data science)和商业分析(business analytics)之间有什么关系?

    本来我以为不需要解释这个问题的,到底数据挖掘(data mining),机器学习(machine learning),和人工智能(AI)有什么区别,但是前几天因为有个学弟问我,我想了想发现我竟然也回答 ...

  2. data Mining with Weka: Trailer More Data Mining with Weka 用weka 进行数据挖掘 Weka 用weka 进行更多数据挖掘

    https://www.youtube.com/user/WekaMOOC 大学公开课  视频教程 weka 入门教程 data Mining with Weka: Trailer  More Dat ...

  3. Machine Learning and Data Mining(机器学习与数据挖掘)

    Problems[show] Classification Clustering Regression Anomaly detection Association rules Reinforcemen ...

  4. Weka 3: Data Mining Software in Java

    官方网站: Weka 3: Data Mining Software in Java 相关使用方法博客 WEKA使用教程(经典教程转载) (实例数据:bank-data.csv) Weka初步一.二. ...

  5. data mining,machine learning,AI,data science,data science,business analytics

    数据挖掘(data mining),机器学习(machine learning),和人工智能(AI)的区别是什么? 数据科学(data science)和商业分析(business analytics ...

  6. Data Mining的十种分析方法——摘自《市场研究网络版》谢邦昌教授

    Data Mining的十种分析方法: 记忆基础推理法(Memory-Based Reasoning:MBR)        记忆基础推理法最主要的概念是用已知的案例(case)来预测未来案例的一些属 ...

  7. 搭建Data Mining环境(Spark版本)

    前言:工欲善其事,必先利其器.倘若不懂得构建一套大数据挖掘环境,何来谈Data Mining!何来领悟“Data Mining Engineer”中的工程二字!也仅仅是在做数据分析相关的事罢了!此文来 ...

  8. Tinghua Data Mining

    Learning Resources 书籍: 期刊: 业界先驱: 开阔视野,掌握业界最新动态. 工具: 数据挖掘是很多学科的综合体: 甭管叫什么名字,归根到底都是数据挖掘: Comprehensive ...

  9. 论文翻译:Data mining with big data

    原文: Wu X, Zhu X, Wu G Q, et al. Data mining with big data[J]. IEEE transactions on knowledge and dat ...

随机推荐

  1. java监控

    参考: https://www.cnblogs.com/smail-bao/p/6027756.html

  2. Websocket --(3)实现

    今天介绍另外一种websocket实现方式,结合了spring MVC,并完善了第二节所提到做一个简单的登录认证用来识别用户的名称.界面继续沿用第二节的布局样式,同时增加上线和下线功能. 参考了 ht ...

  3. Node.js 博客搭建

    Node.js 博客搭建:https://www.linuxidc.com/Linux/2017-02/140115.htm https://www.cnblogs.com/mrcln/p/93087 ...

  4. keepalived 容器在宿主机重启后无法启动问题:报错:daemon is already running

    初步猜测原因是:keepalived容器内的keepalived.pid文件在keepalived容器非正常退出时,没有正确删除,造成第二次启动时容器检查到pid文件已经存在,认为该进程已经存在,因为 ...

  5. linux操作系统安装运行Redis

    Redis是c语言开发的. 安装redis需要c语言的编译环境.如果没有gcc需要在线安装.yum install gcc-c++ 安装步骤: 1.安装gcc      yum install gcc ...

  6. linux命令 集合

    ps:查看所有进程 // -e :显示所有进程:-f:代表全格式 ps -ef | grep python :查看后台运行的python程序,| 表示管道,grep表示筛选 & 符号:后台执行 ...

  7. vue学习【三】vue-router路由显示多页面

    大家好,我是一叶,今天是七夕,单身狗的我还在这里写踩坑文.在这里还是要祝大家早日脱单(能不能脱单自己心里没个数吗).本篇继续踩坑,在单页面上展示多页的内容,大家的想法是什么,估计大家第一印象会是ifr ...

  8. 使用CXF开发WebService程序的总结(五):基于Map数据类型处理的的客户端和服务端代码的编写

    1. 首先我们按照List或数组等处理方式来处理Map,看看效果 1.1 在服务端的接口中添加以下方法 /** * 查询所有班级信息加上对应的学生列表 * * @return */ public Ma ...

  9. js 性能优化 - web worker

    当在 HTML 页面中执行脚本时,页面的状态是不可响应的,直到脚本已完成. web worker 是运行在后台的 JavaScript,独立于其他脚本,不会影响页面的性能. 您可以继续做任何愿意做的事 ...

  10. 关于mysql的查询优化

    由于工作原因,最近甲方客户那边多次反应了他们那边的系统查询速度慢,经过排除之后,发现他们那边的数据库完全没有用到索引,简直坑得一笔,通过慢查询日志分析,为数据表建立了适当的索引之后,查询速度明显的提高 ...