Course description
With the continuing advances of geographic information science and geospatial
technologies, spatially referenced information have been easily and increasingly
available in the past decades and becoming important information sources in
scientific research and decision making processes. To effectively take advantage
of the rich collection of spatial (and temporal) data, statistical analysis is
often necessary, e.g., to extract implicit knowledge such as spatial relations and
patterns that are not explicit in the data. Spatial data analysis distinguishes
itself from classical data analysis in that spatial analysis focuses on locations,
areas, distances, relationships and interactions of measurements that are usually
referenced as points, lines, and areal units in geographical spaces. In the
past decades, a plethora of theory, methods and tools of spatial analysis have
been developed from different perspectives, and converged as fruitful fields of
geographic information science (GIScience) and spatial statistics.
The purpose of this class is to present the commonly used methods and current
trends in spatial and spatiotemporal data analysis, and innovative applications
in relevant fields (e.g., environmental science and engineering, natural resources
management, ecology, public health, climate sciences, civil engineering, and social
sciences). In this class, we will review the basic principles in spatiotemporal
analysis and modeling, and discuss commonly used methods and tools. Students
are expected to actively participate in class lecture, complete lab assignments,
read assigned articles and develop a project of their own choice or directly related
to their thesis/dissertation topics. The following topics will be covered in the
class, but can be adjustable to meet the students’ interests:
• Exploratory spatial data analysis
• Space-time geostatistics
• Spatial point process and species distribution modeling
• Spatiotemporal disease mapping
• Time series map analysis and change detection
Prerequisites
Prerequisites of this course includes an understanding of basic concepts of spatial
analysis and statistics, which could be fulfilled with basic statistics courses or
graduate level of GIS course. Students from different disciplines are welcome,
please contact the instructor should there any question about the prerequisites.

Learning outcomes
After completing this course, the students of this class are expected to be able
to:
• formulate real-world problems in the context of spatial and spatiotemporal
analysis with a knowledge of basic concepts and principles in this field;
• understand commonly used concepts and methods in statistical analysis of
spatiotemporal data;
• apply appropriate spatial and spatiotemporal analytical methods to solve
the formulated problems, and be able to critically review alternative
methods;
• utilize programmable scientific computing tools (e.g., R) to make maps,
solve spatial and spatiotemporal analysis problems, and evaluate and assess
the results of alternative methods;
Readings
• A reading list of articles will be provided. The following books will be
frequently referred to for reading:
– Bivand Roger S., Pebesma, Edzer J., and Gómez-Rubio, Virgilio
(2008), Applied Spatial Data Analysis with R, Springer (eBook available at TTU library).
– Cressie, N., & Wikle, C. K. (2011). Statistics for Spatio-temporal
Data. John Wiley & Sons.
Sample course outline
Day Sample topics Readings Hours
1 Class overview and introduction Handouts 3
2 Point pattern analysis Ch.7 BPG 3
3 Species distribution modeling Handouts 3
4 Space-time geostatistics Ch.8 BPG 3
5 Spatiotemporal regression Ch.9,10 BPG 3
6 Time series map analysis Handouts 3
7 Discussion and student presentation 5

Background of Instructor
Dr. Guofeng Cao is an Assistant Professor in the Department of Geosciences
at Texas Tech University. His research interests include geographic information
science and systems (GIS), cyberGIS and spatiotemporal statistics, with a
primary focus on statistical learning of complex spatial and spatiotemporal
patterns across different domains. His research has been supported by different
funding agency. He has published 45 peer-reviewed papers including 30 journal
articles. He received a B.S. in Earth Science from Zhejiang University, an M.S. in
GIS from Chinese Academy of Science, and a M.A. in Statistics and a Ph.D. in
Geography from the University of California, Santa Barbara. He also had several
years of industrial experiences before moving back to academia.

https://github.com/surfcao/summer2018-cug

--
title: "Day 1: Use R as GIS"
output: github_document
---

```{r global_options, results='asis', warning=FALSE}
knitr::opts_chunk$set(fig.width=12, fig.height=8, fig.path='Figs/', warning=FALSE, message=FALSE)
```

```{r load, echo=F, eval=T}
rm(list=ls())
x <- c("sp", "rgdal", "rgeos", "maptools", "classInt", "RColorBrewer", "GISTools", "maps", "raster", "ggmap")
#install.packages(x) # warning: this may take a number of minutes
lapply(x, library, character.only = TRUE) #load the required packages
```

# Spatial Objects

| | Without attributes | With attributes |
| ----- | ------------------ | -------------- |
|Points | SpatialPoints | SpatialPointsDataFrame|
|Lines | SpatialLines | SpatialLinesDataFrame|
|Polygons | SpatialPolygons | SpatialPolygonsDataFrame|
|Raster | SpatialGrid | SpatialGridDataFrame|
|Raster | SpatialPixels | SpatialPixelsDataFrame|

```{r load_library1, echo=T, eval=T}
LubbockBlock<-readShapePoly("Data/LubbockBlockNew.shp") #read polygon shapefile
class(LubbockBlock)
HouseLocation<-read.csv("Data/HouseLatLon.csv") #read GPS data
class(HouseLocation)
coordinates(HouseLocation)<-c('Lon', 'Lat')
class(HouseLocation)
cropland<-raster("Data/Lubbock_CDL_2013_USDA.tif")
class(cropland)

tmin <- getData("worldclim", var = "tmin", res = 10) # this will download
class(tmin)
```

```{r load_library2, echo=T, eval=T}
LubbockBlock<-readOGR("./Data", "LubbockBlockNew") #read polygon shapefile
class(LubbockBlock)
```

# Mapping with R

## Basic Mapping

```{r mapping, echo=T, eval=T}
LubbockBlock<-readShapePoly("Data/LubbockBlockNew.shp") #read polygon shapefile
plot(LubbockBlock,axes=TRUE, col=alpha("gray70", 0.6)) #plot Lubbock block shapefile
#add title, scalebar, north arrow, and legend
HouseLocation<-read.csv("Data/HouseLatLon.csv") #read GPS data
price<-HouseLocation$TotalPrice
nclr<-5
priceclr<-brewer.pal(nclr, "Reds")
class<-classIntervals(price, nclr, style="quantile")
clocode<-findColours(class, priceclr)

points(HouseLocation$Lon, HouseLocation$Lat, pch=19, col=clocode, cex=0.5) #add houses on top of Lubbock block shapefile
title(main="Houses on Sale in Lubbock, 2014")

legend(-101.95, 33.65, legend=names(attr(clocode, "table")), fill =attr(clocode, "palette"), cex=0.5, bty="n")
#map.scale(x=-101.85, y=33.49,0.001,"Miles",4,0.5,sfcol='red')
north.arrow(xb=-101.95, yb=33.65, len=0.005, lab="N")

#plot raster
plot(cropland)
#plot raster stack
tmin <- getData("worldclim", var = "tmin", res = 10) # this will download
plot(tmin)
```

## Mapping with static Google Maps

```{R mapping2, echo=F, eval=F}
library(RgoogleMaps)
lubbock=geocode('lubbock')

newmap <- GetMap(center = c(lubbock$lat, lubbock$lon), zoom = 12, destfile = "newmap.png", maptype = "roadmap")

PlotOnStaticMap(newmap, lat=HouseLocation$Lat, lon=HouseLocation$Lon, col='red')
lubbock<-SpatialPolygons(LubbockBlock@polygons, proj4string=CRS("+init=EPSG:4326"))
PlotPolysOnStaticMap(newmap, lubbock, col=alpha('blue', 0.2))
```

## Mapping with dynamic Google Maps

```{R mapping3, echo=F, eval=F}
library(plotGoogleMaps)

data(meuse)
coordinates(meuse)=~x+y
proj4string(meuse) = CRS('+init=epsg:28992')
plotGoogleMaps(meuse, filename='meuse.html')

HouseLocation<-read.csv("Data/HouseLatLon.csv") #read GPS data
coordinates(HouseLocation)<-c('Lon', 'Lat')
proj4string(HouseLocation)=CRS('+init=EPSG:4326')
plotGoogleMaps(HouseLocation, filename='house.html')

ic = iconlabels(meuse$zinc, height=12)
plotGoogleMaps(meuse, iconMarker=ic, mapTypeId='ROADMAP', filename='meuse2.html')

#plot raster
data(meuse.grid)
coordinates(meuse.grid)<-c('x', 'y')
meuse.grid<-as(meuse.grid, 'SpatialPixelsDataFrame')
proj4string(meuse.grid) <- CRS('+init=epsg:28992')
mapMeuseCl<- plotGoogleMaps(meuse.grid,zcol= 'dist',at=seq(0,0.9,0.1),colPalette= brewer.pal(9,"Reds"), filename='meuse3.html')

#plot polygons
proj4string(LubbockBlock)=CRS("+init=epsg:4326")
m<-plotGoogleMaps(LubbockBlock,zcol="Pop2010",filename= 'MyMap6.htm' , mapTypeId= ' TERRAIN ' ,colPalette= brewer.pal(7,"Reds"), strokeColor="white")

#plot line
meuse.grid<-as(meuse.grid, 'SpatialPixelsDataFrame')
im<-as.image.SpatialGridDataFrame(meuse.grid[ 'dist' ])
cl<-ContourLines2SLDF(contourLines(im))
proj4string(cl) <- CRS( '+init=epsg:28992')
mapMeuseCl<- plotGoogleMaps(cl,zcol= 'level' ,strokeWeight=1:9, filename= 'myMap6.htm',mapTypeId= 'ROADMAP')

```

## Changing map projections

```{r projection, eval=T }

#project a vector

boudary=readShapePoly('Data/boundary');
proj4string(boudary) <-CRS("+proj=utm +zone=17 +datum=WGS84 +units=m +no_defs +ellps=WGS84 +towgs84=0,0,0")
proj4string(boudary)
boudaryProj<-spTransform(boudary, CRS("+init=epsg:3857"))
proj4string(boudaryProj)

#project a raster
proj4string(cropland)
plot(cropland)
aea <- CRS("+init=ESRI:102003") #Albert equal area
projCropland=projectRaster(cropland, crs=aea)
plot(projCropland)
```

# Spatial analysis with R

```{r load_library4, echo=F, eval=T}

#subsetting a spatial dataframe
LubbockBlock<-readOGR("./Data", "LubbockBlockNew") #read polygon shapefile

selection = LubbockBlock[LubbockBlock$Pop2010>500,]
plot(selection)

#select by clicking
selected = click(LubbockBlock)

extent = drawExtent()

extent=as(extent,'SpatialPolygons')
proj4string(extent)=proj4string(selection)

# performace erase
plot(erase(selection, extent))

poly = drawPoly()
proj4string(poly) = proj4string(LubbockBlock)

# performe clip
cropselection = crop(LubbockBlock,poly)
plot(cropselection)

```
## vector analysis (overlay)

```{r vector, echo=T, eval=T }
#project a vector

# Datasets
# * CSV table of (fictionalized) brown bear sightings in Alaska, each
# containing an arbitrary ID and spatial location specified as a
# lat-lon coordinate pair.
# * Polygon shapefile containing the boundaries of US National Parks
# greater than 100,000 acres in size.

bears <- read.csv("Data/bear-sightings.csv")
coordinates(bears) <- c("longitude", "latitude")

# read in National Parks polygons
parks <- readOGR("Data", "10m_us_parks_area")

# tell R that bear coordinates are in the same lat/lon reference system as the parks data
proj4string(bears) <- proj4string(parks)

# combine is.na() with over() to do the containment test; note that we
# need to "demote" parks to a SpatialPolygons object first
inside.park <- !is.na(over(bears, as(parks, "SpatialPolygons")))

# calculate what fraction of sightings were inside a park
mean(inside.park)
## [1] 0.1720648

# determine which park contains each sighting and store the park name as an attribute of the bears data
bears$park <- over(bears, parks)$Unit_Name

# draw a map big enough to encompass all points, then add in park boundaries superimposed upon a map of the United States
plot(bears)
map("world", region="usa", add=TRUE)
plot(parks, border="green", add=TRUE)
legend("topright", cex=0.85, c("Bear in park", "Bear not in park", "Park boundary"), pch=c(16, 1, NA), lty=c(NA, NA, 1), col=c("red", "grey", "green"), bty="n")
title(expression(paste(italic("Ursus arctos"), " sightings with respect to national parks")))

# plot bear points with separate colors inside and outside of parks
points(bears[!inside.park, ], pch=1, col="gray")
points(bears[inside.park, ], pch=16, col="red")

# write the augmented bears dataset to CSV
write.csv(bears, "bears-by-park.csv", row.names=FALSE)

# ...or create a shapefile from the points
writeOGR(bears, ".", "bears-by-park", driver="ESRI Shapefile")
```

## Raster analysis

```{r raster, eval=T, echo=T}

tmin=getData('worldclim', var='tmin', res=10)

# Raster calculator
diff=tmin$tmin1 - tmin$tmin10

## the following code is faster for large datasets.
overlay(tmin$tmin1, tmin$tmin10, fun=function(x,y){return (x-y)})

elevation <- getData("alt", country = "ESP")
slope <- terrain(elevation, opt = "slope")
aspect <- terrain(elevation, opt = "aspect")
hill <- hillShade(slope, aspect, 40, 270)
plot(hill, col = grey(0:100/100), legend = FALSE, main = "Spain")
plot(elevation, col = rainbow(25, alpha = 0.35), add = TRUE)

#contours

contour(elevation)
```

```{r raster2, eval=F, echo=T}
#crop raster
plot(hill, col = grey(0:100/100), legend = FALSE, main = "Spain")
plot(elevation, col = rainbow(25, alpha = 0.35), add = TRUE)
extent=drawExtent()
cropElev <- crop(elevation, extent)
plot(cropElev)
```

Applied Spatiotemporal Data Mining应用时空数据挖掘的更多相关文章

  1. 数据挖掘(data mining),机器学习(machine learning),和人工智能(AI)的区别是什么? 数据科学(data science)和商业分析(business analytics)之间有什么关系?

    本来我以为不需要解释这个问题的,到底数据挖掘(data mining),机器学习(machine learning),和人工智能(AI)有什么区别,但是前几天因为有个学弟问我,我想了想发现我竟然也回答 ...

  2. data Mining with Weka: Trailer More Data Mining with Weka 用weka 进行数据挖掘 Weka 用weka 进行更多数据挖掘

    https://www.youtube.com/user/WekaMOOC 大学公开课  视频教程 weka 入门教程 data Mining with Weka: Trailer  More Dat ...

  3. Machine Learning and Data Mining(机器学习与数据挖掘)

    Problems[show] Classification Clustering Regression Anomaly detection Association rules Reinforcemen ...

  4. Weka 3: Data Mining Software in Java

    官方网站: Weka 3: Data Mining Software in Java 相关使用方法博客 WEKA使用教程(经典教程转载) (实例数据:bank-data.csv) Weka初步一.二. ...

  5. data mining,machine learning,AI,data science,data science,business analytics

    数据挖掘(data mining),机器学习(machine learning),和人工智能(AI)的区别是什么? 数据科学(data science)和商业分析(business analytics ...

  6. Data Mining的十种分析方法——摘自《市场研究网络版》谢邦昌教授

    Data Mining的十种分析方法: 记忆基础推理法(Memory-Based Reasoning:MBR)        记忆基础推理法最主要的概念是用已知的案例(case)来预测未来案例的一些属 ...

  7. 搭建Data Mining环境(Spark版本)

    前言:工欲善其事,必先利其器.倘若不懂得构建一套大数据挖掘环境,何来谈Data Mining!何来领悟“Data Mining Engineer”中的工程二字!也仅仅是在做数据分析相关的事罢了!此文来 ...

  8. Tinghua Data Mining

    Learning Resources 书籍: 期刊: 业界先驱: 开阔视野,掌握业界最新动态. 工具: 数据挖掘是很多学科的综合体: 甭管叫什么名字,归根到底都是数据挖掘: Comprehensive ...

  9. 论文翻译:Data mining with big data

    原文: Wu X, Zhu X, Wu G Q, et al. Data mining with big data[J]. IEEE transactions on knowledge and dat ...

随机推荐

  1. spring boot-11.全局捕获异常

    1.在Spring boot 中如果发生错误,浏览器访问会默认跳转到Whitelabel Error Page 这个错误页面,如果是客户端访问的话返回JSON格式的错误数据,说明spring boot ...

  2. Tomcat使用时出现的问题总结

    1.有两种办法解决Tomcat启动时端口号冲突问题 1.第一种: 查看本地端口使用情况,找到被占用的8080端口,杀死该进程 1.查看本地端口命令:cmd->netstat -ano 2.找到 ...

  3. 洛谷 P3388 割点(割顶) 题解

    题面:     割点性质:     节点 u 如果是割点,当且仅当存在 u 的一个子树,子树中没有连向 u 的祖先的边(返祖边).     换句话说,如果对于一个点u,它的子节点是v,如果low[v] ...

  4. 微信小程序 报错 “对应的服务器无效。控制台输入 showRequestInfo()可以获取更详细信息”

    之前做的项目突然无法读出数据了,一测试发现报这个错误==>对应的服务器无效.控制台输入 showRequestInfo()可以获取更详细信息,后来发现是SSL证书到期了.重新申请了一个证书,免费 ...

  5. ELK的搭建以及使用

    一.架构如图: 二.工作机制: 在需要收集日志的应用上安装filebeat(需要修改配置文件,配置文件稍后介绍),启动filebeat后,会收集该应用的日志推送给redis,然后logstash从re ...

  6. 吴恩达深度学习:2.12向量化logistic回归

    1.不使用任何for循环用梯度下降实现整个训练集的一步迭代. (0)我们已经讨论过向量化如何显著加速代码,在这次视频中我们会设计向量化是如何实现logistic回归,这样酒桶同时处理m个训练集,来实现 ...

  7. 简单了解 node net 模块

    简单了解 node net 模块 文章记录了对net 模块的简单理解分析. net模块 简单使用 net.Server 类 net.Socket 类 总结 1.1 net模块 Node.js 的 Ne ...

  8. jQuery中outerWidth()方法

    截图自:菜鸟教程https://www.runoob.com/jquery/html-outerwidth.html

  9. laravel-admin Field type [editor] does not exist.

    把App/admin中的bootstrap.php 里边的Encore\Admin\Form::forget(['map', 'editor']);注释掉就行了 解决网址:http://tieba.b ...

  10. 开发规范总结-java代码

    java8新特性: 开发的时候适当用一些新特性的语法,可以使代码更简洁.譬如List根据某个属性转map.stream.函数式编程.lambda表达式 有一种场景:两个list一个转map 两个lis ...