Thursday, March 30, 2017

Post 3: Gathering Data

Introduction
The goal of this lab was to become familiar with a few sources for collecting data and to determine accuracy of the data collected. The data collected was for Trempealeau County, Wisconsin. A map of Trempealeau County was created using the land cover, crop cover and digital elevation model data gathered from these different resources (Figure 1). In addition, a table was created to display all the information about each dataset that was available to determine the accuracy of the data.

Methods
The data for this map were collected from various sources:

US Department of Transportation-Bureau of Transportation Statistics:https://www.rita.dot.gov/bts/sites/rita.dot.gov.bts/files/publications/national_transportation_atlas_database/index.html

United States Geological Survey-National Map Viewer: http://nationalmap.gov/about.html

USDA Geospatial Data Gateway:http://datagateway.nrcs.usda.gov/

Trempealeau County Land Records: http://www.tremplocounty.com/tchome/landrecords

USDA NRCS Web Soil Survey:http://websoilsurvey.sc.egov.usda.gov/App/HomePage.htm

SSURGO data from the USDA NRCS Web Soil Survey was not compatible with ArcGIS, so the data tables were reformatted in Microsoft Access and imported into the geodatabase created for this map. Once the Trempealeau County geodatabase was put into a folder titled 'Sub' the desired rasters were clipped and reprojected to the shape of Trempealeau County using Python scripting. The script is displayed in a previous blog.

Data Accuracy
Below is a table (Table 1) that information about various attributes of the different datasets used to create the map. Many of the attributes were not included in the metadata of each dataset.
Table 1. Data Accuracy table.
Results
Below is the map (Figure 1) that was created using the various sources of data collected above. The trails feature class was added from the Trempealeau County geodatabase. The maps weren't created for any purpose other than displaying the collected data.
Figure 1. This map was created to demonstrate the use of data gathered from different sources.
Conclusion
The collection of data was difficult considering each source had different ways of downloading the data. This lab helped to get the GIS class used to retrieving information and that this class was extremely helpful in getting our feet wet. Assessing data accuracy was even more difficult, because not every data source openly displayed the necessary information for this lab. It was very frustrating trying to find the information for every field. It is important and necessary to assess the accuracy of data and this lab showed it.

Wednesday, March 15, 2017

Post 2: Python Scripts

Introduction: Python Script 5
This exercise created a script that produced a weighted impact map. This map combines 5 factors that were used to create another map. In this script a streams feature class was weighted 1.5 that of the other factors. The other rasters were then added to the weighted raster to create a final raster.

Introduction: Python Script 4
This exercise created a script to find all historic major attractions within the maritime climate zone using field delimiters, building SQL statements, creating feature layers using those SQL statements and using the MakeFeatureLayer_management function.

import system modules
import arcpy

#set environments
from arcpy import env
env.workspace = "Q:\StudentCoursework\CHupy\GEOG.337.001.2175\KLEINSAS\DemoLab\SanDiego.gdb"
arcpy.env.overwriteOutput = True

fc1= "Climate"
fc2= "MajorAttractions"

#Add field delimiter for the TYPE and ESTAB
field1=arcpy.AddFieldDelimiters(fc1,"TYPE")
field2=arcpy.AddFieldDelimiters(fc2,"ESTAB")

#Build an SQL Statement for the Field TYPE=Maritime
maritimeSQLExp=field1+"="+"'Maritime'"

#Build an SQL Statement for the Field ESTAB<1956
historicSQLExp=field2+">0and"+field2+"<1956"

#Pass SQL Statement to make a New Feature Layer for Maritime
arcpy.MakeFeatureLayer_management(fc1,"MaritimeLyr", maritimeSQLExp)

#Pass SQL Statement to make a New Feature Layer for Historic
arcpy.MakeFeatureLayer_management(fc2, "HistoricLyr", historicSQLExp)

#Select features from the historic layers that are completely within the maritime layer and make a new selection
arcpy.SelectLayerByLocation_management("HistoricLyr", "COMPLETELY_WITHIN","MaritimeLyr","","NEW_SELECTION")

featCount=arcpy.GetCount_management("HistoricLyr")
print"Number of historic features selected:{}".format(featCount)

arcpy.CopyFeatures_management("HistoricLyr", "HistoricMaritime")

Introduction: Python Script 3
The goal of this exercise was to gain experience describing features and using FOR/IN loops.

#import system modules
import arcpy

#set environments
from arcpy import env
env.workspace = "Q:\StudentCoursework\CHupy\GEOG.337.001.2175\KLEINSAS\Python\Demo3.gdb"
arcpy.env.overwriteOutput = True


#set variables for the parameters and buffer the random points by 10 miles.
infc = "Mi_random"
outfc = "Mi_random_buff"
dist = "10 miles"
sidetype = "Full"
endtype = "Round"

#describe the type of dataset and the spatial reference of Mi_random
desc = arcpy.Describe (infc)
sr = desc.spatialReference
print"Dataset Type: " + desc.datasetType
print"Spatial Reference: " + sr.name


arcpy.Buffer_analysis (infc, outfc, dist, sidetype, endtype, "#", "#")

#set variables for the parameters and clip the buffer by the state of Michigan.

clipfc = "MI"
outclipfc = "buff_clip"
arcpy.Clip_analysis(outfc, clipfc, outclipfc)

fc_list = arcpy.ListFeatureClasses()
for name in fc_list:
    desc = arcpy.Describe(name)
    featCount = arcpy.GetCount_management(name)
    print "Name: {} Shape: {} SR: {} Count: {}".format(desc.name, desc.shapeType, desc.spatialReference.name, featCount)


print"the print is complete"

Introduction: Python Script 2
The goal of this exercise was to gain more experience using Pyscripter. In this script the buffer and clip tool were used in Pyscripter.

#import system modules
import arcpy

#set environments
from arcpy import env
env.workspace = "Q:\StudentCoursework\CHupy\GEOG.337.001.2175\KLEINSAS\Python\Demo2.gdb"
arcpy.env.overwriteOutput = True

#set up the buffer tool

arcpy.Buffer_analysis("Mi_random", "Mi_random_buff", "10 MILES", "FULL", "ROUND", "#", "#")


print("The script has completed")


arcpy.Clip_analysis("Mi_random_buff", "MI", "Mi_random_buff_clip")


Introduction: Python Script 1
Python is a script that is used throughout much of the computer science field. This language is also applicable to geographic information systems. This post presents the first python script used in this class to clip, edit and establish a coordinate system for multiple feature classes at once. The software used to create this script was PyScripter for Python 2.7.