Friday, October 23, 2015

Gathering Data and Assessing Accuracy

Goals and Objectives:

The goal of this lab was to familiarize myself with the process of gathering and downloading data from different organizations on the internet. After the data was gathered I had to import it into ArcGIS and project it from all of its different sources into a single common coordinate system. I also had to design a getodatabase to store all the data; this had to be done by writing python scripts in PyScripter. You can view my scripts here. The challenge of this lab was writing script that would actually work, as well as keeping all the downloaded data organized and easily accessible. Like all the posts/labs in this blog they share the common goal to build a risk and sustainability model for sand mining in western Wisconsin. All of the data downloaded for this lab was focused on Tremealeau County for purposes of proximity and to minimized the sheer amount of data stored.

Methods:

The first step in this lab was to obtain our data from the internet (Figure A) . Below is a list of the specified data sources required for this lab.


Figure A: a basic work flow created by Professor Hupy provided for us in our instructions.

All the datasets were downloaded to a temporary file, they were then unzipped, extracted, to a working folder. I downloaded them to a temporary file first in order to save storage space, the temporary files are deleted from the system at the end of every month. Then we sorted our files, choosing the files we needed: railroad feature class, soils information, DEM elevation rasters, and national land use and land cover data. These were all sent to a master folder along with the TMP geodatabase. As mentioned before this data was used in the next step, the coding process in python.

The data (the four listed above) was then projeted into the same coordinate system as the TMP geodatabase, NAD83_HARN_WISCRS Trempealeau County Feet. It was then clipped to the Trempealeau County boundary. Finally it was loaded to the geodatabase and we were able to use it effectively, and create some maps. After all was said and done all unnecessary and redundant data was deleted.

Data Accuracy:

Using the metadata from each dataset I was able to investigate the accuracy of said data. This is very important because our data came from many different sources and therefore it would have varying degrees of accuracy. Delving into the depths of the metadata has helped me better understand where the data is from, how frequently it is kept up to date, who has accessed it, its resolution, etc. In figure B below you can see the accuracy for each dataset.

Figure B: Data Quality Table
NA represents data that was unavailable or that I could not locate.

Conclusion:

I learned how to download and organize a ridiculous amount of data from different sources, probably minuscule compared to professional level GIS users, but I digress. I feel like this is a great skill to have learned and to now hone. I also learned a lot about metadata and the data itself, it was endearing, but worth it, I feel smarter now. Using python to do things in Arc is a great skill to have begun learning because it will prove to be very helpful.








No comments:

Post a Comment