Sunday, July 26, 2015

Custom Tools within ArcMap

ArcGIS, being the powerful program that it is, provides ample customization abilities, particularly with the creation of custom tools by way of Python scripts.  Tools are accessed by a user within the GIS Desktop interface via dialog boxes, which receive the user's input file locations, desired output location, and values for whatever parameters may be specified.  One of the handiest things about them, though, is the average user's potential to create a custom tool with a Python script, and easily format it to run within the ArcMap interface- potentially by GIS users of any and all skill levels.


A custom dialog window, such as this, looks fairly identical to the system provided tools within ArcGIS, but the process behind the user interface is a custom Python script.  One only need create a new toolbox in ArcMap, and go through the steps within the Add Script Wizard to specify the script's location, and the parameters required to run the tool.  After a few minor edits within the script itself- mainly replacing lines specifying file paths with GetParameter functions- the script can be run with any valid user-specified parameters.  

  
The Results window, which opens upon the script's execution and displays messages about the success or failure of the script, can be customized from within the source-script as well.  Print statements, used in the Python environment to print messages to the Interactive Window, are changed to AddMessage functions, which add output information to the Results window.  The above are the output messages from the custom script, modified to become a custom tool within ArcMap.  The messages specified to display upon successful completion are displayed here, and are replaced with the appropriate red-lettered warnings and notifications if there are any errors or exceptions encountered instead.    

Saturday, July 25, 2015

Analyzing Statistical "Hotspots" with GIS

Using GIS to identify and locate “hotspots,” or areas with statistically significant amounts of various phenomena, is both intuitive and useful, given its powerful ability to both analyze and visually display this kind of information.  Martino, et al. (2014) use GIS to this end to explore locational patterns and statistical algorithms employed in the analysis of brain cancer in “Spatio-temporalhotspots and application on a disease analysis case via GIS.”  With this research the authors sought to identify locations with statistically significant proportions of brain cancer incidence in New Mexico, with further consideration of temporal factors, for the years 1973 to 1991.  The data used for the research was a collection of 5,000 point locations of cancer incidence for the indicated time period, with the enumeration units as the counties that compose the state of New Mexico.  The authors’ examination focused on a comparison of two methods of cluster analysis- the Extended Fuzzy C Means (EFCM) and the Extended Gustafson-Kessel (EGK) algorithms- employed within the GIS environment.  The EFCM algorithm they used employs a circular shaped hotspot area, determined by the patterns of point event clusters across space, and makes a temporal comparison for a specified period of time.  The EGK method is an analogous procedure which produced very similar results, but employs an ellipsoid shaped hotspot area, and uses a different variation of membership calculation within one if its formulas.  Analysis of the results of this research lead the authors to conclude that both algorithms, used within a GIS, are equally effective in predicting the spatial area of the diffusion of this disease, and may be of use in larger-scale studies seeking to identify temporal and/or spatial determinants within a specified study area.        


Reference
Martino, F., Sessa, S., Barillari, U. S., & Barillari, M. R. (2014). Spatio-temporal hotspots and application on a disease analysis case via GIS. Soft Computing - A Fusion Of Foundations, Methodologies And Applications, (12), 2377.
http://dl.acm.org/citation.cfm?id=2689536

Monday, July 20, 2015

Working with Rasters in Python

The corollary to working with feature geometry in vector datasets could be described as the use of the Spatial Analyst extension to access the arcpy.sa module, which allows one to work with raster datasets in Python.  This module comes complete with its own tools, functions and idiosyncrasies, which were the focus of this week's Python script.


     
Reclassification of raster data can be a cumbersome process using the Spatial Analyst tools available in ArcMap, particularly when the required result is a classification raster involving the combination of several re-classified inputs.  This task is relatively simple by comparison within a Python script though, which takes each intermediary input as a temporary dataset, and saves only user specified results.  The Python script also completes with a few short lines of code what would otherwise be completed through the creation of multiple separate raster datasets with the tools available within the Spatial Analysis Toolbox.  The efficiency of the code's ability to reclassify, and then combine to create a final output of areas containing specified characteristics, various raster data cannot be overstated.  The above image represents an area with characteristics meeting all of a set of some specified conditions- namely that the land has a particular type of landcover, and a specified range of slope and aspect.  Two input raster datasets were employed in its creation- one with landcover types, and another with elevation.  The elevation data was used to create datasets with slope and aspect characteristic attributes, and the landcover was reclassified to contain only those areas categorized as "forested."  The resulting output is the combination of the suitable areas within each of the inputs, and represents the land within the specified areal extent that meets the landcover, slope and aspect suitability requirements.   

Tuesday, July 14, 2015

Geometries in Python

Geometry, within the context of GIS and Python programming, refers to the points, connecting lines and arrays that compose points, polyline and polygon features.  All of the vector data objects used within a GIS can be expressed in terms of these geometries, which can be easily (relatively speaking) accessed with Python script.


The task at hand for this topic was the creation of a script that output the above- the object ID, a sequential "vertex ID," the x, y coordinates and feature name- all taken from the attribute data of a polyline shapefile of a set of rivers in Hawaii.  The script has to iterate through each line segment and print the coordinates of all of the vertices, placing each pair on a separate line of the text file.  In this process we make use of the concept of the "search cursor" introduced last week, which allows the script to access each row (or record) within the feature class.  Also put into play is the for loop, which provides the means of iteration through each row (in this case each river), and further through each line segment and set of point vertices that compose each one.  The 25 different rivers within the feature class are represented as 25 different object IDs (the first number of the line), with each coordinate pair (vertex) given a sequential ID (the second number), the x, y pair, and finally the feature's name, taken from the name attribute field.  Reading and outputting data such as this from feature classes could be a very useful task for a script to complete, as the required steps, performed manually, would surely become very time consuming, especially if one were working with a large number of datasets.   

Sunday, July 12, 2015

Practicing Damage Assessment: Structural Damage Caused by Hurricane Sandy

Assessing damage in the aftermath of a natural disaster is a weighty task- often there is significant loss of life and property involved, but someone has to actually quantify exactly how much and of what is damaged or gone all together.  This is easily accomplished using aerial photos and GIS analysis.


Above is an aerial image of a block on the New Jersey shore, taken shortly after the late October 2012 storm event of Hurricane Sandy.  This image was used in conjunction with one of the same areal extent, but taken before the storm.  Comparison of the two images allows for a quick, but not overly accurate, inventory of the structures directly affected, and some of the damage on the ground.  The final qualification of said damage is more than likely completed with reference to actual field-derived ground information, but image analysis such as this provides a decent estimate to start off with.  When the two aerial images, pre- and post- storm, are compared a point layer with locations of structures can be created, with descriptions of the types of damage observed as a part of the feature attributes.  Upon completion, the layer can be symbolized according to observed damage, and presented in map-form with an image like the one here.  This type of analysis provides maps that can give a good general overview of damages resulting from storm events such as Hurricane Sandy, and can also be used to analyze and quantify the extent of the damage.  

 Creating a feature layer of the actual coastline and drawing a buffer around it allows for a count of buildings, and their associated damage levels, within a specified distance of the water's edge.  The results of this analysis, in the table above, indicate that the majority of the damage- about 60%- occurred in the area 100 to 200 meters from the water.  Quantitative results within a table like this also support the intuitive assumption that the level of damage is greatest nearest to the water's edge, and the magnitude of structural damage generally decreases with increased distance.  It is worth mentioning, though, that despite the quantitative nature of this summary, the data it is based off of is anything but.  Subjective interpretation by one person of the amount of damage by way of visual comparison of aerial images would hardly suffice as an accurate assessment in a real-life scenario.  

Monday, July 6, 2015

Using Python to Manipulate Spatial Data (and also possibly to test your ability to refrain from gratuitous use of hyperbole in writing...)

Computer programming, writing script, everything involved with learning the "language of the machines"(so to speak) isn't supposed to be easy. No one will tell you it's easy.  Going into your first class on the subject is intimidating, to say the least, but there isn't really anything to decently prepare an average intellect for the exact nature of the challenge ahead.  The minor victories along the way are as sweet as any devoted student has experienced- spending hours experimenting with this and that to persuade your script to perform the way you require, and finally getting it right.  The rush of satisfaction is unparalleled, but the dark and frustrating hours toiling to get there are enough to render helpless even the most stalwart resolve.  This week and last week were dedicated to grasping the finer points of spatial data manipulation with Python. Some of the concepts in Python introduced were a bit cryptic for a novice, but extensive experimentation with them in creating the assigned script was sufficient to reveal their basic workings, if not some of their many vagaries.




We have with the above screenshot, once again, a section from the printed output of the script that was created for the assignment.  The simple output displayed above is, like the previous assignments for the semester, the mere tip of the iceberg that was the required script.  The required outputs, some of which are displayed in the above screen capture, varied from a simple geodatabase created in a specified location, to a search cursor function that queried a feature class' table to output specific attributes of records meeting certain criteria.  Suffice it to say, getting the right output from the myriad processes the script performed was an epic test of patience for your humble blog author, and fortunately (for all involved) did not result in a frustrated burst of destructive rage.  For example- the concept of a "search cursor" is perhaps not so intuitive to the nascent programmer, and so employing its functionality became a heuristic lesson, as reading about or listening to someone describe these kinds of things isn't always sufficient to create an understanding.  The employment of some SQL functionality in Python and the necessity of "field delimiters" is another case in point from this week's assignment.  The varied plethora of capabilities that the humble list can have within a Python script is also something probably not immediately evident, but when iterated through with a process from a for loop this unassuming structure can be used to complete many a useful task.  In the end, though, after all of the frustration of learning and acquiring various skills and knowledge, the objective has been met- the concepts introduced in this series of lessons are now fully grasped, and a working understanding of the material covered is the prize.  It strikes one as rendering the difficult process of getting there completely worthwhile.            
     

Sunday, July 5, 2015

Sea Level Rise and Coastal Flooding

Weather events have the ability to impact large numbers of people in this country in very drastic ways.  This was made very evident with Hurricane Sandy's landfall along the eastern seaboard in 2012, and hopefully brought some attention to the fact that any increase in sea level, either on its own or in combination with a large weather event, can displace, harm and cause great economic impact to a large portion of the population.  Planning and analysis of the likelihood and nature of these kinds of potential catastrophes is essential.



Hawaii, because of its isolated location in the warm tropical waters of the south Pacific, is especially vulnerable to water-related catastrophes, such as tsunamis and typhoons.  On all of the Hawaiian Islands many of the schools and other public facilities that are located on the lower elevations nearest to the shore have tsunami drills, in order to prepare for these events.  The above map details some of the potential impact of a 6 foot sea level rise to the District of Honolulu, on the south eastern edge of the island of Oahu.  The elevation of the district is measured, and the extent of a certain rise in the water level can be anticipated by isolating those areas below a certain threshold.  The potential depth of the water at these lower shoreline locations can be calculated as well, and overlaid with various census measures to better describe the characteristics of the population directly affected.  The analysis completed in the creation of this map revealed that the total population within the 6 foot sea level rise flood zone is 60,005 persons, the majority of which are Caucasian, under the age of 65, and do not own the residence that they occupy.  These kinds of measures are important to account for in modeling the impacts of sea level rise, as awareness of especially vulnerable populations within the flood zone is necessary to anticipate measures needed to protect the affected population at large.