Saturday, June 23, 2012

Module 4 - Review of Python Syntax

Greetings fellow Argonauts,

This week’s assignment was a very good overview of some of the critical Python skills we will need to improve to (eventually) become GIS Development Masterminds.  I am lucky enough to have a background in programming and can imagine that if you had never programmed before, that these exercises might be difficult and frustrating.   All I can do is to tell you to hang in there !   Completing a project (in the real world as in a class) is all about iterative problem solving.   Rarely will something work the first time and it usually will take some research time to solve the problem (Thanks Google!).  Trust me, your programming skills WILL improve with each problem that is placed before you.   Your mind may not be used to thinking like the computer so it is almost like a workout at the gym for your brain.   At first you won’t be able to do much, but with persistence, you will amaze yourself (and your coworkers).   So hang in there, use the message board, use the Internet, talk to others you know that might be struggling with similar problems.

Now on to what I learned this week:

1.      I REALLY prefer Python WIN over IDLE.    I leaned on the copy/paste and find/replace functionality heavily, especially during the 4th assignment

2.      You should familiarize yourself with some of the basic Python functions.  One of the things I’ll do when I learn a new language is look for a Python Quick Reference or Cheat Sheet that gives me a list of the basics.

3.      Lists are a very useful data type that I wasn’t really familiar with from my other programming languages.  I can foresee them being very useful in ArcGIS when we start thing about layers, features, etc...

4.      Some of the skills you are learning in this class are going to help you learn other languages (if you choose to go that path).   The “import module” for example and the “.” Hierarchy of functions is in several languages

5.      You WILL see CSV files in the real world quite a bit, so that lesson is very useful.

6.      Reading and Writing to text files is also a very valuable skill that will come up quite a bit in your development duties.

7.      Loops are another great place to use a cheat sheet to remember the format (although the indentation format Is kind of growing on me)
 

Preparing MEDS

This week's assignment was about MEDS (Minimum Essential Data Sets).    Our task was to put together some critical data for NORAD.    Here is a screenshot of my layers at the completion of this assignment:

Thursday, June 21, 2012

Participation Assignment

The article I choose to report on concerned the Ogden, Utah Police Department’s use of GIS technology to improve their ability to handle the “suppression, detection, and investigation of crime”.  The Department has created what they call a Real Time Crime Center (RTCC).   This system is the result of over 10 years of improvements.  

The goal of their project was to maximize the efficiency of their staff of 144 sworn officers in their community of 85,000 residents.   In the past they had used the ESRI line of products to analyze their criminal incident information, but it wasn’t until 2008 when they started deploying what they call “significant enhancements”.  These included dynamic hot spot mapping (which shows the intensity levels of criminal activity spatially), improved web access (making the information easier to retrieve for its officers), and Automated Vehicle Location (which allowed them to see the location of all their vehicles in real time).  Another technology they embrace is from an ESRI Partner is called “Pictometry”.    This allows their staff to look at aerial imagery not only from the top-down view, but also from all 4 directions.   This helps them detect features such as cover where a criminal might be hidden.   The Pictometry is of special interest to me as I have started using it as part of my Internship Process.  In the news recently it looks like Apple and Google will both be tackling this type of imagery, so it will be interesting to see if they will be able to implement this technology in Ogden as well.

The Ogden Police Department really took the RTCC to new levels of usefulness in 2011 when they began to integrate other data into the system such as local warrants, criminal history, jail data, arrest affidavits, and property information.    This helped improve the frequency of data retrieval from the system from monthly (in some cases) down to near-real time.   They have developed their systems to be standards compliant with the Microsoft Fusion Core Solution (MFCS) which is the result of a partnership between Microsoft and ESRI.   They describe the MFCS system as “an easy-to-use, quick-to-configure solution that combines the robust capabilities of the Esri ArcGIS Advanced Enterprise Server and Microsoft Office SharePoint Server 2010  They can add data sources from numerous third party data providers such as surveillance cameras, a blimp, and briefing notes.

This article really inspired me to look at some of the processes where I work currently.  We gather real time data in several systems that could be really beneficial to see in a near real-time spatial web application.   We generate some reports for board meetings that would be a whole lot more useful if they were pulled up on a live web site.   Now if I only had the free time…..



Washington DC Crime Mapping

Here is my latest work on the Homeland Security project for Washington DC Crime mapping.


And here is my Process Summary for part 3 & 4 (which were the only parts required this week)

STEP 3: Produce a Map of Police Stations with Crime Proximity
1) Used my DCMaps template to create my new crime2.mxd document
2) Exported some of my symbolized data to layer files so I could use them more easily
3) Ran the Multiple Ring tool from the Analysis tools in the toolbox.   Set at .5, 1 and 2 miles
4) On the next step, it would not let me add a field on the Crime data until I exported it to be part of the GDB file.
5) Set the new Event field to 1 using the calculator
6) Performed my spatial join to create the buf_crime
7) Removed buffer from the TOC
8) Answered Q7 & Q8
9) On Q9 I have an interesting issue.   My sum of events doesn’t match my original count of crimes (off by 61).   So I am going to do my best to answer Q9.
10) Answered Questions 9-12
11) Prepared my deliverable (I went with an 11x17 map, I hope that is acceptable).

STEP 4: Produce Density Maps of Burglaries, Homicide, and Sex Abuse Crimes
1) Created my crime3.mxd
2) Added the Crimes to it (with the added Events column)
3) Selected all the burglaries
4) Answered Question 13
5) Ran the spatial analysis on the Crime density using the different buffer values
6) Saved the 1500 version out as a layer file
7) Answered Q14
8) Created my additional data frames and set the coordinate systems all to match.
9) Next worked on homicide and Sex Abuse maps
10) Answered Q15
11) Answered Q16
12) Prepared the map deliverable for this assignment.
13) Posted to dropbox and blog.

Thursday, June 14, 2012

Intro to Python - Module 3


Summary of what I learned:

Exercise 3a:

·        While the Shell is good for a lot of things, working on multiple-line code segments is not one of them.

·        One of the big benefits of working in a Text Editor is the global Find/Replace functionality that we have become accustomed to.

·        Notepad ++ can be temperamental.   Whenever I tried to work with it, it would muck up my paths to my S: drive not only from within Notepad ++ but even from Windows Explorer.

Exercise 3b:

·        It is important to recognize places where you will need to use both single and double quotation marks, such as in contractions used in text strings.

·        Embedded variables are an extremely yet simple way to personalize your text strings.

·        Variable naming can lead to a lot of confusion.   You want to make sure to check for proper case sensitivity to ensure that you reference the correct variable.

Exercise 3c:

·        Be sure to take full advantage of the auto-complete functionality that ArcGIS gives you with Python.

·        In addition to auto-complete, Python also offers extensive drag and drop functionality to create code for you.

·        Don’t forget to import your arcpy library !

·        Be mindful of your environment settings.   These are key to finding all the awesome data results you have created !

Assignment 3a:

·        Python has great tools for automating repetitive tasks such as directory creation.

·        Lists are a very useful concept that can help reduce “code-bloat”

Assignment 3b:

·        String literals are very helpful in this assignment.  These really help minimize the time it takes to format your strings

·        You can use a “while” loop to make this code more streamlines

·        Using the data in a LIST format really helps with readability

Challenge Exercise:

·        For this exercise, you can set the parameters using the ArcGIS interface rather than trying to do it from within the Python script.

·         You can then read these parameters as follows:    inYear = gp.GetParameterAsText(0)

This was a fun assignment and challenge exercise.    I am looking forward to what comes next !

Wednesday, June 13, 2012

Natural Hazards - Tsunamis

This week's assigment looked at how GIS can be used in regards to Tsunamis.

Here is my map and process summary for this week:

Process Summary

Part I: Building a File Geodatabase
1. Copied the data from the R: drive and unzipped
2. Started ArcCatalog
3. Documented Metadata
4. Navigated to my S:\GIS4048-ApplicationsInGIS\W4-NaturalHazards-Tsunamis\Tsunami directory
5. Previewed the Roads shape file to make sure the Preview Tab works properly.
6. Created the Tsunami.gdb File Geodatabase
7. Created the Transportation Feature Dataset and set the Project Coordinate tp WGS1984
8. Added the Rails_UTM and the Roads_UTM shapefiles to the File Geodatabase
9. Started ArcMap
10. Now we need to convert the data in Excel into a spatial feature
11. First I added the Excel Sheet as a table
12. Next I right-Clicked it and chose “Display XY Data”
13. I chose to import the Coordinate system and chose my File Geodatabase and the Feature Dataset as the place to import from.
14. Added the Administrative and Damage_Assesment shapefiles and added the recommended shapefiles to each.
15. Made sure my File Geodatabase matched the lab instructions
16. Opened ArcToolbox.
17. Ran the Build Raster Attribute Table tool on each of the DEMs
18. Created the FukuDEM and SendDEM Raster Datasets
19. Load Data process completed
20. Calculated Statistics on both DEMs.  It is interesting to try to preview the data before running the statistics.   It appears solid black.   Post processing it appears normal.
21. Took a screenshot and prepared it as my first deliverable

Part II: Fukushima Radiation Exposure Zones
1. Started ArcMap
2. Opened Tsunami.mxd
3. Exported the Fukushima II Power Plant to its own Feature Class
4. Answered Q1
5. Created my Multiple Ring buffer.   Pretty cool tool !
6. Adjusted my Symbology
7. Clipped the Roads feature
8. Did a Select by Location query to answer Q2
9. Exported these cities out to Evac_zone_cities
10. Posted my screenshot for Q3
11. Answered questions 4 through 7 using that same tool and the statistics tool for the sum of population
12. Worked on my map deliverable

Thursday, June 7, 2012

SWOPPAT Announcements - New Tool

It has come to our attention that there has been a great demand for a tool that will allow SWOPPAT staff to add two additional columns to their shapefiles.   Acres and HectaAcres.    We at SWOPPAT GIS Programming have made this possible with the new SWOPPATAAHACT "The Save the World One Python Program at a Time Acre and HectaAcre Calculation Tool" (™ and © pending).

This tool can be accessed from your toolbar as follows:



This tool can save you countless hours.   Lets say for example that you create 3 shapefiles per week.  And lets say that calculating the Acres and HectaAcres tool you 3 minutes each time.    That would be 9 minutes a week.   When you look at this over the course of the year, you see that this tool can save you 468 minutes a year.    That is over 7 and a half hours a year, almost an entire work day !


Natural Hazards - Hurricanes

This week's assignment asked us to look at the damage caused by Hurricane Katrina in three counties in Mississippi.  There are 3 maps and a graph and a table that I have submitted here along with a Process Summary.
Deliverable #1- A map showing elevation/bathymetry of the Mississippi Coast counties with places, types of water, barrier islands, and hydrography.
Deliverable #2 - A map of flooded land of the Mississippi Coast after Hurricane Katrina.
Deliverable #3 - A map showing infrastructure and health facilities at risk from storm surge.

Deliverable #4 - A bar graph showing percentage of total flooded land by land-cover type.

Deliverable #5 - A table showing various land types that were flooded measured in acres and square miles.
Process Summary

1.      Set up my directory on S: Drive and copied over the necessary data from the R: drive
2.      Answered Questions 1-4 (Returned to them at several points during the lab)
3.      Opened ArcCatalog and previewed the data in both spatial and tabular form
4.      Examined the Metadata to complete the chart as part of Q5
5.      Organized my directory structure so that I have a S:\GIS4048-ApplicationsInGIS\W3-NaturalHazards-Hurricanes\Hurricanes\Project1\results path for my results
6.      Created a new map document in ArcGIS and called it coast1
7.      Created my Map Document Properties and set to relative paths.
8. Saved work
9.      Set my coordinate system by importing the coordinate system from the land cover feature
10.  Set up my Environment in Toolbox to handle current and scratch workspaces, Output Coordinate System, Raster Analysis settings and the mask to base the results on the 3 counties of interest
11.  Added the elevation raster.
12.  Answered Q6
13.  Symbolized the color ramp for the Elevation.   The tip about right clicking to see the names will come in handy at work and at my internship.
14.  Added the Counties and set up the labels
15.  Answered Q7
16.  Added the Places, symbolized, and answered Q8
17.  Added Islands to the map
18.  Added swamps and rivers and symbolized appropriately.
19.  This will be the base map that will give us a starting point for the rest of the assignment
20.  Answered Q9 and 10
21.  Saved my map (will com back to it later to get it ready to be a deliverable, as it is currently very busy)
22.  Created Coast2.mxd
23.  Documented the map, set the data frame properties, and set the environments just like the previous map
24.  Turned on Spatial Analyst
25.  Used the Math Tools in Spatial Analyst to convert meters to feet
26.  Used the logical toolset to get the lands where elevation is less than or equal to 15 feet and saved it as flooded_land
27.  Added Landcover
28.  Reclassified Landcover using the Spatial Analyst reclassify tool.
29.  Used Editor to label the reclassified data
30.  Saved the reclassified data as a layer file
31.  Added the counties and labeled them
32.  Obtained the Statistics on the Count field
33.  Answered Q12
34.  Created a new field called “Percent” as float
35.  Used the field calculator to set this new column to equal COUNT/1380797*100
36.  Verified that this totaled to 100% using the statistics tool.
37.  Saved my coast 2 document and graph.   I will come back to prep it for delivery later
38.  Created my coast3.mxd and set the description, coordinate system information, and environment
39.  Added my infrastructure data to the MXD and symbolized appropriately.
40.  Answered Q15 and Q16 ( I found a cool Spatial Analyst tool for this step called Spatial Analyst Tools | Extraction | "Extract Values to Points"
41.  Started my coast4.mxd document
42.  Documented the map, set the coordinate info, and set the environment settings.
43.  Prepared my deliverables and Blog Post Entries.