Special Topics in GIS - Lab 2 Statistics -Analyze
This week's assignment involved doing some statistical analysis on the data we put together last week in relation to meth lab locations. We used Ordinary Least Squares Regression which was a first for me. I am not sure I mastered it, but hopefully I'll get some good feedback on my assignment to learn more about this powerful tool that GIS users have at their disposal.
For the blog, we are asked to post our OLS Results table and StdResiduals Map. Here they are:
This site ws created as a Portfolio for David Jenness. The work on this blog is for the University of West Florida's GIS Online Certificate Program in 2012.
Showing posts with label GIS4048. Show all posts
Showing posts with label GIS4048. Show all posts
Wednesday, September 12, 2012
Thursday, August 9, 2012
GIS 4048 Final Project Presentation
For my Final Project, I chose to use GIS to help decide where to place a new dog park in Leon County, Florida to best serve the population.
For this project we had to create a Process Summary, a PowerPoint Presentation (along with supporting maps) as well as a Slide by Slide summary in PDF form.
Link to Slide by Slide Summary
Link to PowerPoint Presentation
I experimented with ArcGIS Explorer Online for Extra Credit, but was unable to get the presentation to a point where I was happy enough with it to turn it in. The Raster files especially were giving me trouble and I had too much extraneous things going on to dive into it any deeper.
For my Final Project, I chose to use GIS to help decide where to place a new dog park in Leon County, Florida to best serve the population.
For this project we had to create a Process Summary, a PowerPoint Presentation (along with supporting maps) as well as a Slide by Slide summary in PDF form.
Link to Slide by Slide Summary
Link to PowerPoint Presentation
I experimented with ArcGIS Explorer Online for Extra Credit, but was unable to get the presentation to a point where I was happy enough with it to turn it in. The Raster files especially were giving me trouble and I had too much extraneous things going on to dive into it any deeper.
Saturday, July 21, 2012
Urban Planning: Location Descriptions
This week's assignment helped us use criteria provided by a couple to determine the optimal choice of neighborhood to live in when moving to Alachua county, Florida. We were tasked with providing 3 maps, a basemap, a map showing the 4 criteria (proximity to hospital, proximity to UF, percentage of population between age 40-49, and median house value), and a final map which took these multriple considerations into account.
This week's assignment helped us use criteria provided by a couple to determine the optimal choice of neighborhood to live in when moving to Alachua county, Florida. We were tasked with providing 3 maps, a basemap, a map showing the 4 criteria (proximity to hospital, proximity to UF, percentage of population between age 40-49, and median house value), and a final map which took these multriple considerations into account.
Process Summary
From the “Step 4: Analysis” major section, include your process for Steps 7 and 8 (weighted analysis).
1) Saved my Location2 as Location3.mxd
2) Removed all data frames (Adding a new frame and importing the Projection information first)
3) Added the 3 feature classes that we created in the previous section to the MXD (hosp, school, rightage, and value)
4) Cleaned up the symbology for these layers
5) Started Model Builder
6) Added the 4 feature classes to Model Builder
7) Added the Weighted Overlay tool to the model
8) Connected the 4 feature classes to the Weighted Overlay box
9) Answered Q13
10) Answered Q14
11) Answered Q15
12) Answered Q16
13) Made sure the scale was set in the correct (reverse) order
14) Note: when reordering the scale you have to be sure that it doesn’t retain the last value rather than replacing it with the first value.
15) Gave each feature class a 25% weight
16) Gave the output the name WEIGHT1 and stored in the GeoDatabase
17) Ran the model
18) Added Weight 1 to the display and closed the model
19) Added Places to the mxd
20) Added Sel_tracts and made hollow
21) Added the school and hospital to the project and symbolized accordingly
22) Selected the three best areas that met the criteria and exported them out as Weight1Best3
23) Set up my model with the revised parameters making sure that my scales were set properly.
24) Set up the map layout
25) Answered Q17
2) Removed all data frames (Adding a new frame and importing the Projection information first)
3) Added the 3 feature classes that we created in the previous section to the MXD (hosp, school, rightage, and value)
4) Cleaned up the symbology for these layers
5) Started Model Builder
6) Added the 4 feature classes to Model Builder
7) Added the Weighted Overlay tool to the model
8) Connected the 4 feature classes to the Weighted Overlay box
9) Answered Q13
10) Answered Q14
11) Answered Q15
12) Answered Q16
13) Made sure the scale was set in the correct (reverse) order
14) Note: when reordering the scale you have to be sure that it doesn’t retain the last value rather than replacing it with the first value.
15) Gave each feature class a 25% weight
16) Gave the output the name WEIGHT1 and stored in the GeoDatabase
17) Ran the model
18) Added Weight 1 to the display and closed the model
19) Added Places to the mxd
20) Added Sel_tracts and made hollow
21) Added the school and hospital to the project and symbolized accordingly
22) Selected the three best areas that met the criteria and exported them out as Weight1Best3
23) Set up my model with the revised parameters making sure that my scales were set properly.
24) Set up the map layout
25) Answered Q17
Saturday, July 14, 2012
Urban Planning - GIS for Local Government
This week's multipart exercise had a lot of interesting aspects to it. Having been around offices that have dealings with parcel information, I can vouch for the usefulness of this type of knowledge.
Without further delay, here are some of my maps from this week's assignment:
OBJECTIVE 2:
1) Created a new blank MXD and saved as Parcels.mxd
2) Added the Parcels Feature Class from the results.gdb
3) Looked at the attributes and noticed the owner names are missing.
4) Opened the CSV in Excel and deleted the last column since it is unsupported.
5) Saved and closed the CSV
6) Added the CSV to the map
7) I had some problems on my join until I renamed the PARCEL field in Excel to PARCELID. Then my join seemed to work properly.
8) Exported my join out to the Results.gdb as Parcels_Join
9) Selected the key parcel by attribute and exported it as Zuko_Parcel to the Results.gdb
10) Did a select by location to choose all the parcels that intersected our parcel. Answered Q10
11) Exported these out as ADJ_Parcels
12) Added the zoning feature class
13) Clipped the zoning feature class to the adjacent parcels
14) Created the Parcels_Zone feature class by using the Intersect tool
15) Anseered Q11
16) Added a MAPKEY Field to the table and set it to be the same value as the OBJECTID_1 field
17) Having a problem because when I intersect the Adjacent Parcels and the Zoning, I end up with 25 features instead of 9. Most of the excess are small slivers where the polygons don’t quite match up.
18) I was able to get around this in an unusual way. I created a JOIN on my Adjacent Parcels with the Parcels_Zone feature class using the MAPKEY field. This gave me the ZONING info I needed. Not sure if this was the best approach overall, but it worked in this case.
19) Update, it didn’t work in this case. The zoning appears to be wrong
20) I finally had to move past this problem by setting a tolerance of 5 feet which returned 10 records (Two for parcel 9) I deleted the smaller of these parcels to get down to my count of 9. Still not sure why this was the case, but needed to move on.
21) Added all my required map elements to prepare for deliverable
OBJECTIVE 3:
1) Saved the MXD as Zoning_DDP to start next section
2) Ran the Grid Index Features from the Data Driven Pages part of the Cartography Tools and got my 24 results
3) Ran the Select by location to weed these indexes down to 12
4) Exported these records out to a Zoning_Index table
5) Added my zone descriptions to the table
6) Created a new Data Frame to separate the index from the content
7) Added the Data Driven Pages toolbar
8) Ran the Data Driven Pages Setup wizard
9) Added the Page number Dynamic Text
10) Added the locator map and set the properties to see which section we are currently working on.
11) Added dynamic text for a few fields
12) Added my Required Map Elements
13) Exported out to a PDF for all 24 pages.
14) I noticed that it changed the settings on my scale bare from .1 to .11 which I don’t like. Not sure how to fix this problem unfortunately. I look at my original document and it still shows .1
This week's multipart exercise had a lot of interesting aspects to it. Having been around offices that have dealings with parcel information, I can vouch for the usefulness of this type of knowledge.
Without further delay, here are some of my maps from this week's assignment:
Process Summary:OBJECTIVE 2:
1) Created a new blank MXD and saved as Parcels.mxd
2) Added the Parcels Feature Class from the results.gdb
3) Looked at the attributes and noticed the owner names are missing.
4) Opened the CSV in Excel and deleted the last column since it is unsupported.
5) Saved and closed the CSV
6) Added the CSV to the map
7) I had some problems on my join until I renamed the PARCEL field in Excel to PARCELID. Then my join seemed to work properly.
8) Exported my join out to the Results.gdb as Parcels_Join
9) Selected the key parcel by attribute and exported it as Zuko_Parcel to the Results.gdb
10) Did a select by location to choose all the parcels that intersected our parcel. Answered Q10
11) Exported these out as ADJ_Parcels
12) Added the zoning feature class
13) Clipped the zoning feature class to the adjacent parcels
14) Created the Parcels_Zone feature class by using the Intersect tool
15) Anseered Q11
16) Added a MAPKEY Field to the table and set it to be the same value as the OBJECTID_1 field
17) Having a problem because when I intersect the Adjacent Parcels and the Zoning, I end up with 25 features instead of 9. Most of the excess are small slivers where the polygons don’t quite match up.
18) I was able to get around this in an unusual way. I created a JOIN on my Adjacent Parcels with the Parcels_Zone feature class using the MAPKEY field. This gave me the ZONING info I needed. Not sure if this was the best approach overall, but it worked in this case.
19) Update, it didn’t work in this case. The zoning appears to be wrong
20) I finally had to move past this problem by setting a tolerance of 5 feet which returned 10 records (Two for parcel 9) I deleted the smaller of these parcels to get down to my count of 9. Still not sure why this was the case, but needed to move on.
21) Added all my required map elements to prepare for deliverable
OBJECTIVE 3:
1) Saved the MXD as Zoning_DDP to start next section
2) Ran the Grid Index Features from the Data Driven Pages part of the Cartography Tools and got my 24 results
3) Ran the Select by location to weed these indexes down to 12
4) Exported these records out to a Zoning_Index table
5) Added my zone descriptions to the table
6) Created a new Data Frame to separate the index from the content
7) Added the Data Driven Pages toolbar
8) Ran the Data Driven Pages Setup wizard
9) Added the Page number Dynamic Text
10) Added the locator map and set the properties to see which section we are currently working on.
11) Added dynamic text for a few fields
12) Added my Required Map Elements
13) Exported out to a PDF for all 24 pages.
14) I noticed that it changed the settings on my scale bare from .1 to .11 which I don’t like. Not sure how to fix this problem unfortunately. I look at my original document and it still shows .1
Sunday, July 8, 2012
Urban Planning - GIS for Local Government: Participation Assignment
Part I - Research Local Property Appraisal Services
Q1: Suwannee County Florida has a Property appraiser site at: http://www.suwanneepa.com/
Q2: Due to the recent flooding and the limitations of a smaller staff, they most recent historical data seems to be March 2012 (For the month of February) rather than June. The highest selling point of a property was $ 756,000. The previous sales price was for $1,000,000 in 2005
Q3: The assessed land value is $201,480 You can click on the Tax estimator to see that the taxes are all paid up (and pretty reasonable)
Q4: It is interesting to see the decline in price since 2005. I also don’t understand enough about how the Taxable values work. Even though the property still sold for over 3 quarters of a million dollars, the “Land Value” s only at $201,480 and the “Taxable” value is at $24,039.
Part II - Mapping Assessment Values
Here is my map for this participation assignment:
Question 1:
Part I - Research Local Property Appraisal Services
Q1: Suwannee County Florida has a Property appraiser site at: http://www.suwanneepa.com/
Q2: Due to the recent flooding and the limitations of a smaller staff, they most recent historical data seems to be March 2012 (For the month of February) rather than June. The highest selling point of a property was $ 756,000. The previous sales price was for $1,000,000 in 2005
Q3: The assessed land value is $201,480 You can click on the Tax estimator to see that the taxes are all paid up (and pretty reasonable)
Q4: It is interesting to see the decline in price since 2005. I also don’t understand enough about how the Taxable values work. Even though the property still sold for over 3 quarters of a million dollars, the “Land Value” s only at $201,480 and the “Taxable” value is at $24,039.
Part II - Mapping Assessment Values
Here is my map for this participation assignment:
Question 1:
The
beauty of this method of showing parcels is that you can see several parcels
that stand out from the rest. If I had
this data in front of me, my first wuestion would be about the blue and green
parcels. You mentioned in the
assignment that these are lands that are not available for development. After
ruling those out, by next question would be on :
- Yellow Parcel: Ref No 071S312000001001 - Whis is the land value only $24,700 ?
- Red Parcel: Ref No 071S312000013001 – Why is ths land value $33,250 ?
- Shouldn’t they match the majority of the other parcels which value at $27,075 ?
- After I answered the question on these, I would look at all the parcels that were light orange (5 of them). Why is their value $24,983 vs the $27,075 of the majority of the neighborhood.
There
may be a very good answer for these questions, but the point of the exercise is
to show how it makes the review process much easier.
Wednesday, July 4, 2012
Homeland Security: Protect
Our assignment this week was to make several maps concerning the security of the NORAD facility. We were also asked to post one of our maps here with a brief description:
This is my map for the Heliport location near the facility.
Here are some key steps from the Process Summary:
III. Site surveillance Locations around Critical Infrastructure
i. Generate Hillshade
1. Created a new MXD
2. Turned on the 3D Analyst Extension
3. Turned on the 3D Analyst Toolbar
4. Added the Elevation DEM
5. Ran the Hillshade 3d tool with Azimuth 270 and Altitude 39
6. Added the Orthoimagery data layer and set the transparency to 60%
7. Saved the MXD
8. Launched ArcCatalog
9. Created the DJ_Surveil_pts shapefile
10. Added to my MXD
11. Change symbology to a 12 point red triangle
12. Restored the Orthoimagery transparency to 0
13. Started editing the DJ_Surveil_pts shapefile
14. Added about 13 points to the MXD and saved edits
15. Saved the Map
ii. Generate Viewshed
1. Selected Elevation in the 3D Analyst Toolbar
2. Zoomed to the extent of the DEM and turned off the Orthoimagery
3. Set the input raster to the DEM, and the DJ_Surveil_pts shapefile as the observer features
4. After the viewshed ran, set the transparency level on it to 50%
5. Turned on the Orthoimagery and zoomed to its extent
6. Mode of my imagery was covered by a green area which meant line of site was good.
7. Added a field to the shapefile called OFFSETA
8. Edited the shapefile and used the calc tool to set this new value to 10 for all 13 points.
9. Ran the new viewshed to DJ_viewshed10
10. You can see with this new viewshed that more area is available to see.
iii. Create line of sight
1. Turned off all layers except viewshed10, Surveil points, and Orthoimagery
2. Created several line of site points
3. Saved my map
4. Selected one of my lines of sight elements and used the “Create Profile Graph” to show the line of sight
5. Changed my titles on my graph
6. Exported graph and added it to the layout to produce my next deliverable.
v. View 3D line of sight
1. Started ArcScene
2. Added the elevation data
3. Added the Orthoimagery data
4. Opened the Base Heights tab
5. Set to “Floating on a custom surface” and chose the DEM
6. Zoomed to the extent of the OrthoImagery. A little slow give the remote connection but VERY cool !
7. Went back to ArcMap without closing ArcScene
8. Copied and pasted my line of sight to get a 3D view of the visibility data that we generated.
9. Exported out to JPG
10. Started cleaning up my deliverables for submittal
Our assignment this week was to make several maps concerning the security of the NORAD facility. We were also asked to post one of our maps here with a brief description:
Here are some key steps from the Process Summary:
III. Site surveillance Locations around Critical Infrastructure
i. Generate Hillshade
1. Created a new MXD
2. Turned on the 3D Analyst Extension
3. Turned on the 3D Analyst Toolbar
4. Added the Elevation DEM
5. Ran the Hillshade 3d tool with Azimuth 270 and Altitude 39
6. Added the Orthoimagery data layer and set the transparency to 60%
7. Saved the MXD
8. Launched ArcCatalog
9. Created the DJ_Surveil_pts shapefile
10. Added to my MXD
11. Change symbology to a 12 point red triangle
12. Restored the Orthoimagery transparency to 0
13. Started editing the DJ_Surveil_pts shapefile
14. Added about 13 points to the MXD and saved edits
15. Saved the Map
ii. Generate Viewshed
1. Selected Elevation in the 3D Analyst Toolbar
2. Zoomed to the extent of the DEM and turned off the Orthoimagery
3. Set the input raster to the DEM, and the DJ_Surveil_pts shapefile as the observer features
4. After the viewshed ran, set the transparency level on it to 50%
5. Turned on the Orthoimagery and zoomed to its extent
6. Mode of my imagery was covered by a green area which meant line of site was good.
7. Added a field to the shapefile called OFFSETA
8. Edited the shapefile and used the calc tool to set this new value to 10 for all 13 points.
9. Ran the new viewshed to DJ_viewshed10
10. You can see with this new viewshed that more area is available to see.
iii. Create line of sight
1. Turned off all layers except viewshed10, Surveil points, and Orthoimagery
2. Created several line of site points
3. Saved my map
4. Selected one of my lines of sight elements and used the “Create Profile Graph” to show the line of sight
5. Changed my titles on my graph
6. Exported graph and added it to the layout to produce my next deliverable.
v. View 3D line of sight
1. Started ArcScene
2. Added the elevation data
3. Added the Orthoimagery data
4. Opened the Base Heights tab
5. Set to “Floating on a custom surface” and chose the DEM
6. Zoomed to the extent of the OrthoImagery. A little slow give the remote connection but VERY cool !
7. Went back to ArcMap without closing ArcScene
8. Copied and pasted my line of sight to get a 3D view of the visibility data that we generated.
9. Exported out to JPG
10. Started cleaning up my deliverables for submittal
Saturday, June 23, 2012
Preparing MEDS
This week's assignment was about MEDS (Minimum Essential Data Sets). Our task was to put together some critical data for NORAD. Here is a screenshot of my layers at the completion of this assignment:
Thursday, June 21, 2012
Participation Assignment
The article I choose to report on concerned the Ogden, Utah
Police Department’s use of GIS technology to improve their ability to handle
the “suppression, detection, and investigation of crime”. The Department has created what they call a
Real Time Crime Center (RTCC). This
system is the result of over 10 years of improvements.
The goal of their project was to maximize the efficiency of
their staff of 144 sworn officers in their community of 85,000 residents. In the past they had used the ESRI line of
products to analyze their criminal incident information, but it wasn’t until
2008 when they started deploying what they call “significant enhancements”. These included dynamic hot spot mapping
(which shows the intensity levels of criminal activity spatially), improved web
access (making the information easier to retrieve for its officers), and
Automated Vehicle Location (which allowed them to see the location of all their
vehicles in real time). Another technology
they embrace is from an ESRI Partner is called “Pictometry”. This
allows their staff to look at aerial imagery not only from the top-down view,
but also from all 4 directions. This
helps them detect features such as cover where a criminal might be hidden. The Pictometry is of special interest to me
as I have started using it as part of my Internship Process. In the news recently it looks like Apple and
Google will both be tackling this type of imagery, so it will be interesting to
see if they will be able to implement this technology in Ogden as well.
The Ogden Police Department really took the RTCC to new
levels of usefulness in 2011 when they began to integrate other data into the
system such as local warrants, criminal history, jail data, arrest affidavits,
and property information. This helped
improve the frequency of data retrieval from the system from monthly (in some
cases) down to near-real time. They have
developed their systems to be standards compliant with the Microsoft Fusion
Core Solution (MFCS) which is the result of a partnership between Microsoft and
ESRI. They describe the MFCS system as “an easy-to-use, quick-to-configure solution
that combines the robust capabilities of the Esri ArcGIS Advanced Enterprise
Server and Microsoft Office SharePoint Server 2010” They can add data sources from numerous third
party data providers such as surveillance cameras, a blimp, and briefing notes.
This article really inspired me to look at some of the
processes where I work currently. We
gather real time data in several systems that could be really beneficial to see
in a near real-time spatial web application.
We generate some reports for board meetings that would be a whole lot
more useful if they were pulled up on a live web site. Now if I only had the free time…..
Washington DC Crime Mapping
Here is my latest work on the Homeland Security project for Washington DC Crime mapping.
3) Ran the Multiple Ring tool from the Analysis tools in the toolbox. Set at .5, 1 and 2 miles
4) On the next step, it would not let me add a field on the Crime data until I exported it to be part of the GDB file.
5) Set the new Event field to 1 using the calculator
6) Performed my spatial join to create the buf_crime
7) Removed buffer from the TOC
8) Answered Q7 & Q8
9) On Q9 I have an interesting issue. My sum of events doesn’t match my original count of crimes (off by 61). So I am going to do my best to answer Q9.
10) Answered Questions 9-12
11) Prepared my deliverable (I went with an 11x17 map, I hope that is acceptable).
3) Selected all the burglaries
4) Answered Question 13
5) Ran the spatial analysis on the Crime density using the different buffer values
6) Saved the 1500 version out as a layer file
7) Answered Q14
8) Created my additional data frames and set the coordinate systems all to match.
9) Next worked on homicide and Sex Abuse maps
10) Answered Q15
11) Answered Q16
12) Prepared the map deliverable for this assignment.
13) Posted to dropbox and blog.
And here is my Process Summary for part 3 & 4 (which were the only parts required this week)
STEP 3: Produce a Map of Police Stations with Crime Proximity
1) Used my DCMaps template to create my new crime2.mxd document
2) Exported some of my symbolized data to layer files so I could use them more easily3) Ran the Multiple Ring tool from the Analysis tools in the toolbox. Set at .5, 1 and 2 miles
4) On the next step, it would not let me add a field on the Crime data until I exported it to be part of the GDB file.
5) Set the new Event field to 1 using the calculator
6) Performed my spatial join to create the buf_crime
7) Removed buffer from the TOC
8) Answered Q7 & Q8
9) On Q9 I have an interesting issue. My sum of events doesn’t match my original count of crimes (off by 61). So I am going to do my best to answer Q9.
10) Answered Questions 9-12
11) Prepared my deliverable (I went with an 11x17 map, I hope that is acceptable).
STEP 4: Produce Density Maps of Burglaries, Homicide, and Sex Abuse Crimes
1) Created my crime3.mxd
2) Added the Crimes to it (with the added Events column)3) Selected all the burglaries
4) Answered Question 13
5) Ran the spatial analysis on the Crime density using the different buffer values
6) Saved the 1500 version out as a layer file
7) Answered Q14
8) Created my additional data frames and set the coordinate systems all to match.
9) Next worked on homicide and Sex Abuse maps
10) Answered Q15
11) Answered Q16
12) Prepared the map deliverable for this assignment.
13) Posted to dropbox and blog.
Wednesday, June 13, 2012
Natural Hazards - Tsunamis
This week's assigment looked at how GIS can be used in regards to Tsunamis.
Here is my map and process summary for this week:
Process Summary
Part I: Building a File Geodatabase
1. Copied the data from the R: drive and unzipped
2. Started ArcCatalog
3. Documented Metadata
4. Navigated to my S:\GIS4048-ApplicationsInGIS\W4-NaturalHazards-Tsunamis\Tsunami directory
5. Previewed the Roads shape file to make sure the Preview Tab works properly.
6. Created the Tsunami.gdb File Geodatabase
7. Created the Transportation Feature Dataset and set the Project Coordinate tp WGS1984
8. Added the Rails_UTM and the Roads_UTM shapefiles to the File Geodatabase
9. Started ArcMap
10. Now we need to convert the data in Excel into a spatial feature
11. First I added the Excel Sheet as a table
12. Next I right-Clicked it and chose “Display XY Data”
13. I chose to import the Coordinate system and chose my File Geodatabase and the Feature Dataset as the place to import from.
14. Added the Administrative and Damage_Assesment shapefiles and added the recommended shapefiles to each.
15. Made sure my File Geodatabase matched the lab instructions
16. Opened ArcToolbox.
17. Ran the Build Raster Attribute Table tool on each of the DEMs
18. Created the FukuDEM and SendDEM Raster Datasets
19. Load Data process completed
20. Calculated Statistics on both DEMs. It is interesting to try to preview the data before running the statistics. It appears solid black. Post processing it appears normal.
21. Took a screenshot and prepared it as my first deliverable
Part II: Fukushima Radiation Exposure Zones
1. Started ArcMap
2. Opened Tsunami.mxd
3. Exported the Fukushima II Power Plant to its own Feature Class
4. Answered Q1
5. Created my Multiple Ring buffer. Pretty cool tool !
6. Adjusted my Symbology
7. Clipped the Roads feature
8. Did a Select by Location query to answer Q2
9. Exported these cities out to Evac_zone_cities
10. Posted my screenshot for Q3
11. Answered questions 4 through 7 using that same tool and the statistics tool for the sum of population
12. Worked on my map deliverable
This week's assigment looked at how GIS can be used in regards to Tsunamis.
Here is my map and process summary for this week:
Process Summary
Part I: Building a File Geodatabase
1. Copied the data from the R: drive and unzipped
2. Started ArcCatalog
3. Documented Metadata
4. Navigated to my S:\GIS4048-ApplicationsInGIS\W4-NaturalHazards-Tsunamis\Tsunami directory
5. Previewed the Roads shape file to make sure the Preview Tab works properly.
6. Created the Tsunami.gdb File Geodatabase
7. Created the Transportation Feature Dataset and set the Project Coordinate tp WGS1984
8. Added the Rails_UTM and the Roads_UTM shapefiles to the File Geodatabase
9. Started ArcMap
10. Now we need to convert the data in Excel into a spatial feature
11. First I added the Excel Sheet as a table
12. Next I right-Clicked it and chose “Display XY Data”
13. I chose to import the Coordinate system and chose my File Geodatabase and the Feature Dataset as the place to import from.
14. Added the Administrative and Damage_Assesment shapefiles and added the recommended shapefiles to each.
15. Made sure my File Geodatabase matched the lab instructions
16. Opened ArcToolbox.
17. Ran the Build Raster Attribute Table tool on each of the DEMs
18. Created the FukuDEM and SendDEM Raster Datasets
19. Load Data process completed
20. Calculated Statistics on both DEMs. It is interesting to try to preview the data before running the statistics. It appears solid black. Post processing it appears normal.
21. Took a screenshot and prepared it as my first deliverable
Part II: Fukushima Radiation Exposure Zones
1. Started ArcMap
2. Opened Tsunami.mxd
3. Exported the Fukushima II Power Plant to its own Feature Class
4. Answered Q1
5. Created my Multiple Ring buffer. Pretty cool tool !
6. Adjusted my Symbology
7. Clipped the Roads feature
8. Did a Select by Location query to answer Q2
9. Exported these cities out to Evac_zone_cities
10. Posted my screenshot for Q3
11. Answered questions 4 through 7 using that same tool and the statistics tool for the sum of population
12. Worked on my map deliverable
Thursday, June 7, 2012
Natural Hazards - Hurricanes
This week's assignment asked us to look at the damage caused by Hurricane Katrina in three counties in Mississippi. There are 3 maps and a graph and a table that I have submitted here along with a Process Summary.
Process Summary
1. Set up my directory on S: Drive and copied over the necessary data from the R: drive
2. Answered Questions 1-4 (Returned to them at several points during the lab)
3. Opened ArcCatalog and previewed the data in both spatial and tabular form
4. Examined the Metadata to complete the chart as part of Q5
5. Organized my directory structure so that I have a S:\GIS4048-ApplicationsInGIS\W3-NaturalHazards-Hurricanes\Hurricanes\Project1\results path for my results
6. Created a new map document in ArcGIS and called it coast1
7. Created my Map Document Properties and set to relative paths.
8. Saved work
9. Set my coordinate system by importing the coordinate system from the land cover feature
10. Set up my Environment in Toolbox to handle current and scratch workspaces, Output Coordinate System, Raster Analysis settings and the mask to base the results on the 3 counties of interest
11. Added the elevation raster.
12. Answered Q6
13. Symbolized the color ramp for the Elevation. The tip about right clicking to see the names will come in handy at work and at my internship.
14. Added the Counties and set up the labels
15. Answered Q7
16. Added the Places, symbolized, and answered Q8
17. Added Islands to the map
18. Added swamps and rivers and symbolized appropriately.
19. This will be the base map that will give us a starting point for the rest of the assignment
20. Answered Q9 and 10
21. Saved my map (will com back to it later to get it ready to be a deliverable, as it is currently very busy)
22. Created Coast2.mxd
23. Documented the map, set the data frame properties, and set the environments just like the previous map
24. Turned on Spatial Analyst
25. Used the Math Tools in Spatial Analyst to convert meters to feet
26. Used the logical toolset to get the lands where elevation is less than or equal to 15 feet and saved it as flooded_land
27. Added Landcover
28. Reclassified Landcover using the Spatial Analyst reclassify tool.
29. Used Editor to label the reclassified data
30. Saved the reclassified data as a layer file
31. Added the counties and labeled them
32. Obtained the Statistics on the Count field
33. Answered Q12
34. Created a new field called “Percent” as float
35. Used the field calculator to set this new column to equal COUNT/1380797*100
36. Verified that this totaled to 100% using the statistics tool.
37. Saved my coast 2 document and graph. I will come back to prep it for delivery later
38. Created my coast3.mxd and set the description, coordinate system information, and environment
39. Added my infrastructure data to the MXD and symbolized appropriately.
40. Answered Q15 and Q16 ( I found a cool Spatial Analyst tool for this step called Spatial Analyst Tools | Extraction | "Extract Values to Points"
41. Started my coast4.mxd document
42. Documented the map, set the coordinate info, and set the environment settings.
43. Prepared my deliverables and Blog Post Entries.
This week's assignment asked us to look at the damage caused by Hurricane Katrina in three counties in Mississippi. There are 3 maps and a graph and a table that I have submitted here along with a Process Summary.
![]() |
Deliverable #1- A map showing elevation/bathymetry of the Mississippi Coast counties with places, types of water, barrier islands, and hydrography. |
![]() |
Deliverable #2 - A map of flooded land of the Mississippi Coast after Hurricane Katrina. |
![]() |
Deliverable #3 - A map showing infrastructure and health facilities at risk from storm surge. |
![]() |
Deliverable #4 - A bar graph showing percentage of total flooded land by land-cover type. |
![]() |
Deliverable #5 - A table showing various land types that were flooded measured in acres and square miles. |
1. Set up my directory on S: Drive and copied over the necessary data from the R: drive
2. Answered Questions 1-4 (Returned to them at several points during the lab)
3. Opened ArcCatalog and previewed the data in both spatial and tabular form
4. Examined the Metadata to complete the chart as part of Q5
5. Organized my directory structure so that I have a S:\GIS4048-ApplicationsInGIS\W3-NaturalHazards-Hurricanes\Hurricanes\Project1\results path for my results
6. Created a new map document in ArcGIS and called it coast1
7. Created my Map Document Properties and set to relative paths.
8. Saved work
9. Set my coordinate system by importing the coordinate system from the land cover feature
10. Set up my Environment in Toolbox to handle current and scratch workspaces, Output Coordinate System, Raster Analysis settings and the mask to base the results on the 3 counties of interest
11. Added the elevation raster.
12. Answered Q6
13. Symbolized the color ramp for the Elevation. The tip about right clicking to see the names will come in handy at work and at my internship.
14. Added the Counties and set up the labels
15. Answered Q7
16. Added the Places, symbolized, and answered Q8
17. Added Islands to the map
18. Added swamps and rivers and symbolized appropriately.
19. This will be the base map that will give us a starting point for the rest of the assignment
20. Answered Q9 and 10
21. Saved my map (will com back to it later to get it ready to be a deliverable, as it is currently very busy)
22. Created Coast2.mxd
23. Documented the map, set the data frame properties, and set the environments just like the previous map
24. Turned on Spatial Analyst
25. Used the Math Tools in Spatial Analyst to convert meters to feet
26. Used the logical toolset to get the lands where elevation is less than or equal to 15 feet and saved it as flooded_land
27. Added Landcover
28. Reclassified Landcover using the Spatial Analyst reclassify tool.
29. Used Editor to label the reclassified data
30. Saved the reclassified data as a layer file
31. Added the counties and labeled them
32. Obtained the Statistics on the Count field
33. Answered Q12
34. Created a new field called “Percent” as float
35. Used the field calculator to set this new column to equal COUNT/1380797*100
36. Verified that this totaled to 100% using the statistics tool.
37. Saved my coast 2 document and graph. I will come back to prep it for delivery later
38. Created my coast3.mxd and set the description, coordinate system information, and environment
39. Added my infrastructure data to the MXD and symbolized appropriately.
40. Answered Q15 and Q16 ( I found a cool Spatial Analyst tool for this step called Spatial Analyst Tools | Extraction | "Extract Values to Points"
41. Started my coast4.mxd document
42. Documented the map, set the coordinate info, and set the environment settings.
43. Prepared my deliverables and Blog Post Entries.
Wednesday, May 30, 2012
Natural Hazards / Earthquakes
This weeks' assignment really was a great way to shake off the cobwebs from the break and start making some maps again. We concentrated on using some tools in ArcGIS that I hadn't used before and that will prove VERY helpful in my current job.
For part 1, we considered the impact on infrastructure in the event of an earthquake near Memphis, Tennessee. On this map, I decided to make it 11" x 17" rather than 8.5" x 11" to try to best display all three scenarios (dams, Interstates, and railroads).
For Part 2, we looked at the peak ground velocity of the aftershocks.
For Part 3, we examined aftershocks that were greater than magnitude 3.0
And for part 4, we looked at the intensity of the aftershocks.
We also created a couple of graphs along the way using the built in ESRI tools:
Earthquakes
Part I
This weeks' assignment really was a great way to shake off the cobwebs from the break and start making some maps again. We concentrated on using some tools in ArcGIS that I hadn't used before and that will prove VERY helpful in my current job.
For part 1, we considered the impact on infrastructure in the event of an earthquake near Memphis, Tennessee. On this map, I decided to make it 11" x 17" rather than 8.5" x 11" to try to best display all three scenarios (dams, Interstates, and railroads).
For Part 3, we examined aftershocks that were greater than magnitude 3.0
And for part 4, we looked at the intensity of the aftershocks.
We also created a couple of graphs along the way using the built in ESRI tools:
Here is my Process summary for the lab this week:
Process Sumary:
1) Copied all the data from the r: drive
to the s: drive.
2) Unzipped the files
3) Opened up the NewMadrid.mxd file
4) Used the identify tool to research
the data
5) It appears that Memphis would suffer
the most damage in an earthquake in this region
6) Performed a select by location to see
the areas that would be affected
7) Next we learned about the summary
feature which I wasn’t aware of. I can
see this being VERY useful in my day to day work.
8) We can see that there are 91 areas
that would be greatly impacted by this event
9) Next we calculated the population
density of the counties. Again, VERY
useful lab so far and I have just started !
10) Next we intersected the county data
(including population density) with the New Madrid data
11) Created the PopMMI table
12) Using the summarize tool, we can see that
60,088,857 people live in the affected areas
13) Created the graph as instructed
14) Added my numeric values to the
Modified Mercalli Scale data
15) Created my clipped Istaterisk layer
16) Next I did a SELECT BY ATTRIBUTES to
determine all the areas where the intensity was 8 or greater
17) Then I did a SELECT BY location to
see which Interstates intersected these areas
18) Clipped this data to the map to see
my At Risk Interstates
19) Did a similar process for Railroads
in the are that had an index of 10 or higher
20) Finished this section by doing the
dams at risk as well
21) For my map on this section, I wanted
to try to think outside the box a little bit so I choose a different map size
of 11x17 instead of the normal 8.5 x 11.
I hope this was OK for this map which will display all 3 results on it.
Earthquakes
Part II
1) Now on to the second part of the lab
2) Opened the Northridge1.mxd map
3) Made sure the Spatial Analyst
extension was enabled (Under Customize, rather than Tools)
4) Symbolized the dots for the building
damage and arranged layers as instructed.
5) Did a select by attribute to choose
only the red and yellow tagged buildings
6) Set up the Kernel Density tool as
instructed
7) Cleared the selection and turned off
unused layers
8) Set the lowest level of symbolization
to “No Color”
9) Used the effects toolbar to set
visibility to 15%
10) Saved the Damage Pattern as a layer
file
11) Added the layer back in and
symbolized
12) Opened the Geology layer
13) Examined the Liquefaction layer
14) Tuned on the Stations layer
15) Opened the Spline Interpolation tool
16) Ran this tool with the recommended
settings
17) Created the PGV Layer and set
symbols/transparency/etc…
18) Prepared Map
Earthquakes
Part III
1) Now on to the third part of the lab
2) Opened up the Northridge2.mxd map
3) Added the text file while changing
the Projection to WGS
4) Exported this data out to a shapefile
5) Removed the CSV file from the project
6) Selected the Main Shock by doing a
select by attribute
7) Created a new layer to represent the
Main shock and symbolized as a red triangle
8) Created my map of Aftershocks greater
than 3 and symbolized appropriately. I
am a little worried that my dots are in a “grid” format rather than randomly
scattered, but hopefully that is how it is supposed to look. I actually reloaded the project from the R:
drive to ensure that this should be fine.
9) I didn’t see any of the ArcScene in
this lab which it referred to
Earthquakes
Part IV
1) Now on to the fourth part of the lab
2) Opened up the Northridge3.mxd map
3) Created the Summary table similar to
the process we did earlier in the lab
4) Created my Graph and exported it out
as a JPG file
5) Posted my contents to dropbox and
blog
I really
enjoyed this weeks’ work even though it was very time consuming. I learned a lot of new tips with ArcGIS that
I can start applying to by job already.
I hope this is a sign of what kind of labs to expect for this class,
because it really is helpful. I tried
to make my maps a little different from each other just to try to get back into
the swing of making maps each week.
Subscribe to:
Posts (Atom)