Top Banner
What’s the creepiest place on the Oregon Coast? -Deciding on necessary variables: The first step to answering this question was to extract my variables from the word ‘creepy’. Abandoned buildings are creepy. So are cemeteries. Especially if it’s cloudy and dark. And especially if they’re around steep cliffs. I used these as my variables. Data: Through ghosttown.com, where they list out ghost towns and correlative descriptions, I found the data concerning number and nature of abandoned buildings at each site (sites shown in Fig. 1) within the area I wanted, and whether or not a cemetery was present. I went to oregonclimateandweatherpatterns.com to find relatively recent information regarding average annual precipitation across the state, which is directly related to cloudy darkness. To find information on cliff locations, I found digital elevation models at nationalmap.gov. -Converting data into a useful format: I had to download 10 DEM’s from nationalmap.gov to cover the area I was interested in, although the lowest resolution I could find was 1 arc second. I recorded the numbers corresponding with each DEM so I could remember where it fit in the framework, and slowly used the ‘mosaic’ tool to tie them together in the appropriate order. The resultant DEM was a complete model of my entire area of interest, ready for analysis in ArcMap (Fig. 2). I also created a hillshade, out of curiosity, but it was useless for my great scale of data. I had a hard time finding average annual precipitation data, but eventually realized that I only really cared about the precipitation data at each ghost town point. This allowed me to take a complete map (Fig. 3) from oregonclimateandweatherpatterns.com in .jpg format, add it as data in ArcMap, and georeference it (using GCS 1983) into the right place on a topographic base map, using the spline transformation, and save it as a .tif file. This worked because as long as I could correlate my ghost town points with precipitation values on that map, I would have all the necessary comparative data. Converting the ghost town data was trickier. I used google earth to find each listed ghost town in the area and put a pin on it, saving all 15 of the original points as a KML file to use in the ‘KML to Layer converter’ tool on ArcMap (Fig. 1). I then created an .xls file on Microsoft Excel to include attributes about each ghost town, including a subjective value of scariness for their remaining structures, the presence of a cemetery, an empty column to input precipitation data later when I could compare them against the precipitation image, and a column called ‘coast’, in which I could later identify which points were actually within the frame of my raster dataset and which were not, because I’d previously been guessing in a general area and wanted to be able to use a simple definition query according to that value to get rid of points outside of my area. I joined that table to the point data in ArcMap on the basis of the ‘Name’ column. -Analysis: To analyze the DEM, I used the ‘slope’ tool, calculating slope angle and portraying it with variations in color. I symbolized the data with only two classes: one for angles above 65 degrees, and one for angles below, using no fill color for angles below. I only wanted to be able to see those angles of elevation change that might be called cliffs, although my choice of classification of angles was somewhat arbitrary. I didn’t include the resultant raster here because it only appeared as a spray of spots at this scale. In order to use the resultant cliffs, they had to be within a certain distance of the ghost town points, so I analyzed them next. I put a 5 mile buffer around each ghost town point using the ‘buffer’ tool, after doing a definition query based on the ‘coast’ column to cut my number of ghost town points down from 15 to 12, because 3 of the ghost towns sat just outside my range. I could now see specific points with a topographic angle of more than 65 degrees that were specifically within 5 miles of each data point. I had to add another field to my point data attribute table to record the number of cliffs within that radius, but I only listed linear features of those cliff-spots strung together, because there were a lot of seemingly disconnected points within that range of angles. I also analyzed the average annual precipitation map against my point data and added those precipitation values into the point attribute table. I then had to weight my variables against each other by importance to create a layer of suitability of creepiness. Because all my variables were now stored in the point data, and point data cannot be used in the ‘weighted overlay’ tool, I had to convert each variable column into raster data using the ‘feature to raster conversion’ tool. I also had to reclassify the ‘structures’ column raster and the ‘precipitation’ column raster to make them integer-based, because they had produced a range of values unusable in the ‘weighted overlay’ tool. Once I had made all the necessary conversions, I plugged each variable raster into the ‘weighted overlay’ tool, giving an importance of 5% to the presence of cliffs, 10% to average annual precipitation, 30% to the creepiness of remaining structures, and 55% to the presence of a cemetery, subjectively. The resultant raster only had 12 cells of data with 4 possible values ranging from best(creepiest) to worst (least creepy). -Presentation: To make the data more presentable, I had to clip the raster and precipitation data. To do this, I first found a shapefile of state boundaries, selecting Oregon and creating a new layer from just the state of Oregon, and then clipped the DEM to that shape using the ‘extract by mask’ tool. It was unpresentable before, because the low elevation throughout the coast resulted in a very dark DEM ending in a black ocean. I then clipped the precipitation map surface to that raster using the same tool. I made the sinuously linear cliffs I had counted more exaggerated on my map with the freehand drawing tool. I also had to convert the weighted overlay raster into points bigger than individual cells using the ‘raster-to- point’ tool, creating a different feature class. I then correlated the weighted ranks with the point symbology, making higher-ranked (creepier) points larger and slightly more intense in color. In layout view, I copied the legend for my precipitation map surface to use in my own legend, and then proceeded to explain each of the 12 ghost towns’ rankings against the map, because I realized the rankings wouldn’t be very useful or diplomatic without providing the unseen data in my attribute table; I could have put my attribute table in my layout, but it was very information-rich and unwieldy. Fig. 3: Precipitation data Fig. 2: The complete DEM, made out of 10 smaller, mosaicke d DEM’s Fig. 1: KML-to- Layer data against the topogra- phic base map Fig. 4: Using a state boundary shapefile, I created a new layer from this selection and clipped my coastal raster sets against it, including my full DEM and precipitation dataset, shown here
1
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: FinalPresentation

What’s the creepiest place on the Oregon Coast? -Deciding on necessary variables: The first step to answering this question was to extract my variables from the word ‘creepy’. Abandoned buildings are creepy. So are cemeteries. Especially if it’s cloudy and dark. And especially if they’re around steep cliffs. I used these as my variables.

Data: Through ghosttown.com, where they list out ghost towns and correlative descriptions, I found the data concerning number and nature of abandoned buildings at each site (sites shown in Fig. 1) within the area I wanted, and whether or not a cemetery was present. I went to oregonclimateandweatherpatterns.com to find relatively recent information regarding average annual precipitation across the state, which is directly related to cloudy darkness. To find information on cliff locations, I found digital elevation models at nationalmap.gov.

-Converting data into a useful format: I had to download 10 DEM’s from nationalmap.gov to cover the area I was interested in, although the lowest resolution I could find was 1 arc second. I recorded the numbers corresponding with each DEM so I could remember where it fit in the framework, and slowly used the ‘mosaic’ tool to tie them together in the appropriate order. The resultant DEM was a complete model of my entire area of interest, ready for analysis in ArcMap (Fig. 2). I also created a hillshade, out of curiosity, but it was useless for my great scale of data. I had a hard time finding average annual precipitation data, but eventually realized that I only really cared about the precipitation data at each ghost town point. This allowed me to take a complete map (Fig. 3) from oregonclimateandweatherpatterns.com in .jpg format, add it as data in ArcMap, and georeference it (using GCS 1983) into the right place on a topographic base map, using the spline transformation, and save it as a .tif file. This worked because as long as I could correlate my ghost town points with precipitation values on that map, I would have all the necessary comparative data. Converting the ghost town data was trickier. I used google earth to find each listed ghost town in the area and put a pin on it, saving all 15 of the original points as a KML file to use in the ‘KML to Layer converter’ tool on ArcMap (Fig. 1). I then created an .xls file on Microsoft Excel to include attributes about each ghost town, including a subjective value of scariness for their remaining structures, the presence of a cemetery, an empty column to input precipitation data later when I could compare them against the precipitation image, and a column called ‘coast’, in which I could later identify which points were actually within the frame of my raster dataset and which were not, because I’d previously been guessing in a general area and wanted to be able to use a simple definition query according to that value to get rid of points outside of my area. I joined that table to the point data in ArcMap on the basis of the ‘Name’ column.

-Analysis: To analyze the DEM, I used the ‘slope’ tool, calculating slope angle and portraying it with variations in color. I symbolized the data with only two classes: one for angles above 65 degrees, and one for angles below, using no fill color for angles below. I only wanted to be able to see those angles of elevation change that might be called cliffs, although my choice of classification of angles was somewhat arbitrary. I didn’t include the resultant raster here because it only appeared as a spray of spots at this scale. In order to use the resultant cliffs, they had to be within a certain distance of the ghost town points, so I analyzed them next. I put a 5 mile buffer around each ghost town point using the ‘buffer’ tool, after doing a definition query based on the ‘coast’ column to cut my number of ghost town points down from 15 to 12, because 3 of the ghost towns sat just outside my range. I could now see specific points with a topographic angle of more than 65 degrees that were specifically within 5 miles of each data point. I had to add another field to my point data attribute table to record the number of cliffs within that radius, but I only listed linear features of those cliff-spots strung together, because there were a lot of seemingly disconnected points within that range of angles. I also analyzed the average annual precipitation map against my point data and added those precipitation values into the point attribute table. I then had to weight my variables against each other by importance to create a layer of suitability of creepiness. Because all my variables were now stored in the point data, and point data cannot be used in the ‘weighted overlay’ tool, I had to convert each variable column into raster data using the ‘feature to raster conversion’ tool. I also had to reclassify the ‘structures’ column raster and the ‘precipitation’ column raster to make them integer-based, because they had produced a range of values unusable in the ‘weighted overlay’ tool. Once I had made all the necessary conversions, I plugged each variable raster into the ‘weighted overlay’ tool, giving an importance of 5% to the presence of cliffs, 10% to average annual precipitation, 30% to the creepiness of remaining structures, and 55% to the presence of a cemetery, subjectively. The resultant raster only had 12 cells of data with 4 possible values ranging from best(creepiest) to worst (least creepy).

-Presentation: To make the data more presentable, I had to clip the raster and precipitation data. To do this, I first found a shapefile of state boundaries, selecting Oregon and creating a new layer from just the state of Oregon, and then clipped the DEM to that shape using the ‘extract by mask’ tool. It was unpresentable before, because the low elevation throughout the coast resulted in a very dark DEM ending in a black ocean. I then clipped the precipitation map surface to that raster using the same tool. I made the sinuously linear cliffs I had counted more exaggerated on my map with the freehand drawing tool. I also had to convert the weighted overlay raster into points bigger than individual cells using the ‘raster-to-point’ tool, creating a different feature class. I then correlated the weighted ranks with the point symbology, making higher-ranked (creepier) points larger and slightly more intense in color. In layout view, I copied the legend for my precipitation map surface to use in my own legend, and then proceeded to explain each of the 12 ghost towns’ rankings against the map, because I realized the rankings wouldn’t be very useful or diplomatic without providing the unseen data in my attribute table; I could have put my attribute table in my layout, but it was very information-rich and unwieldy.

Fig. 3: Precipitation data

Fig. 2: The complete DEM, made out of 10 smaller, mosaicked DEM’s

Fig. 1: KML-to-Layer data against the topogra-phic base map

Fig. 4: Using a state boundary shapefile, I created a new layer from this selection and clipped my coastal raster sets against it, including my full DEM and precipitation dataset, shown here