Tuesday, October 25, 2016

Distance Azimuth Survey - Week 1

Introduction

For this exercise, students are to conduct a survey of trees in a park on campus at UWEC using a basic surveying technique: distance and azimuth. After obtaining three different GPS coordinate points, students will then physically measure the distance and azimuth of different trees nearby and record the measurements which will then be transferred into an Excel spreadsheet to be used in ArcMap.


Methods

The materials used in this exercise include the following:
  • Laser distance finder (for measuring the distance from the GPS coordinate points to the trees)
  • High quality compass
  • GPS device (for obtaining GPS coordinates)
  • Measuring tape (specially designed for measuring tree diameters, in centimeters)
  • Field notebook
The study area was located in Putnam Park on the UWEC campus which features a variety of deciduous and coniferous trees (figure 1). The following attributes were recorded for the survey:
  • Distance (from GPS coordinate points to the trees)
  • Azimuth (from GPS point to the tree being measured)
  • Tree type (ash, oak, pine, etc.)
  • Tree diameter (in centimeters)
The distance and azimuth are crucial for being able to make a map of the tree locations in ArcMap, and the tree type and diameter will serve as attribute data for each tree measured. 


Figure 1: Putnam Park trees

The steps that were taken for collecting the data are as follows:
  1.  Pick three points from which to gather tree data from, use a GPS device to record the coordinates of these points, and pick a number of trees (10 trees were measured from each point) to record measurements on
  2. Standing on the exact GPS located spot, one person would walk to a tree of choice with the laser distance finder receiver while another, standing on the point with the laser distance finder, would calculate the distance from the point to the tree, keeping each unit of the distance finder as level as possible
  3. Using the compass, the person standing on the point would line up the compass with the center of the tree and record the azimuth
  4. Using the tree measuring tape, the person next to the tree would wrap the tape around the tree at chest height and record the tree diameter
  5. The tree type was then recorded where the species was determined based on the type of leaves, bark, and any other features that would indicate the type of tree it was
After the survey was conducted, the next step was to combine all of the recorded data and enter it into an Excel spreadsheet where the data was entered into the following columns:
  • x: longitudinal GPS data (cells formatted to numerical with 6 decimal places)
  • y: latitudinal GPS data (cells formatted to numerical with 6 decimal places)
  • Distance: distance from GPS point to tree (cells formatted to numerical with 2 decimal places)
  • Azimuth: compass direction from GPS point to tree (cells formatted to numerical with 1 decimal place)
  • DBH: tree diameter in centimeters (cells formatted to numerical with 1 decimal place)
  • Tree_type: type of tree
  • P_number: GPS point number (1, 2, and 3)
Once the data was entered and formatted, the next step was to import the data into ArcMap in order to create a map of the tree points. The steps taken to be able to use the Excel data to create the map are as follows:
  1. Project the data frame in NAD 1983 HARN Wisconsin CRS Eau Claire (meters)
  2. Create a work folder for the project, connect to the folder, and create a file geodatabase within that folder 
  3. Bring the Excel spreadsheet into ArcMap to look it over before performing tools
  4. First tool: ArcToolbox > Data Management Tools > Features > Bearing Distance to Line, and then enter the following into the appropriate fields:
    • Input table: Excel table
    • Output: tree_survey (save in previously created geodatabase)
    • x field: longitudinal field
    • y field: latitudinal field
    • Distance field: Distance column (meters)
    • Bearing field: Azimuth column
  5. Once all of this is entered, run the tool
  6. After the tool is run, run the next tool: ArcToolbox > Data Management Tools > Features > Feature Vertices to Points, and then enter the following into the appropriate fields:
    • Input features: output from Bearing Distance to Line (tree_survey)
    • Output: tree_points (save in the geodatabase)
  7. Once everything is entered, run the tool
The 'Bearing Distance to Line' tool take the x-coordinate field, y-coordinate field, azimuth field, and distance field and creates a new feature class containing lines (figure 2). The 'Feature Vertices to Points' tool then takes the vertices of the lines created by the previous tool and creates a feature class of these points (figure 3).

Figure 2: Lines created from 'Bearing to Distance' tool
Figure 3: Points created from line vertices using the 'Feature Vertices to Points' tool


Results/Discussion

The resulting image produced from the survey can be seen below (figure 4) where tree points (green triangles) are shown against an aerial image basemap. While the final image appears to be quite accurate and is representative of the data collected, there were some obstacles that arose and adjustments made along the way.

Figure 4: Tree points produced from survey

The biggest problem that occurred was that, after the 'Bearing Distance to Line' tool was initially ran, two of the three GPS points and subsequent lines were inaccurate, where GPS point 3 was highly inaccurate and was approximately 3,000 meters south of the actual location (figure 5, lower red dot at the bottom of the blue ellipse). This error could have been a result of one or both of the following:
  1. Equipment error: the GPS unit did not calculate the correct coordinate (highly unlikely, especially for GPS point 3, but possible), or
  2. Human error: the GPS coordinates could have either been recorded and/or transferred into Excel incorrectly (most likely cause)
Figure 5: Image of the three GPS points and (red dots within the blue ellipse) where GPS point 3 is located at the bottom and shows just how far off it was compared to the other two points  

GPS point 2 was also not in the correct location, appearing in the middle of the parking lot behind the Davies Center about 75 meters north of the actual location (top group of lines in figure 6 below). GPS point 1 appears to be accurate, therefore no correction will be needed. The GPS points were all chosen along Putnam Drive (the path running from the upper left corner to the lower right corner in figure 6), so to line up GPS points 2 and 3 with the path, the latitudinal coordinates were adjusted. 

Figure 6: Result of 'Bearing Distance to Line' tool after GPS point 3 (left group of lines) was corrected, but GPS point 2 (upper group of lines) is still inaccurate

Using the 'Identify' tool in ArcMap, by clicking in a location directly above GPS point 3 and below GPS 2, coordinates were displayed in the spot that was clicked. Using the given latitude coordinates, these were then used to replace the related coordinates in the Excel table. After the table was updated in Excel, the 'Bearing Distance to Line' tool was ran, this time with a more accurate result (see figure 7 below). 

Figure 7: Result of 'Bearing Distance to Line' tool after latitudinal coordinate were adjusted in Excel for GPS point 2 and 3

Since the collection of data for this exercise was a group effort, it's difficult to discern just how accurate the adjusted coordinates for GPS points 2 and 3 are (GPS point 3 appears to be slightly off path). The highest confidence in accuracy goes to GPS point 1 which was not adjusted. Unfortunately for this exercise, sub-meter accuracy is crucial for pinpointing trees to their exact location, so if someone were to use the map produced from this survey, it may be difficult to locate the trees from GPS points 2 and 3 if the adjustments made for them were not accurate enough.


Conclusions

Overall this survey was fairly successful. All of the attribute table was measured and recorded without too many problems (although identifying certain tree species was somewhat difficult) and the methods worked well for accomplishing these tasks. However, either due to equipment or human error, the resulting map featured inaccuracies (some worse than others). It will be important going forward in future activities and surveys to be extra cautious, pay close attention to measurements and data recording, and double check that data transfers are correct.

Tuesday, October 18, 2016

Creation of a Digital Elevation Surface - Week 2

Introduction

In the previous week, a 45"x45" (114x114 cm) sandbox terrain of the students design was surveyed where x, y, and z data points were collected in order to translate the physical terrain into a digital elevation surface map. A grid was constructed on top of the sandbox that allowed sample points to be taken every 5 centimeters along the x and y axes. These points were then entered directly into an Excel spreadsheet that could then be imported into a geodatabase and used as a shapefile in ArcMap.

Data normalization is the process of organizing data in a uniform way for ease of use, and in this case the x,y, and z columns were saved in a numerical format in Excel where each value featured one decimal place.

Over 500 data points (533) were recorded that consisted of an x, y, and z value which represented its position on the terrain. The interpolation procedure used in ArcMap takes these points and runs them through an algorithm that will then create a surface map where each map will differ from each other based on the type of interpolation method being used.


Methods

Five different interpolation methods were used to create the surface maps including:

  1. IDW (Inverse Distance Weighted): The IDW tool "estimates cell values by averaging the values of sample data points in the neighborhood of each processing cell. The closer a point is to the center of the cell being estimated, the more influence, or weight, it has in the averaging process." (Comparing Interpolation Methods). This method produces the best results when sampling is dense in areas of variation but the average cannot be higher than the highest input or lower than the lowest input.
  2. Natural Neighbors: This interpolation method "finds the closest subset of input samples to a query point and applies weights to them based on proportionate areas to interpolate a value (Sibson, 1981)." (Comparing Interpolation Methods). 
  3. Kriging: The kriging interpolation method "is an advanced geostatistical procedure that generates an estimated surface from a scattered set of points with z-values." (Comparing Interpolation Methods). Kriging is a processor-intensive method where the input dataset will affect the speed of execution.
  4. Spline: The spline method "uses an interpolation method that estimates values using a mathematical function that minimizes overall surface curvature, resulting in a smooth surface that passes exactly through the input points." (Comparing Interpolation Methods). The spline method produces a very smooth surface.
  5. TIN (Triangular Irregular Network): In ArcMap, "TINs are a form of vector based digital geographic data and are constructed by triangulating a set of vertices (points). The vertices are connected with a series of edges to form a network of triangles...The edges of TINs form contiguous, nonoverlapping triangular facets and can be used to capture the position of linear features that play an important role in a surface, such as ridgelines or stream courses." (About TIN Surfaces, 2008). TINs can have a higher resolution in more highly variable areas and lower resolution in areas that are less variable and capture features such as mountain peaks well but tend to be more expensive to build and process.
After the surface maps were created in ArcMap using the previously mentioned interpolation methods, each map was then brought into ArcScene in order to render a 3D image. After each 3D image was rendered, it was then exported as a JPEG image to be used in a final map layout in ArcMap.



Results/Discussion

The results of each interpolation along with its 3D rendered image can be found below (figures 2-6) and a picture of the actual sandbox terrain has been placed below (figure 1) as a reference to compare how the results match up with the surveyed terrain.


Figure 1: Sandbox terrain


IDW: This appears to be one of the worst results with the surface resembling an "egg carton" pattern where many little hills and depressions cover the entire surface that do not reflect the true nature of the surveyed terrain. In reality, the terrain is much smoother.

Figure 2: IDW interpolation method

Kriging: This appears to be the best result out of the five interpolation methods that were used. The features are captured well and flow into each other smoothly, it isn't bumpy like IDW, nor is it edgy unlike the TIN.

Figure 3: Kriging interpolation method

Natural: Although the Natural method looks good, it doesn't capture the depth of the depression in the lower left hand corner well enough and the surface as a whole appears "grid-like" in a way that resembles the actual grid that was constructed over the top of the terrain and provided the sample points.
Figure 4: Natural Neighbors interpolation method

Spline: This method provided one of the best results of the terrain where features flow smoothly into one another and the ridge peaks and low valley points are captured well. However, the surface as a whole has a somewhat "bubbly" appearance to it whereas the actual terrain was more smoothed out.

Figure 5: Spline interpolation method

TIN: While the TIN captures ridge peaks and valleys well, it's too "edgy" in comparison to the actual terrain and the elevation differences do not flow well into each other.

Figure 6: TIN interpolation method

In a future survey, although 5 cm increments were already a fairly dense sample spread, tighter sample points could be collected in areas of extreme relief. Also, having a flatter, more solid surface underneath the terrain would help provide more accurate elevation measurements if the same method for measuring the elevation that was used in this lab is to be deployed in the future. 


Conclusions

This survey relates to any other field-based survey that collects sample points to represent a whole population (or in this case surface) where the sample points in this exercise were elevations points along a grid on an x-y plane. Depending on the scale, it is not always going to be realistic to perform a terrain survey as detailed as the one in this exercise (5 cm increments). Increment values would have to be based on the size and scope of the particular survey. For example, if the sandbox had been twice as large, the 5 cm increments may have become 10 cm increments instead. Terrain with varying amounts of relief may require denser sample points while terrain with little relief may not require nearly as many sample points. Interpolation methods can also be used in remote sensing applications to determine surface composition, for example, if there is any missing data within the area of interest.  


Sources:

Comparing Interpolation Methods: http://pro.arcgis.com/en/pro-app/tool-reference/3d-analyst/comparing-interpolation-methods.htm#ESRI_SECTION1_6A0EECC499AA4BB191961A99AFA9352F 

About TIN Surfaces: http://webhelp.esri.com/arcgisdesktop/9.2/index.cfm?TopicName=About_TIN_surfaces

Tuesday, October 11, 2016

Creation of a Digital Elevation Surface - Week 1

Introduction

Sampling is a shortcut method that involves gathering data on just part of a whole population and using that data to represent the whole population. Spatially speaking, and for this exercise specifically, sample points were measured and recorded along a grid for the elevation of a sandbox terrain in order to create a digital elevation surface representative of the actual sandbox terrain. The sampling technique that was employed was a systematic point sampling method where elevation points where measured and recorded along a grid at the grid intersections. The objectives for this exercise include:

  • Using critical thinking skills to devise an improvised survey technique for mapping out a sandbox terrain with x, y, and z fields
  • Transfer the recorded terrain points into ArcMap and create a digital elevation surface map

Methods

For this exercise, a systematic point sampling technique was used to record the points of the terrain using a grid where intersecting points were measured and recorded (figure 1 shows how the grid was set up where each intersection of string indicated where elevation was measured, including where the string met the wood sides). This method was chosen because it provided a neat and organized grid across the terrain that allowed for unbiased and consistent measurements along the x and y planes. Measurements could have been taken in the middle of each grid square but this would have resulted in less accurate measurements due to having to "eyeball" the exact middle of each square.

Figure 1: Terrain with grid and measuring tools

The sandbox was set up outside of Phillips Hall across the street in a patch of lawn. The 45"x45" (114x114 cm) box (inner square) was then filled with sand. From here, each group developed their own terrain that was to include the following features (refer to figure 2):
  • Ridge - surrounding the hill in the middle in a circular pattern
  • Hill - in the middle
  • Depression - lower left corner
  • Valley - lower right corner
  • Plain - upper right corner

Figure 2: Terrain with developed features

A variety of tools were used to set up the grid and make and record the elevation measurements including:
  • Meter stick - measuring out 5 cm increments on top of the wooden box to insert push pins and for measuring the straight wire when elevation measurements were taken
  • Collapsible ruler - used as a place holder when taking elevation measurements on the grid
  • Push pins - inserted every 5 cm along the top of the wooden box for weaving string around
  • String - weaved around the pins along the x and y axes to create the grid
  • Thin straight wire - used to capture depth of elevation of terrain along the grid intersections
  • Laptop - recording x, y, and z values of terrain
For setting up the grid, 5 cm increments were measured from the origin (near the student's foot in figure 2) along the x and y axes and the push pins were inserted at these increments. After the pins were inserted, string was woven around them in a parallel fashion starting in one corner and going from one side to the other, then when one axis was done the string was woven perpendicularly again in a parallel fashion and going from one side to the other until the grid was complete (refer to figure 1). Once this was complete, the measurements and subsequent recording of measurements began starting at the origin (which was right up against the corner of the wooden box). A thin straight wire was used to push straight down into the sand until it hit the ground, then marked at where it met the top of the sand, and pulled out and measured with the meter stick from the marked location to the bottom of the wire where it touched the ground underneath the sand (see figure 3).

Figure 3: Measuring elevation with straight wire and meter stick

 After each recording, a collapsible ruler that stretched across and over on each side of the wooden box, was moved to the next grid line as a placeholder for keeping track of which point was to be measured next. The ground acted as the chosen "sea level" in this exercise. This avoided negative numbers in the z field. The top of the wooden box could have been chosen as sea level, but would have led to negative values and error in spots where the ridge and hill rose above the box which could cause inaccurate z values due to the grid points lying above the plane of the top of the box. The data was entered into a spreadsheet on a laptop by one student while the other two students conducted the measurements. Recording the data in the laptop was done as a way to be more time effective. If the data had been recorded in a notebook, then it would've have needed to be transferred into a spreadsheet regardless. Entering the data into the spreadsheet on the laptop directly avoided this extra step.


Results/Discussion

The final number of recorded points totaled 533. The sample values are as follows:
  • Minimum value: 6.6
  • Maximum value: 23.2
  • Mean: 14.6
  • Median: 14.4
  • Standard Deviation: 2.57
There was a 16.6 cm difference between the highest and lowest elevation points. The median, at 14.4 cm, falls almost directly between the maximum and minimum values. The average value is just slightly above the median value at 14.6 cm. 

The sampling related to the chosen method quite well. Hundreds of elevation points were collected that represented the terrain well. The sampling technique did not change from how it was originally planned out. It was even discussed before the survey that only every other point along the grid in the "plain" area of the terrain would be measured due to consistent elevation values and that is exactly what occurred. The only thing that was improved upon was using the collapsible ruler as a placeholder for measuring the points. The resulting data is what was expected where there was an elevation measured and recorded at every point on the grid except for a few rows in the "plain" area where only every other point was measured. Some problems (and subsequent solutions) that occurred during sampling included:
  1. Hitting hard objects with the wire in the sand on the way down to the ground or hitting soft ground with the wire and it going through the ground, giving an inaccurate elevation measurement. Solution: The wire was pushed hard through the objects in the sand and lightly pushed against the ground when it reached that point.
  2. Losing track of which point or row was being measured. Solution: The collapsible ruler was used to mark the column that was being measured but did not mark the row. For this, the color of the pin in the wood on that row was used to mark the row being measured. 
  3. The laptop ran out of battery with about three rows remaining. Solution: The remaining data points were recorded in a notebook and transferred over into the spreadsheet after the computer was reconnected to a power source. 

Conclusion

The sampling technique used in this exercise, which involved measuring and recording points along a grid that covered a surface area, relates the definition of sampling at the beginning of the introduction where a small part of the data was recorded that was then used to represent the whole population (terrain). Using sampling, in this situation, provided a less time consuming method for gathering elevation data points on the terrain as opposed to if every single elevation point was measured at every possible location (this would be an extremely tedious and time consuming effort). With over 500 points collected, the number of data points adequately sampled the extent of the terrain. If the survey were to be refined to accommodate the desired sampling density, it would be by measuring areas of higher relief more closely (every 2.5 cm as opposed to 5 cm) and the areas with lower relief less closely (every 7.5 cm as opposed to 5 cm). 

Tuesday, October 4, 2016

Hadleyville Cemetery Mapping - Week 3

Introduction

The biggest problem that the Hadleyville cemetery faces is the loss of all records and maps that detail who is buried in the cemetery and where in the cemetery they are buried. This cemetery has been in use since 1865 and many of the tombstones may be broken or illegible. The data must now be generated from scratch due to the fact that there is no background material to reference. A GIS will allow the creation of an attribute table that will be attached to a feature class that represents the individual plots located at the cemetery. Due to the fact that the data will be entered into a geodatabase that can be viewed, analyzed and updated makes this more than just a simple map project. A GPS unit, field notebooks, and a UAS drone will be used the gather the necessary data. The overall objectives include identifying as many stones and burials as possible and entering gathered data into a spreadsheet as accurately as possible for the use of creating the GIS map. The GIS will include a basemap and plotted grave marker points which will include attribute data such as the first and last name, year of birth and death, and a picture of the plot just to name a few.

Study Area

The study area is located in Eleva, WI located in the northwestern part of Trempealeau County (see figure 1 below). Located on the south side of County Road HH, it is out in the countryside surrounded by fields and trees.
Figure 1: Eleva Locator Map
The data was collected in early September about two weeks before the official start of the fall season. 

Methods

The class used a combination of a survey-grade GPS unit, a UAS drone, cameras, and field notebooks to conduct the survey. The GPS unit was used to plot the grave markers, the UAS drone gathered aerial imagery of the cemetery to be used as a basemap, cameras were used to take close-up pictures of the grave markers, and the field notebooks were used to gather observed attribute data off of the grave markers. Due to time constraints in the field while gathering data, data was collected in a quicker pace that may have led to possible data entry errors. This was evident when comparing recorded attribute data with the pictures that were taken of the grave markers. The attribute data from the grave markers were recorded through observation in field notebooks and in an organized fashion. The graveyard was "mapped" out in grid form where students designated grave markers with a letter for the row followed by the number that the grave marker fell within that row (refer to figure 2 below for clarity). The recorded data was then transferred over onto an online spreadsheet that all students were able to access and edit which then provided each student with the same, normalized table. The final, standardized attributes that the class agreed upon were:
  •  PointID (a letter and number associated with each grave marker)
  •  Notes
  •  Joint tombstone (whether or not a grave marker was shared between two or more people)
  •  Legible
  • First name
  • Last name
  • Middle initial
  • Year of birth
  • Year of death
  • Standing (if the headstone was upright or not)
  • Marker type (type of material marker was made from), and
  • Occupancy number (number of people that shared the same stone)
Figure 2: Cemetery Grid Map
Some issues that occurred when transferring data was making sure everyone was labeling his/her grave markers with the proper PointID, students having different terminology in the same field (for instance, some may have entered 'Partially' as opposed to 'Somewhat' in the 'Legible' field), and some confusion about what some students meant exactly in the 'Notes' field with what they wrote in. Overall, however, the data normalization process went smoothly considering how much effort was needed to collaborate with each other on how to proceed with the data entry and normalization process.

Using the UAS imagery in the GIS and the survey data in conjunction with the grid map in figure 2, it was possible to then plot each grave marker in the GIS and join the attribute table to the plotted points. The UAS provided high resolution imagery that made it possible to clearly see each individual grave marker which allowed for relatively easy plotting. 

Results/Discussion

The final result featured a grave marker attribute table (see figure 3 below) and a GIS map with locator maps included (see figure 4 below).
Figure 3: Graves Attribute Table (partial)

Figure 4: Hadleyville Cemetery GIS Map
The time creating the GIS itself was much longer than the time spent collecting the data. The survey GPS only collected a small portion of grave marker plot points, so that method ended up not being utilized. Instead, students used the grid map along with their recorded data to plot (as best as possible) each grave marker. The time spent in the field collecting data was approximately two hours, while the transferring of data and the creation of the GIS took several hours and included transferring the recorded data into a spreadsheet, plotting each individual grave marker and attaching attributes and photos to each one, and creating the final map. Possible sources of error include data being inaccurately recorded and transferred, attaching attribute data and photos to the incorrect grave markers, and not precisely plotting the grave markers (as some were located underneath trees that cannot be seen through in the UAS imagery). 

Having more time or using a different type of GPS unit for plotting the grave markers would have allowed for more thorough and faster data point collection and subsequent plotting in the GIS, and having created the cemetery grid along with a normalized spreadsheet prior to recording the attribute data would have allowed all students to be on the same page and made collecting data much more efficient. 

Conclusion

Overall, the methods transferred to the objectives quite well, although it could have been done more efficiently if students had known how to handle the collection of data beforehand, but such is the way of data collection in the field. While the UAS provided accurate imagery for the basemap, the recorded data was prone to human error in recording and transferring of data, and this became evident when comparing the data in the attribute table to the grave marker photos. The sources of error are not negligible and may only have a possibility of being acceptable if a grave marker was difficult to read. Otherwise the attribute data should have been recorded and transferred accurately. Although it was not 100% accurate, this GIS project does provide something better than the original situation, which was no records of any kind whatsoever. The survey was successful overall as it provided a GIS map with plotted grave markers that included attribute data and photos that can now be updated and edited in the future. 

Sources

Eleva locator map (figure 1): https://en.wikipedia.org/wiki/Eleva,_Wisconsin

Cemetery grid map (figure 2): Marcus Sessler