Posted on June 25, 2013 by Carl Lipo
[ hide ]
Uluhe ferns (Dicranopteris linearis) are native to Hawaii and they protect other native species like hala trees (Pandanus tectorius). Using remote sensing imagery to map the spatial distribution of native species in Hawaii like the uluhe ferns would be necessary for protecting the limited number of existing native species from non-native, invasive species. This paper demonstrates how object-based image analysis (OBIA) in eCognition could be used to classify uluhe ferns in the Worldview-2 imagery. The overall accuracy of the classification was about 86% (kappa = 0.77) for classifying the ferns, grass, and trees in the Ka’a’awa Valley.
Uluhe ferns grow in very dense thickets that are more than 3m deep because new ferns grow on top of the old or dying ferns (Russell, 1998). As a result, the ferns are green at the top, brown underneath, and grey at the bottom. This makes it possible to differentiate between the ferns and grass in remote sensing imagery because the ferns tend to be more rough in texture, and have a darker green color with some brown mixed in compared to the grass which is very smooth and has bright green color when viewing the image in true color. The ferns also cover large areas that are very sparsely populated with trees which are usually native like the hala. This can also be seen in the remote sensing imagery. In addition, because the ferns grow on top of each other and is completely surrounding native trees, it is difficult for invasive species to come in and destroy the native vegetation. Thus, uluhe ferns could potentially be used to protect other native species for ecosystem management. Finally, the ferns tend to grow on very steep slopes and at mid elevations, which meant that a DEM could be used to extract ferns in eCognition.
The objective of this research was to map the uluhe ferns that are located in the backside of the valley using the Worldview-2 imagery and eCognition. The eCognition software Definiens Developer 8.0.1 can conduct object-based image analysis which takes an object as the unit of analysis rather than pixels. The object is created through a segmentation command where the user defines the scale parameter, shape, compactness, and image layer weights. These settings helps the user attain the best object segmentation, where each object contains only one class and the object is not too small so that there is no real difference in range of values covered by the different classes that the user is trying to define.
The study area is the Ka’a’awa Valley, which is on the island of Oahu in Hawaii. This valley is a part of the Kualoa Ranch, which is involved in cattle ranching, tourism, and filming. Thus, the valley is considered to be fairly pristine but we need to be aware that plants may be growing in certain places due to human activity or management of plants rather than due to the natural process.
For this research I only covered the backside of the Ka’a’awa Valley because I knew that there were no ferns at the front of the valley from collecting GPS points of vegetation that would appear in remote sensing imagery throughout the entire valley for the common product for vegetation. Field data were collected during June, 2013. A total of 44 GPS points were taken using the GPSMAP 60CSx. These points were taken at places with ferns or grass to help distinguish ferns from grass when looking at the Worldview-2 imagery. For the fern patches, two to five points were taken for each fern patch depending on the size of the patch but only one point was taken for grass. The Worldview-2 imagery and the Oahu DEM used for this research were from 2011. I overlaid the GPS points on the Worldview-2 imagery but due to the differences in years, only one of the fern patches that I took GPS points for was actually overlaid on top of an area of the Worldview-2 imagery that resembled a fern patch which is shown in Figure 1 with a red circle at the top of the map. All the other points seemed to lie on trees or grass. The other two red circles on the map are locations of potential fern patches that look similar to the fern patch with GPS points. I also remember seeing two fern patches approximately at those places but I could not get GPS points because they were located very high up on the valley.
In order to conduct the OBIA in eCognition, a subset of the Worldview-2 imagery was used because the imagery covers more than the Ka’a’awa Valley and I only needed the backside of the Ka’a’awa Valley for my research. Then I ran the command for multiresolution segmentation which used scale parameter = 50, shape = 0.1, compactness = 0.9, and for the image layer weights, NIR2 and Yellow had values of 5 while all the other bands had values of 1. For the classification of uluhe ferns, I set a minimum and a maximum boundary for a variable, and the objects that meet the criteria were assigned to a temporary class called ‘extra1’. This command is called a ruleset. Then another ruleset was created but the objects would only be chosen from class ‘extra1’ and the objects were assigned to ‘extra2’. This process is repeated until I get down to a set of objects that I considered to be ferns. Then I assign the set of objects to ‘Uluhe Ferns’. The same process was done for classifying the other classes, but for trees, I also used the distance to shadow objects. First I would classify some objects as shadow using a ruleset, then I create another ruleset that says that if an object is less than 50 pixels away from shadows then the object should be classified as ‘Tree’. For the rulesets, I used mean, standard deviation, and texture variables. I also created a slope raster using ArcToolbox in ArcGIS 10.1 and imported the raster into eCognition to use the slope values for each object to classify the ferns.
Figure 2 shows the final classification map and Figure 3 is showing the map in true color. The uluhe ferns at the center of the map in Figure 2 are located in the same places that I considered to be ferns. However, the ferns on the top left corner of the map look like trees in Figure 3 so this could be overclassification or it may be that there are ferns underneath the trees and the image has captured the spectral values for those ferns. There was no way for me to verify this because I was not able to access that portion of the valley.
The accuracy of the OBIA was assessed by selecting samples for each class and using the Error Matrix based on Samples statistics that can be conducted using the Accuracy Assessment Tool in eCognition. In order to do this accuracy assessment, I first created a new point shapefile in ArcMap 10.1. Then, the Create Random Points Tool was used to create points for the new shapefile. This was buffered at 5m using the Buffer Tool. The buffer was imported into eCognition where I selected samples by identifying an object that contained the random points as a specific class. Afterwards, I noticed that no ferns were selected due to the small area it covers so I chose the three sites that I know to be ferns as samples for ferns. This is why in Table 2, the values for ferns are all 1.
The Kappa’s Index of Agreement (KIA) is shown for each class and the overall classification which was 0.77. If the KIA is close to 1 then this means that the classification is in perfect agreement or that the samples you selected are classified correctly like the uluhe ferns since I only selected samples that I knew would be accurately classified. On the other hand, if the KIA is close to 0, this means that the classification is in agreement equivalent to chance, meaning that the chosen samples will be classified correctly at random if at all (Viera, 2005). The overall classification had an accuracy of 86%.
The limitations of this research was that it was difficult to identify the uluhe ferns in the Worldview-2 imagery because the GPS points were collected in June, 2013 while the imagery used for classification was from 2011. This difference in years caused problems because the ferns that exist in 2013, seemed to have been nonexistent in 2011. Due to the low spatial resolution of the Worldview-2 imagery, it was difficult to tell the difference between grass and ferns. Due to time constraints, it was difficult to do a classification that seemed more accurate. I also did not know the spectral values for the different types of vegetation which would have helped greatly in setting the thresholds for classifying something as uluhe ferns, grass, or trees. However, this would have been dangerous because I would have had to walk up a very steep slope with very dense thicket of ferns that are difficult to get through while both of my hands were occupied with holding a spectrometer and the Yuma. Not to mention that a third hand would be needed to press a button on the screen of the Yuma to make the spectrometer get a reading.
For future research, I would like to try using LiDAR data and iFSAR DSM instead of DEM. I did not have a chance to use the iFSAR DSM because I only had access to it two days before the presentation and paper were due. I would also like to compare the UAV and Worldview-2 imagery and see which is better for classification of uluhe ferns. It would be ideal if it was possible to collect field data in the same year as the imagery I am working with.
Oahu DEM Source: National Geophysical Data Center, NESDIS, NOAA, U.S. Department of Commerce (2011)
Russell, A. E., Raich, J. W. and Vitousek, P. M. (1998), The ecology of the climbing fern Dicranopteris linearis on windward Mauna Loa, Hawaii. Journal of Ecology, 86(5), 765–779.
Viera, A.J., & Garrett, J. M. (2005). Understanding interobserver agreement: the kappa statistic. Farm Med, 37(5), 360-363.
WorldView-2 imagery Source: DigitalGlobe (2011)
Revision @ 09/28/15 9:09 pm by Carl Lipo
Revision @ 09/28/15 9:09 pm by Audrey
Revision @ 09/28/15 9:09 pm by Carl Lipo
Revision @ 09/28/15 9:09 pm by Audrey
Revision @ 09/28/15 9:09 pm by Greg Hosilyk