Skip to:

GRI Blog

Mapping of Invasive Phragmites in the Pearl River Coastal Wetlands and the Results of its Eradication Efforts

November 21, 2016 - Sathish Samiappan, Gray Turnage
Visible Imagery - Pearl River
Figure 1 Visible imagery collected near Pearlington, MS on September 23rd 2014
Journal Article: Using unmanned aerial vehicles for high-resolution remote sensing to map invasive Phragmites australis in coastal wetlands

Phragmites australis (common reed) is a perennial grass commonly found in brackish and freshwater wetlands. This invasive grass with an average height of 15 ft. appears to be rapidly outcompeting native species in many areas of the US. After colonization by Europeans, at least two subspecies of common reed have been introduced from Eurasia (haplotype M) and Africa (haplotype I). Haplotype M is rapidly displacing the native subspecies throughout its range, except for the southeastern US. Haplotype I appears to be the predominant subspecies of Phragmites invading wetland ecosystems along the Gulf Coast . Phragmites seeds abundantly and rapidly spreads vegetatively by stout and creeping rhizomes. Invasion of native ecosystems by Phragmites has been shown to have negative impacts on the local ecology, most notably through decreased biodiversity. In highly braided water bodies, like the delta of the Pearl River in southeastern Louisiana, Phragmites can also be a navigation hazard to small boats by reducing visibility as the plant regularly reaches heights greater than 15 ft. Once established, Phragmites spreads, outcompetes native plants for resources, and can eventually form large monocultures. In this paper, we elaborate our efforts to map invasive Phragmites using an Unmanned Aerial System (UAS) in the US Pearl River delta. We also demonstrate the effectiveness of this approach in-terms of both accuracy and cost.

Map of Study Area
Figure 2 Map of the study area. Pearl River delta along the Louisiana and Mississippi state border
Resource managers currently use a variety of tools including mowing, grazing, burning, and herbicides to control Phragmites. All of these methods rely on knowing the location of Phragmites before the implementation of management efforts. The pervasive spread of common reed in wetlands presents a unique challenge in precise mapping. Mapping is usually achieved through a variety of methods including satellite imagery, manned aircraft and walking around or through a Phragmites stand with a GPS unit. Although easily accessible, the spatial resolution of satellite imagery is not sufficient for the detection of short stands of Phragmites (~ 2-6 ft.) or individual plants. This can lead to the reestablishment of a Phragmites stand after management efforts have been completed due to individual plants and small stands acting as refugia. The revisit rates of satellites further limit the accessibility of near real-time data. Long wait times associated with satellite imagery can affect management efforts as the plants may have spread beyond the last known border of a stand. Manned aircraft can alleviate the wait time and poor resolution associated with satellite data modalities. However, they are seldom an economically viable option as it involves high costs and can be prone to pilot error. Walking around the Phragmites with a GPS unit to manually map a stand involves a lot of man-hours as well as dangers associated with groundwork (i.e. dangerous wildlife or difficult terrain navigation). All of these methods have drawbacks associated with them such as reduced image resolution, long periods between updated photos, cost efficiency, pilot error, and dangers associated with groundwork. These hindrances can often result in an incorrect Phragmites stand location which in turn hampers or slows management efforts. To overcome these drawbacks we used a UAS capable of collecting geo-referenced high-resolution visible imagery. In particular, an Altavian Nova UAS with an 18 megapixel Canon EOS Rebel camera. This system negates the low resolution and long update times of satellite imagery, the cost issues and potential pilot error associated with manned aircraft, and the hazards of on-the-ground fieldwork by humans.

Imagery Collected on September 23, 2014
Figure 3 The Red-Green-Blue images captured using a Canon EOS Rebel SL1 on an Altavian Nova platform on 23 September 2014
An Unmanned Aerial System (UAS) is a reusable motorized aerial vehicle that can fly autonomously, semi-autonomously or manually controlled by a ground-based pilot using a remote control and/or a ground control station (GCS). These platforms are usually equipped with photogrammetric measurement systems, including, but not limited to still, video, thermal, infrared, multispectral and hyperspectral cameras. Figure 1 shows two overlapped image mosaics collected with a visible wavelength camera. Each UAS platform has unique limitations associated with the payload being carried, flight time and altitude. To determine its trajectory of flight, a UAS has an integrated navigation system based on global navigation satellites and inertial navigation, a barometric altimeter, and a directional compass. UASs open new avenues such as near real-time, low-cost aerial data collection as an alternative to classical manned aerial systems. There are major benefits of employing UASs, mainly their ability to fly in inaccessible regions, safety, near real-time or real-time data availability for applications such as disaster response, similar mapping capabilities to a manned aircraft, and the ability to map regions at very high spatial resolutions at low-cost. In short, small UASs can be used as mapping platforms in small scale areas.

The study area is the lower Pearl River basin located between southwest Mississippi and southeast Louisiana of the US (see Figure 2). The region can be classified as a tidal freshwater marsh. Such regions are influenced by the daily influx of tides, yet they have a salinity of less than 0.5 ppt. This region is located in the delta of the Pearl River and drains into the Gulf of Mexico. Due to the heterogeneous habitat, tidal freshwater wetlands harbor diverse communities of plants and animals. This region has one of the healthiest marsh complexes in the Southeast and supports between 120-140 fish species and approximately 40 species of mussels, making it one of the most species-rich river systems in the US. Brackish or mesohaline marsh is found in the lower marsh zone near the mouth of the Pearl River.

Automated Phragmites Mapping
Figure 4 Classification results of Upper Bayou (marked in yellow)
The data used in this study was collected in the lower Pearl River basin west of Pearlington, Mississippi and north and south of U.S. Highway 90, an area totaling over 3,200 acres. Data collected over Desert Island and Deer Island (see Figure 1) on 23 September 2014 were used for creating texture based Phragmites map. The Altavian Nova weighs approximately 15 lbs. with payload, with 8.8 ft. wingspan and 4.9 ft. length. It can capture data on flights lasting up to 90 minutes. The Nova is waterproof, enabling water landings in the study area. The camera used is a modified Canon Rebel EOS SL1 with a payload and lens setup that gave approximately 2x2 inch pixels from an altitude of 750 ft. The internal IR filter was removed from the camera for prior missions to allow it to capture a response in the near-infrared region of the electromagnetic spectrum. This is often referred to as a Color Infra-Red camera. An external filter was used to restrict the response of the camera on this mission to the visible range. The size of the imagery is 5184x3456 pixels with 14bits/band. Images obtained from each flight are mosaicked on a per-flight basis in Agisoft Photoscan Pro. The UAS stores latitude, longitude, and altitude of the aircraft for each image taken. This information was uploaded to Photoscan Pro to give initial camera positions. Individual images with 50% side overlap and 70% forward overlap are used for creating the image mosaics, which were typically produced at the high quality setting in Photoscan Pro for both alignments of cameras and generation of the digital elevation model from the stereo imagery. Images were exported in the tiled format of size 3184x3184 and stitched together into a virtual mosaic using Geospatial Data Abstraction Layer (GDAL) software. Georeferencing is performed only by using the flight telemetry data. Ground control points and post flight corrections were not used. The mosaics produced were typically within meters or better of their true position. Should higher accuracy be desired, imagery can be rectified using the base maps within ESRI ArcMap or National Agricultural Imagery Program (NAIP) imagery. Due to the filters (removed and used), it was necessary to adjust the mapping from raw values to color values of the imagery from the payload before creating the mosaic. The issue is that the metering sensor of the camera sensed a different spectral range than the imaging sensor. Effectively the image must be white-balanced in a post-processing step. The white balance process removes unrealistic color casts in pixels that are actually white. The open source software, RawTherapee, was used to create white-balance profiles that were applied to the data. For optimal results, we needed to apply the profiles manually and adjust as necessary, as a single white balancing profile did not always produce the desired results. Two profiles were used. The first one was used as the default; the other was used if the imagery had water that appeared purple. Typically, the results of the white balancing had minimal effect on the end results of the classification algorithm.

Texture Feature Extraction and Classification

Phragmites Eradication by Burning
Figure 5 The Multispectral (Red, Green, Blue, Red Edge and Near Infrared) images captured using a Mica Sense camera on a Precision Hawk Lancaster platform on 3 March 2016
The texture is a critical feature in imagery; it can compensate for the lack of richness in spectral resolution when seeking to classify land covers. Visible imagery collected from the UAS has a very high spatial resolution, hence texture can be a more meaningful feature for classification of Phragmites from other plant species. Visual inspection of the visible imagery revealed unique properties of Phragmites; mainly roughness, granulation, and regularity. This observation motivated the use of texture features for Phragmites classification. Grey level co-occurrence matrix (GLCM) based texture features were used in this study. A GLCM is a matrix, G, where the number of rows and columns is equal to the number of intensity levels (I_N) in an image. A large image mosaic was divided into sub-images of 100 x 100 pixels and the GLCM features were extracted for each sub-image with an overlap of 20% between sub-images to avoid the edge effects. Refer to for more details about the GLCM algorithm implemented in this work. Prior to running the classifications, three Phragmites patches were selected to be used as ground truth (GT) patches. These patches were accessed by boat. Once the field crew was on the site a crew member walked around the patches with a handheld GPS unit to record the patch boundary. The GPS unit used was a Trimble Geo 7X GPS unit with sub-decimeter accuracy. Navigation to and around patches was difficult due to location, terrain, and vegetation. This further highlights the need for studies of this nature so as to decrease the dangers and costs of fieldwork.

Phragmites Eradication
Figure 6 Burned regions are marked in yellow
After the fieldwork was complete the image mosaic from the UAS flight was loaded into ESRI's ArcMap program. Digitization of the boundaries of the same three patches was done manually by selecting the boundary locations of the patches based on direct visual inspection of the image mosaic in ArcMap. After the digitized (DIG) boundaries were completed they were then compared to the GT boundaries that had been collected in the field. There was little noticeable difference in the two boundaries (GT and DIG) when inspected visually. Thus, we assumed that DIG Phragmites patch boundaries could be used as a surrogate ground truth of Phragmites patches when running classifications.Digitization of patch boundaries within the image mosaic was done in ArcMap. We then returned to the field to verify DIG Phragmites patch locations along river channels and roadways. We did not navigate around these patches with a GPS unit or visit patches that were inland from a river channel or roadway during the revisit.

Naive Bayes and maximum likelihood classifiers were initially tested to generate the classification maps. The results with these classifiers were not very encouraging. Then a Support Vector Machines (SVM) classifier is found to be a good choice for classifying the extracted texture features. A SVM classifier is a kernel based classification algorithm, which has been shown to be effective for classifying land cover types. The texture features from GLCM were linearly scaled from -1.0 to 1.0. This process normalized the numerical difference between the numerical values of the features. Optimal SVM parameters (Penalty C and Gamma y) were computed using a grid search algorithm. After obtaining the optimal parameters for training objects, SVM classifiers were trained and tested by using a LIBSVM library. The texture features and spectral bands from the imagery are used to train the SVMs. The binary classification problem is set up to classify Phragmites (P) and the non-Phragmites (NP). The training objects for both the P and the NP classes were selected randomly from multiple regions throughout the image mosaic by using ground truth information, photographs, and field notes. This ensured a representative sample from every region in the study area.

Results and Eradication efforts

Mapping of Phragmites and eradication are two non-coordinated independent operations. This paper demonstrates the ability of texture based algorithm in accurately mapping the invasive Phragmites by comparing the areas that are eradicated by US invasive coastal resource management department. While comparing these two results, readers should keep in mind that burning always causes extra damage so the eradicated areas are always larger than the anticipated regions. Figure 3 shows the area under study (also shown in Figure 2). In Figure 4, the automated mapping created from SVM classification using GLCM texture features is shown. The Phragmites areas are shown in yellow. The region under study is north of Honey Island and west of Desert Island in the delta of Pearl River. Figure 5 shows the imagery collected at a later date (March 2016) after the eradication efforts took place. This imagery was collected by using a Precision Hawk Lancaster UAS with 5-band RedEdge sensor from MicaSense.

Discussion

The image classification methods refined in this work using a very high spatial resolution imagery (2x2 inches) allowed for the mapping of small to large stands of invasive Phragmites in the Pearl River delta regions. This method is found to be applicable for stands of all heights. However the classification accuracy for each site is variable, depending on the density of patches. Our study considered the following major questions about automatic mapping of invasive Phragmites: 1) can UASs be used for successful mapping of Phragmites, 2) can texture features based on GLCM be used to distinguish Phragmites from non-Phragmites, and 3) can Phragmites be mapped by using only the low altitude high spatial resolution visible imagery. Results suggest each question can be answered with an affirmative. Based on experimental results study and comparing the eradication maps, texture features are able to distinguish Phragmites stands. There are several important conclusions that can be drawn from this study. GLCM is computationally efficient and effective technique for classifying Phragmites stands using visible imagery with pixel sizes of 5x5 cm. The ground truth information used in this study is a combination of field visits, photographs and visual analysis of the high-resolution UAS imagery by an expert in the field of aquatic invasive species. In several inaccessible areas, native trees were surrounded by dense stands of Phragmites, which resulted in the assumption that the entire area was Phragmites. This human error resulted in higher omission errors in several regions of the image. As a result, future work will be directed toward the inclusion of additional spectral bands such as red edge and near-infrared. These bands help to easily differentiate the vegetation from non-vegetation and in some cases between two different plant types. Unfortunately, the presently available multispectral cameras for small UAS have a lower spatial resolution than the RGB cameras. Digital surface models that provide height information will also be considered to better classify the Phragmites as this is the tallest grass in the wetlands.

Evaluation of Unmanned Aerial Vehicles (UAV's) for Estimating Distribution and Damage of Feral Swine

November 21, 2016 - Sathish Samiappan
Nine Head of Cattle
Nine head of cattle imaged from 200ft above
The national feral swine population is currently estimated to exceed more than five million individual animals. Over the past several years their numbers have increased significantly and feral swine are now known to exist in at least 40 states. In 1982, feral swine were thought to occur in only a small percentage of counties located in 17 States. Based on data from APHIS Wildlife Services' National Wildlife Disease Program, the Southeastern Cooperative Wildlife Disease Study, and APHIS' Veterinary Services Feral Swine Tracking and Monitoring Data feral swine are now present in approximately 40% of all counties in the United States. The geographic expansion is primarily due to humans transplanting them to new areas to increase hunting opportunities, while increased local population sizes are mainly due to their high reproductive capacity and the ability to thrive in a wide range of habitats. Feral swine cause extensive damage to crops, forest, and livestock. Nationally damage is estimated at 1.5 billion dollars per year. We conducted unmanned aerial vehicle (UAV) based surveys at selected areas where feral swine damage is occurring. We evaluated the resolution necessary to obtain censuses of feral swine using UAVs and automated pattern recognition techniques and to determine if UAV surveys can distinguish between feral swine and deer damage.

Trapped Hog
Hog trapped in a cage from 200ft imagery
Thermal Snapshot of Trapped Hog
Thermal video snapshot of a trapped hog
Uniformity analysis to identify crop damages

Determining corn crop uniformity on a large field is of tremendous value to monitor plant health and damages caused by hogs and deer. Texture modelling techniques were investigated to map three different densities (Low, Medium and High) on a corn field by using visible imagery collected using UAVs.

Texture Modeling Techniques:

Gray Level Co-Occurrence Matrix (GLCM)
GLCM is a statistical method of examining texture that considers the spatial relationship of pixels. This also known as the gray-level spatial dependence matrix. The GLCM characterize the texture of an image by calculating how often pairs of pixel with specific values and in a specified spatial relationship occur in an image, creating a GLCM, and then extracting statistical measures from this matrix.
Segmentation-based Fractal Texture Analysis (SFTA)
The SFTA extraction algorithm consists of decomposing the input image into a set of binary images from which the fractal dimensions of the resulting regions are computed in order to describe segmented texture patterns. In the past, SFTA has been successfully used for the tasks of content-based image retrieval (CBIR) and image classification. Its performance is superior to that of other widely employed feature extraction methods such as Haralick and Gabor filter banks. SFTA achieved higher precision and accuracy for CBIR and image classification. Additionally, SFTA was at least 3.7 times faster than Gabor and 1.6 times faster than Haralick with respect to feature extraction time.
Wavelet Texture Analysis
Wavelet texture analysis is based on the application of a 2D wavelet transform to each raw sub-image, which essentially consists of transforming a matrix of numbers (pixel intensities, as we are analyzing single-channel or grey-level images) into another, with the same size (same overall number of wavelet coefficients), containing blocks of coefficients for different scales (from the finest to the coarsest scale, which is known as the decomposition depth) and along three different directions (horizontal, vertical and diagonal).
Texture Analysis based on Gabor Filters
In image processing, a Gabor filter is a linear filter used for edge detection. The frequency and orientation representations of Gabor filters are similar to those of the human visual system, and they have been found to be particularly appropriate for texture representation and discrimination. In the spatial domain, a 2D Gabor filter is a Gaussian kernel function modulated by a sinusoidal plane wave.

Simple cells in the visual cortex of mammalian brains can be modeled by Gabor functions. Thus, image analysis with Gabor filters is thought to be similar to perception in the human visual system.

Usually, a filter bank consisting of Gabor filters with various scales and rotations is created. The filters are convolved with the signal (image), resulting in a so-called Gabor space. This process is closely related to processes in the primary visual cortex.
Hog Damage at Study Site
Hog damage at the study site delineated by a human expert with field knowledge (not from looking at the UAV imagery)
Visible Damages Caused by Hogs
Visible Damages caused by hogs in the field on right

Evaluation of Unmanned Aerial Vehicles (UAV's) for Estimating Distribution and Abundance of Waterbirds on Catfish Aquaculture Facilities.

November 21, 2016 - Sathish Samiappan
Double-crested cormorants, American White Pelicans, and other fish eating birds are abundant in Mississippi and there is considerable stakeholder concern over the apparent increase in utilization of cultured catfish by fish eating birds. These factors have resulted in concerns regarding depredation impacts and economic losses to the catfish aquaculture industry attributable to fish eating birds.

Proportional use and count information are important in determining the economic impact of fish eating birds to the catfish aquaculture industry. Historically, surveys have been conducted from the ground or by air using certified pilots, typically in fixed wing aircraft. Recent advances in unmanned aerial vehicle (UAV) technology may provide for a more cost effective and less risky solution for the conduct of aerial surveys of waterbirds on catfish aquaculture facilities and at their roost sites.

We conducted UAV based surveys at selected catfish aquaculture facilities in the primary catfish aquaculture producing areas of Mississippi. We evaluated the resolution and extent of coverage necessary to provide for UAV remotely sensed and pattern recognition based censuses of fish-eating birds.

Precisionhawk and Robota UAV platforms were used to collect visible imagery at different altitudes from two different sites. The first site is located north of Leland in Washington county in Mississippi (-90.891425 33.447374 Decimal Degrees). Imagery from four flights at 200ft, 400ft and 600ft altitudes were collected over 80 acre site by using a visible Sony RX-100 20MP camera on Robota Tritan UAV. This data was collected on 09/08/2014 and 10/18/2014. The ground resolution corresponding to UAV flown at 200ft, 400ft and 600ft are approximately 1.2cm, 2.5cm and 3.5cm (0.5 inch to 1.25 inch) respectively. The second site is located south of Indianola in Humphrey county in Mississippi (-90.539764 33.327089 Decimal Degrees). Three flights at 200ft, 400ft and 600ft were collected over 120 acre site by using a visible Cannon camera on Precisionhawk Lancaster UAV. This data was collected on 03/24/2015. The ground resolution corresponding to UAV flown at 200ft, 400ft and 600ft are approximately 1.5cm, 3 cm and 4.5cm respectively (0.75 inch to 2 inch). In all the seven flights, a large image mosaic was constructed out of overlapped individual images. In both locations, American pelicans and blue herons were spotted. We could not spot the double crested cormorants on these locations to collect the imagery.

Pattern Recognition Experiments and Results

Pattern recognition algorithms such as color segmentation and template matching were applied on the imagery to automatically count the birds.

Figure 1 and 3 shows two snapshots of white pelicans and blue herons respectively that are used for performing color segmentation. Figure 2 and 4 shows the results of segmentation. It can be clearly observed that this algorithm can be used to effectively segment the birds in the imagery. The algorithm requires manual/automatic selection of the color of the target object that needs to be segmented in the first step. Delta-E color segmentation is performed in the Lab color space. This algorithm is very fast and accurate in segmenting a single color tone (average over a region) objects such as white pelicans and blue herons. A connected component object counting algorithm is then used to accurately count the segmented objects resulting in the estimation of number of birds in the imagery.

Figure 5 shows the snapshot where both white birds and blue herons were present. Color segmentation is very useful in identifying and segmenting a bird purely by using its color. Template matching algorithm can be employed to identify birds based on both color and shape. Figure 6 shows the result of template matching algorithm identifying eight blue herons in the image.

Figure 1
Figure 1 Snapshot of a region where several white pelicans were spotted in the imagery
Figure 2
Figure 2 Result of Delta-E color segmentation algorithm to segment the birds
Figure 3
Figure 3 Snapshot of a region where several blue herons were spotted in the imagery
Figure 4
Figure 4 Result of Delta-E color segmentation algorithm to segment the birds

Figure 5
Figure 5 Snapshot of a region where several white pelican and blue herons were spotted in the imagery
Figure 6
Figure 6 Result of Template matching algorithm to segment the blue herons