Skip to:

GRI Blog

Mapping of Invasive Phragmites (Common Reed) in Gulf of Mexico Coastal Wetlands Using Multispectral Imagery and Small Unmanned Aerial Systems

December 14, 2016 - Sathish Samiappan, Gray Turnage
The Study Site
Figure 1. The study site (shown in grey) near Pearlington, Mississippi, USA (about 66 ha) selected for developing and evaluating techniques to map invasive Phragmites.
Typical Plant Reflectance Characteristics
Figure 2. Typical plant reflectance characteristics of various wetland plants and spectral response of MicaSense RedEdge camera.
PrecisionHawk Lancaster
Figure 3. Precision Hawk Lancaster UAV with 5-band MSRE multispectral sensor
Camera Trigger Points
Figure 4. Camera trigger point locations (shown as Green dots) where the images are captured with MSRE camera to construct the orthomosaic.
Study Area with Ground Reference
Figure 5. The study area with ground reference of Phragmites (boundaries are shown in yellow).
Phragmites Classification Maps
Figure 6. Phragmites classification maps produced by using MSRE bands, DSM, NDVI, SAVI, MAP - MOI as features (Phragmites GR areas are outlined in yellow and the classification map is overlayed in orange with 50% transparancy)
Journal Article: Mapping of Invasive Phragmites (Common Reed) in Gulf of Mexico Coastal Wetlands Using Multispectral Imagery and Small Unmanned Aerial Systems

Introduction

In coastal wetlands of the Gulf of Mexico, the invasive plant species Phragmites australis (common reed) rapidly alters the ecology of a site by shifting plant communities from heterogeneous mixtures of plant species to homogenous stands of Phragmites. Phragmites grows in very dense stands at an average height of 4.6 m and outcompetes native plants for resources. To restore affected wetlands, resource managers require an accurate map of Phragmites locations. Previous studies have used satellite and manned aircraft based remote-sensing images to map Phragmites in relatively large areas at a coarse scale; however, low altitude high spatial resolution pixel-based classification approaches would improve the mapping accuracy. This study explores the supervised classification methods to accurately map Phragmites in coastal wetlands at the delta of the Pearl River in Louisiana and Mississippi, USA using high resolution (8 cm ground sample distance) multispectral imagery collected from a small unmanned aerial system platform at an altitude of 120 m. We create a map through pixel based support vector machines classification using blue, green, red, red edge, and near infra-red spectral bands along with a digital surface model, vegetation indices and morphological attribute profiles as features. This study also demonstrates the effects of different features and their usefulness in generating an accurate map of Phragmites locations. Accuracy assessment based on a) a subset of training/testing samples (to show classifier performance), and b) the entire ground reference map (to show the quality of mapping) is demonstrated. Kappa, overall accuracy, class accuracies and their confidence intervals are reported. An overall accuracy of 91% and Kappa of 63 is achieved. The results of this study indicate that features such as morphological attribute profiles are very useful in accurately mapping invasive Phragmites compared to existing region-based approaches. Complete research and analysis of this study to be published in the special UAV research issue in the International Journal of Remote Sensing URL: http://dx.doi.org/10.1080/01431161.2016.1271480

Study area and image acquisition

Our study site is located in the delta of the lower Pearl River which spans southwest Mississippi and southeast Louisiana. The region is classified as a freshwater tidal marsh which is influenced by the daily influx of tides, yet it has a salinity of less than 0.5 ppt. This area harbors a diverse ecological community. The region around Desert Island (about 66 ha) was chosen as a study site to evaluate the supervised classification (Figure 1). The site is in the lower Pearl River basin west of Pearlington, Mississippi and north and south of U.S. Highway 90. This site was selected because of its ecological and landscape diversity as it contains all prominent land/vegetation cover types in the region and is a reasonable size and location that allowed for the collection of GR data.

High resolution (8 cm) multispectral imagery was acquired over the Desert Island area, on 3 September 2015. The imagery has three visible bands (blue (480 nm), green (560 nm) and red (670 nm)), as well as red edge (720 nm) and near infrared (840 nm) bands. The response of different wetland plant species and bandwidth of each spectral band of the MSRE camera is shown in Figure 2. Ground control points and white reference panels were used to help to generate orthomosaics and white balancing profiles. Each image is 1280x960 pixels with 12 bits per pixel per band. Images obtained from the flight were mosaicked on a per-flight basis using the cloud-based MicaSense Atlas (MSA) platform. PHawk used in this work is a fully autonomous fixed wing unmanned aircraft that is hand launched and can capture data on flights lasting tp to 45 minutes at cruising speed of 50 km per hour. It is built with a single electric motor and the aircraft weighs approximately 2.4 kg with payload, with a 150 cm wingspan (Figure 3). The PHawk platform utilizes ArduPilot, an open source unmanned aerial vehicle hardware and software platform sold by 3D Robotics. The optimal flight plan for data collection is computed by the onboard flight computer based on the shape of the survey area, wind speed and wind direction once the aircraft reaches the target altitude. PHawk has an open source ground based flight instrument application known as Mission Planner that provides information about the flight situation of the aircraft, such as altitude, airspeed, remaining energy in the battery, etc. Mission Planner is based on the ArduPilot open source autopilot project which helps setup, configure and tune the aircraft for optimum performance. Mission Planner can save and load autonomous flight plans to the aircraft with point-and-click entry of way-points on Google maps. The camera trigger points from the built in GPS unit of the flight is shown in Figure 4. The imagery collected from the UAS and the location of the Phragmites is shown in Figure 5. The Phragmites boundaries are shown in yellow. The location of the samples used to train and test the supervised SVM classifier are also indicated in blue and red patches, corresponding to NP (Non Phragmites) and P (Phragmites) classes respectively, in Figure 5.

The UAS records latitude, longitude, and altitude of the aircraft for each image taken. The imagery had 70% side overlap and 70% in-track overlap. This information was uploaded to MSA to generate orthomosaics and a DSM.

Feature Extraction and Classification

In our previous research, the usefulness of texture based features for mapping invasive Phragmites was studied in detail for simulated natural colour imagery. Although computationally expensive, texture features and SVM classification were shown to be a reliable and low-cost method for mapping Phragmites using simulated natural colour images. In this research, multispectral imagery collected from a UAS using a 5-band MSRE camera has been used to map invasive Phragmites infested regions. Features such as computed DSMs, NDVI, SAVI and MAP's are studied with SVMs as an underlying supervised classifier. The proposed system for mapping invasive Phragmites is based on a stacked feature framework composed of features from four different sources. The choice of this combination of features is due to the complexity of the scene under study. The relationship between the choice of features and statistically significant classification accuracy can only be established by carrying out actual experiments. Depending on the complexity of the remotely sensed data, the choices may vary broadly.

Results

When compared to alternative methods such as human scouting and image collection from manned aircraft, classification maps produced from UAS imagery are of greater value for resource managers. The amount of time, cost, and risk involved can be considerably reduced with a UAS based mapping system.

Our analysis was performed on a computer with a 3.20GHz Intel Xeon W3565 quad-core processor using a 64-bit Microsoft Windows operating system with 12GB of RAM. The overall processing time per km2 was approximately 15 minutes. The processing time is significantly reduced when compared to our previous study based on texture based mapping (50 minutes per km2) from a simulated natural colour imagery (Samiappan 2016). The cost of mapping Phragmites per km2, excluding the cost of the UAS, camera and the software used to mosaic and classify, is about $40.

Mapping of Invasive Phragmites in the Pearl River Coastal Wetlands and the Results of its Eradication Efforts

November 21, 2016 - Sathish Samiappan, Gray Turnage
Visible Imagery - Pearl River
Figure 1 Visible imagery collected near Pearlington, MS on September 23rd 2014
Journal Article: Using unmanned aerial vehicles for high-resolution remote sensing to map invasive Phragmites australis in coastal wetlands

Phragmites australis (common reed) is a perennial grass commonly found in brackish and freshwater wetlands. This invasive grass with an average height of 15 ft. appears to be rapidly outcompeting native species in many areas of the US. After colonization by Europeans, at least two subspecies of common reed have been introduced from Eurasia (haplotype M) and Africa (haplotype I). Haplotype M is rapidly displacing the native subspecies throughout its range, except for the southeastern US. Haplotype I appears to be the predominant subspecies of Phragmites invading wetland ecosystems along the Gulf Coast . Phragmites seeds abundantly and rapidly spreads vegetatively by stout and creeping rhizomes. Invasion of native ecosystems by Phragmites has been shown to have negative impacts on the local ecology, most notably through decreased biodiversity. In highly braided water bodies, like the delta of the Pearl River in southeastern Louisiana, Phragmites can also be a navigation hazard to small boats by reducing visibility as the plant regularly reaches heights greater than 15 ft. Once established, Phragmites spreads, outcompetes native plants for resources, and can eventually form large monocultures. In this paper, we elaborate our efforts to map invasive Phragmites using an Unmanned Aerial System (UAS) in the US Pearl River delta. We also demonstrate the effectiveness of this approach in-terms of both accuracy and cost.

Map of Study Area
Figure 2 Map of the study area. Pearl River delta along the Louisiana and Mississippi state border
Resource managers currently use a variety of tools including mowing, grazing, burning, and herbicides to control Phragmites. All of these methods rely on knowing the location of Phragmites before the implementation of management efforts. The pervasive spread of common reed in wetlands presents a unique challenge in precise mapping. Mapping is usually achieved through a variety of methods including satellite imagery, manned aircraft and walking around or through a Phragmites stand with a GPS unit. Although easily accessible, the spatial resolution of satellite imagery is not sufficient for the detection of short stands of Phragmites (~ 2-6 ft.) or individual plants. This can lead to the reestablishment of a Phragmites stand after management efforts have been completed due to individual plants and small stands acting as refugia. The revisit rates of satellites further limit the accessibility of near real-time data. Long wait times associated with satellite imagery can affect management efforts as the plants may have spread beyond the last known border of a stand. Manned aircraft can alleviate the wait time and poor resolution associated with satellite data modalities. However, they are seldom an economically viable option as it involves high costs and can be prone to pilot error. Walking around the Phragmites with a GPS unit to manually map a stand involves a lot of man-hours as well as dangers associated with groundwork (i.e. dangerous wildlife or difficult terrain navigation). All of these methods have drawbacks associated with them such as reduced image resolution, long periods between updated photos, cost efficiency, pilot error, and dangers associated with groundwork. These hindrances can often result in an incorrect Phragmites stand location which in turn hampers or slows management efforts. To overcome these drawbacks we used a UAS capable of collecting geo-referenced high-resolution visible imagery. In particular, an Altavian Nova UAS with an 18 megapixel Canon EOS Rebel camera. This system negates the low resolution and long update times of satellite imagery, the cost issues and potential pilot error associated with manned aircraft, and the hazards of on-the-ground fieldwork by humans.

Imagery Collected on September 23, 2014
Figure 3 The Red-Green-Blue images captured using a Canon EOS Rebel SL1 on an Altavian Nova platform on 23 September 2014
An Unmanned Aerial System (UAS) is a reusable motorized aerial vehicle that can fly autonomously, semi-autonomously or manually controlled by a ground-based pilot using a remote control and/or a ground control station (GCS). These platforms are usually equipped with photogrammetric measurement systems, including, but not limited to still, video, thermal, infrared, multispectral and hyperspectral cameras. Figure 1 shows two overlapped image mosaics collected with a visible wavelength camera. Each UAS platform has unique limitations associated with the payload being carried, flight time and altitude. To determine its trajectory of flight, a UAS has an integrated navigation system based on global navigation satellites and inertial navigation, a barometric altimeter, and a directional compass. UASs open new avenues such as near real-time, low-cost aerial data collection as an alternative to classical manned aerial systems. There are major benefits of employing UASs, mainly their ability to fly in inaccessible regions, safety, near real-time or real-time data availability for applications such as disaster response, similar mapping capabilities to a manned aircraft, and the ability to map regions at very high spatial resolutions at low-cost. In short, small UASs can be used as mapping platforms in small scale areas.

The study area is the lower Pearl River basin located between southwest Mississippi and southeast Louisiana of the US (see Figure 2). The region can be classified as a tidal freshwater marsh. Such regions are influenced by the daily influx of tides, yet they have a salinity of less than 0.5 ppt. This region is located in the delta of the Pearl River and drains into the Gulf of Mexico. Due to the heterogeneous habitat, tidal freshwater wetlands harbor diverse communities of plants and animals. This region has one of the healthiest marsh complexes in the Southeast and supports between 120-140 fish species and approximately 40 species of mussels, making it one of the most species-rich river systems in the US. Brackish or mesohaline marsh is found in the lower marsh zone near the mouth of the Pearl River.

Automated Phragmites Mapping
Figure 4 Classification results of Upper Bayou (marked in yellow)
The data used in this study was collected in the lower Pearl River basin west of Pearlington, Mississippi and north and south of U.S. Highway 90, an area totaling over 3,200 acres. Data collected over Desert Island and Deer Island (see Figure 1) on 23 September 2014 were used for creating texture based Phragmites map. The Altavian Nova weighs approximately 15 lbs. with payload, with 8.8 ft. wingspan and 4.9 ft. length. It can capture data on flights lasting up to 90 minutes. The Nova is waterproof, enabling water landings in the study area. The camera used is a modified Canon Rebel EOS SL1 with a payload and lens setup that gave approximately 2x2 inch pixels from an altitude of 750 ft. The internal IR filter was removed from the camera for prior missions to allow it to capture a response in the near-infrared region of the electromagnetic spectrum. This is often referred to as a Color Infra-Red camera. An external filter was used to restrict the response of the camera on this mission to the visible range. The size of the imagery is 5184x3456 pixels with 14bits/band. Images obtained from each flight are mosaicked on a per-flight basis in Agisoft Photoscan Pro. The UAS stores latitude, longitude, and altitude of the aircraft for each image taken. This information was uploaded to Photoscan Pro to give initial camera positions. Individual images with 50% side overlap and 70% forward overlap are used for creating the image mosaics, which were typically produced at the high quality setting in Photoscan Pro for both alignments of cameras and generation of the digital elevation model from the stereo imagery. Images were exported in the tiled format of size 3184x3184 and stitched together into a virtual mosaic using Geospatial Data Abstraction Layer (GDAL) software. Georeferencing is performed only by using the flight telemetry data. Ground control points and post flight corrections were not used. The mosaics produced were typically within meters or better of their true position. Should higher accuracy be desired, imagery can be rectified using the base maps within ESRI ArcMap or National Agricultural Imagery Program (NAIP) imagery. Due to the filters (removed and used), it was necessary to adjust the mapping from raw values to color values of the imagery from the payload before creating the mosaic. The issue is that the metering sensor of the camera sensed a different spectral range than the imaging sensor. Effectively the image must be white-balanced in a post-processing step. The white balance process removes unrealistic color casts in pixels that are actually white. The open source software, RawTherapee, was used to create white-balance profiles that were applied to the data. For optimal results, we needed to apply the profiles manually and adjust as necessary, as a single white balancing profile did not always produce the desired results. Two profiles were used. The first one was used as the default; the other was used if the imagery had water that appeared purple. Typically, the results of the white balancing had minimal effect on the end results of the classification algorithm.

Texture Feature Extraction and Classification

Phragmites Eradication by Burning
Figure 5 The Multispectral (Red, Green, Blue, Red Edge and Near Infrared) images captured using a Mica Sense camera on a Precision Hawk Lancaster platform on 3 March 2016
The texture is a critical feature in imagery; it can compensate for the lack of richness in spectral resolution when seeking to classify land covers. Visible imagery collected from the UAS has a very high spatial resolution, hence texture can be a more meaningful feature for classification of Phragmites from other plant species. Visual inspection of the visible imagery revealed unique properties of Phragmites; mainly roughness, granulation, and regularity. This observation motivated the use of texture features for Phragmites classification. Grey level co-occurrence matrix (GLCM) based texture features were used in this study. A GLCM is a matrix, G, where the number of rows and columns is equal to the number of intensity levels (I_N) in an image. A large image mosaic was divided into sub-images of 100 x 100 pixels and the GLCM features were extracted for each sub-image with an overlap of 20% between sub-images to avoid the edge effects. Refer to for more details about the GLCM algorithm implemented in this work. Prior to running the classifications, three Phragmites patches were selected to be used as ground truth (GT) patches. These patches were accessed by boat. Once the field crew was on the site a crew member walked around the patches with a handheld GPS unit to record the patch boundary. The GPS unit used was a Trimble Geo 7X GPS unit with sub-decimeter accuracy. Navigation to and around patches was difficult due to location, terrain, and vegetation. This further highlights the need for studies of this nature so as to decrease the dangers and costs of fieldwork.

Phragmites Eradication
Figure 6 Burned regions are marked in yellow
After the fieldwork was complete the image mosaic from the UAS flight was loaded into ESRI's ArcMap program. Digitization of the boundaries of the same three patches was done manually by selecting the boundary locations of the patches based on direct visual inspection of the image mosaic in ArcMap. After the digitized (DIG) boundaries were completed they were then compared to the GT boundaries that had been collected in the field. There was little noticeable difference in the two boundaries (GT and DIG) when inspected visually. Thus, we assumed that DIG Phragmites patch boundaries could be used as a surrogate ground truth of Phragmites patches when running classifications.Digitization of patch boundaries within the image mosaic was done in ArcMap. We then returned to the field to verify DIG Phragmites patch locations along river channels and roadways. We did not navigate around these patches with a GPS unit or visit patches that were inland from a river channel or roadway during the revisit.

Naive Bayes and maximum likelihood classifiers were initially tested to generate the classification maps. The results with these classifiers were not very encouraging. Then a Support Vector Machines (SVM) classifier is found to be a good choice for classifying the extracted texture features. A SVM classifier is a kernel based classification algorithm, which has been shown to be effective for classifying land cover types. The texture features from GLCM were linearly scaled from -1.0 to 1.0. This process normalized the numerical difference between the numerical values of the features. Optimal SVM parameters (Penalty C and Gamma y) were computed using a grid search algorithm. After obtaining the optimal parameters for training objects, SVM classifiers were trained and tested by using a LIBSVM library. The texture features and spectral bands from the imagery are used to train the SVMs. The binary classification problem is set up to classify Phragmites (P) and the non-Phragmites (NP). The training objects for both the P and the NP classes were selected randomly from multiple regions throughout the image mosaic by using ground truth information, photographs, and field notes. This ensured a representative sample from every region in the study area.

Results and Eradication efforts

Mapping of Phragmites and eradication are two non-coordinated independent operations. This paper demonstrates the ability of texture based algorithm in accurately mapping the invasive Phragmites by comparing the areas that are eradicated by US invasive coastal resource management department. While comparing these two results, readers should keep in mind that burning always causes extra damage so the eradicated areas are always larger than the anticipated regions. Figure 3 shows the area under study (also shown in Figure 2). In Figure 4, the automated mapping created from SVM classification using GLCM texture features is shown. The Phragmites areas are shown in yellow. The region under study is north of Honey Island and west of Desert Island in the delta of Pearl River. Figure 5 shows the imagery collected at a later date (March 2016) after the eradication efforts took place. This imagery was collected by using a Precision Hawk Lancaster UAS with 5-band RedEdge sensor from MicaSense.

Discussion

The image classification methods refined in this work using a very high spatial resolution imagery (2x2 inches) allowed for the mapping of small to large stands of invasive Phragmites in the Pearl River delta regions. This method is found to be applicable for stands of all heights. However the classification accuracy for each site is variable, depending on the density of patches. Our study considered the following major questions about automatic mapping of invasive Phragmites: 1) can UASs be used for successful mapping of Phragmites, 2) can texture features based on GLCM be used to distinguish Phragmites from non-Phragmites, and 3) can Phragmites be mapped by using only the low altitude high spatial resolution visible imagery. Results suggest each question can be answered with an affirmative. Based on experimental results study and comparing the eradication maps, texture features are able to distinguish Phragmites stands. There are several important conclusions that can be drawn from this study. GLCM is computationally efficient and effective technique for classifying Phragmites stands using visible imagery with pixel sizes of 5x5 cm. The ground truth information used in this study is a combination of field visits, photographs and visual analysis of the high-resolution UAS imagery by an expert in the field of aquatic invasive species. In several inaccessible areas, native trees were surrounded by dense stands of Phragmites, which resulted in the assumption that the entire area was Phragmites. This human error resulted in higher omission errors in several regions of the image. As a result, future work will be directed toward the inclusion of additional spectral bands such as red edge and near-infrared. These bands help to easily differentiate the vegetation from non-vegetation and in some cases between two different plant types. Unfortunately, the presently available multispectral cameras for small UAS have a lower spatial resolution than the RGB cameras. Digital surface models that provide height information will also be considered to better classify the Phragmites as this is the tallest grass in the wetlands.

Evaluation of Unmanned Aerial Vehicles (UAV's) for Estimating Distribution and Damage of Feral Swine

November 21, 2016 - Sathish Samiappan
Nine Head of Cattle
Nine head of cattle imaged from 200ft above
The national feral swine population is currently estimated to exceed more than five million individual animals. Over the past several years their numbers have increased significantly and feral swine are now known to exist in at least 40 states. In 1982, feral swine were thought to occur in only a small percentage of counties located in 17 States. Based on data from APHIS Wildlife Services' National Wildlife Disease Program, the Southeastern Cooperative Wildlife Disease Study, and APHIS' Veterinary Services Feral Swine Tracking and Monitoring Data feral swine are now present in approximately 40% of all counties in the United States. The geographic expansion is primarily due to humans transplanting them to new areas to increase hunting opportunities, while increased local population sizes are mainly due to their high reproductive capacity and the ability to thrive in a wide range of habitats. Feral swine cause extensive damage to crops, forest, and livestock. Nationally damage is estimated at 1.5 billion dollars per year. We conducted unmanned aerial vehicle (UAV) based surveys at selected areas where feral swine damage is occurring. We evaluated the resolution necessary to obtain censuses of feral swine using UAVs and automated pattern recognition techniques and to determine if UAV surveys can distinguish between feral swine and deer damage.

Trapped Hog
Hog trapped in a cage from 200ft imagery
Thermal Snapshot of Trapped Hog
Thermal video snapshot of a trapped hog
Uniformity analysis to identify crop damages

Determining corn crop uniformity on a large field is of tremendous value to monitor plant health and damages caused by hogs and deer. Texture modelling techniques were investigated to map three different densities (Low, Medium and High) on a corn field by using visible imagery collected using UAVs.

Texture Modeling Techniques:

Gray Level Co-Occurrence Matrix (GLCM)
GLCM is a statistical method of examining texture that considers the spatial relationship of pixels. This also known as the gray-level spatial dependence matrix. The GLCM characterize the texture of an image by calculating how often pairs of pixel with specific values and in a specified spatial relationship occur in an image, creating a GLCM, and then extracting statistical measures from this matrix.
Segmentation-based Fractal Texture Analysis (SFTA)
The SFTA extraction algorithm consists of decomposing the input image into a set of binary images from which the fractal dimensions of the resulting regions are computed in order to describe segmented texture patterns. In the past, SFTA has been successfully used for the tasks of content-based image retrieval (CBIR) and image classification. Its performance is superior to that of other widely employed feature extraction methods such as Haralick and Gabor filter banks. SFTA achieved higher precision and accuracy for CBIR and image classification. Additionally, SFTA was at least 3.7 times faster than Gabor and 1.6 times faster than Haralick with respect to feature extraction time.
Wavelet Texture Analysis
Wavelet texture analysis is based on the application of a 2D wavelet transform to each raw sub-image, which essentially consists of transforming a matrix of numbers (pixel intensities, as we are analyzing single-channel or grey-level images) into another, with the same size (same overall number of wavelet coefficients), containing blocks of coefficients for different scales (from the finest to the coarsest scale, which is known as the decomposition depth) and along three different directions (horizontal, vertical and diagonal).
Texture Analysis based on Gabor Filters
In image processing, a Gabor filter is a linear filter used for edge detection. The frequency and orientation representations of Gabor filters are similar to those of the human visual system, and they have been found to be particularly appropriate for texture representation and discrimination. In the spatial domain, a 2D Gabor filter is a Gaussian kernel function modulated by a sinusoidal plane wave.

Simple cells in the visual cortex of mammalian brains can be modeled by Gabor functions. Thus, image analysis with Gabor filters is thought to be similar to perception in the human visual system.

Usually, a filter bank consisting of Gabor filters with various scales and rotations is created. The filters are convolved with the signal (image), resulting in a so-called Gabor space. This process is closely related to processes in the primary visual cortex.
Hog Damage at Study Site
Hog damage at the study site delineated by a human expert with field knowledge (not from looking at the UAV imagery)
Visible Damages Caused by Hogs
Visible Damages caused by hogs in the field on right

Evaluation of Unmanned Aerial Vehicles (UAV's) for Estimating Distribution and Abundance of Waterbirds on Catfish Aquaculture Facilities.

November 21, 2016 - Sathish Samiappan
Double-crested cormorants, American White Pelicans, and other fish eating birds are abundant in Mississippi and there is considerable stakeholder concern over the apparent increase in utilization of cultured catfish by fish eating birds. These factors have resulted in concerns regarding depredation impacts and economic losses to the catfish aquaculture industry attributable to fish eating birds.

Proportional use and count information are important in determining the economic impact of fish eating birds to the catfish aquaculture industry. Historically, surveys have been conducted from the ground or by air using certified pilots, typically in fixed wing aircraft. Recent advances in unmanned aerial vehicle (UAV) technology may provide for a more cost effective and less risky solution for the conduct of aerial surveys of waterbirds on catfish aquaculture facilities and at their roost sites.

We conducted UAV based surveys at selected catfish aquaculture facilities in the primary catfish aquaculture producing areas of Mississippi. We evaluated the resolution and extent of coverage necessary to provide for UAV remotely sensed and pattern recognition based censuses of fish-eating birds.

Precisionhawk and Robota UAV platforms were used to collect visible imagery at different altitudes from two different sites. The first site is located north of Leland in Washington county in Mississippi (-90.891425 33.447374 Decimal Degrees). Imagery from four flights at 200ft, 400ft and 600ft altitudes were collected over 80 acre site by using a visible Sony RX-100 20MP camera on Robota Tritan UAV. This data was collected on 09/08/2014 and 10/18/2014. The ground resolution corresponding to UAV flown at 200ft, 400ft and 600ft are approximately 1.2cm, 2.5cm and 3.5cm (0.5 inch to 1.25 inch) respectively. The second site is located south of Indianola in Humphrey county in Mississippi (-90.539764 33.327089 Decimal Degrees). Three flights at 200ft, 400ft and 600ft were collected over 120 acre site by using a visible Cannon camera on Precisionhawk Lancaster UAV. This data was collected on 03/24/2015. The ground resolution corresponding to UAV flown at 200ft, 400ft and 600ft are approximately 1.5cm, 3 cm and 4.5cm respectively (0.75 inch to 2 inch). In all the seven flights, a large image mosaic was constructed out of overlapped individual images. In both locations, American pelicans and blue herons were spotted. We could not spot the double crested cormorants on these locations to collect the imagery.

Pattern Recognition Experiments and Results

Pattern recognition algorithms such as color segmentation and template matching were applied on the imagery to automatically count the birds.

Figure 1 and 3 shows two snapshots of white pelicans and blue herons respectively that are used for performing color segmentation. Figure 2 and 4 shows the results of segmentation. It can be clearly observed that this algorithm can be used to effectively segment the birds in the imagery. The algorithm requires manual/automatic selection of the color of the target object that needs to be segmented in the first step. Delta-E color segmentation is performed in the Lab color space. This algorithm is very fast and accurate in segmenting a single color tone (average over a region) objects such as white pelicans and blue herons. A connected component object counting algorithm is then used to accurately count the segmented objects resulting in the estimation of number of birds in the imagery.

Figure 5 shows the snapshot where both white birds and blue herons were present. Color segmentation is very useful in identifying and segmenting a bird purely by using its color. Template matching algorithm can be employed to identify birds based on both color and shape. Figure 6 shows the result of template matching algorithm identifying eight blue herons in the image.

Figure 1
Figure 1 Snapshot of a region where several white pelicans were spotted in the imagery
Figure 2
Figure 2 Result of Delta-E color segmentation algorithm to segment the birds
Figure 3
Figure 3 Snapshot of a region where several blue herons were spotted in the imagery
Figure 4
Figure 4 Result of Delta-E color segmentation algorithm to segment the birds

Figure 5
Figure 5 Snapshot of a region where several white pelican and blue herons were spotted in the imagery
Figure 6
Figure 6 Result of Template matching algorithm to segment the blue herons

Deer Island

June 27, 2016
In the spring and early summer of 2016, the Mississippi State University Geosystems Research Institute exploited an array of UASs to image Deer Island, just off the Mississippi Gulf Coast. The principal objectives were for habitat analysis and to monitor the island's morphological dynamics.

We collected a series of overlapping RGB, CIR, and 5-band multispectral images to create high resolution visible maps, high resolution vegetation maps, and 3D point clouds. We also collected several flyover video of the island to provide context for the mapping, as well as to promote the island.

The UAVs we used included an Altavian F6500, an Altavian F7200, and a DJI Phantom 4. The cameras we used on the Altavian UAVs were a Canon EOS Rebel SL1 (one unmodified and another modified to allow us to sense in the NIR band) and a Micasense RedEdge.

The project was funded by and was in collaboration with the Mississippi Secretary of State's Office and the Mississippi Department of Marine Resources.

Pier Off Deer Island
Coast of Deer Island

Improving Created Wetland Function with Data from Unmanned Aerial Vehicles

April 22, 2016
Bayou Dupont 1
Bayou Dupont 2
Large-scale coastal wetland creation is typically accomplished by placing hydraulically dredged material into a confined area where it de-waters and settles to a designed elevation. The result is a flat homogeneous platform with minimal habitat variability.

Approaches to increase habitat function on newly created marsh often include the removal of barriers to tidal exchange (such as containment dikes) and the excavation of tidal channels to provide variability in habitat and increase land/water edge interface. Typically, the location and specification of these additional habitat features are part of the initial project design and do not consider post-construction site conditions which could be integrated to improve habitat function. By identifying a means to measure minor variability in the elevation of the constructed wetland platform, project managers can connect areas of lower elevation with the excavation of tidal channels. Identifying and connecting existing depressions in the newly created wetland platform can maximize habitat function and minimize the cost of installation of functional features.

Bayou Dupont 3
At the Bayou Dupont marsh creation project in SE Louisiana, the NOAA Restoration Center partnered with the Northern Gulf Institute and Mississippi State University to collect high resolution imagery using unmanned aerial vehicles. The imagery and associated ground control points were used to create a digital surface model using structure from motion. Structure from motion (SfM) is a range imaging technique; it refers to the process of estimating three-dimensional structures from two-dimensional image sequences. This pilot project will help to evaluate the value of digital surface models created using structure from motion with data collected from unmanned aerial vehicles. Lessons learned will support advances in low-cost data acquisition and effective management of large-scale created wetland projects.

Mel Landry III
NOAA Restoration Center

Mapping Sensor Integration for EMILY Unmanned Surface Vehicles

April 22, 2016
EMILY USV 1
The Emergency Integrated Lifesaving Lanyard (EMILY) Unmanned Surface Vehicle (USV) was originally designed to be deployed in rugged shallow water conditions by lifeguards as a surf rescue craft, and possesses a number of characteristics well suited to this operational environment. Our current project is focused on integrating sonar sensors on EMILY USVs, in order to produce detailed maps of shallow coastal seafloor bathymetry and imagery.

The first sensor added to the EMILYS was a single beam sonar, which uses reflected sound waves to measure the distance from the bottom of the USV to the seafloor. These observations are corrected with vehicle pitch, heading, and roll information, collected with an inertial measurement unit, to produce maps of coastal water depth. The second sensor added to the EMILYs was a side scan sonar, which uses obliquely reflected sound waves to produce an image of a swath of the seafloor beneath the USV. Automated processing and georeferencing of the sonar data will be done on a dedicated survey computer and the resulting products will be stored in a ruggedized solid-state drive.

The resulting coastal survey capable EMILY USVs will have to capacity to map coastal habitats and seafloor features with a high degree of resolution, while conducting pre-programed survey missions. This new capability has the potential to result in greater seafloor survey efficiency relative to traditional crewed vessel. Additionally, these EMILYS USVs could be rapidly deployed to location of natural disasters (e.g. , hurricane landfall) in order to survey coastal waters for morphological change and the presence of submerged debris including channel obstructions, which represent hazards to vessel navigation.

EMILY USV 2
EMILY USV 3
EMILY USV 4
EMILY USV 5
EMILY USV 6
EMILY USV 7

EMILY USV 8
EMILY USV on Water

Development of a Cost-effective, Efficient Method to Control Fish-eating Bird Abundance at Aquaculture Facilities

April 11, 2016
UAV
Pelicans and other fish-eating birds can cause significant damage to a catfish farmer's way of life. In recent years, the amount of catfish ponds in the Delta has decreased by 50%. Farmers have claimed to see more fish-eating birds and that their normal scaring methods are no longer effective. The current scaring methods typically include the use of a bird chaser, who uses pyrotechnics and strategic culling with a shotgun while driving around the complex in a vehicle. Birds become accustomed to these noises when the sounds occur frequently at regular intervals and intensities. With the costs of depredation, spread of disease, and costs of harassment, catfish farmers need better and more cost-efficient ways of scaring fish-eating birds off their ponds than the commonly used tactic of human harassment.

Researchers 1
Another scare tactic that has not yet been tested is the use of unmanned aerial vehicles (UAV). UAVs have become increasingly popular for research in the wildlife field. For the catfish farmers, this method could be highly effective at scaring pelicans and other fish-eating birds away from their facilities. Using UAVs would require less labor and with today's rapid advances in technology, it could be much cheaper than human harassment in the future. However, the efficacy of UAVs as avian scaring devices has not been assessed. This is the goal of our current research project at Mississippi State University. We want to develop a new scaring tactic for catfish farmers to use that will be effective, cost-efficient, and will keep bird habituation to a minimum.

Researchers 2
This study is a collaborative research effort involving USDA/WS National Wildlife Research Center and Mississippi State University's Department of Wildlife, Fisheries, and Aquaculture and Geosystems Research Institute. UAV pilots, David Young and Sean Meacham, from Mississippi State's Geosystems Research Institute will remotely fly a Phantom II quadcopter around the perimeter of the ponds then focus on harassing any birds still left in the area. A Department of Wildlife, Fisheries, and Aquaculture graduate student (Ciera Rhodes) and NWRC staff are measuring the immediate percent reduction in bird abundance, which means we will be counting the number of birds that return during the first hour following harassment. In addition to the UAVs, we are also observing the catfish farm's bird chasers during their normal routines. This way, we can compare the two techniques to determine which scaring tactic is more effective for fish-eating birds. This research is funded by the U.S. Department of Agriculture, Wildlife Services' National Wildlife Research Center (NWRC). The mission of the NWRC is to apply scientific expertise to resolve human-wildlife conflicts while maintaining the quality of the environment shared with wildlife.

If you have any questions, contact Ciera Rhodes at car267@msstate.edu.

GBNERR Wildfire

March 09, 2016
Grand Bay Fire Flight February 2016
Over the past year, the Grand Bay National Estuarine Research Reserve (GBNERR) has partnered with NOAA's Northern Gulf Institute (NGI) and the Geosystems Research Institute (GRI) at Mississippi State University to utilize Unmanned Aircraft Systems (UAS) for a variety of missions. These missions include: high resolution vegetation mapping along GBNERR Sentinel Site research infrastructure, monitoring a simulated disaster response exercise, and mapping the extent of a marsh wildfire. These missions were possible through the support of the NOAA UAS Program Office and the NERR UAS working group.

The most recent mission was flown in response to a wildfire that burned from February 11, 2016 to February 18, 2016 across 4,246 acres of marsh and upland habitat within the Grand Bay National Estuarine Research Reserve, Grand Bay National Wildlife Refuge, and adjacent lands. GBNERR wanted to obtain imagery of the fire for the purposes of mapping the effected marsh/upland habitats and analyzing vegetation regeneration. Efficient coordination between GBNERR, NGI, and GRI at MSU allowed for a mission to be coordinated quickly, funding identified, and the flight vetted through the U.S. Fish and Wildlife Service. On February 25 and 26, an Altavian Nova Block III was flown over the wildfire carrying a Micasense RedEdge payload.

Five (5) band imagery (blue, green, red, red-edge, and near infrared) of almost the entire wildfire area was obtained at 8 cm ground resolution in 3 flights. Overflights near the GBNERR headquarters, which was on the periphery of the wildfire area, were deemed too dangerous due to visibility limitations, personnel in the building, and high voltage power lines in the area.