Open Access
How to translate text using browser tools
1 September 2015 Coastal and Environmental Remote Sensing from Unmanned Aerial Vehicles: An Overview
Victor V. Klemas
Author Affiliations +
Abstract

Klemas, V.V., 2015. Coastal and environmental remote sensing from unmanned aerial vehicles: An overview.

Unmanned aerial vehicles (UAVs) offer a viable alternative to conventional platforms for acquiring high-resolution remote-sensing data at lower cost and increased operational flexibility. UAVs include various configurations of unmanned aircraft, multirotor helicopters (e.g., quadcopters), and balloons/blimps of different sizes and shapes. Quadcopters and balloons fill a gap between satellites and aircraft when a stationary monitoring platform is needed for relatively long-term observation of an area. UAVs have advanced designs to carry small payloads and integrated flight control systems, giving them semiautonomous or fully autonomous flight capabilities. Miniaturized sensors are being developed/adapted for UAV payloads, including hyperspectral imagers, LIDAR, synthetic aperture radar, and thermal infrared sensors. UAVs are now used for a wide range of environmental applications, such as coastal wetland mapping, LIDAR bathymetry, flood and wildfire surveillance, tracking oil spills, urban studies, and Arctic ice investigations.

INTRODUCTION

Environmental remote sensing is primarily concerned with the collection and interpretation of information about the oceans, land, and atmosphere from a remote vantage point. Remotely sensed data has been used successfully to predict the weather and to track hurricanes; observe coastal dynamics and detect pollutants; and map coastal land cover, including tidal wetlands, forests, agriculture, and urban areas (Jensen, 2007). Although most of these applications require a certain amount of field data, the remotely sensed information makes it possible to efficiently observe large areas at suitable spatial and temporal resolutions.

Depending on resolution requirements and cost limitations, the remote-sensing platforms can range from helicopters or balloons just above the surface of the Earth, to aircraft at low and midaltitudes, to satellites hundreds of kilometers away in space. Traditionally, satellites have offered large-area coverage, multispectral imaging, and a reliable revisit time for environmental change studies, yet they lacked the spatial resolution required by many applications. High spatial resolution satellite data are now available, but the high resolution and frequent, flexible overflights offered by airborne sensors are more suitable for a wide range of applications, such as land-use mapping, wetlands mapping, coastline delineation, LIDAR bathymetry, and tracking oil slicks. For example, an aircraft overflight can be timed to view a coastal wetland during low tide to improve the detection of emergent and submerged aquatic vegetation (SAV). A low-altitude aircraft with multispectral or hyperspectral imagers can accurately map wetland, SAV, and coral reef habitats (Chust et al., 2008; Garono, 2004; Klemas, 2013; Phinn, Stow, and Zedler, 1996; Purkis, 2005; Purkis and Klemas, 2011; Yang and Artigas, 2010). The imagery can be integrated with GPS position information and used as layers in a GIS for a wide range of mapping and modeling applications.

However, manned aircraft overflights can be costly (Klemas, 2011). If one uses small, unmanned aerial platforms, the cost drops dramatically. GPS-guided unmanned aerial vehicles (UAVs) have the capacity to obtain very high spatial resolution (<10 cm) imagery of specific landscape features with revisit times determined by the operator as opposed to fixed satellite revisit times (Lechner et al., 2012). As a result, UAVs, such as drones, quadcopters, balloons, and blimps are now being used effectively in many environmental studies.

The objective of this article is to acquaint the reader with the types, advantages, and problems of various unmanned aerial platforms that are available for remote-sensing applications.

FIXED-WING UAVs

UAVs are powered aircraft operated remotely or autonomously with preprogrammed flight planning. The two main types of UAV configurations are fixed wing (airplane) and rotary wing (helicopter). The UAVs, sometimes known as drones, offer a viable alternative to conventional platforms for acquiring high-resolution remote-sensing data at lower cost, increased operational flexibility, and greater versatility. The ability of UAVs to capture imagery concurrent with field observations solves a common remote-sensing problem resulting from differences in the acquisition of ground and remote-sensing data.

Military applications of UAV technology have been steadily transferred to civilian and research applications. Improvements in the design of flight control systems have transformed these platforms into research-grade tools capable of acquiring high-quality images and geophysical/biological measurements (Hugenholtz et al., 2012). There have also been advances in the development of miniaturized sensors, specifically designed or adapted for UAV payloads, including hyperspectral imagers, LIDAR, synthetic aperture radar, and thermal and other sensors.

Fixed-wing UAVs have been flown over cities and wetlands to assess damage after hurricanes, floods, and earthquakes. For example, when smoke grounded other aircraft during a 2009 forest fire in Circle, AK, a Predator drone provided infrared imagery that allowed officials to determine that no evacuation was necessary. During the accident at Japan's Fukushima Daiichi nuclear power plant, large drones analyzed the emergency from high altitude, whereas backpack-sized drones inspected the crippled reactors at close quarters (Conniff and McClaran, 2011).

According to Hugenholtz et al. (2012), the Federal Aviation Administration has divided the UAV types into five categories by weight, ranging from micro (<0.9 kg) to mini to tactical to medium-altitude long-endurance and to high-altitude long-endurance (>13,636 kg) craft. There is also a category of small unmanned aerial vehicles (SUAVs) weighing less than 25 kg, which is being made available for civil operation in the national airspace system. Given their low cost, remote-sensing research projects have mainly used UAVs in this category.

These SUAVs have advanced designs to carry small payloads and integrated flight control systems, giving them semiautonomous or fully autonomous flight capabilities. In autonomous UAVs, the acquisition of remote-sensing data is preprogrammed with flight planning software that can calculate waypoints for image acquisition based on the desired ground resolution, amount of image overlap, and area to be surveyed. Digital cameras are commonly part of the payload. During the flight, the UAV's autopilot can communicate via a telemetry link to a ground station running flight control software (Hugenholtz, 2012). For example, the AeroVironment RQ-14A “Dragon Eye” has a 110-cm wingspan and weighs about 2.5 kg. Launched by hand or with a bungee cord, the small plane is controlled by GPS coordinates entered into its guidance system with a standard laptop computer. Once aloft, it can transmit video images of the coastal landscape in real time (Edwards, 2009).

Figure 1 shows National Aeronautics and Space Administration (NASA)'s midsized UAV—the Sensor Integrated Environmental Remote Research Aircraft (SIERRA)—which has a 6-m wingspan and 45 kg exchangeable payload. The UAV was designed by the U.S. Naval Research Laboratory and developed at NASA's Ames Research Center. It has a range of about 1,000 km and a maximum endurance of 10 hours. Its maximum speed is 148 km/h, and its service ceiling is 3,600 m. The SIERRA UAV has been used in many environmental studies, including monitoring floods, wildfires, volcano hazards, mapping geologic faults, and exploring geothermal and mineral resources (Oleson, 2013).

Figure 1. 

The Sensor Integrated Environmental Remote Research Aircraft (SIERRA) on the tarmac at NASA's Crows Landing Airport in California. Credits: Oleson and Glen (2013).

i1551-5036-31-5-1260-f01.tif

One of the most intense applications of UAVs has been in monitoring and mapping the extent, biomass, and health of vegetation cover, including tidal wetlands, forests, and agricultural crops. Multispectral and hyperspectral imagers are being developed for incorporation into UAVs for discriminating and mapping various vegetation species. Thus the UAVs will be able to map and classify coastal vegetation types using just a camera, GPS, and an inertial measuring unit that tracks the position of the UAV in space. For example, a fixed-wing UAV and a helicopter in tandem have been used to locate weeds in remote rangelands and spray them with herbicide (Marris, 2013). UAVs have also been used to map wetlands and survey estuaries for hazardous algal blooms.

Vousdukas et al. (2011) have used small UAVs to generate high-quality, time-averaged images that provided information on the nearshore, including sand bar morphology, the locations of rip channels, and the dimensions of surf/swash zones. A set of more than 6,000 images obtained at altitudes ranging from 150 to 500 m was processed using semiautomatic techniques to create georeferenced snapshots and time-averaged and mosaic images of the nearshore zone. A comparison with ground-based system images showed that the use of SUAVs provides not only increased spatial coverage but also a more favorable vantage point, as well as portability and rapid, flexible deployment.

Pereira et al. (2009) performed video surveillance and coastal monitoring operations with six different types of fixed-wing UAVs having wingspans ranging from 1 to 6 m, with autonomous takeoff and landing capabilities. They demonstrated their applicability not only to military reconnaissance but also to the detection of coastal hazards, fishing surveillance, coastal erosion studies, etc.

The UAVs were equipped with off-the-shelf autopilots controlled by an onboard computer. They used small wireless cameras, capable of real-time video feed transmission over distances of up to 8 km, and multispectral cameras for more thorough analysis. The drones were developed in a modular way to accommodate the most demanding payloads, such as search and rescue (SAR). The smaller UAVs could be hand-launched, whereas the larger ones were more suitable for endurance flights and larger payloads.

The project RAVEN (Remote Aerial Vehicle for Environment Monitoring) (O'Young and Hubbard, 2007) was designed to develop control technology to enable a long-endurance UAV to perform safe and effective coastal and maritime surveillance operations in harsh coastal environments. The project is focused on developing techniques that allow a long-endurance UAV to perform inspection under extreme weather conditions, being at the same time able to obtain high-texture imagery and also capable of providing broad-area coverage. Onboard the UAV, there is a PC 104 connected to a digital high-resolution camera and a digital video camera. There is also a frame-grabber that allows the video to be stored in a hard-disk. Image processing is done in the PC 104 on board the UAV, with the video transmitted through a satellite phone (Iridium) for supervision purposes only. Currently, the authors are working on target detection and tracking. Their next step will be detect/see and avoid (DSA) technology, which will enable commercial UAVs to operate in beyond line-of-sight missions in nonsegregated airspace.

Rajan, Sousa, and Niiler (2014) combined aerial views from a UAV (drone) with measurements from autonomous underwater vehicles (AUVs) to get an unprecedented look at coastal waters off the coast of southern Portugal. From the stern of a ship, the scientists launched several AUVs into the water. Then, they launched a cylindrical drifter buoy (the target) that broadcasts its position to the AUVs and serves as a mobile hotspot, allowing the underwater vehicles to communicate with the drone and support vessel. The drone was an off-the-shelf aircraft, about 2 m long, with a GoPro camera mounted slightly behind its nose. After the aerial drone spotted the buoy, the underwater and surface vehicles converged on that area to measure the surrounding ocean environment: current direction, speed, strength, temperature, and water salinity. In another experiment, Oliveira et al. (2013) combined AUVs, UAVs, and ships to test new techniques for tracking multiple marine animals in real time, collecting environmental data in the water around each animal, and, at the same time, controlling and coordinating the diverse group of ships and robotic vehicles.

Lechner et al. (2012) used a UAV to study subsidence, resulting from underground coal mining that can alter the structure of overlying rock formations. This activity changes hydrological conditions and affects ecological communities and swamp species sensitive to hydrologic changes. The UAV (Kahu Hawk) carried modified digital cameras, and the imagery was classified applying object-based image analysis (OBIA) methods to characterize swamp land cover on the Newnes Plateau near Sydney, Australia. The characterization of swamp spatial distribution was key to identifying long-term changes in swamp conditions.

The Kahu Hawk was a GPS-guided, electric-powered small aircraft with a 2-m wingspan, weighing about 4 kg. It was controlled by a small ground station consisting of a single laptop and transmitter allowing for communication and control of the UAV. Flight paths were uploaded before takeoff, even though the UAV could have been guided manually. Imagery was acquired using a Sony NEX5 micro digital single-lens reflex (DSLR) camera system with a 16-mm lens. This camera system included two cameras: (1) a regular camera for acquiring imagery in the visible spectrum, and (2) a modified full-spectrum camera with a near-infrared filter. The imagery covered an area of about 26 ha from 121 m above ground level, and it took about 45 minutes to complete the overflights. This study demonstrated that images from low-altitude UAVs and OBIA image analysis methods can be combined to accurately classify swamp vegetation extents (Lechner et al., 2012).

To improve hurricane-intensity forecasts, NOAA has been sending manned aircraft into hurricanes for more than 50 years to collect information vital to understanding hurricane dynamics and accurately measuring storm strength. The interface between the near-surface, high-wind environment of air and sea is critical in hurricane dynamics but very risky for manned aircraft to observe directly. More recently, NOAA used UAVs as an instrument platform capable of transmitting data from within hurricanes at flight levels at or below 150 m altitude, where it is too dangerous to send manned aircraft. During a flight into a hurricane, the UAV spirals into the eye of the storm, riding the waves of air turbulence, whereas its pilot controls the aircraft from a safe ground station (Rule, 2008). For example, in 2007, an Aerosonde UAV spent 7.5 hours navigating Hurricane Noel's boundary layer in wind gusts up to 115 km/h, from as low as 100 m above the ocean. That mission, totaling more than 17 hours, set records for duration and data gathered.

UAVs provide a safe way for scientists to study remote or inaccessible regions. Light-aircraft crashes are a major killer of wildlife biologists and a danger to pilots flying in remote areas, such the Arctic. For example, UAVs are being used in the polar regions of the Earth in a variety of tasks, from monitoring ice dynamics and the ozone layer to counting seal populations. During some missions, the UAVs have to fly 50 m off the ice, with winds of 60 km/h, and temperatures of −40°C. One research project is using UAVs to map melt ponds to explain why the edges of Greenland are melting so quickly. Some of the UAVs are catapulted from ships; some launched from pick-up trucks and some from icy airstrips. The payloads consist of various combinations of instruments, including cameras, radar, LIDAR, infrared sensors, and chemical-analysis tools (Castelvecchi, 2010; Hutt, 2014; Marris, 2013).

UAVs have been used around the world in recent years to help conserve orangutans, elephants, tigers, and other endangered species. They have also started to provide a remarkable look at hard-to-reach places, like orangutan nests high in the jungles of Sumatra and Borneo, and have shown potential for catching poachers and stopping illegal logging (Averett, 2014). Carrying digital cameras, the UAVs provide geo-referenced photos that can be fed into image-recognition algorithms to significantly improve the accuracy of population counts. Some of these UAVs also carry high-definition cameras with high-power zoom for day flights and thermal imaging cameras for nighttime activities (Platt, 2013).

ROTARY WING UAVs

The advantages offered by helicopters during military and search/rescue operations are well known. Helicopters are also used in many environmental projects, including wetland mapping, LIDAR bathymetry, flood/earthquake damage assessment, oil-slick tracking, urban studies, etc. Helicopters have one major advantage over fixed-wing aircraft in that they can hover over a target site, descend for a closer inspection, and change altitude to provide imagery for mapping at preferred spatial resolutions. For example, helicopters were used to rescue inhabitants of New Orleans after Hurricane Katrina made landfall and to assess the damage to the city once the hurricane had passed (Klemas, 2009; NOAA, 2008). Helicopters can also be used to estimate the extent of an oil slick and to track the movement of the spilled oil or other pollution plume (Klemas, 2010; Pike, 2014). Unmanned helicopters offer similar advantages, but at much lower cost.

Very high spatial resolution remote-sensing images and digital elevation models (DEMs) are widely used in coastal management applications. For example, they are used for the quantification of morphosedimentary changes of the coastal fringe, including cross-shore and longshore sediment transport. They are also used as input in hydrodynamics numerical modeling. Spatial resolution, precision, and accuracy are critical parameters of the DEM. Presently, many DEMs built using aerial or satellite images with a spatial resolution coarser than 50 cm are not accurate enough for most of applications. An unmanned photogrammetric helicopter (DRELIO) has been developed to address this problem (Delacourt et al., 2009). It is equipped with an autopilot system. After loading the flight plan, no ground communications are needed from takeoff to landing. The flight altitude can reach 100 m above the ground. DRELIO can operate in windy conditions up to 50 km/h. A reflex camera with high-quality interchangeable optics is onboard. Depending on the focal length and flying altitude, the resolution of the images varies from 1 to 5 cm with a ground coverage of about 50 by 75 m up to 250 by 375 m. With stereoscopic images and GPS positioning of reference points on the images, the DRELIO facilitates building a DEM and an orthorectified image with a spatial resolution better than 5 cm. The DRELIO system, which produces the DEM concurrent with LIDARs, currently appears to be more flexible and efficient than other UAVs and less expensive than LIDAR (Delacourt et al., 2009).

Unmanned helicopters come in many different configurations. One popular, stable design is the quadcopter, a multirotor helicopter that is lifted and propelled by four rotors. Quadcopters use two sets of identical fixed-pitch propellers: two rotating clockwise and two counterclockwise. Control of vehicle motion is achieved by altering the rotation rate of one or more rotor discs, thus changing its torque load and thrust/lift characteristics. Quadcopters use an electronic control system and electronic sensors to stabilize the aircraft. They have several advantages over single-rotor helicopters. The four-rotor design allows quadcopters to be relatively simple in design yet highly maneuverable and reliable. They do not require mechanical linkages to vary the pitch angle of the rotor blade as they spin. This simplifies the design and maintenance of the vehicle. The use of four rotors allows each individual rotor to have a smaller diameter than the equivalent helicopter rotor, minimizing damage if a crash occurs (Wikipedia, 2014). Typical flying times are in the tens of minutes before the battery needs to be recharged.

Figure 2 shows a Phantom quadcopter in the midprice range of about $800. It contains an integrated 14-MP camera with 1080p HD video recording (high definition video with 1,080 horizontal lines of vertical resolution) on a micro-SD (secure digital) card and live first-person view (FPV) Wi-Fi streaming of video and telemetry to a free Vision application (app) for iOS and Android. The Wi-Fi allows live, remote video streaming from up to 330 m away. The integrated camera tilt motor automatically compensates for the Phantom's single-axis motion by tilting the camera for smoother video. It is also manually controllable via a Vision app. It can fly up to 25 minutes on a single charge using the included lithium polymer battery that is easily removable and contains a built-in charge-remaining indicator.

Figure 2. 

DJI Phantom 2 Vision Quadcopter with integrated camera. Credits: Amazon (2014).

i1551-5036-31-5-1260-f02.tif

The need for aircraft with greater maneuverability and hovering ability has led to a rise in quadcopter use in environmental monitoring and mapping. They are now widely used in wetland mapping, flood damage assessment, coastal dynamics studies, urban planning, etc. For example, M. Madden (personal communication) used a quadcopter to map wetlands in Georgia. Her team used a $300 Parrot quadcopter with two built-in cameras, one pointed down for nadir video and the other forward. The video records to a flash drive, and a cell phone is used for control. Most quadcopters can carry a small camera, many have a mount for a Go-Pro camera, with a maximum payload of about 1 kg. Hexacopters and octocopters can carry heavier cameras, cost tens of thousands of dollars, and are often used in movie/TV applications. New systems capable of carrying multispectral and hyperspectral imagers are being designed.

BLIMPS, BALLOONS, AND KITES

Blimps and balloons fill a gap between satellites and aircraft. Blimps and similar airships are by nature low-flying, slow, long-endurance aircraft that provide a monitoring platform needed for long-term observation of an area. They are excellent instrument platforms, being steady and vibration-free. A blimp, unlike an airplane, does not need fuel to fly, but is filled with helium gas, which is lighter than air, and thus keeps the blimp floating through air through the buoyant force. Tethered balloons carrying remote-sensing imagers are being used to study radiometric issues in terrestrial and coastal ecosystems by effectively bridging the gap between measurements made on the ground and those acquired by airplane or satellite. The main advantage balloons and kites have over airplanes is that they can hover over a site and observe environmental changes over long periods from various altitudes, providing high-resolution images at low cost (Chen and Vierling, 2006; Vierling et al., 2006).

Kite and balloon aerial photography has been used for scientific surveys, meteorological observations, and military surveillance for centuries. For example, Boike and Yoshikawa (2003) used a camera suspended from a kite to obtain high-resolution pictures in Alaska to perform geometric analyses of ice-wedge networks and to map water and vegetation. Rigid Delta kites are used for light to moderate winds, and parafoil kites are flown in moderate to strong winds. The payload ranges from 0.5 to 2 kg, and the size of the kite depends on wind conditions. The photographs provide an important data set for monitoring changes in permafrost patterns, periglacial processes, and coastal vegetation over time. Home-made kites with attached camera rigs have also been used in archeology and in some developing countries to map local vegetation, obtain population counts in villages, and assess complex humanitarian emergencies (Sklaver et al., 2006).

Balloons come in various sizes and shapes. They are used not only to measure meteorological properties at various altitudes in the atmosphere but also to monitor environmental changes at specific sites. A tethered balloon can provide an excellent platform for imaging and tracking coastal dynamics, such as monitoring sediment transport over an oyster bed for several tidal cycles. Shaw et al. (2012) developed low-cost, compact, multispectral imaging systems for deployment on tethered balloons in the blue, red, and near-infrared (NIR) spectral bands. The red and NIR bands are used primarily for identifying and monitoring coastal vegetation through the Normalized Difference Vegetation Index (NDVI), whereas the blue band can be used for studying water properties, such as turbidity (Cihlar, St. Laurent, and Dyer, 1991; Goward et al., 1991; Jensen, 2007). The imagers, designed by students, were carried by tethered balloons up to altitudes of about 50 m.

Another inexpensive balloon configuration is the Helikite (helium balloon plus kite) designed, built, and tested at the University of Delaware (J.R. Jo, personal communication). The Helikite Aerial Platform (HAP) is a multialtitude, high spatial resolution system, which can fly up to a 1.2-km altitude and carry a 5-kg load. With a diameter of 3 m, it is a lighter-than-aircraft balloon but is not bounced around by the wind because it also works like a kite. The HAP is a versatile monitoring system that can be modified as required to include a digital camera, a thermal infrared camera, GPS equipment, an inertial measurement unit, or more sophisticated instruments. Most important, it can be deployed rapidly and easily. The HAP has been used in Chesapeake Bay and will be used in the Delaware Bay to map coastal wetlands and to obtain continuous images of river and coastal plumes with very high spatial and temporal resolutions. The images will be analyzed to study wetland changes and the evolution of coastal plumes, which have an important role in the mixing between high-nutrient terrestrial plumes and salt water (Jo, 2013). Different designs of helikites have been used by other researchers to study relatively small sites, including submerged aquatic vegetation (Visser, Wallis, and Sinnott, 2013).

Vierling et al. (2006) describe a tethered balloon remote-sensing system called the Short Wave Aerostat-Mounted Imager (SWAMI). The SWAMI was designed to acquire colocated video imagery and hyperspectral data to study basic remote-sensing problems and to link landscape-level trace-gas fluxes with spatially and temporally appropriate spectral observations. The SWAMI can fly at altitudes up to 2 km to bridge the spatial gap between radiometric measurements collected near the surface and those acquired by aircraft or satellites. The equipment on the SWAMI platform includes a dual-channel, hyperspectral spectroradiometer, video camera, GPS unit, thermal infrared sensor, and several meteorological and control sensors. Data acquisition, sensor pointing, and other functions can be controlled from the ground via wireless transmission (Vierling et al., 2006).

Digital surface models (DSMs) and the digital mapping of topography provide powerful databases used to understand, model, and analyze terrestrial environments and landscapes. Kushida et al. (2009) upgraded an automated forest DSM extraction method by using balloon stereo photography. Recently, the importance of automated DSM extraction in forest management and ecosystem studies has also led to the development of laser-scanning systems on low-altitude aircraft and balloons. Airborne laser scanning and photogrammetry are different techniques, yet are complementary. One of the benefits of photogrammetry is that past, historic photographs can be used for temporal change analysis.

Mobile laser scanning (MLS) is a new tool for collecting LIDAR topographic data, which permits regional coverage approaching that of laser swath-mapping surveys, with the high data density, autonomy, and ease of deployment provided by ground-based terrestrial laser scanning. MLS involves moving the laser scanner while acquiring data and removing the noise introduced from the motion of the platform with an integrated, onboard global-navigation satellite system and inertial navigation system measurements. An example of MLS use was provided by Brooks et al. (2013), who described a BLIDAR (balloon or backpack LIDAR) system that weighed only 14 kg and could be rapidly deployed by the scientists studying the physics of sudden and substantial change associated with earthquakes, landslides, or large ocean waves. BLIDAR's portability also allowed the study of more static or gradually changing phenomena.

Of all the balloon shapes, the aerodynamic blimp shape has proven most efficient and economical for use in calm to light wind conditions (<15 km/h). Manned blimps are frequently deployed at major sports events, such as football games, to provide live TV coverage. Unmanned blimps can be compact and light, effective during day and night, and certainly cost less to operate or purchase than manned blimps or aircraft. Unmanned blimps have been used in many environmental applications (Pike, 2014; Shaw et al., 2012; Vierling et al., 2006; Visser, Wallis, and Sinnott, 2013).

Many types of blimps are available commercially for use in small-format aerial photography. Figure 3 shows a typical blimp constructed of a single layer of urethane that retains helium well and is durable under field conditions. This blimp is about 4 m long and has a diameter of several meters. It has a maximum payload lift of 2 kg. Four rigid tail fins stabilize the blimp while in flight, and multiple attachment points along the keel allow for fastening the tether line and camera system weighing up to 1.5 kg.

Figure 3. 

Helium blimp with camera rig in flight. It measures 4 m long, is easily deployable, and can lift a typical camera rig used for aerial photography. Credits: Aber (2014).

i1551-5036-31-5-1260-f03.tif

A helium-filled blimp/balloon has been deployed from a vessel during oil recovery operations (Pike, 2014). The tethered balloon/blimp carried a camera that was downward looking and was designed to transmit the images wirelessly to an onboard terminal that allowed the pictures to be displayed in real time. The balloon was fitted with a fabric fin to stabilize it so that steady pictures could be received. The balloon/blimp was deployed and recovered using a winch for the tethered rope. It offered a view of more than 6 km when flying at an altitude of 150 m. The application of this technique increases oil recovery rates and produces savings, especially if dispersants are used on the oil (Pike, 2014).

CONCLUSIONS

UAVs are powered aircraft operated remotely or autonomously with preprogrammed flight planning. The two main types of UAV configurations are fixed wing (airplane) and rotary wing (helicopter). The UAVs, sometimes known as drones, offer a viable alternative to conventional platforms for acquiring high-resolution remote-sensing data at lower cost, increased operational flexibility, and greater versatility. Unmanned helicopters (e.g., quadcopters) have a major advantage over fixed-wing aircraft because they can hover over a target site, descend for a closer inspection, and change altitude to provide imagery for mapping at preferred spatial resolutions. UAV prices range from several hundred to tens of thousands of dollars, depending on their lift capability, stability, range, instrument payload, etc. In comparison, manned aircraft overflights are more expensive, and remote-sensing satellites provide only relatively coarse resolution and have long repeat cycles.

Advances in UAV technology allow flexibility in delivering timely data for longer durations and can be tailored to required spatial and temporal resolutions. Military applications of fixed and rotary wing UAV technology have been steadily transferred to civilian and research applications. Improvements in the design of flight control systems have transformed these platforms into research-grade tools capable of acquiring high-quality images and geophysical/biological measurements. There have also been advances in the development of miniaturized sensors, specifically designed or adapted for UAV payloads, including hyperspectral imagers, LIDAR, synthetic aperture radar, thermal infrared units, and other sensors.

Blimps and balloons of various sizes and shapes can be used as stable instrument platforms for mapping and monitoring environmental conditions and coastal dynamics. The main advantages these airships have over airplanes is that they can hover over specific sites and observe environmental changes over long periods while providing high-resolution images at low cost. For example, a camera on a tethered balloon/blimp, launched from a vessel doing oil recovery, can provide pictures that are transmitted and displayed on the vessel in real time. This can help increase oil recovery rates and reduce costs.

UAVs have the capability to effectively fill current observation gaps in environmental remote sensing and provide critical information needed for coastal change research, wetland mapping, water resources forecasting, ecosystem monitoring, natural hazard prediction, and damage assessment. UAVs expand our ability to observe dynamic landscape-altering events and to conduct surveys in logistically challenging areas. For example, UAVs are replacing manned aircraft in studies of remote and dangerous areas, such as the polar regions.

There still are technical and legal hurdles that stand in the way of UAV wider use. One of the most difficult challenges facing the operation of UAVs concerns the insertion of these systems in non-segregated airspace. This stems from the strict safety requirements for manned aircraft (Pereira et al., 2009). For example, the FAA is proposing that for commercial use the drones must weigh less than 55 pounds, fly at a maximum speed of only 100 mph, and no more than 500 feet above the ground. Drones would not be allowed to fly at night or over densely populated areas. Thus, regulations, particularly in the United States, are placing strict limits on where and how one can use UAVs.

LITERATURE CITED

1.

J.S. Aber 2014. Helium Blimp for SFAP: ES 555 Small Format Aerial Photography.  http://academic.emporia.edu/aberjame/airphoto/blimp/blimp.htmGoogle Scholar

2.

Amazon, 2014. DJI Phantom 2 Vision Quadcopter with Integrated FPV Camcorder.  http://www.amazon.com/DJI-Phantom-Quadcopter -Integrated-Camcorder/dp/B00FW78710. Google Scholar

3.

N. Averett 2014. Technology: Eyes in the sky—Drones are poised to revolutionize ecology and even save scientist's lives—If the feds clear them for takeoff. Audubon, July–August 2014, 58–63. Google Scholar

4.

B.A. Brooks C. Glennie K.W. Hudnut T. Ericksenand D. Hauser 2013. Mobile laser scanning applied to the earth sciences. EOS, Transactions of the American Geophysical Union, 94(36), 1–3. Google Scholar

5.

D. Castelvecchi 2010. Invasion of the drones: Unmanned aircraft take off in polar exploration. Scientific American, 302, 25–27. doi: 10.1038/scientificamerican0310-25 Google Scholar

6.

X. Chenand L. Vierling 2006. Spectral mixture analyses of hyperspectral data acquired using a tethered balloon. Remote Sensing of Environment, 103(2006), 338–350. Google Scholar

7.

G. Chust I. Galparsoro Á. Borja J. Francoand A. Uriarte 2008. Coastal and estuarine habitat mapping using LiDAR height and intensity and multispectral imagery. Estuarine, Coastal and Shelf Science, 78(4), 633–643. Google Scholar

8.

J. Cihlar A. St.-Laurentand J.A. Dyer 1991. Relation between the Normalized Difference Vegetation Index and ecological variables. Remote Sensing of Environment, 35(2–3), 279–298. Google Scholar

9.

R. Conniffand R. McClaran 2011. Ready for takeoff: Aerial drones are not just for the military any more. Smithsonian, June 2011, 41–54. Google Scholar

10.

C. Delacourt P. Allemand M. Jaud P. Grandjean A. Deschamps J. Ammann V. Cuqand S. Suanez 2009. DRELIO: An unmanned helicopter for imaging coastal areas. In: C.P. da Silva (ed.), Proceedings of the ICS, Journal of Coastal Research, Special Issue No. 56, pp. 1489–1493. Google Scholar

11.

O. Edwards 2009. Under the radar with unmanned aerial vehicles. Smithsonian Magazine, March 2009, 26–27. Google Scholar

12.

R.J. Garono C.A. Simenstad R. Robinsonand H. Ripley 2004. Using high spatial resolution hyperspectral imagery to map intertidal habitat structure in Hood Canal, Washington, U.S.A. Canadian Journal of Remote Sensing, 30(1), 54–63. Google Scholar

13.

S.N. Goward B. Markham D.G. Dye W. Dulaneyand J. Yang 1991. Normalized Difference Vegetation Index measurements from the advanced very high resolution radiometer. Remote Sensing of Environment, 35(2–3), 257–277. Google Scholar

14.

C.H. Hugenholtz B.J. Moorman K. Riddelland K. Whitehead 2012. Small unmanned aircraft systems for remote sensing and Earth science research. Eos, Transactions of the American Geophysical Union, 93(25), 236. Google Scholar

15.

M. Hutt 2014. Land remote sensing program: Unmanned aircraft systems. USGS Rocky Mountain Geographic Science Center News Release.  http://rmgsc.cr.usgs.gov/UAS/Google Scholar

16.

J.R. Jensen 2007. Remote Sensing of the Environment: An Earth Resource Perspective. Upper Saddle River, New Jersey: Prentice-Hall, p. Google Scholar

17.

V. Klemas 2009. The role of remote sensing in predicting and determining coastal storm impacts. Journal of Coastal Research, 25(6), 1264–1275. Google Scholar

18.

V. Klemas 2010. Tracking oil slicks and predicting their trajectories using remote sensors and models: Case studies of the Sea Princess and Deepwater Horizon oil spills. Journal of Coastal Research, 26(5), 789–797. Google Scholar

19.

V. Klemas 2011. Remote sensing of wetlands: Case studies comparing practical techniques. Journal of Coastal Research, 27(3), 418–427. Google Scholar

20.

V. Klemas 2013. Airborne remote sensing of coastal features and processes: An overview. Journal of Coastal Research, 29(2), 239–255. Google Scholar

21.

K. Kushida K. Yoshino T. Naganoand T. Ishida 2009. Automated 3D forest surface model extraction from balloon stereo photographs. Photogrammetric Engineering and Remote Sensing, 75(1), 25–35. Google Scholar

22.

A.M. Lechner A. Fletcher K. Johansenand P. Erskine 2012. Characterizing upland swamps using object-based classification methods and hyper-spatial resolution imagery derived from an unmanned aerial vehicle. Proceedings of the XXII ISPRS Congress Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences, Volume I–4 (Melbourne, Australia, ISPRS), pp. 101–106. Google Scholar

23.

E. Marris 2013. Drones in science: Fly and bring me data. Nature, 498, 156–158. doi: 10.1038/498156a Google Scholar

24.

NOAA (National Oceanic and Atmospheric Administration), 2008. Hurricane Katrina.  http://www.ncdc.noaa.gov/oa/research/2005/katrina.htmlGoogle Scholar

25.

S. O'Youngand P. Hubbard 2007. RAVEN: A maritime surveillance project using small UAV. Proceedings of the 2007 IEEE Conference Emerging Technologies & Factory Automation (Patras, Greece, IEEE), pp. 904–907. Google Scholar

26.

T. Oleson 2013. Droning on for science: Unmanned aerial vehicles take off in geosciences research. Earth Magazine, 2014, 1–6.  http://www.earthmagazine.org/article/droning-scienceGoogle Scholar

27.

M. Oliveira K. Rajan J. Sousaand E. Niiler 2013. Fish-tracking robots take to the seas and skies of Portugal. Monterey Bay Aquarium Research Institute News Release, 13 July 2013, pp. 1–3. Google Scholar

28.

E. Pereira R. Bencatel J. Correira L. Felix G. Goncalves J. Morganoand J. Sousa 2009. Unmanned air vehicles for coastal and environmental research. In: C.P. da Silva (ed.), Proceedings of the ICS, Journal of Coastal Research, Special Issue No. 56, pp. 1557–1561. Google Scholar

29.

S.R. Phinn D.A. Stowand J.B. Zedler 1996. Monitoring wetland habitat restoration in Southern California using airborne multispectral video data. Restoration Ecology, 4(4), 412–422. Google Scholar

30.

D. Pike 2014. Ocean Eye oil spill aerial tracking. Maritime Journal, 23 January 2014.  http://www.maritimejournal.com/news101/pollution-control/ocean-eye-oil-spill-aerial-trackingGoogle Scholar

31.

J.R. Platt 2013. Crowd-funded drones could help protect Kenyan rhinos. Scientific American, 8 January 2013, pp. 1–5. Google Scholar

32.

S. Purkisand V. Klemas 2011. Remote Sensing and Global Environmental Change. Oxford, U.K.: Wiley-Blackwell, p. Google Scholar

33.

S.J. Purkis 2005. A 'reef-up' approach to classifying coral habitats from IKONOS imagery. IEEE Transactions on Geoscience and Remote Sensing, 43, 1375–1390. Google Scholar

34.

K. Rajan J. Sousaand E. Niiler 2014. Eyes in the sky—And the sea. Discover, April 2014, p. . Google Scholar

35.

E. Rule 2008. Unmanned aircraft systems for hurricane research. Earth System Monitor, November 2008, p. . Google Scholar

36.

J.A. Shaw P.W. Nugent N.A. Kaufman N.J. Pust D. Mikes C. Knierim N. Faulconer R.M. Larimer A.C. DesJardinsand W.B. Knighton 2012. Multispectral imaging systems on tethered balloons for optical remote sensing education and research. Journal of Applied Remote Sensing, 6. doi: 10.1117/1.JRS.6.063613 Google Scholar

37.

B.A. Sklaver A. Manangan S. Bullard A. Svanbergand T. Handzel 2006. Rapid imagery through kite aerial photography in a complex humanitarian emergency. International Journal of Remote Sensing, 22(21–22), 4709–4714. Google Scholar

38.

L.A. Vierling M. Fersdahl X. Chen Z. Liand P. Zimmerman 2006. The short wave aerostat-mounted imager (SWAMI): A novel platform for acquiring remotely sensed data from a tethered balloon. Remote Sensing of Environment, 103(3), 255–264. Google Scholar

39.

F. Visser C. Wallisand A.M. Sinnott 2013. Optical remote sensing of submerged aquatic vegetation: Opportunities for shallow clear-water streams. Limnologica, 43(5), 388–398. Google Scholar

40.

M.A. Vousdoukas G. Pennucci R.A. Holmanand D.C. Conley 2011. A semi-automatic technique for rapid environmental assessment in the coastal zone using small unmanned aerial vehicles (SUAVs). In: K. Furmanczyk A. Gizaand P. Terefenko (eds.), ICS 2011 Proceedings, Journal of Coastal Research, Special Issue No. 64, pp. 1755–1759. Google Scholar

41.

Wikipedia, 2014. Quadcopter.  http://en.wikipedia.org/wiki/QuadcopterGoogle Scholar

42.

J. Yangand F.J. Artigas 2010. Mapping salt marsh vegetation by integrating hyperspectral and LiDAR remote sensing. In: J. Wang (ed.). Remote Sensing of Coastal Environment. Boca Raton, Florida: CRC, pp. 173–190. Google Scholar
© Coastal Education & Research Foundation 2015
Victor V. Klemas "Coastal and Environmental Remote Sensing from Unmanned Aerial Vehicles: An Overview," Journal of Coastal Research 31(5), 1260-1267, (1 September 2015). https://doi.org/10.2112/JCOASTRES-D-15-00005.1
Received: 8 January 2015; Accepted: 24 February 2015; Published: 1 September 2015
KEYWORDS
blimp remote sensing
coastal remote sensing
drone remote sensing
quadcopter remote sensing
Back to Top