Add to collection
  • + Create new collection
  • Aerial imaging is not new, and scientists have been using data gathered from manned flights for decades. In recent decades, Earth-oriented satellites have become the dominant method of collating aerial data.

    Satellite monitoring of terrestrial and marine environments has been done predominantly from low earth orbits (LEOs). LEOs allow for regular passes with satellites circling the Earth once every 90 minutes.

    Satellites have significantly advanced remote sensing of marine and terrestrial environments. Some of the drawbacks with satellite imagery have been the low resolution (large pixel size), and while a lot of data can be accessed easily for free, it is data that has not been tailored to the specific needs of different research projects.

    Manned aerial flights from fixed-wing aeroplanes can achieve higher-resolution images (smaller pixels) depending on the parameters. However, motion blur can be an issue with images from fixed-wing planes. It is also expensive to hire planes and helicopters for this type of work, especially as different parameters for monitoring need to line up – for example, weather, flight paths and tides.

    Unmanned aerial vehicles – drones – offer solutions and new opportunities for scientists to collect data and monitor environments and at an even higher resolution than planes or helicopters.

    Marine science and drones

    NIWA marine ecologist Dr Leigh Tait was looking at the recovery and resilience of kelp forests associated with the 2016 Kaikōura earthquake. With funding from the Sustainable Seas National Science Challenge, he worked with specialised technicians and scientists to optimise drones for collecting marine data for this research.

    Leigh needed to survey the vegetation on the uplifted reef areas along the Kaikōura coast. The coastal uplift covered a vast area and some of the specific sites they wanted to look at were very difficult to access, so the team turned to drone technology.

    The advantages of drone mapping the marine environments along the Kaikōura coast were the simplicity in timing drone flights to target specific conditions and events. For example, Leigh needed to map during low tide to see the intertidal zone, and good weather conditions were required to operate. Flights were done in the late morning to midday to help minimise shadows, glare and reflections caused by the Sun.

    Discover more about Leigh's work on mapping the recovery of kelp forests following the 2016 Kaikōura earthquake.

    Species-level mapping with drones

    This was the first time drones equipped with multispectral cameras (cameras that see visible light and infrared) had been used to survey marine vegetation over whole reef areas in New Zealand, so the team needed to start by configuring the drones to work for the intended environment.

    To optimise a drone for species-level mapping, you need to combine a number of elements:

    • Physical parameters for the drone flight.
    • Select a camera with the best light wavelengths to allow for the images to be captured in a way that will best support the identification of different species – many look very alike!
    • A computer application that can be loaded with the drone images and then be ‘trained’ to work with the images in order to identify the vegetation pictured on them.
    • Validation work to assure that the images and end outputs are accurate.

    Flight of the drone

    To do species-level mapping of the vegetation, the drones had a low flight path to capture close-up high-resolution data (pixels <10 mm2 on the ground). Usually, a high flight path is used to cover large areas, but this means lower resolution (bigger pixels). Lower flight paths mean that longer flight times are required to cover an area, so there is a big trade-off between the area covered and the pixel resolution. This can be a challenge when there are multiple parameters for drone flight involved.

    Other key parameters for Leigh’s work, outside of weather conditions and time of day, were the ocean conditions. The ocean has to be calm and clear as water turbidity from wave action and seafoam can stop cameras from being able to capture clear images through the water column, which is essential to identify species of interest, or to photograph the bottom of the ocean (the benthic layer).

    Wave action also hinders image stitching, where a number of images are stitched together to build up a bigger picture of an area. It can also blur species of interest meaning automated algorithms cannot be used to map species of interest.

    Light intensity needs to be at certain levels. If it is too low, the camera on the drone would require a higher ISO . Higher ISOs enable the camera to operate with less light but then the image is affected by too much grain. ‘Grain’ is an old film term that refers to the texture of an image when printed from the film negative. In digital images, this texture can also be a response to low light but is more correctly referred to as ‘noise’. Compare a night-time or low-light image that you’ve taken with your phone to a daytime image, and you’ll get a sense of how noise impacts the image’s resolution!

    Multiband imaging

    A multispectral camera was chosen for the drone because it had six lenses with filters that could be used to block all but the desired wavelength.

    Multiband imaging involves taking several pictures using light of different colours or wavelengths from the electromagnetic spectrum. It combines multiple images, with each capturing light of specific wavelengths. For example, a multiband image for identifying plants may consist of images that highlight green, red, blue and infrared. Each highlights a different aspect of the plant: the chlorophyll reflects green light, the cell structure is highlighted by infrared and pigments are captured in the red and blue image. When the images are combined, specific plant species can be identified.

    Multiband imaging is also used to identify objects other than plants. Using a combination of different wavelengths, this technique can be used to identify cloud cover or physical features like vegetation cover or animal colonies.

    Leigh was looking at kelp and other macroalgal species and so the team optimised the cameras to best capture these. In turn, some of these images were then combined to further the accuracy of machine learning in picking up the different species of interest.

    Data validation

    The project combined the drone data with field sampling at a number of sites using transects. Transects were laid out prior to the launch of the drone. This transect data was then used as a validation dataset to gauge accuracy.

    Machine learning

    The high-resolution images (small pixel size) and extra spectral bands (six bands of light as opposed to just standard RGB – red-green-blue) from drones were used to optimise algorithms and to build machine learning to classify vegetation species. An algorithm is a set of instructions for a computer to solve a problem or reach a goal. With machine learning, accurately classified images are loaded into the application as training images. The computer looks at all the training images and the class (such as a species of seaweed) assigned by the user and trains itself by identifying features that are unique to that class (such as colour or shape). When new images come through for analysis, the machine compares them at pixel level with all the existing images. The machine is able to reach a species classification based on the new image looking like the set of preloaded training images.

    The sky’s the limit!

    Dr Leigh Tait is now working with other organisations interested in using the drone technology and marine species identification algorithms to map other marine areas. In related work, he is looking to optimise underwater remotely operated vehicles (ROVs) for use by biosecurity staff who assess the bottom of boats in ports for pest plants and animals that might have hitched a ride.

    Related content

    Discover more about Dr Leigh Tait's work mapping the recovery of kelp forests following the Kaikōura earthquake.

    Aeronavics is a company that specialises in drone technology. Learn about this company and their work on designing and producing UAVs to service industries that include film and television, search and rescue, farming and forestry.

    In this video, discover how these specially designed drones are being used to find, follow and provide information to protect the highly endangered Māui dolphin.

    The Connected article Three drones features Aeronavics drones keeping black rhinos free from poachers in South Africa and helping Australian mines keep track of the amount of material they process.

    The Connected article Amazing algorithms introduces and explains the concept of algorithms with concrete examples from everyday life, mathematics and computer programming.

    Dr Wolfgang Rack is a senior lecturer in the Gateway Antarctica programme at the University of Canterbury. One of his research projects involves measuring sea-ice thickness in Antarctica using the CryoSat-2 satellite

    Dr Adrian McDonald relies on satellite data to look at factors that affect the Antarctic climate, the ozone hole and their interactions. Learn more about his work in Satellites to study Antarctic atmosphere.

    Useful links

    In this recorded webinar from Sustainable Seas National Science Challenge, learn more about Dr Leigh Tait’s research on monitoring kelp and seaweed biodiversity of coastal marine ecosystems with drones.

    For a visual interactive showing different areas mapped by Leigh’s team, the areas mapped and different drone images, look at Earthquake impacts to Kaikōura’s rocky shore.


    This article has been developed using resources from the Sustainable Seas National Science Challenge and Dr Leigh Tait.

    Further funding for this research was provided by the Ministry of Business, Innovation and Employment Endeavour Fund.

      Published 23 January 2020 Referencing Hub articles
          Go to full glossary
          Download all