A new android smartphone app for geospatial mapping from drones and kites
This paper will describe the development, testing and deployment of a free-to-use, open-source, android-based smartphone application for capturing geo-tagged aerial photographs for grass-roots remote sensing (RS) and mapping applications. Historically, RS data have been acquired from sensors on platforms such as piloted aircraft or satellites but a new self-service, and to some extent, 'grassroots' (participatory and distributed) RS revolution is underway making use of drones and kites as platforms for proximal observations of environmental phenomena. There are a growing number of papers in the geosciences and in landscape ecology utilising such platforms for cost-effective, self-service acquisition of RS data. These platforms cannot carry the heavy payloads used on satellites or aircraft, but they offer a more flexible way of gathering responsive survey data, and their low flying capability means that very fine-grained data can be captured easily. The current scientific focus for drone- and kite-based aerial mapping relies on automatically-triggered camera systems, followed by complex post-processing algorithms (e.g. computer vision-based 'structure-from-motion' software) to convert the resulting aerial photography data into orthorectified maps and point clouds. Whilst these approaches generate high quality products, for many applications the complexity is a barrier to uptake. Wiring the camera to an autopilot trigger is non-trivial, and the post-processing stage demands expensive and complex software and high performance computing. For many basic mapping applications, the workflow is too complex and the detail in the products exceed what is really needed. We asked: what if a basic smartphone, with its plethora of on-board sensors (accelerometer, GPS, compass, camera) could be used to generate ready-to-use spatial data from lightweight aerial platforms such as drones or kites? We built an android application to test the capability of standard smartphones as remote sensing devices. The application uses a visual coding 'scheme blocks' framework, so that users can customise their own data capture tools in the field. In our presentation we will demonstrate the coding framework, and then we will show the results that were gathered when we used the app to collect data during test flights - utilising various kite and lightweight drone platforms. We have also developed a simple to use open-source geospatial toolkit to allow geographical information system (GIS)-ready GeoTIFF images to be processed from the metadata stored by the app. We will demonstrate how this works in our presentation. Two Android smartphones were used in testing - a high specification OnePlus One handset and a lower cost Acer Liquid Z3 handset. We will show that the best results were obtained with the higher specification phone when it was attached to a single line kite or to a gliding drone. Finally, we will use data collected using the app, over a farmyard to demonstrate the power of the resultant fine-grained products for a simple application - advising farmers about small-scale interventions they can make to improve the quality of water run-off from their farms. The app can be downloaded freely, and used wherever an Android smartphone and aerial platform are available to deliver rapid spatial data (e.g. disaster zones, in teaching or for grassroots democratic mapping).