A framework for enhanced localization of marine mammals using auto-detected video and wearable sensor data fusion - Robotics Institute Carnegie Mellon University

A framework for enhanced localization of marine mammals using auto-detected video and wearable sensor data fusion

Joaquin Gabaldon, Ding Zhang, Kira Barton, M. Johnson-Roberson, and Alex Shorter
Conference Paper, Proceedings of (IROS) IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 2505 - 2510, September, 2017

Abstract

Accurate biological agent localization offers the opportunity for both researchers and institutions to gain new knowledge about individual and group behaviors of biosystems. This paper presents a sensor-fusion approach for tracking biological agents, combining the data from automated video logging with magnetic, angular rate, and gravity (MARG) and inertial measurement unit (IMU) data, with professionally managed dolphins as the representative example. Our method of video logging allows for accurate and automated dolphin location detection using a combination of Laplacian of Gaussian (LoG) and multi-orientation elliptical blob detection. These data are combined with MARG/IMU measurements to generate a localization estimate through a series of drift-correcting Kalman and gradient-descent filters, finalized with Incremental Smoothing and Mapping (iSAM2) pose-graph localization.

BibTeX

@conference{Gabaldon-2017-130164,
author = {Joaquin Gabaldon and Ding Zhang and Kira Barton and M. Johnson-Roberson and Alex Shorter},
title = {A framework for enhanced localization of marine mammals using auto-detected video and wearable sensor data fusion},
booktitle = {Proceedings of (IROS) IEEE/RSJ International Conference on Intelligent Robots and Systems},
year = {2017},
month = {September},
pages = {2505 - 2510},
}