Developing a quality assurance plan for telemetry studies: A necessary management tool for an effective study

By: , and 
Edited by: Noah S. AdamsJohn W. Beeman, and John H. Eiler

Links

Abstract

Telemetry has been used to answer various questions associated with research, management, and monitoring programs and to monitor animal behavior and population dynamics throughout the world. Many telemetry projects have been developed to study the passage, behavior, and survival of migrating adult and juvenile salmonids at hydroelectric projects on the mainstem Columbia and Snake rivers (Skalski et al. 2001a, 2001b; Skalski et al. 2002; Keefer et al. 2004; Goniea et al. 2006; Plumb et al. 2006). Telemetry based field evaluations of the survival of salmon through hydroelectric projects are costly because of the technology (tags, telemetry systems, infrastructure, etc.) and personnel required to conduct the evaluations. Given the cost of implementing these projects, and the financial and conservation implications of the decisions made from the research results (e.g., forgone electricity production and conservation of threatened and endangered animals), ensuring quality data are collected by documenting all procedures, training, data checks, and that sound protocols and quality assurance and control procedures are in place is paramount.

Telemetry studies can pose unique data collection, processing, and analysis challenges. For instance, inferences about entire populations of animals are made from study animals that are captured, held, and tagged at disparate locations. Consequently great care must be taken to ensure that any potential biases that could arise from field procedures must be minimized (Peven et al. 2005). Interrogations of released study animals are remotely conducted by telemetry systems throughout the study area. The continuous recording of telemetry systems can result in large numbers of detections over a short time frame and the potential for false positive detections from records that are weak or erroneous. Thus, there is the potential to generate large data sets (many thousands of lines) that require significant postprocessing. Data reduction can be done using software or programming code within a software package or manually to discern noise from valid data and pull out the pertinent information for analysis. In either case, consistent well-documented procedures need to be in place to ensure quality results and allow for repeatability of study methods.

Publication type Book chapter
Title Developing a quality assurance plan for telemetry studies: A necessary management tool for an effective study
Chapter 9.3
DOI 10.47886/9781934874264.ch20
Year Published 2012
Language English
Publisher American Fisheries Society
Publisher location Bethesda, MD
Contributing office(s) Western Fisheries Research Center
Larger Work Type Book
Larger Work Subtype Monograph
Larger Work Title Telemetry techniques: A user guide for fisheries research
Online Only (Y/N) N
Additional Online Files (Y/N) N
Google Analytic Metrics Metrics page
Additional publication details