The ongoing sixth mass extinction constitutes a critical threat for the future of human civilization because of the associated degradation of ecosystem services. The proposed actions to bend the curve of mass extinction are heterogeneous and depend on the target ecosystem. In this context, technology plays a crucial role to keep a record of the state of species and ecosystems, identify causes of extinction and degradation, assess the effectiveness of mitigation measures, and monitor the evolution of the environment while collecting data to drive future actions and make informed decisions. Manual (i.e., by humans) off-site analytics of the data collected by sensors has been the standard procedure in the conservation field for many years. As the capabilities of sensors evolved, manual information processing became the major limiting factor for full exploitation of the possibilities technology offered. To overcome this situation, artificial intelligence (AI), and more specifically its embodiment in the form of deep neural networks (DNNs), is expected to be the most important catalyzer of advances in the next few years. At the moment, cloud-based services are transforming the aforementioned classical paradigm of manual off-site analytics into automatic off-site analytics. However, the bottleneck of conveying all the data to the cloud remains. Ultimately, the objective is automatic on-site analytics, that is, the systems deployed for nature monitoring should be able to sense their environment, process the data locally, and digest information of interest for researchers, managers, and conservationists. The challenges for the successful realization of these capabilities are remarkable. DNNs, which are todays de-facto standard implementation of AI because of their high accuracy in inference tasks, are computationally heavy and memory-hungry, not only during training but also when it comes to inferring in real deployments. With the present project, we intend to contribute to the realization of automatic on-site analytics through the design and implementation of a sensing-processing edge platform that will integrate and fusion visual, acoustic, and environmental data for smart nature monitoring at prescribed locations. This platform will be constructed under the principles of low power, low cost, and accurate inference i.e., it must provide specialists with very reliable information for them to make decisions with minimum manual analysis. As a first step, in this 2-year project we will make use of off-theshelf components carefully selected for the targeted platform. As a long-term goal, we aim at designing and integrating specific chips on the basis of the experience acquired with commercial components in order to create a miniaturized system in the line of the long envisioned concept of smart dust.
Project TED2021-131835B-I00 funded by MICIU/AEI /10.13039/501100011033 and European Union NextGenerationEU/ PRTR.