In many rural areas, wild boars are a growing threat to livestock and farmland. This project aims to help farmers in such areas detect the presence of boars using sound recognition and send real-time alerts through a mobile app - BoarMap. Whether it’s to track their movement or deter them, this smart system provides a non-invasive early warning tool.
This project was created not only because it was technically challenging but also provides real-world value — something that could benefit people or the environment in a meaningful way. While brainstorming ideas, problems that were often overlooked but still impactful, especially in rural communities emerged. With the hardware and tools already available — like microphones, microcontrollers, and access to Edge Impulse — the idea of building something practical while using sound detection was born. Or more specifically, the idea of a system to detect wild boar activity. This project is meant to act as a solution that could help farmers and rural residents protect their land and livestock, using simple, energy-efficient technology.
Creatingtheproject
-Dataset:
For the purpose of this project firstly, a dataset needed to be created. The dataset was split into three main sound categories:
1.Boars
2.Humans
3.Forest/environmentsounds
The data was collected by firstly acquiring the recordings for all the sound categories. Most of the recordings were gathered from online compilations that were later converted into.wav formats and ensure they all have the same sample rate and duration.
Each file was trimmed and labeled accordingly. To convert the recordings into their appropriate formats a Python script was used to facilitate the process. The code used to achieve this was attached in the project.
After, the data was uploaded to Edge-impulse, a platform for training machine learning models on edge devices, by "manually" uploading the sounds using the microphone on the seeed studio xiao nrf52840 board.
What followed after uploading the data on Edge-impulse was its preprocessing.
To create an "impulse", the following parameters were set:
As a processing block, the Audio(MFE)block was added, This block processes raw audio and converts it into features the model can understand. This block transforms the waveform into a time-frequency representation (a spectrogram-like matrix), making it easier for the model to learn the differences between boar, forest, and human sounds.
After training and testing the model, the accuracy for it wasn`t the most reliable. There might be several reasons for the "weaker" accuracy, despite the dataset being accurate and well-extracted.
Once satisfied with the accuracy, the trained model was deployed as an Arduino library (C++ code) ready for use on embedded devices - seeed studio xiao nrf52840 board.
-Preparing the code:
This code runs on an Arduino-compatible microcontroller for real-time sound classification using an Edge Impulse model and a PDM microphone. It's designed to detect wild boar sounds and send alerts via a LoRa network when such sounds are reliably recognized.
At startup, USB serial communication is initialized, and the LoRa module is powered on. The code sets up the LoRa connection using AT commands and OTAA credentials (DevEUI, AppEUI, AppKey). After joining the network, the Edge Impulse classifier and microphone buffers are initialized.
In the main loop, the system records a slice of audio, waits for the buffer to fill, and converts the data for inference. The classifier results, including labels, confidence scores, and optionally anomaly scores, are printed to the serial monitor. If the label "Boar" has a confidence above 0.5, it’s considered a detection.
To reduce false positives, a rolling buffer of the last 20 classifications is used. If "boar" is detected in 15 or more of the last 20 slices, an alert is sent via LoRa (AT+MSG="1"
), along with a simulated sound effect. If not, and 30+ seconds have passed since the last "no boar" message, it sends AT+MSG="0"
.
Audio capture uses interrupt-driven callbacks from the PDM library, which swap between two buffers to avoid data loss. The classifier uses a helper function to convert int16
samples to float
. On completion or shutdown, the microphone stops and buffers are freed to maintain system stability.
- Application designing:
The mobile application is developed using Android Studio and is designed to assist farmers in monitoring wild boar activity through sensor data. It consists of two main activities:
1. A configuration activity
2. A map activity for sensor location and boar status visualization.
The configuration activity allows the user to set up the sensor's location with ease. The farmer visits the field or installation site with a mobile device and simply presses the “Get your current phone location” button. This action captures the device’s GPS coordinates, which can then be saved locally on the phone.
Once saved, the map activity reads this stored location and places a marker accordingly on the map interface. The second activity displays a satellite map with the sensor's position clearly marked, along with color-coded icons that indicate the presence or absence of wild boars based on the latest sensor data. Red marker icon indicates that boar sounds have been detected at the specific sensor location, green one indicates that boars were not detected and the orange one indicates connection interruption.
This visual feedback helps the user monitor boar movement remotely and take preventive action to protect crops. Additionally, the application sends a notification to the user's phone whenever a boar is detected, ensuring the farmer is alerted immediately even without actively checking the app.
The application ensures a user-friendly experience, enabling fast setup in the field and real-time status updates, making it a practical and efficient tool for modern farming and wildlife management.
Comments