BeePi: Honeybees Meet AI: Stage 2
The story of BeePi, a multi-sensor electronic beehive monitoring citizen science platform, continues to unfold!
We are very grateful for the generous support of our Kickstarter backers in 2017 that allowed us to complete the first stage of the BeePi project. The objectives of Stage 1 were 1) to build two BeePi monitors and to deploy and stress test them in local hives in the beekeeping seasons of 2017 and 2018; 2) to collect and analyze multi-sensor data; and 3) to share broadly our data and findings.
We're ready to move on to Stage 2 of the BeePi project. Our vision is that in the future significant practical and scientific benefits will come from transforming our beehives into immobile robots that use AI to monitor the health of their bee colonies with multiple sensors, analyze the data, and alert beekeepers of any deviations from the norm.
Why Do We Need to Monitor Beehives?
We need to monitor them, because many honeybee colonies are failing. Typical culprits of bee colony failures are Varroa mites, foulbrood, chalkbrood, and bad weather. Other contributing factors include pesticides (e.g., neonicotinoids), monoculture (increasing emphasis on growing large quantities of the same crop), and transportation stress caused by long hauls of commercial beehives on semitrucks.
The average annual loss for beekeepers varies from 25% to 50% per year. The cost of equipment, bee packages, maintenance, and transportation is so high that profit margins for beekeepers are very small. Human beekeepers cannot monitor their hives continuously due to problems with logistics and fatigue. Intelligent autonomous beehive monitoring will reduce the number of invasive hive inspections that disrupt the natural life cycle of bee colonies. It will also lower transportation costs in that beekeepers will no longer need to drive long distances to their far flung bee yards and will be able to monitor their beehives remotely.
Where We Have Been and What We Have Given Back to the Community
The BeePi project's motto is replicate and share. During Stage 1, we collected, analyzed, and curated the following data sets and made them publicly available.
BUZZ2 - a curated and labeled dataset of 12,914 audio samples captured by BeePi monitors deployed on Langstroth hives with Carniolan bees in 2018.
BEE1 - a curated and labeled dataset of 54,382 images obtained from videos captured by BeePi monitors placed on Langstroth hives with Italian honeybee colonies in 2017.
BEE2_1S - a curated and labeled dataset of 58,201 images obtained from videos captured by BeePi monitors placed on first supers of Langstroth hives with Carniolan bees in 2018.
BEE2_2S - a curated and labeled dataset of 54,678 images obtained from videos captured by BeePi monitors placed on second supers of Langstroth hives with Carniolan bees in 2018.
All of our research publications with detailed explanations of our experiments with these datasets, algorithms, and classification models are available on my LinkedIn page.
Where We Are Going in Stage 2
Objective 1: Build Five More BeePi Units. The BeePi hardware has been sufficiently stress tested and iteratively modified. With this fundraiser, we aim to build five more BeePi monitors and deploy them in local beekeepers' apiaries. Videos 1 and 2, shot in mid May 2019, show how a BeePi monitor is deployed on a hive with a newly installed bee package.
Video 1. Field Installation of a BeePi Monitor (Part 1): Installing a bee package into a hive on which a BeePi monitor will be placed.
Video 2. Field Installation of a BeePi Monitor (Part 2): Placing a BeePi monitor on a newly hived bee colony.
Objective 2: Directional Bee Traffic Estimation. Our current vision-based bee traffic estimation algorithm that returns counts of moving bees on or near landing pads of Langstroth beehives is omnidirectional. The algorithm detects bees moving in any direction (up, down, or lateral). Video 3 below gives you a sense of how our current algorithm works.The BeePi monitor's camera is looking straight down on the landing pad. The green squares are drawn in real time around detected moving bees. We plan to enhance this algorithm to estimate directional bee traffic. Given a video, the algorithm will return three numbers: the count of bees that moved down (i.e., to the beehive), the count of bees that moved up (i.e., away from the hive), and the count of bees that moved laterally (i.e., sideways).
Video 3. Omnidirectional bee motion detection in BeePi.
Objective 3: Sensor Fusion. It's highly unlikely that intelligent beehive robots will use single sensors (i.e., only audio, only video, only temperature, only weight, etc.). If a beehive is to model human beekeepers' decision making, it must use multiple sensors, because human beekeepers use multiple sensors. What this means, in practical terms, is that the intelligent beehive's AI must be capable of sensor fusion - the ability to integrate and analyze data from multiple data streams. To achieve this objective, we'll work on multi-modular pattern recognition and tracking models that correlate video, audio, temperature, and human beekeeper observations. Funding permitting, we'll integrate a chemosensor into our sensor set. As an example, consider the three plots below from one of our beehives (let's call it beehive X) monitored by a BeePi monitor in 2018. The first two plots (see Figures 1 and 2) are generated automatically by BeePi. The third plot (see Figure 3) was generated from a human beekeeper's log. The multi-modular models will look for correlations between and among these streams of data to detect deviations from normalcy.
Objective 4: Data Curation and Sharing. To make our vision of beehives as intelligent immobile robots a reality, we need terrabytes (I repeat - terrabytes) of good quality multi-sensor data collected over multiple years (quite possibly decades) and at multiple locations. It's a long haul with no quick fixes or panaceas in sight. It is of paramount importance that the collected and curated datasets and models be publicly available. Ultimately, this is the only guarantee of quality control through replicability by independent 3rd parties, which is the essence of citizen science.
A fundamental, long-term objective of the BeePi project has been (and will remain) to provide a replicable open design/open data/open source citizen science platform for electronic beehive monitoring, i.e., a set of software tools and hardware designs for researchers, practitioners, enthusiasts, and citizen scientists to monitor their own or someone else's beehives and share their findings and insights without any private, centrally managed data repositories, patents, artificial data standards or any other top-down control structures that stifle innovation, participation, engagement, and sharing. In other words, 100% bottom-up and grass roots.
All beekeeping is local. There may be a few universal rules of thumb, but, generally speaking, what works for me may not necessarily work for you, which is why we want people to replicate and share. Build your own BeePi monitors and create your own blogs and data/code repositories so that other people can benefit from your local findings and improve them. It's an old well-tried principle on which Linux, Wikipedia, Open Office, Open AI, OpenCV, Octave, Tor and many other projects have been successfully built for the benefit of all. By supporting the BeePi project you're supporting scientific endeavor and citizen science.
BeePi Tote Bags
Ambient Beehive Audio Tracks Collection
This collection includes the following tracks.
1. Bee Symphony (String Ensemble)
2. Curious Turkeys (Acoustic Drums with a Touch of Acoustic Snare)
3. Static Noise Chant (Polysynthesizer)
4. Rain Drops Dancing on a Tin Can (Marimba)
5. Waves, Birds, and Bells (Synthesizer)
6. Horses and Bumble Bees (Acoustic Drums with a Touch of Acoustic Snare)
7. Little Mouse Tooh Visits a Beehive (Synthetic Voice)
8. Sad Cricket Watching Fireworks (Music Box)
9. Bee Concerto with a Rain Drop Theme (Acoustic Grand Piano)
For technical details on how to digitally synthesize music out of ambient beehive audio samples, see V. Kulyukin, M. Putnam, S. Reka. "Digitizing Buzzing Signals into A440 Piano Note Sequences and Estimating Forager Traffic Levels from Images in Solar-Powered, Electronic Beehive Monitoring." Proceedings of the International MultiConference of Engineers and Computer Scientists 2016 Vol I, IMECS 2016, March 16 - 18, 2016, Hong Kong. (pdf)
Risks and challenges
Any project that deals with deploying hardware devices into real world environments has hardware failure risks. If any hardware components malfunction, they can be easily replaced because all of them are off-the-shelf and are readily available from multiple online vendors. I am qualified to complete this project. I have a Ph.D. in Computer Science and have been doing R&D work in various areas of Computer Science, mostly AI-related, for over 20 years. I know how to manage R&D to achieve concrete results. You can check out my LinkedIn profile (see above) for more details. I am also a beekeeper, and I love beekeeping.Learn about accountability on Kickstarter
Environmental commitmentsVisit our Environmental Resources Center to learn how Kickstarter encourages sustainable practices.
- (60 days)