We're sorry but this page doesn't work properly without JavaScript enabled. Please enable it to continue.
Feedback

Multi-Sensor Feeder: Automated and Easy-To-Use Animal Monitoring Tool for Citizens

00:00

Formal Metadata

Title
Multi-Sensor Feeder: Automated and Easy-To-Use Animal Monitoring Tool for Citizens
Title of Series
Number of Parts
351
Author
Contributors
License
CC Attribution 3.0 Unported:
You are free to use, adapt and copy, distribute and transmit the work or content in adapted or unchanged form for any legal purpose as long as the work is attributed to the author in the manner specified by the author or licensor.
Identifiers
Publisher
Release Date
Language
Production Year2022

Content Metadata

Subject Area
Genre
Abstract
Environmental changes can have different causes on local level (e.g. soil sealing) as well as on global level (e.g. climate change). To detect these changes and to find patterns in the reasons for them it is necessary to collect broad environmental data, temporally and spatially. Thereto citizens can play an essential role to collect the data (Goodchild, 2007). In particular, we developed a system which enables citizens to monitor the occurrence and distribution of birds and provides the collected data to the public in order that both researchers and citizens can derive conclusions from them. With our automated approach we want to support other citizen science solutions like eBird (Sullivan et al. 2014) where contributors manually report their sightings. Therefore, we built a prototypical bird feeder equipped with several sensors and the infrastructure to process the data collected by the feeder. The feeder is easy to reproduce at a reasonable price by following an open available manual. This allows anyone to build the feeder on their own, enabling a large distribution at many locations. The feeder automatically detects when a bird is visiting it, takes an image of the bird, determines the species and connects the observation with environmental data like the temperature or light intensity. All the collected data are published on a developed open access platform. Incorporating other surrounding factors like the proximity of the feeder station to the next forest or a large street allows it to pursue various questions regarding the occurrence of birds. One of them might ask, how does the immediate environment affect bird abundance? Or do sealed surfaces have a negative effect compared to a flowering garden? The developed weatherproof bird feeder is attached with multiple sensors. Thereby the standard equipment includes a motion sensor to detect if a bird is currently visiting the feeder, a camera to take images of the birds, a balance to weigh the birds and a sensor to measure the environment's temperature and air pressure. In addition to the standard sensors, further sensors were tested with the prototype, which usefully supplement the monitoring but are not absolutely necessary for the operation of the station. Thus, a microphone is suited to record the voice of the birds or generally the surrounding noises. A brightness sensor can be valuable to draw conclusions whether birds visit the feeder in relation to light conditions, or a sensor to measure the air pollution (e.g. PM10) to investigate if the air quality influences the bird occurrence. Besides, the usual camera can be replaced by an infrared camera to capture animals which visit the feeder at night. Thus, the station is expandable and customizable depending on the individual use cases or research questions. The environmental sensor data is continuously logged and sent to the open access platform, whereby the corresponding interval can be set by the user. Once the motion sensor detects a movement, the camera recording starts as well as scale and microphone start to store values. As long as the motion sensor detects movement, camera, microphone and balance are running. After the movement is finished, a light-weighted recognition model is used to check whether a bird is depicted in the images. If this is the case, all data collected during the movement, including the respective environmental data, will be sent as a package to the open access platform. In order to process the data collected by the station, we have developed various methods and software for data storage, analysis and sharing. The data processing is done on a centralized server. Communication with this server is enabled through a RESTful API and a website. On the server created entities of the feeders can receive environmental data as well as movement packages. When movements are sent, the server analyzes the amount of birds and identifies the species with artificial intelligence. In addition to the storage, the server makes the data available to users in two ways. First, the data is downloadable as raw JSON via the API, which enables others to use it for their own research. Second, the data is presented nicely on our website, to make it easily inspectable for everyone. However, not only via our stations a upload to the server is possible, it is also open for the upload of data gathered by other systems. Further, it is also possible to upload images of birds and receive the represented species. The feeder is designed so that it can be replicated by anyone. The corresponding instructions will be published shortly. The code to run the station and the server is available via GitHub (github.com/CountYourBirds). Moreover, different options for the validation of the data, especially the species classification, are implemented. One step is the automatic validation by the sensor values or metadata. For instance if a standard camera recognizes a bird but currently it is night (detected by light sensor or time of the day) or the balance detects nothing, the observation is discarded. Further validation can come from actual people. An interface is provided which is used to show people values and especially images recorded with the automatically recognized species. The depicted data can be validated to find corrupt sensors and wipe out mistakes made within the image classification. Additionally, the serverside evaluation of the data is supplemented by a validation of the recognized species. It is checked whether it is possible that the species can occur at that geographic region or at that time of the year. As next steps we want to conduct workshops with citizens and experts, both for putting together the stations as well as evaluating the data and the station itself. In general a strength of our implemented approach is that it is easily adaptable to other use cases, especially to detect other animals. For example with small adaptations to the feeder it could be used to detect or count different mammals like squirrels or for insects like butterflies and bees.
Keywords
202
Thumbnail
1:16:05
226
242
Pascal's triangleMultiplicationXMLComputer animation
Content (media)Insertion lossMultiplicationOpen sourceElectronic mailing listEmailNumberSpeciesArtificial intelligenceVideo trackingRadio-frequency identificationInformationAutomationSystem identificationIntegrated development environmentSoftware testingPairwise comparisonUsabilityFunction (mathematics)Computer hardwareSoftwarePerformance appraisalMultiplicationCASE <Informatik>Flow separationDifferent (Kate Ryan album)Game theoryInsertion lossArtificial neural networkMultiplication signTerm (mathematics)PredictabilityAnalytic continuationIntegrated development environmentContent (media)Run-time systemNumberAreaPoint (geometry)Parameter (computer programming)UsabilityProjective planeDivisorNatural numberSoftware testingDistribution (mathematics)Open setField (computer science)Open sourceComputer hardwareStatisticsRight angleMikrocomputerWorkstation <Musikinstrument>Functional (mathematics)Conservation lawConfiguration spaceUniform resource locatorInformation2 (number)Endliche ModelltheorieData storage deviceContext awarenessComputer configurationProcess (computing)Radio-frequency identificationCartesian coordinate systemSoftwareExtension (kinesiology)SpeciesCombinational logicCuboidComputer animation
ResultantWorkstation <Musikinstrument>Component-based software engineeringBitMultiplicationRight angleComputer animation
Component-based software engineeringMiniDiscPlastikkarteExtreme programmingDigital signalModule (mathematics)WeightLoop (music)Flip-flop (electronics)Server (computing)MikrocomputerRight angleMikrocomputerProduct (business)Computer hardwareRevision controlSet (mathematics)Row (database)Chemical equationComponent-based software engineeringData storage deviceServer (computing)MathematicsCASE <Informatik>Artificial neural networkEndliche ModelltheoriePixelView (database)WeightComputing platformProcess (computing)Workstation <Musikinstrument>SpacetimeMereologyInterior (topology)Pattern recognitionCompilation albumDifferent (Kate Ryan album)Goodness of fitLipschitz-StetigkeitClique-widthAdaptive behaviorRun-time systemMeasurementComputerComputer animation
WeightCore dumpRun-time systemReal-time operating systemField (computer science)Letterpress printingWeightPattern recognitionMultiplication signCASE <Informatik>PredictabilityRun-time system
Open setWorkstation <Musikinstrument>WikiComputing platformRun-time systemMikrocomputerModule (mathematics)InformationVisual systemPixelMathematicsInformation privacyThresholding (image processing)Process (computing)Axiom of choiceLengthAngleSpacetimeGreatest elementPosition operatorFrame problemVideoconferencingComputer-generated imageryCalculationPredictionServer (computing)MicrocontrollerRing (mathematics)Data bufferThermodynamisches SystemCellular automatonSoftwareHexagonMotion blurContent (media)Pattern recognitionFocus (optics)WeightWave packetAsynchronous Transfer ModeArtificial neural networkPresentation of a groupMonster groupCorrespondence (mathematics)Multiplication signPoint (geometry)Chemical equationPixelNatural numberServer (computing)Position operatorWeightRow (database)VideoconferencingSet (mathematics)Process (computing)HexagonArtificial neural networkRight angleProjective planeWorkstation <Musikinstrument>Endliche ModelltheorieInformation privacyNumberTerm (mathematics)MicrocontrollerSpacetimeThresholding (image processing)Level (video gaming)Axiom of choiceValidity (statistics)Power (physics)Real-time operating systemMobile WebMathematicsBitExterior algebraProduct (business)Uniform resource locatorLimit (category theory)Pattern recognitionAsynchronous Transfer ModeOpen sourceComputing platformDifferent (Kate Ryan album)Mathematical analysisWebsiteDrill commandsAngleConditional-access moduleComponent-based software engineeringSoftwareHome pageGreatest elementParameter (computer programming)CodeSensitivity analysisRun-time systemAdditionLengthPresentation of a groupIntegrated development environmentElectronic mailing listInformationMikrocomputerConnected spaceWave packetMereologyComputer configurationTotal S.A.Self-organizationLink (knot theory)UsabilityFrame problemTheorySpeciesSeries (mathematics)Stability theoryFlash memoryState of matterPermanentMiniDiscComputer animation
Transcript: English(auto-generated)
Yeah, I'd like to welcome you to our talk multi-sensor feeder automated and easy to use bird monitoring tool for Citizens we conducted this work together with the two colleagues who are already named before and further colleague who could not be present today Which is called Thomas battle check and we come from the Institute for geoinformatics in Minster in
Germany Right Here I can present you our content But I will mainly talk about our so-called multi-sensor feeder, which I will present you in a few minutes But let's start first with the topic of the biodiversity loss So at this point in time more species are threatened than with extensions than ever before
This is a statistic on the left hand side by the IUC and the International Union for Conservation of nature and they found out that the number of endangered species doubled from 2007 to 2019 to over 14,000 and this is especially true also for birds
Yeah, this also evident by the graphic on the right hand side which was created by the nature and biodiversity conservation union in Germany and They found out that the total number of breeding pairs falls by 14 percent between 1998 and 2009 so for example for the starlings as a decrease of 42 percent so numbers about 2.6 million
Yeah, the reason for this are oftentimes anthropogenic impacts So excessive land use and destruction of nature and there is a really a need to change something But to change something we need to know what can we change or where can we do something?
And that's what we work on So we have three areas which help us in our research subject first citizen science So there are already several projects which are working in this field for example as a project where? Citizens can convert in their backyards or there's an application where you can track a citizen's invasive alien species
and other possibilities for example a naturalist where you can validate recorded data by other citizens and there are also some quite interesting open source tools like the sense box and open source weather station, which you can also
Use and operate as citizen Right, however a continuous data gathering is also time consuming and for citizens and therefore it's quite useful to use in Automation this can be done by several sensors for example by an RFID ship camera or even a microphone to record sound and
It's also possible to combine different sensors and to use further information Yeah, artificial intelligence can be a game-changer in this subject So artificial intelligence can be used to identify an animal But not only to identify an animal to also identify this species of the identified animal
Then you can use already existing models. You can train your own model and you can combine different parameters So for example, you can use the appearance of the animal or the motion and environment where the animal is living to raise your probability of the prediction which you do with with AI
right and this brings me to our Research projects we want to develop or are developing an automated and easy to use birds monitoring tool for citizens Which we called a multi-sensor feeder So the overall goal of this work is to contribute to the evaluation
of the biodiversity in people's gardens or balconies by monitoring birds and environmental factors What's important in this context? So we want all the data which is collected by the stations in the gardens to be available as open access Second we want to identify the species of the birds of the visiting birds automatically
And we not only want to collect data about the birds visiting the stations but we want also to collect data about the environment around the station and The station should be reproducible affordable and easy to use for anyone Yes, so that every citizens can build up the station on their own which leads to a high distribution of stations and thus to
a high amount of data Now I come to our approach so the approach is to test different Configurations in terms of the hardware so different RAM of the microcomputer different cameras different kinds of detection how we detect the movement
Also in terms of the software we change we tried out different options so for example in terms of the data storage or the processing location and For the casing of the station we also tried different Variants for example we changed the kind of timber, but I will talk about it in a minute here
We had certain criteria during Testing this different configurations. What was important for us is a high usability functionality and a reasonable price Right and with this I come to our main results the multi sensor feeder So on the left hand side you can see all components needed to build up the station
And I explained them in a minute a bit more detailed on the right hand side You can see the build-up station in a garden mounted to a wooden stake Right and first I will introduce you the technical components so on you on the left hand side You see the different components and on the right hand side you see the name and the corresponding costs for it
So first of all we have a microcomputer a Raspberry Pi model for B with two Gigabit of RAM we have an camera Which is the version 2 Raspberry Pi camera with 8 megapixels and we use furthermore
microphone and a balance which includes a load cell and weight sensor Right and furthermore an environmental sensor, which measures temperature and humidity Yes, the price for all these Technical components is shortly below 100 euro and the prices were taken from the online seller berry base
Because which so we choose this as a vendor as it offered most hardware needed to a reasonable price and was good service Now we come to the casing components So we use mainly a beach wood multiplex paid with a width of nine millimeters out of which we yeah Created the individual parts or cut it them out
you further more need a plastic lip for the station a wooden title perch where the bird can lens on a case for the camera a roofing felt for weather resistance and some further stuff like screws to attach the station and to assemble it Yeah and at least the casing components cost about 40 euros so that we come to a final price of about 140 euros as
production price for the whole station Yeah, now I come to the recognition process But firstly you can see the station from a side view the interior is divided into two different parts first space for the footer in the front so the footer
So that it is reachable by the bird who's landing on the perch in front of the station and in the background You can see some space for the technical computer components like the Raspberry Pi and Of course the roof is removable So if you want to reach it with two to make some adaptions at the technique or to refill the footer
Right. So the balance in the front is measuring the weight in a short-term interval and the environment sensor at the bottom So there's a hole so that they are connected to the microcomputer measures temperature and humidity every X minute so the user can influence or set a settings how
Often the temperature or the environmental sensor should measure something Right. And now if a birds lands on the perch Then a movement is starting what we called movement and the balance detects changes in the way So this is the way how we recognize a movement
Then the camera starts recording and films the bird and a microphone starts recording the environment So the sound of the environment and if a bird is leaving then the movement is over because the balance detects this is again a change because no weight is measured by the balance so the movement is over and with this the camera stops recording and the
microphone stops also recording and with this the movement is over and the micro computer starts to send the data as movement package to our server and Then the server is able to detect the birds p-size by the uses of artificial intelligence and stores all the data on our server and
with it the data can be shared and therefore we use in platform where Researchers as well as citizens can here reach all the data collected by the stations right The collected data is available in real time via our API
So some usual data, of course like a time frame, but especially what is in the red? Yeah, the red field is the environmental data So the temperature and humidity measured by the station and if we look have a look at the movement data The weight of the visiting bird is stored and the surrounding sound is available as well as an AI based
p-size recognition So in this case, it is a great hit with a prediction score of 96 percent Yeah But our data or the data collected by the stations not only available via API in real time
But also via our website here You can see on the left hand side the stations which are currently visited by birds and on the right side You can see really one station on our platform Where you can find the recordings of the last three birds who visited our stations as video as audio Also, the weight is shown and also the detected species by our server
furthermore, you can see some temporal series about the environment data in terms of the temperature and humidity and Furthermore at the bottom. You can see the birds. Who is it? So a total number of birds who visited our state This is the corresponding station yesterday or today
Yeah with this I want to come to one important point that all of our stuff we are doing is open data So the research code is available on github and Zenodo and we provide an open documentation for our API We use open source tools. For example for the server use Docker flash or endings
yeah concerning the model who's recognizing the See birds. We use a naturalist data together with a mobile net We chew model made by Google and for our website for the map. We use leaflet as well as react Right, and if you want to build up the station yourself
We have an do-it-yourself manual online available on github and furthermore you can of course visit our platform to have a look at the different stations and to get some more information about our project Right now I come to our discussion. So we decided to use as microcomputers a Raspberry Pi model for B with two gigabits of RAM
This is a relatively common tool with a lot of documentation and easy adaptable to our needs So we there were many Compatible sensors available with enough documentation and also the RAM of two gigabits is quite enough for our purpose
Concerning the camera. We use the Raspberry Pi camera model 2 is 8 megapixels This is due to usability reasons as it is easily attachable and the price is quite reasonable for Such insufficient quality. For example, if we would use the high-quality cam It is much more expensive and not very usable due to the size
Now I come to the detection of the birds There are quite some different options first the motion sensor could be used and attached near to the camera But therefore manual settings are required like to set a time delay or the sensitivity and this is only many will be
Changeable and not by software and of course, it's additional sensor But there's also positive arguments so you can detect movement also in the background So not only movement directly in front of the station, but also in the background. This is also true for the pixel change detection This the good thing here is that it is also already installed because we do not need an additional sensor
You can simply do it with the camera But it's challenging to define a threshold At which pixel change counts as motion and this takes there's a need for permanent Analysis and this requires a lot of processing power Nevertheless interesting like the motion sensor if you also want to cover motion in the background
But we finally decided to use the balance So a certain change in weight detection counts as movement So that there is a low number of false recordings because the camera only starts recording Is really a weight measured by the balance and thus there's a lower processing requirement which enables us to use a low-cost
microcontroller In terms of the choice of the which we first used a dark colored multiplex plate including a film layer as weather protection But time by time we recognize that the station with this black material is really heating up and we decided to use
beach plate instead Which we nevertheless Recommend to use a glaze to make it even more weather protected in terms of the general size It's important that there's enough space for technical components and foot and that it is not too big and still attachable by Everyone for the roof the angle and length is important. So if you look at you take a detailed look
At the front. There's a small overhang so I mean There is some space for foot. That's a the The bird can really reach it
In terms of the footer silo, it's similar You need enough space for footer and a sufficient angle that the footer can really roll out and in terms of the balance position It needs to be far enough away from the camera so that the bird can record Can be recorded we also thought about using a perch or a plate instead of a perch
But if you use a plate There's a lot of place for unwanted stuff like dirt or feckle and this influences in the whole recognition process and just towards measured weights Yeah, then another thing we thought about was Recording videos or images. So if you record video video says more space for information and more
It is probably more interesting for the citizens because they can really watch the videos Images instead need less disk space and are easier to send via network Finally, we decided to record videos because if we only got one single image of the bird probably only parts of the bird
depicted Yeah, we now use in 30 frames per second video whereby we use every 10th frame Which is then finally analyzed by our image recognition model concerning the processing You could do the image recognition on the microcontroller or on the server if you do it on the microcontroller
More processing power for the microcomputer is needed. So probably then two gigabits of RAM are not enough But then if you do it there You only need to send for example short text information like this piece as predicted by the model and then it's not required to use Really Wi-Fi you could also use a low power network like LoRaWAN instead of Wi-Fi
Right and in terms of sending the data already explained that we send movement package But in addition, we also send environment packages where we send only the environment information like temperature and humidity We do this
Set there is also possible to answer research question in depending on the environment Yeah, but it's a bit more processing power needed Right Concerning the privacy Stations are built up in garden of private person and the recordings and position of stations are available on home
Homepage, so it's really a need to ensure the data privacy currently the camera focuses on the purge The background is blurred quite a bit and recordings are only stored when a white is Recognized and the citizens of course need to agree that the data is stored But there are some steps we want to do in the future
For example, the location could be blurred via hexagons since our dot community is doing this So you then not have these? Yeah, the corresponding locations, but they are kind of blurred in a large scale hexagon Another idea is a lightweighted image recognition to detect unwanted content So that you already got an image recognition on the Raspberry Pi which is
Identifying unwanted content and if there is a person for example on it and the data is not sent to our server Right. There are also some limitations You need certain do-it-yourself tools For example is 3d printer or further tools like a drill or a saw but this should be available
Normally in a well-equipped at home workshop. You need to think about the station proportions So a big bird probably is not able to land on the small perch and a small bird probably is Too far away from the footer to reach it if it if he stands on the perch so the more of course the footer is kind of limiting different footer affects the station differently as
Visiting birds change with the chosen footer. You need to think about weather protection So there's a need to ensure that all the different sensors are weather protected We started to put our stations in nature in May. So until today they are running Let's see how long they will do it And of course, there's a validation need because we are certain science project projects though. This is the
Only due to the citizen science approach our project is running because this way we can really collect a lot of data But yeah as every citizen is a bit different. The stations are bit different and also the collected data is different and therefore need to be Yeah validated. Yeah. Now I come to future work
Our idea is to use some further sensors for example Particulate matter loudness sensor or an infrared camera to make also recordings during the night We thought about using an alternative microcontroller, which could lower the production costs We also thought about in standalone mode so that we for example change the network connection to LoRaWAN or cellular or
That we use not a stable power cable, but instead a battery or solar power We also thought about detection of individual birds so that you do not say there were hundred birds today But 50 times the same bird we thought about the validation this could be also done a bit more automized and
we thought about training our own model because we now really collect a lot of image data and thus we can Train our own model or combine it with a bit retrain model at sea and of course It's also interesting to make the station available for further organisms So already know there were some squirrels and it would be of course also interesting to track them
Yeah, and with this I come to the end So our presentation is also available via this link the papers already published its website You can see some more information about our project in general, but of course you can see there the stations and Yeah, why you think about some great questions? You can see a video about the birds who already visited our stations
Thank you