YetAnother-WhosAtMyFeeder Svelte Themes

Yetanother Whosatmyfeeder

🐦 AI-powered bird identification for Frigate NVR. Classifies birds from your feeder camera using TensorFlow Lite and displays them in a real-time dashboard.

Yet Another WhosAtMyFeeder (YA-WAMF)

A bird classification system that integrates with Frigate NVR to automatically identify birds visiting your feeder using advanced AI models.

Features at a Glance

  • Advanced AI Classification - MobileNetV2, ConvNeXt, or EVA-02 models (up to 91% accuracy)
  • Multi-Sensor Verification - Correlates visual detections with BirdNET-Go audio
  • Smart Notifications - Discord, Telegram, Pushover, Email with customizable filters + Notification Center
  • Video Analysis - Automatic scanning of 15+ frames (temporal ensemble) for improved accuracy
  • LLM Insights - AI-powered behavioral analysis (Gemini/OpenAI/Claude)
  • Leaderboard AI Insights - Analyze detection charts for trends and weather correlations
  • Home Assistant Integration - Sensors, automation, and dashboard cards
  • BirdWeather Reporting - Contribute to community science
  • Real-time Dashboard - Live updates, video playback, species statistics
  • Notification Center - Pinned progress for long-running jobs and a full notifications view
  • Public View (Guest Mode) - Share a read-only dashboard with rate limits and optional camera name hiding

About This Project

A personal project built with AI-assisted coding, inspired by the original WhosAtMyFeeder. When I noticed the original project wasn't being maintained, I saw an opportunity to learn and build something better.

Built with help from AI coding assistants as an experiment in what's possible with modern development tools. Feedback and contributions are welcome!

Live Instance

A public instance of YA-WAMF is available here:

What It Does

When Frigate detects a bird at your feeder, YA-WAMF:

  1. Grabs the snapshot image
  2. Runs it through an advanced AI model (MobileNetV2, ConvNeXt, or EVA-02)
  3. Cross-references with audio detections from BirdNET-Go for multi-sensor confirmation
  4. Automatically analyzes the video clip (optional) using a temporal ensemble of 15+ frames for higher accuracy
  5. Sends rich notifications to Discord, Pushover, Telegram, or Email (OAuth/SMTP)
  6. Enriches detections with local weather data and behavior analysis via LLMs (Gemini/OpenAI/Claude)
  7. Keeps track of all your visitors in a nice dashboard with taxonomic normalization
  8. Proxies video clips from Frigate with full streaming and seeking support
  9. Reports detections to BirdWeather (optional) for community science contribution

Advanced Features:

  • Auto Video Analysis: Automatically downloads and scans 15+ frames from the event clip to verify snapshot detections.
  • Multi-Platform Notifications: Native support for Discord, Pushover, Telegram, and Email with customizable filters (species, confidence, audio-only).
  • Accessibility & i18n: Screen-reader friendly UI, live announcements toggle, and multilingual interface/notifications.
  • Multi-Sensor Correlation: Matches visual detections with audio identifications from BirdNET-Go (now with live dashboard widget!).
  • Backfill Tool: Missed some events? Scan your Frigate history to import and classify past detections.
  • AI Naturalist Insight: One-click behavioral analysis of your visitors using state-of-the-art LLMs.
  • Elite Accuracy: Support for state-of-the-art EVA-02 Large models (~91% accuracy).
  • Taxonomy Normalization: Automatic Scientific ↔ Common name mapping using iNaturalist data.
  • iNaturalist Submissions (Beta): Owner-reviewed submissions are implemented but currently untested due to App Owner approval limits. Testers welcome.
  • Camera Preview: Expand a camera in Settings → Connection to see a live snapshot preview.

Note: To preview the iNaturalist submission UI without OAuth, enable Debug UI (DEBUG_UI_ENABLED=true) and toggle Settings → Debug → iNaturalist preview UI.

  • Fast Path Efficiency: Skip local AI and use Frigate's sublabels directly to save CPU.
  • Home Assistant Integration: Full support for tracking the last detected bird and daily counts in HA.
  • Observability: Built-in Prometheus metrics, Telemetry (opt-in), and real-time MQTT diagnostics.
  • Public View (Guest Mode): Optional read-only sharing with rate limits and privacy controls.

Documentation

For detailed guides on setup, integrations, and troubleshooting, please see the Full Documentation Suite.

How It Works

Here's the flow from bird to identification:

ā”Œā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”     MQTT Event      ā”Œā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”
│   Frigate   │ ─────────────────>  │  YA-WAMF    │
│   (NVR)     │   "bird detected"   │  Backend    │
ā””ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”˜                     ā””ā”€ā”€ā”€ā”€ā”€ā”€ā”¬ā”€ā”€ā”€ā”€ā”€ā”€ā”˜
                                           │
                                           v
                                    ā”Œā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”
                                    │ Fast Path:   │
                                    │ Use Frigate  │
                                    │ Sublabels?   │
                                    ā””ā”€ā”€ā”€ā”€ā”€ā”€ā”¬ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”˜
                                           │
                                     (No)  v  (Yes)
                                    ā”Œā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”
                                    │  AI Engine   │
                                    │ (TFLite/ONNX)│
                                    ā””ā”€ā”€ā”€ā”€ā”€ā”€ā”¬ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”˜
                                           │
                                           v
                                    ā”Œā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”
                                    │ Save to DB & │
                                    │ Notify User  │
                                    ā””ā”€ā”€ā”€ā”€ā”€ā”€ā”¬ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”˜
                                           │
                                           v
                                    ā”Œā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”
                                    │ Auto Video   │
                                    │ Analysis     │
                                    │ (Background) │
                                    ā””ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”˜

Step by step:

  1. Frigate spots a bird - Your camera picks up movement, Frigate's object detection identifies it as a bird
  2. MQTT message sent - Frigate publishes an event to frigate/events on your MQTT broker
  3. YA-WAMF receives the event - The backend is subscribed to that MQTT topic and picks up the message
  4. Efficiency Check - If "Trust Frigate Sublabels" is enabled and Frigate already has a label, I use it instantly.
  5. Classification runs - Otherwise, the image goes through a local model (TFLite or ONNX) trained on bird species.
  6. Results stored & Notified - The detection is saved, and notifications (Discord/Telegram/Pushover) are fired immediately.
  7. Deep Analysis - If enabled, a background task waits for the video clip to finalize, then scans it frame-by-frame to refine the ID.

  8. Dashboard updates - The frontend gets real-time updates via Server-Sent Events (SSE).

Quick Start

Prerequisites

  • Docker and Docker Compose installed
  • Frigate NVR running with MQTT enabled
  • MQTT broker accessible (typically Mosquitto running alongside Frigate)
  • Basic knowledge of Docker networking

Installation

1. Download configuration files:

mkdir ya-wamf && cd ya-wamf
curl -O https://raw.githubusercontent.com/Jellman86/YetAnother-WhosAtMyFeeder/main/docker-compose.yml
curl -O https://raw.githubusercontent.com/Jellman86/YetAnother-WhosAtMyFeeder/main/.env.example
cp .env.example .env

2. Configure your environment:

Edit .env with your settings:

# Docker network (check with: docker network ls)
DOCKER_NETWORK=frigate

# Frigate instance
FRIGATE_URL=http://frigate:5000

# MQTT broker (usually 'mosquitto' if running in Docker)
MQTT_SERVER=mosquitto
MQTT_PORT=1883

# MQTT authentication (if required)
MQTT_AUTH=true
MQTT_USERNAME=mqtt_user
MQTT_PASSWORD=secret_password

# Timezone
TZ=Europe/London

3. Verify Docker network:

Ensure the network specified in .env exists and matches your Frigate setup:

docker network ls

4. Create directories and start:

mkdir -p config data/models
docker compose up -d

5. Access the dashboard:

Open http://localhost:9852 (or http://YOUR_SERVER_IP:9852)

6. Download the AI model:

In the web UI, go to Settings and click the model download button. The model is saved to data/models/ and persists across updates.

Public View (Guest Mode) at a Glance

Guest mode is read-only and rate-limited. Guests can view detections and any existing AI Naturalist analysis, but cannot change settings, delete items, or run new AI analysis. You can hide camera names and limit the public history window in Settings > Public Access.

Verification

Check logs to confirm everything is working:

docker compose ps                    # Check container status
docker compose logs backend -f       # Follow backend logs

# You should see:
# MQTT config: auth=True port=1883 server=mosquitto
# Connected to MQTT topic=frigate/events

Troubleshooting

Issue Solution
MQTT connection failed Verify DOCKER_NETWORK matches Frigate's network
Check MQTT hostname and credentials
Frontend not loading Run docker compose ps to check health
View logs: docker compose logs frontend
No detections Confirm Frigate is detecting birds
Check backend logs for events
Verify model was downloaded in Settings

For detailed troubleshooting, see the Troubleshooting Guide.

Configuration

All settings are managed through the web UI under Settings. Configuration is persisted to config/config.json.

Key Settings

Setting Description Default
Frigate URL Frigate instance for fetching media http://frigate:5000
MQTT Server MQTT broker hostname mqtt
Classification Threshold Minimum confidence for detections (0-1) 0.7
Min Confidence Floor Reject detections below this score 0.4
Trust Frigate Sublabels Use Frigate's labels instead of local AI Enabled
Auto Video Analysis Analyze full video clips for accuracy Disabled
AI Model MobileNet (Fast), ConvNeXt (High), EVA-02 (Elite) MobileNet
BirdWeather Token Upload detections to BirdWeather (none)
BirdNET-Go Topic MQTT topic for audio detections birdnet/text

Security & Authentication

YA-WAMF v2.6.0 introduces a robust built-in authentication system.

šŸ” Built-in Authentication

  • Setup Wizard: On first run, you'll be prompted to set an admin username and password.
  • Guest Mode: Optionally enable a "Public View" to share your bird detections with friends (read-only) while keeping settings and admin tools secure.
  • Security: Includes login rate limiting, session management, and security headers.

šŸ‘‰ Read the Full Authentication & Access Control Guide

šŸ”‘ Legacy API Key (Deprecated)

If you are upgrading from an older version using YA_WAMF_API_KEY, your setup will continue to work. However, this method is deprecated and will be removed in v2.9.0. I recommend migrating to the new password-based system via Settings > Security.

For detailed upgrade instructions, see the Migration Guide.

Tech Stack

  • Backend: Python 3.12, FastAPI, SQLite
  • Frontend: Svelte 5, Tailwind CSS
  • ML Engine: ONNX Runtime & TensorFlow Lite
  • Messaging: MQTT for Frigate events, SSE for live UI updates

Video Playback & Bandwidth

YA-WAMF includes a robust video proxy that streams clips directly from Frigate. This supports:

  • Instant Playback: Starts playing immediately without waiting for the whole file.
  • Seeking: You can jump to any part of the video (scrubbing) thanks to HTTP Range support.
  • Bandwidth Control: If you are on a metered connection or want to reduce load, you can disable "Fetch Video Clips" in the Settings. This prevents the backend from fetching heavy video files.

Home Assistant Integration

YA-WAMF includes a custom component for Home Assistant to bring your bird sightings into your smart home.

Features:

  • Last Bird Detected Sensor: Shows the name of the most recent visitor with all metadata (score, camera, weather) as attributes.
  • Daily Count Sensor: Keeps track of how many birds have visited today.
  • Camera Entity: (Optional) Proxy for the latest bird snapshot.

Setup:

  1. Copy the custom_components/yawamf folder to your Home Assistant custom_components directory.
  2. Restart Home Assistant.
  3. If the integration icon doesn't appear right away, hard-refresh Home Assistant or clear the browser cache (icons are cached).
  4. Add the integration via Settings > Devices & Services > Add Integration.
  5. Enter your YA-WAMF backend URL (e.g., http://192.168.1.50:9852).

Help Improve YA-WAMF

This project is actively developed and your feedback is valuable!

How to contribute:

  • Report bugs - Open an issue for bugs or feature requests
  • Share feedback - Let me know what works and what doesn't
  • Enable telemetry - Turn on anonymous usage stats in Settings > Connections (see Telemetry Spec)
  • Test features - Try video analysis, notifications, and integrations in your environment

Contributing

Feel free to open PRs if you have improvements to share. Just keep in mind this is a hobby project maintained in spare time.

License

This project is licensed under the MIT License - see the LICENSE file for details.

Thanks To

  • The original WhosAtMyFeeder project for the idea
  • Frigate for being such a great NVR
  • BirdNET-Go for the excellent audio classification integration
  • Ben Jordan on YouTube for his inspiring bird detection video
  • The AI assistants that helped build this thing

Top categories

Loading Svelte Themes