Cliff Notes — TL;DR
Managing a vast array of security cameras across multiple properties can generate an overwhelming amount of footage. To build a reliable, easily browsable timeline of activity at the remote Tainé property, we implemented an automated camera chronology system within Home Assistant. This system takes pseudo-random daily snapshots from 8 different cameras, organizes them logically into area-based subdirectories, and synchronizes them directly to a secure AWS S3 bucket. The result is a rock-solid, zero-maintenance pipeline that permanently archives organized daily snapshots to the cloud — without bloating local storage or Git repositories.
Background & Business Context
As the Tainé property undergoes development and agricultural work, maintaining a visual history is critical for tracking progress. Two core challenges drove this initiative:
- Repository Bloat: Taking daily photos from 8 cameras generates roughly 875 MB of binary data a year. Pushing this to our existing AWS CodeCommit Git repository would quickly degrade repo performance and hit service limits.
- Data Organization: A flat directory of generic camera names makes it impossible to efficiently review a timeline of specific areas — like the Nursery or the Shamba.
- Container Ephemerality: Home Assistant's SSH/Terminal add-on resets its environment upon reboot, routinely wiping out manual package installations (like the AWS CLI) and temporary credential files.
The goal was to build a resilient, structured, cloud-archived visual timeline that requires zero daily management.
"A flat directory of generic camera names makes it impossible to efficiently review a timeline of specific areas."
Architecture Overview
The solution utilizes a combination of Home Assistant automations, local directory scripting, and AWS IAM policies.
System Components
- Storage Target: A dedicated, private AWS S3 bucket —
s3://idonny-home-assistant-media. - Security: A highly restrictive IAM policy that grants the
home-assistantuser only the permissions to read/write to that exact bucket. - Local Buffer: A
.gitignore'd local directory on the Raspberry Pi (/config/taine_chronology/) acting as a staging area. - System Sync: AWS CLI running via a Home Assistant
shell_commandthat pushes staged changes up to S3.
What We Built
1. The Tainé Daily Camera Chronology
We deployed a new automation that orchestrates the entire capture-and-sync process. To prevent data gaps caused by mid-day server reboots — which interrupt standard HA delay blocks — we implemented a deterministic, date-seeded pseudo-random template trigger.
The automation guarantees:
- Two photos per day — one AM, one PM.
- A mandatory minimum of 5 hours between the two shots.
- Snapshots only fire when
input_select.konza_time_of_day_optionverifies it is daylight. - 24-hour format timestamps integrated directly into the filename (e.g.,
YYYY-MM-DD_0815_tanktower.jpg) to prevent overwriting and allow easy timeline sorting.
2. Area-Aligned Data Structuring
Prior to this project, camera entity IDs were inherited generically from the Ring integration. To solve this, the cameras were strategically renamed within the HA UI to match their formal physical areas: Nursery, Shamba, Mainroad, Workers' House, and Septic Corner.
We then scanned the HA API to map these new IDs and executed a sweeping refactor across automations.yaml and the Lovelace Dashboards to repair any broken dependencies. A pre-flight shell_command was designed to automatically build the matching 1:1 subdirectory framework locally before any snapshot is written.
3. Bulletproofing the AWS CLI on HAOS
During testing, a full Home Assistant restart revealed that the SSH container wiped out both the aws binary and the ~/.aws/ credentials folder, breaking the Git sync pipeline. We engineered a permanent fix:
- Added
aws-clito the persistentpackageslist in the SSH Add-on configuration. - Migrated AWS credentials into the permanent
/config/.aws/directory. - Injected custom environment variables (
AWS_CONFIG_FILEandAWS_SHARED_CREDENTIALS_FILE) to permanently route authentication away from the ephemeral root directory. - Patched the background
git-sync.shcron job with a self-healing routine that automatically reinstalls the AWS CLI in seconds if it ever drops offline.
Benefits Realized
The system transformation across all key dimensions:
| Area | Before | After |
|---|---|---|
| Image Storage | Local only / Bloated Git | Scalable, cost-effective AWS S3 |
| Snapshot Timing | Rigid or reboot-vulnerable | Date-seeded pseudorandom; reboot-resilient |
| Image Organization | Flat folder | Grouped by 5 distinct physical areas |
| Dashboard Integrity | Broken after renaming | Repaired via Raw Configuration YAML edits |
| AWS Auth Resilience | Wiped upon HA restart | Permanently anchored in /config volume |
| User Visibility | Silent operation | Push notifications delivered to iOS upon sync |
Conclusion
By leveraging AWS S3 for binary storage and applying strict area-based taxonomy to our naming conventions, we transformed a noisy fleet of security cameras into an organized, cloud-synced visual diary. The system perfectly handles the ephemeral nature of the Home Assistant container environment, ensuring that the chronology automation — and the underlying Git sync pipeline — can confidently survive endless system reboots.
The groundwork laid today ensures the Tainé timeline will quietly document multi-year progress without human intervention.