Monitoring Station | ⇨ | Message Queue | ⇨ | Data Collector | ⇨ | Data Publisher | ⇨ | Static Website |
---|---|---|---|---|---|---|---|---|
Raspberry Pi Zero + SDS011 | AWS SQS | GitHub Actions | GitHub Actions | GitHub Pages |
- The Monitoring Station reads PM2.5 and PM10 from the sensor and sends the metrics to the Message Queue.
- These messages are collected by the Data Collector.
- The Data Publisher computes the hourly averages and publishes the data as a static website on GitHub Pages.
Create an account in AWS and:
- Create a SQS FIFO queue named
airquality.fifo
. - Create an user for the monitor and give it the following permissions:
SQS:SendMessage
SQS:GetQueueUrl
- Create an user for the collector and give it the following permissions:
SQS:DeleteMessage
SQS:GetQueueUrl
SQS:ReceiveMessage
Composed by SDS011 and Raspberry Pi Zero with a Debian-based distribution (I use DietPi).
-
Setup AWS credentials
- Setup monitor AWS credencials for the monitor IAM.
-
Install dependencies
- Run
pip3 install boto3 pyserial
- Run
-
Install the service
- Copy
monitor/airquality.py
to/root/airquality.py
. - Copy
monitor/airquality.service
to/etc/systemd/system/airquality.service
. - Run
systemd install airquality.service && systemctl daemon-reload
- Copy
It sends a message every 10 seconds, so it fits AWS SQS free tier.
It collects the data sent to the queue every 15 minutes and stores it in a git branch named state
. The branch must be checked out as a git worktree under .state
folder.
It's implemented on GitHub actions. Setup the following secrets:
AWS_ACCESS_KEY_ID
AWS_SECRET_ACCESS_KEY
It computes the hourly averages and publishes the results as a static website on GitHub Pages. The data and the website are stored into a branch named gh-pages
. The branch must be checked out as a git worktree under .gh-tree
folder.
The chart is built with Plotly.
Collected and published data are stored on different branches that are managed as git worktrees. They are handled by stateful-action on the GitHub workflow.