Skip to content

Commit

Permalink
Refactor for stateful-action
Browse files Browse the repository at this point in the history
  • Loading branch information
jorgebg committed May 1, 2020
1 parent c307d2d commit 02cf5c5
Show file tree
Hide file tree
Showing 16 changed files with 94 additions and 113,591 deletions.
66 changes: 30 additions & 36 deletions .github/workflows/main.yml
Original file line number Diff line number Diff line change
@@ -1,55 +1,49 @@
name: Build

#on:
# schedule:
# - cron: '0/15 * * * *'
# push:
# branches:
# - master
on:
schedule:
- cron: '0/15 * * * *'
# push:
# branches:
# - master

jobs:
build:
runs-on: ubuntu-latest

steps:
# Checks-out the repository under $GITHUB_WORKSPACE
- uses: actions/checkout@v2

- uses: jorgebg/stateful-action@v0.1
with:
branch: state

- uses: jorgebg/stateful-action@v0.1
with:
branch: gh-pages

- name: Set up Python
uses: actions/setup-python@v1
with:
# 100 last commits
fetch-depth: 100
python-version: 3.8

- name: Run collector
- name: Set up Git
run: |
git config user.email "airquality@jorgebg.com"
git config user.name "Air Quality Bot"
- name: Install dependencies
run: |
pip3 install -r requirements.txt
- name: Collect the data
env:
AWS_DEFAULT_REGION: 'eu-west-1'
AWS_ACCESS_KEY_ID: ${{ secrets.AWS_ACCESS_KEY_ID }}
AWS_SECRET_ACCESS_KEY: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
working-directory: ${{ github.workspace }}
run: |
git config --global user.email "airquality@jorgebg.com"
git config --global user.name "Air Quality Bot"
pip3 install setuptools
pip3 install boto3
python3 collector.py
- name: Process the data
working-directory: ${{ github.workspace }}
run: |
python3 processor.py
if git diff-files --quiet
then
echo "Data didn't change"
else
git add .
git commit -m"Publish metrics"
git push
fi
- name: Squash the commits of the day
working-directory: ${{ github.workspace }}
- name: Publish the data
run: |
DATE=$(date +%F)
git rev-list -1 --before="${DATE}T00:00:00Z" master
git reset $(git rev-list -1 --before="${DATE}T00:00:00Z" master)
git add .
git commit -m"Metrics of $DATE"
git push -f
python3 publisher.py
6 changes: 5 additions & 1 deletion .gitignore
Original file line number Diff line number Diff line change
@@ -1,4 +1,8 @@
.vscode
__pycache__
.env
.aws.config
.aws.config

# Stateful Action
.state/
.gh-pages/
14 changes: 0 additions & 14 deletions Pipfile

This file was deleted.

195 changes: 0 additions & 195 deletions Pipfile.lock

This file was deleted.

68 changes: 49 additions & 19 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,32 +1,62 @@
# Air quality
# Air quality monitoring station

## Architecture

SDS011 → Raspberry Pi Zero [`airquality.py`] → AWS SQS → GitHub Actions [`collector.py` `processor.py``plot.py`] → GitHub Pages
## Installation
| Monitoring Station || Message Queue || Data Collector || Data Publisher || Static Website |
| - | - | - | - | - | - | - | - | - |
| Raspberry Pi Zero + SDS011 | | AWS SQS | | GitHub Actions | | GitHub Actions | | GitHub Pages |

### Station
1. The Monitoring Station reads PM2.5 and PM10 from the sensor and sends the metrics to the Message Queue.
2. These messages are collected by the Data Collector.
3. The Data Publisher computes the hourly averages and publishes the data as a static website on GitHub Pages.

#### Install dependencies
## Components setup

```sh
pip3 install boto3 pyserial
```
### Message Queue

Setup [AWS credencials](https://boto3.amazonaws.com/v1/documentation/api/latest/guide/configuration.html).
Create an account in AWS and:

#### Install airquality.service
1. Create a **[SQS FIFO queue](console.aws.amazon.com/sqs)** named `airquality.fifo`.
2. Create an [user](console.aws.amazon.com/iam) for the **monitor** and give it the following permissions:
* `SQS:SendMessage`
* `SQS:GetQueueUrl`
3. Create an [user](console.aws.amazon.com/iam) for the **collector** and give it the following permissions:
* `SQS:DeleteMessage`
* `SQS:GetQueueUrl`
* `SQS:ReceiveMessage`

Copy `airquality.py` to `/root/airquality.py`.

Copy `airquality.service` to `/etc/systemd/system/airquality.service` and then run:
### Monitoring Station

```sh
systemd install airquality.service
systemctl daemon-reload
```
Composed by [SDS011](http://inovafitness.com/en/a/chanpinzhongxin/95.html) and [Raspberry Pi Zero](https://www.raspberrypi.org/products/raspberry-pi-zero-w/) with a Debian-based distribution (I use [DietPi](https://dietpi.com/)).

### Collector
1. Setup AWS credentials
* Setup monitor [AWS credencials](https://boto3.amazonaws.com/v1/documentation/api/latest/guide/configuration.html) for the monitor IAM.

Setup GitHub Actions.
2. Install dependencies
* Run `pip3 install boto3 pyserial`

3. Install the service
* Copy `monitor/airquality.py` to `/root/airquality.py`.
* Copy `monitor/airquality.service` to `/etc/systemd/system/airquality.service`.
* Run `systemd install airquality.service && systemctl daemon-reload`

It sends a message every 10 seconds, so it fits AWS SQS free tier.

### Data Collector

It collects the data sent to the queue every 15 minutes and stores it in a git branch named `state`. The branch must be checked out as a git worktree under `.state` folder.

It's implemented on GitHub actions. Setup the following [secrets](https://help.github.com/en/actions/configuring-and-managing-workflows/creating-and-storing-encrypted-secrets):
- `AWS_ACCESS_KEY_ID`
- `AWS_SECRET_ACCESS_KEY`

### Data Publisher

It computes the hourly averages and publishes the results as a static website on GitHub Pages. The data and the website are stored into a branch named `gh-pages`. The branch must be checked out as a git worktree under `.gh-tree` folder.

The chart is built with [Plotly](https://plotly.com/javascript/).

## Git worktrees

Collected and published data are stored on different branches that are managed as [git worktrees](https://git-scm.com/docs/git-worktree). They are handled by [stateful-action](/jorgebg/stateful-action) on the GitHub workflow.
Loading

0 comments on commit 02cf5c5

Please sign in to comment.