Skip to main content

12 posts tagged with "Cloud"

Cloud

View All Tags

Hosting multiple subsites under a serverless website instance

· 7 min read
Chiwai Chan
Tinkerer

Introduction

Recently, I was tasked with coming up with a solution for a single website instance to host various pockets of documentations scattered across a growing number of Git repositories; each repository hosted documentation for a specific subject domain written in Markdown format - you may have come across README.md files all over the internet which is a classic example of Markdown.

Here is a list of requirements based on what the solution has to solve:

  • Website Hosting: the documentation website must be accessible from anywhere over the public internet. Optionally, we could limit access to a list of whitelisted IPs.
  • Authentication: access is only granted to those that should have it. Federating an IdP is ideal, e.g. Azure AD.
  • Serverless.
  • Host multiple sets of documentation scattered across multiple Azure DevOps Git Repositories.
  • Versioning: store each set of documentation in source control for all its goodness.
  • Format: create the documentation in plain text without having to worry much about styling and formatting. This is where Markdown file format comes in.
  • Pipelines to detect changes to documentation that would in turn trigger builds and deployments.
  • Azure AD Federation for SSO, this is especially useful for organisations with many applications and users so existing credentials can be re-used and managed the same way.

Solution

Serverless Website Hosting Infrastructure

website infrastructure chiwaichan

The Serverless Website Hosting Infrastructure I am about to talk about is built on top on an AWS's sample solution found here. I added resources on top of the example to suit our needs.

  • The user visits https://docs.example.co.nz from a browser on any device.
  • CloudFront: We are leveraging this component as the Content Distribution Network for the website, using the standard pattern of serving the CDN using an S3 Bucket.
    • Successful Lambda@Edge Check Auth: Static website content stored in S3 will only be served if the user is authenticated - a valid JWT (JSON Web Token) is found in the request.
    • Unsuccessful Lambda@Edge Check Auth: Return an HTTP 302 in the response to the user's browser to redirect user to Cognito so the user can sign in
    • This CloudFront instance is configured with the following settings:
      • Website content is cached for 30 minutes, each expired content file will be retrived from S3 individually.
      • Configured with the Alternative Domain Name: docs.example.co.nz
      • Configured with an SSL certificate for the sub-domain docs.example.co.nz using ACM (Amazon Certificate Manager) Service, the certificate is free and will be automatically renewed and managed by AWS.
  • Lambda@Edge: Validates every incoming request to CloudFront for the existence of a cookie to see if it contains a user's authentication information/JWT.
    • No authentication information: Respond to Cloudfront that the user needs to login.
    • Contains authentication cookie: Exchange the authentication information for a JWT token and store the JWT in the cookies in the HTTP response.
  • S3: This bucket is used as a CloudFront Origin and contains the static content files for the Documentation Website, e.g. HTML/CSS/JS/Images.
  • Amazon Cognito: This is the component used as the entry point for Authentication into the website, we will Federate Azure AD as an IdP using SAML integration - the user will be redirected to Azure AD for authentication.
    • Post back: When Cognito receives a SAML Assertion/Token from Azure AD after a successful login, a user's profile of that user is saved into the Cognito's User Pool by collecting the user attributes (claims) from the SAML Assertion.
  • Azure Active Directory: This solution will Federate Azure AD into Cognito using SAML, I suggest on following this walkthrough if you have a requirement for Azure AD Federation: https://aws.amazon.com/blogs/security/how-to-set-up-amazon-cognito-for-federated-authentication-using-azure-ad/
    • Successful authentication: the IDP posts back a SAML Assertion/Token back to Cognito

Set up instructions - Website Infrastructure

  • Create an AWS CloudFormation stack for the Website Hosting Infrastructure from the existing YML file "templates/aws-website-infrastructure.yml" found in this repository. We'll need the Stack's Outputs later on when we create the AWS Pipeline.

website infrastructure chiwaichan cloudformation create website infrastructure chiwaichan cloudformation outputs

Azure DevOps and AWS CodePipeline

deployment pipeline chiwaichan

There are 2 types of pipelines that makes up the end-to-end pipeline for this solution, 1st type is for the Azure side to push Markdown files into AWS, the other is for AWS to compile the Markdown files and deploy them into S3 where the Website Content is hosted.

In the Azure pipeline we take the raw documentation (Markdown) from a Git repository hosted in Azure DevOps Git Repositories, each time a set code changes is pushed into any one of the Git repositories will trigger an Azure Pipeline "Run", the Azure Pipeline will upload the Markdown and assets files to a centralised S3 bucket repository (created by the Website Infrastructure CloudFormation Stack earlier).

Each Azure DevOps repository will host documentation for a specific domain topic, this Pipeline pattern is designed to cater for a growing number of repositories that has a requirement to host all documentations within a single Wesbite instance; the Azure Pipeline needs to be configured for each instance of Azure DevOps Git Repository. Once the Markdown files are converted to HTML during the CodeBuild stage of the CodePipeline execution, the output of those files are upload the S3 bucket that is served behind the CloudFront/Website stack.

Set up instructions - Azure & AWS Pipelines

1 This step is skipped if the infrastruture website was previously set up for the another (first) set of documentation, in this case re-use the Access Keys created at that time in subsequent steps. Create a set of Access Keys for an AWS IAM User with a policy to perform the following actions on the "SourceZipBucket" bucket created in the Website Infrastructure CloudFormation stack earlier:

  • s3:PutObject
  • s3:GetObject
  • s3:DeleteObject
  • s3:ListBucket
#example

{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "VisualEditor0",
"Effect": "Allow",
"Action": [
"s3:PutObject",
"s3:GetObject",
"s3:DeleteObject",
"s3:ListBucket"
],
"Resource": [
"arn:aws:s3:::${REPLACE-WITH-SOURCE-ZIP-BUCKET-NAME}/*",
"arn:aws:s3:::${REPLACE-WITH-SOURCE-ZIP-BUCKET-NAME}"
]
},
{
"Sid": "VisualEditor1",
"Effect": "Allow",
"Action": "s3:ListAllMyBuckets",
"Resource": "*"
}
]
}

2 Create a new ADO pipeline from the existing YML file "templates/azure-pipeline.yml" in this repository.

azure devops pipeline 1 azure devops pipeline 2 azure devops pipeline 3 azure devops pipeline 4

Use these as the variables for the Pipeline using the same case:

  • S3-documentation-bucket-name: use the Outputs value of "SourceZipBucket" from the AWS CloudFormation Website Infrastructure Stack created earlier - this is the same S3 bucket name used in the IAM User policy.
  • AWS_ACCESS_KEY_ID: The value of the Access Key ID created earlier.
  • AWS_SECRET_ACCESS_KEY: The value of the Secret Access Key created earlier.
  • AWS_REGION: The region where the SourceZipBucket was created in.
  • sub-site-name: This is the name of the URL path for this set of documentation, it could be the name of the Azure DevOps Repository Name for easy reference. E.g. https://docs.example.co.nz/${sub-site-name}

3 Hit Run to start a pipeline execution

4 Skip this Step if you skipped Step 1. Create a CloudFormation stack for the Pipeline to deploy new Documentation, use the Cloudformation YML file "templates/aws-pipeline.yml" in this repository. Use the following as the Parameter values for the Pipeline:

  • SourceBucket: This is the Outputs value of "SourceZipBucket" from the AWS CloudFormation Website Infrastructure Stack created earlier.
  • StaticFilesBucket: This is the Outputs value of "DocumentationS3Bucket" from the AWS CloudFormation Website Infrastructure Stack created earlier.

aws cloudformation pipeline 1

Populate the website skeleton for Docusaurus

The CodeBuild instance in the pipeline runs a set of commands that takes the Markdown and asset files, then produces as an output the HTML format equivalent files of the entire website for all sub-sites. In order for the CodeBuild instance to run successfully it expects the skeleton files in the root of the "DocumentationS3Bucket" S3 Bucket found in the Outputs of the Website Infrastructure CloudFormation Stack, this is so Docusaurus knows how to render the Markdown files into HTML.

To generate the skeleton files and upload it to the S3 bucket use the following commands on a local machine:

npx create-docusaurus@latest website classic
aws s3 cp website/. s3://${DocumentationS3Bucket}

Smart Cat Feeder – Part 2

· 8 min read
Chiwai Chan
Tinkerer

seeed studio xiao esp32c3

The source code for this blog can be found in my Github repository: https://github.com/chiwaichan/aws-iot-cat-feeder. This repository only includes the source code for the solution implemented up to this stage/blog in the project.

In the end I decided to go with the Seeed Studio XIAO ESP32C3 implementation of the ESP32 micro-controller for $4.99 (USD). I also ordered some other bits and pieces from AliExpress that's going to take some time to arrive.

In this Part 2 of the blog series I will demonstrate the exchange of messages (JSON payload) using the MQTT protocol between the ESP32 and the AWS IoT Core Service, as well as the exchange of messages between a Lambda Function and the ESP32 - this Lambda is written in Python which is intended to replace the Lambda triggered by the IoT button event found in Part 1.

Prerequisites if you like to try out the solution using the source code

  • An AWS account.
  • An IoT button. Follow Part 1 of this blog series to onboard your IoT button into the AWS IoT 1-Click Service.
  • Create 2 Certificates in the AWS IoT Core Service. One certificate is for the ESP32 to publish and subscribe to Topics to IoT Core, and the other is used by the IoT button's Lambda to publish a message to a Topic subscribed by the ESP32.

aws iot certificate list

Create a Certificate using the recommended One-Click option.

aws iot certificate create

Download the following files and take note of which device (the ESP32 or the IoT Lambda) you like to use this certificate for:

aws iot certificate created

Activate the Certificate.

aws iot certificate activated

Click on Done. Then repeat the steps to create the second Certificate.

Publish ESP32 States to AWS IoT Core

seeed studio xiao esp32c3 aws iot

The diagram above depicts the components used that is required in order for the ESP32 to send the States of the Cat Feeder, I've yet to decide what to send but examples could be 1.) battery level 2.) Cat weight (based on a Cat's RFID chip and some how weighing them while they eat) 3.) or how much food is remaining in the feeder. So many options.

  1. ESP32: This is the micro-controller that will eventually have a bunch of hardware components that we will take States from, then publish to a Topic.
  2. MQTT: This is the lightweight pub/sub protocol used to send IoT messages over TCP/IP to AWS IoT Core.
  3. AWS IoT Core: This is the service that will forward message to the ESP32 micro-controller that are subscribed to Topics.
  4. IoT Topic: The Lambda will publish a message along with the type of button event (One click, long click or double click) to the Topic "cat-feeder/action", the value of the event is subject to what is supported by the IoT button you use.
  5. Do something later on: I'll decide later on what to do downstream with the State values. This could be anything really, e.g. save a time series of the data into a database or bunch of DynamoDB tables, or get an alert to remind me to charge the Cat Feeder's battery with a customizable threshold?

Instructions to try out the Arduino/ESP32 part of the solution for yourself

  1. Install the Arduino IDE.
  2. Follow this AWS blog on setting up an IoT device, start from "Installing and configuring the Arduino IDE" to including "Configuring and flashing an ESP32 IoT device". Their blog walks us through on preparing the Arduino IDE and on how to flash the ESP32 with a Sketch.
  3. Clone the Arduino source code from my Github repository: https://github.com/chiwaichan/aws-iot-cat-feeder
  4. Go to the "secrets.h" tab and replace the following variables:

arduino secrets

  • WIFI_SSID: This is the name of your Wifi Access Point
  • WIFI_PASSWORD: The password for your Wifi.
  • AWS_IOT_ENDPOINT: This is the regional endpoint of your AWS Iot Core Service.

aws iot endpoint

  • AWS_CERT_CA: The content of the Amazon Root CA 1 file created in the prerequisites for the first certificate.
  • AWS_CERT_CRT: The content of the xxxxx.cert.pem file created in the prerequisites for the first certificate.
  • AWS_CERT_PRIVATE: The content of the xxxxx.private.key file created in the prerequisites for the first certificate.
  1. Flash the code onto the ESP32

arduino flash code

You might need to push a button on the micro-controller during the flashing process depending on the your ESP32 micro-controller

  1. Check the Arduino console to ensure the ESP32 can connect to AWS IoT and publish messages.

arduino console

  1. Verify the MQTT messages is received by AWS IoT Core

aws iot mqtt test client

Sending a message to the ESP32 when the IoT button is pressed

architecture diagram seeed

The diagram above depicts the components used to send a message to the ESP32 each time the Seeed AWS IoT button is pressed.

  1. AWS IoT button: this is the IoT button I detail in Part 1; it's a physical button that can be anywhere in the world where I can press to feed the fur babies once the final solution is built.
  2. AWS Lambda: This will replace the Lambda from the previous blog with the one shown in the diagram.
  3. IoT Topic: The Lambda will publish a message along with the type of button event (One click, long click or double click) to the Topic "cat-feeder/action", the value of the event is subject to what is supported by the IoT button you use.
  4. AWS IoT Core: This is the service that will forward message to the ESP32 micro-controller that are subscribed to Topics.
  5. ESP32: We will see details of the button event from each click in the Arduino console once this part is set up.

Instructions to set up the AWS IoT button part of the solution

  1. Take the 3 files create in the second set of Certificate created in the AWS IoT Core Service in the prerequisites, then create 3 AWS Secrets Manager "Other type of secret: Plaintext" values. We need a Secret value for each file. This is to provide the Lambda Function the Certificate to call AWS IoT Core.

aws secrets manager

  1. Get a copy of the AWS code from my Github repository: https://github.com/chiwaichan/aws-iot-cat-feeder

  2. In a terminal go into the aws folder and run the commands found in the "sam-commands.text" file, be sure to replace the following values in the commands to reflect the values for your AWS account. This will create a CloudFormation Stack of the AWS IoT Services used by this entire solution.

  • YOUR_BUCKET_NAME
  • Value for IoTEndpoint
  • Value for CatFeederThingLambdaCertName, this is the name of the long certificate value found in Iot Core created in the prerequisites for the second certificate.
  • Value for CatFeederThingLambdaSecretNameCertCA, e.g. "cat-feeder-lambda-cert-ca-aaVaa2", check the name in Secrets Manager.
  • Value for CatFeederThingLambdaSecretNameCertCRT
  • Value for CatFeederThingLambdaSecretNameCertPrivate
  • Value for CatFeederThingControllerCertName, this is the name of the long certificate value found in Iot Core created in the prerequisites for the second certificate used by the ESP32.
  • Find the Lambda created in the CloudFormation stack and Test the Lambda to manually trigger the event.
  • If you have setup an IoT 1-Click Button found in Part 1, you can replace that Lambda with the one created by the CloudFormation Stack. Go to the "AWS IoT 1-Click" Service and edit the "template" for the CatFeeder project.

aws iot one click lambda

  1. Let's press the Iot Button in the following way:
  • Single Click
  • Double Click
  • Long Click
  1. Verify the button events are received by the ESP32 by going to the Arduino console and you should see something like this:

arduino console aws iot mqtt messages

What's next?

I recently got a Creality3D Ender-3 V2 printer, I've got many known unknowns I know I need to get up to speed with in regards to fundamentals of 3D printing and all the tools, techniques and software associated with it. I'll attempt to print an enclosure to house the ESP32 controller, the wires, power supply/battery (if I can source a battery that lasts for more than a month on a single charge) and most importantly the dry cat food; I like to use some mechanical components to dispense food each time we press the IoT button described in Part 1. I'll talk in depth on the progress made on the 3D printing in Part 3.