Skip to main content

4 posts tagged with "AWS CDK"

AWS CDK

View All Tags

Rockit Apple payslip Analyzer with GenAI Chatbot using Bedrock and Streamlit

· 4 min read
Chiwai Chan
Tinkerer

It's the time of year where I normally have to start doing taxes, not for myself but for my parents. Mum works at various fruit picking / packing places in Hawkes Bay throughout the year, so that means there are all sorts of Payslips from different employers for the last financial year. Occasionally mum would ask me specific details about her weekly payslips, and that usually means: download a PDF from and email -> open up the PDF -> find what's she asking for -> look at the PDF -> can't find it so ask what mum meant -> find the answer -> explain it to her.

Solution & Goal

The usual format,challenge: create a Generative AI conversational chat to enable mum to ask in her natural language specific details of,

And the goal: outsource the work to AI = more time to play. :-)

Success Criterias

  • Automatically extract details from Payslips - I've only tested it on Payslips from Rockit Apple.
  • Enable end-user to ask in Cantonese details of a Payslip
  • Retrieve data from an Athena Table where the
  • Create a Chatbot to receive question in Cantonese around the user's Payslips stored in the Athena Table, and generate a response back to the user in Cantonese

So what's the Architecture?

Architecture

Note

I've only tried it for Payslips generated by this employer: Rockit Apple

Deploy it for yourself to try out

Prerequisites

  • Python 3.12 installed - the only version I've validated
  • Pip installed
  • Node.js and npm Installed
  • CDK installed - using npm install -g aws-cdk
  • AWS CLI Profile configured

Deployment CLI Commands

  • Open up a terminal
  • And run the following commands
git clone git@github.com:chiwaichan/rockitapple-payslip-analyzer-with-genai-chatbot-using-bedrock-streamlit.git 
cd rockitapple-payslip-analyzer-with-genai-chatbot-using-bedrock-streamlit
python3 -m venv venv
source venv/bin/activate
pip install -r requirements.txt
cdk deploy

If all goes well

You should see this as a result of calling the cdk deploy command

CDK Deploy

Check that the CloudFormation Stack is being created in the AWS Console

CloudFormation Create

Click on it to see the Events, Resources and Output for the Stack

CloudFormation Create Events

Find the link to the S3 Bucket to upload Payslip PDFs into, in the Stack's Resources, find the S3 Bucket with a with a Logical ID that starts with "sourcepayslips" and click on its Physical ID link.

S3 Buckets

Upload your PDF Payslips into here S3 Source Payslip PDFs

Find the link to the S3 Bucket where the extracted Data will be stored into for the Athena Table, in the Stack's Resources, find the S3 Bucket with a with a Logical ID that starts with "PayslipAthenaDataBucket" and click on its Physical ID link.

CloudFormation S3 Buckets

There you can find a JSON file, it should take about a few minutes to appear after you upload the PDF.

Athena Table JSON file in S3 Bucket

It was created by the Lambda shown in the architecture diagram we saw earlier, it uses Amazon Textract to extract the data from each Paylip using OCR, using the Queries based feature to extract the Text from a PDF by enabling us to use queries in natural language to configure what we want to extract out from a PDF. Find the "app.py" file shown in the folder structure in the screenshot below, you can modify the wording of the Questions the Lambda function uses to extract the details from the Payslip, to suit the specific needs based on the wording of your Payslip; the result of each Question extracted is saved to the Athena table using the column name shown next to the Question.

Textract Queries

What it looks like in action

Go to the CloudFormation Stack's Outputs to get the URL to open the Streamlit Application's frontend.

Click the value for the Key "StreamlitFargateServiceServiceURL"

Streamlit URL

That will take you to a Streamlit App hosted in the Fargate Container shown in the architecture diagram

Streamlit App

Lets try out some examples

Example 1 Example 2 Example 3 Example 4 1 payslip

Things don't always go well

Error

You can tweak the Athena Queries generated by the LLM by providing specific examples tailoured to your Athena Table and its column names and values - known as a Few-Shot Learning. Modify this file to tweak the Queries feed into the Few-shot examples used by Bedrock and the Streamlit app.

Few Shot Examples

Thanks to this repo

I was able to learn and build my first GenAI app: AWS Samples - genai-quickstart-pocs

I based my app on the example for Athena, I wrapped the Streamlit app into a Fargate Container and added Textract to extract Payslips details from PDFs and this app was the output of that.

FeedMyFurBabies – Storing Historical AWS IoT Core MQTT State data in Amazon Timestream

· 3 min read
Chiwai Chan
Tinkerer

In my code examples I shared in the past, when I sent and received IoT messages and states to and from AWS Core IoT Topics, I only implemented subscribers to react to perform a functionality when an MQTT message is received on a Topic; while that it was useful when my FurBaby was feed in the case when the Cat Feeder was triggered to drop Temptations into the bowls, however, we did not keep a record of the feeds or the State of the Cat Feeder into some form of data store over time - this meant we did not track when or how many times food was dropped into a bowl.

In this blog, I will demonstrate how to store the data in the MQTT messages sent to AWS IoT Core and ingest the data into Amazon Timestream database; Timestream is a serverless time-series database that is fully managed so we can leverage with worrying about maintaining the database infrastructure.

Architecture

Architecture

In this architecture we have two AWS IoT Core Topics, where each IoT Topic has an IoT Rule associated with it that will send all the data from every MQTT message receieved from that Topic - there is an ability to filter the messages but we've not using to use it, and that data is ingested into a corresponding Amazon Timestream table.

Deploying the reference architecture

git clone git@github.com:chiwaichan/feedmyfurbabies-cdk-iot-timestream.git
cd feedmyfurbabies-cdk-iot-timestream
cdk deploy

git remote rm origin
git remote add origin https://git-codecommit.us-east-1.amazonaws.com/v1/repos/feedmyfurbabies-cdk-iot-timestream-FeedMyFurBabiesCodeCommitRepo
git push --set-upstream origin main

Here is a link to my GitHub repository where this reference architecture is hosted: https://github.com/chiwaichan/feedmyfurbabies-cdk-iot-timestream

Simulate an IoT Thing to Publish MQTT Messages to IoT Core Topic

In the root directory of the repository is a script that simulates an IoT Thing and it will constantly publish MQTT messages to the "cat-feeder/states" Topic; ensure you have the AWS CLI installed on your machine with a default profile as it relies on it, and ensure the Access Keys used by the default profile has the permission to call "iot:Publish".

It sends a random number for the "food_capacity" that ranges 0-100 to represent the percentage of food that is remaining in a cat feeder, and a values for the "device_location" as we are scaling out with the number of cat feeders placed around the house. Be sure to send the same JSON structure in your MQTT message if you decide to not use the provided script to send the messages to the Topic.

publish mqtt messages script

Query the data stored in the Amazon Timestream Database/Table

Now lets jump into the AWS Console, then jump into the Timestream Service and go into the "catFeedersStates" Table; then click on "Actions" to show the "Query table" option to go to the Query editor.

timestream table

The Query editor will show a default query statement, click "Run" and you will see in the Query results the data from the MQTT messages that was generated by the script; where the MQTT messages was ingested from the IoT Topic "cat-feeder/states".

timestream table query

FeedMyFurBabies – Using Custom Resources in AWS CDK to create AWS IoT Core Keys and Certificates

· 9 min read
Chiwai Chan
Tinkerer

In a previous blog I talked about switching from CloudFormation template to AWS CDK as my preference for infrastructure as code, for provisioning my AWS Core IoT resources; I mentioned at the time whilst using resources using AWS CDK, as it would improve my productivity to focus on iterating and building. Although I switched to CDK for the reasons I described in my previous blog, there are some CloudFormation limitations that cannot be addressed just by switching to CDK alone.

In this blog I will talk about CloudFormation Custom Resources:

  • What are CloudFormation Custom Resources?
  • What is the problem I am trying to solve?
  • How will I solve it?
  • How am I using Custom Resources with AWS CDK?

CloudFormation Custom Resources allows you to write custom logic using AWS Lambda functions to provision resources, whether these resources live in AWS (you might ask why not just use CloudFormation or CDK: keep reading), on-premise or in other public clouds. These Custom Resource Lambda functions configured within a CloudFormation template, and are hooked into a CloudFormation Stack's lifecycle during the create, update and delete phases - to allow these lifecycle stages to happen, the logic must be implemented into the Lambda function's code.

What is the problem I am trying to solve?

In my AWS IoT Core reference architecture, it relies on use of two sets of certificates and private keys; they are used to authenticate each Thing devices connecting to AWS IoT Core - this ensures that only trusted devices can establish a connection.

In the CloudFormation template version of my reference architecture, I had in the deployement instructions to manually create 2 Cetificates in the AWS Console for the IoT Core service, this is because CloudFormation doesn't directly support creation of certificates for AWS IoT Core; as shown in the screenshot below.

CloudFormation Stacks

There is nothing wrong with creating the certificates manually within the AWS Console when you are trying out my example for the purpose of learning, but it would best to be able to deploy an entire set of resources using infrastructure as code, so we can achieve consistent repeatable deployments with as minimal effort as possible. If you are someone completely new to AWS, coding and IoT, my deployment instructions would be very overwheling and the chances of you successfully deploying a fully functional example will be very unlikely.

How will I solve it?

If you got this far and actually read what was written up to this point, you probably would have guess the solution is Custom Resources: so lets talk about how the problem described above was solved.

So we know Custom Resources is part of the solution, but one important thing we need to understand is that, even though there isn't the ability to create the certificates directly using CloudFormation, but there is support for creating the certificates using the AWS SDK Boto3 Python library: create_keys_and_certificate.

create_keys_and_certificate

So essentially, we are able create the AWS IoT Core certificates using CloudFormation (in an indirectly way) but it requires the help of Custom Reources (a Lambda function) and the AWS Boto3 Python SDK.

The Python code below is what I have in the Custom Resource Lambada function, it demonstrates the use of the Boto3 SDK to create the AWS IoT Core Certificates; and as a bonus, I am leveraging the Lambda function to save the Certificates into the AWS Systems Manager Parameter Store, this makes it much more simplier by centralising the Certificates in a single location without the engineer deploying this reference architecture having to manually copying/pasting/managing the Certificates - as I have forced readers in my original version of this reference architecture deployment. The code below also manages the lifecycle of the Certificates as the CloudFormation Stacks are deleted, by deleting the Certificates it created during the create phase of the lifecycle.

The overall flow to create the certificates is: Create a CloudFormation Stack --> Invoke the Custom Resource --> invoke the Boto3 IoT "create_keys_and_certificate" API --> save the certificates in Systems Manager Parameter Store

import os
import sys
import json
import logging as logger
import requests
import boto3
from botocore.config import Config
from botocore.exceptions import ClientError

import time

logger.getLogger().setLevel(logger.INFO)


def get_aws_client(name):
return boto3.client(
name,
config=Config(retries={"max_attempts": 10, "mode": "standard"}),
)


def create_resources(thing_name: str, stack_name: str, encryption_algo: str):

c_iot = get_aws_client("iot")
c_ssm = get_aws_client("ssm")

result = {}

# Download the Amazon Root CA file and save it to Systems Manager Parameter Store
url = "https://www.amazontrust.com/repository/AmazonRootCA1.pem"
response = requests.get(url)

if response.status_code == 200:
amazon_root_ca = response.text
else:
f"Failed to download Amazon Root CA file. Status code: {response.status_code}"


try:
# Create the keys and certificate for a thing and save them each as Systems Manager Parameter Store value later
response = c_iot.create_keys_and_certificate(setAsActive=True)
certificate_pem = response["certificatePem"]
private_key = response["keyPair"]["PrivateKey"]
result["CertificateArn"] = response["certificateArn"]
except ClientError as e:
logger.error(f"Error creating certificate, {e}")
sys.exit(1)

# store certificate and private key in SSM param store
try:
parameter_private_key = f"/{stack_name}/{thing_name}/private_key"
parameter_certificate_pem = f"/{stack_name}/{thing_name}/certificate_pem"
parameter_amazon_root_ca = f"/{stack_name}/{thing_name}/amazon_root_ca"

# Saving the private key in Systems Manager Parameter Store
response = c_ssm.put_parameter(
Name=parameter_private_key,
Description=f"Certificate private key for IoT thing {thing_name}",
Value=private_key,
Type="SecureString",
Tier="Advanced",
Overwrite=True
)
result["PrivateKeySecretParameter"] = parameter_private_key

# Saving the certificate pem in Systems Manager Parameter Store
response = c_ssm.put_parameter(
Name=parameter_certificate_pem,
Description=f"Certificate PEM for IoT thing {thing_name}",
Value=certificate_pem,
Type="String",
Tier="Advanced",
Overwrite=True
)
result["CertificatePemParameter"] = parameter_certificate_pem

# Saving the Amazon Root CA in Systems Manager Parameter Store,
# Although this file is publically available to download, it is intended to provide a complete set of files to try out this working example with as much ease as possible
response = c_ssm.put_parameter(
Name=parameter_amazon_root_ca,
Description=f"Amazon Root CA for IoT thing {thing_name}",
Value=amazon_root_ca,
Type="String",
Tier="Advanced",
Overwrite=True
)
result["AmazonRootCAParameter"] = parameter_amazon_root_ca
except ClientError as e:
logger.error(f"Error creating secure string parameters, {e}")
sys.exit(1)

try:
response = c_iot.describe_endpoint(endpointType="iot:Data-ATS")
result["DataAtsEndpointAddress"] = response["endpointAddress"]
except ClientError as e:
logger.error(f"Could not obtain iot:Data-ATS endpoint, {e}")
result["DataAtsEndpointAddress"] = "stack_error: see log files"

return result

# Delete the resources created for a thing when the CloudFormation Stack is deleted
def delete_resources(thing_name: str, certificate_arn: str, stack_name: str):
c_iot = get_aws_client("iot")
c_ssm = get_aws_client("ssm")

try:
# Delete all the Systems Manager Parameter Store values created to store a thing's certificate files
parameter_private_key = f"/{stack_name}/{thing_name}/private_key"
parameter_certificate_pem = f"/{stack_name}/{thing_name}/certificate_pem"
parameter_amazon_root_ca = f"/{stack_name}/{thing_name}/amazon_root_ca"
c_ssm.delete_parameters(Names=[parameter_private_key, parameter_certificate_pem, parameter_amazon_root_ca])
except ClientError as e:
logger.error(f"Unable to delete parameter store values, {e}")

try:
# Clean up the certificate by firstly revoking it then followed by deleting it
c_iot.update_certificate(certificateId=certificate_arn.split("/")[-1], newStatus="REVOKED")
c_iot.delete_certificate(certificateId=certificate_arn.split("/")[-1])
except ClientError as e:
logger.error(f"Unable to delete certificate {certificate_arn}, {e}")


def handler(event, context):
props = event["ResourceProperties"]
physical_resource_id = ""


try:
# Check if this is a Create and we're failing Creates
if event["RequestType"] == "Create" and event["ResourceProperties"].get(
"FailCreate", False
):
raise RuntimeError("Create failure requested, logging")
elif event["RequestType"] == "Create":
logger.info("Request CREATE")

resp_lambda = create_resources(
thing_name=props["CatFeederThingLambdaCertName"],
stack_name=props["StackName"],
encryption_algo=props["EncryptionAlgorithm"]
)

resp_controller = create_resources(
thing_name=props["CatFeederThingControllerCertName"],
stack_name=props["StackName"],
encryption_algo=props["EncryptionAlgorithm"]
)

# The values in the response_data could be used in the CDK code, for example used as Outputs for the CloudFormation Stack deployed
response_data = {
"CertificateArnLambda": resp_lambda["CertificateArn"],
"PrivateKeySecretParameterLambda": resp_lambda["PrivateKeySecretParameter"],
"CertificatePemParameterLambda": resp_lambda["CertificatePemParameter"],
"AmazonRootCAParameterLambda": resp_lambda["AmazonRootCAParameter"],
"CertificateArnController": resp_controller["CertificateArn"],
"PrivateKeySecretParameterController": resp_controller["PrivateKeySecretParameter"],
"CertificatePemParameterController": resp_controller["CertificatePemParameter"],
"AmazonRootCAParameterController": resp_controller["AmazonRootCAParameter"],
"DataAtsEndpointAddress": resp_lambda[
"DataAtsEndpointAddress"
],
}

# Using the ARNs of the pairs of certificates created as the PhysicalResourceId used by Custom Resource
physical_resource_id = response_data["CertificateArnLambda"] + "," + response_data["CertificateArnController"]
elif event["RequestType"] == "Update":
logger.info("Request UPDATE")
response_data = {}
physical_resource_id = event["PhysicalResourceId"]
elif event["RequestType"] == "Delete":
logger.info("Request DELETE")

certificate_arns = event["PhysicalResourceId"]
certificate_arns_array = certificate_arns.split(",")

resp_lambda = delete_resources(
thing_name=props["CatFeederThingLambdaCertName"],
certificate_arn=certificate_arns_array[0],
stack_name=props["StackName"],
)

resp_controller = delete_resources(
thing_name=props["CatFeederThingControllerCertName"],
certificate_arn=certificate_arns_array[1],
stack_name=props["StackName"],
)
response_data = {}
physical_resource_id = certificate_arns
else:
logger.info("Should not get here in normal cases - could be REPLACE")

send_cfn_response(event, context, "SUCCESS", response_data, physical_resource_id)
except Exception as e:
logger.exception(e)
sys.exit(1)


def send_cfn_response(event, context, response_status, response_data, physical_resource_id):
response_body = json.dumps({
"Status": response_status,
"Reason": "See the details in CloudWatch Log Stream: " + context.log_stream_name,
"PhysicalResourceId": physical_resource_id,
"StackId": event['StackId'],
"RequestId": event['RequestId'],
"LogicalResourceId": event['LogicalResourceId'],
"Data": response_data
})

headers = {
'content-type': '',
'content-length': str(len(response_body))
}

requests.put(event['ResponseURL'], data=response_body, headers=headers)

How I am using Custom Resources with AWS CDK?

What I am about to describe in this section can also be applied to a regular CloudFormation template, as a matter of fact, CDK will generate a CloudFormation template behind the scenes during the Synth phase of the CDK code in the latest version of my IoT Core reference architecture implemented using AWS CDK: https://chiwaichan.co.nz/blog/2024/02/02/feedmyfurbabies-i-am-switching-to-aws-cdk/

If you want to get straight into deploying the CDK version of reference architecture, go here: https://github.com/chiwaichan/feedmyfurbabies-cdk-iot

In my CDK code, I provision the Custom Resource lambda function and the associated IAM Roles and Polices using the Python code below. The line of code "code=lambda_.Code.from_asset("lambdas/custom-resources/iot")" loads the Custom Resource Lambda function code shown earlier.

# IAM Role for Lambda Function
custom_resource_lambda_role = iam.Role(
self, "CustomResourceExecutionRole",
assumed_by=iam.ServicePrincipal("lambda.amazonaws.com")
)

# IAM Policies
iot_policy = iam.PolicyStatement(
actions=[
"iot:CreateCertificateFromCsr",
"iot:CreateKeysAndCertificate",
"iot:DescribeEndpoint",
"iot:AttachPolicy",
"iot:DetachPolicy",
"iot:UpdateCertificate",
"iot:DeleteCertificate"
],
resources=["*"] # Modify this to restrict to specific secrets
)

# IAM Policies
ssm_policy = iam.PolicyStatement(
actions=[
"ssm:PutParameter",
"ssm:DeleteParameters"
],
resources=[f"arn:aws:ssm:{self.region}:{self.account}:parameter/*"] # Modify this to restrict to specific secrets
)

logging_policy = iam.PolicyStatement(
actions=[
"logs:CreateLogGroup",
"logs:CreateLogStream",
"logs:PutLogEvents"
],
resources=["arn:aws:logs:*:*:*"]
)

custom_resource_lambda_role.add_to_policy(iot_policy)
custom_resource_lambda_role.add_to_policy(ssm_policy)
custom_resource_lambda_role.add_to_policy(logging_policy)

# Define the Lambda function
custom_lambda = lambda_.Function(
self, 'CustomResourceLambdaIoT',
runtime=lambda_.Runtime.PYTHON_3_8,
handler="app.handler",
code=lambda_.Code.from_asset("lambdas/custom-resources/iot"),
timeout=Duration.seconds(60),
role=custom_resource_lambda_role
)


# Properties to pass to the custom resource
custom_resource_props = {
"EncryptionAlgorithm": "ECC",
"CatFeederThingLambdaCertName": f"{cat_feeder_thing_lambda_name.value_as_string}",
"CatFeederThingControllerCertName": f"{cat_feeder_thing_controller_name.value_as_string}",
"StackName": f"{construct_id}",
}

# Create the Custom Resource
custom_resource = CustomResource(
self, 'CustomResourceIoT',
service_token=custom_lambda.function_arn,
properties=custom_resource_props
)

When you execute a "cdk deploy" using the CLI on the CDK reference architecture, CDK will synthesize from the Python CDK code, a CloudFormation template, and then create a CloudFormation Stack using the synthesized CloudFormation template for you.

For more details on the CDK AWS IoT reference architecture and deployment instructions, please visit my blog: https://chiwaichan.co.nz/blog/2024/02/02/feedmyfurbabies-i-am-switching-to-aws-cdk/

FeedMyFurBabies – I am switching to AWS CDK

· 7 min read
Chiwai Chan
Tinkerer

I have been a bit slack on this Cat Feeder IoT project for the last 12 months or so; there have been many challenges I've faced during that time that prevented me from materialising the ideas I had - many of them sounded a little crazy if you've had a conversation with me in passing, but they are not crazy to me in my crazy mind as I know what I ramble about is technically doable.

Examples of the technical related challenges I had were:

  • CloudFormation: the initial version of this project was implemented using CloudFormation for the IaC, here is the repository containing both the code and deployment instructions. If you read the deployment instructions, you will notice there are a lot of manual steps required - e.g. creating 2 sets of certificates in AWS Iot Core in the AWS Console; and copying and pasting values to and from the CloudFormation Parameters and Outputs, even though at the time I made my best efforts to minimise the manual effort required while coding them. It was not a good example to get it up and running especially if you are new to AWS, Arduino or IoT; as I myself struggled at times to deploy my own example.

  • Terraform: I ported the CloudFormation IaC code to Terraform some time last year, you can find it here. Nothing is wrong with Terraform itself; I just keep forgetting to save or misplaced my terraform state files every time I resume this project. In reality I might leverage both Terraform and CDK for the projects/micro-services I create in the future, but it all really depends on what I am trying to achieve at the end of the day.

Deploying the AWS CDK version of this Cat Feeder IoT project

So, the commands below are the deployment instructions taken from the AWS CDK version of this project, you can find it here: https://github.com/chiwaichan/feedmyfurbabies-cdk-iot

git clone git@github.com:chiwaichan/feedmyfurbabies-cdk-iot.git
cdk feedmyfurbabies-cdk-iot
cdk deploy

git remote rm origin
git remote add origin https://git-codecommit.us-east-1.amazonaws.com/v1/repos/feedmyfurbabies-cdk-iot-FeedMyFurBabiesCodeCommitRepo
git push --set-upstream origin main

The commands above are all you need to execute in order to deploy the Cat Feeder project in CDK - assuming you have the AWS CDK and your AWS credentials configured on the machine you are calling these commands on; the first group of commands checks out the CDK code which deploys an AWS CodeCommit repository and a CodePipeline pipeline - creates the 1st CloudFormation Stack using a CloudFormation template; and the second group of commands pushes the CDK code into the newly created CodeCommit repository created in the first group of commands, which in turns trigger an execution in CodePipeline and the pipeline deploys the resources for this Cat Feeder IoT project - creates the 2nd CloudFormation Stack using a different CloudFormation template.

The two groups of commands creates the 2 CloudFormation Stacks shown in the screenshot below, the stack "feedmyfurbabies-cdk-iot" provisions the CodeCommit repository and CodePipeline - using the 1st CloudFormation template, and the stack "Deploy-feedmyfurbabies-cdk-iot-deployed-service" provisions the resources for this Cat Feeder IoT project - using the 2nd CloudFormation template.

CloudFormation Stacks

FYI, I did not come up with the pattern I just described above that deployed the two CloudFormation Stacks: one for the pipeline and the other for the AWS resources for this Cat Feeder IoT project; I only came across it during one of those AWS online workshops I was using to learn CDK and noticed this pattern and found it useful, and pretty much decided to adopt it for my projects going forward.

Test out the deployed solution

The resources that are relevant to architecture of this AWS IoT solution are shown in the diagram below.

Deployed resources

There are 2 sets of certificates and 2 sets of AWS IoT Things and policies deployed by the "Deploy-feedmyfurbabies-cdk-iot-deployed-service":

IoT Certificates

The 1st set of certificates and IoT Thing is hooked up to the AWS Lambda function (Lambda Thing) shown in the diagram, this Lambda function acts as an AWS IoT Thing (uses the certificates saved in Systems Manager Parameter prefixed with "/feedmyfurbabies-cdk-iot-deployed-service/CatFeederThingLambda") and is fully configured as one along with all the neccessary certificates and permissions to send an MQTT message to the "cat-feeder/action" topic in AWS IoT Core; this is a very convenient way to see in action how one could send MQTT messages to AWS IoT Core using Python, as well as a good way to confirm the deployment was successful by testing it out!

Before we invoke the Lambda Thing/function, we need to subscribe to the "cat-feeder/action" topic so that we could see the incoming messages sent by the Lambda function.

Subscribe to IoT Topic

Then we invoke the Lambda function in the AWS Console:

Lambda Result

Make sure you get a green box confirming the MQTT message was sent.

The code in the Lambda is written in Python and it sends a JSON payload (the dictionary variable shown in the code below) to the IoT Topic "cat-feeder/action"

Lambda Code

Now lets go back to AWS IoT Core to confirm we have received the message:

AWS IoT Core MQTT received

We can see the message received in IoT Core is the dictionary object sent by the Lambda code

Conclusion

Using CDK does not eliminate all the issues you might encounter when using CloudFormation - I have a future blog on creating and using CloudFormation Custom Resources lined up; because at the end of the day CDK just generates a CloudFormation template and handles the deloyment of the CloudFormation Stack for you without you having to manage the CloudFormation Stacks or templates; the intent of this blog is to demonstrate how little effort is required to deploy an AWS IoT solution using CDK, compared with the same architecture I shared in my Github repo 2 years ago but with instructions using a CloudFormation template deployment that was long and tedious in manual steps.

The ultimate aim of change in IaC is to just focusing on building and iterating!

I do often talk too much in my blogs, but in this instance the instructions to deploy this solution for yourself to try out is very minimal, with the majority of the content focused on the resources deployed; and what each resource is for and how they interact with each other.

Extra

You may have noticed that there are 2 sets of certificates deployed in IoT Core and 2 IoT Things in this reference architecture, this is because you can take the 2nd set of certificates (prefixed with "/feedmyfurbabies-cdk-iot-deployed-service/CatFeederThingESP32") and Thing provisioned purely for you to send MQTT message to AWS IoT Core from your own IoT hardware devices / micro-controllers.

Your own Thing

If you want to try it out, you will need to use the IoT Core Endpoint specific to your AWS Account and Region; you can either find it in the AWS IoT Core Console, or copy it from the CloudFormation Stack's Output:

IoT Core Endpoint

The Lambda Thing we tested above can be used to send MQTT messages to your own IoT device/micro-controller, as the 2nd set of certificates is configured with the neccessary IoT Core Policies to receive the MQTT messages sent to the Topic "cat-feeder/action", and the certificates is also configured with the policies to send MQTT messages to a second IoT Topic called "cat-feeder/states"

Your own Thing Architecture

I have a future blog that will demonstrate how to do this using MicroPython and a Seeed Studio XIAO ESP32C3 - so watch this space.