
Building a Stream Deck plugin to invoke a Lambda function
Interacting with Cloud services is rarely a tactile experience. You write some code, run some command or click a button on a screen and things happen. Today we鈥檙e going to change that. We鈥檒l write a plugin for the Elgato Stream Deck to trigger an AWS Lambda function on demand with a customizable event. In case you haven鈥檛 heard of it, the Stream Deck is basically a set of buttons with tiny screens behind them that you can customize to do your bidding through plugins. Today, we鈥檒l write a small plugin that invokes a Lambda function of our choosing, thereby allowing us to do pretty much anything in AWS. ...

The thing I keep relearning about the S3-SNS/SQS integration
Occasionally, you鈥檒l do something that you think you鈥檝e done dozens of times before and are then surprised it no longer works. While setting up a log delivery mechanism for Splunk, I had one of these experiences again. (Feel free to replace relearning with forgetting in the headline.) Splunk鈥檚 preferred method of ingesting log data from AWS is the SQS-based S3 input. In a nutshell, you ensure that all logs end up in an S3 bucket. That bucket is configured to send all object create events to an SNS topic (so that multiple systems can subscribe), to which an SQS queue is subscribed. Splunk subsequently consumes the object create events from the queue and ingests the corresponding objects from S3. ...

Fixing the Touchpad on my former Chromebook
At some point the Touchpad on my converted Chromebook failed. I explain how I fixed it.

Which AWS Regions are Lambda@Edge functions executed in?
Lambda@Edge functions are executed in multiple regions around the world, but which ones exactly? In this post we鈥檒l find out together, which will enable us to pre-create log groups and other resources for them.

Integrating HubSpot with AWS Lambda
Like many organizations, tecRacer uses HubSpot as a CRM. Integrating Hubspot with other (internal) systems enables smooth workflows for everyone involved. Since I recently built a custom integration, I thought it may be helpful to explain how to set up a secure interface with AWS.

How I spent a few hours using advanced technology to save $2
Opportunity cost is an important economic concept, but sometimes we need to ignore it to learn something. Join me in using a variety of services and tools to figure out what is using my KMS keys and if I can safely delete them.

AWS Community Day 2024: Finding and using undocumented AWS APIs
Links and References for my Talk at the AWS Community Day 2024 in Sofia As soon as the recording is available, it will appear here Contact If you want to get in touch, send me an e-mail hi [盲t] mauricebrg.com. Presentation The slides are available as a PDF here. Links Blog Post: Using undocumented AWS APIs with Python Python Package that implements some undocumented APIs (Github) If you鈥檙e interested in (un)documented or non-production APIs for Security research, Datadog has an interesting article here ...

AWS-Blog: Building Data Aggregation Pipelines using Apache Airflow and Athena
Business insights are frequently generated from aggregated data, like daily sales per market segment over time. In this blog post we鈥檒l use Apache Airflow to build a data aggregation pipeline that utilizes Amazon Athena for the heavy lifting. We鈥檒l cover best practices that you should follow to build a production-ready system.

AWS-Blog: How to accidentally create read-only DynamoDB items
In a recent Developing on AWS course I was faced with an interesting question about DynamoDB. What happens if you create an item that features attributes of a global secondary index with a data type that doesn鈥檛 match the index? My intuition was wrong, let鈥檚 check out what actually happens.

AWS-Blog: Making the TPC-H dataset available in Athena using Airflow
The TPC-H dataset is commonly used to benchmark data warehouses or, more generally, decision support systems. It describes a typical e-commerce workload and includes benchmark queries to enable performance comparison between different data warehouses. I think the dataset is also useful to teach building different kinds of ETL or analytics workflows, so I decided to explore ways of making it available in Amazon Athena.