I Created a Twitter Bot Powered by AWS Lambda to Tweet Quotes Each Morning

As the new manager of the Toronto freeCodeCamp Twitter account, I wanted an easy way to automate tweets without needing to post updates manually every day.

I decided to build a simple bot that tweets out motivational coding quotes each morning. This provides some consistently interesting content that followers can look forward to.

Rather than maintaining a full server, I built this bot using AWS Lambda and CloudWatch scheduled events to run code snippets on a schedule. Lambda is a highly cost-effective and easy way to run small jobs like this.

In this post, I’ll share how I built this serverless quote-tweeting bot architecture on AWS.

Why I Took Over the freeCodeCamp Toronto Twitter Account

I recently became more involved with the Toronto freeCodeCamp chapter as we started up in-person meetings again post-pandemic. FreeCodeCamp is an awesome coding education community with active local groups around the world.

Our chapter leader Caleb asked if anyone could help post more frequently to the Twitter account and engage with followers. I enjoy trying new projects like bots, so I happily volunteered!

Caleb granted me access to @freeCodeCampTO where I now help manage updates and interactions. My goal is to build our audience and share relevant content about code, tech, events and opportunities in the Toronto area.

I have some experience running my own Twitter bot that auto-tweets my blog posts. So I figured something similar that tweets motivational quotes daily would make a fun new project!

The Benefits of Tweeting Quotes and Sayings

Posting quotes serves a few different purposes:

  • Adds value for followers – Quotes are quick wisdom that brighten up someone‘s day. The programming and tech focused ones are fun & engaging for our coder audience.

  • Provides diverse content – Mixing quotes in with other updates keeps things interesting. The variety stands out better in followers‘ feeds.

  • Requires minimal effort – Finding or writing longer articles takes more time. Quotes deliver value quickly.

The auto-scheduled tweets also help keep the account stay active when no one is available to post manually. This shows followers the community is still alive and kicking.

I spent some time compiling a database of ~100 inspirational coding sayings to randomly select from each day. The bot tweets these out to entertain and motivate freeCodeCamp TO‘s audience.

Comparing Different Open Source Twitter Bot Options

There are a variety of frameworks available for coding Twitter bots in languages like Python, Node.js and PHP. Some popular options include:

Tweepy – Python bot framework with many handy features. Good for data science use cases.

Twit – NPM package built on the Twitter API. Simple JavaScript integration.

TwitterAPIExchange – PHP wrapper offering an easy interface to the REST and streaming APIs.

However, my own background is primarily in Go and AWS. I wanted to build this project using Lambda functions triggered on a CloudWatch schedule.

Some advantages of the serverless approach:

  • No backend servers to manage
  • Handles traffic spikes easily
  • Very cost-effective at low scale
  • Integrates nicely with my AWS experience

Now let‘s take a look under the hood at the technical design…

Overview of the Lambda Architecture

The deployment architecture consists of a few simple components:

Serverless Twitter Bot Architecture

AWS Lambda – The Lambda function runs the Go code to post the tweet. Executed on a schedule via CloudWatch rule.

CloudWatch Events – Cron job runs daily and triggers the Lambda function.

API Gateway (optional) – For public interaction, we can front the Lambda with API Gateway.

DynamoDB (optional) – Alternative to storing quotes locally, can pull from external database.

Let‘s dig into the Lambda function itself next…

Configuring Lambda Function Permissions for Twitter API Access

The Lambda required some additional permissions to be granted in order to successfully connect to Twitter‘s API and post tweets.

By default, functions have limited access rights via basic execution roles. We need to customize this so it can access external services.

In the Lambda console, under Permissions I created a custom role like TwitterPostRole. The policy document grants permissions for the Twitter API:

{
  "Version": "2012-10-17",
  "Statement": [
    {
      "Effect": "Allow",
      "Action": [
        "logs:CreateLogGroup",
        "logs:CreateLogStream",
        "logs:PutLogEvents"
      ],
      "Resource": "arn:aws:logs:*:*:*"
    },
    {
      "Effect": "Allow",
      "Action": [
        "sns:Publish"
      ],
      "Resource": "*"
    }
  ]
}

I pass in credentials via environment variables so they are available at runtime inside the function.

Now when my handler code tries to connect to Twitter‘s API to post a tweet, Lambda permits that external outbound traffic thanks to this IAM role.

Evaluating Different Programming Languages for Writing Lambda Functions

AWS Lambda supports a bunch of different languages for writing functions – Go, Python, JavaScript, Java, C# and more.

I went with Go since that is my most comfortable coding language right now. However, some other great options include:

Python

  • Very popular general purpose language
  • Data analysis and ML applications
  • Large ecosystem of libraries
  • Slightly slower performance than Go

JavaScript (Node.js)

  • Frontend web development backgrounds
  • Quick prototypes and MVPs
  • Easy to get started writing code fast

Java

  • Great for enterprise developers
  • Lots of pre-existing Java talent and codebases
  • Statically typed and higher performance

Each language has its own pros and cons. The awesome thing about Lambda is you can switch between them fairly easily if needed since the surrounding integration and triggers remain the same.

Now let‘s break down the actual cost of running this on Lambda vs traditional servers!

Pricing Comparison – Lambda vs EC2 Cost Analysis

One of the main benefits of Lambda over provisioning always-on EC2 instances is the cost savings. Let‘s analyze the hard numbers.

First, the EC2 pricing if I was to run my Twitter bot server continuously:

  • Instance Type: t3.nano (1 vCPU, 0.5 GB RAM) = $3.78/month on-demand
  • Storage: 10 GB EBS = $1/month
  • Data transfer: ~1 GB = $0.09
  • Total: Around $4.87 per month

Whereas with AWS Lambda, the pricing is as follows:

  • Requests: 1M free tier, then $0.20 per 1M
  • Duration: 400,000 GB-seconds free, then $0.00001667 for extra time
  • Total: Under 500ms duration and < 1M requests = $0 per month

For my use case of tweeting once per day Lambda easily fits within the free tier forever. No paying for idle EC2 server capacity I don‘t use.

If we exceed the Lambda free tier thresholds, my estimated monthly costs would be:

  • 1 tweet = ~100ms duration
  • 30 tweets per month = 0.03 sec total duration
  • $0.00001667 per GB-sec = $0.000005 per month

As you can see, Lambda is extremely cheap for small workloads. Code run costs tend to be negligible compared to external services like databases.

The auto-scaling capabilities also ensure I don‘t need to worry about traffic spikes on viral tweets. Performance remains fast and consistent without any effort on my part.

Key Factors That Influence Lambda Pricing & Optimization Opportunities

The two primary drivers of Lambda costs are compute time and requests. A few ways to optimize based on these pricing factors:

Reduce duration

  • Code optimizations – reuse initialization, leverage caching, etc
  • Faster CPU instance types
  • Streamline dependencies and payloads

Lower requests

  • Increase batch sizes rather than individual requests
  • Implement caching and data warming
  • Route non-function traffic to APIs and containers

Beyond just tweeting quotes I could add more advanced features like analyzing clicks and building a full audience dashboard.

Let‘s explore some ways to enhance this project over time…

Potential Feature Enhancements for the Twitter Bot

Here are some possibilities to take my simple quote-tweeting bot to the next level:

Expanding the Database of Quote Sources

I manually compiled ~100 quotes to pick from initially. But it would be nice to have more diversity over longer periods.

Some options to expand sources:

  • Scraping coding blogs and newsletters
  • Adding famous quotes about technology
  • Curating student and member submissions
  • Dynamic generation using GPT-3

With a larger database I can go 6+ months before repeating any quotes.

Randomizing Tweet Times

Currently CloudWatch triggers Lambda every day at the same time. But tweets may have better visibility if sent at random times throughout the day.

I can achieve this by having Lambda generate a random hour each day, then passing that into the scheduled event rule dynamically. This improves the follower experience through unpredictability.

Analyzing Tweet Engagement and Bot Performance

It would be useful to track metrics on how well received the quotes are – clicks, retweets, mentions, followers gained, etc.

I can print analytics data to CloudWatch Logs from Lambda, then visualize trends in tools like DataDog or Grafana. Debugging and improving the bot through data analysis would provide some nice insights.

Error Handling, Logging and Troubleshooting

No serverless application is immune from issues. My bot code needed robust logic to handle errors from the Twitter API or CloudWatch events failing to trigger the function.

In my Go handler, I have wraps around all external calls to catch any panics or HTTP exception statuses:


func handler() {

    defer func() {
        if err := recover(); err != nil {
            log.Println("Unexpected panic occurred:", err) 
        }
    }()

    tweet, err := getRandomQuote()

    if err != nil {
        log.Printf("Unable to get quote, %v", err)
    }

  resp, err := twitterClient.PostTweet(tweet)

  if resp.StatusCode > 299 {
    log.Printf("Failed to tweet, %v", err)
  }

}

This ensures the function doesn‘t crash if quote retrieval or tweeting fails. I can handle cases like the API being unreachable.

All logs stream to CloudWatch Logs which provides timestamps, filtering, and analysis. This proved invaluable for troubleshooting runtime issues during testing. Serverless applications shift traditional monitoring to logs-based observation.

I also implemented some retry logic using exponential backoff to posting the tweet up to 3 times if rate limited by Twitter. This helps smooth out transient API errors.

Security Considerations for Serverless Bots

Any publicly exposed application needs to be developed defensively with security in mind. Here are a few best practices I followed:

Rotate API Tokens

Storing secrets in environment variables leaves the potential for leaks or exploits. I implemented a short 90 day rotation policy on the Twitter tokens. New credentials are automatically generated without any downtime.

Operate Under Least Privilege

The IAM role granted to Lambda should follow the principle of least privilege. Only the specific permissions required for operation are allowed. This limits the blast radius if the function were to ever get compromised.

Utilize VPC Isolation (advanced)

Placing Lambda functions within an Amazon VPC creates an isolated network environment and private IP addresses. This adds extra logical separation between components.

Of course, over-engineering security too early can stall product iteration. The level of protection should match the data sensitivity – a quote tweeting bot likely doesn‘t warrant extremely strict controls from day one.

Final Thoughts

In closing, I‘ve enjoyed setting up this serverless cron-based Twitter bot with AWS Lambda. The loose coupling of functions and triggers provided a resilient way to run scheduled jobs.

Lambda lowered costs by only paying for compute duration while eliminating infrastructure management overhead. And the bots tweets can scale seamlessly if any happen to go viral one day!

You can view the source code for running this bot in this GitHub gist. I welcome you to reuse it for your own awesome Twitter projects.

I plan to keep enhancing the @freeCodeCampTO bot over time with more quotes and intelligence. Let me know if you have any other feature requests!

Similar Posts