Saturday, March 12, 2022

The Cloud Resume Challenge

 I haven't posted in many years but I'm still very much involved in tech. Lots of things have changed over the years, especially tech roles. When I first started this blog, Cisco and network engineering was the hottest thing around (hence the name). But now many IT roles are merging into one. With most having some form of automation, cloud, coding, or combination of the three as a rule.

I'm still continuing my education including deep-diving into coding, cloud, and automation. After hundreds of hours and dollars spent, I knew that I needed a way to showcase my latest skillsets. Doing a quick search I came across a challenge that would do just that, the Cloud Resume Challenge.


Completed design for shawnmooreresume.com



To preface, the Cloud Resume Challenge touches upon many skills a cloud or DevOps engineer will come across. Yet, some areas of the challenge dig deeper into the skill set than others. For example, creating parts of the Front End takes a shorter amount of time than some of the back-end steps. So even after the challenge is completed, it's the tip of the iceberg for some of the tech you'll touch compared to others. It's a great way to showcase your cloud skills along with your resume of work as well.

Designing the Front End

The first part of the challenge is essentially to map out how you're going to tackle it and which order. Including which cloud provider you're going to target, I chose AWS. Since I work with Azure daily, I typically go out of my way to experiment with AWS to keep my multi-cloud toolset sharp. There are 16 tasks and after you read through them, it makes sense to complete certain tasks before others. Other times it makes sense to attempt to start tasks simultaneously. For me, I completed the majority of the challenges in order in 3 multi-part sections.

The first set of tasks was to create the front-end, which was more of a creative exercise than anything. This consisted of buying a domain name (shawnmooreresume.com). Uploading your HTML and CSS code to S3. Then finally deploy your code to CloudFront for all to see. 

The trickiest part for me was CloudFront, due to needing an SSL certificate installed with the right format. I had to scrap my CloudFront distributions multiple times before realizing that my Route53 DNS config was still pointing at my S3 bucket during my testing and not CloudFront. The second hardest part for me was getting a visitor counter working with Javascript. This is due to me not knowing Javascript code, so it was trial and error.


Designing the Back End

It took me a week to knock out the front-end part of the challenge. Once finished, it was time to dive into the meat of the challenge, the back-end design. This involves getting 4 of the challenge tasks (Tasks 7-10) to work together in harmony. You're essentially getting the front-end to talk to the far back-end database and vice versa. The hardest part for me was gluing the two together.

I started with task 8, building the DynamoDB database since it can be completed standalone. The database is a small two-row table that handles storing the visitor counter. It took a few tries to get the partition key and visitor item sorted in the right format. Rather than looking at the table items left to right format, it made more sense when I thought of the table in top-down format.

I then circled back to step 7 and tested incrementing the visitor counter using JavaScript. My method was getting the counter working on its own, without the use of any database. Instead, JavaScript used the user device's local storage cache to keep track of their visits to the site. Once this was sorted, it was now time to get the glue between the JavaScript and DynamoDB working.

I split the glue between two tasks. First I needed to get Lambda configured to invoke a response back from DynamoDB. Most of my time was spent here coding Python. After many tests and failures, I learned that the code needed was simpler than I initially made it out to be. All that's needed is some sort of counter loop that increments on each initiation to DynamoDB. With the use of the Boto3 module, it was straightforward to get a return response back that could be sent to JavaScript.

Once Lambda to DynamoDB was sorted, the last piece was to connect JavaScript to Lambda. With the use of an API, we can simply accomplish this request. The AWS API Gateway is a Wizard-ish-like service that allows you to create the API needed. 

After testing API to Lambda, then Lambda to DynamoDB, the last step is to update our JavaScript to invoke the API rather than the user's local device to get and update the visitor counter. I had the most trouble getting the counter display on my webpage to show the right key value. After many hours of trial-and-error, I nailed this down and my website was working as designed.

Completing the Final Steps

With a fully functioning serverless resume website, it can be very tempting to skip the last handful of steps. But there's a lot of value in at least investigating and researching the topics at hand. I'll be honest, code testing is where I have the most trouble. I understand the many methods of testing (unit, integration, etc) and even create sample tests, but I have difficulty applying them to my environment. At some point, I'll come back and focus on cleaning up the Smoke Tests environment.

Automation is the goal of the final few steps, including turning your architecture into repeatable and deployable code (IaaS). I'm guilty of not finishing this part yet either. The IaaS portion calls for using AWS SAM templates. However, I've been picking up Terraform in my spare time and will go this route. If you check my GitHub, you'll see that I've built out an Apache web server front-end and back-end template with Terraform.

Setting up the GitHub repository and GitHub Actions was a straightforward process. There are many examples out there on how to set up a runner to deploy your config to AWS with less than 30 lines of code. The idea is to NOT store your secrets or access keys directly in code. Instead you'll want to use environment variables and take advantage of GitHub's ability to store these credentials.

Summary

Check out my resume website shawnmooreresume.com to see the final results. Note the visitor counter at the bottom of the web page. The visitor counter displays the latest hit count stored in the DynamoDB table.

The final step is what you're reading now, create a blog post! Blogging has always helped me recap what I learned along with confirming my understanding of the material. I recommend you do the same as well if time permits, it's surprising how much you can recall once you start typing it out into a post. 

I hope this overview was helpful to you as it was to me, let me know if you're thinking about taking the Cloud Resume Challenge as well below in the comments. Until next time.