Saturday, March 12, 2022

The Cloud Resume Challenge

 I haven't posted in many years but I'm still very much involved in tech. Lots of things have changed over the years, especially tech roles. When I first started this blog, Cisco and network engineering was the hottest thing around (hence the name). But now many IT roles are merging into one. With most having some form of automation, cloud, coding, or combination of the three as a rule.

I'm still continuing my education including deep-diving into coding, cloud, and automation. After hundreds of hours and dollars spent, I knew that I needed a way to showcase my latest skillsets. Doing a quick search I came across a challenge that would do just that, the Cloud Resume Challenge.


Completed design for shawnmooreresume.com



To preface, the Cloud Resume Challenge touches upon many skills a cloud or DevOps engineer will come across. Yet, some areas of the challenge dig deeper into the skill set than others. For example, creating parts of the Front End takes a shorter amount of time than some of the back-end steps. So even after the challenge is completed, it's the tip of the iceberg for some of the tech you'll touch compared to others. It's a great way to showcase your cloud skills along with your resume of work as well.

Designing the Front End

The first part of the challenge is essentially to map out how you're going to tackle it and which order. Including which cloud provider you're going to target, I chose AWS. Since I work with Azure daily, I typically go out of my way to experiment with AWS to keep my multi-cloud toolset sharp. There are 16 tasks and after you read through them, it makes sense to complete certain tasks before others. Other times it makes sense to attempt to start tasks simultaneously. For me, I completed the majority of the challenges in order in 3 multi-part sections.

The first set of tasks was to create the front-end, which was more of a creative exercise than anything. This consisted of buying a domain name (shawnmooreresume.com). Uploading your HTML and CSS code to S3. Then finally deploy your code to CloudFront for all to see. 

The trickiest part for me was CloudFront, due to needing an SSL certificate installed with the right format. I had to scrap my CloudFront distributions multiple times before realizing that my Route53 DNS config was still pointing at my S3 bucket during my testing and not CloudFront. The second hardest part for me was getting a visitor counter working with Javascript. This is due to me not knowing Javascript code, so it was trial and error.


Designing the Back End

It took me a week to knock out the front-end part of the challenge. Once finished, it was time to dive into the meat of the challenge, the back-end design. This involves getting 4 of the challenge tasks (Tasks 7-10) to work together in harmony. You're essentially getting the front-end to talk to the far back-end database and vice versa. The hardest part for me was gluing the two together.

I started with task 8, building the DynamoDB database since it can be completed standalone. The database is a small two-row table that handles storing the visitor counter. It took a few tries to get the partition key and visitor item sorted in the right format. Rather than looking at the table items left to right format, it made more sense when I thought of the table in top-down format.

I then circled back to step 7 and tested incrementing the visitor counter using JavaScript. My method was getting the counter working on its own, without the use of any database. Instead, JavaScript used the user device's local storage cache to keep track of their visits to the site. Once this was sorted, it was now time to get the glue between the JavaScript and DynamoDB working.

I split the glue between two tasks. First I needed to get Lambda configured to invoke a response back from DynamoDB. Most of my time was spent here coding Python. After many tests and failures, I learned that the code needed was simpler than I initially made it out to be. All that's needed is some sort of counter loop that increments on each initiation to DynamoDB. With the use of the Boto3 module, it was straightforward to get a return response back that could be sent to JavaScript.

Once Lambda to DynamoDB was sorted, the last piece was to connect JavaScript to Lambda. With the use of an API, we can simply accomplish this request. The AWS API Gateway is a Wizard-ish-like service that allows you to create the API needed. 

After testing API to Lambda, then Lambda to DynamoDB, the last step is to update our JavaScript to invoke the API rather than the user's local device to get and update the visitor counter. I had the most trouble getting the counter display on my webpage to show the right key value. After many hours of trial-and-error, I nailed this down and my website was working as designed.

Completing the Final Steps

With a fully functioning serverless resume website, it can be very tempting to skip the last handful of steps. But there's a lot of value in at least investigating and researching the topics at hand. I'll be honest, code testing is where I have the most trouble. I understand the many methods of testing (unit, integration, etc) and even create sample tests, but I have difficulty applying them to my environment. At some point, I'll come back and focus on cleaning up the Smoke Tests environment.

Automation is the goal of the final few steps, including turning your architecture into repeatable and deployable code (IaaS). I'm guilty of not finishing this part yet either. The IaaS portion calls for using AWS SAM templates. However, I've been picking up Terraform in my spare time and will go this route. If you check my GitHub, you'll see that I've built out an Apache web server front-end and back-end template with Terraform.

Setting up the GitHub repository and GitHub Actions was a straightforward process. There are many examples out there on how to set up a runner to deploy your config to AWS with less than 30 lines of code. The idea is to NOT store your secrets or access keys directly in code. Instead you'll want to use environment variables and take advantage of GitHub's ability to store these credentials.

Summary

Check out my resume website shawnmooreresume.com to see the final results. Note the visitor counter at the bottom of the web page. The visitor counter displays the latest hit count stored in the DynamoDB table.

The final step is what you're reading now, create a blog post! Blogging has always helped me recap what I learned along with confirming my understanding of the material. I recommend you do the same as well if time permits, it's surprising how much you can recall once you start typing it out into a post. 

I hope this overview was helpful to you as it was to me, let me know if you're thinking about taking the Cloud Resume Challenge as well below in the comments. Until next time.





Tuesday, May 1, 2018

Cert Passed - AWS Solutions Architect Associate

I was able to get through my first AWS certification yesterday with a successful pass. The test experience was quite a bit different from a Cisco or even Juniper certification. Typically when I sit for a test, the testing software has a bit of lag due to the low spec computers test centers use. This wasn't the case with the exam, it was very intuitive and no scary moments where I thought hitting the next button would cause the entire PC to melt down. Also unlike Cisco exams we have the ability to flag questions and go back to them to review, Juniper provides this same feature as well.

Overall the exam was fair but it is without a doubt a mile wide and an inch deep (maybe two). Also this exam is a recently released new version which focuses more on scenario based questions rather than true/false questions, etc. The biggest difficulty for me was answering questions around topics I was not familiar with due to my study process or technologies I wasn't uncomfortable enough around. Such as API security features within AWS.

If I had once piece of advise, it would be to take a look at the acloudguru course. That covered 80% of the material needed even for this new test version. My plans now are to focus time on family and my career for the remainder of 2018. Next year I'm going to take a hard look at continuing my college education. More specifically a Bachelor's in Cyber Security of some form.

With that said, what are your certification/education goals for this yeat?

Monday, January 22, 2018

What is AWS Direct Connect?

As I study for the AWS Solutions Architect Associate certification, one service that stood out to me was AWS Direct Connect. I didn't completely understand how it differed from a VPN connection or its use case. Here are a few high points about this service:

  • Provides a direct connection between your internal network and AWS environment.
  • The connection is made using either 1Gb or 10Gb Ethernet Fiber.
  • Uses both 802.1Q VLANs and BGP routing protocol
  • Supports IPv4 and IPv6 addressing. However the maximum MTU size supported is 1522 bytes (14 bytes ethernet header + 4 bytes VLAN tag + 1500 bytes IP datagram + 4 bytes FCS).
Interesting, it seems as if this direct connection is some type of VRF connection between the on-premise environment and AWS. You essentially have your router directly connect to an AWS router in a specific region via Fiber. This seems to come with a lot of caveats as you can probably see. How do you go about running fiber from your router to an AWS router? Well one requirement is that your network is collocated with an AWS Direct Connect location. You can use this link for current Direct Connect locations:

AWS Direct Connect Geographic Locations


There's a good chance you're probably not collocated with AWS, so does that mean you're out of luck? Not at all, the easier solution is to use a 3rd party AWS partner that offers this service. Partners can provide additional flexibility such as cabling and location Independence for direct connect. Along with even offering lower speeds at a lower cost such as 100Mbps, 500Mbps, etc. 


Direct Connect using AWS Partner



However if you're needing to traverse a Partner just to use Direct Connect, it may make more sense to use the many VPN options AWS offers. Direct Connect is a great solution for real-time data such as video and voice along with working with huge amounts of data between your network and AWS. It may be worth testing rather or not real-time data works sufficiently with AWS VPN as internet bandwidth is cheap now days.

Monday, January 15, 2018

How to Pass an IT Certification on your First Try

I've studied and passed multiple IT certifications from multiple vendors over the last 10 years. As time progressed, I've become more efficient on gathering the tools needed to pass an exam on the first try. This isn't to say that haven't failed or come close to failing an exam a few times along the way. As a matter of fact I seemed to learn the most when I failed exam. Today's quick topic I will discuss the process I use to study for an entry level to professional exam. I won't include the Expert level certification in this group as they're whole different beasts. Yes I'm talking to you CCIE and F5-CSE!


WHAT DO I NEED TO LEARN?
For starters lets begin with what the exam objectives will cover. This is something I see quite a few colleagues skip over when they begin studying for a certification. Knowing what the objectives are will allow you to set goals that you can follow along the way before exam date. For example when you've touched on every objective and topic in the exam at least once, this is a goal you can use to assess how ready you are for the test. Every vendor typically has a page that lines out the objectives and topics you need to know before even picking up your first book.

For example Amazon has a PDF that lists what you need to know and the percentage each topic holds on Solutions Architect the test:



WHERE DO I FIND STUDY MATERIAL?
Now that you know what you need to study, next we'll need to find study resources for the exam. This is where using forums such as www.techexams.net or Reddit comes in handy. I usually spend a few days researching methods others used along with tools available to me. I'm lucky enough to have CBT Nuggets for videos and SafariBooks  for cert books at my disposable, but even then that's sometimes not enough. You'll want to add labs and flashcards (Anki) to the mix as well. Vendors often times have free material as well. For example I was able to study for my Juniper certs using nothing but their free books and practice tests they offered, all in it only cost me $50 bucks to take their certification!

HOW DO I STUDY FOR THE EXAM?
Now that you have your blueprint and resources, the next and longest step is how to study for the certification. This really comes down for personal preference, but for me this is where my secret sauce comes into play. I use the following 3-step method when I study for a certification:

1. Start with the most high-level material, usually videos such as CBTNuggets. Take notes as you watch the videos, these will be placed into Anki flashcards once finished.

2. Once videos are finished, create initial flash cards then pop your head into the reading material. Hopefully the vendor has some type of official certification book, otherwise boring Whitepapers it is! Again take notes as you progress through the material. As you finish each chapter, this is where your labs come into place. Either attempt to recreate the examples listed in the chapter or even better, come up with your own scenarios and get the environment working as expected.

Along with labs add your notes to Flashcards at the end of each chapter. I should also mention that you should be studying flashcards EVERY SINGLE DAY until your exam date. Reading is a slog and rightfully so. That's where I really start to hone in and pick up most of my knowledge about the objectives at hand. I usually can't wait to take the exam because the repetition on the same topics start to get old at about this point. I consider this a good sign that it's about time to sit for the certification.

3.  While I'm still studying flashcards EVERY SINGLE DAY, I go back to any high-level material I can find. I'll look for YouTube how-to videos, exam caveats, and any additional lab examples I can find on the interwebs. At the same time I'm also locking down and scheduling the exam date, usually 2-4 weeks away. During this crunch time window I usually feel overly prepared and actually slow down my studying a few days before the exam. The only thing I'm doing at this point is studying flashcards EVERY SINGLE DAY!



There you have it folks. For the very last step I walk into my nearest Pearson-Vue location on game day, ace the exam, and walk out calm and collected. As I head back home I put any thoughts of certification out of my mind for at least two months (otherwise my family would kill me).

So this was my method, how do you typically study for a certification? Let me know in the comments below!







Monday, January 8, 2018

The New Age Old Question, Will the Cloud Replace IT Workers?

 This is one of the more controversial topics. As businesses look more towards the cloud to get rid of aging IT infrastructure, what happens to the workers who used to manage this infrastructure? As shown in the graphic below, company budgets for cloud services are increasing each year:



Now my personal opinion, I don't think us IT professionals will feel too much of a difference over the next 5-10 years. What I see happening is that we're no longer dealing with the heartaches of installing and racking physical hardware. Which can literally take hours per device depending on what you're needing to install. More lower skilled jobs such as rack & stacking may shrink but even the Amazon and Googles of the world still needs to install hardware in their data centers for their software.

I see cloud as a great opportunity for just about everyone in the IT field, rather you're on the database team, helpdesk,  or  network engineer. What cloud is doing is making your job more efficient, to focus on higher up tasks. Instead of me spending 3 days spinning up switches, VLAN's, etc.; this time used for more important duties such as design and documentation. No longer do you need to deal with a late night emergency change to increasing bandwidth to a SQL server for the DB team. Spend a few hours scripting your cloud environment to scale the database infrastructure and un-scale as needed. That's it, you're done, hand over the keys to the database manager!

Even as we start to automate server and network provisioning, rather in the cloud or on-premise; we still need techies to manage this. Jumping ahead of the game and learning how powerful and affordable these new tools are will set you a part for the years to come. How much longer can IT folk hide in their cubicles manually change Guest Wireless passwords or downloading and installing OVA's on VM hosts? I bet at least half of the tasks we do everyday could be eliminated or at the very least automated. It's up to us to keep pushing technology forward to make our day jobs easier and not being scared on loosing touch of the tech past.

Do you agree, disagree, not sure? Comment below, I would love to hear from my fellow IT folk and continue the discussion!

Sunday, December 31, 2017

2017 Recap

The year has come to a close, below is a recap of the 2017 goals I've completed for 2017:

    • Completed the CCDP exam: This was a big one for me as my CCNP was set to expire this year. The difficulty studying for this certification was at the same level as the CCNP but for different reasons. While the CCNP requires hours of labs and deep diving into CLI configuration, the design exams are on the opposite end of the spectrum. It requires hours of tedious reading of the most obscure white papers and videos. What got me through this exam by far had to be the note taking and Anki flash cards I practiced every day.

    • Implemented Wireless Solution from Scratch: I've always been listed as the SME for our wireless solution for the few companies I worked for by happen stance. I never had the opportunity to build out a wireless controller from scratch, survey and deploy AP's, and configure the wireless network. I had my chance a few months ago with Aruba and it was a good experience. If nothing else it gives me the confidence to deploy wireless on a larger scale which is the plan for next year.

    • Created useful Python scripts: If you look to the right menu, you'll see my meager attempts here. While these tools are far from optimized they help get my job done when needed. After finishing up the AWS Solutions Architect cert, I plan on digging deeper in Python scripting.

    • Finally began Amazon Web Services learning: For the past year I felt like I was falling behind current technology trends. There's only so much OSPF and Spanning-Tree you can learn and provide value to a business. If you look at the latest IT business trends, the shift from on-premise to cloud is a huge thing. I honestly do not know of a decent sized business that's not at least looking at cloud infrastructure. To get a head of the curve, I've jumped head first into Amazon's cloud services.

    •  No better way to do this than by studying for a certification to help track your learning progress. I expect to have the AWS Solutions Architect Certification completed in early 2018. I'm currently studying the following book (which is awesome), click the image if you want a copy:


    • Committed more time to blogging: This wasn't on my initial list of 2017 goals honestly. But earlier this month, I realized that I've neglected a very useful tool that's helpful not only for my certification studying, but also for my career. It's been a good December getting all of my thoughts and notes on digital paper. I plan on keeping this momentum going throughout 2018.


Well that's it, not too long of a list. I have even more goals for 2018, don't worry I'll be sharing them in the near future. Let me know what your 2017 were and if you completed your all of your goals below!

Monday, December 25, 2017

AWS Certified Solutions Architect - Progress Report and Notes

At this time, I've finished the second part of my study plan for the AWS Certified Solutions Architect Associate exam. As you may remember, I wanted to knock out CBT Nugget videos before digging into SafariBooks to read the AWS Certified Solutions Architect official study guide.






Now it's time to collect my notes from CBT and move on to the reading portion. Below are some high level notes I've taken:

AWS Infrastructure:
  • Uses regions with availability zones, zones are redundant
  • Edge Locations are cached Content Delivery Networks (CDNs)

Foundation Services:
  • Compute: EC2, LAMDA, Auto-Scaling (Regions)
  • Networking: Load-balancing, Route53, VPC (Availability Zones)
  • Storage: S3, Block Storage, Glacer, EFS (Edge Locations)

Platform Services:
  • Databases: DynamoDB, RDS, Redshift
  • Analytics: Kinesis, EMR, Data Pipeline
  • Deployment: Elastic Beanstalk, CodeDeploy
  • Mobile: Cognito, SNS

Storage Options:
  • Instance Store Backed: Physical storage connects directly to instance. Ephemeral so it is not in a permanent location.
  • EBS Backed (recommended): Persistent storage using EBS.

Simple Storage Service (S3):
  • Account uses bucketes (max 100 buckets)
  • Objects are files within buckets (virtually limitless storage)
  • Can host static web pages with S3
  • Buckets are globally unique names created in a region
  • Cannot nest buckets, they can only be Top-level containers
  • Objects can be up to 5TB in size
  • Bucket+Object+Version maps to unique URL
  • Access control can be done at bucket or object level
  • Not meant as primary storage for services (i.e. Instances)
  • Region specific & supports REST & SOAP
  • Server side encryption of data at rest
  • Three access controls: IAM, Bucket, and ACLs. You can combine all three methods.
S3 Storage Classes:
  • Standard: most expensive
  • Infrequent Access
  • Glacier: least expensive
  • Reduced Redundancy
Elastic Block Store (EBS):
  • Storage sizes between 1GB - 16TB (1TB for magnetic)
  • Can take snapshots into S3 at anytime
  • Use for DB's, Applications, & root volumes
  • Backups are incremental
  • Good for ephemeral temporary storage, is shared between instances
  • Similar to a SAN
VPC:
  • Security groups police traffic at instance level
  • Network ACLs police traffic at subnet level
  • Route tables are similar to VRF's
  • Default VPC use subnet 172.31.0.0/16 and IPv6 disabled
  • Use NAT Gateway or NAT instance for private to public routing
Identitiy and Access Management (IAM):
  • Policies are not cumulative, entities give up old permissions when assuming a role
  • Three types of policies (Managed, Custom, & Inline)
Non-Relational DB:
  • Top-level organized into 'Tables'
  • Tables contain 'Items'
  • Items contain 'Attributes'
Auto-Scaling:
  • Involves Elastic LB, Cloudwatch (provides info to AS), & Auto Scaling (manages group)
  • Auto-Scaling invludes the following:
  1.  Launch Config: Config of EC2 instances to be scaled
  2. Auto-Scaling group: Defines how much to scale and un-scale
  3. Scaling life cycle: Defines when to scale out or in, along with hooking events
Elastic Load Balancing (ELB):
  • Can load balance across availability zones
  • Cross zone load balancing: Allows you to distribute traffic evenly across all zones
  • Can be internet facing or internal only
Cloudwatch:
  • Has metrics for most AWS products and services
  • Can push metrics via REST or CLI
  • Can use SNS or Auto-Scaling
CloudFormation:
  • Method to create or manage a collection of resources
  • Built with JSON or CloudFormer
  • Infrastructure as code
  • Uses the components called "Resources" and "Parameters"
  • GIT is recommended for version control
  • Stack will rollback if there's a problem with its config
  • Resources are deleted when the stack is deleted
  • "WaitCondition" is used to ensure no 'order of operations' issues