Nebraska.Code() Sessions tagged aws

AWS Lambda Signal Corps: Zombie Apocalypse Workshop

Zombies! Zombies have taken over major metropolitan areas. The AWS Lambda Signal Corps has built a communications system to connect remaining survivors. Come learn how AWS Lambda provides a platform for building event-driven microservices, all without the need to provision, manage, and scale servers. In this workshop, we will introduce the basics of using AWS Lambda to run code in response to events from Amazon DynamoDB, S3, and API Gateway. You'll work within a team to build a secure, scalable, fault-tolerant chat service with global reach from scratch using blueprints provided by us. Unfortunately, the blueprints provided only describe a very rudimentary communications system (the engineers of the project got mysteriously ill). We are looking to you and your team to add additional real-time life saving features (e.g., food cache locations, zombie motion detectors, undead counters) to the chat platform using Lambda functions.

Overview of the Workshop Labs The Zombie Microservices Workshop introduces the basics of building serverless applications using AWS Lambda, Amazon API Gateway, Amazon DynamoDB, Amazon Cognito, Amazon SNS, and other AWS services. In this workshop, as a new member of the AWS Lambda Signal Corps, you are tasked with completing the development of a serverless survivor communications system during the Zombie Apocalypse.

Speaker

Darren Lichty

Darren Lichty

Chief Engineering Officer, Panology Tech Solutions

Automating Docker-based Tasks in the Cloud using AWS Batch

Batch processing is a common, powerful pattern for high-CPU background workloads. Companies often use it for advanced simulations, rendering, media transcoding and processing, deep learning, and more. At Hudl, we're using AWS Batch to manage a video processing pipeline that includes a GPU-based deep learning algorithm.

AWS Batch is a recent addition to Amazon's cloud platform that makes it very simple to define and execute tasks without worrying about the infrastructure needed to make it happen. Once you define a task by providing a Docker image and necessary parameters, you can create hundreds of thousands of jobs, and let Batch deal with scaling, parallelization, and managing dependencies.

In this talk I'll walk through setting up Batch jobs (including some basic Docker images and everything on the Batch side), how Batch handles scheduling and dependencies, describe scenarios where Batch excels, and touch on some pain points we've experienced so far.

Hudl is still early in the stages of using it, but so far it’s proven easy to use and very adaptable to what we need. We’re planning to move more of our workloads into batch, including thumbnail generation, video transcoding and processing, PDF generation, and more.

Speaker

Ryan Versaw

Ryan Versaw

Software Engineer, Hudl

A Serverless WebApp Deployed to AWS as Code

In this session, we will show you how to deploy a simple (yet functional) serverless webapp to Amazon Web Services (AWS) using the Everything-As-Code approach. We will begin with a short review of the architecture itself, highlighting the AWS Products used. Next, we will deploy the code the builds it all, showing you the AWS Console approach first. The AWS Console is not code though, so we will also show you the code-centric approach (spoiler: we are using a script for this). The deployment takes some time to complete, so while we wait, we will review the CloudFormation template(s), resources, and tools used to build this solution. If all goes well, we will have a functioning webapp. We will close out the demonstration by showing a some updates made using the Everything-As-Code approach.

Links will be provided for the entire code base used during this session (hosted on GitHub). We may even have some experts from AWS there to answer a few questions.

Speaker

Darren Lichty

Darren Lichty

Chief Engineering Officer, Panology Tech Solutions