Discover the Latest AWS Innovations – 5 New Services Unveiled

0
348

In many big cities here in America, social decay has reached the point where items like deodorant and toothpaste need to be kept behind lock and key, and Walmart needs to lock at stakes in barbed wire. But luckily, I have some good news – shoplifting will soon be a thing of the past, thanks to a biometric surveillance driven AI controlled economy powered by Amazon web services. Today, we’ll take a glimpse into the future with five awesome new tools announced at AWS reinvent earlier this week.

It is December 1st, 2023, and you’re watching the Code Report. I remember 5 years ago when I could walk into the grocery store with 20 bucks in my pocket and walk out of there with an entire cart full of groceries with 20 bucks left over in my pocket. But you can’t do that nowadays, because they keep all the good stuff locked up.

Back in 2018, Amazon announced its first cashier-less store. You walk in and shoplift whatever you want, and it will automatically charge your Amazon account. It’s super convenient and cost-effective for stores, but it comes at the cost of ubiquitous personal surveillance. Now, people still broke into these stores, and Amazon shot a bunch of them down in 2023. But if you own a grocery store yourself, AWS sells the tools you need to build your own dystopian shopping experience.

That brings us to one of the craziest new AWS tools – Amazon 1, a palm identity service. Instead of authenticating with a username and password, you create a palm signature which is unique to every human. This can then be used for authentication to get you into a building instead of an ID badge, or be used to pay for goods in a store instead of a credit card. In the future, you won’t just be able to walk into a store anonymously – you’ll need to first authenticate. “Welcome to Walmart, please scan Palm to shop our everyday low prices.” And you may not get access if you have a history of shoplifting. Even if you do try to shoplift, the exit doors aren’t going to open unless you have enough credits in your account to pay for that lab-grown meat. Even if you do manage to smash down the door and escape, Amazon’s loss prevention drones will be right there to track you. At which point, these new law enforcement robots will be authorized to terminate you, or at the very least reduce your social credit score.

All this tech requires a lot of computing power. Announcement number two was the latest generation of AWS Trainium chips. There’s a race going on right now between all the big tech companies to build AI chips. Google, Amazon, Microsoft, and Nvidia have all announced their own specialized chips for training massive AI models. A cluster of 100,000 of these Trainium chips is capable of training a GPT-4 size model in a matter of weeks, instead of months.

Announcement number three was Amazon SageMaker Hyperpod, which allows you to take advantage of the Trainium chip or optionally Nvidia GPUs. You just take your data and your code, drop it into Hyperpod, push the play button, and it will distribute that work across hundreds or thousands of chips. Come back in a few weeks, and you should have your own custom GPT-4 like foundational model, along with what I can only imagine would be a seven-figure bill.

Announcement number four was Q, not to be confused with QAR, the rumored AGI from Open AI. Q is a chatbot for AWS customers, which is actually an awesome feature. AWS is incredibly complex, and it’s pretty easy to shoot yourself in the foot. And that means an AI that understands the context of your AWS account will be incredibly useful. It can even analyze and write code just like GitHub Co-pilot, as well as generate reports and analyze data on your account.

And that brings us to our final tool, which is also the most fun to play with – Amazon Bedrock and its new image generator. Bedrock allows you to try out a bunch of different Foundation models like Stable Diffusion, CLA, Llama, and so on. In addition, Amazon has trained its own Foundation models called Titan. It has a playground where you can try out all these different models, which is really convenient if you don’t have a high-performance machine with a GPU where you can download and run these models locally. I use it to compare Stable Diffusion XL to Titan Image Generation, and although Titan is not quite as good as Stable Diffusion, it did provide some interesting high-quality results.

What’s really cool about Bedrock, though, is that you can fine-tune the models with your own data and then offer them as an API fully managed by AWS, which is a great idea for your failed AI software as a service site hustle that’ll make Amazon a bunch of money.

This has been the Code Report. Thanks for watching, and I will see you in the next one.