September 11th, 2024

AWS AI Stack – Ready-to-Deploy Serverless AI App on AWS and Bedrock

The AWS AI Stack is a boilerplate for serverless AI applications, featuring backend services, a React frontend, AI chat functionality, and easy deployment with the Serverless Framework. A live demo is available.

Read original articleLink Icon
AWS AI Stack – Ready-to-Deploy Serverless AI App on AWS and Bedrock

The AWS AI Stack is a comprehensive boilerplate project designed for developing serverless AI applications on AWS. It features a full-stack architecture that includes backend services such as API Gateway, AWS Lambda, and DynamoDB, along with a frontend built using React. The project supports AI chat functionality with streaming responses through AWS Bedrock models and operates on a true serverless model, allowing users to pay only for the resources they consume. Key functionalities include built-in authentication, multi-environment support, and CI/CD integration via GitHub Actions. To get started, users need to install the Serverless Framework, set up AWS credentials, request access to AWS Bedrock models, and deploy services using the Serverless command. The architecture employs various AWS services, implements JWT token-based authentication, and provides APIs for user registration, login, and chat interactions. A live demo of the application is available at awsaistack.com, making it an ideal choice for developers aiming to create scalable AI applications while ensuring data privacy and security.

- AWS AI Stack is a boilerplate for serverless AI applications on AWS.

- It includes backend services like API Gateway and AWS Lambda, and a React frontend.

- The project supports AI chat functionality using AWS Bedrock models.

- Users can deploy services easily with the Serverless Framework.

- A live demo is available for users to explore the application.

Link Icon 8 comments
By @ac360 - 5 months
Increasingly bullish on AWS Bedrock.

• Devs forever want choice.

• Open-source LLMs are getting better

• Anthropic ships fantastic models

• Doesn't expose your app’s data to multiple companies

• Consolidated security, billing, config in AWS

• Power of AWS ecosystem

By @agcat - 5 months
You can check out this technical deep dive on Serverless GPUs offerings/Pay-as-you-go way. This includes benchmarks around cold-starts, performance consistency, scalability, and cost-effectiveness for models like Llama2 7Bn & Stable Diffusion across different providers -https://www.inferless.com/learn/the-state-of-serverless-gpus... .Can save months of your time. Do give it a read.

P.S: I am from Inferless

By @rmbyrro - 5 months
Last time I checked Bedrock was quite expensive to operate in a small scale.
By @ethagnawl - 5 months
I have not read too deeply into this but, do any of these serverless environments offer GPUs? I'm sure there are ... reasons but the lack of GPU support in Lambda and Fargate remains a major paint point for AWS users.

It's been keeping me wrangling EC2 instances for ML teams but I do wonder how much longer that will last.

By @fitzgera1d - 5 months
Introducing the AWS AI Stack

A serverless boilerplate for AI apps on trusted AWS infra.

• Full-Stack w/ Chat UI + Streaming

• Multiple LLM Models + Data Privacy

• 100% Serverless

• API + Event Architecture

• Auth, Multi-Env, GitHub Actions & more!

Github: https://github.com/serverless/aws-ai-stack

Demo: https://awsaistack.com

By @brap - 5 months
I don’t get it. How many people need to deploy their own custom AI chat apps over standard models?
By @justanotheratom - 5 months
Then need to go one step further and do what Replit did - AI Engineer generates code that gets deployed to this AWS AI Stack.