From the course: AI Engineering Use Cases and Projects on AWS: Production-Grade LLM Systems

Unlock this course with a free trial

Join today to access over 24,900 courses taught by industry experts.

Rust Cargo Lambda serverless capstone challenge

Rust Cargo Lambda serverless capstone challenge

There's some really exciting things about having the ability to proxy the local models through a provider, like for example, Ollama, and then Ollama could dispatch to DeepSeek-R1. Also, to look at the load with this CLI, you could burst out to the cloud. Now you can directly call Bedrock, that could be one way to do it. Or a more sophisticated way to do this to maybe scale to a global level is to use global scale components. So a good example of this is in the AWS layer here. We could use SQS. Now what's interesting about a service like SQS is that it has no limits on how much data you can send it. So it's effectively a fire hose with infinite inputs. Now what could you do with this? Well, we could put...a ton of traffic could start pumping into SQS and then those SQS payloads could, through an event-based mechanism, invoke Lambda. Now what could Lambda do? Well, Lambda could invoke Bedrock via the Bedrock API in Rust. Also, we could look at CloudWatch to measure the metrics and make…

Contents