From the course: AI Engineering Use Cases and Projects on AWS: Production-Grade LLM Systems
Unlock this course with a free trial
Join today to access over 24,900 courses taught by industry experts.
Rust Cargo Lambda serverless capstone challenge
From the course: AI Engineering Use Cases and Projects on AWS: Production-Grade LLM Systems
Rust Cargo Lambda serverless capstone challenge
There's some really exciting things about having the ability to proxy the local models through a provider, like for example, Ollama, and then Ollama could dispatch to DeepSeek-R1. Also, to look at the load with this CLI, you could burst out to the cloud. Now you can directly call Bedrock, that could be one way to do it. Or a more sophisticated way to do this to maybe scale to a global level is to use global scale components. So a good example of this is in the AWS layer here. We could use SQS. Now what's interesting about a service like SQS is that it has no limits on how much data you can send it. So it's effectively a fire hose with infinite inputs. Now what could you do with this? Well, we could put...a ton of traffic could start pumping into SQS and then those SQS payloads could, through an event-based mechanism, invoke Lambda. Now what could Lambda do? Well, Lambda could invoke Bedrock via the Bedrock API in Rust. Also, we could look at CloudWatch to measure the metrics and make…
Contents
-
-
-
(Locked)
Rust LLM project extension6m 50s
-
(Locked)
Ollama DeepSeek-R1 and Claude12m 2s
-
Open-source strategy walkthrough3m 8s
-
(Locked)
YAML prompts with Rust walkthrough2m 52s
-
(Locked)
Multimodel workflow walkthrough4m 29s
-
(Locked)
Rust-model proxy routing walkthrough3m 27s
-
(Locked)
Rust Cargo Lambda serverless capstone challenge8m 46s
-
(Locked)
AI-engineering capstone4m 2s
-
(Locked)