From the course: AI Engineering Use Cases and Projects on AWS: Production-Grade LLM Systems

Unlock this course with a free trial

Join today to access over 24,900 courses taught by industry experts.

Ollama DeepSeek-R1 and Claude

Ollama DeepSeek-R1 and Claude

Here we have a Capstone project that allows you to proxy between two different providers. One is a local provider and this would be Ollama. And the other one is Bedrock, which allows you to talk to multiple foundation models hosted on AWS. If we look at the structure here, first step, we have a source directory in Rust. This source directory contains several files. We also have a Cargo.toml file. The Cargo.toml file is pretty nice because, by default, you get everything explicitly added to your project. And an editor like IntelliJ or some other editor will be able to directly communicate with the libraries. And you get this really good feedback loop where you can actually get the up-to-date versions, see what updates can happen, et cetera. So, for example, here we see, oh, look, I could update if I wanted to, and it gives you a very clear picture of everything that's in your program. So if we then move on to the rest of the project here, we have the lib. And if we look at the lib…

Contents