Perplexity’s Post

View organization page for Perplexity

1,290,971 followers

Perplexity is the first to develop custom Mixture-of-Experts (MoE) kernels that make trillion-parameter models available with cloud platform portability. Our team has published this work on arXiv as Perplexity's first research paper. Read more: https://lnkd.in/gStC_SzJ

non-technical 3-sentence summary: Perplexity developed new technology that makes extremely large AI models (with trillions of parameters) run faster and more affordably in the cloud. This breakthrough allows these massive models to work across different servers, not just ultra-specialized hardware,making them much more accessible. In short, it means more powerful AI can be deployed in real products and services, not just research labs.

Peter M.

Building @ Camera Search

2w

Perplexity: Mad at Amazon Perplexity: Happy at Amazon

MyongHak J.

Building I.N.G: A real-time video platform where your curiosity becomes income. Let’s connect. (Your curiosity deserves ROI.)| Real-time platform | Shortform | AI video tech

2w

Impressive milestone, MoE kernels at this scale redefine efficiency itself. Portability across cloud platforms is exactly what pushes trillion-parameter AI from theory to real-world impact.

Chris Brady

VP of Sales | Driving B2B Growth in Tech-Enabled Experiences | Hospitality • Fintech • Smart Energy • AI Retail | Faith Driven. Texas Built.

2w

Perplexity- Keep crushing it!

Arwed Grön

Gründer GROENIE.com | KI-Lösungen✨für Spa & Wellness | Chatbots & AI-Tools automatisieren Buchungen & Leads | Mit Aloha🌺 Effizienz boosten & Mind stärken | Auch für andere Branchen

2w

Aloha🌺from Germany. I love to work with your Assistent in Comet 😍. It´s simply amazing!!! So helpful!!!

Sabri Deniz Martin

Head of Marketing @ TestSolutions GmbH | Testmanager | "Deniz"

2w

So basically... Perplexity has achieved a significant technical advance that enables running the largest AI models efficiently and with lower latency on the AWS Cloud infrastructure, right? Will this be only used internally?

Orest Andrusyshyn

Once a kid with a dream. Still here, building software that feels human in a world that forgot how.

2w

Wow, custom MoE kernels at this scale! Honestly curious how this will play out across different clouds. Been helping teams automate similar workflows lately, everyone seems to hit a different bottleneck first.

Zhirayr Gumruyan

CEO & Co-Founder, Elixion.ai — Human + AI collaboration redefined

2w

Perplexity is doing great job!

Impressive milestone. Making trillion-parameter models portable across cloud platforms is a huge step toward scalable AI accessibility. Excited to see how this shapes the future of open research.

See more comments

To view or add a comment, sign in

Explore content categories