SAN FRANCISCO, May 8, 2023 /PRNewswire/ -- Yurts AI has teamed with Lambda to create an air-gapped large language model (LLM) stack solution for enterprise. Yurts' LLM platform runs on a Lambda workstation accelerated with NVIDIA GPUs. The solution is designed to provide advanced language processing capabilities to customers while ensuring maximum security for sensitive data.
"We have seen an explosion of interest in generative AI for enterprise use, but most C-suites have genuine and rightful concerns about security and privacy," said Ben Van Roo, CEO and Co-founder of Yurts AI. "Our platform can be embedded within an enterprise and give companies private and secure access to generative AI-based assistants for writing, chat, and search."
"This solution addresses the challenge of generative AI at the edge," adds Van Roo. "Large generative AI models can drive significant computing requirements, making them challenging for many use cases where cost, space, power, and noise are issues. By working with Lambda, we are enabling entirely new secure and edge-based use cases for LLMs."
The Yurts platform includes an application layer, modeling layer, and data management layer, which make it easy for organizations to adopt AI technologies without the need for extensive technical expertise. The platform allows enterprise customers to ingest and organize corporate knowledge and connect it to generative models. The result provides full explainability and attribution to source documentation. The platform's application includes writing, chat, and search interfaces and has extensions to plug into other applications and web interfaces, providing a seamless user experience.
The Yurts AI solution is available for purchase immediately, and customers can choose to have it installed on a Lambda workstation or their own hardware. The solution is backed by a team of experts from Yurts AI and Lambda who are available to provide support and assistance to customers. Yurts AI is a member of the NVIDIA Inception program for startups, which provides access to NVIDIA technology and expertise.
"Large language models are helping to transform productivity with intelligent-assistant, research and writing capabilities, but can be challenging to operate at secure edge locations," said Anthony Robbins, vice president of federal sales at NVIDIA. "Inception member Yurts AI is helping to broaden the usability of generative AI outside a traditional data center or cloud."
"Lambda is committed to providing access to high-performance AI/ML infrastructure wherever it resides. We are excited to see our collaboration with Yurts AI result in offering customers increased access to highly secure generative AI capabilities in their data centers or workstations." shared Lambda VP of Sales Robert Brooks.
For more information on the Yurts AI solution and how it can benefit your organization, visit www.yurts.ai.
Founded and led by deep learning engineers, Lambda provides deep learning infrastructure including a GPU cloud service, on-prem servers, GPU clusters, GPU workstations, and GPU laptops to customers such as Intel, Microsoft, Google, Amazon Research, Tencent, Kaiser Permanente, MIT, Stanford, Harvard, Caltech, and the Department of Defense. Find out more at www.lambdalabs.com.
SOURCE Yurts AI
WANT YOUR COMPANY'S NEWS FEATURED ON PRNEWSWIRE.COM?
Newsrooms &
Influencers
Digital Media
Outlets
Journalists
Opted In
Share this article