- Solar Pro is the most intelligent LLM optimized to run on a single GPU, outperforming models from tech giants like Microsoft, Meta, and Google.
SAN JOSE, Calif. , Sept. 11, 2024 /PRNewswire/ -- Upstage today announced the release of a preview version of its next-generation large language model (LLM), Solar Pro. This preview, available as an open-source model with free API access, gives developers and businesses the opportunity to test and provide feedback on the model ahead of its official release in November.
As the flagship model of the Solar LLM series, Solar Pro features 22 billion parameters—more than double the size of its predecessor, Solar Mini. Despite its increased size, Solar Pro is optimized to run efficiently on a single GPU, thanks to Upstage's proprietary Depth-Up Scaling (DUS) method and advanced data recipe. This innovation enables Solar Pro to deliver state-of-the-art performance while significantly reducing model size—an essential advantage in the face of rising GPU costs and supply constraints in the AI landscape.
Solar Pro's advanced capabilities have translated to impressive gains in key LLM benchmarks, with an average improvement of 51% compared to Solar Mini. It achieved an accuracy score of 52.11 on the MMLU Pro benchmark, which measures multi-disciplinary language understanding and reasoning across 14 domains. Additionally, Solar Pro excelled in the IFEval benchmark with a score of 84.37, showcasing its ability to follow complex instructions with intelligence comparable to that of humans.
These results surpass those of similar-sized models from leading tech companies, including Microsoft's Phi 3 Medium, Meta's Llama 3.1 8B, Mistral NeMo 12B, and Google's Gemma 2 27B. Solar Pro even competes with much larger models that require multiple GPUs, such as Llama 3.1 70B, which has three times the parameter count. By setting a new standard in both general and specialized tasks, Solar Pro positions itself as the most intelligent and efficient LLM available on the market today.
"Having already made a significant impact on the global AI market with our Solar LLM series, we are thrilled to push the boundaries further with Solar Pro, the most intelligent LLM that runs on a single GPU," said Sung Kim, CEO of Upstage. "We invite developers and businesses to explore the Solar Pro Preview, which raises the bar for small language model performance."
Solar Pro's advanced capabilities enable enterprises to automate and streamline a wide array of tasks across industries. In healthcare, it analyzes patient records, generates clinical summaries, and supports medical research. In finance, it performs financial analysis, drafts reports, and provides personalized investment advice. In legal services, Solar Pro reviews contracts, summarizes legal documents, and assists in legal research—making it an indispensable tool for boosting productivity and efficiency.
The Solar Pro Preview will be available for public use as an open-source model, including for commercial applications, with free API access until the official release in November. This preview version supports English-language input and offers a context window of 4,096 tokens.
About Upstage
Founded in October 2020, Upstage boosts work efficiency with industry-leading document processing engines and large-language models (LLMs). Our flagship product Solar LLM delivers GPT-4-level performance with an unparalleled speed and cost-efficiency. Available via on-premises as well as API integration through platforms like Amazon SageMaker JumpStart, Solar provides a versatile and accessible alternative to larger, more resource-intensive models developed by tech giants. Furthermore, our Document AI solution leverages AI-powered optical character recognition (OCR) technology to automate workflows and process unstructured data, reducing operational costs and streamlining operations for our clients.
PR Contact
Sungbeom Bae: [email protected]
SOURCE Upstage
WANT YOUR COMPANY'S NEWS FEATURED ON PRNEWSWIRE.COM?
Newsrooms &
Influencers
Digital Media
Outlets
Journalists
Opted In
Share this article