>>
Platform>>
Microsoft>>
Microsoft introduces Orca, a n...Orca, an AI model with 13 billion parameters, can copy and learn from big language models.
Microsoft, in collaboration with OpenAI, has been slowly incorporating AI capabilities into its products and services, as well as developing smaller case-specific models. Microsoft Research has announced Orca, a new AI model that learns by emulating huge language models. Orca is intended to circumvent the limits of smaller models by emulating the reasoning processes of large foundation models such as GPT-4, according to the research article.
Language models like Orca can be taught using big language models like GPT-4 and optimized for specific needs. Orca uses fewer computational resources to execute and operate due to its lower size. Researchers can tune their models based on their needs and run them individually without relying on a massive data center. On benchmarks like as BBH, Orca is said to be on par with ChatGPT.
Orca, a 13 billion-parameter-powered AI model based on Vicuna, can copy and learn from huge language models such as GPT-4. With the help of GPT-4, which is said to have over one trillion parameters, it can learn explanations, step-by-step thought processes, and other complex instructions. Furthermore, it performs competitively on academic exams such as the SAT, LSAT, GRE, and GMAT.