Details for Orca Mini

Description: A general-purpose model ranging from 3 billion parameters to 70 billion, suitable for entry-level hardware.

Additional Information: Orca Mini is a Llama and Llama 2 model trained on Orca Style datasets created using the approaches defined in the paper, Orca: Progressive Learning from Complex Explanation Traces of GPT-4. There are two variations available. The original Orca Mini based on Llama in 3, 7, and 13 billion parameter sizes, and v3 based on Llama 2 in 7, 13, and 70 billion parameter sizes.

Link: dub.sh/localaimodels-orca-mini