Model Catalog
Browse our selection of hand-picked, ready-to-deploy models!
Text Generation
A 70-billion parameter model from Meta, optimized for dialogue. Generates helpful, safe responses and outperforms other open-source chat LLMs.
gemma-7b-it
A instruction model fine-tuned from Gemma 7B Googles, first open LLM.
A powerful chat model fine-tuned from Mistral 7B on a large corpus of synthetic data. Capable of function-calling and has strong coding capabilities.
A fine-tuned version of OpenHermes 2.5 that was aligned with Direct Preference Optimization and AI preference examples from the SlimOrca dataset.
Open chat language model by UC Berkeley. Outperformed all 7B models at the time of its release. For non-commercial use only.
Open source large language model that targets high performance and commercial viability. Fine-tuned using C-RLFT, for results on par with ChatGPT.
A 7-billion parameter instruct model from Mistral AI, fine-tuned using a variety of publicly available conversation datasets.
zephyr-7b-beta
A chat model fine-tuned from Mistral 7B with synthetic data and Direct Preference Optimization.
A chat model from Intel fine-tuned from Mistral 7B on the SlimOrca dataset with Direct Preference Optimization.
A 13-billion parameter model from Meta, optimized for dialogue. Generates helpful, safe responses and outperforms other open-source chat models.
A 70-billion parameter model from Meta, optimized for dialogue. Generates helpful, safe responses and outperforms other open-source chat LLMs.
A 180-billion parameter conversational AI model optimized for fast inference through an efficient architecture. Freely available under TII LICENSE.
Mixtral 8x7B is a sparse mixture-of-experts decoder-only model fine-tuned on instruction following a permissive license.
Text-to-Image
Latent text-to-image diffusion model capable of generating photo-realistic images given any text input.
Advanced latent diffusion model designed to create high-resolution, detailed anime images. Fine-tuned from Stable Diffusion XL 1.0.
Latent Diffusion model from Stability AI for high-quality, diverse image generation based on short text prompts provided by the user.
Open source Stable Diffusion fine-tuned model on Midjourney images.
Fine-tuned model based on Stable Diffusion designed to generate anime characters.
Sentence Embeddings
all-MiniLM-L6-v2
This model maps sentences & paragraphs to a 384 dimensional dense vector space and can be used for tasks like clustering or semantic search.
This model maps sentences & paragraphs to a 768 dimensional dense vector space and can be used for tasks like clustering or semantic search.
bge-base-en-v1.5
BGE models from BAAI are optimized for retrieval and search. The 'base' variant is ranked 2nd in the MTEB English leaderboard
This model is initialized from xlm-roberta-large and continually trained on a mixture of multilingual datasets. It supports 100 languages.
This model maps sentences & paragraphs to a 384 dimensional dense vector space and can be used for tasks like clustering or semantic search.
ember-v1
Model trained on an extensive corpus of text pairs belonging to domains such as finance, science, medicine, law, and various others.
BGE models from BAAI are optimized for retrieval and search. The 'large' variant is ranked 1st in the MTEB English leaderboard
Sentence Ranking
Zero-Shot Classification
An English-only model specialized in zeroshot text classification trained on 33 diverse datasets. 0.43B parameters small and more efficient than generative LLMs.
A multilingual model specialized in zeroshot text classification trained on 33 diverse datasets. 0.57B parameters small and more efficient than generative LLMs.
Version of the bart-large model trained on the MultiNLI (MNLI) dataset.
Automatic Speech Recognition
Distilled version of the Whisper model that is 6 times faster, 49% smaller, and performs within 1% WER on out-of-distribution evaluation sets.
New version of the whisper-large model showing improved performance over a wide variety of languages, with 10% to 20% reduction of errors compared to Whisper large-v2.