Do you believe in a better tomorrow? We do. Our team of expert researchers live the dream and work to build it every day.

News

FalconMamba LLM

We are excited to announce the release of our groundbreaking LLM model with a pure SSM architecture, setting a new benchmark by outperforming all previous SSM models and achieving performance on par with leading transformer-based models.

More details on the new models and their performance can also be found in our FalconMamba blogpost.

Artefact Link Type Details
🐍 FalconMamba-7B Here pretrained model 7B parameters pure SSM trained on ~5,500 billion tokens.
FalconMamba-7B-Instruct Here instruction/chat model Falcon-Mamba-7B finetuned using only SFT.
FalconMamba-7B-4bit Here pretrained model 4bit quantized version using GGUF.
FalconMamba-7B-Instruct-4bit Here instruction/chat model 4bit quantized version using GGUF.

Falcon2 LLM

Falcon2 LLM is TII's new flagship series of large language models, where we focused on building smaller models with enhanced performance to enable cheaper inference that can encourage the development of more downstream applications and improve the general usability of our models.

Papers:

More details on the new models and their performance can also be found in our Falcon2 blogpost.

See below for a detailed list of artefacts in the Falcon2 LLM family:

Artefact Link Type Details
πŸ¦…πŸ¦… Falcon-11B Here pretrained model 11B parameters trained on over 5000 billion tokens.
πŸ¦…πŸ“Έ Falcon-11B-vlm Here vision adapted model Integrating the pretrained CLIP ViT-L/14 vision encoder with our Falcon2-11B chat-finetuned model, and trained with image-text data.

Falcon LLM

Falcon LLM is TII's flagship series of large language models, built from scratch using a custom data pipeline and distributed training library Almazrouei et al. 2023.

Papers:

To promote collaborations and drive innovation, we have open-sourced a number of artefacts:

See below for a detailed list of artefacts in the Falcon LLM family:

Artefact Link Type Details
πŸ₯‡ Falcon-180B Here pretrained model 180B parameters trained on 3,500 billion tokens.
Falcon-180B-Chat Here chat model Falcon-180B finetuned on a mixture of Ultrachat, Platypus and Airoboros.
πŸ₯ˆ Falcon-40B Here pretrained model 40B parameters trained on 1,000 billion tokens.
Falcon-40B-Instruct Here instruction/chat model Falcon-40B finetuned on the Baize dataset.
πŸ₯‰ Falcon-7B Here pretrained model 6.7B parameters trained on 1,500 billion tokens.
Falcon-7B-Instruct Here instruction/chat model Falcon-7B finetuned on the Baize, GPT4All, and GPTeacher datasets.
πŸ“€ RefinedWeb Here pretraining web dataset ~600 billion "high-quality" tokens.
Falcon-RW-1B Here pretrained model 1.3B parameters trained on 350 billion tokens.
Falcon-RW-7B Here pretrained model 7.5B parameters trained on 350 billion tokens.

About us

The Technology Innovation Institute (TII) is a leading global research center dedicated to pushing the frontiers of knowledge. Our teams of scientists, researchers and engineers work in an open, flexible and agile environment to deliver discovery science and transformative technologies. Our work means we will not only prepare for the future; we will create it. Working together, we are committed to inspiring innovation for a better tomorrow.

We are part of Abu Dhabi Government’s Advanced Technology Research Council, which oversees technology research in the emirate. As a disruptor in science, we are setting new standards and serve as a catalyst for change.

Faced with a future of limitless possibilities and supported by strategically funded investments, we are encouraging a culture of discovery. Our work reinforces Abu Dhabi and the UAE’s status as an R&D hub and a global leader in breakthrough technologies.