TII debuts multimodal Falcon 2 Series
New Falcon 2 11B compares favourably with Google Gemma 7B Model
#UAE #LLMS - The Advanced Technology Research Council’s applied research arm, Technology Innovation Institute (TII) has announced the second generation of its open-source high-performance large language model (LLM), Falcon, with the release of Falcon 2. TII has announced two new Falcon 2 versions: Falcon 2 11B, a more efficient and less resource-hungry 11 billion parameter LLM trained on 5.5 trillion tokens; and Falcon 2 11B VLM, which is the first multimodal LLM with new vision-to-language model (VLM) capabilities.
SO WHAT? - Technology Innovation Institute (TII) released a series of high quality, high performing Falcon large language models last year, including Falcon 7B, Falcon 40B, and Falcon 180B. Opting to open-source the Falcon series, TII and its Falcon models became globally known and respected overnight, driving millions of model downloads and thousands of enquiries. To capitalise on this opportunity the Advanced Technology Research Council (ATRC) formed AI71 in November, to develop AI solutions on Falcon and the Falcon Foundation in January of this year to curate the open source models and help build a global ecosystem. The announcement of the Falcon 2 series, advertises the fact that those plans are alive and well, whilst providing significant advantages to developers building on Falcon LLMs.
Here are key details of the announcement:
Technology Innovation Institute (TII) has announced its first major upgrade to the capabilities of its open-source, multilingual Falcon large language model, with the arrival of its first Falcon 2 LLM models, which includes a multimodal model with Vision-to-Language Capabilities.
TII unveiled the first two new Falcon 2 models: Falcon 2 11B, a more efficient and less resource-hungry 11 billion parameter LLM trained on 5.5 trillion tokens; and Falcon 2 11B VLM, which is the first multimodal LLM with new vision-to-language model (VLM) capabilities.
Immediate development plans for the Falcon 2 series include adding 'Mixture of Experts' capabilities for enhanced machine learning, which could make Falcon 2 models more accurate and intelligent by amalgamating smaller networks with distinct specialisations.
TII also has plans to broaden the Falcon 2 series of large language models, to introduce a range of different parameter sizes.
Falcon 2 11B has been tested against several prominent pre-trained AI models in its class. According to TII, the Falcon 2 11B today surpasses the performance of Meta’s newly launched Llama 3 8B, and performs on par with Google’s Gemma 7B at first place (Falcon 2 11B: 64.28 vs Gemma 7B: 64.29), as independently verified by open-source model community Hugging Face.
The Falcon 2 11B VLM, a vision-to-language model, has the capability to identify and interpret images and visuals from the environment, providing a wide range of applications, such as document management, digital archiving, and context indexing to supporting individuals with visual impairments
Importantly, these models can run efficiently on just one graphics processing unit (GPU), allowing them to be deployed on laptops and PCs.
The two new Falcon 2 models function as multilingual models, using English, French, Spanish, German, Portuguese, and various other languages
IMO - The two new AI models announced under the new Falcon 2 series of large language models, show that TII is both focused on continuing to develop LLMs at the cutting edge, whilst endeavouring to tune-in to market needs. The past 18 months of Generative AI hype and the obsession with bigger, more powerful models, may delight big tech and provide a spectacle, but being stuck with a choice of big power-hungry models is a frustration to many developers and corporate IT departments.
If TII can strike a balance between the size, power and capabilities of its Falcon 2 series portfolio, then it's going to become more appealing to both more developers, and more enterprise users. This ‘widening of the net’ will ultimately make Falcon models available for more use cases, and keep Falcon in the running to achieve the Falcon Foundation’s goal of creating a future de facto LLM standard.
Read more about Falcon large language models:
Could Falcon become the Linux of AI? (Middle East AI News)
Abu Dhabi launches new global AI company (Middle East AI News)
Disrupt or be disrupted (Middle East AI News)
Can UAE-built Falcon rival global AI models? (Middle East AI News)