Top Selling Multipurpose WP Theme
Home Technology MediaTek Research introduces Breeze-7B open-source Large Language Model

MediaTek Research introduces Breeze-7B open-source Large Language Model

MediaTek Research, a cutting-edge research arm of MediaTek, announced it is releasing a new open-source Large Language Model (LLM) called MediaTek Research Breeze-7B (MR Breeze-7B). This LLM is adept in Traditional Chinese and English, and comes on the heels of their pioneering traditional Chinese language model in early 2023. MR Breeze-7B, boasting 7 billion parameters, has been engineered on the widely acclaimed Mistral model.

MR Breeze-7B has absorbed a remarkable twentyfold additional knowledge compared to its predecessor, BLOOM-3B, enabling it to navigate the intricate linguistic and cultural nuances of the Traditional Chinese language with unprecedented precision. This advancement paves the way for more genuine and accurate bilingual interactions and content generation. Enhanced by MediaTek Research’s optimizations, MR Breeze-7B outperforms its counterparts, including the Mistral and Llama models, in processing speed. It cuts the time and memory needed for complex Traditional Chinese inferences by half, providing users with a more seamless experience.

“With the rapid expansion of the AI industry, we’re finding that there is a strong emphasis on English-based language models,” said Dr. Da-shan Shiu, Managing Director of MediaTek Research. “As an open-source language model that’s optimized for Traditional Chinese, MR Breeze-7B will significantly advance both academic and industrial AI technology, and this is just the beginning as we prepare to launch more open-source multimodal models to encourage additional AI collaborations.”

Compared to other 7B Chinese-English language models, MR Breeze-7B delivers smoother, more accurate responses in both languages swiftly, with a keen ability to grasp context for relevant and coherent answers. This enhancement is crucial for scenarios demanding rapid bilingual interaction, such as live translation, business negotiations, and smart customer service. Furthermore, MR Breeze-7B’s adeptness at parsing and producing tabular content is a game-changer for data-driven tasks like analytics, financial statements, and complex scheduling, proving indispensable for enterprises handling extensive structured data.

The release of MediaTek Research’s open-source MR Breeze-7B model marks a significant step for researchers to further dissect and understand the intricacies of large language models, particularly in refining solutions for challenges such as hallucination and alignment in question-answering systems. MediaTek Research is preparing to unveil a new 47B parameter model, built upon the open-source Mixtral framework, for public testing in the near future.

@2023 – Cellit. All Rights Reserved.

Contact us: contact@cellit.in