What you need to know
- MediaTek is working with Meta to integrate the Llama 2 large language model (LLM) into its products.
- The Taiwanese brand is aiming to offer on-device generative AI use cases that augment its existing feature-set.
- The Llama 2 model will be debuting in the next-gen Dimensity platform that’s launching later this year.
Generative AI is everywhere these days, and MediaTek has announced that it is teaming up with Meta to bring on-device generative AI to its next Dimensity platform. The upcoming chipset — likely going to be the Dimensity 9300 — will integrate Meta’s Llama 2 large language model, allowing generative AI applications to work entirely on-device without having to go through the cloud.
MediaTek touts several advantages to offering generative AI on-device, including “seamless performance, greater privacy, better security and reliability, lower latency, the ability to work in areas with little to no connectivity, and lower operation cost.” MediaTek’s existing Dimensity portfolio already has APUs that offer generative AI features including the likes of AI Noise Reduction and AI MEMC, with devices leveraging the Dimensity 9200 — like the Vivo X90 Pro and Find X6 — highlighting these features.
MediaTek is aiming to build on that foundation by rolling out a software stack that’s optimized to run Llama 2, as well as an upgraded APU with Transformer backbone acceleration, reduced footprint access and use of DRAM bandwidth to facilitate better on-device generative AI use cases.
The Taiwanese brand doesn’t mention the Dimensity 9300 by name, only stating that these features will debut in a next-gen flagship SoC that’s set to debut later in the year. If history is any indication, we’ll see the launch of the Dimensity 9300 sometime in November, and MediaTek notes that the initial wave of phones powered by the hardware will be available before the end of the year, so it isn’t likely to assume that the brand has secured a design win with one of the major Chinese manufactures.