Raspberry Pi 5 gets LLM smarts with AI HAT+ 2
The AI HAT+ 2 offers 40 TOPS INT4 inference performance with 8 GB RAM to locally accelerate large language models and generative AI on the Raspberry Pi 5 platform.
- On launch, Raspberry Pi introduced the AI HAT+ 2 built around the Hailo-10H neural network accelerator to enable on-device inference for LLMs and generative AI.
- At launch, Raspberry Pi said DeepSeek-R10-Distill, Llama3.2 and Qwen2 family models will be available, mostly 1.5-billion-parameter, framing the HAT+ 2 for local inference use.
- The board plugs into the Pi 5 via GPIO connector and PCIe interface; it includes an optional passive heatsink, spacers and screws for active cooler fitting, and requires fresh Raspberry Pi OS with Docker and hailo-ollama, while rpicam-apps supports the hardware natively.
- The AI HAT+ 2 increases on-device inference performance relative to the earlier AI HAT+, but Raspberry Pi acknowledged 8 GB onboard RAM may limit memory-hungry AI workloads and thermal constraints persist.
- By comparison, cloud LLMs start at about 500 billion parameters, and Raspberry Pi said users can specify a Pi 5 with 16 GB RAM amid thermal trade-offs, raising questions about target users.
17 Articles
17 Articles
Raspberry Pi’s new add-on board has 8GB of RAM for running gen AI models
Raspberry Pi is launching a new add-on board capable of running generative AI models locally on the Raspberry Pi 5. Announced on Thursday, the $130 AI HAT+ 2 is an upgraded - and more expensive - version of the module launched last year, now offering 8GB of RAM and a Hailo 10H chip with 40 TOPS of AI performance. Once connected, the Raspberry Pi 5 will use the AI HAT+ 2 to handle AI-related workloads while leaving the main board's Arm CPU availa…
Raspberry has announced a new module for Raspberry Pi 5 that introduces generational artificial intelligence into this computer, with an inference performance of 40 TOPS and the ability to run models locally.
For several years now, Raspberry Pi has established itself as an essential platform for developers, developers and enthusiasts of embedded technologies. However, with the explosion of artificial generative intelligence, a new need has emerged. It is a question of being able to execute advanced IA models locally, without relying on the cloud. It is precisely the mission of the Raspberry Pi AI HAT+ 2, a new extension module designed to transform t…
Raspberry Pi reveals AI Hat+ 2. The hat remains the same in design and dimensions as the v1. The first version was a bit frustrating because it was based on the Pi and the treatments could not be done on the IA card. The v2 has the advantage of integrating 8GB of RAM. A 16GB version would have been a better choice but it still allows to be able to support LLM and VLM of about 6 billion parameters. The NPU remains the excellent Hailo (see our IA …
Coverage Details
Bias Distribution
- 67% of the sources are Center
Factuality
To view factuality data please Upgrade to Premium





