IBM Granite
IBM Granite is a family of AI models created by IBM Research that can generate text and code. It was announced on September 7, 2023, with a paper published a few days later. Granite was originally designed for IBM’s cloud AI platform Watsonx, and IBM also opened some of the code for public use.
Granite models are trained on a mix of data, including content from the internet, academic papers, code, and legal and financial documents. A foundation model is a large AI model trained on broad data so it can be adapted to many tasks. The first Granite models were Granite.13b.instruct and Granite.13b.chat, each with 13 billion parameters.
Later Granite versions range from 3 to 34 billion parameters. On May 6, 2024 IBM released four Granite Code Model variations under the Apache 2 license and published them on Hugging Face for public use. IBM’s tests show that Granite 8b outperforms Llama 3 on several coding tasks at similar sizes.
This page was last edited on 3 February 2026, at 02:21 (CET).