Dataiku broadens LLM Mesh

New Dataiku LLM Registry fortifies LLM Mesh with added layer of governance to qualify, document, and frame LLM usage.

Dataiku is expanding its LLM Mesh ecosystem to facilitate secure access to thousands of large language model (LLM) gateways, empowering data and analytics teams to build and deploy GenAI-driven solutions at scale by adopting a multi-LLM strategy. Dataiku is also closing a critical governance gap to ensure regulatory readiness and effective management of LLM technologies across the organization with the LLM Registry, which allows CIOs and their teams to qualify, document, and rationalize which LLMs should or should not be used across use cases.

In a highly-competitive and volatile LLM ecosystem, Dataiku’s LLM Mesh enables organizations to take a multi-LLM approach, switching out underlying models to power GenAI-driven applications with ease. With the expansion, the LLM Mesh now supports many LLM players, including 15 major cloud and AI vendors like Amazon Web Services (AWS), Databricks, Google Cloud, Snowflake (Arctic), and more.

“Our goal is to help our customers future-proof their GenAI strategies and avoid obsolescence — that said, we provide a balanced approach to developing AI applications, while removing the risk of anchoring a strategy to a single AI provider,” said Florian Douetteau, co-founder and CEO, Dataiku. “The LLM Mesh gives organizations secure access to literally thousands of diverse models for any GenAI use case they’re looking to implement today for a true multi-LLM strategy.”

LLMs constitute one piece of GenAI applications, and the reality of LLM use in the enterprise is complex, as organizations scale to more sophisticated applications. A multi-LLM approach is essential to account for cost and performance management, privacy and security, and to meet regulatory requirements. Dataiku’s Universal AI Platform supports this comprehensive approach, in addition to supporting traditional analytics and machine learning techniques, which allows enterprises to effectively handle the complete development lifecycle of GenAI applications.

“IDC anticipates a future marked by a variety of model types, each suited to different tasks and scenarios,” said Nancy Gohring, IDC senior research director, AI. “Enterprises are likely to use many models of different sizes and modes, and should ensure they have the ability to quickly evaluate and swap models as new models come to market and use cases evolve.”

Breaks down the barriers to enterprise AI adoption by enhancing security and compliance, improving...
Findings from ‘Unlocking Growth in the Mid-Market: The Node4 Report’ point to a lack of...
Unveiled at the RSAC™ Conference, the 2025 LevelBlue Futures Report finds only 29% of executives...
Horizon River, a leading provider of premium technology is redefining network solutions with its...
A new survey commissioned by Expereo exposes the true roadblocks to UK AI plans - poor...
The recently refurbished site now boasts state-of-the-art facilities, providing benefits to Kyndryl...
Kubernetes complexity drives surge in demand for enhanced observability tools
Telehouse International has completed a new phase in the development of its Magny 2 data center,...