eCommerceNews UK - Technology news for digital commerce decision-makers
Story image

Teradata enhances AI with new BYO-LLM feature & NVIDIA tech

Today

Teradata has announced new capabilities for its cloud-native data analytics and management platform, VantageCloud Lake, alongside its integrated suite of advanced analytics tools, ClearScape Analytics, aimed at facilitating the implementation of generative AI use cases.

The company's approach is particularly timely as enterprises seek actionable AI strategies that can deliver quick returns on investment. According to data, 84 percent of executives anticipate seeing ROI from AI projects within a year. With recent advancements in large language models (LLMs) and the emergence of smaller models, Teradata is addressing the demand for more cost-effective and versatile AI solutions.

Through the introduction of the "bring-your-own LLM" (BYO-LLM) capability, Teradata enables its clients to deploy small or mid-sized open LLMs, including those tailored to specific domains. This feature allows organisations to process data where it resides, thereby reducing costs associated with data movement and enhancing security and privacy.

Additionally, Teradata provides the flexibility to utilise either GPUs or CPUs based on LLM complexity and size. This flexibility is supported by Teradata's collaboration with NVIDIA, integrating NVIDIA's AI full-stack accelerated computing platform into the Vantage platform, which is designed to enhance the performance of GenAI applications.

"Teradata customers want to swiftly move from exploration to meaningful application of generative AI," said Hillary Ashton, Chief Product Officer at Teradata. "ClearScape Analytics' new BYO-LLM capability, combined with VantageCloud Lake's integration with the full stack NVIDIA AI accelerated computing platform, means enterprises can harness the full potential of GenAI more effectively, affordably and in a trusted way. With Teradata, organisations can make the most of their AI investments and drive real, immediate business value."

With the BYO-LLM, users can select the most appropriate model to meet specific business requirements. As per Forrester, approximately 46 percent of AI leaders aim to employ existing open-source LLMs in their AI strategies. Teradata's approach enables its customers to use open-source providers like Hugging Face, which offers over 350,000 LLMs, allowing for a wide range of applications.

The application of smaller LLMs spans several sectors, including regulatory compliance, healthcare, product recommendations, and customer service. For example, in banking, specialised open LLMs can identify emails with regulatory implications efficiently. Meanwhile, in healthcare, they can process doctors' notes to facilitate better patient care without exposing sensitive patient information.

Teradata's dedication to an open ecosystem ensures that as new open LLMs emerge, its customers can incorporate them into their strategies, maintaining flexibility and minimising reliance on any single provider.

The addition of NVIDIA's accelerated computing to VantageCloud Lake will enable significant improvements in LLM inferencing and fine-tuning capabilities. This infrastructure is key in handling complex models efficiently, making it particularly valuable in applications like healthcare, where rapid data processing is crucial.

VantageCloud Lake's support for model fine-tuning allows customers to adapt pre-trained language models by incorporating specific vocabulary or contexts relevant to their business.

This can result in improved model accuracy and efficiency without necessitating an overhaul of the entire training dataset.

Follow us on:
Follow us on LinkedIn Follow us on X
Share on:
Share on LinkedIn Share on X