Software Tech

Nvidia’s AI Workbench brings model fine-tuning to workstations

Timed to coincide with SIGGRAPH, the annual AI academic conference, Nvidia this morning announced a new platform designed to

Nvidia’s AI Workbench brings model fine-tuning to workstations

Timed to coincide with SIGGRAPH, the annual AI academic conference, Nvidia morning announced a new platform designed to let users , test and customize generative AI models on a PC or workstation before scaling them to a data center and public cloud.

“In order to democratize this ability we have to it possible to run pretty much everywhere,” said Nvidia founder and CEO during a keynote the event.

Dubbed AI Workbench, the service can be accessed through a basic interface running on a local workstation. Using it, developers can fine-tune and test models popular repositories like and GitHub using proprietary data, and they can access cloud computing resources when the need to scale arises.

Manuvir Das, VP of enterprise computing at Nvidia, says that the impetus for AI Workbench was the challenge — and time-consuming nature — of customizing large AI models. Enterprise-scale AI projects can require hunting through multiple repositories for the right and tools, a process further complicated when projects have to be moved from one infrastructure to another.

Certainly, the success rate for launching enterprise models into production is low. According to a poll from KDnuggets, the data science and platform, the majority of data scientists responding say that 80% or more of their projects stall before deploying a machine learning model. A separate estimate from Gartner suggests that close to 85% of big data projects fail, due in part to infrastructural roadblocks.

“Enterprises around the world are racing to find the right infrastructure and build generative AI models and applications,” Das said in a canned statement. “Nvidia AI Workbench provides a simplified path for cross-organizational teams to create the AI-based applications that are increasingly becoming essential in modern business.”

The jury's out on just how “simplified” the path is. But to Das' point, AI Workbench allows developers to pull together models, frameworks, SDKs and libraries, including libraries for data prep and data , from open source resources into a unified workspace.

As the demand for AI — particularly generative AI — grows, there's been an influx of tools focused on fine-tuning large, general models to specific use cases. Startups like Fixie, Reka and Together aim to make it easier for companies and individual developers to customize models to their needs without having to out for costly cloud compute.

With AI workbench, Nvidia's pitching a more decentralized approach to fine-tuning — one that happens on a local machine as opposed to a cloud service. That makes sense, given Nvidia and its product portfolio of AI-accelerating GPUs stand to benefit; Nvidia makes not-so-subtle mentions of its lineup in the press release announcing the news. But Nvidia's commercial motivations aside, the pitch might appeal to developers who don't wish to be beholden to a single cloud or service for AI model experimentation.

AI-driven demand for GPUs has propelled Nvidia's earnings to new heights. In May, the company's market cap briefly reached $1 trillion after Nvidia reported $7.19 billion in revenue, up 19% from the previous fiscal quarter.

About Author

Kyle Wiggers

Leave a Reply

SOFAIO BLOG We would like to show you notifications for the latest news and updates.
Dismiss
Allow Notifications