[go: nahoru, domu]

Americas

  • United States
evan_schuman
Contributor

SUSE plans more support for gen AI workloads

News
Jun 18, 20244 mins
Generative AIHybrid CloudLinux

The Linux packager’s SUSE AI Early Access Program could interest companies wanting to run generative AI on premises.

datacenter
Credit: Shutterstock

SUSE is preparing an “enterprise-grade generative AI Platform” that will run any vendor’s large language models (LLMs) on premises or in the cloud, it said Tuesday. But analysts said it is the on-prem scenario where SUSE could make a big difference, given the virtual absence of major vendors positioning for on-prem.

Bill Weinberg, senior partner at consulting firm OpenSourceSense, said the dearth of on-prem AI suites is noteworthy. 

“I haven’t seen a lot of integrated AI offerings even talking about on-prem. There was an announcement last year by IBM and VMware regarding support for Watson AI on prem, but with the current trajectory of VMware, it’s hard to say how consequential that solution is today,” Weinberg said. “Today’s announcement is strongly positioned to substantiate the company as a supplier to enterprise of open source AI technology, without making specific assumptions of where the enterprise market really stands regarding on-prem AI. They are hedging their bets.”

SUSE plans to “offer enterprises a modular, secure, vendor- and LLM-agnostic gen-AI platform that will dissolve silos and reduce costs associated with enterprise generative AI implementations. These AI solutions, built on SUSE’s industry-leading open source, enterprise-grade SUSE Linux, Rancher Prime Kubernetes management and Rancher NeuVector security offerings, will enable enterprises to control data flows in a secure, private environment, reducing regulatory compliance risk and improving security,” it said in a statement.

Gartner Research VP Tony Iams said that SUSE not talking about an alternative for VMWare was “maybe a miss in our view.” The consequences of Broadcom’s acquisition of VMware are a concern for customers, he said: “The VMWare alternative issue is a pressing, real problem.”

Both analysts saw SUSE’s move as a response to many enterprises struggling to craft an effective gen-AI deployment strategy, one that balances cybersecurity, compliance, privacy, scalability, data leakage, shadow AI, accuracy (aka hallucinations) and cost-effectiveness.

Once IT management figures out whether to run workloads in the cloud or on premises, then they can explore the question of open-source versus proprietary operating systems. 

Cost trade-offs

Regarding where generative AI workloads are run, on premises or in the cloud, “there are some cost considerations. The jury is out on the cost tradeoffs,” Iams said. 

For many enterprises, the on-prem vs cloud debate is more about control than anything else. It is a common problem for CIOs and CISOs to work out precise settings and customizations tailored to that enterprise’s environment, only to find those decisions overwritten by a cloud staffer who changed settings universally for all cloud tenants. 

“The universal business model is that the CIO wants throats to choke,” Weinberg said, referring to the ability to control employees and contractors that your team has hired, versus an employee or contractor working for the cloud vendor.

As for the software, Iams said that “open source is not always going to be cheaper than closed source. There is this perception that open source is cheap, but someone has to get all of it to work together.”

That is precisely part of the SUSE argument, that they will be delivering a suite of all of the elements needed to support gen-AI deployments, with all elements tested to work well together. 

“SUSE approaches AI with a strong foundation in open source principles, a commitment to delivering security, and a belief that customer options, including privacy by design, is paramount,” the vendor’s statement said. “SUSE AI takes a responsible AI approach by which enterprises are empowered to choose the models and tools they prefer to get the most out of AI in a private, safe and secure environment.”

Weinberg also pointed to another IT fear, which is what happens when a gen-AI vendor either goes out of business or perhaps simply abandons that product line. He compared an open source AI strategy to traditional code escrow, referring to open source as “the escrow of last resort. If the vendor tanks, you have the source code licenses in an amenably open-sourced forum.”

evan_schuman
Contributor

Evan Schuman has covered IT issues for a lot longer than he'll ever admit. The founding editor of retail technology site StorefrontBacktalk, he's been a columnist for CBSNews.com, RetailWeek, Computerworld and eWeek and his byline has appeared in titles ranging from BusinessWeek, VentureBeat and Fortune to The New York Times, USA Today, Reuters, The Philadelphia Inquirer, The Baltimore Sun, The Detroit News and The Atlanta Journal-Constitution. Evan can be reached at eschuman@thecontentfirm.com and he can be followed at twitter.com/eschuman. Look for his blog twice a week.

The opinions expressed in this blog are those of Evan Schuman and do not necessarily represent those of IDG Communications, Inc., its parent, subsidiary or affiliated companies.

More from this author