With businesses debating whether to host and run the latest wave of AI in the public cloud, in a private cloud, or on-premises, Cloudera is betting on a bit of all of the above. The company, which offers a hybrid approach to data storage, recently collaborated with Nvidia to roll out a new service that makes it easier for enterprises to build and deploy AI models as the 16-year-old platform continues to remake itself for the AI age. The move, announced at a conference in New York this month, comes on the heels of Cloudera's June acquisition of Verta, a startup that helps customers manage machine learning models. Cloudera is building up these tools as CEO Charles Sansbury claims a shift is taking place in how large companies think about their computing needs. Energy and compute costs are driving more global corporations to run generative AI applications on-premise rather than in the cloud, he said. "A year ago, if you'd asked these large global customers, 'What does your endpoint computing architecture look like in five years?' they would have said, most of my workloads are moving to the cloud. Just in the past year, that tune has changed dramatically," Sansbury told Tech Brew. "Generative AI-based applications for large companies will run on on-premises hardware, not on-cloud, for purposes of control, security, but also cost." Keep reading here.—PK |
No hay comentarios:
Publicar un comentario