How Important Is OpenAI Tech to Azure Cloud? Ask Nadella — Virtualization Review


News

How Important Is OpenAI Tech to Azure Cloud? Ask Nadella

Nothing defines a lucrative partnership like showing up at one another’s big tech events.

After reportedly investing some $13 billion in OpenAI, Microsoft is making back some of that money with higher-priced products and services infused with advanced generative AI tech from its partner. How important is that tech to Azure and Microsoft? None other than Microsoft CEO Satya Nadella made a surprise appearance today at the kick-off keynote presentation by OpenAI CEO Sam Altman for OpenAI DevDay, the company’s first developer conference.

The partnership cements an exclusive arrangement between the two companies, with Microsoft supplying cloud services to OpenAI and OpenAI supplying AI tech to Azure cloud services and other Microsoft offerings. That partnership left the other cloud giants, AWS and Google Cloud, scrambling to play catch-up by developing their own advanced AI tech or team up with other AI leaders like Anthropic.

What Does Microsoft Think About the Partnership?
“We love you guys,” said Nadella in joining Altman on stage and being asked how Microsoft is thinking about their partnership.


Altman and Nadella on Stage
[Click on image for larger view.] Altman and Nadella on Stage

“Look, it’s been fantastic for us,” Nadella continued. “In fact, I remember the first time I think you reached out and said, ‘Hey, do you have some Azure credits?’ We’ve come a long way from there. You guys have built something magical.

“There are two things for us when it comes to the partnership. The first is, these workloads and even when I was listening backstage to how you’re describing what’s coming even, it’s just so different and new. I’ve been in the infrastructure business for three decades. The workload, the pattern of the workload, these training jobs are so synchronous and so large and so data parallel.

“And so, the first thing we’ve been doing is building in partnership with you the system all the way from thinking from power to the DC to the rack, to the accelerators, to the network, and just really the shape of Azure is drastically changed. And it’s changing rapidly in support of these models that you’re building. And so our job number one is to build the best system so that you can build the best models, and then make that all available to developers.

“And so, the other thing is, we ourselves are developers, building products. My own conviction of this entire generation of foundation models has completely changed — the first time I saw GitHub Copilot on GPT. And so we want to build our Copilot, GitHub Copilot, all as developers on top of OpenAI APIs so we’re very, very committed to that. What does that mean to developers? I always think of Microsoft as a platform company, a developer company, and a partner company, and so we want to make — for example, we want to make GitHub available — GitHub Copilot available, the enterprise edition available to all the attendees here so they can try it out. We’re very excited about that. And you can count on us to build the best infrastructure in Azure with your API support, and bring it to all of you, and then even things like the Azure marketplace, for developers building out products here to get to market rapidly. That’s sort of really our intent here.”

The Future
While that answered Altman’s question about how Microsoft is thinking about the partnership, Altman’s second question to Nadella concerned the future, or “future of the partnership or future of AI or whatever.”

“You know, like, there are a couple of things for me that I think are gonna be very, very key for us,” Nadella replied. “One is, I just described how the systems that are needed as you aggressively push forward on your roadmap, requires us to be on the top of our game, and we intend fully to commit ourselves deeply to making sure you all, as builders of these foundation models, have not only the best systems for training and inference but the most compute so you can keep pushing forward on the frontiers because I think that’s the way we’re going to make progress.

“The second thing I think both of care about, in fact, quite frankly, the thing that excited both sides to come together, is your mission and our mission. Our mission is to empower every person and every organization on the planet to achieve more, and to me, ultimately AI is only going to be useful if it truly does empower. I saw the video you played earlier, and that was fantastic to see those — hear those voices describe what AI meant for them and what they were able to achieve. So ultimately it’s about being able to get the benefits of AI broadly disseminated to everyone, I think is going to be the goal for us.

“And the last thing is of course we’re very grounded in the fact that safety matters and safety is not something that you care about later, but it’s something we do shift left on and we’re very, very focused on that with you all.”

Conference News
Announced during at the start of the event were new models and products including:

  • New GPT-4 Turbo model that is more capable, cheaper and supports a 128K context window
  • New Assistants API that makes it easier for developers to build their own assistive AI apps that have goals and can call models and tools
  • New multimodal capabilities in the platform, including vision, image creation (DALL·E 3), and text-to-speech (TTS)

You can read more about them in the Pure AI article, “OpenAI Unveils More Powerful GPT-4 ‘Turbo’ at First Devcon Event.”

OpenAI also introduced “GPTs,” custom versions of ChatGPT that combine instructions, extra knowledge, and any combination of skills. GPTs exist for cooking, creative writing, laundry, board/card games, tech advice and more.

About the Author



David Ramel is an editor and writer for Converge360.





Source link

Previous articleTalk Your Book: Bitcoin Fundamentals
Next articleCrypto Game Plan: A Plan to Turn $5 into $130,000 with Bitcoin’s Halving