Zephyrnet Logo

Power of AI With Cloud Computing is “Stunning” to Microsoft’s Nadella 

Date:

Microsoft CEO Satya Nadella said at the AI and the Future of Work Conference from MIT that the ability of cloud computing to harness massive computing power is ‘transformative.’ (Photo by Mohammad Rezaie on Unsplash.)

By AI Trends Staff  

Asked what in the march of technology he is most impressed with, Microsoft CEO Satya Nadella said at MIT’s AI and the Work of the Future Congress 2020 held virtually last week that he is struck by the ability of cloud computing to provision massive computing power.   

Satya Nadella, CEO, Microsoft

“The computing available to do AI is transformative,” Nadella said to David Autor, the Ford Professor of Economics at MIT, who conducted the Fireside Chat session.   

Nadella mentioned the GPT-3 general purpose language model from OpenAI, an AI lab searching for a commercial business model. GPT-3 is an autoregressive language model with 175 billion parameters. OpenAI agreed to license GPT-3 to Microsoft for their own products and services, while continuing to offer OpenAI’s API to the market. Today the API is in a limited beta as OpenAI and academic partners test and assess its capabilities.  

The Microsoft license is exclusive however, meaning Microsoft’s cloud computing competitors cannot access it in the same way. The agreement was seen as important to helping OpenAI with the expense of getting GPT-3 up and running and maintaining it, according to an account in TechTalks. These include an estimated $10 million in expenses to research GPT-3 and train the model, tens of thousands of dollars in monthly cloud computing and electricity costs to run the models, an estimated one million dollars annually to retrain the model to prevent decay, and additional costs of customer support, marketing, IT, legal and other requirements to put a software product on the market.  

Earlier this year at its Build developers conference, Microsoft announced it worked with OpenAI to assemble what Microsoft said was “one of the top five publicly disclosed supercomputers in the world,” according to an account on the Microsoft AI blog. The infrastructure will be available in Azure, Microsoft’s cloud computing offering, to train “extremely large” AI models.   

The partnership between Microsoft and OpenAI aims to “jointly create new supercomputing technologies in Azure,” the blog post stated.  

And it’s not just happening in the cloud, it’s happening on the edge,” Nadella said.  

Applications for cloud and edge computing working together—such as natural language generation, image completion, or virtual simulations from wearable sensors that see the work—are very compute-intensive. “It’s stunning to see the capability,” of the GPT-3 model applied to this work, Nadella said. “Something in the model architecture gives me confidence we will have more breakthroughs at an accelerating pace,” he said.  

Potential Strategic Advantage in Search, Voice Assistants from GPT-3 Models  

Strategically, it could be that the GPT-3 models will give Microsoft a real advantage, the article in TechTalks suggested. For example in the search engine market, Microsoft’s Bing has just over a 6% market share, behind Google’s 87%. Whether GPT-3 will enable Microsoft to roll out new features that redefine how search is used remains to be seen.   

Microsoft is also likely to explore potential advantages GPT-3 could bring to the voice assistant market, where Microsoft’s Cortana sees a 22% share, behind Apple’s Siri, which has 35%.  

Nadella does have concerns related to the power of AI and automation. “We need a set of design principles, from ethics to actual engineering and design and a process to allow us to be accountable, so the models are fair and not biased. We need to ‘de-bias’ the models and that is hard engineering work,” he said. “Unintended consequences” and “bad use cases” are also challenges, he said, without elaborating. [Ed. Note: A ‘misuse case” or bad use case describes a function the system should not allow, from Wikipedia.]  

Moderator Autor asked Nadella how Microsoft makes decisions on what problems to work on using AI. Nadella mentioned “real world small AI” and the company’s Power Platform tools, which enables several products to work well together as part of a business application platform. This foundation is built on what had been called the Common Data Service for apps, and as of this month (November), is called “Dataverse.” Data is stored in tables which can reside on the cloud. 

Using the tools, “People can take their domain expertise and turn it into automation using AI capabilities,” Nadella said. 

Asked what new job opportunities are being created from the use of AI he anticipates in the future, Nadella compared the transition going on today to the onset of computer spreadsheets and word processors. “The same thing is happening today,” as computing is getting embedded in manufacturing plants, retail settings, hospitals, and farms. “This will shape new jobs and change existing jobs,” he said. 

‘Democratization of AI’ Seen as Having Potential to Lower Barriers  

The two discussed whether the opportunities from AI extend to those workers without abstract skills like programming. Discussion ensued on “democratization of AI” which lowers barriers for individuals and organizations to gain experience with AI, allowing them, for example, to leverage publicly available data and algorithms to build AI models on a cloud infrastructure. 

Relating it to education, Autor wondered if access to education could be “democratized” more. Nadella said, “STEM is important, but we don’t need everyone to get a master’s in computer science. If you can democratize the expertise to help the productivity of the front line worker, that is the problem to solve.” 

Autor asked if technology has anything to do with the growing gap between low-wage and high-wage workers, and what could be done about it. Nadella said Microsoft is committed to making education that leads to credentials available. “We need a real-time feedback loop between the jobs of the future and the skills required,” Nadella said. “To credential those skills, we are seeing more companies invest in corporate training as part of their daily workflow. Microsoft is super focused on that.” 

 

A tax credit for corporations that invest in training would be a good idea, Nadella suggested. “We need an incentive mechanism,” he said, adding that a feedback loop would help training programs to be successful.  

Will “telepresence” remain after the pandemic is over? Autor asked. Nadella outlined four thoughts: first, the collaboration between front line workers and knowledge workers will continue, since the collaboration has proved to be more productive in some ways; second, meetings will change but collaboration will continue before, during, and after meetings; third, learning and the delivery of training will be better assisted with virtual tools; and “video fatigue” will be recognized as a real thing.   

“We need to get people out of their square boxes and into a shared sense of presence, to reduce cognitive load,” Nadella said. “One of my worries is that we are burning the social capital that got built up. We need to learn new techniques for building social capital back.”  

Learn more about AI and the Work of the Future Congress 2020, GPT-3 inTechTalks and on the Microsoft AI blog, the Power Platform and Dataverse. 

Source: https://www.aitrends.com/cloud-2/power-of-ai-with-cloud-computing-is-stunning-to-microsofts-nadella/

spot_img

Latest Intelligence

spot_img

Chat with us

Hi there! How can I help you?