BT

Facilitating the Spread of Knowledge and Innovation in Professional Software Development

Write for InfoQ

Topics

Choose your language

InfoQ Homepage News Microsoft Obtains Exclusive License for GPT-3 AI Model

Microsoft Obtains Exclusive License for GPT-3 AI Model

This item in japanese

Bookmarks

Microsoft announced an agreement with OpenAI to license OpenAI's GPT-3 deep-learning model for natural-language processing (NLP). Although Microsoft's announcement says it has "exclusively" licensed the model, OpenAI will continue to offer access to the model via its own API.

Microsoft CTO Kevin Scott wrote about the agreement on Microsoft's blog. The deal builds on an existing relationship between the two organizations, which includes a partnership in building a supercomputer on Microsoft's Azure cloud platform. OpenAI recently used that supercomputer to train GPT-3, which at 175 billion parameters is one of the largest NLP deep-learning models trained to date. Scott said the licensing of GPT-3 will:

[Allow] us to leverage its technical innovations to develop and deliver advanced AI solutions for our customers, as well as create new solutions that harness the amazing power of advanced natural language generation.


GPT-3 is the third iteration of OpenAI's Generative Pre-Trained Transformer model. The original GPT model was released in 2018 and contained 117 million parameters. For the next iteration, GPT-2, OpenAI scaled up the model more than 10x, to 1.5 billion parameters. Because the text generated by GPT-2 could often be as "credible" as text written by humans, OpenAI at first declined to release the full model, citing potential for misuse in generating "deceptive, biased, or abusive language at scale." However, by November 2019, OpenAI had seen "no strong evidence of misuse" and decided to release the model.

In July 2019, Microsoft and OpenAI announced a partnership, which included a $1 billion investment from Microsoft, to "jointly build new Azure AI supercomputing technologies." OpenAI also agreed to run its services on Azure and to make Microsoft its "preferred partner for commercializing new AI technologies." During its Build conference this May, Microsoft showcased the supercomputer built for OpenAI on its Azure cloud platform: "a single system with more than 285,000 CPU cores, 10,000 GPUs and 400 gigabits per second of network connectivity for each GPU server."

GPT-3, announced earlier this year, was a 100x scale-up of GPT-2 and set new state-of-the-art results on several NLP tasks. The training dataset contained nearly half a trillion words. Training the model on the Azure supercomputer consumed "several thousand petaflop/s-days of compute" and is estimated to have cost from $4.6 million to $12 million. As with GPT-2, OpenAI has not released the trained model; however, OpenAI did release a limited-access web API for developers to make calls to the model from their apps.

The licensing deal with Microsoft is the latest of several recent moves by OpenAI to monetize their technology. Originally founded as a non-profit, OpenAI launched a new "hybrid of a for-profit and nonprofit" or "capped-profit" called OpenAI LP in March 2019. The goal of the new company was to "raise investment capital and attract employees with startup-like equity." OpenAI's API page contains an FAQ section that defends its commercial products as "one of the ways to make sure we have enough funding to succeed." While the terms of the Microsoft license have not been disclosed, OpenAI claims that it has "no impact" on users of OpenAI's API, who can "continue building applications...as usual."

With the license agreement being touted as "exclusive," and given OpenAI's past reluctance to release their trained models, many commenters have joked that the company should change its name to "ClosedAI." One Hacker News reader questioned the long-term commercial viability of GPT-3:

Anyone else feel like this idea of commercializing GPT-3 is bound to go nowhere as the research community figures out how to replicate the same capabilities in smaller cheaper open models within a few months or even a year?

The OpenAI API is currently in beta, with a waitlist for gaining access. The 1.5-billion parameter GPT-2 model is available on GitHub.
 

Rate this Article

Adoption
Style

BT