17 12

This new system, GPT-3, had spent those months learning the ins and outs of natural language by analyzing thousands of digital books, the length and breadth of Wikipedia, and nearly a trillion words posted to blogs, social media and the rest of the internet.

These systems known as universal language models that can help power a wide range of tools, like services that automatically summarize news articles and “chatbots” designed for online conversation.

But GPT-3 which learned from a far larger collection of online text than previous systems — opens the door to a wide range of new possibilities, such as software that can speed the development of new smartphone apps, or chatbots that can converse in far more human ways than past technologies.

GPT-3 analyzed digital prose on an unprecedented scale, spending months looking for patterns in huge amounts of text posted to the internet.

During its months of training, GPT-3 identified more than 175 billion parameters —in that sea of books, Wikipedia articles and other online texts.

These patterns amount to a map of human language: a mathematical description of the way we piece characters together, whether we are writing blogs or coding software programs.

Before asking GPT-3 to generate new text, you can focus it on patterns it may have learned during its training, priming the system for certain tasks.

But GPT-3 can do things that previous models could not, like write its own computer code.

Add your comment