Original article was published by Ashish Pal on Artificial Intelligence on Medium
YES I am talking about the latest and most Hyped OPEN AI GPT-3 Algorithm which has shown exceptional results in terms of Natural Language Processing and Generation modelling. OPEN AI has released a beta version of it’s most advanced GPT-3 (Generative Pre-trained Transformer) algorithm as an API which provides a general-purpose “text in, text out” interface.
What is GPT-3 ?
GPT-3 is the most advance Generative Pre-trained Transformer algorithm which has been released by OPENAI as a beta version API which has been designed to be both simple for anyone to use but also flexible enough to make machine learning teams more productive.
GPT showed how a generative model of language is able to acquire world knowledge and process long-range dependencies by pre-training on a diverse corpus with long stretches of contiguous text.
GPT-2 (Generative Pretrained Transformer 2), announced in February 2019 is an unsupervised transformer language model and the successor to GPT.The corpus it was trained on, called WebText, contains slightly over 8 million documents for a total of 40 GB of text from URLs shared in Reddit submissions.The full version of GPT-2 was not immediately released out of concern over potential misuse, including applications for writing fake news.
GPT-3 (Generative Pretrained Transformer 3), announced in May 2020. It contains 175 billion parameters which is 10X greater than any other model available out there.