A Chat with Kurt Cobain using Artificial Intelligence

GPT-3

GPT-3 is a language model with billions of nodes that is already trained with huge amount of text over Internet. What it does is simply extending a given text by generating a “probable” following text using the “experience” it gained through the “readings” it had done.

The Generation Process

I used OpenAI API Playground (as of today, unfortunately the access is only available to private beta invitees) with its predefined Chat settings. I just altered it a bit to introduce “Orhan:” and “Kurt:” as special keywords. Then, I inputted an information paragraph — I will tell you why — and started to chat with Kurt. During the process, I generated ~10 different answers from Kurt for each of my lines and curated the conservation by handpicking the most interesting one among them.

Few-Shot Learning

GPT-3 is a previously trained general language model and fine tuning or further training the neural network behind it to adapt to your task is not possible through the API — at least for now. Instead, it learns the task without updating its parameters through some content and examples related to the task — which is called few-shot learning.

Context and Conservation

That’s why I started my text input with a why-do-we-have-it-here information paragraph about Kurt Cobain before starting the chat. This allowed GPT-3 to understand the context even at the very beginning.

Human Curation

Also, I curated the chat just to make it more appealing to the audience. I mostly tried to pick the interesting lines instead of boring ones like “Hi Orhan!”, “Thank you bro”, “Thanks, bye” etc. So, keep in mind that GPT-3 is actually pretty much capable of automatically generating human-like answers and it is possible to generate a meaningful conversation without any human curation.

Accuracy

What it is not capable of is to ensure the accuracy or the truth of its answers. As I mentioned above, it only generates a reasonable continuation of a given text but it does not care about nor fact check what it says. For example, you can see that even in this small chat, there is already a contradiction about the origins of Dave and Krist — Olympia or Bellevue? Furthermore, none of them are correct about Dave Grohl who arrived to Seattle from Washington D.C. just to play with Nirvana [Wikipedia].

Parameters

Finally, the last detail I want to talk about the generation process is the parameters. Other than the input text, they are the only way to affect the responses of GPT-3.

Cost

Everything comes with a price and GPT-3 is no exception and you are charged proportional your input and output text size when using the API. Just to have an idea, it costed me around $5 to generate this chat including receiving multiple answers for curation purposes. So, it would be around $0.50 if I had generated it without any curation.

Final Thoughts

To be honest, I am really impressed how easy is to generate human-like text using GPT-3, considering that I am totally new for this technology. My only concern would be the accuracy of text generated. Hence, any system that uses GPT-3 as the backbone and curates the final output — either manual or automatic — would probably be more than acceptable.

--

--

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store