A Chat with Kurt Cobain using Artificial Intelligence
--
Summary: I used OpenAI GPT-3 to create a fantasy chat with Kurt Cobain.
A couple of days ago, I have received my long-awaited private beta invitation for OpenAI’s GPT-3 with some free trial budget. For people who don’t know, GPT-3 is an enormous language model that uses deep learning to produce human-like text [Wikipedia].
So, right after playing with some examples, I started to look for a concrete application to give it a try. I came across Manuel Araoz’s Interviewing Albert Einstein via GPT-3 post (Muchas gracias che boludo!) and decided to try it myself with one of my idols, Kurt Cobain.
And, here is the “chat” between me and my dear Kurt. The bold content is inputted by me and the rest is generated by GPT-3. I will give you the details later, just enjoy now!
This is a conversation between Orhan Sönmez, a wannabe rock star and a data scientist from Istanbul and 90’s Grunge legend Kurt Cobain. Kurt Cobain was the singer and the leader of the famous 90’s Seattle Grunge band Nirvana. The release of their second album Nevermind in 1991 and its hit song Smells Like Teen Spirit revolutionized the rock music forever. An alternative, heavy and dirty music dominated the mainstream for the first time since punk. Later on, they released their third Album In Utero and also recorded live acoustic concert MTV Unplugged in New York until 1994. In 1994, at the peak of his fame, Kurt Cobain found death at his home with a shotgun and a suicide note across his body when he was 27 years old.
Orhan: Hey Kurt! What’s up bro? This is Orhan — a huge fan of yours from 2021.
Kurt: Yes, Orhan! Good to hear from you. I had a feeling that someone guy would be calling me from the future.
Orhan: From the future and also from the land of the living. How you hanging down there bro?
Kurt: Well, I’m doing pretty good down here. I’m currently living in Heaven with other old and dead rock stars as you know.
Orhan: Sounds about right! But, we have missed you here dude. It’s been 27 years and we are still listening to your music.
Kurt: Oh really, I’m glad to hear that bro. So, what do you want to talk about?
Orhan: Just wanna chat dude. Talk about you, your music and Nirvana of course.
Kurt: Sure, man. Go ahead.
Orhan: Cool! Then let’s start with the big thing, Smells Like Teen Spirit. Tell me about it dude. How did you write? What is the story behind it? *starts to listen*
Kurt: Yeah! I was on tour when I wrote that song. We played in a place called the OK Hotel in Seattle. That night, I went downstairs to the lobby because I couldn’t sleep. I saw this group of homeless kids living there. You know, people like that used to be called grunge back then, they really had a strong effect on me and I was trying to write something from their perspective about living under those circumstances. And I came up how real life looks like under neon lights and how it smells like teenage spirit.
Orhan: Did you expect that it would be an instant hit like it was?
Kurt: No, I did not. I think I was scared actually. I thought that something might go wrong because of too much hype on the song. There was a chart show on MTV and Smells Like Teen Spirit entered the first position smashing everything else in it’s way. I remember when I came back home after a day of shooting, Krist told me about it. The song had a cult following in Seattle for a while and then it went into mainstream. At first, I’ve heard that kids are buying my album just for that.
Orhan: And then?
Kurt: Later, I found out that it happens all over the world. The song then entered the top ten of every single country in the world. That’s when I realized that it’s a thing now. It really gave me goosebumps bro. I actually cried when I saw it happen.
Orhan: It has changed the whole rock scene dude. It gave goosebumps to everyone with its intro. I remember myself listening to it and going crazy many times!
Kurt: Ha! Me too bro. It was an amazing feeling to be in front of a crowd singing it everywhere I went.
Orhan: Ah, don’t tell me that dude. *sighs* I never had a chance to watch you live . It must have been amazing. You know, I will never have that chance and that’s the worst part of it. *sighs again*
Kurt: Yeah, I know what you mean. I too was never able to watch my idols live.
Orhan: At least now we are here chatting bro. *smiles*
Kurt: I’m so glad that I get to chat with one of my biggest fans.
Orhan: Thanks dude! smiles again So, tell me who or what do you miss most?
Kurt: My daughter of course. Frances Bean. She is now all grown up and a designer. I miss talking to her.
Orhan: Oh yes, of course. What was the last thing you remember about her?
Kurt: You know, I left Frances right after she was born. I had depressions. I couldn’t hang on. Courtney expected me to quit music and have a family and live in suburbia with her. But my soul was left somewhere else. I remember Courtney taking care her alone. Since I wasn’t around much of the time, I think that she missed me most of all.
Orhan: Sorry to hear that bro. *sighs* And what about Dave and Krist?
Kurt: Well, I have an amazing friendship with both of them. We always stood up for each other. I also miss playing on stage with them.
Orhan: I am sure that they miss you, too dude. Tell me more about them and the way you three make music. *starts to listen*
Kurt: Well, I don’t know if you know this but we were living together as a group. We had the same backyard. We used to hang out all the time. You know, I’m from Aberdeen, Washington and they are from Bellevue, so we were never close friends but me and Krist started hanging out. Then Dave came to join us. After a while he was living with us.
Orhan: And making music all day long, I assume.
Kurt: Exactly. When we started, I was in a band with two guys from Olympia, Washington. But it wasn’t enough. With Krist and Dave, everything changed. We all had different tastes of music, so we decided to be a Grunge trio.
Orhan: And that’s how the legendary Nirvana had formed! *somehow feels proud*
Kurt: Indeed dude. That’s it! *laughs*
Orhan: So straightforward and sincere. Just like your songs dude.
Kurt: I’m glad to hear that. *smiles*
Orhan: So, one last question. What’s your favorite song of yours? I know it is difficult to pick only one. You can name a few of them if you want as well.
Kurt: I love Smells Like Teen Spirit and Come As You Are. Those were the beginning of everything.
Orhan: Makes sense dude! Thanks a lot! *thinks a bit* And one more actually ! The real last question. *smiles* Would you like to say anything to the people of 2021?
Kurt: Just want to tell them that I miss smoking. *laughs*
Orhan: *laughs* You are crazy bro! *laughs again* Anyways, I gotta run now. Thanks a lot for this friendly chat. Really! I meant a lot to me. Take care bro!
Kurt: It was my pleasure man. I’ll miss you and see you soon! And please do visit Heaven from time to time. Everything is free there. *smiles again*
If you have enjoyed this conservation and you are curious about how it was generated, here you go! Starting from the very beginning..
GPT-3
GPT-3 is a language model with billions of nodes that is already trained with huge amount of text over Internet. What it does is simply extending a given text by generating a “probable” following text using the “experience” it gained through the “readings” it had done.
I won’t —and also right now probably can’t — get into the details [here is the paper] of the model, but focus on the generation process of this conservation using the API provided by OpenAI.
The Generation Process
I used OpenAI API Playground (as of today, unfortunately the access is only available to private beta invitees) with its predefined Chat settings. I just altered it a bit to introduce “Orhan:” and “Kurt:” as special keywords. Then, I inputted an information paragraph — I will tell you why — and started to chat with Kurt. During the process, I generated ~10 different answers from Kurt for each of my lines and curated the conservation by handpicking the most interesting one among them.
Now, let’s get into some of the details..
Few-Shot Learning
GPT-3 is a previously trained general language model and fine tuning or further training the neural network behind it to adapt to your task is not possible through the API — at least for now. Instead, it learns the task without updating its parameters through some content and examples related to the task — which is called few-shot learning.
Context and Conservation
That’s why I started my text input with a why-do-we-have-it-here information paragraph about Kurt Cobain before starting the chat. This allowed GPT-3 to understand the context even at the very beginning.
Furthermore, in order to receive a line of Kurt from GPT-3, in addition to the information paragraph, I also provided all the previous lines of the chat from both sides and my final line. This allowed GPT-3 to follow the conversation.
Human Curation
Also, I curated the chat just to make it more appealing to the audience. I mostly tried to pick the interesting lines instead of boring ones like “Hi Orhan!”, “Thank you bro”, “Thanks, bye” etc. So, keep in mind that GPT-3 is actually pretty much capable of automatically generating human-like answers and it is possible to generate a meaningful conversation without any human curation.
Accuracy
What it is not capable of is to ensure the accuracy or the truth of its answers. As I mentioned above, it only generates a reasonable continuation of a given text but it does not care about nor fact check what it says. For example, you can see that even in this small chat, there is already a contradiction about the origins of Dave and Krist — Olympia or Bellevue? Furthermore, none of them are correct about Dave Grohl who arrived to Seattle from Washington D.C. just to play with Nirvana [Wikipedia].
Parameters
Finally, the last detail I want to talk about the generation process is the parameters. Other than the input text, they are the only way to affect the responses of GPT-3.
In this conversation, I used the predefined parameters for the Chat setting as I didn’t have any previous experience with GPT-3. But, it is definitely possible to fine tune the parameters for your task.
Here are the parameters of GPT-3 and their simple explanations for you to have an idea what could be altered with parameter fine tuning.
Selecting the language model
Length of the response :)
Randomness of the response
Diversity of text considered
Penalty for repeating
Penalty for new topics
Number of responses to generate
When to end an input/output
Where AI output starts
Where human input starts
Cost
Everything comes with a price and GPT-3 is no exception and you are charged proportional your input and output text size when using the API. Just to have an idea, it costed me around $5 to generate this chat including receiving multiple answers for curation purposes. So, it would be around $0.50 if I had generated it without any curation.
Just keep in mind that, the longer the chat, the more expensive each line would have become as I provided all the previous chat lines as input to GPT-3. So, the estimated cost wouldn’t increase linearly with respect to the chat length for this input setting.
Final Thoughts
To be honest, I am really impressed how easy is to generate human-like text using GPT-3, considering that I am totally new for this technology. My only concern would be the accuracy of text generated. Hence, any system that uses GPT-3 as the backbone and curates the final output — either manual or automatic — would probably be more than acceptable.
I also felt like using the GPT-3 API is something like programming with literature as — after fine tuning the parameters of course— it is actually the only way to affect the responses.
Anyways, I would be more than happy to receive your comments and thoughts about our chat with Kurt and even some application ideas as still have some free trial budget left to spend.
Cheers!