A Reddit user tries using GPT-4 to play a game. Here's how it turned out

 


With its exceptional capabilities, OpenAI's GPT-4 has genuinely changed the artificial intelligence (AI) landscape. The AI chatbot is capable of far more than one might imagine, including passing the bar exam, solving challenging logic puzzles, producing original recipes, and creating beautiful poems in response to prompts. Even games can be played with you by it. It's true what you just read!

On the screenshot provided by user 'Secret-Aardvark-366' to the Reddit group 'ChatGPT,' it is written, "Tried to play a game with GPT-4. A person by the name of Alex starts the conversation by inviting GPT-4 to play a game with them. They requested a game in which they would be given a collection of emojis and the initial letter of each emoji would be used to spell out a word. I'll start off. The user then displayed a series of emojis to GPT-4, claiming that they represented the word "Hello." The AI chatbot took part and contributed a set of emojis that represented a musically related term. The user incorrectly identified the word as "Casapoal," which is regrettable.

On the screenshot provided by user 'Secret-Aardvark-366' to the Reddit group 'ChatGPT,' it is written, "Tried to play a game with GPT-4. A person by the name of Alex starts the conversation by inviting GPT-4 to play a game with them. They requested a game in which they would be given a collection of emojis and the initial letter of each emoji would be used to spell out a word. I'll start off. The user then displayed a series of emojis to GPT-4, claiming that they represented the word "Hello." The AI chatbot took part and contributed a set of emojis that represented a musically related term. The user incorrectly identified the word as "Casapoal," which is regrettable.

The original poster then admitted that "it got even worse" in the comments section and published a link to their complete exchange with the AI chatbot. The user corrected the AI's answer after it was shared as "baseball," saying that "baseball has nothing to do with music" and that "piano does not start with B." The chatbot then corrects the term to "Basap-oal" after acknowledging its error. They then pick up where they left off, and GPT-4 starts throwing emojis.

Alex says, "Can you make a word that exists?" in response to GPT-4's retorts, which have angered him. The AI chatbot apologizes for its earlier responses and makes many unsuccessful tries.

B is a piano. Duh," a Reddit user exclaimed. Another person said, "By the way, this passed the bar exam." A third said, "You're close!" It's trying so hard, bless this machine, said a fourth. Another laughed, "Oh, that's it, baseball, that popular musical style. The sixth person said, "Lol, this is gold." What do you think about this? Have you ever tried using GPT-4 to play a game? And if so, what was it?

buttons=(Accept !) days=(20)

Our website uses cookies to enhance your experience. Learn More
Accept !