Why does chatgpt 4 suck at wordle?

I see a almost intentional. Like it gets tired of repetitive tasks? It says things like a 5 letter word with n in 4th position is pinch. Should be super easy task for a computer? Why say wrong things? It keeps apologizing and thanking for patience #tech #ai #openai

Microsoft EIuj73 Jul 27, 2023

Itā€™s an LLM not really ML. Itā€™s not solving problems, just trying to assemble sentences that look like sentences itā€™s seen on the internet

Waymo UvpM20 Jul 27, 2023

What do you mean not really ML. It's peak ML šŸ˜‚

Chubb Dota2fan Jul 27, 2023

Then why itā€™s so good at coding?

QuantumBlack sIRF37 Jul 27, 2023

The same reason itā€™s bad at typing a word backwards. It thinks in ā€˜tokenā€™s, not words or letters. A token is a relatively common group of letters, with about 4 tokens every 3 words.

Amazon bobbytabl Jul 27, 2023

Current gen LLMs are next token predictors, you need infill prediction in queries required for Wordle (like the one you mentioned) which gpt-4 is not very good at.

Microsoft Brrrrrrre Jul 27, 2023

Maybe good prompt crafting might help?

Bloomberg Km20 Jul 27, 2023

Tokenization strips the underlying characters from the input before it is trained. You can give it queries like "what is the nth character of this sentence: " and it will suck at that. It also has a loose grasp of whitespace since tokenization mostly strips that out

Bloomberg Km20 Jul 27, 2023

You would need to train it on content where shit is literally spelled out .. like "zoo is spelled z-o-o" then it might click .. but you can imagine there is not a lot of content like that in its training model.

Bloomberg Km20 Jul 27, 2023

Btw bing chat in precise mode is better at this .. they probably did some extra tuning for this use case