Tokenization

All discussions tagged with this topic

Found 2 discussions

Discover why advanced AI models like ChatGPT can't easily count to a million, exploring the impact of tokenization and the difference between sophisticated pattern matching and true general intelligence.

Users are observing AI models like ChatGPT and Gemini displaying 'thoughts' in non-English languages. This discussion explores why this happens, linking it to multilingual training, internal token efficiency, and research findings that suppressing it can even reduce performance.