Tokenizing and compressing, so humans and computers can solve large problems with finite memory – where it matters
Replying to @YiMaTweets
Filed as Tokenizing and compressing, so humans and computers can solve large problems with finite memory – where it matters
Yi Ma, It is not maximizing information so much as maximizing the chance that the information is remembered and applied correctly in critical situations, especially where human lives are on the line. Where it matters, a lot. Where communications are limited, tokenizing to global open verified identifiers greatly compresses global scale systems. That compression allows getting critical messages through without error.
We compress to allow our finite human memories to hold more in memory at once. There are many problems where “having the whole problem in mind at once” is a critical part of solving it. So tokenizing (one way of compressing) is simply making it so the problem can be solved at all – using unaided humans. A global open token is easier to remember and apply consistently so there is no ambiguity in knowing what the pieces mean. In a game or process, knowing the precise steps, practicing, and then doing them exactly might save lives, or get the job done where it really matters.