Tokenizing and compressing, so humans and computers can solve large problems with finite memory – where it matters
Yi Ma @YiMaTweets Learning is all about maximizing information. We compress to learn and we learn to compress. Replying to @YiMaTweets Filed as Tokenizing and compressing, so humans and computers can solve large problems with finite memory – where it matters Yi Ma, It is not maximizing information so much as maximizing the chance that
Read More »