{"id":16452,"date":"2024-08-16T12:26:26","date_gmt":"2024-08-16T12:26:26","guid":{"rendered":"\/?p=16452"},"modified":"2024-08-16T12:34:17","modified_gmt":"2024-08-16T12:34:17","slug":"individual-creativity-might-sell-degrees-and-papers-but-we-need-practical-people-who-work-hard-and-do-a-complete-job","status":"publish","type":"post","link":"\/?p=16452","title":{"rendered":"All humans and all AIs working together for 50 KiloYears"},"content":{"rendered":"<p>Mark Riedl @mark_riedl HIVE MIND: I saw an interesting-looking paper on Twitter within the last few weeks that was trying to quantify generalization in LLMs. I didn&#8217;t grab it at the time, and now I cannot track it down. Any thoughts on what it might have been? Thanks!<br \/>\nReplying to @mark_riedl<\/p>\n<hr \/>\n<p>( quantify generalization in LLMs ) has 128,000 entries.<\/p>\n<p>Position: Understanding LLMs Requires More Than Statistical Generalization at https:\/\/arxiv.org\/abs\/2405.01964<\/p>\n<p>Humans can generalize, so can algorithms. Usually it is &#8220;compression&#8221; because of finite cache memory and finite permanent memory<\/p>\n<p>Stop focusing on &#8220;one human and one AI&#8221; and look at &#8220;all humans and all AIs working together for KiloYears and MegaYears&#8221;<\/p>\n<p>I have to deal with these issues every day.\u00a0 Go look at the whole world and think about the rest of your life and what you can do. Do not forget your family over the next 75 years. And put serious effort into working with &#8220;all human languages, all domain specific knowledge, and all devices&#8221;.<\/p>\n<div data-rbd-draggable-context-id=\"29\" data-rbd-draggable-id=\"eh33g\">\n<div class=\"\" data-block=\"true\" data-editor=\"943so\" data-offset-key=\"eh33g-0-0\">\n<div class=\"public-DraftStyleDefault-block public-DraftStyleDefault-ltr\" data-offset-key=\"eh33g-0-0\"><span data-offset-key=\"eh33g-0-0\">Individual creativity might sell degrees and papers, but we need practical people who work hard and do a complete job.<\/span><\/div>\n<\/div>\n<\/div>\n<div data-rbd-draggable-context-id=\"29\" data-rbd-draggable-id=\"amgmc\">\n<div class=\"\" data-block=\"true\" data-editor=\"943so\" data-offset-key=\"amgmc-0-0\">\n<div class=\"public-DraftStyleDefault-block public-DraftStyleDefault-ltr\" data-offset-key=\"amgmc-0-0\"><span data-offset-key=\"amgmc-0-0\">\u00a0<\/span><\/div>\n<\/div>\n<\/div>\n<div data-rbd-draggable-context-id=\"29\" data-rbd-draggable-id=\"adr6t\">\n<div class=\"\" data-block=\"true\" data-editor=\"943so\" data-offset-key=\"adr6t-0-0\">\n<div class=\"public-DraftStyleDefault-block public-DraftStyleDefault-ltr\" data-offset-key=\"adr6t-0-0\"><span data-offset-key=\"adr6t-0-0\">Stop focusing on &#8220;one human and one AI&#8221; and look at &#8220;all humans and all AIs working together for KiloYears or MegaYears&#8221;<\/span><\/div>\n<\/div>\n<\/div>\n<div data-rbd-draggable-context-id=\"29\" data-rbd-draggable-id=\"cg8cq\">\n<div class=\"\" data-block=\"true\" data-editor=\"943so\" data-offset-key=\"cg8cq-0-0\">\n<div class=\"public-DraftStyleDefault-block public-DraftStyleDefault-ltr\" data-offset-key=\"cg8cq-0-0\"><span data-offset-key=\"cg8cq-0-0\">\u00a0<\/span><\/div>\n<\/div>\n<\/div>\n<div data-rbd-draggable-context-id=\"29\" data-rbd-draggable-id=\"1fgnb\">\n<div class=\"\" data-block=\"true\" data-editor=\"943so\" data-offset-key=\"1fgnb-0-0\">\n<div class=\"public-DraftStyleDefault-block public-DraftStyleDefault-ltr\" data-offset-key=\"1fgnb-0-0\"><span data-offset-key=\"1fgnb-0-0\">Filed as (All humans and all AIs working together for 50 KiloYears)<\/span><\/div>\n<\/div>\n<\/div>\n<p>Richard Collins, The Internet Foundation<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Mark Riedl @mark_riedl HIVE MIND: I saw an interesting-looking paper on Twitter within the last few weeks that was trying to quantify generalization in LLMs. I didn&#8217;t grab it at the time, and now I cannot track it down. Any thoughts on what it might have been? Thanks! Replying to @mark_riedl ( quantify generalization in <br \/><a class=\"read-more-button\" href=\"\/?p=16452\">Read More &raquo;<\/a><\/p>\n","protected":false},"author":2,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[77,73,72],"tags":[],"class_list":["post-16452","post","type-post","status-publish","format-standard","hentry","category-all-global-open-devices","category-all-knowledge","category-all-languages"],"_links":{"self":[{"href":"\/index.php?rest_route=\/wp\/v2\/posts\/16452","targetHints":{"allow":["GET"]}}],"collection":[{"href":"\/index.php?rest_route=\/wp\/v2\/posts"}],"about":[{"href":"\/index.php?rest_route=\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"\/index.php?rest_route=\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"\/index.php?rest_route=%2Fwp%2Fv2%2Fcomments&post=16452"}],"version-history":[{"count":4,"href":"\/index.php?rest_route=\/wp\/v2\/posts\/16452\/revisions"}],"predecessor-version":[{"id":16456,"href":"\/index.php?rest_route=\/wp\/v2\/posts\/16452\/revisions\/16456"}],"wp:attachment":[{"href":"\/index.php?rest_route=%2Fwp%2Fv2%2Fmedia&parent=16452"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"\/index.php?rest_route=%2Fwp%2Fv2%2Fcategories&post=16452"},{"taxonomy":"post_tag","embeddable":true,"href":"\/index.php?rest_route=%2Fwp%2Fv2%2Ftags&post=16452"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}