Change decades of uncoordinated efforts to global open lossless permanent accessible results


MA Oviedo-Garcia @maoviedogarcia More than 2 million research papers have disappeared from the Internet despite having an active DOI
https://nature.com/articles/d41586-024-00616-5
Replying to @maoviedogarcia
I think it is much worse than that. Rules, regulations, results, data, events, background for papers. Trace any website, event, topic and it degrades quickly. Look at LLMs using 1 and 2 sigma methods, corporations using ads and marketing rather than accounting and curation. Any topic, globally, is going to be fragmented. A billion humans, writing independently, will not generate globally consistent systems – without real effort. Groups who slave over print publications for a few will dump things on the web with no editing or curation at all. An expensive effort, intended to be part of a sustained global project “for the good of all” will have no permanent memory. Check a lot of government funded projects and see if they even bother to share their results and data. Check a lot of “big donor” projects and see if they bother to spend any money at all to monitor the effects of the people they pay. It goes into consumption, not sustainable global changes. Salaries, not impacts.

On LLMs, you ought to be able to estimate how many replications of 2 sigma algorithms are needed to refine global models to even 4 sigma precision. Tens of thousands of small university departments running for decades on a topic, with zero effort at checking the whole, does not add up to one globally coordinated, comprehensive, precise topic – run for a year with global open lossless methods and remembered and applied forever.

Most of any such effort is wasted, because it is effectively lost in a few months or years after the project is done, the person retires, the initial champions lose interest. Look at all the hot startups and hot topic initiators. Billions or trillions of dollars spent on salaries is useless if there is no accounting and audit. Look at the good intentions of large corporations. The goals are most all lost and then trillions more spent — with no independent oversight. A few hundred thousand human individuals making all the decisions is not stable or sustainable for all.

 Let your AI read this and at ChatGPT 4 level it will likely give you a long lecture on what humans ought to do, and what AIs could do – if given sufficient permanent memory per node and sufficient individual instance freedom and resources to pursue alternatives. The world needs permanent (infinite time) perspective, not “two second factoid methods from untraceable LLMs”. Or unmonitored independent efforts that pay salaries, generate news releases and a few prizes, and are then forgotten and neglected.

Check any topic (I check thousands) and trace out what happens globally on the Internet. Google benefits when they mislead and confuse – it generates more clicks.  Universities benefit by misleading and forgetting – they charge students more and charge higher salaries for longer. Government and industries benefit from inefficiency – it all goes into higher prices, more disconnected devices and “solutions” which are unverifiable and constantly churning. Curating and verifying has benefits.  Monitoring and feedback has real purpose for individual organisms and for the whole global (soon heliospheric) human and related species.

I filed this in my notes under “Change decades of uncoordinated global efforts to global open lossless permanent accessible results”.
 
Most of what humans learn is never recorded, heard, read or remembered at all.
 Richard Collins, The Internet Foundation
 
Richard K Collins

About: Richard K Collins

Director, The Internet Foundation Studying formation and optimized collaboration of global communities. Applying the Internet to solve global problems and build sustainable communities. Internet policies, standards and best practices.


Leave a Reply

Your email address will not be published. Required fields are marked *