Change decades of uncoordinated efforts to global open lossless permanent accessible results
MA Oviedo-Garcia @maoviedogarcia More than 2 million research papers have disappeared from the Internet despite having an active DOI
https://nature.com/articles/d41586-024-00616-5
Replying to @maoviedogarcia
On LLMs, you ought to be able to estimate how many replications of 2 sigma algorithms are needed to refine global models to even 4 sigma precision. Tens of thousands of small university departments running for decades on a topic, with zero effort at checking the whole, does not add up to one globally coordinated, comprehensive, precise topic – run for a year with global open lossless methods and remembered and applied forever.
Most of any such effort is wasted, because it is effectively lost in a few months or years after the project is done, the person retires, the initial champions lose interest. Look at all the hot startups and hot topic initiators. Billions or trillions of dollars spent on salaries is useless if there is no accounting and audit. Look at the good intentions of large corporations. The goals are most all lost and then trillions more spent — with no independent oversight. A few hundred thousand human individuals making all the decisions is not stable or sustainable for all.
Check any topic (I check thousands) and trace out what happens globally on the Internet. Google benefits when they mislead and confuse – it generates more clicks. Universities benefit by misleading and forgetting – they charge students more and charge higher salaries for longer. Government and industries benefit from inefficiency – it all goes into higher prices, more disconnected devices and “solutions” which are unverifiable and constantly churning. Curating and verifying has benefits. Monitoring and feedback has real purpose for individual organisms and for the whole global (soon heliospheric) human and related species.