Nuclear and Atomic Data, effect of order of magnitude increase in precision on opportunities in global machine learning

Donnie Mason,

I was just enjoying the graphs and data at NuDat 3.0.  I have been reading and using table of isotopes for over 50 years.  It is finally getting to where it is easy to use.

I was looking at Q beta-, Q beta+ and Q EC for all the isotopes and the CSVs have no common index, so they take effort to merge and compare.  I do not see “the whole table” which has all your properties with columns for A, Z, N, Variable Name, Unit, Value.

I wish you had integrated the magnetic moments, because that is the easiest way to look for new reactions.  The magnetic dipole energy is a good first approximation before full multipole models or looking for experiments.  I read the CRC Table of Isotopes and worked by hand countless times over the years.  When two protons combine, their Coulomb repulsion will be overcome at “nuclear” distances when the magnetic dipole 1/r^3 and 1/r^4 terms dominate over 1/r^2.

I wish I has been born now.  The neutron star modelers use full hydrodynamics and deal with proton pairs as proton superconductors.  I find it fascinating how one part of the Internet will be decades ahead of other parts. What is hard one place is super easy elsewhere.  Combining all the knowledge in a standard form that is easy to combine and use has been the hope of The Internet Foundation.  July 2023 was the 25th Anniversary of the Internet Foundation.

Thanks for a nice web page.  Maybe a nuclear reaction calculator and simulator page would be good.  I have not seen one.  For the last few months I have been testing large language model AIs. I am sure they can be trained to interact with students and people interested in and working on nuclear data.  I started following the fusion groups when I was at UT Austin in the early 1970s.  I always felt they should have kept going. But, even then, I listened and their electromagnetic people did not truly talk to the “nuclear chemists” (chemistry at nuclear energies.

Sorry to ramble. I am just writing things down.  Maybe you should add “like” and “$Thanks” to all your pages.  I recommend that for all Internet sites now, and also to link the pages to their teams for collaboration, sharing and to see what all is happening.  Sites moving to “open” are built that way, but a lot needs to be done to make the data efficiently shareable with the 5 billion humans using the Internet and all they have is Javascript, and NOT programmers, but very smart people from every country in the world. I grew up when “atomic energy” was a good thing.

The only suggestion I would make is to have a unique isotope identifier in the CSV.  Perhaps add A as well. To make it sort, 0000-0001 would be a neutron.  My only worry is I think it might need to need to allow for massive clusters now.  I call those “extended nuclear materials” or “nuclear polymers” or “magnetic bound states”.

On the Internet I have to allow for much experimentation with models and simulations now.  People can run simulations that explore many more alternatives quickly.  Setting standards and policies for Internet wide “open” web groups has been interesting. Sometimes I get to name areas that have grown, but the groups spring up separately and have not merged yet.

Thanks.

Richard Collins, The Internet Foundation


Subject: Some notes on where the magnetic moments fit into the binding energy

Donnie Mason,

You might want to jump to the end where I wrote about magnetic moments and nuclear properties. I wish I had more time.

Thank you. I try to write down suggestions and ideas as I visit sites.  I wrote this long note to organize my thoughts.  Maybe you will find some useful ideas.  I tried to summarize some important things going on.  I will check your data and put it in energy terms. I think I can handle the even odd energy by the magnetic moment. Trying will help me sort out what is going on. Approximate models are very helpful. When I worked with Steve Klosko on the NASA gravitational potential models, I would find approximate analytic models that let me get starting values for the satellite orbits. If those are “close enough”, then he could run his higher resolution, more accurate, but slower models and get good convergence quickly. Trying to use the big models for screening, or to get started was too expensive. The “engineering estimates” and “more accurate models” pairing occurs in all fields on the Internet.

I spent so many years gathering tiny bits of data here and there to try to understand nuclear and atomic processes. I am encouraging the space industries to look carefully at using nuclear and atomic energy sources for long range flights,and long term projects. It is just not economically feasible to use that much chemical fuel. And all disciplines now have changed fundamentally because of global collaboration and AIs capable of high quality language generation. I am recommending to them that they teach and certify their AIs as they would humans – to test and verify the AI knowledge.  I keep telling them they should not use AI for any situation where life and property are at risk,until their behavior and skills can be tested and verified

I could not find the masses that were used to make your tables. I can re-engineer them (calculate) them from the reaction data, but the Atomic Mass reports I am finding, like AME 2020 does not match at the milliElectronVolt level.  For gravitational sensors, that is too coarse.

For many years I have been recommending that all global sensor network compare their data at much finer level.  I think I started doing that more earnestly after calibrating the superconducting gravimeter network about 20 years ago.  There are just a few dozen of those sensors, they are vertical axis only and 1 sample per second.  But the vector signal is perfectly Newtonian so only requires an offset and multiplier to set the baseline on a global scale referenced to the Jet Propulsion Labs solar system ephemeris for the sun and moon. It is not perfect but put strong bounds on the baseline.  Before every station has to be individually calibrated by bringing in an expensive “absolute” gravitational sensor now and then.  This data – albeit now tied to JPL as a reference – sets the baseline and scale.  Before the signal had to be set by earth-only local references.  For the MEMS accelerometers that drift, this means they can calibrate hourly.

I am a little tired, so I won’t try to tall a very long story.  Except to say that the gravitational time dilation equations used for GPS/GNSS and now ground surveys and drone surveys of gravity on earth depend on the local gravitational potential.  And the forces, though tiny, now there are many very precise measurements reaching nano and pico levels and smaller where such things are possible. The magnetic moment measurements are extremely sensitive.  I hope those groups will inter-calibrate with the gravitational and magnetic sensor arrays.

I have not checked recently to see if there is any new work. I remember checking with a Russian guy about the neutron statistics in a fission reactor.  Gravitational time dilation affects all clocks. When I was at UT Austin studying statistical mechanics and thermodynamics with Ilya Prigogine’s group in the early 1970’s we were all trying to solve those “chemical clocks”, “chemical oscillator” problems.  I convinced Dilip Kondepudi to try to write about the effect of gravity on chemical processes. That was about 1975.  Prigogine got his Nobel prize in 1977.  And by then I was studying gravitation at University of Maryland College Park with Misner and Weber and those guys. And working with Steve Klosko on the NASA geopotentials.  Sorry, I remember lots of things now by where I was and what I was doing.  Using satellites to measure the potential with precise models and calibrations taught me to use a wide range of data for constraining the solutions.  Adding one dataset could get an order of magnitude improvement in precision.  I remember someone asked me to solve a Kalman filter equation to use the C-band radar data in real time  to estimate the location of the satellite and guide the laser detectors.  One little model took a wide target down to something solid.

I hope I am not boring you. Did you know that the next generation accelerators have to include earth tides in the modeling and control of the experiments?  Large Hadron Collider the operator manually corrects for things.  In the next generation those can be included in the algorithms probably AI ones.  For fast processes in heavy ion beam systems where people want to work with short lived excited states, those become possible when the daily (microHertz) and vibrational (milliHertz to MegaHertz) changes can be tracked.  I was reading about vibration isolation studies in semiconductor fabrication where the fabrication requires active isolation. All these technologies are growing in parallel, and many of them are duplicating work.  Publication cycles are so long and slow, and that strips all the relevant data to use the content, that industry changes now in months are hitting delays in updating reference data and global data.  I can see if from years of trying to sort out these connections.  But there are so many groups, I just try to hit the more important ones that affect the most industies and new technologies.

LHC throws away all the low energy data, but the machine intelligence algorithms allow using that to train models that give precise estimates now. I tried to explain to them that there are about 20,000 colleges and universities and many more schools and organizations training the next generations of humans who are going to grow up with things like “solar system colonization”. So LHC should take is “junk data” and “we don’t care about that little stuff” package it and give it to all the training grounds (most of them online now in some form) so those kids and researchers can learn from real data, not fake.  Every day I find new groups using simulated data as a proxy for the real thing. Because the groups gathering the data are finally getting around to the orders of magnitude increases in precision in most every field.

For atomic and nuclear data the milliKiloElectronVolts precision (ElectronVolt) is too coarse grained for faster correlations. The magnetic and electric and gravitational field variations are down to nanoElectronVolt now (roughly) across many industries.

Mossbauer was the first method of direct gravitational potential detection.  Most people think of it as a time dilation experiment, but it now allows measuring slow large very precisely predictable and measured variations in local gravitational potential. I think Mossbauer taught that modeling the recoil energy and resonances precisely gives orders of magnitude improvement in certain types of measurement.  I hope I am not boring you.  I am just trying to go over which groups are starting to work together, or could work together if their core data was comparable another factor of 1000.

I have been using the SI prefixes a lot.  Because much of the research is labeled and kept separate by Internet tags like nano and pico, Giga and Tera.  So groups tend to get into ‘nano” and work there for a while, and some of them will go from there into pico and femto, or Peta and Exa.  People do jump larger but most diffusion of knowledge is linear in the prefix world.  It makes most groups and technologies very predictable.  I often can find updates and where people will be going next in a few minutes or a few hours.  Sometimes a few months, but that is a LOT faster than “a few decades”.

Mossbauer.  “Searching for New Interactions at Sub-micron Scale Using the Mossbauer Effect
Giorgio Gratta, David E. Kaplan, Surjeet Rajendran is October 2020. They are talking about nanoMeter, picoElectronVolt, femtoElectronVolts.

The resonant conditions for magnetic dipole fusion reactions are in that range.

They talk about temperature effects and the magnetic and Newtonian gravitational potential noise is in the same scale. One of my biggest projects is to get groups to use “time of flight correlation” method with arrays of sensors so the sources of the large slowly varying and episodic can be localized and imaged.  This is “passive seismic” with acoustic signal. When the camera companies started making time of flight cameras and the lidar groups grew into new industries for 3D scaninng and 3D laser ionization displays and process controls, that meant new low cost methods tools and ideas are diffusing into more areas.  But it means that when fusion groups get to nanoElectronVolt resolution, there are already many global groups working. When they realize connections, there are sudden spurts of growth of one or two SI prefixes (1000 or 1 Million) and transformative changes occur.

So I am just trying to give you some encouragement to merge the Mass and magnetic moment data and the run the whole set of data to refine it all down to microElectronVolt levels.  It is just numbers. But it has huge consequences in evolution of new global and heliospheric industries.

After I left UMD College Park, I got a job working at Georgetown University on global economic and social modeling of countries, But I also studied magnetic resonance in the chemistry department and continued work on gravitational energy density effects. The spin-spin interaction for chemistry is not some soft rule of thumb, rather it uses the classical magnetic dipole interaction and can give very precise models for everyday comparisons. I considered what happens when the magnetic moments get to “nuclear distances” and it gives good starting values when looking for fusion reactions.  I term that “chemistry at KeV and MeV bond energies”. The reason it works is that the full multipole models either by plane wave or Schrodinger have the dipole and quadrupole interactions as the first terms.  They are dipole estimates of the full solution.

I went through the list of isotopes looking for potential fusion reactions. And all the “good” ones had magnetic dipoles. And, in a very simplistic but effective way, put two proton near enough they will bond by magnetic dipole forces.  It turns out that the energy constraint is more fundamental than forces. So set the Coulomb energy of two protons equal to the magnetic dipole energy, and that depends on a distance which can be used as a reference value.  It does not have to be real, it only has to be the right order of magnitude and, with an open model, give consistent results.  The full Schrodinger model is not even enough if you are going to do fusion. But the neutron star modelers have a pretty good handle on it now when they deal with proton superconductivity (protons bind into magnetic pairs),  The whole of spin dependence can be made quantitative. Cooper pairs are Fermions, Fermions have magnetic moments, when pair bind the magnetic field closes and they become bosons.  The bosons are superconducting in bulk as in Bose-Einstein condensates (it turns our there are Bose Einstein gravimeters as well that can be extremely fast and sensitive, lots to do).

I wrote to Emilio Segre to ask him about the spectrum of positronium.  I have his letter here.  That was 1981.  Pretty much all the particle antiparticle pairs can be approximated by magnetic binding, Coulomb repulsion and rotational and vibration energies.  The neutron should have a spectrum.  If you use magnetic gradients to accelerate and cool it, you should be able to store them in closed orbits.  And solve for stable states – because inside the nucleus, even deuterium, the neutron is stable.  The neutron as an electron and proton bound by Coulomb and magnetic dipole forces (energies) and rotational and vibrational states should have stable states longer than the average.

Joe Weber at UMD told me to study the works of Robert Forward. Robert had helped Joe with his cylinder detectors, but the piezo sensors and electronics and data collection then were not sufficient to do what they needed.  Robert wrote “Detectors for Dynamic Gravitational Fields” where he encouraged combining gravity and electromagnetism by establishing a common set of units. That is part of why I have been working on that for the last 45 years. Robert went on to help test LIGO technologies of photon interferometer detectors (now there are atom interferometer that are much smaller) But Robert wrote about the gravitational energy density.

If you look at “Gauss’s law for gravity” on Wikipedia you will find an expression for the “Lagrangian density for Newtonian graivity”  the first term is just the gravitational potential field time the density field. And “field” just means “3D with data for every voxel”. The second term is g^2/(8*pi*G) which is the gravitational energy density in Joules/Meter^3.

For a field of 9,8 Meters/Second^2, at the earth’ surface that is 5.72540968137E10 Joules/meter^3 or 57.2540968137 GigaJoules/Meter^3.  The gravitational field has that much energy, fine grained and with sufficient power available to do it job.  I wrote an essay for the Gravity Research Foundation that proposed that all fusion experiments on earth would be difficult because of fluctuations in the earth gravitational energy density field..

To give you an idea of how large that is use the expression for the magnetic energy density, set that equal to the gravitational energy density, and solve for B.

B^2/(2*mu0) = g^2/(8*pi*G)

B = g*sqrt(2*mu0/8*pi*G) = g*sqrt(mu0/4*pi*G)

B = g*38.7076796657 (Tesla/(meter/second^2))

For a field of (9.8 Meters/Second^2) that B = 379.335260724 Tesla using Codata

Since 1981 I have been checking every experiment that gets to magnetic fields of this size.

On earth and sun the gravitational energy density sets a bound on the size of natural events like gamma ray emission in lightning and on the sun, magnetic reconnection.

But for things like laser generation of electron positron pairs from the vacuum, it is very likely that is the threshold for noise for experiments. The laser vacuum and any fusion experiments should see variations because of the gravitational potential and changes in the local gravitational energy density – both are measurable now to high precision.  And nearly perfectly Newtonian.

About 99% of the vector signal at a gravimeter station can be predicted from GM/r^2 for the sun at the station minus the sun at the center of the earth, plus the moon at the station minus the moon at the center of the earth, plus earth rotation centrifugal acceleration. Then a linear regression for each axis.  I spent a year (and then several more) to check the broadband seismometers that can be used as gravimeter. The show all three axis match the sun moon signal. The residual is mostly atmospheric and ocean contributions to changes in the earth gravitational potential field. With arrays of sensitive time of flight gravitational detectors, that allows imaging the sources – the atmosphere in 3D, ocean surf, seismic waves from earth quakes.

This is a bit far from fusion, but it is the reason I studied the conditions for fusion down to picoElectronVolts levels. And why I have been trying to check if I can make a simpler model of nuclear processes.  A desktop fusion reaction does NOT have to generate energy, but it should be a sensitive gravitational detector. And the data from it should be a great training ground for AI modelers and global collaborations. Since I can in most cases interconvert gravitational and magnetic equations by using the energy density, I think of a combined field where what we call “gravity” can be traced to specific spatial and temporal frequencies of a common radiation field.

In August 2017, the electromagnetic and gravitational signals from the merger of two neutron stars arrived at earth detectors arrays after about 130 Million year race.  They arrived at the same time, so that showed me the speed of light and gravity are identical,  Not close, identical. And that means they most likely share the same underlying potential and are different aspects of one field. In these last 6 years I have checked how that can be, and have a fairly decent way to sort it out.  I used the superconducting gravimeter to measure the speed of diffusion of the gravitational potential. The timing is precise and many sensors all have to agree.  The speed of gravity and electromagnetism is the same. For many practical purposes the “gravitational” field is a low frequency magnetic process. And it can be synthesized with strong magnetic field gradients that can be pulsed.

I will take a short break and then start checking all your tables that come in CSV to solve for the masses,  I want to put it into energy terms because the table of isotopes is symmetric in the Coulomb energy.  All the even odd is just the magnetic pairing. And some of the inconsistencies are likely related to mistakes in assignment.  I am confident the whole of masses, reaction energies, and electromagnetic constants can be made consistent in energy terms, and can be checked to picoElectronVolt resolution.

ALL the nuclear and atomic measurement should be checked for earth magnetic and gravitational variations, The sun moon signal is a huge signal that is about 1 Angstrom/second^2 at 1 sample per second. That might seem small, but the gravitational signal is harder to damp than magnetic field. When I worked at Phillips Petroleum I met a guy doing magnetic research and his shielded room use plates from old battleships. But his magnetic shielding did not block gravity.  The gravimeters are precise position sensors, since the signal has to match precisely the timing and location of the station, and as a vector, has to match the orientation of the three axes in a vector detector or tensor detector. The Transportable Array of seismometers has broadband sensors that can track the sun and moon. Those can be solved for the orientation of the sensor. (some of them had no data on which way the sensor was set) so the gravitational sensors can be used for “gravitational GPS (location) and “gravitational compass” (orientation), And not affected by earth, ocean or ionosphere.

That is roughly what I have been doing. I will see if I can write some Javascript programs to proces your CSV.  If you get the electromagnetic data, I can check that.

e^2/(4*pi*e0*r) = mu0*mu1*mu2/(4*pi*r^3) gives the distance where the Coulomb and magnetic dipole energies are equal. When the particles have the same magnetic dipole moment, then e0=1/(mu0*c^2) in SI units.  KEEP the magnetic moment in Joules/Hertz

r = mu/(e*c)

Proton mu = 1.41060679736E-26 Joules/Tesla
Electron_mu = 9.2847647043E-24 Joules/Tesla
Neutron_mu = 9.6623651E-27 Joules/Tesla

ElectronMagneticRadius = 1.9330354E-13 Meters = 192.30354 femtoMeters for two electrons
ProtonMagneticRadius = 2.93680341E-16 Meters = 293.680341 attoMeters for two proton
NeutronMagneticRadius = 2.01164965E-16 Meters = 201.164965 attoMeters for two neutrons.

These are just the CoData values and I use the radius values for order of magnitude.  If you send collision debris through a magnetic separator, the single protons should have a magnetic moment and 1 charge.  The bound pairs will have two charges and no magnetic moment.  And there should be triples and longer chains in neutron stars.  The neutron pairs should have no charge, no magnetic moment but maybe an octopole moment.  I have not worked that out.  Thinking about it, in fission there should be lots of bound pairs that are freed and remain stable. A neutron pair might make a good neutrino.

When you collide a proton and election to form a neutron, that rewires very precise orientation and timing.  Two protons are heavier and the Coulomb part is the same, they should bind easier than two electrons.  If you take a proton and antiproton, they will bind as real particles if they have rotational energy. They will nave no external charge, no external magnetic field, but can be massive or massless according to the rotational state which should be quantized with n*hbar. Bound particle antiparticle pairs should occur naturally but they would be electromagnetically invisible.

Binding energy is not gravitational?  I think not, but I have no easy way to check yet.

Richard Collins, The Internet Foundation

Richard K Collins

About: Richard K Collins

Director, The Internet Foundation Studying formation and optimized collaboration of global communities. Applying the Internet to solve global problems and build sustainable communities. Internet policies, standards and best practices.


Leave a Reply

Your email address will not be published. Required fields are marked *