One linear regression is all that is needed to useg JPL Ephemeris to separate the sun moon from earth-based signals
I want to repeat the SG network speed of gravity measurement I did in 2003 using all the current stations. I will also use the best of the broadband seismometers from IRIS.edu. Any absolute measurements, any MEMS or atom interferometer measurements. I know most of the groups, and will try to find others. After that I can remove the external tidal field. The residual I can calibrate against the atmospheric models. There are many networks and datasets to put bounds on the cells of those models, and it is fairly easy to correct the SG and broadband (3 axis) seismometers as gravimeters.
I should be able to get maybe 5 digits accuracy on the speed of gravity. And then hope to convince the big G groups to help. LIGO could help. There are many groups. I have been following and talking to a few over the last few years particularly. The LIGO follow-on devices have many good uses. There are some atomic clock and Mossbauer and quantum detector groups who can connect. Not sure what will happen. Or I will just put bounds on the speed of gravity and file it away. I tried for years to get groups to build three axis, high sampling rate (Msps, Gsps) devices for time of flight gravimeter imaging arrays.
I have some low cost approaches that might work. But I thought going back to do my original research on the speed of gravity, check the current state of the SG network, find the people and groups – would give me a clearer picture to see what to do next. I am getting tired and want to put these methods onto GitHub and other locations so people can use them for routine calibration and imaging. I will index all the groups again and put my notes at GravityNotes.Org. I may rewrite Eterna in Javascript but use the JPL ephemeris directly. There are lots of groups using Python and Matlab. Will see if I can fit their models together.
Found most of the “global climate change” and solar weather groups. It is just easier to do them all, than try to cherry pick.
Would you like to move to HTTPS and make your data accessible to the whole Internet? You could make some older data open, or just make if available to everyone. Or move some to IRIS.edu. There is a little SG data on IRIS, but they have never really used gravimeters – thought the broadband seismometers can be used and are better, sometimes, because of three axes.
I have minute data from 1994 to 2013 on my disk, for 24 stations. I am looking at the regression and calibration statistics from those and remembering why it was so much trouble. I did not store all the regression logs. The few samples I looked at just now have lots of seismic data that I need to remove. Not hard, just tedious.
Looking at one of these again, I realize what using just the Newtonian vector tidal signal means for global coordination.
Months long complex curves that only require two parameters and Newtonian vector calculations – tied to JPL ephemeris and WGS84 so easily updated. Absolute – using JPL as the reference. They (JPL) said, if we can get the precision up to their standards, they can use the data to correct their GMs. The Cavendish and Big G experiments can weight things locally. The speed of light tests can be good reference test for any new time of flight detector.
It is easier to teach and standardize simple methods that every group can learn and use.
When I worked at the Central Intelligence Agency, and later on the NASA GEM models with Steve Klosko, I learned the value of keeping 16 digit values for restarting orbital calculations and regressions. They don’t have to be perfect, they just have to be trace-able (audit-able) and reproducible.