¿ªÔÆÌåÓý

Re: CTI OSC5A2B02 OXCO module high precision frequency reference project


 

i"m gob smacked! In the 10 years since I got into the topic I've never run across anyone other than an couple of oil industry friends who knew anything about it. Foucart & Rauhut is rather heavy going with no one to talk to except the wall.

I evaluated the analytic 1D heat equation for something like 50,000 parameter choices, picked a few and summed them to generate y, solved Ax=y using the simplex algorithm in GLPK and compared the results to what I had created. At 2x the period of my trial data I reached 1-2% error.
I was inverting flow rates from multiple cracks in porous media.

The machine I was using is not running at present so I can't readily find my work and after 10 years it would require a couple of days to figure out what I did where in 4-5 TB of disk space.

The main mathematical obstacle I see is starting with very old OXCOs and LM399s where there is very little curvature left.

Please send me PDFs of your papers. After reading one paper, I'm pretty sure by Donoho, I bought and read Ziegler's "Lectures on Polytopes" and Grunbaum's "Convex Polytopes". I only read the sections relevant to the paper I'd read as the general topic is very far from my normal activities. Having taken a BA in English lit, I find my involvement with rather exotic mathematics a curious turn of events.

I just returned from hunting through the "stuff" aka the "Anvil of Conrad". 8000 sq ft filled at the time of his death 4 feet deep. In the process I turned up a 60 VA Sola Constant Voltage transformer. 95 to 130 Vac in 118 Vac out. I plan to set it up with a large (4-6") Ohmite rheostat as a load and see how it does using a large Variac to swing across the 114-126 V service entrance spec. Stable line voltage will significantly simply the PS for the LM399.

There certainly is no magic pixie dust. One of the things that makes seismic processing so expensive is all the rabbit holes you come across dealing with 10-12 TB of data. The other is the amount of machine time it requires. A friend had to rent a warehouse for the group of about 400 people he was managing at the time because they had exceeded the available power.

This is going to be a very interesting thread with you participating.

Have Fun!
Reg


On Tuesday, August 8, 2023 at 04:10:26 PM CDT, Daniel Marks <profdc9@...> wrote:


I worked in the field of compressed sensing and I taught a course that included this material. I also wrote several papers on compressed sensing instruments.

I am very familiar with the works of Donoho, Candes, and Tao. The various measures of sparsity, including mutual coherence, the restricted isometry property, L0, and L1 measures.

I also build instruments. I built spectroscopic instruments and inverse scattering radar measurements (which are better conditioned as they are elliptic rather than parabolic systems as in evanescent surface waves) that utilized compressed sensing for data inversion.

I also worked on trying to infer the parameters of distributions of time series with long-memory, long-tailed distribution diffusion processes from measurements of these distributions.

And there's one thing I know: no magic fairy dust turns bad data into good data. You can not wave your sparsity magic wand over data and miraculously get usable data from noise. It doesn't matter if you have the government spend $100 billion to improve a radar signature or oil companies spend $10 billion to find an oil well.

These are the kinds of visions sold by people who want grant money and promise that they can miraculously tease out some data that is somehow latent and overlooked. This is extraordinarily rare as to be unknown.

In the end, you have a physical model for a process. You have possible measurements of that process. You have some inference method, for example, maximum a posteriori. Your estimator can only be good as your model. If your model is well enough behaved, you can get an idea using Fisher information or mininum variance estimation as to the accuracy of the estimator.

I have spend a career solving inverse problems and have been quite successful at this. And I don't promise what I do not think can deliver. And I would not promise that any compressed sensing or estimation would reliably provide an answer, unless there was some reason to believe that the problem was guaranteed by the physical situation to actually satisfy that sparsity constraint.

In reality, most just assume the sparsity constraint, get an answer, and don't bother to compare to reality, or have any sort of cross-validation of the results.

I attach a copy of a lecture for a course that briefly summarizes some basic results in compressed sensing theory as of the time the lecture was written.

Join [email protected] to automatically receive all group messages.