Megan Bedell (Flatiron) was at Yale yesterday; they pointed out that some of the time-variable telluric lines we see in our wobble model of the HARPS data are not telluric at all; they are in fact interstellar medium lines. That got her thinking: Could we measure our velocity with respect to the local ISM using HARPS? The answer is obviously yes, and this could have strong implications for the Milky Way rotation curve! The signal should be a dipolar pattern of RV shifts in interstellar lines as you look around the Sun in celestial coordinates. In the barycentric reference frame, of course.
I also got great news first thing this morning: The idea that Soledad Villar (NYU) and I discussed yesterday about using a generative adversarial network trained on noisy data to de-noise noisy data was a success: It works! Of course, being a mathematician, her reaction was “I think I can prove something!” Mine was: Let's start using it! Probably the mathematical reaction is the better one. If we move on this it will be my first ever real foray into deep learning.
Surely, a variational-autoencoder makes much more sense here. With a VAE not only can you denoise individual images, because you are directly maximizing p(X) of your dataset you should be able to get much tighter constraints.
ReplyDelete@Unknown: I don't disagree!
DeleteIf you do end up looking into variational-autoencoders, I'd be grateful if you gave updates on the blog. I would be curious to see if and how much better they perform (my instinct says much better).
ReplyDeleteApologies for what I feel is excessive commenting on my part, but I thought you would definitively be interested in knowing that your idea does already exist in the machine learning community (it goes by the name AmbientGAN). I just saw a post about it in the machinelearning subreddit.
ReplyDelete