2013-05-31

NIPS submitted

Lang was in Pittsburgh writing code and reading off answers by Skype. Schölkopf and I were writing in disjoint parts of the text. One of the principal results of the paper was put in less than 120 seconds before the deadline. And yet we made it. We submitted to NIPS. This is a project proposed by Lang 11 days ago but finished today. Schölkopf's verdict: Deadlines are good! My verdict: Deadlines are unhealthy!

We plan to write an astronomer-friendly version and post it to arXiv.

2013-05-30

Open-Source Sky Survey

I haven't thought about the Open-Source Sky Survey in a long time, but my NIPS submission with Lang and Schölkopf made me think about it again. We can combine all the images of the night sky ever taken into an all-sky map with an algorithm that is robust to calibration and transformation problems and that is linear in the total data size. That's awesome! Now I just hope we maintain momentum after the (unhealthy, insane) NIPS deadline.

[ps. This post violates the anonymity requirements of NIPS. That just shows that submitting to NIPS is not consistent with my ideals of open science.]

2013-05-29

saving Kepler, calibrating Kepler

Fergus, Schölkopf, Foreman-Mackey, Fadely, and I (yes, most of CampHogg) had a big free-for-all on how we might (first) make current Kepler lightcurve data more precise, and (second) develop calibration and photometry systems that would permit Kepler to operate in radical modes if it is stuck with only two reaction wheels going forward. On the second point, we didn't get very far, except to decide that if the things the Kepler team is currently trying fail, we will chime in with the suggestion that some observing modes that they might be ruling out out of hand (for perfectly sensible reasons) might still be scientifically useful, if they are accompanied by a very hard-core calibration scheme.

On the first point, Fergus wrote down a full causal generative model for the Kepler data under weak assumptions, and then I linearized it (possible because pointing and temperature variations are small in relevant units). We found out that every light curve ought to be expressible as a true light curve times a linear function of spacecraft attitude and temperature perturbations, plus another linear function of spacecraft attitude and temperature perturbations. This is a kind of matrix factorization; in the limit that photon noise variance is small, this problem is solved by PCA, of all things. That ugly algorithm rears again its ugly head! But also this justifies things we have been hearing from the Kepler camp, that is, that a low-dimensional PCA model of lightcurves is good at describing lightcurves. And I am pleased to note that the correct generalization as the noise variance gets large is my own HMF. The day ended with Foreman-Mackey doing some experiments.

2013-05-28

spacecraft issues

Foreman-Mackey and I plotted Kepler reaction-wheel angular-momentum adjustment events onto the residuals away from one of our fits to a lightcurve today, not to diagnose the spacecraft's reaction-wheel issues but rather to check whether we can make an effective model for the perturbations to the photometry that these adjustment events imprint (indirectly) on the photometry. We decided that the effect was too small to worry about just yet; we tabled it until we find that we are forced to deal with it (because, say, it interferes with discovery or measurement). In related news, Fergus called a meeting for tomorrow to discuss crazy ideas for saving Kepler. Insane!

2013-05-24

Bill Freeman

Bill Freeman (MIT) came into town for the day today. In the morning he showed us work his group has been doing to infer the flow of turbulent air through a scene in which hiqh-quality video has been shot. By using assumptions about the background scene, he can look for the motion of the (extremely tiny) optical distortion pattern in the scene and try to get the air movement. Applications include airport safety and astronomical imaging, where in principle observations of resolved objects (Freeman is interested in the Moon, as I have mentioned previously) could be used to build a model of the atmosphere and improve image quality (or adaptive optics control loops).

In the afternoon, Freeman showed (in a seminar) his work on motion amplification. He can take tiny, tiny motions in a video stream (think: your pulse, your breathing, the swaying of a rigid building) and amplify them, but importantly without building a model of the motion. His highest-performing systems work so simply: Take a spatial Fourier Transform of the images in the video stream, and amplify the Fourier-Component phases (within some window function, and with some filtering, and so on). This method amplifies the motion but doesn't amplify the noise in the image, because it doesn't amplify the intensity or amplitude of any mode or component. The results are astounding! Interesting to think of astronomical applications: One might be prediction for the future of time-variable nebulae like V838 Monocerotis and supernova 1987A and η Carinae.

2013-05-23

data science; autonomous robots

The morning was spent with various NYU luminaries preparing for a proposal (or really a design process leading up to a proposal) to the Moore and Sloan Foundations relating to "data science". The most interesting aspects of the question are that no two people agree on what data science is, and even for any component of data science, no two people agree on what the big issues are that need addressing. So that is the beginning of an interesting set of conversations.

At lunch, Foreman-Mackey and I met up with Schiminovich to discuss a proposal that Schiminovich wants to put in to NASA about autonomous platforms that will make space missions and suborbital projects cheaper, faster, and easier to deploy. The idea is for the platform to be able to make scientific decisions and re-scopes in real time, in reaction to changing conditions. Could be fun! Of course I argued that we should take an economic model, with utilities specified in dollars. And I need more projects to be doing, because I just don't have enough irons in the fire! (Note ironic tone.)

2013-05-22

robust rank statistics

While Lang and I programmed like mad, Schölkopf read the literature on rank statistics (and galaxies with faint features). We realized that we need to do something much more robust in our combinations of rank information. We implemented a more robust method, with Schölkopf wondering if there is something much better we could be doing. Results will appear tomorrow (or late tonight).

2013-05-21

image pixel ranks, probability, provenance

In an argumentative session, we decided that everything we did and thought yesterday about combination of images was wrong, and re-started. The argument was long and complicated, but ended up delivering a very simple algorithm. The idea is to use the rank information in an input image to update or improve our beliefs about the rank information for pixels in a combined or reference image. The point of this is that we don't believe the intensity information in the images but we do believe that brighter parts are probably truly brighter. A lot of what made things complicated is that sometimes an input image covers only part of the reference image; in this case we only want to use it to reorder the pixels within its footprint.

In a not totally unrelated conversation we asked the following question: How can you combine the rolls of two six-sided dice such that you get a random integer uniformly distributed between 1 and 6? The constraint is: You must use the two dice symmetrically. One solution: Roll the two dice and then randomly choose one die and read it. We came up with a few others. You can't add the two dice rolls and divide by two, because then the result isn't uniformly distributed between 1 and 6. The central limit theorem is a hard thing to fight against. My favorite solution: Make a 6x6 table, in which the numbers from 1 through six each appear 6 times, but placed in the table randomly. Roll two dice, use the first to choose the row and the second to choose the column in the table. That's a hash, I think, mapping the two rolls (which jointly produce 36 different outcomes) onto 6 numbers.

At the end of the day, Lang and I used pixel rankings to identify human-viewable images that were built from the same source data. The idea is that the ordering of the noisy pixel values in the sky is like a "digital fingerprint". It seems to work like magic.

2013-05-20

combining bad images

Dustin Lang arrived for a few days of hacking in preparation for (we hope) putting in a NIPS paper by the deadline of next week. We are working with Schölkopf on a project to combine arbitrarily badly processed human-viewable images to find very faint features in extended astronomical objects (like galaxies and nebulae). We argued for ages about the methodology and started to implement. In the background, while Lang and I pair-coded something somewhat sensible, Schölkopf coded up the straight-up average of the registered images. It looked surprisingly good, causing us to wonder whether it is worth going to all the trouble to which we are going!

2013-05-17

Dr Lewis

George Lewis (NYU) gave an absolutely wonderful defense of his thesis today, on top-quark physics with ATLAS at the LHC. He ruled out a range of new physics in the top mass range, and measured the top pair-production cross section more accurately than it can even be predicted in the standard model. A very nice talk and a very well-deserved PhD.

2013-05-16

big planets, small planets, Earth-like planets

In a low-research day, Tom Barclay (Ames) gave a very nice talk about exoplanets. He made many interesting and novel points. The first was that big planets are still very interesting, because their large impact on the system means that many things can be measured precisely. In particular, he showed examples where you can measure the Doppler beaming of the stellar light resulting from the reflex velocity of the star induced by the planet! Another point was that it is possible to find very tiny planets; he showed some of the smallest planets discovered with Kepler; several are much smaller than Earth. He is personally responsible for the smallest ever. Another point was that there are a few planets now that are debatably and reasonably "habitable". The striking thing is that there aren't yet Earth-sized planets that have been found in year-ish orbits. All known planets are either on shorter orbits or else larger. Time to fix that!

2013-05-15

Kepler searching

Foreman-Mackey and I finished our NASA ADAP proposal. In the afternoon, we hatched a plan with Barclay (Ames) to search the Kepler photometry for very long-period planets, because the Kepler-team searches are weakest there.

2013-05-14

limb darkening

Tom Barclay (Ames), Foreman-Mackey, and I made our plan for hacking this week: We are going to take a multi-tranet (a "tranet" is a transiting planet) system from the Kepler data and infer the host-star limb-darkening profile. Multi-tranet is important, because for any individual tranet (especially at low signal-to-noise), the limb-darkening has substantial covariance with the transit geometry. Indeed, our goal for the week is to find out how constraints on limb-darkening improve as the system increases in the number of tranets; I predict that it will improve faster than the total signal-to-noise in the transits, because the different tranets will place (at least slightly) different constraints on the appearance of the star. But we will see. By the end of the day, some progress was evident, although not through any useful action of mine!

2013-05-13

Kepler day

Today was Kepler Day at Camp Hogg, kicking off Kepler Week. Tom Barclay (Ames) came into town for a week to help Foreman-Mackey and me understand the Kepler data in more detail. We spent a lot of the day discussing the various physical effects coming in to the instrument-induced variations we see in the Kepler lightcurves. There are some crazy things, including stellar aberration variations, temperature and point-spread function variations, CCD electronics cross-talk, cosmic-rays and bad cosmic-ray removal, and thruster firings. For many of these things we might be able to build a model or help with modeling. The goal for tomorrow is to decide on week-scale goals and execute. At lunch-time, Foreman-Mackey gave a very nice blackboard talk on Kepler systematics and population modeling, which was pretty relevant to everything we did today.

2013-05-11

proposal writing

Foreman-Mackey and I are putting in a NASA proposal to support our Kepler work. I spent a lot of today hacking on it. It has got me excited about what we are doing, which is the best thing a funding proposal can do.

2013-05-09

data preservation, meta-analysis

I spent the day at Radcliffe, in a small meeting arranged by Alyssa Goodman (Harvard) and Xiao-Li Meng (Harvard) on how to curate and keep data for analysis and re-analysis. Most of the discussion in this (free-form, informal, small) workshop was around the idea of meta-analysis and re-use of the data by other users. Some of the interesting ideas that came up were the following: Different people coming from different backgrounds have very different meanings for the word "model" and also many other words, including "data" and "provenance". The goals of data preservation, meta-analysis, re-use, and scientific reproducibility are all related and overlapping. Archivists and curators do best when they get involved with the data as early as possible in the "life cycle", preferably right at the original taking of the data. The concerns that arise with reproducibility and the concerns that arise with privacy (think: health data and the like) are strongly at odds.

Meta-analysis can be described in terms of hierarchical modeling (duh) and we should probably think about it that way. Meng showed some nice results on the idea of sufficient statistics in hierarchical models; specifically, he is thinking about statistics that are sufficient for sub-branches of the full model: When are they also sufficient statistics for the whole model? The range of expertise in the room—from statistics to particle physics to library science—made for a lively conversation, and many (small) disagreements. The goal for tomorrow is to write a document summarizing various things learned.

2013-05-08

combining arbitrarily transformed images

Schölkopf and I worked over coffee to come up with a method for the Lang–Schölkopf idea of combining Web-scraped images using pixel rank information. The idea is that human-viewable images can be very strangely transformed, but if they have been transformed in a way that doesn't re-order pixel brightnesses (at least locally), there ought to be ways to combine them. We came up with several simple methods. It was an interesting conversation, because I like to think about problems as having a causal, generative, probabilistic model underlying them and justify all procedures as being approximations to the Right Thing To Do (tm) within that model framework. Schölkopf likes to think about fast, tractable, scalable procedures with good properties, and only then see if there is an understanding of that procedure in terms of inference. Fortunately, I think we have it all; more soon after we try it out. My job (as usual) is to start the document. While we were talking, Lang was scraping the Web and calibrating images with Astrometry.net.

2013-05-07

pixel brightness rankings

Lang and Schölkopf blew me away today by suggesting that we combine heterogenous images not by co-adding them but rather by inferring a consistent brightness ranking for the pixels. There are lots of real-world issues (think registration and pixellization and bad data), but there are also lots of reasons that a brightness ranking analysis might be far more robust than a co-adding procedure for finding very faint structure. We set the scope of a fast project to kick this off and Lang started on the dirty work, which involves scraping the web for images (recall our Comet Holmes project?) and running everything through Astrometry.net.

2013-05-06

take back the data!

At Computer-Vision-meets-Astronomy group meeting this morning, several extremely good ideas were hatched. One idea, from Lang originally in part, is to build a model of heterogeneous JPEG images of the sky grabbed from the Web but using not true brightness on a linear or magnitude scale but just brightness ranking. This would get us much of the information we seek about the sky without putting nearly as hard requirements on our PSF and photometric calibration.

Another idea, hatched by Schölkopf after an amazing image-recognition demonstration by Fergus, was to start a company (non-profit) that provides a browser plug-in or skin that delivers image-labeling content to a public database, rather than letting it just get sucked into the black hole that is Google Corporation. The idea is that whenever you do a Google image search, Google learns image labels by looking at which images you subsequently click, or so we hypothesize; that's valuable content, since all high-performing image recognition systems (except Astrometry.net) are data-driven. (A related idea was to start a class-action lawsuit to get everyone's image-labeling data back from Google!)

2013-05-03

black-hole binaries and inference

In a very low-research day, I had a short conversation with Mike Kesden (NYU) about how to distinguish models of black-hole–black-hole binary formation using an Advanced LIGO or eLISA data set. I gave the usual gospel of hierarchical probabilistic modeling with your causal knowledge baked in. The non-research part of the day was spent handicapping the Kentucky Derby.

[Note added later: The Kentucky Derby thing worked out well.]

2013-05-02

SDSS and WISE

I spent an hour or so on the phone with Lang discussing his ambitious project to build a model of all the WISE imaging using as a very strong prior the SDSS imaging catalog. The results are beautiful, and will help enormously in SDSS-IV eBOSS targeting. We talked about various issues with doing enormous least-square fits like these: The idea is to believe exactly everything we know about the SDSS catalog, the SDSS PSF, and the WISE PSF, and then just find the set of amplitudes (brightnesses), one per catalog entry, that best explains all the WISE pixels. This method makes use of all the WISE epochs but without co-adding them. It also deals as correctly as possible with overlapping sources, since the fit to all amplitudes is simultaneous. It is very beautiful and is—under brutal assumptions—the Right Thing To Do (tm).

2013-05-01

writing

Schiminovich and I got to our undisclosed location and then decided to write. We each wrote various things, but I mainly worked on my upcoming proposal to NASA to support my Kepler projects with Foreman-Mackey.