Wow, it's been a while since I've posted anything here, so I hope I'm not out of practice at typing really fast. Anyway, I'm here at the 2015 Winter AAS meeting, and will again be blogging my way through all of the sessions I attend. I would love to start with the Kavli Foundation Lecture on Earth's Van Allen Belts (I didn't know we were still learning new things about them!), but I have to go help set up the AstroBetter booth so I unfortunately have to miss out on what promises to be a really interesting talk. Also, as usual, don't expect anything particularly comprehensive, as it will be me taking really quick notes as the talk goes on.
The first talk I've been able to attend (while not running around doing stuff for AstroBetter) is the early afternoon plenary talk. So here we go! (Science policy... should be interesting).
Science Policy Plenary Talk: What Do We Expect of a Space Program? - John M. Logsdon
Early on, we did, in fact, have an answer to this question. The initial goal was for NASA to be better than anyone else exploring space.
Kennedy's challenge to the American people led to a golden age of science funding: over $40 billion (in today's dollars)
LBJ and Nixon started by cutting NASA. Johnson kept funding for lunar program, but that's about it. Left challenge to Nixon.
Speaker blames Nixon for NASA's current path of "building capabilities rather than seeking goals".
Nixon really began the process of trimming NASA's budget because he didn't see the benefit of NASA to himself. "space expenditures must take their proper place within a rigorous system of national priorities."
Wow those funding/time charts are always really depressing.
Space science funding is most generally allotted to the field, and we're left on our own to figure out how best to use it. It's also surprisingly stable.
The George Bushes tried to set goals for space exploration. Only one actually put his money where his mouth was (surprise!).
We do finally have a goal (operational, not necessarily a broad vision): sending humans to Mars. We've got a lot of small steps to take first, and space exploration funding still isn't a major priority for most Americans.
The most likely outcome is that we keep muddling along as we have since 1971. Well shit.
122.01D: The Power of a Planet Poplulation - Angie Wolfgang
I sadly came into this talk late, but at least it's a dissertation talk, so I haven't missed everything yet.
I will also be jumping ship to the Supernovae II session shortly, as there are a number of talks on subjects near and dear to my heart: nucleosynthesis and neutrinos.
We're looking at a statistical analysis of the Kepler population prior to K2.
Looking at Weiss & Marcy's mass-radius relationship for the smallest Kepler planets, there is a HUGE dispersion in the planet masses for a given planetary radius.
Disperson seems intrinsic, therefore physically meaningful parameter. (Also, Chrome doesn't recognize "dispersion" as a word.)
Also interested in the period distribution of Kepler planets and where it comes from. This work seems to be in her thesis rather than being discussed here.
122.02: Characterizing K2 Planet Discoveries - Andrew Vanderburg
As we all know, Kepler started with 4 reaction wheels to stabilize it's pointing, and 2 of them broke. So now we have the K2 mission which uses a really clever solution of pointing Kepler along the ecliptic plane away from the Sun.
In the test field, they observed HIP 116454
The brightness of the star largely changes as a result of stellar drift across the cameras. Fortunately, it's predictable and, therefore, correctable! And voila! A transit appears!
HARPS followed up with radial velocity measurements.
HIP 116454b is a super-Earth with a radius of 2.5 times that of Earth, and a mass around 11.8 Earth masses
122.03: The Sixth Kepler Catalog of Planet Candidates: Fergal Mullally
Basically, we have almost all of Kepler's K1 data in the catalog now. 1493 new KOIs from the last catalog update. 554 of those are serious planet candidates.
They also re-vetted all KOIs with periods less than 50 days
Caveat: beware the high S/N limit. Plenty of deep, V-shaped light curves are probably just eclipsing binaries, potentially with a missing secondary companion.
Also, while we're at it, beware the low S/N limit (because nothing can be simple). the reliability of shallow, long period detections is decidedly lower than the actual sample.
9 new habitable zone candidates also (with a shout-out to Kopparapu et al. 2013!)
And now I jump ship for supernovae. Mostly so I have the time to find the right room. Also, yay getting a seat near an outlet so my laptop doesn't die during the next round of plenary talks!
121.05: Neutrino Emission from Core-Collapse Supernovae - Evan O'Connor
Supernovae emit the vast majority of their energy in neutrinos (~99%)
What we saw from SN1987A is that half of the detected neutrinos were detected in the first few seconds.
More compact progenitors released more neutrinos within the first few seconds. This is important because the neutrino flux can tell us about the progenitor's composition and structure.
We still have problems making 2D supernova simulations explode. Fortunately we can learn from these failures as well.
When we do get explosions, however, we can see the information about the progenitor's structure within the first 200 milliseconds. Once the explosions happen, we lose all structure information. Oh well.
121.07D: Nucleosynthesis in Axisymmetric Ab Initio Core-Collapse Supernova Similations of 12-25 Solar Mass Stars - James Austin Harris
He has collaborators at NC State and Oak Ridge. I am so amazingly not surprised by any of this (past experience).
Description of the supernova code, and what codes they use for particular areas (neutrino transport, hydrodynamics, nuclear reaction network, etc.)
Shock is decidedly asymmetric, production of silicon and nickel dominates within the shock.
Unfortunately for me, most of the nucleosynthesis being discussed here is at relatively low mass number, way lower than I ever concerned myself with in my undergrad work.
Certain nuclear species are very sensitive to the conditions of your code. In this case, titanium-44 and chromium-48 seem to be the most sensitive, but the effect on the resulting nickel-56 is non-existent as far as the graph shows.
Ok, if you're going to reference graphs in a talk, please make them large enough that I can read them from the back of the room (though I suppose he didn't necessarily anticipate being in a gigantic freaking room).
In closing: supernova simulations are really fucking hard. Just going 1.4 seconds in the life of the explosion is farther than anyone else has gone so far apparently. That should give you an idea of ludicrously computationally intensive these codes are.
121.08: Impact of the Third Dimension on Simulations of Core-Collapse Supernovae - Eric Lentz
Why is modelling supernovae so damn hard? There is a LOT of physics that needs to go into a single simulation. While 2D models are ridiculously computationally intensive, the third dimension is even worse, 100x more expensive.
Oh god, I just realized he's using Copperplate Gothic Bold as his title font. Ugh...
Anyway, there are also significant differences in the 1D, 2D, and 3D simulations. You just can't get the same structures developing when you're confined to not full 3D hydrodynamical effects.
Holy shit, 3D shock simulations look really fucking cool. I could watch that short video for a long time and not get tired of it.
134 Plenary Talk: Back to the Beginning. The Rosetta Mission to Commet Churyumov-Gerasimenko - Paul Weissman
Comets are pretty generally awesome. Part of the reason is because they are remnants of solar system formation from beyond the snow line. No comment on how processed most comets are.
Previously, we've only had ~hour long flybys of comments for photography. Before Rosetta, we've visited 5.
3 of these 5 comets are actually bimodal (like P/Hartley 2), which suggests that they were formed in a collision of some sort where the two bodies were moving slowly enough to stick.
There are a lot of instruments on Rosetta. Same deal with Philae.
One thing we want to do is follow a comet throughout it's whole cycle as it moves past the Sun.
Our comet looks like a rubber ducky.
The rotation period of the comet has sped up by 21 minutes between aphelion in 2007 and 2014. Probably due to outgassing providing an odd sort of thrust.
Looking at some really cool close-up pictures now.
We see activity in the form of jets and dust grains surrounding the nucleus.
IR images show that the "neck" region is brighter in IR (meaning it's hotter).
Water ice absorption line seen at 566 GHz. Line is blueshifted because material is coming out of the comet at nearly 700 m/s.
Modeling the periodicity of outgassing is hard, not quite exact to the rotation period. Likely more complicated.
Stats all shown in image.
Deuterium:Hydrogen ratio is more than 3 times that of Earth. Expected something near Earth-value like Hartley 2, and we were horribly wrong.
We've also detected a wide range of volatiles in the comet.
And now onto our favorite little lost lander, Philae.
Philae landed and bounced to a different location entirely. Summary of results included in image form. Possible to re-awaken it in Spring 2015!
135 Plenary Talk: High Energy Neutrino Astronomy. First Light, New Questions - Kara Hoffman
Karta's group uses IceCube, the giant neutrino detector buried in the ice of the South Pole. But we're going to get an introduction to the history of neutrinos in astronomy. Yay history!
About 100 years, we discovered the first cosmic rays with balloon flights. On an interesting note, we still use balloon flights to get higher in the atmosphere to make cosmic ray measurements with less atmosphere in the way. Only now these balloons don't have people on them.
Then Pierre Auger observed particle showers, where a single incoming cosmic ray event creates a ton of other particles from all of the energy of the initial event. This generally occurs when the cosmic ray in question slams into a particle in our atmosphere.
The same thing that makes neutrinos such great cosmic messengers (they can travel very far without interacting with anything ever) makes them a right pain in the ass to detect. Normally we use gigantic tanks of water. IceCube uses the ice of the South Pole as a natural, even bigger reservoir.
IceCube is probably one of the coolest on-Earth projects ever. you can read more about it here.
And at this point I got distracted by talking on Twitter. The rest of what I said can be found on my Twitter feed, found here.
What we really need is sky monitoring in multiple wavelengths to correlate astrophysical neutrinos with crazy astronomical events like gamma ray bursts and supernovae.
IceCube is also constantly being upgraded. Now that we know there are, in fact, astrophysical neutrinos, we can justify it. The goal is to spread out IceCube over another 10 kilometers.