19 August 2012

NSF AST Portfolio Review

Recently, amid US government budget shortcomings, the National Science Foundation (NSF) convened a panel to review the research priorities set forth by the 2010 Decadal Review, otherwise known as Astro2010. The NSF AST Portfolio Review reassessed the research priorities and objectives for the US ground-based observatories which carries potentially large implications for the astronomical community if or when the recommendations are upheld. Included in this, was the recommendation to "divest" in resources such as the WIYN and 2.1 meter telescopes atop Kitt Peak. However, a number of facilities were recommended for continued support, most notably ALMA and the VLA. For those interested, the full document is freely available online.

Given that there has been some time for people to read the document, or various blog posts summarizing it, what are you thoughts? Do you feel the budget allocation recommendations were adequate, or were there certain projects recommended for divestment that you were hoping to see remain open? Finally, how does the recommended budget allocations affect your current research or your future research plans, if at all?

14 February 2012

Naked Stellar Core

Title: Discovery of a stripped red giant core in a bright eclipsing binary star
Authors P.F.L. Maxted, D.R. Anderson, M.R. Burleigh, et al.

Before delving into the topic, it should first be pointed out that the discovery paper was first published in the Monthly Notices in September of 2011. The article can be found here.

Figure 1. Phase folded light curve for 1SWASP J024743.37-251549.2.
The title of the article is very effective at describing the system presented in this paper. Maxted et al. announced the discovery of an eclipsing binary system in which the primary star completely occults the secondary star. However, the secondary star is found to have a higher effective temperature than the primary star. Photometric analysis suggests the primary is an A star and that it contributes approximately 90% of the total flux of the system.

Figure 1 neatly elucidates this information. The deeper eclipse (at phase 0) indicates when the cooler star is passing between our line-of-sight and the hotter star. However, the fact that the eclipse profile is flat at the bottom implies the hotter star is being completely occulted, yet the total flux received from the system is hardly affected.

To further complicate things, a rough age estimate can be obtained from the system's kinematics. Space motions indicate the system is a part of the galactic thick disk, meaning the system has undergone significant disk heating and has acquired a larger vertical component to it's motion than would be expected from a young system forming in the galactic thin disk. The characteristic age for the galactic thick disk is > ~7 Gyr meaning the system has an age well older than the lifetime of an A star!

So what is going on here? Numerical modeling has shown that the system is consistent with a red giant that has had its outer layers stripped off, leaving behind a He core with a H envelope. There is some degree of H burning taking place in a shell around the inert He core. How was that mass stripped off and where did it end up?

Since the secondary was plausibly a red giant before it had mass stripped away and since the primary is, as far as we can tell, a normal main sequence star, the secondary star must be more massive to be in a more advanced evolutionary stage. As the more massive star puffed up when it started to become a red giant, it overflowed it's Roche lobe and began funneling material to its lower mass companion. Eventually the mass transfer halted and left the system in the state we find it today.

Stellar evolution dictates that a star will begin it's ascent up the red giant branch once it has exhausted most of the hydrogen in its core. However, the core is not yet hot enough to ignite helium burning (hence the core contracts and the exterior inflates to conserve flux), leaving the star with an inert He core. It will live out the rest of it's life as a He white dwarf once the H shell burning halts (pre-He-WD). As for the star that received the additional mass, it is now living the life of a higher mass star (hence the A spectral type). These particular stars are known as blue stragglers.

The system is rare and exciting, but the WASP team has indicated they have more examples of blue straggler/pre-He-WD systems. Stay tuned and keep an eye out for more of these remarkable systems!

17 January 2012

Circumgalactic Media and Their Hosts

Title: The Large, Oxygen-Rich Halos of Star-Forming Galaxies Are a Major Reservoir of Galactic Metals
Authors: J. Tumlinson et al.

Galaxies grow and evolve by accreting gas from the intergalactic medium (IGM), forming stars with this material, and ejecting the often-enriched remnants through galactic-scale outflows.  At the convergence of these processes lies the circumgalactic medium (CGM), gas surrounding galaxies out to 100 to 300 kpc.  This paper investigates the relationship between properties of host galaxies and their CGM using the Cosmic Origins Spectrograph (COS) on the Hubble Space Telescope with absorption-line spectroscopy.

The method of measuring absorption lines through CGM is as follows.  The study focuses on 42 sample galaxies that are close to distant QSOs on the plane of the sky.  As light from the QSOs passes through the CGM of the galaxies, absorption lines can be measured, specifically the ultraviolet O VI doublet 1032 and 1038.  The data is (are?) used to measure O VI column densities, line profiles, and radial velocities of the CGM with respect to the host galaxies.  Furthermore, the Keck Observatory Low-Resolution Imaging Spectrograph (LRIS) and the Las Campanas Observatory Magellan Echellette (MagE) spectrograph were used to measure redshift, star formation rate, and metallicity for each galaxy.

Not surprisingly, the study found that each CGM was close in radial velocity to its host galaxy, suggesting a close physical and gravitational relationship.  Furthermore, there is a correlation between O VI column density and specific star formation rate (sSFR): active, star-forming galaxies have much higher column densities than passive galaxies.  This reflects the bimodality of galaxies and suggests that the CGM either directly affects or is affected directly by the galaxy's star formation.

The CGM also contains a substantial fraction of the metals in the galaxy, and the ratio of CGM metals to ISM metals increases with decreasing galaxy mass.  These metals were most likely created in the galaxies and then transported into the CGM by outflows.  Taking into account the amount of oxygen returned to the ISM during star formation and the typical star formation rate, the authors estimate that the oxygen in the CGM could have been deposited there over several billion years of star formation and outflow.  Furthermore, the observed O VI outflows do not exceed the galaxies' escape velocities, suggesting that this enrichment could eventually fall back onto the galaxy to fuel further star formation.

23 December 2011

Does Not Compute

Title: How Will Astronomy Archives Survive the Data Tsunami?
Authors: G.B. Berriman & S.L. Groom

Astronomy is abound with data. The scope of currently archived data ranges from large telescope projects to personal data that astronomers have accumulated over the ages from their long nights spent at ground based telescopes. The ability to store and access this data has been implemented fairly successfully, but the quantity of data is increasing rapidly (~0.5 PB of data per year) and is set to explode in the near future (60 PB total archived data by 2020). With this rapid increase of data comes the requirement for more storage space as well as more bandwidth to facilitate large and numerous database queries/downloads.
Figure 1: Growth of data downloads and data queries at IRSA
from 2005 - 2011

The primary focus of this paper concerns how we are to address data archival issues both from the server-side and client-side perspective. LSST, ALMA, and the SKA are projected to generate literally PBs of data and it is possible that our current infrastructure is insufficient to support the needs of those observatories. For instance, even if there is a database large enough to store and archive the data, searching the database for a particular image or set of images would require parsing through all of the data. We might also consider the bandwidth strain if numerous people are downloading data remotely. For the case of LSST, a projected 10 GB image file size would make this intractable. Data rates would suffer greatly.

Berriman and Groom study potential issues that are already cropping up in small archival data sets and attempt to provide a paths forward. These paths include developing innovative means to memory-map the stored data in order to lessen the load on the server and allow for more rapid data discovery, providing server-side reduction and analysis procedures, utilizing cloud computing in order to out-source the data storage issues, and implementing GPU computing. Current technologies are not in place to make all of these paths immediately beneficial, meaning the astronomical community should be looking to promote and partner with cyber initiatives and also educate their own members so they may more effectively contribute to the overall efficiency of computational power.

This last suggestions from the authors is what stirred up the most conversation. In particular, the authors recommend all graduate students in astronomy should be required to take classes in a long list of computational courses (i.e., software engineering). A quick analysis of their learning requirements means that a typical graduate student would be required to take an additional 3-6 courses. That's about an extra year for that course work. While the addition of an extra year for graduate students doesn't seem very attractive, it was suggested that summer workshops would be extremely helpful and advantageous. A 1-2 week program could potentially provide an intensive introduction to many of the highlighted skills astronomers might soon be expected to have (parallel programming, scripting, development of code, database technology). One comment even threw out the idea that Dartmouth hold such a school - quite possible, so keep an eye out!

What do you think about the future of computing in astronomy? Do we need to up the computational coursework for students or just hire computer scientists? Are there any tools or technologies you believe might be beneficial for astronomers to implement?

14 November 2011

Tut-tut, it looks like rain.

Title: Measuring NIR Atmospheric Extinction Using a Global Positioning System Receiver
Authors: C. H. Blake & M. M. Shaw

Ground based astronomical observing has one major obstacle that it must over come in order to produce quality, science quality data: Earth's atmosphere. Methods to correct for atmospheric attenuation are familiar to anyone who has taken data at a ground based telescope, or who have at least studied observational astronomy. As an example, observations of "standard" stars are required in order to calibrate not only the detector, but also to differentially correct for atmospheric effects on a given night.

Figure 1: Components of atmospheric absorption presented in
Blake & Shaw (2011)
Atmospheric absorption and attenuation is particularly noticeable in the near infrared (NIR) where absorption bands due to the molecular species in the atmosphere efficiently absorb much of the incoming flux. Typically, narrow band filters that have transmission peaks between these molecular bands are utilized in order to skirt around the difficult procedure of correcting for molecular absorption. 

However, in this paper, Blake and Shaw propose a very unique and interesting method for correcting astronomical images that have been affected by the absorption due to water molecules. They propose using signals from the Global Positioning Satellite (GPS) system to infer the water content of the atmosphere, allowing for more accurate atmospheric transmission modeling. Relying on the fact that GPS signals must be corrected for atmospheric attenuation, the author's propose that this may then be applied to astronomical studies to correct for the light attenuation of astrophysical sources.

Of greatest interest to the authors is the derivation of the precipitable water vapor (PWV) in the atmosphere. What is PWV? It is actually conceptually very simple - PWV is the column integrated depth of water vapor if all of the water were to precipitate out of the atmosphere instantaneously. As such, it is measured in units of length (typically mm). Basically, what your rain gauge would measure if all of the water vapor in the atmosphere directly above the gauge were to condense and precipitate to the ground. In the 1990s, it was shown that use of multi-wavelength GPS signals combined with a highly accurate barometer could lead to a very accurate derivation of PWV. 

Figure 2: An empirically derived fit for the correlation between
PWV + Airmass and the atmospheric  optical depth of water.
With numerous GPS stations set up across the United States to measure PWV, the authors were easily able to obtain PWV measurements near their location (Apache Point Observatory in NM). An empirical relation was then derived to relate PWV (and the airmass) to the optical depth of water in the atmosphere - assuming the optical depth is related directly to the amount of water in the atmosphere. As you can see on the left, the figure betrays the presence of a fairly obvious correlation (even with the absence of error bars). The empirically derived optical depth can then be input into the atmospheric transmission models, allowing for a more accurate estimation of the correction needed to remove effects due to water molecules.

Figure 3: Differential colors of over 6,000 M stars are binned
according to PWV at the time of observation.
An example correction is presented for numerous stars, but is most evident in the correction for M star colors. From a sample of 6,177 mid-M stars, the authors binned the data as a function of PWV - where the data is the deviation of the stellar color from an assumed stellar color locus. Their corrections are then applied and the result is illustrated to the right.

Overall the technique is very unique and holds a lot of promise, but is strongly dependent on the atmospheric transmission models. While they do provide a very good estimation of the atmospheric transmission, atmosphere models suffer from many uncertainties. However, the authors have demonstrated that their corrections appear to do very well - assuming M star not lying near the color locus are unaffected by other systematics (metallicity, etc). The plan is to have such corrections implemented in large survey telescopes, such as SDSS and LSST in order to allow for more accurate characterizations of M stars in the hunt for exoplanets. Not to mention corrections needed to accurately characterize the transmission spectra of exoplanets - supposing the exoplanets are in a position for this method to be possible (aka: transiting). 


10 November 2011

Chemical Evolution of TN J0924-2201 at z = 5.19

Title: Chemical properties in the most distant radio galaxy
Authors: Matsuoka, K. et al.
Measuring the chemical evolution of galaxies can give clues to their star formation histories. This is often done by measuring the metallicity of galaxies at various redshifts. However, as for all things astronomical, this becomes more difficult at high redshift. To alleviate this, studies have used active galactic nuclei (AGN) to measure the metallicities of high-redshift radio galaxies (HzRGs) because of their high luminosities. Specifically, gas clouds photoionized by the active nucleus emit lines in the ultraviolet (among other wavelengths) that can be observed in the optical. Quite convenient.

The currently accepted model of a typical AGN is as follows. Material from an accretion disk falls onto the central black hole, which is encircled by a large torus of gas and dust. Thus, if we observe an AGN from a pole, we see all the way down into the center, where the velocity dispersions of the gas create broad emission lines (this is known as the broad line region, or BLR). However, if we are fortuitous enough to view an AGN edge-on, the torus obscures the BLR, and instead we see emission lines resulting from the slower-moving gas clouds farther away from the black hole. This, naturally, is known as the narrow line region (NLR).

Figure 1: Line ratios showing possible metallicity evolution.
Studies using AGN have found no metallicity evolution up to z ~ 6. However, this may be due to the fact that many of these studies focused on the BLR, which could have evolved faster than the rest of the galaxy. Matsuoka et al. (2011), then, concentrate on the NLR of the most distant radio galaxy at z = 5.19, TN J0924-2201 (catchy name). Using the Faint Object Camera and Spectrograph (FOCUS), they detect Lyα and CIV lines, the first time CIV has been detected from a galaxy with z > 5. This indicates that a significant amount of carbon exists even in this high-z galaxy. Additionally, the Lyα/C IV ratio is slightly lower than that from lower-z HzRGs (Figure 1), suggesting possible metallicity evolution because of the higher amounts of carbon. However, this could also be attributed to weaker star-formation activity or Lyα absorption. Upper limits of NV/CIV and CIV/HeII were also measured, but these agree with lower-z HzRG measurements.

The authors also investigate the [C/O] abundance ratio by comparing observational limits of NV/CIV and CIV/HeII to photoionization models, the results of which are shown in Figure 2. Carbon enrichment in these galaxies is delayed compared to α elements, because much of the production of carbon comes from intermediate-mass stars (which have longer lives compared to those stars that create α elements). Thus, [C/O] is a good measure of star formation. The analysis finds a lower limit on [C/O] of -0.5, suggesting that this galaxy has experienced some chemical evolution. Comparison of this limit to previous models suggests an age of TN J0924-2201 of a few hundred million years old.

Figure 2: Lower limit of [C/O] abundance from photoionization models.

01 November 2011

JWST Passes Senate

Title: JWST Bill
Authors CJS Subcommittee Chairwoman Mikulski

The James Webb Space Telescope (JWST), the purported successor to Hubble, has been in the news a lot over the past year. As the budget for the project increased once again, the will of the US Government to actually finish the project was called into question. Well, after a long battle, a bill which explicitly outlines supporting and funding a 2018 launch of JWST was passed by the Senate. If you are interested in reading the Appropriations Committee's press release, see the title link above. It must still be approved by the House of Representatives, which might prove to be the more difficult task.

An artists conception of JWST. Credit: NASA/ESA

However, it is very interesting to note that not everyone is in favor of JWST (politicians, astronomers, scientists, the public, etc). There seems to be a divide since the money to fund JWST must come from somewhere - possibly other NASA projects (e.g., smaller satellites, research grants). Other are adamantly in favor of it because the scientific questions which will be answered (or plausibly answered) are important.

This topic came up at astro lunch this afternoon, before we knew about the bill passing. Now, I propose the question to the readers. Is JWST a worthwhile investment? I hope to hear all of your comments and opinions.