Thursday, 30 June 2011

Astro-Ph: The scatter about the "Universal" dwarf spheroidal mass profile

PhD student, Michelle L. M. Collins, working with Scott Chapman (IoA, Cambridge) and members of the Pan-Andromeda Archaeological Survey (PAndAS) (including me), has had her paper on the study of the size of dwarf galaxies accepted for publication in MNRAS. This paper shows further evidence that the dwarf galaxies around the Milky Way are different from those around its sister, the Andromeda Galaxy, and suggests that there must have been some difference in the way these two similar galaxies formed and evolved. Well done Michelle!!!

The scatter about the "Universal" dwarf spheroidal mass profile: A kinematic study of the M31 satellites, And V and And VI

M. L. M. Collins, S. C. Chapman, R. M. Rich, R. A. Ibata, M. J. Irwin, J. Peñarrubia, N. Arimoto, A. M. Brooks, G. F. Lewis, A. W. McConnachie, K. Venn

While the satellites of the Milky Way (MW) have been shown to be largely consistent in terms of their mass contained within one half--light radius (M_{half}) with a "Universal" mass profile, a number of M31 satellites are found to be inconsistent with such relations, and seem kinematically colder in their central regions than their MW cousins. In this work, we present new kinematic and updated structural properties for two M31 dSphs, And V and And VI using data from the Keck Low Resolution Imaging Spectrograph (LRIS) and the DEep Imaging Multi-Object Spectrograph (DEIMOS) instruments and the Subaru Suprime-Cam imager. We measure systemic velocities of v_r=-393.1+/-4.2km/s and -344.8+/-2.5km/s, and dispersions of sigma_v=11.5{+5.3}{-4.4}km/s and sigma_v=9.4{+3.2}{-2.4}km/s for And V and And VI respectively, meaning these two objects are consistent with the trends in sigma_v and r_{half} set by their MW counterparts. We also investigate the nature of this scatter about the MW dSph mass profiles for the "Classical" (i.e. M_V<-8) MW and M31 dSphs. When comparing both the "classical" MW and M31 dSphs to the best--fit mass profiles in the size--velocity dispersion plane, we find general scatter in both the positive (i.e. hotter) and negative (i.e. colder) directions from these profiles. However, barring one exception (CVnI) only the M31 dSphs are found to scatter towards a colder regime, and, excepting the And I dSph, only MW objects scatter to hotter dispersions. We also note that the scatter for the combined population is greater than expected from measurement errors alone. We assess this divide in the context of the differing disc-to-halo mass (i.e. stars and baryons to total virial mass) ratios of the two hosts and argue that the underlying mass profiles for dSphs differ from galaxy to galaxy, and are modified by the baryonic component of the host. 

Most Distant Quasar Found

A quasar has been found at a redshift of 7.1. While this is not the most distant object discovered (there are gamma-ray bursts and galaxies that have been found at a higher redshift), it is considerably brighter, has a lovely spectrum and comes with some very pretty graphics;



The lead on the paper is Dan Mortlock. I've known Dan for years; he was a PhD student at Melbourne when I visited in in 1995 (I was finishing up my thesis at the time). Excellent result.

Saturday, 25 June 2011

Astro-ph: Stellar Streams as Probes of Dark Halo Mass

PhD student, Anjali Varghese, working with me and my close colleague,  Rodrigo Ibata, has had her paper working on probing dark matter halos with stellar streams accepted. Congratulations Anjali!!

Stellar Streams as Probes of Dark Halo Mass and Morphology: A Bayesian Reconstruction

Anjali Varghese, Rodrigo A. Ibata, Geraint F. Lewis

Tidal streams provide a powerful tool by means of which the matter distribution of the dark matter halos of their host galaxies can be studied. However, the analysis is not straightforward because streams do not delineate orbits, and for most streams, especially those in external galaxies, kinematic information is absent. We present a method wherein streams are fit with simple corrections made to possible orbits of the progenitor, using a Bayesian technique known as Parallel Tempering to efficiently explore the parameter space. We show that it is possible to constrain the shape of the host halo potential or its density distribution using only the projection of tidal streams on the sky, if the host halo is considered to be axisymmetric. By adding kinematic data or the circular velocity curve of the host to the fitting data, we are able to recover other parameters of the matter distribution such as its mass and profile. We test our method on several simulated low mass stellar streams and also explore the cases for which additional data are required. 

Monday, 20 June 2011

Fish

My kids are interested in making a stop-motion movie, so I grabbed Framebyframe for a whirl, and it seems to work. The boys have made a couple of lego movies, but here's my first attempt;

video

I'm quite proud of it.

Sunday, 19 June 2011

Time to fight back!

Oh no!! There was a zombie outbreak in Leicester and it was clear that the city council was completely unprepared. Maybe they had not mathematically modeled the outbreak!!

Let's take the next step in the modeling and add a fight back factor. This means that there is a chance that, during a Healthy-Zombie interactions, the Zombies themselves can be truly killed and moved into the Dead population.

The dynamical equations become


where we now have a new parameter, δ, which controls the probability of killing Zombies.

OK, some runs. Let's try δ=0.003. This means that there is still a good chance of being bitten and becoming inflected, but you have a better chance of killing a Zombie than a Zombie killing you. What you get is


This is quite interesting. Clearly, the timescale for things to happen has been extended; it now takes a couple of weeks until the Zombies take over, but the population of Healthy humans still crashes very quickly.

OK - one more test (as breakfast is ready), but let's try δ=0.009, so there is almost as much chance of you killing a Zombie as you getting bitten. What happens now?


Notice that the timescale is much longer now, and we have half-a-year where things are OK, but the outcome is the same; the population of Health humans crashes and Zombies take over, but with a lot fewer Zombies in the end.

Right, that's enough for now. Next we will consider hardening of the population, where people become better at killing Zombies and avoiding infection. Then we will add a cure. but that's next.

Saturday, 18 June 2011

When there's no room left in hell.....

It's been a very productive week, and so I am going to take a little time to write up some Zombie Apocalypse code. As I noted previously, this is based on a earlier piece of work, but I am going to extend the model (and modify it to more accurately match zombies in movies).

The start point is the variables we will be following. These are H, the health humans, I, those bitten by a zombie and so on their way to zombiedom, Z, the star of the show, and D, the well and truly, non-zombie, dead.

Unlike the previous work by Robert Smith?, I don't assume hat any olf dead are revived. Rather, those that are bitten become infected, die and become zombies. Other dead remain dead, and as we know, a knock to the head moves a zombie into the permanently dead camp.

So, my starting dynamics equations are;


Where the ' are derivatives with respect to time. The H Z term is the number of Healthy and Zombies multiplied together. This is a measure of the number of interactions between the two populations.

For the parameters, α controls the rate that an encounter between a healthy and a zombie results and a dead human (D), whereas β controls rate that the outcome will be an inflected human (I). The remaining parameter, γ, controls the rate that infected humans become zombies.

So, let's do a test run. We'll start with 500,000 Healthy humans, and a solitary zombie. We are actually going to be working in units of 1000 individuals, and the parameters to start with are α = 0.001, β = 0.01 and γ = 0.5. What do we get; 

 
Excellent. We have about a week where nothing really happens (the zombie population is growing very slowly), and then BAM!, the population of healthy humans crashes in a week as the number of inflected rapidly rises and them decays away as zombies grow in strength.

This is exactly what we should expect to happen. Everything is flowing out of H and into Z and D, and no matter how we muck about with the parameters (other than having no zombies, or no harm in interacting with them), this is going to be the outcome.

What's next? Well, there are two possibilities: we can see if we can find a cure to stop those who are infected becoming zombies (and this may include a severe case of lead poisoning), and having a population of healthy humans who are good at protecting themselves from zombies.

I'll investigate this later.

Friday, 17 June 2011

God, the Big Bang … next please …

Another article of mine has been published on The Conversation. Titled God, the Big Bang … next please …, I discuss whether the Big Bang was the birth of the Universe. The reason for writing this is the catch call from those wanting to undermine modern cosmology, declaring its birth from nothing is illogical.

While we have no theory of everything, with no single picture encompassing gravity and the other forces (and be careful of what you hear from the hype known as superstrings), we cannot see "through" the Big Bang, and don't know what was before it.

But that does not mean we think there was nothing, literally nothing, no time and no space, before the Big Bang, and plenty of ideas are out there on what possible came before. We could be a daughter universe, born during the formation of a black hole in a universe before, or just one of a continuation of cycling universes, or something even weirder we haven't even thought of yet.

As I say in the article; I don't think any cosmologist really thinks that the Big Bang was the Beginning.

Anyway, I didn't originally have God in the title. I'm sure it will raise a few eye-brows.

Sunday, 12 June 2011

Talking of Hubble!

A quick post, as it is the Queen's Birthday Long Weekend here

Hubble Space Telescope Observations Awarded to Sydney Astronomers

9 June 2011

Two astronomers in the School of Physics, Professor Geraint Lewis and Professor Joss Bland-Hawthorn, have each been awarded individual observation time on the Hubble Space Telescope.

"Hubble observation time is internationally sought after and very competitive," says Professor Lewis who is known for his work on galactic clusters.

Professor Lewis' project is targeting old stellar systems, known as globular clusters and is part of a large international team, led by Dougal Mackey at ANU, and includes astronomers based in North America, the UK and Europe.

"These clusters are orbiting our nearest cosmic neighbour, the Andromeda Galaxy at very large distance. While Andromeda seems to have a number of these, our own galaxy, the Milky Way, does not. We really want to know why there is this difference."

Professor Lewis says that using Hubble the astronomers will be able to accurately see individual stars in these globular clusters, something that is impossible from the ground.

"We will be able to chart out the history of the globulars, and work out just where they came from, providing important clues to the formation and evolution of galaxies. So to get observing time on telescope as great as Hubble is fantastic for Australian science."

Professor Bland-Hawthorn, an ARC Federation Fellow, is part of a team of comprising five astronomers from the USA and one Australian scientist who have been granted "20 orbits" to use the COS ultraviolet spectrograph.

"We are studying the Magellanic Stream, which is a stream of gas discovered by Australian radio astronomers in the 1970s that wraps right around the Galaxy."

Earlier work by Professor Bland-Hawthorn showed that what can be seen with radio telescopes is only a fraction of the gas.

"Much of it is warm, or even hot, but you can only see this with optical and, especially, UV telescopes," he explains, "We want to confirm this claim from my work and my modelling, and demonstrate that much of the gas falling into the Galaxy is in the form of a warm rain."

Thursday, 9 June 2011

The Hubble Law; or is it?

One of the greatest discoveries of the 20th Century was the discovery of expanding Universe. Of course, what this means is that see the light from galaxies redshifted, with more distant galaxies having a larger redshift. Importantly, this is precisely what you expect from a Universe described by general relativity, in terms of the expanding space-time metric given by the Friedmann-Robertson-Walker metric, and governed by the Friedmann equations.

Converting a galaxies redshift into a velocity (by treating it as a Doppler shift), Hubble's law is v = Ho d, where v is the velocity, d is the distance, and Ho is the (in)famous Hubble constant. It is Ho that tells us how fast the Universe is expanding at the moment.

Measuring Ho was a big preoccupation of 20th Century astronomy, with us finally finding it is around 72 km/s/Mpc. But who was the first to measure Ho, essentially by plotting the distance against redshift for galaxies and measuring the slope? Credit is typically given to Hubble. But the situation is not so clear.

As mentioned in Letters to Nature, a recent paper on astro-ph suggests that Lematire beat Hubble to the "linear velocity–distance relationship" (i.e. Hubble's law) by two years.

Today's astro-ph paper by Sidney van den Bergh muddies the water even more. Directly quoting his abstract;

The 1927 discovery of the expansion of the Universe by Lemaitre was published in French in a low-impact journal. In the 1931 high-impact English translation of this article a critical equation was changed by omitting reference to what is now known as the Hubble constant. That the section of the text of this paper dealing with the expansion of the Universe was also deleted from that English translation suggests a deliberate omission by the unknown translator. 

However, a recent paper by Jean-Pierre Luminet directly names the translator as the famous astronomer Arthur Eddington, leader of the expeditions to verify Einstein's general theory of relativity by examining the deflection of starlight by the Sun. As explained by Luminet;

Next, Eddington carried out an English translation of the 1927 Lemaitre article for publication in the Monthly Notices of the Royal Astronomical Society. Here took place a curious episode: for an unexplained reason, Eddington replaced the important paragraph quoted above (where Lemaitre gave the relation of proportionality between the recession velocity and the distance) by a single sentence: "From a discussion of available data, we adopt R'/R = 0,68x10-27cm-1 (Eq. 24)". Thus, due to Eddington's (deliberate?) blunder, Lemaitre will never be recognized on the same footing as Edwin Hubble for being the discoverer of the expansion of the universe.

So, we are left with the fact that at some level, Hubble's law should probably be known as Lemaitre's law. History is never as simple as the textbooks make out! At some point I'll write something about the strange case of Ollin Eggan and the vanishing Greenwich Observatory documents.

Tuesday, 7 June 2011

Zombies (and Differential Equations)

 A couple of years ago, Robert Smith? and collaborators published a paper on numerical modeling a zombie outbreak. The article got lots of press mileage, but I think that an important message did not shine through.

While I am sure that the authors are not getting ready for the coming zombie apocalypse (although others clearly are), the story is about how more realistic hazards, such as diseases, can be computationally modeled as they flow through a population, and this, as we all know, is governed by differential equations.

Why computational? Because (and this is not a fact we really make apparently to our undergraduate students) the vast majority of differential equations do not have analytic solutions, and we need to turn to the computer to model complex interactions.

And if I had the opportunity to study zombie outbreaks as an undergraduate, I am sure that learning about differential equations and computational approaches would have been a lot more fun.

Anyway, having watched a few zombie movies in my time, I felt there were a couple of problems with Smith?'s original model for the life-cycle (if that's the word) for zombies, and I coded up some models of my own, but as ever, time got the better of me.

However, others, such as here and here, clearly were thinking along the same lines. So, I'm going to use this blog to go through the model and check out the results. The goal is to see if we can come up with a scenario in which we will get some survivors (although, so far, this is not looking very likely).

The first real post will be coming soon, but for now, here's a picture;

Monday, 6 June 2011

Dark energy is "real"

The "Dark Energy is Real" kerfuffle is continuing with interesting comment over at Universe Today on the issue. As Nerlich points out;

I mean how the heck did ‘dark energy’ ever become shorthand for ‘the universe is expanding with a uniform acceleration’?

I have to agree. It seems that, with the recent Missing Mass Hysteria, there are some issues to do with astronomical press releases, namely that the truth is somewhat stretched to make the story interesting to the media. 

This is, in my opinion, a problem. Some feedback I have received on this is that any press is good press. But I don't think this is necessarily the case. What happens when dark energy is possibly shown not to be real? Or that the missing mass was not discovered by an undergraduate student during a vacation project? 

Perhaps the assumption is that the general public has a short memory, or that telling the complete truth doesn't really matter. Continuing down this road, we can expect this to come back to bite us. We should not forget what happens to "sexed up" stories, and our colleagues in climate science are actively accused as liars in various areas of the press.

My feeling is, therefore, that the astronomers have to pull their heads in a little, and tell it like it is. Perhaps tempering their press releases to more realistically reflect the actual science, rather than pandering to the hyperbolic to get the press interested.

The press, on the other hand, has to realise that science is not all Nobel prize winning results, and that incremental science is still newsworthy. Media that actually reflects the doing of science is not a bad thing (and will potentially reassure those entering science that you don't need to be Nobel prize winner to make a realistic contribution - something we all learn in the end).

Closing with Nerlich, his final comment hits the mark;

Not saying it’s impossible, but no way has anyone confirmed that dark energy is real. Our flat universe is expanding with a uniform acceleration. For now, that is the news story.

Sunday, 5 June 2011

Should we even have a "Cosmology" prize?

As part of the discussion on the Gruber Prize being held over at Peter Coles's blog, the question whether a prize in one small part of astrophysics, namely cosmology, is a good idea? Not that I have checked in detail, but I doubt there are similar prizes in symbiotic stars, dwarf galaxies, intra-cluster gas etc. So why cosmology?

A quick squizz at wikipedia you can see that the Gruber Prize in cosmology is one of five international prizes awarded by the  Peter and Patricia Gruber Foundation, the others being in Women's Rights, Genetics, Neuroscience and Justice, as well as another prize for Young Scientists. These are all worthy prize areas, and my feeling is that this is effectively private money and they can give it to whom ever they want.

Looking a little deeper (i.e. reading wikipedia a little more) that the cosmology prize is in fact given to "a leading cosmologist, astronomer, astrophysicist or scientific philosopher for theoretical, analytical or conceptual discoveries leading to a fundamental advances in the field", which is a broader definition of cosmology than would spring into many astronomers' minds.

In fact, looking at the list of recipients, we can see that the 2010 winner was (the very worthy) Chuck Steidel. Chuck's work has focuses upon faint, blue star forming galaxies at high redshift, which would clearly fall into the astrophysics camp, rather than what a lot of people call cosmology.

Rather embarrassingly, I seem to have missed the announcement of the 2010 prize, have a vague recollection of the 2009 prize to Freedman and Mould, and didn't know that Dick Bond got the 2008 prize. The last I really remember is the Supernovae teams getting it in 2007; perhaps this is because a local was the winner. I should really keep my eyes a little more open.

As George Efstathiou points out on Peter's blog, there are other prizes out there, such as the Kavli Prize for Astrophysics, that have a very broad coverage.

I guess we didn't get into this game to win prizes, and so I don't personally mind the Gruber Foundation for setting up a prize for Cosmology.  It gives us, all astronomy, press coverage and hence visibility in the general population,  which can only be a good thing (I have heard grumbles from other groups of physicists on what media mongrels astronomers are).

So good on the Gruber Foundation. I have a few of my own cosmological papers they can have a look at, if they are interested :)

Friday, 3 June 2011

2011 Gruber Prize for Cosmology

The 2011 Gruber Prize for Cosmology has been awarded to the "Gang of Four", Marc Davis, George Efstahiou, Carlos Frenk and Simon White. The award is for their work on “their pioneering use of numerical simulations to model and interpret the large-scale distribution of matter in the Universe”.

As Peter Coles points out in his "blog", this group was undertaking simulations of cosmological structure in the mid-1980s, something we do routinely now. The startling thing is that they could only 32768 particles. These days, students will not get out of bed for less than a million particles, and often many more. This is because life has become easier, with prepackaged code, especially the wonderful "GADGET" by Volker Springel, and access to supercomputers, which has become 10-a-penny since we learnt how to cluster linux machines together. It's now hard to keep track of all of the high resolution simulations, such as "Millennium" and "Via Lactea", and the torrent of papers they produce.

And as computers get faster, with the advent of GPGPU-based supercomputers, it's only going to get worse (in a good way).

It should be remembered, however, that back in the (good-old) days, you often had to write simulation code yourself, whereas now, such coding skills are in the hands of a relative few. I'll save my grumpiness on the coding skills of many students to another day, and just finish by congratulating the Gang of Four.

Thursday, 2 June 2011

Adventures in the dark side of cosmology

And just to round off this first day of posting, here's a link to my other article in The Conversation, namely "Adventures in the dark side of cosmology".

This was in response to a press-release titled "Dark Energy is Real" which recently came out of the "WiggleZ", and is a bit of a comment on what "real" means in cosmology.

 By the way, WiggleZ is not pronounced "Wiggle-Zee", or "Wiggle-Zed", but after an Australian Cultural Icon.

Wednesday, 1 June 2011