Sauron’s singularity: Sucking in light but lighting up the universe

This composite image shows the central region of the spiral galaxy NGC 4151, dubbed the 'Eye of Sauron' by astronomers for its similarity to the eye of the malevolent character in 'The Lord of the Rings'.

This composite image shows the central region of the spiral galaxy NGC 4151, dubbed the ‘Eye of Sauron’ by astronomers for its similarity to the eye of the malevolent character in ‘The Lord of the Rings’. Image: X-ray: NASA/CXC/CfA/J.Wang et al.; Optical: Isaac Newton Group of Telescopes, La Palma/Jacobus Kapteyn Telescope, Radio: NSF/NRAO/VLA

When heavier stars run out of hydrogen to fuse into helium, the fusion reactions that keep the stars from imploding due to their own gravity become more difficult (as they infeasibly fuse helium into heavier elements) and eventually stop happening. At this stage, they blow away their outermost layer of gases and collapse into neutron stars (If the parent star is heavy enough, the neutron star collapses into a black hole).

The neutron star is an extremely dense, rapidly rotating body composed mostly of neutrons and ridden with powerful magnetic fields. These magnetic fields accelerate particles on and around some neutron stars and eject them in beams from the poles. Because the star is spinning, these beams periodically point toward and away from Earth, making them look like flashing points of light in the night sky.

For this reason, such neutron stars are called pulsars, and pulsars are used as ‘cosmic candlesticks’, relatively fixed points of light that astronomers use to gauge distances in the universe. Pulsars can remain stable for 10-100 million years, making them reliable on par with atomic clocks when it comes to keeping time as well.

The keys to their relevance for human observations are the stability and distinctness of the beams. Astronomers would use any other natural object like pulsars if only they emitted radiation that was long-lasting and distinguishable from the other light in the universe. Now, they might have a new kind of candidates starting with the ‘Eye of Sauron’.

That’s the common name of the galaxy NGC 4151, located about 40 million light-years from Earth. A group of Danish astrophysicists have measured the distance between the supermassive black hole at the heart of this galaxy and Earth by studying how it is heating up gas clouds and a ring of dust that surround it.

The clouds are heated as they’re compressed by the black hole’s intense gravitational pull. In the process, they emit ultraviolet radiation. The UV radiation then heats up a ring of dust orbiting the black hole at a large distance, and in turn the ring emits infrared radiation. Effectively, thanks to the thermal cascade, there are two concentric ‘zones’ of radiation around the singularity.

Astronomers from the Niels Bohr Institute at the University of Copenhagen used the twin Keck Telescopes in Hawaii and this effect to their advantage when trying to measure how far the black hole is from Earth. Darach Watson, an associate professor at the institute, explained,

Using telescopes on Earth, we [measured] the time delay between the ultraviolet light from the black hole and the subsequent infrared radiation emitted from the dust cloud.

Keeping in mind the speed of light, Watson’s team calculated the delay to be 30 light-days, corresponding to a distance of about 777 million km between the cloud of irradiated gas and the ring of dust.

If this weren’t cool enough, the astronomers then used a technique from 19th century (a.k.a. high school) optics to measure the distance between the black hole itself and Earth.

The most powerful astronomical telescopes are not built to observe electromagnetic radiation at all wavelengths because their resolution depends on the wavelength of the radiation they’re observing. Specifically, a telescope with a fixed lens diameter will have lower angular resolution (which is good) when observing radiation of lower wavelengths. So each of the 10-meter-wide Keck Telescopes will have an angular resolution of 8.75 arc-seconds when observing infrared emissions but 1.6 arc-seconds when observing UV light – an increase in resolution by 5.4-times.

But what makes Keck much better is a technique called interferometry. The two telescopes are separated by 85 meters, which makes their collective effective lens diameter 85 meters. The resultant interference pattern due to the difference in the time at which light reaches each of the lenses is then corrected for by computers, giving rise to an image of the object as if it were observed by an 85-meter-wide telescope.

Related: What is Very Long Baseline Interferometry?

Using interferometry, Watson and his colleagues were able to measure the diameter of the entire dust ring. As a result, they had two fixed distances in the night sky: the distance between the ring and the cloud of gas, and the width of the ring. The only thing left to find out the black hole’s distance from Earth was simple trigonometry, and a simple trigonometric calculation later, the astronomers had their answer: 62 million light-years.

Clouds of gas and rings of dust are common around supermassive black holes, which often reside at the center of large galaxies (the one at the Milky Way’s center is called Sagittarius A*). This means the ‘Eye of Sauron’ needn’t be an uncommon occurrence and could instead join pulsars in holding up candles in space’s dark for astronomers.

And coolness wasn’t the only outcome of the Niels Bohr Institute group’s experiment. Their work heralds a long-sought element of precision missing until now in measuring the masses of black holes. As Watson explained, again,

The calculations of the mass of the supermassive black holes at the heart of galaxies depends on two main factors: the rotational speed of the stars in the galaxy and how far it is from the black hole to the stars. The rotational speed can be observed and the distance from the black hole out to the rotating disc of stars can now be calculated precisely using the new method.

Watson & co. were able to find that the ‘Eye of Sauron’ was 40% heavier than expected.

So, not just coolness…


… but also awesomeness.

Hardy DNA could mean we’re aliens

The TEXUS mission sounding rocket taking off in March 2011 from Kiruna, Sweden.

A TEXUS mission sounding rocket taking off in March 2011 from Kiruna, Sweden. Image: Adrian Mettauer

A team of European scientists have shown that DNA molecules can withstand the rough temperatures and pressures that rockets experience when they reenter Earth’s atmosphere from space. Their finding is important from the perspective of meteorites and other space rocks that crash on Earth. Many scientists think such objects could once have seeded our planet with the first molecules of life, billions of years ago.

The scientists had attached bits of plasmid DNA – the part physically separated from chromosomal DNA in biological cells and capable of reproducing independently – on 15 different parts of the outer shell of a TEXUS mission sounding rocket (powered by the Brazilian VSB-30 motor). On March 29, 2011, the rocket took off from the European Space and Sounding Rocket Range near Kiruna, Sweden, for a suborbital flight that exposed the DNA to the vacuum and low temperatures of space before shooting back toward Earth, exposing the samples to friction against the atmosphere.

The entire flight lasted 780 seconds and reached a height of 268 km. While going up, the acceleration maxed at 13.5 g and while coming down, 17.6 g. When outside Earth’s atmosphere, the rocket and samples also experienced about 378 seconds of microgravity. The maximum temperature experienced during atmospheric reentry was just below 130 degrees Celsius on the surface of the rocket; the gases in the air around the samples attached to the sides of the rocket could have reached 1,000 degrees Celsius.

A schematic showing the design of the TEXUS-49 payload and the various positions at which the DNA samples were attached.

A schematic showing the design of the TEXUS-49 payload and the various positions at which the DNA samples were attached. For full caption, see footnote. Image: Screenshot from paper

Promising results

In all, a maximum of 53% of the DNA could be recovered intact and 35% was fully biologically functional. Analysis also showed that “DNA applied to the bottom side of the payload had the highest degree of integrity followed by the samples applied in the grooves of the screw heads”, according to the study paper. It was published in PLOS ONE on November 26.

The ability of the DNA molecules to sustain life was then recorded by observing how many bacterial colonies each of the 15 samples could engender per nanogram. The 100% transformation efficiency was set at 1,955 colonies/nanogram, which was what an unaffected bit of plasmid DNA could achieve.

Curiously, for sample #1, which was attached on the side of the rocket where there was minimum shielding especially during atmospheric reentry, 69 colonies/nanogram were identified. The highest density of colonies was for sample #10, which was attached in the grooves of screw-heads on the rocket: 1,368/nanogram.

“We were totally surprised,” said Cora Thiel and Oliver Ullrich, coauthors of the study and biologists at the University of Zurich, in a statement. “Originally, we designed this experiment as a technology test for biomarker stability during spaceflight and reentry. We never expected to recover so many intact and functional active DNA.”

Last molecule standing

It’s clear that the damage inflicted on the DNA samples by the harsh conditions of acceleration, microgravity, temperature fluctuations, solar radiation and cosmic rays may not have been sufficient in deterring the molecules from retaining their biological functions. In fact, this study imposes new lower limits on the survivability of life: it may not be as fragile as we like to think it is.

Scientists have known temperature to be the most effective destroyer of DNA double-strands. Studies in the past have shown that the molecules weren’t able to withstand more than 95 degrees Celsius for more than five minutes without becoming denatured. During the TEXUS-49 mission, bacterial plasmid DNA temporarily withstood up to 130 degrees Celsius, maybe more.

By extension, it is not inconceivable that a fragment of a comet could have afforded any organic molecules on-board the same kind of physical shielding that a TEXUS-49 sounding rocket did. Studies dating from the mid-1970s have also shown that adding magnesium chloride or potassium chloride to the DNA further enhances its ability to withstand high temperatures without breaking down.

How big a hurdle is that out of the way? Pretty big. If DNA can put itself through as much torture and live to tell the tale, there’s no need for it to have been confined to Earth, trapped under the blanket of its atmosphere. In fact, in 2013, scientists from the Indian Center for Space Physics were able to show, through computer simulations, that biomolecules like DNA bases and amino acids are capable of being cooked up in the interstellar medium – the space between stars – where they could latch on to trespassing comets or asteroids and bring themselves into the Solar System.

According to the study, published in New Astronomy in April 2013, cosmic rays from stars can heat up particles in the interstellar medium and promote the formation of so-called precursor molecules – such as methyl isocyanate, cyanamide and cyanocarbene – which then go on to form amino acids. The only conditions his team presupposed were a particle density of 10,000-100,000 per cubic centimeter and an ambient temperature of 10 kelvin to say about 1 gram of amino acids could be present in 1014 kg of matter.

Compared to the mass density of the observable universe (9.9 × 10-27 kg/m3), that predicted density of amino acids, if true, is quite high. So, the question arises: Could we be aliens?

The first experiments

The first studies to entertain this possibility and send hapless living things to space and back began as far back as 1966, in the early days of the Space Age, alongside the Gemini IX and XII missions. Prominent missions since then include the Spacelab 1 launch (1983), the Foton 9, 11 and 12 rockets (1994-1999), the Foton M2 and M3 missions (2005-2007) and ISS EXPOSE-R mission (2009-2011). The Foton launches hosted the STONE and BIOPAN missions, which investigated if microbial lifeforms such as bacteria and fungi could survive conditions in space, such as a low temperature, solar radiation and microgravity.

Through most of these missions, scientists were able to find that the damage to lifeforms often extended down to the DNA-level. Now, we’re one step closer to understanding exactly what kind of damage is inflicted, and if there are simple ways for them to be fended off like with the addition of salts.

The STONE-5 mission (2005) was particularly interesting because it also tested how rocks would behave during atmospheric reentry, being a proxy for meteorites. It was found that the surface of a rock reached temperatures of more than 1,800 degrees Celsius. However, mission scientists concluded that if the rock layer had been thick enough (at least more than 5 mm as during the test, or 2 cm during STONE-6) to provide insulation, the innards could survive.

Fragment of the Murchison meteorite (at right) and isolated individual particles (shown in the test tube).

Fragment of the Murchison meteorite (at right) and isolated individual particles (shown in the test tube). Image: Wikimedia Commons

In the same vein, the ultimate experiments – though not performed by humans – could have been the Murchison meteorite that crashed near a town of the same name in Australia in 1969 and the Black Beauty, a rock of Martian origins, that splintered over the Sahara a thousand years ago. The Murchison meteorite was found to contain more than 70 different amino acids, only 19 of which are found on Earth. The Black Beauty was found to be 4.4 billion years old and made of sediments, signalling that a young Mars did have water.

Their arrivals’ prime contribution to humankind was that they turned our eyes skyward in the search of our origins. The experiments conducted with the TEXUS-49 mission keep them there.

Full caption for second image: a Scheme of the TEXUS 49 payload with DNA sample 1–12 application sites b Plasmid DNA samples 1–12 were applied on the outside of the TEM (TEXUS Experiment Module) EML 4 c I DNA samples 1–4 were applied circular at 0, 90, 180, 270 degree directly on the surface of the payload DNA samples 5–12 were also applied with a distance of 90 degree each in the screw heads of the payload c II DNA samples 13–15 were applied directly on the payload surface at the bottom side d DNA samples 1–4 were pipetted directly on the surface and locations were marked with a pen e DNA samples 5–12 were applied in the grooves of the screw heads f DNA samples 13–15 were applied directly on the payload surface on the bottom side and locations were marked with a pen.

How Venus could harbor life: supercritical carbon dioxide

The dark spot of Venus crossed our parent star in 2012. Pictured above during the occultation, the Sun was imaged in three colors of ultraviolet light by the Earth-orbiting Solar Dynamics Observatory.

The dark spot of Venus crossed our parent star in 2012. Pictured above during the occultation, the Sun was imaged in three colors of ultraviolet light by the Earth-orbiting Solar Dynamics Observatory. Image: NASA/SDO & the AIA, EVE, and HMI teams

A new study published in the online journal Life says a hotter, pressurized form of carbon dioxide could harbor life in a similar way water does on Earth. This is an interesting find, theoretical though it is, because it might obviate the need for water to be present for life to exist on other planets. In fact, of the more than 2,700 exoplanet candidates, more than 2,000 are massive enough to have such carbon dioxide present on their surface.

At about 305 kelvin and 73-times Earth’s atmospheric pressure, carbon dioxide becomes supercritical, a form of matter that exhibits the physical properties of both liquids and gases. Its properties are very different from what they usually are in its common state – in the same way highly pressurized water is acidic but normal water isn’t. Supercritical carbon dioxide is often used as a sterilization agent because it can deactivate microorganisms quickly at low temperatures.

As the study’s authors found, some enzymes were more stable in supercritical carbon dioxide because it contains no water. The anhydrous property also enables a “molecular memory” in the enzymes, when they ‘remember’ their acidity from previous reactions to guide the future construction of organic molecules more easily. Moreover, as stated in the paper,

… the surface tension in carbon dioxide is much lower than that of water, whereas the diffusivity of solutes in scCO2 is markedly higher [because of lower viscosity]. Thus, scCO2 can much easier penetrate [cell membranes] than subcritical fluids can.

The easiest way – no matter that it’s still difficult – to check if life could exist in supercritical carbon dioxide naturally is to check the oceans at about a kilometer’s depth, where pressures are sufficient to entertain pockets of supercritical fluids. As the authors write in their paper, supercritical carbon dioxide is less dense than water, so they could be trapped under rocky formations which in turn could be probed for signs of life.

A similarly accessible place to investigate would be at shallow depths below the surface of Venus. Carbon dioxide is abundant on Venus and the planet has the hottest surface in the Solar System. Its subsurface pressures could then harbor supercritical carbon dioxide. Dirk Schulze-Makuch, a coauthor of the paper and an astrobiologist at Washington State University, notes,

An interesting twist is that Venus was located in the habitable zone of our Solar System in its early history. [Him and his coworkers] suggested the presence of an early biosphere on the surface of this planet, before a run-away greenhouse effect made all life near the Venusian surface all but impossible.

The probability that Venus could once have harbored life is as strange as it is fascinating. In fact, if further studies indicate that supercritical carbon dioxide can play the role of a viable bio-organic solvent,  the implications will stretch far out into anywhere that a super-Earth or gas-giant is found. Because its reactions with complex organic molecules such as amines will not be the same as water’s, the life-forms supercritical carbon dioxide could harbor will be different – perhaps more primitive and/or short-lived. We don’t know yet.

This study continues a persistent trend among astrobiologists since the 1980s to imagine, and then rationalize, if and how life could take root in environments considered extreme on Earth. After the NASA Kepler space telescope launched in 2009 and, in only four years of observation, yielded almost 4,100 exoplanet candidates (more than a thousand confirmed as of now), astrobiologists began to acquire a better picture of the natural laboratories their hypotheses had at their disposal, as well as which hypotheses seemed more viable.

In August this year, Schulze-Makuch himself had another paper, in Science, that discussed how a lake of asphalt in Trinidad harbored life despite a very low water content (13.5%), and what this said about the possibilities of life on Saturn’s moon Titan, which exhibits a similar chemistry on its surface. The Science paper had cited another study from 2004. Titled ‘Is there a common chemical model for life in the universe?‘, it contained a pertinent paragraph about why the search for alien life is important as well as likely endless:

The universe of chemical possibilities is huge. For example, the number of different proteins 100 amino acids long, built from combinations of the natural 20 amino acids, is larger than the number of atoms in the cosmos. Life on Earth certainly did not have time to sample all possible sequences to find the best. What exists in modern Terran [i.e. Earth-bound] life must therefore reflect some contingencies, chance events in history that led to one choice over another, whether or not the choice was optimal.


SpaceShipTwo crash brings down Richard Branson with it

Virgin Galactic’s commercial spaceflight program was pushed back by more than a year after the test flight of its SpaceShipTwo rocket-plane over the California desert blew up mid-air and killed one of its two pilots on October 31. Virgin Galactic had planned to go start operating suborbital flights as soon as 2015 before the incident. Later, an investigation into the details of the accident by the National Transportation Safety Board (NTSB) said it would take until 2016 to conclude.

That date may be pushed further back as far as Virgin Galactic is concerned because new details have emerged that SpaceShipTwo had been plagued by numerous technical issues, that none outside its engineering and contracting teams knew about, even before the test flight. The Wall Street Journal reports,

Engineers and subcontractors working on SpaceShipTwo spent years wrestling with difficulties, ranging from inadequate rocket-motor thrust to problems in the flight-control system to structural deficiencies affecting the wings of the rocket’s carrier plane.

… Fixes were devised, flight tests were delayed and the result, [Virgin Galactic employees] said, was that some important elements of the project remained in flux for several years. It isn’t unusual for complex vehicles such as spacecraft and airliners to face repeated pitfalls and delays during development. Yet throughout the process, Virgin Galactic founder Richard Branson repeatedly announced timetables that were more aggressive than technical advances warranted, the people said.

Peter Siebold, the surviving co-pilot, had told NTSB officials while in hospital that Mike Alsbury, who was killed, had prematurely unlocked the feathering mechanism and that he (Siebold) didn’t know about it. This then violated the principle that the unlocking be announced verbally, although it remains to be seen if Siebold simply didn’t hear it. SpaceShipTwo boasts of two ‘feathers’, one on each wing, which are hinged booms that can be moved to become vertical to the wings to provide extra drag during atmospheric reentry (obviating the need for heat shields).

SpaceShipTwo schematic.

SpaceShipTwo schematic showing the raised feathers. Image: Virgin Galactic

Related: Feathering malfunction, not hyped motor, suspected in SpaceShipTwo crash

It is unclear why Branson would make such vaunted promises but for the money Virgin Galactic stood to make as one of the first commercial spaceflight operators. Tickets on the flight were going to be priced at $250,000 apiece, with each flight ferrying six passengers and two crew. Soon after the October 31 accident, in fact, the company’s CEO George Whitesides said in an interview to Financial Times that another SpaceShipTwo model being built in New Mexico was 65% complete. In 2015, Virgin Galactic would’ve then had at least two flights to operate (it had announced in 2013 that it would eventually operate five). However, it is unknown if the second model is also as problem-prone as the first one was.

Branson’s impatience to get started on a profitable note is brought out by another example. Earlier, media concerns focused on a new hybrid engine that SpaceShipTwo was going to use in the ill-fated test flight. Designed by the Sierra Nevada Corporation, it used solid HTPB fuel – a plastic – and a liquid nitrous oxide oxidizer. A previous version of the engine using a different fuel couldn’t provide enough bang to power the 60-foot vehicle and Sierra Nevada had asked Branson to reduce the passenger limit. However, according to WSJ, Branson had declined saying Virgin Galactic wouldn’t make money unless it flew six people per flight, prompting the switch to the plastic fuel.

NTSB investigation in the week following the accident had managed to recover all the parts of disintegrated SpaceShipTwo, which had blown up at an altitude of 50,000 feet (15.2 km). The lack of commensurate burn marks exonerated the engine but the fragmentation of components meant investigation would take at least a year to complete. Moreover, with news emerging that Virgin Galactic’s business practices could have severely conflicted with technical issues during development, Richard Branson’s dreams of commercial spaceflight even by 2016 could be pixie dust.

ALMA telescope catches live planet-forming action for the first time

The ALMA telescope in Chile has, for the first time, observed a star system that might be in the early stages of planet formation. The picture has astronomers drooling over it because the study of the origins of planets has until now been limited to simulated computer models and observations of planets made after they formed.

ALMA image of the protoplanetary disc around HL Tauri.

ALMA image of the protoplanetary disc around HL Tauri. Image: ALMA (ESO/NAOJ/NRAO)

According to a statement put out by the European Southern Observatory (ESO), the observation was one of the first made with the ALMA, which opened in September 2014 for a ‘Long Baseline Campaign’ (ESO is the institution through which European countries fund the telescope). ALMA uses a technique called very-long baseline interferometry to achieve high resolutions that lets it observe objects hundreds of light-years away in fine detail. It makes these observations in the millimeter/sub-millimeter range of wavelengths; hence its name: Atacama Large Millimeter/sub-millimeter Array.

The image shows a disc of gas, dust and other debris orbiting the star HL Tauri, located about 450 light-years from Earth. A system like this is originally a large cloud of gas and dust. At some point, the cloud collapses under its own gravitation and starts to form a star, further accruing matter from the cloud and growing in size. The remaining matter in the cloud then settles into a disc formation over millions of years around the young star.

In the disc, the gas and dust continue to clump, this time into rocky lumps like planets and asteroids. This is why the disc is called a proto-planetary disc. As a planet forms and its gravitational pull gets stronger, it starts to clear a space in the disc of matter by either sucking it for itself or knocking it out. The gaps that are formed as a result are good indicators of planet formation.

According to the ESO statement, “HL Tauri’s disc appears much more developed than would be expected from the age of the system [less than 100,000 years]. Thus, the ALMA image also suggests that the planet-formation process may be faster than previously thought.”

An annotated image showing the protoplanetary disc surrounding the young star HL Tauri.

An annotated image showing the protoplanetary disc surrounding the young star HL Tauri. Image: ALMA (ESO/NAOJ/NRAO)

In the Solar System, similar gaps exist called Kirkwood gaps. They represent matter cleared by Jupiter, whose prodigious gravitational pull has been pushing and pulling the orbits of asteroids around the Sun into certain locations. In fact, Jupiter’s movement within the Solar System – first moving away, then toward, and then away once more from the Sun – has been used to explain why the material composition of some asteroids between Mars and Jupiter is similar to those of Kuiper Belt objects situated beyond the present orbit of Neptune. Jupiter’s migration mixed them up.

Similarly, the gaps forming around HL Tauri, though they may represent planetesimals, may not result in planets in the exact same orbits as they could move around under the influence of subsequent gravitational disruptions. They could acquire unexpectedly eccentric orbits if their star system comes too close to another, as was found in the nearby binary star system HK Tauri in July 2014. Or, the gaps are probably being emptied by the gravitational effects of an object in another gap.

However, astronomers think the presence of multiple gaps is likely evidence of planet formation more than anything else.

At the same time, the resolution in the image is 7 AU (little more than Jupiter’s distance from the Sun), which means the gaps are very large and represent stronger gravitational effects.

Astronomers will use this and other details as they continue their investigation into the HL Tauri system and how planets – at least planets in this system – form. The Long Baseline Campaign, which corresponds to the long-baseline configuration of the ALMA telescope that enabled this observation, will continue into December.

Fabiola Gianotti.

Fabiola Gianotti, the first woman Director-General of CERN

The CERN Council has elected a new Director-General to succeed the incumbent Rolf-Dieter Heuer. Fabiola Gianotti, who served as the ATLAS collaboration’s spokesperson from 2009 to 2013 – a period that included the discovery of the long-sought Higgs boson by the ATLAS and CMS experiments – will be the first woman to hold the position. Her mandate begins from January 2016.

A CERN press release announcing the appointment said the “Council converged rapidly in favor of Dr. Gianotti”, implying it was a quick and unanimous decision.

The Large Hadron Collider (LHC), the mammoth particle smasher that produces the collisions that ATLAS, CMS and two other similar collaborations study, is set to restart in January 2015 after a series of upgrades to increase its energy and luminosity. And so Dr. Gianotti’s term will coincide with a distinct phase of science, this one eager for evidence to help answer deeper questions in particle physics – such as the Higgs boson’s mass, the strong force’s strength and dark matter.

Dr. Gianotti will succeed 15 men who, as Director Generals, have been responsible for not simply coordinating the scientific efforts stemming from CERN but also guiding research priorities and practices. They have effectively set the various agendas that the world’s preeminent nuclear physics lab has chosen to pursue since its establishment in 1945.

In fact, the title of ‘spokesperson’, which Dr. Gianotti held for the ATLAS collaboration for four years until 2013, is itself deceptively uncomplicated. The spokesperson not only speaks for the collaboration but is also the effective project manager who plays an important role when decisions are made about what measurements to focus on and what questions to answer. When on July 4, 2012, the discovery of a Higgs-boson-like particle was announced, results from the ATLAS particle-detector – and therefore Dr. Gianotti’s affable leadership – were instrumental in getting that far, and in getting Peter Higgs and Francois Englert their 2013 Nobel Prize in physics.

Earlier this year, she had likened her job to “a great scientific adventure”, and but “also a great human adventure”, to CNN. To guide the aspirations and creativity of 3,000 engineers and physicists without attenuation1 of productivity or will must have indeed been so.

That she will be the first woman to become the DG of CERN can’t escape attention either, especially at a time when women’s participation in STEM research seems to be on the decline and sexism in science is being recognized as a prevalent issue. Dr. Gianotti will no doubt make a strong role model for a field that is only 25% women. There will also be much to learn from her past, from the time she chose to become a physicist after learning about Albert Einstein’s idea of quantum mechanics to explain the photoelectric effect. She joined CERN while working toward her PhD from the University of Milan. She was 25, it was 1987 and the W/Z bosons had just been discovered at the facility’s UA1 and UA2 collaborations. Dr. Gianotti would join the latter.

It was an exciting time to be a physicist as well as exacting. Planning for the LHC would begin in that decade and launch one of the world’s largest scientific collaborations with it. The success of a scientist would start to demand not just research excellence but also a flair for public relations, bureaucratic diplomacy and the acuity necessary to manage public funds in the billions from different countries. Dr. Gianotti would go on to wear all these hats even as she started work in calorimetry at the LHC in 1990, on the ATLAS detector in 1992, and on the search for supersymmetric (‘new physics’) particles in 1996.

Her admiration for the humanities has been known to play its part in shaping her thoughts about the universe at its most granular. She has a professional music diploma from the Milan Conservatory and often unwinds at the end of a long day with a session on the piano. Her fifth-floor home in Geneva sometimes affords her a view of Mont Blanc, and she often enjoys long walks in the mountains. In the same interview, given to Financial Times in 2013, she adds,

There are many links between physics and art. For me, physics and nature have very nice foundations from an aesthetic point of view, and at the same time art is based on physics and mathematical principle. If you build a nice building, you have to build it with some criteria because otherwise it collapses.2

Her success in leading the ATLAS collaboration, and becoming the veritable face of the hunt for the Higgs boson, have catapulted her to being the next DG of CERN. At the same time, it must feel reassuring3 that as physicists embark on a new era of research that requires just as much ingenuity in formulating new ideas as in testing them, an era “where logic based on past theories does not guide us”4, Fabiola Gianotti’s research excellence, administrative astuteness and creative intuition is now there to guide them.

Good luck, Dr. Gianotti!

1Recommended read: Who really found the Higgs boson? The real genius in the Nobel Prize-winning discovery is not who you think it is. Nautilus, Issue 18.

2I must mention that it’s weird that someone which such strong aesthetic foundations used Comic Sans MS as the font of choice for her presentation at the CERN seminar in 2012 that announced the discovery of a Higgs-like-boson. It was probably the beginning of Comic Sans’s comeback.

3Though I am no physicist.

4In the words of Academy Award-winning film editor Walter S. Murch.

Featured image credit: Claudia Marcelloni/CERN

Feathering malfunction, not hyped motor, suspected in SpaceShipTwo crash

On October 31, a manned suborbital test flight broke up mid-air and crashed into the California desert, a 50,000-foot dive that left it smashed. The pilot was killed and the copilot was critically injured. The vehicle was SpaceShipTwo (SS2), owned by British businessman Richard Branson’sVirgin Galactic enterprise, which wants to debut commercial spaceflight in 2015. With SS2’s crash, that’s not going to happen before 2016. The investigation of its crash alone is due to take a year.

In a press conference on November 3 (IST), the National Transportation Safety Board (NTSB), an autonomous federal agency in the United States tasked with investigating civilian aviation accidents, revealed more information about the SS2’s behavior before it tumbled to the ground. The agency didn’t blame any specific incidents or actions, and only addressed some anomalous events that could have contributed to the tragedy. Specifically, the hyped ‘feathering’ mechanism appeared to have been executed abnormally.

On the SS2, a portion of the vehicle’s wings are capable of folding upward and becoming almost perpendicular to the fuselage (see image below; more here). This is called feathering: it controls the pitch, roll and yaw motions of the vehicle, and is used to achieve aerodynamic stability during reentry. It is feasible only for reentry achieved at speeds much lower than the orbital velocity, around 25,000 km/hr. This is why the now-retired NASA Space Shuttle couldn’t use feathering to stabilize itself.

SpaceShipTwo schematic.

SpaceShipTwo schematic.

Feathering increases the aerodynamic drag generated by the vehicle while ensuring the surface isn’t heated up. This eliminates the need for heat shields.

According to the NTSB, on October 31, the engine first fired for nine seconds. Then, a pilot prematurely unlocked the feathering (at Mach 1 instead of at Mach 1.4). However, he didn’t push the lever that moves the feathers to their perpendicular position – but the feathers moved anyway. This caused SS2 to brake suddenly and induce significant structural loads in its frame leading to the mid-air disintegration.

Before the presser, there was speculation that SS2’s motor, the predictably named RocketMotorTwo, might have been responsible for the accident. It is derived from an older variant built by the Sierra Nevada Corporation first tested in April 2013. On the day of the crash, SS2 was debuting a new version of the motor that was using a polyamide plastic fuel. The NTSB said that, after ascertaining they’d found all the parts of the crashed vehicle, they couldn’t find any burn marks that might’ve implicated RocketMotorTwo.

According to an article in Financial Times on November 2,

A second spacecraft under construction for the last three years in New Mexico is “65 per cent complete”, Mr [George] Whitesides [CEO, Virgin Galactic] said, adding that it could be ready to fly next year, once the cause of last week’s accident has been resolved. “The second spaceship is getting close to readiness,” he said.

All eyes on the investigation now…