Connecting animals to the cloud could help predict earthquakes

Did you feel that? Brian Collins/USFWS/flickr, CC BY

The recent earthquake in Nepal demonstrated yet again how difficult it is to reliably predict natural disasters. While we have a good knowledge of the various earthquakes zones on the planet, we have no way of knowing exactly when a big quake like the 7.8-magnitude event in Nepal will happen.

But we know that many animals seem able to sense the onset of such events. We could use powerful computers to monitor herds of animals and make use of their natural instincts to provide forewarning of natural disasters.

Immediately before an earthquake, herds of animals often start to behave strangely – for example suddenly leaving their homes to seek shelter. This could be because they detect small, fast-travelling waves or because they sense chemical changes in ground water from an impending earthquake.

Although there are possibilities here, we certainly need more studies – because it’s difficult to find statistically significant links between unusual animal behaviour and impending disasters. This is because natural disasters occur relatively rarely and it’s hard to reliably interpret animal behaviour after the fact. In fact, this uncertainty was quoted by the Chinese government after reports that zoo animals behaved strangely before the Wenchuan earthquake a few years ago.

Animal whispering software is needed

There are areas where we know beyond doubt that animals have accurate detection ability, for example the way dogs can spot signs of cancer that we otherwise have difficulty recognising. We also know that by giving them animal-centred interfaces we can provide them the means to express what they detect, for example by hitting the right buttons according to their judgement.

This is an example of providing animals with accessible technology that supports their natural behaviour, while also translating their behaviour into something we can understand.

Of course, a key difference between a dog who is detecting cancer and a swarm of birds that is responding to the early signs of an imminent quake is in the numbers involved. We would expect an upcoming earthquake to affect many individuals at the same time, which would amplify the effect.

Collecting data in large quantities – while at the same time being able to recognise and filter background noise – requires efficient and elastic cloud computation. However, we already have technology that can do this, something we’ve previously suggested could be used to track the course of large numbers of aircraft.

Don’t you dare put a microchip on me! Dave Huth, CC BY

So the bigger question is how to record data from large groups of animals, capitalising on advances in the Internet of Things, without affecting the welfare of the animals and without interfering with their natural behaviour.

Research has shown that putting sensors such as biotelemetric devices on animals can have seriously detrimental effects on their welfare, change their behaviour and, by doing so, invalidate whatever data is collected. Of course, trying to fit sensors to large numbers of animals for generation after generation would be highly impractical.

A better option would be to monitor changes in the animals' behaviour around their habitats via ambient sensors such as motion detectors. The data could be used to automatically detect any deviation from normal behavioural patterns.

Herdsourcing

The “wisdom of crowds” has been put to use through the practice of crowdsourcing, where the internet is used to bring together a large, diverse range of users in order to undertake a certain task. For example, analysing Wikipedia documents, conducting citizen science projects, or generating cash through crowd-sourcing.

This is exactly that kind of concept we need to extend to animals in order to watch for collective changes in their behaviour. The technology of cloud computing, which can elastically scale to the amount of computation needed for such a project, is already commercially available.

The groundwork for the kind of system we need has been carried out as part of an ongoing security research programme. This project designs cloud-based software systems to recognise and adapt to changes that may have safety and security consequences.

Applied to the task of monitoring collective animal behaviour, the system could use sensors to detect big groups of animals in specific areas, monitor the speed and shape of their movement, or detect variations in their calls or cries. Of course, a major consideration would have to be to ensure the data is secure, so that for example it couldn’t be used to cause the animals harm (for example, through poaching).

We could apply approaches typically used for human-computer interfaces to animals; designing the means to do so for animals might shed light on how to predict earthquakes – not only that but it could show that there are plenty of other things we can find out from animals too, if only we can learn how to do it.

The Conversation

Russian spacecraft falling to Earth poses no danger – we have survived bigger objects

The relatively light spacecraft that is now spinning out of control. Roscosmos Press Service/EPA

A Russian spacecraft is spinning uncontrollably around Earth after it broke down travelling to the International Space Station with food and fuel supplies. The vessel, Progress M-27M, will burn up when re-entering the Earth’s atmosphere in a week or two, although there is a small chance that parts of it could crash down on the planet. The risk to us humans is minute – we have survived far greater objects falling back towards Earth before.

The six people currently at the ISS are not in any great danger of running out of food either, as the next supply vessel is scheduled to dock at the ISS on June 19 – long before its current food supply is due to run out in September.

The problem with Progress

Progress M-27M launched on April 28 but soon started developing some serious problems. It was not inserted into the correct orbit and ended up in what was described as an “emergency state” when detaching from the rocket used to bring it into orbit. It has only been able to establish limited communication and control with the ground.

Progress breaks down and starts tumbling. NASA.

At the moment the module is in a low safe orbit that does not endanger any other satellites or the ISS. The module was designed to burn up in high atmosphere after supplying the ISS and being filled with rubbish from the station.

As long as the engineers in Russia manage to control its path there should be no danger of the vessel reaching Earth’s surface. The problem is that since there is very limited control of the vessel at the moment, there might be a slim chance that some parts of the craft might not fully burn up. Because the majority of out planet is filled with oceans and sparsely populated areas, the probability of parts actually hitting people is tiny, however the risk cannot be reliably calculated at the moment since the trajectory of the time of re-entry is not yet clear. But if the conditions are the same as were for NASA’s Upper Atmosphere Research Satellite, which fell back to Earth in September 2011, the probability of hitting any human would be one in 3,200.

A more pressing matter might be that the six-person ISS crew run out of food. However, not only is the next supply vessel DragonX scheduled to dock at the ISS on June 19, there will also be a larger Japanese supply vessel arriving in mid-August. In the worst case scenario of no supply reaching the station, the ISS astronauts would run out of food by September 5, with a system of rationing in place from July 24. Water would run out by mid-September. Handily, the ISS has a permanently docked module that would allow the team to return safely to Earth. So overall the crew has only to put up with a delay in delivery of equipment and supplies.

Lessons from the past

Looking back into history there have been far greater objects re-entering Earth’s atmosphere and we have survived such events without tidal waves occurring or being cast into a nuclear winter. The 150-tonne MIR space station burnt up in 2001. The result? A few tourists on the remote Fiji islands captured some nice holiday snaps of fragments and heard sonic booms. The 77-tonne Skylab re-entered 1979 over the Indian Ocean without much ado. The Progress module weighs in at only approximately ten tonnes and none of its material is classed as dangerous. Therefore issues as arising from the Cosmos 954 crash in Canada in 1977 with radioactive debris is not expected either.

We survived MIR crashing – and it was way bigger! NASA

The main lesson learnt from this incident is that when it comes to space flight and even unmanned supply missions you always have to plan for the unexpected. The next step would be now a detailed analysis of why Progress failed, especially since the rocket and module are similar to the ones used for manned missions carried out by the Russians. If it takes too long to establish the cause, it might cause a delay to the next scheduled Russian manned mission on May 26.

The problem also shows why international collaboration is essential for ensuring the safety of the ISS crew and the successful performance of a space station – and why these ties should be strengthened for future space ventures.

The Conversation

Making ISPs enforce age checks for porn puts responsibility where it might actually count

Too hot to handle, for children. hot stuff by Smolina Marianna/shutterstock.com

David Cameron’s desire to see ISPs tackle the availability of pornography to children has prompted some to ask at what point restrictions become an infringement of liberty.

The Conservative Party introduced legislation in 2014 that compelled UK ISPs to provide parental controls, presenting the bill-payer with a choice to block or allow pornographic content. Major ISPs such as BT, Virgin, TalkTalk and Sky have introduced these features, although according to Ofcom figures few households take up the option to use them.

The culture secretary, Sajid Javid, has declared that, if re-elected, the Conservative Party would take steps to regulate overseas ISPs as well as those based in the UK, requiring strong age-restriction controls based on credit card checks or by sharing passport or driving licence details through special software.

There seem to be three objections to this stronger approach to age-verification. First, that responsibility for preventing children’s access to pornography lies with parents not the state. Second, the concern around using identification that would create a direct link between an identifiable person and their viewing habits. And third, that the requirement for this identification would be likely to have an effect on the industry. For example, visits to free-to-view sites would be affected if viewers are put off by the need to use credit card identification more than pay sites (which require a credit card anyway).

There is also the fact that most, if not all, restrictions can be circumvented by anyone with sufficient technological knowledge – which often includes children. This is held up as a reason not to take steps that, despite having eroded our freedoms and privacy, won’t even be effective.

I’d argue that the relationship between new technologies, sexuality and human rights is a serious issue for advanced liberal democracies. The balancing act required between rights and responsibilities needs our collective attention. The question I am concerned with is whose sexual freedoms and rights does a neo-liberal approach embodied by these objections set out to advance?

Social control vs sexual freedom

Perhaps the major ethical issue that lies behind attempts to restrict access to pornography is the sexualisation of children through pornography. Much internet pornography is extreme, violent and profoundly degrading of women. Research has found that both young boys' and young girls' exposure to it is linked to beliefs that women are sex objects and to negative – and even fearful – attitudes towards sex.

‘Raunch culture’ is everywhere. x1brett

The question that should be asked in my view is not whether regulation of access to pornography is technologically viable, but whether leaving that responsibility to parents is viable. The current filtering technology is almost ineffective in preventing children from accessing pornography, however responsible and rigorous the parent. However rigorous parents may be with their own computers, it’s possible their child may access it either through their mobile phones or through devices belonging to other children whose parents have not opted to impose filters. So to suggest that the problem of children’s exposure to hardcore internet pornography can be overcome by individual parents and families is a red herring.

The “free” pornography sites that might be affected are not of course free, since they depend for their revenue on inducing consumers to pay for services. To whom would the damage occur if free pornography sites are affected as result of the proposed regulation? Should we extend our sympathy to the multi-billion dollar pornography industry for a potential loss of their income?

In any case, would the new regulation increase surveillance of our private habits any more than the myriad current online surveillance practices to which we are already subject, whether voluntarily or involuntarily? If adult use of pornography is harm-free, as its proponents claim, what has the user got to fear with regard to his own sexual predilections? Does the fear of surveillance say more about the guilt attached to consuming pornography than it does the chances of illegitimate government interference and control?

If we are adults, then let us be adults

I’d argue that, in weighing up the costs and benefits of social regulation which any liberal democratic government on the left or right is compelled to consider, regulating ISPs with regard to age restriction would, on the whole, increase freedom rather than restrict it.

It will help stay the pornography industry’s influence on children’s and young people’s sexual imaginations, identities and practices – and if adults then feel some shame because they can no longer be completely anonymous in their pornography-viewing habits then at least it is they as adults – and not the children – who will have to process their conflicted sexual emotions.

As well as the right to “adult” interests and a sexual life, isn’t learning to handle complicated feelings part of what being an adult means?

The Conversation

After years of talk, a regulator is willing to take on Google

In Monopol-e-Commerce, who plays the hat, and who gets the boot? danielbroche, CC BY

The European Commission’s decision to charge Google with abuse of its dominant market position in the search business in order to favour its own services has been criticised as too narrow in focus, too superficial for not dealing with the bigger problem of digital competition, ill-conceived for messing with the market, or not focused on the real problem of who owns our personal data.

While these are valid criticisms in their own way, they miss the most important point – that legal action has been taken at all. Whatever the result, this is a seismic and seminal move.

The US Federal Trade Commission (FTC) flirted with legal action in 2012 but withdrew, despite the conclusions of an leaked internal investigation that found that Google had “unlawfully maintained its monopoly over general search and search advertising”.

The European Commission worked closely with the FTC on its investigation and, like the FTC, decided against launching action by 2013. Joaquin Almunia, head of the European Competition Commission between 2010 and 2014, tried and failed to reach acceptable negotiated settlements with Google on three occasions. But his successor, Margrethe Vestager, has chosen action over discussion.

When the FTC launched an antitrust case against Microsoft in 1998 it dragged on for years, cost the organisation huge amounts of money and effort, and arguably opened up the space for Google to expand and eat much of Microsoft’s lunch. As journalist Charles Arthur writes in his book Digital Wars, the FTC’s action had a devastating impact on Microsoft’s self-esteem and “reached into the company’s soul”.

The case against Microsoft also shows why the FTC and the commission were reticent to launch a case against Google. It was legally and technologically complex, with courts struggling to apply 19th century antitrust law to the digital 21st century. Many people ended up dissatisfied with the result.

Hurdles could trip up either side

The case against Google has the potential to be even more complex and legally challenging. To demonstrate Google has abused its dominance the commission may need to call upon economists, engineers, investigative journalists and perhaps even sociologists.

It will need to define the markets in which Google acts. General search may be a relatively established market, but what about vertical search, or social search? It will need to translate competition law to a digital environment, to understand how algorithms work, and the extent to which Google’s algorithms favour the company, and to show evidence of abuse. It will also need to establish whether Google’s actions have damaged “consumer welfare”.

The European Commission will need to do all this while being intensively lobbied by some of the world’s largest and most powerful corporations, for example through the Microsoft-sponsored Initiative for a Competitive Online Marketplace (ICOMP).

It’s not a great surprise, therefore, that the commission is charging Google on narrow grounds, in this case on favouring its own comparison shopping product. Shopping ought to be relatively low-hanging fruit: a reasonably well-defined market that Google has tried (unsuccessfully) to enter on more than one occasion with previous products Froogle, Google Product Search, and Google Shopping. There are a number of vocal, disgruntled competitors such as Yelp, Expedia and TripAdvisor. And there is evidence upon which to build a case, compiled by the commission and the FTC since 2010.

The commission hopes that by narrowly focusing its action in the first instance it can create a precedent from which to build. It has already signalled where it may go next, having announced a formal investigation into Android, Google’s mobile operating system, on the same day. Concerns over Google’s web content scraping and its exclusivity agreements with advertising partners have also been highlighted as potential areas of inquiry.

Legal ramifications

Whichever way the result falls, the repercussions will be pivotal. If the commission wins it will create a precedent with which the commission may choose to take on the dominance of other digital giants such as Amazon and Facebook. It may also trigger action by other governments and private actions. For Google it could lead to a crisis of confidence and loss of market lead similar to that experienced by Microsoft.

The consequences could be even more significant if the commission loses. Some will see it as evidence of the unchallengeable power of the global tech titans. Some will see it as confirmation that the legal action was merely European anger at US tech success. Few other democratic governments will be likely to take up cudgels and follow the commission’s lead.

However, the most likely result is that Google will settle. Though, as has been pointed out in reference to previous attempts to negotiate with the firm Google, settlements could create a precedent too, which could make it difficult in the future to pursue Google for anti-competitive behaviour in one field having settled for the same in another.

In his landmark book The Master Switch, Tim Wu outlined the stages of each information cycle. First a period of openness characterised by innovation, entrepreneurship and relative confusion. Then consolidation, in which a small number of organisations grow dominant. And finally monopolisation of markets – and often subsequent government intervention. For the web, the commission’s antitrust action against Google may well signify the start of the final stage of the cycle.

The Conversation

Reducing science to sensational headlines too often misses the bigger picture

"This theory complex but important and -- hey look, it's Kim Kardashian!" Ed Schipul , CC BY

We are all being lied to, but it’s okay because we sort of know it. Exaggeration, sensationalism and hype are in the newspaper headlines and on the magazine covers we read and in the films we watch. Even the conversations we have with each other are exaggerated to make things sound that little bit more interesting. But what happens when you try to sensationalise science, and put little lies into something that revolves around truth?

The role of science in society is changing. Science is now in the mainstream, with “science editors” commonplace. But the little lies are creeping into science, designed to sensationalise, to entertain, to generate clicks online, to sell newspapers, and to make science sexy.

Missing the point

The best way to illustrate this problem is with an example. There is an ambitious idea called the Skylon project, essentially a rocket plane. Rocket planes are an excellent way to get to space, but building an engine is difficult. However, Skylon recently achieved this with their SABRE engine. This was widely reported with headlines tending toward the likes of “Now Possible to Get to Australia in Four Hours”.

In order to understand why that is important, a little context: rockets are a terrible way of getting to space. Large rockets weigh close to 1,000 tonnes, yet can only carry around ten tonnes of payload into orbit. Worse, most of that rocket gets crashed into the ocean in the process, and those rockets aren’t cheap.

Imagine if every time you took a flight you had to pay for the entire cost of the aircraft – ticket prices would go up, and the number of aircraft available would go down. The Skylon rocket plane is the first, completely reusable way of getting into space. This means in comparison to non-reusable rockets, rocket planes such as Skylon would hugely increase capacity and availability of flights and lower the cost of flying people and cargo into space. Everyone who could afford to buy a sports car could now afford to go to space.

The Wright Brothers first flight. CC BY

Getting lost in the headline

Yet all the reporting of Skylon’s new engine chose to focus on the idea that it would be possible to fly from one side of the world to another in hours. Sure it’s attention grabbing, but it misses the point. It’s like being alive in 1903. At this point in time, the only way to fly was with a balloon or glider. Then the Wright brothers invent powered flight. With the ambitions of many from Icarus to Leonardo da Vinci finally realised and mankind able to take to the skies, it would be absurd to report it with the headline: “Now Possible to Get to the Shops in 30 Seconds”.

We all know that powered flight changed the world. A century after the Wright brothers' breakthrough, two billion people and 40m tonnes of cargo were transported that year alone. Just 110 years later Voyager 1 would become the first man-made object to leave the solar system entirely.

Think about that. Only a century after we worked out how to take off from the ground, we managed to leave the solar system. And you’re telling me that the most interesting part of the SABRE engine is that you can get to Australia in four hours? No. Not even close. We are potentially ushering in a whole new era of human existence.

Yet somehow this message gets lost in the sensationalising of the world around us. Modern society has become obsessed with short-term gains and creating the illusion of progress and achievement. That is why popular media is full of these little lies and it is why we are trying to make science sexy.

Tracy Caldwell Dyson viewing Earth from the ISS Cupola, 2010. WikiCommons

Beauty is in the bigger picture

But when you make science sexy you lose the beauty, and there is tremendous beauty in science. That beauty is hope. Science is the hope of a future. Because if we just sit here on this planet and do the things we already do, getting places just a little bit faster, living just a little bit longer, happy to simply survive as we are then we know how humanity’s journey ends – and it will end, here, on this planet.

But if we do more than survive; if we discover and explore and expand, then our future is uncertain. Science is a demonstration that humanity need not exist only on some tiny rock in the outer spiral arm of a single galaxy. To me, it means that humanity refused to go gently into that good night. Will it make it? Who knows – but it’s important that we try.

The Conversation

Bridge may be a sport but the brain definitely isn't a muscle

A judge European Bridge League/flickr, CC BY-SA

This week a High Court judge opened the way for the card game bridge to be classified as a sport under English law. Recalling his own bridge-playing experience, Justice Mostyn recognised claims that the game could be recognised as a “mind sport” that exercises the “brain muscle”. He also stated that the game involves more physical activity than rifle shooting.

The case was brought by the English Bridge Union, which wants bridge to be classified as a sport in the hope the group would qualify for Sport England lottery funding, and a full judicial review has been awarded.

Given that chess is recognised by the International Olympic Committee as a sport, the union’s claim may not be as unlikely as it appears. But the judge’s views were misguided.

The brain is not a muscle, it is an organ. It does not contain any muscle cells (which can be smooth, striated or cardiac) and it is incapable of contraction and dilation from central nervous system signals.

Increased cerebral activity will elevate glucose metabolism in the brain, but this will have negligible affects on the body’s overall energy balance or consequential physical health benefits. Any benefits of this kind are often negated by the typical consumption of snacks and beverages during the activity.

Brain plasticity research has demonstrated the capacity of the brain to develop new neurons throughout life when exposed to mentally challenging tasks. This means improvement of cognition is possible at any age. This activity can even reverse cognitive decline and delay the onset of dementia. In other words, the idea that you should “use it or lose it” is true.

Unlike bridge, rifle shooting has demonstrated physical benefits. North Carolina National Guard/flickr, CC BY-ND

In its defence, Sport England referred to the Council of Europe’s definition of sport: “Sport means all forms of physical activity which, through casual or organised participation, aim at expressing or improving physical fitness and mental well-being, forming social relationships obtaining results in competition at all levels”.

It is not difficult to see that bridge fails to tick the boxes referring to physical activity and fitness. Yet the judge insisted that the dealing and playing of bridge requires greater physical activity than that of rifle shooting. From this I assume he meant the dynamic arm movement of selecting and placing cards onto a table (and possibly toilet breaks?), versus the static contraction of holding a rifle to take the desired shots.

Maybe Justice Mostyn has a point, but as yet there is no evidence to prove the physical benefits of bridge, unlike rifle shooting, which has demonstrated increased skeletal muscle activity. Critically, the physical activity required to take the rifle shot will determine its accuracy, whereas the speed and accuracy of placing cards on a table bears no relationship to the performance outcome.

However, we are now into the pedantic world of legal definitions. Jaffa Cakes succeeded in being classified as a cake as opposed to a biscuit for the sake of lower taxation, so maybe bridge could be classified as a sport.

One thing is for certain: this topic will always make for an interesting debate at the bar, even during the sport of darts. I’ve heard an alternative definition: “A sport should only be considered such when it necessitates the changing of shoes.” Maybe the Council of Europe should adopt this instead.

The Conversation

Discovery of microbe-rich groundwater in Antarctica could guide our search for life in space

It's the inside that counts. Eli Duke/flickr, CC BY-SA

Scientists have discovered salty groundwater underneath the dry valleys of Antarctica that is buzzing with microbial life. As the valleys are geologically similar to what Mars was like in the past, the discovery could help us understand what life on the red planet could have looked like. It could also help us search for life elsewhere in the solar system, such as on the icy moons surrounding Jupiter and Saturn.

The dry valleys west of McMurdo Sound are some of the most geologically intriguing regions of Antarctica. While we know a lot about their geology and surface hydrology, we have little understanding about what is happening beneath the glaciers and lakes. However, the new study, published in Nature Communications, has revealed that a large, inter-connected series of flowing ground-water streams lurks underneath the glaciers, lakes and permafrost.

Moreover, this ground-water system is home to a variety of microbial life feeding off the rich mineralogy of the environment. This could be micro-organisms that produce energy by chemical reactions involving iron and sulphur, for instance sulphur-reducing bacteria.

Salty surprise

The groundwater is very salty, containing a number of chemicals dissolved from sediments which were once many metres below sea level. The levels of salt, sodium chloride, are indeed very similar to ocean water. The salty water also explains the distinct red colour of the wonderfully named, Blood Falls, which lies at the foot of the Taylor Glacier to west of the valleys, where it mixes with iron and oxygen.

The creepy, iron-rich Blood Falls National Science Foundation/Peter Rejcek

The discovery was made by shining a sensor on the ground, which measures the electromagnetic resistance of the material beneath. This allowed the team to distinguish between salt-containing sediments and frozen, ice-bound layers. Strikingly, they discovered that the salty ground-water system extends throughout large parts of the valleys, extending from the coast to at least 7.5 miles inland in the valleys.

A guide to extra-terrestrial life?

Reasonably, the authors make the connection between the microbial habitat and similar geological environments that may have existed once on Mars. This link has been made many times before, recently by by NASA’s Curiosity Rover at Gale Crater.

Liquid water is essential for life and it is therefore a major driver for astrobiology. It can be found in many different types of geological environment, locked within microscopic cavities of minerals or encased in ice. Surprisingly, life can actually exist in such extreme environments, within certain limits. Moreover, metals – such as the iron and magnesium that were found in this study – are also crucial to driving the chemical processes of living, as they are found at the heart of many key enzymes.

While it is not a “smoking gun” for the existence of life elsewhere in the universe, the present study reminds us what we should be looking for.

Could similar microbial life as that found in Anatarctica be found on Mimas? NASA/JPL-Caltech/Space Science Institute

Cosmic candidates

It is possible that liquid, salty water underneath the vast ice sheets of other solar system bodies such as Saturn’s moons Enceladus and Mimas; Jupiter’s moon Ganymede or even Neptune’s moon Triton could hold similar microbial life. Underneath thick layers of ice, such water would be protected from potentially damaging cosmic rays, harmful radiation, meaning life there would be relatively safe.

Identified as part of Scott’s Discovery science mission of 1901-4, the McMurdo Dry Valleys were explored in greater detail during Scott’s second, fateful, Terra Nova expedition of 1910-13. Little did they know back then that their discoveries had the potential to one day inform missions to space.

The Conversation

Space debris: what can we do with unwanted satellites?

It's crowded up there - the many objects tracked in low Earth orbit. ESA

There are thousands of satellites in Earth orbit, of varying age and usefulness. At some point they reach the end of their lives, at which point they become floating junk. What do we do with them then?

Most satellites are not designed with the end of their life in mind. But some are designed to be serviced, such as the Hubble Space Telescope, which as part of its final service was modified to include a soft capture mechanism. This is an interface designed to allow a future robotic spacecraft to attach itself and guide the telescope to safe disposal through burn-up in the Earth’s atmosphere once its operational life has ended.

Thinking about methods to retire satellites is important, because without proper disposal they become another source of space debris – fragments of old spacecraft, satellites and rockets now orbiting Earth at thousands of miles per hour. These fragments travel so fast that even a piece the size of a coin has enough energy to disable a whole satellite. There are well over 100,000 pieces this size or larger already orbiting Earth, never mind much larger items – for example the Progress unmanned cargo module, which Russian Space Agency mission controllers have lost control of and which will orbit progressively lower until it burns up in Earth’s atmosphere.

A hole punched in the side of the SMM satellite by flying orbital debris. NASA

We don’t know exactly how many or where they are. Only the largest – about 10% of those fragments substantial enough to disable a satellite – can be tracked from the ground. In fact damage to satellites is not unknown, with Hubble and the Solar Maximum Mission (SMM) satellites among those to have coin-sized holes punched into them by flying debris. There is a risk that over the next few years there will be other, perhaps more damaging, collisions.

The soft capture mechanism was installed to prevent more space debris. Engineers worldwide are devising ingenious ways to try to limit the amount of debris orbiting the planet – for good reason. Predictions show that if we don’t tackle the problem of space debris then many of our most useful orbits will become too choked with flying fragments for satellites to safely occupy them.

At some point, there may be enough debris in a given orbit for debris-satellite collisions and debris-debris collisions to cascade out of control. This is known as the Kessler syndrome, as shown (in somewhat exaggerated fashion) in the film Gravity.

Given the degree to which we rely on satellites these days – for communication, GPS and time synchronisation, upon which in turn many vital services such as international banking rely – it’s crucial we prevent near-Earth space from reaching this point. And like it or not, one of the important steps required is to remove large defunct satellites that could become the source of many more chunks of debris.

Designed for disposal

Satellites such as the UK’s TechDemoSat-1 (TDS-1), which launched in 2014, are designed for end-of-life disposal. TDS-1 carries a small drag sail designed and built at Cranfield University that can be deployed once the satellite’s useful science life is over. This acts like a parachute, dragging the satellite’s orbit lower until it re-enters the atmosphere naturally and burns up high in Earth’s atmosphere.

TDS-1 is small enough to burn up – larger or higher satellites will require other ways of moving them away from the most important, valuable, and busy orbits. It’s possible, with enough fuel on-board (and all systems functioning after perhaps decades in space), for satellites to de-orbit themselves. Other, more exotic solutions include tug satellites using nets, tethers, and even high power lasers.

Bag it and bin it - ESA’s e.Deorbit project may use nets to collect debris and drag it into the atmosphere to burn up. ESA

However, space debris isn’t just an engineering problem. Suppose Europe develops a tug satellite and tries to de-orbit old Russian satellites, or passes close to an active US spy satellite. Clearly this could get political. Simply put, we haven’t yet found a way to use space sustainably, and the problem is almost as complex as finding ways to ensure sustainable development on Earth. What we need are practical solutions – and soon.

One that got through: part of the Delta rocket fuel tank that came back to Earth in 1997. NASA

So what will happen to Hubble, perhaps the most well-known case of a satellite that requires a retirement plan? One day, perhaps in the early 2020s, a small spacecraft will be launched to rendezvous with the space telescope. It will attach using the soft capture mechanism and then fire its engines to guide Hubble toward re-entry over the South Pacific. For a satellite as large as Hubble, it’s likely that some parts will survive re-entry so a large uninhabited region over the ocean is best suited to avoid risk of damage or casualties.

The re-entry can be tracked carefully from other satellites, aircraft, and ships – all will capture the moment that Hubble itself, having spent decades watching the heavens, will become a bright shooting star for other telescopes to capture. It somehow seems fitting that a mission as remarkable and long-lived as Hubble should itself end in a blaze of glory.

The Conversation

How earthquake safety measures could have saved thousands of lives in Nepal

Poorly built houses were destroyed in the earthquake. Domenico/flickr, CC BY-SA

Earthquake engineers often say earthquakes don’t kill people, collapsing buildings do. The tragic loss of life that followed the huge earthquake in Nepal on April 25 occurred despite the fact that the country is among the world’s leaders in community-based efforts to reduce disaster risk. But poverty, corruption, and poor governance have all led to a failure to enforce building codes – as has a shortage of skilled engineers, planners and architects.

Sadly the country was on its way to deploying knowledge and skills to tackle its long-term vulnerability just as the ground shook.

So why aren’t more buildings designed to withstand shaking – even extreme shaking.

To keep buildings standing, it is essential to have adequate building and planning codes, as well as proper training and certification for professionals such as engineers, architects, and planners. But having certification and codes on paper does not ensure implementation or compliance. Nepal does, after all, have some of these things. Laws and regulations must also be monitored and enforced. That is not easy in a country such as Nepal, which has isolated villages, a history of conflict and many governance difficulties.

Vast vulnerability

Financial as well as social resources are needed to set up earthquake resistant buildings. Governments at all levels need to be functioning and competent in order to engage with processes such as urban planning and earthquake-resistant construction. Citizens must trust and have the opportunity to work with their governments, including the law enforcement and judicial sectors.

It’s not just about buildings. Many non-structural measures are needed to ensure survivability in earthquakes. Appliances such as televisions, microwaves, hot water boilers, and refrigerators (which do not always exist in Nepalese homes) must be securely fastened to the floors and the walls. Otherwise, they move and topple, killing as readily as building collapse. Even in affluent earthquake-prone locations such as New Zealand and California, we see shockingly low rates of households enacting these basic measures.

Students at a Nepalese school practice earthquake preparedness. Australian Department of Foreign Affaris and Trade/flickr, CC BY-SA

But Nepal is not New Zealand or California. It has been wracked by conflict and troubled by unstable governments, not to mention the governance issues caused by being sandwiched between China and India. It has long had high poverty and low formal education rates.

Despite recent improvements, Nepal still lags behind other countries when it comes to human development and it is still seen as highly corrupt. It also scores badly on child health and gender equality measurements.

When families struggle daily for enough food to keep their children healthy, they are not likely to spend time thinking about making their home earthquake resistant.

And when children are malnourished and stunted, they perform worse in school. That leads to long-term education inadequacies that prevent them from developing into adults with the skills to lobby for adequate and enforced building codes. What’s more, when women lack the same opportunities as men, half the population is excluded from demanding and enacting good governance.

All these factors contribute to the country’s vulnerability. All these factors have led to housing and infrastructure prone to collapse in an earthquake.

Rebuilding a nation

None of these things can be solved overnight. Tackling vulnerability is a long-term process, yet earthquakes strike and bring down buildings in seconds and minutes.

As the earthquake struck, Nepalese people were working hard to overcome these vulnerability conditions. My friends and colleagues from the country have taught me plenty about retrofitting buildings and constructing earthquake-resistant homes.

There is hope Jean-Marie Hullot/flickr, CC BY

They travelled to communities with small shake tables, which are used to simulate earthquakes by shaking model houses or building components, showing the difference between an earthquake-resistant house and a non-earthquake resistant house. They made many schools safe. They taught school children and their parents about earthquake-safe behaviour.

A shake table demonstration for Earthquake Safety Day 2007 in Nepal. NSET, Nepal., Author provided

These efforts saved hundreds of lives, if not more, during the recent tremors. With a few more decades, a mere instant in geological time, they could have made Nepal comparatively safe from earthquake disasters despite earthquakes. In that time, so many more buildings would have been retrofitted, we might have had adequate building code enforcement, and most importantly, an earthquake-educated and vulnerability-educated generation would have started to take power.

Nepal must now continue these efforts in order to avoid similar future devastation. We can be optimistic. Education is happening – for boys and girls. Women are increasingly being given equal opportunities as men. This means the Nepalese people are taking charge of their own health, their own environment, and their own sustainability. That is vulnerability reduction over the long-term.

The Conversation

Forget the James Webb, a future high-definition telescope could probe life on exoplanets

Bigger but not better than Hubble. The James Webb's primary mirror. NASA/wikimedia

The James Webb Space Telescope will be Earth’s premier space observatory for the next decade, serving thousands of astronomers worldwide. However its scientific mission will be limited. Unlike Hubble, which is nearing the end of its scheduled life, the James Webb will cover a much smaller part of the electromagnetic spectrum. Instead, a proposed high-definition space telescope is the only way to image Earth-like planets orbiting others stars and study them in detail.

While such a project is being studied by a consortium of scientists in response to a NASA call for ideas for large future space missions, it has so far not been formally approved. But it is really urgent that we start working on this project now, because the planning timescales for large missions of this kind are long. Even if we started to build it right now, it would still not be ready before 2030 at the earliest.

The limits of James Webb

Hubble, in low Earth orbit since 1990, has been a great success and has demonstrated the many advantages that space telescopes have over ground-based telescopes. Its successor, the James Webb, which is due for launch in 2018, is an even larger instrument, with a 6m diameter mirror compared to Hubble’s 2.4m. Just like Hubble, it will be able to avoid the disturbing effects of the Earth’s atmosphere.

Hubble exceeded our expectations. NASA

Its scientific mission includes searching for light from the first stars and galaxies and to study the formation and evolution of galaxies. This is more easily achieved by measurements in the near-infrared, which is why it will not measure visible or ultraviolet light like Hubble. While James Webb will be able to deliver some amazing science – it will collect much more light and will be able to look deeper and farther back in time in the universe – the lack of ultraviolet measurements is a major drawback. Ultraviolet can only be observed by space telescopes like James Webb, it cannot be picked up from the ground as it is blocked by the Earth’s atmosphere. Astronomers will therefore completely lose access to UV when Hubble dies.

High definition is the way forward

ET phone home. Artist’s impression of an exoplanet. NASA/JPL-Caltech/flickr, CC BY-SA

The proposed High Definition Space Telescope, which would have a 10-12m aperture, would be tuned to work in the UV and visible, as well as the infrared.

The aim of the NASA project is to understand the technical challenges now so that they can be solved before any construction begins.

The photon-counting detectors in the proposed HDST would have a higher count rate per pixel and lower noise than James Webb and Hubble. AURA/NASA presentation

A facility would be a general, all-purpose observatory that would deliver some amazing and often unexpected science. However, the most compelling case for this telescope and one of the most exciting pieces of science that can be conceived, I believe, is the ability to image tens of Earth-like planets orbiting others stars and study them in detail. By looking at the chemical signatures in their atmospheres, it will be possible to work out if life exists and understand how common it is in our galaxy.

To do this, the telescope would be fitted with a disk to block the bright surface of stars, which would allow direct imaging of exoplanets. The group studying the telescope says most of the technologies needed for the mission are already being developed as part of other NASA programmes. The telescope could therefore be credibly be put forward to NASA’s Decadal Survey in 2020, which will identify and prioritise scientific questions and observations.

In the meantime, while we prepare for this over the next couple of decades, we should consider going back to Hubble with the next generation of human-carrying space vehicles, such as NASA’s Orion capsule, and service it at least one more time.

The Conversation

Computers are knocking on the door of the company boardroom

What's your golf handicap old chap? Mopic

While women sitting on company boards remains a much-discussed topic, there is something new waiting to take a seat at the table: artificial intelligence, computers with company voting rights.

Deep Knowledge Ventures has appointed an algorithm called VITAL (Validating Investment Tool for Advancing Life Sciences) as a member of its board. It uses state-of-the-art analytics to assist in the process of making investment decisions in a given technology.

Of course, companies have used computer assisted analysis to analyse investment opportunities for a long time, but is the vision of a computer with equal voting rights as human board members a bit far-fetched?

Defining artificial intelligence

Alan Turing Wikimedia Commons

What does the future hold with regard to the influence of computers on business decisions – and can they ever be used in place of a human board member? The Turing Test, formulated by Alan Turing in the 1950s, provides a strict interpretation of machine intelligence. A human participant must be unable to tell whether they are communicating (through a typed, text medium) with a computer or a human. If the human participant cannot reliably tell whether their conversation partner is a computer, then Turing would argue the computer has demonstrated intelligence.

Numberphile: The Turing Test

Not everybody agrees that passing the Turing Test is enough for a computer to exhibit intelligence. In his Chinese Room argument, the Stanford philosopher John Searle described a closed room, into which a sentence written in Chinese is fed. A response emerges from the room, written in Chinese, that correctly answers the questions or conversational cues in the sentence submitted. The assumption could be made that inside the room is someone that can speak Chinese.

Instead, inside the room is a human who cannot speak Chinese but is equipped with manuals that exhaustively provide the appropriate Chinese characters to produce in response to those received. The argument holds that an appropriately programmed computer (the person in the room) could pass the Turing Test (by producing convincing Chinese) but would still not have an intelligent mind that we would regard as human intelligence (by understanding Chinese).

The Chinese Room

A computer in the boardroom

If we want computers to make business decisions and even have equal voting rights on a company board, what would it have to do in order for the other board members to have confidence in its decisions?

Part of the challenge of the Turing Test is syntax versus semantics. Compare the sentences “Fruit flies like bananas” and “Time flies like an arrow”. The sentence structure is similar but the meaning is entirely different, making it a linguistic challenge.

Even a very simple conversation relies upon a substantial amount of linguistic knowledge and understanding. Consider the following questions:

  • What was the result of the big match last night?

  • I have K at my K1, and no other pieces. You have only K at K6 and R at R1. It is your move. What do you play? (these chess moves are from Turing’s original paper)

  • What book do you think of if I say 42?

These might seem easy for humans to understand, but are challenging for a computer. Thankfully, a computer making business decisions is not faced with such a general task as the Turing Test. But if we are serious about having a computer as a full member of a company board, what are the hurdles that need to be addressed? Here is a (almost certainly not complete) list.

  1. Access to LOTS of data: An automated approach to decision-making will require the use of big data. Company reports and accounts, economic data such as share prices, interest rates and exchange rates, and government statistics such as employment rates and house prices would all be obvious inputs. More subjective data such as newspapers, social media feeds and blogs might also be useful. Peer-reviewed scientific papers might also provide insight. Of course as always, the challenge with big data is to process the large quantities of data that will be be of different types (figures, text, charts), stored in different ways, and have missing elements.

  2. Cost: Much of the data required is likely to generate significant costs. Social media feeds may be free (but not always), but stock market information, company accounts, government data, scientific papers and so on are generally commercial products that must be paid for. In addition, there is the cost of developing and maintaining the system. The algorithm is likely to require continual development by highly skilled analysts and programmers.

  3. Complexity: Big data algorithms will be central to the boardroom decision support algorithm, but they will be underpinned by advanced analytics, many of which we are only just starting to understand and develop. To have a real impact there is likely to be some research required which would require staff with the relevant skills.

So, are we really at a point where a computer could take its place on the board? Technically it’s possible but the costs to develop and maintain, as well as subscribe to the data that is required, probably means that it is not within the reach of most companies and I suspect that the money would be better spent on a human decision maker – at least for now.

The Conversation

Why it is so hard to predict where and when earthquakes will strike

There is currently no technique that could have helped Nepal predict when the recent earthquake would strike. AP/PA/Niranjan Shrestha

Can earthquakes ever be predicted? This question is timely after the magnitude 7.8 earthquake that struck Nepal recently. If authorities had more warning that the earthquake was coming, they may have been able to save more lives.

While Nepal is a documented area of previous seismic activity, at the moment there is no technique that provides predictions of sufficient clarity to allow for evacuations at short notice. So if we cannot predict these events now, are there avenues of research to provide useful predictions in the future?

The key word here is “useful”. It is possible to make long-term forecasts about future earthquake activity, partly by using the past record of earthquakes as a guide. There is no reason to believe that a region of the Earth is going to behave differently in the next few thousands of years from its pattern over the same range back in time. In the short term, seismologists can draw on data from recording stations, with records going back roughly 40 years on a global scale.

Within hours of a major earthquake there are estimates of its epicentre, magnitude (the amount of energy released), the depth at which it originated, the orientation of the geological fault that caused it and the direction in which it moved. The event in Nepal was a thrust fault, meaning that the upper part of the Earth was shortened by a few metres, with the rock lying above the fault plane moving southwards over the rock lying beneath it.

Gathering the data

Information about past earthquakes comes from a number of sources, not least historical records. But such records are incomplete, even in earthquake-prone countries with long traditions of documenting natural disasters, such as China and Iran. Other lines of evidence are available, including measuring and dating the offsets (movements caused by earthquakes) of man-made or natural features that can be accurately dated, such as the walls of a castle or a city. Faults cutting the Great Wall of China have been documented in this way.

Seismologists also dig trenches across faults known or suspected to be active, and can recover rocks and sediments affected by earthquakes. These events can dated, for example by radiocarbon analysis of plant remains disturbed by the faulting.

Seismologists can assess earthquakes by measuring how much they move geological features. flickr/US Geological Survey, CC BY

By combining the earthquake ages with the size of the damaged areas, it is possible to understand earthquake patterns over hundreds or even thousands of years. Scientists use this information as a guideline for future behaviour, but it is clear that the faults do not slip after the same period of time between earthquakes (the recurrence interval).

Nor does a fault necessarily rupture in the same place in successive earthquakes. An earthquake releasing stress along one fault segment may place more stress on an adjacent region, thereby increasing the earthquake likelihood in that area. This may occur soon after the original event, which explains the phenomenon of aftershocks. Nepal has already seen aftershocks of a magnitude greater than six, and is likely to see more.

Global hotspots

Instrumental and historical records combine to make a global picture of earthquake activity. There are, unfortunately, many danger areas. Eurasia bears the brunt, because of the collision of the Indian and Arabian plates with the rest of Eurasia. Therefore China, Iran, Pakistan and India all share Nepal’s susceptibility to large earthquakes. Other danger areas lie along the margins of the Pacific and Indian oceans, where one plate slides under another in a process called subduction. Earthquakes at such plate boundaries can cause devastating tsunamis, like in Japan in 2011.

Areas where tectonic plates slide under one another are earthquake hotspots.

Newer lines of research include precise measurements of the movement of a fault during earthquakes and the motion of the Earth’s surface between earthquakes. Across the Himalayas there is around 20mm of convergence (shortening) every year, roughly half of the overall convergence between the Indian and Eurasian plates. The remainder is accommodated further north, in ranges such as the Tian Shan and the Tibetan Plateau. In other words, every year a person in Siberia becomes roughly 40 mm closer to a person in central India, as the Earth’s crust deforms across the broad region between them.

This strain builds up over time and is released in an earthquake like the snapping of an elastic band. Faster strain, longer faults and greater strength in the upper part of the Earth in a particular region can all lead to larger earthquakes. The Himalayas feature a deadly combination of these factors, leading to very large events of the kind experienced on April 25.

It is not sensible to be naively optimistic about improvements in earthquake prediction, but all research on the past and present behaviour of active faults is to be welcomed. It is timely that the UK’s Natural Environment Research Council has just announced funding for research into earthquakes and resilience to earthquakes.

The Conversation

How we identified weird and wonderful 'Jurassic platypus' dinosaur

Calm down, I'm a vegetarian. Gabriel Lio, Author provided

When the platypus was discovered in very late 18th century, its bizarre features that appeared to be a mash-up of other animals perplexed naturalists. Now a creature from the past that would have looked like strange mix of unrelated dinosaurs has been discovered. And our research suggests that it belonged to a hitherto unknown lineage of herbivores that lived around 145m years ago, in the Jurassic period.

I was part of the international team that identified this strange creature by analysing bones enclosed in ancient rocks. Our research, published in the journal Nature, reveals that the Chilesauraus was relatively small – a fully grown adult would have measured about 3.2 metres. We discovered this by investigating four whole skeletons and several other bones – a task that was not particularly difficult as the bones were well preserved. In fact, only a few skull bones and the end of the tail remain undiscovered.

Chileosaurus' teeth suggest it was a vegetarian. Fernando Novas, Author provided

The creature had leaf-shaped teeth, which means it was most likely a plant eater. Other signs were the robust legs, which resemble those of other herbivorous dinosaur groups, and the morphology of the pelvis that allowed to increase the gut capacity for processing plant material. Chilesaurus was the most common species of the braided river system in which it lived alongside with primitive crocodiles and large long-necked dinosaurs.

A genealogical puzzle

Identifying what the dinosaur looked like was not the most challenging of the research, but it was very difficult to figure out which dinosaur group it belonged to – an issue we spent many late nights discussing. We were completely astonished by the fact that each part of the skeleton that was cleaned out from the surrounding rock resembled a different group of dinosaurs.

The well-preserved skeleton Gabriel Lio, Author provided

Its skull and neck look like those of primitive long-necked dinosaurs like Plateosaurus; the vertebrae resemble those of primitive meat-eating theropods such as Dilophosaurus; the pelvis is very similar to that of ornithischian dinosaurs such as Iguanodon; and the hand has only two well-developed fingers as in Tyranosaurus Rex, but with a longer arm.

However, there is no possibility that Chilesaurus is simply made up of different dinosaur bones, because we found four partial skeletons. Working partly in Buenos Aires, Argentina, and partly in Birmingham, our team compared the bones to those of other dinosaur groups. Eventually we decided through different analyses that Chilesaurus belongs to a completely unknown lineage of dinosaurs that acquired herbivore habits from carnivorous ancestors. Chilesaurus is the first herbivorous theropod (a lineage that includes mainly predatory dinosaurs) from the southern hemisphere.

We believe that the new dinosaur is a primitive tetanuran – a group of theropods that includes Megalosaurus, Allosaurus, Tyrannosaurus and birds – but not Carnotaurus and other early dinosaurs.

The first bones were found by geologist Manuel Suarez and his seven-year old son. The study took four years and the analyses were conducted during the second half of last year.

A Chilesaurus of our times

Who are you calling weird? daniel.baker/flickr, CC BY-ND

A bizarre combination of features like that seen in Chilesaurus can also be seen in living animal species, such as the platypus, which is a mix of duck, beaver and otter. Some naturalists even considered it a hoax. But animals such as Chilesaurus and the platypus can be explained by an evolutionary process called convergence evolution, in which two unrelated species or groups acquire similar characteristics because of living in similar environments or having a similar behaviour.

Similarly, the bizarre anatomy of Chilesaurus will probably open a heated discussion about its relationships. Ultimately, the discovery reveals how much data is still completely unknown about dinosaurs and that there is still much waiting to be discovered in the rocks that tell the story of our planet in deep time.

The Conversation

Telescopes on the ground may be cheaper, but Hubble shows why they are not enough

Bye, Earth telescopes! You will never reach my level. ESA, CC BY-SA

Observatories on Earth are cheaper than telescopes in space. They are also improving rapidly – when the European-Extremely Large Telescope starts its observations in nine years, it will be able to provide images 16 times sharper than those taken by the Hubble space telescope. But while it may seem hard to justify investment in space telescopes, the ground-breaking discoveries made by Hubble have taught us just how valuable they are.

Hubble, which was the world’s first space-based optical observatory, has made amazing discoveries in all aspects of astronomy, from flashes of aurora on planets and moons in our solar system to the evolution of galaxies billions of light years away.

Observations by Hubble helped determine the rate of expansion of the universe in a Nobel prize-winning study. We have witnessed stars being born in nurseries like the Eagle nebula and exploding as supernovae. Hubble has also captured a powerful jet emerging from a black hole at the centre of another galaxy.

Picture of the globular cluster Messier 2, taken by Hubble. ESA/Hubble & NASA, CC BY

These discoveries come at a price. The Hubble mission cost $1.5 billion at its launch in 1990 and the maintenance costs have also been sky-high. The eagerly-anticipated first pictures taken by Hubble were disappointingly blurry. The 2.4 m diameter mirror inside the telescope was slightly flawed so the light was not focusing correctly. Installation of an optics system to correct this problem was the target of the first Hubble servicing mission, carried out by space shuttle astronauts over five days of spacewalks in 1993. Four further servicing missions were carried out from 1997 to 2009 to upgrade and replace scientific instruments, power and guidance systems, and each mission had associated risks and expense. Since the end of NASA’s Space Shuttle programme there has been no way to carry out further servicing.

Space telescopes are not getting any cheaper. The successor to Hubble, the James Webb telescope, has been plagued by a number of delays and rising costs. As it prepares for launch in 2018, it will have cost about $8bn to build, launch and commission.

Earth v space

One significant advantage of building on the ground is that the size of the telescopes can be much larger than can be carried into space. Telescopes on our own planet have also made amazing discoveries, such as the Gemini telescope observing Jupiter’s two giant red spots brushing past one another in the planet’s southern hemisphere. The Keck observatory has detected water vapour in the atmosphere of a planet orbiting another star. The European Southern Observatory telescopes tracked stars orbiting the black hole at the centre of our galaxy to understand the formation of the stars and their interaction with the black hole.

However, ground-based telescopes aren’t cheap either. Work has already begun on the European Extremely Large Telescope, sited in Chile’s Atacama desert, with a cost estimated to be over €1 billion and with annual operating costs of €50m. But this is still less than Hubble and James Webb.

Artist’s impression of the European Extremely Large Telescope European Southern Observatory/flickr, CC BY-SA

When E-ELT observations start in 2024, the state-of-the-art correction for atmospheric distortion will allow it to provide images 16 times sharper than those taken by Hubble. With technological advancements like this it may seem hard to justify the expense and risk of future space-based telescopes.

However, the simple fact is that if we choose to only observe from the ground we will make ourselves blind to a wide variety of astronomical phenomena and potential discoveries. These include some of the universe’s most energetic events, such as gamma ray bursts.

The main reason for this is that the atmosphere of our planet does not hold back space telescopes. While the atmosphere lets through visible light, to which our eyes are sensitive, it absorbs light at some other wavelengths so we can never see it from the ground. In addition, turbulent motion in the atmosphere blurs the light travelling through it, causing objects to twinkle and appear fuzzy. Another problem with ground-based telescopes is that they are subject to local weather conditions, and high clouds can ruin the chance of making any useful observations.

The Very Large Telescope in Chile is about to get competition from the E-ELT. ESO/G. Lombardi (glphoto.it), CC BY-SA

From its vantage point above the atmosphere, Hubble avoids these effects and can produce high-resolution images over a broad spectrum. The scientific value of these observations is evident in that applications by scientists for observing time on Hubble last year were oversubscribed by a factor of five. It has also been an important source of scientific papers. According to a survey by the European Southern Observatory last year, Hubble has produced between 650 and 850 papers per year since 2005 – which is far more than any of ESO’s ground-based telescopes.

Complementary contributions

The investment in astronomical telescopes, whether in space or on the ground, has to be justified by the scientific return – and in selecting new facilities it is fundamentally the science which drives the decision. Having worked with telescopes both on the ground and in space, I feel that science ultimately needs both. But in a world of limited funds we can’t have it all. International co-operation is therefore the key, whether it is about placing a new telescope in another country or providing an instrument for a mission led by another space agency.

The value of the observations made by telescopes based both on the ground and in space can be measured not just by the scientific results in understanding the near and far universe, but also in the inspiration that these images and discoveries provide.

The Conversation