Australasian Science: Australia's authority on science since 1938

Exclusive subscriber news

By Stephen Luntz

Subscribe for complete access to all news articles, columns and features each month.

Lunar Gravity Map Created
The creation of a detailed map of lunar gravity has revealed the internal workings of our nearest neighbour. “Our new lunar gravity map now shows, for the first time, how the pull of gravity changes from location to location over the rugged surface of the Moon,” says Dr Christian Hirt of Curtin University.

Space probes, particularly the 2007 Japanese SELENE mission, have provided large-scale maps of anomalies in lunar gravity. However, the height at which these missions have flown past prevent them from detecting smaller variations, such as those caused by mountains or craters.

“Recently the United States Lunar Reconnaissance Orbiter has provided a useful data source on lunar topography, containing rich information on expected gravity field signatures such as of small craters,” Hirt notes.

Hirt and Prof Will Featherstone combined information from these maps with far more detailed topographic data to estimate the gravity on a finer scale. The same technique has been verified on Earth, with comparison against local data.

Apollo 17 astronauts took detailed gravity measurements at around a dozen locations, but the spread represents a tiny portion of the Moon’s surface. Consequently Hirt acknowledges that the map assumes that the density of the surface materials is consistent over small scales.

Since 1968 it has been known that gravity is higher on the lunar “seas”. Hirt says this is thought to be because these plains formed when “the Moon was impacted by very large rocks, and liquid from the interior came out and filled out these parts”.

This interior material is denser than the crust, producing higher gravitational fields. Measured through the SELENE space probes, these mass-concentrations show up, as well as other large anomalies that cannot be explained through topography alone.

Hirt says it is not yet clear what applications the map may serve, but “when designing a satellite mission it is always a good idea to have a basic understanding of what structures are to be expected”.

He adds that he is waiting on the reaction from the scientific community as “sometimes when you create a new product it is found useful in ways you hadn’t expected.” Possibilities include providing more accurate comparisons against selenoid height (the lunar equivalent of sea level) and in the study of historic rock falls or lava movements.

The Furthest Cluster
The most distant galactic cluster ever identified has been observed 10.5 billion light years away. Co-discoverer Dr Lee Spitler of Swinburne University says that individual galaxies have been seen at this distance, but never a collection of 30 galaxies such as this one.

“Our galaxy cluster is observed when the universe was only three billion years old,” Spitler says. “This means it is still young and should continue to grow into an extremely dense structure containing many more galaxies.”

Clusters such as this one are believed to slowly capture individual galaxies with time, some having now reached several thousand components.

The discovery was published in Astrophysical Journal Letters.

Co-author Prof Kim Vy Tran of Texas A&M says: “In the same way that it’s important for humans to search for the oldest known cities to understand civilisations today, it’s important to search for the cosmological equivalent of the most ancient cities to understand why galaxies like our Milky Way look the way they do”.

The cluster was observed in an area of the sky that has been the focus of much deep observation, including more than a month of the Hubble Space Telescope’s time. However, while the galaxies were observed, Spitler says no one recognised their nature.

“A fundamental problem in observational astronomy is to classify the points of light we see in the night sky. You need to determine if the point of light you are looking at is a star in our Milky Way, a nearby galaxy or one very far away.”

The galaxies are moving away from us so rapidly that their light has been dramatically shifted towards the red end of the spectrum. Spitler and his colleagues used infrared cameras with multiple filters to break the spectrum into narrower windows than has been done before, providing an indication of the distance over which the light has travelled.

These findings were confirmed using spectrographs, a more established way of measuring red shift, but one that only allows observations on a small number of objects at a time.

Quicker Assessment for Peanuts
A combination of two blood tests for peanut allergies could save the health care system from expensive or unreliable testing, according to researchers from the Murdoch Children’s Research Institute.

Peanut allergies can be identified in a number of ways, with A/Prof Katie Allen describing oral food challenges as “the gold standard”. However, these involve patients spending half a day with health care professionals and placing themselves at risk of severe reactions. On the other hand, skin prick tests have proved unreliable and led to overdiagnosis in the past.

Blood tests are now conducted using samples of the whole peanut, but a range of tests are now emerging that look for antibodies to peanut proteins. Of these, Allen says the Arah2 test is the most reliable. However, on its own Allen says that Arah2 still does not meet the standards required.

Instead, Allen and PhD student Thanh Dang found more success using blood tests against whole peanuts as a first stage. Individuals with a high response were reliably allergic to peanuts, while those with little or no response were not.

The blood of those in between was then tested using the Arah2 test. After this, only one-quarter as many people were still in the uncertain zone and required further testing.

“By reducing the number of oral food challenges, this helps prevent many peanut allergics undertaking the unnecessary risks involved,” Allen says.

Arah2 is just one of “literally hundreds” of protein allergic tests for various conditions about to come onto the market, according to Allen. “If we don’t know how to use them they will be indiscriminately or not used at all.” Allen and Dang’s work may provide a pointer to ways these tests can be used appropriately.

Allen hopes that if the simple cases can be handled by general health care professionals, the backlog of demand for allergists’ services will be reduced so they can focus on those in greatest need.

The research was done on 1-year-old children. Allen is now conducting a study of 10,000 older children, but says she anticipates similar results.

Random Numbers Generate Some Noise
A potent random number generator has literally been produced from nothing, ans may have applications in many fields. “Random number generation has many uses in information technology,” says Prof Ping Koy Lam of the Australian National University. “Global climate prediction, air traffic control, electronic gaming, encryption, and various types of computer modelling all rely on the availability of unbiased, truly random numbers.”

Lam explains that since complex systems, such as the weather, involve a number of random factors, computer models need a source of unbiased random numbers to accurately represent them.

While computers produce what are described as random numbers, these actually build on previous inputs. “If you can guess the initial number you can predict the rest,” Lam explains, and this has been raised as a problem for cryptographic coding.

Since only physical processes can produce true randomness, radioactive decay has become a popular source of data. However, Lam notes that “to get numbers large enough for some purposes the radioactivity can be lethal”. Moreover, any radioactive source declines with time, creating a bias in the numbers.

Lam’s solution involves the quantum fluctuations observed in true vacuums, where photons spontaneously appear and disappear without pattern. This vacuum noise imposes an ultimate limit on the speed of fibre optic communication. “While it has always been an annoyance that engineers and scientists would like to circumvent, we instead exploited this vacuum noise and used it to generate random numbers,” Lam says.

Lam’s solution uses a standard laser and a beam splitter. The beams were brought back together in such a way that one was subtracted from the other. “This might be expected to produce a zero result, but the photons appearing on the path produce a random deviation from zero.”

The numbers generated have been put on the internet. No two people downloading random numbers will get the same set, and the university hopes to commercialise the technology, selling the data to those in need of noise.

Nitrous Oxide Record Revealed
A record of nitrous oxide levels in the southern atmosphere will help track the sources of the heat-trapping and ozone-depleting gas.

Nitrous oxide is a greenhouse gas, and one of the few naturally produced gases to remove ozone from the stratosphere. Emissions have increased, particularly from microbes feeding on fertilisers, and Dr David Etheridge of CSIRO Marine and Atmospheric Research says concentrations are now 20% higher than before 1750.

Different sources of nitrous oxide have different isotopic signatures. In a paper published in Nature Geoscience, Etheridge and his co-authors note: “When fertilizer is plentiful, the enzyme kinetics of the microbial nitrous oxide production processes favour 14N.”

However, nitrous oxide exists in such small quantities in the atmosphere that determining the isotopic ratios from ice cores represents a major challenge. “As well as the long-term trend there are very subtle seasonal variations,” Etheridge says.

To construct a record of the changes – both in concentration and the ratio of 14N to heavier isotopes – the researchers started with air samples collected at Cape Grim in Tasmania. “People realised there were things we needed to measure in the atmosphere that we couldn’t at the time,” Etheridge says, “and this is a perfect example of that”.

In order to extend the record back before Cape Grim’s foundation, the researchers removed air from the upper layers of Antarctic ice. Etheridge explains that these layers contain stagnant air in open pores. This air is easier to access in the quantities required for this research than older air bubbles, and has allowed the team to extend the record back to 1940.

Etheridge says changes to soil management and the timing of fertiliser application can reduce the production of nitrous oxide, and this can have commercial, as well as environmental, benefits. “No one wants to waste fertiliser,” he says.

The record he has helped to produce will assist in establishing the scale of the challenge.

Carbon Capture Prototype Demonstrated
A CSIRO program to capture carbon dioxide from flue gas at coal-fired power stations has achieved absorption rates of 85%, while also capturing other pollutants such as sulfur dioxide. While less than 200 kg of CO2 was captured per hour, the plants have demonstrated their technical viability – although large cost reductions are required for commercial uptake.

Post-combustion carbon capture (PCC) plants were established at Munmorah in NSW and Tarong in Queensland. While carbon dioxide is already captured on some sites in North America and re-injected both to prevent it from reaching the atmosphere and to force more fuel from the ground, there are no large-scale integrated PCC plants in operation.

Dr Paul Feron of CSIRO Energy Technology says the Australian program brought flue gas into contact with a liquid absorbent spread over a packing material. The absorbent captured the carbon dioxide, which was released upon heating to produce a mixture of water vapour and carbon dioxide. When cooled, the absorbent can be reused.

“Given the small amounts of CO2 we were capturing it was released back into the flue gas duct,” Feron says. “I’m involved in another project in China where the carbon dioxide is purified and sold for beverages. On this small scale it is not economic to store it.”

CSIRO’s laboratory research included the design of liquid absorbents that combine the two most needed features: high absorption rates and low energy requirements to release the captured gas. The researchers designed the amines they thought would best suit this role but, as they were not currently commercially available, set about synthesising them.

An alternative approach to carbon capture uses membranes that will let through smaller gases while leaving larger molecules behind. “We’re keen to have as many technologies as possible developed and available to maximise results,” Feron says. “Currently, membranes need to be scaled up maybe 1000 times to be viable, whereas liquid absorbents need to increase 10–20 times.”

The energy required to recycle absorbents would currently reduce a power station’s efficiency by 30% to capture 90% of the gas. Part of this energy is required to compress the gas to the point where it can be transported and stored.

“Once the technology is established, the costs of installing and operating a PCC system will fall substantially,” Feron says. “We need to capture 100,000 tonnes to store underground to demonstrate the total CCS chain and to streamline the process of scaling up.”

See-Through Solar Cells
Flat screen TVs may be huge energy guzzlers, but they could usher in a world where the windows in office buildings generate electricity.

Dr Mark Bissett’s PhD at Flinders University was devoted to the idea of producing solar cells from carbon nanotubes. Besides being potentially much cheaper than silicon solar panels, Bissett’s products let through the light they do not turn into electricity, making them suitable for use as windows, particularly in glass office towers.

“It’s basically like tinting the windows except they’re able to produce electricity, and considering office buildings don’t have a lot of roof space for solar panels it makes sense to utilise the many windows they do have instead,” Bissett says.

“A solar cell is created by taking two sheets of electrically conductive glass and sandwiching a layer of functionalised single-walled carbon nanotubes between the glass sheets. When light shines on the cell, electrons are generated within the carbon nanotubes and these can be used to power electrical devices.”

Previously the cost of the electrically conductive glass sheets alone would have been a major obstacle to the viability of this idea, but Bissett says the use of this glass in LCD computer and TV screens has created “massive infrastructure to produce them”. Moreover, such a large industry has driven the search for cheaper alternatives, with Bissett making use of one possibility: fluorine-doped tin oxide.

Wires can be connected to each glass sheet to form a circuit, although Bissett warns: “For up-scaling to commercial application it would require further optimisation of these contacts to fit into a window frame”.

The idea of electricity-producing windows is not new, but Bissett says that a layer of nanotubes need only be “a few hundred nanometers thick” while other materials must be more than a micron wide.

Nanotubes are also well-studied for other purposes, which Bissett says has led to modifications that can increase their efficiency. “We were the first in the world to try it so it’s pretty exciting that we’ve proved the concept, and hopefully it will be commercially available in a few years’ time.”

Nanotubes naturally absorb ultra­violet light, so the cells would provide UV protection as a bonus. “We have showed through our research that the efficiency of the devices can be greatly improved over carbon nanotubes by themselves by functionalising the nanotubes with other organic light-absorbing molecules such as dye or porphyrins, analogous to artificial photosynthesis,” Bissett says.

Despite these improvements Bissett’s cells collect less than 1% of the energy falling on them. However, he predicts this will soon reach 10%, which is comparable to what is currently achieved by dye-sensitised solar cells in the laboratory (AS, March 2006, pp.10–11), and close behind commercial silicon solar panels.

Cancer Connection to Fatty Acids Becomes Clearer
Some new pieces have been added to the puzzle connecting cancer with diet, particularly the consumption of fatty acids.

“There are some reports from places where people eat a lot of fish that say that at autopsy there is not much difference in the incidence of cancer, but they have a much lower rate of dissemination,” says Prof Michael Murray of the University of Sydney.

Murray hastens to add that some studies do not find this but says: “Whenever there is a study that finds a positive correlation I think that should be studied in other ways”.

Murray investigated the spread of cancer cells in tissue culture and found that these seem to be blocked by omega-3 fatty acids, which are found in high quantities in fish. On the other hand, omega-6 fatty acids, which are more common in vegetable oils and poultry, appear to promote the spread of cancer.

“The Western diet has an omega-6 to omega-3 ratio of something like 15:1 when it should be 1:1,” Murray says, suggesting that if his laboratory results apply in vivo then modern diets make it far easier for primary cancers to metastasise and spread to other parts of the body.

Murray has observed that the promotion of metastasis appeared driven not by the omega-6 acids themselves but by unique epoxides, which are smaller molecules produced within cells from the omega-6 chain.

“I wondered if the omega-3 epoxides do the same thing in reverse,” Murray says. “I tried to capture the epoxides produced by omega-3s.”

Indeed it appears that omega-3 epoxides prevent metastasis in malignant breast tumours. “There is an extra double bond that, when you have something forming at the site, it stops metastasis in its tracks.”

Murray notes that attempts to shift diets have proven hard work, and instead is seeking a pill that, by using the epoxides rather than the original omega-3s, will prove potent enough to fight the spread of cancer at manageable doses.

The imbalance of omega-6 to omega-3 in the modern diet is causing widespread problems, and in some areas there is debate as to whether a similar imbalance in the other direction would be equally problematic. “I can’t answer whether an omega-3 excess could be an issue,” Murray says, “but I see no evidence that omega-3s on their own are pro-metastatic the way omega-6s are”.

Pregnant Women Not Getting the Listeria Message
Many women who are pregnant or seeking to become pregnant continue to eat foods with a high risk of Listeria poisoning. Those who avoid such foods, however, often have nutritional deficiencies that could be particularly problematic during pregnancy. Study author Prof Clare Collins of the Hunter Medical Research Institute believes this may reflect a failure to communicate the message of what is a healthy diet for pregnant women.

“This is quite a dilemma,” says Collins. “It is important for pregnant women to achieve a balance between an adequate intake of nutrients such as folate, iron, zinc and protein, and reducing their risk of listeriosis.”

Listeria monocytogenes can thrive on raw meat and vegetables, soft cheeses and salads that are improperly prepared or stored too long. While the symptoms of ingestion can be unpleasant for anyone, pregnant women and the elderly are particularly vulnerable, with a high risk of miscarriage among the former.

Collins looked at a survey of 9000 women of childbearing age and calculated a theoretical listeriosis exposure score based on what they were eating. A follow-up survey revealed pregnancy outcomes 3 years later, and Collins found that the 20% with the highest scores had a greater rate of miscarriage.

On the other hand, women with low listeriosis exposure scores were often not getting enough folate, zinc, iron and calcium.

Collins notes that women in the study who were pregnant, trying to get pregnant or had recently had children had the same listeriosis exposure as other women their age, suggesting that the message about its dangers is not getting through.

Collins believes this may be a by-product of confusing messages that emphasise both what should be eaten for nutritional value and what should be avoided. She suggests that the distribution of a single document may help, making suggestions of healthy replacements for risky foods, such as using home-cooked meat or seafood rather than pre-packaged or deli alternatives.

“Antenatal programs at hospitals provide access to accredited dieticians,” Collins says, but she also suggests the need for increased public health campaigns about appropriate foods and their preparation. “Pregnancy is not the time to be making a batch of something on Sunday and still eating it on Friday, unless it can be frozen and thoroughly cooked,” she says.

Old Poles Spark Bushfire Risk
Ageing wooden power poles are a fire hazard but Dr Sachin Pathak, formerly of RMIT’s School of Electrical and Computer Engineering, is helping power companies to select the most vulnerable examples for replacement.

The build-up of salt or chemical pollution can allow current to leak from powerlines in the presence of moisture. The leaking current can make its way around an insulating cap and into the pole, in some cases setting the wood on fire.

Heavy rain will wash the electrolytes away, so Pathak says periods of extended dry weather followed by light rain or high humidity are particularly dangerous. These often coincide with conditions in which, should a pole catch fire, surrounding bushland is especially likely to catch on.

“My study proved conclusively that leakage current performance of wooden structures deteriorates with age,” Pathak says. “Given that 70% of the 8.5 million wooden poles in service as part of the electricity distribution infrastructure in Australia are over 35 years old, these findings are significant.”

The poles become more dangerous, Pathak says, not because the electrical current increases but because the more porous old timber is more likely to catch alight.

South Australians have famously adopted the concrete and metal Stobie pole in response to their shortage of appropriate timber, but Pathak says that while this may eliminate pole fires, the extra cost does not make it competitive in most environments.

Instead, he advocates modelling where the danger is highest, based on a combination of the ages of the poles, climatic conditions and the presence of salt or pollution. Using this information, power companies can identify the poles in the most urgent need of replacement.

“I hope my recommendations not only reduce the number of wooden pole fires, but also help to save lives and millions of dollars in the process,” Pathak says.

Homes Approved Despite Climate Risk
New houses are being built in areas under threat from sea level rise as councils struggle to come to grips with the challenges that climate change will create. Moreover, coastal councils are trapped between the dangers of liability if they let homes be built in the wrong place, and the threat of lawsuits if they refuse.

“Of the estimated 711,000 homes located in the coastal zone, up to 35% are at risk of inundation under a sea level rise scenario of 1.1 metres,” says A/Prof Nicole Gurran, acting head of Urban and Regional Planning at the University of Sydney. Gurran is the lead author commissioned by the National Sea Change Task Force, a collaboration of non-metropolitan coastal councils.

Many of the fastest growing areas of the country are particularly vulnerable to sea level rise, including Victoria’s Bass Coast, much of South-East and Far North Queensland and the Central Coast of NSW.

Gurran says that the quality of information available to councils has improved significantly in recent years, with national maps showing the areas under threat from small oceanic rises. Nevertheless, councils are often lacking more detailed charts, although some have been funded to produce their own by federal or state programs or with support from local government associations.

However, having this information does not eliminate the problems. “Council staff also reported local community ‘pushback’ and scepticism towards climate change and potential impacts for coastal regions,” Gurran says. “In some cases, landholders and developers have become more vocal in opposing development restrictions associated with climate risk management.”

For small councils the threat of legal action, particularly from large developers, can be a serious worry. “Even if they are found to have made the right decision it can be expensive,” Gurran says. While no legal challenges have been launched questioning the climate science, she is aware of approximately 20 cases where councils were sued over the way they applied the research.

Allowing householders to build at their own risk is not an option, Gurran says, because when a house is in danger of being flooded people will take action that can cause damaging effects nearby.