Bookmark and Share

Saturday, November 28, 2009

Arctic Heats Up More Than Other Places: High Sea Level Rise Predicted


As a result, glacier and ice-sheet melting, sea-ice retreat, coastal erosion and sea level rise can be expected to continue.
A new comprehensive scientific synthesis of past Arctic climates demonstrates for the first time the pervasive nature of Arctic climate amplification.
The U.S. Geological Survey led this new assessment, which is a synthesis of published science literature and authored by a team of climate scientists from academia and government. The U.S. Climate Change Science Program commissioned the report, which has contributions from 37 scientists from the United States, Germany, Canada, the United Kingdom and Denmark.
The new report also makes several conclusions about the Arctic:
Taken together, the size and speed of the summer sea-ice loss over the last few decades is highly unusual compared to events from previous thousands of years, especially considering that changes in Earth's orbit over this time have made sea-ice melting less, not more, likely.
Sustained warming of at least a few degrees (more than approximately 4° to 13°F above average 20th century values) is likely to be sufficient to cause the nearly complete, eventual disappearance of the Greenland ice sheet, which would raise sea level by several meters.
The current rate of human-influenced Arctic warming is comparable to peak natural rates documented by reconstructions of past climates. However, some projections of future human-induced change exceed documented natural variability.
The past tells us that when thresholds in the climate system are crossed, climate change can be very large and very fast. We cannot rule out that human induced climate change will trigger such events in the future.
"By integrating research on the past 65 million years of climate change in the entire circum-Arctic, we have a better understanding on how climate change affects the Arctic and how those effects may impact the whole globe," said USGS Director Mark Myers. "This report provides the first comprehensive analysis of the real data we have on past climate conditions in the Arctic, with measurements from ice cores, sediments and other Earth materials that record temperature and other conditions."
Arctic Heats Up More Than Other Places: High Sea Level Rise Predicted

Carbon Emissions Linked To Global Warming In Simple Linear Relationship

These findings will be published in the next edition of Nature, to be released on June 11, 2009.
Until now, it has been difficult to estimate how much climate will warm in response to a given carbon dioxide emissions scenario because of the complex interactions between human emissions, carbon sinks, atmospheric concentrations and temperature change. Matthews and colleagues show that despite these uncertainties, each emission of carbon dioxide results in the same global temperature increase, regardless of when or over what period of time the emission occurs.
These findings mean that we can now say: if you emit that tonne of carbon dioxide, it will lead to 0.0000000000015 degrees of global temperature change. If we want to restrict global warming to no more than 2 degrees, we must restrict total carbon emissions – from now until forever – to little more than half a trillion tonnes of carbon, or about as much again as we have emitted since the beginning of the industrial revolution.
"Most people understand that carbon dioxide emissions lead to global warming," says Matthews, "but it is much harder to grasp the complexities of what goes on in between these two end points. Our findings allow people to make a robust estimate of their contribution to global warming based simply on total carbon dioxide emissions."
In light of this study and other recent research, Matthews and a group of international climate scientists have written an open letter calling on participants of December's Conference of the Parties to the U.N. Framework Convention on Climate Change to acknowledge the need to limit cumulative emissions of carbon dioxide so as to avoid dangerous climate change.
Carbon Emissions Linked To Global Warming In Simple Linear Relationship

Abrupt Climate Change: Will It Happen This Century?

"Abrupt" changes can occur over decades or less, persist for decades more, and cause substantial disruptions to human and natural systems.
A new report, based on an assessment of published science literature, makes the following conclusions about the potential for abrupt climate changes from global warming during this century.
Climate model simulations and observations suggest that rapid and sustained September arctic sea ice loss is likely in the 21st century.
The southwestern United States may be beginning an abrupt period of increased drought.
It is very likely that the northward flow of warm water in the upper layers of the Atlantic Ocean, which has an important impact on the global climate system, will decrease by approximately 25-30 percent. However, it is very unlikely that this circulation will collapse or that the weakening will occur abruptly during the 21st century and beyond.
An abrupt change in sea level is possible, but predictions are highly uncertain due to shortcomings in existing climate models.
There is unlikely to be an abrupt release of methane, a powerful greenhouse gas, to the atmosphere from deposits in the earth. However, it is very likely that the pace of methane emissions will increase.
The U.S. Geological Survey led the new assessment, which was authored by a team of climate scientists from the federal government and academia. The report was commissioned by the U.S. Climate Change Science Program with contributions from the National Oceanic and Atmospheric Administration and the National Science Foundation.
"This report was truly a collaborative effort between world renowned scientists who provided objective, unbiased information that is necessary to develop effective adaptation and mitigation strategies that protect our livelihood," said USGS Director Mark Myers. "It summarizes the scientific community's growing understanding regarding the potential for abrupt climate changes and identifies areas for additional research to further improve climate models."
Further research is needed to improve our understanding of the potential for abrupt changes in climate. For example, the report's scientists found that processes such as interaction of warm ocean waters with the periphery of ice sheets and ice shelves have a greater impact than previously known on the destabilization of ice sheets that might accelerate sea-level rise.
http://www.climatescience.gov/default.php
Abrupt Climate Change: Will It Happen This Century?

In The Warming West, Climate Most Significant Factor In Fanning Wildfires' Flames


"We found that what matters most in accounting for large wildfires in the Western United States is how climate influences the build up—or production—and drying of fuels," said Jeremy Littell, a research scientist with the University of Washington's Climate Impacts Group and lead investigator of the study. "Climate affects fuels in different ecosystems differently, meaning that future wildfire size and, likely, severity depends on interactions between climate and fuel availability and production."
To explore climate-fire relationships, the scientists used fire data from 1916 to 2003 for 19 ecosystem types in 11 Western States to construct models of total wildfire area burned. They then compared these fire models with monthly state divisional climate data.
The study confirmed what scientists have long observed: that low precipitation and high temperatures dry out fuels and result in significant fire years, a pattern that dominates the northern and mountainous portions of the West. But it also provided new insight on the relationship between climate and fire, such as Western shrublands' and grasslands' requirement for high precipitation one year followed by dry conditions the next to produce fuels sufficient to result in large wildfires.
The study revealed that climate influences the likelihood of large fires by controlling the drying of existing fuels in forests and the production of fuels in more arid ecosystems. The influence of climate leading up to a fire season depends on whether the ecosystem is more forested or more like a woodland or shrubland.
"These data tell us that the effectiveness of fuel reductions in reducing area burned may vary in different parts of the country," said David L. Peterson, a research biologist with the Forest Service's Pacific Northwest Research Station and one of the study's authors. "With this information, managers can design treatments appropriate for specific climate-fire relationships and prioritize efforts where they can realize the most benefit."
Findings from the study suggest that, as the climate continues to warm, more area can be expected to burn, at least in northern portions of the West, corroborating what researchers have projected in previous studies. In addition, cooler, wetter areas that are relatively fire-free today, such as the west side of the Cascade Range, may be more prone to fire by mid-century if climate projections hold and weather becomes more extreme.
In The Warming West, Climate Most Significant Factor In Fanning Wildfires' Flames

Scientists argue for a new type of climate target

"The implications are that global emissions must peak around 2015 and be cut by roughly half between the peak and the year 2030," Steffen Kallbekken, scientist at CICERO, said.
In a new paper in Nature Reports Climate Change Steffen Kallbekken, Nathan Rive, Glen P. Peters and Jan S. Fuglestvedt from CICERO Center for International Climate and Environmental Research -- Oslo, argue for a new type of climate target to be considered:
"Focusing climate policy on a long-term target, such as the EU 2-degree target, provides limited guidance for mitigation over the next few decades, and gives the impression that there is time to delay," said Steffen Kallbekken.
The researchers propose that, in addition to a long-term cumulative emissions budget, a maximum limit on the rate of warming should also be considered as an element in the design of climate policies.
Required mitigation rates are 4-8 percent per year, which far exceeds anything achieved in history.
"A short-term target provides clearer guidance on mitigation in the near term, limits potentially dangerous rates of warming, and allows easier inclusion of potent and toxic short-lived climate components," Kallbekken said.
"A short-term cumulative emissions target, for example 190 GtC for the period 2010-2030, is a useful approach to limit the rate of warming, while at the same time keeping the focus on what matters in the long term: reducing CO2 emissions."
Scientists argue for a new type of climate target

Including Environmental Data Improves Effectiveness Of Invasive Species Range Predictions

Inés Ibáñez of the University of Michigan and her colleagues examined not only historical and current climatic data, but also historical environmental information from both the native and invaded ranges of three New England invasive plants: Japanese barberry, bittersweet and winged euonymus (or burning bush). The models took into account human development, disturbances and agricultural land use; habitat measures of local ground cover, such as forest type and wetlands type, were also included.
The researchers found that although climate plays a large role in predicting invasive species distribution, the inclusion of land use and habitat data improve the explanatory power of their models. In some instances, the combination of an unfavorable climate with a suitable landscape cover increased the probability of species establishment. On the other hand, some areas with favorable climates became less so when their unfavorable habitat data were included.
Most importantly, the researchers write, their models can be modified and used in other systems to predict biological invasions anywhere in the world.
Including Environmental Data Improves Effectiveness Of Invasive Species Range Predictions

Paleoecologists Offer New Insight Into How Climate Change Will Affect Organisms

According to Booth and his colleagues, one of the biggest challenges facing ecologists today is trying to predict how climate change will impact the distribution of organisms in the future. Combining the environmental conditions that allow a particular species to exist with the output from climate models is a commonly used approach to determining where these conditions will exist in the future. However, according to the authors, there some potential problems with the correlational approach that ecologists have traditionally used.
"This traditional prediction approach on its own is insufficient," said Booth. "It needs to be integrated with mechanistic and dynamic ecological modeling and systematic observations of past and present patterns and dynamics."
The paper uses examples from recent paleoecological studies to highlight how climate variability of the past has affected the distributions of tree species, and even how events that occurred many centuries ago still shape present-day distributions patterns. For example, the authors note that some populations of a Western US tree species owe their existence to brief periods of favorable climatic conditions allowing colonization in the past, such as a particularly wet interval during the 14th century.
"The climate system varies at all ecologically relevant time scales," said Booth. "We see differences year to year, decade to decade, century to century and millennia to millennia. When trying to understand how species and populations will respond to changing climate, it's not just changes in the mean climate state that need to be considered, but also changes in variability "
The article was written by Stephen Jackson of the Department of Botany and Program in Ecology at the University of Wyoming, Julio Betancourt of the U.S. Geological Survey in Arizona, Robert Booth of the Department of Earth and Environmental Sciences at Lehigh University, and Stephen Gray of the Wyoming Water Resources Data System and Wyoming State Climate Office of the University of Wyoming. It was published on Sept. 23, 2009.
Paleoecologists Offer New Insight Into How Climate Change Will Affect Organisms

Plants Could Override Climate Change Effects On Wildfires


Philip Higuera of Montana State University and his colleagues show that although changing temperatures and moisture levels set the stage for changes in wildfire frequency, they can often be trumped by changes in the distribution and abundance of plants. Vegetation plays a major role in determining the flammability of an ecosystem, he says, potentially dampening or amplifying the impacts that climate change has on fire frequencies.
"Climate is only one control of fire regimes, and if you only considered climate when predicting fire under climate-change scenarios, you would have a good chance of being wrong," he says. "You wouldn't be wrong if vegetation didn't change, but the greater the probability that vegetation will change, the more important it becomes when predicting future fire regimes."
Higuera and his colleagues examined historical fire frequency in northern Alaska by analyzing sediments at the bottom of lakes. Using meter-long samples, called sediment cores, Higuera and his colleagues measured changes in the abundance of preserved plant parts, such as pollen, to determine the types of vegetation that dominated the landscape during different time periods in the past. Like rings in a tree, different layers of sediment represent different times in the past.
The researchers used radiocarbon dating to determine the sediment's age, which dates as far back as 15,000 years. They then measured charcoal deposits in the sediment to determine fire frequency during time periods dominated by different vegetation. Finally, they compared their findings to known historical climate changes.
In many cases, the authors discovered, changes in climate were less important than changes in vegetation in determining wildfire frequency. Despite a transition from a cool, dry climate to a warm, dry climate about 10,500 years ago, for example, the researchers found a sharp decline in the frequency of fires. Their sediment cores from that time period revealed a vegetation change from flammable shrubs to fire-resistant deciduous trees, a trend which Higuera thinks was enough to offset the direct effects of climate on fire frequencies.
"In this case, a warmer climate was likely more favorable for fire occurrence, but the development of deciduous trees on the landscape offset this direct climatic effect. Consequently, we see very little fire," Higuera says.
Similarly, during the development of the modern spruce-dominated forest about 5000 years ago, temperatures cooled and moisture levels increased, which – considered alone – would create unfavorable conditions for frequent fires. Despite this change, the authors observed an increase in fire frequency, a pattern they attribute to the high flammability of the dense coniferous forests.
Higuera thinks this research has implications for predictions of modern-day changes in fire regimes based on climate change. These findings, Higuera says, emphasize that predicting future wildfire frequency shouldn't hinge on the direct impacts of climate change alone.
"Climate affects vegetation, vegetation affects fire, and both fire and vegetation respond to climate change," he says. "Most importantly, our work emphasizes the need to consider the multiple drivers of fire regimes when anticipating their response to climate change."
Plants Could Override Climate Change Effects On Wildfires

Monday, November 23, 2009

New climate treaty could put species at risk, scientists argue

A team of eleven of the world's top tropical forest scientists, coordinated by the University of Leeds, warn that while cutting clearance of carbon-rich tropical forests will help reduce climate change and save species in those forests, governments could risk neglecting other forests that are home to large numbers of endangered species.
Under new UN Framework Convention on Climate Change (UNFCCC) proposals, the Reduced Emissions from Deforestation and Degradation (REDD) scheme would curb carbon emissions by financially rewarding tropical countries that reduce deforestation.
Governments implicitly assume that this is a win-win scheme, benefiting climate and species. Tropical forests contain half of all species and half of all carbon stored in terrestrial vegetation, and their destruction accounts for 18% of global carbon emissions.
However, in a paper published in the latest issue of Current Biology, the scientists warn that if REDD focuses solely on protecting forests with the greatest density of carbon, some biodiversity may be sacrificed.
"Concentrations of carbon density and biodiversity in tropical forests only partially overlap," said Dr Alan Grainger of the University of Leeds, joint leader of the international team. "We are concerned that governments will focus on cutting deforestation in the most carbon-rich forests, only for clearance pressures to shift to other high biodiversity forests which are not given priority for protection because they are low in carbon."
"If personnel and funds are switched from existing conservation areas they too could be at risk, and this would make matters even worse."
If REDD is linked to carbon markets then biodiversity hotspot areas -- home to endemic species most at risk of extinction as their habitats are shrinking rapidly -- could be at an additional disadvantage, because of the higher costs of protecting them.
According to early estimates up to 50% of tropical biodiversity hotspot areas could be excluded from REDD for these reasons. Urgent research is being carried out across the world to refine these estimates.
Fortunately, the UN Framework Convention on Climate Change is still negotiating the design of REDD and how it is to be implemented.
The team is calling for rules to protect biodiversity to be included in the text of the Copenhagen Agreement. It also recommends that the Intergovernmental Panel on Climate Change give greater priority to studying this issue, and to producing a manual to demonstrate how to co-manage ecosystems for carbon and biodiversity services.
"Despite the best of intentions, mistakes can easily happen because of poor design" said Dr Grainger. "Clearing tropical forests to increase biofuel production to combat climate change is a good example of this. Governments still have time at Copenhagen to add rules to REDD to ensure that it does not make a similar mistake. A well designed REDD can save many species and in our paper we show how this can be done."
New climate treaty could put species at risk, scientists argue

Blue Energy Seems Feasible And Offers Considerable Benefits

On 3 November, Jan Post is presenting his research during his doctorate dissertation on this subject from Wageningen University.
The principle of generating electricity by mixing salt and fresh water, taking advantage of the difference in charge that results, has been known for more than 100 years. It was first tested in practice in a laboratory in the 1950s. There are two methods for generating blue energy: pressure-retarded osmosis and reverse electrodialysis.
Post, in his research, has focused mainly on the latter because it is the more attractive method of generating energy from sea and river water. With his research into the practical applicability, techniques and preconditions for large-scale energy generation from salinity gradients, he was the first to demonstrate that very high yields are possible. In the laboratory, it is possible to recover more than 80% of the energy from salinity gradients; the technical feasibility would be 60-70% and the economic feasibility a little lower than that.
There are differences among continents: the technical potential in Australia (65%) or Africa (61%) is greater than in South America (47%). There are also considerable differences between rivers -- there are 5472 large rivers worldwide. These differences depend on the salt concentration in the rivers and seas, temperature, and environmental factors. The Rhine is one of the most 'energetic' rivers in Europe.
Afsluitdijk
Post investigated the possibility of recovering energy from the Rhine and the Maas rivers. He estimated the technical potential of both rivers to be 2.4 gigawatts per year. He believes it would be economically feasible to recover 1.5 gigawatts; enough to supply 4 million households in the Netherlands. A power station of around 200 megawatts -- comparable with a park containing 200 wind turbines -- could be placed at the Afsluitdijk (the famous Closure Dike in the Northern part of the Netherlands) which, according to Post, is a rather suitable place for the large-scale trials that need to be carried out. This test location on the Afsluitdijk could be combined with the redesign of the dike that is already being planned. Heavy investment is necessary but this type of clean energy is extremely promising and, since it is essential to look for alternatives to fossil energy, this investment would be worthwhile in every respect. It will be at least ten years before the first commercial power stations are operational, Post says.
Technological developments
Post believes that in the next few years it will be necessary to work even more intensively on two technological developments that will bring down the present, rather high, price of generating blue electricity. An appropriate membrane technology should be developed and, furthermore, such membranes should become much cheaper by introducing mass production. The technique should also be robust enough to work both when the water is polluted and when living organisms accumulate on the membranes (biofouling). His research showed that both hindrances could be removed in the future.
Blue Energy Seems Feasible And Offers Considerable Benefits

Volatile gas could turn Rwandan lake into a freshwater time bomb


Scientists can't say for sure if the volatile mixture at the bottom of the lake will remain still for another 1,000 years or someday explode without warning. In a region prone to volcanic and seismic activity, the fragility of Lake Kivu is a serious matter. Compounding the precarious situation is the presence of approximately 2 million people, many of them refugees, living along the north end of the lake.
An international group of researchers will meet Jan. 13-15 in Gisenyi, Rwanda, to grapple with the problem of Lake Kivu. A grant from the National Science Foundation won by Rochester Institute of Technology will fund the travel and lodging for 18 scientists from the United States to attend the three-day workshop. Anthony Vodacek, conference organizer and associate professor at RIT's Chester F. Carlson Center for Imaging Science, is working closely with the Rwandan Ministry of Education to organize the meeting.
"Rwandan universities suffered greatly in the 1994 genocide and there are few Rwandan scientists performing significant work on the lake or within the rift system," Vodacek notes. "We will work with the government to identify interested researchers."
Vodacek is convening the workshop with Cindy Ebinger, an expert in East African Rift tectonics at the University of Rochester, and Robert Hecky, an expert in limnology -- the study of lake systems -- at University of Minnesota-Duluth. Core samples Hecky took in the 1970s initially brought the safety of Lake Kivu under question.
Addressing the lake as a whole system is a new concept for the workshop participants, who will bring their expertise in volcanology, tectonics and limnology to the problem. Vodacek's goal is to prioritize research activities and improve communication between the North American, European and African collaborators.
"Most scientists are fairly in agreement that the lake is pretty stable; it's not as if its going to come bursting out tomorrow," Vodacek says. "But in such a tectonically and volcanically active area, you can't tell what's going to happen."
One of the problems with Lake Kivu is that the 1,600-foot deep lake never breathes. The tropical climate helps stagnate the layers of the lake, which never mix or turn over. In contrast, fluctuating temperatures in colder climates help circulate lake water and prevent gas build up. Lake Kivu is different from both temperate and other tropical lakes because warm saline springs, arising from ground water percolating through the hot fractured lava and ash, further stabilize the lake. Scientists at the workshop will consider how these spring inputs may vary over time under changing climates and volcanic activity.
A number of catalysts could destabilize the gas resting at the bottom of Lake Kivu. It could be an earthquake, a volcanic explosion, a landslide or even the methane mining that has recently united Rwandan and Congolese interests.
Close calls occurred in 2008 when an earthquake occurred near the lake and in 2002 when a volcanic eruption destroyed parts of Goma in the Democratic Republic of Congo, only 11 miles north of Lake Kivu. Although scientists were alarmed, neither event sufficiently disturbed the gas.
Vodacek likens the contained pressure in the lake to a bottle of carbonated soda or champagne. "In the lake, you have the carbon dioxide on the bottom and 300 meters of water on top of that, which is the cap," he says. "That's the pressure that holds it. The gas is dissolved in water."
When the cap is removed, bubbles form and rise to the surface. More bubbles form and create a column that drags the water and the gas up to the surface in a chain reaction.
"The question is, and what's really unknown, is how explosive is that?" Vodacek says.
Through his own research Vodacek plans to simulate the circulation of Lake Kivu. Modeling the circulation patterns above the layers of carbon dioxide and methane will help determine the energy required to disrupt the gas and cause Lake Kivu to explode.
Volatile gas could turn Rwandan lake into a freshwater time bomb

Controllable Rubber Trailing Edge Flap To Reduce Loads On Wind Turbine Blades



”Providing the blade with a movable trailing edge it is possible to control the load on the blade and extend the life time of the wind turbine components. This is similar to the technique used on aircrafts, where flaps regulate the lift during the most critical times such as at take-off and landing, "explains Helge Aagaard Madsen, Research Specialist on the project.
However, there is a difference. Whereas on aircrafts, movable flaps are non-deformable elements hinged to the trailing edge of the main wing, this new technique means a continuous surface of the profile on the wind turbine blade even when the trailing edge moves. The reason for this is that the trailing edge is constructed in elastic material and constitutes an integrated part of the main blade.
Robust design of rubber
In 2004 Risø DTU applied for the first patent for this basic technique of designing a flexible, movable trailing edge for a wind turbine blade. Since then there has been a significant development with regard to the project. By means of so-called "Gap-funding" provided by the Ministry of Science, Technology and Innovation and by the local Region Zealand it has been possible to develop such ideas into a prototype stage.
Part of the research has been aimed at the design and development of a robust controllable trailing edge. This has now led to the manufacturing of a trailing edge of rubber with built-in cavities that are fibre-reinforced. The cavities in combination with the directional fibre reinforcement provide the desired movement of the trailing edge, when the cavities are being put under pressure by air or water.
“In this project a number of different prototypes have been manufactured with a chord length of 15 cm and a length of 30 cm. The best version shows very promising results in terms of deflection and in terms of the speed of the deflection” says Helge Aagaard.
The size of the protype fits a blade airfoil section with a chord of one metre and such a blade section is now being produced and is going to be tested inside a wind tunnel.
The capability of the trailing edge to control the load on the blade section is going to be tested in a wind tunnel. This part of the development process is supported by GAP-funding from Region Zealand.
”If the results confirm our estimated performance, we will test the rubber trailing edge on a full-scale wind turbine within a few years” says Helge Aagaard.Controllable Rubber Trailing Edge Flap To Reduce Loads On Wind Turbine Blades

Dutch Electricity System Can Cope With Large-scale Wind Power


Wind is variable and can only partially be predicted. The large-scale use of wind power in the electricity system is therefore tricky. PhD candidate Bart Ummels MSc. investigated the consequences of using a substantial amount of wind power within the Dutch electricity system. He used simulation models, such as those developed by transmission system operator TenneT, to pinpoint potential problems (and solutions).
His results indicate that wind power requires greater flexibility from existing power stations. Sometimes larger reserves are needed, but more frequently power stations will have to decrease production in order to make room for wind-generated power. It is therefore essential to continually recalculate the commitment of power stations using the latest wind forecasts. This reduces potential forecast errors and enables wind power to be integrated more efficiently.
Ummels looked at wind power up to 12 GW, 8 GW of which at sea, which is enough to meet about one third of the Netherlands’ demand for electricity. Dutch power stations are able to cope at any time in the future with variations in demand for electricity and supply of wind power, as long as use is made of up-to-date, improved wind forecasts. It is TenneT’s task to integrate large-scale wind power into the electricity grid. Lex Hartman, TenneT’s Director of Corporate Development: “in a joint effort, TU Delft and TenneT further developed the simulation model that can be used to study the integration of large-scale wind power. The results show that in the Netherlands we can integrate between 4 GW and 10 GW into the grid without needing any additional measures.
Surpluses
Ummels: ‘Instead of the common question ‘What do we do when the wind isn’t blowing?’, the more relevant question is ‘Where do we put all the electricity if it is very windy at night?’. This is because, for instance, a coal-fired power station cannot simply be turned off. One solution is provided by the international trade in electricity, because other countries often can use the surplus. Moreover, a broadening of the ‘opening hours’ of the international electricity market benefits wind power. At the moment, utilities determine one day ahead how much electricity they intend to purchase or sell abroad. Wind power can be better used if the time difference between the trade and the wind forecast is smaller.’
No energy storage
Ummels’ research also demonstrates that energy storage is not required. The results indicate that the international electricity market is a promising and cheaper solution for the use of wind power.
Making power stations more flexible is also better than storage. The use of heating boilers, for instance, means that combined heat and power plants operate more flexibly, which can consequently free up capacity for wind power at night.
The use of wind power in the Dutch electricity system could lead to a reduction in production costs of EUR1.5 billion annually and a reduction in CO2 emissions of 19 million tons a year.
Dutch Electricity System Can Cope With Large-scale Wind Power

Robot Inspects Wind Energy Converters


It appears reliably and appears alone. Nimbly and quickly, it pulls itself up a rope meter for meter until it reaches a wind energy converter’s giant rotor blades. Then it goes to work. It thoroughly inspects every centimeter of the rotor blades’ surface. Nothing escapes it. It registers any crack and any delamination in the material and relays their exact positions. In this job, a robot is superior to humans.
The researchers at the Fraunhofer Institute for Factory Operation and Automation IFF are experts in robotics – regardless of whether to clean facades, inspect sewer lines or assist humans. Their latest helper is RIWEA, a robot that inspects the rotor blades of wind energy converters. Primarily made of glass fiber reinforced plastics, rotor blades have to withstand a great deal: wind, inertial forces, erosion, etc. Until now, humans have inspected wind energy converters at regular intervals – not an easy job. After all, the technicians must closely examine large surfaces – a rotor blade can be up to 60 meters long – in airy heights. “Our robot is not just a good climber,” says Dr. Norbert Elkmann, Project Manager am Fraunhofer IFF and coordinator of the joint project. “It is equipped with a number of advanced sensor systems. This enables it to inspect rotor blades closely.” Are there cracks in the surface? Are the bonded joints and laminations in order? Is the bond with the central strut damaged?
The inspection system consists of three elements: An infrared radiator conducts heat to the surface of the rotor blades. A high-resolution thermal camera records the temperature pattern and thus registers flaws in the material. In addition, an ultrasonic system and a high resolution camera are also on board, thus enabling the robot to also detect damage that would remain hidden to the human eye. A specially developed carrier system ensures that the inspection robot is guided securely and precisely along the surface of a rotor blade. “it is a highly complex platform with sixteen degrees of freedom, which can autonomously pull itself up ropes,” explains Elkmann. The advantage of this system: It can perform its job on any wind energy converter – regardlesss of whether it is large or small, on land or offshore. The robot always delivers an exact log of the rotor blades’ condition, keeping humans safe and not missing any damage.
Robot Inspects Wind Energy Converters

New Rechargeable Lithium Batteries Could Jump-start Hybrid Electric Car Efficiency

As concern grows about climate change, a range of 'green technologies' are being developed to help reduce carbon emissions.
Hybrid petrol/electric cars that use conventional metal-hydride batteries are already available but they are heavy and the cars have limited power.
Professor Saiful Islam, of the Department of Chemistry at the University of Bath, is researching new materials to use in rechargeable lithium batteries, similar to those that have helped to power the worldwide 'portable revolution' in mobile phones, laptops and MP3 players. For hybrid cars, new materials are crucial to make the batteries lighter, safer and more efficient in storing energy.
Professor Islam's research, which recently won the Fuel Cell Science & Technology Award from the Royal Society of Chemistry, will be presented at the Sustainable Energy and the Environment research showcase on September 17 at the University of Bath, alongside other cutting-edge research from across the region.
"Hybrid electric cars such as the Toyota Prius rely on petrol engines, with their batteries being charged by the waste energy from braking. These cars provide better fuel economy for urban driving than a conventional car," explained Professor Islam.
"Developing new materials holds the key to lighter and more efficient rechargeable batteries for hybrid electric cars, reducing our use of fossil fuels and cutting carbon emissions."
The showcase will be opened by David Willetts MP, Shadow Secretary of State for Innovation, Universities and Skills, and will be attended by key industrialists, research councils, local and national government officials and other key stakeholders from across the South West.
The exhibition also coincides with the launch of the new Institute for Sustainable Energy and the Environment (I-SEE) at the University of Bath. This will bring together experts from diverse fields of science, engineering, social policy and economics to tackle the problems posed by global warming.
Professor Islam added: "I-SEE reflects the growing focus on 'green technology' at the University, which is a major centre for sustainable energy and chemical research."
The showcase event on 17 September will also feature exhibitions from other researchers from the University on subjects such as affordable solar cells and hydrogen fuel production.
New Rechargeable Lithium Batteries Could Jump-start Hybrid Electric Car Efficiency

Sunday, November 22, 2009

Accelerated Melting Of Continental Icepacks Is Major Reason For Rise In Sea Level Between 2003 And 2008

This question was resolved thanks to data from the French-American Satellite Jason-1, from two satellites of the GRACE space gravimetry mission and from the buoys of the Argo system. These results have been published online on the website of the journal Global and Planetary Change.
Between 1993 and 2003, the global mean sea level, measured very accurately by the French-American Topex/Poséidon satellites and their successor Jason-1, showed a relatively constant progression of 3 mm/yr. The last GIEC report, published in 2007, showed that more than half of this rise (approximately 1.5 mm/yr) was due to sea water expansion as it warmed up (steric contribution), while 1.2 mm/yr resulted from the reduction in mass of polar ice sheets and mountain glaciers. Since 2003 however, the situation has changed; a quite rapid rise (2.5 mm/yr) in sea water levels is still observed but, over the same period, the warming of the oceans is showing a plateau, only accounting for a rise of 0.4 mm/yr.
Thermal expansion was calculated using two independent methods:
The Argo network of buoys transmits water temperature and salinity profiles across all of the world's oceans. Since 2003, the analysis of all relevant data in the topmost 900 meters of sea water resulted in a steric contribution of about 0.4 mm/yr.
This value was independently confirmed by measurements from space by calculating the difference between the water level observed by the altimeters on Topex/Poséidon and Jason-1 and the increase in the ocean volume as witnessed by GRACE. The satellites indicate a steric contribution of 0.3 mm/yr, which is very similar to the value from the Argo buoys.
Consequently, it is above all the increase in the mass of sea water rather than its heat content that is behind the rise in sea level that has been observed since 2003. The increase in the mass of the oceans is equivalent to a rise of 1.9 mm/yr of the mean sea level. What is the source of this extra water in the oceans? Melting continental ice sheets. Data from GRACE has made it possible to measure changes in the mass of the two polar ice sheets in Antarctica and Greenland. These were responsible for a 1 mm/yr increase in sea level (i.e. twice as much as in the previous decade). For mountain glaciers, the most recent estimates from glaciologists show a contribution of 1.1 mm/yr (also higher than during previous years).
Thus, losses from glacial masses can easily account for why the mass of sea water is increasing and are responsible for 80 % of the average rise in sea level in recent years. Given the accelerated melting of glaciers and polar ice sheets, if the steric contribution returned to the values of the 1990s, a rise in sea level of around 4 mm/yr could not be excluded.
Notes:
LEGOS
Collecte Localisation SatelliteAccelerated Melting Of Continental Icepacks Is Major Reason For Rise In Sea Level Between 2003 And 2008

Mediterranean Sea Dried Up Five Million Years Ago

Much like a mattress springs back into shape after you get off it, the Earth’s crust moves upwards when sea levels fall. Known as isostasy, this phenomenon explains how the Mediterranean Sea was sealed off from the Atlantic Ocean five million years ago. This ‘dam’ would remain in place for 170,000 years. Much like today, the rate of evaporation in the Mediterranean Sea five million years ago greatly exceeded the incoming flow of water. As no more water was introduced via the Straits of Gibraltar, the water evaporated and the Mediterranean Sea dried up completely.
Restoration of the connection with the Atlantic Ocean
After being separated for 170,000 years, the Mediterranean Sea and the Atlantic Ocean were once again connected. Govers believes that the movement of the Earth’s crust played a crucial role. The African Plate subducts under the Eurasian Plate beneath Gibraltar and the weight of the subducting edge of the African Plate may have pulled the entire region downwards. Govers submits CT scans of the inner layers of the Earth’s crust and measurements of gravitational forces as evidence: both the scans and the measurements indicate the presence of a heavy mass up to 400 kilometres beneath the area.
Mediterranean Sea Dried Up Five Million Years Ago

Layers Of Bottom Sediment Reveal Secrets Of Environmental Changes In The Baltic Sea

“The area of research extends from the marine environment of Skagerrak to the almost fresh water of the Northern Baltic Sea. By studying the bottom sediment, we’re aiming to obtain information on the natural variations in the environmental conditions of the Baltic Sea and on the effect of human activity on environmental changes,” says Research Professor Aarno Kotilainen of the Geological Survey of Finland, who is coordinating the project.
Climatic conditions affect the temperature, salinity and changes of current in the Baltic Sea. They regulate such things as the salt water pulses that occasionally flow from the North Sea to the Baltic Sea. The eco-system and environmental conditions of the Baltic Sea are influenced both by local climate and that of the North-East Atlantic. This project coordinated by the Geological Survey of Finland is studying Baltic surface- and deep water conditions and their temporal variation, by looking at the layers of sediment on the seabed, using multivariate analysis.
By modelling, the project also aims to forecast the effects of climate change on the Baltic Sea. “A deeper understanding of the factors affecting the long-term changes in the Baltic Sea and of possible future changes is important. This knowledge is needed to support planning for the sustainable use of the marine regions and in preparation for the effects of climate change,” summarises Professor Kotilainen. In addition to the Geological Survey of Finland and the Department of Geology at the University of Helsinki, other participants in the research come from Russia, Germany, Denmark, Sweden, Poland and Norway.
Research funding organisations from the nine Baltic Sea nations are behind the BONUS programme, which was launched at the beginning of this year. The study is also being funded by the EU Commission. The Finnish funding organisation is the Academy of Finland. At the first stage of the research programme, decisions were made to fund 16 research projects with a total of 22 million euros, with more than 100 research institutes and universities from the Baltic Sea countries taking part. Finland is coordinating four of these projects. Total project funding will be approximately 60 million euros between 2010 and 2016.Layers Of Bottom Sediment Reveal Secrets Of Environmental Changes In The Baltic Sea

Black Sea Pollution Could Be Harnessed As Renewable Future Energy Source

The waters of the Black Sea contain very little oxygen. As such, the rare forms of life that live in the depths of the inland sea, so-called extremophile bacteria, survive by metabolising sulfate in the water. The sulfate fulfils a similar biochemical role to oxygen in respiration for these microbes allowing them to release the energy they need to live and grow from the nutrients they absorb from the water.
With organic matter and waste pouring into the Black Sea from waterways running off 17 countries, the Black Sea has a serious environmental contamination problelm. Mehmet Haklidir of the TUBITAK Marmara Research Center in Gebze-Kocaeli, and Füsun Servin Tut Haklidir of COWI SNS Ltd in Gayrettepe-Istanbul, Turkey, suggest that with a little of the right chemistry this problem could be recouched as an environmental solution.
The Black Sea has a layer some 50 metres thick that lies between the anaerobic and aerobic water at a depth of about 200 metres along its axis. As such it represents a vast untapped fuel reserve. The total hydrogen sulfide production in the sediments of the sea is estimated at about 10,000 tonnes per day and this figure is continually rising. That equates to potentially well over 500 tonnes of daily hydrogen gas production.
The researchers explain that what is now required is the development of a safe, and energy-efficient method for collecting the hydrogen sulfide from the Black Sea. In addition, there is a need to find effective catalysts and to build solar energy plants that could be used to quickly dissociated the hydrogen from the sulfide, leaving just a residual sulfur, that has industrial applications in the rubber and pharmaceutical industries.
Black Sea Pollution Could Be Harnessed As Renewable Future Energy Source

Sea stars bulk up to beat the heat

"Sea stars were assumed to be at the mercy of the sun during low tide," said the study's lead author, Sylvain Pincebourde of François Rabelais University in Tours, France. "This work shows that some sea stars have an unexpected back-up strategy."
The researcher is published in the December issue of The American Naturalist.
Sea stars need to endure rapid changes in temperature. During high tide, they are fully submerged in cool sea water. But when tides receded, the stars are often left on rocky shorelines, baking in the sun.
Clearly the stars had some way of beating the heat, but scientists were unsure how they did it. Pincebourde and his team thought it might have something to do with fluid-filled cavities found in the arms of sea stars. So he set up an experiment to test it.
The researchers placed sea stars in aquariums and varied the water level to simulate tidal patterns. Heat lamps were used to control temperature, with some stars experiencing hotter temperatures than others. The researchers found that stars exposed to higher temperatures at low tide had higher body mass after the high tide that followed. Since the stars were not allowed to eat, the increased mass must be from soaking up water.
"This reservoir of cool water keeps the sea star from overheating when the tide recedes again the next day, a process called 'thermal inertia,'" Pincebourde said.
What appears to be happening, the researchers say, is that a hot low tide serves as a cue telling the star to soak up more water during the next high tide. And the amount of water the stars can hold is remarkable.
"It would be as if humans were able to look at a weather forecast, decide it was going to be hot tomorrow, and then in preparation suck up 15 or more pounds of water into our bodies," said co-author Brian Helmuth of the University of South Carolina in Columbia.
The researchers are concerned, however, that climate change may put this novel cooling strategy in peril.
"This strategy only works when the sea water is colder than the air," said co-author Eric Sanford of the University if California, Davis. "Ocean warming might therefore break down this buffering mechanism, making this sea star susceptible to global warming. There are likely limits to how much this mechanism can buffer this animal against global change."
Sea stars bulk up to beat the heat

Hidden Costs Of Energy Production And Use

Requested by Congress, the report assesses what economists call external effects caused by various energy sources over their entire life cycle -- for example, not only the pollution generated when gasoline is used to run a car but also the pollution created by extracting and refining oil and transporting fuel to gas stations. Because these effects are not reflected in energy prices, government, businesses and consumers may not realize the full impact of their choices. When such market failures occur, a case can be made for government interventions -- such as regulations, taxes or tradable permits -- to address these external costs, the report says.
The committee that wrote the report focused on monetizing the damage of major air pollutants -- sulfur dioxide, nitrogen oxides, ozone, and particulate matter -- on human health, grain crops and timber yields, buildings, and recreation. When possible, it estimated both what the damages were in 2005 (the latest year for which data were available) and what they are likely to be in 2030, assuming current policies continue and new policies already slated for implementation are put in place.
The committee also separately derived a range of values for damages from climate change; the wide range of possibilities for these damages made it impossible to develop precise estimates of cost. However, all model results available to the committee indicate that climate-related damages caused by each ton of CO2 emissions will be far worse in 2030 than now; even if the total amount of annual emissions remains steady, the damages caused by each ton would increase 50 percent to 80 percent.
Damages From Electricity Generation
Coal accounts for about half the electricity produced in the U.S. In 2005 the total annual external damages from sulfur dioxide, nitrogen oxides, and particulate matter created by burning coal at 406 coal-fired power plants, which produce 95 percent of the nation's coal-generated electricity, were about $62 billion; these nonclimate damages average about 3.2 cents for every kilowatt-hour (kwh) of energy produced. A relatively small number of plants -- 10 percent of the total number -- accounted for 43 percent of the damages. By 2030, nonclimate damages are estimated to fall to 1.7 cents per kwh.
Coal-fired power plants are the single largest source of greenhouse gases in the U.S., emitting on average about a ton of CO2 per megawatt-hour of electricity produced, the report says. Climate-related monetary damages range from 0.1 cents to 10 cents per kilowatt-hour, based on previous modeling studies.
Burning natural gas generated far less damage than coal, both overall and per kilowatt-hour of electricity generated. A sample of 498 natural gas fueled plants, which accounted for 71 percent of gas-generated electricity, produced $740 million in total nonclimate damages in 2005, an average of 0.16 cents per kwh. As with coal, there was a vast difference among plants; half the plants account for only 4 percent of the total nonclimate damages from air pollution, while 10 percent produce 65 percent of the damages. By 2030, nonclimate damages are estimated to fall to 0.11 cents per kwh. Estimated climate damages from natural gas were half that of coal, ranging from 0.05 cents to 5 cents per kilowatt-hour.
The life-cycle damages of wind power, which produces just over 1 percent of U.S. electricity but has large growth potential, are small compared with those from coal and natural gas. So are the damages associated with normal operation of the nation's 104 nuclear reactors, which provide almost 20 percent of the country's electricity. But the life cycle of nuclear power does pose some risks; if uranium mining activities contaminate ground or surface water, for example, people could potentially be exposed to radon or other radionuclides; this risk is borne mostly by other nations, the report says, because the U.S. mines only 5 percent of the world's uranium. The potential risks from a proposed long-term facility for storing high-level radioactive waste need further evaluation before they can be quantified. Life-cycle CO2 emissions from nuclear, wind, biomass, and solar power appear to be negligible when compared with fossil fuels.
Damages From Heating
The production of heat for buildings or industrial processes accounts for about 30 percent of American energy demand. Most of this heat energy comes from natural gas or, to a lesser extent, the use of electricity; the total damages from burning natural gas for heat were about $1.4 billion in 2005. The median damages in residential and commercial buildings were about 11 cents per thousand cubic feet, and the proportional harm did not vary much across regions. Damages from heat in 2030 are likely to be about the same, assuming the effects of additional sources to meet demand are offset by lower-emitting sources.
Damages From Motor Vehicles And Fuels
Transportation, which today relies almost exclusively on oil, accounts for nearly 30 percent of U.S. energy demand. In 2005 motor vehicles produced $56 billion in health and other nonclimate-related damages, says the report. The committee evaluated damages for a variety of types of vehicles and fuels over their full life cycles, from extracting and transporting the fuel to manufacturing and operating the vehicle. In most cases, operating the vehicle accounted for less than one-third of the quantifiable nonclimate damages, the report found.
Damages per vehicle mile traveled were remarkably similar among various combinations of fuels and technologies -- the range was 1.2 cents to about 1.7 cents per mile traveled -- and it is important to be cautious in interpreting small differences, the report says. Nonclimate-related damages for corn grain ethanol were similar to or slightly worse than gasoline, because of the energy needed to produce the corn and convert it to fuel. In contrast, ethanol made from herbaceous plants or corn stover -- which are not yet commercially available -- had lower damages than most other options.
Electric vehicles and grid-dependent (plug-in) hybrid vehicles showed somewhat higher nonclimate damages than many other technologies for both 2005 and 2030. Operating these vehicles produces few or no emissions, but producing the electricity to power them currently relies heavily on fossil fuels; also, energy used in creating the battery and electric motor adds up to 20 percent to the manufacturing part of life-cycle damages.
Most vehicle and fuel combinations had similar levels of greenhouse gas emissions in 2005. There are not substantial changes estimated for those emissions in 2030; while population and income growth are expected to drive up the damages caused by each ton of emissions, implementation of new fuel efficiency standards of 35.5 miles per gallon will lower emissions and damages for every vehicle mile traveled. Achieving significant reductions in greenhouse gas emissions by 2030 will likely also require breakthrough technologies, such as cost-effective carbon capture and storage or conversion of advanced biofuels, the report says.
Both for 2005 and 2030, vehicles using gasoline made from oil extracted from tar sands and those using diesel derived from the Fischer-Tropsch process -- which converts coal, methane, or biomass to liquid fuel -- had the highest life-cycle greenhouse gas emissions. Vehicles using ethanol made from corn stover or herbaceous feedstock such as switchgrass had some of the lowest greenhouse gas emissions, as did those powered by compressed natural gas.
Fully implementing federal rules on diesel fuel emissions, which require vehicles beginning in the model year 2007 to use low-sulfur diesel, is expected to substantially decrease nonclimate damages from diesel by 2030 -- an indication of how regulatory actions can significantly affect energy-related damages, the committee said. Major initiatives to further lower other emissions, improve energy efficiency, or shift to a cleaner mix of energy sources could reduce other damages as well, such as substantially lowering the damages attributable to electric vehicles.
The report was sponsored by the U.S. Department of the Treasury. National Academy of Sciences, National Academy of Engineering, Institute of Medicine, and National Research Council make up the National Academies.
Hidden Costs Of Energy Production And Use

Is The Dead Sea Dying? Levels Dropping At Alarming Rate

The projected Dead Sea-Red Sea or Mediterranean-Dead Sea Channels therefore need a significant carrying capacity to re-fill the Dead Sea to its former level, in order to sustainably generate electricity and produce freshwater by desalinization. The study also shows that the drop in water levels is not the result of climate change; rather it is due to ever-increasing human water consumption in the area.
Normally, the water levels of closed lakes such as the Dead Sea reflect climatic conditions - they are the result of the balance between water running into the lake from the tributary area and direct precipitation, minus water evaporation. In the case of the Dead Sea, the change in water level is due to intensive human water consumption from the Jordan and Yarmouk Rivers for irrigation, as well as the use of Dead Sea water for the potash industry by both Israel and Jordan. Over the last 30 years, this water consumption has caused an accelerated decrease in water level (0.7 m/a), volume (0.47 km³/a) and surface area (4 km² /a), according to this study.
Abu Ghazleh and colleagues developed a model of the surface area and water volume of the Dead Sea and found that the lake has lost 14 km3 of water in the last 30 years. The receding water has left leveled sections on the lake's sides - erosional terraces - which the authors recorded precisely for the first time using Differential Global Positioning System (DGPS) field surveys. They were able to date the terraces to specific years.
The authors point out that this rapid drop in the level of the Dead Sea has a number of detrimental consequences, including higher pumping costs for the factories using the Dead Sea to extract potash, salt and magnesium; an accelerated outflow of fresh water from surrounding underground water aquifers; receding shorelines making it difficult for tourists to access the water for medicinal purposes; and the creation of a treacherous landscape of sinkholes and mud as a result of the dissolution of buried salt which causes severe damage to roads and civil engineering structures.
To address the mounting stress on water resources in the Dead Sea basin and the environmental hazards caused by its lowering, the authors suggest that the diversion of Jordan water to the Mediterranean coast could be replaced by desalinization of seawater, causing the recession of the Dead Sea to be considerably slowed, and buying time to consider the long-term alternatives such as the Red Sea-Dead Sea Channel or the Mediterranean-Dead Sea Channel.
The authors conclude that either of these channels will require a carrying capacity of more than 0.9 km3 per year to slowly fill the lake back to its levels of 30 years ago and to ensure its long-term sustainability for energy production and desalinization to fresh water. Such a channel will also maintain tourism and potash industry on both sides of the Dead Sea.
Is The Dead Sea Dying? Levels Dropping At Alarming Rate

Plugging Into An Electric Vehicle Revolution


CSIRO engineers have modified the PHEVs to carry a 30Ah NiMH battery which is capable of holding a 6kw charge, and a battery charger, to allow the cars to plug into and charge with electricity from the grid or from on-site renewable energy sources.
CSIRO Energy Transformed Flagship scientist Dr Phillip Paevere said the road trial is collecting extensive information on how the existing PHEV technology could be used for a new application: using the car as a large mobile battery which can be integrated and used in the home.
"The PHEVs have been fitted with instruments which will monitor the travel patterns of different users, and the residual battery power left in the car at the end of the day, which could be available for other uses," Dr Paevere said.
"When not needed, the parked car in the driveway could potentially become a large battery store and energy source for the house, running appliances or storing off-peak or surplus electricity generated from on-site renewable generators, such as solar panels."
SP AusNet spokesperson, Sean Sampson, said the trial will also allow thorough analysis of what the electricity demands are likely to be when PHEVs are connected to the network for charging.
"The introduction of electric vehicles into the mainstream market could have a significant impact on the electricity network," Mr Sampson said.
"They may also dramatically affect the output at residential and retail outlets and the forecasted growth of peak and base demands."
The transport sector accounts for 14 per cent of Australia's total greenhouse gas emissions.
PHEVs have the potential to reduce our emissions and may also provide a way to manage peak demand on the electricity grid.
By controlling when PHEVs are recharging from the electricity network the burden of demand can be shifted.
Furthermore, the car battery can be drawn upon to provide power during peak periods of demand, prevent blackouts when there is a network supply interruption and assist in maintaining the overall stability of the network.
The road trial is the first phase in understanding the potential for using PHEVs in Australian homes.
The PHEV technology will also be used in the home energy system of CSIRO's Zero Emission House (AusZEH) project.
Plugging Into An Electric Vehicle Revolution

Key Step Made Towards Turning Methane Gas Into Liquid Fuel

Methane, the primary component of natural gas, is plentiful and is an attractive fuel and raw material for chemicals because it is more efficient than oil, produces less pollution and could serve as a practical substitute for petroleum-based fuels until renewable fuels are widely useable and available.
However, methane is difficult and costly to transport because it remains a gas at temperatures and pressures typical on the Earth's surface.
Now UNC and UW scientists have moved closer to devising a way to convert methane to methanol or other liquids that can easily be transported, especially from the remote sites where methane is often found. The finding is published in the Oct. 23 issue of the journal Science.
Methane is valued for its high-energy carbon-hydrogen bonds, which consist of a carbon atom bound to four hydrogen atoms. The gas does not react easily with other materials and so it is most often simply burned as fuel. Burning breaks all four hydrogen-carbon bonds and produces carbon dioxide and water, said Karen Goldberg, a UW chemistry professor.
Converting methane into useful chemicals, including readily transported liquids, currently requires high temperatures and a lot of energy. Catalysts that turn methane into other chemicals at lower temperatures have been discovered, but they have proven to be too slow, too inefficient or too expensive for industrial applications, Goldberg said.
Binding methane to a metal catalyst is the first step required to selectively break just one of the carbon-hydrogen bonds in the process of converting the gas to methanol or another liquid. In their paper, the researchers describe the first observation of a metal complex (a compound consisting of a central metal atom connected to surrounding atoms or molecules) that binds methane in solution. This compound serves as a model for other possible methane complexes. In the complex, the methane's carbon-hydrogen bonds remained intact as they bound to a rare metal called rhodium.
The work should spur further advances in developing catalysts to transform methane into methanol or other liquids, Goldberg said, although she noted that actually developing a process and being able to convert the gas into a liquid chemical at reasonable temperatures still is likely some distance in the future.
"The idea is to turn methane into a liquid in which you preserve most of the carbon-hydrogen bonds so that you can still have all that energy," she said. "This gives us a clue as to what the first interaction between methane and metal must look like."
Maurice Brookhart, a UNC chemistry professor, said carbon-hydrogen bonds are very strong and hard to break, but in methane complexes breaking the carbon-hydrogen bond becomes easier.
"The next step is to use knowledge gained from this discovery to formulate other complexes and conditions that will allow us to catalytically replace one hydrogen atom on methane with other atoms and produce liquid chemicals such as methanol," Brookhart said.
The lead author of the paper is Wesley Bernskoetter of Brown University, who did the work while at UNC. Goldberg, Brookhart and Cynthia Schauer, associate chemistry professor at UNC, are co-authors.
The work comes out of a major National Science Foundation-funded collaboration, the UW-based Center for Enabling New Technologies Through Catalysis, which involves 13 universities and research centers in the United States and Canada, including UNC. Additional funding came from the National Institutes of Health.
The center, directed by Goldberg, is aimed at finding efficient, inexpensive and environmentally friendly ways to produce chemicals and fuels.
Key Step Made Towards Turning Methane Gas Into Liquid Fuel

Climate Scientists Uncover Major Accounting Flaw In Kyoto Protocol And Other Climate Legislation

Current carbon accounting, used in the Kyoto Protocol and other climate legislation including the European Union's cap-and-trade law and the American Clean Energy and Security Act, does not factor CO2 released from tailpipes and smokestacks utilizing bioenergy nor does it count emissions resulting from land use changes when biomass is harvested or grown. This, the scientists say, erroneously treats all uses of bioenergy as carbon neutral, regardless of the source of the biomass, and could create strong economic incentives for large-scale land conversion as countries around the world tighten carbon caps.
"The error is serious, but readily fixable," said Timothy Searchinger, a research scholar and lecturer in public and international affairs at Princeton University's Woodrow Wilson School and at the Princeton Environmental Initiative. He also is a fellow with the German Marshall Fund of the United States.
"As we approach the most important climate treaty negotiations in history, it is vital that technologies, such as biofuels, that are proposed as solutions to global warming, are properly evaluated," said team member Daniel Kammen, a University of California, Berkeley, professor of energy and resources and of public policy, who directs the campus's? Renewable and Appropriate Energy Laboratory and the Transportation Sustainability Research Center. "Our paper builds on recent work on the direct and indirect land use impacts of biofuels, and clarifies how the accounting should be done."
The burning of bioenergy and fossil energy releases comparable amounts of carbon dioxide from tailpipes or smokestacks, but bioenergy use may reduce emissions overall if the biomass results from additional plant growth. This is because plants grown specifically for bioenergy absorb carbon dioxide from the atmosphere, and this offsets the emissions from the eventual burning of the biomass for energy.
On the other hand, burning forests releases stored carbon into the atmosphere in the same way as burning oil releases carbon stored for millions of years underground. For these reasons, the greenhouse gas consequences of using bioenergy vary greatly with the source of the biomass.
Unfortunately, Kammen said, the accounting rules used in the Kyoto Protocol, the European Union's Emissions Trading System, and in the climate bill that recently passed the U.S. House of Representatives, exempt the carbon dioxide emitted by bioenergy, regardless of the source of the biomass. That legally makes bioenergy from any source, even that generated by clearing the world's forests, a potentially cheap, yet false, way to reduce greenhouse gas emissions by oil companies, power plants and industry as they face tighter pollution limits.
According to a number of studies, including one by a U.S. Department of Energy lab, applying this incentive globally could lead to the loss of most of the world's natural forests as carbon caps tighten.
The Science article, co-authored by Searchinger, Kammen and 11 others, explains that the error stems from a misapplication of guidelines established by the Intergovernmental Panel on Climate Change (IPCC) at the time of the Kyoto Protocol.
According to the IPCC, exempting carbon dioxide from bioenergy use is appropriate only if an accounting system also counts emissions from clearing land and other land use activities. In that way, if biomass for energy use results in deforestation, emissions are counted as land use emissions. However, the exemption of carbon dioxide from energy use is inappropriate for laws and treaties that do not legally limit emissions from deforestation and other land use activities. Neither the protocol, nor the existing or proposed climate legislation in Europe and the U.S., apply limits to emissions from land use. Because these laws nevertheless exempt all emissions from bioenergy use, the IPCC warns, they can therefore create large, perverse incentives to clear land.
This error in the system for administering carbon caps is distinct from other laws that require minimum quantities of biofuels. Many of these other laws do account for at least some of the emissions from land use activities.
According to the authors, the solution is to count all emissions from energy use, whether from fossil fuels or bioenergy, and then to develop a system to credit bioenergy to the extent it uses biomass derived from "additional" carbon sources, and thereby offsets energy emissions.
Climate Scientists Uncover Major Accounting Flaw In Kyoto Protocol And Other Climate Legislation

Shifting The World To 100 Percent Clean, Renewable Energy As Early As 2030: Here Are The Numbers


To make clear the extent of those hurdles – and how they could be overcome – they have written an article in Scientific American. In it, they present new research mapping out and evaluating a quantitative plan for powering the entire world on wind, water and solar energy, including an assessment of the materials needed and costs. And it will ultimately be cheaper than sticking with fossil fuel or going nuclear, they say.
The key is turning to wind, water and solar energy to generate electrical power – making a massive commitment to them – and eliminating combustion as a way to generate power for vehicles as well as for normal electricity use.
The problem lies in the use of fossil fuels and biomass combustion, which are notoriously inefficient at producing usable energy. For example, when gasoline is used to power a vehicle, at least 80 percent of the energy produced is wasted as heat.
With vehicles that run on electricity, it's the opposite. Roughly 80 percent of the energy supplied to the vehicle is converted into motion, with only 20 percent lost as heat. Other combustion devices can similarly be replaced with electricity or with hydrogen produced by electricity.
Jacobson and Delucchi used data from the U.S. Energy Information Administration to project that if the world's current mix of energy sources is maintained, global energy demand at any given moment in 2030 would be 16.9 terawatts, or 16.9 million megawatts.
They then calculated that if no combustion of fossil fuel or biomass were used to generate energy, and virtually everything was powered by electricity – either for direct use or hydrogen production – the demand would be only 11.5 terawatts. That's only two-thirds of the energy that would be needed if fossil fuels were still in the mix.
In order to convert to wind, water and solar, the world would have to build wind turbines; solar photovoltaic and concentrated solar arrays; and geothermal, tidal, wave and hydroelectric power sources to generate the electricity, as well as transmission lines to carry it to the users, but the long-run net savings would more than equal the costs, according to Jacobson and Delucchi's analysis.
"If you make this transition to renewables and electricity, then you eliminate the need for 13,000 new or existing coal plants," Jacobson said. "Just by changing our infrastructure we have less power demand."
Jacobson and Delucchi chose to use wind, water and solar energy options based on a quantitative evaluation Jacobson did last year of about a dozen of the different alternative energy options that were getting the most attention in public and political discussions and in the media. He compared their potential for producing energy, how secure an energy source each was, and their impacts on human health and the environment.
He determined that the best overall energy sources were wind, water and solar options. His results were published in Energy and Environmental Science.
The Scientific American article provides a quantification of global solar and wind resources based on new research by Jacobson and Delucchi.
Analyzing only on-land locations with a high potential for producing power, they found that even if wind were the only method used to generate power, the potential for wind energy production is 5 to 15 times greater than what is needed to power the entire world. For solar energy, the comparable calculation found that solar could produce about 30 times the amount needed.
If the world built just enough wind and solar installations to meet the projected demand for the scenario outlined in the article, an area smaller than the borough of Manhattan would be sufficient for the wind turbines themselves. Allowing for the required amount of space between the turbines boosts the needed acreage up to 1 percent of Earth's land area, but the spaces between could be used for crops or grazing. The various non-rooftop solar power installations would need about a third of 1 percent of the world's land, so altogether about 1.3 percent of the land surface would suffice.
The study further provides examples of how a combination of renewable energy sources could be used to meet hour-by-hour power demand, addressing the commonly asked question, given the inherent variability of wind speed and sunshine, can these sources consistently produce enough power? The answer is yes.
Expanding the transmission grid would be critical for the shift to the sustainable energy sources that Jacobson and Delucchi propose. New transmission lines would have to be laid to carry power from new wind farms and solar power plants to users, and more transmission lines will be needed to handle the overall increase in the quantity of electric power being generated.
The researchers also determined that the availability of certain materials that are needed for some of the current technologies, such as lithium for lithium-ion batteries, or platinum for fuel cells, are not currently barriers to building a large-scale renewable infrastructure. But efforts will be needed to ensure that such materials are recycled and potential alternative materials are explored.
Finally, they conclude that perhaps the most significant barrier to the implementation of their plan is the competing energy industries that currently dominate political lobbying for available financial resources. But the technologies being promoted by the dominant energy industries are not renewable and even the cleanest of them emit significantly more carbon and air pollution than wind, water and sun resources, say Jacobson and Delucchi.
If the world allows carbon- and air pollution-emitting energy sources to play a substantial role in the future energy mix, Jacobson said, global temperatures and health problems will only continue to increase.
Shifting The World To 100 Percent Clean, Renewable Energy As Early As 2030: Here Are The Numbers

Customizing electric cars for cost-effective urban commuting

The vehicle is part of a new research project, ChargeCar, headed by Illah Nourbakhsh, associate professor of robotics. The project is exploring how electric vehicles can be customized to cost-effectively meet an individual's specific commuting needs and how an electric vehicle's efficiency can be boosted and its battery life extended by using artificial intelligence to manage power.
"Most electric cars today are being designed with top-down engineering to match the performance of gas-powered cars," Nourbakhsh said. "Our goal is to revolutionize urban commuting by taking a different approach -- by first analyzing the needs, conditions and habits of the daily commutes of actual people and then using this 'commute ecology' to develop electric vehicles suited to each unique commute." The researchers calculate that a typical Pittsburgh commuter might save 80 percent of energy costs by switching from a gas car to an electric car.
ChargeCar isn't developing new vehicles, but rather a knowledge base that can be used to convert gas-powered vehicles using existing technology. The researchers are working with Pittsburgh mechanics to develop community-level expertise in vehicle conversion, as well as a set of conversion "recipes."
Key to the project is a vehicle architecture called smart power management, which uses artificial intelligence to manage the flow of power between conventional electric car batteries and a device called a supercapacitor. Supercapacitors are electrochemical capacitors with unusually high energy density and have typically been used to start locomotives, tanks and diesel trucks. Because it can store and rapidly release large amounts of electrical power, a supercapacitor can serve as a buffer between the battery pack and the vehicle's electric motors, improving the vehicle's responsiveness while reducing the charge/discharge cycling that shortens battery life.
"Many people have talked about using supercapacitors as buffers on a battery, but we also will use artificial intelligence to manage how power is discharged and stored," Nourbakhsh said. "Based on a driver's route and habits, the smart power management system will decide whether to draw power for the electric motors from the batteries or the supercapacitor and decide where to store electricity produced by the regenerative braking system as the car slows down or goes down a hill."
Determining the optimal means of managing power will be one of ChargeCar's primary goals. The researchers calculate that an intelligent electric car controller could recapture 48 percent of the energy during braking and that a supercapacitor could reduce 56 percent of the load on the batteries and reduce heating of the batteries -- which shortens battery life -- by 53 percent.
"The number one cost of electric vehicle ownership is the batteries," Nourbakhsh said. "Smart power management will save money initially because it pairs a low-cost battery pack with a small supercapacitor. And it will continue to save money by increasing efficiency and extending battery life." By customizing each vehicle to the owner's specific commute, ChargeCar will save money for some owners by allowing them to purchase the minimum number of batteries necessary.
The converted Scion xB will serve as a test bed for developing smart power management techniques, measuring battery lifetimes and refining conversion techniques.
The ChargeCar project has created a national clearinghouse for commuter data at http://chargecar.org; people across the country are invited to store their commute data via GPS and upload it to the site. The site can then use the data to show individuals the energy cost of gasoline vs. electricity for their commute and also can show how much wear and tear on batteries could be saved on the commute by using a supercapacitor. The researchers will use the database to help them tailor solutions to individual commutes and they will make the database available to all electric car researchers and enthusiasts.
The ChargeCar team includes Gregg Podnar, co-principal investigator with Nourbakhsh; research engineer Josh Schapiro; senior research programmer Chris Bartley; project scientist Ben Brown; Intel Labs Pittsburgh senior researcher Jason Campbell; and students Vibhav Sreekanti, Paul Dille and Matt Duescher.
Customizing electric cars for cost-effective urban commuting

Emissions increase despite financial crisis

A new study from Norwegian and New Zealand scientists provides updated numbers for CO2 emissions from fossil fuels. While the global financial crisis may have slowed down the emission growth, it has not been sufficient to stop it: From 2007 to 2008 global emissions from fossil fuels increased by 2.2 percent. From 2003 to 2007, the average fossil emissions increased by 3.7 percent a year.

"The financial crisis started in the latter part of 2008, so the full effect of the financial crisis of CO2 emissions will most likely be on the emissions in 2009," scientist Gunnar Myhre at CICERO Center for International Climate and Environmental Research, Oslo, said.
Coal most important
According to the study published in Environmental Research Letters, coal in 2006 bypassed oil as the largest source of CO2 emissions. Emissions from gas and oil have had a rather constant growth since 1990. For coal however, the picture is different.
"Emissions from coal have had a strong increase since 2000 and coal is now the driver of the strong fossil fuel CO2 emission growth. The main reason is increased use of coal in China, largely due to export production," Myhre said.
India coming up
For the first time, India's emissions now increase faster than the Chinese emissions.
"The growth rate of the emissions has been slightly higher in India the last two years. Still, China is by far the leading world polluter, but we can expect Indian emissions to play an increasingly important role in the future," Myhre said.
Fossil energy's role
According to the International Panel on Climate Change (IPCC), a large reduction of emissions from fossil sources is needed to reduce global warming. The concentration of CO2 in the atmosphere has increased from 280 ppm in 1750 to 383 ppm in 2007. Around 75 percent of the increase until now is due to CO2 emissions from fossil energy. 25 percent is due to changes in land use.
Whereas the trend in CO2 emissions from land use over the last few decades has been relatively constant, an increasing trend in fossil fuel CO2 emissions has been reported. This increasing trend is driven by enhanced economic growth and also an increase in carbon intensity.
All main IPCC scenarios of fossil fuel CO2 emissions show an increase over the next few decades with a large spread in emissions estimates up to 2100. Future atmospheric CO2 concentrations not only depend on the emissions, but also on the net uptake of CO2 by land and ocean.
The study was conducted by Gunnar Myhre and Kari Alterskjær at Center for International Climate and Environmental Research -- Oslo (CICERO) and Dave Lowe at the National Institute of Water and Atmospheric Research in New Zealand.
Emissions increase despite financial crisis

Sweet solution to energy production

ScienceDaily (Nov. 12, 2009) — Sugarcane biomass, a significant waste product from sugar production, could be a renewable energy source for electricity production, according to research published in the international journal Progress in Industrial Ecology.

Engineer Vikram Seebaluck of the University of Mauritius and energy technology Dipeeka Seeruttun of the Royal Institute of Technology, in Stockholm, Sweden, have demonstrated that an optimal blend of sugarcane agricultural residues (30%) mixed with 70% sugarcane bagasse (the fibrous residue left after sugar production) can be used to generate electricity at a cost of just 0.06 US dollars per kilowatt hour. That figure is on a par with the costs of other renewable energies, including wind power at $0.05/kWh.
Sugarcane is a giant perennial grass of the genus Saccharum that can be found in wet and dry tropical and partially subtropical regions. It consists of an above-ground bamboo-like stalk with trash, cane tops and leaves and underground rhizomes and roots. 30 tonnes per hectare of fibre and sugarcane juice are sent to factories for sugar production, which leaves 24 tonnes per hectare of waste biomass. Currently, sugarcane bagasse is burnt for onsite heat and electricity production at sugar factories and surplus electricity is exported to the grid. That still leaves 24 tonnes per hectare of waste in the fields.
This waste has a similar energy content to bagasse, Seeruttun says, which could make it technically viable to use this material together with bagasse in a more effective way for electricity production. The 30:70 mixture of waste and bagasse reduces the risk of fouling or slagging of the furnaces used to burn the material.

"The combustion of SARs for the production of electricity is technically and economically feasible and creates opportunities for increasing the renewable energy share in sugarcane-producing countries," the researchers explain.
The researchers analysis of the economics and technology required to exploit sugarcane waste products effectively suggests that bioenergy expansion from cane biomass would create rural jobs, reduce costly energy imports, and cut greenhouse gas emissions overall. Its use in electricity generation displaces the equivalent of 230 kg of coal for the equivalent amount of energy generated and 560 kg of carbon dioxide per tonne.
They caution that harnessing this bioenergy and biomass potential will require significant increases in investment, technology transfer and international cooperation. Nevertheless, its high efficiency and concentration, mostly in the developing world, should be viewed as a global resource for sustainable development
Sweet solution to energy production

Renewable Energies Will Benefit US Workers' Health, Expert Predicts

Steven Sumner, M.D., who completed the work while a medical student, along with Peter Layde, M.D., professor of population health and co-director of the Injury Research Center at the Medical College, examined occupational health risks to workers in renewable energy industries compared to fossil fuel industries. Risk of workplace injury and death among energy workers is a hidden cost of energy production, known as an externality of energy. Externalities of energy production include a whole host of problems from damage to the general environment to adverse effects on human health caused by pollution to injury and death among workers in the energy sector.
Dr. Sumner, currently an internal medicine resident at Duke University, and Dr. Layde examined the human health risks associated with traditional fossil fuels, such as coal, oil, and natural gas, relative to renewable energy sources such as wind, solar, and biomass. Wind and solar energy appeared to offer less risk of workplace injury and death than traditional fossil fuel industries, as the dangerous energy extraction phase is minimized or eliminated in wind or solar energy production. Biomass, comprised of biofuels, organic waste, and wood derived fuels, currently accounts for more than half of US energy renewable consumption and does not appear to offer a significant safety benefit to US workers relative to fossil fuels.
“The energy sector remains one of the most dangerous industries for US workers. A transition to renewable energy generation utilizing sources such as wind and solar could potentially eliminate 1300 worker deaths over the coming decade,” says Dr. Sumner.
According to Dr. Layde, “Previous research on the health effects of a transition from fossil fuels to renewable energy has focused on the environmental benefits of renewable energy on air quality and global warming. The benefits of reduced workplace injury and fatality have not been sufficiently emphasized in the debate to move to renewable energies. This will be an added benefit to US energy workers with the passage of the American Recovery and Reinvestment Act of 2009.”
The researchers reviewed the occupational cost of energy production in the traditional and new energies and noted that while fossil fuel energies have historically been priced lower than renewable energies, the additional hidden costs, or externalities of energy, especially adverse effects on human health have often not been taken into account.
The dangers to energy workers were examined at various stages of energy production: extraction, generation and distribution. The entire fuel life cycle includes fuel extraction, other raw materials extraction, structure construction, equipment manufacturing, material transport, energy generation, power distribution and by product disposal.
Extraction
Mining, which includes coal, gas, and oil extraction from underground or underwater stores, is the second most hazardous occupation in the US with 27.5 deaths per 100,000, compared to the average annual fatality rate of 3.4 deaths for all US industries. Only agriculture is more dangerous with 28.7 deaths per 100,000. Additionally, fossil fuel workers risk unintended injuries from extraction, and are exposed to hazardous particles, gases and radiation.
Renewable energies which eliminate the full extraction phase pose far less hazard, though a one-time extraction of raw materials is required to manufacture wind turbines and photovoltatic modules for wind and solar energy, respectively. Biomass, on the other hand, which includes corn farming for ethanol production, is unlikely to offer a reduction in extraction-related occupational fatalities.
Generation
The combustion required to generate fossil fuel not only leads to green house gases and respiratory pollutants, but includes risk of catastrophic explosions. This also holds true for biomass energy generation. In developed countries fossil fuels are associated with more accident-related fatalities per unit of energy generated than either nuclear or hydroelectric power.
With wind and solar the possibility of a large unintentional catastrophe is limited.
Distribution
There are several ways of distributing fossil fuel and renewable energies. Highway crashes account for the greatest proportion of fatalities among oil and gas extraction workers, who are not subject to work-hour restrictions imposed on other transportation industries. Biomass energies also use vehicular transportation. Both fossil fuel and wind and solar energies share a common pathway and risk for transmission of electrical current via utility powers lines.
The researchers concluded that available studies on occupational health risks of energy generation have significant limitations and more precise nationwide data for renewable energy occupations are needed. Nonetheless, the potential occupational health benefits of transitioning to renewal energies are considerable and the safety profile should be immediate, obvious and sizeable.
The study was partially supported by a grant from the Centers for Disease Control and Prevention.
Renewable Energies Will Benefit US Workers' Health, Expert Predicts