Microscopic carbon nanotubes a hundred thousand times thinner than a human hair may have the potential to transport electricity faster and over greater distances with minimal loss of energy, according to new research that will be published Friday, in the October 2nd edition of Science magazine. The research was led by Honda Research Institute USA, Inc., in conjunction with researchers at Purdue University and the University of Louisville.
The findings open new possibilities for miniaturization and energy efficiency, including much more powerful and compact computers, electrodes for supercapacitors, electrical cables, batteries, solar cells, fuel cells, artificial muscles, composite material for automobiles and planes, energy storage materials and electronics for hybrid vehicles.
Microscopic carbon nanotubes are grown on the surface of metal nanoparticles, taking the cylindrical form of rolled honeycomb sheets with carbon atoms in their tips. When these tiny carbon nanotubes exhibit metallic conductivity they possess extraordinary strength compared to steel, higher electrical properties than copper, are as efficient in conducting heat as a diamond and are as light as cotton.
“Our goal is not only the creation of new and better technologies and products, but to fulfill Honda’s commitment to environment sustainability,” said Dr. Hideaki Tsuru, project director from Honda Research Institute USA.
Past research efforts to control the structural formation of carbon nanotubes with metallic conductivity through conventional methodology resulted in a success rate of approximately 25 – 50%. Honda, which has worked in the field of carbon nanotube synthesis for almost a decade, has achieved a success rate of 91% metallic conductivity.
“This is the first report that shows we can control fairly systematically whether carbon nanotubes achieve a metallic state. Further research is in progress with the ultimate goal to take complete control over grown nanotube configurations to support their real world application,” said Dr. Avetik Harutyunyan, principal scientist from Honda Research Institute USA, and the leader of the project.
“Our finding shows that the nanotube configuration which defines its conductivity depends not only on the size of the metal nanocatalyst used to nucleate the tube as was previously believed, but importantly also is based on its shape and crystallographic structure, and we learned to control it,” said Dr. Harutyunyan, whose team of Honda scientists included Dr. Gugang Chen and Dr. Elena Pigos.
“We are excited about our teamwork and collaborations with researchers at Purdue and Louisville, who helped achieve this advance,” he said. Researchers at Purdue, led by Professor Eric Stach, used a transmission electron microscope to observe nanotube formation, revealing that changes in the gaseous environment can vary the shape of the metal catalyst nanoparticles from very sharp faceted to completely round. Researchers at Louisville, led by Professor Gamini Sumanasekera, produced the nanotubes in thin film – a few layers of carbon nanotubes, and made careful measurements to determine whether the nanotubes achieve a metallic state.
Honda’s innovative research and development efforts during the past decade have yielded such diverse outcomes as humanoid robotics, walking assist devices, HondaJet, fuel cell technology, increased rice crop yields, and thin film solar cells, in addition to the design and development of automobiles, motorcycles and power equipment products. Honda has conducted consumer product related R&D in the United States since 1975 at Honda R&D Americas, Inc. For the purpose of researching future technologies, in January 2003, Honda Research Institute USA, Inc. (HRI-US) was founded along with HRI-EU (Europe) and HRI-JP (Japan). U.S. offices are located in California, Ohio and Massachusetts and include a computer science research division focused on human intelligence technologies and a materials science research division focused on functional nano-materials
Honda’s Nanotube Research Opens New Design Opportunities
skip to main |
skip to sidebar
Ireland’s Energy Minister Eamon Ryan recently announced a major move in the electrification Irish motoring. The Memoranda of Understanding (signed by Minister Ryan on behalf of the Government) and by Padraig McManus (for ESB) will create favourable conditions for the distribution of electric vehicles to the Irish market by Renault-Nissan.
In a hugely significant and new collaboration between Government, between the semi-state electricity supplier ESB and between car manufacturers Renault-Nissan, these electric vehicles will be on Irish roads within 2 years.
“This historic agreement”, said Minister Ryan “is proof of Government’s firm intention to act on the electrification of transport. Some months ago, I announced the Government target to move to at 10% target of electric vehicles by 2020. Today’s Memorandum of Understanding will help us not only realise, but surpass this target.
We are well on our way and our streets will see the change very shortly”.
“In November, we sent a call to the market that Ireland was ‘open for business’ on electric cars. Our call has been answered by Renault-Nissan and I’d like to welcome them to the Irish market with this new product Today we sign, what I hope will be the first of many agreements with interested companies”. The Irish Government’s intentions are not product-exclusive.
“Today’s initiative will transform our streets, will cut carbon emissions and change the face of transport in Ireland,” he said. “Again we see the ESB stepping up to the plate to secure Ireland’s future and I commend them for their vision and work in this regard”.
“This collaboration will provide the world with a model for how electric vehicles can be achieved globally. We will continue to press ahead”.
Renault-Nissan Denki concept
ESB Chief Executive Padraig McManus described today’s development as an “an opportunity for Ireland to demonstrate its leadership in the green revolution, including in electric transport”.
“ESB has set out its plans to become carbon-neutral by 2035 and carbon-neutral electricity will power an emissions-free transport system. ESB will roll out a charging network to support the development. We will guarantee open access to all electricity suppliers and car manufacturers and can ensure adherence to the strictest safety standards for the recharging points”, he said.
“The roll-out of electric vehicles will provide major employment opportunities in a number of areas”, he said.
Speaking at the announcement, Andrew Palmer, Senior Vice President, Nissan Motor Company, said the Renault-Nissan Alliance looks forward to a successful partnership with Ireland.
“We regard Ireland as a leader in the EV project. Demography and political support make Ireland one of the most suitable locations for a large scale roll out of electric vehicles. Renault and Nissan are particularly pleased to be working with the Irish Government and ESB in putting in place the correct conditions to support electric transport“.
About ESB
Founded in 1927, ESB is Ireland’s leading electricity company. It is a vertically integrated utility that generates, distributes and supplies electricity in a regulated energy market.ESB Group employs approximately 6,500 people and sub-company, ESB International, employs 1,200 on its overseas business that has spanned more than 100 countries.One of Ireland’s most successful companies with an annual turnover of €3.5 billion, ESB has grown in value from €2.5 billion in 2002 to approximately €6.5 billion today.
Ireland and Renault-Nissan Sign Agreement to Develop Electric Vehicles
Smith Electric Vehicles, the leading manufacturer of commercial electric vehicles, recently announced an electric vehicle development collaboration with Ford Motor Company. Smith Electric Vehicles, a trading division of The Tanfield Group Plc, will work with Ford to introduce a battery-electric light van, the first vehicle in the company’s broad electrification strategy for the North American market which was announced at this year’s Detroit Auto Show. This vehicle will be based on the European-designed Ford Transit Connect which goes on sale in North America this year.
The vehicle, which Smith will assemble in North America, will have a range of up to 100 miles on a full charge, without compromising the Transit Connect’s superior driving experience. It will operate very similarly to a conventional light van, but with smoother acceleration, less noise and zero emissions. The vehicle will be fully branded as a Ford product and will go on sale through selected Ford dealerships in North America in 2010.
Smith Electric Vehicles Collaborates with Ford Motor Company to Build Electric Vans
Researchers studying the reactions of trees to rising CO2 concentration in the atmosphere have it easy. Since trees store the carbon they absorb in wood, all they need to do is take core samples from tree trunks. A centenarian oak will reveal how it coped with the incipient climate change over a period of a hundred years in its annual rings. "However, the grassland vegetation we work with is grazed or dies off in a matter of months and decomposes," explains Prof. Hans Schnyder, who is doing research in the field of grasslands at the Center for Life and Food Sciences Weihenstephan at the TUM. The Swiss scientist nonetheless wanted to establish out how economically grasslands deal with water when temperatures rise and the carbon dioxide concentration in the air increases.
Important in this context is that all plants absorb CO2 from the atmosphere. At the same time they transpire water vapor to cool their sunlit leaves. Both processes run via the stomata, tiny pores in the leaves, the opening size of which plants can regulate. During longer periods of drought plants close the stomata to curb water loss, albeit at the expense of CO2 absorption. Laboratory experiments show that, for a given stoma aperture, an artificial increase of ambient CO2 leads to a temporary increase in the absorption capacity for the gas. However, to ascertain the actual change of water use efficiency in grassland vegetation over the course of the last century, Prof. Schnyder had to find grassland time series comparable in length to those of trees.
This is where the team turned their sights to the Alpine ibex horn collection at the Museum of Natural History in Bern. Ibex store isotopic information in their horns that reflects the water use of the vegetation they consume. The TUM researchers went at the museum collection, which covers the years 1938 to 2006, with a carving knife, to remove tiny samples from the horns. Since ibex horns also have annual rings, the grassland researchers were able to use the samples to draw conclusions about temporal changes in the grassland vegetation of the Bernese Alps where the ibex had grazed.
A unique specimen archive at the research station Rothamsted in England eventually enabled a comparison with a second grassland region. The "Park Grass Experiment" -- the longest running ecological grassland experiment worldwide -- was initiated in Rothamsted over 150 years ago. Since 1857 specimens have been archived there to allow future generations of scientists to gain long-term insights into the local ecosystem using modern research methods. And indeed, the TUM scientists were able to benefit from the hay specimens dating as far back as 150 years. Once again analyzing the isotope signature, they could infer how the English grassland vegetation had utilized the water over the years.
The Weihenstephan researchers thus determined the individual isotope composition of the grassland vegetation in both the Bernese Alps and in the British lowlands over extended periods of time: more than 69 years based on the horns, and as far back as 150 years using the hay specimens. In a second step this data was lined up with climate data, e.g. air temperature and aridity, of the respective region.
The result: In both locations the intrinsic water-use efficiency of the grassland vegetation rose over the years. This implies that the plants improved their water storage potential as temperatures rose and the level of CO2 in the atmosphere increased. Based on these results the TUM scientists have now, for the first time ever, managed to demonstrate the long-term effects of anthropogenic climate change on the water-use efficiency of grasslands.
There were, however, also differences between the two locations. In Switzerland the effective water-use efficiency of the Alpine meadows remained unchanged in spite of the increased intrinsic water-use efficiency of the grassland. This was because, overall, the air had become drier and warmer as a result of the climate change. In England the scientists found evidence for this effect only during the fall. In the spring though -- which in Rothamsted is no drier today than it was 150 years ago -- the water storage potential of grassland vegetation had a real effect. This insight will help to further improve climate simulations. In the past, complex simulation models that included vegetation had to rely on estimates where grassland was concerned. The scientists at the TU Muenchen have now succeeded in prying open this climate research black box.
Old hay and Alpine ibex horns reveal how grasslands respond to climate change
The Coca-Cola Company, in partnership with Greenpeace, has committed to make all its new vending machines and coolers hydrofluorocarbon-free by 2015.
By using hydrofluorocarbon-free refrigeration, Coca-Cola would be able to reduce its equipment’s direct greenhouse gas emissions by 99 percent. The elimination of hydrofluorocarbons in the commercial refrigeration industry would be equivalent to the elimination of annual emissions of Germany or Japan.
Coca-Cola’s new green commitment will help influence the commercial refrigeration market to shy away from using hydrofluorocarbons. The company has already invested more than $50 million in research and development to accelerate the use of climate-friendly cooling technologies.
Next year, Coca-Cola and its bottling partners will purchase a minimum of 150,000 units of hydrofluorocarbon-free equipment, doubling the current rate of purchase to meet its goal to buy 50 percent of all new coolers and vending machines without hydrofluorocarbons by 2012.
The company has approximately 10 million coolers and vending machines worldwide, making up the largest element of Coca-Cola’s total climate impact.
Hydrofluorocarbon-free equipment
As a result of the company’s commitment to use hydrofluorocarbon-free equipment, carbon emission reductions will exceed 52.5 million metric tons over the lifetime of the equipment – the equivalent of removing 11 million cars from the road for one year.
Because of Coca-Cola’s supply chain engagement, a major supplier already intends to build a dedicated carbon dioxide compressor production facility to help meet the increasing demand for hydrofluorocarbon-free cooling options across the industry.
“At Coca-Cola, we are deploying our scale and working with suppliers to deliver cost effective alternatives to HFC, for us and for others,” said Rick Frazier, vice president of supply chain for the Coca-Cola Company.
The new green initiative is a direct result of the collaboration with Greenpeace that began in 2000. Greenpeace has urged Coca-Cola to use hydrofluorocarbon-free equipment for the Olympics. By the Torino Games in 2006 and the Beijing Games in 2008, the company was using all hydrofluorocarbon-free technology at Olympic venues.
“Large enterprises have both an opportunity and responsibility to change the game and Coca-Cola’s action leaves no excuse for other companies not to follow,” remarked Kumi Naidoo, executive director of Greenpeace International.
In addition to its refrigeration gas commitment, Coca-Cola also developed a proprietary energy management system that provides energy savings of up to 35 percent.
The beverage bottler is listed on the New York Stock Exchange.
Researchers from the Massachusetts Institute of Technology have developed a new type of natural gas electric power plant with zero carbon dioxide emissions.
Postdoctoral associate Thomas Adams and Lammot du Pont Professor of Chemical Engineering Paul I. Barton are proposing a system that uses solid-oxide fuel cells to produce power from fuel without burning it.
The system would not require any new technologies but would combine existing components in a new configuration for which they have applied for a patent.
The system would run on natural gas that is considered more environmentally friendly that coal or oil. Presently, natural gas power plants produce an average of 1,235 pounds of carbon dioxide for every megawatt-hour of electricity produced, one third the emissions from coal power plants.
The system proposed by Mr. Adams and Mr. Barton would not emit any carbon dioxide into the air or other gases believed responsible for global warming, but would instead produce a stream of mostly pure carbon dioxide.
This stream could be harnessed and stored underground relatively easily, a process known as carbon capture and sequestration (C.C.S.). One additional advantage of the proposed system is that, unlike a conventional natural gas plant with C.C.S. that would consume significant amounts of water, the fuel-cell based system actually produces clean water that could easily be treated to provide potable water as a side benefit, Mr. Adams says.
Carbon pricing needed
The new system could produce power at cost comparable to or less than conventional natural-gas plants and even to coal-burning plants. However, the researchers said that this can only come about if and when a price is set on the emission of carbon dioxide and other greenhouse gases.
Carbon pricing attempts to take into account the true price exacted on the environment by greenhouse gas emission.
Mr. Adams explained that without costs imposed on carbon emissions, the cheapest fuel will always be pulverized coal. But as soon as there is some form of carbon pricing, their system will have the lowest price option as long as the pricing is more than about $15 per metric ton of emitted carbon dioxide.
Natural gas already accounts for 22 percent of all United States electricity production, and that percentage is likely to rise in coming years if carbon prices are put into effect.
For these and other reasons, a system that can produce electricity from natural gas at a competitive price with zero greenhouse gas emissions could prove to be an attractive alternative to conventional power plants that use fossil fuels.
M.I.T. develops cleaner natural gas power
CANBERRA, Dec. 1 (Reuters) - Australia's plans to cut carbon emissions were set for defeat in a hostile Senate after the election of a new opposition leader opposed to carbon-trade laws, setting the stage for a possible early 2010 election.
New Liberal opposition leader Tony Abbott, elected Tuesday, said conservative Senators, many climate change skeptics, would reject the government's carbon emissions trading laws if they are not deferred until early 2010.
Mr. Abbott said he believed in climate change, but told reporters he was opposed to the government's emissions trading scheme model, and was not afraid of fighting an election on the issue.
"As leader, I am not frightened of an election on this issue. This is going to be a tough fight. But it will be a fight. You cannot win an election without a fight," said Mr. Abbott, a boxer in his university days who once studied for the priesthood.
Greg Combet, assistant climate change minister, said the government would still push for its carbon trade laws to be passed this week. He hoped some opposition lawmakers would side with the government and defy Mr. Abbott.
"The extremists have gained control of the Liberal Party. They are opposed to taking action on climate change, they dispute the science," Mr. Combet told reporters.
Prime Minister Kevin Rudd has struggled to have his climate change legislation passed in the Senate before parliament adjourns until February.
Mr. Rudd, who was in Washington on Monday meeting President Barack Obama, was keen to take a lead role at next week's Copenhagen summit on climate change by enacting a "cap-and-trade" scheme requiring polluters to buy permits for their emissions.
Mr. Rudd wants emissions trading to start in Australia in July 2011, covering 75 percent of emissions in the developed world's bigger per capita emitter. The planned carbon-trade scheme would be the biggest outside Europe.
The United States is closely watching Australia's debate and a political agreement on carbon trading in Australia would help garner support for action from other countries.
If the Senate ends up rejecting the carbon scheme for a second time, Rudd will have a trigger to call an early 2010 election on climate change, most likely for March of April, with polls suggesting his government would win an increased majority.
Business, election uncertainty
Mr. Abbott said while the opposition rejected the emissions trading scheme, it still backed the government's emissions reduction target of at least 5 percent from 2000 levels by 2020, with a 25 percent target if nations agree on an ambitious climate pact in Copenhagen.
The prolonged political debate has caused some dismay among companies, coal and power firms in particular, who see some sort of scheme as inevitable and are looking for pricing certainty.
Banks and fund managers see the emissions trading scheme, the biggest economic policy change in modern Australian history, as a boon for traders, investors and new green technologies, while major polluters generally oppose it as a tax on heavy industry.
International Power Australia, a unit of International Power and Australia's largest private-sector generator, says the uncertainty has impinged on talks with its lenders.
Major miners such as BHP Billiton and oil and gas firms such as Woodside Petroleum have criticized the scheme, though have been mollified somewhat by government pledges last month to raise state compensation.
"This is the worst possible outcome for us, as this guy (Mr. Abbott) is the figurehead for the climate change skeptics within the conservative party," said Tim Hanlin, managing director Australian Climate Exchange Limited.
"Australian industry is now thrown into total uncertainty regarding a price on carbon and therefore cannot make any informed investment decisions," said Mr. Hanlin.
"This is going to put Kevin Rudd under enormous pressure to call an election on this issue."
But Monash University analyst Nick Economou said Rudd would now miss his Copenhagen deadline and be in no rush for an election, putting the chances of an early poll at 20 percent.
"They may as well play the long game, the patient game," Mr. Economou said, saying Mr. Rudd would prefer a normal election later in 2010, giving him time to build up an attack against Mr. Abbott's Liberals, with the carbon laws waiting to be passed.
Mr. Rudd has repeatedly said he does not want an early poll and would prefer elections to be held on time in late 2010.
If Mr. Rudd wins a double dissolution election of both houses of parliament, he can then push his climate policy through a special joint sitting of the houses.
Australia carbon laws in doubt
VANCOUVER, British Columbia, Nov. 24 (Reuters) - The Canadian province of Quebec said on Monday it aims to cut its greenhouse gas emissions by 20 percent below 1990 levels by 2020, the same target as that set by the European Union.
"It is a very ambitious target for the government, given that 48 percent of Quebec's total energy currently comes from renewable energy sources," Quebec Premier Jean Charest said in a statement.
Much of Quebec's power comes from massive hydroelectric projects.
Quebecers emit approximately 11 tons per capita of greenhouse gases, which are blamed for climate change. That is half the Canadian average, Mr. Charest said.
The mostly French-speaking province is a member of the Western Climate Initiative, a group of four Canadian provinces and seven western American states, which is working on implementing a carbon cap and trade system in North America by 2012.
Canada's federal government has pledged to cut carbon emissions by 20 percent from 2006 levels by 2020. However, Ottawa is waiting for the United States to finalize its cap-and-trade program before proceeding with its own.
British Columbia pledged in 2007 to cut its emissions of greenhouse gases by 33 percent by 2020, which would put them 10 percent under 1990 levels.
Quebec sets 2020 greenhouse gas emission targets
The new species belongs to a larger group of extinct mammal relatives, called anomodonts, which were widespread and represented the dominant plant eaters of their time.
"Members of the group burrowed in the ground, walked the surface and lived in trees," said Fröbisch, the lead author of the study. "However, Kombuisia antarctica, about the size of a small house cat, was considerably different from today's mammals -- it likely laid eggs, didn't nurse its young and didn't have fur, and it is uncertain whether it was warm blooded," said Angielczyk, Assistant Curator of Paleomammology at The Field Museum. Kombuisia antarctica was not a direct ancestor of living mammals, but it was among the few lineages of animals that survived at a time when a majority of life forms perished.
Scientists are still debating what caused the end-Permian extinction, but it was likely associated with massive volcanic activity in Siberia that could have triggered global warming. When it served as refuge, Antarctica was located some distance north of its present location, was warmer and wasn't covered with permanent glaciers, said the researchers. The refuge of Kombuisia in Antarctica probably wasn't the result of a seasonal migration but rather a longer-term change that saw the animal's habitat shift southward. Fossil evidence suggests that small and medium sized animals were more successful at surviving the mass extinction than larger animals. They may have engaged in "sleep-or-hide" behaviors like hibernation, torpor and burrowing to survive in a difficult environment.
Earlier work by Fröbisch predicted that animals like Kombuisia antarctica should have existed at this time, based on fossils found in South Africa later in the Triassic Period that were relatives of the animals that lived in Antarctica. "The new discovery fills a gap in the fossil record and contributes to a better understanding of vertebrate survival during the end-Permian mass extinction from a geographic as well as an ecological point of view," Fröbisch said.
The team found the fossils of the new species among specimens collected more than three decades ago from Antarctica that are part of a collection at the American Museum of Natural History. "At the time those fossils were collected, paleontologists working in Antarctica focused on seeking evidence for the existence of a supercontinent, Pangaea, that later split apart to become separate land masses," said Angielczyk. The fossils collected in Antarctica provided some of the first evidence of Pangaea's existence, and further analysis of the fossils can refine our understanding of events that unfolded 250 million years ago.
"Finding fossils in the current harsh conditions of Antarctica is difficult, but worthwhile," said Angielczyk. "The recent establishment of the Robert A. Pritzker Center for Meteoritics and Polar Studies at The Field Museum recognizes the growing importance of the region," he said.
This research is part of a collaborative study of Dr. Jörg Fröbisch (Department of Geology, Field Museum, Chicago), Dr. Kenneth D. Angielczyk (Department of Geology, Field Museum, Chicago), and Dr. Christian A. Sidor (Burke Museum and Department of Biology, University of Washington), which will be published online December 3, 2009 in Naturwissenschaften.
Funding for this research was provided through a Postdoctoral Research Fellowship of the German Research Foundation (Deutsche Forschungsgemeinschaft) to J. Fröbisch and grants of the National Science Foundation to C. A. Sidor.
Antarctica served as climatic refuge in Earth's greatest extinction event
As a result, glacier and ice-sheet melting, sea-ice retreat, coastal erosion and sea level rise can be expected to continue.
A new comprehensive scientific synthesis of past Arctic climates demonstrates for the first time the pervasive nature of Arctic climate amplification.
The U.S. Geological Survey led this new assessment, which is a synthesis of published science literature and authored by a team of climate scientists from academia and government. The U.S. Climate Change Science Program commissioned the report, which has contributions from 37 scientists from the United States, Germany, Canada, the United Kingdom and Denmark.
The new report also makes several conclusions about the Arctic:
Taken together, the size and speed of the summer sea-ice loss over the last few decades is highly unusual compared to events from previous thousands of years, especially considering that changes in Earth's orbit over this time have made sea-ice melting less, not more, likely.
Sustained warming of at least a few degrees (more than approximately 4° to 13°F above average 20th century values) is likely to be sufficient to cause the nearly complete, eventual disappearance of the Greenland ice sheet, which would raise sea level by several meters.
The current rate of human-influenced Arctic warming is comparable to peak natural rates documented by reconstructions of past climates. However, some projections of future human-induced change exceed documented natural variability.
The past tells us that when thresholds in the climate system are crossed, climate change can be very large and very fast. We cannot rule out that human induced climate change will trigger such events in the future.
"By integrating research on the past 65 million years of climate change in the entire circum-Arctic, we have a better understanding on how climate change affects the Arctic and how those effects may impact the whole globe," said USGS Director Mark Myers. "This report provides the first comprehensive analysis of the real data we have on past climate conditions in the Arctic, with measurements from ice cores, sediments and other Earth materials that record temperature and other conditions."
Arctic Heats Up More Than Other Places: High Sea Level Rise Predicted
"We found that what matters most in accounting for large wildfires in the Western United States is how climate influences the build up—or production—and drying of fuels," said Jeremy Littell, a research scientist with the University of Washington's Climate Impacts Group and lead investigator of the study. "Climate affects fuels in different ecosystems differently, meaning that future wildfire size and, likely, severity depends on interactions between climate and fuel availability and production."
To explore climate-fire relationships, the scientists used fire data from 1916 to 2003 for 19 ecosystem types in 11 Western States to construct models of total wildfire area burned. They then compared these fire models with monthly state divisional climate data.
The study confirmed what scientists have long observed: that low precipitation and high temperatures dry out fuels and result in significant fire years, a pattern that dominates the northern and mountainous portions of the West. But it also provided new insight on the relationship between climate and fire, such as Western shrublands' and grasslands' requirement for high precipitation one year followed by dry conditions the next to produce fuels sufficient to result in large wildfires.
The study revealed that climate influences the likelihood of large fires by controlling the drying of existing fuels in forests and the production of fuels in more arid ecosystems. The influence of climate leading up to a fire season depends on whether the ecosystem is more forested or more like a woodland or shrubland.
"These data tell us that the effectiveness of fuel reductions in reducing area burned may vary in different parts of the country," said David L. Peterson, a research biologist with the Forest Service's Pacific Northwest Research Station and one of the study's authors. "With this information, managers can design treatments appropriate for specific climate-fire relationships and prioritize efforts where they can realize the most benefit."
Findings from the study suggest that, as the climate continues to warm, more area can be expected to burn, at least in northern portions of the West, corroborating what researchers have projected in previous studies. In addition, cooler, wetter areas that are relatively fire-free today, such as the west side of the Cascade Range, may be more prone to fire by mid-century if climate projections hold and weather becomes more extreme.
In The Warming West, Climate Most Significant Factor In Fanning Wildfires' Flames
Philip Higuera of Montana State University and his colleagues show that although changing temperatures and moisture levels set the stage for changes in wildfire frequency, they can often be trumped by changes in the distribution and abundance of plants. Vegetation plays a major role in determining the flammability of an ecosystem, he says, potentially dampening or amplifying the impacts that climate change has on fire frequencies.
"Climate is only one control of fire regimes, and if you only considered climate when predicting fire under climate-change scenarios, you would have a good chance of being wrong," he says. "You wouldn't be wrong if vegetation didn't change, but the greater the probability that vegetation will change, the more important it becomes when predicting future fire regimes."
Higuera and his colleagues examined historical fire frequency in northern Alaska by analyzing sediments at the bottom of lakes. Using meter-long samples, called sediment cores, Higuera and his colleagues measured changes in the abundance of preserved plant parts, such as pollen, to determine the types of vegetation that dominated the landscape during different time periods in the past. Like rings in a tree, different layers of sediment represent different times in the past.
The researchers used radiocarbon dating to determine the sediment's age, which dates as far back as 15,000 years. They then measured charcoal deposits in the sediment to determine fire frequency during time periods dominated by different vegetation. Finally, they compared their findings to known historical climate changes.
In many cases, the authors discovered, changes in climate were less important than changes in vegetation in determining wildfire frequency. Despite a transition from a cool, dry climate to a warm, dry climate about 10,500 years ago, for example, the researchers found a sharp decline in the frequency of fires. Their sediment cores from that time period revealed a vegetation change from flammable shrubs to fire-resistant deciduous trees, a trend which Higuera thinks was enough to offset the direct effects of climate on fire frequencies.
"In this case, a warmer climate was likely more favorable for fire occurrence, but the development of deciduous trees on the landscape offset this direct climatic effect. Consequently, we see very little fire," Higuera says.
Similarly, during the development of the modern spruce-dominated forest about 5000 years ago, temperatures cooled and moisture levels increased, which – considered alone – would create unfavorable conditions for frequent fires. Despite this change, the authors observed an increase in fire frequency, a pattern they attribute to the high flammability of the dense coniferous forests.
Higuera thinks this research has implications for predictions of modern-day changes in fire regimes based on climate change. These findings, Higuera says, emphasize that predicting future wildfire frequency shouldn't hinge on the direct impacts of climate change alone.
"Climate affects vegetation, vegetation affects fire, and both fire and vegetation respond to climate change," he says. "Most importantly, our work emphasizes the need to consider the multiple drivers of fire regimes when anticipating their response to climate change."
Plants Could Override Climate Change Effects On Wildfires
Scientists can't say for sure if the volatile mixture at the bottom of the lake will remain still for another 1,000 years or someday explode without warning. In a region prone to volcanic and seismic activity, the fragility of Lake Kivu is a serious matter. Compounding the precarious situation is the presence of approximately 2 million people, many of them refugees, living along the north end of the lake.
An international group of researchers will meet Jan. 13-15 in Gisenyi, Rwanda, to grapple with the problem of Lake Kivu. A grant from the National Science Foundation won by Rochester Institute of Technology will fund the travel and lodging for 18 scientists from the United States to attend the three-day workshop. Anthony Vodacek, conference organizer and associate professor at RIT's Chester F. Carlson Center for Imaging Science, is working closely with the Rwandan Ministry of Education to organize the meeting.
"Rwandan universities suffered greatly in the 1994 genocide and there are few Rwandan scientists performing significant work on the lake or within the rift system," Vodacek notes. "We will work with the government to identify interested researchers."
Vodacek is convening the workshop with Cindy Ebinger, an expert in East African Rift tectonics at the University of Rochester, and Robert Hecky, an expert in limnology -- the study of lake systems -- at University of Minnesota-Duluth. Core samples Hecky took in the 1970s initially brought the safety of Lake Kivu under question.
Addressing the lake as a whole system is a new concept for the workshop participants, who will bring their expertise in volcanology, tectonics and limnology to the problem. Vodacek's goal is to prioritize research activities and improve communication between the North American, European and African collaborators.
"Most scientists are fairly in agreement that the lake is pretty stable; it's not as if its going to come bursting out tomorrow," Vodacek says. "But in such a tectonically and volcanically active area, you can't tell what's going to happen."
One of the problems with Lake Kivu is that the 1,600-foot deep lake never breathes. The tropical climate helps stagnate the layers of the lake, which never mix or turn over. In contrast, fluctuating temperatures in colder climates help circulate lake water and prevent gas build up. Lake Kivu is different from both temperate and other tropical lakes because warm saline springs, arising from ground water percolating through the hot fractured lava and ash, further stabilize the lake. Scientists at the workshop will consider how these spring inputs may vary over time under changing climates and volcanic activity.
A number of catalysts could destabilize the gas resting at the bottom of Lake Kivu. It could be an earthquake, a volcanic explosion, a landslide or even the methane mining that has recently united Rwandan and Congolese interests.
Close calls occurred in 2008 when an earthquake occurred near the lake and in 2002 when a volcanic eruption destroyed parts of Goma in the Democratic Republic of Congo, only 11 miles north of Lake Kivu. Although scientists were alarmed, neither event sufficiently disturbed the gas.
Vodacek likens the contained pressure in the lake to a bottle of carbonated soda or champagne. "In the lake, you have the carbon dioxide on the bottom and 300 meters of water on top of that, which is the cap," he says. "That's the pressure that holds it. The gas is dissolved in water."
When the cap is removed, bubbles form and rise to the surface. More bubbles form and create a column that drags the water and the gas up to the surface in a chain reaction.
"The question is, and what's really unknown, is how explosive is that?" Vodacek says.
Through his own research Vodacek plans to simulate the circulation of Lake Kivu. Modeling the circulation patterns above the layers of carbon dioxide and methane will help determine the energy required to disrupt the gas and cause Lake Kivu to explode.
Volatile gas could turn Rwandan lake into a freshwater time bomb
”Providing the blade with a movable trailing edge it is possible to control the load on the blade and extend the life time of the wind turbine components. This is similar to the technique used on aircrafts, where flaps regulate the lift during the most critical times such as at take-off and landing, "explains Helge Aagaard Madsen, Research Specialist on the project.
However, there is a difference. Whereas on aircrafts, movable flaps are non-deformable elements hinged to the trailing edge of the main wing, this new technique means a continuous surface of the profile on the wind turbine blade even when the trailing edge moves. The reason for this is that the trailing edge is constructed in elastic material and constitutes an integrated part of the main blade.
Robust design of rubber
In 2004 Risø DTU applied for the first patent for this basic technique of designing a flexible, movable trailing edge for a wind turbine blade. Since then there has been a significant development with regard to the project. By means of so-called "Gap-funding" provided by the Ministry of Science, Technology and Innovation and by the local Region Zealand it has been possible to develop such ideas into a prototype stage.
Part of the research has been aimed at the design and development of a robust controllable trailing edge. This has now led to the manufacturing of a trailing edge of rubber with built-in cavities that are fibre-reinforced. The cavities in combination with the directional fibre reinforcement provide the desired movement of the trailing edge, when the cavities are being put under pressure by air or water.
“In this project a number of different prototypes have been manufactured with a chord length of 15 cm and a length of 30 cm. The best version shows very promising results in terms of deflection and in terms of the speed of the deflection” says Helge Aagaard.
The size of the protype fits a blade airfoil section with a chord of one metre and such a blade section is now being produced and is going to be tested inside a wind tunnel.
The capability of the trailing edge to control the load on the blade section is going to be tested in a wind tunnel. This part of the development process is supported by GAP-funding from Region Zealand.
”If the results confirm our estimated performance, we will test the rubber trailing edge on a full-scale wind turbine within a few years” says Helge Aagaard.Controllable Rubber Trailing Edge Flap To Reduce Loads On Wind Turbine Blades
Wind is variable and can only partially be predicted. The large-scale use of wind power in the electricity system is therefore tricky. PhD candidate Bart Ummels MSc. investigated the consequences of using a substantial amount of wind power within the Dutch electricity system. He used simulation models, such as those developed by transmission system operator TenneT, to pinpoint potential problems (and solutions).
His results indicate that wind power requires greater flexibility from existing power stations. Sometimes larger reserves are needed, but more frequently power stations will have to decrease production in order to make room for wind-generated power. It is therefore essential to continually recalculate the commitment of power stations using the latest wind forecasts. This reduces potential forecast errors and enables wind power to be integrated more efficiently.
Ummels looked at wind power up to 12 GW, 8 GW of which at sea, which is enough to meet about one third of the Netherlands’ demand for electricity. Dutch power stations are able to cope at any time in the future with variations in demand for electricity and supply of wind power, as long as use is made of up-to-date, improved wind forecasts. It is TenneT’s task to integrate large-scale wind power into the electricity grid. Lex Hartman, TenneT’s Director of Corporate Development: “in a joint effort, TU Delft and TenneT further developed the simulation model that can be used to study the integration of large-scale wind power. The results show that in the Netherlands we can integrate between 4 GW and 10 GW into the grid without needing any additional measures.
Surpluses
Ummels: ‘Instead of the common question ‘What do we do when the wind isn’t blowing?’, the more relevant question is ‘Where do we put all the electricity if it is very windy at night?’. This is because, for instance, a coal-fired power station cannot simply be turned off. One solution is provided by the international trade in electricity, because other countries often can use the surplus. Moreover, a broadening of the ‘opening hours’ of the international electricity market benefits wind power. At the moment, utilities determine one day ahead how much electricity they intend to purchase or sell abroad. Wind power can be better used if the time difference between the trade and the wind forecast is smaller.’
No energy storage
Ummels’ research also demonstrates that energy storage is not required. The results indicate that the international electricity market is a promising and cheaper solution for the use of wind power.
Making power stations more flexible is also better than storage. The use of heating boilers, for instance, means that combined heat and power plants operate more flexibly, which can consequently free up capacity for wind power at night.
The use of wind power in the Dutch electricity system could lead to a reduction in production costs of EUR1.5 billion annually and a reduction in CO2 emissions of 19 million tons a year.
Dutch Electricity System Can Cope With Large-scale Wind Power
Saturday, December 12, 2009
GM to Begin to Repay US,Canadian and German Government Loans
In light of improving global economic conditions, stabilizing industry sales and its healthier cash position, GM announced today that it plans to accelerate repayment of its outstanding $6.7 billion in UST loans as well as the C$1.5 billion (US$1.4 billion) in EDC loans ahead of the scheduled maturity date of July 2015.GM plans to repay the United States, Canadian and Ontario government loans in quarterly installments from escrowed funds, beginning next month with an initial $1.2 billion payment to be made in December ($1.0 billion to the UST and $192 million to the EDC), followed by quarterly payments. Any escrowed funds available as of June 30, 2010 would be used to repay the UST and EDC loans unless the escrowed funds were extended one year by the UST. Any balance of funds would be released to GM after the repayment of the UST and EDC loans.In addition, the company has begun to repay the German government loans which were extended to support Opel, and had a balance of €900 million (~US$1.3 billion) as of September 30, 2009. Opel has already repaid €500 million (~US$0.7 billion) of that in November, and will repay the remaining €400 million (~US$0.6 billion) balance by the end of the month. The cash balance in Europe as of September 30, 2009 was US$2.9 billion.GM’s total debt as of September 30, 2009 was $17 billion, including $6.7 billion in U.S. government loans, $1.4 billion in Canadian government loans, $1.3 billion in German government loans and $7.6 billion in other debt globally. The $17 billion debt level does not include the UAW or CAW VEBA notes or preferred stock, which are $2.5 billion, $0.7 billion and $9 billion, respectively. While GM has reached settlements for the UAW and CAW VEBAs, the debt associated with the agreements will not be recognized until all preconditions are met and they become effective, which will be December 31, 2009 or later. Prior to the start of the new GM, total debt of Old GM was $94.7 billion as of July 9, 2009.
GM to Begin to Repay US,Canadian and German Government Loans
GM to Begin to Repay US,Canadian and German Government Loans
The Netherlands Places Order for 75 Natural Gas Powered Daimler Buses
Daimler Buses has once again won two contracts, and as a result it will deliver a total of 425 buses to the Netherlands by December. The biggest contract from the Netherlands to date is for 350 Mercedes-Benz urban and intercity buses, which were ordered by Qbuzz, a private bus operator with headquarters in Amersfoort. The vehicles will be employed in the provinces of Drenthe and Groningen. The second major contract is for 75 natural gas-powered Mercedes-Benz urban buses, which were ordered by Connexxion, the Netherlands’ largest public transport company, with headquarters in Hilversum. The environmentally friendly low-emission natural gas buses will be employed in the Arnhem-Nijmegen region.
“We’re very happy about the two big orders from the Netherlands,” says Hartmut Schick, head of Daimler Buses. “They show that our high-quality and environmentally friendly buses are also successful in economically challenging times. As a result, we will be able to fully utilize the production capacity of our European plants until the end of the year.”
Daimler Buses will manufacture Mercedes-Benz brand urban and intercity buses for the two customers. Qbuzz has ordered Citaro LE and Citaro NF solo and articulated urban buses with a low-floor design. The Citaro LE and NF solo vehicles are 12 meters long, while the Citaro NF articulated vehicles are 18 meters long, offering seating for up to 53 passengers. The economical low-emission buses are equipped with the pioneering Mercedes-Benz BlueTec SCR diesel technology and meet the Euro 5 and EEV (Enhanced Environmentally Friendly Vehicle) emissions limits. Qbuzz has also ordered Mercedes-Benz Integro intercity buses, which are 12 meters long and offer seating for 47 passengers. The Mercedes-Benz Integro is an appealing and profitable solution for the bus operator, thanks to the low operating costs, high quality standards, great versatility, and long service life of the vehicle and its components. The whole contract is being financed through Daimler Financial Services Netherlands, while the EvoBus Service partner Wensink will repair and maintain the vehicles.
The other customer, Connexxion, is also focusing on environmental friendliness with its order of 75 natural gas-powered Mercedes-Benz Citaro CNG urban buses. Daimler Buses will be delivering two variants of the Citaro CNG: a solo bus measuring 12 meters and an articulated bus measuring 18 meters. The solo vehicle’s engine has an output of 185 kW (252 hp), while the articulated bus is equipped with a drive generating 240 kW (326 hp). The contract for 75 vehicles consists of 69 solo buses and six articulated buses, all of which generate very low emissions and meet the EEV standard. In total, more than 1,200 Mercedes-Benz Citaro CNG buses are in use by customers
The Netherlands Places Order for 75 Natural Gas Powered Daimler Buses
“We’re very happy about the two big orders from the Netherlands,” says Hartmut Schick, head of Daimler Buses. “They show that our high-quality and environmentally friendly buses are also successful in economically challenging times. As a result, we will be able to fully utilize the production capacity of our European plants until the end of the year.”
Daimler Buses will manufacture Mercedes-Benz brand urban and intercity buses for the two customers. Qbuzz has ordered Citaro LE and Citaro NF solo and articulated urban buses with a low-floor design. The Citaro LE and NF solo vehicles are 12 meters long, while the Citaro NF articulated vehicles are 18 meters long, offering seating for up to 53 passengers. The economical low-emission buses are equipped with the pioneering Mercedes-Benz BlueTec SCR diesel technology and meet the Euro 5 and EEV (Enhanced Environmentally Friendly Vehicle) emissions limits. Qbuzz has also ordered Mercedes-Benz Integro intercity buses, which are 12 meters long and offer seating for 47 passengers. The Mercedes-Benz Integro is an appealing and profitable solution for the bus operator, thanks to the low operating costs, high quality standards, great versatility, and long service life of the vehicle and its components. The whole contract is being financed through Daimler Financial Services Netherlands, while the EvoBus Service partner Wensink will repair and maintain the vehicles.
The other customer, Connexxion, is also focusing on environmental friendliness with its order of 75 natural gas-powered Mercedes-Benz Citaro CNG urban buses. Daimler Buses will be delivering two variants of the Citaro CNG: a solo bus measuring 12 meters and an articulated bus measuring 18 meters. The solo vehicle’s engine has an output of 185 kW (252 hp), while the articulated bus is equipped with a drive generating 240 kW (326 hp). The contract for 75 vehicles consists of 69 solo buses and six articulated buses, all of which generate very low emissions and meet the EEV standard. In total, more than 1,200 Mercedes-Benz Citaro CNG buses are in use by customers
The Netherlands Places Order for 75 Natural Gas Powered Daimler Buses
Ireland and Renault-Nissan Sign Agreement to Develop Electric Vehicles
Ireland’s Energy Minister Eamon Ryan recently announced a major move in the electrification Irish motoring. The Memoranda of Understanding (signed by Minister Ryan on behalf of the Government) and by Padraig McManus (for ESB) will create favourable conditions for the distribution of electric vehicles to the Irish market by Renault-Nissan.
In a hugely significant and new collaboration between Government, between the semi-state electricity supplier ESB and between car manufacturers Renault-Nissan, these electric vehicles will be on Irish roads within 2 years.
“This historic agreement”, said Minister Ryan “is proof of Government’s firm intention to act on the electrification of transport. Some months ago, I announced the Government target to move to at 10% target of electric vehicles by 2020. Today’s Memorandum of Understanding will help us not only realise, but surpass this target.
We are well on our way and our streets will see the change very shortly”.
“In November, we sent a call to the market that Ireland was ‘open for business’ on electric cars. Our call has been answered by Renault-Nissan and I’d like to welcome them to the Irish market with this new product Today we sign, what I hope will be the first of many agreements with interested companies”. The Irish Government’s intentions are not product-exclusive.
“Today’s initiative will transform our streets, will cut carbon emissions and change the face of transport in Ireland,” he said. “Again we see the ESB stepping up to the plate to secure Ireland’s future and I commend them for their vision and work in this regard”.
“This collaboration will provide the world with a model for how electric vehicles can be achieved globally. We will continue to press ahead”.
Renault-Nissan Denki concept
ESB Chief Executive Padraig McManus described today’s development as an “an opportunity for Ireland to demonstrate its leadership in the green revolution, including in electric transport”.
“ESB has set out its plans to become carbon-neutral by 2035 and carbon-neutral electricity will power an emissions-free transport system. ESB will roll out a charging network to support the development. We will guarantee open access to all electricity suppliers and car manufacturers and can ensure adherence to the strictest safety standards for the recharging points”, he said.
“The roll-out of electric vehicles will provide major employment opportunities in a number of areas”, he said.
Speaking at the announcement, Andrew Palmer, Senior Vice President, Nissan Motor Company, said the Renault-Nissan Alliance looks forward to a successful partnership with Ireland.
“We regard Ireland as a leader in the EV project. Demography and political support make Ireland one of the most suitable locations for a large scale roll out of electric vehicles. Renault and Nissan are particularly pleased to be working with the Irish Government and ESB in putting in place the correct conditions to support electric transport“.
About ESB
Founded in 1927, ESB is Ireland’s leading electricity company. It is a vertically integrated utility that generates, distributes and supplies electricity in a regulated energy market.ESB Group employs approximately 6,500 people and sub-company, ESB International, employs 1,200 on its overseas business that has spanned more than 100 countries.One of Ireland’s most successful companies with an annual turnover of €3.5 billion, ESB has grown in value from €2.5 billion in 2002 to approximately €6.5 billion today.
Ireland and Renault-Nissan Sign Agreement to Develop Electric Vehicles
Smith Electric Vehicles Collaborates with Ford Motor Company to Build Electric Vans
Smith Electric Vehicles, the leading manufacturer of commercial electric vehicles, recently announced an electric vehicle development collaboration with Ford Motor Company. Smith Electric Vehicles, a trading division of The Tanfield Group Plc, will work with Ford to introduce a battery-electric light van, the first vehicle in the company’s broad electrification strategy for the North American market which was announced at this year’s Detroit Auto Show. This vehicle will be based on the European-designed Ford Transit Connect which goes on sale in North America this year.
The vehicle, which Smith will assemble in North America, will have a range of up to 100 miles on a full charge, without compromising the Transit Connect’s superior driving experience. It will operate very similarly to a conventional light van, but with smoother acceleration, less noise and zero emissions. The vehicle will be fully branded as a Ford product and will go on sale through selected Ford dealerships in North America in 2010.
Smith Electric Vehicles Collaborates with Ford Motor Company to Build Electric Vans
Diesel Hybrid Electric Drivetrain Developed for Military Applications
Diesel Hybrid Electric Drivetrain Developed for Military Applications
Quantum Fuel Systems Technologies Worldwide, Inc. has introduced a fuel efficient, high performance diesel hybrid electric powertrain, “Q-Force,” after six years of development. This advanced, proprietary 4-wheel drivetrain can be configured for specialized military as well as commercial applications.
The first application of Quantum Q-Force is in a JP-8 fuel compatible diesel engine-based, battery dominant, series-hybrid electric military Alternative Mobility Vehicle (AMV). A number of pre-production prototypes that incorporate Q-Force have been successfully developed and built for testing and evaluation by selected commands to assess mission suitability, supportability, performance objectives, and guidance on final vehicle configuration.
In one configuration, the diesel engine produces 75 horsepower, the electric motor, 133 horsepower, and the powertrain yields 5,463 foot-pounds of torque after gear reduction. Features of Q-Force include:
Sophisticated System Control and Data Acquisition (SCADA) system, incorporating Quantum‟s proprietary algorithms
Hybrid control system that minimizes battery size through optimized charge controls and regenerative braking
Optimized engine operation/calibration and generator performance
Advanced traction motor/transmission system
Alan P. Niedzwiecki, President and CEO of Quantum said: “Our innovative diesel hybrid electric all-wheel drive system provides high performance, acceleration and extended range, resulting in significant advantages for the U.S. Army in communications, surveillance, targeting, and reconnaissance missions. We believe that Quantum’s Q-Force drivetrain is also very well-suited for commercial applications including homeland security, border patrol, park service operations, and light-duty automobiles.”
In 2008 Quantum introduced a gasoline plug-in-hybrid electric drive known as „Q-Drive‟ in the Fisker KARMA 4-door sports sedan. Fisker Automotive, a „Green American‟ car company co-founded in 2007 by Quantum and Henrik Fisker, was recently selected by the U.S. Department of Energy for a low interest loan of $528.7 million, under the Advanced Technology Vehicle Manufacturing Loan Program.
About Quantum:
Quantum Fuel Systems Technologies Worldwide, Inc., a fully integrated alternative energy company, is a leader in the development and production of advanced propulsion systems, energy storage technologies, and alternative fuel vehicles.
Diesel Hybrid Electric Drivetrain Developed for Military Applications
Quantum Fuel Systems Technologies Worldwide, Inc. has introduced a fuel efficient, high performance diesel hybrid electric powertrain, “Q-Force,” after six years of development. This advanced, proprietary 4-wheel drivetrain can be configured for specialized military as well as commercial applications.
The first application of Quantum Q-Force is in a JP-8 fuel compatible diesel engine-based, battery dominant, series-hybrid electric military Alternative Mobility Vehicle (AMV). A number of pre-production prototypes that incorporate Q-Force have been successfully developed and built for testing and evaluation by selected commands to assess mission suitability, supportability, performance objectives, and guidance on final vehicle configuration.
In one configuration, the diesel engine produces 75 horsepower, the electric motor, 133 horsepower, and the powertrain yields 5,463 foot-pounds of torque after gear reduction. Features of Q-Force include:
Sophisticated System Control and Data Acquisition (SCADA) system, incorporating Quantum‟s proprietary algorithms
Hybrid control system that minimizes battery size through optimized charge controls and regenerative braking
Optimized engine operation/calibration and generator performance
Advanced traction motor/transmission system
Alan P. Niedzwiecki, President and CEO of Quantum said: “Our innovative diesel hybrid electric all-wheel drive system provides high performance, acceleration and extended range, resulting in significant advantages for the U.S. Army in communications, surveillance, targeting, and reconnaissance missions. We believe that Quantum’s Q-Force drivetrain is also very well-suited for commercial applications including homeland security, border patrol, park service operations, and light-duty automobiles.”
In 2008 Quantum introduced a gasoline plug-in-hybrid electric drive known as „Q-Drive‟ in the Fisker KARMA 4-door sports sedan. Fisker Automotive, a „Green American‟ car company co-founded in 2007 by Quantum and Henrik Fisker, was recently selected by the U.S. Department of Energy for a low interest loan of $528.7 million, under the Advanced Technology Vehicle Manufacturing Loan Program.
About Quantum:
Quantum Fuel Systems Technologies Worldwide, Inc., a fully integrated alternative energy company, is a leader in the development and production of advanced propulsion systems, energy storage technologies, and alternative fuel vehicles.
Diesel Hybrid Electric Drivetrain Developed for Military Applications
Thursday, December 10, 2009
Old hay and Alpine ibex horns reveal how grasslands respond to climate change
Researchers studying the reactions of trees to rising CO2 concentration in the atmosphere have it easy. Since trees store the carbon they absorb in wood, all they need to do is take core samples from tree trunks. A centenarian oak will reveal how it coped with the incipient climate change over a period of a hundred years in its annual rings. "However, the grassland vegetation we work with is grazed or dies off in a matter of months and decomposes," explains Prof. Hans Schnyder, who is doing research in the field of grasslands at the Center for Life and Food Sciences Weihenstephan at the TUM. The Swiss scientist nonetheless wanted to establish out how economically grasslands deal with water when temperatures rise and the carbon dioxide concentration in the air increases.
Important in this context is that all plants absorb CO2 from the atmosphere. At the same time they transpire water vapor to cool their sunlit leaves. Both processes run via the stomata, tiny pores in the leaves, the opening size of which plants can regulate. During longer periods of drought plants close the stomata to curb water loss, albeit at the expense of CO2 absorption. Laboratory experiments show that, for a given stoma aperture, an artificial increase of ambient CO2 leads to a temporary increase in the absorption capacity for the gas. However, to ascertain the actual change of water use efficiency in grassland vegetation over the course of the last century, Prof. Schnyder had to find grassland time series comparable in length to those of trees.
This is where the team turned their sights to the Alpine ibex horn collection at the Museum of Natural History in Bern. Ibex store isotopic information in their horns that reflects the water use of the vegetation they consume. The TUM researchers went at the museum collection, which covers the years 1938 to 2006, with a carving knife, to remove tiny samples from the horns. Since ibex horns also have annual rings, the grassland researchers were able to use the samples to draw conclusions about temporal changes in the grassland vegetation of the Bernese Alps where the ibex had grazed.
A unique specimen archive at the research station Rothamsted in England eventually enabled a comparison with a second grassland region. The "Park Grass Experiment" -- the longest running ecological grassland experiment worldwide -- was initiated in Rothamsted over 150 years ago. Since 1857 specimens have been archived there to allow future generations of scientists to gain long-term insights into the local ecosystem using modern research methods. And indeed, the TUM scientists were able to benefit from the hay specimens dating as far back as 150 years. Once again analyzing the isotope signature, they could infer how the English grassland vegetation had utilized the water over the years.
The Weihenstephan researchers thus determined the individual isotope composition of the grassland vegetation in both the Bernese Alps and in the British lowlands over extended periods of time: more than 69 years based on the horns, and as far back as 150 years using the hay specimens. In a second step this data was lined up with climate data, e.g. air temperature and aridity, of the respective region.
The result: In both locations the intrinsic water-use efficiency of the grassland vegetation rose over the years. This implies that the plants improved their water storage potential as temperatures rose and the level of CO2 in the atmosphere increased. Based on these results the TUM scientists have now, for the first time ever, managed to demonstrate the long-term effects of anthropogenic climate change on the water-use efficiency of grasslands.
There were, however, also differences between the two locations. In Switzerland the effective water-use efficiency of the Alpine meadows remained unchanged in spite of the increased intrinsic water-use efficiency of the grassland. This was because, overall, the air had become drier and warmer as a result of the climate change. In England the scientists found evidence for this effect only during the fall. In the spring though -- which in Rothamsted is no drier today than it was 150 years ago -- the water storage potential of grassland vegetation had a real effect. This insight will help to further improve climate simulations. In the past, complex simulation models that included vegetation had to rely on estimates where grassland was concerned. The scientists at the TU Muenchen have now succeeded in prying open this climate research black box.
Old hay and Alpine ibex horns reveal how grasslands respond to climate change
0
comments
RSS
Posted by
Krasugir & Co
at
11:54 AM
Labels:
Earth Climate,
Ecology,
Saving Planet Earth
First phase of pan-tropical forest mapping debuts
The Woods Hole Research Center (WHRC) has initiated a three-year project focused on producing spatially consistent pan-tropical data sets to support the monitoring of forest cover and associated carbon stocks stored in above-ground forest biomass. A circa-2007 high-resolution, cloud-free radar data set from the Japan Aerospace Exploration Agency's (JAXA) ALOS/PALSAR sensor is the cornerstone of the pan-tropical forest cover mapping effort. A circa-2005 500-meter biomass product is being produced through the fusion of optical (MODIS) and lidar (ICESat/GLAS) data provided by the National Aeronautics and Space Administration (NASA). Pantropical mosaics of ALOS/PALSAR and MODIS data are now complete and can be viewed for the first time on Google Earth.
According to Josef Kellndorfer, an associate scientist at the Woods Hole Research Center who is leading the radar-mapping portion of the project, "Japan's cloud-penetrating ALOS/PALSAR sensor has greatly advanced satellite-based forest observation. JAXA's Kyoto and Carbon Initiative has been instrumental in pushing a global radar-based data acquisition and observation strategy since 2006. Given the strong sensitivity of the PALSAR sensor to forest classification, pan-tropical remote sensing of forests has been significantly improved, aiding countries in their ability to build robust national carbon accounting systems."
NASA's MODIS and IceSAT sensors are used in the project to build consistent pantropical maps of forest carbon stocks. Alessandro Baccini, an assistant scientist at WHRC involved with this initiative, explains, "The unprecedented quality and high temporal resolution of MODIS data allow us to produce cloud-free mosaics with many optical information channels, overcoming cloud-cover which affects optical remote sensing in the tropics. Coupled with lidar measurements that provide information on the vertical structure of the vegetation, as well as targeted field observations, it is possible to generate increasingly accurate maps of above-ground carbon stored in woody vegetation across the pan-tropical belt."
According to Craig Dobson, NASA Program Scientist, "NASA is pleased to support this effort through provision of the MODIS and IceSAT data and the processing of JAXA's ALOS PALSAR data. The combination of frequent MODIS coverage with the forest structural information provided by the IceSAT/GLAS lidar and the PALSAR radar data provides a rich data set for monitoring forests. The public release of the pantropical products by WHRC is both timely and a testament to the value of multi-agency and multi-institutional collaboration with unrestricted access to data. This effort is very much in tune with the spirit of the Group on Earth Observations's (GEO) Global Earth Observing System of Systems (GEOSS)."
The Woods Hole Research Center is a partner in this GEO Task on Forest Carbon Tracking. "Satellite observations will be key to measure trends in forest carbon. We are impressed by the Woods Hole Research Center's work, in partnership with JAXA and NASA. Systematically generated data sets such as the present pan-tropical compilation will significantly strengthen national forest tracking capabilities," said José Achache, Executive Director of GEO, the Group on Earth Observations.
The WHRC mapping initiatives are accompanied by on-the-ground work. Through workshops, a visiting scholars program, and other related activities, new maps will be produced, assessed, disseminated and discussed with various stakeholders within countries -- including representatives from government, civil society, indigenous and traditional forest communities, and the private sector. An integral part of the project is to transfer knowledge and skills of forest and carbon mapping to those countries that are increasingly engaged in international efforts to slow deforestation and enable these countries to evaluate alternative options for management of their forest resources. Nadine Laporte, an associate scientist at WHRC who coordinates in-country activities states "field measurements coupled with the satellite observations are an essential part of producing accurate maps of tropical forests, and building capacity with stakeholders in the region."
Luis Solorzano of the Gordon and Betty Moore Foundation (GBMF) states, "As funders, we are proud of the work advanced by WHRC. Our support to WHRC is part of a growing partnership between GBMF, Google.org and the David and Lucile Packard Foundation. Our unusual partnership brings together scientists, private sector, governments and philanthropies, all motivated by a shared commitment to halt the most severe threats to our global environment. Our common goal is to create the capacity to broadly deploy data, information, and analysis tools for the global community to transparently monitor and improve the management of global environment. Most essential, we seek to initiate open spaces for the collaborative creation of knowledge, and to freely share and use it in broad and transparent ways. This work by WHRC is uniquely important for its contribution to our primary purposes and principles."
This work is a partnership project of the Woods Hole Research Center, JAXA's Kyoto and Carbon Initiative, NASA, the Alaska Satellite Facility, SARMAP, Boston University, and SpotImage PlanetAction. Funding for this work has been provided by The Gordon and Betty Moore Foundation, the David & Lucile Packard Foundation, Google.org, and NASA.First phase of pan-tropical forest mapping debuts
According to Josef Kellndorfer, an associate scientist at the Woods Hole Research Center who is leading the radar-mapping portion of the project, "Japan's cloud-penetrating ALOS/PALSAR sensor has greatly advanced satellite-based forest observation. JAXA's Kyoto and Carbon Initiative has been instrumental in pushing a global radar-based data acquisition and observation strategy since 2006. Given the strong sensitivity of the PALSAR sensor to forest classification, pan-tropical remote sensing of forests has been significantly improved, aiding countries in their ability to build robust national carbon accounting systems."
NASA's MODIS and IceSAT sensors are used in the project to build consistent pantropical maps of forest carbon stocks. Alessandro Baccini, an assistant scientist at WHRC involved with this initiative, explains, "The unprecedented quality and high temporal resolution of MODIS data allow us to produce cloud-free mosaics with many optical information channels, overcoming cloud-cover which affects optical remote sensing in the tropics. Coupled with lidar measurements that provide information on the vertical structure of the vegetation, as well as targeted field observations, it is possible to generate increasingly accurate maps of above-ground carbon stored in woody vegetation across the pan-tropical belt."
According to Craig Dobson, NASA Program Scientist, "NASA is pleased to support this effort through provision of the MODIS and IceSAT data and the processing of JAXA's ALOS PALSAR data. The combination of frequent MODIS coverage with the forest structural information provided by the IceSAT/GLAS lidar and the PALSAR radar data provides a rich data set for monitoring forests. The public release of the pantropical products by WHRC is both timely and a testament to the value of multi-agency and multi-institutional collaboration with unrestricted access to data. This effort is very much in tune with the spirit of the Group on Earth Observations's (GEO) Global Earth Observing System of Systems (GEOSS)."
The Woods Hole Research Center is a partner in this GEO Task on Forest Carbon Tracking. "Satellite observations will be key to measure trends in forest carbon. We are impressed by the Woods Hole Research Center's work, in partnership with JAXA and NASA. Systematically generated data sets such as the present pan-tropical compilation will significantly strengthen national forest tracking capabilities," said José Achache, Executive Director of GEO, the Group on Earth Observations.
The WHRC mapping initiatives are accompanied by on-the-ground work. Through workshops, a visiting scholars program, and other related activities, new maps will be produced, assessed, disseminated and discussed with various stakeholders within countries -- including representatives from government, civil society, indigenous and traditional forest communities, and the private sector. An integral part of the project is to transfer knowledge and skills of forest and carbon mapping to those countries that are increasingly engaged in international efforts to slow deforestation and enable these countries to evaluate alternative options for management of their forest resources. Nadine Laporte, an associate scientist at WHRC who coordinates in-country activities states "field measurements coupled with the satellite observations are an essential part of producing accurate maps of tropical forests, and building capacity with stakeholders in the region."
Luis Solorzano of the Gordon and Betty Moore Foundation (GBMF) states, "As funders, we are proud of the work advanced by WHRC. Our support to WHRC is part of a growing partnership between GBMF, Google.org and the David and Lucile Packard Foundation. Our unusual partnership brings together scientists, private sector, governments and philanthropies, all motivated by a shared commitment to halt the most severe threats to our global environment. Our common goal is to create the capacity to broadly deploy data, information, and analysis tools for the global community to transparently monitor and improve the management of global environment. Most essential, we seek to initiate open spaces for the collaborative creation of knowledge, and to freely share and use it in broad and transparent ways. This work by WHRC is uniquely important for its contribution to our primary purposes and principles."
This work is a partnership project of the Woods Hole Research Center, JAXA's Kyoto and Carbon Initiative, NASA, the Alaska Satellite Facility, SARMAP, Boston University, and SpotImage PlanetAction. Funding for this work has been provided by The Gordon and Betty Moore Foundation, the David & Lucile Packard Foundation, Google.org, and NASA.First phase of pan-tropical forest mapping debuts
Digital avalanche rescue dog: Geolocation system can locate victims to within centimeters
For many skiers and snowboarders, there is nothing quite like being the first to make tracks in the virgin snow, off the regular piste. But this can be a fateful decision, because the risk of avalanche is many times greater here. Once buried under a mass of snow, a person's only hope of survival is if their location can be pinpointed swiftly. If not rescued within half an hour, their chances of being found alive diminish rapidly. Victims stand the best chance of being saved if the uninjured members of their group start searching for them immediately -- but for that the buried victim needs to be wearing an avalanche beacon.
"In the experience of rescue teams not everyone actually carrys beacons," says Wolfgang Inninger of the Fraunhofer Institute for Material Flow and Logistics IML. "However, nearly everyone has a cellphone. This is why we decided to enhance our automatic geolocation system that works with Galileo, the future European satellite navigation system."
To do so, two new components have been added to the 'avalanche rescue navigator' ARN: a cellphone location function and software that calculates the position of the buried victim on the basis of local measurements. Starting from the approximate place where the victim is thought to be lying under the snow, the rescuers measure the field strength of the signal transmitted by the cellphone or beacon at three to five reference points. The system then uses a highly precise calculation algorithm to pinpoint the source of the signal, indicating with high probability the location of the buried victim. In this kind of situation, the position relative to the rescue team's starting point is more important than the absolute position relative to global coordinates, which may be subject to measurement inaccuracies. This gives the rescuers immediate information on the direction and distance from their present location at which the victim can be found.
For their development work on the system, the researchers are using the GATE Galileo test and development environment in Berchtesgaden, where transmitter antennas installed on six mountain peaks simulate the Galileo signals. The researchers intend to combine these signals -- and the real ones, after 2012 -- with signals from existing satellite navigation systems such as the American GPS and the Russian Glonass, and to add signals for error estimation and correction. The project is being implemented by a consortium of regional companies, institutes and universities in collaboration with the Berchtesgaden mountain rescue service and the police, and is being sponsored by the German Aerospace Center DLR.
Digital avalanche rescue dog: Geolocation system can locate victims to within centimeters
"In the experience of rescue teams not everyone actually carrys beacons," says Wolfgang Inninger of the Fraunhofer Institute for Material Flow and Logistics IML. "However, nearly everyone has a cellphone. This is why we decided to enhance our automatic geolocation system that works with Galileo, the future European satellite navigation system."
To do so, two new components have been added to the 'avalanche rescue navigator' ARN: a cellphone location function and software that calculates the position of the buried victim on the basis of local measurements. Starting from the approximate place where the victim is thought to be lying under the snow, the rescuers measure the field strength of the signal transmitted by the cellphone or beacon at three to five reference points. The system then uses a highly precise calculation algorithm to pinpoint the source of the signal, indicating with high probability the location of the buried victim. In this kind of situation, the position relative to the rescue team's starting point is more important than the absolute position relative to global coordinates, which may be subject to measurement inaccuracies. This gives the rescuers immediate information on the direction and distance from their present location at which the victim can be found.
For their development work on the system, the researchers are using the GATE Galileo test and development environment in Berchtesgaden, where transmitter antennas installed on six mountain peaks simulate the Galileo signals. The researchers intend to combine these signals -- and the real ones, after 2012 -- with signals from existing satellite navigation systems such as the American GPS and the Russian Glonass, and to add signals for error estimation and correction. The project is being implemented by a consortium of regional companies, institutes and universities in collaboration with the Berchtesgaden mountain rescue service and the police, and is being sponsored by the German Aerospace Center DLR.
Digital avalanche rescue dog: Geolocation system can locate victims to within centimeters
Tuesday, December 8, 2009
U.S. carbon capture projects reap $ 3.18 billion
Three advanced coal technology projects with carbon capture and commercial-scale storage capabilities will receive $3.18 billion in combined funding from the United States Department of Energy and private investments.
About $979 million in stimulus funds will be released by the department to the awardees which will be leveraged with $2.2 billion in private capital cost share. The department’s investment is the third round of funding under its clean coal power initiative created in 2005.
“This investment is part of our commitment to advancing carbon capture and storage technologies to the point that widespread, affordable deployment can begin in eight to ten years,” Steven Chu, energy secretary, said.
The projects explore various carbon capture and storage methods. Through $350 million from the Energy Department which represents the biggest federal allocation, Summit Texas Clean Energy L.L.C. of Washington will combine Siemens gasification and power generation with carbon capture technologies. The project will capture 90 percent of carbon dioxide, or 2.7 million metric tons annually, of a proposed 400-megawatt Midland-Odessa, Texas plant.
American Electric Power Company Inc. of Ohio will design, construct and operate a chilled ammonia process for $334 million from the energy department. The process can effectively capture at least 90 percent of the carbon dioxide, about 1.5 million metric tons annually, in a 235-megawatt flue gas stream at the existing 1,300 megawatt Appalachian Power Company Mountaineer power plant near New Haven, West Virginia.
Lastly, the Southern Company Services Inc. of Alabama, for $295 million of the same federal grant, will retrofit a carbon dioxide capture plant on a 160-megawatt flue gas stream at an existing coal-fired power plant north of Mobile, Alabama. The captured carbon dioxide will be compressed and transported through a pipeline. Up to 1 million metric tons annually of the carbon dioxide will be sequestered in deep saline formations.
All in all, the projects target carbon dioxide capture efficiency of 90 percent.
“Clean coal solutions are possible and attainable - and that is evident by all of the supporters behind this project. This is so crucial to move this state and nation forward,” said West Virginia Governor Joe Manchin.
“Coal has and always will be an enormous part of our West Virginia soul and when we invest in new technologies that make it better and cleaner, we are taking control [of] our future – and that is the key,” said Senator Jay Rockefeller.
U.S. carbon capture projects reap $ 3.18 billion
About $979 million in stimulus funds will be released by the department to the awardees which will be leveraged with $2.2 billion in private capital cost share. The department’s investment is the third round of funding under its clean coal power initiative created in 2005.
“This investment is part of our commitment to advancing carbon capture and storage technologies to the point that widespread, affordable deployment can begin in eight to ten years,” Steven Chu, energy secretary, said.
The projects explore various carbon capture and storage methods. Through $350 million from the Energy Department which represents the biggest federal allocation, Summit Texas Clean Energy L.L.C. of Washington will combine Siemens gasification and power generation with carbon capture technologies. The project will capture 90 percent of carbon dioxide, or 2.7 million metric tons annually, of a proposed 400-megawatt Midland-Odessa, Texas plant.
American Electric Power Company Inc. of Ohio will design, construct and operate a chilled ammonia process for $334 million from the energy department. The process can effectively capture at least 90 percent of the carbon dioxide, about 1.5 million metric tons annually, in a 235-megawatt flue gas stream at the existing 1,300 megawatt Appalachian Power Company Mountaineer power plant near New Haven, West Virginia.
Lastly, the Southern Company Services Inc. of Alabama, for $295 million of the same federal grant, will retrofit a carbon dioxide capture plant on a 160-megawatt flue gas stream at an existing coal-fired power plant north of Mobile, Alabama. The captured carbon dioxide will be compressed and transported through a pipeline. Up to 1 million metric tons annually of the carbon dioxide will be sequestered in deep saline formations.
All in all, the projects target carbon dioxide capture efficiency of 90 percent.
“Clean coal solutions are possible and attainable - and that is evident by all of the supporters behind this project. This is so crucial to move this state and nation forward,” said West Virginia Governor Joe Manchin.
“Coal has and always will be an enormous part of our West Virginia soul and when we invest in new technologies that make it better and cleaner, we are taking control [of] our future – and that is the key,” said Senator Jay Rockefeller.
U.S. carbon capture projects reap $ 3.18 billion
Coca-Cola bottles cool in emissions-free machines
The Coca-Cola Company, in partnership with Greenpeace, has committed to make all its new vending machines and coolers hydrofluorocarbon-free by 2015.
By using hydrofluorocarbon-free refrigeration, Coca-Cola would be able to reduce its equipment’s direct greenhouse gas emissions by 99 percent. The elimination of hydrofluorocarbons in the commercial refrigeration industry would be equivalent to the elimination of annual emissions of Germany or Japan.
Coca-Cola’s new green commitment will help influence the commercial refrigeration market to shy away from using hydrofluorocarbons. The company has already invested more than $50 million in research and development to accelerate the use of climate-friendly cooling technologies.
Next year, Coca-Cola and its bottling partners will purchase a minimum of 150,000 units of hydrofluorocarbon-free equipment, doubling the current rate of purchase to meet its goal to buy 50 percent of all new coolers and vending machines without hydrofluorocarbons by 2012.
The company has approximately 10 million coolers and vending machines worldwide, making up the largest element of Coca-Cola’s total climate impact.
Hydrofluorocarbon-free equipment
As a result of the company’s commitment to use hydrofluorocarbon-free equipment, carbon emission reductions will exceed 52.5 million metric tons over the lifetime of the equipment – the equivalent of removing 11 million cars from the road for one year.
Because of Coca-Cola’s supply chain engagement, a major supplier already intends to build a dedicated carbon dioxide compressor production facility to help meet the increasing demand for hydrofluorocarbon-free cooling options across the industry.
“At Coca-Cola, we are deploying our scale and working with suppliers to deliver cost effective alternatives to HFC, for us and for others,” said Rick Frazier, vice president of supply chain for the Coca-Cola Company.
The new green initiative is a direct result of the collaboration with Greenpeace that began in 2000. Greenpeace has urged Coca-Cola to use hydrofluorocarbon-free equipment for the Olympics. By the Torino Games in 2006 and the Beijing Games in 2008, the company was using all hydrofluorocarbon-free technology at Olympic venues.
“Large enterprises have both an opportunity and responsibility to change the game and Coca-Cola’s action leaves no excuse for other companies not to follow,” remarked Kumi Naidoo, executive director of Greenpeace International.
In addition to its refrigeration gas commitment, Coca-Cola also developed a proprietary energy management system that provides energy savings of up to 35 percent.
The beverage bottler is listed on the New York Stock Exchange.
0
comments
RSS
Posted by
Krasugir & Co
at
3:30 AM
Labels:
Eco Friendly,
Energy Efficecy,
Energy Saving,
Recycle
M.I.T. develops cleaner natural gas power
Researchers from the Massachusetts Institute of Technology have developed a new type of natural gas electric power plant with zero carbon dioxide emissions.
Postdoctoral associate Thomas Adams and Lammot du Pont Professor of Chemical Engineering Paul I. Barton are proposing a system that uses solid-oxide fuel cells to produce power from fuel without burning it.
The system would not require any new technologies but would combine existing components in a new configuration for which they have applied for a patent.
The system would run on natural gas that is considered more environmentally friendly that coal or oil. Presently, natural gas power plants produce an average of 1,235 pounds of carbon dioxide for every megawatt-hour of electricity produced, one third the emissions from coal power plants.
The system proposed by Mr. Adams and Mr. Barton would not emit any carbon dioxide into the air or other gases believed responsible for global warming, but would instead produce a stream of mostly pure carbon dioxide.
This stream could be harnessed and stored underground relatively easily, a process known as carbon capture and sequestration (C.C.S.). One additional advantage of the proposed system is that, unlike a conventional natural gas plant with C.C.S. that would consume significant amounts of water, the fuel-cell based system actually produces clean water that could easily be treated to provide potable water as a side benefit, Mr. Adams says.
Carbon pricing needed
The new system could produce power at cost comparable to or less than conventional natural-gas plants and even to coal-burning plants. However, the researchers said that this can only come about if and when a price is set on the emission of carbon dioxide and other greenhouse gases.
Carbon pricing attempts to take into account the true price exacted on the environment by greenhouse gas emission.
Mr. Adams explained that without costs imposed on carbon emissions, the cheapest fuel will always be pulverized coal. But as soon as there is some form of carbon pricing, their system will have the lowest price option as long as the pricing is more than about $15 per metric ton of emitted carbon dioxide.
Natural gas already accounts for 22 percent of all United States electricity production, and that percentage is likely to rise in coming years if carbon prices are put into effect.
For these and other reasons, a system that can produce electricity from natural gas at a competitive price with zero greenhouse gas emissions could prove to be an attractive alternative to conventional power plants that use fossil fuels.
M.I.T. develops cleaner natural gas power
Australia carbon laws in doubt
CANBERRA, Dec. 1 (Reuters) - Australia's plans to cut carbon emissions were set for defeat in a hostile Senate after the election of a new opposition leader opposed to carbon-trade laws, setting the stage for a possible early 2010 election.
New Liberal opposition leader Tony Abbott, elected Tuesday, said conservative Senators, many climate change skeptics, would reject the government's carbon emissions trading laws if they are not deferred until early 2010.
Mr. Abbott said he believed in climate change, but told reporters he was opposed to the government's emissions trading scheme model, and was not afraid of fighting an election on the issue.
"As leader, I am not frightened of an election on this issue. This is going to be a tough fight. But it will be a fight. You cannot win an election without a fight," said Mr. Abbott, a boxer in his university days who once studied for the priesthood.
Greg Combet, assistant climate change minister, said the government would still push for its carbon trade laws to be passed this week. He hoped some opposition lawmakers would side with the government and defy Mr. Abbott.
"The extremists have gained control of the Liberal Party. They are opposed to taking action on climate change, they dispute the science," Mr. Combet told reporters.
Prime Minister Kevin Rudd has struggled to have his climate change legislation passed in the Senate before parliament adjourns until February.
Mr. Rudd, who was in Washington on Monday meeting President Barack Obama, was keen to take a lead role at next week's Copenhagen summit on climate change by enacting a "cap-and-trade" scheme requiring polluters to buy permits for their emissions.
Mr. Rudd wants emissions trading to start in Australia in July 2011, covering 75 percent of emissions in the developed world's bigger per capita emitter. The planned carbon-trade scheme would be the biggest outside Europe.
The United States is closely watching Australia's debate and a political agreement on carbon trading in Australia would help garner support for action from other countries.
If the Senate ends up rejecting the carbon scheme for a second time, Rudd will have a trigger to call an early 2010 election on climate change, most likely for March of April, with polls suggesting his government would win an increased majority.
Business, election uncertainty
Mr. Abbott said while the opposition rejected the emissions trading scheme, it still backed the government's emissions reduction target of at least 5 percent from 2000 levels by 2020, with a 25 percent target if nations agree on an ambitious climate pact in Copenhagen.
The prolonged political debate has caused some dismay among companies, coal and power firms in particular, who see some sort of scheme as inevitable and are looking for pricing certainty.
Banks and fund managers see the emissions trading scheme, the biggest economic policy change in modern Australian history, as a boon for traders, investors and new green technologies, while major polluters generally oppose it as a tax on heavy industry.
International Power Australia, a unit of International Power and Australia's largest private-sector generator, says the uncertainty has impinged on talks with its lenders.
Major miners such as BHP Billiton and oil and gas firms such as Woodside Petroleum have criticized the scheme, though have been mollified somewhat by government pledges last month to raise state compensation.
"This is the worst possible outcome for us, as this guy (Mr. Abbott) is the figurehead for the climate change skeptics within the conservative party," said Tim Hanlin, managing director Australian Climate Exchange Limited.
"Australian industry is now thrown into total uncertainty regarding a price on carbon and therefore cannot make any informed investment decisions," said Mr. Hanlin.
"This is going to put Kevin Rudd under enormous pressure to call an election on this issue."
But Monash University analyst Nick Economou said Rudd would now miss his Copenhagen deadline and be in no rush for an election, putting the chances of an early poll at 20 percent.
"They may as well play the long game, the patient game," Mr. Economou said, saying Mr. Rudd would prefer a normal election later in 2010, giving him time to build up an attack against Mr. Abbott's Liberals, with the carbon laws waiting to be passed.
Mr. Rudd has repeatedly said he does not want an early poll and would prefer elections to be held on time in late 2010.
If Mr. Rudd wins a double dissolution election of both houses of parliament, he can then push his climate policy through a special joint sitting of the houses.
Australia carbon laws in doubt
California unveils draft cap-and-trade rules
SAN FRANCISCO, Nov. 25 (Reuters) - California released on Tuesday draft rules for its landmark greenhouse gas cap-and-trade plan that will be the most ambitious United States effort to use the market to address global warming.
State law requires California to cut its carbon dioxide and other greenhouse gas emissions to 1990 levels by 2020. Measures will range from clean vehicle and building rules to the cap-and-trade system that lets factories and power companies trade credits to emit gases that heat up the earth.
Federal rules under debate by Congress could eclipse and pre-empt regional plans, but California and other local governments see themselves as the vanguard of addressing climate change, especially in light of slow national action and setbacks for international talks scheduled in Copenhagen next month.
The draft released on Tuesday shows California, seen as an environmental trend-setter, may take on even more than expected in its first round of cap-and-trade, which will start in 2012.
Gasoline and residential heating fuel suppliers could be included in the first cap-and-trade phase, which had been expected to focus on big pollution sources like power plants and refineries.
"California is the first out of the box," Mary Nichols, state Air Resources Board chair, told reporters on a conference call. The draft rules kick off a comment period that will lead to final regulation next fall.
A less comprehensive Northeastern United States regional trading system is already under way, focusing on carbon dioxide emissions by big emitters. California by contrast plans to include nearly every source of emissions to reach its goal.
California businesses regularly criticize the plan as going too far too fast – and costing too much. Whether the net effect of the plan will be a new green economy or disaster for overburdened businesses is still hotly debated.
Outsize attention
New estimates of plan costs, including suggestions on how much support to give industry, won't be available until an independent advisory group issues a report next year.
The draft avoids what may be the toughest issue – how much to rely on auctions of credits, which would require power companies and the like to buy permission to pollute. The emitters want allowances given to them, especially early on.
But Ms. Nichols said California had shown a strong preference for moving to auction as quickly as possible and that its 2006 global warming law provided clear guidance while politicians in the United States Congress were still raising support for a bill.
"Congress started this, you know, as a political exercise to see how many allowances you had to give out to which groups to get them to buy into the program. They didn't have a climate bill," she said.
"We know how many emissions we have to reduce. The question is how do we do it in a way that costs less," added Ms. Nichols, whose Air Resources Board was appointed by state law as the main regulator deciding on how to cut greenhouse gases.
The cost of a ton of carbon dioxide initially could be around $10, based on how other programs operated, she said. That is about half the current European price. The average American has carbon production of about 20 tons per year, according to the Union of Concerned Scientists.
The cap-and-trade system will account for only about a fifth of California reductions but it draws outside attention, in part because the state, with the largest United States economy and population, is part of the 11-member Western Climate Initiative, which includes American states and Canadian provinces.
China, too, will watch California's action, partly by virtue of the state's partnerships with Chinese provinces, said Derek Walker, climate change director of the Environmental Defense Fund California.
"In many ways this is similar to what you are hearing from international circles now. Everybody is coming to the table with their opening bets," he said. But unlike most, California has committed to cuts and now is working out the details.
California unveils draft cap-and-trade rules
State law requires California to cut its carbon dioxide and other greenhouse gas emissions to 1990 levels by 2020. Measures will range from clean vehicle and building rules to the cap-and-trade system that lets factories and power companies trade credits to emit gases that heat up the earth.
Federal rules under debate by Congress could eclipse and pre-empt regional plans, but California and other local governments see themselves as the vanguard of addressing climate change, especially in light of slow national action and setbacks for international talks scheduled in Copenhagen next month.
The draft released on Tuesday shows California, seen as an environmental trend-setter, may take on even more than expected in its first round of cap-and-trade, which will start in 2012.
Gasoline and residential heating fuel suppliers could be included in the first cap-and-trade phase, which had been expected to focus on big pollution sources like power plants and refineries.
"California is the first out of the box," Mary Nichols, state Air Resources Board chair, told reporters on a conference call. The draft rules kick off a comment period that will lead to final regulation next fall.
A less comprehensive Northeastern United States regional trading system is already under way, focusing on carbon dioxide emissions by big emitters. California by contrast plans to include nearly every source of emissions to reach its goal.
California businesses regularly criticize the plan as going too far too fast – and costing too much. Whether the net effect of the plan will be a new green economy or disaster for overburdened businesses is still hotly debated.
Outsize attention
New estimates of plan costs, including suggestions on how much support to give industry, won't be available until an independent advisory group issues a report next year.
The draft avoids what may be the toughest issue – how much to rely on auctions of credits, which would require power companies and the like to buy permission to pollute. The emitters want allowances given to them, especially early on.
But Ms. Nichols said California had shown a strong preference for moving to auction as quickly as possible and that its 2006 global warming law provided clear guidance while politicians in the United States Congress were still raising support for a bill.
"Congress started this, you know, as a political exercise to see how many allowances you had to give out to which groups to get them to buy into the program. They didn't have a climate bill," she said.
"We know how many emissions we have to reduce. The question is how do we do it in a way that costs less," added Ms. Nichols, whose Air Resources Board was appointed by state law as the main regulator deciding on how to cut greenhouse gases.
The cost of a ton of carbon dioxide initially could be around $10, based on how other programs operated, she said. That is about half the current European price. The average American has carbon production of about 20 tons per year, according to the Union of Concerned Scientists.
The cap-and-trade system will account for only about a fifth of California reductions but it draws outside attention, in part because the state, with the largest United States economy and population, is part of the 11-member Western Climate Initiative, which includes American states and Canadian provinces.
China, too, will watch California's action, partly by virtue of the state's partnerships with Chinese provinces, said Derek Walker, climate change director of the Environmental Defense Fund California.
"In many ways this is similar to what you are hearing from international circles now. Everybody is coming to the table with their opening bets," he said. But unlike most, California has committed to cuts and now is working out the details.
California unveils draft cap-and-trade rules
0
comments
RSS
Posted by
Krasugir & Co
at
3:21 AM
Labels:
Green News,
Green Politics,
Green Stocks and Market
Quebec sets 2020 greenhouse gas emission targets
VANCOUVER, British Columbia, Nov. 24 (Reuters) - The Canadian province of Quebec said on Monday it aims to cut its greenhouse gas emissions by 20 percent below 1990 levels by 2020, the same target as that set by the European Union.
"It is a very ambitious target for the government, given that 48 percent of Quebec's total energy currently comes from renewable energy sources," Quebec Premier Jean Charest said in a statement.
Much of Quebec's power comes from massive hydroelectric projects.
Quebecers emit approximately 11 tons per capita of greenhouse gases, which are blamed for climate change. That is half the Canadian average, Mr. Charest said.
The mostly French-speaking province is a member of the Western Climate Initiative, a group of four Canadian provinces and seven western American states, which is working on implementing a carbon cap and trade system in North America by 2012.
Canada's federal government has pledged to cut carbon emissions by 20 percent from 2006 levels by 2020. However, Ottawa is waiting for the United States to finalize its cap-and-trade program before proceeding with its own.
British Columbia pledged in 2007 to cut its emissions of greenhouse gases by 33 percent by 2020, which would put them 10 percent under 1990 levels.
Quebec sets 2020 greenhouse gas emission targets
0
comments
RSS
Posted by
Krasugir & Co
at
3:20 AM
Labels:
Green News,
Green Politics,
Green Stocks and Market
Monday, December 7, 2009
Elevated carbon dioxide levels may mitigate losses of biodiversity from nitrogen pollution
The study, published in December 4 in the journal Science, involved a 10-year open-air outdoor experiment in which 48 plots planted with 16 different species of plants were tested using ambient and elevated levels of nitrogen and carbon dioxide. Researchers measured the number of species observed in each plot, the plant biomass both above and below ground, as well as factors related to soil, water and light that might affect plant growth.
Over time, the diversity of plants growing in the research plots changed significantly, depending on the combinations of plants and the way added CO2 and nitrogen affected the health of different species. One of the study's key findings is that while the combination of ambient carbon dioxide and nitrogen pollution reduces species richness by 16 percent, adding more CO2 to the mix reduces that change by half.
"From a biodiversity perspective, there was no evidence to support the worst-case scenario, in which impacts of rising CO2 and nitrogen deposition combine to suppress diversity by 30 percent, 40 percent or even 50 percent or more," Reich said. "Instead, their interaction ameliorated the diversity loss due to nitrogen enrichment that occurs under ambient CO2. Given the importance of biodiversity to the effective health and function of our ecosystems this is good news, or perhaps better labeled as "not quite as bad" news."
Reich, a Regents professor in the department of forest resources, notes that "while it is a relief to find out that rising CO2 and nitrogen may not directly cause enormous losses of diversity, this finding does not detract from the urgent need for us to curb CO2 emissions given the other critical CO2 effects, such as overheating the planet and threatening marine life through ocean acidification."
Elevated carbon dioxide levels may mitigate losses of biodiversity from nitrogen pollution
Over time, the diversity of plants growing in the research plots changed significantly, depending on the combinations of plants and the way added CO2 and nitrogen affected the health of different species. One of the study's key findings is that while the combination of ambient carbon dioxide and nitrogen pollution reduces species richness by 16 percent, adding more CO2 to the mix reduces that change by half.
"From a biodiversity perspective, there was no evidence to support the worst-case scenario, in which impacts of rising CO2 and nitrogen deposition combine to suppress diversity by 30 percent, 40 percent or even 50 percent or more," Reich said. "Instead, their interaction ameliorated the diversity loss due to nitrogen enrichment that occurs under ambient CO2. Given the importance of biodiversity to the effective health and function of our ecosystems this is good news, or perhaps better labeled as "not quite as bad" news."
Reich, a Regents professor in the department of forest resources, notes that "while it is a relief to find out that rising CO2 and nitrogen may not directly cause enormous losses of diversity, this finding does not detract from the urgent need for us to curb CO2 emissions given the other critical CO2 effects, such as overheating the planet and threatening marine life through ocean acidification."
Elevated carbon dioxide levels may mitigate losses of biodiversity from nitrogen pollution
Antarctica served as climatic refuge in Earth's greatest extinction event
The new species belongs to a larger group of extinct mammal relatives, called anomodonts, which were widespread and represented the dominant plant eaters of their time.
"Members of the group burrowed in the ground, walked the surface and lived in trees," said Fröbisch, the lead author of the study. "However, Kombuisia antarctica, about the size of a small house cat, was considerably different from today's mammals -- it likely laid eggs, didn't nurse its young and didn't have fur, and it is uncertain whether it was warm blooded," said Angielczyk, Assistant Curator of Paleomammology at The Field Museum. Kombuisia antarctica was not a direct ancestor of living mammals, but it was among the few lineages of animals that survived at a time when a majority of life forms perished.
Scientists are still debating what caused the end-Permian extinction, but it was likely associated with massive volcanic activity in Siberia that could have triggered global warming. When it served as refuge, Antarctica was located some distance north of its present location, was warmer and wasn't covered with permanent glaciers, said the researchers. The refuge of Kombuisia in Antarctica probably wasn't the result of a seasonal migration but rather a longer-term change that saw the animal's habitat shift southward. Fossil evidence suggests that small and medium sized animals were more successful at surviving the mass extinction than larger animals. They may have engaged in "sleep-or-hide" behaviors like hibernation, torpor and burrowing to survive in a difficult environment.
Earlier work by Fröbisch predicted that animals like Kombuisia antarctica should have existed at this time, based on fossils found in South Africa later in the Triassic Period that were relatives of the animals that lived in Antarctica. "The new discovery fills a gap in the fossil record and contributes to a better understanding of vertebrate survival during the end-Permian mass extinction from a geographic as well as an ecological point of view," Fröbisch said.
The team found the fossils of the new species among specimens collected more than three decades ago from Antarctica that are part of a collection at the American Museum of Natural History. "At the time those fossils were collected, paleontologists working in Antarctica focused on seeking evidence for the existence of a supercontinent, Pangaea, that later split apart to become separate land masses," said Angielczyk. The fossils collected in Antarctica provided some of the first evidence of Pangaea's existence, and further analysis of the fossils can refine our understanding of events that unfolded 250 million years ago.
"Finding fossils in the current harsh conditions of Antarctica is difficult, but worthwhile," said Angielczyk. "The recent establishment of the Robert A. Pritzker Center for Meteoritics and Polar Studies at The Field Museum recognizes the growing importance of the region," he said.
This research is part of a collaborative study of Dr. Jörg Fröbisch (Department of Geology, Field Museum, Chicago), Dr. Kenneth D. Angielczyk (Department of Geology, Field Museum, Chicago), and Dr. Christian A. Sidor (Burke Museum and Department of Biology, University of Washington), which will be published online December 3, 2009 in Naturwissenschaften.
Funding for this research was provided through a Postdoctoral Research Fellowship of the German Research Foundation (Deutsche Forschungsgemeinschaft) to J. Fröbisch and grants of the National Science Foundation to C. A. Sidor.
Antarctica served as climatic refuge in Earth's greatest extinction event
Saturday, November 28, 2009
Arctic Heats Up More Than Other Places: High Sea Level Rise Predicted
As a result, glacier and ice-sheet melting, sea-ice retreat, coastal erosion and sea level rise can be expected to continue.
A new comprehensive scientific synthesis of past Arctic climates demonstrates for the first time the pervasive nature of Arctic climate amplification.
The U.S. Geological Survey led this new assessment, which is a synthesis of published science literature and authored by a team of climate scientists from academia and government. The U.S. Climate Change Science Program commissioned the report, which has contributions from 37 scientists from the United States, Germany, Canada, the United Kingdom and Denmark.
The new report also makes several conclusions about the Arctic:
Taken together, the size and speed of the summer sea-ice loss over the last few decades is highly unusual compared to events from previous thousands of years, especially considering that changes in Earth's orbit over this time have made sea-ice melting less, not more, likely.
Sustained warming of at least a few degrees (more than approximately 4° to 13°F above average 20th century values) is likely to be sufficient to cause the nearly complete, eventual disappearance of the Greenland ice sheet, which would raise sea level by several meters.
The current rate of human-influenced Arctic warming is comparable to peak natural rates documented by reconstructions of past climates. However, some projections of future human-induced change exceed documented natural variability.
The past tells us that when thresholds in the climate system are crossed, climate change can be very large and very fast. We cannot rule out that human induced climate change will trigger such events in the future.
"By integrating research on the past 65 million years of climate change in the entire circum-Arctic, we have a better understanding on how climate change affects the Arctic and how those effects may impact the whole globe," said USGS Director Mark Myers. "This report provides the first comprehensive analysis of the real data we have on past climate conditions in the Arctic, with measurements from ice cores, sediments and other Earth materials that record temperature and other conditions."
Arctic Heats Up More Than Other Places: High Sea Level Rise Predicted
Carbon Emissions Linked To Global Warming In Simple Linear Relationship
These findings will be published in the next edition of Nature, to be released on June 11, 2009.
Until now, it has been difficult to estimate how much climate will warm in response to a given carbon dioxide emissions scenario because of the complex interactions between human emissions, carbon sinks, atmospheric concentrations and temperature change. Matthews and colleagues show that despite these uncertainties, each emission of carbon dioxide results in the same global temperature increase, regardless of when or over what period of time the emission occurs.
These findings mean that we can now say: if you emit that tonne of carbon dioxide, it will lead to 0.0000000000015 degrees of global temperature change. If we want to restrict global warming to no more than 2 degrees, we must restrict total carbon emissions – from now until forever – to little more than half a trillion tonnes of carbon, or about as much again as we have emitted since the beginning of the industrial revolution.
"Most people understand that carbon dioxide emissions lead to global warming," says Matthews, "but it is much harder to grasp the complexities of what goes on in between these two end points. Our findings allow people to make a robust estimate of their contribution to global warming based simply on total carbon dioxide emissions."
In light of this study and other recent research, Matthews and a group of international climate scientists have written an open letter calling on participants of December's Conference of the Parties to the U.N. Framework Convention on Climate Change to acknowledge the need to limit cumulative emissions of carbon dioxide so as to avoid dangerous climate change.
Carbon Emissions Linked To Global Warming In Simple Linear Relationship
Until now, it has been difficult to estimate how much climate will warm in response to a given carbon dioxide emissions scenario because of the complex interactions between human emissions, carbon sinks, atmospheric concentrations and temperature change. Matthews and colleagues show that despite these uncertainties, each emission of carbon dioxide results in the same global temperature increase, regardless of when or over what period of time the emission occurs.
These findings mean that we can now say: if you emit that tonne of carbon dioxide, it will lead to 0.0000000000015 degrees of global temperature change. If we want to restrict global warming to no more than 2 degrees, we must restrict total carbon emissions – from now until forever – to little more than half a trillion tonnes of carbon, or about as much again as we have emitted since the beginning of the industrial revolution.
"Most people understand that carbon dioxide emissions lead to global warming," says Matthews, "but it is much harder to grasp the complexities of what goes on in between these two end points. Our findings allow people to make a robust estimate of their contribution to global warming based simply on total carbon dioxide emissions."
In light of this study and other recent research, Matthews and a group of international climate scientists have written an open letter calling on participants of December's Conference of the Parties to the U.N. Framework Convention on Climate Change to acknowledge the need to limit cumulative emissions of carbon dioxide so as to avoid dangerous climate change.
Carbon Emissions Linked To Global Warming In Simple Linear Relationship
0
comments
RSS
Posted by
Krasugir & Co
at
3:16 AM
Labels:
Carbon Emissions,
Climate Change Effects,
Earth Climate
Abrupt Climate Change: Will It Happen This Century?
"Abrupt" changes can occur over decades or less, persist for decades more, and cause substantial disruptions to human and natural systems.
A new report, based on an assessment of published science literature, makes the following conclusions about the potential for abrupt climate changes from global warming during this century.
Climate model simulations and observations suggest that rapid and sustained September arctic sea ice loss is likely in the 21st century.
The southwestern United States may be beginning an abrupt period of increased drought.
It is very likely that the northward flow of warm water in the upper layers of the Atlantic Ocean, which has an important impact on the global climate system, will decrease by approximately 25-30 percent. However, it is very unlikely that this circulation will collapse or that the weakening will occur abruptly during the 21st century and beyond.
An abrupt change in sea level is possible, but predictions are highly uncertain due to shortcomings in existing climate models.
There is unlikely to be an abrupt release of methane, a powerful greenhouse gas, to the atmosphere from deposits in the earth. However, it is very likely that the pace of methane emissions will increase.
The U.S. Geological Survey led the new assessment, which was authored by a team of climate scientists from the federal government and academia. The report was commissioned by the U.S. Climate Change Science Program with contributions from the National Oceanic and Atmospheric Administration and the National Science Foundation.
"This report was truly a collaborative effort between world renowned scientists who provided objective, unbiased information that is necessary to develop effective adaptation and mitigation strategies that protect our livelihood," said USGS Director Mark Myers. "It summarizes the scientific community's growing understanding regarding the potential for abrupt climate changes and identifies areas for additional research to further improve climate models."
Further research is needed to improve our understanding of the potential for abrupt changes in climate. For example, the report's scientists found that processes such as interaction of warm ocean waters with the periphery of ice sheets and ice shelves have a greater impact than previously known on the destabilization of ice sheets that might accelerate sea-level rise.
http://www.climatescience.gov/default.php
Abrupt Climate Change: Will It Happen This Century?
A new report, based on an assessment of published science literature, makes the following conclusions about the potential for abrupt climate changes from global warming during this century.
Climate model simulations and observations suggest that rapid and sustained September arctic sea ice loss is likely in the 21st century.
The southwestern United States may be beginning an abrupt period of increased drought.
It is very likely that the northward flow of warm water in the upper layers of the Atlantic Ocean, which has an important impact on the global climate system, will decrease by approximately 25-30 percent. However, it is very unlikely that this circulation will collapse or that the weakening will occur abruptly during the 21st century and beyond.
An abrupt change in sea level is possible, but predictions are highly uncertain due to shortcomings in existing climate models.
There is unlikely to be an abrupt release of methane, a powerful greenhouse gas, to the atmosphere from deposits in the earth. However, it is very likely that the pace of methane emissions will increase.
The U.S. Geological Survey led the new assessment, which was authored by a team of climate scientists from the federal government and academia. The report was commissioned by the U.S. Climate Change Science Program with contributions from the National Oceanic and Atmospheric Administration and the National Science Foundation.
"This report was truly a collaborative effort between world renowned scientists who provided objective, unbiased information that is necessary to develop effective adaptation and mitigation strategies that protect our livelihood," said USGS Director Mark Myers. "It summarizes the scientific community's growing understanding regarding the potential for abrupt climate changes and identifies areas for additional research to further improve climate models."
Further research is needed to improve our understanding of the potential for abrupt changes in climate. For example, the report's scientists found that processes such as interaction of warm ocean waters with the periphery of ice sheets and ice shelves have a greater impact than previously known on the destabilization of ice sheets that might accelerate sea-level rise.
http://www.climatescience.gov/default.php
Abrupt Climate Change: Will It Happen This Century?
0
comments
RSS
Posted by
Krasugir & Co
at
3:16 AM
Labels:
Carbon Emissions,
Climate Change Effects,
Earth Climate
In The Warming West, Climate Most Significant Factor In Fanning Wildfires' Flames
"We found that what matters most in accounting for large wildfires in the Western United States is how climate influences the build up—or production—and drying of fuels," said Jeremy Littell, a research scientist with the University of Washington's Climate Impacts Group and lead investigator of the study. "Climate affects fuels in different ecosystems differently, meaning that future wildfire size and, likely, severity depends on interactions between climate and fuel availability and production."
To explore climate-fire relationships, the scientists used fire data from 1916 to 2003 for 19 ecosystem types in 11 Western States to construct models of total wildfire area burned. They then compared these fire models with monthly state divisional climate data.
The study confirmed what scientists have long observed: that low precipitation and high temperatures dry out fuels and result in significant fire years, a pattern that dominates the northern and mountainous portions of the West. But it also provided new insight on the relationship between climate and fire, such as Western shrublands' and grasslands' requirement for high precipitation one year followed by dry conditions the next to produce fuels sufficient to result in large wildfires.
The study revealed that climate influences the likelihood of large fires by controlling the drying of existing fuels in forests and the production of fuels in more arid ecosystems. The influence of climate leading up to a fire season depends on whether the ecosystem is more forested or more like a woodland or shrubland.
"These data tell us that the effectiveness of fuel reductions in reducing area burned may vary in different parts of the country," said David L. Peterson, a research biologist with the Forest Service's Pacific Northwest Research Station and one of the study's authors. "With this information, managers can design treatments appropriate for specific climate-fire relationships and prioritize efforts where they can realize the most benefit."
Findings from the study suggest that, as the climate continues to warm, more area can be expected to burn, at least in northern portions of the West, corroborating what researchers have projected in previous studies. In addition, cooler, wetter areas that are relatively fire-free today, such as the west side of the Cascade Range, may be more prone to fire by mid-century if climate projections hold and weather becomes more extreme.
In The Warming West, Climate Most Significant Factor In Fanning Wildfires' Flames
Scientists argue for a new type of climate target
"The implications are that global emissions must peak around 2015 and be cut by roughly half between the peak and the year 2030," Steffen Kallbekken, scientist at CICERO, said.
In a new paper in Nature Reports Climate Change Steffen Kallbekken, Nathan Rive, Glen P. Peters and Jan S. Fuglestvedt from CICERO Center for International Climate and Environmental Research -- Oslo, argue for a new type of climate target to be considered:
"Focusing climate policy on a long-term target, such as the EU 2-degree target, provides limited guidance for mitigation over the next few decades, and gives the impression that there is time to delay," said Steffen Kallbekken.
The researchers propose that, in addition to a long-term cumulative emissions budget, a maximum limit on the rate of warming should also be considered as an element in the design of climate policies.
Required mitigation rates are 4-8 percent per year, which far exceeds anything achieved in history.
"A short-term target provides clearer guidance on mitigation in the near term, limits potentially dangerous rates of warming, and allows easier inclusion of potent and toxic short-lived climate components," Kallbekken said.
"A short-term cumulative emissions target, for example 190 GtC for the period 2010-2030, is a useful approach to limit the rate of warming, while at the same time keeping the focus on what matters in the long term: reducing CO2 emissions."
Scientists argue for a new type of climate target
In a new paper in Nature Reports Climate Change Steffen Kallbekken, Nathan Rive, Glen P. Peters and Jan S. Fuglestvedt from CICERO Center for International Climate and Environmental Research -- Oslo, argue for a new type of climate target to be considered:
"Focusing climate policy on a long-term target, such as the EU 2-degree target, provides limited guidance for mitigation over the next few decades, and gives the impression that there is time to delay," said Steffen Kallbekken.
The researchers propose that, in addition to a long-term cumulative emissions budget, a maximum limit on the rate of warming should also be considered as an element in the design of climate policies.
Required mitigation rates are 4-8 percent per year, which far exceeds anything achieved in history.
"A short-term target provides clearer guidance on mitigation in the near term, limits potentially dangerous rates of warming, and allows easier inclusion of potent and toxic short-lived climate components," Kallbekken said.
"A short-term cumulative emissions target, for example 190 GtC for the period 2010-2030, is a useful approach to limit the rate of warming, while at the same time keeping the focus on what matters in the long term: reducing CO2 emissions."
Scientists argue for a new type of climate target
Including Environmental Data Improves Effectiveness Of Invasive Species Range Predictions
Inés Ibáñez of the University of Michigan and her colleagues examined not only historical and current climatic data, but also historical environmental information from both the native and invaded ranges of three New England invasive plants: Japanese barberry, bittersweet and winged euonymus (or burning bush). The models took into account human development, disturbances and agricultural land use; habitat measures of local ground cover, such as forest type and wetlands type, were also included.
The researchers found that although climate plays a large role in predicting invasive species distribution, the inclusion of land use and habitat data improve the explanatory power of their models. In some instances, the combination of an unfavorable climate with a suitable landscape cover increased the probability of species establishment. On the other hand, some areas with favorable climates became less so when their unfavorable habitat data were included.
Most importantly, the researchers write, their models can be modified and used in other systems to predict biological invasions anywhere in the world.
Including Environmental Data Improves Effectiveness Of Invasive Species Range Predictions
The researchers found that although climate plays a large role in predicting invasive species distribution, the inclusion of land use and habitat data improve the explanatory power of their models. In some instances, the combination of an unfavorable climate with a suitable landscape cover increased the probability of species establishment. On the other hand, some areas with favorable climates became less so when their unfavorable habitat data were included.
Most importantly, the researchers write, their models can be modified and used in other systems to predict biological invasions anywhere in the world.
Including Environmental Data Improves Effectiveness Of Invasive Species Range Predictions
Paleoecologists Offer New Insight Into How Climate Change Will Affect Organisms
According to Booth and his colleagues, one of the biggest challenges facing ecologists today is trying to predict how climate change will impact the distribution of organisms in the future. Combining the environmental conditions that allow a particular species to exist with the output from climate models is a commonly used approach to determining where these conditions will exist in the future. However, according to the authors, there some potential problems with the correlational approach that ecologists have traditionally used.
"This traditional prediction approach on its own is insufficient," said Booth. "It needs to be integrated with mechanistic and dynamic ecological modeling and systematic observations of past and present patterns and dynamics."
The paper uses examples from recent paleoecological studies to highlight how climate variability of the past has affected the distributions of tree species, and even how events that occurred many centuries ago still shape present-day distributions patterns. For example, the authors note that some populations of a Western US tree species owe their existence to brief periods of favorable climatic conditions allowing colonization in the past, such as a particularly wet interval during the 14th century.
"The climate system varies at all ecologically relevant time scales," said Booth. "We see differences year to year, decade to decade, century to century and millennia to millennia. When trying to understand how species and populations will respond to changing climate, it's not just changes in the mean climate state that need to be considered, but also changes in variability "
The article was written by Stephen Jackson of the Department of Botany and Program in Ecology at the University of Wyoming, Julio Betancourt of the U.S. Geological Survey in Arizona, Robert Booth of the Department of Earth and Environmental Sciences at Lehigh University, and Stephen Gray of the Wyoming Water Resources Data System and Wyoming State Climate Office of the University of Wyoming. It was published on Sept. 23, 2009.
Paleoecologists Offer New Insight Into How Climate Change Will Affect Organisms
"This traditional prediction approach on its own is insufficient," said Booth. "It needs to be integrated with mechanistic and dynamic ecological modeling and systematic observations of past and present patterns and dynamics."
The paper uses examples from recent paleoecological studies to highlight how climate variability of the past has affected the distributions of tree species, and even how events that occurred many centuries ago still shape present-day distributions patterns. For example, the authors note that some populations of a Western US tree species owe their existence to brief periods of favorable climatic conditions allowing colonization in the past, such as a particularly wet interval during the 14th century.
"The climate system varies at all ecologically relevant time scales," said Booth. "We see differences year to year, decade to decade, century to century and millennia to millennia. When trying to understand how species and populations will respond to changing climate, it's not just changes in the mean climate state that need to be considered, but also changes in variability "
The article was written by Stephen Jackson of the Department of Botany and Program in Ecology at the University of Wyoming, Julio Betancourt of the U.S. Geological Survey in Arizona, Robert Booth of the Department of Earth and Environmental Sciences at Lehigh University, and Stephen Gray of the Wyoming Water Resources Data System and Wyoming State Climate Office of the University of Wyoming. It was published on Sept. 23, 2009.
Paleoecologists Offer New Insight Into How Climate Change Will Affect Organisms
Plants Could Override Climate Change Effects On Wildfires
Philip Higuera of Montana State University and his colleagues show that although changing temperatures and moisture levels set the stage for changes in wildfire frequency, they can often be trumped by changes in the distribution and abundance of plants. Vegetation plays a major role in determining the flammability of an ecosystem, he says, potentially dampening or amplifying the impacts that climate change has on fire frequencies.
"Climate is only one control of fire regimes, and if you only considered climate when predicting fire under climate-change scenarios, you would have a good chance of being wrong," he says. "You wouldn't be wrong if vegetation didn't change, but the greater the probability that vegetation will change, the more important it becomes when predicting future fire regimes."
Higuera and his colleagues examined historical fire frequency in northern Alaska by analyzing sediments at the bottom of lakes. Using meter-long samples, called sediment cores, Higuera and his colleagues measured changes in the abundance of preserved plant parts, such as pollen, to determine the types of vegetation that dominated the landscape during different time periods in the past. Like rings in a tree, different layers of sediment represent different times in the past.
The researchers used radiocarbon dating to determine the sediment's age, which dates as far back as 15,000 years. They then measured charcoal deposits in the sediment to determine fire frequency during time periods dominated by different vegetation. Finally, they compared their findings to known historical climate changes.
In many cases, the authors discovered, changes in climate were less important than changes in vegetation in determining wildfire frequency. Despite a transition from a cool, dry climate to a warm, dry climate about 10,500 years ago, for example, the researchers found a sharp decline in the frequency of fires. Their sediment cores from that time period revealed a vegetation change from flammable shrubs to fire-resistant deciduous trees, a trend which Higuera thinks was enough to offset the direct effects of climate on fire frequencies.
"In this case, a warmer climate was likely more favorable for fire occurrence, but the development of deciduous trees on the landscape offset this direct climatic effect. Consequently, we see very little fire," Higuera says.
Similarly, during the development of the modern spruce-dominated forest about 5000 years ago, temperatures cooled and moisture levels increased, which – considered alone – would create unfavorable conditions for frequent fires. Despite this change, the authors observed an increase in fire frequency, a pattern they attribute to the high flammability of the dense coniferous forests.
Higuera thinks this research has implications for predictions of modern-day changes in fire regimes based on climate change. These findings, Higuera says, emphasize that predicting future wildfire frequency shouldn't hinge on the direct impacts of climate change alone.
"Climate affects vegetation, vegetation affects fire, and both fire and vegetation respond to climate change," he says. "Most importantly, our work emphasizes the need to consider the multiple drivers of fire regimes when anticipating their response to climate change."
Plants Could Override Climate Change Effects On Wildfires
Monday, November 23, 2009
New climate treaty could put species at risk, scientists argue
A team of eleven of the world's top tropical forest scientists, coordinated by the University of Leeds, warn that while cutting clearance of carbon-rich tropical forests will help reduce climate change and save species in those forests, governments could risk neglecting other forests that are home to large numbers of endangered species.
Under new UN Framework Convention on Climate Change (UNFCCC) proposals, the Reduced Emissions from Deforestation and Degradation (REDD) scheme would curb carbon emissions by financially rewarding tropical countries that reduce deforestation.
Governments implicitly assume that this is a win-win scheme, benefiting climate and species. Tropical forests contain half of all species and half of all carbon stored in terrestrial vegetation, and their destruction accounts for 18% of global carbon emissions.
However, in a paper published in the latest issue of Current Biology, the scientists warn that if REDD focuses solely on protecting forests with the greatest density of carbon, some biodiversity may be sacrificed.
"Concentrations of carbon density and biodiversity in tropical forests only partially overlap," said Dr Alan Grainger of the University of Leeds, joint leader of the international team. "We are concerned that governments will focus on cutting deforestation in the most carbon-rich forests, only for clearance pressures to shift to other high biodiversity forests which are not given priority for protection because they are low in carbon."
"If personnel and funds are switched from existing conservation areas they too could be at risk, and this would make matters even worse."
If REDD is linked to carbon markets then biodiversity hotspot areas -- home to endemic species most at risk of extinction as their habitats are shrinking rapidly -- could be at an additional disadvantage, because of the higher costs of protecting them.
According to early estimates up to 50% of tropical biodiversity hotspot areas could be excluded from REDD for these reasons. Urgent research is being carried out across the world to refine these estimates.
Fortunately, the UN Framework Convention on Climate Change is still negotiating the design of REDD and how it is to be implemented.
The team is calling for rules to protect biodiversity to be included in the text of the Copenhagen Agreement. It also recommends that the Intergovernmental Panel on Climate Change give greater priority to studying this issue, and to producing a manual to demonstrate how to co-manage ecosystems for carbon and biodiversity services.
"Despite the best of intentions, mistakes can easily happen because of poor design" said Dr Grainger. "Clearing tropical forests to increase biofuel production to combat climate change is a good example of this. Governments still have time at Copenhagen to add rules to REDD to ensure that it does not make a similar mistake. A well designed REDD can save many species and in our paper we show how this can be done."
New climate treaty could put species at risk, scientists argue
Under new UN Framework Convention on Climate Change (UNFCCC) proposals, the Reduced Emissions from Deforestation and Degradation (REDD) scheme would curb carbon emissions by financially rewarding tropical countries that reduce deforestation.
Governments implicitly assume that this is a win-win scheme, benefiting climate and species. Tropical forests contain half of all species and half of all carbon stored in terrestrial vegetation, and their destruction accounts for 18% of global carbon emissions.
However, in a paper published in the latest issue of Current Biology, the scientists warn that if REDD focuses solely on protecting forests with the greatest density of carbon, some biodiversity may be sacrificed.
"Concentrations of carbon density and biodiversity in tropical forests only partially overlap," said Dr Alan Grainger of the University of Leeds, joint leader of the international team. "We are concerned that governments will focus on cutting deforestation in the most carbon-rich forests, only for clearance pressures to shift to other high biodiversity forests which are not given priority for protection because they are low in carbon."
"If personnel and funds are switched from existing conservation areas they too could be at risk, and this would make matters even worse."
If REDD is linked to carbon markets then biodiversity hotspot areas -- home to endemic species most at risk of extinction as their habitats are shrinking rapidly -- could be at an additional disadvantage, because of the higher costs of protecting them.
According to early estimates up to 50% of tropical biodiversity hotspot areas could be excluded from REDD for these reasons. Urgent research is being carried out across the world to refine these estimates.
Fortunately, the UN Framework Convention on Climate Change is still negotiating the design of REDD and how it is to be implemented.
The team is calling for rules to protect biodiversity to be included in the text of the Copenhagen Agreement. It also recommends that the Intergovernmental Panel on Climate Change give greater priority to studying this issue, and to producing a manual to demonstrate how to co-manage ecosystems for carbon and biodiversity services.
"Despite the best of intentions, mistakes can easily happen because of poor design" said Dr Grainger. "Clearing tropical forests to increase biofuel production to combat climate change is a good example of this. Governments still have time at Copenhagen to add rules to REDD to ensure that it does not make a similar mistake. A well designed REDD can save many species and in our paper we show how this can be done."
New climate treaty could put species at risk, scientists argue
Blue Energy Seems Feasible And Offers Considerable Benefits
On 3 November, Jan Post is presenting his research during his doctorate dissertation on this subject from Wageningen University.
The principle of generating electricity by mixing salt and fresh water, taking advantage of the difference in charge that results, has been known for more than 100 years. It was first tested in practice in a laboratory in the 1950s. There are two methods for generating blue energy: pressure-retarded osmosis and reverse electrodialysis.
Post, in his research, has focused mainly on the latter because it is the more attractive method of generating energy from sea and river water. With his research into the practical applicability, techniques and preconditions for large-scale energy generation from salinity gradients, he was the first to demonstrate that very high yields are possible. In the laboratory, it is possible to recover more than 80% of the energy from salinity gradients; the technical feasibility would be 60-70% and the economic feasibility a little lower than that.
There are differences among continents: the technical potential in Australia (65%) or Africa (61%) is greater than in South America (47%). There are also considerable differences between rivers -- there are 5472 large rivers worldwide. These differences depend on the salt concentration in the rivers and seas, temperature, and environmental factors. The Rhine is one of the most 'energetic' rivers in Europe.
Afsluitdijk
Post investigated the possibility of recovering energy from the Rhine and the Maas rivers. He estimated the technical potential of both rivers to be 2.4 gigawatts per year. He believes it would be economically feasible to recover 1.5 gigawatts; enough to supply 4 million households in the Netherlands. A power station of around 200 megawatts -- comparable with a park containing 200 wind turbines -- could be placed at the Afsluitdijk (the famous Closure Dike in the Northern part of the Netherlands) which, according to Post, is a rather suitable place for the large-scale trials that need to be carried out. This test location on the Afsluitdijk could be combined with the redesign of the dike that is already being planned. Heavy investment is necessary but this type of clean energy is extremely promising and, since it is essential to look for alternatives to fossil energy, this investment would be worthwhile in every respect. It will be at least ten years before the first commercial power stations are operational, Post says.
Technological developments
Post believes that in the next few years it will be necessary to work even more intensively on two technological developments that will bring down the present, rather high, price of generating blue electricity. An appropriate membrane technology should be developed and, furthermore, such membranes should become much cheaper by introducing mass production. The technique should also be robust enough to work both when the water is polluted and when living organisms accumulate on the membranes (biofouling). His research showed that both hindrances could be removed in the future.
Blue Energy Seems Feasible And Offers Considerable Benefits
The principle of generating electricity by mixing salt and fresh water, taking advantage of the difference in charge that results, has been known for more than 100 years. It was first tested in practice in a laboratory in the 1950s. There are two methods for generating blue energy: pressure-retarded osmosis and reverse electrodialysis.
Post, in his research, has focused mainly on the latter because it is the more attractive method of generating energy from sea and river water. With his research into the practical applicability, techniques and preconditions for large-scale energy generation from salinity gradients, he was the first to demonstrate that very high yields are possible. In the laboratory, it is possible to recover more than 80% of the energy from salinity gradients; the technical feasibility would be 60-70% and the economic feasibility a little lower than that.
There are differences among continents: the technical potential in Australia (65%) or Africa (61%) is greater than in South America (47%). There are also considerable differences between rivers -- there are 5472 large rivers worldwide. These differences depend on the salt concentration in the rivers and seas, temperature, and environmental factors. The Rhine is one of the most 'energetic' rivers in Europe.
Afsluitdijk
Post investigated the possibility of recovering energy from the Rhine and the Maas rivers. He estimated the technical potential of both rivers to be 2.4 gigawatts per year. He believes it would be economically feasible to recover 1.5 gigawatts; enough to supply 4 million households in the Netherlands. A power station of around 200 megawatts -- comparable with a park containing 200 wind turbines -- could be placed at the Afsluitdijk (the famous Closure Dike in the Northern part of the Netherlands) which, according to Post, is a rather suitable place for the large-scale trials that need to be carried out. This test location on the Afsluitdijk could be combined with the redesign of the dike that is already being planned. Heavy investment is necessary but this type of clean energy is extremely promising and, since it is essential to look for alternatives to fossil energy, this investment would be worthwhile in every respect. It will be at least ten years before the first commercial power stations are operational, Post says.
Technological developments
Post believes that in the next few years it will be necessary to work even more intensively on two technological developments that will bring down the present, rather high, price of generating blue electricity. An appropriate membrane technology should be developed and, furthermore, such membranes should become much cheaper by introducing mass production. The technique should also be robust enough to work both when the water is polluted and when living organisms accumulate on the membranes (biofouling). His research showed that both hindrances could be removed in the future.
Blue Energy Seems Feasible And Offers Considerable Benefits
Volatile gas could turn Rwandan lake into a freshwater time bomb
Scientists can't say for sure if the volatile mixture at the bottom of the lake will remain still for another 1,000 years or someday explode without warning. In a region prone to volcanic and seismic activity, the fragility of Lake Kivu is a serious matter. Compounding the precarious situation is the presence of approximately 2 million people, many of them refugees, living along the north end of the lake.
An international group of researchers will meet Jan. 13-15 in Gisenyi, Rwanda, to grapple with the problem of Lake Kivu. A grant from the National Science Foundation won by Rochester Institute of Technology will fund the travel and lodging for 18 scientists from the United States to attend the three-day workshop. Anthony Vodacek, conference organizer and associate professor at RIT's Chester F. Carlson Center for Imaging Science, is working closely with the Rwandan Ministry of Education to organize the meeting.
"Rwandan universities suffered greatly in the 1994 genocide and there are few Rwandan scientists performing significant work on the lake or within the rift system," Vodacek notes. "We will work with the government to identify interested researchers."
Vodacek is convening the workshop with Cindy Ebinger, an expert in East African Rift tectonics at the University of Rochester, and Robert Hecky, an expert in limnology -- the study of lake systems -- at University of Minnesota-Duluth. Core samples Hecky took in the 1970s initially brought the safety of Lake Kivu under question.
Addressing the lake as a whole system is a new concept for the workshop participants, who will bring their expertise in volcanology, tectonics and limnology to the problem. Vodacek's goal is to prioritize research activities and improve communication between the North American, European and African collaborators.
"Most scientists are fairly in agreement that the lake is pretty stable; it's not as if its going to come bursting out tomorrow," Vodacek says. "But in such a tectonically and volcanically active area, you can't tell what's going to happen."
One of the problems with Lake Kivu is that the 1,600-foot deep lake never breathes. The tropical climate helps stagnate the layers of the lake, which never mix or turn over. In contrast, fluctuating temperatures in colder climates help circulate lake water and prevent gas build up. Lake Kivu is different from both temperate and other tropical lakes because warm saline springs, arising from ground water percolating through the hot fractured lava and ash, further stabilize the lake. Scientists at the workshop will consider how these spring inputs may vary over time under changing climates and volcanic activity.
A number of catalysts could destabilize the gas resting at the bottom of Lake Kivu. It could be an earthquake, a volcanic explosion, a landslide or even the methane mining that has recently united Rwandan and Congolese interests.
Close calls occurred in 2008 when an earthquake occurred near the lake and in 2002 when a volcanic eruption destroyed parts of Goma in the Democratic Republic of Congo, only 11 miles north of Lake Kivu. Although scientists were alarmed, neither event sufficiently disturbed the gas.
Vodacek likens the contained pressure in the lake to a bottle of carbonated soda or champagne. "In the lake, you have the carbon dioxide on the bottom and 300 meters of water on top of that, which is the cap," he says. "That's the pressure that holds it. The gas is dissolved in water."
When the cap is removed, bubbles form and rise to the surface. More bubbles form and create a column that drags the water and the gas up to the surface in a chain reaction.
"The question is, and what's really unknown, is how explosive is that?" Vodacek says.
Through his own research Vodacek plans to simulate the circulation of Lake Kivu. Modeling the circulation patterns above the layers of carbon dioxide and methane will help determine the energy required to disrupt the gas and cause Lake Kivu to explode.
Volatile gas could turn Rwandan lake into a freshwater time bomb
Controllable Rubber Trailing Edge Flap To Reduce Loads On Wind Turbine Blades
”Providing the blade with a movable trailing edge it is possible to control the load on the blade and extend the life time of the wind turbine components. This is similar to the technique used on aircrafts, where flaps regulate the lift during the most critical times such as at take-off and landing, "explains Helge Aagaard Madsen, Research Specialist on the project.
However, there is a difference. Whereas on aircrafts, movable flaps are non-deformable elements hinged to the trailing edge of the main wing, this new technique means a continuous surface of the profile on the wind turbine blade even when the trailing edge moves. The reason for this is that the trailing edge is constructed in elastic material and constitutes an integrated part of the main blade.
Robust design of rubber
In 2004 Risø DTU applied for the first patent for this basic technique of designing a flexible, movable trailing edge for a wind turbine blade. Since then there has been a significant development with regard to the project. By means of so-called "Gap-funding" provided by the Ministry of Science, Technology and Innovation and by the local Region Zealand it has been possible to develop such ideas into a prototype stage.
Part of the research has been aimed at the design and development of a robust controllable trailing edge. This has now led to the manufacturing of a trailing edge of rubber with built-in cavities that are fibre-reinforced. The cavities in combination with the directional fibre reinforcement provide the desired movement of the trailing edge, when the cavities are being put under pressure by air or water.
“In this project a number of different prototypes have been manufactured with a chord length of 15 cm and a length of 30 cm. The best version shows very promising results in terms of deflection and in terms of the speed of the deflection” says Helge Aagaard.
The size of the protype fits a blade airfoil section with a chord of one metre and such a blade section is now being produced and is going to be tested inside a wind tunnel.
The capability of the trailing edge to control the load on the blade section is going to be tested in a wind tunnel. This part of the development process is supported by GAP-funding from Region Zealand.
”If the results confirm our estimated performance, we will test the rubber trailing edge on a full-scale wind turbine within a few years” says Helge Aagaard.Controllable Rubber Trailing Edge Flap To Reduce Loads On Wind Turbine Blades
Dutch Electricity System Can Cope With Large-scale Wind Power
Wind is variable and can only partially be predicted. The large-scale use of wind power in the electricity system is therefore tricky. PhD candidate Bart Ummels MSc. investigated the consequences of using a substantial amount of wind power within the Dutch electricity system. He used simulation models, such as those developed by transmission system operator TenneT, to pinpoint potential problems (and solutions).
His results indicate that wind power requires greater flexibility from existing power stations. Sometimes larger reserves are needed, but more frequently power stations will have to decrease production in order to make room for wind-generated power. It is therefore essential to continually recalculate the commitment of power stations using the latest wind forecasts. This reduces potential forecast errors and enables wind power to be integrated more efficiently.
Ummels looked at wind power up to 12 GW, 8 GW of which at sea, which is enough to meet about one third of the Netherlands’ demand for electricity. Dutch power stations are able to cope at any time in the future with variations in demand for electricity and supply of wind power, as long as use is made of up-to-date, improved wind forecasts. It is TenneT’s task to integrate large-scale wind power into the electricity grid. Lex Hartman, TenneT’s Director of Corporate Development: “in a joint effort, TU Delft and TenneT further developed the simulation model that can be used to study the integration of large-scale wind power. The results show that in the Netherlands we can integrate between 4 GW and 10 GW into the grid without needing any additional measures.
Surpluses
Ummels: ‘Instead of the common question ‘What do we do when the wind isn’t blowing?’, the more relevant question is ‘Where do we put all the electricity if it is very windy at night?’. This is because, for instance, a coal-fired power station cannot simply be turned off. One solution is provided by the international trade in electricity, because other countries often can use the surplus. Moreover, a broadening of the ‘opening hours’ of the international electricity market benefits wind power. At the moment, utilities determine one day ahead how much electricity they intend to purchase or sell abroad. Wind power can be better used if the time difference between the trade and the wind forecast is smaller.’
No energy storage
Ummels’ research also demonstrates that energy storage is not required. The results indicate that the international electricity market is a promising and cheaper solution for the use of wind power.
Making power stations more flexible is also better than storage. The use of heating boilers, for instance, means that combined heat and power plants operate more flexibly, which can consequently free up capacity for wind power at night.
The use of wind power in the Dutch electricity system could lead to a reduction in production costs of EUR1.5 billion annually and a reduction in CO2 emissions of 19 million tons a year.
Dutch Electricity System Can Cope With Large-scale Wind Power
Subscribe to:
Posts (Atom)
Followers
Sites
Eco Alternative Energy
Blog Archive
-
▼
09
(143)
-
▼
Dec
(17)
- Honda’s Nanotube Research Opens New Design Opportu...
- GM to Begin to Repay US,Canadian and German Govern...
- The Netherlands Places Order for 75 Natural Gas Po...
- Ireland and Renault-Nissan Sign Agreement to Devel...
- Smith Electric Vehicles Collaborates with Ford Mot...
- Diesel Hybrid Electric Drivetrain Developed for Mi...
- Old hay and Alpine ibex horns reveal how grassland...
- First phase of pan-tropical forest mapping debuts
- Digital avalanche rescue dog: Geolocation system c...
- U.S. carbon capture projects reap $ 3.18 billion
- Coca-Cola bottles cool in emissions-free machines
- M.I.T. develops cleaner natural gas power
- Australia carbon laws in doubt
- California unveils draft cap-and-trade rules
- Quebec sets 2020 greenhouse gas emission targets
- Elevated carbon dioxide levels may mitigate losses...
- Antarctica served as climatic refuge in Earth's gr...
-
►
Nov
(115)
- Arctic Heats Up More Than Other Places: High Sea L...
- Carbon Emissions Linked To Global Warming In Simpl...
- Abrupt Climate Change: Will It Happen This Century?
- In The Warming West, Climate Most Significant Fact...
- Scientists argue for a new type of climate target
- Including Environmental Data Improves Effectivenes...
- Paleoecologists Offer New Insight Into How Climate...
- Plants Could Override Climate Change Effects On Wi...
- New climate treaty could put species at risk, scie...
- Blue Energy Seems Feasible And Offers Considerable...
- Volatile gas could turn Rwandan lake into a freshw...
- Controllable Rubber Trailing Edge Flap To Reduce L...
- Dutch Electricity System Can Cope With Large-scale...
-
▼
Dec
(17)
Labels
- Alternative Energy (1)
- Atmopheric gases (1)
- BioDisel (4)
- BIOFUEL (3)
- Biomass (5)
- Carbon Emissions (2)
- Chile (1)
- Climate Change Effects (13)
- Diesel Hybrids (3)
- Earth Climate (16)
- Earthquake (3)
- Eco Friendly (12)
- Ecology (2)
- Electric Vehicles (3)
- Energy Efficecy (5)
- Energy Saving (2)
- Food Crisis (1)
- Geothermal (2)
- Global Warming (6)
- Green Energy (5)
- Green Environment (13)
- Green News (26)
- Green Politics (18)
- Green Stocks and Market (13)
- Hydro Power (5)
- Matter and Energy (1)
- Nano Technology (1)
- Natural Gas Power (2)
- Recycle (5)
- Renewable Energy (11)
- Save Environment (3)
- Saving Planet Earth (15)
- Solar Energy (8)
- Tsunami (2)
- U.N (1)
- Uk (2)
- Wind Energy (18)