Global installed wind power capacity continued to grow in 2011, albeit at a slightly lower rate than in 2009 and 2010, according to new research conducted by the Worldwatch Institute for its Vital Signs Online service. The world now has approximately four times the installed wind capacity that it did in 2005, reflecting the combined effects of falling prices, improved technology, global investment, and various incentive programs. China led the way with a 43 percent share of global capacity additions in 2011, followed by the United States at 17 percent, India with almost 7 percent, and Germany at 5 percent, writes report author and Worldwatch’s Climate and Energy Program Manager Mark Konold.
“China continues to lead the world in wind capacity additions, having increased its capacity a remarkable 40 percent since 2010,” said Konold. “But a gap remains between this installed capacity and the amount of wind power that is actually available for use in the country. Because of grid connection challenges and other issues, China is struggling to use all of the electricity generated by its turbines.”
Despite large increases in installed wind power capacity, several Chinese provinces, including Inner Mongolia and Gansu, have actually lost a significant portion of their generation capacity because of technical problems. Over the next five years, China plans to invest more than US$400 billion to make improvements to its electrical grid that will enable it to fully integrate its total installed wind capacity by 2015.
In 2011, the United States accounted for approximately 17 percent of global wind power capacity additions. Although the country generated 27 percent more electricity from wind in 2011 than in 2010, wind power still accounts for less than 3 percent of total U.S. power generation, according to the report. Konold credits much of the growth in U.S. wind power capacity to the federal Production Tax Credit (PTC), which helped to finance approximately 4,000 megawatts of new capacity by reducing corporate income tax by 2.2 cents for every kilowatt-hour produced. But if the PTC is not extended beyond its scheduled expiration date at the end of this year, he cautions, the industry could be negatively affected.
The report also discusses wind power developments in the European Union, where Germany regained its position as regional leader for installed capacity. Currently, wind accounts for almost 8 percent of the country’s electricity consumption. Although Spain added only a third of total EU capacity since 2008, wind power accounts for almost 16 percent of the country’s electricity consumption. Economic instability has had some negative impacts on European wind power, however, pushing future growth projections down and potentially hampering investment.
Worldwide, wind power prices fell to $1.2 million per megawatt in the first half of 2011, mainly because of improvements in supply chain efficiency and economies of scale. Competition from Chinese manufacturers and their excess capacity to build machines and flood the market also played a role. In addition, the capacity factor of wind turbines (the ratio of actual output to nameplate capacity) continues to rise as better technologies enter the market, further driving down turbine costs. Combined, these factors are expected to bring down the cost of wind energy 12 percent by 2016, making onshore wind cost competitive with coal, gas, and nuclear power.
“Global wind power growth looks very strong and is on a continued rise, largely because of China’s incredible level of investment,” said Konold. “Withhold that, and the picture looks more muddled. Developed economies are not reaching their fullest potential due to financial and policy uncertainty, and many developing economies are running into technical problems, despite slightly stronger growth in wind power capacity. Although continued growth in wind power won’t be as strong as it could be, as the supply increases and prices fall, wind energy is quite likely to continue its upward trend.”
Global warming may initially make the grass greener, but not for long, according to new research results.
The findings, published this week in the journal Nature Climate Change, show that plants may thrive in the early stages of a warming environment but then begin to deteriorate quickly.
“We were really surprised by the pattern, where the initial boost in growth just went away,” said scientist Zhuoting Wu of Northern Arizona University (NAU), a lead author of the study. “As ecosystems adjusted, the responses changed.”
Ecologists subjected four grassland ecosystems to simulated climate change during a decade-long study.
Plants grew more the first year in the global warming treatment, but this effect progressively diminished over the next nine years and finally disappeared.
The research shows the long-term effects of global warming on plant growth, on the plant species that make up a community, and on changes in how plants use or retain essential resources like nitrogen.
“The plants and animals around us repeatedly serve up surprises,” said Saran Twombly, program director in the National Science Foundation (NSF)’s Division of Environmental Biology, which funded the research.
“These results show that we miss these surprises because we don’t study natural communities over the right time scales. For plant communities in Arizona, it took researchers 10 years to find that responses of native plant communities to warmer temperatures were the opposite of those predicted.”
The team transplanted four grassland ecosystems from a higher to lower elevation to simulate a future warmer environment, and coupled the warming with the range of predicted changes in precipitation–more, the same, or less.
The grasslands studied were typical of those found in northern Arizona along elevation gradients from the San Francisco Peaks down to the Great Basin Desert.
The researchers found that long-term warming resulted in loss of native species and encroachment of species typical of warmer environments, ultimately pushing the plant community toward less productive species.
The warmed grasslands also cycled nitrogen more rapidly. This should make more nitrogen available to plants, scientists believed, helping plants grow more. But instead much of the nitrogen was lost, converted to nitrogen gases in the atmosphere or leached out by rainfall washing through the soil.
Bruce Hungate, senior author of the paper and an ecologist at NAU, said the study challenges the expectation that warming will increase nitrogen availability and cause a sustained increase in plant productivity.
“Faster nitrogen turnover stimulated nitrogen losses, likely reducing the effect of warming on plant growth,” Hungate said. “More generally, changes in species, changes in element cycles–these really make a difference. It’s classic systems ecology: the initial responses elicit knock-on effects, which here came back to bite the plants. These ecosystem feedbacks are critical–you can’t figure this out with plants grown in a greenhouse.”
The findings caution against extrapolating from short-term results, or from experiments with plants grown under artificial conditions, where researchers can’t measure the feedbacks from changes in the plant community and from nutrient cycles.
“The long-term perspective is key,” said Hungate. “We were surprised, and I’m guessing there are more such surprises in store.”
Co-authors of the paper include George Koch and Paul Dijkstra, both at NAU.
Marine researchers have definitively linked the collapse of oyster seed production at a commercial oyster hatchery in Oregon to an increase in ocean acidification.
Larval growth at the hatchery declined to a level considered by the owners to be “non-economically viable.”
A study by the scientists found that increased seawater carbon dioxide (CO2) levels, resulting in more corrosive ocean water, inhibited the larval oysters from developing their shells and growing at a pace that would make commercial production cost-effective.
As atmospheric CO2 levels continue to rise, this may serve as the proverbial canary in the coal mine for other ocean acidification impacts on shellfish.
Results of the research are published this week in the journal Limnology and Oceanography, published by the Association for the Sciences of Limnology and Oceanography (ASLO).
The research was funded by a grant from the National Science Foundation (NSF)’s Science, Engineering and Education for Sustainability (SEES) Ocean Acidification solicitation.
“Studies funded by NSF’s SEES Ocean Acidification solicitation are well-positioned to determine the specific mechanisms responsible for larval mortality in Pacific Northwest oyster hatcheries,” said David Garrison, program director in NSF’s Division of Ocean Sciences.
“This is one of the first times that we have been able to show how ocean acidification affects oyster larval development at a critical life stage,” said Burke Hales, an Oregon State University (OSU) chemical oceanographer and co-author of the paper.
“The predicted rise of atmospheric CO2 in the next two to three decades may push oyster larval growth past the break-even point in terms of production.”
The owners of Whiskey Creek Shellfish Hatchery at Oregon’s Netarts Bay experienced a decline in oyster seed production several years ago and looked at potential causes, including low oxygen and pathogenic bacteria.
Alan Barton, who works at the hatchery and is a co-author of the journal article, was able to eliminate those potential causes and shifted his focus to ocean acidification.
Barton sent samples to OSU and to the National Oceanic and Atmospheric Administration’s Pacific Marine Environmental Laboratory for analysis.
The results clearly linked the production failures to the CO2 levels in the water in which the larval oysters were spawned and spent the first 24 hours of their lives. That first day is a critical time when the oysters develop from fertilized eggs to swimming larvae and build their initial shells.
“The early growth stage for oysters is particularly sensitive to the carbonate chemistry of the water,” said George Waldbusser, a benthic ecologist at OSU.
“As the water becomes more acidified, it affects the formation of calcium carbonate, the mineral in shells. As the CO2 goes up, the mineral stability goes down, ultimately leading to reduced growth or to mortality.”
Commercial oyster production on the West Coast of North America is a 273-million-dollar industry each year. It has depended since the 1970s on oyster hatcheries for a steady supply of the seed used by growers.
In recent years, the hatcheries that provide most of the seed for West Coast growers have suffered persistent production problems.
At the same time, non-hatchery wild stocks of these oysters also have shown low recruitment, putting additional strain on a limited seed supply.
Hales said that Netarts Bay, where the Whiskey Creek hatchery is located, experiences a wide range of chemistry fluctuations.
The researchers believe that hatchery operators may be able to adapt to take advantage of periods when water quality is at its highest.
“In addition to the impact of seasonal upwelling, the water chemistry changes with the tidal cycle and with the time of day,” Hales said. “Afternoon sunlight, for example, promotes photosynthesis in the bay. That production can absorb some of the carbon dioxide and lower the corrosiveness of the water.”
The researchers also found that larval oysters showed a delayed response to the water chemistry, which may cast new light on other experiments looking at the impacts of ocean acidification on shellfish.
In the study, they found that larval oysters raised in water that was acidic, but non-lethal, had significantly less growth in later stages of their life.
“The takeaway message here is that the response to poor water quality isn’t always immediate,” said Waldbusser.
“In some cases, it took until three weeks after fertilization for effects from the acidic water to become apparent. Short-term experiments of just a few days may not detect the damage.”
The research was also supported by NOAA and the Pacific Coast Shellfish Growers Association.
Other authors of the journal article include Chris Langdon of OSU’s Hatfield Marine Science Center and Richard Feely of NOAA’s Pacific Marine Environmental Laboratory.
Analysis of data from the National Science Foundation- (NSF) funded 10-meter South Pole Telescope (SPT) in Antarctica provides new support for the most widely accepted explanation of dark energy, the source of the mysterious force that is responsible for the accelerating expansion of the universe.
The results begin to hone in on the tiny mass of the neutrinos, the most abundant particles in the universe, which until recently were thought to be without mass.
The SPT data strongly support Albert Einstein’s cosmological constant–the leading model for dark energy–even though researchers base the analysis on only a fraction of the SPT data collected and only 100 of the over 500 galaxy clusters detected so far.
“With the full SPT data set we will be able to place extremely tight constraints on dark energy and possibly determine the mass of the neutrinos,” said Bradford Benson, an NSF-funded postdoctoral scientist at the University of Chicago’s Kavli Institute for Cosmological Physics.
Benson presented the SPT collaboration’s latest findings, Sunday, April 1, at the American Physical Society meeting in Atlanta.
These most recent SPT findings are only the latest scientifically significant results produced by NSF-funded researchers using the telescope in the five years since it became active, noted Vladimir Papitashvili, Antarctic Astrophysics and Geospace Sciences program director in NSF’s Office of Polar Programs.
“The South Pole Telescope has proven to be a crown jewel of astrophysical research carried out by NSF in the Antarctic,” he said. “It has produced about two dozen peer-reviewed science publications since the telescope received its ‘first light’ on Feb. 17, 2007. SPT is a very focused, well-managed and amazing project.”
The 280-ton SPT stands 75 feet tall and is the largest astronomical telescope ever built in the clear and dry air of Antarctica. Sited at NSF’s Amundsen-Scott South Pole station at the geographic South Pole, it stands at an elevation of 9,300 feet on the polar plateau. Because of its location at the Earth’s axis, it can conduct long-term observations.
NSF manages the U.S. Antarctic Program through which it coordinates all U.S. scientific research on the southernmost continent and aboard ships in the Southern Ocean as well as providing the necessary related logistics support.
An international research collaboration led by the University of Chicago manages the South Pole Telescope. The collaboration includes research groups at Argonne National Laboratory; Cardiff University in Wales; Case Western Reserve University; Harvard University; Ludwig-Maximilians-Universität in Germany; the Smithsonian Astrophysical Observatory; McGill University in Canada; the University of California, Berkeley; the University of California, Davis; the University of Colorado Boulder; and the University of Michigan, as well as individual scientists at several other institutions.
SPT specifically was designed to tackle the dark-energy mystery. The 10-meter telescope operates at millimeter wavelengths to make high-resolution images of Cosmic Microwave Background (CMB) radiation, the light left over from the big bang.
Scientists use the CMB to search for distant, massive galaxy clusters that can be used to pinpoint the properties of dark energy and also help define the mass of the neutrino.
“The CMB is literally an image of the universe when it was only 400,000 years old, from a time before the first planets, stars and galaxies formed in the universe,” Benson said. “The CMB has travelled across the entire observable universe, for almost 14 billion years, and during its journey is imprinted with information regarding both the content and evolution of the universe.”
The new SPT results are based on a new method that combines measurements taken by the telescope and by NASA and European Space Agency X-ray satellites, and extends these measurements to larger distances than previously achieved.
The most widely accepted property of dark energy is that it leads to a pervasive force acting everywhere and at all times in the universe. This force could be the manifestation of Einstein’s cosmological constant that assigns energy to space, even when it is free of matter and radiation.
Einstein considered the cosmological constant to be one of his greatest blunders after learning that the universe is not static, but expanding.
In the late 1990s, astronomers discovered the universe’s expansion appears to be accelerating according to cosmic distance measurements based on the relatively uniform luminosity of exploding stars. The finding was a surprise because gravity should have been slowing the expansion, which followed the big bang.
Einstein introduced the cosmological constant into his theory of general relativity to accommodate a stationary universe, the dominant idea of his day. But his constant fits nicely into the context of an accelerating universe, now supported by countless astronomical observations.
Others hypothesize that gravity could operate differently on the largest scales of the universe. In either case, the astronomical measurements point to new physics that have yet to be understood.
As the CMB passes through galaxy clusters, the clusters effectively leave “shadows” that allow astronomers to identify the most massive clusters in the universe, nearly independent of their distance.
“Clusters of galaxies are the most massive, rare objects in the universe, and therefore they can be effective probes to study physics on the largest scales of the universe,” said John Carlstrom, the S. Chandrasekhar Distinguished Service Professor in Astronomy & Astrophysics, who heads the SPT collaboration.
“The unsurpassed sensitivity and resolution of the CMB maps produced with the South Pole Telescope provides the most detailed view of the young universe and allows us to find all the massive clusters in the distant universe,” said Christian Reichardt, a postdoctoral researcher at the University of California, Berkeley and lead author of the new SPT cluster catalog paper.
The number of clusters that formed over the history of the universe is sensitive to the mass of the neutrinos and the influence of dark energy on the growth of cosmic structures.
“Neutrinos are amongst the most abundant particles in the universe,” Benson said. “About one trillion neutrinos pass through us each second, though you would hardly notice them because they rarely interact with ‘normal’ matter.”
The existence of neutrinos was proposed in 1930. They were first detected 25 years later, but their exact mass remains unknown. If they are too massive they would significantly affect the formation of galaxies and galaxy clusters, Benson said.
The SPT team has been able to improve estimates of neutrino masses, yielding a value that approaches predictions stemming from particle physics measurements.
“It is astounding how SPT measurements of the largest structures in the universe lead to new insights on the evasive neutrinos,” said Lloyd Knox, professor of physics at the University of California at Davis and member of the SPT collaboration. Knox will also highlight the neutrino results in his presentation on Neutrinos in Cosmology at a special session of the APS on Tuesday, April 3.
NSF’s Office of Polar Programs primarily funds the SPT. The NSF-funded Physics Frontier Center of the Kavli Institute for Cosmological Physics, the Kavli Foundation and the Gordon and Betty Moore Foundation provide partial support.
Scientists are reporting new evidence that the Deepwater Horizon oil spill has affected marine life in the Gulf of Mexico, this time species that live in dark ocean depths–deepwater corals.
The research used a range of underwater vehicles, including the submarine Alvin, to investigate the corals. The findings are published this week in the journal Proceedings of the National Academy of Sciences (PNAS).
The scientists used a method known as comprehensive two-dimensional gas chromatography to determine the source of the petroleum hydrocarbons found.
The lead author of the paper, chemist Helen White of Haverford College in Pennsylvania, is part of a team of researchers led by Charles Fisher of Penn State University (PSU).
The group includes Erik Cordes from Temple University, and Timothy Shank and Christopher German from the Woods Hole Oceanographic Institution (WHOI), which operates the submersible Alvin.
Fisher, Cordes, Shank and German are co-authors of the paper, along with other scientists from WHOI, Penn State, Temple and the U.S. Geological Survey.
“The biological communities in the deep Gulf of Mexico are separated from human activity at the surface by 4,000 feet of water,” says White.
“We would not expect deep-water corals to be affected by a typical oil spill. But the sheer magnitude of the Deepwater Horizon oil spill makes it very different from a tanker running aground and spilling its contents.
Because of the unprecedented nature of the spill, its effects are more far-reaching than those from smaller spills on the surface.”
The study grew out of a research cruise in October 2010 that was part of a Bureau of Ocean Energy Management and National Oceanic and Atmospheric Administration project.
Using the remotely-operated vehicle (ROV) Jason II, the team initially looked at nine sites more than 20 kilometers from the Macondo well.
The researchers found deep-water coral communities unharmed there.
But when the ROV explored another area 11 kilometers to the southwest of the spill site, the team was surprised to find coral communities covered in a brown flocculent material and showing signs of tissue damage.
“We discovered the site during the last dive of the three-week cruise,” says Fisher.
“As soon as the ROV got close enough to the community for the corals to come into clear view, it was obvious that something was wrong. There was too much white and brown, and not enough color on the corals and brittle stars.”
Once the scientists were close enough to zoom in on a few coral colonies, “there was no doubt that this was something we had not seen anywhere else in the Gulf,” says Fisher. “This is what we had been on the lookout for, but were hoping not to see.”
The coral communities were at a depth of 4,300 feet in close proximity to the Macondo well, which had been capped three months earlier after spilling an estimated 160 million gallons of oil into the Gulf.
At the time the damaged corals were spotted, the effects could not be directly linked to the Deepwater Horizon oil spill.
Then in December, 2010, the scientists set out on a second research cruise to the Gulf.
A National Science Foundation (NSF) RAPID grant funded their return. NSF RAPID awards allow scientists to respond quickly to issues such as natural disasters–in this case, the oil spill.
“Through the RAPID award,” says Rodey Batiza of NSF’s Division of Ocean Sciences, “the researchers were able to analyze the oil spill’s effect on the area’s deep-sea corals, and compare changes in the corals’ condition over a relatively short time-period.”
It’s easy to see the effect of oil in surface waters, “but this was the first time we were diving to the seafloor to look at the effects on deep-sea ecosystems,” says White.
The team used the autonomous underwater vehicle Sentry to map and photograph the ocean floor, and the submersible Alvin to get a better look at the distressed corals.
Alvin holds a pilot and two passengers, and is equipped with viewports and cameras.
Alvin also has robotic arms that can manipulate instruments to collect samples. During six dives in Alvin, whose manipulator claws were modified with a cutting blade, the team collected sediments and samples of the corals and filtered material from the corals for analysis.
“Collecting samples from the deep ocean is incredibly challenging, and Alvin is crucial to this kind of work,” says White.
“The primary aim of the research was to determine the composition of the brown flocculent material covering the corals, and the source of any petroleum hydrocarbons present,” says White.
Because oil can naturally seep from cracks in the floor of the Gulf, pinpointing the source of petroleum hydrocarbons in Gulf samples can be challenging, especially since oil is made up of a complex mixture of chemical compounds.
However, there are slight differences in oils that can be used to trace their origin.
To identify the oil found in the coral communities, White worked with Christopher Reddy and Robert Nelson at WHOI using an advanced technique called comprehensive two-dimensional gas chromatography, pioneered by Reddy and Nelson for use in oil spill research.
The method, which separates oil compounds by molecular weight, allows scientists to “fingerprint” oil and determine its source.
This petroleum analysis, coupled with a review of 69 images from 43 corals at the site performed by Pen-Yuan Hsing of PSU, yielded evidence that the coral communities were affected by oil from the Macondo spill.
“These findings will have a significant effect on deep-water drilling, and on the monitoring of oil spills in the future,” White says.
“Ongoing research in the Gulf will improve our understanding of the resilience of these isolated coral communities and the extent to which they are affected by human activity.
“Oil had a visible effect on the corals, and it’s important to determine whether they can rebound.”« newer posts | older posts »