Home Blog Page 274

99% of world’s population breathes poor-quality air

The U.N. health agency says nearly everybody in the world breathes air that doesn’t meet its standards for air quality, calling for more action to reduce fossil-fuel use, which generates pollutants that cause respiratory and blood-flow problems and lead to millions of preventable deaths each year.

The World Health Organization, about six months after tightening its guidelines on air quality, on Monday issued an update to its database on air quality that draws on information from a growing number of cities, towns and villages across the globe—now over 6,000 municipalities.

WHO said 99% of the global population breathes air that exceeds its air-quality limits and is often rife with particles that can penetrate deep into the lungs, enter the veins and arteries and cause disease. Air quality is poorest in WHO’s eastern Mediterranean and Southeast Asia regions, followed by Africa, it said.

“After surviving a pandemic, it is unacceptable to still have 7 million preventable deaths and countless preventable lost years of good health due to air pollution,” said Dr. Maria Neira, head of WHO’s department of environment, climate change and health. “Yet too many investments are still being sunk into a polluted environment rather than in clean, healthy air.”

The database, which has traditionally considered two types of particulate matter known as PM2.5 and PM10, for the first time has included ground measurements of nitrogen dioxide. The last version of the database was issued in 2018.

WHO says 99% of world's population breathes poor-quality air
Emissions rise from the smokestacks at the Jeffrey Energy Center coal power plant as the suns sets, near Emmett, Kansas, United States, Saturday, Sept. 18, 2021. The U.N. health agency said Monday, April 4, 2022, nearly everybody in the world breathes air that doesn’t meet its standards for air quality, calling for more action to reduce fossil-fuel use, which generates pollutants that cause respiratory and blood-flow problems and lead to millions of preventable deaths each year. Credit: AP Photo/Charlie Riedel, File

Nitrogen dioxide originates mainly from human-generated burning of fuel, such as through automobile traffic, and is most common in urban areas. Exposure can bring respiratory disease like asthma and symptoms like coughing, wheezing and difficulty in breathing, and more hospital and emergency-room admissions, WHO said. The highest concentrations were found in the eastern Mediterranean region.

On Monday, the east Mediterranean island of Cyprus suffered through high concentrations of atmospheric dust for the third straight day, with some cities experiencing three and nearly four times the 50 micrograms per square meter that authorities consider normal. Officials said the microscopic particles could be especially harmful to young children, the elderly and the ill.

Particulate matter has many sources, such as transportation, power plants, agriculture, the burning of waste and industry – as well as from natural sources like desert dust. The developing world is particularly hard hit: India had high levels of PM10, while China showed high levels of PM2.5, the database showed.

“Particulate matter, especially PM2.5, is capable of penetrating deep into the lungs and entering the bloodstream, causing cardiovascular, cerebrovascular (stroke) and respiratory impacts,” WHO said. “There is emerging evidence that particulate matter impacts other organs and causes other diseases as well.”

The findings highlight the sheer scale of the changes needed to combat air pollution, said Anumita Roychowdhury, an air pollution expert at Center for Science and Environment, a research organization in New Delhi.

India and the world need to brace for major changes to try to curb air pollution, including using electric vehicles, shifting away from fossil fuels, embracing a massive scaling-up of green energy and separating out types of waste, she said.

The Council on Energy, Environment and Water, a New Delhi-based think tank, found that more than 60% of India’s PM2.5 loads are from households and industries. Tanushree Ganguly, who heads the council’s program on air quality, called for action toward reducing emissions from industries, automobiles, biomass burning and domestic energy.

“We need to prioritize clean energy access for households that need it the most, and take active measures to clean up our industrial sector,” she said.

Simulating Earth’s changing climate: Why some models exaggerate future warming

The latest report by the Intergovernmental Panel on Climate Change (IPCC), released overnight, shows a viable path to cutting global emissions by half by the end of this decade.

It follows earlier reports in the IPCC’s Sixth Assessment round, which reiterate that climate change is unequivocal and ubiquitous, humans are to blame and warming will surpass the Paris target to keep warming below 2℃ unless we make deep cuts to emissions.

For its projections of future warming, the IPCC relies heavily on climate models—computer simulations that help us understand how the climate has changed in the past and how it is likely to change in the future under various emissions scenarios.

These models are continuously updated but some new-generation models are “running hot,” showing a notably higher climate sensitivity than previous ones.

According to the IPCC, our planet’s actual climate sensitivity is unlikely to be as large as these models suggest, which raises the question of why we would use them if their climate sensitivities are likely unrealistic.

Estimating climate sensitivity

Climate sensitivity describes how much global temperatures will rise in response to human-caused greenhouse gas emissions. The best estimate is 3℃ of warming for a doubling of pre-industrial carbon dioxide levels, with a likely range of 2.5 to 4℃, but ongoing research aims to narrow this range.

Several new models, contributed by renowned modeling centers, display climate sensitivities outside this likely range and larger than any models used for the IPCC’s last assessment in 2013. As a consequence, they simulate anomalously large and fast warming during the 21st century.

Critics see climate models generally as flawed attempts at capturing the complexities of the climate system, not good enough as scientific evidence to guide climate policies.

Yes, all climate models have flaws because they are models, not reality. But they are spectacularly successful at capturing past climate change, including the steady march of global warming and the intensification and increasing frequency of floods and droughts that now regularly make headlines. Nevertheless the large sensitivities of some models are a cause of concern.

The story starts in the early 2000s, when various satellite measurements were combined to better describe the Earth’s radiation budget—the balance between incoming solar radiation and reflected outgoing visible light and invisible infrared radiation.

Based on this, the earlier IPCC report concluded clouds over the Southern Ocean were poorly represented in models, with insufficient sunlight reflected back into space and too much reaching the surface where it warmed the ocean. Later research found many models simulated ice clouds when in fact they should have been liquid clouds.

Simulating water and clouds

This may sound like an elementary problem, but it isn’t. If water comes in very small droplets—as it does in clouds—it can remain liquid down to about -35℃. We call such droplets supercooled.

How scientists estimate climate sensitivity.

If the water contains impurities, its freezing temperature can be anywhere between 0℃ and -35℃. Simulating clouds under all conditions is therefore far from trivial.

Modeling groups generally succeeded in introducing more supercooled liquid clouds into their latest models and at least partly solved this Southern Ocean cloud problem. But this change weakened an important climate feedback: as the climate warms, liquid clouds become more prevalent at the expense of ice clouds.

Liquid clouds are brighter and more reflective than ice clouds, and under progressive global warming more and more incoming sunlight is reflected back into space, counteracting the warming effect. However by replacing ice with supercooled liquid clouds, newer models weaken this cooling effect. This is the leading explanation for the larger climate sensitivity of many new-generation climate models.

The IPCC’s response

The latest IPCC report didn’t raise the estimate of the planet’s actual climate sensitivity. It cites observational evidence to make the case, including “historical” warming which is very well understood for the past several decades.

Models with a middle-of-the-road climate sensitivity near 3℃ often better reproduce the temperature variations of this historical period than those with a large climate sensitivity.

Further evidence comes from simulations of the Earth’ geological past (thousands to millions of years ago) which saw both much colder and much warmer climates than at present. Geological evidence shows high-sensitivity models exaggerate the temperature swings of this distant past. By the same token, a few very low-sensitivity models are also unlikely to be correct.

The latest report concludes climate sensitivity is now better understood, but it doesn’t go as far as dismissing high-sensitivity models altogether. Instead, it says such models simulate “high risk, low likelihood” futures that cannot be ruled out.

Refining climate models

What does the future hold for climate models? Climate sensitivity is the result of a model’s “tuning” whereby parameters are varied systematically until the model produces an acceptable representation of the well observed climate of the past few decades.

Clearly this process requires refinement. Low, medium, and high-sensitivity models have all passed this test, yet these models project quite different magnitudes of warming for this century.

There is scope for increasing cooperation between institutions, scientific disciplines and countries to rise to this challenge. The latest IPCC report did an excellent job at dealing with this, but clearly better constraining the climate sensitivity in models would further raise confidence in climate projections.

The stakes are high. Climate projections inform expensive and disruptive adaptation and mitigation decisions around the world, including which coastal properties should be abandoned due to rising seas, how quickly we need to wean ourselves off fossil fuels, or how to make agriculture climate resilient and climate neutral while still feeding a growing human population.

Seen against this backdrop, a seemingly innocuous, technical problem in climate modeling takes on outsized importance.

California signs $2.6 billion ecological pact

It’s a major source of California’s water supply and a vital habitat for fish, migratory birds and other species.

But the Sacramento-San Joaquin River Delta watershed is also a fragile ecosystem in decline, with human demands for water taking a harsh toll on the environment.

With a third year of severe drought straining water resources and pushing endangered salmon and other fish closer to extinction, California officials have announced a controversial $2.6 billion deal with the federal government and major water suppliers that they say will bolster the ecosystem.

The new pact, called a memorandum of understanding, reflects a realization that with climate change, “the system is collapsing quicker than the laws and regulations that exist can manage or heal that system,” said Jared Blumenfeld, California’s environmental protection secretary.

The proposed agreement lays out plans over the next eight years whereby agencies that supply cities and farms would give up water or secure additional supplies to help threatened species, while state, federal and local agencies would fund projects to improve habitat in the watershed.

State officials called the deal an important milestone in their efforts to balance the delta’s ecological needs with the water needs of Californians, and a key step toward larger “voluntary agreements” that can help ensure substantial flows for the health of the estuary.

Gov. Gavin Newsom declared the plan a historic rejection of “old binaries” in favor of new solutions, while Blumenfeld said it would “move us away from ‘water wars’ of yesteryear.”

Those claims drew strong criticism, however.

Immediately after the plan’s announcement this week, environmental advocates and salmon conservationists condemned it as a set of backroom deals negotiated out of the public eye that wouldn’t provide nearly enough water for threatened fish or the overall health of the watershed.

“Nothing has been achieved through backroom negotiations with water districts,” said Jon Rosenfield, senior scientist with the group San Francisco Baykeeper. “The state’s latest scheme promises only a tiny fraction of the relief our rivers, fisheries and delta communities need, according to a wealth of research—and it leaves all the hard questions unanswered.”

The San Francisco Bay Delta is the largest estuary on the West Coast. Formed by the convergence of California’s two largest rivers, the Sacramento-San Joaquin Delta lies at the heart of the state’s water system.

Two huge government-run pumping plants draw water from the delta’s southern edge and send it flowing through the canals of the State Water Project and the Central Valley Project, supplying vast farmlands and cities to the south.

The delta’s ecosystem has been ailing for decades, and the export of large quantities of freshwater has been a major reason. Climate change has added to the stresses on the ecosystem by intensifying droughts.

Fish have suffered. The delta smelt is now on the brink of extinction. And endangered winter-run Chinook salmon have struggled to reproduce in the Sacramento River when the water flowing from Shasta Dam has warmed up so much that many eggs fail to hatch.

State officials said the agreement aims to meet water-quality objectives in the Delta through additional flows for the environment, projects that restore and improve thousands of acres of aquatic habitat, and funding to purchase water and carry out habitat projects. They said these projects would include creating more spawning habitat for salmon and smelt, restoring floodplains and side channels, and removing barriers that hinder fish, among other things.

Wade Crowfoot, California’s natural resources secretary, said the steps toward voluntary agreements among water agencies “hold promise to improve environmental conditions more quickly and holistically than regulatory requirements.”

But the plan still needs to be endorsed by the State Water Resources Control Board, which is required to update its water-quality plan for the delta. Heavy criticism of the announcement also suggests Newsom and his administration will face opposition as they continue pushing for voluntary water deals.

The agreement’s signatories included more than a dozen water agencies, among them the Metropolitan Water District of Southern California and Westlands Water District, some of the nation’s largest water suppliers. Water agencies have agreed to provide varying flows, depending on whether conditions are wet, above average, below average, dry or critically dry.

Amounts of water contributed annually by the signatories could range between 150,000 acre-feet to 825,000 acre-feet. The largest amount of water, if spread out across the city of Los Angeles, would cover the area more than 2 feet deep.

Baykeeper’s Rosenfield pointed out that would be much less than the average of approximately 1.5 million to 1.6 million acre-feet that the state water board had contemplated in a 2018 document, and far less than what the board had indicated would be needed to protect imperiled fish in the delta watershed as well as the state’s commercial and recreational fisheries.

Rosenfield and other critics noted that the agreement uses a water baseline laid down in 2019 by the Trump administration, so much of the additional water made available under the proposal would simply restore the flows that had been called for under federal biological opinions a decade earlier.

Rosenfield also criticized provisions of the deal that involve purchasing water for environmental purposes, essentially using taxpayer money to “subsidize” the water districts’ obligations.

“We don’t need to pay water districts for water that belongs to the people of California,” Rosenfield said.

However, state officials stressed that the agreement would send a significant amount of water flowing through the delta that otherwise wouldn’t be helping the ecosystem. And they said the collaborative approach, worked out through years of meetings and negotiations, can avoid protracted fights.

“You can get a lot more done at a bigger scale when you’re trying to do it collaboratively, because you step around what tends to be decades’ worth of litigation, when people don’t want to voluntarily talk about leaving water in the rivers,” said Chuck Bonham, director of the California Department of Fish and Wildlife.

“Instead of fighting about what to do, we now have a commitment for one of the largest habitat restoration efforts conceivable,” Bonham said.

He said large-scale habitat restoration efforts in the watershed can make a big difference in recovering fish and other species that are at risk.

The state’s traditional approach has been to adopt regulations and then deal with lawsuits, and the proposed agreement aims to circumvent that approach to reduce uncertainty, said Jeffrey Mount, a senior fellow with the Public Policy Institute of California.

“For the water user community, it meets one of their great needs, and that is regulatory certainty,” Mount said. “It’s so that there is not an annual, difficult regulatory battle.”

Mount said he supports the approach generally and has been calling for something like this for years as a more effective strategy.

“But it would have been better if they could have actually brought in the environmental community and had them as part of these negotiations,” Mount said.

What will ultimately come out of the plan is uncertain, he said, because some agencies haven’t signed on to the terms and the deal will need to undergo a lengthy review.

The plan laid out in the agreement calls for environmental monitoring, and if key indicators aren’t met through the voluntary agreement by the sixth year, Blumenfeld said, the state could change course and instead work toward those goals through regulation. State regulators could determine if the voluntary agreements should be continued, modified or ended.

“So there’s a backstop,” Blumenfeld said. “There’s a lot at stake to make it work. But if it doesn’t, we get to implement the more traditional regulatory pathway.”

Water districts that do not agree to the voluntary approach will be required to comply with requirements set by the state water board. The agencies that haven’t signed on include those that draw water from the lower San Joaquin River and its tributaries, among them the Merced Irrigation District, Modesto Irrigation District, Friant Water Authority and San Francisco Public Utilities Commission.

State officials have told managers of these agencies that their proposals fall short of what’s needed, and that the door is open for them to participate if they agree to enough additional water and support for habitat projects.

Mount said the state may be taking a “divide and conquer strategy,” but it most certainly won’t end the conflicts.

“The water wars will continue because we’re talking about tradeoffs in use in a zero-sum game,” Mount said.

The water agencies that have joined the deal have committed to restoring or creating 20,000 acres of floodplain habitat, and nearly 3,300 acres of additional habitat where fish can spawn.

A breakdown of the implementation costs under the agreement lists $858 million for habitat restoration and construction in the watershed, plus additional amounts for scientific monitoring, water purchases and payments for some growers to leave farmlands dry and fallow.

Managers of water districts that signed the agreement this week have committed to taking the terms to their boards for endorsement.

Adel Hagekhalil, general manager of the Metropolitan Water District, said the agreement represents a milestone first step in a joint effort to develop a watershed-wide approach to address the challenges in the delta.

“We need to work collaboratively with all of our state, federal, environmental and water agency partners to ensure we have a comprehensive action plan that improves water reliability and delivers real results for the environment,” Hagekhalil said in a statement.

The Newsom administration is pushing for the voluntary deals while also pursuing a controversial plan for rerouting the state’s water system by building a huge water tunnel beneath the delta.

Environmental advocates said they’re concerned about the estimated $2.6 billion that would be spent on implementation, with funds coming from water suppliers and the state and federal governments. They also said that there is no enforcement mechanism if the expected funding doesn’t come through, and that the document outlining the deal counts water supplies that have yet to be secured.

“Of course, we support floodplain restoration,” said Regina Chichizola, executive director of the group Save California Salmon. But she said research has shown that the health of the delta ecosystem demands much more water than this agreement would provide.

“This to me doesn’t seem like it’s dealing with drought or climate change or what the actual needs of [the] delta are. So I’m disappointed,” Chichizola said.

She said it’s also concerning that instead of having an open, democratic process guided by science, “it’s just the most elite water users” that were in the room to negotiate.

John McManus, president of the Golden State Salmon Assn., said that nobody from the salmon fishing industry was invited to participate in the talks.

“I think many in California will wonder why taxpayers have to pay to gain basic environmental protections for our fish and wildlife,” McManus said. “Don’t we already have regulations that should ensure the protection of our fish and wildlife?”

No obituary for Earth: Scientists fight climate doom talk

It’s not the end of the world. It only seems that way.

Climate change is going to get worse, but as gloomy as the latest scientific reports are, including today’s from the United Nations, scientist after scientist stresses that curbing global warming is not hopeless. The science says it is not game over for planet Earth or humanity. Action can prevent some of the worst if done soon, they say.

After decades of trying to get the public’s attention, spur action by governments and fight against organized movements denying the science, climate researchers say they have a new fight on their hands: doomism. It’s the feeling that nothing can be done, so why bother. It’s young people publicly swearing off having children because of climate change.

University of Maine climate scientist Jacquelyn Gill noticed in 2018 fewer people telling her climate change isn’t real and more “people that we now call doomers that you know believe that nothing can be done.” Gill says it is just not true.

“I refuse to write off or write an obituary for something that’s still alive,” Gill told The Associated Press, referring to the Earth. “We are not through a threshold or past the threshold. There’s no such thing as pass-fail when it comes to the climate crisis.”

“It’s really, really, really hard to walk people back from that ledge,” Gill said.

Doomism “is definitely a thing,” said Wooster College psychology professor Susan Clayton, who studies climate change anxiety and spoke at a conference in Norway last week that addressed the issue. “It’s a way of saying ‘I don’t have to go to the effort of making changes because there’s nothing I can do anyway.'”

Gill and six other scientists who talked with The Associated Press about doomism aren’t sugarcoating the escalating harm to the climate from accumulating emissions. But that doesn’t make it hopeless, they said.

“Everybody knows it’s going to get worse,” said Woodwell Climate Research Center scientist Jennifer Francis. “We can do a lot to make it less bad than the worst case scenario.”

The United Nation’s Intergovernmental Panel on Climate Change just issued its third report in six months. The first two were on how bad warming is and how it will hurt people and ecosystems, with today’s report focusing on how the extent of disruption depends on how much fossil fuels are burned. It shows the world is still heading in the wrong direction in its fight to curb climate change, with new investments in fossil fuel infrastructure and forests falling to make way for agriculture.

“It’s not that they’re saying you are condemned to a future of destruction and increasing misery,” said Christiana Figueres, the former U.N. climate secretary who helped forge the 2015 Paris climate agreement and now runs an organization called Global Optimism. “What they’re saying is ‘the business-as-usual path … is an atlas of misery ‘ or a future of increasing destruction. But we don’t have to choose that. And that’s the piece, the second piece, that sort of always gets dropped out of the conversation.”

United Nations Environment Program Director Inger Andersen said with reports like these, officials are walking a tightrope. They are trying to spur the world to action because scientists are calling this a crisis. But they also don’t want to send people spiraling into paralysis because it is too gloomy.

“We are not doomed, but rapid action is absolutely essential,” Andersen said. “With every month or year that we delay action, climate change becomes more complex, expensive and difficult to overcome.”

“The big message we’ve got (is that) human activities got us into this problem and human agency can actually get us out of it again,” James Skea, co-chair of Monday’s report, said. “It’s not all lost. We really have the chance to do something.”

Monday’s report details that it is unlikely, without immediate and drastic carbon pollution cuts, that the world will limit warming to 1.5 degrees Celsius (2.7 degrees Fahrenheit) since pre-industrial times, which is the world’s agreed upon goal. The world has already warmed 1.1 degrees Celsius (2 degrees Fahrenheit). And earlier IPCC reports have shown that after 1.5 degrees, more people die, more ecosystems are in trouble and climate change worsens rapidly.

“We don’t fall over the cliff at 1.5 degrees,” Skea said, “Even if we were to go beyond 1.5 it doesn’t mean we throw up our hands in despair.”

IPCC reports showed that depending on how much coal, oil, and natural gas is burned, warming by 2100 could be anywhere from 1.4 to 4.4 degrees Celsius (2.5 to 7.2 degrees Fahrenheit) above pre-industrial times, which can mean large differences in sickness, death and weather disasters.

While he sees the increase in doom talk as inevitable, NASA climate scientist Gavin Schmidt said he knows first-hand that people are wrong when they say nothing can be done: “I work with people and I’m watching other people and I’m seeing the administration. And people are doing things and they’re doing the right things for the most part as best they can. So I’m seeing people do things.”

Pennsylvania State University climate scientist Michael Mann said scientists used to think Earth would be committed to decades of future warming even after people stopped pumping more carbon dioxide into the air than nature takes out. But newer analyses in recent years show it will only take a few years after net zero emissions for carbon levels in the air to start to go down because of carbon being sucked up by the oceans and forests, Mann said.

Scientists’ legitimate worries get repeated and amplified like in the kids game of telephone and “by the time you’re done, it’s ‘we’re doomed’ when what the scientist actually said was we need to reduce or carbon emissions 50% within this decade to avoid 1.5 (degrees of) warming, which would be really bad. Two degrees of warming would be far worse than 1.5 warming, but not the end of civilization,” Mann said.

Mann said doomism has become far more of a threat than denialism and he believes that some of the same people, trade associations and companies that denied climate change are encouraging people who say it is too late. Mann is battling publicly with a retired University of Arizona ecologist, Guy McPherson, an intellectual leader of the doom movement.

McPherson said he’s not part of the monetary system, hasn’t had a paycheck in 13 years, doesn’t vote and lived off the grid for a decade. He said all species go extinct and humans are no exception. He publicly predicted humanity will go extinct in 2026, but in an interview with The Associated Press said, “I’m not nearly as stuck on 2026,” and mentioned 2030 and changes to human habitat from the loss of Arctic summer sea ice.

Woodwell’s Francis, a pioneer in the study of Arctic sea ice who McPherson said he admires, said while the Arctic will be ice free by the summer by 2050, McPherson exaggerates the bad effects. Local Arctic residents will be hit hard, “the rest of us will experience accelerated warming and sea-level rise, disrupted weather patterns and more frequent extreme weather. Most communities will adapt to varying degrees,” Francis said. “There’s no way in hell humans will go extinct by 2026.”

Humans probably can no longer prevent Arctic sea ice from disappearing in the summer, but with new technology and emissions cuts, Francis said, “we stand a real chance of preventing those (other) catastrophic scenarios out there.”

Psychology professor Clayton said “no matter how bad things are, they can always be worse. You can make a difference between bad and worse… That’s very powerful, very self-affirming.”

Strange ‘reverse shock wave’ supernova is exploding in the wrong direction

Part of the shock wave is shrinking rather than expanding.

A powerful shock wave traveling through a cloud of gas left behind by the explosive death of a star has a bizarre quirk: Part of it is traveling in the wrong direction, a new study reveals. 

In the study, researchers found that the shock wave is accelerating at different rates, with one section collapsing back toward the origin of the stellar explosion, or supernova, in what the study authors call a “reverse shock.”  

Cassiopeia A is a nebula, or gas cloud, left behind by a supernova in the constellation Cassiopeia, around 11,000 light-years from Earth, making it one of the closest supernova remnants. The nebula, which is around 16 light-years wide, is made of gas (mainly hydrogen) that was expelled both before and during the explosion that ripped apart the original star. A shock wave from that explosion is still rippling through the gas, and theoretical models show that this shock wave should be expanding evenly, like a perfectly rounded balloon that’s constantly being inflated. 

But the researchers found that this wasn’t the case.

“For a long time, we suspected something weird was going on inside Cassiopeia A,” lead author Jacco Vink, an astronomer at the University of Amsterdam in the Netherlands, told Live Science. Previous studies had shown that the internal motions within the nebula were “rather chaotic” and highlighted that the western region of the shock wave moving through the gas cloud might even be going in the wrong direction, he added.

In the new study, the researchers analyzed the movement of the shock wave, using X-ray images collected by NASA’s Chandra X-ray Observatory, a telescope that orbits Earth. The data, collected over 19 years, confirmed that part of the western region of the shock wave was, in fact, retreating in the opposite direction in a reverse shock.

But they also discovered something even more surprising: Parts of the same region were still accelerating away from the supernova’s epicenter, like the rest of the shock wave.

Uneven expansion 

The current average speed of the expanding gas in Cassiopeia A is around 13.4 million mph (21.6 million km/h), which makes it one of the fastest shock waves ever seen in a supernova remnant, Vink said. This is mainly because the remnant is so young; light from Cassiopeia A reached Earth in 1970. But over time, shock waves lose their momentum to their surroundings and slow down. 

Cassiopeia A consists of two main expanding bands of gas: an inner shell and an outer shell. These two shells are two halves of the same shock wave, and across most of the nebula, the inner and outer shells are traveling at the same speed and in the same direction. But in the western region, the two shells are going in opposite directions: The outer shell is still expanding outward, but the inner shell is moving back toward where the exploding star would have been. 

An image of Cassiopeia A showing the shock wave move through the inner and outer shells of gas. The blue arrows show the western section of the inner shell moving back towards the center of the nebula. (Image credit: J.Vink/astronomie.nl)

The reverse shock is retreating at around 4.3 million mph (6.9 million km/h), which is about a third of the average expansion speed of the rest of the nebula. However, what really puzzled the researchers was how fast the outer shell was expanding compared with the retreating inner shell in this region. The researchers had expected the outer shell to be expanding at a decreased rate compared with the rest of the shock wave, but they found that it was actually accelerating faster than some other regions of the shock wave. “That was a total surprise,” Vink said.

Cosmic collision 

The unusual expansion within Cassiopeia A’s western region does not match up with theoretical supernova models and suggests that something happened to the shock wave in the aftermath of the stellar explosion, Vink said. 

The researchers said the most likely explanation is that the shock wave collided with another shell of gas that was likely ejected by the star before it exploded. As the shock wave hit this gas, it may have slowed down and created a pressure buildup that pushed the inner shell back toward the center. However, the outer shell still may have been forced through this blockage and begun to accelerate again on the other side, Vink said. “This explains both the inward movement of the inner shell but also predicts that the outer shell should be accelerating, as indeed we measured,” he added. 

The researchers also think the unique way the original star died could explain the uneven shock wave. Cassiopeia A is the result of a Type IIb supernova, in which a massive star exploded after it  had almost completely shed its outer layers, Vink said.

An image of  Cassiopeia A combining X-ray data collected by NASA’s Imaging X-ray Polarimetry Explorer, shown in magenta, and NASA’s Chandra X-Ray Observatory, in blue. (Image credit: NASA/CXC/SAO/IXPE)

“X-ray estimates suggest that the star was around four to six times the mass of the sun during the explosion,” Vink said, but the star most likely had a mass of around 18 times the sun when it was born. This means the star lost around two-thirds of its mass, most of which would have been hydrogen, before it exploded; The shock wave may have later collided with this gas, Vink said.

There are several theories as to why Cassiopeia A lost so much of its mass before it exploded. In September 2020, another team of researchers proposed that the original star was part of a binary star system, where two stars orbit each other. That research team said this companion star also could have gone supernova before Cassiopeia A and blasted off the star’s hydrogen “skin” in the process, Live Science previously reported.

However, the authors of the new study are unconvinced by this theory. “The only problem is that we have not yet found the remains of the other star,” Vink said. “So, at this stage, it remains speculative.”

Originally published on Live Science.

Ukraine invasion’s impacts on the world of science

The Russian invasion of Ukraine is being felt far and wide, from risks to nuclear power plants to impacts on science experiments to fear of a nuclear war.

Russia launched a war against Ukraine on Feb. 24, 2022, targeting more than a dozen cities and the Chernobyl nuclear site within the first day of the invasion.

The ongoing war not only threatens Ukraine’s continued existence as an independent country, but the conflict will likely have wide-reaching ramifications for science-related industries and organizations the world over. In addition, the potential for nuclear war and damage to Ukraine’s various nuclear sites pose a threat to public health and the environment, on a global scale.

As the war continues, Live Science will be sharing live updates on how the conflict is impacting various scientific fields, the energy sector and the space industry. We’ll also be covering developments related to nuclear weapons and power plants, as well as relevant health news, such as the state of medical supply chains in Ukraine and updates on how the COVID-19 pandemic is unfolding in the region.

Russia threatened to pull out of the International Space Station (ISS) program until sanctions from the West are lifted, according to news reports. This isn’t the first time the Russian space agency, Roscosmos, has aired such threats.

Head of Roscosmos Dmitry Rogozin said that Moscow would restore cooperation with ISS partners only after sanctions were lifted. (Other ISS partners include the United States, Japan, Canada and the European Union.)

“The purpose of the sanctions is to kill the Russian economy, plunge our people into despair and hunger and bring our country to its knees,” Rogozin wrote on Twitter on Saturday (April 2), as translated from Russian using Google Translate.

This and other similar tweets from Rogozin do not mean Russia will necessarily walk out on the ISS. The Russian space chief is known for his hyperbolic statements, according to Live Science sister site Space.com. Despite these past threats, the ISS has been operating normally, with NASA astronaut Mark Vande Hei returning to Earth in a Russian Soyuz spacecraft on March 30.

Rare primordial gas may be leaking out of Earth’s core

This gas was formed in the aftermath of the Big Bang.

An extremely rare type of helium that was created soon after the Big Bang is leaking out of Earth’s metallic core, a new modeling study suggests.

The vast majority of this gas in the universe, called helium-3, is primordial and was created just after the Big Bang occurred about 13.8 billion years ago. Some of this helium-3 would have joined other gas and dust particles in the solar nebula — the vast, spinning and collapsed cloud that is thought to have led to the creation of the solar system.

The discovery that Earth’s core likely contains a vast reservoir of helium-3 is further evidence to support the idea that Earth formed inside a thriving solar nebula, not on its periphery or during its waning phase, the researchers said.

Helium-3 is “a wonder of nature, and a clue for the history of the Earth, that there’s still a significant amount of this isotope in the interior of the Earth,” study lead author Peter Olson, a geophysicist at the University of New Mexico, said in a statement.

Helium-3 is an isotope, or variant, of helium that has one neutron instead of the usual two in its nucleus. It’s a rare gas, making up just 0.0001% of helium on Earth. It comes from various processes, such as the radioactive decay of tritium, a rare radioactive isotope of hydrogen. But because helium is one of the earliest elements to exist in the universe, most helium-3 likely came from the Big Bang.

Scientists already knew that about 4.4 pounds (2 kilograms) of helium-3 escapes from Earth’s interior annually, mostly along the mid-ocean ridge system where tectonic plates meet, the researchers wrote in the study, published online March 28 in the journal Geochemistry, Geophysics, Geosystems.

This is “about enough to fill a balloon the size of your desk,” Olson said.

But scientists weren’t sure exactly how much of the helium-3 came from the core versus the mantle, and how much helium-3 was in Earth’s reservoirs.

This image taken by Hubble Telescope shows Lagoon Nebula. After the Big Bang, large quantities of the rare gas helium-3 were made, and these gas particles became part of nebulas, one of which later gave rise to our solar system. The amount of helium-3 leaking from Earth’s metallic core indicates that our planet formed inside a nebula with high helium-3 concentrations. (Image credit: NASA/ESA)

To investigate, the research team modeled helium abundance during two important phases of Earth’s history: the planet’s early formation, when it was still accumulating helium, and after the formation of the moon, when our planet lost a lot of this gas. Scientists think that the moon formed when a colossal object about the size of Mars collided with Earth about 4 billion years ago.

This event would have melted Earth’s crust and enabled much of the helium inside our planet to escape.

However, Earth didn’t lose all of its helium-3 at that time. It still retains some of the rare gas, which continues to seep out of Earth’s innards. The core would be a good place for such a reservoir, “because it is less vulnerable to large impacts compared to other parts of the Earth system,” the researchers wrote in the study, and it is not involved in tectonic plate cycling, which also releases helium gas.

The researchers coupled the modern helium-3 leak rate with models of helium isotope behavior. These calculations revealed that between 22 billion pounds (10 teragrams) to 2 trillion pounds (1 pentagram) of helium-3 are hanging out in Earth’s core — an enormous amount, indicating that Earth formed in a solar nebula with high concentrations of the gas.

Their models of gas exchange “exchange during Earth’s formation and evolution implicate the metallic core as a leaky reservoir that supplies the rest of the Earth with helium-3,” the researchers wrote in the study.

However, because these results are based on modeling, the results aren’t ironclad. The team had to make a number of assumptions — for example that Earth took on helium-3 as it formed in the solar nebula, that helium entered into core-forming metals and that some helium left the core for the mantle. These assumptions, in addition to other uncertainties, including how long the solar nebula lasted relative to the rate at which Earth formed, mean that there may be less helium-3 in the core than they calculated, the scientists said.

But the researchers hope to find more clues that support their findings. For instance, finding other nebula-created gases, such as hydrogen, that are leaking from Earth from similar spots and at similar rates as helium-3, could be a “smoking gun” showing that the core is the source, Olson said. “There are many more mysteries than certainties.”

Why isn’t Earth perfectly round?

It’s similar to why your arms feel pulled outward when you spin.

If you had an enormous measuring tape that started at Earth’s center and went to our planet’s highest peak, you wouldn’t be looking at Mount Everest. Rather, the tallest mountain would be on the other side of the world: Ecuador’s Chimborazo.

Chimborazo wins in this case because Earth is actually a little squished at the poles, like a person pressing both hands on the top and bottom of a ball. As a result, the equator — where Ecuador sits — juts out. Rather than a perfect sphere, Earth is “oblate,” meaning it’s shaped like a slightly flattened sphere.

In fact, “most planets and moons are not true spheres; they are usually squished in some way or another,” said James Tuttle Keane, a planetary scientist at NASA’s Jet Propulsion Laboratory in Pasadena, California. So why aren’t Earth and other planets and moons perfectly round?

The obstacle is something called the centrifugal force, Keane told Live Science, or the apparent outward force experienced by an object that’s spinning.

A rotating planet experiences the centrifugal force. You can see it in action, too: If you spin around in a chair or on your feet, you should feel a pull away from your center. Maybe your arms or legs will flail. Or, if you sit on a merry-go-round, “there’s a little bit of extra force acting on you on that merry-go-round, and so you feel tugged off to the side,” Keane said.

Because planets and moons spin, the centrifugal force causes them to bulge at their equators. The effect can be very subtle, but good examples of this are Jupiter and Saturn. If you look at a global image of either gas giant, you’ll notice that they’re slightly squished and their middle bulges. These planets’ squished shape is more noticeable because they are the fastest spinning planets in the solar system, Keane said. The faster something spins, the more the centrifugal force acts on it.

Haumea of the Outer Solar System. Illustration credit to Instituto de Astrofísica de Andalucía and NASA

An extreme example of the centrifugal force acting on a body can be seen in the dwarf planet Haumea, which is almost egg-shaped. (Image credit: Illustration credit: Instituto de Astrofísica de Andalucía/NASA)

An extreme example of the centrifugal force acting on a body is the dwarf planet Haumea, Keane said. The dwarf planet resides in the Kuiper Belt, a region of icy objects outside the orbit of Neptune. Haumea is about the size of Pluto, but it’s spinning so fast (one complete rotation every four hours) that it’s “almost egg-shaped,” Keane said.

Originally published on Live Science.

The sun let out another flare and the photos are stunning

NASA’s Solar Dynamics Observatory photographs the sun with 10 times the resolution of high-definition television.

NASA’s orbiting Solar Dynamics Observatory captured yet another solar flare blasting from the same overactive sunspot that triggered radio blackouts and stunning aurora displays on Earth earlier this week.

The spacecraft, which watches Earth’s parent star from 22,000 miles (36,000 kilometers) above the planet’s surface, captured the flare, classified as a medium-strength type M, on Thursday (March 31) at 2:35 p.m. EDT (1835 GMT).

The Solar Dynamics Observatory images the sun’s entire disk across a range of wavelengths every ten seconds, providing pictures with a resolution 10 times higher than that of high-definition television, according to NASA. This colorized image in particular shows the flare in the extreme ultraviolet part of the spectrum that highlights its high temperature.

An M-class flare is a fairly powerful flare, a sudden release of electromagnetic radiation from the sun that travels at the speed of light. The U.S. National Oceanic and Atmospheric Administration (NOAA) ranked the Thursday flare as M9.6, meaning it was not too far from becoming the strongest type, X-class. The flare caused a moderate radio blackout as it hit Earth, NOAA said in the statement.

Solar flares can disrupt high frequency radio communications as the X-ray and extreme ultraviolet radiation they emit ionize the upper part of Earth’s atmosphere, the ionosphere. The ionosphere extends from 30 miles (48 kilometers) to 600 miles (965 km) above the planet’s surface and includes the outermost atmospheric layers: the exosphere, the thermosphere and parts of the mesosphere.

Under normal circumstances, high frequency radio waves, which transmit communication signals across long distances, bounce off particles in the upper ionosphere back to Earth. But when a solar flare charges up the lower ionosphere, the radio waves lose energy as they pass through, the atmosphere degrading or even absorbing them.

A moderate blackout can disrupt communications for tens of minutes, according to the U.K.’s national weather forecaster Met Office. This type of blackout affects mostly aviation and marine communication, but also radio amateurs and shortwave broadcasting stations. The ionization can also disrupt transmission of signals from navigation satellites, such as the U.S. GPS network.

NASA’s Solar Dynamics Observatory captured this image of a solar flare — as seen in the bright flash in the upper right portion of the image — on March 31, 2022. (Image credit: NASA)

The “magnetically complex” sunspot, called 2975, that gave birth to the latest flare has spurted out about 20 solar flares over the past week including an X-class flare that blasted from the sun on Wednesday (March 30).

A few of these flares have been accompanied by coronal mass ejections (CMEs), which are expulsions of magnetized plasma from the sun’s upper atmosphere, the corona. CMEs travel much slower than flares and usually reach Earth within a few days of forming. When a CME arrives, it may disrupt the planet’s magnetic field, triggering beautiful aurora displays.

On Wednesday night and Thursday early morning, skywatchers reported stunning auroras all over Canada, in the northern parts of the U.S. and in New Zealand.

According to the Met Office, another CME, this one associated with Wednesday’s X-class flare, is due to hit Earth Saturday (April 2) and will likely give the polar lights another boost. Because Earth’s magnetic field is the weakest above polar regions, the charged solar plasma penetrates deeper into the atmosphere in these areas. The interaction of the charged solar particles with the particles in Earth’s atmosphere results in the glowing spectacles. The stronger the CME, the farther from the poles auroras can be observed.

So, if you can, head poleward and away from city lights over the weekend and check our ‘Where and how to photograph the aurora’ guide.

Aurora-viewing conditions are likely to remain solid next week; additional M-class flares are likely since sunspot 2975 shows no signs of fading just yet, the Met Office said.

First-of-its-kind detection of reduced human carbon dioxide emissions

For the first time, researchers have spotted short-term, regional fluctuations in atmospheric carbon dioxide (CO2) around the globe due to emissions from human activities.

Using a combination of NASA satellites and atmospheric modeling, the scientists performed a first-of-its-kind detection of human CO2 emissions changes. The new study uses data from NASA’s Orbiting Carbon Observatory-2 (OCO-2) to measure drops in CO2 emissions during the COVID-19 pandemic from space. With daily and monthly data products now available to the public, this opens new possibilities for tracking the collective effects of human activities on CO2 concentrations in near real-time.

Previous studies investigated the effects of lockdowns early in the pandemic and found that global CO2 levels dropped slightly in 2020. However, by combining OCO-2’s high-resolution data with modeling and data analysis tools from NASA’s Goddard Earth Observing System (GEOS), the team was able to narrow down which monthly changes were due to human activity and which were due to natural causes at a regional scale. This confirms previous estimates based on economic and human activity data.

The team’s measurements showed that in the Northern Hemisphere, human-generated growth in CO2 concentrations dropped from February through May 2020 and rebounded during the summer, consistent with a global emissions decrease of 3% to 13% for the year.

The results represent a leap forward for researchers studying regional effects of climate change and tracking results of mitigation strategies, the team said. The method allows detection of changes in atmospheric CO2 just a month or two after they happen, providing fast, actionable information about how human and natural emissions are evolving.

Discerning subtle changes in Earth’s atmosphere

Carbon dioxide (CO2) is a greenhouse gas present in the atmosphere and its concentration changes due to natural processes like respiration from plants, exchange with the world’s oceans, and human activities like fossil fuel combustion and deforestation. Since the Industrial Revolution, the concentration of CO2 in the atmosphere has increased nearly 49%, passing 400 parts per million for the first time in human history in 2013.

When governments asked citizens to stay home early in the COVID-19 pandemic, fewer cars on the road meant steep drops in the amount of greenhouse gases and pollutants released into the atmosphere. But with CO2, a “steep drop” needs to be put in context, said Lesley Ott, a research meteorologist at NASA’s Global Modeling and Assimilation Office at Goddard Space Flight Center in Greenbelt, Maryland. This gas can last in the atmosphere for up to a century after it is released, which is why short-term changes could get lost in the overall global carbon cycle—a sequence of absorption and release that involves natural processes as well as human ones. The lockdowns of early 2020 are one small part of the total CO2 picture for the year.

“Early in 2020, we saw fires in Australia that released CO2, we saw more uptake from plants over India, and we saw all these different influences mixed up,” Ott said. “The challenge is to try to disentangle that and understand what all the different components were.”

Up until recently, measuring these kinds of changes wasn’t possible with satellite technology. NASA’s OCO-2 satellite has high-precision spectrometers designed to pick up even smaller fluctuations in CO2, and combined with the comprehensive GEOS Earth system model, were a perfect fit to spot the pandemic-related changes.

The COVID-19-related lockdowns granted scientists an unexpected and detailed glimpse as to how human activities impact atmospheric composition. Two recent studies, one focusing on nitrogen oxide and the other examining CO2 concentrations, were able to detect the atmospheric ‘fingerprint’ of the lockdowns in unprecedented detail. Credit: NASA / Katie Jepson

“OCO-2 wasn’t designed for monitoring emissions, but it is designed to see even smaller signals than what we saw with COVID,” said lead author Brad Weir, a research scientist at Goddard and Morgan State University. Weir explained that one of the OCO-2 mission research goals was to track how human emissions shifted in response to climate policies, which are expected to produce small, gradual changes in CO2. “We hoped that this measurement system would be able to detect a huge disruption like COVID.”

The team compared the measured changes in atmospheric CO2 with independent estimates of emissions changes due to lockdowns. In addition to confirming those other estimates, the agreement between emissions models and atmospheric CO2 measurements provides strong evidence that the reductions were due to human activities.

GEOS contributed important information on wind patterns and other natural weather fluctuations affecting CO2 emission and transport. “This study really is bringing everything together to attack an enormously difficult problem,” Ott said.

Taking a closer look at greenhouse gases

The team’s results showed that growth in CO2 concentrations dropped in the Northern Hemisphere from February through May 2020 (corresponding to a global emissions decrease of 3% and 13%), which agreed with computer simulations of how activity restrictions and natural influences should affect the atmosphere.

The signal wasn’t as clear in the Southern Hemisphere, thanks to another record-breaking climate anomaly: The Indian Ocean Dipole, or IOD. The IOD is a cyclical pattern of cooler-than-normal oceans in Southeast Asia and warmer-than-normal oceans in the eastern Indian Ocean (“positive” phase) or the reverse (“negative” phase). In late 2019 and early 2020, the IOD experienced an intense positive phase, yielding a plentiful harvest season in sub-Saharan Africa and contributing to the record-setting Australian fire season. Both events strongly affected the carbon cycle and made detecting the signal of COVID lockdowns difficult, the team said—but also demonstrated GEOS/OCO-2’s potential for tracking natural CO2 fluctuations in the future.

GEOS/OCO-2 data power one of the indicators in the COVID-19 Earth Observing Dashboard, a partnership between NASA, the European Space Agency, and the Japan Aerospace Exploration Agency. The dashboard compiles global data and indicators to track how lockdowns, dramatic reductions in transportation, and other COVID-related actions are affecting Earth’s ecosystems.

The GEOS-OCO-2 assimilated product is available for free download, making it accessible to researchers and students who want to investigate further.

“Scientists can go to this dashboard and say, “I see something interesting in the CO2 signal; what could that be?'” said Ott. “There’s all kinds of things we haven’t gotten into in these data sets, and I think it helps people explore in a new way.”

In the future, the new assimilation and analysis method could also be used to help monitor results of climate mitigation programs and policies, especially at the community or regional level, the team said.

“Having the capability to monitor how our climate is changing, knowing this technology is ready to go, is something we’re really proud of,” Ott said.

More information: Brad Weir et al, Regional impacts of COVID-19 on carbon dioxide detected worldwide from space, Science Advances (2021). DOI: 10.1126/sciadv.abf9415

Journal information:Science Advance