Disaster management – Artifex.News https://artifex.news Stay Connected. Stay Informed. Tue, 26 Mar 2024 10:51:18 +0000 en-US hourly 1 https://wordpress.org/?v=6.6 https://artifex.news/wp-content/uploads/2023/08/cropped-Artifex-Round-32x32.png Disaster management – Artifex.News https://artifex.news 32 32 Fighting every wildfire makes bigger fires more extreme, study says https://artifex.news/article67993566-ece/ Tue, 26 Mar 2024 10:51:18 +0000 https://artifex.news/article67993566-ece/ Read More “Fighting every wildfire makes bigger fires more extreme, study says” »

]]>

A soldier works to contain wildfires in Nogales, in the High Mountains area of Veracruz state, Mexico, Monday, March 25, 2024.
| Photo Credit: AP

In the U.S., wildland firefighters are able to stop about 98% of all wildfires before the fires have burned even 100 acres. That may seem comforting, but decades of quickly suppressing fires has had unintended consequences.

Fires are a natural part of many landscapes globally. When forests aren’t allowed to burn, they become more dense, and dead branches, leaves and other biomass accumulate, leaving more fuel for the next fire. This buildup leads to more extreme fires that are even harder to put out. That’s why land managers set controlled burns and thin forests to clear out the undergrowth.

However, fuel accumulation isn’t the only consequence of fire suppression.

Fire suppression also disproportionately reduces certain types of fire. In a new study, my colleagues and I show how this effect, known as the suppression bias, compounds the impacts of fuel accumulation and climate change.

What happened to all the low-intensity fires?

Most wildfires are low-intensity. They ignite when conditions aren’t too dry or windy, and they can often be quickly extinguished.

The 2% of fires that escape suppression are those that are more extreme and much harder to fight. They account for about 98% of the burned area in a typical year.

In other words, trying to put out all wildfires doesn’t reduce the total amount of fire equally – instead, it limits low-intensity fires while extreme fires still burn. This effect is worsened by climate change.

Too much suppression makes fires more severe

In our study, we used a fire modeling simulation to explore the effects of the fire suppression bias and see how they compared to the effects of global warming and fuel accumulation alone.

Fuel accumulation and global warming both inherently make fires more severe. But over thousands of simulated fires, we found that allowing forests to burn only under the very worst conditions increased fire severity by the same amount as more than a century’s worth of fuel accumulation or 21st-century climate change.

The suppression bias also changes the way plants and animals interact with fire.

By removing low-intensity fires, humans may be changing the course of evolution. Without exposure to low-intensity fires, species can lose traits crucial for surviving and recovering from such events.

After extreme fires, landscapes have fewer seed sources and less shade. New seedlings have a harder time becoming established, and for those that do, the hotter and drier conditions reduce their chance of survival.

In contrast, low-intensity fires free up space and resources for new growth, while still retaining living trees and other biological legacies that support seedlings in their vulnerable initial years.

By quickly putting out low-intensity fires and allowing only extreme fires to burn, conventional suppression reduces the opportunities for climate-adapted plants to establish and help ecosystems adjust to changes like global warming.

Suppression makes burned area increase faster

As the climate becomes hotter and drier, more area is burning in wildfires. If suppression removes fire, it should help slow this increase, right?

In fact, we found it does just the opposite.

We found that while conventional suppression led to less total area burning, the yearly burned area increased more than three times faster under conventional suppression than under less aggressive suppression efforts. The amount of area burned doubled every 14 years with conventional fire suppression under simulated climate change, instead of every 44 years when low- and moderate-intensity fires were allowed to burn. That raises concerns for how quickly people and ecosystems will have to adapt to extreme fires in the future.

The fact that the amount of area burned is increasing is undoubtedly driven by climate change. But our study shows that the rate of this increase may also be a result of conventional fire management.

The near total suppression of fires over the last century means that even a little additional fire in a more fire-prone future can create big changes. As climate change continues to fuel more fires, the relative increase in area burned will be much bigger.

This puts more stress on communities as they adapt to increased extreme wildfires, from dealing with more wildfire smoke to even changing where people can live.

A way forward

To address the wildfire crisis, fire managers can be less aggressive in suppressing low- and moderate-intensity fires when it is safe to do so. They can also increase the use of prescribed fire and cultural burning to clear away brush and other fuel for fires.

These low-intensity fires will not only reduce the risk of future extreme fires, but they also will create conditions that favor the establishment of species better suited to the changing climate, thereby helping ecosystems adapt to global warming.

Coexisting with wildfire requires developing technologies and approaches that enable the safe management of wildfires under moderate burning conditions. Our study shows that this may be just as necessary as other interventions, such as reducing the number of fires unintentionally started by human activities and mitigating climate change.

The Conversation

Mark Kreider, Ph.D. Candidate in Forest and Conservation Science, University of Montana

This article is republished from The Conversation under a Creative Commons license. Read the original article.



Source link

]]>
India set to transition to hyperlocal extreme weather forecasting https://artifex.news/article67829459-ece/ Sat, 10 Feb 2024 15:30:00 +0000 https://artifex.news/article67829459-ece/ Read More “India set to transition to hyperlocal extreme weather forecasting” »

]]>

Weather forecasting plays a pivotal role in the country’s functioning. Accurately predicting rain, cyclones, heatwaves and drought are critical to inform decision making on disaster management. In India the Indian Meteorology Department (IMD) is the principal government agency in all matters relating to meteorology and it specialises in the incredibly complex science of predicting weather patterns by observing, modelling and interpreting a multitude of variables.

However, in tropical countries like India, weather variability is inherently higher. IMD’s forecasts have improved vastly in the last few years as it has upgraded to technologies similar to the ones used by the U.S., the U.K. and Japan, which are known to produce accurate forecasts. Yet, there are still many days and geographies for which Indian forecasts go wrong, especially during winter and summer monsoon.

One of the major hurdles is the lack of weather monitoring ground stations. Currently, IMD operates around 800 automatic weather stations (AWS), 1,500 automatic rain gauges (ARG) and 37 doppler weather radars (DWR). This is against the total requirements of more than 3,00,000 ground stations (AWS/ARG) and around 70 DWRs. It’s interesting to note that several Indian State governments and private companies operate a significant network of ground stations (more than 20,000), many of which are not currently used by IMD. The factors range from the inaccessibility of the data to their reliability.

Currently, most of the prediction software used in forecasting are based on the global forecasting system and weather research and forecasting models, both of which are not the most modern. In contrast, with the government’s focus on Make in India, its support to start-ups and the huge public and private investments in the sectors of agriculture and climate, many new age companies are switching to artificial intelligence/machine (AI/ML) learning for more predictions. At the same time though, these technologies are only as good and verifiable as the data they can access. Thus there is an urgent need for an integrated system to fill these data gaps. New ground stations will have to be installed and the available data have to be shared seamlessly.

A promising step forward was that recently, the Department of Agriculture & Farmers Welfare and the Ministry of Agriculture & Farmers Welfare have initiated the weather information network and data system (WINDS) to generate long-term, hyper-local weather data. The system will also promote the data for wider applications in agriculture and other sectors, it will help in creating a national-level data base, and it will assist in establishing the protocols required to access the country-wide data by the various public and private concerns. Under this programme, more than 200,000 ground stations (AWS and ARG) will be installed, which can help tremendously in enhancing weather data utilisation and thus in improving weather predictions and decision making.

Meanwhile, air pollution continues to be a challenge. In the last month large dense fog blanketed the NCR region, leading to near-zero visibility. This happens mainly because of the high particulate matter and smog in the atmosphere. Fog can trap the pollutants close to the ground, leading to an increase in respiratory and other health-related issues. Another cause for worry is that in foggy conditions, some pollutants like nitrogen  oxides  can  react  with  other  compounds  to  form  secondary pollutants, thus posing severe health risks, particularly to children and the elderly.

Air quality monitoring systems are currently very expensive and tend to be imported. Thanks to the Make in India initiatives, however, many Indian companies have started to manufacture low-cost and highly reliable sensor-based air quality monitoring systems. These are also easy to install and have low maintenance costs. Thus, it is now possible to install a large number of such instruments quickly, especially in the urban areas. The IITs are helping as well as they have started centres of excellence in this domain, with a mission to establish a nation-wide network of affordable air quality sensors. An integrated AI/ML-based model with data from the new air quality and weather sensors will be a major step forward to accurately predict fog as it will help in timely decision making around transportation and the health-related impacts of air pollution.

With the recent advances India is therefore poised towards establishing a world class, robust air quality and weather information network. What will bring everything together is a time-bound, inclusive approach by the various stakeholders to create this most important of national infrastructures. Once seamless data sharing and systems integration is achieved, India will have access to a new information gateway and one that is critical to addressing our climate and environmental challenges.

(Sachchida Nand Tripathi is Professor, Department of Civil Engineering, IIT Kanpur, and Ashish Agarwal is Founder & Chief Technology Officer, Ingen Technologies)



Source link

]]>