Skip to yearly menu bar Skip to main content


Poster

A Markov Decision Process Model for Socio-Economic Systems Impacted by Climate Change

Salman Sadiq Shuvo · Yasin Yilmaz · Alan Bush · Mark Hafen

Virtual

Keywords: [ Computational Social Sciences ] [ Deep Reinforcement Learning ] [ Sustainability, Climate and Environment ] [ Social Good Applications ] [ Applications - Other ]


Abstract:

Coastal communities are at high risk of natural hazards due to unremitting global warming and sea level rise. Both the catastrophic impacts, e.g., tidal flooding and storm surges, and the long-term impacts, e.g., beach erosion, inundation of low lying areas, and saltwater intrusion into aquifers, cause economic, social, and ecological losses. Creating policies through appropriate modeling of the responses of stakeholders€™, such as government, businesses, and residents, to climate change and sea level rise scenarios can help to reduce these losses. In this work, we propose a Markov decision process (MDP) formulation for an agent (government) which interacts with the environment (nature and residents) to deal with the impacts of climate change, in particular sea level rise. Through theoretical analysis we show that a reasonable government's policy on infrastructure development ought to be proactive and based on detected sea levels in order to minimize the expected total cost, as opposed to a straightforward government that reacts to observed costs from nature. We also provide a deep reinforcement learning-based scenario planning tool considering different government and resident types in terms of cooperation, and different sea level rise projections by the National Oceanic and Atmospheric Administration (NOAA).

Chat is not available.