Shelter from the storm
To better understand and protect citizens before, during and after natural disasters, state and local governments are using a combination of smart sensors and emergency response devices to identify areas susceptible to intense damage and reach those who may be caught in harm’s way. From taking advantage of satellite images and crowd-sourced mapping tools to better predict and help prepare for disasters, agencies are adopting data analytics as an analytical tool to bolster early warning systems and aid relief efforts in the aftermath of a disastrous event.
A survey piloted by The Economist Intelligence Unit and sponsored by SAS, investigated the use of data and analytics in supporting disaster management. The sample included data scientists and professionals across the public, private and non-governmental sectors. The survey found that one-third of respondents consider data analytics very effective in advancing disaster management development. Furthermore, half of those who were surveyed expected significant improvement in strengthening early warning systems through such tools. Unsurprisingly, 46 percent expected data analytics to significantly improve the delivery speed of aid and relief.
Government agencies are facing a deluge of data
Thanks to the advancement of technology and measurement tools, agencies are now able to blend big data and weather monitoring to reduce risk or minimize the impact – something that would not have been possible just a few short years ago. The availability of new data sources has created new opportunities to optimize risk, reduce exposure and create behavior-based products.
Agencies gather insights not just from weather events, but also from a diverse range of technologies such as sensors, geolocation events, photographs and social media. Based on previous weather patterns, state and local governments can be better prepared by developing a model to know what weather tracks or patterns are occurring to better enable evacuation plans. For example, based on data captured during a major flooding event, localities can understand and put up a new dike in an area that is prone to flooding.
With such a gargantuan amount of data being created, are agencies capable of analyzing, assessing, and combining it with existing data? Additionally, despite all of the steps agencies have taken to prepare, Mother Nature has a tendency to throw curveballs. How do you respond to a change that you did not predict in real-time?
Agencies need to have a platform that allows teams to respond to changing natural conditions and provide decisions based on those conditions.
Enterprise open source: the calm before, during and after the storm
Enterprise open source solutions can collect, organize and store data in an efficient way, while also ensuring accuracy when compared to traditional methods. Instead of reactive management of disasters, agencies can now enter into predictive and proactive risk-reduction services with the help of open source enabled technologies.
Through open source platforms, cities can take advantage of their data and obtain solutions that can slice through massive amounts of data to deliver the right intelligence to the right people in real-time. Enterprise open source solutions have strong enough processing capabilities to sift through unstructured data quickly and even evaluate archived data with predictive analytics. And because seconds count during times of disaster, the real-time processing power of open source could spell the difference between life and death.
The beauty of an open source data platform solution is that it can provide value to government agencies throughout the entirety of the disaster management lifecycle. From the early planning stages to long-term recovery, an enterprise open source data platform makes it possible for agencies to make the most informed decisions to mitigate risk and save lives. It makes it possible to prepare for, respond to and recover from disasters in a variety of ways, including:
- Using historical data to develop more effective evacuation strategies and avoid staffing shortages
- Identifying efficient routes for evacuation during disasters based on traffic data from previous disasters
- Tracking weather events in real-time to be ahead of any unanticipated changes in the storm’s pattern
- Planning and predicting the impact during the event to allocate resources early to decrease the fallout
- Using statistical analysis to increase budget and deploy the appropriate amount of emergency services in the wake of a storm
Improving disaster management through data science
At the end of the day, state and local government agencies are challenged with establishing and managing an IT infrastructure on tight budgets, oftentimes inhibiting their ability to keep pace with technology advancements while sustaining their legacy systems. An enterprise open source platform not only solves the problems agencies are trying to address today, but also addresses the challenges that lie in tomorrow’s disasters. Enterprise-ready solutions reduce integration cost and risk while improving the operational effectiveness and efficiency of government infrastructure.
We are seeing the adoption of data sciences across the disaster management community make a direct impact in how the private and public sector address disaster situations. The reduction of storage costs and widespread availability of Hadoop platforms is putting the control of data directly into the hands of agencies. Through the use of open source, agencies are making more informed decisions and, in turn, getting more accurate answers to a wide range of disaster management questions.
Enterprise open source solutions are beginning to become more widely leveraged to provide valuable insights and actionable intelligence for government agencies facing, preparing for or responding to a natural disaster. By managing data in a central platform, agencies can collect, curate, analyze and deliver real-time data to those in need. In short, enterprise open source software is a fast, functional and future-oriented IT infrastructure whose innumerable benefits provide agencies the ability to understand disaster data in real-time, while also improving the storage and access of data for historical insights and predictive analytics to prepare for the next disaster.
Shaun Bierweiler is the vice present of U.S. public sector at Hortonworks.