How data can help improve air quality control
By Matt Scholl
Air quality monitoring is facing a dilemma.
The amount of air quality data we are going to have at our fingertips in the coming years is going to grow exponentially. A number of manufacturers are producing high-quality sensors that will significantly increase both the quantity and quality of data available, while simultaneously reducing the cost of monitoring. There are also a growing number of data aggregators opening their platforms to any community or enthusiast who wants to buy their own sensors and forward data to them in ex-change for online access. Companies like Weather Underground, owned by IBM, started this with weather stations and they plan to roll out something similar for air quality in the coming months.
Despite these developments, government agencies continue to support more robust, well-established monitoring methods and some have begun to ask whether current reference equivalent monitoring methods can be supplemented with small sensor technology. However, another question to consider, as we have more and more data to analyze, is what methods for analyzing that data will be most appropriate and beneficial for deriving real and valuable outcomes.
One aspect that agencies must consider is big data analytics software. Already being used extensively in other applications, big data could easily be adapted to collect and analyze the massive quantity of data that new monitoring technologies and multiple contextual data streams will pro-duce. Data analysis using predictive models, statistical algorithms, and automated methods allows scientists and policy makers to quickly review huge volumes of data.
Another important consideration for data processing and management is how it will be presented to the variety of stakeholders who have an interest. Vast quantities of scientific data can be adapted and combined with data from other sources – such as digital mapping – to present an easy and highly effective way to determine not only the source of air quality issues, but also those areas most likely to be impacted and potentially measures to reduce those impacts. Data could also be combined with feeds from weather agencies so that sudden wind shifts or changes in atmospheric conditions can be factored into the equation.
An investment in approaches that combines air quality data from numerous monitoring sources with data processing and analytics that enable agencies to make sense of it all would significantly improve management of air quality issues and incident response. Visual analytical methods enable agencies to identify the likely source of problems and respond quickly with a plan supported by measured data, rather than anecdotal evidence and best guesses – often well after the event.
Such tools would also allow analysts to construct forecasting models which show where problems are most likely to occur, allowing agencies to take proactive measures designed to minimize the potential effects on residents.
The availability of more data and better analytics will ultimately help air quality analysts to focus on actions that can improve the situation by facilitating a holistic view rather than analyzing individual subsets of data looking for a link to a viable solution. Deeper insights can be achieved by combining extensive air quality monitoring and processed or modeled datasets with local data, such as geography, traffic and weather.
Investing in data analytics and visualization tools that possess the ability to process the growing volume of data being generated by sensor networks undoubtedly will improve monitoring agencies’ ability to deal with future air quality issues. To be truly effective, though, the air quality community will need to embrace both the volume of data that is now available to them and the technologies that enable them to get their arms around this massive amount of data and make sense of it.
Matt Scholl is Vice President & General Manager – Americas for Envirosuite Limited, a global provider of environmental management technology.