Infrastructure & Cities

Mapping a flood of new data

March 22, 2016

Global

March 22, 2016

Global
Becca Lipman

Editor, EMEA

Becca is currently a supporting editor and writer for The Economist Intelligence Unit's thought leadership division in the Americas and EMEA. Her primary focus is on healthcare policy and financial market trends. She has also recently developed research programmes that analyse themes in infrastructure and smart cities, as well as C-suite perspectives on talent strategy, small business and IT development. 
 
Before joining the EIU in New York, and later in London, Becca worked as senior editor at Wall Street & Technology where she reported on IT advances in capital markets. She previously held posts as lead editor for a US stock brokerage. Becca earned her bachelor’s degree in both economics and environmental studies from New York University.

Contact

One city tweets to stay dry

From drones to old-fashioned phone calls, data come from many unlikely sources. In a disaster, such as a flood or earthquake, responders will take whatever information they can get to visualise the crisis and best direct their resources. Increasingly, cities prone to natural disasters are learning to better aid their citizens by empowering their local agencies and responders with sophisticated tools to cut through the large volume and velocity of disaster-related data and synthesise actionable information.

Consider the plight of the metro area of Jakarta, Indonesia, home to some 28m people, 13 rivers and 1,100 km of canals. With 40% of the city below sea level (and sinking), and regularly subject to extreme weather events including torrential downpours in monsoon season, Jakarta’s residents face far-too-frequent, life-threatening floods. Despite the unpredictability of flooding conditions, citizens have long taken a passive approach that depended on government entities to manage the response. But the information Jakarta’s responders had on the flooding conditions was patchy at best. So in the last few years, the government began to turn to the local population for help. It helped.

Today, Jakarta’s municipal government is relying on the web-based PetaJakarta.org project and a handful of other crowdsourcing mobile apps such as Qlue and CROP to collect data and respond to floods and other disasters. Through these programmes, crowdsourced, time-sensitive data derived from citizens’ social-media inputs have made it possible for city agencies to more precisely map the locations of rising floods and help the residents at risk. In January 2015, for example, the web-based Peta Jakarta received 5,209 reports on floods via tweets with detailed text and photos. Anytime there’s a flood, Peta Jakarta’s data from the tweets are mapped and updated every minute, and often cross-checked by Jakarta Disaster Management Agency (BPBD) officials through calls with community leaders to assess the information and guide responders.

But in any city Twitter is only one piece of a very large puzzle. Patrick Meier, author of the book Digital Humanitarians and a proponent of the Peta Jakarta project, says for social-media data to have an impact on response, they cannot be used in isolation. “The whole point of using additional data sources from social media is to complement existing data sources that may not be available as quickly or in real time, but [help with] triangulation, cross-checking and augmenting.” He explains that a basic map of an immediate situation generated by social media is most effective when it’s overlaid with government-produced population distribution, socio-economic, weather and topographical data to produce a detailed visual understanding of the situation and how it’s been affected by the disaster.

Dr Meier adds, “You can never pay 100,000 government officials to be at every street corner.” But with the help of smartphones and citizen engagement, “you can have 100,000 pairs of eyes and ears on the ground.” Dr Meier notes that what makes a project in Jakarta unusually valuable is that the city has an unusually large number of Twitter users per capita. Indeed, Indonesia is one of the top five countries with the most accounts. Additionally, the Peta Jakarta programme ensures its ability to extract usable information by sending specific instructions on how to report on flood levels and activate geo-tagging. Typically, he says, only 3% of tweets are geo-located, limiting their utility.

Buy-in from governments

Not all governments are as engaged with its citizens as Jakarta. While Dr Meier has focused on the role of computing and emerging technologies to enhance humanitarian disaster relief for more than a decade, he recalls the deep resistance to the idea of crowdsourced data as recently as 2009 in meetings with United Nations and other officials. “It was considered frankly laughable. You would not be taken seriously,” he says. “You would discredit yourself in the process.”

Dr Meier’s actions during the 2010 earthquake in Haiti that nearly levelled Port-au-Prince changed many officials’ opinions. Touched personally because his wife was on the ground there and he could not reach her, Dr Meier found a productive way to channel his anxiety. Putting together a growing band of digital volunteers, they created a street-by-street crisis map derived from social media such as Twitter, Facebook and YouTube, as well as newspaper, TV and radio reports, later adding e-mails and text messages to the mix. This map was widely used by local responders, the Federal Emergency Management Agency (FEMA) and the US Marine Corps. It is estimated to have saved hundreds of lives and is credited for getting needed supplies to many more trapped victims.

Even with such life-and-death examples, government agencies remain deeply protective of data because of issues of security, data ownership and citizen privacy. They are also concerned about liability issues if incorrect data lead to an activity that has unsuccessful outcomes. These concerns encumber the combination of crowdsourced data with operational systems of record, and impede the fast progress needed in disaster situations.

“In the US, the National Guard is often called to be a first responder,” explains Jonathan Sury, project director for the National Center for Disaster Preparedness at the Earth Institute in Columbia University. “But since they aren’t a part of any local governmental organisation they do not have access to information behind the local firewall, leaving them at a disadvantage.” Making matters worse, when multiple response agencies show up without access to data they may start to collect their own information with no central body to co-ordinate that response for smarter decisionmaking. “Some cities, like Jakarta, have been making the ability to collect data and openly share access much more available, but we need to get out of this historic need to keep data in silos within government,” Mr Sury adds.

Bringing it all together

For data to be most useful in disaster mitigation, Mr Sury says it’s important to bring regional partners together to explain what systems and processes are available to them, how those systems can be useful in a disaster, what information gaps remain—and how they might help fill them. “Creating awareness of the passive and actively crowdsourced data that can be gathered from social media is an important part of that information process,” he says. “There has to be a human element to the data as inevitably you’re talking about people and their environment at a community level.”

Dr Meier adds that with all the utility of the Haiti map, they also struggled with an overflow of information and data: “This can be as paralysing as an absence of it.” Neither crowdsourcing alone nor simply combining that data with other data is enough: the goal is always to create a more integrated picture. Automated systems powered by artificial intelligence that can cut through the noise and analyse the range of relevant data are important to building usable maps. “No single track of information is perfect on its own,” he says. “You need to be able to mix it all together and say, ‘OK, this is what the situation looks like.’”

New data into the fold

Clearly, “crisis mappers” learned a great deal from and since the Haiti earthquake. In the aftermath of the Nepal earthquake that killed more than 9,000 people in April 2015, teams of digital humanitarians around the world used high-resolution satellite imagery and an online global mapping platform used by the UN, the Red Cross and the Nepal Army to identify buildings, roads and workable routes into villages hard-hit by the quake. As in Haiti, they worked quickly to assess how to get needed food, water and medical care to survivors. In addition to drawing on useful texts, tweets and photos, Dr Meier was involved in employing low-flying drones to provide the kind of three-dimensional information that satellite imagery does not—one more tool for the datagathering toolkit to create a contextually rich picture when time and resources are in short supply.

The analytic tools to do so are still experimental and underused, but disaster-prone areas have an opportunity to give themselves—and their citizens—a better chance of success. Dr Meier summarises it succinctly: “User-generated content can be leveraged, filtered, analysed and turned into actionable intelligence to help increase situational awareness and make more informed decisions.” And that can save lives. 

Download the case study .

Enjoy in-depth insights and expert analysis - subscribe to our Perspectives newsletter, delivered every week