Sensing and Shaping Emerging Conflicts by Andrew Robertson and Steve Olson - HTML preview

PLEASE NOTE: This is an HTML preview only and some elements such as links or page numbers may be incorrect.
Download the book in PDF, ePub, Kindle for a complete version.

PERSPECTIVE FROM A SOCIAL SCIENTIST

Duncan Watts, a principal researcher with Microsoft Research, parsed the issues discussed at the workshop into three categories. In the first category is what he called the representation of ground truth, in which information is gathered and processed to yield a representation of what is happening. (Box 2-1 presents an example of such a representation.) What happens with that information can vary from good to bad, depending on who is using it.

The second category involves the ability to interpret a signal about what is happening to anticipate or predict what will happen. Technologically this  img4.pngimg4.pngis no more difficult than the representation problem. But theoretically it is more difficult because it raises questions about what signals are informative.

img6.png

The third category involves the facilitation of communication, resolution, and reconciliation. The technological problem in this category is com-paratively simple, but the theoretical problem is immense. Giving people cell phones does not indicate whether things will change for the better—or worse.

Watts also classified the issues discussed at the workshop according to the audience to whom information is directed or the users of particular tools.  External actors may be agencies, NGOs, and self-organizing communities focused on an issue or problem; internal actors include the local communities and people directly affected. The use of the information generated in any of the three categories above—representation, early warning, or communication and facilitation—is very different depending on which set of actors receives it. For example, early warning information bumps up against the problem of political will. Even if information indicates that something is going to happen, external agents may do nothing, or they may communicate information to a trusted network of internal actors. In the latter situation, internal actors need to worry about what to do with the information and what the likely consequences of that action might be. If a natural disaster is predicted, will a local population be better off or worse? Computer scientists refer to this kind of situation as the price of anarchy, where distributed decisions are not sorted by outcome. “Simply giving people more information doesn’t necessarily lead to a better outcome, although sometimes it does.”

Technical problems, such as building better real-time awareness tools, can yield an infusion of resources to produce better tools. But political and social problems, such as convincing a policymaker to take a particular action, tend to be harder to solve. Other such problems concern the coordination of responders who converge on a conflict zone to help, or the best ways to encourage local communities to resolve their conflicting agendas.

An experimentalist approach to political and social problems, noted  Watts, might be to instrument the world, conduct field experiments to gauge the impacts of different interventions, and measure the results. Such an approach, however, would be insufficient. The technology challenges may be seen as low-hanging fruit for the near term, while agendas for research could be laid out in other areas to work toward long-term solutions.

This way of looking at the issues prompts several questions, Watts noted.  Are human analysts the best way to combine and analyze information, or can this sense making be better handled by machines? How can that capability be tested? If human analysts are used, how should they be organized? What kinds of people are needed? How can their division of labor be established?  “These are standard questions in industrial organization and organizational sociology,” said Watts, “and I think we have good answers to them, but this is certainly an interesting context in which to think about it.”

The most important question is what to do with information once it has been gathered. The answer is associated with a spectrum of social dynamics issues. Communities and nation-states are complex organizations with multiple scales and many things happening simultaneously. Even if someone has a good picture of what is happening at the moment, the ways to improve a situation are not necessarily obvious. Decisions will also depend on whether actions are to be taken by an external or internal actor.

“I don’t have any answers to any of these questions,” said Watts. “But I wanted to emphasize that the technology is extremely exciting.” Many things are possible today that were not possible ten years ago. But it is an illusion, he said, to think that gathering more data and applying more processing power is going to lead inevitably to better outcomes without understanding how systems work.

BIG DATA FOR CONFLICT PREVENTION

The world’s population is generating and processing an immense quantity of digital information, observed Emmanuel Letouzé, a consultant for the United Nations and other international organizations and the author of UN  Global Pulse’s white paper “Big Data for Development: Opportunities and Challenges.”2 He quoted a figure from the University of California that the world’s computers process about 10 zettabytes of information in a single year, the equivalent of 10 million million gigabytes. Furthermore, the number is increasing—“the growth is really ahead.”

“Big data” is not well defined, but it is often characterized in terms of three Vs: volume, variety, and velocity. The volume ranges from kilobytes to petabytes, the variety from ephemeral texts to archived records, and the velocity from real time to batch processing, but all three dimensions are relative and contextual, said Letouzé. Intent and capacity are the central factors affecting the application of technology, but how these play out exactly depends on the technology and the context in which it is applied.

_____________________

2 The paper is available at www.unglobalpulse.org/sites/default/files/BigDataforDevelopment-UNGlobalPulseJune2012.pdf (May 14, 2013).

 

 

Global Pulse has defined four kinds of big data in its work on development. Data exhaust refers to the “passively collected transactional data from people’s use of digital services like mobile phones, purchases, web searches, etc.,” which create networked sensors of human behavior. Online information is “web content such as news media and social media interactions (e.g., blogs, Twitter), news articles, obituaries, e-commerce, job postings”; these data treat Web usage and content as sensors of human intent, sentiments, perceptions, and wants. Data from physical sensors include “satellite or infrared imagery of changing landscapes, traffic patterns, light emissions, urban development and topographic changes, etc.”—information derived from remote sensing of changes in human activity. And citizen-reported or crowdsourced data refers to “information actively produced or submitted by citizens through mobile phone–based surveys, hotlines, user-generated maps, etc.”; this information is critical for verification and feedback.

Global Pulse also has delineated three applications of big data. Early warning is “early detection of anomalies in how populations use digital devices and services,” which can enable faster response in times of crisis.   Real-time awareness is the use of big data to produce “a fine-grained and current representation of reality,” which can inform the design and target-ing of programs and policies. Real-time feedback is “the ability to monitor a population in real time,” making it possible to understand where policies and programs are failing and make necessary adjustments.

For the use of big data in conflict prevention, Letouzé distinguished between structural and operational efforts. The goal of the former is to understand the ecosystem while identifying the structural drivers of conflict.  The goal of operational prevention is to detect and respond to anomalies through, for example, early warning and response systems. Big data can contribute to both forms of prevention, especially as data become more people centered, bottom up, and decentralized, said Letouzé.

Global Pulse, in partnership with several other organizations, has  analyzed situations analogous to conflict prevention to get a sense of the potential for big data to serve peacebuilding. For example, it has looked at the sociopsychological effects of a spike in unemployment, as measured by online discussions, to seek proxy indicators of upcoming changes, just as the food price index has been a predictor of food riots. And the ability of tweets to anticipate the official influenza rate in the United States similarly demonstrates how big data might provide early warning of emerging events.

Mapping unstructured data generated by politically active users is further evidence of the potential of big data in conflict prevention, Letouzé said.  For example, mining the social web during Iran’s postelection crisis in 2009  revealed some evidence for a shift from awareness and advocacy toward organization and mobilization and eventually action and reaction. Similarly, data visualization of the Iranian blogosphere has identified a dramatic increase in religiously oriented users, while a study of tweets associated with the Arab Spring found that, in 2010, socioeconomic terms (e.g., income, housing, and minimum wage) largely prevailed whereas in 2011, 88 percent of tweets mentioned “revolution,” “corruption,” “freedom,” and related terms.

The evidence, Letouzé explained, indicates that big data could help by providing digital “signatures” that can enhance understanding of human systems, along with digital “smoke signals” of anomalies for early warning and prevention.

However, big data also pose risks and challenges in conflict settings.  (Chapter 4 discusses in detail the misuse of technology in conflict settings.) As Patrick Meier and Jennifer Leaning pointed out in 2009, information and communications technologies, including the use of big data, raise serious concerns about access and security because of the lack of economic development, the prevalence of oppressive regimes, and the increasingly hostile environment for humanitarian aid workers throughout the developing  world.3 In addition, the use of big data for conflict prevention faces many of the same challenges as its use for development, such as digital divides, lack of infrastructure and other resources, and political constraints.

A related important challenge concerns the balance between access  to data and protection of data producers. Reliability in conflict settings is another issue, especially when people have an incentive to “play the system”  or suppress signals (e.g., by destroying cell towers). Though many people think that data are easy to access, in fact not all data are produced in easily accessible and storable forms, said Letouzé. Furthermore, in a conflict setting, the privacy challenge can become a security challenge.

But the biggest problem Letouzé identified is what he called arrogance or overconfidence. People have a tendency to believe that data mining invariably yields the truth. They may see patterns where none exist, confuse correlation and causation, not understand sampling techniques, be misled by sample bias, or lack sufficient computing capacities to appropriately interpret the data. Data scientists or econometricians often do not know the context in which data are generated to be able to distinguish between a joke, an off-handed comment, or a real threat.

___________________

3 Patrick Meier and Jennifer Leaning. 2009. Applying Technology to Crisis Mapping and Early Warning in Humanitarian Settings. Cambridge, MA: Harvard Humanitarian Initiative.

 

Big data can jeopardize the security and privacy of individuals and communities, and this risk may be greater in conflict zones, where it can create a new digital divide between and/or within communities and regions. At worst, big data could function as a sort of Big Brother for a world that is atheoreti-cal, acontextual, and all automated, according to Letouzé.

Contextualization is key, especially when lives are on the line, Letouzé concluded. Big data should build on existing systems and knowledge and should be applied incrementally, iteratively, and over the long term, as a tool rather than a driver of change. Nevertheless, big data will continue to grow and develop and will likely eventually play a significant role in conflict prevention.

TECHNOLOGICAL CHALLENGES FOR PEACEBUILDING

Shortly before the workshop, USAID and Humanity United issued the Tech Challenge for Atrocity Prevention. Five key challenges in peacebuilding were presented at the workshop by Patrick Vinck, a research scientist at the Harvard School of Public Health and associate faculty with the Harvard Humanitarian Initiative (HHI).4 These challenges were:

1. Identification of uses of technology to deter enablers of violence— third parties such as multinational corporations and institutions  that finance, arm, coordinate, or otherwise support perpetrators of  violence.

2. Collection of evidence of sufficient quality to be used in court against the perpetrators.

3. Development of methodologies and indicators to assess vulnerability to interor intragroup violence.

4. Ability to communicate with and between conflict-affected communities and also the ability of affected communities to communicate  with responders.

5. Development of simple, affordable, trainable, and scalable technologies to enable NGOs and human rights activists to gather or verify  information from hard-to-access areas.

___________________

4 More information is available at www.thetechchallenge.org/#!enablers (May 14, 2013).

 

The collection of information is a central component of these challenges, said Vinck. Significant progress has been made in mining information from new technological sources such as the Internet and social media. In a more active system, individuals in a community, whether volunteers or recruited for the task, would send information to a monitoring system. In particular, smartphones can be used to gather data more quickly, more accurately, and with better controls on where the information has been collected and when.

An example from Eastern Congo is a project called Voix des Kivus, in which individuals were selected and trained to report information as it happened in the field. Another example of the adoption of a new technology is the use of satellite images to document the preparation of attacks, a step that has helped to democratize tools previously limited to military use.

These new technological tools hold promise, but there has been very  little evaluation of their application, Vinck noted. And the evaluation that has been undertaken reveals a problem of linking information with responses.  In the Central African Republic, for example, a system was set up to improve communication between affected communities in the Lord’s Resistance  Army area with humanitarian groups. After six months, hundreds of messages had been received from the community, but no humanitarians indicated having responded directly to any of these messages, even though the system was supposed to be a two-direction communication system. “They were gathering and collecting the information but they were not using it,”  said Vinck. The same thing happened with the Voix des Kivus project: it was a success in collecting information, but no humanitarians indicated having responded directly to that information.

Vinck also pointed to a disconnect between the technologies discussed at the workshop and what is actually happening on the ground. In some places, less than a third of the population has access to a cell phone, and of that only a fraction may use text messaging. Text messaging may be common among the most educated people in the community but not, for example, among poor women, so the resulting information may be biased. Access to technology may vary by geography within a country, which may also distort the information provided. In some places even simple technologies like radios may not work because of a lack of electricity, equipment, or local capacity to fix equipment. Technology has great potential, Vinck said, but biased results may be detrimental to the situation on the ground.

The information collected by communities through technologies is also typically available to those communities, which therefore have a responsibility to respond to that information, according to Vinck. Responses are no longer solely in the hands of international organizations or governments.  With satellite imagery, for example, if credible evidence shows troops mass-ing outside a village, the people of that village can respond; they may flee, or they may respond with violence.

Whoever compiles and provides information to a community has a  responsibility for what happens with that information, which raises a host of ethical questions. What does information mean? How should it be interpreted? How should it be shared and with whom?

Finally, technology can bear witness to what has happened. Sensitive data need to be archived and protected, said Vinck. Many groups in the public and private sectors have collected large amounts of data, but there is no clear responsibility for storing the data.

DISCUSSION

Melanie Greenberg, president and CEO of the Alliance for Peacebuilding, called attention to issues associated with the sharing of data gathered using technologies (the subject of a previous NAE-USIP workshop that she cochaired5). There are particular ethical considerations associated with the sharing of data with the military, for example, as such sharing can affect the security of NGO personnel and their local partners.

Matt Levinger, director of the National Security Studies Program at  George Washington University, said his experience as an early warning analyst made him a skeptic about early warnings in general. “It’s hard to predict the future…any number of potential futures are possible.” A better approach, he said, is early detection and adaptive response. In his work on conflict analysis, he thinks of actors as either dividers (potential sources of polarization and conflict) or connectors (potential sources of cohesion).

Generally speaking, peacebuilding involves trying to identify and mitigate the effects of the dividers and trying to identify and bolster the connectors.

A key question, then, is, Where do technologies have the potential to make new kinds of connections and boost resilience? “If we start thinking about what information do we need and go from there, we will be in a much better place than if we ask what information the technology allows us to obtain.”

Robert Loftis, a consultant and former State Department official  responsible for conflict stabilization, discussed the need to separate sensing and shaping. Sensing essentially involves reacting to something that is happening. But most conflicts are not surprises, even though their timing may not be known for sure. Sensing technologies can direct humanitarian aid, but, unlike shaping, they do not necessarily change the conflict. (Chapter 5  addresses the path from sensing to shaping.) The question, then, is whether the use of technologies can, in fact, prevent a conflict. Can they be used to help resolve land tenure disputes or differences over water rights before these become violent conflicts? This more anticipatory and active approach involves the dissemination and use of information to reduce differences among people and groups.

___________________

5 The report, Using Data Sharing to Improve Coordination in Peacebuilding, was released in December 2012 and is available on the NAE website (www.nae.edu/66866.aspx) (May 14, 2013).

 

Joseph Bock, director of global health training for the Eck Institute for Global Health at the University of Notre Dame, wondered whether some aspects of big data might be overly hyped. Flashpoints are often single pre-cipitating events, not related to complex pattern analysis, and understanding them may be more important than analyzing big data. Still, he said, the latter could be immensely useful in tracking sentiment through media and communications, which today is a labor-intensive task. Combined with the use of sensors to detect conversations, big data could be “incredibly powerful,”  though there is also a risk of being massively intrusive.

Fred Tipson called attention to the opportunities provided by technologies that promote collaboration. Peacebuilding is built on interactions among individuals and groups, and technology platforms can facilitate these interactions and broaden the range and effectiveness of the actors involved.