Tech Leaders and Obama Find Shared Problem: Fading Public Trust


The title of this blog post is a headline from the NYTimes on 18 Dec. 2014, B1. Here is a reaction written back in January 2014.

Trust is earned but government hasn’t made any effort. Indeed, government today is failing to deliver quality information security that it routinely managed in the past, before the current big data age. Hence, there exists a certain nostalgia for an America that served its people without endangering their rights. We can recover that America if we make an effort and match our technology to the new challenge. 

Once, information access permission  routinely required two factors: “right to know” and “need to know”. Now, only the first factor is required to roam freely through sensitive information archives of unprecedented scale. That failure to enforce sensible access requirements lead directly to the Manning and Snowden fiascos. 

Trust? It must be earned. The first step has to be the discovery of a way around Executive Order  13587 of October 7, 2011. When President Obama signed that order, he endorsed a rather cogent analysis of the information access and sharing crisis but the order mandated an administrative solution. I am sorry, but the Mannings and Snowdens don’t follow administrative regulations. Real change is necessary. Change will not be forthcoming however without a swift kick to the wrenches that jam the gears of effective governance. 

The right direction is this: restore the “need to know” requirement that sensible governments have always imposed on individuals who want to poke around in sensitive files. Our government information systems deleted that requirement when government embraced electronic databases. A modern database very easily implements a “right to know” policy but – being a relatively passive entity – it lacks a perspective on “need to know”. Also, answering a query that is broad enough to detect and stop terrorism does, frankly, require a synoptic view across all data resources. The current system is probably the best we can do with antiquated technology.

With modern technology on the other hand, data can be kept in small, well protected locations. Algorithms that operate on encrypted data can detect relationships between individual facts and events recorded in the separated locations. Those detected relationships serve as a “need to know” justification to share. That justification will alert people with the “right to know” that they should share and combine just the related facts – without exposing the entire data collection. Such a modern system is proactive. It performs limited sharing based on need and leaves the bulk of sensitive data well protected. It doesn’t need to be prompted to act when there is probable cause. It draws attention to critical problems and can compel attention. Thus, with modern technology, the public is safer is two ways. Data are better protected and essential facts are pushed into the hands of individuals who serve to protect the public.

Cooperation is the topic of this blog. Security is connected to the evolution of cooperation by the technology that reduces the risk of transactions. People will share and cooperate when there is less risk in the sharing process. Thus technology will enable better cooperation. We – as a society – will cooperate better to advance our causes and overcome our challenges.

Prisoners and Dictators

The following was originally posted 11/1/2008 on etcn.typepad.com and may refer to OpenBEDM under the project name Pygar.

Recently, I attended a lecture by Simon A. Levin (Princeton U.) on Cooperation and Collective Behavior, from Bacteria to the Global Commons.  I wandered into the talk expecting to hear confirmation of themes familiar to me from the pioneering work of R. Axelrod in his 1984 book The Evolution of Cooperation (hereafter EC1984). Certainly Levin said nothing that contradicted the principles uncovered by Axelrod. Indeed, much of the totally new work is foreshadowed by Axelrod’s discussion of the potlatch tradition. But, to be blunt, EC1984 – which frames my discussion on this blog – did not set the agenda for Levin’s talk.

The game has changed; literally, it is about a different game. The research in EC1984 focused on the game of Prisoner’s Dilemma and interpreted cooperation as a way to maximize long-term economic gain. The new research focuses on games with names like Ultimatum Game and Dictator Game in which only short-term gains and losses occur. From the standpoint of EC1984, cooperation would seem unlikely to play a role in either game. Contrary to expectations however, empirical evidence shows that people exhibit “other-regarding” behavior governed by non-economic considerations. They help others without maximizing profit. The lecturer found the new empirical evidence hopeful because problems of the global commons might be solvable if the current population of citizens manifests a strong sense of altruism towards their future descendants.

I wish Dr. Levin well and hope he succeeds in encouraging environmentalism and responsible society. But, counter examples abound. Today’s headlines (Oct. 29, 2008) are occupied by news about the public funds that were given to bankers to enable them to loosen credit and benefit the general economy. Instead, the bankers are using public funds to award bonuses and finance acquisitions in order to eliminate competitors. Evidently,  other-regarding behavior is not found in all social circles; indeed, Levin reminded his audience that other-regarding behavior requires an underpinning from normalizing influences like fear of punishment. Thus, the same banker who would help an acquaintance at the country club, where social reputation is important for the banker, will not help the nation because no enforceable norms apply to corporate decisions impacting the common good.   

Over on the software side ( see Pygar blog), I am developing network enabled cooperation mechanisms that support the principles of EC1984. The players will operate on the Internet under conditions of the iterated Prisoner’s Dilemma. Nothing on the software side was originally conceived to enable “other-regarding” behavior in anything like the Ultimatum Game.

Normative Mechanisms

The following item was originally published on ectn.typepad.com 11/1/2008 when the OpenBEDM project went under the name Pygar.

A Situation to Motivate the Discussion.

When for example might the community need a normative procedure for negotiations conducted through BEDM? Let’s consider a hypothetical, but plausible, situation.

Suppose that the Department of Homeland Security (DHS), an agency of the U.S. Federal Government, gives the New York Police Department (NYPD), a local police force, a sum of money to purchase and use a New Technology (NT) that gathers information indicative of future terrorist activities. Suppose further that the DHS requires the NYPD to share all the data gathered by the NT with the Federal Bureau of Investigation (FBI) in order to maximize the probability of detecting terrorist activity. The NYPD is naturally grateful for the grant but the last condition is troubling. The NYPD has the following worries:

  1. A flow of data from NYPD to the FBI turns a local police force into a surveillance team for a federal agency – a role that is neither in the charter of the NYPD nor consistent with its culture.
  2. Detection of actual terrorist activity will require fusing local data from the NT with information in the national FBI database; indeed, such data fusion is the goal of the required information flow from NYPD to FBI. However, the NYPD has no confidence that the FBI will respond in a timely fashion to a threat in New York, especially when threats are coming in from all over the country. The NYPD wants the option to act immediately to prevent damage in its jurisdiction.
  3. The NYPD likes the flow of money from DHS and plans to request more. It needs to show effective use of the NT in order to justify a request in the next year’s budget. However, the FBI controls the fused data and the NYPD cannot assess and report the effectiveness of its own efforts.

Consequently, the NYPD proposes a two-way data flow. It will provide data to the FBI if the FBI opens its database to the NYPD. Now the NYPD can perform its own data fusion for fast response to events and an independent assessment of the whole program. This proposal worries the FBI. The FBI does not believe it can trust each and every member of the NYPD. The FBI has some legitimate fears about misuse of FBI data. Among these risks are:

  1. an NYPD insider might discover a planned FBI operation, reveal its details and thwart the operation.
  2. a leak in the NYPD might expose the identity of an FBI informant.
  3. an insider might obtain information from the FBI database and use it for blackmail.

Given their mutual suspicion, can the FBI and the NYPD ever cooperate to exploit the New Technology? Yes they can with BEDM.

Standard BEDM Procedure.

Let us now suppose the DHS engages a neutral, disinterested, third party, say the Audubon Society, to serve as the blind agent/broker for a BEDM operation that fuses the NT data from New York City with the data in the FBI’s central database. Now, both the NYPD and the FBI keep control of their data and jealously guard access to it. The server operated by the Audubon Society fuses encrypted data from both. When the Audubon’s server succeeds, it forwards the still encrypted but fused data records to both the NYPD and the FBI. Now, both police agencies have an opportunity to act on the new data and hopefully prevent an incident.

Ideally, this outcome is a win-win situation for both agencies. In the long run, both agencies benefit. Cooperation can only become stronger when both parties benefit. Nevertheless, the temptation to defect and betray trust is always present. Cost-benefit analysis may fail to persuade the emotional side of a human agent. Let us consider what an emotional man might do.

A Defection in the Cooperative Enterprise.

Suppose the Audubon’s server fuses data records in a combination that clearly pinpoints a terrorist cell. Nobody at the Audubon Society knows this; the records are encrypted and, besides that, there are so many birds to count. The fused records go to both the FBI and the NYPD. At the FBI, the first person to realize the significance of the data is a low-level official stuck in the same grade for 8 years. Every individual has a rational element and an emotional one. The emotional side of this agent sees a clear path to getting ahead. The FBI agent acts swiftly. An FBI team closes down the terrorist cell and gathers up evidence including explosives. The story is splashed over the evening papers: “A clear and present danger was averted due to swift and competent FBI operations”. The FBI agent formerly stuck in grade is now on a fast track to a promotion.

Meanwhile the NYPD is not happy. To keep the FBI press release short and on-message, the FBI agent gave no credit to the NYPD. Thus, the NYPD has nothing to show for its use of the NT to gather information or its cooperation with the BEDM operation. According to Axelrod’s theory of cooperation, the NYPD is likely to switch to “tit-for-tat” mode. From now on, the reporting of NT data to the BEDM service will be lackadaisical and incomplete. Such a reaction punishes the FBI by reducing its chances for future successes. That is the proper rational response to what was originally an emotional choice. In the long run, it may bring the FBI around to better cooperation with the NYPD. But is a suboptimal outcome. Good work on the part of the NYPD goes unrewarded and future police work is undermined by a loss of cooperation.

Were it not for the strict encryption of all the information, the NYPD would be able to prove its contribution to the FBI’s achievement and demand proper credit. The encryption is strong, but we can provide an appeal mechanism by which the party who is wronged by a defection can obtain redress from a judge or arbitrator. Here is how it might work.

Here Comes the Judge.

Let us suppose that the DHS appoints a trustworthy arbitrator to adjudicate disputes arising in the use of BEDM. Furthermore, suppose we configure the Pygar software so that the BEDM broker keeps a historical record of the fused, encrypted data records while each party keeps a historical record of the keys that were used for each data fusion transaction. Now the following scenario is possible.

The NYPD first demands credit from the FBI but the FBI denies that it used NYPD data to break up the terrorist cell. Consequently, the NYPD appeals to the DHS appointed arbitrator. The NYPD has decoded the fused data records and could provide them as evidence to the arbitrator but the FBI could claim these records were fabricated. However, encrypted and date-stamped data records supporting the NYPD position are stored with the Audubon Society, a neutral party. The arbitrator requests a copy of the encrypted records from the Audubon Society. Meanwhile, the NYPD gives the arbitrator the transaction key necessary to decode the records. The arbitrator can then decode the fused data records, assess the merits of the NYPD claim, and rule on the validity of the claim.

We call this a normative process because it punishes bad behavior. In the long run, that helps develop a culture where people behave the right way because they must do so to avoid censure or penalty and to keep their standing in the community. In contrast, a culture in which members can get away easily with hurting or betraying each other will quickly become uncooperative and unproductive. This seems intuitively correct; moreover, the recent research on the evolution of cooperation supports the importance of normative mechanisms in developing effective cooperation between competing parties.

Consequently it is likely that the final version of the Pygar software will include the features necessary to set up a judge or arbitrator for disputes.

Postscript.

At the beginning, I said the scenario is plausible. Someone sufficiently high in the DHS would argue that the scenario is impossible because rules and regulations mandate cooperation. But that is not the real world agents and analysts live within. For a description of the real world I heartly recommend the article “Open Source Spying” by Clive Thompson  in tbe Dec. 6, 2006 edition of the New York Times (or see Thompson’s blog: http://www.collisiondetection.net/mt/archives/2006/12/yesterday_the_n.php )

 

Evolving Cooperation on the Network

How Does Cooperation Evolve from Competition?

The following essay was published 11/3/2008 to state a larger purpose for OpenBEDM, which was then called the “Pygar Project”.

Motivation in our society comes from the competitive instinct: get ahead, get rich, get recognition. However, success always depends on effective cooperation. The rich, the famous, the well-known belong to networks of cooperating individuals. The top competitor is never a lone individual but always a member of a clique or network.  

To understand how society and the economy actually work, we need to explain how success is driven by competition but accomplished through cooperation. I believe this will be a matter of national survival. Our society must compete against other societies with different values and political systems. If our networks fail us by undercutting the cooperation within our group, our values and uniqueness will by lost. My thesis is that the Internet is failing us because it is impeding important modes of cooperation.  A new Internet method is essential. 

True, the Internet offers endless on-line information resources. But where vital interests are concerned – national security, medical situations and treatments, financial dealings – the data are locked up in secure off-line databases. We are plugging into the web in great numbers. So are our doctors, our government and our security forces. But, too much of the critical information is locked up in secure databases and unusable. Data security is necessary; but it prevents essential cooperation between problem-solvers who want to increase our society’s health, security, and well-being.

Forward looking technology experts lament the fact that people still rely on traditional face-to-face communication when the topic is really vital. Such experts believe the ordinary users are just backward. Actually, ordinary people are exhibiting a wisdom based on millions of years of social interaction. They realize we should distrust the Internet! Evolution taught our species to avoid a bad deal even before we invented writing. There is something wrong with the Internet that is obvious at the gut level. Forget technology, we need to address the gut issue of risk to life and property.

To see what needs fixing on the Internet, we need to return to the question: why does cooperation emerge as the successful strategy  in a world based on individual competition? That question was answered brilliantly by Dr. Robert Axelrod in his work on the evolution of cooperation. Briefly the answer is that cooperation is a highly effective competitive strategy. People who adopt a pattern of cooperation succeed against people who adopt alternatives like pure competition and hierarchical control. Under the proper conditions, cooperation evolves in a competitive landscape, drives out unproductive patterns of competition, and everybody wins. Under poor conditions, competition reduces a society to a subsistence level economic state and political subservience to a foreign power with a better model for social interaction.

In the Pygar Software Project, we will take Axelrod’s ideas, apply them to the Internet, and create conditions that foster cooperation and ensure a prosperous, secure society. To explain the idea takes several steps.

  1. First, we need to discuss the research on cooperation,
  2. then look at the successful strategies, 
  3. next look at how the current Internet undermines and prevents those successful strategies
  4. and finally explain how Pygar fixes the problems with the Internet. 

Axelrod’s Research on Cooperation

The study of cooperation predates the Internet revolution. University of Michigan Political Science Professor Robert Axelrod propelled the subject into public discussion with the publication of his book The Evolution of Cooperation in 1984. This work provides the social engineering context for the Pygar Project.

In his study, Axelrod and his collaborators showed that humans — and indeed most social animals — exhibit an ability to cooperate with others as a means to maximize self-interest. The study explained how this cooperation arises spontaneously when conditions are right. For many people, the most surprising observation in the study was the volume of evidence to show that voluntary, elective, cooperation is essential within government, corporations and even military units even though those organizations are theoretically based on formal, legal obligations. Organizations whose structure and culture foster natural social cooperation tend to succeed while those that cobble it while relying on hierarchical authority eventually fail.

Axelrod’s notion of the Evolution of Cooperation is the product of both experiments and historical analysis. Reducing the historical patterns to idealized game-theoretical strategies, Axelrod showed that successful strategies follow closely a simple, idealized strategy called tit-for-tat. Tit-for-tat advises competitors to cooperate at first and then take the cue for their next action from how the other player acted on the previous move. The strategy is cooperative because cooperation is the default decision, but the strategy is also principled because it looks out for self-interest. A player following tit-for-tat will not continue cooperation if the other player fails to respond in kind. Lastly, the strategy is forgiving because it reciprocates cooperation in the present moment and forgets past transgressions.

Prof. Axelrod’s argument considers a multitude of issues and draws on many lines of evidence. Only the broad conclusions can be given here. Successful patterns of cooperation evolve during a competitive game when several assumptions are true:

  1. the players interact often and expect to remain players indefinitely,
  2. the players know each other or at least can attribute actions reliably to the responsible agent, and 
  3. each interaction is brief and less consequential than the sum of repeated actions.

Axelrod’s analysis showed that not only are these conditions necessary for tit-for-tat to succeed, they are also essential conditions for all voluntary cooperation. For a detailed analysis, the reader should turn to Axelrod’s book or any of the subsequent studies. The three assumptions are enough to explain the limitations of established technology and the value of the new approach applied in Pygar.

Critique of Integrated Information Networks

Let us consider today’s attempts at cooperative activity on the Internet in the context of the three assumptions required for principled cooperation.

Current procedures leave servers accessible to individuals with no confirmed identity. It is difficult to attribute information, requests, or orders to a fixed identity; thus, the second assumption is violated.

The Pygar Project addresses the identity problem with digital IDs — that is not an novel idea of course — but Pygar adds another important change. Interactions take place with the aid of a broker. Over time, people can develop trust with an intermediary — for example a Google or E-bay — and that trust may alleviate some of the fears of dealing with less well-known parties.

The largest obstacle by far to successful widespread integration of information via the Internet is the violation of the third principle of cooperation: The risk from each interaction should be bounded and small. In contrast, a large open, database is exposed to serious exploitation with potentially grievous consequences. Consequently, the most important databases are closed and private. The people who guard such data do not cooperate; in fact, it may be illegal for them to do so.

The Advantages of Pygar’s Approach

The Pygar Project’s goal is to protect most of an agent’s information resources from misuse. Sharing is based on small quantities of information that are exchanged for a specific reason. Although the limited amount of shared data represents a risk, the bulk of the data are protected. The parties involved in the exchange have verifiable identities so that future cooperation can be extended or withheld based on the past behavior of the other party. The significance of the small quantity of shared data is ensured by a blind broker. The blind broker is prevented from reading any data at all, but the methods allow a blind broker to determine when hidden data items meet the search requirements of the parties who are engaged in the private cooperation. 

Thus, with Pygar, the interaction between parties consists of an extended series of small, descretionary two-way interactions as opposed to large-scale, broadly-inclusive sharing agreements. The Pygar Internet method helps develop a long-term mutual interest in the relationship by preventing a major betrayal of trust and rewarding long-term, mutually beneficial cooperation.