Author Topic: Smile, you're on Guantanamo Camera  (Read 606 times)

0 Members and 2 Guests are viewing this topic.

Lanya

  • Hero Member
  • *****
  • Posts: 3300
    • View Profile
  • Liked:
  • Likes Given: 0
Smile, you're on Guantanamo Camera
« on: August 13, 2007, 12:26:09 AM »
Homeland Security tests automated "Hostile Intent" detector

By John Timmer | Published: August 12, 2007 - 09:31PM CT

An uncomfortable fact of modern security is that too many people go through transit hubs and to public events for all of them to be screened both efficiently and thoroughly. As a result, there has been a lot of attention focused on producing automated systems that screen crowds without the need for human intervention. Automated biometric scans have serious limits, however, in that they can only identify people who have already been classified as threats. The Department of Homeland Security is hoping to overcome that limitation by automating the identification of individuals whose behavior suggests they pose a threat via a program dubbed "Hostile Intent."

The program has a foundation in both real science and prior experience. The human brain is constantly balancing input from both conscious decisions and reflexive actions. When the two conflict, this debate can produce subtle alterations in the timing and appearance of conscious actions, as well as in physiological responses. Trained individuals can read those cues to identify aberrant behavior; a report on the Hostile Intent program in the New Scientist indicates that the DHS has already deployed such individual examiners in airports, where they've apprehended a number of drug smugglers and money launderers.

The big leap will be shifting that sort of expertise to an automated system. At least in the case of facial expressions, a significant amount of work has already been done. Researchers located in the Netherlands have already demonstrated a system that recognizes and rates facial expressions. This system appears to be capable of working in real time, as demonstrated by videos the researchers made of a mood-based Pong game.

The DHS doesn't plan on relying on facial expressions alone, however. In information provided to Ars, the DHS said that it would incorporate a wide variety of "hints" that are already in use by people trained in this area: "hostile intent indicators are composed of behavioral, speech, and physiological cues that are derived from operational and laboratory experiments and reflect the screening and interviewing mission objectives of DHS' operational community," we were told. The DHS hopes that these cues can identify not only those who pose immediate risks but also individuals who are likely to pose risks in the indefinite future.

At least in controlled settings, the system they have developed appears to work reasonably well. According to the DHS, "Preliminary results from the independent verification and validation process yield a multi-modal set of indicators with a detection accuracy of 87 percent for both primary and secondary screening within a research protocol validated by the DHS operational customers." Those operational customers include the defense and intelligence communities as well as regional port authorities. We requested a copy of the experimental protocols used but have not received a response from the DHS at the time of this report.

From a purely scientific standpoint, a success rate near 90 percent is quite impressive for a complex system such as this. It's worth considering, however, just what a 13 percent failure rate means relative to the number of people who go through a major metropolitan airport in a single day. Kennedy Airport in New York, for example, handles over 100,000 passengers on 1,000 flights a day. A 13 percent failure rate could inconvenience many travelers, but the screening device is only one tool among many that will be used to flag travelers for additional screening.

There are more challenges, too. Outside of a lab setting, the system would have to be designed to handle missing information, as it's unlikely that all aspects of the data?voice, behavior, and physiology readings?will be available or clear for every passenger, especially during periods of heavy traffic. The system will either need to function without this missing data or, at the very least, indicate its evaluations are tentative in the absence of a full reading.

Despite these difficulties, Hostile Intent appears to have progressed far enough that I expect there will be test deployments to see how well it performs under real-world conditions. It seems equally likely that any such plans will trigger a lawsuit, given the civil-liberties implications of having the government surreptitiously track our emotional states. There is a certain Minority Report feel to any system that attempts to predict your criminal future.

If the program does ever make it into use, I would love to see what the population of false-positives looks like. The kinds of emotional states that might be considered indicative of a threat?angry, conflicted, nervous?might just as easily characterize anyone going through a divorce or to a job interview.

http://arstechnica.com/news.ars/post/20070812-homeland-security-tests-automated-hostile-intent-detector.html
Planned Parenthood is America’s most trusted provider of reproductive health care.