View text source at Wikipedia
Future Attribute Screening Technology (FAST)[1] is a program created by the Department of Homeland Security. It was originally titled Project Hostile Intent. The purpose is to detect "Mal Intent" by screening people for "psychological and physiological indicators"[2] in a "Mobile Screening Laboratory".[3]
The program was under the Homeland Security Advanced Research Agency and the Science & Technology Human Factors Behavior Science Division of DHS.[4] In a meeting held on July 24, 2008, the DHS Under Secretary Jay Cohen stated, the goal is to create a new technology that would be working in real time as opposed to after a crime is already committed.[5]
The DHS science spokesman John Verrico stated in September 2008 that preliminary testing had demonstrated 78% accuracy on mal-intent detection and 80% on deception.[6] However, this was not a controlled, double-blind study, and researchers from Lawrence University and the Federation of American Scientists have questioned its validity without further evidence.[7]
The system measures pulse rate, skin temperature, breathing, facial expression, body movement, pupil dilation, and other "psycho physiological/behavioral patterns" to stop "unknown terrorists". The technology would mostly be used at airports, borders, and special events.[8] Fox News reported that the mobile units transmit data to analysts, who use "a system to recognize, define and measure seven primary emotions and emotional cues that are reflected in contractions of facial muscles." The system is named MALINTENT. Results are transmitted back to screeners.[4][9]
DHS produced a 'privacy impact assessment' in 2008. It described the system as comprising:[10]
The DHS plan on using cameras and sensors to measure and track the changes in a person's body language, the tone of their voice and the rhythm of their speech. Civil Liberties Groups raised privacy concerns about the project but Burns from the DHS claims "the technology would erase data after each screening, and no personal information would be used to identify subjects, create files, or make lists". He reassured the public that regulations would be put in place to protect privacy if and when the technology is deployed.
Other researchers, such as Tom Ormerod of the Investigative Expertise Unit at the UK's Lancaster University, argue that ordinary travel anxieties could cause false positives—Ormerod told Nature "even having an iris scan or fingerprint read at immigration is enough to raise the heart rate of most legitimate travellers".[7] Others noted that the basic premise may be flawed. Steven Aftergood, a senior research analyst at the Federation of American Scientists, stated "I believe that the premise of this approach—that there is an identifiable physiological signature uniquely associated with malicious intent—is mistaken. To my knowledge, it has not been demonstrated." The Nature article in which he was quoted went on to note that Aftergood is concerned that the technology "will produce a large proportion of false positives, frequently tagging innocent people as potential terrorists and making the system unworkable in a busy airport."[7]
Due to the ability of the system to 'read people's thoughts', it is potentially in violation of privacy laws such as the Fourth and Fifth Amendment to the United States Constitution. A summary of the scientific and legal issues with the program was presented at DEF CON in 2011 by independent security researchers.[14]