17.4. Will Lie-Detecting Software Make Us More Trustworthy?
For several years now you could have purchased lie-detecting software for your personal computer. Not only you, but businesses and government agencies. An Israeli firm, Trustech, introduced PC-based lie-detecting software several years ago, and now there is a thriving little industry pursuing this business. Various versions of the software are designed for use by insurance companies (is a claimant falsifying his claim?), credit card companies (are you telling the truth when you say your credit card was stolen, accounting for the astronomical charges to a 900 number?), call centers (is a customer or a call center agent under great stress? better notify a supervisor), employers (to screen job applicants), law enforcement agencies (which have long employed lie detectors), and you and me when we're worried that a spouse or girl friend or co-worker is doing us dirty.
An early Trustech product was called "Truster a personal truth verifier." It adapted to your phone, allowing you to monitor all callers for their veracity (doubtless helping you build more trusting relationships) or, rather, for a statistical indication of veracity. One advertised use was for agents at traveling checkpoints and airports:
A microphone worn on the officer's shirt would pick up the traveler's voice for analysis on a tiny computer attached to the officer's belt, with results being relayed to the officer by a discreet earphone.
Trustech's CEO, Tamir Segal, running into the expected controversy over the use of such software, gave the usual, bulletproof response:
This is the computer. This is the society that we've decided to live with. The technology is here. It's up to everyone to decide how to use it. I use it as a decision-support tool, not as a decision tool.
Yes, the truth is a vital one: "It's up to us to decide how to use it." But this includes, to begin with, the responsibility to decide whether to use it at all. For the truth is that the minute you and I pick up Segal's invention with the intent to use it, we have already made a crucial choice. After all, merely to decide to monitor your conversational partners in this way is already to enter into an altogether different relationship with them. And that underlying difference in quality is likely to transform society far more than any particular decisions you make about "good" and "bad" uses in Segal's sense.
Trustech's software analyzes supposed stress levels and other psychological indices in the human voice, focusing on measurable features such as "microtremors." But the notion that you can gain a basis for trust by such methods is hardly persuasive. The more significant and likely outcome is that, as we improve our analyses of the external and physical aspects of speech, and as we rely more fully on these externalities when making judgments, the less practiced we will become at hearing and understanding the speaking self behind the sound waves. And the only enduring basis for trust lies in this inner, intimate, delicate wedding of hearing and response the meeting of persons. By shifting the search for trust onto technical ground, we all too easily subvert the deeply social and humane consciousness upon which all trust finally depends. Just as you cannot read the meaning of a text while attending to its alphabetic characters, so, too, you cannot understand what is being said and who is saying it certainly not in any deep way while focusing on the physical characteristics of the voice.
There's also this to consider: Truster can be used not only as a putative lie detector, but also as a reliable biofeedback device. Employing it, we can train ourselves to project and manipulate the physical sound features that Truster presumptuously correlates with such things as "confusion," "excitement," "exaggeration," "sarcasm," and "falsehood." Before the advent of the new software, the general public had no convenient access to such training tools. So, to the extent voice-analysis software enters into the normal give and take of society, becoming a factor all mischief makers must reckon with, we can look forward to an endless technological arms race between those who would detect technical features of the voice and those who would camouflage them. There can be no end or final resolution of such an arms race.