7.4 Open Policy Issues

only for RuBoard - do not distribute or recompile

7.4 Open Policy Issues

When the first edition of this book was published in 1996, many people believed that a working public key infrastructure was a prerequisite for commerce on the World Wide Web. We disagreed. At that time, there was already substantial commerce occurring on the Internet based on old-style, easily forged credit cards, rather than high-tech digital signatures. We argued that the additional security offered by digital signatures might not be necessary if there was money to be made.

Today, the need for a widespread PKI is even more compelling, yet it seems more remote than ever. There are growing incidents of fraud on the Internet, and there is an increasing need to use digital signatures to do business. Yet despite the passage of digital signature legislation in the United States that makes a digital signature as legally binding as a written signature, widespread PKI seems further away today than it was in 1996.

It is not clear that the current vision of a public key infrastructure can even be built. Today's vision calls for a system with multiple CAs and with thousands or millions of different users, each obtaining, invalidating, and discarding certificates and public keys as needed. For the past 30 years, this type of technology has really not been tested outside the lab except in very controlled environments.[14]

[14] Although smart cards have been used widely in Europe and are beginning to be used in the United States, these cards contain anonymous certificates that are not bound to individual identities and do not need to be invalidated if the cards are lost.

In the following sections, we'll look at a few of the problems that must be faced in building a true PKI.

7.4.1 Private Keys Are Not People

Digital signatures facilitate proofs of identity, but they are not proofs of identity by themselves. All they prove is that a person (or a program) signing the digital signature has access to a particular private key that happens to match a particular public key that happens to be signed by a particular CA. Unless the private key is randomly generated and stored in such a way that it can only be used by one individual, the entire process may be suspect.

Unfortunately, both key generation and storage depend on the security of the end user's computer. But the majority of the computers used to run Netscape Navigator or Internet Explorer are unsecure. Many of these computers run software that is downloaded from the Internet without knowledge of its source. Some of these computers are infected by viruses. Some of the programs downloaded have Trojan horses pre-installed. And the most common operating system and browser are terribly buggy, with hundreds of security patches issued over the past few years, so it is possible that any arbitrary system in use on the network has been compromised in the recent past by parties unknown.

The companies issuing digital certificates don't have a solution to this problem yet. The closest that VeriSign comes to addressing the issue is a phrase in its certification practices statement that says:

[E]ach certificate applicant shall securely generate his, her, or its own private key, using a trustworthy system, and take necessary precautions to prevent its compromise, loss, disclosure, modification, or unauthorized use.

But this is system engineering by license agreement! It simply doesn't solve the underlying computer security problems inherent in today's computer systems. Computers aren't trustworthy, because they can't prevent the intentional modification of programs by other programs or intruders. A computer virus or other rogue program could search its victim's computer for a copy of Netscape Navigator and modify the random number generator so that it always returned one of a million possible values. Public keys would still appear uncrackable, but anybody who knew about the virus would be able to forge your digital signature in no time.

Today's PCs are no better at storing private keys once they have been generated. Even though both Netscape Navigator and Internet Explorer can store keys encrypted, they have to be decrypted to be used. All an attacker has to do is write a program that manages to get itself run on the user's computer,[15] wait for the key to be decrypted, and then sends the key out over the network.

[15] For example, by using Netscape's plug-in or Microsoft's ActiveX technology.

VeriSign knows this is a problem. "We do not, and cannot, control or monitor the end users' computer systems," says VeriSign's president Stratton Sclavos. "In the absence of implementing high-end PC cards for all subscribers, or controlling or participating in key generation, the storage of end user keys is fully within the control of end users."

Unfortunately, this means that users, and not VeriSign, are ultimately responsible for the fraudulent uses of keys, which leaves one wondering about the ultimate worth of VeriSign's per-key liability policies.

The advent of new technology may solve this problem. The widespread use of smart cards and smart card readers, for example, may make it much more difficult to steal somebody's private key. But it won't be impossible to do so.

7.4.2 Distinguished Names Are Not People

Protecting private keys is not enough to establish the trustworthiness of the public key infrastructure. Merely possessing a private key and an X.509 v3 certificate for the matching public key signed by a CA doesn't prove that you are the person whose name appears in the Distinguished Name field of the certificate. All it proves is that somebody managed to get the CA to sign the corresponding public key.

Ideally, a distinguished name means what a CA says it means. Ideally, a CA has established a regimen of practices and assurances, and that CA is consistent in the application of its own policies. But how do you determine if the name in the Distinguished Name field is really correct? How do you evaluate the trustworthiness of a CA? Should private companies be CAs, or should that task be reserved for nations? Would a CA ever break its rules and issue fraudulent digital identification documents? After all, governments, including the United States, have been known to issue fraudulent passports when their interests have demanded that they do so.

How do you compare one CA with another CA? Some CAs voluntarily subscribe to audit methodologies such as SAS 70 or Web Trust for CAs; others do not. The American Bar Association Information Security Committee has published a book, PKI Assessment Guidelines, but few users have the skill or the access to be able to access the CAs that they might employ. Each CA promises that it will follow its own certification rules when it signs its digital signature. How do you know that a CA's rules will assure that a distinguished name on the certificate really belongs to the person they think it does?

If a CA offers several different products, then how do you tell them apart? A CA might offer several different signature products some with rules like "We sign whatever key we see,"[16] and others with more involved certification regimens. How can you recognize which is which in an automated way? Once you've taken the time to understand the CA's rules, how do you know that the CA has really followed them? The case of VeriSign issuing keys with Microsoft's name on them is an example of the fact that accidents can happen.

[16] This is currently the case for VeriSign's Class 1 digital IDs.

In theory, many of these questions can be resolved through the creation of standards, audits, and formal systems of accreditation. Legislation can also be used to create standards. But in practice, efforts to date are not encouraging.

7.4.3 There Are Too Many Robert Smiths

For now, let's ignore the inherent difficulties in running a certification authority ignore the difficulty of guarding keys and fighting fraud, and let's say that CAs are upstanding and honest corporate citizens and that they never make mistakes.

There are still some inherent problems with certificates themselves. If you get a certificate from a CA with the distinguished name "Simson L. Garfinkel", then there's an excellent chance that certificate belongs to him. That's because there is only one Simson L. Garfinkel in the United States and probably only one in the world as well.[17]

[17] This is either a tragedy or blessing. Many things, including really knowing who someone is, are relative to the observer but that's the point of this section.

At least, we think that there is only one Simson L. Garfinkel. We've certainly never met another one. And Simson has searched the nation's credit data banks and checked with Internet search services, and so far it seems there is only one Simson L. Garfinkel in evidence. So it's probably Simson's certificate you've got there.

But what do you do with a certificate that says "Robert Smith" on it? How do you tell which Robert Smith it belongs to? The answer is that a certificate must contain more information than simply a person's name: it must contain enough information to uniquely and legally identify an individual. Unfortunately, you (somebody trying to use Robert Smith's certificate) might not know this additional information so there are still too many Robert Smiths for you.

Are There Better Alternatives to Public Key Digital Signatures?

Should a technology that requires the use of private keys be used in cases where there is a high incentive to commit fraud and a history of fraud, as well as illegal activities by the intended keyholder?

While there is wide agreement that some form of digital timestamping or digital notary service is necessary, it is not clear that this is an ideal application for public key technology. The reason is that the signed timestamp is completely under the control of the service that is signing the timestamp. If the service's private key were compromised, either accidentally or intentionally, the service could issue fraudulent timestamps with different dates.

Bogus signatures and certificates might be issued because of a bribe, by a particular clerk acting on a grudge, or for political purposes.

Other technologies for timestamping exist that do not require the use of private keys. These technologies have the advantage that there is no way to compromise the system because there is no secret to be divulged. One such system is the digital notary marketed by Surety Technologies, Inc. Instead of treating each signature process as a distinct operation, the Surety system builds a hash-tree based, in part, on the contents of every document that is presented for digital timestamping. The root of the tree can be published once a week in The New York Times so that anyone may verify any signature. Tampering with Surety signatures is extremely difficult: the only way to do it is either to find a document with the same message digest (Surety uses a combination of MD5 and SHA-1, which are described in Chapter 3), or to change the root of the tree after it has been published. For more information about Surety and its digital notary system, consult its web site at http://www.surety.com.

For large communities, identifying a person by name alone is of little value there are simply too many chances for a name collision.

7.4.4 Today's Digital Certificates Don't Tell Enough

Another problem with the digital certificates currently being distributed on the Internet is that they don't have enough information in them to be truly useful. Sites that distribute pornography might want to use digital IDs to see if their customers are over 21, but they can't because, unlike driver's licenses, the digital certificates being issued by companies like VeriSign, Thawte, and GTE don't specify age. Sites that would like to have "women-only space" on the Net can't, because VeriSign's digital IDs don't specify gender. They don't even have your photograph or fingerprint, which makes it almost impossible to do business with somebody over the Internet, then have them show up at your office and prove that they are the same person.

Of course, if these digital certificates did have fields for a person's age, gender, or photograph, users on the Internet would say that these IDs violated their privacy if they disclosed that information without the user's consent. And they would be right. That's the whole point of an identification card: to remove privacy and anonymity, producing identity and accountability as a result.

Clearly, there is nothing fundamentally wrong with CAs disclosing information about subscribers, as long as they do so with the subscriber's consent. However, if all certificates disclose personal information, this choice may be illusory: it may be a choice between disclosing information and not using the system.

7.4.5 X.509 v3 Does Not Allow Selective Disclosure

When a student from Stanford University flashes her state-issued California driver's license to gain entrance to a bar on Townsen Street, she is forced to show her true name, her address, and even her Social Security number to the person who is standing guard. The student trusts that the guard will not copy down or memorize any information that is not relevant to the task at hand verifying that she is over 21.

As we discussed in Section 7.2.3.1, Stefan Brands has developed a system of certificates that allow selective disclosure. Although cryptography is promised, because the certificates are not compatible with X.509, the system is not currently being deployed.

Today, the only workable way to allow selective disclosure of personal information using X.509 digital certificates is to use multiple certificates, each one with a different digitally signed piece of personal information. If you want to prove that you are a woman, you provide the organization with your "XX" digital certificate. If you want to prove that you're over 21, you provide an organization with your "XXX" digital certificate. These certificates wouldn't even need to have your legal name on them. The certification authority would probably keep your name on file, however, should some problem arise with the certificate's use.

The IETF's Selective Public Key Infrastructure (SPKI) project is experimenting with small digital certificates that carry a single assertion. For more information on SPKI, see http://world.std.com/~cme/html/spki.html.

7.4.6 Digital Certificates Allow for Easy Data Aggregation

Over the past two decades, universal identifiers such as the U.S. Social Security number have become tools for systematically violating people's privacy. Universal identifiers can be used to aggregate information from many different sources to create comprehensive data profiles of individuals.

Digital certificates issued from a central location have the potential to become a far better tool for aggregating information than the Social Security number ever was. That's because digital signatures overcome the biggest problem that's been seen by people using Social Security numbers: poor data. People sometimes lie about their Social Security numbers; other times, these numbers are mistyped.

Today, when two businesses attempt to match individually identified records, the process is often difficult because the numbers don't match. By design, digital certificates will simplify this process by providing for verified electronic entry of the numbers. As a result, the practice of building large data banks of personal information aggregated from multiple sources is likely to increase.

7.4.7 How Many CAs Does Society Need?

Would the world be a better place if there were only one CA, and everybody trusted it? How about if there were two? What about two thousand? Is it better to have many CAs or a few? If you have only one or two, then everybody sort of knows the rules, but it puts that CA in a tremendous position of power. If the world has just one CA, then that CA can deny your existence in cyberspace by simply withholding its signature from your public key.

Do we really need CAs for certifying identity in all cases? Carl Ellison doesn't think so. In his paper on generalized certificates, Ellison writes:

When I communicate with my lover, I don't need to know what city she's in and I certainly don't need to know that she's prosecutable. We aren't signing a contract. All I need is assurance that it's her key, and for that I use my own signature on her key. No CA ever needs to know she has a key, even though this is clearly an identity certificate case.

7.4.8 How Do You Loan a Key?

Here's another question asked, but not answered, by Carl Ellison: How do you handle people loaning out their private keys?

Suppose Carl is sick in the hospital and he wants you to go into his office and bring back his mail. To do this, he needs to give you his private key. Should he be able to do that? Should he revoke his key after you bring it back?

Suppose he's having a problem with a piece of software. It crashes when he uses private key A, but not when he uses private key B. Should he be legally allowed to give a copy of private key A to the software developer so she can figure out what's wrong with the program? Or is he jeopardizing the integrity of the entire public key infrastructure by doing this?

Suppose a private key isn't associated with a person, but is instead associated with a role that person plays within a company. Say it's the private key that's used for signing purchase orders. Is it okay for two people to have that private key? Or should the company create two private keys, one for each person who needs to sign purchase orders?

7.4.9 Why Do These Questions Matter?

People who are talking today about using a public key infrastructure seem to want a system that grants mathematical certainty to the establishment of identity. They want to be able to sign digital contracts and pass cryptographic tokens and know for sure that the person at the other end of the wire is who that person says he is. And they want to be able to seek legal recourse in the event that they are cheated.

The people who are actually setting up these systems seem to be a little wiser. They don't want a system that is perfect, simply one that is better than today's paper-based identification systems.

Unfortunately, it's not clear whether public key technology even gives that kind of assurance about identity. It's an unproven matter of faith among computer security specialists that private keys and digital certificates can be used to establish identity. But these same specialists will pick up the phone and call one another when the digital signature signed at the bottom of an email message doesn't verify. That's because it is very, very easy for the technology to screw up.

Probably the biggest single problem with digital signatures is the fact that they are so brittle. Change one bit in a document and the digital signature at the bottom becomes invalid. Computer security specialists make this out to be an impressive feature of the technology, but the fact is that paper, for all of its problems, is a superior medium for detecting alteration. That's because paper doesn't simply reveal that a change has been made to a document: it reveals where the change was made as well. And while the digital technologies will detect a single period changed to a comma, the technologies will also frequently detect changes that simply don't matter (e.g., a space being changed to two spaces) which causes people to expect signatures not to verify. Meanwhile, although it is possible to create better and better copies of documents, advances in watermarking, holography, and microprinting are allowing us to create new kinds of paper that cannot be readily copied or changed without leaving a detectable trace.

Society does need the ability to create unforgeable electronic documents and records. But while illegal aliens, underage high school students, and escaped convicts may be interested in creating forged credentials, few other people are. Society will have to discover ways of working with the problems inherent in these new digital identification technologies so that they can be used in a fair and equitable manner.

7.4.10 Brad Biddle on Digital Signatures and E-SIGN

This section was contributed by attorney Brad Biddle. In the first edition of this book, we reprinted Mr. Biddle's, "Ten Questions on Digital Signatures." Since the time of that book's publication, many states and the federal govenrment have adopted legislation giving legal status to digital signatures. In response to that legislation, Mr. Biddle has written this "short history of digital signature and electronic signature legislation," which we include, with his permission.

Beginning in 1995 there was a flurry of legislative attention related to digital signatures. The state of Utah enacted its Digital Signature Act, which was based on work done by the Information Security Committee of the American Bar Association's Section of Science and Technology. The Utah legislation, which became a model for other legislative bodies, envisioned a public key infrastructure supported by state-licensed certification authorities.

Through 1996 and 1997, the Utah model increasingly came under fire. Critics levied a number of arguments against the Utah approach:

  • Its premise was based in part on an assumption that certification authorities faced an uncertain and potentially huge liability risk, and therefore wouldn't enter the marketplace. Critics argued that this assumption was wrong, and that any "excessive" CA liability risk was due to flawed CA business models, not a flawed legal environment.

  • It provided one particular technology (PKI) and business model (the authentication model championed at the time primarily by VeriSign) with legal advantages over other authentication technologies and business models; this would potentially stifle innovation.

  • It enshrined a speculative vision for how authentication would work in the commercial marketplace, rather than building on and then clarifying a market-developed model.

  • It didn't solve the immediate, pressing problem facing online businesses: could contracts be formed electronically (e.g., via "clickthrough" agreements or via email)?

  • It contained draconian liability rules for consumers. This became known as the "Grandma loses her house" problem: under the Utah Act, if a consumer lost control of her private key, she could face unlimited liability for resulting damages that is, Grandma could fail to secure her computer from a malicious hacker and end up losing everything she owned.

  • Under this approach, government-licensed CAs would collect vast amounts of sensitive transactional data, but the privacy issues associated weren't addressed in the legislation (leaving the question to privacy-unfriendly default rules).

States began considering alternative approaches to digital and "electronic" (non-public key) signatures. Massachusetts emerged at the opposite end of the spectrum from Utah, with a spare, minimalist approach designed to remove barriers to e-commerce posed by existing law (e.g., unnecessary "writing" requirements) but otherwise letting the marketplace evolve unfettered.

Other states tried to occupy ground between the Utah and Massachusetts extremes. California, for example, enacted a law that permitted electronic signatures in transactions involving state government, if certain security criteria were met. The California Secretary of State was tasked to determine which technologies met this criteria, and enacted regulations that permitted use of public key digital signatures and "signature dynamics," a biometric technology pushed by a company called PenOp.

7.4.10.1 E-SIGN and UETA

In 1998, in response to the wide variety of state laws, the National Conference of Commissioners on Uniform State Laws (NCCUSL) commissioned the drafting of the Uniform Electronic Transactions Act (UETA). After significant debate, the drafters adopted a "technology neutral" approach that is, the law does not endorse PKI in any way, and largely follows the Massachusetts model. NCCUSL adopted UETA in July 1999. A number of different states subsequently enacted UETA, although some enacted it with significant variances from the "official" NCCUSL version.

Also in response to chaos at the state level, in 2000 Congress enacted E-SIGN, the Electronic Signatures in Global and National Commerce Act. E-SIGN is substantively very similar to UETA, although it added some consumer protection elements not found in UETA. Importantly, E-SIGN preempted (superseded) all state laws except state laws that conform to the official text of UETA. So, Utah-style digital signature laws, which had been enacted in several U.S. states, are now dead and completely replaced by E-SIGN (or a state enactment of UETA, if applicable). In the U.S., the law that applies to all e-signatures, including PKI digital signatures, is either E-SIGN (in states that have not enacted UETA), or UETA (in states that have enacted conforming versions of UETA).

The basic rules of both E-SIGN and UETA are quite simple: if a law requires a signature, an "electronic signature" will suffice with "electronic signature" defined very broadly to include things like a plaintext typed name in an email or an electronic click on an "I agree" button (importantly, an electronic signature must be applied by a user with the intent to be bound to a contract). Similarly, if a law requires a "writing," an "electronic record" (a digital copy that meets certain very minimal security criteria) will do. This approach solves the "signed writing" problem discussed in more detail later, but avoids the pitfalls associated with the Utah model.

The debate that has occurred in the U.S. has been echoed at the international level. The Utah approach initially found favor in many countries. More recently, however, it appears that the tide may have turned. After much debate, the European Union enacted an "Electronic Signature Directive" that requires E.U. member states to enact legislation that is substantively similar to the E-SIGN/UETA approach (although the directive also contains some special rules applicable to certification authorities). Some Asian countries have enacted Utah-style laws, but because this has proven challenging for global businesses that want to engage in e-contracting in those regions, these countries have recently begun considering alternatives. Latin America, the Middle East, and Africa have largely been silent on the issue of electronic signatures.

7.4.10.2 Electronic contracting it's more than just "signatures"!

Because the issues associated with digital and electronic signatures have received so much attention from the legal community, it is easy to miss the fact that signatures are only a small and often irrelevant element of electronic contracting.

It may be helpful to first make one point perfectly clear: under U.S. law (and under the law of most countries worldwide, although we won't attempt a detailed international analysis here), it is absolutely possible to form a contract electronically. E-SIGN and UETA have helped cement this conclusion, but really this wasn't a hard question even before these enactments.

Electronic contracting is, fundamentally, contracting. Contract law fundamentals apply. Any contract, electronic or not, requires (a) an "offer," (b) "acceptance," and (c) "consideration" some promised exchange of value. A contract, electronic or not, will not be enforced if a successful defense can be raised: for example, if an element of the contract is "unconscionable" (violates public policy), if one of the contracting parties was too young to create a contract, and so on.

Two recent cases where courts declined to enforce electronic contracts provide interesting demonstrations of these principles. In one case, a judge refused to enforce a license agreement that was presented as a link on a page where users could download Netscape's SmartDownload software. Users were not forced or even asked to read the terms prior to downloading the software, nor was any sort of "I agree" button presented to users. The judge, focusing on what he called "the timeless issue of assent," found that there was no contract, not because of the electronic nature of the circumstances, but because there simply was no "acceptance" of the license terms by users.

In another case, a California judge refused to enforce a "forum selection" clause in AOL's electronic user agreement, because doing so would deprive a California litigant of consumer protection rules available under California law that would not be available under Virginia law. To the court, the question wasn't whether the contract was valid due to its electronic nature, but rather whether the forum selection clause violated public policy.

For the record, many U.S. courts have enforced electronic contracts. All this being said, there are two areas where electronic contracting raises some unique issues: (1) "signed writing" requirements, and (2) proof that is, proving contract formation, proving what the substantive terms of a contract are, and proving party identity.

7.4.10.3 "Signed writing" requirements

"Signed writing" requirements have caused a great deal of confusion in connection with electronic contracting probably unnecessarily.

Most contracts require neither a "signature" nor a "writing" to be valid. There are a small number of exceptions to this general rule, usually based on a policy of requiring more proof in connection with contracts where there is a higher degree of fraud risk or of high-stakes misunderstanding. Some examples of contracts that require a "signed writing" are contracts for:

  • Sales of goods priced over $500

  • Transfers of land

  • Obligations that cannot be performed in less than one year

  • Assignments (but not licenses) of some intellectual property

Courts have tended to construe the signed writing requirement very broadly, allowing, for example, fax headers or preprinted letterhead to serve as a "signature." It is likely that the courts would have treated email headers or plaintext email signatures in a similar manner. E-SIGN and UETA have made this question moot, however: as described earlier, under E-SIGN and UETA "signature" and "writing" requirements are very easily met electronically.

The bottom line is that despite the conventional wisdom to the contrary, when doing electronic contracting under U.S. law, meeting legal "signature" or "writing" requirements is not a significant issue, particularly in light of E-SIGN and UETA. (One caveat: E-SIGN has some special rules about "written notice" requirements in connection with certain legally-required consumer disclosures, applicable, for example, to the insurance and banking industries.)

7.4.10.4 Proof

"Proof" issues associated with electronic contracting present some challenging questions. Imagine the following scenario:

Alice sends Bob a plaintext email that says "Bob, would you like to buy my car for $5000? Your friend, Alice." Bob replies with a plaintext email: "Yes. Regards, Bob."

Alice and Bob have formed a contract. There is an offer (Alice's email), acceptance (Bob's email), and consideration (the promised exchange of car and money). This is a sale of goods valued over $500, so a signed writing is required; per E-SIGN and UETA. Alice's plaintext "Alice" and Bob's plaintext "Bob" or even their email headers will meet the signature requirement, and the email will serve as a writing. Let's assume that there are no applicable defenses (both of the parties were capable of contracting, etc.). The contract law analysis is easy: there is a valid, enforceable contract.

But what if Bob claimed that he never sent the message at all? Once Bob denies that he sent the message, Alice will have the burden of proving to a court that, in fact, it was Bob who contracted with her and what the substance of their agreement was. In this scenario, such a burden would be difficult, but not necessarily impossible, to meet. Alice could, for example, subpoena server logs and determine that the message in fact came from Bob's computer; she could get testimony from, say, Bob's coworkers that he was sitting at his desk at the time that the message was sent. As a practical matter, under this scenario Alice would probably not be inclined to go to such lengths to enforce the agreement.

If Alice thought the risk of being unable to enforce the contract was too high, she could demand a more robust form of authentication from Bob. For example, she could require that Bob sign his email with a digital signature created with a key pair certified by a commercial CA. Note that use of a digital signature would not change the contract law analysis: digital signature or not, Alice and Bob have a contract. But use of a digital signature may make Alice's job of proving the contract easier, and make Bob's denial less credible.

Anyone engaging in electronic contracting will need to make a careful risk/benefit determination around questions of proof. A party to an electronic contract may have to go before a judge and show (a) that there was, in fact, a contract formed that is, there was an offer, an acceptance, and consideration; (b) what the substance of the contract was; and (c) who the contracting parties are. Some methods of electronic contracting, such as the use of CA-authenticated digital signatures, may make proving these points relatively easy. In some cases, however, the cost and hassle of employing robust authentication techniques simply won't be worthwhile. For example, plenty of online businesses rely on "clickthrough" contracts where users self-report their identity. These businesses can still make good proof arguments: by keeping careful records, they can show how they formed contracts with users and show the substance of the contracts; they will have some evidence of user identity. But these businesses presumably have made a decision to forgo the proof benefits accorded by more robust authentication techniques after weighing these benefits, the associated costs, and the risks and consequences associated with potential unenforceability of their electronic contracts.

The bottom line is that it is easy to form a contract electronically, and electronic contracts may be formed in a variety of ways. But it may be difficult to prove an electronic contract in the event of a dispute. It is important to take this fact into account when engaging in electronic contracting, and to scale authentication techniques in accordance with the risk of unenforceability and the consequences if the contract were to be unenforceable.

only for RuBoard - do not distribute or recompile


Web Security, Privacy & Commerce
Web Security, Privacy and Commerce, 2nd Edition
ISBN: 0596000456
EAN: 2147483647
Year: 2000
Pages: 194

flylib.com © 2008-2017.
If you may any questions please contact us: flylib@qtcs.net