You are on page 1of 49

A Project On CRYPTOGRAPHY At STAR HOLIDAYS

Submitted In Partial Fulfillment of the Requirement For The award Of the Degree Of MASTERS OF BUSINESS ADMINISTRATION Prepared BY:

PINKY VISHNANI
(G-VOO2) AND

SALVI GUPTA
(G-GO34) Submitted to MR.SNEHAL MISTRY METAS OF SEVENTHDAY ADVENTIST COLLEGE ATHWALINES, SURAT 2008-2009 Page ~ 1

Page ~ 2

ACKNOWLEDGEMENT It is rightly said that Success of one is not the reward of a single-hand but of everyone I would like to heartily thank my college, METAS ADVENTIST COLLEGE for designing MASTERS OF BUSINESS ADMINISTRATION program in a way in which I am supposed to take practical knowledge of the subject along with theoretical aspects. I would like to express my gratitude to all those who gave me the possibility to complete this thesis. I want to thank the Faculty of SEVENTH DAY ADVENTIST COLLEGE,SURAT for giving me permission to commence this thesis.

I am deeply indebted to my supervisor Prof. Mistry , whose help, stimulating suggestions and encouragement helped me in all the time of research for and writing of this thesis.

I would also like to thanks to my friends whose help enabled me to complete this work.

THANKING YOU SALVI GUPTA (G-G034) PINKY VISHNANI (G-V002)

Page ~ 3

SR.NO 1.

CONTENTS

PAGE NO. 1 3

TITLE PAGE ACKNOWLEDGEMENT

2.

MAIN MATTER CRYPTOGRAPHY HISTORY OF CRYPTOGRAPHY AND


5 7

CRYPTANALYSIS. TYPES OF CRYPTOGRAPHIC ALGORITHMS Why Three Encryption Techniques? The Significance of Key Length Cryptographic Solution Discovery Cryptographic Solution Integration Prevention is better than remedy! SUMMARY REFERENCES:

14 23 25

28 38 42 43 46

Page ~ 4

CRYPTOGRAPHY.
Cryptography (or cryptology; from Greek , kryptos, "hidden, secret"; is the practice and study of hiding information. Modern cryptography intersects the disciplines of mathematics, computer science, and engineering. Applications of cryptography include ATM cards, computer passwords, and electronic commerce. Cryptography is the science of writing in secret code and is an ancient art; the first documented use of cryptography in writing dates back to circa 1900 B.C. when an Egyptian scribe used non-standard hieroglyphs in an inscription. Some experts argue that cryptography appeared spontaneously sometime after writing was invented, with applications ranging from diplomatic missives to war-time battle plans. It is no surprise, then, that new forms of cryptography came soon after the widespread development of computer communications. In data and telecommunications, cryptography is necessary when communicating over any entrusted medium, which includes just about any network, particularly the Internet. Within the context of any application-to-application communication, there are some specific security requirements, including: Authentication: The process of proving one's identity. (The primary forms of host-to-host authentication on the Internet today are name-based or addressbased, both of which are notoriously weak.) Privacy/confidentiality: Ensuring that no one can read the message except the intended receiver.

Page ~ 5

Integrity: Assuring the receiver that the received message has not been altered in any way from the original. Non-repudiation: A mechanism to prove that the sender really sent this message. Cryptography, then, not only protects data from theft or alteration, but can also be used for user authentication. There are, in general, three types of cryptographic schemes typically used to accomplish these goals: secret key (or symmetric) cryptography, public-key (or asymmetric) cryptography, and hash functions, each of which is described below. In all cases, the initial unencrypted data is referred to as plaintext. It is encrypted into ciphertext, which will in turn (usually) be decrypted into usable plaintext. In many of the descriptions below, two communicating parties will be referred to as Alice and Bob; this is the common nomenclature in the crypto field and literature to make it easier to identify the communicating parties. If there is a third or fourth party to the communication, they will be referred to as Carol and Dave. Mallory is a malicious party, Eve is an eavesdropper, and Trent is a trusted third party.

Page ~ 6

HISTORY OF CRYPTOGRAPHY AND CRYPTANALYSIS


Before the modern era, cryptography was concerned solely with message confidentiality (i.e., encryption)conversion of messages from a comprehensible form into an incomprehensible one and back again at the other end, rendering it unreadable by interceptors or eavesdroppers without secret knowledge (namely the key needed for decryption of that message). Encryption was used to (attempt to) ensure secrecy in communications, such as those of spies, military leaders, and diplomats. In recent decades, the field has expanded beyond confidentiality concerns to include techniques for message integrity checking, sender/receiver identity authentication, digital signatures, interactive proofs and secure computation, among others.

CLASSIC CRYPTOGRAPHY

Reconstructed ancient Greek scytale (rhymes with "Italy"), an early cipher device The earliest forms of secret writing required little more than local pen and paper analogs, as most people could not read. More literacy, or literate opponents, required actual cryptography. The main classical cipher types are transposition ciphers, which rearrange the order of letters in a message (e.g., 'hello world' becomes 'ehlol owrdl' in a trivially simple rearrangement scheme), and substitution ciphers, which systematically replace letters or Page ~ 7

groups of letters with other letters or groups of letters (e.g., 'fly at once' becomes 'gmz bu podf' by replacing each letter with the one following it in the Latin alphabet). Simple versions of either offered little confidentiality from enterprising opponents, and still do. An early substitution cipher was the Caesar cipher, in which each letter in the plaintext was replaced by a letter some fixed number of positions further down the alphabet. It was named after Julius Caesar who is reported to have used it, with a shift of 3, to communicate with his generals during his military campaigns, just like EXCESS-3 code in boolean algebra. There is record of several early Hebrew ciphers as well. The earliest known use of cryptography is some carved ciphertext on stone in Egypt (ca 1900 BC), but this may have been done for the amusement of literate observers. The next oldest is bakery recipes from Mesopotamia. Ciphertexts produced by a classical cipher (and some modern ciphers) always reveal statistical information about the plaintext, which can often be used to break them. After the discovery of frequency analysis perhaps by the Arab mathematician and polymath, Al-Kindi (also known as Alkindus), in the 9th century, nearly all such ciphers became more or less readily breakable by any informed attacker. Such classical ciphers still enjoy popularity today, though mostly as puzzles (see cryptogram). Al-Kindi wrote a book on cryptography entitled Risalah fi Istikhraj al-Mu'amma (Manuscript for the Deciphering Cryptographic Messages), in which described the first cryptanalysis techniques, including some for polyalphabetic ciphers.

Page ~ 8

Essentially all ciphers remained vulnerable to cryptanalysis using the frequency analysis technique until the development of the polyalphabetic cipher, most clearly by Leon Battista Alberti around the year 1467, though there is some indication that it was already known to Al-Kindi.[9] Alberti's innovation was to use different ciphers (i.e., substitution alphabets) for various parts of a message (perhaps for each successive plaintext letter at the limit). He also invented what was probably the first automatic cipher device, a wheel which implemented a partial realization of his invention. In the polyalphabetic Vigenre cipher, encryption uses a key word, which controls letter substitution depending on which letter of the key word is used. In the mid-19th century Charles Babbage showed that polyalphabetic ciphers of this type remained partially vulnerable to extended frequency analysis techniques. Although frequency analysis is a powerful and general technique against many ciphers, encryption has still been often effective in practice; many a would-be cryptanalyst was unaware of the technique. Breaking a message without using frequency analysis essentially required knowledge of the cipher used and perhaps of the key involved, thus making espionage, bribery, burglary, defection, etc., more attractive approaches to the cryptanalytically uninformed. It was finally explicitly recognized in the 19th century that secrecy of a cipher's algorithm is not a sensible nor practical safeguard of message security; in fact, it was further realized that any adequate cryptographic scheme (including ciphers) should remain secure even if the adversary fully understands the cipher algorithm itself. Security of the key used should alone be sufficient for a good cipher to maintain confidentiality under an attack. This fundamental principle was first Page ~ 9

explicitly stated in 1883 by Auguste Kerckhoffs and is generally called Kerckhoffs' principle; alternatively and more bluntly, it was restated by Claude Shannon, the inventor of information theory and the fundamentals of theoretical cryptography, as Shannon's Maxim'the enemy knows the system'. Different physical devices and aids have been used to assist with ciphers. One of the earliest may have been the scytale of ancient Greece, a rod supposedly used by the Spartans as an aid for a transposition cipher (see image above). In medieval times, other aids were invented such as the cipher grille, which was also used for a kind of steganography. With the invention of polyalphabetic ciphers came more sophisticated aids such as Alberti's own cipher disk, Johannes Trithemius' tabula recta scheme, and Thomas Jefferson's multi-cylinder by (not publicly around known, 1900). and reinvented mechanical independently Bazeries Many

encryption/decryption devices were invented early in the 20th century, and several patented, among them rotor machinesfamously including the Enigma machine used by the German government and military from the late '20s and during World War II. The ciphers implemented by better quality examples of these machine designs brought about a substantial increase in cryptanalytic difficulty after WWI.

Page ~ 10

The computer era


The development of digital computers and electronics after WWII made possible much more complex ciphers. Furthermore, computers allowed for the encryption of any kind of data representable in any binary format, unlike classical ciphers which only encrypted written language texts; this was new and significant. Computer use has thus supplanted linguistic cryptography, both for cipher design and cryptanalysis. Many computer ciphers can be characterized by their operation on binary bit sequences (sometimes in groups or blocks), unlike classical and mechanical schemes, which generally manipulate traditional characters (i.e., letters and digits) directly. However, computers have also assisted cryptanalysis, which has compensated to some extent for increased cipher complexity. Nonetheless, good modern ciphers have stayed ahead of cryptanalysis; it is typically the case that use of a quality cipher is very efficient (i.e., fast and requiring few resources, such as memory or CPU capability), while breaking it requires an effort many orders of magnitude larger, and vastly larger than that required for any classical cipher, making cryptanalysis so inefficient and impractical as to be effectively impossible. Alternate methods of attack (bribery, burglary, threat, torture, ...) have become more attractive in consequence. Credit card with smart-card capabilities. The 3-by-5-mm chip embedded in the card is shown, enlarged. Smart cards combine low cost and portability with the power to compute cryptographic algorithms.Extensive open academic research into cryptography is relatively recent; it began only in the mid-1970s. In recent times, IBM personnel designed the algorithm that Page ~ 11

became the Federal (i.e., US) Data Encryption Standard; Whitfield Diffie and Martin Hellman published their key agreement algorithm,;[12] and the RSA algorithm was published in Martin Gardner's Scientific American column. Since then, cryptography has become a widely used tool in communications, computer networks, and computer security generally. Some modern cryptographic techniques can only keep their keys secret if certain mathematical problems are intractable, such as the integer factorization or the discrete logarithm problems, so there are deep connections with abstract mathematics. There are no absolute proofs that a cryptographic technique is secure (but see one-time pad); at best, there are proofs that some techniques are secure if some computational problem is difficult to solve, or this or that assumption about implementation or practical use is met. As well as being aware of cryptographic history, cryptographic algorithm and system designers must also sensibly consider probable future developments while working on their designs. For instance, continuous improvements in computer processing power have increased the scope of brute-force attacks, thus when specifying key lengths, the required key lengths are similarly advancing. The potential effects of quantum computing are already being considered by some cryptographic system designers; the announced imminence of small implementations of these machines may be making the need for this preemptive caution rather more than merely speculative. Essentially, prior to the early 20th century, cryptography was chiefly concerned with linguistic and lexicographic patterns. Since then the emphasis has shifted, and cryptography now makes extensive use of Page ~ 12

mathematics, including aspects of information theory, computational complexity, statistics, combinatorics, abstract algebra, number theory, and finite mathematics generally. Cryptography is, also, a branch of engineering, but an unusual one as it deals with active, intelligent, and malevolent opposition (see cryptographic engineering and security engineering); other kinds of engineering (e.g., civil or chemical engineering) need deal only with neutral natural forces. There is also active research examining the relationship between cryptographic problems and quantum physics (see quantum cryptography and quantum computing).

Page ~ 13

TYPES OF CRYPTOGRAPHIC ALGORITHMS


There are several ways of classifying cryptographic algorithms. For purposes of this paper, they will be categorized based on the number of keys that are employed for encryption and decryption, and further defined by their application and use. The three types of algorithms that will be discussed are : Secret Key Cryptography (SKC): Uses a single key for both encryption and decryption Public Key Cryptography (PKC): Uses one key for encryption and another for decryption

Hash Functions: Uses a mathematical transformation to irreversibly "encrypt" information

Symmetric-key cryptography
Symmetric-key cryptography refers to encryption methods in which both the sender and receiver share the same key (or, less commonly, in which their keys are different, but related in an easily computable way). This was the only kind of encryption publicly known until June 1976.

Page ~ 14

The modern study of symmetric-key ciphers relates mainly to the study of block ciphers and stream ciphers and to their applications. A block cipher is, in a sense, a modern embodiment of Alberti's polyalphabetic cipher: block ciphers take as input a block of plaintext and a key, and output a block of ciphertext of the same size. Since messages are almost always longer than a single block, some method of knitting together successive blocks is required. Several have been developed, some with better security in one aspect or another than others. They are the modes of operation and must be carefully considered when using a block cipher in a cryptosystem. The Data Encryption Standard (DES) and the Advanced Encryption Standard (AES) are block cipher designs which have been designated cryptography standards by the US government (though DES's designation was finally withdrawn after the AES was adopted).[14] Despite its deprecation as an official standard, DES (especially its still-approved and much more secure triple-DES variant) remains quite popular; it is used across a wide range of applications, from ATM encryption[15] to e-mail privacy and secure remote access.[17] Many other block ciphers have been designed and released, with considerable variation in quality. Many have been thoroughly broken; see Category:Block ciphers.[13][18] Stream ciphers, in contrast to the 'block' type, create an arbitrarily long stream of key material, which is combined with the plaintext bit-by-bit or character-by-character, somewhat like the one-time pad. In a stream cipher, the output stream is created based on a hidden internal state which changes as the cipher operates. That internal state is initially set up using the secret key material. RC4 is a widely used stream cipher; see Category:Stream ciphers.[13] Block ciphers can be used as stream ciphers; see Block cipher modes of operation. Cryptographic hash functions are a third type of cryptographic algorithm. They take a message of any length as input, and output a short, fixed length hash which can be used in (for example) a digital signature. For good hash functions, an attacker cannot find two messages that produce the same hash. MD4 is a long-used hash function which is now broken; MD5, a strengthened variant of MD4, is also widely used but broken in practice. The U.S. National Security Agency developed the Secure Hash Algorithm series of MD5-like hash functions: SHA-0 was a flawed algorithm that the agency withdrew; SHA-1 is widely deployed and more secure than MD5, but cryptanalysts have identified attacks against it; the SHA-2 family improves Page ~ 15

on SHA-1, but it isn't yet widely deployed, and the U.S. standards authority thought it "prudent" from a security perspective to develop a new standard to "significantly improve the robustness of NIST's overall hash algorithm toolkit. Thus a hash function design competition is underway and meant to select a new U.S. national standard, to be called SHA-3, by 2012. Message authentication codes (MACs) are much like cryptographic hash functions, except that a secret key can be used to authenticate the hash value upon receipt. Public-key cryptography Symmetric-key cryptosystems use the same key for encryption and decryption of a message, though a message or group of messages may have a different key than others. A significant disadvantage of symmetric ciphers is the key management necessary to use them securely. Each distinct pair of communicating parties must, ideally, share a different key, and perhaps each ciphertext exchanged as well. The number of keys required increases as the square of the number of network members, which very quickly requires complex key management schemes to keep them all straight and secret. The difficulty of securely establishing a secret key between two communicating parties, when a secure channel does not already exist between them, also presents a chicken-and-egg problem which is a considerable practical obstacle for cryptography users in the real world.

Whitfield Diffie and Martin Hellman, authors of the first published paper on public-key cryptography In a groundbreaking 1976 paper, Whitfield Diffie and Martin Hellman proposed the notion of public-key (also, more generally, called asymmetric key) cryptography in which two different but mathematically related keys are useda public key and a private key. A public key system is so constructed that calculation of one key (the 'private key') is computationally infeasible from the other (the 'public key'), even though they are necessarily related. Instead, both keys are generated secretly, as an interrelated pair. The Page ~ 16

historian David Kahn described public-key cryptography as "the most revolutionary new concept in the field since polyalphabetic substitution emerged in the Renaissance". In public-key cryptosystems, the public key may be freely distributed, while its paired private key must remain secret. The public key is typically used for encryption, while the private or secret key is used for decryption. Diffie and Hellman showed that public-key cryptography was possible by presenting the DiffieHellman key exchange protocol. In 1978, Ronald Rivest, Adi Shamir, and Len Adleman invented RSA, another public-key system. In 1997, it finally became publicly known that asymmetric key cryptography had been invented by James H. Ellis at GCHQ, a British intelligence organization, and that, in the early 1970s, both the DiffieHellman and RSA algorithms had been previously developed (by Malcolm J. Williamson and Clifford Cocks, respectively). The DiffieHellman and RSA algorithms, in addition to being the first publicly known examples of high quality public-key algorithms, have been among the most widely used. Others include the CramerShoup cryptosystem, ElGamal encryption, and various elliptic curve techniques. See Category:Asymmetric-key cryptosystems. Padlock icon from the Firefox Web browser, meant to indicate a page has been sent in SSL or TLS-encrypted protected form. However, such an icon is not a guarantee of security; any subverted browser might mislead a user by displaying such an icon when a transmission is not actually being protected by SSL or TLS. In addition to encryption, public-key cryptography can be used to implement digital signature schemes. A digital signature is reminiscent of an ordinary signature; they both have the characteristic that they are easy for a user to produce, but difficult for anyone else to forge. Digital signatures can also be permanently tied to the content of the message being signed; they cannot then be 'moved' from one document to another, for any attempt will be detectable. In digital signature schemes, there are two algorithms: one for signing, in which a secret key is used to process the message (or a hash of the message, or both), and one for verification, in which the matching public key is used with the message to check the validity of the signature. RSA and Page ~ 17

DSA are two of the most popular digital signature schemes. Digital signatures are central to the operation of public key infrastructures and many network security schemes (e.g., SSL/TLS, many VPNs, etc.). Public-key algorithms are most often based on the computational complexity of "hard" problems, often from number theory. For example, the hardness of RSA is related to the integer factorization problem, while DiffieHellman and DSA are related to the discrete logarithm problem. More recently, elliptic curve cryptography has developed in which security is based on number theoretic problems involving elliptic curves. Because of the difficulty of the underlying problems, most public-key algorithms involve operations such as modular multiplication and exponentiation, which are much more computationally expensive than the techniques used in most block ciphers, especially with typical key sizes. As a result, public-key cryptosystems are commonly hybrid cryptosystems, in which a fast highquality symmetric-key encryption algorithm is used for the message itself, while the relevant symmetric key is sent with the message, but encrypted using a public-key algorithm. Similarly, hybrid signature schemes are often used, in which a cryptographic hash function is computed, and only the resulting hash is digitally signed. Cryptanalysis

Page ~ 18

Variants of the Enigma machine, used by Germany's military and civil authorities from the late 1920s through World War II, implemented a complex electro-mechanical polyalphabetic cipher. Breaking and reading of the Enigma cipher at Poland's Cipher Bureau, for 7 years before the war, and subsequent decryption at Bletchley Park, was important to Allied victory. The goal of cryptanalysis is to find some weakness or insecurity in a cryptographic scheme, thus permitting its subversion or evasion. It is a common misconception that every encryption method can be broken. In connection with his WWII work at Bell Labs, Claude Shannon proved that the one-time pad cipher is unbreakable, provided the key material is truly random, never reused, kept secret from all possible attackers, and of equal or greater length than the message. Most ciphers, apart from the onetime pad, can be broken with enough computational effort by brute force attack, but the amount of effort needed may be exponentially dependent on the key size, as compared to the effort needed to use the cipher. In such cases, effective security could be achieved if it is proven that the effort required (i.e., "work factor", in Shannon's terms) is beyond the ability of any adversary. This means it must be shown that no efficient method (as opposed to the time-consuming brute force method) can be found to break the cipher. Since no such showing can be made currently, as of today, the one-time-pad remains the only theoretically unbreakable cipher. There are a wide variety of cryptanalytic attacks, and they can be classified in any of several ways. A common distinction turns on what an attacker knows and what capabilities are available. In a ciphertext-only attack, the cryptanalyst has access only to the ciphertext (good modern cryptosystems are usually effectively immune to ciphertext-only attacks). In a knownplaintext attack, the cryptanalyst has access to a ciphertext and its corresponding plaintext (or to many such pairs). In a chosen-plaintext attack, the cryptanalyst may choose a plaintext and learn its corresponding ciphertext (perhaps many times); an example is gardening, used by the British during WWII. Finally, in a chosen-ciphertext attack, the cryptanalyst may be able to choose ciphertexts and learn their corresponding plaintexts. Also important, often overwhelmingly so, are mistakes (generally in the design or use of one of the protocols involved; see Cryptanalysis of the Enigma for some historical examples of this).

Page ~ 19

Pozna monument (center) to Polish cryptologists whose breaking of Germany's Enigma machine ciphers, beginning in 1932, altered the course of World War II Cryptanalysis of symmetric-key ciphers typically involves looking for attacks against the block ciphers or stream ciphers that are more efficient than any attack that could be against a perfect cipher. For example, a simple brute force attack against DES requires one known plaintext and 255 decryptions, trying approximately half of the possible keys, to reach a point at which chances are better than even the key sought will have been found. But this may not be enough assurance; a linear cryptanalysis attack against DES requires 243 known plaintexts and approximately 243 DES operations.This is a considerable improvement on brute force attacks. Public-key algorithms are based on the computational difficulty of various problems. The most famous of these is integer factorization (e.g., the RSA algorithm is based on a problem related to integer factoring), but the discrete logarithm problem is also important. Much public-key cryptanalysis concerns numerical algorithms for solving these computational problems, or some of them, efficiently (i.e., in a practical time). For instance, the best known algorithms for solving the elliptic curve-based version of discrete logarithm are much more time-consuming than the best known algorithms for factoring, at least for problems of more or less equivalent size. Thus, other things being equal, to achieve an equivalent strength of attack resistance, factoring-based encryption techniques must use larger keys than elliptic curve techniques. For this reason, public-key cryptosystems based on elliptic curves have become popular since their invention in the mid-1990s. While pure cryptanalysis uses weaknesses in the algorithms themselves, other attacks on cryptosystems are based on actual use of the algorithms in real devices, and are called side-channel attacks. If a cryptanalyst has access Page ~ 20

to, for example, the amount of time the device took to encrypt a number of plaintexts or report an error in a password or PIN character, he may be able to use a timing attack to break a cipher that is otherwise resistant to analysis. An attacker might also study the pattern and length of messages to derive valuable information; this is known as traffic analysis,[27] and can be quite useful to an alert adversary. Poor administration of a cryptosystem, such as permitting too short keys, will make any system vulnerable, regardless of other virtues. And, of course, social engineering, and other attacks against the personnel who work with cryptosystems or the messages they handle (e.g., bribery, extortion, blackmail, espionage, torture, ...) may be the most productive attacks of all.

Page ~ 21

FIGURE 1: Three types of cryptography: secret-key, public key, and hash function.

Page ~ 22

Why Three Encryption Techniques?


So, why are there so many different types of cryptographic schemes? Why can't we do everything we need with just one? The answer is that each scheme is optimized for some specific application(s). Hash functions, for example, are well-suited for ensuring data integrity because any change made to the contents of a message will result in the receiver calculating a different hash value than the one placed in the transmission by the sender. Since it is highly unlikely that two different messages will yield the same hash value, data integrity is ensured to a high degree of confidence. Secret key cryptography, on the other hand, is ideally suited to encrypting messages, thus providing privacy and confidentiality. The sender can generate a session key on a per-message basis to encrypt the message; the receiver, of course, needs the same session key to decrypt the message. Key exchange, of course, is a key application of public-key cryptography (no pun intended). Asymmetric schemes can also be used for nonrepudiation and user authentication; if the receiver can obtain the session key encrypted with the sender's private key, then only this sender could have sent the message. Public-key cryptography could, theoretically, also be used to encrypt messages although this is rarely done because secret-key cryptography operates about 1000 times faster than public-key cryptography.

Page ~ 23

Figure 2 puts all of this together and shows how a hybrid cryptographic scheme combines all of these functions to form a secure transmission comprising digital signature and digital envelope. In this example, the sender of the message is Alice and the receiver is Bob. A digital envelope comprises an encrypted message and an encrypted session key. Alice uses secret key cryptography to encrypt her message using the session key, which she generates at random with each session. Alice then encrypts the session key using Bob's public key. The encrypted message and encrypted session key together form the digital envelope. Upon receipt, Bob recovers the session secret key using his private key and then decrypts the encrypted message. The digital signature is formed in two steps. First, Alice computes the hash value of her message; next, she encrypts the hash value with her private key. Upon receipt of the digital signature, Bob recovers the hash value calculated by Alice by decrypting the digital signature with Alice's public key. Bob can then apply the hash function to Alice's original message, which he has already decrypted (see previous paragraph). If the resultant hash value is not the same as the value supplied by Alice, then Bob knows that the message has been altered; if the hash values are the same, Bob should believe that the message he received is identical to the one that Alice sent. This scheme also provides nonrepudiation since it proves that Alice sent the message; if the hash value recovered by Bob using Alice's public key proves that the message has not been altered, then only Alice could have created the digital signature. Bob also has proof that he is the intended receiver; if he can correctly decrypt the message, then he must have correctly decrypted the session key meaning that his is the correct private key. Page ~ 24

The Significance of Key Length


In a recent article in the industry literature (circa 9/98), a writer made the claim that 56-bit keys do not provide as sufficient protection for DES today as they did in 1975 because computers are 1000 times faster today than in 1975. Therefore, the writer went on, we should be using 56,000-bit keys today instead of 56-bit keys to provide adequate protection. The conclusion was then drawn that because 56,000-bit keys are infeasible (true), we should accept the fact that we have to live with weak cryptography (false!). The major error here is that the writer did not take into account that the number of possible key values double whenever a single bit is added to the key length; thus, a 57-bit key has twice as many values as a 56-bit key (because 257 is two times 256). In fact, a 66-bit key would have 1024 times the possible values as a 56-bit key. In cryptography, size does matter. The larger the key, the harder it is to crack a block of encrypted data. The reason that large keys offer more protection is almost obvious; computers have made it easier to attack ciphertext by using brute force methods rather than by attacking the mathematics (which are generally well-known anyway). With a brute force attack, the attacker merely generates every possible key and applies it to the ciphertext. Any resulting plaintext that makes sense offers a candidate for a legitimate key. This was the basis, of course, of the EFF's attack on DES. Until the mid-1990s or so, brute force attacks were beyond the capabilities of computers that were within the budget of the attacker community. Today, Page ~ 25

however, significant compute power is commonly available and accessible. General purpose computers such as PCs are already being used for brute force attacks. For serious attackers with money to spend, such as some large companies or governments, Field Programmable Gate Array (FPGA) or Application-Specific Integrated Circuits (ASIC) technology offers the ability to build specialized chips that can provide even faster and cheaper solutions than a PC. Consider that an AT&T ORCA chip (FPGA) costs $200 and can test 30 million DES keys per second, while a $10 ASIC chip can test 200 million DES keys per second (compared to a PC which might be able to test 40,000 keys per second). The table below shows what DES key sizes are needed to protect data from attackers with different time and financial resources. This information is not merely academic; one of the basic tenets of any security system is to have an idea of what you are protecting and from who are you protecting it! The table clearly shows that a 40-bit key is essentially worthless today against even the most unsophisticated attacker. On the other hand, 56-bit keys are fairly strong unless you might be subject to some pretty serious corporate or government espionage. But note that even 56-bit keys are declining in their value and that the times in the table (1995 data) are worst cases. So, how big is big enough? DES, invented in 1975, is still in use today, nearly 25 years later. If we take that to be a design criteria (i.e., a 20-plus year lifetime) and we believe Moore's Law ("computing power doubles every 18 months"), then a key size extension of 14 bits (i.e., a factor of more than 16,000) should be adequate. The 1975 DES proposal suggested 56-bit Page ~ 26

keys; by 1995, a 70-bit key would have been required to offer equal protection and an 85-bit key will be necessary by 2015. The discussion above suggests that a 128- or 256-bit key for SKC will suffice for some time because that key length keeps us ahead of the brute force capabilities of the attackers. While a large key is good, a huge key may not always be better. That is, many public-key cryptosystems use 1024- or 2048-bit keys; expanding the key to 4096 bits probably doesn't add any protection at this time but it does add significantly to processing time. The most effective large-number factoring methods today use a mathematical Number Field Sieve to find a certain number of relationships and then uses a matrix operation to solve a linear equation to produce the two prime factors. The sieve step actually involves a large number of operations of operations that can be performed in parallel; solving the linear equation, however, requires a supercomputer. Indeed, finding the solution to the RSA-140 challenge in February 1999 factoring a 140-digit (465-bit) prime number required 200 computers across the Internet about 4 weeks for the first step and a Cray computer 100 hours and 810 MB of memory to do the second step. In early 1999, Shamir (of RSA fame) described a new machine that could increase factorization speed by 2-3 orders of magnitude. Although no detailed plans were provided nor is one known to have been built, the concepts of TWINKLE (The Weizmann Institute Key Locating Engine) could result in a specialized piece of hardware that would cost about $5000 and have the processing power of 100-1000 PCs. There still appear to be Page ~ 27

many engineering details that have to be worked out before such a machine could be built. Furthermore, the hardware improves the sieve step only; the matrix operation is not optimized at all by this design and the complexity of this step grows rapidly with key length, both in terms of processing time and memory requirements. Nevertheless, this plan conceptually puts 512-bit keys within reach of being factored. Although most PKC schemes allow keys that are 1024 bits and longer, Shamir claims that 512-bit RSA keys "protect 95% of today's E-commerce on the Internet." (See Bruce Schneier's Crypto-Gram (May 15, 1999) for more information, as well as the comments from RSA Labs.) It is also interesting to note that while cryptography is good and strong cryptography is better, long keys may disrupt the nature of the randomness of data files. Shamir and van Someren ("Playing hide and seek with stored keys") have noted that a new generation of viruses can be written that will find files encrypted with long keys, making them easier to find by intruders and, therefore, more prone to attack. Finally, U.S. government policy has tightly controlled the export of crypto products since World War II. Until recently, export outside of North America of cryptographic products using keys greater than 40 bits in length was prohibited, which made those products essentially worthless in the marketplace, particularly for electronic commerce. More recently, the U.S. Commerce Department relaxed the regulations, allowing the general export of 56-bit SKC and 1024-bit PKC products (certain sectors, such as health care and financial, allow the export of products with even larger keys). Page ~ 28

The Commerce Department's Bureau of Export Administration maintains a Commercial Encryption Export Controls web page with more information. The length of the secret keys exchanged via the system have to have at least the same level of attack resistance. Thus, the three parameters of such a system system strength, secret key strength, and public key strength must be matched.

Abstract There are many misconceptions about the role cryptography plays in the world of information security. The purpose of this paper is to help information technology professionals make informed decisions about using cryptographic solutions to secure electronic business transactions. The guidance contained in this paper is organized into three main areas of the cryptographic solution life cycle. These three areas can be described as solution discovery, solution integration and solution maintenance. The following provides a consistentapproach to cost justify the use of cryptographic security solutions in business applications. Some basic understanding of encryption and cryptography is assumed.1

See Schneier, Bruce Risk, Complexity, and Network Security

Page ~ 29

Cryptographic Solution Discovery


In order to define what cryptographic solution discovery is, think about why anyone would want a cryptographic solution in the first place. Cryptographic solutions are defined as any technology that provides business applications with one or more of the four main cryptographic principles. These principles include data confidentiality, data integrity, client accountability, and client authentication. These four principles have been sold over that past few years as the means to a secure computing end. Thus, the discovery takes place when an IT professional is taking action to solve business partner security problems with the promise of cryptographic principles. The first step an IT professional can take to make a business partners transaction more secure is to understand that cryptography is only one part of security threat mitigation. Cryptography does not equal security. In fact, that is exactly the myth that must not be propagated. When many security professionals are asked the question How can we protect this data? they will provide the simple response Encrypt it! The problem is that these security professionals are ignoring the two most difficult parts of discovering the cryptographic solution. First, a solid business case should be documented based on the risk of existing threats and vulnerabilities. These threats and vulnerabilities could lead to negative business impact. Second, potential cryptographic solutions need to be compared with other non-cryptographic alternatives. This comparison will take into account the actual security value the solution provides in a production environment. Page ~ 30

Business Case via Risk Assessment Creating a valid business case for a cryptographic solution is not complex in theory. What can make the business case creation painful is the complex application environment to be analyzed.1 In the end, all business cases come down to dollars and cents. Therefore, some intangible things like risk have to be defined in these monetary terms. Risk assessment is not elementary but it becomes a surmountable task with assistance from tools that have been proven. The Information Security 1 See Schneier, Bruce Risk, Complexity, and Network Security Forum is one such organization that has produced risk assessment models2. Do not invent your own risk assessment methodology from scratch. At the very least take some input from professionals that are veterans in this space and adopt their ideas to fit your needs3. A valid business case will include the documented risk assessment along with the direct business value a specific cryptographic solution provides. The risk assessment describes potential vulnerabilities in a business application and the likelihood that one of these vulnerabilities could result in a compromise. The risk assessment will also document possible negative impact to the business if a compromise does occur. True business value of a security solution is the amount of risk mitigation provided compared to the cost of solution implementation and maintenance. In other2 words, how well does the cryptographic solution protect the business from costs related to a
2
2 See

3 See

The Standard of Good Practice for Information Security Information Security Risk Assessment Practices of Leading Organizations

Page ~ 31

compromise of confidentiality, integrity, accountability, or authentication? These questions can only be answered through direct conversation with the actual business partners. Data Confidentiality Confidentiality seems to be one of the most common benefits that cryptographic solutions provide via encryption. The need for cryptographic confidentiality usually stems from data classification and the risk of unauthorized access to certain classes of data4. Data classification can be a complex task. Many organizations underestimate the resources and effort that go into data classification. The simplest case is the small business scenario where business ownership of data is usually well defined. Data classification is also much simpler if the number of data types is relatively few. Unfortunately, most large enterprise data classification tasks are a huge undertaking filled with political red tape.

First, it is necessary to define what classifications exist. The easy route is simply public versus private. All public data is unprotected and all private data is access controlled to a specific group of individuals. This bipolar classification scheme is over simplistic and almost never applicable. In the real world there are always more than two data classifications and they are organized to be gradually more restrictive in access. An example of graduated data classifications might start with private data as a grouping. Everything outside of private data would be publicly accessible. A subset of the private data may be considered proprietary. A subset of the proprietary Page ~ 32

data could be confidential. The confidential classification could be the most restrictive. The next logical step is to classify every bit of data used by business applications. These data elements and their classification can be maintained in a catalog for easy reference. One of the road bumps that will slow down this process is the large number of data types that need classification. When databases reach into the terabytes of storage the number of elements can be overwhelming. An even bigger roadblock is determining which business partner actually owns the data. There may be many business applications that share the data. Here is where the politics and red tape come into play. Suddenly, the area tagged with the job of classifying data realizes it will cost them 5,000 people hours. On top of that, the task was never planned for. This type of resource expense will most likely result in finger pointing and passing the buck. Calculating the return on investment of data classification is not any easier than the data classification itself. One of the best ways to sell the data classification task to management is the concept of reuse. Multiple efforts will not waste valuable time re-classifying data if there is an enterprise standard. Think of how much time and money is saved by having a consistent answer to the classification of certain data types. Once an organization has a consistent way to classify their business data, they have a significant head start on answering the data confidentiality question. Risk assessment can happen much faster when there is an enterprise data classification to reference. Still, even when a solid data classification model is defined, risk assessments should not always conclude that cryptography is the best answer. Page ~ 33

What security does the encryption really provide? Encryption provides access control based on key management. Are there other ways to provide a similar level of access control without the key management overhead? Perhaps the data exists on a platform that has native access control built into it. Perhaps the business partners that need access to the data already have authentication credentials on the platform and a simple access control list is the answer. The moral of the story here is do not get distracted by the latest hot security technology with cryptography written all over it. Do not overlook a simple, cost effective solution that has been available since the implementation of the original application. Always be on the lookout for the best security control. Do not be fooled by the used car encryption salesman.

Data Integrity Integrity is a valuable cryptographic principle that is often overlooked because it is overshadowed by the marketing of encryption and confidentiality. Security professionals that truly understand cryptography realize that data integrity is just as important as data confidentiality. They also realize that encryption is not the security silver bullet because data integrity is lacking. Integrity checks can even be used to monitor how well data confidentiality solutions are doing their job. Integrity checks augment data confidentiality solutions by alerting applications when business data has been tampered with due to a possible key compromise. This paper will not dive into the Page ~ 34

mathematical proofs related to hashing functions, randomness, and check sum collisions. Yet, what is important to know is that integrity checks are critical to prevent malicious modification of business partner data. Major negative business impacts from unauthorized data modification may be illustrated through risk assessment documentation. In this case, an integrity solution should be deployed as a part of the application security controls. This is another example of how important formal risk assessment is in discovering the correct cryptographic security control.

Client Accountability Accountability can be the most confusing of the principles. The confusion is due to the abuse or misunderstanding of trust. The concept of using cryptography to provide user or transactional accountability is nothing new. Business applications have been digitally signing documents and transactions for years now. The problem is that a digital signature without the proper supporting trust model is not worth the ones and zeros it consists of. Defining a valid trust model will probably lead you into philosophical and legal debates with your coworkers and legal department. This is the underlying issue to successful deployment of a public key infrastructure. The mathematical details of asymmetric cryptography will not be explored in this paper but suffice to say that public key cryptography is the heart of cryptographic accountability. There is a lot more to a public key infrastructure than the million dollars worth of technology. There is another million in paperwork. Significant effort goes into the political and legal Page ~ 35

documentation that is necessary to support a solid public key infrastructure (PKI). It takes a very solid PKI to be the cornerstone of cryptographic accountability. Again, granular risk assessment and accurate cost benefit analysis is the only way to justify such a cryptographic solution. As stated before with confidentiality and integrity, accountability is another area where business requirements can be misinterpreted. Take enterprise internal secure email as an example business requirement. At first glance a group of security professionals may diagnose the problem as a need for enterprise user messaging accountability via public key infrastructure. After all, there are plenty of industry examples of secure email through public key technology. Vendors of these public key infrastructures can spend entire days mesmerizing potential clients with the wonders of public key technology. PKI sales people tell customers how PKI products will secure enterprise email as well as several other money saving features. Many of these claims may very well be true.

Client Authentication Authentication is a practical extension of the cryptographic accountability principle. Once accountability is established it can be leveraged to prove identity to a system or application. This principle also relies heavily on a strong trust model. Authentication is dependent on the policies and procedures surrounding the public key infrastructure that issues the certificates. A system that uses cryptographic authentication must trust the Page ~ 36

certificate authority registration and distribution process. The system has to assume that proper process has been implemented to ensure appropriate certificate management. If the certificates are managed correctly then the system can be confident that the client presenting a certificate is represented by the information in that certificate. Probably the most important part of managing the client certificate is authorized access to that certificate. Client certificate access is what makes cryptographic authentication more secure that simple ID and password solutions. The owner of the client certificate should have to present something they know in order to gain access to their certificate. Thus, that client has used something they know (a pass phrase or pin number) to gain access to something they have. The goal is to require more than one factor of authentication. Keep in mind that cryptography is not providing magic security dust here either. Cryptography is used to verify the authenticity of the certificate and its client information. Overall authentication is dependent on the strength of trust model and diligence of access control to private data. Cryptographic solution discovery is a cost-benefit decision procedure. Cryptography can bring the benefits of confidentiality, integrity, accountability, and/or authentication. The cost of these benefits needs to be compared with the cost of alternative solutions. The cost of any security solution, including cryptographic solutions, must be weighed against the cost of potential security compromise.

Page ~ 37

Cryptographic Solution Integration


Once a cryptographic solution has been discovered through documented risk assessment and justified through cost-benefit analysis that solution can be integrated into business applications. The documentation produced through cryptographic solution discovery should remove any question about what solution will be integrated. The effort should concentrate on logistical details of solution deployment during cryptographic solution integration. It may seem like most of the hard work has been accomplished when the security requirements have been documented and a specific cryptographic solution chosen. Unfortunately, there is considerable work required to make the cryptographic solution a seamless part of the business transaction. Solution integration is the return on time and money invested in the solution discovery phase. Security professionals must follow through on the promises made to business partners during the solution discovery phase. This is accomplished by efficiently implementing the selected cryptographic technology.6 One of the most effective ways to ensure smooth solution integration is to maintain a group of approved cryptographic solutions that can be reused. Business partners that share an enterprise will reuse much of the same infrastructure as they deploy business applications. When cryptographic solutions are proven they can be leveraged by future efforts in the form of reusable services or patterns.7

Page ~ 38

Reusable Cryptographic Services Reusable services are the cornerstone of quick hitting cryptographic solutions that can be integrated into business applications with low overhead. These reusable services should exist and interoperate on multiple platforms. The services should also be accessible through different application development models. The reusable services should be based on industry standard algorithms and technologies. Cryptographic reusable services provide a thin layer of abstraction to business applications by hiding the cryptographic implementation details. This will allow the services to stay current with the strongest cryptographic techniques without major changes to business application functionality.3 Reusable Cryptographic Patterns Reuse of cryptographic solutions is not limited to application layer technologies. The same concept of reusability can be applied at other layers of the infrastructure. Consider data in transit5 Three examples are web browser requests, FTPs, and web services. The data associated with these transactions may require protection based on risk assessment. Perhaps changes to the business application will cost more than the benefit provided by existing reusable cryptographic services. Another option to provide seamless, low overhead solution integration is to leverage cryptographic solution integration patterns.

5 See

See Java Cryptography Extension System.Security.Cryptography Namespace

Page ~ 39

The patterns may not consist of reusable APIs. Solution patterns can document how to implement reusable cryptographic technologies like Secure Sockets Layer, IP Security, and Secure FTP. As new efforts prove out certain cryptographic solutions in certain parts of the enterprise, these implementations should be documented as new cryptographic solution patterns. Integration patterns should call out any implementation issues. Road bumps and mistakes made in past deployments can be prevented in future implementations by having detailed descriptions documented.

Cryptographic Solution Maintenance The last phase of the cryptographic solution lifecycle is solution maintenance. This stage of the solution lifecycle is as important as discovery and integration combined. Maintenance of a cryptographic solution ensures that the time and money investment continue to produce business value. Solution maintenance is a significant factor in the cost benefit analysis that occurs when comparing security solutions. The cost of cryptographic solution maintenance can be easily underestimated.7 Key management is one of the reasons that the cost of solution maintenance is consistently underestimated. Many security professionals and business partners do not understand the resource requirements for strong key management. Whether dealing with symmetric shared keys or public key certificates, there is a set of core management functions.8Exam4ples of these

9 See

Digital Certificate Management: Navigating Your Success

Page ~ 40

management functions include key registration, generation, distribution, storage, revocation, expiration, and recovery. Key management is not the only aspect of a cryptographic solution that can cause the solution to weaken over time. Strength of the cryptographic algorithms themselves will loose their security value as technology advances.9 Older cryptographic algorithms will be broken and key lengths become obsolete. At some point cryptographic solutions must migrate to keep up with the latest cryptographic technology. Performance and availability also factor into the end of solution lifecycle. Cryptographic solutions may become inefficient compared to newer technology. Speed of cryptographic operations as well as cipher text size should be considered.

Page ~ 41

Prevention is better than remedy!


We provide a wide range of Information Security technologies, solutions and services for all businesses who deal with electronic data, communication and databases, but most importantly for financial, healthcare and legal service sectors who deal with sensitive information. Cryptographic Solutions Inc. is extending its security services through partnership programs with outstanding security technology providers in order to provide a broad range of services to its clients. Our core services are based on Information Security Technology like secure communication, encryption mechanism, Authentication Mechanism, Database Security, and Intrusion Detection Systems, to safeguard your digital assets against unauthorized access and data loss due to natural disasters. Financial and Healthcare 1. Authentication solutions 2. Web App security Audit 3. PEN test 4. Secure Storage solutions 5. Hardware Solutions 6. Secure Database 7. High Availability solutions Encryption Small to Medium businesses 1. Secure wireless design 2. Vulnerability assessment 3. Threat Risk Assessment 4. Firewall systems 5. Cryptography solutions 6. Site to Site VPN 7. Security training Awareness and IDS Legal Sector 1. Secure Storage 2. Forensic Auditing 3. Incident Analysis 4. Secure Database 5. PIPEDA Compliancy 6. SSL/VPN solutions 7. Secure Email 8. Secure web portal

SUMMARY
Page ~ 42

Secrecy is the heart of cryptography. Encryption is a practical means to achieve information secrecy. Modern encryption techniques are mathematical transformations (algorithms) which treat messages as numbers or algebraic elements in a space and transform them between a region of meaningful messages or cleartext and a region of unintelligible messages or ciphertext. In order to restore information, an encryption transformation must be reversible and the reversing transformation is called decryption. Conventional, encryption and decryption algorithms are parameterized by cryptographic keys. An encryption algorithm and a decryption algorithm plus the description on the format of messages and keys form a cryptographic systems or a cryptosystem. The rapid growth of information technology have led to significant advances in cryptography to protect the integrity and confidentiality of data is astounding. In the modern information-oriented society, various devices are connected to the Internet as terminals, which necessitate technology for information security. Today, the world continues to witness an explosion of technology designed to help people communicate faster and more easily. We carry powerful digital computers in our pockets, exchange digital information in addition to voice data with our mobile phones, and surf the Web with highend PDAs. In the near future, especially the coming of age of 3G wireless devices, every type of electronic data channel will be used to exchange every type of electronic information. One of the great challenges of the ability to communicate digitally is securing the increased amount of electronic information now exchanged over the network. To make the matter worse today, everyone wants to be everywhere and anywhere and be reached via

Page ~ 43

his tech-mobile system. And that makes mobile security a top priority for many businesses that want to offer high-end mobile customer application. Over the last three decades the traditional cryptosystems like DES, AES, RSA, DSA, One-Time-Pad, DLP, ElGamal and of late ECC, have thus far been the answer to the wide range of issues that impact modern secure communication, mobile data protection, including the assurance of privacy, the certainty of the transmitter or receivers identity, and the integrity of the communication. And of late centralized enterprise key management is playing role a in HR provisioning via people, process and technology. And of late the role key management is playing in enterprise mission-critical data encryption and network access control. Digital Signature It's an electronic signature that authenticates the identity of the sender of a message. It can be used also to ensure that the content of a sent message is unchanged, i.e., data integrity is preserved. If a digital signature is used, it is still possible for the recipient to see the message in plain text. That is, for a digital signature, the main idea is no longer to disguise what a message says, but rather to prove that it originates with a particular sender. Digital Certificate It is an electronic document issued by a certification authority (CA) and usually contains your name, a serial number, an expiration date and a copy of your public key (which anyone can use to encrypt messages to send to you you then open the messages with your private key) and the digital

Page ~ 44

signature of the CA. Use of a CA when doing business on-line allows anyone to check that you are who you say you are. Public Key Infrastructure (PKI) A PKI can be used by a company to securely and privately exchange data and money. It involves a digital certificate being issued that can identify an individual or company but also offers directory services that can store, allocate and revoke certificates as and when necessary. There are several third party vendors of business PKI solutions, e.g. RSA, Baltimore, VeriSign, or Thawte that have gained public confidence as CAs. Thus, Cryptography is not a magical security serum. The need for investment in cryptographic solutions must be backed by formal risk assessment. Risk assessments are only valid if an enterprise standard for data classification exists. Once the security analysis is complete, business partners must determine how deep they want to dig into their pockets to protect their data. Security professionals can ease some of the cost by having strategic cryptographic solutions readily available for integration. Finally, cryptographic solutions are a living technology. Each one will come to the end of its lifecycle. At that time, the solution will be replaced with a new and more secure cryptographic technology.

Page ~ 45

REFERENCES: ^ Liddell and Scott's Greek-English Lexicon. Oxford University Press. (1984) ^ a b c d David Kahn, The Codebreakers, 1967, ISBN 0-68483130-9. ^ Oded Goldreich, Foundations of Cryptography, Volume 1: Basic Tools, Cambridge University Press, 2001, ISBN 0-52179172-3 ^ "Cryptology (definition)". Merriam-Webster's Collegiate Dictionary (11th edition ed.). Merriam-Webster. http://www.merriam-webster.com/dictionary/cryptology. Retrieved 2008-02-01 ^ Hakim, Joy (1995). A History of Us: War, Peace and all that Jazz. New York: Oxford University Press. ISBN 0-19-509514-6.

Page ~ 46

^ James Gannon, Stealing Secrets, Telling Lies: How Spies and Codebreakers Helped Shape the Twentieth Century, Washington, D.C., Brassey's, 2001, ISBN 1-57488-367-4. ^ a b c Whitfield Diffie and Martin Hellman, "New Directions in Cryptography", IEEE Transactions on Information Theory, vol. IT-22, Nov. 1976, pp: 644654. ^ a b c d e f AJ Menezes, PC van Oorschot, and SA Vanstone, Handbook of Applied Cryptography ISBN 0-8493-8523-7. ^ FIPS PUB 197: The official Advanced Encryption Standard. http://www.sans.org/reading_room/whitepapers/vpns/cryptogr aphy-business-myth_1236 http://www.cryptographicsolutions.com/ http://www.encyclopedia.com/topic/cryptography.aspx http://www.garykessler.net/library/crypto.html#fig01

Page ~ 47

Page ~ 48

Page ~ 49

You might also like