You are on page 1of 7

Appendix A Elements of Information Theory

Entropy
Redundancy of language
Key Equivocation & Unicity Distance
Equivocation of a simple cryptographic system

Elements of information Theory

This be tricky toward argue that substance regarding cryptography devoid of submitting to
elementary concept of information theory. Claude Shannon a priest of regulation available in
1948 the determining job in which the principles were formulated of consistent transmission via
deafening conduit. Later this approach to protect hardware with the superior piece of
hypothetical basis for modern cryptography. The standard tackle for the protected transmission
are code and ciphers. A code is a permanent programmed thesaurus wherever each suitable
message is coded. Coding theory attend to noisy channel crisis whereby choose a fussy code, if a
communication M is fuzzy to M this slip could be noticed with accurate near M.

Entropy

An information source is solitary essential mechanism of the transmission and confidentiality


system. Messages were generated that are soon transmitting a message channel. In almost
belongings, probabilistic replicas of data basis appear to be sufficient. So a source represented
via arbitrary variable S by the source sets are called by the messages S={s1,s2,,sk}and
linked probabilty P(S=si)= p(si) for every condition i= 1..k.

The entropy of the discrete communication source is defined as:

H(S) = ki=1 p(si) log2 ( 1/p(si))

Each log2 ( 1/p(si)) signify the no. of bits desired to instruct the communication optimally .
whilst the mail be evenly liable p(s1)=p(s2)=..=p(sk)= 1/k the H(S)= log2k. if k= 2n after
that n bits be desired to instruct the message. The worth of H(S) vary b/w the max. value log2k
and its min worth zero when there is a lone message with the possibility 1.

Language Redundancy and Cryptanalysis


1. speech Redundancy and Cryptanalaysis

o person tongue are superfluous

o eg "th lrd s m shphrd shll nt wnt"


o script were not uniformly exploited

o When e in English the farmost letter

o in that case T,R,N,I,O,A,S

o additional calligraphy are quite uncommon

o cf. Z,J,K,Q,X

Since example explained, there is no need of all letters to understand English text. Here vowels
can be erased. Similarly in parties "party conversations", can listen to one human being talking of
hubbub of numerous, yet once more as of redundancy in oral verbal communication also. This
redundancy is the cause of compressing; the hardware encode in more compact without losing
any information. Fundamentally seen with the count of the relative frequencies of letters the
resulting pattern is perceived as seen in Fig.5.1.

1. The frequencies of letters in English language

Fig 6.1. English letter relative frequencies {source:


http://sjsu.rudyrucker.com/~haile.eyob/paper/frequency.jpg }

This chart is bottom on count up ready at ADFA in the belatedly 1980's, and worn to extend the
bench available in Seberry & Pieprzyk.

1. additional speech

o normal languages comprise unreliable letter frequencies


o languages comprise dissimilar letters (cf. Norwegian)

o example text and reckon letter frequencies

2. apply in Cryptanalysis

o computed frequencies for cipher being examine

o contrast counts/plots alongside known values

o gaze for general crest & trough

o crest at: A-E-I spaced triple, NO pair, RST triple with U shape;

o troughs at: JK, X-Z

o main idea - monoalphabetic replacement cant alter comparative alphabetic


occurrences

3. Table of ordinary English Single, Double and Triple Letters

Table 6.1: Examples of English letters

Triple
Single Letter Double Letter
Letter

E TH THE

T HE AND

R IN TIO

N ER ATI

I RE FOR

O ON THA

A AN TER

S EN RES

6. Cryptanalysis of a Caesar Cipher

Caesar cipher can be analyzed by by occurrence count too eg. Known "JXU WHUQJUIJ
TYISELUHO EV CO WUDUHQJYED YI JXQJ Q XKCQD RUYDW SQD QBJUH XYI YVU
RO QBJUHYDW XYI QJJYJKTUI" can count alphabets along with conspire as in Fig 5.2:

Fig.6.2 The sharing of alphabets in a characteristic example of English communication


transcript has a idiosyncratic and predictable shape. A Caesar move "rotates" this allocation, and
it is probable to decide the shift by investigative the resulting occurrence graph.{source:
http://en.wikipedia.org/wiki/Caesar_cipher#/media/File:English-slf.png}

gaze the graph, the A-E-I triple is attractive obvious at Q-U-Y

also HIJ triple would fit as RST, DE is then NO though fewer apparent

propose a key of Q utilized(A to Q, etc)

The graph beyond is a immediately plot from krypto program. Contain the key, decrypt the
communication and recuperate: "THE GREATEST DISCOVERY OF MY GENERATION IS
THAT A HUMAN BEING CAN ALTER HIS LIFE BY ALTERING HIS ATTITUDES"

Key Equivocation and Unicity Distance

Think concerning an encoding scheme in below fig.5.3 , the cryptosystem contain of three
essential systems:

Message resource

Key producer
Encryption block

The communication basis is characterizing by a arbitrary variable M and illustrate the


arithmetical assets of the tongue produced the source. The typescript set in the alphabet is M. the
key producer choose the input arbitrarily with the consistent possibility for the entire set K.
formerly select the key reside preset. The encryption block utilizes the publicly standard
algorithm to code message into cryptograms below the management of secret key. The
cryptograms group is represented by C.

For n successive mail constructs n matching cryptograms (cipher texts) is applied by the
recipients of the cryptosystem. The opponent cryptanalyst which does not recognize the secret
key except reading cryptograms might attempt to:

improve communication since the cryptograms

improve the undisclosed key

The invader also cannot identified the statistical possessions of the message resource and
consequently could compute the message and key evasion to discover the compilation of mainly
possible messages and the keys. As an invader recognized n cryptograms it be able to figure
communication evasion follows :

H(Mn | Cn) =

Here p(m|c) is provisional possibility of seq. m supply c has been pragmatic .

Equivocation of simple cryptographic system.

Examine a cryptographic scheme as in fig 5.3 which encrypts a binary communication utilizing a
binary keys following to formulae:

C= m addition mod 2 k

wherever c C, m M and k K are a cipher transcript, a message and a key. C=M=K={0,1}.


The communication is identified to produce basic bits by possibility

p(M=0) =v ; p(M=1) =1-v while 0v 1

In favor of each communication session, a cryptographic key k is chosen by equal feasible binary
rudiments.

p(k=0)= p(k=1)=0.5
This helps in computing the cipher equivocation and predictable the unicity distance .

Message source M Encryption C


M E

Key
generator

Figure 5.3 correlations of message and cryptosystem in binary modulo arithmetic


operation for C= M (addition modulo 2) K

suppose this cryptosystem has produce n binary cryptograms in order that the possibility p(A)
wherever O be the occasion that prepared cryptograms order include i 0s and n-i 1s is defines
as :

p(O)=p(O, (k=0 or k=1))= p(O,k=0 )+p(O, k=1)= p(O|k=0)p(k=0)+ p(O|k=1)p(k=1)

The conditional belongings p(O|k=0) the possibility that the prearranged message series include
of i 0s and n-i 1s. conversely p(O|k=1) gives possibility to prearranged message n-i 0s and I
1s. consequently,

p(O|k=0)=vi (1-v)n-i

p(O|k=1)= vn-i(1-v)i

Henceforth it conclude:

p(O)= 0.5(vi (1-v)n-I + vn-i(1-v)i )

Try:

1. Does the cipher become stronger by the cipher multiplication?

2. Is MxS=SxM?
3. Demonstrate the unicity distance of the Hill Cipher (by p x p encryption matrix) < m/RL.

References

1. C. E. Shannon, Communication Theory of Secrecy Systems. Bell Systems Technical


Journal, 28(1949), 656-715

2. Douglas Stinson, Cryptography Theory and Practice, 2nd Edition, Chapman &
Hall/CRC

3. Cover, TM, Thomas, JA. Elements of information theory, 1st Edition. New York:
Wiley-Interscience, 1991. ISBN 0-471-06259-6. 2nd Edition. New York: Wiley-
Interscience, 2006. ISBN 0-471-24195-4.

4. R. J. McEliece. The Theory of Information and Coding. Addison-Wesley, Reading,


MA, 1977.

5. B. McMillan. The basic theorems of information theory. Ann. Math. Stat.,24:196


219, 1953.

6. N. Merhav and M. Feder. A strong version of the redundancy-capacity theorem of


universal coding. IEEE Trans. Inf. Theory, pages 714722, May 1995.

You might also like