You are on page 1of 2

Rylan Schaeffer

ECS153 Spring 2015


Homework 2
1.
a.
Alice

alicerc
own, execute

bobrc
read

cyndyrc
-

Bob

read

own, execute

Cyndy

read

read, write

own, read,
write, execute

Alice

alicerc
own, execute

bobrc
read

cyndyrc
read

Bob

own, execute

Cyndy

read

read, write

own, read,
write, execute

b.

2.
TODO
The proof of Theorem 1 states that we can omit the delete and destroy commands a
s they do not affect the ability of a right to leak when no command can test for
the absence of rights. Justify this statement. If such tests were allowed, woul
d delete and destroy commands affect the ability of a right to leak?
The kep part of Theorem 1 is that the sequence of commands is MINIMAL. This mean
s that after ck (the final command) is executed, right r has been leaked from th
e system with initial state s0. Suppose there is a delete or destroy command in
the sequence. Since commands can only test for the presence of a right, if a com
mand fails to activate because the right has previously been deleted, then the c
ommand and the delete are unnecessary. This contradicts the claim that the seque
nce of commands is minimal. Hence, all deletes or destroys can be omitted.
If a test for the absence of rights existed, delete and destroy commands would a
ffect the ability of a right to leak because a subsequent command could check fo
r the absence of a right and then call some other command or operation. Hence, o
mitting the delete or destroy commands would have an impact.
3.
a. Discretionary. In UNIX systems, users set permissions for which users can acc
ess which files.
b. Originator because this system prohibits memoranda from being distributed wit
hout the creator s consent.
c. Mandatory. The policy will be enforced regardless of who approves or disappro
ves.
d. Discretionary. The faculty member is given permission to see the student's gr
ades, but since the student did not create that information, this is not an orig
inator access policy.
4.
a. Paul cannot read because {A, C} is not a subset of {B,C}. Paul cannot write b
ecause TOP SECRET > SECRET.
b. Anna cannot read because {C} is not a subset of {B}. Anna cannot write becaus

e {B} is not a subset of {C}.


c. Jesse can read because SECRET > CONFIDENTIAL and {C} is a subset of {C}. Jess
e cannot write since SECRET > CONFIDENTIAL.
d. Sammi can read because TOP SECRET > CONFIDENTIAL and because {A} is a subset
of {A, C}. Sammi cannot write because TOP SECRET > CONFIDENTIAL.
e. Robin cannot read since UNCLASSIFIED < CONFIDENTIAL. Robin can write since UN
CLASSIFIED < CONFIDENTIAL and because {} is a subset of {B}.
5.
a. The Principle of Least Common mechanism states that mechanisms used to access
resources should not be shared. Ware's scheme satisfies this principle by prohi
biting processes from sharing access to other processes' resources by requiring
that processes use a neutral information-exchange area.
b. Yes. With piping, neither process reaches into the other process and so the u
se of pipes satisfies Ware. That said, I think this misunderstands half of Ware'
s argument. His point isn't solely that information exchange should occur in a n
eutral place, but also that the two processes should agree to share information.
Whether or not Linux OSes do this, I do not know.
Extra Credit
6.
To prove that the set of unsafe systems is recursively enumerable, let S be the
set of all protection systems and si be the ith system in S. Let Mi be a Turing
Machine that simulates si, complete with the transitions of system si. Now, cons
ider the following matrix:
M1
M1
M1
M1
M1
.
.
.

M2 M3 M4 M5 ....
M2 M3 M4
M2 M3
M2

Starting in the top left corner, and proceding along the diagonals from the left
most column to the topmost row, advance each Mi by one transition. As each Mi ad
vances, we can determine if the corresponding si is unsafe by whether a right le
aks on that transition. If a right does not leak, we cannot correspondingly say
that si is safe because it may be the case that we simply have not yet reached t
he transition that will leak a right. Hence, by this mechanism, we run each mach
ine one transition at a time. When a machine, and therefore a system, is declare
d unsafe, we add it to our list of unsafe systems. By this method, we may recurs
ively enumerate which protection systems are unsafe.