You are on page 1of 13

Traceless Biometric Technology

Enabling Secure Transactions without Storage of Unique


Biometric Information
BY: Michael (Micha) Shafir, Cofounder and CTO, Innovya R&D

"The computer, with its insatiable appetite for information, its image of infallibility, its
inability to forget anything that has been put into it, may become the heart of a
surveillance system that will turn society into a transparent world in which our home,
our finances, our associations, our mental and physical condition are laid bare to use
most casual observer." (Prof. Arthur Miller. "Statement to Sub-Committee of US Senate on Administrative Practice
and Procedure" March 14th, 1967)

As with many rapidly expanding technologies that affect social life, biometrics has
in a justifiable manner come under attack by civil libertarians. Privacy advocates
argue that biometrics will lead to an even deeper erosion of personal privacy in both
the real world and cyber-space. In this paper we study the many privacy concerns
which have emerged following the increase in use and the popularity of biometric
systems for identification and authentication purposes in digital and physical
environments. We will argue that contrary to critics' arguments, Innovya’s traceless
biometrics solution is in fact completely traceless and noninvasive with regard to
personal privacy. Further, we hold that if these new traceless biometric systems are
used in conjunction with existing security mechanisms (such as public-key
algorithms), they can provide almost foolproof protection for electronic transactions
and other operations in smart environments. The key element however, is that
government intervention, in the form of a set of standards for how the new traceless
biometric solution will be adopted, is an absolute necessity for complete privacy
protection.

Our goal is to demonstrate how traceless non-unique biometric systems can


themselves be advocates of privacy. We do so by answering the following questions:
1) How can traceless biometric systems be designed so as not to intrude into
personal data sets? 2) How can government intervention through legislation
guarantee privacy protection of users by adopting and enforcing the new traceless
biometric authentication and identification systems? 3) In the absence of
government regulation, how much reliance can users of biometric systems have on
self-regulation for privacy protection? We start off by examining the authentication
and identification requirements of networked digital environments, as well as the
privacy requirements of such environments. This is followed by a review of how
traceless biometric systems are compatible with privacy requirements. We will close
by looking at how the possible implications of regulation of the biometrics industry,
both from government and the technical community may affect today's digital
world.

Innovya Research & Development, an Israeli startup, has developed a new


Traceless Biometrics Solution that clearly authenticates users’ identity without

All content copyright © 2006 Innovya R&D Ltd. All rights reserved.
For more information please contact Mr. Ronen Blecher: Ronen@Innovya.com
requiring the storage of any unique biometric information. Furthermore, the solution
does not need to link, write, or bind any unique information to an external device,
smart card, or network of any kind. The solution’s method is able to positively
recognize and identify biometric identity in real-time without violating the user’s
privacy and without leaving any intrinsic traces. The company was founded in 2006
by Michael (Micha) Shafir and Ronen Blecher, both experienced entrepreneurs from
the network security devices industry. The company owns a revolutionary patented
platform and method for Traceless Biometric Identification.

Innovya is in the process of developing and providing a method for identifying an


individual through a biometric identifier that is designed to be non-unique.
Innovya has created an amorphous biometric identifier agent, or ‘BIdToken’ (Non-
unique Biometric Identifier Token), which is designed to be biometrically
traceless, so that an exact image or copy of the biometric information is not
maintained. Instead, the one directional BIdToken refers to an incomplete identifier
obtained from biometric information, which is non-unique. By ‘incomplete’ we
mean that the biometric information itself cannot be reconstructed from the
BIdToken even with the device that originally allocated the biometric token
identifier. Using this method, the individual has to be present during the
identification process since the (secret) token identifier itself has no true value
except in a particular biometric identification transaction. This is important in order
to avoid an association with recorded values or any other unique characteristic.

ATM/debit card fraud in the U.S. generated losses of $2.75 Billion:


Gartner, Inc. the leading provider of research and analysis on the global information
technology industry estimates that in the 12 months ending May 2005, ATM/debit
card fraud in the U.S. generated losses of $2.75 billion, with an average loss of
more than $900. Criminals secretly obtain consumer banking account and password
information by online ‘phishing’ and keystroke logging attacks, and armed with this
information, hack into consumers' ATM accounts.
Gartner also claims that "Most of the losses were covered by banks and other
financial institutions that issued the specific ATM/debit cards exploited by thieves."

Systems cannot determine the identity of actual user:


News stories of Internet privacy threats are commonplace these days. The Internet
was designed as an inherently insecure communications vehicle.
• Hackers have easily penetrated the most secure facilities of the military and
financial institutions.
• Internet companies have designed numerous ways to track Web users as they
travel and shop throughout cyberspace. ‘Cookie’ is no longer a word associated
solely with sweets. It is now associated with cyber-snooping.
• Identity thieves are able to shop online anonymously using the credit-identities
of others.
• Web-based ‘information brokers’ sell sensitive personal data, including Social
Security numbers, relatively cheaply.

A long-time goal of computer scientists, specifically those specializing in Artificial


Intelligence, has been to create computer systems that are able to simulate human

All content copyright © 2006 Innovya R&D Ltd. All rights reserved.
Highly Confidential – Limited circulation only
intelligence. At the same time, researchers have continually been concerned with
improving the identification and authentication methods used for access to computer
systems and networks. Biometric authentication systems are a natural extension (to
computers) of the recognition methods that humans have used since the beginning
of time. In these systems, physical or behavioral characteristics of the person to be
authenticated determine whether he is indeed who he declared himself to be - this is
analogous to how people recognize each other (i.e. how they identify others and
verify that the person is who he appears to be) by examining physical features that
are essentially unique to the other person, like his face.

Security is a fundamental requirement of any digital environment:


One key security principle that must be included in any security policy of a system
in such an environment is accountability - someone must be responsible for each
action that takes place in the digital space. Accountability therefore, necessitates
identification. Furthermore, the system must be able to verify a user's claim to
Identity X. In other words, identification necessitates authentication.

Knowledge-based authentication is the most commonly used method for verifying a


user's identity to a computer system. Indeed, authentication by knowledge has
several advantages: it is easy to implement, users can protect their knowledge -
typically a password - easily, the knowledge is portable, and it can be simply
changed if it is compromised. At the same time however, authentication based on
knowledge of a password is often insufficient in preventing unauthorized access to
computer systems. Password-based authentication systems are vulnerable to offline
dictionary attacks, and exhaustive-search attacks. In an offline dictionary attack, the
attacker will steal a password file which stores a number of encrypted passwords,
and then encrypt each word in a dictionary to see if any of them match the
encrypted password(s) on the file. In an exhaustive-search attack, all possible
passwords of the minimum length are encrypted and compared against the
encrypted password in the system. Another problem with password-based
authentication schemes is that it is difficult for users to come up with strong
passwords. "A good password is easy to remember and hard to guess ... Something
is easy to remember if it is meaningfully related to other things one knows. These
same relationships make it easy to guess." The prevailing techniques of user
authentication, which involve the use of either passwords and user IDs (identifiers),
or identification cards and PINs (Personal Identification Numbers), have several
limitations. Passwords and PINs can be illicitly acquired by direct covert
observation. Once an intruder acquires a user ID and password, the intruder has
total access to the user's resources. In addition, there is no way to positively link the
usage of the system or service to the actual user; that is, there is no protection
against repudiation by the real ID owner. For example, when a user ID and
password is shared with another individual such as a friend, family member or
colleague, the system cannot determine the identity of the actual user, which can be
particularly problematic in case of fraud or other criminal acts, or when payment
may be made.

Credit card fraud:

All content copyright © 2006 Innovya R&D Ltd. All rights reserved.
Highly Confidential – Limited circulation only
A similar situation arises when a transaction involving a credit card number is
conducted on the Web. Even though the data are sent over the Web using secure
encryption methods, current systems are not capable of assuring that the transaction
was initiated by the rightful owner of the credit card, since both the real owner and
the counterfeiter are using the same transaction initiation process which is, the entry
of a credit card number and expiration date to the payment system. Indeed, for such
transactions even the card itself does not need to be physically present, further
increasing the potential scope of fraud and deceptive use of credit card information.

Biometrics contradictions:
Fortunately, automated biometrics in general and fingerprint technology in
particular, can provide a much more accurate and reliable user authentication
method. There are three classic bases for authentication: (1) something the user
knows (a password), (2) something the user has (a key, a smartcard), (3) something
the user is or does (biometrics). Biometrics is a rapidly advancing field that is
concerned with identifying a person based on his or her physiological or behavioral
characteristics. Examples of automated biometrics include fingerprint, face, iris, and
speech recognition.

However, deploying biometric systems without sufficient attention to their dangers


makes them likely to be used in a way that is dangerous to civil liberties because of
the inherent property of biometric data, which is that it forms part of the person.
There are two main phases in biometric authentication. In the enrollment phase, the
user's intrinsic characteristic is measured. This may be a physical characteristic such
as his fingerprint, hand geometry, retina vein configuration, iris pattern, face, or
DNA, or a behavioral characteristic like his voice or signature dynamics. The main
problem is; the data that is being collected in the enrollment phase is then analyzed
to build a unique template. To authenticate a person with identity X, the
characteristic must be measured again in the same manner, and then compared with
the so called ‘trusted’ stored template. The person is then authenticated depending
on how closely the freshly measured characteristic compares with the retrieved
template. Turning the human body into the ultimate identification card is extremely
dangerous. A fingerprint, a retinal or iris print, a face or other physical information
used for the biometric data are part of the individual. They cannot be changed at all
or can only be changed somewhat. Therefore, if the biometric information is used
abusively and/or is distributed to third parties such as law enforcement agencies for
example, the individual has little or no recourse, and cannot change the situation.
The problem with the biometrics enfacement scheme is not merely the collection of
biometrics, it is that the scheme is conceptualized to act as a means of collation of
all government data, and indexing all significant civil transactions through a central
database. Who will be responsible or compensate for lost, stolen and reconstructed
unique biometric characteristics collections? People want to be able to draw a
boundary circle around information about themselves and how they behave. They
feel entitled to the ability to control all that falls inside this circle and they want to
be able to regulate how, to whom, and for what reasons the information within the
circle is disseminated. A life less monitored and less searchable is a life more
private. Many countries are dependent on electronic data storage mechanisms. As

All content copyright © 2006 Innovya R&D Ltd. All rights reserved.
Highly Confidential – Limited circulation only
this reliance continues to increase the question becomes one of safeguarding
electronic information against misuse. There are thousands of databases of less
permanent information about people on computers, often servers connected to the
Internet. Names, addresses, credit card and bank account numbers are just some of
the personally identifying information that is being stored by independent
information traders, including state and federal governments. We all must be aware
that biometrics exposure may take a long period of time to recover (actually a life-
time period). Can anybody implant ‘new’ biometrics in case of exposure? Anything
can be faked but, if someone owns your biometrics he practically owns your
identity. Exposing or losing biometric property is lost for life.

Biometric Technology Not Popular In US ATMs


"In the case of ATMs, the PIN will be here for a long time," said Jim Block,
Diebold's director of global advanced technology. "Part of the reason stems from
the PINs popularity. Consumers can access ATMs nearly anywhere in the world by
simply inserting their card and punching in a secret four-digit number. Biometrics
are not universally used because there is no standard for storing the data."
Source: ‘Payment News’ Posted: May 6, 2006 at 06:08 AM Pacific

Other forms of identification are less permanent:


Other forms of identification are much less permanent. For example, many if not
most individuals in the modern world have a UserID (such as a user name), one or
more passwords and one or more Personal Identification Numbers (PINs), which are
all different types of information. Since they do not form a permanent part of the
individual, if this information is stolen, it can be changed. Most individuals in the
modern world also have cards, badges and keys, which may be combined with the
above information for accessing one or more resources that require identification
and authentication. For example, an individual typical has and knows an ATM card
and an associated PIN. Only the combination of the two items, which is card
owning and knowing the PIN, permits the individual to conclude successful
transactions with ATM machines. When a PIN and/or PIN plus card are shared with
another individual such as a friend, family member or colleague, there is no way for
the system to ‘know’ who the actual card owner is. It means that currently there is
no way for the system to know if the previously described items that are defined as
'knowing' and 'having' have been shared willingly, duplicated, lost or stolen. As
described previously, biometrics can be used to overcome these problems but with
severe potential privacy drawbacks.

Privacy and security are not the same:


Roger Clarke of the Faculty of Engineering and Information Technology at the
Australian National University explains privacy as "the interest that individuals
have in sustaining a 'personal space', free from interference by other people and
organizations." Clarke defines several dimensions to this interest. The two that are
most relevant to this White Paper are: 1) Privacy of personal communications.
"Individuals claim an interest in being able to communicate among themselves
using various media without routine monitoring of their communications by other
persons or organizations." 2) Privacy of personal data. "Individuals claim that data

All content copyright © 2006 Innovya R&D Ltd. All rights reserved.
Highly Confidential – Limited circulation only
about themselves should not be automatically available to other individuals and
organizations, and that, even where data is possessed by another party, the
individual must be able to exercise a substantial degree of control over that data
and its use." In other words, users of computer systems (especially those in
networked environments) expect that those who store their personal information
will not abuse it. They expect too that wherever their personal information is being
stored, it is safe, so even if a hacker were to succeed in breaking into the computer
or server on which this data were stored, it would be protected. Users expect also to
be able to communicate anonymously. This is especially important for those who
want to criticize the government, or an employer without having to worry about
victimization.

Biometrics violating privacy and is harmfully traceable:


In the context of biometrics, privacy is a central issue because any biometric
information about a person necessarily falls within the boundary of the privacy-
circle. Hence, individuals are concerned about how any biometrically identifying
information about them is controlled. Biometric properties from the perspective of
traces or permanent storage can lead to undesired identification and tracing of the
activities of an individual. Even if the biometric data is stored in an altered form
that requires a complex algorithm to decipher, the uniqueness of the biometrics
specimen, the speed and computational power available today makes any such
protection scheme irrelevant.

Biometrics must benefit third-party trust:


If unique biometric properties are stored somewhere, for example on a smart card or
on a computer system, even if it is stored in an encoded, scrambled or ciphered
form, it is still a unique biometric identifier. Once a unique biometric identifier has
being stored anywhere, at any time, on any external media (including media that is
associated with the boundaries of the individual, such as a smartcard held by the
individual), the privacy of that biometric property owner is violated. As noted
previously, exposing or losing biometric property is a permanent problem for
the life of the individual, since there is no way to change the physiological or
behavioral characteristics of the individual. Biometric technology is inherently
individuating and interfaces easily to database technology, making privacy
violations easier and more damaging.

Who can you trust?


It may seem that one of the issues that plagues card-based ID systems the security
or integrity of the card itself -- does not apply for biometric systems, because ‘you
are your ID.’ But the question of the reliability of the card is really a question about
trust. In an ID card system, the question is whether the system can trust the card. In
a biometric system, the question is whether the individual can trust the system. If
someone else captures an individual’s physiological signature, fingerprint, or voice
print for instance, abuse by others is difficult to prevent. Any use of biometrics with
a scanner run by someone else involves trusting someone's claim about what the
scanner does and how the captured information will be used.

All content copyright © 2006 Innovya R&D Ltd. All rights reserved.
Highly Confidential – Limited circulation only
Vendors and scanner operators may say that they protect privacy in some way,
perhaps by hashing the biometric data or designing the database to enforce a privacy
policy. But the end user typically has no way to verify whether such technical
protections are effective or implemented properly. End users should be able to
verify any such claims, and to leave the system completely if they are not satisfied.
Exiting the system should at least include expunging the end user's biometric data
and records.
Despite these concerns, political pressure for more deployment of biometrics is
increasing. Much U.S. federal attention is devoted to deploying biometrics for
border security. This is an easy sell, because immigrants and foreigners are
politically speaking, easy targets. But once a system is created, new uses are usually
found for it, and those uses are not likely to stop at the border.

Existing legal framework for privacy protection of personal information:


The U.S. Constitution does not explicitly guarantee a right to privacy. Privacy of
personal data has traditionally been protected in two ways: through self-regulatory
codes and through laws. If one biometrics system were widely adopted, say
fingerprinting, the many databases containing the digitized versions of the prints
could be combined. While such a system is most likely to be developed by the
commercial sector for use in financial transactions, government and law
enforcement authorities would likely want to take advantage of these massive
databases for other purposes, especially if we were to enter a time of social unrest.
Indeed, government agencies and law enforcement are the top subscribers to the
many databases compiled by private sector ‘information brokers’. Privacy laws and
policy in the United States were derived from a code of fair information practices
developed in 1973 by the U.S. Department of Health Education and Welfare. This
Code is ‘an organized set of values and standards about personal information
defining the rights of record subjects and the responsibilities of record keepers.’ The
Code highlights five principles of fair information practices:
• There must be no secret personal data record-keeping system.
• There must be a way for individuals to discover what personal information is
recorded about them and how it is used.
• There must be a way for individuals to prevent personal information obtained
for one purpose from being used or made available for other purposes without
their consent.
• There must be a way for individuals to correct or amend information about
themselves.

Privacy Protection Through Law


1. The Privacy Act of 1974
The first response by the U.S. federal government to the many concerns about
their power to use and misuse personal information was the Privacy Act of
1974. This Act covers federal databases and is based on the Code of Fair
Information Practices defined above. In 1977, a Privacy Protection Study
Commission rejected the idea of having a similar privacy law for the private
sector. This means that individuals' privacy with respect to databases of
information stored and maintained by private organizations is not protected. In
the private sector, total reliance is on the fair information practice codes. This
is a serious problem.

All content copyright © 2006 Innovya R&D Ltd. All rights reserved.
Highly Confidential – Limited circulation only
2. Constitutional Provisions
Though there is no clearly defined right to privacy in the U.S. Constitution,
privacy rights are implied in several of the amendments. The right to privacy is
rooted in the 4th Amendment, which protects individuals from unreasonable
search and seizure; the 5th Amendment, which protects individuals from self-
incrimination, and the 14th Amendment, which gives the individual control over
his personal information.

What remains to be determined is the following:


1. Can the biometric information be collected, stored, or retrieved?
2. Can the biometric information collected be used both for criminal and non-
criminal searches and suspicionless searches?
3. Can the system give the individual full control over his abandoned personal
intrinsic information?
The following fact remains: there are no legal restrictions on biometrically
identifying information, or biometric authentication systems. However: there are
severe restrictions on collecting, creating, maintaining, using, or disseminating
records of identifiable personal data. One immediate conclusion that we should
draw is that biometrics authentication must be traceless.

The Case against Biometrics


Critics argue that biometric authentication methods present a serious threat to
privacy rights. These arguments have been broken down into three categories:
1) anonymity, 2) tracking and surveillance, 3) data matching and profiling.
Privacy advocates argue that individuals lose their anonymity in any system or
digital environment that uses biometric authentication methods. Many people claim
the option of anonymity in the marketplace (for electronic purchases) and in the
political arena (for voting) as part of their expectation of privacy. Critics of
biometrics feel that if this traceable technology were to gain widespread acceptance
and proliferate further into daily life, then much of our anonymity, when we use
different services, and move from place to place will fade.
Privacy advocates envision traceable biometrics as being able to foster ‘Big-
Brother’ monitoring of citizens by the State. This idea stems from the fact that
traceable biometric measures can be used as universal identifiers for individuals
because each biometric measure is unique. Consider having a driver's license
with a magnetic strip that stored one's fingerprint. One could imagine being pulled
over by a traffic policeman for a trivial traffic violation, and being subject to harsh
treatment because after scanning your fingerprint in, the police officer has
access to your entire criminal record and knows all of your past offenses.
Governments have used technology to intrude into the interior of individuals'
privacy-circle. Critics of traceable biometrics argue that there is no reason to expect
that the State will use traceable biometric technologies any differently.
Isolated identifying and non-identifying information in different databases can be
used to create extensive records that profile peoples’ shopping and spending habits.
The biggest danger of traceable biometrics according to privacy advocates, is that
traceable biometric identifiers can be linked to databases of other information that

All content copyright © 2006 Innovya R&D Ltd. All rights reserved.
Highly Confidential – Limited circulation only
people do not want dispersed. The threat to privacy arises from “the ability of third
parties to access this data in identifiable form and link it to other information,
resulting in secondary uses of the information, without the consent of the data
subject.” This would be a violation of the Code of Fair Information Practices, since
the individual would no longer have control over the dissemination of his personal
information.
People have generally frowned on biometrics, in particular fingerprints, because of
the long association with criminal identification, and more recently because of its
use in State welfare schemes to prevent recipients from making double claims on
their benefits. The argument is that people are reduced to mere codes and are subject
to inanimate, unjust treatment. A similar argument against the use of biometrics is
that traceable biometric identifiers are an "example of the state's using technology to
reduce individuality." This type of identification corrupts the relationship between
citizen and state because it empowers the state with control over its citizens.
Religious groups argue that traceable biometric authentication methods are “the
mechanism foretold in religious prophecy” (e.g. the Mark of the Beast). Further
religious objections are based on the premise that individuals must give up
themselves, or part of themselves, to a symbol of authority which has no spiritual
significance.
Though there are no documented cases of biometric technologies causing actual
physical harm to users, certain methods are considered as invasive. For example,
retina scanning requires the user to place his eye as close as three inches away from
the scanner so that it can capture an image of his retina pattern. Fingerprint
recognition devices too are deemed as invasive because they require the user to
actually touch a pad.

The Case for Biometrics


Biometrics by itself cannot be blamed for anonymity loss. In today's world the
problem is in the data collection and the intrinsic traces. There are larger social and
technological forces that have caused this. If a single advancement had to be blamed
for the erosion of anonymity, it would have to be the computer. The computer and
computer networks like the Internet make it incredibly easy to collect and store
information about people, and to disperse this information to a large number of
people. The Internet hosts a vast wealth of resources about many people, and the
search capabilities that exist make it relatively simple for adversaries to get personal
information about anyone. The Internet provides many resources for identity theft
(e.g. search engines, genealogy databases). In the physical world, people have
access to others' credit reports, and for a small fee, employers can perform checks
on their employees through services provided by companies like Informus
(http://www.informus.com/) and Infoseekers (http://www.infoseekers.net/). There is
no need for a universal identifier in order to link identifying and non-identifying
information from separate databases. Similarly, there is a great need for biometrics
in order for ‘Big-Brother’ surveillance to take place. There are satellites which can
track a person's movements with extreme detail. Video surveillance cameras in
department stores and on the streets, online electronic transactions, and email
sniffing are just three means by which others can keep track of one's digital identity.
John Woodward poses three arguments that establish biometrics as a friend of
privacy. Woodward's first argument is that biometrics protects privacy by
safeguarding identity and integrity. Biometric authentication systems provide very

All content copyright © 2006 Innovya R&D Ltd. All rights reserved.
Highly Confidential – Limited circulation only
secure protection against impersonators. Criminals in the real world and cyberspace
commonly exploit weaknesses in token-based and knowledge-based authentication
systems in order to break into an individual's bank account. Using a biometric
identifier for access to systems makes it much more difficult for such compromises
to occur. Second, Woodward argues that biometrics is a friend to privacy because it
can be used to limit access to information. Finally, he proposes that biometrics is a
privacy-enhancing technology. Innovya traceless biometric algorithms use
biometric characteristics to construct non-unique biometrics with a unique
identifier code that can be reconstructed only with a particular identifier. This means
the person's actual physical characteristics are not stored by the system. These types
of biometric systems can be used to create PINs for users, thus providing a form of
anonymous verification.

HOW DO WE MAKE BIOMETRIC SYSTEMS COMPATIBLE WITH PRIVACY


CONCERNS?
There are many different forces acting on biometrics, including industry and law.
The only way biometric systems can address privacy concerns is if the two forces
propose and implement a mechanism that simultaneously accomplishes the
following:
• Eliminates intrusiveness into personal data sets.
• Establishes obligations about how biometrics by itself without any harmful
traces can be used and disseminated both in the public and the private sector and
does not stifle traceless technology.
Separate efforts by each of these forces will not work because they conflict. In
industry, engineers want to design biometric systems with lower and lower false
rejection rates. Policy makers are concerned with wider public interests. It would
not be surprising if they were to lay down laws that would rightfully ban the use of
traceable biometric systems (at least in the private sector).
In March 1999, the International Biometric Industry Association (IBIA) announced
a set of principles to protect personal information collected by biometric
authentication systems. In this announcement, the IBIA stressed that it is very
concerned with the issues of privacy and use of personal information. The principles
they propose as guidelines to manufacturers, integrators, customers and users are:
Traceable biometric data is electronic code that is separate and distinct from
personal information, and provides an effective, secure barrier against unauthorized
access to personal information. Beyond this inherent protection, the IBIA
recommends safeguards to ensure that biometric data is not misused to compromise
any information, or released without personal consent or the authority of law.
Traceless biometrics will put end to this concern.
In the private sector, the IBIA advocates the development of policies that clearly set
forth how biometric data will be collected, stored, accessed and used, to preserve
the rights of individuals and to limit the distribution of data beyond the stated
purposes.
In the public sector, the IBIA believes that clear legal standards should be developed
to carefully define and limit the conditions under which national security and law
enforcement agencies may acquire, access, store, and use biometric data.
In both the private and public sectors, the IBIA advocates the adoption of
appropriate managerial and technical controls to protect the confidentiality and
integrity of databases containing biometric data.

All content copyright © 2006 Innovya R&D Ltd. All rights reserved.
Highly Confidential – Limited circulation only
This is a first step toward privacy protection for users of biometric systems, but it is
lacking. First, it suggests only self-regulation for the private sector. This means
there would be no legal way to punish corporations for misuse of biometric
information. This would leave the current state of affairs as is. It is imperative that
database managers be accountable for how they handle people's information.
Second, it is hard to keep track of who is adhering to these principles and who is
not. There are many companies that do not audit how information is used and
disclosed. Businesses commonly sell information to each other in order to use data
mining algorithms to discover consumer trends, and send them targeted advertising
material. Third, it makes no mention of what sorts of technological solutions can be
used to deal with the privacy problem. Engineers need to come up with different
methods so individuals can have more control over their personal information.
Innovya's Traceless Biometric technology which uses a non-unique biometric
identifier to create a PIN is an example of how industry can design safer and more
secure systems.
Government policy-makers and industry need to collaborate to ensure that there are
legal prohibitions against the selling, collecting or exchanging of biometric
identification databases to third parties:
• That there is legislation to ensure that electronic storage of biometric identifiers
will not be carried out in the same manner as companies' other information. It
must not be there in the first place.
• That there are legal prohibitions against the use of peoples’ biometric
characteristics for identification purposes without their consent.
• That there are legal prohibitions against using traceable biometric identifiers for
discriminatory purposes either by law enforcement agencies or the private
sector.
Industry and governments need to set up and fund a research organization (or extend
the research scope of the government-funded Biometric Consortium) to design
traceless biometric authentication systems that fall in the realm of privacy-
enhancing technology. The implications of such collaboration could eliminate the
privacy problems created by security solutions that use biometric identifiers. This
would also provide a model for how to approach the wider privacy issue which is a
consequence of the ubiquitous presence of computers and the wealth of information
available on the Internet.

Conclusion:
The digital evolution that we are witnessing today is leaning ever more strongly
toward smart environments where humans and computers are in symbiosis.

Unless governments establish strict oversight of traceable systems, many innocent


individuals are likely to be apprehended. There must be limits on the kinds of uses
that can be made of traceable biometric technologies by government and law
enforcement authorities, as well as clear-cut and expeditious procedures to handle
cases of erroneous identification. Traceless biometric identification and
authentication schemes are the first step toward this, since they cloud the line
between an individual's claim to an identity and his means of verifying this claim.
Privacy advocates worry that sensitive traceable biometric information used for
authentication will provide yet another opportunity for both private and public

All content copyright © 2006 Innovya R&D Ltd. All rights reserved.
Highly Confidential – Limited circulation only
sector information traders to exploit individuals. Their stance is that biometrics will
lead to an even deeper erosion of personal privacy in both the real world and
cyberspace; that it will foster Big-Brother monitoring of citizens by the government;
and that individuals will lose their anonymity whenever they use traceable biometric
devices to authenticate themselves. In the absence of adequate legislation to regulate
how such information is deployed and used, some of the predictions of the critics of
biometrics may well materialize. What is needed is for policy makers (who
represent the ethical interests of individuals) and engineers of biometric systems
(who represent the technological interests of individuals), to collaborate so that a
well-defined legal framework within which traceless biometric technologies can
safely operate and advance is established. Innovya Research and Development has
already begun designing and implementing traceless biometric systems tailored
toward giving the user as much control as possible over his information. It is now
time for policy makers to look more closely into what contributions they can make
to accommodate the privacy interests of individuals.

Innovya’s solution:
When designing a security system, it is best not to make it too powerful. If an
intruder manages to gain access, he has more power over you. If however, the
security system is simpler, the intruder’s success is more limited. Innovya’s
technology overcomes these disadvantages by using its patented traceless
biometrics for identifying an individual with a biometric identifier that is designed
to be non-unique. Innovya uses an amorphous and non-unique biometric identifier
agent called ‘BIdToken’ (Biometric Identifier Token) that is designed to be
biometrically traceless, so that an image or copy of the biometric information does
not need to be maintained. Instead, the BIdToken refers to an incomplete and non-
unique identifier obtained from the biometric information. By ‘incomplete’ we
mean that the biometric information itself cannot be reconstructed from the
BIdToken, because the necessary information is discarded during processing of the
biometric information.

Representing application – an example:


Innovya’s BIdToken is not stored on any database such as a bank, government, or
any other system. Instead, the user securely provides the BIdToken and thus
maintains control over it. For example, a secured BIdToken can replace the PIN
associated with an ATM card. Only the combination of physically possessing the
ATM and Innovya’s Biometric Identifier (BIdToken), permits the individual to
complete a transaction at the ATM machine. In this situation when a PIN and/or PIN
and card are shared with another individual or is stolen, the identity of the
individual using the card can be determined, allowing only the true owner to use the
card. The method for determining the BIdToken is kept secure and therefore it is not
possible to determine the non-unique BIdToken or its generation from the
fingerprint or other unique biometric identifier by an unauthorized party (reverse
engineering). However, a BIdToken can be replaced by another one and still be
associated with the real biometric owner. Innovya's solution neutralizes the
obligation requirements for trust by third parties and significantly reduces
identity theft.

All content copyright © 2006 Innovya R&D Ltd. All rights reserved.
Highly Confidential – Limited circulation only
Author:
Michael (Micha) Shafir – Cofounder seasoned entrepreneur (RadWare, MagniFire,
CrossID)
Email: micha@Innovya.com
Direct: +972 54 4837900

All content copyright © 2006 Innovya R&D Ltd. All rights reserved.
Highly Confidential – Limited circulation only

You might also like