You are on page 1of 13

Secure Distributed Deduplication Systems with

Improved Reliability
ABSTRACT
• Data deduplication is a technique for eliminating duplicate
copies of data.
• Used in cloud storage to reduce storage space and upload
bandwidth.
• there is only one copy for each file stored in cloud
• The propose new distributed deduplication systems with
higher reliability.
• which the data chunks are distributed across multiple cloud
servers.
EXISTING SYSTEM

• Users are keeping multiple data copies with the same content,
it leads to data redundancy.
• In existing work file is eliminated base on the file name, the
content of the particular file is not verified.
DISADVANTAGES OF EXISTING SYSTEM

• No existing work on secure deduplication can properly address


the reliability and tag consistency problem in distributed
storage systems.
• It leads to data redundancy and data lose.
• Limited storage space
PROPOSED WORK

• Instead of keeping multiple data copies with the same content,


deduplication eliminates redundant data by keeping only one
physical copy and referring other redundant data to that copy.
• which discovers redundancies between different files and
removes these redundancies to reduce capacity demands.
ADVANTAGES OF PROPOSED SYSTEM

• It increases storage area in cloud environment


• Number of user can upload the files with perfect bandwidth
utilization.
• Confidentiality, reliability and integrity can be achieved in our
proposed system.
SYSTEM SPECIFICATIONS
HARDWARE DESCRIPTION

• Processor : INTEL Pentium 4


• RAM : 2 GB
• Hard Disk Drive : 500 GB
• Key Board : Standard 128 Keys
• Monitor : 19.5 “ TFT Monitor
• Mouse : Logitech Serial Mouse
SOFTWARE DESCRIPTION

• Operating System : Windows 7


• Front- End : Visual Studio 2012
• Back- End : SQL SERVER 2005
MODULES
USER REGISTRATION AND USER LOGIN

• user wants to access the data which is stored in a cloud.


• he/she should register their details first.
• These details are maintained in a Database.
CLOUD SERVICE PROVIDER

• This is an entity that provides a data storage service in public


cloud.
• The S-CSP provides the data outsourcing service and stores
data on behalf of the users.
• To reduce the storage cost, the S-CSP eliminates the storage of
redundant data via deduplication and keeps only unique data.
FILE UPLOAD/DOWNLOAD

• In this module Owner uploads the file(along with meta data) into
database.
• This metadata and its contents, the end user has to download the file.
REFERENCES

• Gantz and D. Reinsel. (2012, Dec.). The digital universe in 2020: Big
data, bigger digital shadows, and biggest growth in the fareast.
• M. O. Rabin, “Fingerprinting by random polynomials,” Center for Res.
Comput. Technol., Harvard Univ., Tech. Rep. TR-CSE-03-01,1981.
• J. R. Douceur, A. Adya, W. J. Bolosky, D. Simon, and M. Theimer,
“Reclaiming space from duplicate files in a serverless distributed file
system,” in Proc. 22nd Int. Conf. Distrib. Comput. Syst., 2002, pp.
617–624.
• M. Bellare, S. Keelveedhi, and T. Ristenpart, “Dupless: Server aided
encryption for deduplicated storage,” in Proc. 22nd USENIX Conf.
Secur. Symp., 2013, pp. 179–194.
• M. Bellare, S. Keelveedhi, and T. Ristenpart, “Message-locked
encryption and secure deduplication,” in Proc. EUROCRYPT, 2013, pp.
296–312.

You might also like