Professional Documents
Culture Documents
INTELLIGENCE
Presented By:
ATHIRA M RAJ
Roll No:22
Semester/Branch:S7/CSE
CONTENTS
Introduction
SP Theory of Intelligence
Problems of Big Data
Volume
Efficiency
Transmission
Variety
Veracity
Visualization
A Road Map
Conclusion
References
2
INTRODUCTION
SP theory of intelligence be applied to the management and
analysis of big data
Overcomes the problem of variety in big data.
Analysis of streaming data- velocity
Economies in the transmission of data
Veracity in big data.
Visualization of knowledge structures and inferential processes
SP THEORY OF INTELLIGENCE
The SP theory is
conceived as a
brain-like system
that receives New
information and
compresses it to
create Old
information
BENEFITS OF SP THEORY
Conceptual simplicity combined with descriptive and
explanatory power across several aspects of intelligence.
Simplification of computing systems, including software.
Deeper insights and better solutions in several areas of
application.
Seamless integration of structures and functions within and
between different areas of application
6
MULTIPLE ALIGNMENT
The system aims to find multiple alignments that enable a
New pattern to be encoded economically in terms of one or
more Old patterns
Multiple alignment provides the key to:
Versatility in representing different kinds of knowledge.
Versatility in different kinds of processing in AI and
mainstream computing.
AN SP MULTIPLE ALIGNMENT
Compression difference:
CD = BN-BE
BN :total number of bits in those symbol in the
New pattern that are aligned with Old symbols in
the alignment
BE :the total number of bits in the symbols in
the code pattern
Compression ratio:
CR = BN/BE;
10
BIG DATA
11
Efficiency
Via Reduction in volume
Reducing the size of big data and size of search terms
Via Probabilities
Get out unnecessary searching
Via a synergy with data-centric computing
Close integration of data and processing
14
Transmission Of Information
Since so much of the energy in computing is required to
move data around, they have to discover ways to move the
data as little as possible.
16
Veracity
For any body of data, I, principles of minimum-length
encoding provide the key
Aim to minimize the overall size of G and E.
G is a distillation or essence of I, that excludes most
errors and generalizes beyond I.
E + G is a lossless compression of I including typos etc
but without generalizations.
Systematic distortions remain a problem.
17
Interpretation of Data
Processing I in conjunction with a pre-established grammar
(G) to create a relatively compact encoding (E) of I
Depending on the nature of I and G, the process of
interpretation may be seen to achieve:
Pattern recognition
Information retrieval
Parsing and production of natural language
Translation from one representation to another
Planning
Problem solving
18
Visualizations
The SP system is well suited to visualization for these
reasons:
Transparency in the representation of knowledge.
Transparency in processing.
The system is designed to discover natural structures in
data.
There is clear potential to integrate visualization with the
statistical techniques that lie at the heart of how the SP
system works.
20
A ROAD MAP
Develop a high-parallel, open-source version of the SP
machine.
21
CONCLUSION
Designed to simplify and integrate concepts across artificial
intelligence, mainstream computing, and human perception
and cognition, has potential in the management and analysis of
big data.
The SP system has potential as a universal framework for the
representation and processing of diverse kinds of knowledge
(UFK), helping to reduce the problem of variety in big data
The great diversity of formalisms and formats for knowledge,
and how they are processed.
22
REFERENCES
www.cognitionresearch.org/sp.htm .
Article: Big data and the SP theory of intelligence, J G
Wolff, IEEE Access, 2, 301-315, 2014.
International Journal of Computer Engineering and
Technology (IJCET), ISSN 0976-6367,Volume 5, Issue 12,
December (2014), pp. 207-213 IAEME
23
24