You are on page 1of 18

BUSINESS INTELIGENCE: "Business Intelligence is a set of methodologies, processes, architectures, and technologies that transform raw data into

o meaningful and useful information used to enable more effective strategic, tactical, and operational insights and decision-making." The term business intelligence was probably first used in a 1958 research paper from IBM researcher Hans Peter Luhn. He defined intelligence as "the ability to apprehend the interrelationships of presented facts in such a way as to guide action towards a desired goal." In 1989, Howard Dresner proposed business intelligence or BI as an umbrella term used to describe "concepts and methods to improve business decision-making by using fact-based support systems." In short, business intelligence is about taking all of the data in your organization and presenting it in a way that connects various facts to one another. You drill through those facts, constrained.

TYPES OF KNOWLEDGE: 1. TACIT KNOWLEDGE: Unwritten, unspoken, and hidden vast storehouse of knowledge held by practically every normal human being, based on his or her emotions, experiences, insights, intuition, observations and internalized information. Tacit knowledge is integral to the entirety of a person's consciousness, is acquired largely through association with other people, and requires joint or shared activities to be imparted from on to another. Like the submerged part of an iceberg it constitutes the bulk of what one knows, and forms the

underlying framework that makes explicit knowledge possible. Concept of tacit knowledge was introduced by the Hungarian philosopher-chemist Michael Polanyi (1891-1976) in his 1966 book 'The Tacit Dimension. Also called informal knowledge. 2. IMPLICIT KNOWLEDGE: Knowledge that is kept in a persons mind without necessarily being expressed in words and is often acted on instinctively. Implicit Knowledge hasnt yet been codified but that it likely can be codified, 3. EXPLICIT KNOWLEDGE Articulated knowledge, expressed and recorded as words, numbers, codes, mathematical and scientific formulae, and musical notations. Explicit knowledge is easy to communicate, store, and distribute and is the knowledge found in books, on the web, and other visual and oral means. Opposite of tacit knowledge.

MACHINE INTELLIGENCE / ARTIFICIAL INTELLIGENCE: A system that makes it possible for a machine to perform functions similar to those performed by human intelligence, such as learning, reasoning, self-correcting, and adapting. Computer technology produces many instruments and systems that mimic and surpass some human capabilities, such as speed of calculations, correlating, sensing, and deducing. "The study and design of intelligent agents" where an intelligent agent is a system that perceives its environment and takes actions that maximize its chances of success. John McCarthy, who coined the term in 1956, defines it as "the science and engineering of making intelligent machines." INTELLIGENT SYSTEMS: Intelligent systems (IS) provide a standardized methodological approach to solve important and fairly complex problems and obtain consistent and reliable results over time. Here is a useful definition of an Intelligent System: It is a system. It learns during its existence. (In other words, it senses its environment and learns, for each situation, which action permits it to reach its objectives.) It continually acts, mentally and externally, and by acting reaches its objectives more often than pure chance indicates (normally much oftener). It consumes energy and uses it for its internal processes, and in order to act.

What does this definition imply? The system has to exist. An environment must exist, with which the system can interact. It must be able to receive communications from the environment, for its elaboration of the present situation. This is an abstracted summary of the communications received by the senses. By communications, in turn, we mean an interchange of matter or energy. If this communication is for the purpose of transmitting information, it is a variation of the flow of energy or a specific structuring of matter that the system perceives. The IS has to have an objective, it has to be able to check if its last action was favorable, if it resulted in getting nearer to its objective, or not. To reach its objective it has to select its response. A simple way to select a response is to select one that was favorable in a similar previous situation. It must be able to learn. Since the same response sometimes is favorable and sometimes fails, it has to be able to recall in which situation the response was favorable, and in which it was not. Therefore it stores situations, responses, and results. Finally, it must be able to act; to accomplish the selected response. META KNOWLEDGE: Meta-knowledge or meta-knowledge is knowledge about a preselected knowledge. It is relative concept, it means, if K is a knowledge about a domain D, and K' is a knowledge about D', then if D'= K then K' is a meta-knowledge. K can be analysed and used from different points of view therefore we may have different meta-knowledge related to the same domain. From the systemic perspective, generalized meta-knowledge is domain-independent knowledge which performs or enables operations on every another more or less specific domain-dependent knowledge in different domains/areas of human activities.

KNOWLEDGE MODELLING: STEP 1. : FIND AND GET THE RIGHT KNOWLEDGE RESOURCES STEP2. : ACQUIRE THE KNOWLEDGE (KNOWLEDGE ACQUISITION) STEP3. : REPRESENT AND STORE THE KNOWLEDGE (KNOWLEDGE REPRESENTATION) STEP 4. : ACCESS, USE OR REUSE AND APPLY THE KNOWLEDGE (KNOWLEDGE APPLICATION) DSS: DECISION SUPPORT SYTEM: A decision support system (DSS) is a computer-based information system that supports business or organizational decision-making activities. DSSs serve the management, operations, and planning levels of an organization and help to make decisions, which may be rapidly changing and not easily specified in advance. DSSs include knowledge-based systems. A properly designed DSS is an interactive software-based system intended to help decision makers compile useful information from a combination of raw data, documents, and personal knowledge, or business models to identify and solve problems and make decisions. Typical information that a decision support application might gather and present are: 1. inventories of information assets (including legacy and relational data sources, cubes, data warehouses, and data marts), 2. comparative sales figures between one period and the next, 3. projected revenue figures based on product sales assumptions.

Three fundamental components of a DSS architecture are: 1. the database (or knowledge base), 2. the model (i.e., the decision context and user criteria), and 3. the user interface. The users themselves are also important components of the architecture. DSS components may be classified as:

1. Inputs: Factors, numbers, and characteristics to analyse 2. User Knowledge and Expertise: Inputs requiring manual analysis by the user 3. Outputs: Transformed data from which DSS "decisions" are generated 4. Decisions: Results generated by the DSS based on user criteria EXAMPLE: One example is the clinical decision support system for medical diagnosis. Other examples include a bank loan officer verifying the credit of a loan applicant or an engineering firm that has bids on several projects and wants to know if they can be competitive with their costs.

EXPERT SYSTEMS: In artificial intelligence, an expert system is a computer system that emulates the decision-making ability of a human expert. Expert systems are designed to solve complex problems by reasoning about knowledge, like an expert, and not by following the procedure of a developer as is the case in conventional programming. The first expert systems were created in the 1970s and then proliferated in the 1980s. Expert systems were among the first truly successful forms of AI software. An expert system has a unique structure, different from traditional programs. It is divided into two parts, one fixed, independent of the expert system: the inference engine, and one variable: the knowledge base. To run an expert system, the engine reasons about the knowledge base like a human. In the 80s a third part appeared: a dialog interface to communicate with users. This ability to conduct a conversation with users was later called "conversational". Example: Expert systems are designed to facilitate tasks in the fields of accounting, medicine, process control, financial service, production, human resources, among others. Typically, the problem area is complex enough that a more simple traditional algorithm cannot provide a proper solution. The foundation of a successful expert system depends on a series of technical procedures and development that may be designed by technicians and related experts. As such, expert systems do not typically provide a definitive answer, but provide probabilistic recommendations.

Expert system DATA WAREHOUSING: A Data Warehouse (DW) is simply a consolidation of data from a variety of sources that is designed to support strategic and tactical decision making. Its main purpose is to provide a coherent picture of the business at a point in time. Using various Data Warehousing toolsets, users are able to run online queries and 'mine" their data. A collection of data designed to support management decision making. Data warehouses contain a wide variety of data that present a coherent picture of business conditions at a single point in time. Development of a data warehouse includes development of systems to extract data from operating systems plus installation of a warehouse database system that provides managers flexible access to the data.

The term data warehousing generally refers to the combination of many different databases across an entire enterprise. Contrast with data mart.

DATA MINING: Data mining (sometimes called data or knowledge discovery) is the process of analyzing data from different perspectives and summarizing it into useful information - information that can be used to increase revenue, cuts costs, or both. Data mining software is one of a number of analytical tools for analyzing data. It allows users to analyze data from many different dimensions or angles, categorize it, and summarize the relationships identified. Technically, data mining is the process of finding correlations or patterns among dozens of fields in large relational databases.

EG: one Midwest grocery chain used the data mining capacity of Oracle software to analyze local buying patterns. They discovered that when men bought diapers on Thursdays and Saturdays, they also tended to buy beer. Further analysis showed that these shoppers typically did their weekly grocery shopping on Saturdays. On Thursdays, however, they only bought a few items. The retailer concluded that they purchased the beer to have it available for the upcoming weekend. The grocery chain could use this newly discovered information in various ways to increase revenue. For example, they could move the beer display closer to the diaper display. And, they could make sure beer and diapers were sold at full price on Thursdays.

Data mining is primarily used today by companies with a strong consumer focus - retail, financial, communication, and marketing organizations. It enables these companies to determine relationships among "internal" factors such as price, product positioning, or staff skills, and "external" factors such as economic indicators, competition, and customer demographics. And, it enables them to determine the impact on sales, customer satisfaction, and corporate profits. Finally, it enables them to "drill down" into summary information to view detail transactional data. With data mining, a retailer could use point-of-sale records of customer purchases to send targeted promotions based on an individual's purchase history. By mining demographic data from comment or warranty cards, the retailer could develop products and promotions to appeal to specific customer segments. For example, Blockbuster Entertainment mines its video rental history database to recommend rentals to individual customers. American Express can suggest products to its cardholders based on analysis of their monthly expenditures. WalMart is pioneering massive data mining to transform its supplier relationships. WalMart captures point-of-sale transactions from over 2,900 stores in 6 countries and

continuously transmits this data to its massive 7.5 terabyte Teradata data warehouse. WalMart allows more than 3,500 suppliers, to access data on their products and perform data analyses. These suppliers use this data to identify customer buying patterns at the store display level. They use this information to manage local store inventory and identify new merchandising opportunities. In 1995, WalMart computers processed over 1 million complex data queries. The National Basketball Association (NBA) is exploring a data mining application that can be used in conjunction with image recordings of basketball games. The Advanced Scout software analyses the movements of players to help coaches orchestrate plays and strategies. For example, an analysis of the play-by-play sheet of the game played between the New York Knicks and the Cleveland Cavaliers on January 6, 1995 reveals that when Mark Price played the Guard position, John Williams attempted four jump shots and made each one! Advanced Scout not only finds this pattern, but explains that it is interesting because it differs considerably from the average shooting percentage of 49.30% for the Cavaliers during that game. By using the NBA universal clock, a coach can automatically bring up the video clips showing each of the jump shots attempted by Williams with Price on the floor, without needing to comb through hours of video footage. Those clips show a very successful pick-and-roll play in which Price draws the Knick's defense and then finds Williams for an open jump shot.

DATA MINING NEURAL NETWORKS: An Artificial Neural Network (ANN) is an information processing paradigm that is inspired by the way biological nervous systems, such as the brain, process information. The key element of this paradigm is the novel structure of the information processing system. It is composed of a large number of highly interconnected processing elements (neurones) working in unison to solve specific problems. ANNs, like people, learn by example. An ANN is configured for a specific application, such as pattern recognition or data classification, through a learning process. Learning in biological systems involves adjustments to the synaptic connections that exist between the neurones. This is true of ANNs as well. advantages include: 1. Adaptive learning: An ability to learn how to do tasks based on the data given for training or initial experience.

2. Self-Organisation: An ANN can create its own organisation or representation of the information it receives during learning time. 3. Real Time Operation: ANN computations may be carried out in parallel, and special hardware devices are being designed and manufactured which take advantage of this capability. 4. Fault Tolerance via Redundant Information Coding: Partial destruction of a network leads to the corresponding degradation of performance. However, some network capabilities may be retained even with major network damage.

NEURAL NETWORKS

WEB 2.0: Web 2.0 is a loosely defined intersection of web application features that facilitate participatory information sharing, interoperability, user-cantered design,[1] and collaboration on the World Wide Web. A Web 2.0 site allows users to interact and collaborate with each other in a social media dialogue as creators (prosumers) of user-generated content in a virtual community, in contrast to websites where users (consumers) are limited to the passive viewing of content that was created for them. Examples of Web 2.0 include social networking sites, blogs, wikis, video sharing sites, hosted services, web applications, mashups and folksonomies. The client-side/web browser technologies used in Web 2.0 development are Asynchronous JavaScript and XML (Ajax), Adobe Flash and the Adobe Flex framework, and JavaScript/Ajax frameworks such as YUI Library, Dojo Toolkit, MooTools, jQuery and Prototype JavaScript Framework. Ajax programming uses JavaScript to upload and download new data from the web server without undergoing a full page reload. Web 2.0 can be described in 3 parts, which are as follows: 1. Rich Internet application (RIA) defines the experience brought from desktop to browser whether it is from a graphical point of view or usability point of view. Some buzzwords related to RIA are Ajax and Flash. 2. Web-oriented architecture (WOA) is a key piece in Web 2.0, which defines how Web 2.0 applications expose their functionality so that other applications can leverage and integrate the functionality providing a set of much richer applications (Examples are: Feeds, RSS, Web Services, Mash-ups)

3. Social Web defines how Web 2.0 tends to interact much more with the end user and make the end-user an integral part. Web 2.0 websites include the following features and techniques: Andrew McAfee used the acronym SLATES to refer to them: Search Finding information through keyword search. Links Connects information together into a meaningful information ecosystem using the model of the Web, and provides low-barrier social tools. Authoring The ability to create and update content leads to the collaborative work of many rather than just a few web authors. In wikis, users may extend, undo and redo each other's work. In blogs, posts and the comments of individuals build up over time. Tags Categorization of content by users adding "tags"short, usually one-word descriptionsto facilitate searching, without dependence on pre-made categories. Collections of tags created by many users within a single system may be referred to as "folksonomies" (i.e., folk taxonomies). Extensions Software that makes the Web an application platform as well as a document server. These include software like Adobe Reader, Adobe Flash player, Microsoft Silverlight, ActiveX, Oracle Java, Quick time, Windows Media, etc.

Signals The use of syndication technology such as RSS to notify users of content changes. While SLATES forms the basic framework of Enterprise 2.0, it does not contradict all of the higher level Web 2.0 design patterns and business models. In this way, a new Web 2.0 report from O'Reilly is quite effective and diligent in interweaving the story of Web 2.0 with the specific aspects of Enterprise 2.0. It includes discussions of self-service IT, the long tail of enterprise IT demand, and many other consequences of the Web 2.0 era in the enterprise. The report also makes many sensible recommendations around starting small with pilot projects and measuring results, among a fairly long list.

SEMANTIC WEB: The Semantic Web is a collaborative movement led by the World Wide Web Consortium (W3C) that promotes common formats for data on the World Wide Web. By encouraging the inclusion of semantic content in web pages, the Semantic Web aims at converting the current web of unstructured documents into a "web of data". It builds on the W3C's Resource Description Framework (RDF). According to the W3C, "The Semantic Web provides a common framework that allows data to be shared and reused across application, enterprise, and community boundaries." The term was coined by Tim Berners-Lee, the inventor of the World Wide Web and director of the World Wide Web Consortium ("W3C"), which oversees the development of proposed Semantic Web standards. He defines the Semantic Web as "a web of data that can be processed directly and indirectly by machines." While its critics have questioned its feasibility, proponents argue that applications in industry, biology and human sciences research have already proven the validity of the original concept.

What is a Semantic App? Firstly let's define "Semantic App". A key element is that the apps below all try to determine the meaning of text and other data, and then create connections for users. Another of the founders mentioned below, Nova Spivack of Twine, noted at the Summit that data portability and connectibility are keys to these new semantic apps - i.e. using the Web as platform. Two main approaches to Semantic Apps: 1) Bottom Up - involves embedding semantically annotations (meta-data) right into the data. 2) Top down - relies on analyzing existing information; the ultimate top-down solution would be a fully blown natural language processor, which is able to understand text like people do.

You might also like