You are on page 1of 3

to solve practical problems will be discussed.

An introduction , expectation and other fundamental issues are covered with the focus on their applications in the study of industrial systems. Stochastic models such zation, analysis, interpretation and presentation of data.[1][2] It deals with all aspects of data, including the planning of data collection in terms of the

distances between measurements defined, but the zero value is arbitrary (as in the case with longitude and temperature measurements in Celsius or Fahrenheit). Ordinal measurements have imprecise differences between consecutive values, but have a meaningful order to those values. Nominal measurements have no meaningful should not be confused with the word statistic, referring to a quantity (such as mean or median) calculated from a set of data,[4] whose plural is statistics
to solve practical problems will be discussed. An introduction to simulation techniques such as the Monte Carlo method together with stochastic modeling is also included. Synopsis: Basic probability concepts and models that are useful for solving engineering problems are introduced. Interpretation of probability, probability distribution, conditional probability, independence, expectation and other fundamental issues are covered with the focus on their applications in the study of industrial systems. Stochastic models such Statistics rarely give a simple Yes/No type answer to the question asked of them. Interpretation often comes down to the level of statistical significance applied to the numbers and often refers to the probability of a value accurately rejecting the null hypothesis (sometimes referred to as the p-value).

Referring to statistical significance does not necessarily mean that the overall result is significant in real world terms. For example, in a large study of a drug it may be shown that the drug has a statistically significant but very small beneficial effect, such that the drug is unlikely to help the patient noticeably. Criticisms arise because the hypothesis testing approach forces one hypothesis (the null hypothesis) to be "favored," and can also seem to exaggerate the importance of minor differences in large studies. A difference that is highly statistically significant can still be of no practical vv fic treatment of the subject.[12] Jakob Bernoulli's Ars Conjectandi (posthumous, 1713) and Abraham de Moivre's Doctrine of Chances (1718) treated the subject as a branch of mathematics.[13] See Ian Hacking's The Emergence of Probability[8] and James Franklin's The Science of Conjecture[full citation needed] for histories of the early development of the very concept of mathematical probability. The theory of errors may be traced back to Roger Cotes's Opera Miscellanea (posthumous, 1722), but a memoir prepared by Thomas Simpson in 1755 (printed 1756) first applied the theory to the discussion of errors of observation.[citation needed] The reprint (1757) of this memoir lays down the axioms that positive and negative errors are equally probable, and that certain assignable limits define the range of all errors. Simpson also discusses continuous errors and describes a probability curve.

The first two laws of error that were proposed both originated with Pierre-Simon Laplace. The first law was published in 1774 and stated that the frequency of an error could be expressed as an exponential function of the numerical magnitude of the error, disregarding sign. The second law of error was proposed in 1778 by Laplace and stated that the frequency of the error is an exponential function of the square of the error.[14] The second law of error is called the normal distribution or the Gauss law. "It is difficult historically to attribute that law to Gauss, who in spite of his well-known precocity had probably not made this discovery before he was two years old."[14] Daniel Bernoulli (1778) introduced the principle of the maximum product of the probabilities of a system of concurrent errors. nal probability is written by[21] , and is read "the probability of A, given B". It is defined

If then is formally undefined by this expression. However, it is possible to define a conditional probability for some zero-probability events using a -algebra of such events (such as those arising from a continuous random variable).[citation needed] For example, in a bag of 2 red balls and 2 blue balls (4 balls in total), the probability of taking a red ball is ; however, when taking a second ball, the probability of it being either a red ball or a blue ball depends on the ball previously taken, such as, if a red ball was taken, the probability of picking a red ball again would be been remaining.
Inverse probability

since only 1 red and 2 blue balls would have

In probability theory and applications, Bayes' rule relates the odds of event to event , before (prior to) and after (posterior to) conditioning on another event . The odds on to event is simply the ratio of the probabilities of the as posterior is proportional to prior times likelihood, the left hand side is proportional to (i where the proportionality symbol means that

Carl Friedrich Gauss

Adrien-Marie Legendre (1805) developed the method of least squares, and introduced it in his Nouvelles mthodes pour la dtermination des orbites des comtes (New Methods for

Determining the Orbits of Comets).[citation needed] In ignorance of Legendre's contribution, an IrishAmerican writer, Robert Adrain, editor of "The Analyst" (1808), first deduced the law of facility of error,

where is a constant depending on precision of observation, and is a scale factor ensuring that the area under the curve equals 1. He gave two proofs, the second being essentially the same as John Herschel's (1850).[citation needed] Gauss gave the first proof that seems to have been known in Europe (the third after Adrain's) in 1809. Further proofs were given by Laplace (1810, 1812), Gauss (1823), James Ivory (1825, 1826), Hagen (1837), Friedrich Bessel (1838), W. F. Donkin (1844, 1856), and Morgan Crofton (1870). Other contributors were Ellis (1844), De Morgan (1864), Glaisher (1872), and Giovanni Schiaparelli (1875). Peters's (1856) formula[clarification needed] for r, the probable error of a single observation, is well known.[to whom?] In the nineteenth century authors on the general theory included Laplace, Sylvestre Lacroix (1816), Littrow (1833), Adolphe Quetelet (1853), Richard Dedekind (1860), Helmert (1872), Hermann Laurent (1873), Liagre, Didion, and Karl Pearson. Augustus De Morgan and George Boole improved the exposition of the theory. Andrey Markov introduced[citation needed] the notion of Markov chains (1906), which played an important role in stochastic processes theory and its applications. The modern theory of probability based on the measure theory was developed by Andrey Kolmogorov (1931).[citation
needed]

On the geometric side (see integral geometry) contributors to The Educational Times were influential (Miller, Crofton, McColl, Wolstenholme, Watson, and Artemas Martin).[citation needed]
Further information: History of statistics v

You might also like