You are on page 1of 7

International Journal of Trend in Scientific

Research and Development (IJTSRD)


International Open Access Journal
ISSN No: 2456 - 6470 | www.ijtsrd.com | Volume - 2 | Issue – 5

Statistical Methods:
Quantitative Techniques in Business aand
nd Industry
Bhavika Madnani
B.B.A., LL.B (Hons), Indore Institute of Law
Indore, Madhya Pradesh, India

ABSTRACT
The researcher has done the research on Statistical Statistical methods means the analyzing, collecting
Methods-Quantitative
Quantitative Techniques in Business and and over all summarization and interpretation of
Industry. numerical data. Statistical techniques can contrast
with deterministic techniques which are in the sense
Statistical methods mean analyzing, collecting and give appropriate results and sometimes assumed one.
overall summarization and interpretation of numerical Statistical methods can be used in various fields like
data. Statistical techniques can be used in various economics, physical sciences, and in agricultural
fields such as economics, physical
ysical sciences and science, life sciences they also have an important role
cultural sciences, random phenomena such as in the physical sciences in the study of measurement
radioactivity or meteorological events. This research errors, of random phenomena such as radioactivity or
contains the importance of statistical methods in meteorological events, and in obtaining approximate
Business and Industry as well as Classification of results where deterministic solutions are hard to
Quantitative Techniques, Statistical Quantitative apply.
Techniques like-
Statistical analysis relates observed statistical data to
 Collection of Data theoretical models, such as probability distributions or
 Measures of Central Tendencies, Dispersion, models used in regression analysis. By estimating
Skewness and Kurtosis parameters in the proposed model and testing
 Correlation and Regression analysis hypotheses about rival models, one can assess the
 Index Numbers value of the information
rmation collected and the extent to
 Time Series Analysis which the information can be applied to similar
 Interpolation and Extrapolation situations. Statistical prediction is the application of
 Statistical Quality Control the model thought to be most appropriate, using the
 Ratio Analysis estimated values of the parameters.
 Probability Theory
 Testing of Hypothesis Further now coming up to Quantitative techniques
these are those techniques which helps us in taking
INTRODUCTION decisions in relation to business and industry. These
Statistical methods are the mathematical formulas, techniques are also use of models, symbols and
models, and techniques that are basically used in expressions. Statistics is the science of designing
analyzing the statistics of raw research data. The studies or experiments, collecting
col data and
application of statistical methods extracts
cts information modelling/analyzing data for the purpose of decision
from research data and provides different ways to making and scientific discovery when the available
assess the robustness of research outputs. information is both limited and variable. That is,

@ IJTSRD | Available Online @ www.ijtsrd.com | Volume – 2 | Issue – 5 | Jul-Aug


Aug 2018 Page: 1099
International Journal of Trend in Scientific Research and Development (IJTSRD) ISSN: 2456-6470
statistics is the science of Learning from Data. Almost statistics is that your profession or employment may
everyone—including corporate presidents, marketing require you to interpret the results of sampling
representatives, social scientists, engineers, medical (surveys or experimentation) or to employ statistical
researchers, and consumers—deals with data. These methods of analysis to make inferences in your work.
data could be in the form of quarterly sales figures, For example, practicing physicians receive large
percent increase in juvenile crime, contamination amounts of advertising describing the benefits of new
levels in water samples, survival rates for patients drugs. These advertisements frequently display the
undergoing medical therapy, census figures, or numerical results of experiments that compare a new
information that helps determine which brand of car drug with an older one. Do such data really imply that
to purchase. In this text, we approach the study of the new drug is more effective, or is the observed
statistics by considering the four-step process in difference in results due simply to random variation in
Learning from Data: (1) defining the problem, (2) the experimental measurements? Recent trends in the
collecting the data, (3) summarizing the data, and (4) conduct of court trials indicate an increasing use of
analyzing data, interpreting the analyses, and probability and statistical inference in evaluating the
communicating results. Through the use of these four quality of evidence. The use of statistics in the social,
steps in Learning from Data, our study of statistics biological, and physical sciences is essential because
closely parallels the Scientific Method, which is a set all these sciences make use of observations of natural
of principles and procedures used by successful phenomena, through sample surveys or
scientists in their pursuit of knowledge. The method experimentation, to develop and test new theories.
involves the formulation of research goals, the design Statistical methods are employed in business when
of observational studies and/or experiments, the sample data are used to forecast sales and profit. In
collection of data, the modeling/ analyzing of the data addition, they are used in engineering and
in the context of research goals, and the testing of manufacturing to monitor product quality. The
hypotheses. The conclusions of these steps is often the sampling of accounts is a useful tool to assist
formulation of new research goals for another study accountants in conducting audits. Thus, statistics
you must remember that for each data set requiring plays an important role in almost all areas of science,
analysis, someone has defined the problem to be business, and industry; persons employed in these
examined (Step 1), developed a plan for collecting areas need to know the basic concepts, strengths, and
data to address the problem (Step 2), and summarized limitations of statistics. The development and testing
the data and prepared the data for analysis (Step 3). of the Salk vaccine for protection against
Then following the analysis of the data, the results of poliomyelitis (polio) provide an excellent example of
the analysis must be interpreted and communicated how statistics can be used in solving practical
either verbally or in written form to the intended problems. Most parents and children growing up
audience (Step 4).We can think of many reasons for before 1954 can recall the panic brought on by the
taking an introductory course in statistics. One reason outbreak of polio cases during the summer months.
is that you need to know how to evaluate published Although relatively few children fell victim to the
numerical facts. Every person is exposed to disease each year, the pattern of outbreak of polio was
manufacturers’ claims for products; to the results of unpredictable and caused great concern because of the
sociological, consumer, and political polls; and to the possibility of paralysis or death. The fact that very
published results of scientific research. Many of these few of today’s youth have even heard of polio
results are inferences based on sampling. Some demonstrates the great success of the vaccine and the
inferences are valid; others are invalid. Some are testing program that preceded its release on the
based on samples of adequate size; others are not. Yet market. It is standard practice in establishing the
all these published results bear the ring of truth. Some effectiveness of a particular drug product to conduct
people (particularly statisticians) say that statistics can an experiment (often called a clinical trial) with
be made to support almost anything. Others say it is human participants. For some clinical trials,
easy to lie with statistics. Both statements are true. It assignments of participants are made at random, with
is easy, purposely or unwittingly, to distort the truth half receiving the drug product and the other half
by using statistics when presenting the results of receiving a solution or tablet that does not contain the
sampling to the uninformed. It is thus crucial that you medication (called a placebo). One statistical problem
become an informed and critical reader of data-based concerns the determination of the total number of
reports and articles. A second reason for studying participants to be included in theclinical trial. This

@ IJTSRD | Available Online @ www.ijtsrd.com | Volume – 2 | Issue – 5 | Jul-Aug 2018 Page: 1100
International Journal of Trend in Scientific Research and Development (IJTSRD) ISSN: 2456-6470
problem was particularly important in the testing of  Measures of Central Tendencies, Dispersion,
the Salk vaccine because data from previous years Skewness and Kurtosis
suggested that the incidence rate for polio might be  Correlation and Regression analysis
less than 50 cases for every 100,000 children. Hence,  Index Numbers
a large number of participants had to be included in  Time Series Analysis
the clinical trial in order to detect a difference in the  Interpolation and Extrapolation
incidence rates for those treated with the vaccine and  Statistical Quality Control
those receiving the placebo. Libel suits related to  Ratio Analysis
consumer products have touched each one of us; you  Probability Theory
may have been involved as a plaintiff or defendant in  Testing of Hypothesis
a suit or you may know of someone who was involved
in such litigation. Certainly we all help to fund the IMPORTANCE OF STATISTICAL METHODS
costs of this litigation indirectly through increased IN BUSINESS AND INDUSTRY
insurance premiums and increased costs of goods. The
testimony in libel suits concerning a particular Role of statistics in BUSINESS and INDUSTRY
product (automobile, drug product, and so on) Statistics plays a vital and essential role in every
frequently leans heavily on the interpretation of data business man’s life. A businessman makes his
from one or more scientific studies involving the strategies, policies, decides about his programmes
product. This is how and why statistics and keeping in view the graphical readings of the people’s
statisticians have been pulled into the courtroom. For tastes and preferences, changing culture and much
example, epidemiologists have used statistical more, quality of the products can also be checked by
concepts applied to data to determine whether there is using statistical methods.
a statistical “association’’ between a specific
characteristic, such as the leakage in silicone breast MANAGEMENT
implants, and a disease condition, such as an  Marketing:
autoimmune disease. An epidemiologist who finds an  Product selection
association should try to determine whether the  Analysis of marketing research information
observed statistical association from the study is due  Statistical records for building and maintain an
to random variation or whether it reflects an actual extensive market.
association between the characteristic and the disease.  Competitive strategies
Courtroom arguments about the interpretations of  Sales forecasting
these types of associations involve data analyses using  Advertising strategy
statistical concepts as well as a clinical interpretation
of the data. Many other examples exist in which  Production:
statistical models are used in court cases. In salary  Production planning, control and analysis
discrimination cases, a lawsuit is filed claiming that  Allocation of Resources
an employer underpays employees on the basis of age,  Evaluation of machine performance
ethnicity, or sex. Statistical models are developed to  Location of factories and their sizes
explain salary differences based on many factors, such  Quality control requirements
as work experience, years of education, and work  Equipment replacement and maintenance
performance. These techniques can be grouped in two  Inventory control measures
categories:
 Finance, Accounting and Investment:
 Statistical Techniques; and  Financial forecast, budget preparation
 Operations Research(or programming techniques)  Financial investment decisions
 Selection of securities
In this research I am going to give brief description  Auditing function
mainly about the Statistical techniques which  Credit policies, credit risk and delinquent
includes: accounts.

 Collection of Data

@ IJTSRD | Available Online @ www.ijtsrd.com | Volume – 2 | Issue – 5 | Jul-Aug 2018 Page: 1101
International Journal of Trend in Scientific Research and Development (IJTSRD) ISSN: 2456-6470
CLASSIFICATION OF QUANTITATIVE that may lead to credible and conclusive answers to
TECHNIQUES questions that have been posed.
Quantitative techniques can be classified into
Statistical Techniques (or Statistical methods and Accurate data collection is essential to ensure the
measure) and Programming Techniques (or Operation integrity of the research, regardless of the field of
Research) study or data preference (quantitative or qualitative).
The selection of appropriate data collection tools and
Statistical Techniques: instruments, which may be existing, modified or
I. Methods of collecting data totally new, and with clearly defined instructions for
II. Classification and tabulation of collected data their proper use, reduces the chances of errors
III. Probability theory and Sampling Analysis occurring during collection. This is the most integral
IV. Correlation and Regression Analysis and essential part of collection of any kind of
V. Index Numbers information in the research prospectus usually done in
VI. Time Series Analysis any field such as industry, business, humanities, social
VII. Interpolation and Extrapolation sciences or other physical sciences.
VIII. Survey Techniques and Methodology
IX. Ratio Analysis There are basically two methods of collection of data
X. Statistical Quality Control particularly PRIMARY DATA and SECONDARY
XI. Analysis of Variance DATA.
XII. Statistical Interference and Interpretation
XIII. Theory of Attributes PRIMARY DATA
XIV. Curve Fitting and Method of Least square When we obtain data directly from individuals,
objects or processes, we refer to it as primary data.
Programming Techniques: Quantitative or qualitative data can be collected using
I. Linear Programming this approach. Such data is usually collected solely for
II. Decision Theory the research problem to you will study. Primary data
III. Theory of Games has several advantages. First, we tailor it to our
IV. Simulation specific research question, so there are no
V. Monte Carlo Technique customizations needed to make the data usable.
VI. System Simulation Second, primary data is reliable because you control
VII. Queuing Theory how the data is collected and can monitor its quality.
VIII. Inventory Planning Third, by collecting primary data, you spend your
IX. Network Analysis/ PERT resources in collecting only required data. Finally,
X. Integrated Production Models primary data is proprietary, so you enjoy advantages
over those who cannot access the data.
STATISTICAL QUANTITATIVE TECHNIQUES
SECONDARY DATA
1. COLLECTION OF DATA When you collect data after another researcher or
One of the most important data is collection of data. agency that initially gathered it makes it available,
Data collection is the process of gathering and you are gathering secondary data. Examples of
collecting data, information and any variables of secondary data are census data published by the US
interest in a standardized and a well established Census Bureau, stock prices data published by CNN
manner that enables to collect the answer and and salaries data published by the Bureau of Labor
evaluates outcome of the particular collection. Statistics.

Data collection is concerned with the accurate One advantage to using secondary data is that it will
acquisition of data; although methods may differ save you time and money, although some data sets
depending on the field, the emphasis on ensuring require you to pay for access. A second advantage is
accuracy remains the same. The primary goal of any the relative ease with which you can obtain it. You
data collection endeavour is to capture quality data or can easily access secondary data from publications,
evidence that easily translates to rich data analysis government agencies, data aggregation websites and
blogs. A third advantage is that it eliminates effort

@ IJTSRD | Available Online @ www.ijtsrd.com | Volume – 2 | Issue – 5 | Jul-Aug 2018 Page: 1102
International Journal of Trend in Scientific Research and Development (IJTSRD) ISSN: 2456-6470
duplication since you can identify existing data that index number is usually expressed as 100 times the
matches your needs instead of gather new data. ratio to the base value.

MEASURES OF CENTRAL TENDENCY, Index numbers are values expressed as a percentage


DISPERSION, SKEWNESS AND KURTOSIS of a single base figure. For example, if annual
Measures of Central tendency are a method use for production of a particular chemical rose by 35%,
finding the average of a series while Measures of output in the second year was 135% of that in the first
dispersion used for finding out the variability in a year. In index terms, output in the two years was 100
series. Measures of Skewness measures asymmetry of and 135 respectively.
a distribution while measures of kurtosis measures the
flatness of peakedness in a distribution. TIME SERIES ANALYSIS
Analysis of time series helps us to know the effect of
CORRELATION AND REGRESSION factors which are responsible for changes.
ANALYSIS
In this method first we will discuss about the A time series is a series of data points indexed in time
correlation analysis which means that it is used to order. Most commonly, a time series is a sequence
quantify the association between two continuous taken at successive equally spaced points in time.
variables that is either between one dependent and the Thus it is a sequence of discrete-time data.
other independent variable or both the independent
variables. A time series is a series of data points indexed (or
listed or graphed) in time order. Most commonly, a
Two or more variables are said to be interlinked or time series is a sequence taken at successive equally
correlated if the change in one variable results in the spaced points in time. Thus it is a sequence of
change of another variable. discrete-time data. Examples of time series are heights
of ocean tides, counts of sunspots, and the daily
According to Simpson and Kafka, “Correlation closing value of the Dow Jones Industrial Average.
analysis attempts to determine the degree of
relationship between variables”. Time series are very frequently plotted via line charts.
Time series are used in statistics, signal processing,
Boddingtons states that “Whenever some definite pattern recognition, econometrics, mathematical
connection exists between two or more groups or finance, weather forecasting, earthquake prediction,
classes of series of data, there is said to be electroencephalography, control engineering,
correlation.” astronomy, communications engineering, and largely
in any domain of applied science and engineering
Regression analysis is a related technique to assess the which involves temporal measurements.
relationship between an outcome variable and one or
more risk factors or confounding variables. The INTERPOLATION AND EXTRAPOLATION
outcome variable is also called the response or Interpolation is the statistical technique of estimating
dependent variable and the risk factors and under certain assumptions, the missing figures which
confounders are called the predictors, or explanatory may fall within the range of given figures.
or independent variables. In regression analysis, the Extrapolation provides estimated figures outside the
dependent variable is denoted "y" and the independent range of given data.
variables are denoted by "x".
Interpolation is an estimation of a value within two
INDEX NUMBERS known values in a sequence of values. Polynomial
Index numbers measure the fluctuations in various interpolation is a method of estimating values
Phenomena like price, production etc over a period of between known data points. When graphical data
time, they are described as economic barametres. contains a gap, but data is available on either side of
the gap or at a few specific points within the gap,
An index number is an economic data figure interpolation allows us to estimate the values within
reflecting price or quantity compared with a standard the gap.
or base value. The base usually equals 100 and the

@ IJTSRD | Available Online @ www.ijtsrd.com | Volume – 2 | Issue – 5 | Jul-Aug 2018 Page: 1103
International Journal of Trend in Scientific Research and Development (IJTSRD) ISSN: 2456-6470
Extrapolation is an estimation of a value based on I. A bar graph (present simple data and easy
extending a known sequence of values or facts beyond understood)
the area that is certainly known. In a general sense, to II. A scatter diagram (relationship between two
extrapolate is to infer something that is not explicitly types of data)
stated from existing information. III. A histogram (distribution of data in term of
frequency)
IV. A Pareto diagram (statistical tool in problem
STATISTICAL QUALITY CONTROL analysis)
Statistical quality control is used for ensuring the RATIO ANALYSIS
quality of items manufactured. The variations in Ratio Analysis is used for analyzing financial
quality because of assignable causes and chance statements of any business or industrial concerns
causes can be known with the help of this tool. which help to take appropriate decisions.
Different control charts are used in controlling the
quality of products. A ratio analysis is a quantitative analysis of
information contained in a company’s financial
Statistical quality control refers to the use of statistical statements. Ratio analysis is used to evaluate various
methods in the monitoring and maintaining of the aspects of a company’s operating and financial
quality of products and services. One method, referred performance such as its efficiency, liquidity,
to as acceptance sampling, can be used when a profitability and solvency.
decision must be made to accept or reject a group of
parts or items based on the quality found in a sample. Ratio Analysis is a form of Financial Statement
A second method, referred to as statistical process Analysis that is used to obtain a quick indication of a
control, uses graphical displays known as control firm's financial performance in several key areas. The
charts to determine whether a process should be ratios are categorized as Short-term Solvency Ratios,
continued or should be adjusted to achieve the desired Debt Management Ratios, Asset Management Ratios,
quality. Profitability Ratios, and Market Value Ratios.

TOOLS USED FOR STATISTICAL QUALITY Ratio Analysis as a tool possesses several important
CONTROL features. The data, which are provided by financial
Every process depends on gathering and analysis of statements, are readily available. The computation of
data what are profuse in any organization that is ratios facilitates the comparison of firms which differ
involved in the process problem. The basic tools are: in size. Ratios can be used to compare a firm's
 Data Collection financial performance with industry averages. In
 Data Display addition, ratios can be used in a form of trend analysis
to identify areas where performance has improved or
 Data Collection deteriorated over time.
A check sheet is useful in assembling and compiling
data concerning a problem. It uses for data collection Because Ratio Analysis is based upon accounting
to view for any unwanted element. The functions of a information, its effectiveness is limited by the
check sheet are distortions which arise in financial statements due to
I. Production process distribution checks such things as Historical Cost Accounting and
II. Defective item checks inflation. Therefore, Ratio Analysis should only be
III. Defect location checks used as a first step in financial analysis, to obtain a
IV. Defect cause checks quick indication of a firm's performance and to
V. Checkup confirmation checks identify areas which need to be investigated further.

 Data Display PROBABILITY THEORY


When a company collects data it is converted on to Theory of probability provides numerical values of
various types of charts and forms for the purpose of the likely hood of the occurrence of events. A
display and analysis of data. Charts have different technique used by risk managers for forecasting future
types as: events, such as accidental and business losses. This
process involves a review of historical loss data to

@ IJTSRD | Available Online @ www.ijtsrd.com | Volume – 2 | Issue – 5 | Jul-Aug 2018 Page: 1104
International Journal of Trend in Scientific Research and Development (IJTSRD) ISSN: 2456-6470
calculate a probability distribution that can be used to idealized model. A hypothesis is proposed for the
predict future losses. The probability analyst views statistical relationship between the two data sets, and
past losses as a range of outcomes of what might be this is compared as an alternative to an idealized null
expected for the future and assumes that the hypothesis that proposes no relationship between two
environment will remain fairly stable. This technique data sets.
is particularly effective for companies that have a
large amount of data on past losses and that have CONCLUSION
experienced stable operations. This type of analysis is In this web page we outlined the broad areas in which
contrasted to trend analysis. the Institute of Statistical Mathematics takes part. We
hope that you now understand that the activities of the
Probability bounds analysis (PBA) is a collection of Institute of Statistical Mathematics extend over into
methods of uncertainty propagation for making the entire realm of science and culture.
qualitative and quantitative calculations in the face of
uncertainties of various kinds. It is used to project Today, in the information age, an immense amount of
partial information about random variables and other data is being accumulated in every aspect of society.
quantities through mathematical expressions. These data will be useful only when we can pick out
effective information from them. In the near future,
In analysis of algorithms, probabilistic analysis of statistical methods, which aim at putting data and
algorithms is an approach to estimate the information to practical use, will become increasingly
computational complexity of an algorithm or a valuable not only in the areas of science and industry,
computational problem. This assumption is then used but also in public administration. The Institute of
to design an efficient algorithm or to derive the Statistical Mathematics is making efforts to both
complexity of a known algorithm. develop theory for this and extend the range of
application. We are advancing the development of
TESTING OF HYPOTHESIS statistics, the basis of science and culture.
Testing of hypothesis is an important statistical tool to
judge the reliability of inferences drawn on the basis Statistical analysis relates observed statistical data to
of sample studies. theoretical models, such as probability distributions or
models used in regression analysis. By estimating
A statistical hypothesis, sometimes called parameters in the proposed model and testing
confirmatory data analysis, is a hypothesis that is hypotheses about rival models, one can assess the
testable on the basis of observing a process that is value of the information collected and the extent to
modelled via a set of random variables.[1] A which the information can be applied to similar
statistical hypothesis test is a method of statistical situations. Statistical prediction is the application of
inference. Commonly, two statistical data sets are the model thought to be most appropriate, using the
compared, or a data set obtained by sampling is estimated values of the parameters.
compared against a synthetic data set from an

@ IJTSRD | Available Online @ www.ijtsrd.com | Volume – 2 | Issue – 5 | Jul-Aug 2018 Page: 1105

You might also like