You are on page 1of 13

BY

BENSON NGARE MURIUKI


MIS-3-2621-2/2015

INTRODUCTION
PURPOSE
STATEMENT OF PROBLEM
LITERATURE REVIEW
RESEARCH DESIGN, METHODOLOGY AND ANALYSIS

Cybercrime is defined as unlawful acts involving


computers as the objects of the crimes, used as tools to
commit offences.
It refers to criminal activities or crimes that involve the
Internet, computer systems, or computer technologies and is
often characterized by identity thefts, phishing, and other
forms of cybercrime.
It differs from computer crime in the sense that computer
crimes encompass crimes committed against the computer,
the materials in it like software and data, and its applications
while cybercrime refers to any criminal activity committed
using electronic communication media

Cybercrime Framework This is a real, conceptual


structure intended to serve as a support or guide for the
building of cyber-security that expands the structure into
a dependable system(Rouse Margaret, 2012).
It is layered structure indicating kinds of programs and
how they can or should be built and their interrelation
Hence for the cybercrime framework to be successful it
must adhere to certain standards as to not infringe the
rights of citizens and to be effective (Brenner, S. W 2006).

The

research aims to contribute to cybersecurity to enhance data security within the


local organizations both private and public.

There is increasing risks of cybercrime with


greater focus on developing economies like
Kenya;
The available frameworks have failed to be
effective due to a number of reasons.

Various shortcomings:
Signature based frameworks
Inadequate scalability
Inadequate support for highly
distributed environments
Forensic vs. preemptive
Focused on compliance
Intrusion-centric
Difficult to deploy and manage

Cybercrime has been cited in most literature as a major problem of modern


times (Bezunartea, 2016; Devi & Rather, 2015).
The dynamics of cybercrime and its effects can have far reaching
consequences to security, businesses and other sectors of life
A great deal of research has been focused on its causes (Mwai, 2015) and
how to prevent cybercrime (Nyawanga, 2015) while others are concerned
with its effects and challenges (Ikiao, 2015; Wekundah, 2015)
Researchers recommend clear framework for understanding and addressing
it.
Some of the frameworks include:
Control Objectives for Information and related Technology (COBIT),
National Institute of Standards and Technology (NIST)
Enhanced Telecoms Operations Map (ETOM)
Information Technology Infrastructure Library (ITIL),
ISO/IEC 20000, among others

These theories include;


Cyber-security Information Sharing Theory by D. Inserra and
P. Rosenzweig.
Cyber Terrorism and IR Theory: Realism, Liberalism, and
Constructivism in the New Security Threat by Constantine J.
Petallides
The Willie Sutton Theory of Cyber Security by Alan Cohen

This theory involves;


Administrators must explain what information sharing is and
how it works to address real privacy concerns overcoming
lack of trust.
Information sharing organization in order to flow rapidly and
in both directions between the government and the private
sector.
Have the private sector provided with legal, freedom of
information, and regulatory protections for sharing
information.
Broad information sharing to ensure government agencies
have the information they need in order to prevent
cybercrime and attacks

This

theory involves;
Understanding that the internet is a realist
security model which is ungoverned.
How to safeguard networks in an environment
where allies can not be fully trusted
Finding better methods of storing important
data through concentrating on mitigation of
data breaches and cyber-attacks

This

involves a three-step process for better


segmentation of high-value assets through;
Step 1. Comprehensively understand your
computer environment
Step 2. Create a segmentation model that
ring-fences high-value assets
Step 3. Create a zero-trust model for highvalue assets

The NIST Framework

The COBIT 5 Framework

Is based on US information security


law.
Used to provide documentation
which describe minimum level of
requirements for IT security
The disadvantage is that NIST
framework should be used in
conjunction
with
an
in-depth
information security program (Ford,
2015);
Another weakness is that the NIST
framework lacks focus on financial
aspects, which makes it not useful
across many organizations

COBIT 5 segregates IT into four main


domains of plan, build, run and monitor as
well as 34 processes
The strengths of the COBIT 5 framework
are its heavy linkage to business
objectives in conjunction with the IT
framework of the organization
COBIT 5 framework also lacks focus on
how to achieve the necessary goals
It leaves the endeavours to management
team which can be an opportunity for
cybercrime to occur.
Difficult to implement due to the necessity
for all stakeholders to be involved in its
creation and management.

Questionnaire survey research design will be used; reason questionnaires are standardized and
can obtain large amounts of data from respondents. They are also cost-effective to administer and
get empirical first-hand data.
The target population for the study will be the IT experts in organizations in Kenya.
Purposive sample will be used and will compose of 100 IT expert respondents drawn from 25
organizations in Kenya. Reason purposive sampling targets specific information from expert
respondents who cannot be easy to get by random sampling.
Both primary and secondary data will be collected in the study. Primary data will be obtained from
questionnaire responses while secondary data will be obtained from sources such as
organizational publications, internet sources, reports etc.
5-point Likert scale rating will be used to transform responses to scalable quantities to ease
analysis.
Data will be both quantitative and qualitative. Reasons for use of quantitative methods are that
they are standardized, making them effective in comparing research findings with similar studies.
Additionally, they allow researchers to summarize vast sources of data. Qualitative methods
are reliable and valid because they employ prescribed procedures.
Quantitative data will be analysed using descriptive statistics in SPSS to obtain mean, standard
deviations, frequencies etc.
Qualitative data will be analyzed using inferential statistics as well as content analysis approaches
Ethical guidelines of research will be observed. Data will be used solely for research purposes and
necessary consents will be obtained from NACOSTI and other authorities

THANK YOU

You might also like