You are on page 1of 17

This article was downloaded by: [HEAL-Link Consortium] On: 8 December 2010 Access details: Access Details: [subscription

number 786636649] Publisher Routledge Informa Ltd Registered in England and Wales Registered Number: 1072954 Registered office: Mortimer House, 3741 Mortimer Street, London W1T 3JH, UK

Educational Media International

Publication details, including instructions for authors and subscription information: http://www.informaworld.com/smpp/title~content=t713698864

Building a tool to help teachers analyse learners' interactions in a networked learning environment

O. Petropouloua; I. Altanisa; S. Retalisa; C. A. Nicolaoub; C. Kannasb; M. Vasiliadouc; Ireneos Pattisc a University of Piraeus, Piraeus, Greece b Noesis Chemoinformatics, Nicosia, Cyprus c INNOVADE LI Ltd, Nicosia, Cyprus Online publication date: 05 November 2010

To cite this Article Petropoulou, O. , Altanis, I. , Retalis, S. , Nicolaou, C. A. , Kannas, C. , Vasiliadou, M. and Pattis,

Ireneos(2010) 'Building a tool to help teachers analyse learners' interactions in a networked learning environment', Educational Media International, 47: 3, 231 246 To link to this Article: DOI: 10.1080/09523987.2010.518815 URL: http://dx.doi.org/10.1080/09523987.2010.518815

PLEASE SCROLL DOWN FOR ARTICLE


Full terms and conditions of use: http://www.informaworld.com/terms-and-conditions-of-access.pdf This article may be used for research, teaching and private study purposes. Any substantial or systematic reproduction, re-distribution, re-selling, loan or sub-licensing, systematic supply or distribution in any form to anyone is expressly forbidden. The publisher does not give any warranty express or implied or make any representation that the contents will be complete or accurate or up to date. The accuracy of any instructions, formulae and drug doses should be independently verified with primary sources. The publisher shall not be liable for any loss, actions, claims, proceedings, demand or costs or damages whatsoever or howsoever caused arising directly or indirectly in connection with or arising out of the use of this material.

Educational Media International Vol. 47, No. 3, September 2010, 231246

Building a tool to help teachers analyse learners interactions in a networked learning environment
O. Petropouloua, I. Altanisa, S. Retalisa*, C.A. Nicolaoub, C. Kannasb, M. Vasiliadouc and Ireneos Pattisc
aUniversity

of Piraeus, Piraeus, Greece; bNoesis Chemoinformatics, Nicosia, Cyprus;

cINNOVADE LI Ltd, Nicosia, Cyprus

(Received 12 March 2010; final version received 12 August 2010)


REMI_A_518815.sgm Taylor and Francis retal@unipi.gr SymeonRetalis 0 300000September 47 Taylor 2010 & Francis Original Article International 0952-3987 Media2010 Educational(print)/1469-5790 (online) 10.1080/09523987.2010.518815

Downloaded By: [HEAL-Link Consortium] At: 16:49 8 December 2010

Educators participating in networked learning communities have very little support from integrated tools in evaluating students learning activities flow and examining learners online behaviour. There is a need for non-intrusive ways to monitor learners progress in order better to follow their learning process and appraise the online course effectiveness. This paper presents a conceptual framework and an innovative tool, called LMSAnalytics, that allows teachers and evaluators easily to track the learners online behaviour, make judgments about learners activity flow and gain a better insight about the knowledge constructed and skills acquired in a networked learning environment. Erstellen eines Tools um Lehrern zu helfen, Lerner-Interaktionen in einer vernetzten Lernumgebung zu analysieren Pdagogen, die an vernetzten Lerngemeinschaften teilnehmen, haben sehr wenig Untersttzung von integrierten Programmen zum Auswerten der Lernaktivitten der Studenten und von ihrem Online-Verhalten. Es ist notwendig, nicht-strende und auch automatisierte Mglichkeiten zur berwachung des Lernfortschritts der Lerner zu entwickeln, damit ihr Lernfortschritt und auch die Online-KursEffektivitt besser verfolgt werden knnen. Dieses Papier stellt einen konzeptuellen Rahmen und ein innovatives Tool, LMS-Analytics, vor, die Lehrern und Bewertern das Nachverfolgen des Online-Verhalten des Lernenden und seinen Aktivittsfluss zu beobachten und dadurch einen besseren Einblick ber die Kenntnisse, das Wissen und die Fhigkeiten, die in einer vernetzten Lernumgebung erworben werden, zu gewinnen. La construction dun instrument pour aider les enseignants analyser les interactions entre apprenants dans un environnement dapprentissage en rseau Les ducateurs qui participent aux activits de communauts dapprentissage en rseau ne sont gure aids par des instruments intgrs permettant dvaluer le flux des activits dapprentissage des tudiants et dexaminer le comportement des apprenants en ligne. Il y a un besoin rel de moyens automatiss et non invasifs pour suivre les progrs des apprenants afin de mieux suivre leur processus dapprentissage et dvaluer lefficacit du cours en ligne. Cet article prsente un *Corresponding author: retal@unipi.gr
ISSN 0952-3987 print/ISSN 1469-5790 online 2010 International Council for Educational Media DOI: 10.1080/09523987.2010.518815 http://www.informaworld.com

232

O. Petropoulou et al.
cadre conceptuel et un outil innovant appel LMSAnalytics qui permet aux professeurs et aux valuateurs de suivre facilement la trace le comportement en ligne des apprenants, de porter des jugements sur le flux dactivit de ces apprenants et davoir une vision meilleure du savoir qui sest construit et des comptences acquises dans un environnement dapprentissage en rseau. La construccin de una herramienta para ayudar a los profesores a analizar las interacciones entre los estudiantes dentro de un entorno de aprendizaje en red Los educadores que participan en comunidades de aprendizaje en red tienen poca ayuda por falta de herramientas integradas que les permitan evaluar el flujo de las actividades de aprendizaje por parte de los estudiantes y examinar su comportamiento en lnea. Lo que hacen falta son sistemas automatizados y noninvasivos para comprobar los progresos de los estudiantes, vigilar sus procesos de aprendizaje y evaluar la eficacia del curso en lnea. Este artculo presenta un marco conceptual y una herramienta innovadora llamada LMS Analytics que ofrecer a los profesores y evaluadores la posibilidad de vigilar fcilmente el comportamiento en lnea de los estudiantes, de evaluar sus flujos de actividad y conseguir una visin ms clara de los conocimientos construidos y de las competencias adquiridas dentro de un entorno de aprendizaje en red. Keywords: learner interaction analysis; evaluation of learning process; networked learning environment

Downloaded By: [HEAL-Link Consortium] At: 16:49 8 December 2010

Evaluating learners activities in a networked learning environment According to modern pedagogical theories, learning occurs not only as a result of learners direct participation in learning tasks, but also through legitimate peripheral participation in communities (Goodyear, 2005), in which implicit and explicit knowledge is acquired from the community (Brown & Duguid, 2000). In the era of networked learning, the network component plays an important role in the formation of learning communities, since it promotes and facilitates collaborative and cooperative connections: between one learner and other learners; between learners and tutors; between learners and learning resources, so that learners and tutors can extend and develop their understanding and capabilities in ways that are important to them, and over which they have significant control (Steeples, Jones, & Goodyear, 2002). Networked learning environments (NLEs) provide socially situated learner support through the active processes of dialogue, collaboration and shared knowledge construction that drive learning in social settings. Creating NLEs leads to an array of benefits, such as:

opportunities for participants to share their knowledge and expertise; opportunities for participants to discuss, plan, reflect on and explore learning issues; increased inspiration, innovation and motivation amongst participants; increased social contact between individuals from differing backgrounds; a reduction in feelings of isolation (both geographically and emotionally); increased access to shared resources.

In an NLE, where individual and collective actions take place, educators face great difficulties in evaluating the broad spectrum of interactions among the interacting

Educational Media International

233

participants such as learnerlearner, learnerteacher, learnercontent and learner technology (Vrasidas, 2002). It becomes difficult and time consuming for educators thoroughly to capture, track and assess the various interactive learning activities performed by all learners. There is a need for designing specific tools, which will be based on well-grounded conceptual frameworks for analysing the grid of all these interactions, since evaluation is a multifaceted and complex process for educators (Dimitracopoulou et al., 2006b; Marcos, Martinez, & Dimitradis, 2005). Our paper discusses the theoretical framework, and findings from pilot implementations of an interaction analysis tool, called LMSAnalytics. LMSAnalytics identifies specific indicators that can help educators analyse and evaluate multiple dimensions of interaction that are developed in an asynchronous NLE. The tool was built for the automatic analysis and visualization of data collected during the networked collaborative learning process. LMSAnalytics is expected to be useful for educators who need to assess the performance of both, individuals and groups in an NLE. One of the innovative features of LMSAnalytics is that it inter-exchanges data with the Moodle learning management system (LMS) which is the most popular open source application of this type. Thus, it can be used by teachers to analyse data of asynchronous networked learning interactions that occur within Moodle. The analysed data can also be exported in appropriate formats so that it can be used as input to the Microsoft Excel, the SPSS statistical package or other tools such as the NetDraw software for further social network analysis and WEKA for advanced data mining. Another innovative feature of the LMSAnalytics tool is that it guides the teacher in performing specific analysis of the data collected according to the teaching strategy followed in his/her course. For example, if a teacher designs the learning activities using the ThinkPairShare (TPS) learning strategy (Palloff & Pratt, 1999), LMSAnalytics will automatically create the most appropriate statistical tables and diagrams based on specific indicators that fit to this strategy. In this way, a teacher can save time, resources and be guided on how to collect the quantitative and qualitative data from the participation, interactions and collaboration among learners. These data can be fed into a rubric for assessing learners performance. In this paper, we first discuss the current state of the art in networked learning analytics. We will also discuss an interaction analysis conceptual framework on which the LMSAnalytics tool is based. Then we will present the LMSAnalytics via an example of its application in a learning scenario of collaborative problem solving according to the TPS strategy. Using this example, 28 teachers had been asked to play the role of the evaluator using the LMSAnalytics tool and evaluate its usability. The results from the evaluation case study will be presented. Finally concluding remarks and future research plans will be outlined. Towards networked learning analytics Networked learning is much more ambitious than previous approaches of using technology in education (Goodyear, 2005). The added value of networked technology is that it enables the enrichment of the learning paradigm in order to:

Downloaded By: [HEAL-Link Consortium] At: 16:49 8 December 2010

support open, flexible and learner-centred patterns of study; provide new ways for learners to work collaboratively;

234

O. Petropoulou et al.

facilitate various forms of interaction: learnerlearner, learnercontent, learner instructor; promote authentic learning and the acquisition of higher-order thinking skills, problem solving abilities, and the like.

The evaluation process, which is based on the analysis of students interaction and online behaviour is a difficult task in an NLE. Actually, it is an open research issue, which has attracted the attention of many research and development groups (Mazza & Dimitrova, 2005; Padilha, Almeida, & Alves, 2004; Ryberg & Larsen, 2006; TELL, 2005; Vrasidas & McIsaac, 2000). Bates and Hardy (2004) emphasized that information from students monitoring could be used as a valuable input for evaluating the effectiveness and quality of the e-learning materials and instructional model. The driving force for our research is the fact that the evaluators and teachers of a networked learning course have very little support by integrated tools to evaluate learners activities and identify learners online behaviour and interactions (Mazza & Botturi, 2007; Mazza, 2009). As a consequence, they are in need for non-intrusive and automated ways to monitor learners progress in order to better understand their learning process and appraise the online course effectiveness. In other words, they need educational data mining tools or networked learning analytic tools (Baker and Yacef, 2009). Input for designing these services can benefit from research in the area of the socalled web analytics tools (Bradley, 2007). These software tools have been originally developed for the purpose of web site traffic analysis. Web analytics processes a variety of data and sources (mainly the web server log file and historical data of visits to the web server) in order to evaluate web site performance and popularity, visitors behaviour and interaction patterns at both an individual and an aggregate level (Pierrakos, Paliouras, Papatheodorou, & Spyropoulos, 2003). Processing of the available information is performed using statistical analysis and data mining methods in order to extract knowledge in the form of patterns, associations and correlations from the raw data. However, Baker (in press) mentions that educational data mining methods and tools differ from web analytics methods, since issues such as student knowledge level and context play important roles in the analysis of educational data. Thus, it is widely believed to be appropriate to develop custom tools that employ both, appropriate pre-processing of the learning management system educational data and encoding into meaningful descriptors, and selected data mining methods targeting the discovery of educationally meaningful knowledge. Networked learning analytics is a specific aspect of the educational data mining area, which can be defined as the process of studying and analysing in depth the students behaviour within an NLE and extracting meaningful interaction patterns, i.e., patterns that concern learnercontent interactions, learnerinstructor interactions and learnerlearner interactions, as well as identifying trends in students online behaviours (Petropoulou, Lazakidou, Retalis, & Vrasidas, 2007). In order to analyse and interpret the data about students interactions in an NLE, several research groups developed coding schemes and analysis techniques that categorize interactions according to models of knowledge construction and skills acquisition in an effort to perform interaction analysis more efficiently. Gunawardena, Lowe and Anderson (1997) developed a model and coding scheme for online interaction among peers with five phases of knowledge construction:

Downloaded By: [HEAL-Link Consortium] At: 16:49 8 December 2010

Educational Media International

235

(1) sharing/comparing of information; (2) discovery and exploration of dissonance or inconsistency among ideas, concepts, or statements; (3) negotiation of meaning/co-construction of knowledge; (4) testing and modification of proposed synthesis or co-construction; and (5) agreement statement(s)/applications of newly constructed meaning. Vrasidas (2002) developed a working typology of intentions driving interaction in online and blended learning environments. Examples of intentions include collaboration, discussion, evaluation, gain status, provide support, share information and socialize. Shute and Glaser (1990) propose a technique that enables the evaluator to derive global learner differences on the basis of learner interaction measures. The approach by Shute and Glaser can be summarized as involving: (1) counting frequencies of actions, (2) categorizing them into meaningful units, and (3) making comparisons across groups. Ganer, Jansen, Harrer, Herrmann, and Hoppe (2003) illustrate how log files can be captured, codified and analysed for providing statistics of activity patterns such as cooperation, turn taking, etc. in a synchronous CSCL environment. The MatchMaker TNG tool offers a framework for analysing activities that occur via a synchronous shared desktop collaborative learning systems. For example, when a student adds a node in a graph and another student connects this node with another one with an edge, this might indicate a collaboration of these two students. An analysis method like the one that appears in Mhlenbrock (2004) can help evaluators identify collaborative activity patterns. Another type of interaction analysis is related to the calculation of the number of the messages read, the postings to a discussion board, the file uploads, the annotations to the uploaded files, etc. For example, the DIAS tool (Bratitsis and Dimitracopoulou, 2005) offer 64 indicators, which can be used for the examination of quantitative data about the interactions in an asynchronous collaborative NLE. Apart from quantitative measurements, analysis of participants postings (content analysis) should be performed in sequence, to reveal many of the behaviours associated with collaborative learning situations (Curtis & Lawson, 2001). Nowadays, LMSs are being widely used by educational and training organizations. However, very few interaction analysis techniques and tools have been developed. LMSs allow authorized users (teachers and administrators) to view some data about the online actions performed by students. This data is stored in the tables of the LMSs database. However, this data is very poor and cannot be easily combined with the interaction analysis techniques as it happens in the field of computer supported collaborative learning (CSCL) environments. For example, in synchronous CSCL environments, interaction analysis can be made using action based approaches/frameworks such as the activity recognition approach (Barros & Verdejo, 1999; Mhlenbrock, 2004), and the OCAF framework (Avouris, Dimitracopoulou, & Komis, 2003) to name few. These approaches/frameworks have been supported by specialized analysis tools like CoLAT that collect actions of the users in a collaborative learning environment and show different indexes of collaboration (Koutri, Avouris, & Daskalaki, 2004). The main advantage of these tools is that they are efficient in providing feedback to their users (learners, teachers or evaluators). These tools are very valuable and their philosophy need to be adopted by LMSs since teachers need rich information that will come out from analysis of interaction using sophisticated interaction analysis techniques combined with conceptual modeling approaches similar to the ones that have started to appear in the area of CSCL. An overview about interaction analysis techniques and tools utilized in

Downloaded By: [HEAL-Link Consortium] At: 16:49 8 December 2010

236

O. Petropoulou et al.

Downloaded By: [HEAL-Link Consortium] At: 16:49 8 December 2010

CSCL can be found in the web site of the Kaleidoscope project (http://www.noekaleidoscope.org). However, these techniques are mainly focused on the use of CSCL tools for knowledge building and they do not embrace holistically the whole interaction process within an NLE. Our thinking about networked learning analytics concerns a more holistic view over the learners interactions within an NLE, not only focusing on social and personal interactions among learners and tutors but also the flow of learning activities that concern learners interaction with online learning resources (Moore, 1989). With the proposed approach, we try to give a good insight to the three design components of an NLE identified by Goodyear (2005): (1) the tasks set for students, which influence their actual learning activity; (2) the (social) organizational forms put in place for them, out of which they develop more or less convivial learning relationships; and (3) the digital resources, tools, artefacts, etc. that we make available to students, which are used by them to customise or fit out their individual learnplaces. Thus, the purpose of our project was to develop and test the LMSAnalytics tool, which is based on a specific conceptual framework for discovering information concerning learners navigational behaviour and extract meaningful patterns, which can be used for assessment purposes using data driven analytical methods. Our overall aim is to make teachers better able to understand what learners are doing alone or in groups, and to classify learners according to their flow of activities to facilitate the evaluation of the lesson taught by educators. A conceptual framework for assessing learners behaviour in an NLE According to Dillenbourg (1999), the key to understanding collaborative learning is to gain an understanding of the wealth of interactions among the individuals. This is why various indicators and specific tools that can analyse the grid of all these interactions have recently proposed. Interaction analysis indicators deal with: (1) the process of the activity (individual, group or community), (2) the interaction product, (3) the quality of collaboration, and (4) the formed social context (Dimitracopoulou et al., 2006a; Vrasidas & Glass, 2002). The associated interaction analysis tools either inform learners about their learning progress (for self-regulation purposes) or help instructors/ researchers evaluate and assess the collaborative learning process and products. Various techniques have been appeared in the literature for evaluating the collaborative learning process and products. Several publications with overviews of such techniques can be found in the literature (e.g., Daradoumis, Martnez, & Xhafa, 2006; Dimitracopoulou et al., 2006b; Vrasidas & Glass, 2002). The evaluation of collaborative learning has to be performed at least at two levels, separating the process (or group functioning) from the product (or task performance) of collaboration (Collazos, Guerrero, Pino, & Ochoa, 2003; Daradoumis, Xhafa, & Marques, 2003; Hakkinen, Jarvela & Makitalo, 2003; MacDonald, 2003). Based on this trend, we have developed a multi-faceted framework to study learners behaviour in an NLE by making use of descriptive statistics, social network analysis (SNA), content and context analysis (through coding teaching and learning activities) as a way to find out what they are talking about, and why they are talking as they do. These methods are being used to triangulate and contextualize our findings (De Laat, Lally, Lipponen, & Simons, 2006). Our proposed conceptual framework consists of two axes that measure student performance in an NLE: (1) the quality of learning products and (2) the quality of the

Educational Media International

237

collaboration related to the volume and quality of interactions in an NLE. The proposed framework tries to analyse the interaction holistically, thus covering the four types of interaction: learnercontent, instructorlearner, learnertechnology and learnerlearner (Vrasidas & Glass, 2002). The first axis concerns all deliverables of individual or group action (e.g., learners assignments). Both quantitative and qualitative indicators about the quality of learning products should be accounted for, such as:

Downloaded By: [HEAL-Link Consortium] At: 16:49 8 December 2010

grading of learners ongoing and final learning products (e.g., final reports, tests, exercises, quizzes); individual and groups overall performance in specific tasks (e.g., groups average score); number of steps performed in a multi-step exercise (e.g., number of correct, wrong, or incomplete steps); ratio of correct to incorrect steps per session correlated with task difficulty.

The second axis refers to the necessity for specifying the effects of particular categories of interactions in an NLE (Dillenbourg, 1999) for the accomplishment of learning products. These interactions refer to the grid of interactions developed between peers, learnertutor and learnercontent. Thus, we not only measure what students deliver in an NLE but also how they produced their deliverables. We propose that this entire spectrum of interactions should be captured and analysed accordingly. For example, for the learnerlearner interactions, we propose evaluation of the following elements:

number and nature of contributions (e.g., questions, additions, replies, social remarks) to the task (per learner); learner behaviour compared with that of other group members; direction of information flow (different kinds of communication among participants); total number of follow-up postings.

Concerning the learnercontent interactions, the following indicators could be measured:


amount of time a learner spends with the system (per session); percentage of available material read; percentage of available exercises tackled; amount of time spent per concept/skill/method/competency; sequential learning paths per session (e.g., theory, example, exercise).

Consequently, a set of available indicators have been identified which can be used or reused by the teachers, in order to analyse the students interactions in an NLE, or by appropriately designed software tools to discover useful knowledge. These indicators have been integrated into the LMSAnalytics tool, which offers information visualization of the data gathered from the Moodle LMS and refer to each indicator. In addition, special care has been attributed to implement the interoperation of LMSAnalytics with a set of applications that include social network analysis and data mining.

238

O. Petropoulou et al.

Data mining in LMSAnalytics Data mining enables the use on the data of interest of interpretative methods, aiming to provide insights to the data, and predictive methods, constructing models for estimating future performance based on past experiences (Han & Kamber, 2006). The inclusion of data mining technology into LMSAnalytics takes place primarily through the interoperation of the tool with the WEKA data mining application. Users of the LMSAnalytics have the option to export a selection of the indicators calculated by the tool in a format appropriate as input to WEKA. Following that, WEKA can be used to import the dataset of interest and process it using any of the data techniques available in it. The results of the analysis can then be evaluated through the WEKA interface or though exporting to text files and manual inspection. The current implementation of LMSAnalytics has placed emphasis on the preparation and export of datasets suitable for two types of data mining analysis available in WEKA: clustering and classification. Clustering refers to a set of techniques that aim to identify natural groups in a dataset based on the similarities of the patterns the dataset contains. The method takes as input a set of patterns (in our case a set of students), each represented by a set of descriptors (in our case a set of indicators) typically arranged in a tabular form. The method produces a set of groups where each contains patterns with similarities at the descriptor level. Data analysts and all users in general can use the results of clustering to understand the nature of the data and overall relations among patterns. WEKA offers several clustering methods. LMSAnalytics has been tested successfully with the simple K-Means method (Han & Kamber, 2006). Classification methods aim at discovering features of the patterns examined that discriminate between distinct classes of patterns. A second goal of classification is to formalize rules useful for predicting the class of a pattern based on its description. In the LMSAnalytics problem case, this amounts to discovering indicators/descriptors of students/student online behaviour, which are crucial in discriminating between, for example, students with high grades and those with low grades and in using those descriptors to build predictive models for the likely performance of a student based on her online behaviour. LMSAnalytics has been tested using decision tree classification methods from WEKA. Decision trees, as well as all classification methods need to be supplied by a set of patterns represented by descriptors as well as a special descriptor defining the class of the pattern (e.g., grade). Decision trees are then used to divide the patterns of the entire dataset into exactly two groups according to whether the patterns have a particular best descriptor in common. The best descriptor is the descriptor that results in the highest possible ratio of patterns in the same class between those patterns containing the descriptor and those patterns not containing it. The method continues iteratively with respect to each subdivided group, dividing each group into two groups based on a next best descriptor selected from the group of descriptors. The result of this process is a tree structure in which terminal nodes contain a majority of patterns in one of the classes. Tracing the lineage that defines each terminal node can reveal descriptors that may be related to the increased or decreased likelihood of the presence of a specific class of patterns. Moreover, new patterns can be filtered through the tree structure generated by a decision tree and a prediction for the class of each filtered pattern can be made by simple examination of the characteristics of the tree nodes in which it is placed. In an educational setting, teachers could utilize classification methods to

Downloaded By: [HEAL-Link Consortium] At: 16:49 8 December 2010

Educational Media International

239

develop models for specific courses. These models could reveal indicators that contribute to increased or decreased performance and, thus, support educators in redesigning their lessons to accommodate this finding. Similarly, the models could be used during the course period to predict the likelihood performance of students based on their online behaviour. Teachers could use this information to assess overall class progress and take appropriate actions to support students predicted to have a low performance. The LMSAnalytics tool LMSAnalytics is an interaction analysis tool for the automated collection, analysis and visualization of data that concern the behaviour of participants in an asynchronous NLE based on the Moodle LMS. The tool has been developed based on the measurable analysis indicators aforementioned. Its basic operations are portrayed in Figure 1. LMSAnalytics is an open source tool developed using php and MySQL languages. It interoperates with the Moodle LMS. Moodle stores in specific tables of the database server data about the students interactions. LMSAnalytics connects to the database and retrieves that necessary data from those tables (a teacher may ask data about a specific course and/or a specific period). The data retrieved from Moodle is stored in a relational database for analysis. The results of the analysis of the data are shown in graphs and/or tables. Data can also be exported in a suitable coded form, so that it can be further processed further with the help of more specialized tools such as the NetDraw software for social networks analysis, SPSS for more in depth statistical analysis or WEKA for additional data mining processing. Figure 2 contains a highlevel architectural diagram of the LMSAnalytics tool. This is the only stand-alone interaction analysis tool that inter-exchanges data with Moodle LMS. The only quite similar tool is the GISMO graphical interactive student monitoring tool (Mazza &
Figure 1. Use case diagram of LMSAnalytics.

Downloaded By: [HEAL-Link Consortium] At: 16:49 8 December 2010

Figure 1.

Use case diagram of LMSAnalytics.

240

O. Petropoulou et al.

Downloaded By: [HEAL-Link Consortium] At: 16:49 8 December 2010

Figure 2.

A high-level architectural diagram of the LMSAnalytics tool.

Botturi, 2007), which also interoperates with Moodle, acting as a Module block and not as a separate application. Moreover, if the teacher has structured the learning tasks in an NLE according to a pedagogical strategy such as Jigsaw, TPS, etc. (Palloff & Pratt, 1999) LMSAnalytics can propose to him/her a series of diagrams and tables that show the results from an analysis of the indicators that best fit to the strategy followed. An example of this feature is shown in the next section.
Figure 2. A high-level architectural diagram of the LMSAnalytics tool.

Example of the LMSAnalytics utilization the case of TPS strategy According to the TPS strategy, the teacher gives to the students a problem/question. As can be shown in Figure 3, first each teacher has to reflect upon the given problem and submit to a forum his/her answers (Think phase). Of course, questions and remarks about the problem can be exchanged among students via the forum during the problem solving process. Often students share resources that could help their peers

Educational Media International

241

Figure 3.

Graphical representation of the ThinkPairShare (TPS) strategy.

Downloaded By: [HEAL-Link Consortium] At: 16:49 8 December 2010

find the solution to the problem. Having the students thought about the problem and reported their solutions (first deliverable), they form groups (Pair phase). During this phase, members of each group exchange their deliverables, give explanations and negotiate their thoughts in order to jointly create a new deliverable, which will be an elaborated version of the problem solution. Finally, all the deliverables are shared (Share phase) in order that the learners peer-review them and ask for clarifications, explanations and so on. Normally, the Share phase ends with an electronic vote on the given solutions. The TPS strategy encourages students active participation, collaboration, investigation of a given problem from various angles, critical thinking and the group attainment of knowledge. LMSAnalytics can help teacher perform the interaction analysis for this specific collaborative strategy. More specifically, the tool can specifically produce reports on indicators such as:
Figure 3. Graphical representation of the ThinkPairShare (TPS) strategy.

A3 Actors degree centrality (SNA); B1 Work Amount (quantification of the amount of work, message dimension per user); B2 Argumentation (measure of the initiative work that has been done in the team message annotation); B5 Collaboration (interaction base message characterization); D1 Average Number of Contributions (calculate the participation percentage per team in a certain course and team argumentation in a certain time period); E3 Participation Count (number of posted messages a user has done in a certain course and period); F3 Number of Messages per Participant (number of posted messages a user has done in a certain course per Forum and period).

Moreover, the results of the analysis of these indicators can be shown per phase. For example:

Think phase: the tool produces statistical tables and bar-charts for B1 indicators (case a). The teacher can see the total number of messages sent per learner and the total time he spent on this activity at a glance (Figure 4). Pair phase: the tool produces statistical tables and diagrams for indicators B2 and D1, that concern the degree and the quality of students participation in the

242

O. Petropoulou et al.

Downloaded By: [HEAL-Link Consortium] At: 16:49 8 December 2010

Figure 4.

Total number of messages per learner.

same group, the type and the quality of collaboration and communication among groups, as well as the total time that students spent for solving a given problem. For example, LMSAnalytics identifies the most active student of this phase and the number/type of messages that this student sent (Figure 5). Share phase: the tool produces graphical representations for indicator B1 as in the Think phase per student and for the forum of the share phase.

Figure 4. 5.

Semantic annotation of messages per learner. Total number of messages per learner.

LMSAnalytics evaluation The usability of the LMSAnalytics tool has been very positively evaluated. More specifically, 28 teachers from different schools who have strong interest in the use of networked technologies in their schools had been asked to use the LMSAnalytics tool and answer to a questionnaire. Teachers were given a learning scenario in which six students learn about nuclear power using the Moodle LMS and decide whether they are in favour or against it. At each phase the teacher gave to students a set of online resources and posed a set of questions that the students had to answer individually at

Figure 5.

Semantic annotation of messages per learner.

Educational Media International

243

Downloaded By: [HEAL-Link Consortium] At: 16:49 8 December 2010

first (Think phase) and then in groups of two (Pair phase). During the last phase (Share phase), the students voted about the usefulness or not of the nuclear power. During the Think phase, students used a common asynchronous web forum for exchanging resources (mostly links) and discussing about the given set of questions. During the Pair phase, each group used an asynchronous web forum in order to share their deliverables and discuss their answers in order to reach to a consensus, which was portrayed in a joint report. Thus, each teacher had to analyse the quality of the given deliverables as well as the data from the discussions occurred during this learning scenario. Each teacher had been given an assessment rubric which they had to complete in order to grade each student. A rubric is an authentic assessment tool, which acts as a scoring guide that seeks to evaluate a students performance based on the sum of a full range of criteria/indicators thus giving a final composite numerical score. Each teacher had to use LMSAnalytics in order to check the reports per indicator and fill in the assessment rubric. It is important to note that final graders that teachers gave to the various students did not differ much (maximum variance 0.6). After having used the LMSAnalytics tool, teachers expressed their opinions about it using a typical usability evaluation questionnaire. Teachers highly appreciated (over 80%) the tool, which was considered highly efficient and effective. They also liked the following aspects of the tool:

content: the visualization and the reporting of information produced by the tool; structure: the organization of the content and functions; appearance: the aesthetics of the graphical user interface; learnability: the easiness in learning to use the tool as well as using it.

Concluding remarks A teacher who organizes collaborative learning tasks for the students needs frameworks and tools that will enable him/her quickly and accurately to evaluate their behaviour as well as to offer timely scaffolding when needed. The LMSAnalytics tool, which has been presented in this paper, tries to address this need. It also goes one step further by guiding the teacher in analysing and visualizing data of the students behaviour. An innovative aspect of the tool is that it interoperates with the Moodle LMS. Thus, although it contains fewer indicators than the DIAS tool, it is very valuable, since it effectively and efficiently aid teacher in assessing the students learning performance. LMSAnalytics tool is comparable to the GISMO tool with respect to the features offered (not the technical details). We plan to extend LMSAnalytics functionality by building a web service for making it interoperable with the WEKA data mining tool. In this way, the tool could:

exploit learning activities flow sequential patterns by drawing the exact paths followed by each student individually or in groups; show deviations of individual students from the typical learning activities flow performed by their peers; perform path analysis with the creation of more complex queries that reveal interesting correlations and association rules among students learning paths.

244

O. Petropoulou et al.

Acknowledgements
This work has been partially supported by the eLAT project: eLearning Analytics Tool: Analyzing Student Behavior in Learning Management Systems, supported by the Cyprus Research Promotion Foundation and partially funded by the European Structural Funds and the Republic of Cyprus.

References
Avouris, N.M., Dimitracopoulou, A., & Komis, V. (2003). On analysis of collaborative problem solving: An object-oriented approach. Computers in Human Behavior, 19(2), 147167. Baker, R.S.J.d. (in press). Data mining for education. In B. McGaw, P. Peterson, & E. Baker (Eds.), International encyclopedia of education (3rd ed.). Oxford: Elsevier. Retrieved on January 2010 from http://www.cs.cmu.edu/rsbaker/Encyclopedia%20Chapter%20Draft %20v10%20-fw.pdf Baker, R.S., & Yacef K. (2009). The state of educational data mining in 2009: A review and future visions. Journal of Educational Data Mining (JEDM), 1, 317. Barros, B., & Verdejo, M. F. (1999). An approach to analyse collaboration when shared structured workspaces are used for carrying out group learning processes. In S.P. Lajoie & M. Vivet (Eds.), Artificial intelligence in education: Open learning environments (pp. 449456). Amsterdam: IOS Press. Bates, S.P., & Hardy, J. (2004) An evaluation of an e-learning strategy: Watching the e-learners learn. In D.S. Preston & T.H. Nguyen (Eds.), Virtuality and education: A reader (pp. 7781). Oxford: Inter-Disciplinary Press. Bradley, N. (2007) Marketing research. Tools and techniques. Oxford: Oxford University Press. Bratitsis, T., & Dimitrakopoulou, A. (2005). Data recording and usage interaction analysis in asynchronous discussions: The DIAS system. Proceedings of the 12th International Conference on Artificial Intelligence in Education AIED, Workshop Usage Analysis in Learning Systems, Amsterdam. Brown, J.S., & Duguid, P. (2000). The social life of information. Boston, MA: Harvard Business School Press. Collazos, C., Guerrero, L., Pino, J., & Ochoa, S. (2003). Collaborative scenarios to promote positive interdependence among group members. In J. Favela & D. Decouchant (Eds.), Proceedings of the Ninth International Workshop on Groupware (CRIWG 2003), Grenoble-Autrans, France, LNCS 2806 (pp. 247260). Berlin: Springer. Curtis, D.D., & Lawson, M.L. (2001). Exploring collaborative online learning, Journal of Asynchronous Learning Networks, 5(1), 2134. Daradoumis, T., Xhafa, F., & Marques, J.M. (2003). Exploring interaction behaviour and performance of online collaborative learning teams. In J. Favela & D. Decouchant (Eds.), Proceedings of the Ninth International Workshop on Groupware (CRIWG 2003), Grenoble-Autrans, France, LNCS 2806 (pp. 126134). Berlin: Springer. Daradoumis, T., Martnez A., & Xhafa F. (2006) A layered framework for evaluating on-line collaborative learning interactions. International Journal of Man-Machine Studies, 64(7), 622635. De Laat, M.F., Lally, V., Lipponen, L., & Simons, P.R. J. (2006). Analysing student engagement with learning and tutoring activities in networked learning communities: A multi-method approach. International Journal of Web-based Communities, 2(4). Dillenbourg, P. (1999). What do you mean by collaborative learning? In P. Dillenbourg (Ed.), Collaborative learning: Cognitive and computational approaches (pp. 120). Advances in Learning and Instruction series. Oxford: Pergamon Elsevier. Dimitrakopoulou A., Petrou A., Martinez A., Marcos J., Kollias V., Jermann P., Harrer A., Dimitriadis Y., & Bollen L. (2006a). State of the art of interaction analysis for Metacognitive Support & Diagnosis, D31.1.1 deliverable of the EU Sixth Framework programme priority 2, Information society technology, Network of Excellence Kaleidoscope project (contract NoE IST-507838). Dimitracopoulou, A., Vosniadou, S., Gregoriadou, M., Avouris, N., Kollias, V., Gogoulou, L., Fessakis, G., & Bratitsis, Th. (2006b). The field of computer based interaction Analysis for the support of participants regulation in social technology based learning environments. State of the art and perspectives. In D. Psillos & V. Dagdidelis (Eds.), 5th Hellenic

Downloaded By: [HEAL-Link Consortium] At: 16:49 8 December 2010

Educational Media International

245

Congress with International Participation: Information and Communication Technologies in Education (pp. 9971000). HICTE, Thessaloniki, October 2006. Ganer K., Jansen M., Harrer A., Herrmann K., & Hoppe U. (2003). Analysis methods for collaborative models and activities. In U. Hoppe (Ed.), Computer support for collaborative
learning: Designing for change in networked learning environments, CSCL 2003 Congress,

1418 June 2003, Bergen, Norway. Goodyear, P. (2005). Educational design and networked learning: Patterns, pattern languages and design practice. Australasian Journal of Educational Technology, 21(1), 82101. Gunawardena, C.N., Lowe, C.A., & Anderson, T. (1997). Analysis of a global online debate and the development of an interaction analysis model for examining social construction of knowledge in computer conferencing. Journal of Educational Computing Research, 17(4), 397431. Hakkinen, P., Jarvela, S., & Makitalo, K. (2003). Sharing perspectives in virtual interaction: Review of methods of analysis. In B. Wason, S. Ludvigson, & U. Hoppe (Eds.), Designing for Change in Networked Learning, Proceedings of the International Conference on Computer Support for Collaborative Learning (pp. 395404). Dordrecht: Kluwer. Han, J., & Kamber, M. (2006). Data mining: Concepts and techniques (2nd ed.). San Diego, CA: Morgan Kaufmann. Koutri, M., Avouris, N., & Daskalaki, S. (2004). A survey on web usage mining techniques for web-based adaptive hypermedia systems. In S.Y. Chen & G.D. Magoulas (Eds.), Adaptable and adaptive hypermedia systems. Hershey, PA: Idea Publishing Inc.. MacDonald, J. (2003). Assessing online collaborative learning: Process and product. International Journal of Computers and Education, 40, 377391. Marcos, A., Martinez, A., & Dimitriadis, Y. (2005). Towards adaptable interaction analysis in CSCL. Proceedings of the 12th International Conference on Artificial Intelligence in Education, Amsterdam. Mazza, R. (2009). Introduction to information visualization. London: Springer-Verlag. Mazza, R., & Botturi, L. (2007). Monitoring an online course with the GISMO tool: A case study. Journal of Interactive Learning Research, 18(2), 251265. Mazza, R., & Dimitrova, V. (2005). Generation of graphical representations of student tracking data in course management systems. In IV 05: Proceedings of the Ninth International Conference on Information Visualisation (IV05) (pp. 253258). Washington, DC. Moore, G. (1989). Three types of interaction. The American Journal of Distance Education, 3(2), 16. Mhlenbrock, M. (2004). Shared workspaces: Analyzing user activity and group interaction. In H.U. Hoppe, M. Ikeda, H. Ogata, F. Hesse (Eds.), New technologies for collaborative learning. Computer-Supported Collaborative Learning Series. Dordrecht: Kluwer. Padilha, T.P.P., Almeida, L.M., & Alves, J.B.M. (2004). Mining techniques for models of collaborative learning. In J. Mostow & P. Tedesco (Eds.), Designing Computational Models of Collaborative Learning Interaction, Workshop at ITS 2004 (pp. 8994). Macei, Brazil. Palloff, R., & Pratt, K. (1999). Building learning communities in cyberspace: Effective strategies for the online classroom. San Francisco, CA: Jossey Bass. Petropoulou, O., Lazakidou, G., Retalis, S., & Vrasidas, C. (2007). Analysing interaction behaviour in network supported collaborative learning environments: A holistic approach. International Journal of Knowledge and Learning, 3(4/5), 450464. Pierrakos, D., Paliouras, G., Papatheodorou, C., & Spyropoulos, C.D. (2003). Web usage mining as a tool for personalization A survey. User Modelling and User-Adapted Interaction, 13(4), 311372. Ryberg, T., & Larsen, M. (2006). Networked identities: Understanding different types of social organisation and movements between strong and weak ties in networked environments. In Proceedings of the Fifth International Conference on Networked Learning, 1012 April 2006, Lancaster. Shute, V.J., & Glaser, R. (1990). A large-scale evaluation of an intelligent discovery world: Smithtown. Interactive Learning Environments, 1, 5177. Steeples, C., Jones, C., & Goodyear, P. (2002). Beyond e-learning: A future for networked learning. In C. Steeples & C. Jones (Eds.), Networked learning: Perspectives and issues (pp. 323342). London: Springer.

Downloaded By: [HEAL-Link Consortium] At: 16:49 8 December 2010

246

O. Petropoulou et al.

TELL. (2005). Introducing a framework for the evaluation of network supported collaborative learning. TELL Project, Deliverable of WorkPackage 1, Retrieved January 2010, from TELL project website, http://cosy.ted.unipi.gr/tell/ Vrasidas, C. (2002). A working typology of intentions driving face-to-face and online interaction in a graduate teacher education course. Journal of Technology and Teacher Education, 10(2), 273296. Vrasidas, C., & McIsaac, M. (2000). Principles of pedagogy and evaluation of Web-based learning. Educational Media International, 37(2), 105111. Vrasidas, C., & Glass, C.V. (Eds.) (2002). Distance education and distributed learning. Charlotte, NC: Information Age Publishing, Inc.

Downloaded By: [HEAL-Link Consortium] At: 16:49 8 December 2010

You might also like