HCI Bibliography Home | HCI Conferences | LAK Archive | Detailed Records | RefWorks | EndNote | Hide Abstracts
LAK Tables of Contents: 1112131415

LAK'14: 2014 International Conference on Learning Analytics and Knowledge

Fullname:Proceedings of the Fourth International Conference on Learning Analytics and Knowledge
Editors:Mattew Pistilli; James Willis; Drew Koch; Kimberly Arnold; Stephanie Teasley; Abelardo Pardo
Location:Indianapolis, Indiana
Dates:2014-Mar-24 to 2014-Mar-28
Standard No:ISBN: 978-1-4503-2664-3; ACM DL: Table of Contents; hcibib: LAK14
Links:Conference Website
  1. Process mining
  2. Predictive models and recommendations
  3. Alternative analytics
  4. Learning mathematics
  5. MOOCs
  6. Learning analytics for "at risk" students
  7. Text analytics and collaborative environments
  8. Institutional perspectives
  9. Analysis of resource use in LMS
  10. Learning analytics and learning design
  11. Discourse and argumentation
  12. Who we are and who we want to be
  13. Panels
  14. Posters
  15. Workshops

Process mining

Formative assessment method of real-world learning by integrating heterogeneous elements of behavior, knowledge, and the environment BIBAFull-Text 1-10
  Masaya Okada; Masahiro Tada
Real-world learning in a field is an important educational area for experience-based activities. Formative assessment by constant monitoring of the intellectual achievement of real-world learners is essential for adaptive learning support, but no assessment methodology has yet been developed. We consider a method to systematically integrate heterogeneous factors of real-world learning: learners' internal situations, their external situations, and their learning field. Then, we propose a method for formatively assessing the situation of real-world learning. The method enables us to recognize the sequence of characteristic stay behavior and the associated body posture of a learner, and to estimate the 3D location of his/her interest. The method enables the estimation of not only the learning topic that a learner is currently examining in a field but also the prospective topics that he/she should learn. Our assessment method is the basis for context-aware support to promote the emergence of new knowledge from intellectual collaboration in the world.
Clustering for improving educational process mining BIBAFull-Text 11-15
  Alejandro Bogarín; Cristóbal Romero; Rebeca Cerezo; Miguel Sánchez-Santillán
In this paper, we propose to use clustering to improve educational process mining. We want to improve both the performance and comprehensibility of the models obtained. We have used data from 84 undergraduate students who followed an online course using Moodle 2.0. We propose to group students firstly starting from data about Moodle's usage summary and/or the students' final marks in the course. Then, we propose to use data from Moodle's logs about each cluster/group of students separately in order to be able to obtain more specific and accurate models of students' behaviour. The results show that the fitness of the specific models is greater than the general model obtained using all the data, and the comprehensibility of the models can be also improved in some cases.

Predictive models and recommendations

Customized course advising: investigating engineering student success with incoming profiles and patterns of concurrent course enrollment BIBAFull-Text 16-25
  SungJin Nam; Steven Lonn; Thomas Brown; Cinda-Sue Davis; Darryl Koch
Every college student registers for courses from a catalog of numerous offerings each term. Selecting the courses in which to enroll, and in what combinations, can dramatically impact each student's chances for academic success. Taking inspiration from the STEM Academy, we wanted to identify the characteristics of engineering students who graduate with 3.0 or above grade point average. The overall goal of the Customized Course Advising project is to determine the optimal term-by-term course selections for all engineering students based on their incoming characteristics and previous course history and performance, paying particular attention to concurrent enrollment. We found that ACT Math, SAT Math, and Advanced Placement exam can be effective measures to measure the students' academic preparation level. Also, we found that some concurrent course-enrollment patterns are highly predictive of first-term and overall academic success.
Explaining predictive models to learning specialists using personas BIBAFull-Text 26-30
  Christopher Brooks; Jim Greer
This paper describes a method we have developed to convert statistical predictive models into visual narratives which explain student classifications. Building off of the work done within the user experience community, we apply the concept of personas to predictive models. These personas provide familiar and memorable descriptions of the learners identified by data mining activities, and bridge the gap between the data scientist and the learning specialist.
Temporal learning analytics for computer based testing BIBAFull-Text 31-35
  Zacharoula K. Papamitsiou; Vasileios Terzis; Anastasios A. Economides
Predicting student's performance is a challenging, yet complicated task for institutions, instructors and learners. Accurate predictions of performance could lead to improved learning outcomes and increased goal achievement. In this paper we explore the predictive capabilities of student's time-spent on answering (in-)correctly each question of a multiple-choice assessment quiz, along with student's final quiz-score, in the context of computer-based testing. We also explore the correlation between the time-spent factor (as defined here) and goal-expectancy. We present a case study and investigate the value of using this parameter as a learning analytics factor for improving prediction of performance during computer-based testing. Our initial results are encouraging and indicate that the temporal dimension of learning analytics should be further explored.

Alternative analytics

Sleepers' lag -- study on motion and attention BIBAFull-Text 36-43
  Mirko Raca; Roland Tormey; Pierre Dillenbourg
Human body-language is one of the richest and most obscure sources of information in inter-personal communication which we aim to re-introduce into the classroom's ecosystem. In this paper we present our observations of student-to-student influence and measurements. We show parallels with previous theories and formulate a new concept for measuring the level of attention based on synchronization of student actions. We observed that the students with lower levels of attention are slower to react then focused students, a phenomenon we named "sleepers' lag".
Clustering of design decisions in classroom visual displays BIBAFull-Text 44-48
  Ma. Victoria Almeda; Peter Scupelli; Ryan S. Baker; Mimi Weber; Anna Fisher
In this paper, we investigate the patterns of design choices made by classroom teachers for decorating their classroom walls, using cluster analysis to see which design decisions go together. Classroom visual design has been previously studied, but not in terms of the systematic patterns adopted by teachers in selecting what materials to place on classroom walls, or in terms of the actual semantic content of what is placed on walls. This is potentially important, as classroom walls are continuously seen by students, and form a continual off-task behavior option, available to students at all times. Using the k-means clustering algorithm, we find four types of visual classroom environments (one of them an outlier within our data set), representing teachers' strategies in classroom decoration. Our results indicate that the degree to which teachers place content-related decorations on the walls, is a feature of particular importance for distinguishing which approach teachers are using. Similarly, the type of school (e.g. whether private or charter) appeared to be another significant factor in determining teachers' design choices for classroom walls. The present findings begin the groundwork to better understand the impact of teacher decisions and choices in classroom design that lead to better outcomes in terms of engagement and learning, and finally towards developing classroom designs that are more effective and engaging for learners.
Data wranglers: human interpreters to help close the feedback loop BIBAFull-Text 49-53
  Doug Clow
Closing the feedback loop to improve learning is at the heart of good learning analytics practice. However, the quantity of data, and the range of different data sources, can make it difficult to take systematic action on that data. Previous work in the literature has emphasised the need for and value of human meaning-making in the process of interpretation of data to transform it in to actionable intelligence.
   This paper describes a programme of human Data Wranglers deployed at the Open University, UK, charged with making sense of a range of data sources related to learning, analysing that data in the light of their understanding of practice in individual faculties/departments, and producing reports that summarise the key points and make actionable recommendations.
   The evaluation of and experience in this programme of work strongly supports the value of human meaning-makers in the learning analytics process, and suggests that barriers to organisational change in this area can be mitigated by embedding learning analytics work within strategic contexts, and working at an appropriate level and granularity of analysis.
Toward unobtrusive measurement of reading comprehension using low-cost EEG BIBAFull-Text 54-58
  Yueran Yuan; Kai-min Chang; Jessica Nelson Taylor; Jack Mostow
Assessment of reading comprehension can be costly and obtrusive. In this paper, we use inexpensive EEG to detect reading comprehension of readers in a school environment. We use EEG signals to produce above-chance predictors of student performance on end-of-sentence cloze questions. We also attempt (unsuccessfully) to distinguish among student mental states evoked by distracters that violate either syntactic, semantic, or contextual constraints. In total, this work investigates the practicality of classroom use of inexpensive EEG devices as an unobtrusive measure of reading comprehension.

Learning mathematics

Learning analytics in CSCL with a focus on assessment: an exploratory study of activity theory-informed cluster analysis BIBAFull-Text 59-67
  Wanli Xing; Bob Wadholm; Sean Goggins
In this paper we propose an automated strategy to assess participation in a multi-mode math discourse environment called Virtual Math Teams with Geogrebra (VMTwG). A holistic participation clustering algorithm is applied through the lens of activity theory. Our activity theory-informed algorithm is a step toward accelerating heuristic approaches to assessing collaborative work in synchronous technology mediated environments like VMTwG. Our Exploratory findings provide an example of a novel, time-efficient, valid, and reliable participatory learning assessment tool for teachers in computer mediated learning environments. Scaling online learning with a combination of computation and theory is the overall goal of the work this paper is situated within.
On using Markov chain to evidence the learning structures and difficulty levels of one digit multiplication BIBAFull-Text 68-72
  Behnam Taraghi; Martin Ebner; Anna Saranti; Martin Schön
Understanding the behavior of learners within learning applications and analyzing the factors that may influence the learning process play a key role in designing and optimizing learning applications. In this work we focus on a specific application named "1x1 trainer" that has been designed for primary school children to learn one digit multiplications. We investigate the database of learners' answers to the asked questions (N > 440000) by applying the Markov chains. We want to understand whether the learners' answers to the already asked questions can affect the way they will answer the subsequent asked questions and if so, to what extent. Through our analysis we first identify the most difficult and easiest multiplications for the target learners by observing the probabilities of the different answer types. Next we try to identify influential structures in the history of learners' answers considering the Markov chain of different orders. The results are used to identify pupils who have difficulties with multiplications very soon (after couple of steps) and to optimize the way questions are asked for each pupil individually.
Context personalization, preferences, and performance in an intelligent tutoring system for middle school mathematics BIBAFull-Text 73-77
  Stephen E. Fancsali; Steven Ritter
Learners often think math is unrelated to their own interests. Instructional software has the potential to provide personalized instruction that responds to individuals' interests. Carnegie Learning's MATHia™ software for middle school mathematics asks learners to specify domains of their interest (e.g., sports & fitness, arts & music), as well as names of friends/classmates, and uses this information to both choose and personalize word problems for individual learners. Our analysis of MATHia's relatively coarse-grained personalization contrasts with more finegrained analysis in previous research on word problems in the Cognitive Tutor (e.g., finding effects on performance in parts of problems that depend on more difficult skills), and we explore associations of aggregate preference "honoring" with learner performance. To do so, we define a notion of "strong" learner interest area preferences and find that honoring such preferences has a small negative association with performance. However, learners that both merely express preferences (either interest area preferences or setting names of friends/classmates), and those that express strong preferences, tend to perform in ways that are associated with better learning compared to learners that do not express such preferences. We consider several explanations of these findings and suggest important topics for future research.
Interaction design for improved analytics BIBAFull-Text 78-82
  Maria Mendiburo; Brian Sulcer; Ted Hasselbring
In this paper, we explain a portion of the design research process that we used to develop the learning analytics for a manipulative-based fractions intervention program. In particular, we highlight a set of qualitative interviews that we conducted with individual students after a short study in which students in three classes at the same school learned to use virtual manipulatives to compare pairs of proper fractions and order groups of 3 proper fractions. These qualitative interviews provided us with considerable information that helped us improve the interactions students have with the virtual manipulatives and produce more sophisticated and informative analytics. We emphasize the importance of using mixed-methods during the iterative cycles of development that define design research.


Visualizing patterns of student engagement and performance in MOOCs BIBAFull-Text 83-92
  Carleton Coffrin; Linda Corrin; Paula de Barba; Gregor Kennedy
In the last five years, the world has seen a remarkable level of interest in Massive Open Online Courses, or MOOCs. A consistent message from universities participating in MOOC delivery is their eagerness to understand students' online learning processes. This paper reports on an exploratory investigation of students' learning processes in two MOOCs which have different curriculum and assessment designs. When viewed through the lens of common MOOC learning analytics, the high level of initial student interest and, ultimately, the high level of attrition, makes these two courses appear very similar to each other, and to MOOCs in general. With the goal of developing a greater understanding of students' patterns of learning behavior in these courses, we investigated alternative learning analytic approaches and visual representations of the output of these analyses. Using these approaches we were able to meaningfully classify student types and visualize patterns of student engagement which were previously unclear. The findings from this research contribute to the educational community's understanding of students' engagement and performance in MOOCs, and also provide the broader learning analytics community with suggestions of new ways to approach learning analytic data analysis and visualization.
Small to big before massive: scaling up participatory learning analytics BIBAFull-Text 93-97
  Daniel T. Hickey; Tara Alana Kelley; Xinyi Shen
This case study describes how course features and individual & social learning analytics were scaled up to support "participatory" learning. An existing online course was turned into a "big open online course" (BOOC) offered to hundreds. Compared to typical open courses, relatively high levels of persistence, individual & social engagement, and achievement were obtained. These results suggest that innovative learning analytics might best be scaled (a) incrementally, (b) using design-based research methods, (c) focusing on engagement in consequential & contextual knowledge, (d) using emerging situative assessment theories.
Success, activity and drop-outs in MOOCs an exploratory study on the UNED COMA courses BIBAFull-Text 98-102
  Jose Luis Santos; Joris Klerkx; Erik Duval; David Gago; Luis Rodríguez
This paper presents an exploratory study about two language learning MOOCs deployed in the UNED COMA platform. The study identifies three research questions: a) How does activity evolve in these MOOCs? b) Are all learning activities relevant?, and c) Does the use of the target language influence?. We conclude that the MOOC activity drops not only due to the drop-outs. When students skips around 10% of the proposed activities, the percentage of passing the course decrease in a 25%. Forum activity is a useful indicator for success, however the participation in active threads is not. Finally, the use of the target language course is not an indicator to predict success.

Learning analytics for "at risk" students

Engagement vs performance: using electronic portfolios to predict first semester engineering student retention BIBAFull-Text 103-112
  Everaldo Aguiar; Nitesh V. Chawla; Jay Brockman; G. Alex Ambrose; Victoria Goodrich
As providers of higher education begin to harness the power of big data analytics, one very fitting application for these new techniques is that of predicting student attrition. The ability to pinpoint students who might soon decide to drop out of a given academic program allows those in charge to not only understand the causes for this undesired outcome, but it also provides room for the development of early intervention systems. While making such inferences based on academic performance data alone is certainly possible, we claim that in many cases there is no substantial correlation between how well a student performs and his or her decision to withdraw. This is specially true when the overall set of students has a relatively similar academic performance. To address this issue, we derive measurements of engagement from students' electronic portfolios and show how these features can be effectively used to augment the quality of predictions.
Perceptions and use of an early warning system during a higher education transition program BIBAFull-Text 113-117
  Stephen Aguilar; Steven Lonn; Stephanie D. Teasley
This paper reports findings from the implementation of a learning analytics-powered Early Warning System (EWS) by academic advisors who were novice users of data-driven learning analytics tools. The information collected from these users sheds new light on how student analytic data might be incorporated into the work practices of advisors working with university students. Our results indicate that advisors predominantly used the EWS during their meetings with students -- despite it being designed as a tool to provide information to prepare for meetings and identify students who are struggling academically. This introduction of an unintended audience brings significant design implications to bear that are relevant for learning analytics innovations.
Modest analytics: using the index method to identify students at risk of failure BIBAFull-Text 118-122
  Tim Rogers; Cassandra Colvin; Belinda Chiera
Regression is the tool of choice for developing predictive models of student risk of failure. However, the forecasting literature has demonstrated the predictive equivalence of much simpler methods. We directly compare one simple tabulation technique, the index method, to a linear multiple regression approach for identifying students at risk. The broader purpose is to explore the plausibility of a flexible method that is conducive to adoption and diffusion. In this respect this paper fits within the ambit of the modest computing agenda, and suggests the possibility of a modest analytics. We built both regression and index method models on 2011 student data and applied these to 2012 student data. The index method was comparable in terms of predictive accuracy of student risk. We suggest that the context specificity of learning environments makes the index method a promising tool for educators who want a situated risk algorithm that is flexible and adaptable.

Text analytics and collaborative environments

Analytics of the effects of video use and instruction to support reflective learning BIBAFull-Text 123-132
  Dragan Gaševic; Negin Mirriahi; Shane Dawson
Although video annotation software is no longer considered as a new innovation, its application in promoting student self-regulated learning and reflection skills has only begun to emerge in the research literature. Advances in text and video analytics provide the capability of investigating students' use of the tool and the psychometrics and linguistic processes evident in their written annotations. This paper reports on a study exploring students' use of a video annotation tool when two different instructional approaches were deployed -- graded and non-graded self-reflection annotations within two courses in the performing arts. In addition to counts and temporal locations of self-reflections, the Linguistic Inquiry and Word Counts (LIWC) framework was used for the extraction of variables indicative of the linguistic and psychological processes associated with self-reflection annotations of videos. The results indicate that students in the course with graded self-reflections adopted more linguistic and psychological related processes in comparison to the course with non-graded self-reflections. In general, the effect size of the graded reflections was lower for students who took both courses in parallel. Consistent with prior research, the study identified that students tend to make the majority of their self-reflection annotations early in the video time line. The paper also provides several suggestions for future research to better understand the application of video annotations in facilitating student learning.
Peer assessment based on ratings in a social media course BIBAFull-Text 133-137
  Andrii Vozniuk; Adrian Holzer; Denis Gillet
Peer assessment is seen as a powerful supporting tool to achieve scalability in the evaluation of complex assignments in large courses, possibly virtual ones, as in the context of massive open online courses (MOOCs). However, the adoption of peer assessment is slow due in part to the lack of ready-to-use systems. Furthermore, the validity of peer assessment is still under discussion. In this paper, in order to tackle some of these issues, we present as a proof-of-concept of a novel extension of Graasp, a social media platform, to setup a peer assessment activity. We then report a case study of peer assessment using Graasp in a Social Media course with 60 master's level university students and analyze the level of agreement between students and instructors in the evaluation of short individual reports. Finally, to see if both instructor and student evaluations were based on appearance of project reports rather than on content, we conducted a study with 40 kids who rated reports solely on their look. Our results convey the fact that unlike the kid evaluation, which shows a low level of agreement with instructors, student assessment is reliable since the level of agreement between instructors and students was high.
Collaborative spatial classification BIBAFull-Text 138-142
  Eric Coopey; R. Benjamin Shapiro; Ethan Danahy
Interactive technologies have become an important part of teaching and learning. However, the data that these systems generate is increasingly unstructured, complex, and therefore difficult of which to make sense of. Current computationally driven methods (e.g., latent semantic analysis or learning based image classifiers) for classifying student contributions don't include the ability to function on multimodal artifacts (e.g., sketches, videos, or annotated images) that new technologies enable. We have developed and implemented a classification algorithm based on learners' interactions with the artifacts they create. This new form of semi-automated concept classification, coined Collaborative Spatial Classification, leverages the spatial arrangement of artifacts to provide a visualization that generates summary level data about idea distribution. This approach has two benefits. First, students learn to identify and articulate patterns and connections among classmates ideas. Second, the teacher receives a high-level view of the distribution of ideas, enabling them to decide how to shift their instructional practices in real-time.
Assessing elementary students' science competency with text analytics BIBAFull-Text 143-147
  Samuel P. Leeman-Munk; Eric N. Wiebe; James C. Lester
Real-time formative assessment of student learning has become the subject of increasing attention. Students' textual responses to short answer questions offer a rich source of data for formative assessment. However, automatically analyzing textual constructed responses poses significant computational challenges, and the difficulty of generating accurate assessments is exacerbated by the disfluencies that occur prominently in elementary students' writing. With robust text analytics, there is the potential to accurately analyze students' text responses and predict students' future success. In this paper, we present WriteEval, a hybrid text analytics method for analyzing student-composed text written in response to constructed response questions. Based on a model integrating a text similarity technique with a semantic analysis technique, WriteEval performs well on responses written by fourth graders in response to short-text science questions. Further, it was found that WriteEval's assessments correlate with summative analyses of student performance.

Institutional perspectives

Techniques for data-driven curriculum analysis BIBAFull-Text 148-157
  Gonzalo Méndez; Xavier Ochoa; Katherine Chiluiza
One of the key promises of Learning Analytics research is to create tools that could help educational institutions to gain a better insight of the inner workings of their programs, in order to tune or correct them. This work presents a set of simple techniques that applied to readily available historical academic data could provide such insights. The techniques described are real course difficulty estimation, dependance estimation, curriculum coherence, dropout paths and load/performance graph. The description of these techniques is accompanied by its application to real academic data from a Computer Science program. The results of the analysis are used to obtain recommendations for curriculum re-design.
The impact of learning analytics on the dutch education system BIBAFull-Text 158-162
  Hendrik Drachsler; Slavi Stoyanov; Marcus Specht
The article reports the findings of a Group Concept Mapping study that was conducted within the framework of the Learning Analytics Summer Institute (LASI) in the Netherlands. Learning Analytics are expected to be beneficial for students and teacher empowerment, personalization, research on learning design, and feedback for performance. The study depicted some management and economics issues and identified some possible treats. No differences were found between novices and experts on how important and feasible are changes in education triggered by Learning Analytics.
An exercise in institutional reflection: the learning analytics readiness instrument (LARI) BIBAFull-Text 163-167
  Kimberly E. Arnold; Steven Lonn; Matthew D. Pistilli
While the landscape of learning analytics is relatively well defined, the extent to which institutions are ready to embark on an analytics implementation is less known. Further, while work has been done on measuring the maturity of an institution's implementation, this work fails to investigate how an institution that has not implemented analytics to date might become mature over time. To that end, the authors developed and piloted a survey, the Learning Analytics Readiness Instrument (LARI), in an attempt to help institutions successfully prepare themselves for a successfully analytics implementation. The LARI is comprised of 90 items encompassing five factors related to a learning analytics implementation: (1) Ability, (2) Data, (3) Culture and Process, (4) Governance and Infrastructure, and, (5) Overall Readiness Perception. Each of the five factors has a high internal consistency, as does the overall tool. This paper discusses the need for a survey such as the LARI, the tool's psychometric properties, the authors' broad interpretations of the findings, and next steps for the LARI and the research in this field.
Competency map: visualizing student learning to promote student success BIBAFull-Text 168-172
  Jeff Grann; Deborah Bushway
Adult students often struggle to appreciate the relevance of their higher educational experiences to their careers. Capella University's competency map is a dashboard that visually indicates each student's status relative to specific assessed competencies. MBA students who utilize their competency map demonstrate competencies at slightly higher levels and persist in their program at greater rates, even after statistically controlling for powerful covariates, such as course engagement.

Analysis of resource use in LMS

Analysis of dynamic resource access patterns in a blended learning course BIBAFull-Text 173-182
  Tobias Hecking; Sabrina Ziebarth; H. Ulrich Hoppe
This paper presents an analysis of resource access patterns in a recently conducted master level university course. The specialty of the course was that it followed a new teaching approach by providing additional learning resources such as wikis, self-tests and videos. To gain deeper insights into the usage of the provided learning material we have built dynamic bipartite student -- resource networks based on event logs of resource access. These networks are analysed using methods adapted from social network analysis. In particular we uncover bipartite clusters of students and resources in those networks and propose a method to identify patterns and traces of their evolution over time.
Analyzing the log patterns of adult learners in LMS using learning analytics BIBAFull-Text 183-187
  Il-Hyun Jo; Dongho Kim; Meehyun Yoon
In this paper, we describe a process of constructing proxy variables that represent adult learners' time management strategies in an online course. Based upon previous research, three values were selected from a data set. According to the result of empirical validation, an (ir)regularity of the learning interval was proven to be correlative with and predict learning performance. As indicated in previous research, regularity of learning is a strong indicator to explain learners' consistent endeavors. This study demonstrates the possibility of using learning analytics to address a learner's specific competence on the basis of a theoretical background. Implications for the learning analytics field seeking a pedagogical theory-driven approach are discussed.
Practice exams make perfect: incorporating course resource use into an early warning system BIBAFull-Text 188-192
  Richard Joseph Waddington; SungJin Nam
Early Warning Systems (EWSs) are being developed and used more frequently to aggregate multiple sources of data and provide timely information to stakeholders about students in need of academic support. As these systems grow more complex, there is an increasing need to incorporate relevant and real-time course-related information that could be predictors of a student's success or failure. This paper presents an investigation of how to incorporate students' use of course resources from a Learning Management System (LMS) into an existing EWS. Specifically, we focus our efforts on understanding the relationship between course resource use and a student's final course grade. Using ten semesters of LMS data from a requisite Chemistry course, we categorized course resources into four categories. We used a multinomial logistic regression model with semester fixed-effects to estimate the relationship between course resource use and the likelihood that a student receives an "A" or "B" in the course versus a "C." Results suggest that students who use Exam Preparation or Lecture resources to a greater degree than their peers are more likely to receive an "A" or "B" as a final grade. We discuss the implications of our results for the further development of this EWS and EWSs in general.

Learning analytics and learning design

Educational data sciences: framing emergent practices for analytics of learning, organizations, and systems BIBAFull-Text 193-202
  Philip J. Piety; Daniel T. Hickey; M. J. Bishop
In this paper, we develop a conceptual framework for organizing emerging analytic activities involving educational data that can fall under broad and often loosely defined categories, including Academic/Institutional Analytics, Learning Analytics/Educational Data Mining, Learner Analytics/Personalization, and Systemic Instructional Improvement. While our approach is substantially informed by both higher education and K-12 settings, this framework is developed to apply across all educational contexts where digital data are used to inform learners and the management of learning. Although we can identify movements that are relatively independent of each other today, we believe they will in all cases expand from their current margins to encompass larger domains and increasingly overlap. The growth in these analytic activities leads to the need to find ways to synthesize understandings, find common language, and develop frames of reference to help these movements develop into a field.
Designing pedagogical interventions to support student use of learning analytics BIBAFull-Text 203-211
  Alyssa Friend Wise
This article addresses a relatively unexplored area in the emerging field of learning analytics, the design of learning analytics interventions. A learning analytics intervention is defined as the surrounding frame of activity through which analytic tools, data, and reports are taken up and used. It is a soft technology that involves the orchestration of the human process of engaging with the analytics as part of the larger teaching and learning activity. This paper first makes the case for the overall importance of intervention design, situating it within the larger landscape of the learning analytics field, and then considers the specific issues of intervention design for student use of learning analytics. Four principles of pedagogical learning analytics intervention design that can be used by teachers and course developers to support the productive use of learning analytics by students are introduced: Integration, Agency, Reference Frame and Dialogue. In addition three core processes in which to engage students are described: Grounding, Goal-Setting and Reflection. These principles and processes are united in a preliminary model of pedagogical learning analytics intervention design for students, presented as a starting point for further inquiry.
A cognitive processing framework for learning analytics BIBAFull-Text 212-216
  Andrew Gibson; Kirsty Kitto; Jill Willis
Incorporating a learner's level of cognitive processing into Learning Analytics presents opportunities for obtaining rich data on the learning process. We propose a framework called COPA that provides a basis for mapping levels of cognitive operation into a learning analytics system. We utilise Bloom's taxonomy, a theoretically respected conceptualisation of cognitive processing, and apply it in a flexible structure that can be implemented incrementally and with varying degree of complexity within an educational organisation. We outline how the framework is applied, and its key benefits and limitations. Finally, we apply COPA to a University undergraduate unit, and demonstrate its utility in identifying key missing elements in the structure of the course.

Discourse and argumentation

Statistical discourse analysis of online discussions: informal cognition, social metacognition and knowledge creation BIBAFull-Text 217-225
  Ming Ming Chiu; Nobuko Fujita
To statistically model large data sets of knowledge processes during asynchronous, online forums, we must address analytic difficulties involving the whole data set (missing data, nested data and the tree structure of online messages), dependent variables (multiple, infrequent, discrete outcomes and similar adjacent messages), and explanatory variables (sequences, indirect effects, false positives, and robustness). Statistical discourse analysis (SDA) addresses all of these issues, as shown in an analysis of 1,330 asynchronous messages written and self-coded by 17 students during a 13-week online educational technology course. The results showed how attributes at multiple levels (individual and message) affected knowledge creation processes. Men were more likely than women to theorize. Asynchronous messages created a micro-sequence context; opinions and asking about purpose preceded new information; anecdotes, opinions, different opinions, elaborating ideas, and asking about purpose or information preceded theorizing. These results show how informal thinking precedes formal thinking and how social metacognition affects knowledge creation.
Uncovering what matters: analyzing transitional relations among contribution types in knowledge-building discourse BIBAFull-Text 226-230
  Bodong Chen; Monica Resendes
Temporality matters for analysis of collaborative learning. The present study attempts to uncover temporal patterns that distinguish "productive" threads of knowledge building inquiry. Using a rich knowledge building discourse dataset, in which notes' contribution types and threads' productivity have been coded, a secondary temporal analysis was conducted. In particular, Lag-sequential Analysis was conducted to identify transitional patterns among different contribution types that distinguish productive threads from "improvable" ones. Results indicated that productive inquiry threads involved significantly more transitions among questioning, theorizing, obtaining information, and working with information; in contrast, responding to questions and theories by merely giving opinions was not sufficient to achieve knowledge progress. This study highlights the importance of investigating temporality in collaborative learning and calls for attention to developing and testing temporal analysis methods in learning analytics research.

Who we are and who we want to be

Current state and future trends: a citation network analysis of the learning analytics field BIBAFull-Text 231-240
  Shane Dawson; Dragan Gaševic; George Siemens; Srecko Joksimovic
This paper provides an evaluation of the current state of the field of learning analytics through analysis of articles and citations occurring in the LAK conferences and identified special issue journals. The emerging field of learning analytics is at the intersection of numerous academic disciplines, and therefore draws on a diversity of methodologies, theories and underpinning scientific assumptions. Through citation analysis and structured mapping we aimed to identify the emergence of trends and disciplinary hierarchies that are influencing the development of the field to date. The results suggest that there is some fragmentation in the major disciplines (computer science and education) regarding conference and journal representation. The analyses also indicate that the commonly cited papers are of a more conceptual nature than empirical research reflecting the need for authors to define the learning analytics space. An evaluation of the current state of learning analytics provides numerous benefits for the development of the field, such as a guide for under-represented areas of research and to identify the disciplines that may require more strategic and targeted support and funding opportunities.
Teaching the unteachable: on the compatibility of learning analytics and humane education BIBAFull-Text 241-245
  Timothy D. Harfield
This paper is an exploratory effort to find a place for learning analytics in humane education. After distinguishing humane education from training on the basis of the Aristotelian model of intellectual capabilities, and arguing that humane education is distinct by virtue of its interest in cultivating prudence, which is unteachable, an account of three key characteristics of humane education is provided. Appealing to thinkers of the Italian Renaissance, it is argued that ingenium, eloquence, and self-knowledge constitute the what, how, and why of humane education. Lastly, looking to several examples from recent learning analytics literature, it is demonstrated that learning analytics is not only helpful as set of aids for ensuring success in scientific and technical disciplines, but in the humanities as well. In order to function effectively as an aid to humane education, however, learning analytics must be embedded within a context that encourages continuous reflection, responsiveness, and personal responsibility for learning.
Establishing an ethical literacy for learning analytics BIBAFull-Text 246-250
  Jenni Swenson
This paper borrows multiple frameworks from the field of technical communication in order to review theory, research, practice, and ethics of the Learning Analytics and Knowledge (LAK) discipline. These frameworks also guide discussion on the ethics of learning analytics "artifacts" (data visualizations, dashboards, and methodology), and the ethical consequences of using learning analytics (classification, social power moves, and absence of voice). Finally, the author suggests a literacy for learning analytics that includes an ethical viewpoint.


Setting learning analytics in context: overcoming the barriers to large-scale adoption BIBAFull-Text 251-253
  Rebecca Ferguson; Doug Clow; Leah Macfadyen; Alfred Essa; Shane Dawson; Shirley Alexander
Once learning analytics have been successfully developed and tested, the next step is to implement them at a larger scale -- across a faculty, an institution or an educational system. This introduces a new set of challenges, because education is a stable system, resistant to change. Implementing learning analytics at scale involves working with the entire technological complex that exists around technology-enhanced learning (TEL). This includes the different groups of people involved -- learners, educators, administrators and support staff -- the practices of those groups, their understandings of how teaching and learning take place, the technologies they use and the specific environments within which they operate. Each element of the TEL Complex requires explicit and careful consideration during the process of implementation, in order to avoid failure and maximise the chances of success. In order for learning analytics to be implemented successfully at scale, it is crucial to provide not only the analytics and their associated tools but also appropriate forms of support, training and community building.
Learning analytics for the social media age BIBAFull-Text 254-256
  Anatoliy Gruzd; Caroline Haythornthwaite; Drew Paulin; Rafa Absar; Michael Huggett
In just a short period of time, social media have altered many aspects of our daily lives, from how we form and maintain social relationships to how we discover, access and share information online. Now social media are also beginning to affect how we teach and learn in this increasingly interconnected and information-rich world. The panelists will discuss their ongoing work that seeks to understand the affordances and potential roles of social media in learning, as well as to determine and provide methods that can help researchers and educators evaluate the use of social media for teaching and learning based on automated analyses of social media texts and networks. The panel will focus on the first phase of this five-year research initiative "Learning Analytics for the Social Media Age" funded by the Social Science and Humanities Research Council of Canada (2013-2018).
Building institutional capacities and competencies for systemic learning analytics initiatives BIBAFull-Text 257-260
  Kimberly E. Arnold; Grace Lynch; Daniel Huston; Lorna Wong; Linda Jorn; Christopher W. Olsen
The last five years have brought an explosion of research in the learning analytics field. However, much of what has emerged has been small scale or tool-centric. While these efforts are vitally important to the development of the field, in order to truly transform education, learning analytics must scale and become institutionalized at multiple levels throughout an educational system. Many institutions are currently undertaking this grand challenge and this panel will highlight cases from: the University of Wisconsin System, the Society for Learning Analytics Research, the University of New England, and Rio Salado College.


Hanzi handwriting acquisition with automatic feedback BIBAFull-Text 261-262
  Chin-Hwa Kuo; Jian-Wen Peng; Wen-Chen Chang
One of the most crucial distinctions between Chinese and Western languages is that the former is based on ideograms, whereas the latter is based on phonograms. Due to this distinction, Western learners of Chinese often experience more difficulties in grasping correct character stroke sequence and/or stroke direction relative to native Chinese speakers. In this paper, we designed a HanZi writing environment with automatic feedback to address the above issue. Before the collection of HanZi characters on a massive scale, we conducted a pilot study to collect handwritten Chinese samples from 160 college students in the U.S. The findings from this study enabled us to further refine the learning environment and design optimal learning and teaching strategies for learners and teachers.
Analyzing student notes and questions to create personalized study guides BIBAFull-Text 263-264
  Perry J. Samson
In the foreseeable future it will be technically possible for instructors, advisors and other delegated representatives of a college or university to access student participation and performance data in near-real time. One potential benefit of this increased data flow could include an improved ability to identify students at risk of academic failure or withdrawal. The availability of these data could also lead to creation of new adaptive learning measures that can automatically provide students personalized guidance.
   This demonstration will describe how the student notes and questions are being mined to provide student study guides that automatically link to outside resources. The demonstration will also report on how these new study guides have been received by the students and how they are at least partially responsible for a significant increase in student outcomes.
Visual analytics of academic writing BIBAFull-Text 265-266
  Duygu Simsek; Simon Buckingham Shum; Anna De Liddo; Rebecca Ferguson; Ágnes Sándor
This paper describes a novel analytics dashboard which visualises the key features of scholarly documents. The Dashboard aggregates the salient sentences of scholarly papers, their rhetorical types and the key concepts mentioned within these sentences. These features are extracted from papers through a Natural Language Processing (NLP) technology, called Xerox Incremental Parser (XIP). The XIP Dashboard is a set of visual analytics modules based on the XIP output. In this paper, we briefly introduce the XIP technology and demonstrate an example visualisation of the XIP Dashboard.
Open academic early alert system: technical demonstration BIBAFull-Text 267-268
  Sandeep M. Jayaprakash; Eitel J. M. Lauría
This paper synthesizes some of the technical decisions, design strategies & concepts developed during the execution of Open Academic Analytics Initiative (OAAI), a research program aimed at improving student retention rates in colleges, by deploying an open-source academic early alert system to identify the students at academic risk. The paper explains the prototype demonstration of the system, detailing several dimensions of data mining & analysis such as: data integration, predictive modelling and scoring with reporting. The paper should be relevant to practitioners and academicians who want to better understand the implementation of an OAAI academic early-alert system.
Educational technology approach toward learning analytics: relationship between student online behavior and learning performance in higher education BIBAFull-Text 269-270
  Taeho Yu; Il-Hyun Jo
The aim of this study is to suggest more meaningful components for learning analytics in order to help learners improving their learning achievement continuously through an educational technology approach. Multiple linear regression analysis is conducted to determine which factors influence student's academic achievement. 84 undergraduate students in a women's university in South Korea participated in this study. The six-predictor model was able to account for 33.5% of the variance in final grade, F(6, 77) = 6.457, p < .001, R² = .335. Total studying time in LMS, interaction with peers, regularity of learning interval in LMS, and number of downloads were determined to be significant factors for students' academic achievement in online learning environment. These four controllable variables not only predict learning outcomes significantly but also can be changed if learners put more effort to improve their academic performance. The results provide a rationale for the treatment for student time management effort.
Visualizing semantic space of online discourse: the knowledge forum case BIBAFull-Text 271-272
  Bodong Chen
This poster presents an early experimentation of applying topic modeling and visualization techniques to analyze online discourse. In particular, Latent Dirichlet Allocation was used to convert discourse into a high-dimensional semantic space. To explore meaningful visualizations of the space, Locally Linear Embedding was performed reducing it to two-dimensional. Further, Time Series Analysis was applied to track evolution of topics in the space. This work will lead to new analytic tools for collaborative learning.
eGraph tool: graphing the learning process in LMSs BIBAFull-Text 273-274
  Rebeca Cerezo; Natalia Suarez; J. Carlos Núñez; Miguel Sánchez-Santillán
eGraph is a virtual tool developed with the aim of make easier to track the students' learning process in Learning Management Systems like Moodle. It is based in the log files that the learning platform records when the students are interacting with and allows teachers, students, and researchers to track the learning route that learners have followed during a particular time span.
Effects of image-based and text-based activities on student learning outcomes BIBAFull-Text 275-276
  Anne K. Greenberg; Melissa Gross; Mary C. Wright
Research on benefits of visual learning has relied primarily on lecture-based pedagogy, not accounting for the processing time students need to make sense of both visual and verbal material[8]. In this study, we investigate the potential differential effects of text-based and image-based student learning activities on student learning outcomes in a functional anatomy course. When controlling for demographics and prior GPA, participation in in-class image-based activities is significantly correlated with performance on associated exam questions, while text-based engagement is not. Additionally, students rated activities as helpful for seeing images of key ideas and as being significantly less mentally taxing than text-based activities.
Peer evaluation of student generated content BIBAFull-Text 277-278
  Jared Tritz; Nicole Michelotti; Ginger Shultz; Tim McKay; Barsaa Mohapatra
We will present three similar studies that examine online peer evaluation of student-generated explanations for missed exam problems in introductory physics. In the first study, students created video solutions using YouTube and in the second two studies, they created written solutions using Google documents. All peer evaluations were performed using a tournament module as part of the interactive online coaching system called E2Coach[4] at the University of Michigan. With the theme of LAK 2014 being "intersection of learning analytics research, theory and practice", we think this poster will provide an accessible example that combines a classroom experiment with rigorous analysis to understand outcomes.
Patterns of persistence: what engages students in a remedial English writing MOOC? BIBAFull-Text 279-280
  John Whitmer; Eva Schiorring; Pat James
MOOCs have the potential to help institutions and students needing remedial English language instruction in two ways. First, with their capacity to use a wide range of instructional approaches and to emphasize contextualized and visual learning, MOOCS can offer potentially more effective pedagogical approaches for remedial students. Second, if students increase success meeting college-level English competencies, MOOCS can help institutions and students conserve their limited resources. Similarly, MOOCs offer domestically and international employers opportunities to provide professional development to workers both in ways that are flexible, affordable and interactive.
National differences in an international classroom BIBAFull-Text 281-282
  Jennifer DeBoer; Glenda S. Stump
The virtual classrooms of open online courses include students from a vast array of individual, social, economic, and educational contexts. Detailed data were collected for the first course MIT ran on the edX platform, including student behavior, performance, and background information. In this paper, we estimate the systematic differences in average performance, distribution of performance, and performance conditional on behaviors for countries with different characteristics (e.g., language, income).


DCLA14: second international workshop on discourse-centric learning analytics BIBAFull-Text 283-284
  Rebecca Ferguson; Anna De Liddo; Denise Whitelock; Maarten de Laat; Simon Buckingham Shum
The first international workshop on discourse-centric learning analytics (DCLA) took place at LAK13 in Leuven, Belgium. That workshop succeeded in its aim of catalysing ideas and building community connections between those working in this field of social learning analytics. It also proposed a mission statement for DCLA: to devise and validate analytics that look beyond surface measures in order to quantify linguistic proxies for deeper learning. This year, the focus of the second international DCLA workshop, like that of LAK14, is on the intersection of learning analytics research, theory and practice. Once researchers have developed and validated discourse-centric analytics, how can these be successfully deployed at scale to support learning?
Computational approaches to connecting levels of analysis in networked learning communities BIBAFull-Text 285-286
  H. Ulrich Hoppe; Daniel D. Suthers
The focus of this workshop is on the potential benefits and challenges of using specific computational methods to analyze interactions in networked learning environments, particularly with respect to integrating multiple analytic approaches towards understanding learning at multiple levels of agency, from individual to collective. The workshop is designed for researchers interested in analytical studies of collaborative and networked learning in socio-technical networks, using data-intensive computational methods of analysis (including social-network analysis, log-file analysis, information extraction and data mining). The workshop may also be of interest to pedagogical professionals and educational decision makers who want to evaluate the potential of learning analytics techniques to better inform their decisions regarding learning in technology-rich environments.
Learning analytics and machine learning BIBAFull-Text 287-288
  Dragan Gasevic; Carolyn Rose; George Siemens; Annika Wolff; Zdenek Zdrahal
Learning analytics (LA) as a field remains in its infancy. Many of the techniques now prominent from practitioners have been drawn from various fields, including HCI, statistics, computer science, and learning sciences. In order for LA to grow and advance as a discipline, two significant challenges must be met: 1) development of analytics methods and techniques that are native to the LA discipline, and 2) practitioners in LA to develop algorithms and models that reflect the social and computational dimensions of analytics. This workshop introduces researchers in learning analytics to machine learning (ML) and the opportunities that ML can provide in building next generation analysis models.
The learning analytics & knowledge (LAK) data challenge 2014 BIBAFull-Text 289-290
  Hendrik Drachsler; Stefan Dietze; Eelco Herder; Mathieu d'Aquin; Davide Taibi
The LAK Data Challenge 2014 continues the research efforts of the second edition by stimulating research on the evolving fields Learning Analytics (LA) and Educational Data Mining (EDM). Building on a series of activities of the LinkedUp project, the challenge aims to generate new insights and analysis on the LA & EDM disciplines and is supported through the LAK Dataset -- a unique corpus of LA & EDM literature, exposed in structured and machine-readable formats.