Learning Analytics: Implications for Higher Education -  - E-Book

Learning Analytics: Implications for Higher Education E-Book

0,0

Beschreibung

This Special Issue gathers recent experiences and research examples concerning the use of Learning Analytics in higher education contexts of online and blended learning. All featured articles span across technically enabled data collection and processing/analysis, on the one hand, and, pedagogically motivated decision making by learners, teachers and other stakeholders on the other.

Sie lesen das E-Book in den Legimi-Apps auf:

Android
iOS
von Legimi
zertifizierten E-Readern

Seitenzahl: 204

Das E-Book (TTS) können Sie hören im Abo „Legimi Premium” in Legimi-Apps auf:

Android
iOS
Bewertungen
0,0
0
0
0
0
0
Mehr Informationen
Mehr Informationen
Legimi prüft nicht, ob Rezensionen von Nutzern stammen, die den betreffenden Titel tatsächlich gekauft oder gelesen/gehört haben. Wir entfernen aber gefälschte Rezensionen.



Table of contents

Preface

Editorial: Learning Analytics: Implications for Higher Education

Wolfgang Greller, Ulrich Hoppe

Adding dispositions to create pedagogy-based Learning Analytics

Dirk Tempelaar, Bart Rienties, Quan Nguyen

Using Learning Analytics to Investigate Student Performance in Blended Learning Courses

Wolfgang Greller, Mohammad Issack Santally, Ravindra Boojhawon, Yousra Rajabalee, Roopesh Kevin Sungkur

Learning Analytics and Survey Data Integration in Workload Research

Evgenia Samoilova, Florian Keusch, Tobias Wolbring

Predicting learning success in online learning environments: Self-regulated learning, prior knowledge and repetition

Karl Ledermüller, Irmgard Fallmann

Driving Student Motivation in MOOCs through a Conceptual Activity-Motivation Framework

Mohammad Khalil, Martin Ebner

Free Articles

Akteurinnen/Akteure der Digitalisierung im Hochschulsystem: Modernisierung oder Profilierung?

Barbara Getto, Michael Kerres

Umgestaltung einer Lehrveranstaltung in ein Blended-Learning-Format: machbar und lerneffizient

Lukas Lochner, Heike Wieser, Simone Waldboth, Maria Mischo-Kelling

„Die eigene Lehre untersuchen“ – ein Erfolgsfaktor?

Monika Wyss, Wolfgang Beywl, Kathrin Pirani, Donat Knecht

Tiefenlernen im Praxissemester: Zusammenhänge mit Emotionsregulation

Robert Kordts-Freudinger, Thomas Große Honebrink, Dagmar Festner

Preface

As the scientific publication organ of the Forum neue Medien in der Lehre Austria, the Zeitschrift für Hochschulentwicklung (Journal for Higher Education Development) is of particular importance. It addresses current topics in higher education development in the areas of both studying and teaching, and as a German-speaking (and particularly Austrian) medium, it provides a platform for academics, practitioners, higher education developers and didactic experts to exchange ideas. Furthermore, since the ZFHE is designed as an open-access journal, it is available for anyone free of charge as an electronic publication.

In 2014 and 2015, the annual number of visitors increased to more than 30,000 visitors. Monthly visits reached up to more than 3,500 visitors per month, which corresponds to an average of over 100 visitors per day. In addition, Google Scholar Metrics show that the journal is now among the fifty best German-speaking scientific journals.

This success can be attributed to the efforts of the international editorial board and the rotating publishers, who are committed to producing at least four editions annually. Moreover, continuing subsidies from the Austrian Federal Ministry for Science, Research and Economics guarantee the long-term existence of the journal. As the journal would not exist without this support, we would like to express our gratitude to the Ministry.

Since last year the ZFHE publishes at least one English-speaking edition per year on a topic of international interest. This special issue gathers recent experiences and research examples concerning the use of Learning Analytics in higher education contexts of online and blended learning. All featured articles span across technically enabled data collection and processing/analysis, on the one hand, and, pedagogically motivated decision making by learners, teachers and other stakeholders on the other.

Since the 9/3 edition, the ZFHE has also been available in printed form and can be purchased everywhere. The association Forum neue Medien in der Lehre Austria is happy to be able to anchor the topic of “higher education development” within a much broader scientific community through this valuable supplement to the electronic publication.

In this spirit, we hope you will enjoy reading the present edition!

Martin Ebner and Hans-Peter Steinbacher

Chairmen of the association Forum neue Medien in der Lehre Austria

Wolfgang GRELLER1 (Vienna) & Ulrich HOPPE (Duisburg)

Editorial: Learning Analytics: Implications for Higher Education

1 Motivation

The increased use of digital systems to support learning and teaching in higher education goes along with the increased possibility to collect data on students’ behaviour in those systems. It is now possible to gather data unobtrusively on when and what students contribute to a discussion forum, when they open a webcast lecture, when and how they take an assignment or test.

The term “Learning Analytics” refers to the use of such data and results of processing these data for educational purposes. A commonly used definition (SIEMENS, 2011) is: “Learning Analytics is the measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimising learning and the environments in which it occurs.” The abundance of learning-related data has resulted in a diverse set of research questions and research approaches (for an overview of pertinent issues: GRELLER & DRACHSLER, 2012). Some researchers advocate a bottom-up approach encouraging the use of data mining techniques, others argue that educational data sets and technical analyses only become meaningful when guided by a learning theory. We believe that a good combination of both ingredients is most desirable and promising. At least two questions emerge in this respect: what information to provide, and, how to present it (e.g. ALI, HATALA, GASEVIC & JOVANOVIC, 2012).

For higher education institutions, ownership of learner data originating from the delivery systems and learning platforms offers new ways to evaluate their learning services, success rates and student support requirements, and may help in reducing drop-out and failure. Learning Analytics has the claimed potential for “evidence-based” decision taking.

Current discussions about learning analytics do also pertain to ethical issues (SLADE & PRINSLOO, 2013). These are concerned with the sharing and exploitation of educational datasets for wider use within and outside educational institutions for a variety of purposes including research as well as commercialisation. Ways to make learner data more accessible without jeopardising personal privacy and independent learner control over their own learning are questions that are still awaiting viable solutions (DRACHSLER & GRELLER, 2016). Steps in this direction have been undertaken by policy makers at the Open University (2014) and through the JISC Code of Practice (SCLATER & BAILEY, 2015).

In addition to the ongoing scientific and technical developments in the domain, there is increased activity at institutional level to put learning analytics into everyday teaching and learning practice. These efforts begin to show signs of impact on the learners, teachers, and the organisation as a whole. The current volume explores various examples and real life experiments with student data that demonstrate the potential in supporting student learning in various ways.

2 Contributions

This Special Issue gathers recent experiences and research examples concerning the use of Learning Analytics in higher education contexts of online and blended learning. All featured articles span across technically enabled data collection and processing/analysis, on the one hand, and, pedagogically motivated decision making by learners, teachers and other stakeholders on the other.

Focusing on gaining actionable results from data and ensuing analyses, Tempelaar, Rienties and Nguyen report on results from an introductory blended learning course on mathematics and statistics using worked out examples as part of the SOWISO learning environment. They adopt the perspective of “Dispositional Learning Analytics” (DLA) and argue that this allows for linking data to concrete pedagogical decisions as actionable results. The variable that they take into focus is the frequency and timing of calls for worked out examples on the part of the students. As a main finding, early usage appears to go along with better overall performance. They also study interaction with dispositions of learning emotions such as enjoyment, curiosity, frustration, or anxiety. These data are made available to both learners and tutors with the aim of informing and influencing self-regulation processes. This work makes a case for making learning data actionable and not just using them for predictive purposes.

From the perspective of optimising Higher Education based on course data, the article by Greller, Santally, Boojhawon, Rajabalee, and Sungkur reports on findings from a freshman course in a “Web and Multimedia Programme” of the University of Mauritius. This course is offered in blended learning mode with a dominance of online modules (three out of five). The study is based on two complete cohorts from two consecutive years. In its core, the analysis is based on the correlations between six normalised variables including gender, High School Certificate scores (prior to entering the programme) and scores within the course with specific dependencies corroborated through ANOVAs and regression analyses. Interestingly, based on the respective marks, online modules appear to rather go along with continuous assessment, whereas face-to-face modules assessment appears to favour summative examinations. There is an overall gender effect indicating a slightly better performance of female students. The dependency between regularity of participation and marks was stronger in the online modules as compared to the face-to-face modules. Based on these findings, the authors discuss strategies for further improving the delivery quality of the specific course and possible generalisations. Thus, this article exemplifies the value of summative analytics using higher education course data on quality improvement and course design.

The verification of student workload is the topic of an analysis by Samoilova, Keusch and Wolbring. It uses a contrastive method of comparing students’ self-assessed work efforts with system data using interaction logs from video files, including number of videos watched, how much of a video was watched, how long videos were played for, and how much video material was covered. The student estimates were collected using survey data. Rather unsurprisingly, the results between self-reporting and system data diverged substantially, with students generally overestimating their time spent. The authors conclude among other things that over-reporting may be due to social desirability, but refined measuring would also be required. In any case, the article proposes to expand Learning Analytics to the new area of workload research, which can be considered important for curriculum development and comparability of degree courses.

A look at classical methodologies forms the basis of the argument by Ledermüller and Fallmann. They aim to show that classical factors that used system-generated data are valuable for learning effectiveness. Among those classical factors that they validate for effective learning are prior knowledge and invested learning time. The validation of their argument used a Learning Analytics approach to confirm the hypothesis for self-regulated learning scenarios with system data.

Khalil and Ebner look at massive open online courses (MOOCs) comparing student activities and engagement with success rates. In their mind, the high drop-out rates associated with MOOCs can be tackled with activity-motivation analytics that uses gamified feedback to students to stimulate persistence and motivation to learn and towards course completion. In their approach, the authors look at student interaction patterns on discussion forums, quizzes, and video lectures over the duration of a course. They find that certified users invested distinctively more time in these activities. Following from these conclusions, Khalil and Ebner develop a motivation prototype that visualises the variety of activities with charging levels of a battery. Full charge can only be achieved via performance in all four defined activity areas. They express the hope that this type of visualisation represents an aid for students against falling off in their engagement, and, avoid early drop-out of the course.

1 E-mail: [email protected]

3 Future Directions

As originally predicted by the Horizon Report (New Media Consortium, 2011), Learning Analytics is no longer a theoretical concept, but is in the process of entering teaching and learning practice. Early experimentation as demonstrated in the presented contributions widens the perceived potential use of analytics to several additional and innovative ways to manage and organise teaching and learning in higher education. Among the ones highlighted here are:

Making data-based information actionable

Student motivation research

Workload measurement

Curriculum and course design to suit student needs and backgrounds

Confirmation of classical theories and empirical evidence

Interestingly, these topics are on a different line of investigation as the growing body in predictive analytics that aims to anticipate student success or failure. In the areas mentioned here, there is, instead, a clear focus on everyday delivery and a better understanding of student needs.

References

Ali, L., Hatala, M., Gasevic, D., Jovanovic, J. (2012). A qualitative evaluation of evolution of a learning analytics tool. Computers & Education, 58(1), 470-489.

Drachsler, H., & Greller, W. (2016). Privacy and analytics: it’s a DELICATE issue – a checklist for trusted learning analytics. In Proceedings of the Sixth International Conference on Learning Analytics & Knowledge (pp. 89-98). ACM.

Greller, W., & Drachsler, H. (2012). Translating Learning into Numbers: A Generic Framework for Learning Analytics. Educational Technology & Society, 15(3), 42-57.

Macfadyen, L. P., & Dawson, S. (2012). Numbers Are Not Enough. Why e-Learning Analytics Failed to Inform an Institutional Strategic Plan. Educational Technology & Society, 15(3), 149-163.

New Media Consortium (2011). The 2011 Horizon Report.

Open University (2014). Policy on Ethical use of Student Data for Learning Analytics. http://www.open.ac.uk/students/charter/essential-documents/ethical-use-student-data-learning-analytics-policy

Sclater, N., & Bailey, P. (2015). Code of Practice for Learning Analytics (JISC). https://www.jisc.ac.uk/guides/code-of-practice-for-learning-analytics

Siemens, G. (2011). 1st International Conference on Learning Analytics and Knowledge, Banff, Alberta, February 27–March 1, 2011. https://tekri.athabascau.ca/analytics/

Slade, S., & Prinsloo, P. (2013). Learning analytics: ethical issues and dilemmas. American Behavioral Scientist, 57(10), 1509-1528.

Editors

Univ.-Doz. Dr. Wolfgang GRELLER || Vienna University of Education || Grenzackerstraße 18, A-1010 Vienna

www.phwien.ac.at

[email protected]

Prof. Dr. H. Ulrich HOPPE || Research Group COLLIDE / Department INKO, University of Duisburg-Essen || Lotharstr. 63, D-47057 Duisburg

http://www.collide.info

[email protected]

Dirk TEMPELAAR2 (Maastricht), Bart RIENTIES & Quan NGUYEN (Walton Hall Milton Keynes)

Adding dispositions to create pedagogy-based Learning Analytics

Abstract

This empirical study aims to demonstrate how Dispositional Learning Analytics (DLA) can provide a strong connection between Learning Analytics (LA) and pedagogy. Where LA based models typically do well in predicting course performance or student drop-out, they lack actionable data in order to easily connect model predictions with educational interventions. Using a showcase based on learning processes of 1080 students in a blended introductory quantitative course, we analysed the use of worked-out examples by students. Our method is to combine demographic and trace data from learning-management systems with self-reports of several contemporary social-cognitive theories. Students differ not only in the intensity of using worked-out examples but also in how they positioned that usage in their learning cycle. These differences could be described both in terms of differences measured by LA trace variables and by differences in students’ learning dispositions. We conjecture that using learning dispositions with trace data has significant advantages for understanding student’s learning behaviours. Rather than focusing on low user engagement, lessons learned from LA applications should focus on potential causes of suboptimal learning, such as applying ineffective learning strategies.

Keywords

Dispositional Learning Analytics, actionable data

2 email: [email protected]

1 Dispositional Learning Analytics

In order to use Learning Analytics (LA) “to evaluate different pedagogical strategies and their effects on learning and teaching through the analysis of learner data” (GRELLER & DRACHSLER, 2012, p. 48), we will need to move beyond mere collection of traces of student activity in Learning Management Systems (LMS). Beyond the issue of the low predictive power of some of these logged activity data (TEMPELAAR, RIENTIES & GIESBERS, 2015), the more important issue is the lack of ‘actionable data’ for educators (GASEVIC, DAWSON, & SIEMENS, 2015). In order to design effective pedagogy-based learning interventions in case predictions signal the need to intervene, it should be possible to link LA data to pedagogical theory.

In this contribution, we conjecture and provide first evidence that Dispositional LA (DLA, see BUCKINGHAM SHUM & DEAKIN CRICK, 2012; BUCKINGHAM SHUM & FERGUSON, 2012) has the potential to provide a pedagogy-based LA framework. Elsewhere (TEMPELAAR, RIENTIES & NGUYEN, 2016) we argued that a DLA infrastructure that combines learning data (i.e. generated in learning activities through traces of an LMS) with learner data (e.g., student dispositions, values, and attitudes measured through self-report surveys) may generate data that is actionable. DLA applications not only provide prediction models that help identify students at risk, but do so using pedagogical descriptors, such as students high in deactivating negative learning emotions, or students using the suboptimal cognitive processing strategies of step-wise learning. Such pedagogical descriptors are easily linked with pedagogical theories, and hence, enable concrete actions, such as counselling activities directed at discovering where the negative learning emotions stem from or practicing the use of deep learning processing strategies.

In this showcase study, we will focus on one specific trace variable: students requesting fully worked-out solutions. What different pedagogical scenarios apply to fully worked-out solutions? And what learning dispositions act as an antecedent of these scenarios? In answering these questions, we intend to demonstrate the pedagogical advantage of extending LA into DLA.

2 Use of fully worked-out solutions

The manner in which students seek feedback in their self-regulated learning activities constitutes one aspect of pedagogic behaviour (GRELLER & DRACHSLER, 2012). Worked-out examples represent one of the several feedback formats in computer-enhanced environments (DUFFY & AZEVEDO, 2015), formats that amongst others differ in the amount of guidance or assistance provided to students. Pedagogics has identified four main instructional approaches for assisting learners in problem-solving (MCLAREN, VAN GOG, GANOE, KARABINOS & YARON, 2016), with varying degrees of learner support. First, the problem-solving approach is positioned in the low guidance end of the continuum, offering little or no feedback to learners. Second, tutored problem solving provides learners with feedback and hints to solve the problem or construct the schema when a learner is stuck. This approach intervenes in the learning process only when help is needed; hence, it ensures learners will actively attempt to solve the problems. Third, erroneous examples present learners with flawed examples and instruct them to find, explain, and fix errors. Finally, at the high end of learner support MCLAREN et al. (2016) position the use of worked-out examples. The use of worked-out solutions in multi-media based learning environments stimulates gaining deep understanding (RENKL, 2014). When compared to the use of erroneous examples, tutored problem solving, and problem-solving in computer-based environments, the use of worked-out examples may be more efficient as it reaches similar learning outcomes in less time and with less learning efforts (MCLAREN et al., 2016).

Most of the above-cited studies are nested in laboratory settings, with students assigned to one of the several experimental conditions, each representing one unique pedagogical feedback scenario. In authentic settings, students mix and match diverse pedagogical feedback scenarios, and do so in different orders. For example, some students will avoid using worked-out examples; other students use worked-out examples at the start of a new learning cycle, whereas others use worked-out examples at the very end of their learning cycle.

Beyond detecting individual differences in preferences for pedagogical feedback scenarios, a next step is to explain these based on differences in learning dispositions. For example, studies in gender differences in learning mathematics suggest that female students would profit more from having worked-out examples available at the very start of learning new mathematical concepts (BOLTJENS, 2004). As suggested by KOEDINGER, MCLAUGHLIN, ZHUXIN JIA & BIER (2016), LA-based models that encompass traces of all relevant pedagogical scenarios may not only lead to knowledge of preferred pedagogical scenarios and their relationship to learning dispositions but also to their efficiency.

Any attempt to solve an exercise can have three different outcomes: the student successfully solves the exercises, provides an incorrect answer, or does not provide any answer, but calls for a worked-out solution. In each of these cases, a student can call for a supportive Hint. These functionalities are examples of Knowledge of the Correct Response (KCR) and Knowledge of Result/response (KR) types of learning feedback; see NARCISS (2008). As indicated before, individual differences exist both in the intensity of using worked-out examples and their timing. For example, in our context students undertake on average 1.35 attempts per exercise, using one hint per eight exercises, and ask on average 0.37 worked-out solutions per exercise. As an approximation of the specific stage in the learning cycle in which students use the feedback mode of fully worked-out solution, we constructed a SolutionOrder variable indicating the position of the call of the solution in the series of attempts of any exercise. The variable ranges from zero to one, with lower values indicating that the call takes place in the initial learning phase, and higher values indicating that the call is positioned at the end of the learning process, such as the last attempt preparing for a quiz.

3 Methods

3.1 Context of the empirical study

This empirical study is based on a large-scale course introductory mathematics and statistics, using an educational system best described as ‘blended’ or ‘hybrid’. The main component is face-to-face: problem-based learning (PBL), in small groups (14 students), coached by a content expert tutor (SCHMIDT, VAN DER MOLEN, TE WINKEL & WIJNEN, 2009). Participation in these tutorial groups is required. Optional is the online component of the blend: the use of the two e-tutorials SOWISO for mathematics, MyStatLab for statistics (TEMPELAAR et al., 2015). This choice is based on the philosophy of student-centred education placing the responsibility for making educational choices primarily on the student. The use of e-tutorials and achieving good scores in the practicing modes of the digital environments is stimulated by making bonus points available for good performance in the quizzes. Quizzes are taken every two weeks and consist of items that are drawn from the same item pools applied in the practicing mode. We chose this particular constellation as it stimulates students with less prior knowledge to make intensive use of the digital platforms. The bonus is maximized to 20% of what one can score in the exam.

The subject of this study is the 2015/2016 cohort of first-year students, who in some way participated in learning activities in the SOWISO digital tool: 1080 students. We restrict this study to learning activities in the SOWISO tool, because of the richness of trace data generated by the tool. A large diversity in the student population is present: only 23.8% were educated in the Dutch high school system, 45.7% of the students were educated according to the German Abitur system. In the investigated course, students work an average 9.7 hours in SOWISO, 12% of the available time of 80 hours for learning on both topics.

The study profits from the circumstance that students conduct a required statistical project, in which they analyse a personal data set, built from their own disposition data. That is: answering all surveys in an honest way is crucial for fulfilling all course requirements. Nonresponse is therefore limited to dropout from the course (the last survey counting a response of 1021).

3.2 Instruments and procedure

Our study combines two different data sources: trace data of the SOWISO learning environment, and self-report survey data measuring learning dispositions. Trace data is both of product and process type (AZEVEDO et al., 2013). SOWISO reporting options of trace data are very broad, requiring making selections from the data. First, all dynamic trace data were aggregated over time, to arrive at static, full course period accounts of trace data. Second, from the large array of trace variables, a selection was made by focusing on process variables most strongly connected to alternative pedagogical behaviours of students. These include the alternative feedback modes preferred by students. In total, six trace variables were selected:

Mastery in the tool, the proportion of exercises successfully solved as product indicator;

Time in the tool: total connect time;

#Attempts: total number of attempts of individual exercises;

#Solutions: total number of worked-out solutions called;

SolutionOrder: indicator of the phase in the learning process where worked-out solution is called for;

#Hints: the total number of Hints called for.

In this study, we will make an additional selection with regard to the self-report surveys measuring student learning dispositions. More than a dozen were administered, ranging from epistemological conceptions about the role of intelligence in learning, to academic buoyance in the learning itself. We will focus here on a selection of six instruments measuring aspects of self-regulated learning (SRL), feedback seeking, achievement goal setting and learning emotions, since these dispositions have been investigated in recent LA studies (see AZEVEDO et al., 2012; DUFFY & AZEVEDO, 2015, and references therein). All disposition surveys are measured using seven-point Likert scales; no transformations of variables were required.

Applications of achievement goal theory in LA studies typically employ the two*two framework of goals, distinguishing two goal definitions, mastery goals against performance goals, and two goal valences, approach goals against avoiding goals (see e.g. DUFFY & AZEVEDO, 2015, and references therein). In this study, we apply an extended version of this framework using four different goal definitions: Task, Self, Other, and Potential goal types (ELLIOT, MURAYAMA, KOBEISY & LICHTENFELD, 2015). Task, Self, and Potential goals use as a basic standard to define competence the task itself, oneself in the past, and one’s own future potential, respectively. Other goals are normative of character, using a standard based on the comparison with others.