Program 2021

Workshop Program September 13, 2021

Please access the workshop via the Webex link or by entering the following details in your Webex application:
Meeting-Code: 2731 712 0943
Password: j3Nv99CEiyb

Please note that the Webex client for Linux has only limited functionalities. Thus, we suggest entering the meeting via other OS (e.g., Windows, MacOS, Android).

For facilitating the workshop and necessary adhoc changes we will provide all relevant details and links in an ether pad that you can access here.

Time Program
9:00 am to 9:15 am Workshop introduction
9:15 am to 10:25 am Paper presentations I
9:15 am Investigating the Impact of Outliers on Dropout Prediction in Higher Education

Daria Novoseltseva, Kerstin Wagner, Agathe Merceron, Petra Sauer, Nadine Jessel and Florence Sedes
9:50 am Teacher vs. Algorithm: Learners’ Fairness Perception of Learning Analytics Algorithms

Linda Mai, Alina Köchling, Lynn Schmodde and Marius Wehner
10:25 am to 10:40 am Coffee break
10:40 am to 11:30 am Group work and discussion
11:30 am to 12:40 pm Paper presentations 2
11:30 am Supporting Students’ Privacy: How Does Learner Control over Their Data Affect the Dataset for Learning Analytics?

Philipp Krieter, Michael Viertel and Andreas Breiter
12:05 am Indicators of group learning in collaborative software development teams

Benjamin Weiher, Niels Seidel, Marc Burchart and Dirk Veiel
12:40 pm Teasers of short papers
12:45 pm to 2:00 pm Lunch break and short paper presentations in break out rooms
Using GitHub data to analyse student’s teamwork in a programming course to prevent discrimination

Maximilian Karl and Niels Pinkwart
FAIR Research Data Management for Learning Analytics

Ian Wolff, David Broneske, and Veit Köppen
2:00 pm to 3:00 pm Keynote by Ryan Baker
Algorithmic Bias in Education
3:15 pm to 3:30 pm Coffee break
3:30 pm to 4:30 pm Group work 2 and discussion
4:30 pm Closing and members meeting

 

Keynote on Algorithmic Bias in Education by Ryan Baker

Abstract The advanced algorithms of learning analytics and educational data mining underpin modern adaptive learning technologies, for assessment and supporting learning. However, insufficient research has gone into validating whether these algorithms are biased against historically underrepresented learners. In this talk, I briefly discuss the literature on algorithmic bias in education, reviewing the evidence for how algorithmic bias impacts specific groups of learners, and the gaps in that literature – both in terms of „known unknowns“ and „unknown unknowns“. I conclude with potential directions to move the research community towards better understanding how bias impacts educational algorithms, and how to address these problems so that learning systems better promote fairness and equity.

Ryan Baker is Associate Professor at the University of Pennsylvania, and Director of the Penn Center for Learning Analytics. His lab conducts research on engagement and robust learning within online and blended learning, seeking to find actionable indicators that can be used today but which predict future student outcomes. Baker has developed models that can automatically detect student engagement in over a dozen online learning environments, and has led the development of an observational protocol and app for field observation of student engagement that has been used by over 150 researchers in 7 countries. Predictive analytics models he helped develop have been used to benefit over a million students, over a hundred thousand people have taken MOOCs he ran, and he has coordinated longitudinal studies that spanned over a decade. He was the founding president of the International Educational Data Mining Society, is currently serving as Editor of the journal Computer-Based Learning in Context, is Associate Editor of the Journal of Educational Data Mining, was the first technical director of the Pittsburgh Science of Learning Center DataShop, and currently serves as Co-Director of the MOOC Replication Framework (MORF). Baker has co-authored published papers with over 400 colleagues.