Statement on #massiveteaching (part I)

(For a more extended statement, see here)

My name is Paul-Olivier Dehaye, I am a mathematics professor at the University of Zurich. On June 23rd 2014, I started to teach a MOOC on Coursera, called Teaching goes massive: New skills required, which was intended to last for three weeks and include a component on business practices of MOOC providers. During the course delivery, my perception of the ethical issues surrounding the course changed, which led me to alter its delivery methods. This prompted a reaction by Coursera itself, and I was censored [1]. In the fallout, I was also insulted and vilified online. Coursera used intentionally misleading information and deeply intrusive data held about the course and me to try to get my employer to launch a disciplinary procedure against me. Legal threats were also made.

Throughout my course, I acted with the interest of the European and Swiss public in mind. The ethical concerns that led me to change the course so abruptly hinged on strong similarities between the Coursera and Facebook Terms of Use, and the lack of transparency and accountability on the data collection and "research" practices at those companies.

There seems to be a very different perception across the Atlantic on privacy. As higher education moves online globally, it is important that the rest of the world does not adopt by default a Silicon Valley narrative on privacy issues in the educational domain. In my view, it is of primordial importance that European universities do not contribute to an erosion of the privacy values held by their local taxpayers [2]. There are fundamentally different regulatory frameworks across the Atlantic: while privacy is seen in Europe as a human rights issue, it tends to be seen in the US as a tradable commodity, akin to a property right. If we consider instead developing countries, we are currently at a juncture point on this issue: while it is no doubt beneficial to offer access to educational material for their next generation, it is questionable whether this requires exporting business practices that are under increasingly intense scrutiny at home.

Privacy is only one of my concerns associated to Coursera and the current shaping of the MOOC market.

By muddling the thick legitimacy of our universities with the thin legitimacy of companies backed by venture capital, we risk devaluing our core: academic freedom. This dynamic happened in other industries, such as journalism, where it has led to a complete reversal of power structures, and a struggle of journalists to remain relevant [3].

Beyond delivering educational materials worldwide, I am optimistic for the potential of MOOCs to create rich and meaningful experiences for students, through citizen science and open democracy activities for instance. There is a rapid impulse to apply similar combinations of educational and crowdsourcing techniques at a high cognitive level to the labor market [4]. The utopia is that this can solve labor issues on a mass scale, enabling students to extract value from their degrees right away. More ambiguously though, through turking [5] and the privatisation of higher education certification on a large scale, this perniciously opens the door to seamless integration of these two markets into a more exploitative arrangement.

Finally, we should not forget what we lose by automating any component of an educational experience. Algorithmic culture comes with its own problems (especially when those algorithms are opaque [6]), and big data is by definition discriminatory [7] and hence a threat to any attempt at fostering equality in education.

[1] My profile is missing from here, for instance.
[2] As an example, Coursera refuses to comply with the EU and/or Swiss Data Protection Authorities.
[3] As masterfully explained by Jay Rosen, journalism professor at NYU, in two articles written in the wake of the Facebook Emotion experiment. I detail some of my concerns here.
[4] See the two workshops at HCOMP 2014 for these trends in human computation.
[5] "A practice that enables individuals or businesses to coordinate the use of human intelligence to perform tasks that computers are currently unable to do."
[6] See Show-and-Tell: Algorithmic Culture and Corrupt Personalisation.
[7] See Big Data's Disparate Impact.