Last week I signed up to the E-Learning and Digital Cultures
course on Coursera. Run by tutors on the University of Edinburgh’s MSc in
Digital Education, the course is an example of a MOOC (massive open online
course) – a relatively new development in distance education.
As an Information Officer working in the e-learning field, I
was keen to participate in the course, not only to learn more about the wider
context of the field in which I work but also to experience e-learning from the
‘other side’, which will hopefully help me in assisting the students and staff
at my institution with their questions.
The five-week course has been designed to look at two
themes, followed by a final assessment. The first theme is Utopias &
Dystopias, exploring how and why e-learning and digital culture has been
portrayed as both utopian and dystopian in popular culture and academic
literature.
Week 1: Looking to the Past
The first week explored past views of utopian and dystopian
writing on e-learning. We also had to watch a few short films about technology
and its impact on humanity. During this time, we were encouraged to discuss the
ideas on Twitter, Facebook, Google+ and the course forums, but the course has
so many participants that the flood of information rather daunted me – I ended
up attending a study group instead, meeting a couple of other London-based
participants on Sunday to discuss the ideas raised.
I found the readings and discussion interesting as it made
me look at technology and e-learning in a whole new light. Some of the videos
related more closely to technology in general, rather than e-learning in
particular, but I still found them useful from a theoretical point of view.
The core reading1 was a useful introduction to
the topic of technological determinism, the idea of the importance of
technology in instigating social change. Further articles explored the concept
of e-learning and the role of technology in learning in general, often with
very contrasting viewpoints. I couldn’t help feeling that the opposing
viewpoints of utopian/dystopian scenarios were rather simplistic – surely the
actuality is more complex?
My initial thoughts, as a history graduate, flew to the
invention of the printing press which helped to facilitate religious change
within Europe, and the Industrial Revolution which changed the fabric of the UK
and Europe in later centuries. Of course it is simplistic to state that
technology was the only thing that led to this change, but it did play a large
part.
Insofar as I ever thought about it, I always subscribed to
the idea that technology itself was neutral – that it is what you do with it
that counts. On a simplistic level, for example, librarians seek to use
technology to help make information more widely available; totalitarian states
can use technology to survey and spy on their citizens. One of the ideas
explored in the reading and media was that this is not true – that technology
can be built for an inherent purpose or can have repercussions beyond what was
originally intended – the ‘Frankenstein syndrome’ defined by Neil Postman
(1983). In our group discussion on Sunday, we talked about this concept in
relation to the guillotine and the atom bomb, among other things. To me, the growth
in popularity of tablet computers, produced without a built-in keyboard,
suggests the development of a passive mode of consumption different to the
communicative nature of the computer keyboard, which lets you record your
thoughts and ideas (someone less obsessed with the written word and more highly
disposed to telephone or video communication may see this entirely
differently!).
Linked to the ‘Frankenstein syndrome’ was the concept of
‘resistentialism’ – this fictional concept was invented by Paul Jennings in
1948, and put forward the idea that the more crucial a piece of technology is
at any given moment, the more likely it is that it will refuse to work. I am
sure I am not the only one to feel that this has the ring of truth about it! On
a more serious note, it is an example of the easy way in which we can invest
inanimate technological equipment with human attributes, such as stubbornness,
awkwardness and petulance (and that’s just the printer…).
I was intrigued by two pieces, one offering a utopian and
the other a largely dystopian view of online education. J. Daniel’s speech for
UNESCO2 argues that e-learning can help to solve the problems of
quality and cost that currently plague universities, while D. Noble’s article3
claims that online education will lead to a decline in teaching quality and the
commercialisation of education. I thought that both had their points, but I
still feel that both positions are too simplistic. After all, there is still a
divide between those who have access to the Internet and online learning and
those who have not, and most if not all universities nowadays have at least
some course content online.
Prensky’s famous paper4 espoused the ‘digital
natives’ versus ‘digital immigrants’ theory, arguing that older people who did
not grow up with technology – the ‘digital immigrants’ – need to learn to adapt
to new ways of working in order to keep pace with the younger ‘digital
natives’. This theory is very simplistic, and ignores other factors relating to
how and when people adopt technology, such as the digital divide, poverty and
other reasons why a younger person might not feel comfortable with technology
(not to mention older ones who might be completely at home with it). In my own
experience, working on a helpdesk dealing with queries about a VLE (Virtual
Learning Environment), many of the students asking questions – several of whom
are in their early twenties – don’t show a high level of technological
understanding.
After initially wondering what I had let myself in for, I
managed to successfully navigate the first week of the course. For the second
week I will try to engage more with some of the other course participants on
the forums – I might even dip a toe into Google+…
1Chandler, D. (2002). Technological determinism.
Web essay, Media and Communications Studies, University of Aberystwyth. https://spark-public.s3.amazonaws.com/edc/readings/chandler2002_PDF_full.pdf.
[Accessed 1 February 2013]
2Daniel, J. (2002). Technology is the Answer:
What was the Question? Speech from Higher Education in the Middle East and
North Africa, Paris, Institut du Monde Arabe, 27-29 May 2002. http://portal.unesco.org/education/en/ev.php-URL_ID=5909&URL_DO=DO_TOPIC&URL_SECTION=201.html. [Accessed 1 February 2013]
3Noble. D. (1998). Digital Diploma Mills: The
Automation of Higher Education. First Monday 3/1. http://firstmonday.org/htbin/cgiwrap/bin/ojs/index.php/fm/article/viewArticle/569/490.
[Accessed 1 February 2013]
4Prensky, M. (2001). Digital Natives, Digital
Immigrants. On the Horizon, 9/5. http://www.marcprensky.com/writing/prensky%20-%20digital%20natives,%20digital%20immigrants%20-%20part1.pdf.
[Accessed 1 February 2013]
2 comments:
This sounds like a fascinating course - thank you for writing it up. I look forward to reading your next posts on it. Sarah
Thank you! I'm still getting my head around it all and I think writing helps me to understand it a bit better.
Post a Comment