does learning design + TEL = the future?

typeset image from pixabay.com

Language matters. Whether its training or lecture capture instead of teaching or recording resources, the words we use and the ways we interpret them are full of unconscious bias. When designing learning, one of the first steps is to bust the jargon. Ask the questions. What are we saying here and what does it mean?

This week I attended a workshop on Marking and Feedback with Prof Lin Norton. Lin spoke about final vocabulary, a term used by philosopher Richard Rorty which refers to words containing deeply held beliefs and assumptions without the necessary explanations. For example feedback comments like good, excellent, exactly what I’m looking for. The marker knows what they mean but it isn’t clear to the recipient. Lin says final vocabulary leaves students no room to manoeuvre. Markers need to make comments which open up conversations rather than close them down. Like active listening or going back to Socratic questioning. Those ancient Greeks really knew their stuff.

question mark from pixabay

The tendency to make uncritical use of language is common. We’re often more subjective than we realise. I think I’m a critical reflector but there’s always something new to learn.  I don’t have a data driven approach to practice. A bit dyscalculic as well as suspicious of quantitative data sets. No matter how the figures are presented, I want to know the stories behind them. But – I’m also an action researcher and promoter of experiential learning. I like critical reflection loops which take you on a journey of change.

data image from pixabay

Recently I’ve come to realise I do have a data driven approach; it’s my interpretation of what data represents which is skewed. Phrases like Big Data or Learning Analytics made me think randomized controlled trials or NSS scores and VLE dashboards. I knew data didn’t  have to be numbers – I’m doing qualitative research for heavens sake (Doh!) but my subjective interpretation was linking the two together. It’s only by developing a learning design approach to TEL with an expert data-king colleague which has uncovered a bias I wasn’t consciously aware of.

scrabble tiles from pixabay.com

How often do we act without questioning that we do? Last week I blogged about the impact of research on TEL and the literature TEL people use to inform their practice. I’m still searching for answers. Let’s broaden it out. Where’s the evidence base for learning and teaching? Is there a contemporary equivalent to Chickering and Gamson’s Seven Principles for Good Practice in Undergraduate Education (1987)

  • encourages contact between students and faculty,
  • develops reciprocity and cooperation among students,
  • encourages active learning,
  • gives prompt feedback,
  • emphasizes time on task,
  • communicates high expectations, and
  • respects diverse talents and ways of learning.

The authors claim these support ‘six powerful forces in education’

  • activity,
  • expectations,
  • cooperation,
  • interaction,
  • diversity,
  • responsibility.

Spot the gaps. It would make a useful online activity. I’d add the need for critical thinking, reflection and creativity as well as having an evidence base. Let’s put scholarship in there. Being research informed and engaged.  This week my colleague and I have scoured the UK literature  around L&T in HE (e.g. Knight, Biggs, Prosser, Trigwell, Trowler, Race, Baud, Nicol, Moon, Brookfield etc) but can’t find anything so succinct or contemporary.

Maybe the subject is too complex to be reduced to bullet points. Maybe it reflects its late arrival. In many ways pedagogic research in HE is still the new kid on the block. It’s not a happy partner to the REF and HE staff having an ‘appropriate teaching qualification’ is a relatively recent requirement. The HESA returns for data on academic teaching qualifications was only introduced in 2012/13 with many  institutions still returning a percentage of ‘not known‘.

opening slide from lin Norton assessment workshop

Events like Lin Nortons are welcome opportunities to ask questions and discuss answers, as in the slide image above. I think they’re useful for TEL people. Marking and feedback are foundation elements of the student experience. Sometimes it can help to separate them out from the technology – which in itself risks becoming a distraction – in order to examine more closely the fundamental principles of assessment practice. Not all TEL people come from a teaching background so it helps to make TEL about learning as well technology. The problem is the language. Again, language matters. Too often when you say you work with TEL or in a TEL Team you’re instantly categorised into a techie box.  This is one of the reasons I believe TEL needs to be reversed. Less of the T and more L please Bob.

There’s a phrase associated with the early days. RTFM stood for read the ******** manual.  All computers came packed with a doorstop of an instruction book. RTFM soon came to mean don’t ask me how the bloody thing works, go and look it up yourself.

Today the technology has (allegedly) changed to a more intuitive click and play  approach – as well as being introduced almost from birth – and the internet has replaced the manual. Today we know how it works. We need to be asking where it’s being used and why. What do we know about how people learn? What is the equivalent to Chickering and Gamson’s principles for 21st century TEL? If we’re promoting digital feedback then lets look at Lin Norton’s research or have a TEL Team discussion around the HEA’s Marked Improvement or visit outputs from the Oxford Brookes ASKe project or REAP.

I believe the design of learning is an essential part of TEL and we should adopt a scholarly approach to our practice by being more research informed and engaged. In which case maybe RTFM is not redundant but needs updating to RTFL. Read the ******** literature.

Now the HEA Subject Centres have closed and the HEFCE funded CETLs have come to an end who is promoting research into learning and teaching practice? Students are paying huge amounts of money for their time at universities where traditional teaching methods are still evident and VLE resemble repositories. Lets take a fresh look at the TEL people what we do because it looks a lot like learning design + TEL = the future.

TEL-ling tales – where is the evidence of impact?

open laptop with the word learning on the screen

Research is complex. It can be a messy business, but it matters. Higher education revolves around research and student degrees yet when it comes to the REF, pedagogical research in HE has a poor showing. A recent HEA funded investigation found critiques of submission quality* while back in 2002, Jenkins described it as having Cinderella status. A paper by the HEA researchers (Cotton, Miller and Kneale, 2017suggests pedagogical research in HE remains the Cinderella of academia.

If pedagogical research in HE is struggling for recognition where does this leave the field of education technology or Technology Enhanced Learning (TEL)? The critiques are plentiful** so where is the evidence of impact?

digital-tech-pixabay

I have great respect for the expertise of TEL colleagues so wearing my curiosity hat, I headed off to a closed learning technology mail list. Citing Surowieckis ‘wisdom of crowds’, I invited members to point me to evidence of enhancement via technology.

I don’t know what I expected. Maybe references to the OLDS mooc project, a NMC Horizon Report, the OU Innovating Pedagogy series or anything from the Jisc elearning projects.  Maybe the application of a model like Laurillard’s conversational framework or her work on teaching as a design science, how Salmon’s Five Stage model of e-moderating was used or Garrison and Anderson’s Community of Inquiry.  At home I have an old fashioned plastic box full of printed papers from my TEL research literature reviews, some by well known names and others less so but all with a variety of methodologies and results. Admittedly, much is aspirational – revealing potential for scaling up rather than the results of broader adoption, but they’re evidence of intervention. They represent hope. My plan was to scope the most popular ‘go to‘ pieces and collate them for sharing.

red question mark on a keyboard

The response was not quite what I expected. Maybe I asked the wrong question. Maybe my view is different and maybe this is a Hull issue – in the nicest possible way! As Philip Larkin said, we ‘re on the edge of things rather than the centre and being on the edge can give you a different perspective. Whatever the reason, there were lots of ensuing discussions, some tweets and a couple of blogs – all showing a variety of reactions – Show me the Evidence by James Clay and In Defence of Technology by Kerry Pinny – but no links.  There was also an #LTHEchat invitation to host ‘Establishing an evidence base for TEL’  which will take place on Twitter, 3rd May, 8.00-9.00 (diary date!) If the questions were wrong at least they generated some positive consequences.

tweetchat-tweet small

I think Kerry was closest to my position when she described asking questions as scholarly practice. If we’re not research-informed and engaged how do we know if we’re having impact? Familiarity with the literature and taking time for critical reflection is about thinking academically and we work within academic environments where TEL is promoted as an enabler and enhancer of student-learning. Pedagogical research might not be scoring 10 out of 10 with the REF but it’s our daily bread and no reason to ignore what’s out there or not adopt a scholarly approach to evidencing our own practice – in particular with TEL matters. Institutions are investing huge amounts of money into digital platforms supporting learning and teaching but less into supporting staff to develop the digital capabilities and confidence to use them.

media-studies

It’s now twenty years since the Dearing Report into the future of higher education which preceded the arrival of the VLE. Since those early days we’ve shifted from a read-only environment to user generated content, file sharing, mobile devices, social media, apps, virtual reality etc etc yet there’s still disparity of adoption and a widening divide between the innovators and those yet to climb aboard the TEL train.

What came out of the discussions (and what I see every working day) was how resistance to TEL remains high. Also it’s clear what’s missing includes the time, space, reward and recognition for staff engagement. We’re grappling with this at Hull and to make our case to SMTs requires evidence of impact on student learning and staff well-being. To find the evidence we need the research.

So where is it?

What do other TEL people use as their rationale for TEL matters?

magnifying glass


footnotes

* critiques of pedagogical research in higher education include small sample sizes, localised research not capable of wider dissemination and limited contribution to theory. This is similar to the examples of critiques of TEL shown below.

** examples of TEL critique

 ‘Our analysis of articles published in two leading journals [these were the Australasian Journal of Educational Technology) and ALT-J (since renamed Research in Learning Technology)] found…poorly conceived or poorly applied methodologies, limited reference to theory, weak results, incomplete descriptions, uneven presentation of data and overblown and unsupported claims of impact and importance.’ (Gunn and Steel, 2012:11)

‘….where the potential of technology to transform teaching and learning practices does not appear to have achieved substantial uptake, this is because ‘the majority of studies focused on reproducing or reinforcing existing practices.’ (Kirkwood and Price 2012: 24)

‘The majority of papers published in BJET and the other educational technology journals are in the form of small-scale, unconnected trials and applications which can have little influence on policy making.’ (Latchem, 2014: 2)


images from pixabay except tweet from #lthechat


 

 

The problem is not ignorance, it’s preconceived ideas

https://pixabay.com/en/binary-code-man-display-dummy-face-1327512/
https://pixabay.com/en/binary-code-man-display-dummy-face-1327512/

Data is never neutral.  This is my social science background talking. It’s made me suspicious! Or should that be critical?  Not everyone agrees but I’ve always distrusted the ability of stats to tell the full story.

This week it was announced Hans Rosling has died. A sign of the internet age is the videos we leave behind. This link to a TED Talk (2006) The best stats you’ve ever seen begins with his trademark introduction ‘I’m a statistician – No – don’t switch off!

Rosling set out to show the changing world through the visualisation of data.  The concept was simple. Most good ideas are. Publically funded statistics exist but are not presented in ways which are educational and accessible. Rosling founded the Gapminder organisation to create software linking data with presentation tools, thereby making it visible and searchable or in this own words – liberated. Helped more than a little by a narration owing more to a sports commentary than traditional academia, graphs have never been so entertaining or eye-opening. Mission accomplished.

Hans Rosling presenting on a stepladder

Over the years Rosling moved from overhead projector, with his trademark stepladder for reaching the high parts, to more sophisticated forms of digital touch screen representation. The technology was wizzy but somehow wasn’t the same.

hans-rosling-digital

I saw Rosling present a couple of times. Mostly on the international health and social care arena where he spoke about the world and what really matters;  fertility rates, child mortality, family planning, distribution of income and the power of social change. There were always a number of key messages. Data is better than you think; there may be  an uncertainty margin but the differences revealed are larger than any weakness. Data can be structured e.g. revealing the importance of context an highlighting diversity, sometimes within single countries. Most relevant to educationalists, Rosling maintained problems are not caused by ignorance but through preconceived ideas.

USB PLUGS
https://pixabay.com/en/network-connector-network-cables-494651/

Data is big business and higher education has not escaped from the lure of using stats to review and refine the student experience. Within  institutions the VLE dashboard and NSS (National Student Survey) have been used for some time to wave red flags. Now the TEF is bringing data analytics to the forefront. The relationship between NSS scores, figures from HESA (Higher Education Statistics Agency) and DELI  (Destination of Leavers from Higher Education) and teaching excellence is still open for debate but there’s no denying how ‘Learning Analytics’ is now positioned centre-stage.

All my initial reservations about statistical data have come back. It’s one thing to collect and group figures into charts and tables but useful interpretation depends on wider issues such as identifying what you want to know and why you want to know it. Counting the times a student logs onto a VLE or walks into the library tells us little about the nature of their activity or quality of engagement.

digital number display
https://pixabay.com/en/nixie-tube-electronics-voltage-1501592/

The biggest concern is the rhetoric. The Bricks to Clicks report tells us data has “enormous potential to improve the student experience at university” while the Jisc report Learning Analytics in Higher Education offers analytics as a tool with many functions. These include quality assurance and quality improvement, boosting retention rates and assessing and acting upon differential students outcomes – to mention a few.

We’ve been here before in the early days of education technology which promised much with regard to enhancement but with little evidence of improvement. Deterministic approaches see technology as the agent of change rather than focusing on the cultural context in which it’s positioned. Today it seems there’s an increasing risk of data being seen through a similar determinist lens.

magnifying glass
https://pixabay.com/en/magnifying-glass-hand-glass-magnify-29398/

Education developers and researchers want teaching interventions which produce the most effective learning environments. As it stands, I’m not convinced the collection, measurement and interpretation of all this data for the TEF will produce any meaningful information about what we really want to know. The Learning Analytics movement needs someone like Hans Rosling to challenge preconceived ideas and find ways to interpret data which are innovative, useful and accessible.

It would also be worth asking if the data we have is from the most appropriate sources in the first place.