Some blog posts bubble and brew for months.
Others burst out of nowhere – like this one.
It started earlier today with a tweet…
HEPI (Higher Education Policy Institute) posted a blog piece on admissions How to land a jumbo jet on a postage stamp
Good title, great hook.
HEPI is a think tank; a research institutes with a remit to underpin policy with evidence. Some think tanks are funded by government bodies and clearly positioned left or right of centre. HEPI claims to be the UK’s only ‘independent think tank devoted to higher education’.
The jumbo jet on the postage stamp was about admissions. Author Nick Hillman recently explored this in a Guardian peice which referred to ‘an explosion in unconditional offers, where a university wants a student so much it doesn’t mind what A-level results they achieve‘. These days, HEI set admissions criteria and places can now be offered on the basis of predicted grades rather than actual ones (despite a 2016 report by UCL and UCU suggesting only 16% of predictions were accurate).
It’s the students fees wot dun it.
Admissions has become a market place. There, I’ve used the language of commodification, of students as consumers, or even worse, customers. Well, I believe, I really believe, there’s enough people working in HE who still see it as more, so much more than a product to be bought and sold.
What doesn’t help is uncritical use of language, for example the HEPI piece referring to institutions and prospective students as buyers and sellers.
So I tweeted so say I felt disappointed at what appeared an uncritical use of language.
The phrase in my head was ‘public good’. What happened to the discourse of ‘higher education for the public good’?
PG refers to services which benefit society without citizens necessarily having to pay for them. A university for the public good is an institution charged with developing the citizens of the future, in a socially democratic society, and upholds the principles of social justice and equality.
There was a time when going to university was free. Sounds crazy now but I took my first degree just as student loans began. It was 1990 and I was one of the first to take advantage. It made all the difference. I’d become a single parent; relationship breakdown being an unacknowledged side-effect of higher education which no one talks about. The student loan meant I could finish my degree and still feed the kids. So in a way I paid for my education but it was nothing compared to the debt students put themselves in today, and the debts my own childrn and their partners are paying off.
Commonly quoted examples of public good include municipal gardens, national parks and lighthouses. They exist to make our lives better, safer, more fulfilling. A university for the public good is about equipping graduates to take up public office and care about a fair and just society, one with equal rights and opportunities.
HEPI replied saying thanks for the feedback. But wouldn’t it be wrong, this close to results day, to pretend we have anything other than the system we do when people need help making choices?
I struggle to accept the reduction of higher education to a buyer and seller’s market.
There’s a number of ways to look at 21st century society. They include the fictional lenses; 1984 by George Orwell (1949) and Brave New World by Aldous Huxley (1932). There are many versions of the aphorism ‘fiction is the lie which tells a truth’ and both these novels contain resonance.
In Orwell’s dystopian vision, media messages were readjusted at regular intervals to suit the power structures of the day i.e. the construction of fake news and false truths, while wherever you were, whatever you did – Big Bother was watching you
Huxley’s Brave Bew World of Hypnopaedia, sleep control aimed at persuading the population to remain in soma-induced highs, a drug freely provided by the government to induce semi-permanent states of bliss in a society where drugs and sex were the only sources of entertainment.
Which would you prefer?
HEPI says it’s an independent think tank but referring to universities as sellers and students as buyers sounds more like buy-in than reminaing neutral. Systems are constructed to support dominant mechanisms of power and control, in this instance capitalism and a free market economy. I don’t deny higher education is being commodified and HEI have to adapt to survive, but language is a powerful reinforcer of ideology and people in positions of influence should take care over their choice of words.
I still believe in the power of higher education to change not only individual lives for the better but as a proactive voice calling for a fairer more equal society.
Thanks for the reply I tweeted back. My worry is the risk of accepting ‘the system’ is to construct the degree as an ‘off the shelf’ product for purchase when knowledge acquisition can be complex and challenging as well as a potentially transformative life experience
HEPI ‘liked’ my reply but the conversation stopped there, but it’s still going on inside my head.
The stamp image at the top of this blog, the inverted jenny, was mistakenly printed in the wrong position; an error which became worth a fortune, showing how in the midst of darkness, there may be light ahead.
What is lurking anyway?
I call it consuming without contribution and we are all great digital consumers.
Truely, here and now in 2018, we risk Amusing Ourselves to Death
When Nicolas Carr (20080 asked Is Google Making us Stupid? interest in cognitive data overload was high. What happened to the CIBER research? The collaboration between Jisc and the British Library studied information searching behaviours in young people. Findings included short attention spans and reliance on surface browsing, with clear implications for universities in the future. Ten years on, those young people are likely to be our students. Today, I can’t even find the report online.
Show me embedded critical digital literacies and I’ll show you a dozen examples of uncritical acceptance.
Tell me why digital skills and confidence of staff who teach and support learning is absent from the ed-tech literature. We know how students learn as e-learners but staff who teach as e-teachers? Where’s that?
…and what’s all this got to do with lurking?
It’s scene setting. Part of the wider picture which starts and ends with our digital codependency and online habits.
Return to Lurking began Friday 13th July, 2018. The 24 hour #HEdigID discussion facilitated by @SuzanKoseoglu was still going strong on Saturday, Sunday, Monday, Tuesday…
The hashtag #OEP (Open Educational Practice) seemed a good opportunity to bring in digital shyness and the politics of participation persuasion. I introduced the concepts and before long lurking emerged as a theme.
I lurk. You lurk. We all lurk.
Lurking has intention and purpose.
Lurking as Learning is a path well-trodden. On 17th April this year, following the Digital Researcher run by my colleagues Mike Ewen and Lee Fallin, I wrote a post titled Sounds of Silence which addressed some of the emerging issues.
To lurk is to loiter, with or without intent, and not post.
We simply don’t understand enough about non-participation. We don’t know what’s going on behind closed screens.
Most of the time it simply doesn’t matter. We’re not expected to comment on every news article or blog post. The facility is available but there’s no pressure to use it.
It’s lurking in online courses which bothers me. Like in blended and distant learning courses where students consume without contributing. You can see content has been accessed but discussion or other collaborative activity fails.
Social constructivism is where it’s at these days. There’s Siemens’ Connectivism and Cormiers’ rhizomatic learning, but the majority of academic practice assumes a Vygotskian approach to how students learn, one which support knowledge construction through collaborative activity rather than didactic transmission.
Sometimes this takes place online and this is where digital silence worries me. Maybe it shouldn’t. But if students don’t talk, how can active learning progress?
So what next?
Well, maybe we’ve got it wrong.
The assumption (to borrow from Orwell’s Animal Farm) is participation good – non participation bad.
Yet we know from discussions, like those reported in Sounds of Silence and elsewhere on Twitter et. al, there’s lots of positives to lurkish practice.
Some were highlighted during the #HEdigID discussions.
Yet we know from discussions, like those reported in Sounds of Silence and else where on Twitter et. al, there’s lots of positives to lurkish practice.
Some were highlighted during the #HEdigID diccussions.
However, lurking as negative remains a common perception as shown in the tweet below
while a 2018 paper by Sarah Honeychurch et. al., Learners on the Periphery: Lurkers as Invisible Learners, explores the lurking research literature. and makes some interesting suggestions. For example, the dominant mode remains that suggested by Neilsen in 2006, namely the 90-9-1 rule.
This rule posits that approximately 90% of group members consume content, 9% participate by contributing from time to time, leaving 1% to contribute a lot on a regular basis (Nielsen, 2006).
Then there’s the Pareto Principle, known as the 80/20 rule. Applied to online participation this translates as 20% of participants creating content which 80% consume.
It seems likely that to lurk is to inhabit safe space. Places of safety. Silent participation without risk. If so, then constructing lurking as a wrong to be righted is inappropriate. It may cause guilt and exacerbate fear of contribution rather than encouraging it.
The majority of Lurk-Lit focuses on change. The use of language like ‘converted’ and ‘persuaded’ suggests students need transforming from no-shows to show-offs, from passive to active.
But is this correct?
If 90% don’t contribute, or 80% consume, maybe we should look at non-contribution and consumption more closely.
Learning online is fundamentally isolated and lonely, but rather than stressing digital participation as a solution, maybe we should celebrate digital singledom instead.
When Philip Larkin wrote about the ‘unique distance from isolation‘ he was referring to a couple next to other in bed. The context is a difficult relationship, Something Larkin is so painfully good at.
If people can be so physically close, yet so far apart, maybe assumptions that distance means separation can also be challenged, Perhaps the isolated learner is more closely linked to a holistic experience of the module or programme, through the medium of digital resources, than we might think. It comes back to my introduction tweet to the #HE digID community.
We need a better understanding of digital shyness. Stop demonising those who choose not to express themselves, be it the digital public sphere or password protected university network. We need to look at lurking from the other side.
There is more in The Other Side of Lurking Part Two; dabbling with digital imposter syndrome which delves further into understadning lurking as a pedagogic strategy neding to be addressed in learning design.
So lurking’s not a problem, right?
…but if it’s your virtual environment and you’re dealing with silence, it can’t be ignored. Lurking flies in the face of everything we’re told 21st century education should be, namely active. We’re well versed in communities of practice and inquiry, zones of proximal development, social, cognitive and teaching presences, and so on – and they all require interaction. Networks need people, don’t they?
Images from #HEdigID discussion on Twitter or pixabay.com
I have a new colleague whose PhD examines Imposter Syndrome in teachers. My twitter feed has been linking me to Imposter Syndrome resources. 2018 seems to have begun on a wave of Imposter Syndrome awareness raising.
So what is it?
Imposter Syndrome is the constant feeling that wherever you are and whatever you do – you’re inadequate. Not good enough, not clever enough, you don’t deserve to be there and sooner or later someone’s going to expose you as the fundamental fake you really are.
Imposter Syndrome is a voice in your head constantly putting you down.
It’s particularly prevalent in higher education research where expectations of expertise don’t always match how you’re feeling inside.
Too easy to feel you’re a fraud and it’s only a matter of time before others find out too. Sound familiar?
Imposter Syndrome is a mentally destructive condition. If instances are increasing, what’s triggering this explosion of self-doubt and hatred. Why have we fallen out of love with ourselves?
The web is full of suggestions and tools for coping. The affordances of a self-help Internet is one of its benefits but sometimes it feels there’s more bad than good and it’s Internet fuelled social media which is making IS worse.
The social in social media has become all about the image. The social user creates online presence which shows how they want to be seen rather than the reality. Photographs are no longer about the person. Instead, crafted images have become representations of desire, used to project something socially constructed as perfection.
It’s a simulation where the ‘like-ing’ game of hearts and arrows takes on a significance far beyond their red lines and circles. They, like the images they’re attached to, have become what Baudrillard would have recognised as empty signs. The meaning has shifted from the appearance of the sign to what the sign has come to represent.
The idea of presenting ourselves as how we want to be seen is not new. Over 50 years ago Goffman wrote about people as performers. In The Presentation of Self in Everyday Life he likened us to actors on the stage, dressing up in whatever costumes are appropriate for the different roles we play. Althusser claimed we all have a set of identities which feel comfortable. When we find them it’s like someone hailing us in a busy street; a familiar face and voice, which stands out from the crowd and is comfortable because we know them.
Social media has become the perfect psychological storm.
There are too many stories about young people bullied and suicidal over online behaviour. Living in a heightened state of awareness, mobile devices have become carriers of extreme joy when digital popularity soars or the depths of despair when they’re unliked, arrowed down, or subject to unpleasant status text which spreads like wildfire so it seems the whole world of people you know and those you’ve never met are all against you.
Or the image of you.
Who are you anyway?
Which brings us back to Imposter Syndrome and the feeling you’re not good enough.
In a world of digital image and false representation, we should rename imposter syndrome as Instagram Symptom.
Social media creates loops where signs are no longer symbolic of the real. Instead, they are exchanged for other signs which are empty and self-referential. The social media image shows an untruth, a falsity. It’s a simulation which has moved from being a copy to being a replacement. When Baudrillard wrote about representation in a postmodern world, he claimed simulations are dangerous.
The danger lies here. An obvious falsity such as a famous face dressing up or acting a role still contains a truth. We know it’s pretend. The intention to deceive is apparent. A simulacrum, as Baudrillard described the postmodern world of media simulations, was more than a deception, it signified the destruction of the original which it replaced. The risk we face with digital images is when they become more real than the person arranging, adapting and adjusting them.
Baudrillard died in 2007. Facebook was new (2004) and Twitter still a baby (2006). Many of his ideas were controversial (Gulf War, Twin Towers etc) but his conception of hyper-reality, where fiction is indistinguishable from fact, is scarily true for the phone-talking-while-walking millions for whom social media is the first thing in the morning, the last thing at night and most of the hours in between. Hyper has become the reality of choice.
Just as education doesn’t teach critical digital literacies in the way it teaches text and numbers, we don’t teach visual digital literacy – but we should. Either Imposter Syndrome is increasing or more people are talking about it. Either way, it seems symptomatic of 21st century desires for digital perfection.
We need to remind ourselves we are real people and the real matters more than the fantasy. No matter how beguiling it might appear – it’s a lie!
If you’re suffering from Imposter Syndrome these links might help.
Sakulku1, J. and Alexander, J. (2011) The Impostor Phenomenon International Journal of Behavioral Science 2011, Vol. 6, No.1, 73-92 http://bsris.swu.ac.th/journal/i6/6-6_Jaruwan_73-92.pdf
- Imposter Syndrome
- 7 Coping Strategies to Overcome Impostor Syndrome
- Overcome Impostor Syndrome: What to Do When You Feel Like a Fraud
- Feeling like an impostor? You can escape this confidence-sapping syndrome
Baby Tweet from http://365icon.com/icon-styles/social/blue-bird-twitter-icon/
#followfriday is a hashtag which passed me by. Like #fossilfriday. Seemed like a great idea but life’s busy, the web vast, you can procrastinate all day and still find stuff to amaze you. I know. I’ve been there.
Why #followfriday? Isn’t every day a follow day on Twitter?
Come to that, what it is about the tweeting bug which bites some and not others?
It’s an odd idea. Imagine pitching it. You want to do what? Limit posts. Well, maybe a thousand words isn’t such a bad idea. What? 140 characters! That’s like a single sentence dude. It’ll never catch on.
Twitter was unique. It lost some magic when the character count increased. Now there’s talk of stringing tweets together.
Are the Twitter-leaders losing it?
It seems Twitter is following the path of other good ideas which lacked faith to hang on to the one quality which made them different. In this case – 140 characters or less.
What a brilliant challenge it was. Condensing your message, contracting your thinking, being concise and precise with words. Twitter made you re-examine your use of language. Learn the art of attention grabbing headlines. Appreciate meaningful puns. Appropriation of idioms. Clever metaphors with a twist. For logophiles and other lovers of text the world was our twitterverse and we liked it. Just the way it was.
Soon there will be nothing to distinguish Twitter from other social media platforms where users post a status, like, repost, link, share, add graphics.
The world is moving towards conformity.
Don’t do it Twitter. Stay unique.
In the meantime, James Clay started something on Twitter this week.
Amy Pearlman @AmyPearlman posted a request:-
I know it’s not Friday but who are your best follows for Women IT, Higher Ed issues, Tech, Just plain cool stuff.
James replied with a list of 21. It was good to see ex-colleague Kerry Pinny there – I would have expected to see Chrissi Nerantzi and Sue Becks while thumbs up for Audrey, Bonnie and Donna – education needs their criticality. Then there’s Jane Secker copyright queen and Theresa Mackinnon, cunningly disguised as @WarwickLanguage along with Maren Deepwell from ALT and Sheila McNeill… Hey, I know nearly all these names. What great company to keep. These are the people who understand it’s not the technology, it’s what you do with it that counts.
After this, my Twitter feed went a little crazy. I haven’t counted the subsequent suggestions for Amy to follow. James should have put a hashtag on it!
The buzz is fading. Soon something else will burst into Twitter-life before it also passes by. This is the nature of social media. Transient. Temporary. Of the moment. But for a short while it was good to think the words you drop into the void of hyperspace might sometimes have an impact. So thanks James for including me. It means a lot.
In the meantime, Christmas is coming. The only time when email stops and professional use of Twitter goes quiet. Its another year end. Those working in HE have two year ends – academic and seasonal. This is our second round of closures and new beginnings. One more blog post before January and I think I know what it’s going to be…
Have a good week.
Procrastination is something I’m good at. Very good.
I wander through the world wide web like it’s my second home – or maybe even my first I spend so much time there!
Hyperlinks are my downfall. There’s still excitement attached to them. My brain is a sponge. It doesn’t always retain what it finds but it loves a good soaking.
Hyperlinks were the brainchild of Tim Berners Lee. The internet already existed and the WWW was a way of linking individual pages and sites. In the early days you knew you were online. Dial up connections beeped and whirred like some giant machine coming to life and the internet being what it is, you can remind yourself exactly what this sounded like
My first computer was a second hand Tandy. I was married and living in the country. My first internet connected computer was a Gateway 486. I was divorced and a city dweller. A degree does this to you. As passing your driving test gives you independence so taking your first degree opens your mind and like Pandora’s Box, once opened it can’t be closed again.
Today the internet/www is integral to our lives and for some, the boundaries between the real and unreal are getting confused. During the US election there was much debate around social media and fake news/false truths. US voters told the world how they relied on Facebook and Twitter as sources of truth because they followed so many people and the majority view had to be the right one, didn’t it.
There are generations who have been born into digital life and know no other, unlike my peers who have analogue feet and roots. We were there at the beginning. My Tandy computer ran DOS, the word processor used commands like <b>strong</b>. I still have a 5 ½ inch floppy disk and sometimes use it in presentations where, as the years pass, less people know what it is,
After DOS came the Microsoft GUI and mouse. We learned to point and click, double click, drag. Now it’s touch screen and a thousand smudgy fingerprints as we tap, double tap, swipe while speech to text and text to speech alternatives continue to get more accurate every year and films like Ex Machina and Her take us to the edge of what is real and unreal – or so we think.
Should we be concerned over the line between real and unreal? Is this what we should be discussing with students? With the aptly named Second Life there were many stories of people becoming emotionally attached to online avatars and we see this today with online dating where digital identity takes on real meaning for real-world users.
Baudrillard gained notoriety for saying the Gulf War hadn’t happened. He didn’t mean it didn’t take place but that for most of us, it was a second hand experience, mediated by a digital reality which wasn’t real. It was hyperreal.
Hyperreality, as in Guy Debord’s Society of Spectacle (1967) is about the confusion between real and representation, in Debord’s case this was caused by a proliferation of images. It isn’t hard to rethink this using virtual reality or even the animated posters they have on the London Underground. They’re like something out of Harry Potter they move and speak to you as though they were real people.
In Simulacra and Simulation (1981) Baudrillard described confusion between real and unreal claiming we’re mistaking digital reality for the real thing so whoever controls digital media has increasing influence over attitudes and behaviours. We are living in a state of hyperreality; hyper from the ancient greek meaning over or above as in hypersonic (faster than the speed of sound) or hyperspace as a different dimension where science fiction characters can travel at hypersonic speeds. The internet/www is known as hyperspace. Online we communicate instantly regardless of time or distance. Online we’re digital space travellers and in 2017,with instant wifi for our mobile devices, we’re increasingly taking this immediate access for granted.
What matters is having the critical digital literacies to be aware this is a construction. Documentary maker Adam Curtis describes Hypernormalisation as a politically influenced state of knowing your reality is wrong but accepting it as right because there’s no alternative.
Charlie Brooker’s Black Mirror series also explores these boundaries through prescient scenarios involving digital shifts and realities. The digital isn’t real yet it must be if we’re all using digital communication to collaborate and make sense of the world.
This blog came about because I read this review from the Research Student Conference at the University of Northampton and was struck by how it reflected the writer’s own perception rather than what I was saying.
PhD student Sue Watling’s timely paper focused on staff attitudes towards technology-enhanced learning and discussed what this can mean from the instructor’s perspective and the processes to standardise training of such skills for teaching staff. (my emphasis)
I was talking about digital shifts yet mention technology enhanced learning and it’s interpreted ‘standardise training and skills. I come across this a lot. With regard to the digital there’s a mismatch between what I’m saying and what you’re hearing and interpreting. This is something which needs addressing.
So this post was going to be you say training, I say teaching, you say skills I say capabilities or something along those lines, but I couldn’t even get from there to here without procrastinating a whole blog post away. Like I said, its something I’m very good at.
This week’s #lthechat (no 87- what will 100 be?) was about CPD or, to be more precise, Professional Development Challenges in Learning and Teaching in Higher Education and led by Prof Sally Brown.
Q1 What professional development challenges do you plan to set yourself in the next academic year?
Er um – I’m not sure.
As the #lthe-chatters listed plans, I sidetracked, taking note of those involving technology, out of interest….but what about the question. What were my own ‘professional challenges’? Then I remembered the PhD. Of course! So why didn’t I initially think of it as CPD? The second question held a clue.
Q2 How can you best engage with students in planning and achieving your CPD?
One chatter posted ‘Not entirely sure what you mean? CPD for me or CPD I deliver for others?’ The reply was ‘for me!’
Another posted ‘Stunning question Hadn’t thought it was something I could do … but it obviously is.’
So not only me! I wonder if there’s a wider tendency to think of CPD in terms of what we provide for others rather than what we do for ourselves?
If so, is the belief related to areas like Academic Practice or Learning Development which are about supporting others to achieve. Could it even be a gender issue. Traditional social conditioning as in being taught to look out for others, be the carer, mender, the one who keeps it all together. Does cultural construction make it more likely some will interpret CPD as ‘do unto others’ rather than ‘unto yourself’?
…do we do CPD without being aware of it. Like students not recognising feedback.
The accompanying #lthechat post listed seven CPD challenges from ‘Professionalism in practice: key directions in higher education learning, teaching and assessment’. These are about ‘translating action into transformative change’. If you saw CPD as doing a mooc or reading a book, take a look at this. CPD can involve any – or all – of the following …
- Stepping out of your comfort zone
- Making an effort
- Talking more to students
- Checking out inclusive practice
- Reviewing internationality
- Becoming more scholarly
- Taking up mentoring or coaching
As if my head wasn’t already thinking enough, question 4 arrived. Which are your key communities of practice: what do you give to them and what do you gain from them? Physical/Virtual
It woz the binary wot did it! Physical/Virtual. For some time I’ve been brooding about how my online life is isolated from my real one. The social media I use isn’t shared by most of my working colleagues (or home peeps come to that, but we’re talking CPD so family/friends is different).
My online professional network is supportive, informative and sometimes game-changing. Take the PhD. Transferring from Lincoln to Hull hadn’t gone well. I was upset at how three years of research into the attitudes and practices of academics online, and how they conceptualised teaching and learning in a digital age, had been rejected. Then a by-the-by comment on Twitter led me to the University of Northampton and Ale Armellini who is now my PhD supervisor. It couldn’t be better. Thank you internet and Chrissi Nerantzi.
We all have similar stories of digital synchronicity. Like the time I found an elusive book of poetry via Twitter in under half an hour! Also regular events like #lthechat can lead to unexpected connections and insights. Yet when I look around, it feels those of us with virtual lives are still the minority. The dominance of the 3P’s, Pen, Pencil and Paper, may be greater than we realise.
Don’t get me wrong! I’m not demanding colleagues be online, or become part of my online life, but I’m aware of their absence. It’s like the ‘Did you watch….’ conversation in the kitchen. I don’t have a tv so am immediately excluded. I’m more likely to ask ‘Did you see….on Twitter’ or ‘have you read the latest post on …..blog’ but I don’t because no one has.
My tweet-answer summed it up. great support/sharing via @twitter but digitally shy colleagues excluded – feel I’m digital/analogue hybrid.
I juggle two worlds – the virtual and real – which feels like I don’t fully fit in either. Like the Roman God Janus, I look both ways. I have dual identities, maybe triple if you include my social use of the internet. Either way I’m an analogue/digital hybrid.
Hybridity is an interesting concept. It’s been around for some time, long before the digital, more complex than a binary, and seemingly well suited to an internet age.
As so often happens, a blog post on one topic is ending on another.
More on hybridity another day.
In the meantime, back to CPD, or in this case – the CPhD.
Storify of #lthechat 14/06/17 available here:https://storify.com/LTHEchat/lthechat-no-87-professional-development-challenges
blog images from #lthechat or https://pixabay.com