Eavesdropping on student thinking

Professor Carl Wieman presented at Imperial College yesterday on Taking a Scientific Approach to Science and Engineering Education. One of the techniques he discussed was using clicker questions in class, and he emphasised the importance of the lecturer listening in to the student discussions to hear how the students are thinking. This allows the lecturer to pick up on misconceptions and address them with the class.

In the subsequent Q&A session, a question was raised on how student thinking can be made visible in an online course. Here’s an example I’ve come across of an online teaching method that promoted extensive student discussion and learning, while allowing staff to see what and how the students were thinking.

I followed an EdX Course, The Analytics Edge, in 2014:

https://www.edx.org/course/analytics-edge-mitx-15-071x-2

(Side note – I see the course will run again from 6th June 2017. I would recommend it very highly for anyone who wants a practical introduction to data analytics and/or to see an excellent example of an xMOOC in action.)

I thought this course was excellent, with very engaging real world examples. Initially, the format was quite traditional with video lectures and online MCQ assessment, but then, half-way through the course, there was a Kaggle competition where participants worked individually, but with lots of peer discussion – sharing approaches, helping each other to understand different aspects. This was loads of fun and really helped to consolidate the previous learning.

Here are links to three of these competitions from different iterations of the course:

front_page

For me, as a learner, the competition ticked all the boxes – the learning was flexible, active and social; the element of competition was motivating; and the task consolidated previous more theoretical study and provided a bridge between theory and practice.

Teaching staff must also have benefited very much from being able to see how students were thinking –  what they had understood well and what they had not fully grasped during the prior, more formal teaching – and this could be fed into future iterations of the course.

There is some more discussion about this competition and how useful it was in terms of learning with peers here: https://octel.alt.ac.uk/2014/forums/topic/autonomous-and-social/#post-13293

Advertisements

Using Blackboard to support team-based learning – delivery and rollout

I will be giving a presentation on using Blackboard to support team-based learning at the Blackboard Teaching and Learning Conference in Liverpool (15-17th April 2015). The session is in the Central Teaching Hub, Lecture theatre C on Wednesday 15th April at 16:25–17:10.

Abstract
Team-based learning (TBL) is a teaching and learning technique that is being used increasingly in professional and higher education. The approach is structured, involving pre-study by students, quizzing to assess understanding, and peer-to-peer learning via team and class discussion. It is also scalable – a single member of teaching staff can run a highly interactive learning session for a large class – and usually forms part of the summative assessment for the course.

This session will provide a brief introduction to the TBL process and describe how delivery of TBL can be supported by the use of learning technology, in particular Blackboard Learn tools. In fact, the presentation could be subtitled “How many Blackboard tools can be used in a single learning activity?”!

Finally, I will describe how a generic course package can be prepared, including all elements of the learning design of a TBL session, for easy redeployment in other Blackboard courses. The course package not only captures the technical elements of the session, but also elements of pedagogy and best practice.

For more information on TBL see http://informahealthcare.com/doi/abs/10.3109/0142159X.2012.651179 and http://www.utexas.edu/academic/ctl/largeclasses/#tbl.

Learning outcomes
By the end of the session attendees should be able to:

  • Describe the key features of team-based learning (TBL)
  • Explain how learning technology, in particular Blackboard Learn tools, can be used to support the delivery of TBL
  • Identify which tools and technologies would be most appropriate to support TBL delivery in their own organisation
  • Set up a reusable course package in Blackboard that captures all the elements of learning design for a particular learning activity

Slides are here: Team-based learning with Blackboard presentation.

Please let me know if you have any questions.

Good behaviour

This is an old data visualisation, which is still used in our Study Skills lecture for first year students…

Behaviour It is based on tracking data from our VLE and shows the number of ‘hits’ over time.

Terms 1 and 2 are 10 weeks long and Term 3 is 6 weeks of teaching, followed by a revision period and then exams.

This graph provides the answer.

BehaviourA

How much students access online learning material is not the only factor; when is also important.

(Also, what they do with the material is important, but that’s not shown in this image.)

 

Navigating #DALMOOC – all at sea

In the discussion forums on the #DALMOOC course, George Siemens (one of the course organisers) urges learners to create artifacts, draw or create “a concept map of the course technologies in order to make sense of (wayfind through) the course.”

So, here is my word picture of trying to wayfind and sensemake in #DALMOOC.

On the blue layer I am sitting alone in a small rowing boat with no paddle, while automated-Carolyn shouts through a loud hailer – “Time to move on”.

On the red layer I am again sitting alone in a small rowing boat, but this time overwhelmed and weighed down by many fantastical Heath-Robinson-style tools of unclear purpose.

I understand the value of finding my own way through the course content, making my own connections, building my own learning space, but I fail to see the benefit of an obscure course structure that impacts on my time available for learning.

Thankfully the course team are now beginning to provide a few more signposts. (Signposts at sea? – I may be getting my metaphors crossed.)

My big question for #DALMOOC

I’m a learning technologist in the UK, where this field has now been renamed Technology Enhanced Learning. One of my favourite questions at the moment is “What is enhanced and how do we know?” (addressed by Kirkwood and Price in this paper – http://www.tandfonline.com/doi/abs/10.1080/17439884.2013.770404#.VEy_QldA2xc). So, I would like to use analytics/big data to help find answers to questions in this area.

As a specific example, my institution is rolling out lecture capture to improve “the quality of the learning and teaching provision for students”. But can the data show us that there are improvements in learning?

Usage data shows us that the recorded lectures are used, but how are they used? Do the lecture recordings encourage students to focus too much on the lecture content rather than reading more widely on the topic? Does the ability to revisit the content of a lecture prevent students building up personal support networks among peers or raising questions with the lecturer? Are the recordings of greater value to those for whom English is not their first language?

To answer these questions requires data from many sources. Is it all available? At the right level of detail? Identifiable to individual students? How can it be gathered and processed? And what are the ethical and legal issues that must be considered?

These are the types of questions that I hope will be answered by the #DALMOOC course.

What question would you like to answer using learning analytics?

Week 1 of #DALMOOC disappoints

I just started on a new MOOC – LINK5.10x Data, Analytics, and Learning or #DALMOOC.

This is a subject that I am interested to know more about and I am also intrigued by the way the course has been designed – with an innovative dual-layer structure, combining aspects of xMOOC and cMOOC.

I was full of anticipation before starting, but so far I have found #DALMOOC very disappointing.

This week has been all about the plethora of new tools that will be used on the course, but why will I use these tools?

First catch your learner
One week in and I’m not hooked. I haven’t been inspired by examples of what analytics will do for me or my learners.

The content to date is minimal, which might be ok, but the structure and direction is minimal too. I can construct my own learning, but I was hoping for some foundation.

The learning tools and design seem to have been given more attention than the actual content, focusing on how the students will learn rather than what they will learn. I have never felt more like a guinea pig.

Is my role in this course simply to provide data for the organisers’ next research paper?

My thought for the day – #altc Day 1

Don’t ignore the late adopters

As a learning technologist, I love the keen academics who are always ready to try out something new.

I engage in coffee-time discussions with these early adopters about teaching and learning and how to incorporate new methods into their courses. And then, after a teaching innovation is deployed, I’m on hand with evaluations and debriefings and pats on the back.

By the time a new technique has been included in multiple courses, it’s old hat for me – boring, more of the same.

But each academic changing their course is taking a significant step. And later adopters deserve the same feedback and support that was lavished on the fore-runners.

So, when I make roll-out plans for embedding new technology, I also need to include time for talking and thinking things through with those who are later to the adopt.

Project Stream

In 2009, because of accidental over-recruitment, we had to put in place a system to allow streaming of material from the main biology lecture theatre to an overflow room. There were two phases to the project: first, developing and setting up the solution and then delivering the solution over the course of the academic year.

The stakeholders were the students who attended the lectures and the staff who delivered the lectures. We had to ensure that the lecture experience was acceptable for all students, both those in the lecture theatre and those in the overflow room. For staff we had to make sure that they were able to focus on delivering their lectures, minimising the disruption caused by the need for streaming.

The technical solution used LiveMeeting (a previous version of Microsoft Lync software) to stream voice and data from the lecture theatre to the overflow room. All video content which appeared on the PC was streamed; this included Powerpoint, web pages, etc. eBeam software was used to display material written on the whiteboard and ELMO software routed the visualiser image via the PC. Dedicated admin sign-ins were needed for the transmitting and receiving PCs. The local ICT support team was very helpful in helping to set up the systems and test the functionality and I delivered training and prepared all the documentation.

The most troublesome part of resourcing the project was the requirement for ongoing support on a daily basis. My time was already fully committed, and I had to make it clear that I was unable to provide support myself. This is hard to do, especially in the ‘matrix management’ situation which often applies for learning technologists – i.e. everyone thinks they are your boss and most important customer! 

In the end we recruited a part-time Streaming Assistant, who set up the equipment before each lecture and monitored the streaming in the overflow room, and an Undergraduate IT Assistant (a volunteer student from the class), who acted as a back-up and provided assistance to the lecturer if required.  I felt it was vital to have this level of support in situ because of the serious consequences if there were problems with the streaming; we would have had to cancel and rearrange the lecture, causing serious disruption to all the students and the lecturer, and a logistical and timetabling headache.

The project plan was clear and achievable, but it was very important to secure adequate resources to deliver a reliable service.

The evaluation of the project was straightforward: was the lecture available and acceptable for the students in the overflow room? It was a reliable solution and the students were happy with the quality of the presentation in the overflow. In fact, some students preferred to watch the lecture there because there was more space (especially if students were using a laptop for notetaking), better sound and video quality than at the back of the main lecture theatre and less chance of being asked a question by the lecturer.  

The results of the project were disseminated at a college Education Day. In my view, the technical details of the project were less significant that the importance of:

  • thinking about the consequences of failure and putting appropriate backup plans in place
  • making sure that staffing resources were adequate

I didn’t use any particular tools for this project, but I can see the value of tools such as a risk register or timesheets. Following a proper project planning process, with appropriate tools and techniques, would make sure that nothing was missed, and would provide documentation of the thinking behind the project plan.

YouTube Automatic Captioning

Did you know about the automatic captioning facility on YouTube?

This is described on the YouTube help pages as follows:

Even if you haven’t added captions to your video, YouTube may use speech recognition technology to automatically make captions available.

Sounds good!

However…

Since these are automatically generated, the quality of the captions may vary from video to video. As the video owner, you can always edit the captions to improve accuracy, or remove them from your video if you do not want them to be available for your viewers.

Hmm…

Here’s an example from the recent ocTEL webinar on assessment and feedback. (Click on the image to see the detail of the captions.)

YouTubeAutoCaptioning

Assessment: Poor.
Feedback: The automatic speech recognition requires more work.

Learning Behaviour

Are you a Viewer, Listener, Optimizer or Completionist?

These terms have been coined by researchers at MIT and Harvard to describe the behaviour of students who participated in the first wave of EDx MOOCs. Links to the research and a de-identified dataset are available in this news report.

The image below shows grades attained and chapters viewed by 53,340 participants on one of these courses, Health and Environmental Change.

Learning Behaviour

From Reich, J., Nesterko, S., Seaton, D. T., Mullaney, T., Waldo, J., Chuang, I., & Ho, A.D. (2014). Health in Numbers and Human Health and Global Environmental Change: 2012-2013 Harvard School of Public Health course reports (HarvardX Working Paper Series No. 2).

The categories of students identified are:

  • Completionists (at the top right of the graph) – students who viewed most of the material and achieved a high grade.
  • Optimizers (top left) – students who did enough to pass the course, but viewed only a limited amount of course material.
  • Viewers (bottom left) – students who viewed a limited amount of material and did not achieve a passing grade. These students may still have achieved all that they wanted from the course.
  • Listeners (bottom right) – students who viewed more than half of the chapters, but did not achieve a passing grade on the course. Some of these students did not engage with the formal assessment at all.

As the report states:

One of the signature features of these plots is that students can be found at nearly every possible location in the possibility space. Some students focused on earning a certificate by targeting assessment questions; some students viewed all parts of the course, eschewing all assessment; some students dabbled in various dimensions; and some students successfully completed all parts of the course.

On future courses, students will be asked about their motivation for taking the course.