Wednesday 17 June 2015

Open Badges: Trying to navigate through the melting pot of ideas

A short literature review on open badges in Higher Education generates a large number of conflicting emotions and views around their potential. I’d sum these up as follows;
  1. uncertainty
  2. excitement
  3. potential
  4. irrelevance
  5. anxieties
  6. disruptive
  7. game changer
  8. questionable
  9. learner centred
This short discussion explores what underpins these emotions and opinions, and how a HE Institution might progress with open badges in the short term.

Open badges are relatively new. “There is currently little in the formal literature relating to their use or impact” (Glover & Latif (2013:1402)). Their importance as a means of accrediting learning was highlighted in the Innovating Pedagogy Report 2013, and it is clear open badges play a role in enabling many of the ideas discussed within the Innovating Pedagogy Report 2014.

A digital badge is a validated indicator of accomplishment, skill, quality or interest (http://www.hastac.org). From the learner’s perspective this opens up the opportunity of their learning environment no longer being tied to a structured classroom or online space, but being wider, encompassing formal and informal learning opportunities. Badges enable learners to navigate “multiple pathways to gain competencies and refine skills through open, remixable and transparent tools, resources and process” (Mozilla Foundation (2012:4). Within this context open badges are the method to demonstrate the outcomes of this evolving learning journey.

Given open badges are a recent innovation, this discussion is grounded within the knowledge they will undergo significant changes. Carey (2012) states, “many of the first badge systems will fail … they won’t be designed well enough or properly connected to communities of interest”.

The question is, what are the current discussions around open badges?

Glover and Latif (2013)’s study indicates the use of badges in formal education is a conceptual struggle, however students and staff are engaged within creative discussions. For instance, staff wish to explore badges within reference writing, while the students wish to differentiate badges with some only available to high achievers. However, both groups felt employer engagement was important.

Gibson et al., (2015) review of badges in HE see the uses around incentivising learners to engage in positive learning behaviours, identify progress in learner and provide credits for engagement, learning and achievements. They explore the positive potential on a leaners motivation, through status recognition and evidence of achievement.

Carey (2012) explores the impact on traditional assessment and recognition methods in HE. His discussion highlights the disruptive nature of badges and their potential as a game changer. Carey (2012) notes, open badges make the standard college transcript look like a sad and archaic thing. Where the information within the transcript is both limited in quantity and usefulness to employers. On one level this challenges the current locus of control within the the assessment process. For instance, the wider use and acceptance of badges would encourage the learner to develop their own assessment framework.

The previous point hinges on the acceptance of open badges across society. Therefore, the role of employers becomes critical. Open badges enable a learner to present a rich picture of themselves, however, adoption is dependent on them being credible to employers and professional bodies. Therefore, they will succeed or fail on how desirable people find the issued badges (Glover 2013).

Davies et al., (2015) identify there is significant opportunity of using open badges to certify practice compared to the current practice based on a degree and examination route. They outline a top-down design framework and conclude with “a carefully prepared and effeciently implemented certification system based on open badges could potentially provide a transparent, flexible, efficient, rigorous and credible way of certify evaluators” (Davies et al., (2015:161).

Some of the emerging concerns from the literature are around the rigour of the badge and the badging system. Currently, the rigour of the assessment criteria and trust is placed on those who authorized the badge. Therefore, questions must be asked concerning what mechanisms are in place and how is quality assured? Goligoski’s (2012) review summarises public concerns of open badges, and provides a longer term perspective around badges becoming just a commodity, which as digital assets may exclude some individuals, resulting in an uneven implementation. If this occurs, it would be expected open badges will remain on the periphery.

Within the above context, the next question is, how might an institution progress with its open badge initiatives?

The starting is institutions need to start piloting the use of open badges within the formal and informal learning provision.

The less contentious route is to focus on the informal learning mechanisms within an institution as a means of piloting a number of initiatives. The pilot needs to help the institution articulate the questions to ask within its context. For instance, what do students and staff currently perceive as the potential of open badges? What value do local employers place on open badges? How should be manage the open badge infrastructure? While addressing the trickier questions at the heart of the debate, what value does a badge have? Should be we concerned about comparing learning effort across badges?

At UCS the first stage has been to develop the infrastructure, and pilot some uses around informal learning opportunities, including reflections on conferences and workshops, and the completion of study skill workshops. This has developed a broader understanding of the answers to some of the earlier questions, while building up confidence and capacity in a robust and resilient infrastructure. On reflection, within the literature this is prevalent approach, the toe in the water, to help contextualise what open badges mean to us.

However, for successful technological adoption, we’ll need to ensure two key elements are aligned if these small scale, unconnected pilots are to morph into an institutional wide impact.
  1. Leadership from the top. At the moment there is no senior / academic lead endorsing and driving the change programme and pilot. However, this is required so they can articulate how the technology supports the institutional strategic vision.
  2. Institutional commitment and investment. The provision of ring fencing resources to deliver the change. A successful pilot project will require academic staff input. Therefore, involvement must be recognised as a valid activity for staff and is reflected within workload models.
  3. Engage with local employees to discuss their opinion of badges and the potential  role they play
I would suggest if an institution wished to progress on open badges it does not need to go through the early adopter, proof of concept stage, as there are enough lessons learnt within the public domain. They should progress straight to a large scale pilot with top level institutional support.

Further Reading

  • Carey, K. (2012) A Future Full of Badges. Available at: http://chronicle.com/article/A-Future-Full-of-Badges/131455/ (Accessed: 17 June 2015)
  • Davies, R., Randall, D. and West, R. E. (2015) ‘Using Open Badges to Certify Practicing Evaluators’, American Journal of Evaluation, 36(2), pp. 151–163. doi: 10.1177/1098214014565505
  • Gibson, D., Ostashewski, N., Flintoff, K., Grant, S. and Knight, E. (2013) ‘Digital badges in education’, Education and Information Technologies, 20(2), pp. 403–410. doi: 10.1007/s10639-013-9291-7
  • Glover, I. (2013) Open badges: a visual method of recognising achievement and increasing learner motivation. Available at: http://shura.shu.ac.uk/7612 (Accessed: 17 June 2015)
  • Glover, I. and Latif, F. (2013) Investigating perceptions and potential of open badges in formal higher education. Available at: http://shura.shu.ac.uk/7173 (Accessed: 17 June 2015)
  • Goligoski, E. (2012) ‘Motivating the Learner: Mozilla’s Open Badges Program’, Access to Knowledge, 4(1), pp. 1–8.
  • Humanities, Arts, Science and Technology Alliance and Collaboratory (ed.) (no date) Digital Badges. Available at: http://www.hastac.org/collections/digital-badges (Accessed: 17 June 2015)
  • Jovanovic, J. and Devedzic, V. (2014) ‘Open Badges: Novel Means to Motivate, Scaffold and Recognize Learning’, Technology, Knowledge and Learning, 20(1), pp. 115–122. doi: 10.1007/s10758-014-9232-6
  • Open University (ed.) (no date) Innovating Pedagogy 2014 | Open University Innovations Report #3. Available at: http://www.open.ac.uk/blogs/innovating/ (Accessed: 17 June 2015)
  • The Mozilla Foundation (ed.) (2012) Open Badges for Lifelong Learning. Available at: https://wiki.mozilla.org/images/5/59/OpenBadges-Working-Paper_012312.pdf (Accessed: 17 June 2015)

Wednesday 10 June 2015

Learning Services Online Open Courses: Review 2014-15

The Learning Services Team have been piloting a number of online open courses during 2014-15. The aims of the pilot included exploring how we could design, develop and maintain online course (equivalent of one hour workshops), and explore the ideas of open badges for motivation and to feed into discussions around gamification.

Our design and development process is based around short development sprints (need to be completed within a week). We also ensure the teams are composed of a wide range of roles from across Learning Services. Each design team is allocated an e-Learning Developer, and a Critical Friend (another member of Learning Services). Their responsibilities are to provide advice around learning design within the context of the available tools, and be a sounding board for ideas. The critical friend needs to be constructively critical. The process is;

  • Stage 1: The Design Team create the detailed plan (based on the workshop template), and this is discussed and agreed (signed off) with the Critical Friend. This looks at the learning design and activities. 
  • Stage 2: Timetable short time frame to create the multimedia (talking heads, screencasts etc.,) with the elearning Developer
  • Stage 3: Completion of template docs for interactive quizzes
  • Stage 4: Review by Course Team and Critical Friend
  • Stage 5: Migrate and build in CourseSites
  • Stage 6: Internal checking by wider Learning Services Team
  • Stage 7: Release and promote

A recommendation for the 2015-16 academic year is to create a new role (Student Reviewer). This person will have an input at Stage 1, Stage 4 and Stage 6. A second recommendation would be to use critical friends from outside Learning Services, including lecturers or academic developers.

During the year, we released three courses (one of which was not in CourseSites). The learning design for this courses was based around a linear, self-paced, individual learner model. Once these had been developed and released, there was no ongoing work requirement for the design team.

On successful completion of the course the learner was awarded an open badge and certificate. The criteria was they needed to have reviewed (read or watch certain material) and completion of any required tests.

We also have designed (not implemented) two other courses with different learning designs. These allowed us to challenge (and understand) the issues around maintaining online courses which had a real time communication dimension or designed around collaborative group learning.

Given the primary aim of the 2014-15 period was for Learning Services to learn from the design and development process, we didn't promote the courses. The promotion was an announcement on the intranet, and an area on our Learning Services web page.

The following usage statistics cover the period 12th Dec 2014 to 6th June 2015, and the two courses in CourseSites (Essay Structure and Information Sources).

Enrollments
  • Total 61 people, of which
    • Learning Services Staff: 4
    • UCS Staff: 8
    • UCS Students: 33
    • Unknown (no recognizable name, username or email): 16
Achievements (Open Badge)
  • Total 14 claimed, of which
    • Essay Structure: 10
    • Information Sources: 4
These badges have been achieved though out the period.
  • Dec 14:3 (of which Essay (1), Info Sources (2))
  • Jan 15: 3 (of which Essay (2), Info Sources (1))
  • Feb 15: 5 (of which Essay (4), Info Sources (1))
  • Mar 15: 0 
  • Apr 15: 1 (of which Essay (1))
  • May 15: 1 (of which Essay (1))
Course Access: Activity (clicks) inside the Content Areas

It is clear from the data below the courses have been accessed, and the peak period was around the end of Semester 1. This activity data excludes members of Learning Services

  • Dec 2014 (12th onwards): 160
  • Jan 15: 196
  • Feb 15: 201
  • Mar 15: 48
  • Apr 15: 54
  • May 15: 74
  • June 15 (before 6th): 15
Learning Design (Learning Analytics)

As mentioned earlier, to award the Open Badge / Certificate the learner needed to have stated they've reviewed the material and/or complete a quiz. This meant we could track user data through out the activity. It highlighted some interesting, firstly completion rates where relatively high (10 out of 18 for Essay Structure, and 4 out of 21 for Information Sources). Secondly, it answered a question (and a design flaw), why had so few people achieved the open badge for Information Sources? The answer was 15 people (out of 21) had got to the last two activities, however, these were deployed at he same time. The bulk of people (10 out of 15) completed the end of session test, but fewer also completed the Summon Search Task (5 out of 15). You need to complete both to be awarded the badge. So the design tweak will be to ensure these are released individually.

Why don't you have a try?

The courses are being taught using CourseSites by Blackboard, an online platform for organizing and securely sharing course materials, online lectures, discussion and other learning activities. To request enrollment into the courses, follow the steps below:
  • Launch a browser and enter the following URL to the course home page: https://www.coursesites.com/s/_lsucssto1
  • Once at the course home page, click the Request Enrollment button.
  • Enter a valid email address and your full name in the corresponding fields.
  • Optionally, edit the Subject.
  • Optionally, edit the message. The name you enter in the Full Name field will be automatically entered into the signature of the message.
  • Click Submit to send your request.
Shortly after, you will be sent to you a course invitation. Follow the link to confirm and register. When signing up, take note that you can register using existing account information from popular web services like Facebook, Twitter, LinkedIn, Gmail, Yahoo and Windows Live to make it easier to login.
 
With Thanks - Image - http://www.virtualeduc.com/images/online_icon.png


Monday 8 June 2015

Multimedia Management Service (Elevate Team): 2015 [Latest Update]

This is an update post. The original was 19th March, 2015. The need for the update is to bring inline with our reporting schedule.

The Elevate Team allow staff to upload videos to the Elevate Team YouTube channel, for use in teaching, learning and assessment. On occasions the Elevate Team will be involved in the capture and edit of the videos, or we'll simple enable the distribution. This service has been growing over time, and provides an alternative to course teams needing to support their own approach.

The following outlines the use of this service from 1st January 2015, to 31st May 2015.

Total number of videos uploaded: 745
  • Student Presentations: 530 (71%)
  • Student Viva's: 3 (0.4%)
  • Student Role Plays: 94 (13%)
  • Recordings of online events (Hangouts): 5 (0.6%)
  • Screencasts (talk over powerpoints & how to videos): 31 (4%)
  • Staff focussed Learning Objects: 53 (7%)
  • Marketing (external audience): 12 (1.6%)
  • Other: 17 (2.4%)
The Elevate Team are starting a review of how people are using Multimedia in their teaching and learning models to ensure the service we provide is fit for staff needs. For more information, see;

The power of the MCQ exam: Optical Mark Reading (OMR) Report 2014-15

The following report is for the OMR in Figures. This year we've not undertaken the survey, as we intended to collected staff stories as a means of promoting the service.
The key changes this year included:
  • Purchasing a license to run from Learning Services. This enhances turn around speeds, and we were able to deliver the processed forms within 2 hours for the Social Work Admissions Test
  • Extending the use of the software and service to support student surveys. This was to support an initiative from Admissions & Marketing
During 2014/15
  • The total number of course teams using the service was 12 (up from 5)
  • The total number of papers set was 20 (up from 17)
  • The total number of answer sheets scanned was 679 (up from 516)
  • Covered course teams in Dept Science and Technology, Dept of Applies Social Science, and Suffolk Business School
Recommendations for 2015/16
  • Be more proactive in promoting the service to course teams across UCS, including the collection of user stories and testimonials

Friday 5 June 2015

LearnUCS Review 2015: Findings (Draft)

Learning Services (Elevate Team) have undertaken a review of LearnUCS to ensure the virtual learning environment (VLE) facilitates the support UCS’s 2020 Vision. This was through answering two distinct questions:
  1. How are staff currently using the VLE within your teaching, learning or assessment?
  2. How would staff like to use the VLE to enhance your teaching, learning or assessment?
UCS’s current Blackboard licence is for the external hosting of the Learn platform. The Blackboard Learn software sits at the centre of a set of e-learning tools used at UCS to provide a broad virtual learning environment.

The usage of LearnUCS is relatively high, with an average of over 78,000 logins per month (web and mobile app - Oct 14-April 15).

Methodology

The data collection involved a number of 1-2-1 semi-structured interviews and group tasks within Departmental meetings. The data was collected between Feb and April, 2015.

Findings

The VLE (LearnUCS) within the broader e-learning platforms at UCS is capable of meeting the general requirements from staff for the design and delivery of a rich digital learning experiences. Therefore, it is the author’s opinion, the existing e-Learning Tool Landscape would only need a few additions to enable UCS to “further develop flexible modes of supporting learning that enable students to flourish as independent learners with the capacity for analytical, critical and creative thinking” (UCS 2020 Vision pg 12). Based on staff feedback, these additions would include UCS exploring a specialist peer assessment tool, getting more from its current web conferencing tools, and seeking opportunities for collaborative learning through student owned cloud solutions, in particular, Microsoft 365 and Google Drive.
The review identified;
  • Staff currently use the VLE as a content plus model. Where the core learning activities and support are delivered face to face. While, administrative information (announcements, calendar), readings, learning material, submission of assignments and some support is provided online
  • The majority of the “it can’t do” issues are actually possible, and in many cases have been for a while. This implies the need to revisit our communication plan to improve its effectiveness. If this is more effective the potential barriers to adoption will be reduced.
  • There is a need for more effective access to support people (Elevate Team) for staff. Individuals, felt many of their questions could be effectively addressed through just-in time support, and not an over reliance on online materials.
  • The barriers to adoption do not indicate there are issues with the actual VLE software. The barriers tend to be related to processes, the provision of wider UCS hardware and most commonly time to explore the opportunities the software offers.
  • The required enhancements include known issues for Blackboard on their roadmaps, in particular enhancements around the electronic management of assessments, the Learn Mobile app and discussion board tools. Therefore, Learning Services need to regularly discuss these with our Blackboard account manager.

Recommendations

The following recommendations are intended to encourage the adoption of the VLE, widen its use towards more e-learning activities, improve UCS’s awareness of viable alternative to Blackboard, and identify the resource requirements for UCS to self-host Blackboard.

The granular enhancements identified through data analysis (see further reading) will be accommodated in the annual LearnUCS Service Plan.

Getting more out of the VLE

Task
ID
Owner
Timeframe
Design and implement a more effective staff support and development model.
R1
Learning Services (Elevate Team)
Implement in Sept 2015
Publish an advice guide on using the following to enhance teaching, Learning and Assessment;

  1. Webinar tools (visimeet & google hangouts)
  2. LearnUCS Quiz tool
  3. LearnUCS Collaborative Group Tools (Wikis & Blogs)
R2
Learning Services (Elevate Team)
Sept 2015
Promote LearnUCS Top Tips and minimum expectations as one way of enhancing effective use.
R3
Learning Services (Elevate Team)
Sept 2015
Work closely with IT Services and Estates to ensure the required hardware and software is available for staff use the e-Learning Tools.
R4
Learning Services (Elevate Team)
Ongoing

Being aware of viable alternatives to Blackboard

Task
ID
Owner
Timeframe
Compare the feature set of alternative VLEs, in particular; Moodle, Desire to Learn, & Google Educational Apps
R5
Learning Services (Elevate Team)
Dec 2015 & Dec 2016

Seeking ways to reduce the license costs

Task
ID
Owner
Timeframe
Identify resource requirements for UCS to self-host Blackboard.
R6
IT Services
May 2016

Further Reading

Image - With Thanks - https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgTnpQat72XatcaIdmZ81ujmhXeuHsMkpQjxR-vQjKdAvqtF5heTEWvlpXfzZaCN7r7uDCMfKntBfB63EH0-cQjOpZHfymJdqbvCe7dLS53U8LPKYKYmbFrXQcmWneCJNiTVf9X14NGdVA/s1600/draft.JPG