Friday 30 May 2014

Open Badges Generator Update - Source Code

Following on from our previous post on the Open Badges Generator we have updated our system to include both badge creation and badge issuing components.

We have now written up a small guide to help anyone who would like to implement this elsewhere. The system we have built and documented is still in it's initial pilot/proof of concept phase so expect there to be some discrepancies in the source code.

Here is the link to the guide:

And here is a link to the source code:

This documentation is provided on the basis that the viewer has some technical knowledge of Javascript Syntax, basic Google Apps Script and basic HTML/CSS knowledge.

Over the coming months we will be piloting the issuing of badges using this system for events here at UCS, such as the Annual Court, Teaching & learning Day and the Research Colloquium. These events will allow us to test out both the system and the processes behind issuing the badges from the initial team talk with the event host to the claiming of the badges from the attendees.

We are keen for recipients to participate towards the event in order to be eligible for a badge, not to simply turn up. So we are asking participants to fill in a form (Name, Email) including a small reflective piece (150 words max) maybe about a key message they took from the event or some feedback.

Currently we have a small list of developments we will be looking at for the next iteration of this system, these include but not limited to;

  • Unique ID for each badge issued to allow for row deletion in DB.
  • Option of Expiry Date for issued badges, this will help reinforce a potential 'renewal' process, for example, continuing digital literacies programme.
  • A statistics component to record and show how many of each badge has been issued.

Finally, if you are a member of staff at UCS and would like to discuss how you could use Open Badges please contact us :)

Friday 23 May 2014

Thoughts from JISC RSC Eastern E-Learning Forum (16th May)

I attended the JISC RSC Eastern E-Learning Forum meeting on Friday (16th May). A couple of applications I took from the sessions to bring back home, include;
  • Padlet ( seems to be gaining traction across the region. The use seems quite varied, one was within the event setting, ie., audio response to questions asked, or observations (, while other institutions are using it to evidence impact of certain teams and services.
  • Videoscribe (, which looks a simple and effective way to generate interesting and engaging multimedia based learning objects. I'll need to pop up to West Suffolk College to get a few pointers as Nikki seems to be moving ahead in this area.
  • JISC RSC Case Studies (

Wednesday 21 May 2014

Elevate Team: e-Learning Tool Landscape 2014-15

We have just released our proposed e-Learning Tool Landscape for 2014/15. We publish this every year to give you a sense of the tools we support, and the level to which they are supported.

The document is available below, and we'd encourage you to leave comments and thoughts. We'll be finalising the e-Learning Tool Landscape by the 10th June.

The original document is available from:

If you have any questions, please email Andy Ramsden (

Monday 19 May 2014

Open Badges Generator

We have been looking recently at how we could issue Open Badges, an initiative backed by the Mozilla foundation for standardising the issuing of badges for 'verifying learning'.

There are many systems available for issuing badges to recipients, however all of the systems we have come across (including BlackBoard) need the recipient to be a registered user of that specific system. This became a stumbling block for us as we would like to issue badges based on the more frequented face to face elements the Elevate Team offer. Workshops, Masterclasses and group sessions are all based away from systems such as BlackBoard, so how could we issue badges?

The option of creating a new module in BlackBoard's CourseSites was an option, however this would become cumbersome and yet another task a user would have to go through to simply claim a badge.

Whilst searching around various user groups and communities we stumbled across an article on how to use Google Sites as a badge issuer using Mozilla's Open Badges Issuer API by +Martin Hawksey . After testing what Martin had done, including using Google Forms to process user information, we decided to try and modify the system to suit our needs.

The 'proof of concept' we have ended up with ended up being quite different from what was detailed in the article. Martin's project issued a static two badges, we modified this code to allow us to issue any number of custom badges.

The process would be:

  1. User signs up, for example a workshop session, leaving Name and Email
  2. Attends the workshop
  3. After, the Elevate Team uses the generator to select appropriate badge for the workshop, uses Name and Email in the form.
  4. Users gets email to claim their badge
What's nice about this process is it only introduces one step for the end user outside of normal operations (Signup form etc), simply clicking a link in an email.

We created a dynamic Open Badge generator where only our team can select which badge we wish to issue and to whom. The script behind the form will dynamically pull the relevant information (Criteria, Evidence) for each badge based on what badge has been selected, rather than having it as a manual process. We intend to extend this function by having another form which will allow us to 'create' a new badge which can be referenced within this system.

If you wish to test the generator in it's development state and issue yourself either a 'Novice' or 'Expert' badge, please visit the Google Form below.

We have integrated a faux-aunthentication function to negate any spamming of badges for unauthorised users, this was simply done by using a required field in the Google Form which had to match some text exactly to submit the form.

For this proof of concept, the Team Code is "pleaseletmehavethebadge" sans quotes.

If you have any feedback on using the system or if you wish to discuss with the Elevate Team about Open Badges, please contact us at

Please note that by it's very nature, this system is likely to change and possibly stop working altogether.

The Elevate Open Badge Generator will be for the private issuing of badges by our team, however the underlying code and technical functionality will be posted under a Creative Commons Attribution 3.0 Unported License. CC-BY which will come shortly.

With thanks to +Martin Hawksey 's original code and article -

Thursday 15 May 2014

The power of the MCQ exam: Optical Mark Reading (OMR) Service at UCS: Annual Report 2013-14

This blog post is the annual report for the OMR Service which is run by the Elevate Team for course teams at UCS Ipswich. For more information on how the OMR service can help your teaching and assessment model, please contact

Executive Summary

During 2013/14
  • one additional course team have started using the OMR Service for a Level 5 summative exam.
  • The total number of course teams using the service was 5
  • The total number of papers set was 17
  • The total number of answer sheets scanned was 516
The recommendations for 2014/15 are;
  • Be more proactive in promoting the service to course teams across UCS
  • Reduce potential points of failure within the service, including, reducing the current risk where the is software being installed on one desktop, which is off site, only one member of the team is aware how to use it

The Elevate Team have been running an OMR Service since 2011-12. The aim of the service is to provide an opportunity for course teams to use Objective Testing (low and high stake) in their assessment and feedback models at UCS. The service complements the use of the Quiz (Test) Engine in LearnUCS. Please note, we do not recommend or support the use of LearnUCS in UCS Computer Labs or alternative learning spaces for summative, high stake assessment.

The Elevate Team use the FormReturn Software / Service (, with a local install on one of the Elevate Team’s iMacs.


The workflow for the creation, distribution and return of OMR exams is outlined in Figure 1. This illustrates the process is a cross team.


The workflow is also available from;
An illustration of an uncompleted answer sheet is included in Appendix 1. This follows current good practice within design, which has been discussed with the Registry / Examination Office to ensure it accommodates UCS requirements.

Usage and feedback on the OMR Service: 2013-14

This section aims is to answer three questions:
  • is service being used?
  • are staff happy with the service
  • how might the service be enhanced?
The methodology involves reviewing the log data to identify if the service is being used, and undertake a survey to capture staff views.

The log data indicates the service has been used. Between 1st September, 2013 and 9th May, 2014

The programmes using it for summative exams are:
  • BA Business Management
  • Foundations of Biological and Cognitive Psychology
  • Child and Social Policy
  • Radiation Physics
    • The total number of course teams using the service were 5
    • The total number of papers set were 17
    • The total number of answer sheets scanned were 516
Admission process

An ongoing commitment is to support the admissions process the Radiography Team. They deploy run a Literacy Test (20 questions) and Numeracy Test (24 questions). During 2013-14 the Elevate Team administered 12 tests (6 instances), with 232 scans.

A survey was administered to gather staff opinions around the value of, and potential to enhance the service. The response rate was 4 out 5 (50%). All four respondents are planning to use the service in 2014/15, two strongly agreed with the statement “I’d strongly recommend the use of OMR in teaching, learning and assessment to my work colleagues”.

We use the OMR as the first assessment in a level 4 module. We have found that students engage very well with the specific learning materials in preparation for the exam. In contrast to the previous assessment which was an essay, students focus on the provided revision material which is directly linked to the learning outcomes for the module, they receive feedback in a matter of days after the exam when previously they had to wait much longer for up to 120 essay scripts to be marked by two tutors.

Feedback from the students is overwhelmingly positive and summative results have improved.
With respect to potential issues, one response focussed on the initial setting up and refreshing of questions, given the potential of three exams a year to the cohort. Therefore, course teams need to develop a large bank of questions. While another response raised issues around their need for a quicker turnaround, in particular, results being available within the working day of the assessment.

  • Be more proactive in promoting the service to course teams across UCS
  • Reduce potential points of failure within the service, including, reducing the current risk where the is software being installed on one desktop, which is off site, only one member of the team is aware how to use it

Wednesday 14 May 2014

Online Submission, Grading & Return UCS Review

Members of the Elevate Team and Academic Services are undertaking a review of online submission, online marking and online return of student assignments. This builds on the work we undertook a few years ago which led to the creation of a good practice workflow.

The broad aims of the review are to engage with various stakeholders across the institution to assess if the current workflow is appropriate, and where it can be enhanced.

The intended outputs are;
  1. a suggested workflow for managing the e-submission, e-grading and e-return 
  2. creation of appropriate staff and student support and development models
I will endeavour to regularly post about this review as we progress.

Members of the Task & Finish Group, are Andy Ramsden, Aaron Burrell, Faith Hicks and Matthew Hirst.

The intention is for Task & Finish Group to feedback to AMC at the start on June 2014. If you would like any more information, please email Andy Ramdsen (

Image Source:

Tuesday 13 May 2014

Mahara currently unavailable.

UCS' Mahara service is now back up and running, however it is advised that the next 24 hours are classed as an 'at risk' period with possible intermittent access. Our host ULCC will be monitoring the situation.

Mahara ( is currently unavailable. We have contacted our host's support team regarding this matter.

We are currently awaiting a response and will update as soon as we know more.

Apologies for an inconvenience.

UPDATE - It has come to our attention that our host ULCC has experienced a power outage in London and is working hard to rectify the problem.

Read more here ->

Friday 9 May 2014

SafeAssign Originality Report Updates

To coincide with a new release of Blackboard (we will be upgrading over the summer) the SafeAssign Originality Reports have had a change of style, to make them easier on the eye.

An example of the new look and feel is below:

A long with the layout are some improvements, now when you click on some highlighted text, you are presented with the text from the document and the original source, as shown below:

A SafeAssign originality report provides detailed information about the matches found between a submitted paper and existing sources. The report identifies all matching blocks of text. You and your students need to determine if the matching text is properly referenced. Investigating each match prevents detection errors due to differences in citing standards. 
The originality report displays the list of potential sources and identifies which sources were searched to return the available results. If students cited any of the sources correctly, you can remove them from the overall score as they are not instances of plagiarism.
The "Send" feature which allowed you to send a report to another user has been removed, now you need to "Print" and choose to print as PDF, which allows you to then forward the report.

Below is a guide showing how to interpret the scores generated by the report, but as always, this is just a guide.

Sentence matching scores represent the percentage probability that two phrases have the same meaning. This number reflects the reciprocal to the probability that these two phrases are similar by chance. For example, a score of 90 percent means that there is a 90 percent probability that these two phrases are the same. There is a 10 percent probability that they are similar by chance and not because the submitted paper includes content from the existing source—whether appropriately attributed or not.

The overall SafeAssign score indicates the probability that the submitted paper contains matches to existing sources. This score is a warning indicator only. Review papers to see if the matches are properly attributed.
  • Scores below 15 percent: These papers typically include some quotes and few common phrases or blocks of text that match other documents. Typically, these papers do not require further analysis as there is no evidence of plagiarism.
  • Scores between 15 percent and 40 percent: These papers include extensive quoted or paraphrased material, or they include plagiarism. Review these papers to determine if the matching text is properly referenced.
  • Scores over 40 percent: A very high probability exists that text in these papers was copied from other sources. These papers include quoted or paraphrased text in excess, and need to be reviewed for plagiarism.

Thursday 8 May 2014

What students Really Want - £5K Funding to Improve Student Lift

The JISC, are offering the following student competition, with £5K up for grabs for successful students.
The JISC believe that students should have a prime role in developing novel uses of technology to improve their experience at college or university.

That’s why last year we ran a competition – the Summer of Student Innovation - designed with RLUK, RUGIT, SCONUL and UCISA as part of a co-design approach to innovation within UK education. 21 projects were successfully funded and supported to develop their ideas. We continue to work with six projects to help develop them and explore the possibility of offering them as a shared service for the whole sector.

The Summer of Student Innovation 2014 competition is running again this year and £5k is available for each successful student idea. Students have until the end of May to submit their entries and with our help attract votes. For more information, see
If you'd like to talk through your ideas (UCS students only) before submitting them, please email Andy Ramsden (e-Learning Development Manager) -

Thursday 1 May 2014

Online courses available to enhance your skills for living in this digital world

The following was announced at a recent Digital Literacies event we ran at UCS with the JISC RSC Eastern.

We are running a number of online courses during May, 2014. The titles are listed below. We've arranged them by expected audience, although you may think differently :-)

If you have any questions about if the course is for you, please email

  • Using e-Portfolios (Mahara) as a reflective learner
Lecturers / Teachers
  • Getting started with Augmented Reality as a Learning Technology
  • Why use the flipped classroom in your teaching?
  • Getting started with the Blackboard Quiz Engine in your teaching and assessment model
University Campus Suffolk Members Only
  • Using RefWorks for resource management (University Campus Suffolk Only)
These online courses will use the Blackboard CourseSite's Learning Platform (, they are designed for approximately 5 hours duration, and when you successfully complete the course you'll receive a Mozilla Badge (and Certificate) of Attendance.

The intended outcomes of the sessions are listed below.

All the courses will start on the 12th May, 2014, and run for two weeks.

The first step is for you to register your interest through the online form ( and we'll contact you a week before the course starts to clarify how and where to access the courses.

In the mean time, if you have any questions, please email

Intended Learning Outcomes

Using e-Portfolios as a reflective learner

By the end of this course participants should be able to;
  • LO1: Have a better understanding of what a reflective learner is and be aware of other types of learners
  • LO2: Create an e-portfolio page which has been used for personal reflection in the Mahara e-portfolio tool
  • LO3: Be aware of the different tools available and their effective use in supporting reflective learning
Getting started with Augmented Reality as a Learning Technology

By the end of this online course participants should be able to answer the following questions;
  • LO1: What is meant by the term augmented reality?
  • LO2: How it is been used in UK HE & FE?
  • LO3: How you might use it in your Teaching, Learning and Assessment Models?
  • LO4: How do I design and develop an augmented reality learning activity using Aurasma?
Why use the flipped classroom in your teaching?

By the end of this online course participants should be able to answer the following questions;
  • LO1: What is meant by the term flipped classroom?
  • LO2: What are some of the underpinning educational models?
  • LO3: What evidence is there to support using a flipped classroom from UK HE & FE?
  • LO4: What concerns are there around flipping the HE classroom?
  • LO5: What technology enhanced learning designs suit a flipped classroom?
  • LO6: How you might use the flipped classroom teaching model?
Getting started with the Blackboard Quiz Engine in your teaching and assessment models

By the end of this course participants should be able to;
  • LO1: Explain what we mean by the term objective test
  • LO2: Review a number of ways objective tests have been used in UK HE to enhance the learning experience
  • LO3: Design, develop and deploy an objective test in a module on LearnUCS
  • LO4: Accessing the results through the Grade Centre
Using RefWorks for resource management

By the end of this course participants should be able to;
  • LO1: Answer the question: “What is a reference management tool (i.e.RefWorks)?”
  • LO2: Create accounts with customised setting to reflect use of UCS Harvard referencing style.
  • LO3: Import/Export content from a variety of sources, including databases and other online resources
  • LO4: Install and know how to use the Ref-Grab-It Bookmarklet tool
  • LO5: Set up folders and store references in them
  • LO6: Create a bibliography of their references
  • LO7: Know the importance of checking that their references lists meets UCS referencing standards 
  • LO8: Share references within their RefWorks account with others.
Image Source: