Showing posts with label Statistics. Show all posts
Showing posts with label Statistics. Show all posts

Wednesday, 10 June 2015

Learning Services Online Open Courses: Review 2014-15

The Learning Services Team have been piloting a number of online open courses during 2014-15. The aims of the pilot included exploring how we could design, develop and maintain online course (equivalent of one hour workshops), and explore the ideas of open badges for motivation and to feed into discussions around gamification.

Our design and development process is based around short development sprints (need to be completed within a week). We also ensure the teams are composed of a wide range of roles from across Learning Services. Each design team is allocated an e-Learning Developer, and a Critical Friend (another member of Learning Services). Their responsibilities are to provide advice around learning design within the context of the available tools, and be a sounding board for ideas. The critical friend needs to be constructively critical. The process is;

  • Stage 1: The Design Team create the detailed plan (based on the workshop template), and this is discussed and agreed (signed off) with the Critical Friend. This looks at the learning design and activities. 
  • Stage 2: Timetable short time frame to create the multimedia (talking heads, screencasts etc.,) with the elearning Developer
  • Stage 3: Completion of template docs for interactive quizzes
  • Stage 4: Review by Course Team and Critical Friend
  • Stage 5: Migrate and build in CourseSites
  • Stage 6: Internal checking by wider Learning Services Team
  • Stage 7: Release and promote

A recommendation for the 2015-16 academic year is to create a new role (Student Reviewer). This person will have an input at Stage 1, Stage 4 and Stage 6. A second recommendation would be to use critical friends from outside Learning Services, including lecturers or academic developers.

During the year, we released three courses (one of which was not in CourseSites). The learning design for this courses was based around a linear, self-paced, individual learner model. Once these had been developed and released, there was no ongoing work requirement for the design team.

On successful completion of the course the learner was awarded an open badge and certificate. The criteria was they needed to have reviewed (read or watch certain material) and completion of any required tests.

We also have designed (not implemented) two other courses with different learning designs. These allowed us to challenge (and understand) the issues around maintaining online courses which had a real time communication dimension or designed around collaborative group learning.

Given the primary aim of the 2014-15 period was for Learning Services to learn from the design and development process, we didn't promote the courses. The promotion was an announcement on the intranet, and an area on our Learning Services web page.

The following usage statistics cover the period 12th Dec 2014 to 6th June 2015, and the two courses in CourseSites (Essay Structure and Information Sources).

Enrollments
  • Total 61 people, of which
    • Learning Services Staff: 4
    • UCS Staff: 8
    • UCS Students: 33
    • Unknown (no recognizable name, username or email): 16
Achievements (Open Badge)
  • Total 14 claimed, of which
    • Essay Structure: 10
    • Information Sources: 4
These badges have been achieved though out the period.
  • Dec 14:3 (of which Essay (1), Info Sources (2))
  • Jan 15: 3 (of which Essay (2), Info Sources (1))
  • Feb 15: 5 (of which Essay (4), Info Sources (1))
  • Mar 15: 0 
  • Apr 15: 1 (of which Essay (1))
  • May 15: 1 (of which Essay (1))
Course Access: Activity (clicks) inside the Content Areas

It is clear from the data below the courses have been accessed, and the peak period was around the end of Semester 1. This activity data excludes members of Learning Services

  • Dec 2014 (12th onwards): 160
  • Jan 15: 196
  • Feb 15: 201
  • Mar 15: 48
  • Apr 15: 54
  • May 15: 74
  • June 15 (before 6th): 15
Learning Design (Learning Analytics)

As mentioned earlier, to award the Open Badge / Certificate the learner needed to have stated they've reviewed the material and/or complete a quiz. This meant we could track user data through out the activity. It highlighted some interesting, firstly completion rates where relatively high (10 out of 18 for Essay Structure, and 4 out of 21 for Information Sources). Secondly, it answered a question (and a design flaw), why had so few people achieved the open badge for Information Sources? The answer was 15 people (out of 21) had got to the last two activities, however, these were deployed at he same time. The bulk of people (10 out of 15) completed the end of session test, but fewer also completed the Summon Search Task (5 out of 15). You need to complete both to be awarded the badge. So the design tweak will be to ensure these are released individually.

Why don't you have a try?

The courses are being taught using CourseSites by Blackboard, an online platform for organizing and securely sharing course materials, online lectures, discussion and other learning activities. To request enrollment into the courses, follow the steps below:
  • Launch a browser and enter the following URL to the course home page: https://www.coursesites.com/s/_lsucssto1
  • Once at the course home page, click the Request Enrollment button.
  • Enter a valid email address and your full name in the corresponding fields.
  • Optionally, edit the Subject.
  • Optionally, edit the message. The name you enter in the Full Name field will be automatically entered into the signature of the message.
  • Click Submit to send your request.
Shortly after, you will be sent to you a course invitation. Follow the link to confirm and register. When signing up, take note that you can register using existing account information from popular web services like Facebook, Twitter, LinkedIn, Gmail, Yahoo and Windows Live to make it easier to login.
 
With Thanks - Image - http://www.virtualeduc.com/images/online_icon.png


Monday, 8 June 2015

The power of the MCQ exam: Optical Mark Reading (OMR) Report 2014-15

The following report is for the OMR in Figures. This year we've not undertaken the survey, as we intended to collected staff stories as a means of promoting the service.
The key changes this year included:
  • Purchasing a license to run from Learning Services. This enhances turn around speeds, and we were able to deliver the processed forms within 2 hours for the Social Work Admissions Test
  • Extending the use of the software and service to support student surveys. This was to support an initiative from Admissions & Marketing
During 2014/15
  • The total number of course teams using the service was 12 (up from 5)
  • The total number of papers set was 20 (up from 17)
  • The total number of answer sheets scanned was 679 (up from 516)
  • Covered course teams in Dept Science and Technology, Dept of Applies Social Science, and Suffolk Business School
Recommendations for 2015/16
  • Be more proactive in promoting the service to course teams across UCS, including the collection of user stories and testimonials

Wednesday, 6 February 2013

Mahara in Numbers

This inaugural stats post is to kick off a monthly look at how our Mahara service is doing from a numbers viewpoint, and potentially how we can change the flow of traffic and probably more importantly, where.


Since our soft rollout to the institution mid 2011 the statistics haven't been awe-inspiring to say the least, however there is some form of rationale behind this, mostly because we were still figuring out what we wanted from the service, it can offer so much but adopted wrongly can cause issues, of which I'll undoubtedly blog about soon.


So let's take a looksie..



As you see from the above graph, the amount of users on the system compared to the number of pages created is quite drastic, however I'd rather personally have a low percentage of students using the service effectively than a gold star stats report where very few students understand the service.



Well looky what we have here! Two very clearly distinguishable peaks in our lifetime stats. Care to hazard a guess at when these peaks, err, peak? You'd be correct in thinking that these do indeed fall on September/November for the past few years, right slap bang when we are doing our student inductions. We push the Mahara service in our student inductions, explain it's effective use and benefits, and it clearly shows students are taking a ganders. Problem is, why aren't they staying?


This is something we need to focus on, the interest is there from students to use it.. next step, get them to stay there and continue using the service.




  • Is there enough follow-up support for students?

  • Are the FAQs enough?

  • Is it quick and easy enough?


Obviously an area I need to work on and I shall report back any progress before if not next months stats post.


To briefly summaries, Mahara is being used, not greatly but it is being used none the less. It's effective use reflected through case studies will come next but that's a different blog post!


:)