At the recent Australian Moot, a hot, emerging topic was “analytics”. Analytics have the potential to help students, teachers and institutions make better choices that lead to improved learning outcomes. Analytics were not possible in traditional education – they are, however, something a learning management system (LMS) can easily provide, if we can just figure out what they are…
What are analytics?
Analytics is a relatively new term; so new, in fact, that my spell-checker doesn’t recognise the word. It’s a real word, though; it must be as there is a Wikipedia entry. The term has been popularised in the online world by Google who offer analytics about your website so you can see details about your site visitors.
But what does the term analytics mean for education in the context of an LMS? At a panel session during the aforementioned moot, the topic was raised and this was the first question asked – what are analytics? One of the panel members, Alan Arnold from the University of Canberra bravely offered a definition: “Analytics are any piece of information that can help an LMS user do their job.”
“Analytics are any piece of information that can help an LMS user do their job.”
Thinking more deeply about the subject, I propose that LMS analytics are useful calculable quantitative gathered collections of information, based on the activities of users within or around the LMS, presented in forms that allow decisions leading to improved learning outcomes. (Quite a mouthful.) There’s lots of information about a user that can be collected in the LMS. The trick is to tease out and combine the useful bits and present them simply.
So the question is not so much “what are analytics?” but is instead “what analyics do you need?” and perhaps “how can you get them?”.
What analytics do you need?
Not all analytics are relevant to all users. If you are a teacher, you’re probably thinking about getting information that can allow you to teach better. If you’re a policy maker at an institution, you’re probably wanting to know how successful your teachers are with their students. Bet let’s not forget the students as well; there is information in the LMS that can help them also.
On the plane back from the Moot I decided it would be worth starting a catalogue of all the different analytics that could be captured in Moodle. At lunch I threw these at Martin and we cranked out a few more.
Analytics useful to Students
- Progress
With an LMS, it is possible to achieve regular assessment within a course based on a rich set of finely chunked multi-modal activities, and while this can lead to deep learning, it can also be overwhelming for students. It is, therefore, useful for a student to know where they are up to in a course and what they have to do next. Students who use short-term planning tend to be more successful; they just need a quick snapshot of their progress.
- Relative success
- Deep learners are more successful and deep learners are characterised by meta-cognition about their learning. Providing analytics about their relative success can allow students to know whether they are on track of if they need further exposure to a topic. Relative success can also be used to introduce a competitive element into a cohort, which some educationalists recommend.
- Opportunities to interact
- If students are studying in isolation, it may not always be apparent when there are chances for them to interact with peers or teachers. Determining the level at which a student is interacting could be seen as an analytic that can be used to direct them to opportunities for communication and collaboration.
Analytics useful to Teachers
- Student participation
- In an online world, it is more difficult for a teacher to know which students are participating and those needing a push. Students can fail to participate for numerous reasons, usually valid ones. Sometimes a student may need to be encouraged to withdraw from a course and re-enrol later. Where analytics can help is in the determination of the timing of when such decisions need to be made. That’s not to say that such information needs to be complex; it could be as simple as “traffic light” coloured icons next to a list of names of students, ordered by risk.
- Student success
- Assuming a student is involved, a teacher also wants to know how successful they are. This could be the product of assessment and views of resources. If students are progressing through the course with unsuccessful results, then they may need to be encouraged to re-expose themselves to a topic within the course before progressing further.
- Student exposures
- Moving away from a course modality where “one size fits all”, it is useful to know how many times a student was exposed to a topic before they were successful. This is a differentiating factor among students in a cohort. If students are progressing with few exposures, perhaps they are finding the course too easy, perhaps even boring, and may need to be challenged further. If students are requiring numerous exposures before they are successful, then perhaps alternate presentations of a topic need to be created to suit the learning preference of particular learners. Such an analytical tool can assist a teacher to deliver learning at an individual level.
- Student difficulty in understanding
- Through an analysis of exposures and assessment results, it may be possible to determine which topics, or areas within a topic, students are finding difficult. This may indicate areas that need to be revisited in the current delivery or enhanced in a future delivery of the course.
- Student difficulty in technical tasks
- When students are undertaking learning, the last thing they want is to be stifled by an inability to express their understanding because of by the way a course is set up within the LMS. Students patterns of use within the LMS may indicate they are having such difficulties, and a teacher can be alerted to take action.
- Feedback attention
- Teachers take time and spend effort creating feedback for students as a reflection of their understanding. It is useful to know which students have paid attention to such feedback, and which students may need to be encouraged to do so. Going beyond this it may be possible to deliver information to a teacher about the effectiveness of their feedback on students’ understandings as reflected in subsequent assessment.
- Course quality
- In several institutions that I know of, part of the measurement of a teacher’s effectiveness is judged by the quality of the courses they are producing within the LMS, based on a set of metrics. Such measurements can be used for promotions and to drive the development of PD activities. If such metrics can be automated, then analytics can be produced for teachers that encourage them to improve their course by increasing the richness of their resources, improving the quality of their activities, including more activities of different kinds, providing more opportunities for students to interact or collaborate.
Analytics useful to Institutions
- Student retention
- Analytics can provide more information about students than simple pass/fail rates. Analytics can help determine when students may be at risk of failing and in which courses this is more likely to happen. Such analytics can help an institution to send resources to where they are needed most and to plan resources for the future.
- Teacher involvement
- There may be ethical implications in monitoring teacher involvement in a course as it is akin to workplace survelance. However there is information in an LMS that can be presented in a useful way in relation to training and promotions. It might also be useful to anonymously tie in a teacher involvement analytic with other analytics to find correlations.
- Teacher success
- As well as looking at success in terms of pass and fail, it may also be possible to determine where teacher interventions have encouraged students to achieve beyond their expected outcomes.
- Relative course quality
- Clearly not all courses are equal, but how do you determine which is better. There have been a number of attempts to manually measure aspects of a course such as accessibility, organisation, goals and objectives, content, opportunities for practice and transfer, and evaluation mechanisms (Criteria for Evaluating the Quality of Online Courses, Clayton R. Wright). If such metrics can be automated, then analytics can be created with can reflect the quality of courses. Such metrics could also be fed back to teachers as an incentive to improve their courses.
How can you get them?
So, you want these analytics, but how can you get them? Some of them may already be accessible via various mechanisms, however I think we still need to work out how best to draw this information together in a simple way for specific users.
Moodle currently logs most actions that take place within the LMS. It is possible to view log reports, but they are limited to interaction in terms of activities within a course.
There are a number of existing plugins and extensions to Moodle that attempt to provide analytics to users. Among these there are a batch of report generators, many of which are quite configurable.
The Configurable reports block plugin is a configurable system that allows reports to be created and used by various roles. It may be a good model to use to start a set of standard analytics reports within an institution.
- The Custom SQL queries report plugin allows an administrator to run any query against the database used by Moodle. It’s clearly flexible, but not something you can put into the hands of all users.
- The Totara LMS is a rebranded, extended version of Moodle. One of the aspects built onto the standard Moodle install is a reporting facility that provides customisable reports to users of different roles.
There are also a number of blocks available and in-the-works that attempt display analytical information to users.
My own Progress Bar block shows a simple view of course progress to students and an overview of student progress to a teacher.
- The Engagement analytics block is
currently in development, but I have seen a demo of this and it looks good.now available. The block allows a teacher to specify expected targets for students then presents to teachers simple traffic-light icons next the names of students at risk. - The Course Dedication block estimates the time each student has spent online in a course.
- The Graph Stats block shows overall student activity in a course over time.
Simple queries
A lot of these analytics can already be queried or calculated from the data already stored in the Moodle databse. The Report plugin type is available for presenting pages if information to users and is applicable for analytics. The Block plugin type is available for simple, compact presentation of information. Both of these APIs can present different displays to users with differing roles.
New logging
Currently, most of the logging that takes place in Moodle ends up in a single table. For simple lookups, this is not a problem, but for complex conjunctive queries, working with a large log table can hog the resources of a server. The current system of logging is likely to be a target of future work at Moodle HQ so that both the recording and retrieval of information can be achieved efficiently.
Measurement of a number of the interactions required for the analytics above is not possible using the current log and module tables. Viewing the recording of user interactions from an analytical perspective may lead to new information being captured for later analysis and presentation.
AI or user optimised queries
When you have a wealth of user interaction information available, why stop at the limits of human developers.
- Genetic algorithms, neural networks and other heuristic approaches may reveal newly refined analytics or even new analytics altogether.
- Crowd sourced optimisation of analytics reports may allow a collective intelligence to refine analytics so that they are even more valuable and reliable.
Analysing analytics
Providing analytics allows us to study the impact that analytics can have on the users who view them. This allows general research questions such as “what analytics promoted better learning, teaching or retention?” Also, specific questions can be asked about individual analytics, such as “does feedback attention have an impact on learning outcomes?” Queue the savvy educationalists…
On my opinion, it is quite difficult to find a lms solution on reasonable price providing the analytics you need. I have reviewed several e-learning solutions from the point of view of reporting and analytics tools (Moodle, Totara, JoomlaLMS). Find the JoomlaLMS analytics most effective for teacher (in fact, the analytics in this system are only intended for teachers and supervisors). They are all grouped in the same section and are easy to reach. Among the most important: analytics on the documents download, learning paths, latest course activities, quiz answers, access to tools, most active users…
Thank you, Michael for this wonderful post on analytics. As an academic who is extremely interested in this area, i would like to open discussion on some points in your post in detail:
1. Progress bar: it is a fantastic visual tool to offer students a snap shot of their progress.
2. Relative success: it is also good to enable a student to compare their individual progress with the collective progress of the entire class at any point during the semester and enabling the averaging of individual progress against the average of the entire class. The end of semester average comparison of individual students might be useful with calculation of correlations between average access or/and completion of tasks in a subject with the grades.
3. Opportunities to interact:
Moodle already has a chat function but to enable this function to be fully utilized, The habit formation of forming and working within a community of practice needs to be promoted and encouraged. Is it possible for location services such as finding classmates who are nearby be built into Moodle? Forum function is also a tool to promote community building but students prefer to use facebook to do so.
4. student participation: at the university level, once students enrol and once the census date (usually at the end of week four), is passed. It is difficult for students to withdraw. However, making analytic data available which tells the course adviser what factors in which unit are important for student success. This could be gotten by creating an automated system that uses statistical tool such as decision tree to analyse historical gradebook data to yield the crucial factors in a unit that are critical in being successful in a unit.
5. How to encourage re-exposure to learning materials: How can this be done in Moodle currently? Setting Quiz items to allow multiple attempts is a good way to re-exposing students to learning materials especially if reading materials can be linked to quizzes like in Hot potatoes.
6. Using restricted access again allows a teacher to determine a particular pathway for student learning. However, this can force the whole class to follow the same path. Can Moodle quiz be adaptive on a personal level for each individual student? How can this be achieved?
7. Using student difficulty, in terms of how many attempts and how long did the students in each attempt can be used to illustrate student difficulty in a quiz item. Analytics on the average it takes a group of students to do a quiz can be a reflection on the difficulty of the quiz which can prompt a teacher to look at the questions and modify the quiz.
8. Course quality is a bit more controversial as we need to choose metrics for measuring a teacher’s teaching performance which are solidly based on sound educational principles. For instance, do we know what constitutes good Moodle site designs which can warrant a label of excellent performance? Many would argue that teaching performance is a combination of many things and the design of a Moodle site is only one of the metrics which can measure relative course quality.
I am excited about the forthcoming development in your blog. I hope this long reply might spark further discussion in what analytics we might need to guide learning.
Hi, Zhang.
Thanks for such a thoughtful response. There were a few questions in your reply so I will try to respond.
You commented on “opportunities to interact” and I think this is an important, but underdeveloped area. My thoughts were allowing students to interact online, but of course, as you pointed out, it can also include offline interactions. There are a couple of tools I can think of for locating other students physically. Perhaps the best is the Online Users Map block (http://moodle.org/plugins/pluginversions.php?plugin=block_online_users_map). Personally I don’t like Facebook, but I don’t know how to fight it. Is that me being backward, because I like to think of myself as progressive…?
I liked your idea for gauging student participation. It would be good to see a tool like that implemented as a plugin. Has anyone got ideas for this?
For repeat exposures, Moodle can be “shoehorned” into forcing students repeat their own exposures to materials until they achieve a satisfactory result. You can already use the mechanism of activity completion and requirements for activities to achieve this. The challenge, I think, is to make this the focus of a course. One current problem is that exposing students to the same content repeatedly will not necessarily yield great improvements; students must be given the opportunity to be exposed in differing modalities until they “get it”. Knowing which modalities are preferred by students is helpful to. I like the ideas being employed by the School of One (http://schoolofone.org/concept.html) and I’m trying to think of how these ideas could be incorporated in an LMS. How can we do that?
Can Moodle be adaptive to different students? At this stage, I don’t think so. The closest you can come is relying on randomness to offer a different experience to students. There are some adaptive mechanisms in the Quiz module, but they have a different focus to what we are talking about here.
Course quality is an area of research that has had some work done in it. I am not an expert there and I’m not too keen to become one, but I know that there are already established metrics for online course quality, so I don’t think that applying them would be that controversial. I agree that course quality is not the only measure of a good teacher, but I think it’s one that we can get right and easily advise teachers on. Imagine a Progress Bar style block that has specific milestones for a teacher to follow when setting up a course. That would be sweet! What do you think?
Thanks again for responding. I hope more people can join in this conversation.
Dear Salvetore, this felicia also known as zhang Zhen online. I am a language teacher so naturally I expose the same material to students in spoken, written and reading formats. I also know that for students to ‘get it’, sometime exposure to new vocabulary almost 20 times is required for retention purposes. But my colleagues in other areas of education might be less familiar with such research. Therefore, there is a need to build in the use of different modalities in moodle to do this. At the moment I use hot potatoes ex to offer materials in different modalities. I checked out the school of one site. It is great to see it working in Mathematics. I can imagine it might work with science too. Do you know whether moodle or blackboard have features to enable this? however, since personalised learning depends on a lot of students’ data, privacy is again a concern. I also wonder whether the technology behind this will be too expensive for institutions too.
I have been investigating the use of progress bar in the last few days and realised that te dates displayed on the bar (if turned on) depends a lot on the design of the site. Say if you put too much in topic zero, then they will have the same due dates. So your idea of having a progress bar block or wizard that offer staff just in time guidance when creating a site is a crucial step. For instance, if a staff member puts a whole heap of staff in topic zero (over 200 characters), the a warning or reminder will appear which reminds them the function of topic zero and remind them the need to avoid the scroll of death etc. so guidance in site construction will enable progress bar to measure stuff accurately.
The quiz adaptivity is still being investigated and I have a feeling it is too hard for a mere mortal to use but stay tuned.
I have also investigated gismo, it tracks everything including access to the site and it does not require setting up from the teacher. You can also isolate one resource and see who have read it. But it does not have a student view.
So my thinking is for all these tools to have traction in moodle, it will be good to combine the capability of progress bar especially it’s visualisation and gismo into one tool and into moodle core so that such useful tool can be maintained with every upgrade. I wonder this is doable at all and what resource implication there might be for this too.
I will be keen to here from you and others on this point.
Dear Michael, dear Felicia
Thank you for this blog post and the comments.
I am a project member of the MOCLog (Monitoring Online Courses with Logfiles) project http://www.switch.ch/aaa/projects/detail/SUPSI.5 here at the Swiss Distance University of Applied Sciences (www.ffhs.ch) in Switzerland.
During 2011-2012 the MOCLog-team developed two plugins (MOCLog and GISMO) based on a didactically oriented monitoring model. The first one is for administrators and the second one for teachers and students. With these plugins we want to help different stakeholders (students, teachers and the institutions) to analyze their Moodle log data.
Good news Felicia: we already modified the GISMO plugin so that students can also use it for their learning analytics. The newest version is online available: http://sourceforge.net/projects/gismo/files/
Concerning the relative success that you Michael mentioned:
with GISMO, students know whether they are on track and what kind of activities/resources they already did respectively read. What is not possible at the moment is what Felicia mentioned: that the students can compare their individual progress with the progress of the entire class.
For the teachers GISMO is also helpful because they can track the participation of the students by analyzing the activities they used and the resources they looked at.
The MOCLog plugin is useful for institutions and their administrators. They can see and analyze the course activity (selected courses with their number of hits on each Moodle tool) and the tool activity (each tool with the total number of hits across all courses).
The MOCLog plugin is available here: http://sourceforge.net/projects/moclog/files/.
We want to publish the final release of the plugins in August/September on moodle.org. At the moment we are implementing the tutorials/manuals for the plugins on our website moclog.ch.
We hope to contribute with our project two helpful plugins to the community concerning the learning analytics. For more information: http://www.moclog.ch
Thank you Jetmire, thanks for the wonderful work. We will download them tools and have a look. By the way, have you got Gismo reviewed by your Moodle partner? Is the present Gismo compatible with 2.3? In my communication with people who created Gismo, they have not yet tested with Moodle 2.3. Another thing Gismo and progress bar, or activity completion in Moodle do not have a push notification facility to send alerts to lecturers or students. Would this functionality be part of the thinking in building future analytics tools? I would love to communicate with you in private on these matters, could you send me an email to felicia.zhang@canberra.edu.au? thanks and have a nice weekend.
No the present Gismo-version is not yet compatible with 2.3. I would also like to communicate with you concerning this functionality. Have a nice day.
dear jetmire, to install your extended version, do you need to unstall the beta version without the student view and then install your version? thanks.
Dear Felicia, I am not sure what the best way is. I think you can just install the extended version, then Moodle will recognise that this is an update.
If you like, I can ask the developers. They can surely help you.
Dear Michael, please ask for me as I am yet to hear from them. Thanks.
felicia
Dr Felicia Zhang
Project leader for the Learning analytics project,
Senior Lecturer in Chinese and Applied Linguistics,
Faculty of Arts and Design,
University of Canberra.
Building 20, room: 20C14
Tel: 61262012496
Email: Felicia.zhang@canberra.edu.au
2003 Winner of an Australian Award for University Teaching in the category of Humanities and The Arts
(Download a QR reader on your smartphone or ipad (search for QRreader )and hover the reader over the following QR code to get a surprise)
[cid:image001.jpg@01CD7970.5B5222E0]
Thank you for a beautiful and very wide review of the tools that available to teachers and system administrative stuff, when trying to figure out what is happening in a course or on their entire system.
Just wanted to share another perspective and a set of tools for a wider view…
I am currently venturing out side of Moodle to include data from other systems (ex SIS and CMS) which lead me to checkout: Jasper Reports, Pentaho, BIRT ( http://www.innoventsolutions.com/open-source-reporting-review-birt-jasper-pentaho.html ) And lately, ART ( http://art.sourceforge.net/ )
I found them all very useful but with varying degrees of difficulty to use. All depending on the administrative stuff that needs to use them.
Analytics is useful to teachers, students as well as to institution. It can provide more information to all of these categories and even eases their job according to different means.