Learning Analytics

From EdFutures
Jump to: navigation, search

Contents

Introduction

"There is a burgeoning use of analytics in institutions with a range of different audiences and purposes across the educational system. However, careful thought needs to be given as to what the purpose of analytics is; in other words what organisational business objectives are the analytics being applied to which could be a specific issue of concern or a broader strategic aim. Possibilities include:

  1. for individual learners to reflect on their achievements and patterns of behaviour in relation to others;
  2. as predictors of students requiring extra support and attention;# to help teachers and support staff plan supporting interventions with individuals and groups;
  3. for functional groups such as course team seeking to improve current courses or develop new curriculum offerings; and
  4. for institutional administrators taking decisions on matters such as marketing and recruitment or efficiency an effectiveness measures.

It is worth a note of caution that this kind of work is in its early stages and is attractive to stakeholders who may have very different motivations for undertaking analytics based projects and it is a good idea to surface and articulate these motivations early on. There is a moral dimension to education, and the needs of the individual learner may come into conflict with that of other stakeholder groups, such as managers and administrators." [1]

Where has learning analytics come from?

In “The State of Learning Analytics in 2012: A Review and Future Challenges” Rebecca Ferguson [2] tracks the progress of analytics for learning, with a broad progression moving through:

  1. An increasing emphasis and interest in ‘big data’ in business intelligence. Targeted advertising, ‘customers who bought x also bought y’ and personalised promotional offers are often given as examples of this sort of business intelligence.
  2. The rise of Virtual Learning Environments (VLEs), Content Management Systems (CMSs), and Management Information Systems (MIS) for education saw an increase in digital data regarding student background (often held in the MIS) and learning log data (from VLEs) – this afforded opportunity for learning from business intelligence in education.
  3. However, this raised the question of how to optimise these systems to support learning, particularly when visual signals are absent in online education – how do we know a student is engaged/understanding if we can’t see them?
  4. An increasing pressure to evidence ‘progress’, show good professional practice, and evidence rational pedagogic decision making also placed pressure on analytics for use by management in both internal and external accountability systems
  5. Thus, analytics became not only a tool for the top levels of a stakeholder hierarchy (governments or institutions) but something in which teachers hold an important stake given that they may be associated with accountability structures
  6. Alongside the natural development of the field and research, this placed pressure to explore the pedagogic affordances of learning analytics, for example with social constructivist learning models exploring the use of Social Network Analysis (SNA) for supporting learning.
  7. An increasing economic pressure further emphasises the need for engagement from all stakeholders in learning analytics – for a variety of purposes (see below).

Learning Analytics and Educational Data Mining – What’s the Difference?

Educational institutions have a wealth of information at their fingertips. Universities in particular are exposed to a deluge of “big data”, and are increasingly turning to techniques from business intelligence and analytics to cope with this information. In these cases, managers in institutions have taken decisions to track particular metrics, crunch numbers towards basic notions of success such as course completion, attendance, or grades. In these cases learning data collected using tool such as Google Analytics, VLE log data, and other Management Information System (MIS) metrics are crunched to explore patterns. Such approaches have had some success in improving completion rates, and they may be useful for Governmental, institutional and course level users. Often, when we talk about “academic analysis” and “educational data mining” it is these sorts of analytics we refer to.

In contrast, often (but not always!) when people refer to “learning analytics” they are referring to the application of data mining techniques for specifically pedagogical purposes. Many researchers in both EDM and LA do work which crosses between the two. Ferguson [2] summarised it thus:

While Siemens and Baker [3] highlight some key differences between LA and EDM as in the table.


LA

EDM

Discovery

The end goal is to facilitate human judgement, automated models play a part in helping humans make judgements

The end goal is automated models, humans play a part in setting these up

Reduction & Holism

“Stronger emphasis on understanding systems as wholes, in their full complexity”

“Stronger emphasis on reducing to components and analysing individual components and relationships between them”

Origins

Stronger origins in understanding learning materials and learners using computational methods

Stronger origins in modelling outcomes, and software to track this

Adaptation & Personalization

“Greater focus on informing and empowering instructors and learners”

“Greater focus on automated adaption (e.g. by the computer with no human in the loop)”

Techniques & Models

Social network analysis, sentiment analysis, influence analytics, discourse analysis, learner success prediction, concept analysis, sensemaking models

Classification, clustering, Bayesian modelling, relationship mining, discovery models, visualization

Table adapted from [3]

Their paper also calls for further collaboration between EDM and LA for addressing the analytic needs of all stakeholders – at the Macro (governments), Meso (instituitions), micro (individuals) levels [4]

Integrating Data Sources

Challenges exist in the development of tools which facilitate the shared use of data at these various levels of analytics to support integrated services and approaches to learning [5]

It has been suggested that an Open Learning Analytics platform – to integrate various data sources [6]

On the schools level, Shared Learning Collaborative (SLC) is working to integrate such data and create a platform which can be used across US states, to save money and provide improved analytics through the use of open standards to allow integration of data sources via the use of APIs.

There’s a dual aim in this project - a) give teachers a better picture of their students and b) facilitate teachers and learners in finding resources and tools for their personalised learning (see e.g. [7] For more on SLC in education).

Risks and Ethics

If the way of understanding learning processes better is having detailed learner-produced data, there are serious issues come out about privacy. Because analytics derives from data, it is an inevitable fact that LA causes outstanding concerns particularly about ethics of using data. In the 1st LAK conference in 2011, all presenters and attendees agreed that LA raises privacy related concerns. Because LA mechanisms examine what learners are doing, it might be assumed as invasion of privacy for some people. Monitoring daily activities of learners can be assumed as invasion of privacy behind the scenes and even called as ‘Digital Big Brother’.

"The legal and ethical considerations relating to analytics are focused on personal data processed by or on behalf of the institution. Whilst other corporate data, in areas such as financials or estates, present their own technical, operational and interpretative challenges, they do not raise such immediate legal and ethical issues relating to individuals. Such personal data of analytical value may range from formal transactions to involuntary data exhaust (such as building access, system logins, keystrokes and click streams). The data can be derived from a range of systems: *Recorded activity; student records, attendance, assignments, researcher information (CRIS).


"Given the education mission and associated governance responsibilities, broad ethical considerations are crucial regardless of legal obligation. This impacts broad considerations and concerns about the use of personal data, as well as the specific uses involved in analytics.*Variety of data - principles for collection, retention and exploitation. *Education mission - underlying issues of learning management, including social and performance engineering.*Motivation for development of analytics – mutuality, a combination of corporate, individual and general good.*Customer expectation – effective business practice, social data expectations, cultural considerations of a globalcustomer base.*Obligation to act – duty of care arising from knowledge and the consequent challenges of student and employee performance management." [8]

See also: Kay, David, Naomi Kom, and Charles Oppenheim. Legal, Risk and Ethical Aspects of Analytics in Higher Education. Analytics Series. http://publications.cetis.ac.uk/wp-content/uploads/2012/10/Legal-Risk-and-Ethical-Aspects-of-Analytics-in-Higher-Education-Vol1-No6.pdf [8] and the Wikipedia page on Learning Analytics [9]. See also Section 5 of Analytics for Learning and Teaching [10]

How can we deal with These Problems?

Learners have a right about data ownership; therefore there is a need for learners themselves to control about which data to share (Siemens, 2012). Allowing students opt-out can be an alternative way to deal with privacy concerns. However letting some students to hide their activities might decrease the accuracy of LA because analysis of an incomplete data from the subset of the students in a course cannot give the realistic results.

LA applications should be responsible of transparent notification of the scope and clear communication about the role of the system itself (Siemens, 2012. So if the only way is tracking all the student activities, transparency is a key principle in doing so. It should be always obvious to the learner that they are being tracked. There might be even a tool used that enables user to understand they are being tracked on the system.

However, there are still questions, which need to be answered and to be carefully worked on, such as ‘Should learners be told that their activities are being tracked?’ ‘How much information should be provided to learners, educators, and administrators?’ ‘Should students have a chance to opt-out?’ and so on (Fergusson, 2012).

Benefits

Example uses

The 3rd CETIS Analytics briefing paper gave a number of examples of analytics in use [10]:

What are learning analytics? Some different types of LA

The nature of learning analytics, and their uses in pedagogic practice, mean the toolset is likely to be an evolving one, and perhaps difficult to capture in its entirety. Furthermore, some tools which might be characterised as 'learning analytic' in nature are not labelled as such. However, it is possible to describe a broad overview, and indeed Simon Buckingham Shum has done so in his UNESCO Policy Briefing [12] summarised below

LMS/VLE data analytics and visualisation

As discussed above, the rise of VLE use may be one of the drivers in an increased interest in LA. However, while VLEs have long been capable of capturing log data, such data has not been in a usable format for most educators. New data dashboards allow visual access to data from VLEs, and other student data. They often provide basic statistics automatically - for example, comparing a student's achievement to the average. More complex visual tools also exist, for example displaying the social network graph of a student cohort from VLE interactions.

Social Network Analysis

Social Network Analysis (SNA) aims to make visual the relations between learners, experts, and resources. They have been used to compare the networks of experts (higher grade students) with novices, to explore the sorts of activity that might facilitate learning and the creation of effective communities of practice.

Tools include SNAPP and NAT

Discourse Analytics

While SNA tends to focus on interaction such as forum posts, discourse analytics takes such simple metrics a step further to explore the content of such posts. Discourse analytics builds on extensive work in the discursive properties of higher quality discourse for learning related to learners: constructing arguments; working together successfully; building on each other's arguments; problem solving, etc. Discourse analytics could be run on forum interactions, live chat, or various written texts such including traditional essays.

Predictive Analytics

Predictive Analytics are aimed at bringing together static data (e.g. demographic information) and dynamic (forum logins, lecture views, etc.). to predict success. This is often exam targeted - and at the moment one of the best predictors of exam success at the end of a course is grades at the start of the course - but predictive analytics could be used to model 'soft skills' such as problem solving abilities too.

Adaptive learning

Adaptive learning analytics is designed around systems to give fine grain feedback on specific elements of understanding within a particular subject. It aims to build a model of understanding of some topic, concept or skill, and areas of weakness to give specific guidance for students to improve their learning. Such systems are resource intensive in that they require conceptualisation of the key elements of the area being assessed, and fine grain 'discrimination' assessments in order to give feedback on particular misunderstandings, etc. However, they have potential to be a very powerful tool particularly in the context of assessment for learning.

Examples - Grockit, Knewton, Open Learning Initiative

The role of dashboards and visualisations

In the learning process, for learners, educators and administrators, it is very beneficial to have a visual representation of learner-produced data, which shows relations, and compares it with other learners’ data (Duval, 2011); because the activities has been done and its affect on learning can be interpreted by the intended audience clearly. These visual representations are collected within one place, which is called dashboard. Dashboards are sense-making component of LA applications in which data analysis is represented as bars, charts, and diagrams to the intended audience in order to assist them when making decisions on teaching and learning (Siemens et al., 2011).

Social Learning Analytics

SLA aims to capture meaningful data regarding the role of social interaction in learning, including discourse and the structure of social networks. [13]

Disposition Analytics

Disposition Analytics [14][15] aims to capture meaningful data regarding student's dispositions to their own learning; for example, learners who could be characterised as "curious" might ask more questions - a disposition which could be captured and analysed in meaningful ways [16]

I want to use LA/EDM – how do I set it up?

See also, section 4 of Analytics for Learning and Teaching[10]

Data Ecosystem

Most VLEs now offer some basic data on content use and user interactions (although these functions may need to be turned on). In addition, there are an increasing number of tools to analyse this data, combine and analyse various sources of data on the open web (e.g. twitter), and visualise the data for sensemaking. See below for a discussion of some of these.

It should also be born in mind that, if we're interested in explicitly learning focussed analytics, then we should think about how and why the analytics is deployed:

We should also think about whether or not LA is a disruptive, or evolutionary technology - does it enforce, prop up, support and develop current teaching, assessment, and learning regimes, or does it offer opportunity to completely redefine what education looks like? See the section below on this topic.

Tools for Analytics

“Commercial applications include Mixpanel analytics, which offers real-time data visualization documenting how users are engaging with material on a website. Similarly, Userfly, designed for usability testing, provides the ability to record the behavior of visitors to websites, and then play it back for analysis. Moving in a different direction, Gephi is a free, open source interactive visualization and exploration platform described as “Photoshop but for data.” It is connected to exploratory data analysis.

11 Among the tools developed specifically for learning analytics is Socrato, an online learning analytics service that generates diagnostic and performance reports. SNAPP (Social Networks Adapting Pedagogical Practice), developed by the University of Wollongong in Australia, is a tool designed to expand on the basic information gathered within learning management systems; this information tends to center on how often and for how long students interact with posted material. SNAPP instead visualizes how students interact with discussion forum posts, giving significance to the socio-constructivist activities of students.

LA – Revolution or Evolution? Disruption or business as usual…

How do LA relate to other levers for change and technologies?

  1. Assessment [18]
  2. OER and the way they’re used (paradata +) – this matters for the flipped classroom
  3. Exploring how to create effective online learning
  4. Use of digital technologies, contextual information, use information,
  5. Many of these things are easier to do in the cloud
  6. But there are ethical issues to all of this
  7. Games based learning
  8. It may be possible to explore CPD in the context of LA, providing support for how LA are used with students through the use of analytics on the contact points with those students

See also

Resources (readings, courses, etc.).

References

  1. Powell, Stephen, and Sheila MacNeil. Instituitional Readiness for Analytics A Briefing Paper. CETIS Analytics Series. JISC CETIS, December 2012. http://publications.cetis.ac.uk/wp-content/uploads/2012/12/Institutional-Readiness-for-Analytics-Vol1-No8.pdf.
  2. 2.0 2.1 Ferguson, Rebecca. The State of Learning Analytics in 2012: A Review and Future Challenges. Technical Report. Knowledge Media Institute: The Open University, UK, 2012. http://kmi.open.ac.uk/publications/pdf/kmi-12-01.pdf
  3. 3.0 3.1 Siemens, G., and R. S. J. Baker. “Learning Analytics and Educational Data Mining: Towards Communication and Collaboration.” In Proceedings of the 2nd International Conference on Learning Analytics and Knowledge, 252–254, 2012. http://dl.acm.org/citation.cfm?id=2330661
  4. Buckingham Shum, Simon. “Our Learning Analytics Are Our Pedagogy”. Macquarie University, 2012.http://www.slideshare.net/sbs/our-learning-analytics-are-our-pedagogy.
  5. Knight, Simon, Simon Buckingham Shum, and Karen Littleton. “Collaborative Sensemaking in Learning Analytics.” In CSCW and Education Workshop. San Antonio, Texas, USA, 2013.
  6. 6.0 6.1 Siemens, George, Dragan Gasevic, Caroline Haythornthwaite, Shane Dawson, Simon Buckingham Shum, Rebecca Ferguson, Erik Duval, Katrien Verbert, and Ryan S.J.d. Baker. “Open Learning Analytics: An Integrated & Modularized Platform”. Society for Learning Analytics Research (SoLAR), July 2011. http://solaresearch.org/OpenLearningAnalytics.pdf
  7. Knight, Simon. “Linking Existing Data - See the Connections.” Nominet Trust Knowledge Centre, October 25, 2012. http://www.nominettrust.org.uk/knowledge-centre/blogs/linking-existing-data-see-connections
  8. http://en.wikipedia.org/wiki/Learning_analytics
  9. 10.0 10.1 10.2 Van Harmelen, Mark, and David Workman. Analytics for Learning and Teaching A Briefing Paper. CETIS Analytics Series. JISC CETIS, November 2012. http://publications.cetis.ac.uk/wp-content/uploads/2012/11/Analytics-for-Learning-and-Teaching-Vol1-No3.pdf.
  10. Johnson, L, R Smith, H Willis, A Levine, and K Haywood. The 2011 Horizon Report. Austin, Texas: The New Media Consortium, 2011. http://www.nmc.org/pdf/2011-Horizon-Report.pdf
  11. Buckingham Shum, Simon. Learning Analytics Policy Briefing. UNESCO, 2012. http://iite.unesco.org/pics/publications/en/files/3214711.pdf
  12. Buckingham Shum, S. and Ferguson, R., Social Learning Analytics. Educational Technology & Society (Special Issue on Learning & Knowledge Analytics, Eds. G. Siemens & D. Gašević), 15, 3, (2012), 3-26. http://www.ifets.info Open Access Eprint: http://oro.open.ac.uk/34092
  13. Brown, M., Learning Analytics: Moving from Concept to Practice. EDUCAUSE Learning Initiative Briefing, 2012. http://www.educause.edu/library/resources/learning-analytics-moving-concept-practice
  14. Buckingham Shum, S. and Deakin Crick, R., Learning Dispositions and Transferable Competencies: Pedagogy, Modelling and Learning Analytics. In: Proc. 2nd International Conference on Learning Analytics & Knowledge (Vancouver, 29 Apr-2 May, 2012). ACM: New York. pp.92-101. DOI: http://dx.doi.org/10.1145/2330601.2330629 Eprint: http://oro.open.ac.uk/32823
  15. Buckingham Shum, S. and Deakin Crick, R., Learning Dispositions and Transferable Competencies: Pedagogy, Modelling and Learning Analytics. In: Proc. 2nd International Conference on Learning Analytics & Knowledge (Vancouver, 29 Apr-2 May, 2012). ACM: New York. pp.92-101. DOI: http://dx.doi.org/10.1145/2330601.2330629 Eprint: http://oro.open.ac.uk/32823
  16. Siemens, George. “Sensemaking: Beyond Analytics as a Technical Activity.” Technology, April 11, 2012. http://www.slideshare.net/gsiemens/eli-2012-sensemaking-analytics.
  17. Booth, Melanie. “Learning Analytics: The New Black.” EDUCAUSE Review, August 2012. http://www.educause.edu/ero/article/learning-analytics-new-black.

Johnson, L, R Smith, H Willis, A Levine, and K Haywood. The 2011 Horizon Report. Austin, Texas: The New Media Consortium, 2011. http://www.nmc.org/pdf/2011-Horizon-Report.pdf. http://wp.nmc.org/horizon2011/sections/learning-analytics/#0

Personal tools
Namespaces
Variants
Actions
Wiki navigation
Useful stuff
Toolbox