What Is Learning Analytics and Why Does It Matter for Indonesian Education?

Introduction

Digital learning has transformed access to education, offering flexible and scalable learning opportunities to diverse learners worldwide.

Fig. 1. AI-generated image of digital learning. “Create image that represent digital learning” prompt. Gemini.

In Indonesia, this momentum has been strengthened through national programs such as Kampus Merdeka, Kominfo Digital Talent, Badan Erekraf Developer Training, and government–industry collaborations like Google Bangkit Academy and Coding Camp powered by DBS Foundation.

Alongside these initiatives, Indonesia’s edtech ecosystem continues to grow. Ruangguru serves around forty million learners across the country, including students and teachers. Dicoding Indonesia delivers industry-aligned programming and technology courses, impacting more than one million learners, while RevoU supports over 2.5 million learners through professional upskilling programs. Many universities are also developing their own online learning platforms, collectively contributing to a rapidly expanding pool of digital learning data.

These developments mean that learning increasingly takes place on digital platforms, creating more detailed and granular data about how learning happens. The key question for educators, institutions, and platforms is how to utilize these data to understand and improve learning at scale.  Learning analytics offers one path toward addressing this challenge.

What Is Learning Analytics

The foundational definition of learning analytics comes from the first Learning Analytics and Knowledge conference in 2011, which describes learning analytics as the measurement, collection, analysis, and reporting of data about learners and their contexts for the purpose of understanding and optimizing learning and the environments in which it occurs. This definition remains a widely accepted reference point across research and practice.

As learning analytics has developed, scholars have encouraged a broader, more theory-informed perspective. Gašević, Dawson, and Siemens (2015) argue that learning analytics must remain centered on learning, meaning it should be grounded in learning theory, aligned with the intentional design of learning experiences, and focused on insights that educators can apply in practice. Discussions within the Society for Learning Analytics Research similarly highlight that analytics requires interpretation and communication, not merely computation, to support decisions within classrooms, programs, and institutions (SoLAR, n.d.).

In practice, learning analytics involves collecting data about learning, making sense of that data using appropriate theories and methods, and translating insights into actions that support educators and learners. It is both a research field and a growing professional practice across education systems.

What Counts as Learning Data

When people hear the term “learning analytics,” they often think of test scores, pass rates, and completion statistics. In reality, learning analytics draws from far richer and more diverse forms of data. Activity data from digital platforms can show which modules students open, how long they spend on a page, how many attempts they make on quizzes, and the sequences they follow through a course. Assessment data include scores, rubric ratings, automated code checker results, project evaluations, and peer assessments. Interaction data capture discussion forum posts, peer feedback, and collaborative work logs. Perception and self-report data reflect learner motivation, self-regulation, confidence, and feedback experiences (Bienkowski, Feng, and Means, 2012).

In Indonesia, these kinds of data are already collected in many contexts. Learning Management Systems store activity and assessment data. Coding and cloud platforms record detailed performance traces, such as submission histories and error patterns. National programs gather demographic, registration, and completion data across large cohort.

Together, these diverse data sources enable a multidimensional view of learning. Rather than capturing only final outcomes, learning analytics helps educators and institutions observe learning processes as they unfold, opening opportunities for more responsive, evidence-based support.

From Data to Insight: Approaches in Learning Analytics

Learning analytics consists of several complementary approaches. Descriptive analytics helps us understand what has happened, such as how many learners completed a module or how performance varies across cohorts. Diagnostic analytics explores why specific patterns occur, for example, by identifying which quiz items commonly cause difficulty. Predictive analytics looks ahead, estimating what might happen next. Early warning models can identify learners who may struggle based on their engagement patterns. Prescriptive or actionable analytics considers what educators can do in response, such as recommending specific resources to learners or suggesting which students may benefit from timely outreach (Siemens, 2013; Bienkowski, Feng, and Means, 2012).

Across these approaches, researchers emphasize that analytics become meaningful only when they are aligned with pedagogical goals and used to inform actionable decisions about course design, feedback strategies, mentoring, or institutional planning (Gašević, Dawson, and Siemens, 2015).

Learning Analytics, Learning Theory, and Learning Design

A key insight in learning analytics is that it sits at the intersection of data science, learning theory, and learning design. Data science provides the analytical methods to collect, process, and model data. Learning theory offers the conceptual lenses needed to interpret patterns in relation to motivation, cognition, self-regulation, and collaboration. Learning design shapes the environment in which learning happens, including the activities, interactions, and assessments that generate the learning data.

Studies in the field show that learning analytics must be grounded in theory and aligned with instructional design intentions to produce meaningful insights. When analytics is disconnected from theory, there is a risk of oversimplifying or misrepresenting complex learning processes (Gašević, Dawson, and Siemens, 2015). For example, making sense of indicators related to self-regulation requires an understanding of how learners plan, monitor, and reflect on their learning. Interpreting data about group work similarly requires theories that explain collaboration and social interaction, rather than relying on participation counts alone (Kovanović et al., 2017).

In Indonesia, this means learning analytics cannot be treated as a purely technical initiative. Effective implementation requires collaboration among educators, instructional designers, technologists, and institutional leaders who understand how data connects to real teaching and learning practices.

Why Learning Analytics Matters for Indonesian Education

Learning analytics offers significant opportunities across Indonesia’s education system. At the system level, national programs generate data that can help policymakers understand learner participation, regional disparities, and the effectiveness of different learning models. At the institutional level, analytics can enhance curriculum evaluation, academic support, and quality assurance by providing evidence about learners’ actual experiences.

At the classroom level, educators can use analytics to identify learners who may need additional support, refine materials that confuse, and create more responsive learning environments. In industry-aligned training, analytics can help ensure that learners acquire skills that match workforce needs and help organizations understand the long-term impact of training.

When implemented responsibly, learning analytics can support Indonesia’s broader goal of providing equitable, high-quality education that prepares learners for a rapidly evolving digital landscape.

How LAI Lab Fits In and What Comes Next

Learning Analytics Indonesia LAI Lab was created to help connect global learning analytics research with the everyday realities of Indonesian education. We focus on transforming learning data into insights that are understandable, theory-informed, and usable for educators, EdTech teams, and institutions. Our work brings together perspectives from data science, learning theory, and learning design, with a consistent emphasis on ethics, equity, and the lived experiences of learners in Indonesia.

This article is the first in our Foundations of Learning Analytics series. In the following pieces, we will introduce practical ways to interpret learning data, share case studies from Indonesian institutions and programs, and explore how analytics can support feedback, self-regulated learning, and learner support in a human-centered manner. 

If you are an educator, developer, researcher, or student who is curious about learning analytics, we invite you to follow our work and grow with our community as we build a more data-informed and human-centered future for learning in Indonesia.

References

Bienkowski, M., Feng, M., and Means, B. 2012. Enhancing teaching and learning through educational data mining and learning analytics. US Department of Education.

Gašević, D., Dawson, S., and Siemens, G. 2015. Let’s not forget learning analytics are about learning. TechTrends, 59, 64–71. https://doi.org/10.1007/s11528-014-0822-x 

Kovanović, V., Joksimović, S., Hatala, M., Gašević, D., and Siemens, G. 2017. Content analytics: The definition, scope, and an overview of published research. In C. Lang, G. Siemens, A. F. Wise, and D. Gašević Eds., Handbook of learning analytics. Society for Learning Analytics Research. https://doi.org/10.18608/hla17 

SoLAR Society for Learning Analytics Research. n.d. What is learning analytics? Retrieved from https://www.solaresearch.org

Image generated by Google Gemini, December 1, 2025, in response to the prompt “create image that represent digital learning,” https://gemini.google.com.


Leave a comment