• Subject List
  • Take a Tour
  • For Authors
  • Subscriber Services
  • Publications
  • African American Studies
  • African Studies
  • American Literature
  • Anthropology
  • Architecture Planning and Preservation
  • Art History
  • Atlantic History
  • Biblical Studies
  • British and Irish Literature
  • Childhood Studies
  • Chinese Studies
  • Cinema and Media Studies
  • Communication
  • Criminology
  • Environmental Science
  • Evolutionary Biology
  • International Law
  • International Relations
  • Islamic Studies
  • Jewish Studies
  • Latin American Studies
  • Latino Studies
  • Linguistics
  • Literary and Critical Theory
  • Medieval Studies
  • Military History
  • Political Science
  • Public Health
  • Renaissance and Reformation
  • Social Work
  • Urban Studies
  • Victorian Literature
  • Browse All Subjects

How to Subscribe

  • Free Trials

In This Article Expand or collapse the "in this article" section Data Collection in Educational Research

Introduction, general overviews.

  • General Quantitative Overviews
  • Questionnaires
  • Quantitative Interviewing
  • Quantitative Observation
  • Technical Properties
  • General Qualitative Overviews
  • In-Depth Interviewing
  • Focus Groups
  • Qualitative Observation
  • Qualitative Document Analysis
  • Visual Analysis

Related Articles Expand or collapse the "related articles" section about

About related articles close popup.

Lorem Ipsum Sit Dolor Amet

Vestibulum ante ipsum primis in faucibus orci luctus et ultrices posuere cubilia Curae; Aliquam ligula odio, euismod ut aliquam et, vestibulum nec risus. Nulla viverra, arcu et iaculis consequat, justo diam ornare tellus, semper ultrices tellus nunc eu tellus.

  • Case Study in Education Research
  • Grounded Theory
  • Methodologies for Conducting Education Research
  • Mixed Methods Research
  • Qualitative Research Design
  • Statistical Assumptions
  • Using Ethnography in Educational Research

Other Subject Areas

Forthcoming articles expand or collapse the "forthcoming articles" section.

  • Black Women in Academia
  • Girls' Education in the Developing World
  • History of Education in Europe
  • Find more forthcoming articles...
  • Export Citations
  • Share This Facebook LinkedIn Twitter

Data Collection in Educational Research by James H. McMillan , Laura P. Gogia LAST REVIEWED: 30 June 2014 LAST MODIFIED: 30 June 2014 DOI: 10.1093/obo/9780199756810-0087

Data collection methods in educational research are used to gather information that is then analyzed and interpreted. As such, data collection is a very important step in conducting research and can influence results significantly. Once the research question and sources of data are identified, appropriate methods of data collection are determined. Data collection includes a broad range of more specific techniques. Historically, much of the data collection performed in educational research depended on methods developed for studies in the field of psychology, a discipline which took what is termed a “quantitative” approach. This involves using instruments, scales, Tests , and structured observation and interviewing. By the mid- to late twentieth centuries, other disciplines such as anthropology and sociology began to influence educational researchers. Forms of data collection broadened to include what is now called “qualitative” methods, with an emphasis on narratives, participant perspectives, and less structured observation and interviewing. As contemporary educational researchers also draw from fields such as business, political science, and medicine, data collection in education has become a multidisciplinary phenomenon. Because data collection is such a broad topic, General Overviews that attempt to cover all or most techniques tend to offer introductory treatments. Few texts, however, provide comprehensive coverage of every data collection technique. Instead, some cover techniques appropriate for either quantitative or qualitative research approaches. Even more focus on one or two data collection methods within those two research contexts. Consequently, after presenting general overviews, this entry is categorized by data collection appropriate for quantitative and Qualitative Data Collection . These sections, in turn, are subdivided into the major types of quantitative and qualitative data collection techniques. While there are some data collection techniques specific to mixed method research design, which implies a combination of qualitative and quantitative research methodologies, these specific procedures are not emphasized in the present article—readers are referred to the Oxford Bibliography article Mixed Methods Research by Nancy Leech for a comprehensive treatment of mixed method data collection techniques. To locate sources for this article, extensive searches were performed using general-use Internet search engines and educational, psychological, and social science research databases. These searches included keywords around data collection and research methods, as well as specific data collection techniques such as surveys, Tests , Focus Groups , and observation. Frequently cited texts and articles, most recent editions at the time, and sources specific to educational research were given priority. Once these sources were identified, their suggested readings and reference lists were mined for other potential sources. Works or scholars found in multiple reference lists were investigated. When applicable, book reviews in peer-reviewed journals were located and taken into account when curating sources. Sources that demonstrated a high level of impact or offered unique coverage of the topic were included.

General educational research overviews typically include several chapters on data collection, organized into qualitative and quantitative approaches. As a rule they are updated frequently so that they offer timely discussions of methodological trends. Most of them are introductory in nature, written for student researchers. Because of the influence of psychology and other social sciences on the development of data collection in educational research, representative works of psychology ( Trochim 2006 ) and of general social sciences ( Robson 2011 ) are included. Available online, Trochim 2006 is a reader-friendly introduction that provides succinct explanations of most quantitative and qualitative approaches. Olsen 2012 is helpful in showing how data collection techniques used in other disciplines have implications for educational studies. Specific to education, Gall, et al. 2007 is a frequently cited text that contains most educational data collection techniques, although it tends to emphasize more traditional quantitative approaches. Johnson and Christensen 2014 offers a more balanced treatment meant for novice researchers and educational research consumers. Cohen, et al. 2011 also provides a balanced approach, but from a British perspective. Fielding, et al. 2008 offer practical advice on recently developed forms of online data collection, with special attention given to the ethical ramifications of Internet-based data collection. Finally, Arthur, et al. 2012 is unique in this section in that it is an edited work offering short overviews of data collection techniques authored by contemporary leading experts.

Arthur, James, Michael Waring, Robert Coe, and Larry Hedges, eds. 2012. Research methods and methodologies in education . London: SAGE.

A diverse edited text discussing trends in study designs, data collection, and data analysis. It includes twelve chapters devoted to different forms of data collection, written by authors who have recently published extensively on the topic. Annotated bibliographies found at the end of each chapter provide guidance for further reading.

Cohen, Louis, Lawrence Manion, and Keith Morrison. 2011. Research methods in education . 7th ed. London: Routledge.

This long-running, bestselling, comprehensive source offers practical advice with clear theoretical foundations. The newest edition has undergone significant revision. Specific to data collection, revisions include new chapters devoted to data collection via the Internet and visual media. Slides highlighting main points are available on a supplementary website.

Fielding, Nigel, Raymond Lee, and Grant Blank. 2008. The SAGE handbook of online research methods . Thousand Oaks, CA: SAGE.

This extensive handbook presents chapters on Internet research design and data collection written by leading scholars in the field. It discusses using the Internet as an archival resource and a research tool, focusing on the most recent trends in multidisciplinary Internet research.

Gall, Meredith, Joyce Gall, and Walter Borg. 2007. Educational research: An introduction . 8th ed. White Plains, NY: Pearson.

A long-standing, well-respected, nuts-and-bolts perspective on data collection meant to prepare students for conducting original research. Although it tends to emphasize quantitative research methodologies, it has a uniquely rich chapter on historical document analysis.

Johnson, Burke, and Larry Christensen. 2014. Educational research: Quantitative, qualitative, and mixed approaches . 5th ed. Thousand Oaks, CA: SAGE.

A comprehensive introductory text for the consumer and the would-be researcher, with extensive lists of additional resources for gathering all types of data. It discusses quantitative and qualitative research methodologies and data collection evenly but provides extended coverage of questionnaire construction.

Olsen, Wendy. 2012. Data collection: Key debates and methods in social research . London: SAGE.

This recently published toolkit of quantitative, qualitative, and mixed method approaches to data collection provides a more contemporary introduction for both students and research professionals. It offers a helpful overview of data collection as an integral part of research in several different fields of study.

Robson, Colin. 2011. Real world research: A resource for users of social research methods in applied settings . West Sussex, UK: Wiley

This introductory text is intended for all social science. There is an applied, integrated emphasis on contemporary quantitative and qualitative data collection techniques in a separate section of the book, including individual and focus group observations, surveys, unstructured and structured interviewing, and tests.

Trochim, William. 2006. Research methods knowledge base

A free online hypertext textbook on applied social research methods. Data collection techniques associated with qualitative and quantitative research are covered comprehensively. Foundational information appropriate for undergraduates and early graduate students is presented through a series of easy-to-navigate and intuitively ordered webpages. Printed editions are available for purchase in an edition written with James Donnelly (Atomic Dog/Cengage Learning, 2008).

back to top

Users without a subscription are not able to see the full content on this page. Please subscribe or login .

Oxford Bibliographies Online is available by subscription and perpetual access to institutions. For more information or to contact an Oxford Sales Representative click here .

  • About Education »
  • Meet the Editorial Board »
  • Academic Achievement
  • Academic Audit for Universities
  • Academic Freedom and Tenure in the United States
  • Action Research in Education
  • Adjuncts in Higher Education in the United States
  • Administrator Preparation
  • Adolescence
  • Advanced Placement and International Baccalaureate Courses
  • Advocacy and Activism in Early Childhood
  • African American Racial Identity and Learning
  • Alaska Native Education
  • Alternative Certification Programs for Educators
  • Alternative Schools
  • American Indian Education
  • Animals in Environmental Education
  • Art Education
  • Artificial Intelligence and Learning
  • Assessing School Leader Effectiveness
  • Assessment, Behavioral
  • Assessment, Educational
  • Assessment in Early Childhood Education
  • Assistive Technology
  • Augmented Reality in Education
  • Beginning-Teacher Induction
  • Bilingual Education and Bilingualism
  • Black Undergraduate Women: Critical Race and Gender Perspe...
  • Blended Learning
  • Changing Professional and Academic Identities
  • Character Education
  • Children’s and Young Adult Literature
  • Children's Beliefs about Intelligence
  • Children's Rights in Early Childhood Education
  • Citizenship Education
  • Civic and Social Engagement of Higher Education
  • Classroom Learning Environments: Assessing and Investigati...
  • Classroom Management
  • Coherent Instructional Systems at the School and School Sy...
  • College Admissions in the United States
  • College Athletics in the United States
  • Community Relations
  • Comparative Education
  • Computer-Assisted Language Learning
  • Computer-Based Testing
  • Conceptualizing, Measuring, and Evaluating Improvement Net...
  • Continuous Improvement and "High Leverage" Educational Pro...
  • Counseling in Schools
  • Critical Approaches to Gender in Higher Education
  • Critical Perspectives on Educational Innovation and Improv...
  • Critical Race Theory
  • Crossborder and Transnational Higher Education
  • Cross-National Research on Continuous Improvement
  • Cross-Sector Research on Continuous Learning and Improveme...
  • Cultural Diversity in Early Childhood Education
  • Culturally Responsive Leadership
  • Culturally Responsive Pedagogies
  • Culturally Responsive Teacher Education in the United Stat...
  • Curriculum Design
  • Data Collection in Educational Research
  • Data-driven Decision Making in the United States
  • Deaf Education
  • Desegregation and Integration
  • Design Thinking and the Learning Sciences: Theoretical, Pr...
  • Development, Moral
  • Dialogic Pedagogy
  • Digital Age Teacher, The
  • Digital Citizenship
  • Digital Divides
  • Disabilities
  • Distance Learning
  • Distributed Leadership
  • Doctoral Education and Training
  • Early Childhood Education and Care (ECEC) in Denmark
  • Early Childhood Education and Development in Mexico
  • Early Childhood Education in Aotearoa New Zealand
  • Early Childhood Education in Australia
  • Early Childhood Education in China
  • Early Childhood Education in Europe
  • Early Childhood Education in Sub-Saharan Africa
  • Early Childhood Education in Sweden
  • Early Childhood Education Pedagogy
  • Early Childhood Education Policy
  • Early Childhood Education, The Arts in
  • Early Childhood Mathematics
  • Early Childhood Science
  • Early Childhood Teacher Education
  • Early Childhood Teachers in Aotearoa New Zealand
  • Early Years Professionalism and Professionalization Polici...
  • Economics of Education
  • Education For Children with Autism
  • Education for Sustainable Development
  • Education Leadership, Empirical Perspectives in
  • Education of Native Hawaiian Students
  • Education Reform and School Change
  • Educational Statistics for Longitudinal Research
  • Educator Partnerships with Parents and Families with a Foc...
  • Emotional and Affective Issues in Environmental and Sustai...
  • Emotional and Behavioral Disorders
  • English as an International Language for Academic Publishi...
  • Environmental and Science Education: Overlaps and Issues
  • Environmental Education
  • Environmental Education in Brazil
  • Epistemic Beliefs
  • Equity and Improvement: Engaging Communities in Educationa...
  • Equity, Ethnicity, Diversity, and Excellence in Education
  • Ethical Research with Young Children
  • Ethics and Education
  • Ethics of Teaching
  • Ethnic Studies
  • Evidence-Based Communication Assessment and Intervention
  • Family and Community Partnerships in Education
  • Family Day Care
  • Federal Government Programs and Issues
  • Feminization of Labor in Academia
  • Finance, Education
  • Financial Aid
  • Formative Assessment
  • Future-Focused Education
  • Gender and Achievement
  • Gender and Alternative Education
  • Gender, Power and Politics in the Academy
  • Gender-Based Violence on University Campuses
  • Gifted Education
  • Global Mindedness and Global Citizenship Education
  • Global University Rankings
  • Governance, Education
  • Growth of Effective Mental Health Services in Schools in t...
  • Higher Education and Globalization
  • Higher Education and the Developing World
  • Higher Education Faculty Characteristics and Trends in the...
  • Higher Education Finance
  • Higher Education Governance
  • Higher Education Graduate Outcomes and Destinations
  • Higher Education in Africa
  • Higher Education in China
  • Higher Education in Latin America
  • Higher Education in the United States, Historical Evolutio...
  • Higher Education, International Issues in
  • Higher Education Management
  • Higher Education Policy
  • Higher Education Research
  • Higher Education Student Assessment
  • High-stakes Testing
  • History of Early Childhood Education in the United States
  • History of Education in the United States
  • History of Technology Integration in Education
  • Homeschooling
  • Inclusion in Early Childhood: Difference, Disability, and ...
  • Inclusive Education
  • Indigenous Education in a Global Context
  • Indigenous Learning Environments
  • Indigenous Students in Higher Education in the United Stat...
  • Infant and Toddler Pedagogy
  • Inservice Teacher Education
  • Integrating Art across the Curriculum
  • Intelligence
  • Intensive Interventions for Children and Adolescents with ...
  • International Perspectives on Academic Freedom
  • Intersectionality and Education
  • Knowledge Development in Early Childhood
  • Leadership Development, Coaching and Feedback for
  • Leadership in Early Childhood Education
  • Leadership Training with an Emphasis on the United States
  • Learning Analytics in Higher Education
  • Learning Difficulties
  • Learning, Lifelong
  • Learning, Multimedia
  • Learning Strategies
  • Legal Matters and Education Law
  • LGBT Youth in Schools
  • Linguistic Diversity
  • Linguistically Inclusive Pedagogy
  • Literacy Development and Language Acquisition
  • Literature Reviews
  • Mathematics Identity
  • Mathematics Instruction and Interventions for Students wit...
  • Mathematics Teacher Education
  • Measurement for Improvement in Education
  • Measurement in Education in the United States
  • Meta-Analysis and Research Synthesis in Education
  • Methodological Approaches for Impact Evaluation in Educati...
  • Mindfulness, Learning, and Education
  • Motherscholars
  • Multiliteracies in Early Childhood Education
  • Multiple Documents Literacy: Theory, Research, and Applica...
  • Multivariate Research Methodology
  • Museums, Education, and Curriculum
  • Music Education
  • Narrative Research in Education
  • Native American Studies
  • Nonformal and Informal Environmental Education
  • Note-Taking
  • Numeracy Education
  • One-to-One Technology in the K-12 Classroom
  • Online Education
  • Open Education
  • Organizing for Continuous Improvement in Education
  • Organizing Schools for the Inclusion of Students with Disa...
  • Outdoor Play and Learning
  • Outdoor Play and Learning in Early Childhood Education
  • Pedagogical Leadership
  • Pedagogy of Teacher Education, A
  • Performance Objectives and Measurement
  • Performance-based Research Assessment in Higher Education
  • Performance-based Research Funding
  • Phenomenology in Educational Research
  • Philosophy of Education
  • Physical Education
  • Podcasts in Education
  • Policy Context of United States Educational Innovation and...
  • Politics of Education
  • Portable Technology Use in Special Education Programs and ...
  • Post-humanism and Environmental Education
  • Pre-Service Teacher Education
  • Problem Solving
  • Productivity and Higher Education
  • Professional Development
  • Professional Learning Communities
  • Program Evaluation
  • Programs and Services for Students with Emotional or Behav...
  • Psychology Learning and Teaching
  • Psychometric Issues in the Assessment of English Language ...
  • Qualitative Data Analysis Techniques
  • Qualitative, Quantitative, and Mixed Methods Research Samp...
  • Quantitative Research Designs in Educational Research
  • Queering the English Language Arts (ELA) Writing Classroom
  • Race and Affirmative Action in Higher Education
  • Reading Education
  • Refugee and New Immigrant Learners
  • Relational and Developmental Trauma and Schools
  • Relational Pedagogies in Early Childhood Education
  • Reliability in Educational Assessments
  • Religion in Elementary and Secondary Education in the Unit...
  • Researcher Development and Skills Training within the Cont...
  • Research-Practice Partnerships in Education within the Uni...
  • Response to Intervention
  • Restorative Practices
  • Risky Play in Early Childhood Education
  • Scale and Sustainability of Education Innovation and Impro...
  • Scaling Up Research-based Educational Practices
  • School Accreditation
  • School Choice
  • School Culture
  • School District Budgeting and Financial Management in the ...
  • School Improvement through Inclusive Education
  • School Reform
  • Schools, Private and Independent
  • School-Wide Positive Behavior Support
  • Science Education
  • Secondary to Postsecondary Transition Issues
  • Self-Regulated Learning
  • Self-Study of Teacher Education Practices
  • Service-Learning
  • Severe Disabilities
  • Single Salary Schedule
  • Single-sex Education
  • Single-Subject Research Design
  • Social Context of Education
  • Social Justice
  • Social Network Analysis
  • Social Pedagogy
  • Social Science and Education Research
  • Social Studies Education
  • Sociology of Education
  • Standards-Based Education
  • Student Access, Equity, and Diversity in Higher Education
  • Student Assignment Policy
  • Student Engagement in Tertiary Education
  • Student Learning, Development, Engagement, and Motivation ...
  • Student Participation
  • Student Voice in Teacher Development
  • Sustainability Education in Early Childhood Education
  • Sustainability in Early Childhood Education
  • Sustainability in Higher Education
  • Teacher Beliefs and Epistemologies
  • Teacher Collaboration in School Improvement
  • Teacher Evaluation and Teacher Effectiveness
  • Teacher Preparation
  • Teacher Training and Development
  • Teacher Unions and Associations
  • Teacher-Student Relationships
  • Teaching Critical Thinking
  • Technologies, Teaching, and Learning in Higher Education
  • Technology Education in Early Childhood
  • Technology, Educational
  • Technology-based Assessment
  • The Bologna Process
  • The Regulation of Standards in Higher Education
  • Theories of Educational Leadership
  • Three Conceptions of Literacy: Media, Narrative, and Gamin...
  • Tracking and Detracking
  • Traditions of Quality Improvement in Education
  • Transformative Learning
  • Transitions in Early Childhood Education
  • Tribally Controlled Colleges and Universities in the Unite...
  • Understanding the Psycho-Social Dimensions of Schools and ...
  • University Faculty Roles and Responsibilities in the Unite...
  • Value of Higher Education for Students and Other Stakehold...
  • Virtual Learning Environments
  • Vocational and Technical Education
  • Wellness and Well-Being in Education
  • Women's and Gender Studies
  • Young Children and Spirituality
  • Young Children's Learning Dispositions
  • Young Children's Working Theories
  • Privacy Policy
  • Cookie Policy
  • Legal Notice
  • Accessibility

Powered by:

  • [66.249.64.20|185.194.105.172]
  • 185.194.105.172
  • Review Article
  • Open access
  • Published: 22 June 2020

Teaching analytics, value and tools for teacher data literacy: a systematic and tripartite approach

  • Ifeanyi Glory Ndukwe 1 &
  • Ben Kei Daniel 1  

International Journal of Educational Technology in Higher Education volume  17 , Article number:  22 ( 2020 ) Cite this article

41 Citations

14 Altmetric

Metrics details

Teaching Analytics (TA) is a new theoretical approach, which combines teaching expertise, visual analytics and design-based research to support teacher’s diagnostic pedagogical ability to use data and evidence to improve the quality of teaching. TA is now gaining prominence because it offers enormous opportunities to the teachers. It also identifies optimal ways in which teaching performance can be enhanced. Further, TA provides a platform for teachers to use data to reflect on teaching outcome. The outcome of TA can be used to engage teachers in a meaningful dialogue to improve the quality of teaching. Arguably, teachers need to develop their teacher data literacy and data inquiry skills to learn about teaching challenges. These skills are dependent on understanding the connection between TA, LA and Learning Design (LD). Additionally, they need to understand how choices in particular pedagogues and the LD can enhance their teaching experience. In other words, teachers need to equip themselves with the knowledge necessary to understand the complexity of teaching and the learning environment. Providing teachers access to analytics associated with their teaching practice and learning outcome can improve the quality of teaching practice. This research aims to explore current TA related discussions in the literature, to provide a generic conception of the meaning and value of TA. The review was intended to inform the establishment of a framework describing the various aspects of TA and to develop a model that can enable us to gain more insights into how TA can help teachers improve teaching practices and learning outcome. The Tripartite model was adopted to carry out a comprehensive, systematic and critical analysis of the literature of TA. To understand the current state-of-the-art relating to TA, and the implications to the future, we reviewed published articles from the year 2012 to 2019. The results of this review have led to the development of a conceptual framework for TA and established the boundaries between TA and LA. From the analysis the literature, we proposed a Teaching Outcome Model (TOM) as a theoretical lens to guide teachers and researchers to engage with data relating to teaching activities, to improve the quality of teaching.

Introduction

Educational institutions today are operating in an information era, where machines automatically generate data rather than manually; hence, the emergence of big data in education ( Daniel 2015 ). The phenomenon of analytics seeks to acquire insightful information from data that ordinarily would not be visible by the ordinary eyes, except with the application of state-of-the-art models and methods to reveal hidden patterns and relationships in data. Analytics plays a vital role in reforming the educational sector to catch up with the fast pace at which data is generated, and the extent to which such data can be used to transform our institutions effectively. For example, with the extensive use of online and blended learning platforms, the application of analytics will enable educators at all levels to gain new insights into how people learn and how teachers can teach better. However, the current discourses on the use of analytics in Higher Education (HE) are focused on the enormous opportunities analytics offer to various stakeholders; including learners, teachers, researchers and administrators.

In the last decade, extensive literature has proposed two weaves of analytics to support learning and improve educational outcomes, operations and processes. The first form of Business Intelligence introduced in the educational industry is Academic Analytics (AA). AA describes data collected on the performance of academic programmes to inform policy. Then, Learning Analytics (LA), emerged as the second weave of analytics, and it is one of the fastest-growing areas of research within the broader use of analytics in the context of education. LA is defined as the "measurement, collection, analysis and reporting of data about the learner and their learning contexts for understanding and optimising learning and the environments in which it occurs" ( Elias 2011 ). LA was introduced to attend to teaching performance and learning outcome ( Anderson 2003 ; Macfadyen and Dawson 2012 ). Typical research areas in LA, include student retention, predicting students at-risk, personalised learning which in turn are highly student-driven ( Beer et al. 2009 ; Leitner et al. 2017 ; Pascual-Miguel et al. 2011 ; Ramos and Yudko 2008 ). For instance, Griffiths ( Griffiths 2017 ), employed LA to monitor students’ engagements and behavioural patterns on a computer-supported collaborative learning environment to predict at-risk students. Similarly, Rienties et al. ( Rienties et al. 2016 ) looked at LA approaches in their capacity to enhance the learner’s retention, engagement and satisfaction. However, in the last decade, LA research has focused mostly on the learner and data collections, based on digital data traces from Learning Management Systems (LMS) ( Ferguson 2012 ), not the physical classroom.

Teaching Analytics (TA) is a new theoretical approach that combines teaching expertise, visual analytics and design-based research, to support the teacher with diagnostic and analytic pedagogical ability to improve the quality of teaching. Though it is a new phenomenon, TA is now gaining prominence because it offers enormous opportunities to the teachers.

Research on TA pays special attention to teacher professional practice, offering data literacy and visual analytics tools and methods ( Sergis et al. 2017 ). Hence, TA is the collection and use of data related to teaching and learning activities and environments to inform teaching practice and to attain specific learning outcomes. Some authors have combined the LA, and TA approaches into Teaching and Learning Analytics (TLA) ( Sergis and Sampson 2017 ; Sergis and Sampson 2016 ). All these demonstrate the rising interest in collecting evidence from educational settings for awareness, reflection, or decision making, among other purposes. However, the most frequent data that have been collected and analysed about TA focus on the students (e.g., different discussion and learning activities and some sensor data such as eye-tracking, position or physical actions) ( Sergis and Sampson 2017 ), rather than monitoring teacher activities. Providing teachers access to analytics of their teaching, and how they can effectively use such analytics to improve their teaching process is a critical endeavour. Also, other human-mediated data gathering in the form of student feedback, self and peer observations or teacher diaries can be employed to enrich TA further. For instance, visual representations such as dashboards can be used to present teaching data to help teachers reflect and make appropriate decisions to inform the quality of teaching. In other words, TA can be regarded as a reconceptualisation of LA for teachers to improve teaching performance and learning outcome. The concept of TA is central to the growing data-rich technology-enhanced learning and teaching environment ( Flavin 2017 ; Saye and Brush 2007 ). Further, it provides teachers with the opportunity to engage in data-informed pedagogical improvement.

While LA is undeniably an essential area of research in educational technology and the learning sciences, automatically extracted data from an educational platform mainly provide an overview of student activities, and participation. Nevertheless, it hardly indicates the role of the teacher in these activities, or may not otherwise be relevant to teachers’ individual needs (for Teaching Professional Development (TPD) or improvement of their classroom practice). Many teachers generally lack adequate data literacy skills ( Sun et al. 2016 ). Teacher data literacy skill and teacher inquiry skill using data are the foundational concepts underpinning TA ( Kaser and Halbert 2014 ). The development of these two skills is dependent on understanding the connection between TA, LA and Learning Design (LD). In other words, teachers need to equip themselves with knowledge through interaction with sophisticated data structures and analytics. Hence, TA is critical to improving teachers’ low efficacy towards educational data.

Additionally, technology has expanded the horizon of analytics to various forms of educational settings. As such, the educational research landscape needs efficient tools for collecting data and analyzing data, which in turn requires explicit guidance on how to use the findings to inform teaching and learning ( McKenney and Mor 2015 ). Increasing the possibilities for teachers to engage with data to assess what works for the students and courses they teach is instrumental to quality ( Van Harmelen and Workman 2012 ). TA provides optimal ways of performing the analysis of data obtained from teaching activities and the environment in which instruction occurs. Hence, more research is required to explore how teachers can engage with data associated with teaching to encourage teacher reflection, improve the quality of teaching, and provide useful insights into ways teachers could be supported to interact with teaching data effectively. However, it is also essential to be aware that there are critical challenges associated with data collection. Moreover, designing the information flow that facilitates evidence-based decision-making requires addressing issues such as the potential risk of bias; ethical and privacy concerns; inadequate knowledge of how to engage with analytics effectively.

To ensure that instructional design and learning support is evidence-based, it is essential to empower teachers with the necessary knowledge of analytics and data literacy. The lack of such knowledge can lead to poor interpretation of analytics, which in turn can lead to ill-informed decisions that can significantly affect students; creating more inequalities in access to learning opportunities and support regimes. Teacher data literacy refers to a teachers’ ability to effectively engage with data and analytics to make better pedagogical decisions.

The primary outcome of TA is to guide educational researchers to develop better strategies to support the development of teachers’ data literacy skills and knowledge. However, for teachers to embrace data-driven approaches to learning design, there is a need to implement bottom-up approaches that include teachers as main stakeholders of a data literacy project, rather than end-users of data.

The purpose of this research is to explore the current discusses in the literature relating to TA. A vital goal of the review was to extend our understanding of conceptions and value of TA. Secondly, we want to contextualise the notion of TA and develop various concepts around TA to establish a framework that describes multiple aspects of TA. Thirdly, to examine different data collections/sources, machine learning algorithms, visualisations and actions associated with TA. The intended outcome is to develop a model that would provide a guide for the teacher to improve teaching practice and ultimately enhance learning outcomes.

The research employed a systematic and critical analysis of articles published from the year 2012 to 2019. A total of 58 publications were initially identified and compiled from the Scopus database. After analysing the search results, 31 papers were selected for review. This review examined research relating to the utilisation of analytics associated with teaching and teacher activities and provided conceptual clarity on TA. We found that the literature relating to conception, and optimisation of TA is sporadic and scare, as such the notion of TA is theoretically underdeveloped.

Methods and procedures

This research used the Tripartite model ( Daniel and Harland 2017 ), illustrated in Fig.  1 , to guide the systematic literature review. The Tripartite model draws from systematic review approaches such as the Cochrane, widely used in the analyses of rigorous studies, to provide the best evidence. Moreover, the Tripartite model offers a comprehensive view and presentation of the reports. The model composes of three fundamental components; descriptive (providing a summary of the literature), synthesis (logically categorising the research based on related ideas, connections and rationales), and critique (criticising the novel, providing evidence to support, discard or offer new ideas about the literature). Each of these phases is detailed fully in the following sections.

figure 1

Tripartite Model. The Tripartite Model: A Systematic Literature Review Process ( Daniel and Harland 2017 )

To provide clarity; the review first focused on describing how TA is conceptualised and utilised. Followed by the synthesis of the literature on the various tools used to harvest, analyse and present teaching-related data to the teachers. Then the critique of the research which led to the development of a conceptual framework describing various aspects of TA. Finally, this paper proposes a Teaching Outcome Model (TOM). TOM is intended to offer teachers help on how to engage and reflect on teaching data.

TOM is a TA life cycle which starts with the data collection stage; where the focus is on teaching data. Then the data analysis stage; the application of different Machine Learning (ML) techniques to the data to discover hidden patterns. Subsequently, the data visualisation stage, where data presentation is carried out in the form of a Teaching Analytics Dashboard (TAD) for the teacher. This phase is where the insight generation, critical thinking and teacher reflection are carried out. Finally, the action phase, this is where actions are implemented by teachers to improve teaching practice. Some of these actions include improving the LD, changing teaching method, providing appropriate feedback and assessment or even carrying out more research. This research aims to inform the future work in the advancement of TA research field.

Framing research area for review

As stated in the introduction, understanding current research on TA can be used to provide teachers with strategies that can help them utilise various forms of data to optimise teaching performance and outcome. Framing the review was guided by some questions and proposed answers to address those questions (see Table  1 )

Inclusion and exclusion criteria

The current review started with searching through the Scopus database using the SciVal visualisation and analytical tool. The rationale for choosing the Scopus database is that it contains the largest abstract and citation database of peer-reviewed research literature with diverse titles from publishers worldwide. Hence, it is only conceivable to search for and find a meaningful balance of the published content in the area of TA. Also, the review included peer-reviewed journals and conference proceedings. We excluded other documents and source types, such as book series, books, editorials, trade publications on the understanding that such sources might lack research on TA. Also, this review excluded articles published in other languages other than English.

Search strategy

This review used several keywords and combinations to search on terms related to TA. For instance: ’Teaching Analytics’ AND ’Learning Analytics’ OR ’Teacher Inquiry’ OR ’Data Literacy’ OR ’Learning Design’ OR ’Computer-Supported Collaborative Learning’ OR ’Open Learner Model’ OR ’Visualisation’ OR ’Learning Management System’ OR ’Intelligent Tutoring System’ OR ’Student Evaluation on Teaching’ OR ’Student Ratings’.

This review searched articles published between 2012 to 2019. The initial stage of the literature search yielded 58 papers. After the subsequent screening of previous works and removing duplicates and titles that did not relate to the area of research, 47 articles remained. As such, a total of 36 studies continued for full-text review. Figure  2 , shows the process of finalising the previous studies of this review.

figure 2

Inclusion Exclusion Criteria Flowchart. The selection of previous studies

Compiling the abstracts and the full articles

The review ensured that the articles identified for review were both empirical and conceptual papers. The relevance of each article was affirmed by requiring that chosen papers contained various vital phrases all through the paper, as well as, title, abstract, keywords and, afterwards, the entire essay. In essence, were reviewed giving particular cognisance and specific consideration to those section(s) that expressly related to the field of TA. In doing as such, to extract essential points of view on definitions, data sources, tools and technologies associated with analytics for the teachers. Also, this review disregarded papers that did not, in any way, relate to analytics in the context of the teachers. Finally, 31 articles sufficed for this review.

Systematic review: descriptive

Several studies have demonstrated that TA is an important area of inquiry ( Flanders 1970 ; Gorham 1988 ; Pennings et al. 2014 ; Schempp et al. 2004 ), that enables researchers to explore analytics associated with teaching process systematically. Such analytics focus on data related to the teachers, students, subjects taught and teaching outcomes. The ultimate goal of TA is to improve professional teaching practice ( Huang 2001 ; Sergis et al. 2017 ). However, there is no consensus on what constitutes TA. Several studies suggest that TA is an approach used to analyse teaching activities ( Barmaki and Hughes 2015 ; Gauthier 2013 ; KU et al. 2018 ; Saar et al. 2017 ), including how teachers deliver lectures to students, tools usage pattern, or dialogue. While various other studies recognise TA as the ability to applying analytical methods to improve teacher awareness of student activities for appropriate intervention ( Ginon et al. 2016 ; Michos and Hernández Leo 2016 ; Pantazos et al. 2013 ; Taniguchi et al. 2017 ; Vatrapu et al. 2013 ). A hand full of others indicate TA as analytics that combines both teachers and students activities ( Chounta et al. 2016 ; Pantazos and Vatrapu 2016 ; Prieto et al. 2016 ; Suehiro et al. 2017 ). Hence, it is particularly problematic and challenging to carry out a systematic study in the area of analytics for the teachers to improve teaching practice, since there is no shared understanding of what constitutes analytics and how best to approach TA.

Researchers have used various tools to automatically harvest important episodes of interactive teacher and student behaviour during teaching, for teacher reflection. For instance, KU et al. ( 2018 ), utilised instruments such as; Interactive Whiteboard (IWB), Document Camera (DC), and Interactive Response System (IRS) to collect classroom instructional data during instruction. Similarly, Vatrapu et al. ( 2013 ) employed eye-tracking tools to capture eye-gaze data on various visual representations. Thomas ( 2018 ) also extracted multimodal features from both the speaker and the students’ audio-video data, using digital devices such as cameras and high-definition cameras. Data collected from some of these tools not only provide academics with real-time data but also attract more details about teaching and learning than the teacher may realise. However, the cost of using such digital tools for large-scale verification is high, and cheaper alternatives are sort after. For instance, Suehiro et al. ( 2017 ) proposed a novel approach of using e-books to extract teaching activity logs in a face-to-face class efficiently.

Vatrapu ( 2012 ) considers TA as a subset of LA dedicated to supporting teachers to understand the learning and teaching process. However, this definition does not recognise that both the learning and teaching processes are intertwined. Also, most of the research in LA collects data about the student learning or behaviour, to provide feedback to the teacher ( Vatrapu et al. 2013 ; Ginon et al. 2016 ; Goggins et al. 2016 ; Shen et al. 2018 ; Suehiro et al. 2017 ), see, for example, the iKlassroom conceptual proposal by Vatrapu et al. ( 2013 ), which highlights a map of the classroom to help contextualise real-time data about the learners in a lecture. Although, a few research draw attention to the analysis of teacher-gathering and teaching practice artefacts, such as lesson plans. Xu and Recker ( 2012 ) examined teachers tool usage patterns. Similarly, Gauthier ( 2013 ) extracted the analysis of the reasoning behind the expert teacher and used such data to improve the quality of teaching.

Multimodal analytics is an emergent trend used to complement available digital trace with data captured from the physical world ( Prieto et al. 2017 ). Isolated examples include the smart school multimodal dataset conceptual future proposal by Prieto et al. ( 2017 ), which features a plan of implementing a smart classroom to help contextualise real-time data about both the teachers and learners in a lecture. Another example, Prieto et al. ( 2016 ), explored the automatic extraction of orchestration graphs from a multimodal dataset gathered from only one teacher, classroom space, and a single instructional design. Results showed that ML techniques could achieve reasonable accuracy towards automated characterisation in teaching activities. Furthermore, Prieto et al. ( 2018 ) applied more advanced ML techniques to an extended version of the previous dataset to explore the different relationships that exist between datasets captured by multiple sources.

Previous studies have shown that teachers want to address common issues such as improving their TPD and making students learn effectively ( Charleer et al. 2013 ; Dana and Yendol-Hoppey 2019 ; Pennings et al. 2014 ). Reflection on teaching practice plays an essential role in helping teachers address these issues during the process of TPD ( Saric and Steh 2017 ; Verbert et al. 2013 ). More specifically, reflecting on personal teaching practice provides opportunities for teachers to re-examine what they have performed in their classes ( Loughran 2002 ; Mansfield 2019 ; Osterman and Kottkamp 1993 ). Which, in turn, helps them gain an in-depth understanding of their teaching practice, and thus improve their TPD. For instance, Gauthier ( 2013 ), used a visual teach-aloud method to help teaching practitioners reflect and gain insight into their teaching practices. Similarly, Saar et al. ( 2017 ) talked about a self-reflection as a way to improve teaching practice. Lecturers can record and observe their classroom activities, analyse their teaching and make informed decisions about any necessary changes in their teaching method.

The network analysis approach is another promising field of teacher inquiry, especially if combined with systematic, effective qualitative research methods ( Goggins et al. 2016 ). However, researchers and teacher who wish to utilise social network analysis must be specific about what inquiry they want to achieve. Such queries must then be checked and validated against a particular ontology for analytics ( Goggins 2012 ). Goggins et al. ( 2016 ), for example, aimed at developing an awareness of the types of analytics that could help teachers in Massive Open Online Courses (MOOCs) participate and collaborate with student groups, through making more informed decisions about which groups need help, and which do not. Network theory offers a particularly useful framework for understanding how individuals and groups respond to each other as they evolve. Study of the Social Network (SNA) is the approach used by researchers to direct analytical studies informed by network theory. SNA has many specific forms, each told by graph theory, probability theory, and algebraic modelling to various degrees. There are gaps in our understanding of the link between analytics and pedagogy. For example, which unique approaches to incorporating research methods for qualitative and network analysis would produce useful information for teachers in MOOCs? A host of previous work suggests a reasonable path to scaling analytics for MOOCs will involve providing helpful TA perspectives ( Goggins 2012 ; Goggins et al. 2016 ; Vatrapu et al. 2012 ).

Teacher facilitation is considered a challenging and critical aspect of active learning ( Fischer et al. 2014 ). Both educational researchers and practitioners have paid particular attention to this process, using different data gathering and visualisation methods, such as classroom observation, student feedback, audio and video recordings, or teacher self-reflection. TA enables teachers to perform analytics through visual representations to enhance teachers’ experience ( Vatrapu et al. 2011 ). As in a pedagogical environment, professionals have to monitor several data such as questions, mood, ratings, or progress. Hence, dashboards have become an essential factor in improving and conducting successful teaching. Dashboards are visualisation tools enable teachers to monitor and observe teaching practice to enhance teacher self-reflection ( Yigitbasioglu and Velcu 2012 ). While a TAD is a category of dashboard meant for teachers and holds a unique role and value [62]. First, TAD could allow teachers to access students learning in an almost real-time and scalable manner ( Mor et al. 2015 ), consequently, enabling teachers to improve their self-knowledge by monitoring and observing students activities. TAD assists the teachers in obtaining an overview of the whole classroom as well as drill down into details about individual and groups of students to identify student competencies, strengths and weaknesses. For instance, Pantazos and Vatrapu ( 2016 ) described TAD for repertory grid data to enable teachers to conduct systematic visual analytics of classroom learning data for formative assessment purposes. Second, TAD also allows for tracking on teacher self-activities ( van Leeuwen et al. 2019 ), as well as students feedback about their teaching practice. For example,Barmaki and Hughes ( 2015 ) explored a TAD that provides automated real-time feedback based on speakers posture, to support teachers practice classroom management and content delivery skills. It is a pedagogical point that dashboards can motivate teachers to reflect on teaching activities, help them improve teaching practice and learning outcome ( 2016 ). The literature has extensively described extensively, different teaching dashboards. For instance, Dix and Leavesley ( 2015 ), broadly discussed the idea of TAD and how they can represent visual tools for academics to interface with learning analytics and other academic life. Some of these academic lives may include schedules such as when preparing for class or updating materials, or meeting times such as meeting appointments with individual or collective group of students. Similarly, Vatrapu et al. ( 2013 ) explored TAD using visual analytics techniques to allow teachers to conduct a joint analysis of students personal constructs and ratings of domain concepts from the repertory grids for formative assessment application.

Systematic review: synthesis

In this second part of the review process, we extracted selected ideas from previous studies. Then group them based on data sources, analytical methods used, types of visualisations performed and actions.

Data sources and tools

Several studies have used custom software and online applications such as employing LMS and MOOCs to collect online classroom activities ( Goggins et al. 2016 ; KU et al. 2018 ; Libbrecht et al. 2013 ; Müller et al. 2016 ; Shen et al. 2018 ; Suehiro et al. 2017 ; Vatrapu et al. 2013 ; Xu and Recker 2012 ). Others have used modern devices including eye-tracker, portable electroencephalogram (EEG), gyroscope, accelerometer and smartphones ( Prieto et al. 2016 ; Prieto et al. 2018 ; Saar et al. 2017 ; Saar et al. 2018 ; Vatrapu et al. 2013 ), and conventional instruments such as video and voice recorders ( Barmaki and Hughes 2015 ; Gauthier 2013 ; Thomas 2018 ), to record classroom activities. However, some authors have pointed out several issues with modern devices such as expensive equipment, high human resource and ethical concerns ( KU et al. 2018 ; Prieto et al. 2017 ; Prieto et al. 2016 ; Suehiro et al. 2017 ).

In particular, one study by Chounta et al. ( 2016 ) recorded classroom activities using humans to code tutor-student dialogue manually. However, they acknowledged that manual coding of lecture activities is complicated and cumbersome. Some authors also subscribe to this school of thought and have attempted to address this issue by applying Artificial Intelligence (AI) techniques to automate and scale the coding process to ensure quality in all platforms ( Prieto et al. 2018 ; Saar et al. 2017 ; Thomas 2018 ). Others have proposed re-designing TA process to automate the process of data collection as well as making the teacher autonomous in collecting data about their teaching ( Saar et al. 2018 ; Shen et al. 2018 ). Including using technology that is easy to set up, effortless to use, does not require much preparation and at the same time, not interrupting the flow of the class. In this way, they would not require researcher assistance or outside human observers. Table  2 , summarises the various data sources as well as tools that are used to harvest teaching data with regards to TA.

The collection of evidence from both online and real classroom practice is significant both for educational research and TPD. LA deals mostly with data captured from online and blended learning platforms (e.g., log data, social network and text data). Hence, LA provides teachers with data to monitor and observe students online class activities (e.g., discussion boards, assignment submission, email communications, wiki activities and progress). However, LA neglects to capture physical occurrences of the classroom and do not always address individual teachers’ needs. TA requires more adaptable forms of classroom data collection (e.g., through video- recordings, sensor recording or by human observers) which are tedious, human capital intensive and costly. Other methods have been explored to balance the trade-off between data collected online, and data gathered from physical classroom settings by implementing alternative designs approach ( Saar et al. 2018 ; Suehiro et al. 2017 ).

Analysis methods

Multimodal analytics is the emergent trend that will complement readily available digital traces, with data captured from the physical world. Several articles in the literature have used multimodal approaches to analyse teaching processes in the physical world ( Prieto et al. 2016 ; Prieto et al. 2017 ; Prieto et al. 2018 ; Saar et al. 2017 ; Thomas 2018 ). In university settings, unobtrusive computer vision approaches to assess student attention from their facial features, and other behavioural signs have been applied ( Thomas 2018 ). Most of the studies that have ventured into multimodal analytics applied ML algorithms to their captured datasets to build models of the phenomena under investigation ( Prieto et al. 2016 ; Prieto et al. 2018 ). Apart from research areas that involve multimodal analytics, other areas of TA research have also applied in ML techniques such as teachers tool usage patterns ( Xu and Recker 2012 ), online e-books ( Suehiro et al. 2017 ), students written-notes ( Taniguchi et al. 2017 ). Table  3 outlines some of the ML techniques applied from previous literature in TA.

Visualisation methods

TA allows teachers to apply visual analytics and visualisation techniques to improve TPD. The most commonly used visualisation techniques in TA are statistical graphs such as line charts, bar charts, box plots, or scatter plots. Other visualisation techniques include SNA, spatial, timeline, static and real-time visualisations. An essential visualisation factor for TA is the number of users represented in a visualisation technique. Serving single or individual users allows the analyst to inspect the viewing behaviour of one participant. Visualising multiple or group users at the same time can allow one to find strategies of groups. However, these representations might suffer from visual clutter if too much data displays at the same time. Here, optimisation strategies, such as averaging or bundling of lines might be used, to achieve better results. Table  4 represents the visualisation techniques mostly used in TA.

Systematic review: critique

Student evaluation on teaching (set) data.

Although the literature has extensively reported various data sources used for TA, this study also draws attention to student feedback on teaching, as another form of data that originates from the classroom. The analytics of student feedback on teaching could support teacher reflection on teaching practice and add value to TA. Student feedback on teaching is also known as student ratings, or SET is a form of textual data. It can be described as a combination of both quantitative and qualitative data that express students opinions about particular areas of teaching performance. It has existed since the 1920s ( Marsh 1987 ; Remmers and Brandenburg 1927 ), and used as a form of teacher feedback. In addition to serving as a source of input for academic improvement ( Linse 2017 ), many universities also rely profoundly on SET for hiring, promoting and firing instructors ( Boring et al. 2016 ; Harland and Wald 2018 ).

Technological advancement has enabled institutions of Higher Education (HE) to administer course evaluations online, forgoing the traditional paper-and-pencil ( Adams and Umbach 2012 ). There has been much research around online teaching evaluations. Asare and Daniel ( 2017 ) investigated the factors influencing the rate at which students respond to online SET. While there is a verity of opinions as to the validity of SET as a measure of teaching performance, many teaching academics and administrators perceive that SET is still the primary measure that fills this gap ( Ducheva et al. 2013 ; Marlin Jr and Niss 1980 ). After all, who experiences teaching more directly than students? These evaluations generally consist of questions addressing the instructor’s teaching, the content and activities of the paper, and the students’ own learning experience, including assessment. However, it appears these schemes gather evaluation data and pass on the raw data to the instructors and administrators, stopping short of deriving value from the data to facilitate improvements in the instruction and the learning experiences. This measure is especially critical as some teachers might have the appropriate data literacy skills to interpret and use such data.

Further, there are countless debates over the validity of SET data ( Benton and Cashin 2014 ; MacNell et al. 2015 ). These debates have highlighted some shortcomings of student ratings of teaching in light of the quality of instruction rated ( Boring 2015 ; Braga et al. 2014 ). For Edström, what matters is how the individual teacher perceives an evaluation. It could be sufficient to undermine TPD, especially if the teachers think they are the subjects of audit ( Edström 2008 ). However, SET is today an integral part of the universities evaluation process ( Ducheva et al. 2013 ). Research has also shown that there is substantial room for utilising student ratings for improving teaching practice, including, improving the quality of instruction, learning outcomes, and teaching and learning experience ( Linse 2017 ; Subramanya 2014 ). This research aligns to the side of the argument that supports using SET for instructional improvements, to the enhancement of teaching experience.

Systematically, analytics of SET could provide valuable insights, which can lead to improving teaching performance. For instance, visualising SET can provide some way, a teacher can benchmark his performance over a while. Also, SET could provide evidence to claim for some level of data fusion in TA, as argued in the conceptualisation subsection of TA.

Transformational TA

The growing research into big data in education has led to renewed interests in the use of various forms of analytics ( Borgman et al. 2008 ; Butson and Daniel 2017 ; Choudhury et al. 2002 ). Analytics seeks to acquire insightful information from hidden patterns and relationships in data that ordinarily would not be visible by the natural eyes, except with the application of state-of-the-art models and methods. Big data analytics in HE provides lenses on students, teachers, administrators, programs, curriculum, procedures, and budgets ( Daniel 2015 ). Figure  3 illustrates the types of analytics that applies to TA to transform HE.

figure 3

Types of analytics in higher education ( Daniel 2019 )

Descriptive Analytics Descriptive analytics aims to interpret historical data to understand better organisational changes that have occurred. They are used to answer the "What happened?" information regarding a regulatory process such as what are the failure rates in a particular program ( Olson and Lauhoff 2019 ). It applies simple statistical techniques such as mean, median, mode, standard deviation, variance, and frequency to model past behaviour ( Assunção et al. 2015 ; ur Rehman et al. 2016 ). Barmaki and Hughes ( 2015 ) carried out some descriptive analytics to know the mean view time, mean emotional activation, and area of interest analysis on the data generated from 27 stimulus images to investigate the notational, informational and emotional aspect of TA. Similarly, Michos and Hernández-Leo ( 2016 ) demonstrated how descriptive analytics could support teachers’ reflection and re-design their learning scenarios.

Diagnostic Analytics Diagnostic analytics is higher-level analytics that further diagnoses descriptive analytics ( Olson and Lauhoff 2019 ). They are used to answer the "Why it happened?". For example, a teacher may need to carry out diagnostic analytics to know why there is a high failure rate in a particular programme or why students rated a course so low for a specific year compared to the previous year. Diagnostic analytics uses some data mining techniques such as; data discovery, drill-down and correlations to further explore trends, patterns and behaviours ( Banerjee et al. 2013 ). Previous research has applied the repertory grid technique as a pedagogical method to support the teachers perform knowledge diagnostics of students about a specific topic of study ( Pantazos and Vatrapu 2016 ; Vatrapu et al. 2013 ).

Relational Analytics Relational analytics is the measure of relationships that exists between two or more variables. Correlation analysis is a typical example of relational analytics that measures the linear relationship between two variables ( Rayward-Smith 2007 ). For instance, Thomas ( 2018 ) applied correlation analysis to select the best features from the speaker and audience measurements. Some researchers have also referred to other forms of relational analytics, such as co-occurrence analysis to reveal students hidden abstract impressions from students written notes ( Taniguchi et al. 2017 ). Others have used relational analytics to differentiate critical formative assessment futures of an individual student to assist teachers in the understanding of the primary components that affect student performance ( Pantazos et al. 2013 ; Michos and Hernández Leo 2016 ). A few others have applied it to distinguish elements or term used to express similarities or differences as they relate to their contexts ( Vatrapu et al. 2013 ). Insights generated from this kind of analysis can be considered to help improve teaching in future lectures and also compare different teaching styles. Sequential pattern mining is also another type of relational analytics used to determine the relationship that exists between subsequent events ( Romero and Ventura 2010 ). It can be applied in multimodal analytics to cite the relationship between the physical aspect of the learning and teaching process such as the relationship between ambient factors and learning; or the investigation of robust multimodal indicators of learning, to help in teacher decision-making ( Prieto et al. 2017 ).

Predictive Analytics Predictive analytics aims to predict future outcomes based on historical and current data ( Gandomi and Haider 2015 ). Just as the name infers, predictive analytics attempts to predict future occurrences, patterns and trends under varying conditions ( Joseph and Johnson 2013 ). It makes use of different techniques such as regression analysis, forecasting, pattern matching, predictive modelling and multi-variant statistics ( Gandomi and Haider 2015 ; Waller and Fawcett 2013 ). In prediction, the goal is to predict students and teachers activities to generate information that can support decision-making by the teacher ( Chatti et al. 2013 ). Predictive analytics is used to answer the "What will happen". For instance, what are the interventions and preventive measures a teacher can take to minimise the failure rate? Herodotou et al. ( Herodotou et al. 2019 ) provided evidence on how predictive analytics can be used by teachers to support active learning. An extensive body of literature suggests that predictive analytics can help teachers improve teaching practice ( Barmaki and Hughes 2015 ; Prieto et al. 2016 ; Prieto et al. 2018 ; Suehiro et al. 2017 ) and also to identify group of students that might need extra support to reach desired learning outcomes ( Goggins et al. 2016 ; Thomas 2018 ).

Prescriptive Analytics Prescriptive analytics provides recommendations or can automate actions in a feedback loop that might modify, optimise or pre-empt outcomes ( Williamson 2016 ). It is used to answer the "How will it best happen?". For instance, how will teachers make the right interventions for students that have been perceived to be at risk to minimise the student dropout rate or what kinds of resources are needed to support students who might need them to succeed? It determines the optimal action that enhances the business processes by providing the cause-effect relationship and applying techniques such as; graph analysis, recommendation engine, heuristics, neural networks, machine learning and Markov process ( Bihani and Patil 2014 ; ur Rehman et al. 2016 ). For example, applying curriculum Knowledge graph and learning Path recommendation to support teaching and learners learning process ( Shen et al. 2018 ).

Actionable Analytics Actionable analytics refers to analytics that prompt action ( Gudivada et al. 2016 ; Gudivada et al. 2018 ; Winkler and Söllner 2018 ). Norris et al. ( 2008 ) used the term action analytics to describe "the emergence of a new generation of tools, solutions, and behaviours that are giving rise to more powerful and effective utilities through which colleges and universities can measure performance and provoke pervasive actions to improve it". The educational sector can leverage some of these innovative, new and cutting edge technologies and techniques such as Natural Language Processing (NLP) ( Sergis and Sampson 2016 ; Taniguchi et al. 2017 ), big data analytics ( Goggins et al. 2016 ) and deep learning ( Prieto et al. 2018 ) to support teacher in both the teaching and learning processes.

Institutional Transformation Data in themselves are not useful; they only become valuable if they can be used to generate insight. In other words, analytics can be applied to institutional data to optimise productivity and performance of the institutional operations, thereby providing value that can transform the institutional practices. In education, there are various purposes of analytics, ranging from those that provide institutions with an overview or deep-down microscopic view of individual students, faculty, curriculum, programs, operations and budgets, to those capable of predicting future trends. Unveiling the value of TA empowers the teachers to identify issues and transform difficulties into opportunities. These opportunities can be employed to optimises the institutional processes, enhance learner experiences and improve teaching performance. TA and LA both play a vital role in effectively reforming and transforming the educational sector to catch up with the fast pace at which data generates. For example, with the extensive use of online and blended learning platforms, the application of analytics will enable institutional stakeholders at all levels to gain new insights into educational data. Today, the HE sector is at crossroads, where there is a need for synergies in learning research and data analytics to transform the way teaching and learning are fundamentally carried out.

The link between TA, LA and LD

Primarily, TA aims to link the centrepiece of LA and remodel them to address teaching challenges. More specifically, TA argues that connecting and analysing insights generated from LA methods and tools with those generated from in-class methods and tools, through TA tools could support teacher reflection and improve TPD based on evidence. Hence, this concept is presented further in the next subsection.

Conceptual framework of TA

Based on the different perceptions of TA described in previous reviews, this study proposes a conceptual framework for TA to model the complex interaction existing around TA. Three nodes (LA, TA and LD) are interconnected to each other forming a triadic network with the teacher at the centre, performing value-added interactions to make informed based decisions. Each part of this interconnection forms a triangle, totalling three triangles (A, B and C) (see Fig.  4 ).

figure 4

Conceptualisation of TA. Triadic TA Conceptual Framework

The proposed framework is not bound to any particular implementation of learning or design technology. Instead, the point is to describe the elements of analytics and data sources that are key for each domain to guide the use of analytical methods, tools and technology to support the multiple dimensions of learning design successfully.

This triad illustrates the interaction occurring between the teacher, the LA and the LD, to inform TPD. Hernández-Leo et al. ( 2019 ) argued that LD could contribute to structuring and orchestrating the design intent with learners digital trace patterns, advancing the knowledge and interpretation of LA. LA tailored to fit the design intent could be considered by teachers as contributing to the enhancement of the LD in subsequent design interactions. For example, LA could be an information tool to inform the tutors or designers of pedagogical decision making ( Persico and Pozzi 2015 ). Hence, a teacher may want to utilise LA to make just-in-time pedagogical decisions, such as grouping students based on their performance.

Similarly, a teacher may want to investigate if the estimated time taken for students to carry out learning tasks is reasonable or whether adjustments need to be made to the course design ( Hernández-Leo et al. 2019 ; Pozzi and Persico 2013 ). This domain can also provide teachers with analytics regarding the challenges and difficulties students face in the problem-solving phase while performing a task. In return, they give the teacher information in the form of TAD summarising the various challenges students encountered with that activity. They may also provide solutions on how to address them. For example, an early alert system that instantiates a dashboard for instructors using some metrics calculations such as login counts and page views ( Thille and Zimmaro 2017 ). The data sources in the LA node can improve teachers’ awareness, which could also lead to the improvement of LD and help to distinguish design elements that could modify future designs. Data collection in this domain is mostly automatic through virtual learning environments (e.g., LMS, MOOCs). Other forms of data collection may include social media platforms (e.g., Facebook, Tweeter), wearable sensors (e.g., eye-trackers, EEG), software tools that support and collect data related to specific student activities and attendance ( Bakharia et al. 2016 ; Bos and Brand-Gruwel 2016 ).

This triangle represents the relationship between the teacher, the LD and TA. While experiencing LD, TA endeavours to handle continues teachers’ engagement, progression, achievement and learners satisfaction ( Bakharia et al. 2016 ; Sergis and Sampson 2017 ). For example, exploring the impact of video shot on instructor performance and student learning. Using MOOC AB testing, teachers could experiment whether a difference in video production setting would have any impact on the instructors acting performance, or whether any changes in format and instructors performance will result in detectable differences in student viewing behaviour ( Chen et al. 2016 ).

Further, data sources in TA could assist teacher reflection on the impacts of their LD. Data collection could also be automatic by the use of wearable sensors on the teachers while performing teaching activities, also known as in-class analytics. Several institutions now record video contents of their face-to-face classes. Some others even go a step further by collecting their physiological data. These datasets, as mentioned earlier, have a way of exemplifying and illustrating things that ordinarily, a book of pedagogy cannot convey, in providing systematic feedback for the teachers. It involves capturing data during a traditional in-class, face-to-face teacher-centric instruction or teacher-student interaction (where students learn by directly or indirectly interacting with instructors in a lab or lecture hall) and analysing data to identify areas of possible improvements. The kind of data usually captured in this setting are audio, video, body movement, brain activity, cortex activity, to mention just a few. For example, a teacher can perform diagnostic analysis on class recorded videos to expose what is intrinsic during his lecture. This kind of diagnostic analysis could help teachers understand more about their teaching and discover areas of further improvement. SET is another form of data about the teachers; they are collected via the institutional application platforms ( Hernández-Leo et al. 2019 ) and can be visualised to improve teaching performance..

Analytics that happens in the LD involves the visualisation of teaching design to facilitate teacher reflection on the lesson plan, visualisation of the extent to which the lesson plan aligns with the educational objectives, and finally, validation of the lesson plan to highlight potential inconsistencies in the teaching design. For example, a teacher can visualise the number of assessment activities of the lesson plan or the various types of educational resources used in the lesson plan, to know if they are still valid or obsolete. Similarly, a teacher could analyse the time allocated for each lesson activity, to find out if the time allocated for each activity is good enough, or visualise the level of inconsistencies of time misappropriations and imbalances between the overall lesson plan and the individual lesson activities.

This area presents the communication between the teacher, the LA and the TA. Chinchu Thomas ( 2018 ) explored the correlation between student ratings on teaching and student physiological data. Similarly, Schmidlin ( 2015 ) established how to analyse and cross-reference data without decrypting the data sources. Hence, we argue that SET could be linked with LA such as student digital traces from LMS ( Stier et al. 2019 ) and other forms of data (such as attendance data), without compromising privacy. This claim for data fusion could support the teachers to make informed-decisions in new ways. For example, analytics performed on linked datasets could quickly reveal those student opinions that may not count at the end of the semester courses.

Visualisations that could quickly realise students with low participation rates and link it to their opinions, without revealing any identity. Additionally, teachers may be interested in comparing the view of students with low participation rate with those of high participation rate. This kind of information may lead teachers towards making explicit judgements with evidence. A tutor may choose to disregard the opinions of those students that participated less than 20 per cent in-class activities and assignments, as well as had a low attendance rate. Hence, narrowing concentration more on the opinions of students that participated in improving teaching practice.

However, considering ethical concerns, data fusion at the individual level still requires explicit and informed consent from the students whose data are collected ( Menchen-Trevino 2016 ). Other issues such as privacy concerns, data fusion can be problematic as this usually requires that the teachers know student identities. However, from a programmatic perspective, extra measures can be put in place to address this concern. Algorithms can be interfaced to mask student identities to some other unique identities to make them anonymous but linked ( Schmidlin et al. 2015 ) to provide a richer set of data for the teacher to make informed decisions.

Teachers can get a better picture towards improving the context in which learning happens, only if they can be informed about both how they teach and how students learn. Hence, this framework aims to continually provide teachers with interesting information from intelligent feedback based on data generated from users and learning context to improve their learning design and teaching outcome continuously.

Teaching Outcome Model (TOM)

Design-based research advances instructional design work, theory, and implementation as iterative, participatory, and located rather than processes "owned and operated" by designers of instructions ( Wang and Hannafin 2005 ). TOM is an iterative process that follows a design-based research approach to guide teachers, researchers, faculty and administrators on how to utilise data to improve the quality of teaching and learning outcome. This model enables teachers to investigate and evaluate their work using data. Consequently, improving the teacher use of data to inform teaching practice. To build more awareness with regards to teaching data, TOM models TA through iterative cycles of data collection, data analysis, data visualisation and action stages which are interdependent of each other (see Fig.  5 ). Design-based research, as a pragmatic methodology, can guide TOM while generating insights that can support teacher reflections on teaching and student learning. Conversely, TOM ensures that design-based research methodologies can be operational and systemised. Following the various stages outlined in the model, teachers can regularly identify, match and adjust teaching practice, and learning design to all the learners need.

figure 5

Teaching Outcome Model. TA Life cycle

In the data collection stage, a constant stream of data accumulates from the digital traces relating to teaching daily activities and engagements, including structured and unstructured data, visual and non-visual data, historical and real-time data. It is also important to note that the rate at which diverse data accumulates in our educational system will keep growing. According to Voithofer and Golan ( 2018 ), there are several ways to mine teaching and learning data without professional knowledge that is beyond the necessary teacher training experience in data literacy, administering learning design and class orchestration. Subscribing to this school of thought, adopting Big data infrastructure in our institutions will guarantee easy access to data by the various stakeholders, this will also mitigate the bottleneck of disparate data points existing in our educational sector. Therefore, enabling educators to focus more attention on instruction, setting up interactive class activities, and participating more on discussions that will create more data for evidence-based decision making. Also, the misuse of data is a broad primary concern ( Roberts et al. 2017 ). One critical matter is identifying the types of data that can be collected, analysed and visualized; to ensure that the right people have access to the data for the right purpose. As such, implementing data governance policies around institutional data such as; ’open definition of purpose, scope and boundaries, even if that is broad and in some respects, open-ended’ is critical ( Kay et al. 2012, p 6 ). This sort of measure will introduce clarity and address issues around who controls what data as well as security and privacy issues around data.

Analysis stage

This step involves the different ways of working with data to ensure data quality. Professionals such as data scientists, programmers, engineers and researchers need to work together with the teachers at this level. They can apply data mining techniques, statistical methods, complex algorithms, and AI techniques (such as NLP, AI, ML, deep learning) to adequately transform data into the useful analytical process. Analytics in the education space presents in diverse forms including, descriptive, diagnostic, predictive and prescriptive. These different forms of analytics can be utilised to offer a high-level view or fine-grained view of individual learners, teacher, faculty and their various activities, engagements and behaviours. Unravelling the value of data analytics empowers teachers and researchers to identify problems and transform challenges into opportunities that can be utilised to support teacher reflection and enrich teacher data-literacy experiences. For example, teachers can apply NLP on text data to gather topics from discussion posts, contributions participants have made within collaborative projects and their sentiments.

Furthermore, ML techniques could be combined with TA to enhance teaching outcome. For instance, chatbots could support the teacher by acting as a teacher assistant in large classes. An essential consideration in analytics, however, is that data can be easily de-identified ( Roberts et al. 2017 ; Cumbley and Church 2013 ), especially when data sets increase in size and scope and are combined to generate big data. To resolve these concerns, a particular university introduced a two-stage method of data de-identification coupled with data governance to restrict data access ( De Freitas et al. 2015 ).

Visualisation stage

This stage ensures data presentation in useful and meaningful ways to teachers. Empowering teachers with interactive visual interfaces and dashboards that facilitate teacher cognition and promote reflection about pre-processed and fine-grained teaching and learning activities. Through TAD, can project real-time and historical information from different data sources that might not be necessarily interoperable, and results summarised ( Moore 2018 ). However, visualisation is "what you see is what you get"; meaning that information presentation method may affect its interpretation, and consequently, may influence decision-making. Hence, it is necessary to address issues around visualisations in diverse forms such as; visual analytics and exploratory data analysis to create room for visual interactivity, exploratory visualisation to discover trends, patterns, relationships and behaviours. For example, a teacher can use a TAD to monitor student engagement. When the student engagement is poor, it may prompt the teacher to take necessary actions such as; changing teaching material and making it more interactive. Additionally, there are also questions around privacy, such as who has access to visualisations relevant to an instructor, such as other faculty members participating in the course, directly or indirectly, administrators, researchers, potential employees of other institutions.

Action stage

At this stage, informed-decision leads to action and actions unavoidably reshape our environment; subsequently, regenerate new data. Additionally, there is a to create tools that will be useful to the teacher to understand and make meaning of data quickly. Actions taken by teachers can be used to improve the course design and assessment (value-added formative assessment). In any case, predictive analytics prompts an epistemological question; how should we ensure effective action by the teacher based on flawed predictions such that the system does not collapse?

Discussion and conclusion

This article presents the result of a systematic literature review aimed at describing the conception, and synthesis of the current research on the notion of TA, to provide insight into how TA can be used to improve the quality of teaching. The first part of the article described what is meant by TA to consolidate the divergent discourse on TA. The review showed that TA applies to analytics on teaching activities as well as methods of improving teachers’ awareness on students’ activities, including supporting the teachers to understand student learning behaviours to provide adequate feedback to teachers. In essence, the primary goal of TA is to improve teaching performance. The literature also revealed the several tools and methods are available for extracting digital traces associated with teaching in addition to traditional student evaluation tools. However, one of the main challenges recognised was the cost associated with some devices used to capture in-class activities, and ML techniques have been proposed to minimise this challenge.

The literature has also recognised teacher inquiry as a promising area of research in TA and came to a consensus that methods, like multimodal analytics and SNA, could help promote teacher inquiry and teacher reflection. Visualisations and visual analytics techniques are very significant in TA and also encourage teacher inquiry. The use of visualisation dashboards and TAD are essential tools that the modern-day teachers require to carry out a continuous and efficient reflection on teaching practice.

The emphasis of the synthesis of TA was clearly on data collection, analysis and visualisation, as illustrated in Fig.  6 . In the literature, the various kinds of data collected and used to improve teaching practice, include:

Digital trace data; "records of activity (trace data) undertaken through an online information system (thus, digital)" [119]. They incorporate various activities generated from custom applications and learning environments that leave digital footprints.

Image data are photographic or trace objects that represent the underlying pixel data of an area of an image element.

Physiological data are body measurement based on body-mounted sensors ( Lazar et al. 2017 ), used to extract data from teachers while performing classroom teaching activities.

Audio-video stream data or recorded lecturer data with captured physical teaching activities and students learning activities. Hence, attainable with mounted cameras, computer or mobile cameras connected to applications like Zoom and Skype, eye tracks with recording capabilities and digital cameras connected to learning environments such as Eco365.

Social data are data with online social activities, including utilising the repertory grid technique to collect students’ assessment data from social media sites.

Text data, including quantitative and qualitative data, data generated from text documents such as discussion forums, students essay or articles, emails and chat messages.

figure 6

Dimensions of TA. Illustration of TA based on the literature

Analysis in this context refers to the application of Educational Data Mining (EDM) and deep learning techniques mostly used to process data. EDM approaches is a complicated process that requires an interweaving of various specialised knowledge and ML algorithms, especially to improve teaching and learning ( Chen 2019 ). NLP and classification are the two main EDM techniques applied in TA. However, the review also recognised the use of other methods such as clustering and deep learning techniques, to support teachers.

As commonly said, a picture is worth more than a thousand words; visualisation can effectively communicate and reveal structures, patterns and trends in variables and their interconnections. Research in TA has applied several visualisation techniques including Network, Timeline, Spatial, Table and Statistical Graphs. For instance, SNA is a form of visual analytics that is used to support teachers to determine how different groups interact and engage with course resources. Identifying differences in interaction patterns for different groups of students may result in different learning outcomes, such as, how access patterns of successful groups of students differ from that of unsuccessful students. Applying visualisation techniques can support teachers in areas such as advising underperforming students about effective ways to approach study. Visualisation can enable teachers to identify groups of students that might need assistance and discover new and efficient means of using collaborative systems to achieve group work that can be taught explicitly to students.

However, while acknowledging the incomplete nature of data and complexities associated with data collection, analysis and use, teachers should take caution to avoid bais. Data collected in one context may not be directly applicable to another or have both benefits and cost for individuals or groups from which data was harvested. Therefore, key stakeholders, including teachers, course directors, unit coordinators and researchers must pay proper attention to predictive models and algorithms and take extra care to ensure that the contexts of data analysed are carefully considered. There are also privacy concerns, such as who has access to view analytics relating to a teacher, including other faculty members both directly or indirectly involved in the course, administrators, researchers, future employees of other institutions. It will be useful for institutions to have clear guidelines as to who has access to what and who views what. Other issues around data include how long should data remain accessible ( Siemens 2013 ), with big data technology and infrastructure, data should be kept for as long as it can exist. Pardo and Siemens ( 2014 ) acknowledged that the use of analytics in higher education research has no clear interpretation of the right to privacy. They seem opposed to the need for absolute privacy, on the basis that the use of historical data enhances research with potential rewards for the future of teaching professional development and student outcome.

The review provided in the current article highlighted the significant limitations in the existing literature on teaching analytics. The TAD is proposed to guide teachers, developers, and researchers to understand and optimise teaching and the learning environments. The critical aspect of this review is establishing the link between LA, TA and LD and its value in informing teachers’ inquiry process. Also, the review describes the relationship between LA, TA and LD. Finally, the article proposes TOM, which draws from a research-based approach to guide teachers on how to utilise data to improve teaching. The outcome of this model is a TAD that provides actionable insights for teacher reflection and informed decision-making. Therefore, showing the value that TA brings to pedagogic interventions and teacher reflection.

Theoretical implications

The analysis of data collected from the interaction of teachers with technology and students is a promising approach for advancing our understanding of the teaching process and how it can be supported. Teachers can use data obtained from their teaching to reflect on their pedagogical design and optimise the learning environment to meet students’ diverse needs and expectations.

Teacher-centric learning design can improve the utility of new technologies and subsequent acceptance of the use of these technologies to improve the quality of teaching and enhance students learning experience. TAD is one class of tools that can be designed in such a way that will improve teaching practice.

Research on learning analytics has revealed useful insights about students’ learning and the context in which they learn. While the ability to track, harvest and analyse various forms of learning analytics can reveal useful insights about learners’ engagement with learning environments, our review suggests that there is limited focus on analytics relating to the teacher, their teaching approaches and activities. Also, there has been increasing advances in the design of learner and teaching dashboards. However, many teachers still struggle with understanding and interpreting dashboards partly because they lack data literacy skills, and mostly because most the design of many of the tools does not include teachers as partners.

Although, TAD enable teachers to inspect, and understand the processes and progress relating to their teaching, the current implementations of TAD in general, does not adequately provide teachers with the details they need or want in a readily usable format. Educational technology developers can utilise our proposed model to design better tools for improving teaching practice. For example, a TAD can be designed to perform text analytics on students qualitative comments about a course taught, and results presented to the teacher in the form of themes, sentiments and classification; such that it will support the instructor’s needs and preferences for insight generation and reflection.

Teachers monitor, observe and track both teaching and learning activities to make appropriate decisions. Moreover, it is also important to note that visualisations can be misrepresented, misinterpreted or misused by the viewer [122]. Hence, perception and cognition remain a significant challenge in TAD. Consequently, it becomes necessary to design and write algorithms that extract information visualisation, in such a way that allows adequate understanding by teachers. It is also crucial for dashboards to integrate multiple sources such as combining both the learning and teaching activities into a TAD, to create room for teachers to comprehend, reflect on and act upon the presented information quickly.

Also, the current state of technology shows little progress in taking TA, raising concerns about the accurate validity and scalability of innovations such as predictive analytics and TAD. Furthermore, the ethical issues of data use are not considered sufficient to establish institutional policies which incorporate TA as part of quality education models.

Finally, consideration of the framework’s three layers as a whole raises new questions and opportunities. For example, linking educational performance and satisfaction to specific learning design involves consideration of elements of all three layers. This review has shown that TA is a new and essential area of analytics in education. The study also suggests that the conceptualisation of teaching analytics is still at its infancy. However, the practical and successful use of teaching analytics is highly dependent on the development of conceptual and theoretical foundations into consideration.

Implications for practice

This review has uncovered the value of TA and its role in fostering data literacy skills in teachers to support evidence-based teaching. The purpose of TOM is to guide the development of teaching dashboard, and for researchers to develop strategies that help meaningful ways in which data can be presented to teachers. Teacher dashboards can empower the teachers with tools that create new opportunities to make data-informed strategic decisions, utilising the power of analytics and visualisation techniques. Consequently, increasing the efficiency and effectiveness of the institution, including, improving teaching practice, curriculum development and improvement, active learning engagement and improved students’ success. TOM also presents a platform for teaching academics who may have the best understanding of their course contexts, to provide a significant contribution to a culture of data-informed teaching practice within an institution.

The responsibility for managing the systems that provide the analytics usually falls within the control and supervision of the institution’s information technology (IT) department, and often, they have little to no knowledge of their pedagogical applications to teaching and learning. Likewise, academics and their fields of learning support are often deprived of IT skills and have little to no professional understanding of how software systems work. TOM provides opportunities for the teachers to be involved in the design of TA by providing significant interaction and collaboration between the IT and the other sectors that interpret and act upon the information flow.

Additionally, institutions need to provide teaching staff with the necessary training that fosters the development of data literacy skills, and in the use of data and analytical or visualisation dashboards to monitor their teaching practice. Based on some of the challenges identified in the present review, it is imperative institutions ensure that data is collected transparently, with the awareness of all the stakeholders involved, and informed consent of individuals where appropriate. With the advancements in computing technology, data collection, analysis and use have significantly increased, large amounts of data can be continually pulled from different sources and processed at fast speeds. Big data offers institutions the opportunity to implement big data infrastructures and utilise the full potential of data analytics and visualisation. However, institutions also need to consider implementing a data governance framework to guide the implementation and practice of analytics.

The conceptual framework of TA was established to demonstrate the relationship between LA, TA and LD, which can be useful knowledge to various institutional stakeholders, including the learners, teachers, researchers and administrators. However, there are also issues around data ownership, intellectual property rights, and licensing for data re-use (the students, the instructor, the researcher or the institution). For instance, the same data sources can be shared amongst the various stakeholders, but with different level of access, as such data sharing agreement would be needed to guide sharability without infringing on rights, violating privacy or disadvantaging individuals. The implementation of data sharing agreement would require the building of institutional, group as well as individual trust, which would include guidelines on sharing data within the institution and between third parties, such as external organisations and other institutions. In general, stricter data management policies that guide data collection, analysis and use is essential for every institution.

Limitations and future research

Teaching analytics is an emergent phenomenon in the learning analytics and data science literature, with a limited body of published work in the area, as such conclusions drawn from the review are limited to the databases interrogated and articles reviewed. Further, findings in the review are likely to be influenced by our interpretation of the literature and untestable assumptions. For example, linking LA, TA and LD and their underlying assumptions is not grounded in empirical work. The review serves as an advocacy for teacher data literacy and the ability to work with various forms of data. However, working with a single data point may not be publicly accessible to teachers.

Moreover, the combination of analytics on the several data points may lead to some level of identification, and this would require navigating issues around access, protecting privacy, and obtaining appropriate consents. Therefore, it is almost impossible for individual teachers to comprehend not only the scope of data collected, analysed and used but also the consequences of the different layers of collection, analysis and use. Consequently, making it challenging for teachers to make use of the full potentials of data to make informed choices in learning design. No matter how straightforward or transparent institutional policies around data are, the sheer complexity of the collection, analysis and use has made it impossible, posing a fundamental issue for the stakeholders trying to use analytics to enhance teaching practice and learning outcome across an institution.

In future research, we hope to carry out more extensive empirical research on how TOM could be applied to address issues with regards to ethical and privacy concerns about the utilization of TA. We are currently exploring how teaching analytics dashboards can be used to support teacher data literacy and use analytics to improve teaching practice and learning outcome.

Availability of data and materials

Not applicable.

Abbreviations

Academic analytics

Artificial intelligence

Educational data mining

Higher education

Interactive whiteboard

  • Learning analytics

Learning design

Learning management system

Machine learning

Massive open online courses

Natural language processing

Open learners model

Student evaluation of teaching

Social network analysis

  • Teaching analytics

Teaching analytics dashboard

Term frequency inverse document frequency

  • Teaching and learning analytics
  • Teaching outcome model

Technology, pedagogy, and content knowledge

Teacher professional development

Adams, M.J., & Umbach, P.D. (2012). Nonresponse and online student evaluations of teaching: Understanding the influence of salience, fatigue, and academic environments. Research in Higher Education , 53 (5), 576–591.

Article   Google Scholar  

Anderson, T. (2003). Getting the mix right again: An updated and theoretical rationale for interaction. The International Review of Research in Open Distributed Learning , 4 (2).

Asare, S., & Daniel, B.K. (2017). Factors influencing response rates in online student evaluation systems: A systematic review approach. In E-Learn: World Conference on E-Learning in Corporate, Government, Healthcare, and Higher Education . Association for the Advancement of Computing in Education (AACE), (pp. 537–541).

Assunção, M.D., Calheiros, R.N., Bianchi, S., Netto, M.A., Buyya, R. (2015). Big data computing and clouds: Trends and future directions. Journal of Parallel and Distributed Computing , 79 , 3–15.

Bakharia, A., Corrin, L., De Barba, P., Kennedy, G., Gašević, D., Mulder, R., Williams, D., Dawson, S., Lockyer, L. (2016). A conceptual framework linking learning design with learning analytics. In Proceedings of the Sixth International Conference on Learning Analytics & Knowledge . ACM, (pp. 329–338).

Banerjee, A., Bandyopadhyay, T., Acharya, P. (2013). Data analytics: Hyped up aspirations or true potential? Vikalpa , 38 (4), 1–12.

Barmaki, R., & Hughes, C.E. (2015). Providing real-time feedback for student teachers in a virtual rehearsal environment. In Proceedings of the 2015 ACM on International Conference on Multimodal Interaction . ACM, (pp. 531–537).

Beer, C., Jones, D., Clark, K. (2009). The indicators project identifying effective learning: Adoption, activity, grades and external factors. In Ascilite . Citeseer.

Benton, S.L., & Cashin, W.E. (2014). Student ratings of instruction in college and university courses , (pp. 279–326): Springer.

Bihani, P., & Patil, S. (2014). A comparative study of data analysis techniques. International Journal of Emerging Trends & Technology in Computer Science , 3 (2), 95–101.

Google Scholar  

Borgman, C.L., Abelson, H., Dirks, L., Johnson, R., Koedinger, K.R., Linn, M.C., Lynch, C.A., Oblinger, D.G., Pea, R.D., Salen, K. (2008). Fostering learning in the networked world: The cyberlearning opportunity and challenge. a 21st century agenda for the national science foundation. https://doi.org/10.1037/e532532011-001 .

Boring, A. (2015). Gender biases in student evaluation of teachers . Paris. https://doi.org/10.1016/j.jpubeco.2016.11.006 .

Boring, A., Ottoboni, K., Stark, P.B. (2016). Student evaluations of teaching are not only unreliable, they are significantly biased against female instructors. Impact of Social Sciences Blog . The London School of Economics and Political Science.

Bos, N., & Brand-Gruwel, S. (2016). Student differences in regulation strategies and their use of learning resources: implications for educational design. In Proceedings of the Sixth International Conference on Learning Analytics & Knowledge . ACM, (pp. 344–353).

Braga, M., Paccagnella, M., Pellizzari, M. (2014). Evaluating students’ evaluations of professors. Economics of Education Review , 41 , 71–88.

Butson, R., & Daniel, B. (2017). The Rise of Big Data and Analytics in Higher Education , (pp. 127–140): Auerbach Publications.

Charleer, S., Klerkx, J., Odriozola, S., Luis, J., Duval, E. (2013). Improving awareness and reflection through collaborative, interctive visualizations of badges. In ARTEL13: Proceedings of the 3rd Workshop on Awareness and Reflection in Technology-enhanced Learning, vol. 1103 . CEUR Workshop Proceedings, (pp. 69–81).

Chatti, M.A., Dyckhoff, A.L., Schroeder, U., Thüs, H. (2013). A reference model for learning analytics. 5-6 , 4 , 318–331.

Chen, L.-L. (2019). Enhancing teaching with effective data mining protocols. Journal of Educational Technology Systems , 47 (4), 500–512.

Chen, Z., Chudzicki, C., Palumbo, D., Alexandron, G., Choi, Y.-J., Zhou, Q., Pritchard, D.E. (2016). Researching for better instructional methods using ab experiments in moocs: results and challenges. Research and Practice in Technology Enhanced Learning , 11 (1), 9.

Choudhury, S., Hobbs, B., Lorie, M., Flores, N. (2002). A framework for evaluating digital library services. D-Lib magazine , 8 (7/8), 1082–9873.

Chounta, I.-A., McLaren, B.M., Albacete, P.L., Jordan, P.W., Katz, S. (2016). Analysis of human-to-human tutorial dialogues: Insights for teaching analytics. In IWTA@ EC-TEL , (pp. 9–17).

Cumbley, R., & Church, P. (2013). Is “big data” creepy? Computer Law & Security Review , 29 (5), 601–609.

Dana, N.F., & Yendol-Hoppey, D. (2019). The Reflective Educator’s Guide to Classroom Research: Learning to Teach and Teaching to Learn Through Practitioner Inquiry : Corwin.

Daniel, B. (2015). Big data and analytics in higher education: Opportunities and challenges. British Journal of Educational Technology , 46 (5), 904–920. https://doi.org/10.1111/bjet.12230 .

Daniel, B., & Harland, T. (2017). Higher Education Research Methodology: A Step-by-Step Guide to the Research Process . Routledge London. https://doi.org/10.4324/9781315149783 .

Daniel, B.K. (2019). Artificial reality: The practice of analytics and big data in educational research. In: Pedersen, J.S., & Wilkinson, A. (Eds.) In Big data: Promise, application and pitfalls . https://doi.org/10.4337/9781788112352.00018 . Edward Elgar, Cheltenham, (pp. 287–300).

Chapter   Google Scholar  

De Freitas, S., Gibson, D., Du Plessis, C., Halloran, P., Williams, E., Ambrose, M., Dunwell, I., Arnab, S. (2015). Foundations of dynamic learning analytics: Using university student data to increase retention. British Journal of Educational Technology , 46 (6), 1175–1188.

Dix, A.J., & Leavesley, J. (2015). Learning analytics for the academic: An action perspective. J. UCS , 21 (1), 48–65.

Ducheva, Z., Pehlivanova, M., Dineva, S. (2013). Possibilities for students to evaluate and improve electronic courses. In The 8th International Conferemnce on Virtual Learning ICVL .

Edström, K. (2008). Doing course evaluation as if learning matters most. Higher Education Research & Development , 27 (2), 95–106.

Elias, T. (2011). Learning analytics. Learning , 1–22.

Ferguson, R. (2012). Learning analytics: drivers, developments and challenges. International Journal of Technology Enhanced Learning , 4 (5/6), 304–317.

Fischer, F., Wild, F., Sutherland, R., Zirn, L. (2014). Grand Challenge Problems from the Alpine Rendez-Vous , (pp. 3–71): Springer.

Flanders, N.A. (1970). Analyzing Teacher Behavior . Addison-Wesley P.C.

Flavin, M. (2017). Disruptive Technology Enhanced Learning: The Use and Misuse of Digital Technologies in Higher Education : Springer. https://doi.org/10.1057/978-1-137-57284-4 .

Gandomi, A., & Haider, M. (2015). Beyond the hype: Big data concepts, methods, and analytics. International Journal of Information Management , 35 (2), 137–144.

Gauthier, G. (2013). Using teaching analytics to inform assessment practices in technology mediated problem solving tasks. In IWTA@ LAK .

Ginon, B., Johnson, M.D., Turker, A., Kickmeier-Rust, M. (2016). Helping Teachers to Help Students by Using an Open Learner Model. https://doi.org/10.1007/978-3-319-45153-4_69 .

Goggins, S.P. (2012). Group informatics: A multi-domain perspective on the development of teaching analytics. In Proceedings of the TaPTA Workshop at EC-TEL .

Goggins, S.P., Galyen, K., Petakovic, E., Laffey, J.M. (2016). Connecting performance to social structure and pedagogy as a pathway to scaling learning analytics in moocs: an exploratory study. Journal of Computer Assisted Learning , 32 (3), 244–266.

Gorham, J. (1988). The relationship between verbal teacher immediacy behaviors and student learning. Communication Education , 37 (1), 40–53.

Griffiths, D. (2017). The use of models in learning design and learning analytics. Interaction Design and Architecture(s) Journal , 33 , 113–133.

Gudivada, V.N., Irfan, M., Fathi, E., Rao, D. (2016). Cognitive analytics: Going beyond big data analytics and machine learning (Vol. 35, pp. 169–205).

Gudivada, V.N., Rao, D.L., Ding, J. (2018). 2. Evolution and Facets of Data Analytics for Educational Data Mining and Learning Analytics , (pp. 16–42). New York. https://doi.org/10.4324/9780203728703-3 .

Harland, T., & Wald, N. (2018). Vanilla teaching as a rational choice: the impact of research and compliance on teacher development. Teaching in Higher Education , 23 (4), 419–434.

Hernández-Leo, D., Martinez-Maldonado, R., Pardo, A., Muñoz-Cristóbal, J.A., Rodríguez-Triana, M.J. (2019). Analytics for learning design: A layered framework and tools. British Journal of Educational Technology , 50 (1), 139–152.

Herodotou, C., Hlosta, M., Boroowa, A., Rienties, B., Zdrahal, Z., Mangafa, C. (2019). Empowering online teachers through predictive learning analytics. British Journal of Educational Technology . https://doi.org/10.1111/bjet.12853 .

Huang, C.-W. (2001). Educlick: A computer-supported formative evaluation system with wireless devices in ordinary classroom. In Proceedings of Int. Conference on Computers in Education, 2010 , (pp. 1462–1469).

Joseph, R.C., & Johnson, N.A. (2013). Big data and transformational government. IT Professional , 15 (6), 43–48.

Kaser, L., & Halbert, J. (2014). Creating and sustaining inquiry spaces for teacher learning and system transformation. European Journal of Education , 49 (2), 206–217.

Kay, D., Korn, N., Oppenheim, C. (2012). Legal, risk and ethical aspects of analytics in higher education. Analytics Series . JISC Cetis (Centre for educational technology and interoperability standards).

KU, O., LIANG, J.-K., CHANG, S.-B., WU, M. (2018). Sokrates teaching analytics system (stas): An automatic teaching behavior analysis system for facilitating teacher professional development. In Proceedings of the 26th International Conference on Computers in Education. Philippines: Asia-Pacific Society for Computers in Education .

Laney, D. (2001). 3d data management: Controlling data volume, velocity and variety. META Group Research Note. META group research note , 6 (70), 1.

Lazar, J., Feng, J.H., Hochheiser, H. (2017). Research Methods in Human-computer Interaction : Morgan Kaufmann.

Leitner, P., Khalil, M., Ebner, M. (2017). Learning analytics in higher education—a literature review , (pp. 1–23): Springer.

Libbrecht, P., Kortenkamp, U., Rebholz, S., Müller, W. (2013). Tales of a companion teacher analytics. In IWTA@ LAK .

Linse, A.R. (2017). Interpreting and using student ratings data: Guidance for faculty serving as administrators and on evaluation committees. Studies in Educational Evaluation , 54 , 94–106.

Loughran, J.J. (2002). Effective reflective practice: In search of meaning in learning about teaching. Journal of teacher education , 53 (1), 33–43.

Macfadyen, L.P., & Dawson, S. (2012). Numbers are not enough. why e-learning analytics failed to inform an institutional strategic plan. Journal of Educational Technology Society , 15 (3).

MacNell, L., Driscoll, A., Hunt, A.N. (2015). What’s in a name: Exposing gender bias in student ratings of teaching. Innovative Higher Education , 40 (4), 291–303.

Mansfield, J. (2019). Pedagogical Equilibrium: The Development of Teachers’ Professional Knowledge : Routledge.

Marlin Jr, J.W., & Niss, J.F. (1980). End-of-course evaluations as indicators of student learning and instructor effectiveness. The Journal of Economic Education , 11 (2), 16–27.

Marsh, H.W. (1987). Students’ evaluations of university teaching: Research findings, methodological issues, and directions for future research. International Journal of Educational Research , 11 (3), 253–388.

McKenney, S., & Mor, Y. (2015). Supporting teachers in data–informed educational design. British journal of educational technology , 46 (2), 265–279.

Menchen-Trevino, E. (2016). Web historian: Enabling multi-method and independent research with real-world web browsing history data. In IConference 2016 Proceedings . https://doi.org/10.9776/16611 .

Michos, K., & Hernández Leo, D. (2016). Towards understanding the potential of teaching analytics within educational communities. In: Vatrapu, R.G.B.B.S., & Kickmeier-Rust, M. (Eds.) In Vatrapu R, Kickmeier-Rust M, Ginon B, Bull S. IWTA 2016 International Workshop on Teaching Analytics. Proceedings of the Fourth International Workshop on Teaching Analytics, in Conjunction with EC-TEL 2016; 2016 Sept 16; Lyon, France.[place Unknown]: CEUR Workshop Proceedings , (pp. 1–8).

Moore, B.L. (2018). 6. The Role of Data Analytics in Education Possibilities and Limitations, 1st edn.https://doi.org/10.4324/9780203728703-8.

Mor, Y., Ferguson, R., Wasson, B. (2015). Learning design, teacher inquiry into student learning and learning analytics: A call for action. British Journal of Educational Technology , 46 (2), 221–229.

Müller, W., Rebholz, S., Libbrecht, P. (2016). Automatic inspection of e-portfolios for improving formative and summative assessment. In International Symposium on Emerging Technologies for Education . Springer, (pp. 480–489).

Norris, D., Baer, L., Leonard, J., Pugliese, L., Lefrere, P. (2008). Action analytics: Measuring and improving performance that matters in higher education. EDUCAUSE Review , 43 (1), 42.

Olson, D.L., & Lauhoff, G. (2019). Descriptive data mining , (pp. 129–130): Springer.

Osterman, K.F., & Kottkamp, R.B. (1993). Reflective Practice for Educators: Improving Schooling Through Professional Development : ERIC.

Pantazos, K., & Vatrapu, R. (2016). Enhancing the professional vision of teachers: A physiological study of teaching analytics dashboards of students’ repertory grid exercises in business education. In System Sciences (HICSS), 2016 49th Hawaii International Conference On . IEEE, (pp. 41–50).

Pantazos, K., Vatrapu, R.K., Hussain, A. (2013). Visualizing repertory grid data for formative assessment. In IWTA@ LAK .

Papamitsiou, Z., & Economides, A.A. (2016). Learning analytics for smart learning environments: A meta-analysis of empirical research results from 2009 to 2015. Learning, Design, and Technology: An International Compendium of Theory, Research, Practice, and Policy , 1–23. https://doi.org/10.1007/978-3-319-17727-4_15-1 .

Pardo, A., & Siemens, G. (2014). Ethical and privacy principles for learning analytics. British Journal of Educational Technology , 45 (3), 438–450.

Pascual-Miguel, F., Chaparro-Pelaez, J., Hernandez-Garcia, A., Iglesias-Pradas, S. (2011). A characterisation of passive and active interactions and their influence on students’ achievement using moodle lms logs. International Journal of Technology Enhanced Learning , 3 (4), 403–414.

Pennings, H.J., Brekelmans, M., Wubbels, T., van der Want, A.C., Claessens, L.C., van Tartwijk, J. (2014). A nonlinear dynamical systems approach to real-time teacher behavior: Differences between teachers. Nonlinear Dynamics, Psychology, and Life Sciences , 18 (1), 23–45.

Persico, D., & Pozzi, F. (2015). Informing learning design with learning analytics to improve teacher inquiry. British Journal of Educational Technology , 46 (2), 230–248.

Pozzi, F., & Persico, D. (2013). Sustaining learning design and pedagogical planning in cscl. Research in Learning Technology , 21 . https://doi.org/10.3402/rlt.v21i0.17585 .

Prieto, L.P., Magnuson, P., Dillenbourg, P., Saar, M. (2017). Reflection for action: Designing tools to support teacher reflection on everyday evidence. https://doi.org/10.31219/osf.io/bj2rp .

Prieto, L.P., Rodriguez-Triana, M.J., Kusmin, M., Laanpere, M. (2017). Smart school multimodal dataset and challenges. In Joint Proceedings of the Sixth Multimodal Learning Analytics (MMLA) Workshop and the Second Cross-LAK Workshop Co-located with 7th International Learning Analytics and Knowledge Conference, vol. 1828 . CEUR, (pp. 53–59).

Prieto, L.P., Sharma, K., Dillenbourg, P., Jesús, M. (2016). Teaching analytics: towards automatic extraction of orchestration graphs using wearable sensors. In Proceedings of the Sixth International Conference on Learning Analytics & Knowledge . ACM, (pp. 148–157).

Prieto, L.P., Sharma, K., Kidzinski, Ł., Rodríguez-Triana, M.J., Dillenbourg, P. (2018). Multimodal teaching analytics: Automated extraction of orchestration graphs from wearable sensor data. Journal of Computer Assisted Learning , 34 (2), 193–203.

Ramos, C., & Yudko, E. (2008). "hits"(not "discussion posts") predict student success in online courses: a double cross-validation study. Computers & Education , 50 (4), 1174–1182.

Rayward-Smith, V.J. (2007). Statistics to measure correlation for data mining applications. Computational Statistics & Data Analysis , 51 (8), 3968–3982.

Article   MathSciNet   MATH   Google Scholar  

Remmers, H.H., & Brandenburg, G. (1927). Experimental data on the purdue rating scale for instructors. Educational Administration and Supervision , 13 (6), 399–406.

Rienties, B., Boroowa, A., Cross, S., Farrington-Flint, L., Herodotou, C., Prescott, L., Mayles, K., Olney, T., Toetenel, L., Woodthorpe, J. (2016). Reviewing three case-studies of learning analytics interventions at the open university uk. In Proceedings of the Sixth International Conference on Learning Analytics & Knowledge . ACM, (pp. 534–535).

Roberts, L.D., Chang, V., Gibson, D. (2017). Ethical considerations in adopting a university-and system-wide approach to data and learning analytics , (pp. 89–108): Springer. https://doi.org/10.1007/978-3-319-06520-5_7 .

Romero, C., & Ventura, S. (2010). Educational data mining: a review of the state of the art. IEEE Transactions on Systems, Man, and Cybernetics, Part C (Applications and Reviews) , 40 (6), 601–618.

Saar, M., Kusmin, M., Laanpere, M., Prieto, L.P., Rüütmann, T. (2017). Work in progress–semantic annotations and teaching analytics on lecture videos in engineering education. In Global Engineering Education Conference (EDUCON), 2017 IEEE . IEEE, (pp. 1548–1551).

Saar, M., Prieto, L.P., Rodríguez-Triana, M.J., Kusmin, M. (2018). Personalized, teacher-driven in-action data collection: technology design principles. In 2018 IEEE 18th International Conference on Advanced Learning Technologies (ICALT) . IEEE, (pp. 58–62).

Saric, M., & Steh, B. (2017). Critical reflection in the professional development of teachers: Challenges and possibilities. CEPS Journal , 7 (3), 67–85.

Saye, J.W., & Brush, T. (2007). Using technology-enhanced learning environments to support problem-based historical inquiry in secondary school classrooms. Theory Research in Social Education , 35 (2), 196–230.

Schempp, P., McCullick, B., Pierre, P.S., Woorons, S., You, J., Clark, B. (2004). Expert golf instructors’ student-teacher interaction patterns. Research Quarterly for Exercise and Sport , 75 (1), 60–70.

Schmidlin, K., Clough-Gorr, K.M., Spoerri, A. (2015). Privacy preserving probabilistic record linkage (p3rl): a novel method for linking existing health-related data and maintaining participant confidentiality. BMC Medical Research Methodology , 15 (1), 46.

Sergis, S., & Sampson, D.G. (2016). Towards a teaching analytics tool for supporting reflective educational (re) design in inquiry-based stem education. In Advanced Learning Technologies (ICALT), 2016 IEEE 16th International Conference On . https://doi.org/10.1109/icalt.2016.134 . IEEE, (pp. 314–318).

Sergis, S., & Sampson, D.G. (2017). Teaching and learning analytics to support teacher inquiry: A systematic literature review , (pp. 25–63): Springer. https://doi.org/10.1007/978-3-319-52977-6_2 .

Sergis, S., Sampson, D.G., Rodríguez-Triana, M.J., Gillet, D., Pelliccione, L., de Jong, T. (2017). Using educational data from teaching and learning to inform teachers’ reflective educational design in inquiry-based stem education. Computers in Human Behavior . https://doi.org/10.1016/j.chb.2017.12.014 .

Shen, J., Chen, H., Jiang, J. (2018). A research on techniques for data fusion and analysis of cross-platform mooc data. In 2018 17th International Conference on Information Technology Based Higher Education and Training (ITHET) . IEEE, (pp. 1–8).

Siemens, G. (2013). Learning analytics: The emergence of a discipline. American Behavioral Scientist , 57 (10), 1380–1400.

Stier, S., Breuer, J., Siegers, P., Thorson, K. (2019). Integrating Survey Data and Digital Trace Data: Key Issues in Developing an Emerging Field. https://doi.org/10.1177/0894439319843669 .

Subramanya, S. (2014). Toward a more effective and useful end-of-course evaluation scheme. Journal of Research in Innovative Teaching , 7 (1).

Suehiro, D., Taniguchi, Y., Shimada, A., Ogata, H. (2017). Face-to-face teaching analytics: Extracting teaching activities from e-book logs via time-series analysis. In Advanced Learning Technologies (ICALT), 2017 IEEE 17th International Conference On . IEEE, (pp. 267–268).

Sun, J., Przybylski, R., Johnson, B.J. (2016). A review of research on teachers’ use of student data: From the perspective of school leadership. Educational Assessment, Evaluation and Accountability , 28 (1), 5–33.

Taniguchi, Y., Suehiro, D., Shimada, A., Ogata, H. (2017). Revealing hidden impression topics in students’journals based on nonnegative matrix factorization. In Advanced Learning Technologies (ICALT), 2017 IEEE 17th International Conference On . IEEE, (pp. 298–300).

Thille, C., & Zimmaro, D. (2017). Incorporating learning analytics in the classroom. New Directions for Higher Education , 2017 (179), 19–31.

Thomas, C. (2018). Multimodal teaching and learning analytics for classroom and online educational settings. In Proceedings of the 2018 on International Conference on Multimodal Interaction . ACM, (pp. 542–545).

ur Rehman, M.H., Chang, V., Batool, A., Wah, T.Y. (2016). Big data reduction framework for value creation in sustainable enterprises. International Journal of Information Management , 36 (6), 917–928.

Van Harmelen, M., & Workman, D. (2012). Analytics for learning and teaching. CETIS Analytics Series , 1 (3), 1–40.

van Leeuwen, A., Rummel, N., van Gog, T. (2019). What information should cscl teacher dashboards provide to help teachers interpret cscl situations? International Journal of Computer-Supported Collaborative Learning , 1–29. https://doi.org/10.1007/s11412-019-09299-x .

Vatrapu, R.K. (2012). Towards semiology of teaching analytics. In Workshop Towards Theory and Practice of Teaching Analytics, at the European Conference on Technology Enhanced Learning, TAPTA, vol. 12 .

Vatrapu, R.K., Kocherla, K., Pantazos, K. (2013). iklassroom: Real-time, real-place teaching analytics. In IWTA@ LAK .

Vatrapu, R., Reimann, P., Bull, S., Johnson, M. (2013). An eye-tracking study of notational, informational, and emotional aspects of learning analytics representations. In ACM International Conference Proceeding Series . https://doi.org/10.1145/2460296.2460321 . https://www.scopus.com/inward/record.uri?eid=2-s2.0-84876499638&doi=10.1145%2f2460296.2460321&partnerID=40&md5=e7b4d83a3e33e7a1c3c5b5f56d5ebe7d , (pp. 125–134).

Vatrapu, R., Reimann, P., Hussain, A., Kocherla, K. (2013). Towards teaching analytics: Repertory grids for formative assessment (rgfa). In CSCL 2013 Conference Proceedings, vol 2 , (pp. 422–426).

Vatrapu, R., Tanveer, U., Hussain, A. (2012). Towards teaching analytics: communication and negotiation tool (coneto). In Proceedings of the 7th Nordic Conference on Human-Computer Interaction: Making Sense Through Design . ACM, (pp. 775–776).

Vatrapu, R., Teplovs, C., Fujita, N., Bull, S. (2011). Towards visual analytics for teachers’ dynamic diagnostic pedagogical decision-making. In Proceedings of the 1st International Conference on Learning Analytics and Knowledge . ACM, (pp. 93–98).

Verbert, K., Duval, E., Klerkx, J., Govaerts, S., Santos, J.L. (2013). Learning analytics dashboard applications. American Behavioral Scientist , 57 (10), 1500–1509.

Voithofer, R., & Golan, A.M. (2018). 5. Data Sources for Educators, 1st edn.https://doi.org/10.4324/9780203728703-7, (p. 18).

Waller, M.A., & Fawcett, S.E. (2013). Data science, predictive analytics, and big data: a revolution that will transform supply chain design and management. Journal of Business Logistics , 34 (2), 77–84.

Wang, F., & Hannafin, M.J. (2005). Design-based research and technology-enhanced learning environments. Educational Technology Research and Development , 53 (4), 5–23.

Williamson, B. (2016). Digital education governance: data visualization, predictive analytics, and ’real-time’ policy instruments. Journal of Education Policy , 31 (2), 123–141.

Winkler, R., & Söllner, M. (2018). Unleashing the potential of chatbots in education: A state-of-the-art analysis. In Academy of Management Annual Meeting (AOM) . https://doi.org/10.5465/ambpp.2018.15903abstract .

Xu, B., & Recker, M. (2012). Teaching analytics: A clustering and triangulation study of digital library user data. Educational Technology & Society , 15 (3), 103–115.

Yigitbasioglu, O.M., & Velcu, O. (2012). A review of dashboards in performance management: Implications for design and research. International Journal of Accounting Information Systems , 13 (1), 41–59.

Download references

Acknowledgements

The research reported is part of an ongoing PhD research study in the area of Big Data Analytics in Higher Education. We also want to thank members of the Technology Enhanced Learning and Teaching (TELT) Committee of the University of Otago, New Zealand for support and for providing constructive feedback.

This research project was fully sponsored by Higher Education Development Centre, University of Otago, New Zealand.

Author information

Authors and affiliations.

Higher Education Development Centre, University of Otago, Dunedin, New Zealand

Ifeanyi Glory Ndukwe & Ben Kei Daniel

You can also search for this author in PubMed   Google Scholar

Contributions

IGN conceived and presented the Conceptualisation of Teaching Analytics and Teachingv Outcome Model. BKD developed the Tripartite Approach that was utilised in this research. BKD encouraged IGN to perform a systematic review of teaching analytics that was guided by the Tripartite Approach. BKD supervised the findings of this work. IGN took the lead in writing the manuscript. All authors discussed the results, provided critical feedback and contributed to the final manuscript.

Corresponding author

Correspondence to Ifeanyi Glory Ndukwe .

Ethics declarations

Competing interests.

The authors declare that they have no competing interests. All authors have approved the manuscript and agree with its submission to the International Journal of Education Technology in Higher Education. This manuscript has not been published and is not under consideration for publication elsewhere.

Additional information

Publisher’s note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ .

Reprints and permissions

About this article

Cite this article.

Ndukwe, I.G., Daniel, B.K. Teaching analytics, value and tools for teacher data literacy: a systematic and tripartite approach. Int J Educ Technol High Educ 17 , 22 (2020). https://doi.org/10.1186/s41239-020-00201-6

Download citation

Received : 29 October 2019

Accepted : 01 April 2020

Published : 22 June 2020

DOI : https://doi.org/10.1186/s41239-020-00201-6

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

analysis of data educational research

Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.

  • View all journals
  • My Account Login
  • Explore content
  • About the journal
  • Publish with us
  • Sign up for alerts
  • Open access
  • Published: 16 October 2023

Education big data and learning analytics: a bibliometric analysis

  • Shaza Arissa Samsul   ORCID: orcid.org/0000-0003-3417-1433 1 ,
  • Noraffandy Yahaya 1 &
  • Hassan Abuhassna   ORCID: orcid.org/0000-0002-5774-3652 1  

Humanities and Social Sciences Communications volume  10 , Article number:  709 ( 2023 ) Cite this article

2 Citations

Metrics details

  • Science, technology and society

The contemporary era’s extensive use of data, particularly in education, has provided new insights and benefits. This data is called ‘education big data’, and the process of learning through such data is called ‘learning analytics’. Education in big data and learning analytics are two important processes that produce impactful results and understanding. it is crucial to take advantage of these processes to enhance the current education system. We conduct a bibliometric analysis based on the PRISMA statement template. The publications used for the analysis are based on the years 2012–2021. We examine and analyze a total of 250 publications, mainly sourced from the Scopus database, for insights regarding education big data and learning analytics. All of the publications also undergo filtration according to specific inclusion and exclusion criteria. Based on the bibliometric analysis conducted, we discover the distribution of education big data and learning analytics publications across the years 2012–2021, the most relevant journals and authors, the most significant countries, the primary research keywords, and the most important subject area involved. This study presents the trends and recommendations in education big data and learning analytics. We also offer suggestions for improvement and highlight the potential for enhancement of the education system through the full utilization of education big data and learning analytics.

Similar content being viewed by others

analysis of data educational research

Education reform and change driven by digital technology: a bibliometric study from a global perspective

analysis of data educational research

The European Tertiary Education Register, the reference dataset on European Higher Education Institutions

analysis of data educational research

A bibliometric analysis of knowledge mapping in Chinese education digitalization research from 2012 to 2022

Introduction.

Big data in education has become a trend in recent years (Wang, 2016 ). The current era involves the creation and use of an enormous volume of data. Big data is a result of the inclusion of data in several industries, including banking, economics, and education. The transformation of technology into digital operations creates a massive digital treasure trove of data, especially in education (Michalik et al., 2014 ). Big data in education can be valuable and can be converted into insight using learning analytics. The possession of big data can definitely produce new knowledge and intuition in the education sector (Wang, 2016 ). The concept of big data is underpinned by the massive increase in the volume, structure, and speed with which data is generated (Daniel, 2017 ). Educators can analyze and improve the traditional educational system through the usage of big data (Drigas and Leliopoulos, 2014 ). The key accomplishment of learning analytics in recent years may be identified as the growth of digital learning, which has improved the quality and accessibility of educational data (Sghir et al., 2023 ).

Industry evolution 4.0 demands that higher education be upgraded in terms of programs and courses to prepare students for a highly computerized learning environment (Mkrttchian et al., 2021 ). Moreover, technological advancements in big data are unquestionably accelerators for boosting analytics in higher education (Mkrttchian et al., 2021 ). The structure of learning environments may be changed and enhanced as a result of input from learning analytics data (Talan and Demirbilek, 2023 ). Hence, education big data and learning analytics are major facilitators in the process of enhancing structure of learning environment. The objective of this study is to analyze the trends and recommendations of education big data and learning analytics publications using the bibliometric analysis method. This study also presents a visualization of the current trend in education big data and learning analytics in different topics. Data is taken from the Scopus database to answer the following research questions:

What is the distribution of education big data and learning analytics publications in the years 2012–2021?

What are the most relevant journals and authors in education big data and learning analytics research?

What are the most significant countries in the education big data and learning analytics research area?

What are the primary research keywords for education big data and learning analytics within the last decade?

What is the most important subject area involving education big data and learning analytics?

Materials and methods

The bibliometric analysis and meta-analyses method was used in conducting this study’s systematic literature review (SLR). The research process and procedure used in this study are based on the Preferred Reporting Items for Systematic reviews and Meta-Analyses (PRISMA) statement template. Based on comprehensive reporting through the PRISMA template, readers may evaluate the applicability of the methodologies and consequently, the veracity of the study’s conclusion (Page et al., 2021 ). There are four processes involved in using the PRISMA template, namely identification, screening, and eligibility, and ultimately establishing which studies were included in the review. The details of this processes are explained further in Fig. 1 .

figure 1

The systematic literature review process using the PRISMA statement template based on four phases that are identification, screening, eligibility and included.

The chosen topics that were used in this SLR were education big data and learning analytics. The Scopus database was utilized to source studies for review. As shown in Fig. 1 , the SLR process involved using the PRISMA statement template for data selection. The first step in the identification phase using the PRISMA statement template was to identify records in the database using “Education Big Data” and “Learning Analytics” keywords. The total number of documents found based on this search was 885.

These results then underwent a screening process, which left 252 documents remaining. The screening process excluded studies published in the year 2022 and some subject areas that were irrelevant to this study, such as business, management, and accounting. Conference papers, reviews, and editorial documents were also excluded from the analysis. The process continued with eligibility screening, which reduced the number of documents to 250 after removing full articles that were not in English. A software tool called VOSviewer was used to conduct data analysis and visualization. As explained by Soegoto et al. ( 2022 ), VOSviewer can effectively analyze and visualize bibliometric data analysis.

Research question 1

This study sought to examine education big data and learning analytics. The first finding answers the first research question, which is about the distribution of education big data and learning analytics publications between 2012 and 2021. As shown in Fig. 2 , there was an increase in the number of documents produced during this 10-year period. Just one document on this subject was published in 2012, whereas 54 documents were produced in 2021. There was a decrement between 2017 and 2018, from 31 documents to 26, but this was followed by a further increase from 2019 onwards.

figure 2

“Document” in y -axis is represented as number of publication and “Year” in x -axis represents the year observed.

Research question 2

The second research question sought to identify the most relevant journals and authors in education big data and learning analytics research. Figure 3 shows the most relevant journals in education big data and learning analytics research in terms of total publications(TP). The Scopus database was then searched for the top ten frequently cited journals on this topic. The details about the journals, such as TP, Total Citations (TC), Citation Score, Most Cited Article, Times Cited, and Publisher Name, are shown in Table 1 .

figure 3

The indicator (text: number), text used represents journal name and number represents number of total publications.

As shown in Table 1 , the most relevant journal in education big data and learning analytics research, with a total of 8 publications and 63 citations, was “IEEE Access” published by IEEE. This journal was followed by “Lecture Notes in Educational Technology” published by Springer Nature, which had a total of 8 publications and 19 citations. The most cited article in this journal was “Big Data Learning Analytics: A new perspective”, which examined the significance of education big data and learning analytics. The “Educational Technology and Society” journal, with a total of 3 publications and 126 citations, was also highly relevant.

Research question two also determined the most productive authors in the area of education big data and learning analytics. The top fifteen authors were searched in the Scopus database. A list of data about the most productive authors in education big data and learning analytics research, based on TC, is presented in Fig. 4 . A summary of the authors, including Author Name, Year of First Publication, TP, h-Index, TC, Current Affiliation, and Country, is illustrated in Table 2 .

figure 4

The y -axis represents name of author and x -axis represents number of total citation.

Table 2 provides a summary of the most productive authors in education big data and learning analytics research. According to the Scopus database, the most productive author was Ben Williamson from the University of Edinburgh, UK, whose first publication in this area was in 2007, and who, at the time of investigation, had a total of 60 publications, 1700 citations, an h-index of 25. The second most productive author was Hiroaki Ogata, from Japan, with a total of 371 publications, 3155 citations, and an h-index of 27, followed by Lynne D. Roberts from Australia, with 107 total publication, 1883 TC, and an h-index of 23. Of the top fifteen authors, Ryan Shaun Joazeiro de Baker from Columbia University, New York, United States, had the highest total number of citations, at 7752, and TP, at 278. A list of the most productive authors in education big data and learning analytics research is provided in Table 2 .

Research question 3

The third research question sought to identify the countries with the most significant contributions to research on education big data and learning analytics. Figure 5 is a map that visualizes the most significant countries in this regard in terms of TP, according to the Scopus database. Table 3 presents summary of the most significant countries in the education big data and learning analytics research area. The criteria listed for the analysis are Rank, Country, TP, and Most Significant Academic Institution.

figure 5

The map chart for most significant countries in education big data and learning analytics according to total publications.

As shown in Table 3 , the most significant country in the education big data and learning analytics research area was the United States, with a total of 59 publications, with City College of New York being the most significant research institution in this area. This was followed by the United Kingdom, with a total of 35 publications, with the University of Aberdeen being the most significant institution. China was ranked third, with a total of 22 publications, and Capital University of Economics and Business was its most significant institution in this area. The other countries that were most productive in this research area are set out in Table 3 .

The study then measured the number of documents produced by each country, which are portrayed using a bar chart to provide a clearer view. As illustrated in Fig. 6 , the highest number of documents produced was in the United States, followed by the United Kingdom, China, and India. Malaysia is also included in the top ten countries, being ranked ninth.

figure 6

The y -axis represents number of publications and x -axis represents country name.

Next, this study examined the relationship of co-authorship with countries related to education big data and learning analytic research using VOSviewer software. The highest total number of link strengths of co-authorship and country was in the United States, with 14 links involving 59 documents and 1544 TC. As shown in Fig. 7 , the country with the second highest link strength was the United Kingdom, which also had 14 links with other countries, involving 35 documents and 752 TC. The map also shows other countries’ co-authorship relationships.

figure 7

The line linking each country represents the relationship between co-authorship with other countries. The size of the circle shows number of publications.

Research question 4

The fourth research question was about the primary research keywords used in education big data and learning analytics research within the last decade. This study examined the co-occurrence of all keywords in the data associated with education big data and learning analytics. Figure 8 is a map based on the co-occurrence relationships of all keywords in the Scopus database. The keyword that had the highest co-occurrence (Oc) of 126 and link strength of 485 was “Big Data”, followed by “Learning Analytics” (Oc = 89). Other keywords with high co-occurrence included “Learning Systems” (Oc = 28), “Machine Learning” (Oc = 38), “Data Analytics” (Oc = 36), “Data Mining” (Oc = 30), and “Education” (Oc = 36).

figure 8

The line linking each keyword represents their co-occurrence with other keywords. The size of the circle shows number of occurrence.

The co-occurrence of author keywords was also analyzed, as mapped in Fig. 9 . The keyword with the highest occurrence was “Big Data”, with 90 occurrences and 150 total links with other keywords, followed by “Learning Analytics” (Oc = 88). Other keywords, such as “Machine Learning” (Oc = 33), “Higher Education” (Oc = 32), “Data Analytics” (Oc = 14), and “Educational Data Mining” (Oc = 14), are also included in the map.

figure 9

The line linking each keyword represents the relationship between their co-occurrence with other author keywords. The size of the circle shows number of occurrence.

Research Question 5

To answer the fifth research question, which sought to identify the most important subject area involving education big data and learning analytics, the data were analyzed according to the subject area. As demonstrated in Fig. 10 , the highest percentage (34.6%), comprising 152 of the documents published, was in the Computer Science area. Computer Science thus appears to be the most relevant subject area regarding education big data and learning analytics. This was followed by 145 (33%) documents from the Social Sciences area. The fewest publications, at three documents, were in the Chemical Engineering area. The remaining data from several publications regarding education big data and learning analytics according to the subject area are presented in Fig. 10 .

figure 10

The pie chart showing number of publications produced according to the subject area.

Based on the result of an analysis of documents sourced from the Scopus database, all of the research questions have been answered in sequence. The distribution of publications on education big data and learning analytics between the years 2012 and 2021 shows an increasing pattern. This clearly shows that awareness about the importance of education big data and learning analytics is rising. Research by Şahin and Yurdugül ( 2020 ) supports this, indicating that education big data and learning analytics are two significant fields that can improve the e-learning environment. Interest in education big data and learning analytics has escalated because of the possibilities of advancement in many sectors. Yu and Couldry ( 2022 ) also stated that digital platforms and learning analytics are proliferating in the education industry. This is supported by the increasing pattern in the distribution of education big data and learning analytics publications in the last decade.

Furthermore, the bibliometric analysis shows that the most relevant journals in education big data and learning analytics research were published by IEEE. The most frequently cited article was about the usage of a data-driven approach for understanding learners’ behavior. This article demonstrated that the most well-organized analytical method to make advancements in learning strategies is the learning analytics approach (Al-Shabandar et al., 2018 ). This study also discovered that decision-making and learning methods can be expanded using big data in education and learning analytics. Big data implementations can fully realize the potential of this personalized learning and improve teaching (Lutfiani and Meria, 2022 ). The article from the most prolific author in this analysis, Ben Williamson from the University of Edinburgh in the UK, discussed two important learning advancements that can be made using big data, namely instructing machines and the use of computerized choice systems to influence human judgments (Knox et al., 2020 ).

In addition, the analysis found that the most significant country in the area of education big data and learning analytics research was the United States, with the highest number of publications. City College of New York was the most significant academic institution in the United States. One study from the United States agrees that major impacts on both educational practices can be seen by fully utilizing educational data mining and learning analytics (Baker and Inventado, 2014 ). Another study suggests that collaboration between the education system, industry players, and government entities in processing data analytics can facilitate the transition of technology to Industry 4.0 (Qin and Chiang, 2019 ). According to one of the most cited articles from the United States, employment and admittance screening, financial management, sponsorship tracking, and academic achievement evaluation are just a few of the administration and educational applications that might benefit from big data principles and data analytics (Picciano, 2012 ). Hence, numerous advancements and improvements can be achieved with education big data and learning analytics, with collaboration from many sectors.

Moreover, the primary research keywords for education big data and learning analytics within the last decade were also analyzed in this study. Based on the results, “Big Data” and “Learning Analytics” were the most frequently used keywords for the education big data and learning analytics research areas. Both of these keywords also had high co-occurrence with all other keywords. Research about potential ways to optimize e-learning agrees that big data and learning analytics play a crucial role in the future of higher education (García and Secades, 2013 ). Learning analytics has made an important contribution to the education field in producing reliable projections of academic achievements based on assessments of the educational process (Tempelaar et al., 2021 ). Over the last 5 years, educational big data and learning analytics have been a growing focus on classifiying and identifying students’ behavior (Lemay et al., 2021 ).

This study also identified the most important subject area involving education big data and learning analytics, namely computer science. Computer science, education, statistics, and other social sciences fields are brought together by educational data science to investigate and comprehend theoretical and practical phenomena (Daniel, 2016 ). Educational data mining also involves the combination of computer science, education, and statistics to better comprehend learning, administration processes, and research issues in higher education (Ray and Saeed, 2018 ).

In summary, based on the present study’s analysis, the trends and recommendations of education big data and learning analytics include acting as a system for early detection that recognizes students who are at risk for academic failure or dropout, helping to provide dashboards for learning analytics, enabling amalgamation with Artificial Intelligence (AI) and machine learning, and providing future orientation in education. The increasing awareness of the importance of education big data and learning analytics is beneficial in enabling early detection of declining student performance based on the availability of big data sources. Dashboards and data visualization using learning analytics can greatly help to analyse complex data to produce insights for prevention and measures to be taken for specific problems. The simplification of data visualization from learning analytics is the future trend in the process of enabling amalgamation with AI and machine learning. The integration of AI and machine learning in the education system could bring a huge impact. As an example, AI and machine learning can create new opportunities for automated evaluation and personalized critiques, and enable smart teaching systems. All of these trends in fully utilizing big data and learning analytics are likely to continue and escalate over time.

Conclusion and implications

The new era involves enormous amounts of data, which should be fully utilized for the advancement and enhancement of traditional systems in many sectors, especially in education. This study involved a bibliometric analysis of 250 publications regarding education big data and learning analytics. The increasing number of publications involving education big data and learning analytics publications during the past 10 years, from 2012 until 2021, implies that the importance of both of these topics has been acknowledged. Most of the relevant journals were published by IEEE, which has made a significant contribution to the field of education big data and learning analytics. Moreover, the most productive author to date is Ben Williamson, currently affiliated with the University of Edinburgh, UK, with a total of 1700 citations since his first publication in this area in 2007. Moreover, this study also found that the United States, with a total of 59 publications, is the most significant country in the area of education big data and learning analytics research, with its most prolific institution being City College of New York. The United States also had the highest total link strength of co-authorship on education big data and learning analytics. Since the United States is a big country, its exposure of the importance of using education big data and learning analytics worldwide is highly significant. “Big Data” and “Learning Analytics” were the keywords with the highest occurrence in most of the publications analyzed in this study. Most of the journals used these two keywords to explain details about education big data and learning analytics research. Computer Science and Social Sciences appear to be the most important subject areas regarding education big data and learning analytics. Both of these areas play important parts in making full use of the benefits of education big data and learning analytics. Other sectors were also involved, such as engineering, mathematics, art and humanities, psychology, and many more.

The main conclusion that can be drawn from this study is that big data and learning analytics are currently important skills to be maneuvered. Both big data and learning analytics could bring many significant benefits, such as improving the e-learning environment, understanding learners’ behavior, advancement in learning strategies, and many more. Learning techniques can be enhanced to achieve the best structured analytical strategy by employing learning analytics. This future trend of manipulating big data and learning analytics is certainly growing, especially in the field of education. In the future, new teaching and learning impacts gained from education big data and learning analytics could escalate the trend of personalized learning, predictive analytics, and adaptive learning, and enable data-driven decision-making. Throughout the upcoming years, education big data and learning analytics will remain significant in the field of education.

Limitations

One of the limitations of this study is regarding limited information access, as it only used Scopus to identify publications to undergo bibliometric analysis. Other databases, such as Springer Link, IEEE Xplore Digital Library, or Web of Science, might have provided different insights and produced different results. In addition, the results of this research could also have been narrowed down and thus been more accurate if more specific keywords had been used in the analysis. The keywords used were “Education Big Data” and “Learning Analytics”. The depth of analysis could also have been increased if more keywords regarding education big data and learning analytics had been used, such as “Big Data Analytics”, “Educational Data Mining”, “Deep Learning,” and many more.

Data availability

All data sets are available upon request.

Al-Shabandar R, Hussain AJ, Liatsis P, Keight R (2018) Analyzing learners behavior in MOOCs: An examination of performance and motivation using a data-driven approach. IEEE Access 6:73669–73685

Article   Google Scholar  

Baker RS, Inventado PS (2014) Educational Data Mining and Learning Analytics. In: pp. 61–75. Springer, New York, NY. https://doi.org/10.1007/978-1-4614-3305-7_4

Daniel BK (2016) Big data and learning analytics in higher education: current theory and practice. Springer International Publishing, Switzerland

Daniel BK (2017) Big data in higher education: the big picture. In Big data and learning analytics in higher education, Springer, Cham. p 19–28

Drigas AS, Leliopoulos P (2014) The use of big data in education. Int J Comput Sci Issues 11(5):58

Google Scholar  

García OA, Secades VA (2013) Big Data & learning analytics: a potential way to optimize elearning technological tools. In: International Association for Development of the Information Society International conference e-learning

Knox J, Williamson B, Bayne S (2020) Machine behaviourism: future visions of “learnification” and “datafication” across humans and digital technologies. Learn Media Technol 45(1):31–45

Lemay DJ, Baek C, Doleck T (2021) Comparison of learning analytics and educational data mining: a topic modeling approach. Comput Educ: Artif Intell 2:100016

Lutfiani N, Meria L (2022) Utilization of big data in educational technology research. Int Trans Educ Technol 1(1):73–83

Michalik P, Štofa J, Zolotova I (2014) Concept definition for Big Data architecture in the education system. Paper presented at the 12th International Symposium on Applied Machine Intelligence and Informatics (SAMI), IEEE, pp 331–334

Mkrttchian V, Gamidullaeva L, Finogeev A, Chernyshenko S, Chernyshenko V, Amirov D et al. (2021) Big data and internet of things (IoT) technologies’ influence on higher education: current state and future prospects. Int J Web-Based Learn Teach Technol 16(5):137–157

Page MJ, McKenzie JE, Bossuyt PM, Boutron I, Hoffmann TC, Mulrow CD et al. (2021) The PRISMA 2020 statement: an updated guideline for reporting systematic reviews. Int J Surg 88:105906

Article   PubMed   Google Scholar  

Picciano AG (2012) The evolution of big data and learning analytics in American higher education. J Asynchronous Learn Netw 16(3):9–20

Qin SJ, Chiang LH (2019) Advances and opportunities in machine learning for process data analytics. Comput Chem Engin 126:465–473

Article   CAS   Google Scholar  

Ray S, Saeed M (2018) Applications of educational data mining and learning analytics tools in handling big data in higher education. In Alani M, Tawfik H, Saeed M, Anya O (eds) Applications of big data analytics. Springer, Cham

Şahİn M, Yurdugül H (2020) Educational data mining and learning analytics: past, present and future. Bartın Univ J Fac Educ 9(1):121–131

Sghir N, Adadi A, Lahmer M (2023) Recent advances in predictive learning analytics: a decade systematic review (2012–2022). Educ Inform Technol 28(7):8299–8333

Soegoto H, Soegoto ES, Luckyardi S, Rafdhi AA (2022) A bibliometric analysis of management bioenergy research using Vosviewer application. Indones J Sci Technol 7(1):89–104

Talan T, Demirbilek M (2023) Bibliometric analysis of research on learning analytics based on web of science database. Inform Educ 22(1):161–181

Tempelaar D, Rienties B, Nguyen Q (2021) The contribution of dispositional learning analytics to precision education. Educ Technol Soc 24(1):109–122

Wang Y (2016) Big opportunities and big concerns of big data in education. TechTrends 60(4):381–384

Yu J, Couldry N (2022) Education as a domain of natural data extraction: analysing corporate discourse about educational tracking. Inform Commun Soc 25(1):127–144

Download references

Acknowledgements

This work was funded by the Ministry of Higher Education Malaysia under the Fundamental Research Grant Scheme (FRGS/1/2020/SSI0/UTM/02/8).

Author information

Authors and affiliations.

School of Education, Faculty of Social Sciences and Humanities, Universiti Teknologi Malaysia, 81310 UTM, Johor Bahru, Johor, Malaysia

Shaza Arissa Samsul, Noraffandy Yahaya & Hassan Abuhassna

You can also search for this author in PubMed   Google Scholar

Contributions

All authors contributed to the research conception and design. The introduction and methodology were done by NY. The methodology and analysis were also planned and performed by HA. The full draft of the manuscript, including all parts was analyzed and written by SAS. All authors read and approved the final manuscript.

Corresponding author

Correspondence to Shaza Arissa Samsul .

Ethics declarations

Competing interests.

The authors declare no competing interests.

Ethical approval

This article does not contain any studies with human participants performed by any of the authors.

Informed consent

Additional information.

Publisher’s note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary information

Overall data from scopus database, rights and permissions.

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons license, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons license and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this license, visit http://creativecommons.org/licenses/by/4.0/ .

Reprints and permissions

About this article

Cite this article.

Samsul, S.A., Yahaya, N. & Abuhassna, H. Education big data and learning analytics: a bibliometric analysis. Humanit Soc Sci Commun 10 , 709 (2023). https://doi.org/10.1057/s41599-023-02176-x

Download citation

Received : 10 October 2022

Accepted : 21 September 2023

Published : 16 October 2023

DOI : https://doi.org/10.1057/s41599-023-02176-x

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

This article is cited by

Influence of e-learning on the students’ of higher education in the digital era: a systematic literature review.

  • Rashmi Singh
  • Shailendra Kumar Singh
  • Niraj Mishra

Education and Information Technologies (2024)

Quick links

  • Explore articles by subject
  • Guide to authors
  • Editorial policies

analysis of data educational research

Advertisement

Advertisement

A practical, iterative framework for secondary data analysis in educational research

  • Published: 18 May 2019
  • Volume 47 , pages 129–148, ( 2020 )

Cite this article

analysis of data educational research

  • Tracy Logan   ORCID: orcid.org/0000-0001-5283-286X 1  

13 Citations

5 Altmetric

Explore all metrics

Secondary data analysis in educational research has been an established research method for many years. Yet, few publications outline the “how to” of undertaking the process. This paper presents an analysis framework suitable for undertaking secondary data analysis within the field of education. The framework is a modification and an application of a pre-existing data mining research process known as Knowledge Discovery in Databases (KDD). The KDD process is interactive and generative and involves a series of sequential steps and decision-making processes. The modified KDD process is described to show how it supports secondary data analysis and provides an example of how the modified KDD process was applied across a secondary analysis in mathematics education. This paper provides educational researchers with a practical and iterative framework through which to undertake secondary analysis that enhances flexibility and encourages depth and saturation.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price includes VAT (Russian Federation)

Instant access to the full article PDF.

Rent this article via DeepDyve

Institutional subscriptions

analysis of data educational research

(adapted from Fayyad et al. 1996 , p. 41)

analysis of data educational research

Similar content being viewed by others

analysis of data educational research

Behind the scenes of educational data mining

analysis of data educational research

A Systematic Review on Data Mining for Mathematics and Science Education

analysis of data educational research

Preparing the Next Generation of Education Researchers for Big Data in Higher Education

Attwood, G., & Croll, P. (2011). Attitudes to school and intentions for educational participation: An analysis of data from the Longitudinal Survey of Young People in England. International Journal of Research and Method in Education, 34 (3), 269–287.

Article   Google Scholar  

Australian Government. (2013). Australian Research Council Discovery Projects—Instructions to applicants for funding commencing in 2015. Resource document. Australian Research Council. Retreived from http://www.arc.gov.au/pdf/DP15/DP15_ITA.pdf .

Bishop, L., & Kuula-Luumi, A. (2016). Revisiting qualitative data reuse: A decade on. SAGE Open . https://doi.org/10.1177/2158244016685136 .

Bobis, J., Shore, S., Bennett, D., Bennett, S., Chan, P., Harrison, N., et al. (2013). Educational research in Australia: Where is it conducted? Australian Educational Researcher, 40 (4), 453–471.

Borgman, C. L. (2012). The conundrum of sharing research data. Journal of the American Society for Information Science and Technology, 63 (6), 1059–1078.

Brewer, E. W. (2007). Secondary data analysis. In N. J. Salkind & K. Rasmussen (Eds.), Encyclopedia of measurement and statistics (pp. 870–877). Thousand Oaks: SAGE Publications Inc.

Google Scholar  

Burstein, L. (1978). Secondary analysis: An important resource for educational research and evaluation. Educational Researcher, 7 (5), 9–12.

Cabena, P., Hadjinian, P., Stadler, R., Verhees, J., & Zanasi, A. (1998). Discovering data mining: From concept to implementation . Englewood Cliffs, NJ: Prentice Hall.

Carter, J., Noble, S., Russell, A., & Swanson, E. (2011). Developing statistical literacy using real-world data: Investigating socioeconomic secondary data resources used in research and teaching. International Journal of Research and Method in Education, 34 (3), 223–240.

Chen, M. S., Han, J., & Yu, P. S. (1996). Data mining: An overview from a database perspective. IEEE Transactions on Knowledge and Data Engineering , 8 (6), 866–883.

Cheng, H. G., & Phillips, M. R. (2014). Secondary analysis of existing data: Opportunities and implementation. Shanghai Archives of Psychiatry, 26 (6), 371–375.

Cheshire, L. (2009). Archiving qualitative data: Prospects and challenges of data preservation and sharing among Australian qualitative researchers . Discussion Paper. http://www.assda.edu.au/forms/AQuAQualitativeArchiving_DiscussionPaper_FinalNov09.pdf .

Colonna, L. (2013). Taxonomy and classification of data mining. SMU Science and Technology Law Review, 16, 309–369.

Corti, L., & Fielding, N. (2017). Opportunities from the digital revolution: Implications for researching, publishing and consuming qualitative research. SAGE Open . https://doi.org/10.1177/2158244016678912 .

Devine, P. (2003). Secondary data analysis. In R. L. Miller & J. D. Brewer (Eds.), The A-Z of social research (pp. 286–289). London: SAGE Publications Ltd.

Economic and Social Research Council (ESRC). (2010). ESRC Research Data Policy (updated 2013). Resource document. Retrieved from http://www.esrc.ac.uk/_images/Research_Data_Policy_2010_tcm8-4595.pdf .

Fayyad, U. M. (1996). Data mining and knowledge discovery: Making sense out of data. IEEE Intelligent Systems, 11 (5), 20–25.

Fayyad, U. M., Piatetsky-Shapiro, G., & Smyth, P. (1996). From data mining to knowledge discovery in databases. AI Magazine, 17 (3), 37–54.

Goodwin, J. (2012). Editor’s introduction: Secondary sources and secondary analysis. In J. Goodwin (Ed.), SAGE secondary data analysis (Vol. 1, pp. xix–xxxvii). London: SAGE.

Hakim, C. (1982). Secondary analysis and the relationship between official and academic social research. Sociology, 16 (1), 12–28.

Hammersley, M. (2010). Can we re-use qualitative data via secondary analysis? Notes on some terminology and substantive issues. Sociological Research Online . https://doi.org/10.5153/sro.2076 .

Hand, D. J. (1998). Data mining: Statistics and more? The American Statistician, 52 (2), 112–118.

Hofferth, S. L. (2005). Secondary data analysis in family research. Journal of Marriage and Family, 67 (4), 891–907.

Houghton, J., & Gruen, N. (2014). Open research data: Report to the Australian National Data Service (ANDS) . Resource document. Retrieved from http://ands.org.au/resource/open-research-data-report.pdf .

Irwin, S., Bornat, J., & Winterton, M. (2012). Timescapes secondary analysis: Comparison, context and working across data sets. Qualitative Research, 12 (1), 66–80.

Kolb, B. (2008). Marketing research: A practical approach . London: SAGE Publishing.

Book   Google Scholar  

Logan, T. (2015). Starting a conversation about open data in mathematics education research. In M. Marshman, V. Geiger & A. Bennison (Eds.), Mathematics education in the margins. Proceedings of the 38th annual conference of the mathematics education research group of Australasia (pp. 373–380). Sunshine Coast, Australia: MERGA Inc.

Logan, T., & Lowrie, T. (2017). Gender perspectives on spatial tasks in a national assessment: A secondary data analysis. Research in Mathematics Education , 19 (2), 199–216. https://doi.org/10.1080/14794802.2017.1334577 .

Logan, T., & Woodland, K. (2015). Digital games and mathematics learning: The state of play. In T. Lowrie & R. Jorgensen (Eds.), Digital games and mathematics learning: Potential, promises and pitfalls (pp. 277–304). The Netherlands: Springer.

Chapter   Google Scholar  

Logan, T., Lowrie, T., & Diezmann, C. M. (2014). Co-thought gestures: Supporting students to successfully navigate map tasks. Educational Studies in Mathematics , 87 , 87–102. https://doi.org/10.1007/s10649-014-9546-2 .

Lowrie, T. (2014). An educational practice framework: The empowerment of the teaching profession. Journal of Education for Teaching, 40 (1), 34–46.

Mariscal, G., Marbán, O., & Fernández, C. (2010). A survey of data mining and knowledge discovery process models and methodologies. The Knowledge Engineering Review, 25 (2), 137–166.

McNeill, D. (1992). Hand and mind: What gestures reveal about thought . Chicago, IL: University of Chicago Press.

Miles, M. B., & Huberman, A. M. (1984). Drawing meaning from qualitative data: Toward a shared craft. Educational Researcher, 13 (5), 20–30.

Morris, D., & Ecclesfield, N. (2011). A new computer-aided technique for qualitative document analysis. International Journal of Research and Method in Education, 34 (3), 241–254.

National Health and Medical Research Council. (2018). Australian code for responsible conduct of research . Canberra, Australia: Author. Retrieved from https://nhmrc.gov.au/about-us/publications/australian-code-responsible-conduct-research-2018#block-views-block-file-attachments-content-block-1 .

National Science Foundation (NSF). (2010). NSF Data Sharing Policy . Resource document. Retrieved from http://www.nsf.gov/bfa/dias/policy/dmp.jsp .

Olesen, S. (2014). Publishing and sharing sensitive data. Australian National Data Service Guides . Resource document. Retrieved from http://ands.org.au/guides/sensitivedata.html .

Onwuegbuzie, A. J., & Johnson, R. B. (2006). The validity issue in mixed research. Research in the Schools, 13 (1), 48–63.

Organisation for Economic Co-operation and Development (OECD). (2007). OECD principles and guidelines for access to research data from public funding . Paris, OECD. Resource document. Retrieved from http://www.oecd.org/sti/sci-tech/38500813.pdf .

Payne, G., & Payne, J. (2004). Key concepts in social research . London: SAGE Publications.

Pyle, D. (1999). Data preparation for data mining . San Francisco, CA: Morgan Kaufmann.

Queensland School Curriculum Council. (2001). 2001 Queensland year 5 test: Aspects of numeracy (p. 2). Brisbane, Australia: Queensland School Curriculum Council.

Raven, J., Raven, J. C., & Court, J. H. (1998). Manual for Raven’s standard progressive matrices . Oxford: Oxford Psychologists Press.

Rew, L., Koniak-Griffin, D., Lewis, M. A., Miles, M., & O’Sullivan, A. (2000). Secondary data analysis: New perspectives for adolescent research. Nursing Outlook, 48 (5), 223–229.

Romero, C., & Ventura, S. (2013). Data mining in education. Wiley Interdisciplinary Reviews: Data Mining and Knowledge Discovery, 3 (1), 12–27.

Rowan, M., & Ramsay, E. (2018). Educational inequality in Tasmania: Evidence and explanations. Australian Educational Researcher, 45 (3), 277–295.

Smith, E. (2008a). Pitfalls and promises: The use of secondary data analysis in educational research. British Journal of Educational Studies, 56 (3), 323–339.

Smith, E. (2008b). Using secondary data in educational and social research . Maidenhead: McGraw Hill/Open University Press.

Smith, E. (2011). Special issue on using secondary data analysis in educational research. International Journal of Research and Method in Education, 34 (3), 219–221.

Smith, A. K., Ayanian, J. Z., Cobinsky, K. E., Landon, B. E., McCarthy, E. P., Wee, C. C., et al. (2011). Conducting high-value secondary dataset analysis: An introductory guide and resources. Journal of General Internal Medicine, 26 (8), 920–929.

Thompson, P. (2000). Re-using qualitative research data: A personal account. Forum: Qualitative Social Research . https://doi.org/10.17169/fqs-1.3.1044 .

Wellcome Trust. (2010). Policy on data management and sharing . Resource document. Retrieved from http://www.wellcome.ac.uk/About-us/Policy/Policy-and-position-statements/WTX035043.htm .

Zhang, S., Zhang, C., & Yang, Q. (2003). Data preparation of data mining. Applied Artificial Intelligence, 17, 375–381.

Download references

Author information

Authors and affiliations.

Faculty of Education, University of Canberra, Locked Bag 1, Canberra, ACT, 2601, Australia

Tracy Logan

You can also search for this author in PubMed   Google Scholar

Corresponding author

Correspondence to Tracy Logan .

Additional information

Publisher's note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Logan, T. A practical, iterative framework for secondary data analysis in educational research. Aust. Educ. Res. 47 , 129–148 (2020). https://doi.org/10.1007/s13384-019-00329-z

Download citation

Received : 25 July 2018

Accepted : 06 May 2019

Published : 18 May 2019

Issue Date : March 2020

DOI : https://doi.org/10.1007/s13384-019-00329-z

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Secondary data analysis
  • Knowledge Discovery in Data
  • Analysis framework
  • Educational research
  • Find a journal
  • Publish with us
  • Track your research

Imperial College London Imperial College London

Latest news.

analysis of data educational research

Cell division decisions and cancer recommendations: News from Imperial

analysis of data educational research

Impact of Giving Event to commemorate the impact of philanthropy

analysis of data educational research

Imperial team awarded £5m from BHF to support world-leading heart research

  • Educational Development Unit
  • Teaching toolkit
  • Educational research methods
  • Analysing and writing up your research

Types of data analysis

The means by which you analyse your data are largely determined by the nature of your research question , the approach and paradigm within which your research operates, the methods used, and consequently the type of data elicited. In turn, the language and terms you use in both conducting and reporting your data analysis should reflect these.

The list below includes some of the more commonly used means of qualitative data analysis in educational research – although this is by no means exhaustive. It is also important to point out that each of the terms given below generally encompass a range of possible methods or options and there can be overlap between them. In all cases, further reading is essential to ensure that the process of data analysis is valid, transparent and appropriately systematic, and we have provided below (as well as in our further resources and tools and resources for qualitative data analysis sections) some recommendations for this.

If your research is likely to involve quantitative analysis, we recommend the books listed below.

Types of qualitative data analysis

  • Thematic analysis
  • Coding and/or content analysis
  • Concept map analysis
  • Discourse or narrative analysis
  • Grouded theory
  • Phenomenological analysis or interpretative phenomenological analysis (IPA)

Further reading and resources

As a starting point for most of these, we would recommend the relevant chapter from Part 5 of Cohen, Manion and Morrison (2018), Research Methods in Education. You may also find the following helpful:

For qualitative approaches

Savin-Baden, M. & Howell Major, C. (2013) Data analysis. In Qualitative Research: The essential guide to theory and practice . (Abingdon, Routledge, pp. 434-450).

For quantitative approaches

Bors, D. (2018) Data analysis for the social sciences (Sage, London).

Data Analysis for Educational Research in R

Joshua m. rosenberg, chapter 1 introduction, 1.1 background: data analysis in educational research.

Educational research is hard to do (Berliner, 2002). This is because many educational phenomena are part of a complex system, with multiple, nested levels, and, well, people, many of them developing. Data analysis in education reflects some of the challenges of educational research writ large. In short, both educational research and analysis of educational data is hard. The goal of this book is to share how to make these difficulties less challenging using R, the open-source, free programming language and software.

1.2 Why a book on data analysis in R for educational research

There are at least three reasons why data analysis in educational research is hard:

Educational researchers have unique methods: an emphasis on multi-level models, networks, and measurement are just some examples.

Educational researchers face unique challenges: coming from myriad backgrounds, and working in fields with greater or lesser emphases on different aspects of data analysis.

Finally, there are training challenges. Educational research features some great methodologists: Many advances in the fields mentioned earlier in this session have been made by those working primarily in educational research. Nevertheless, few quantitative classes teach data analysis.

Home page for the journal Education Policy Analysis Archives

Perspectives and discourses on diversity and inclusion in the university community

The lines of research on diversity and inclusion have increased markedly in recent years. However, few studies integrate the perspective of different agents, and even more scarce are those developed within the epistemological and methodological framework of critical pedagogy. In this sense, the aim of the research has been to analyse the perspectives and discourses on diversity and inclusion in the Faculty of Education of the Complutense University of Madrid from the voices of the university community within three discursive lines or paradigms: integrative, inclusive-canonical, and inclusive-transformative. Forty-seven people from different groups in the university community took part in the study, which used interviews as the data production technique. The results, obtained by coordinating an open and axial coding system, are presented according to three codes: concept of diversity, attitudes towards inclusion, and institutional concept of diversity and inclusion. The conclusions allow us to identify the permanence of a hegemonic model of a banking nature, despite evidence of a positive evolution in the perspectives and discourses on diversity and inclusion.

Author Biography

María naranjo-crespo, universidad complutense de madrid.

Doctora en Educación por la Universidad Complutense de Madrid (UCM). Sus líneas de investigación se centran en la inclusión social y educativa, la adolescencia y juventud en situación de dificultad social, y la política educativa. Actualmente, es profesora doctora en el Área de Educación de la Universidad Internacional de la Empresa, profesora doctora en el CES Don Bosco y miembro del grupo Configuraciones Pedagógicas y Cultura Cívico-social de la Universidad Complutense de Madrid.

  • PDF (Español)
  • Video Commentary (Español)

How to Cite

  • Endnote/Zotero/Mendeley (RIS)

Copyright (c) 2024 María Naranjo-Crespo

Creative Commons License

This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License .

Education Policy Analysis Archives/Archivos Analíticos de Políticas Educativas/Arquivos Analíticos de Políticas Educativas (EPAA/AAPE;  ISSN 1068-2341 ) is a peer-reviewed, open-access, international, multilingual, and multidisciplinary journal designed for researchers, practitioners, policy makers, and development analysts concerned with education policies.

  • Português (Brasil)
  • Español (España)

Make a Gift

Submit a manuscript or revision, recent special issues.

Youth and Adult Education, Literacies and Decoloniality

Global Policy Mobilities in Federal Education Systems

Teacher Subjectivities in Latin America

More information about the publishing system, Platform and Workflow by OJS/PKP.

analysis of data educational research

CRO Platform

Test your insights. Run experiments. Win. Or learn. And then win.

analysis of data educational research

eCommerce Customer Analytics Platform

analysis of data educational research

Acquisition matters. But retention matters more. Understand, monitor & nurture the best customers.

  • Case Studies
  • Ebooks, Tools, Templates
  • Digital Marketing Glossary
  • eCommerce Growth Stories
  • eCommerce Growth Show
  • Help & Technical Documentation

CRO Guide   >  Chapter 3.1

Qualitative Research: Definition, Methodology, Limitation & Examples

Qualitative research is a method focused on understanding human behavior and experiences through non-numerical data. Examples of qualitative research include:

  • One-on-one interviews,
  • Focus groups, Ethnographic research,
  • Case studies,
  • Record keeping,
  • Qualitative observations

In this article, we’ll provide tips and tricks on how to use qualitative research to better understand your audience through real world examples and improve your ROI. We’ll also learn the difference between qualitative and quantitative data.

gathering data

Table of Contents

Marketers often seek to understand their customers deeply. Qualitative research methods such as face-to-face interviews, focus groups, and qualitative observations can provide valuable insights into your products, your market, and your customers’ opinions and motivations. Understanding these nuances can significantly enhance marketing strategies and overall customer satisfaction.

What is Qualitative Research

Qualitative research is a market research method that focuses on obtaining data through open-ended and conversational communication. This method focuses on the “why” rather than the “what” people think about you. Thus, qualitative research seeks to uncover the underlying motivations, attitudes, and beliefs that drive people’s actions. 

Let’s say you have an online shop catering to a general audience. You do a demographic analysis and you find out that most of your customers are male. Naturally, you will want to find out why women are not buying from you. And that’s what qualitative research will help you find out.

In the case of your online shop, qualitative research would involve reaching out to female non-customers through methods such as in-depth interviews or focus groups. These interactions provide a platform for women to express their thoughts, feelings, and concerns regarding your products or brand. Through qualitative analysis, you can uncover valuable insights into factors such as product preferences, user experience, brand perception, and barriers to purchase.

Types of Qualitative Research Methods

Qualitative research methods are designed in a manner that helps reveal the behavior and perception of a target audience regarding a particular topic.

The most frequently used qualitative analysis methods are one-on-one interviews, focus groups, ethnographic research, case study research, record keeping, and qualitative observation.

1. One-on-one interviews

Conducting one-on-one interviews is one of the most common qualitative research methods. One of the advantages of this method is that it provides a great opportunity to gather precise data about what people think and their motivations.

Spending time talking to customers not only helps marketers understand who their clients are, but also helps with customer care: clients love hearing from brands. This strengthens the relationship between a brand and its clients and paves the way for customer testimonials.

  • A company might conduct interviews to understand why a product failed to meet sales expectations.
  • A researcher might use interviews to gather personal stories about experiences with healthcare.

These interviews can be performed face-to-face or on the phone and usually last between half an hour to over two hours. 

When a one-on-one interview is conducted face-to-face, it also gives the marketer the opportunity to read the body language of the respondent and match the responses.

2. Focus groups

Focus groups gather a small number of people to discuss and provide feedback on a particular subject. The ideal size of a focus group is usually between five and eight participants. The size of focus groups should reflect the participants’ familiarity with the topic. For less important topics or when participants have little experience, a group of 10 can be effective. For more critical topics or when participants are more knowledgeable, a smaller group of five to six is preferable for deeper discussions.

The main goal of a focus group is to find answers to the “why”, “what”, and “how” questions. This method is highly effective in exploring people’s feelings and ideas in a social setting, where group dynamics can bring out insights that might not emerge in one-on-one situations.

  • A focus group could be used to test reactions to a new product concept.
  • Marketers might use focus groups to see how different demographic groups react to an advertising campaign.

One advantage that focus groups have is that the marketer doesn’t necessarily have to interact with the group in person. Nowadays focus groups can be sent as online qualitative surveys on various devices.

Focus groups are an expensive option compared to the other qualitative research methods, which is why they are typically used to explain complex processes.

3. Ethnographic research

Ethnographic research is the most in-depth observational method that studies individuals in their naturally occurring environment.

This method aims at understanding the cultures, challenges, motivations, and settings that occur.

  • A study of workplace culture within a tech startup.
  • Observational research in a remote village to understand local traditions.

Ethnographic research requires the marketer to adapt to the target audiences’ environments (a different organization, a different city, or even a remote location), which is why geographical constraints can be an issue while collecting data.

This type of research can last from a few days to a few years. It’s challenging and time-consuming and solely depends on the expertise of the marketer to be able to analyze, observe, and infer the data.

4. Case study research

The case study method has grown into a valuable qualitative research method. This type of research method is usually used in education or social sciences. It involves a comprehensive examination of a single instance or event, providing detailed insights into complex issues in real-life contexts.  

  • Analyzing a single school’s innovative teaching method.
  • A detailed study of a patient’s medical treatment over several years.

Case study research may seem difficult to operate, but it’s actually one of the simplest ways of conducting research as it involves a deep dive and thorough understanding of the data collection methods and inferring the data.

5. Record keeping

Record keeping is similar to going to the library: you go over books or any other reference material to collect relevant data. This method uses already existing reliable documents and similar sources of information as a data source.

  • Historical research using old newspapers and letters.
  • A study on policy changes over the years by examining government records.

This method is useful for constructing a historical context around a research topic or verifying other findings with documented evidence.

6. Qualitative observation

Qualitative observation is a method that uses subjective methodologies to gather systematic information or data. This method deals with the five major sensory organs and their functioning, sight, smell, touch, taste, and hearing.

  • Sight : Observing the way customers visually interact with product displays in a store to understand their browsing behaviors and preferences.
  • Smell : Noting reactions of consumers to different scents in a fragrance shop to study the impact of olfactory elements on product preference.
  • Touch : Watching how individuals interact with different materials in a clothing store to assess the importance of texture in fabric selection.
  • Taste : Evaluating reactions of participants in a taste test to identify flavor profiles that appeal to different demographic groups.
  • Hearing : Documenting responses to changes in background music within a retail environment to determine its effect on shopping behavior and mood.

Below we are also providing real-life examples of qualitative research that demonstrate practical applications across various contexts:

Qualitative Research Real World Examples

Let’s explore some examples of how qualitative research can be applied in different contexts.

1. Online grocery shop with a predominantly male audience

Method used: one-on-one interviews.

Let’s go back to one of the previous examples. You have an online grocery shop. By nature, it addresses a general audience, but after you do a demographic analysis you find out that most of your customers are male.

One good method to determine why women are not buying from you is to hold one-on-one interviews with potential customers in the category.

Interviewing a sample of potential female customers should reveal why they don’t find your store appealing. The reasons could range from not stocking enough products for women to perhaps the store’s emphasis on heavy-duty tools and automotive products, for example. These insights can guide adjustments in inventory and marketing strategies.

2. Software company launching a new product

Method used: focus groups.

Focus groups are great for establishing product-market fit.

Let’s assume you are a software company that wants to launch a new product and you hold a focus group with 12 people. Although getting their feedback regarding users’ experience with the product is a good thing, this sample is too small to define how the entire market will react to your product.

So what you can do instead is holding multiple focus groups in 20 different geographic regions. Each region should be hosting a group of 12 for each market segment; you can even segment your audience based on age. This would be a better way to establish credibility in the feedback you receive.

3. Alan Pushkin’s “God’s Choice: The Total World of a Fundamentalist Christian School”

Method used: ethnographic research.

Moving from a fictional example to a real-life one, let’s analyze Alan Peshkin’s 1986 book “God’s Choice: The Total World of a Fundamentalist Christian School”.

Peshkin studied the culture of Bethany Baptist Academy by interviewing the students, parents, teachers, and members of the community alike, and spending eighteen months observing them to provide a comprehensive and in-depth analysis of Christian schooling as an alternative to public education.

The study highlights the school’s unified purpose, rigorous academic environment, and strong community support while also pointing out its lack of cultural diversity and openness to differing viewpoints. These insights are crucial for understanding how such educational settings operate and what they offer to students.

Even after discovering all this, Peshkin still presented the school in a positive light and stated that public schools have much to learn from such schools.

Peshkin’s in-depth research represents a qualitative study that uses observations and unstructured interviews, without any assumptions or hypotheses. He utilizes descriptive or non-quantifiable data on Bethany Baptist Academy specifically, without attempting to generalize the findings to other Christian schools.

4. Understanding buyers’ trends

Method used: record keeping.

Another way marketers can use quality research is to understand buyers’ trends. To do this, marketers need to look at historical data for both their company and their industry and identify where buyers are purchasing items in higher volumes.

For example, electronics distributors know that the holiday season is a peak market for sales while life insurance agents find that spring and summer wedding months are good seasons for targeting new clients.

5. Determining products/services missing from the market

Conducting your own research isn’t always necessary. If there are significant breakthroughs in your industry, you can use industry data and adapt it to your marketing needs.

The influx of hacking and hijacking of cloud-based information has made Internet security a topic of many industry reports lately. A software company could use these reports to better understand the problems its clients are facing.

As a result, the company can provide solutions prospects already know they need.

Real-time Customer Lifetime Value (CLV) Benchmark Report

See where your business stands compared to 1,000+ e-stores in different industries.

35 reports by industry and business size.

Qualitative Research Approaches

Once the marketer has decided that their research questions will provide data that is qualitative in nature, the next step is to choose the appropriate qualitative approach.

The approach chosen will take into account the purpose of the research, the role of the researcher, the data collected, the method of data analysis , and how the results will be presented. The most common approaches include:

  • Narrative : This method focuses on individual life stories to understand personal experiences and journeys. It examines how people structure their stories and the themes within them to explore human existence. For example, a narrative study might look at cancer survivors to understand their resilience and coping strategies.
  • Phenomenology : attempts to understand or explain life experiences or phenomena; It aims to reveal the depth of human consciousness and perception, such as by studying the daily lives of those with chronic illnesses.
  • Grounded theory : investigates the process, action, or interaction with the goal of developing a theory “grounded” in observations and empirical data. 
  • Ethnography : describes and interprets an ethnic, cultural, or social group;
  • Case study : examines episodic events in a definable framework, develops in-depth analyses of single or multiple cases, and generally explains “how”. An example might be studying a community health program to evaluate its success and impact.

How to Analyze Qualitative Data

Analyzing qualitative data involves interpreting non-numerical data to uncover patterns, themes, and deeper insights. This process is typically more subjective and requires a systematic approach to ensure reliability and validity. 

1. Data Collection

Ensure that your data collection methods (e.g., interviews, focus groups, observations) are well-documented and comprehensive. This step is crucial because the quality and depth of the data collected will significantly influence the analysis.

2. Data Preparation

Once collected, the data needs to be organized. Transcribe audio and video recordings, and gather all notes and documents. Ensure that all data is anonymized to protect participant confidentiality where necessary.

3. Familiarization

Immerse yourself in the data by reading through the materials multiple times. This helps you get a general sense of the information and begin identifying patterns or recurring themes.

Develop a coding system to tag data with labels that summarize and account for each piece of information. Codes can be words, phrases, or acronyms that represent how these segments relate to your research questions.

  • Descriptive Coding : Summarize the primary topic of the data.
  • In Vivo Coding : Use language and terms used by the participants themselves.
  • Process Coding : Use gerunds (“-ing” words) to label the processes at play.
  • Emotion Coding : Identify and record the emotions conveyed or experienced.

5. Thematic Development

Group codes into themes that represent larger patterns in the data. These themes should relate directly to the research questions and form a coherent narrative about the findings.

6. Interpreting the Data

Interpret the data by constructing a logical narrative. This involves piecing together the themes to explain larger insights about the data. Link the results back to your research objectives and existing literature to bolster your interpretations.

7. Validation

Check the reliability and validity of your findings by reviewing if the interpretations are supported by the data. This may involve revisiting the data multiple times or discussing the findings with colleagues or participants for validation.

8. Reporting

Finally, present the findings in a clear and organized manner. Use direct quotes and detailed descriptions to illustrate the themes and insights. The report should communicate the narrative you’ve built from your data, clearly linking your findings to your research questions.

Limitations of qualitative research

The disadvantages of qualitative research are quite unique. The techniques of the data collector and their own unique observations can alter the information in subtle ways. That being said, these are the qualitative research’s limitations:

1. It’s a time-consuming process

The main drawback of qualitative study is that the process is time-consuming. Another problem is that the interpretations are limited. Personal experience and knowledge influence observations and conclusions.

Thus, qualitative research might take several weeks or months. Also, since this process delves into personal interaction for data collection, discussions often tend to deviate from the main issue to be studied.

2. You can’t verify the results of qualitative research

Because qualitative research is open-ended, participants have more control over the content of the data collected. So the marketer is not able to verify the results objectively against the scenarios stated by the respondents. For example, in a focus group discussing a new product, participants might express their feelings about the design and functionality. However, these opinions are influenced by individual tastes and experiences, making it difficult to ascertain a universally applicable conclusion from these discussions.

3. It’s a labor-intensive approach

Qualitative research requires a labor-intensive analysis process such as categorization, recording, etc. Similarly, qualitative research requires well-experienced marketers to obtain the needed data from a group of respondents.

4. It’s difficult to investigate causality

Qualitative research requires thoughtful planning to ensure the obtained results are accurate. There is no way to analyze qualitative data mathematically. This type of research is based more on opinion and judgment rather than results. Because all qualitative studies are unique they are difficult to replicate.

5. Qualitative research is not statistically representative

Because qualitative research is a perspective-based method of research, the responses given are not measured.

Comparisons can be made and this can lead toward duplication, but for the most part, quantitative data is required for circumstances that need statistical representation and that is not part of the qualitative research process.

While doing a qualitative study, it’s important to cross-reference the data obtained with the quantitative data. By continuously surveying prospects and customers marketers can build a stronger database of useful information.

Quantitative vs. Qualitative Research

Qualitative and quantitative research side by side in a table

Image source

Quantitative and qualitative research are two distinct methodologies used in the field of market research, each offering unique insights and approaches to understanding consumer behavior and preferences.

As we already defined, qualitative analysis seeks to explore the deeper meanings, perceptions, and motivations behind human behavior through non-numerical data. On the other hand, quantitative research focuses on collecting and analyzing numerical data to identify patterns, trends, and statistical relationships.  

Let’s explore their key differences: 

Nature of Data:

  • Quantitative research : Involves numerical data that can be measured and analyzed statistically.
  • Qualitative research : Focuses on non-numerical data, such as words, images, and observations, to capture subjective experiences and meanings.

Research Questions:

  • Quantitative research : Typically addresses questions related to “how many,” “how much,” or “to what extent,” aiming to quantify relationships and patterns.
  • Qualitative research: Explores questions related to “why” and “how,” aiming to understand the underlying motivations, beliefs, and perceptions of individuals.

Data Collection Methods:

  • Quantitative research : Relies on structured surveys, experiments, or observations with predefined variables and measures.
  • Qualitative research : Utilizes open-ended interviews, focus groups, participant observations, and textual analysis to gather rich, contextually nuanced data.

Analysis Techniques:

  • Quantitative research: Involves statistical analysis to identify correlations, associations, or differences between variables.
  • Qualitative research: Employs thematic analysis, coding, and interpretation to uncover patterns, themes, and insights within qualitative data.

analysis of data educational research

Do Conversion Rate Optimization the Right way.

Explore helps you make the most out of your CRO efforts through advanced A/B testing, surveys, advanced segmentation and optimised customer journeys.

An isometric image of an adobe adobe adobe adobe ad.

If you haven’t subscribed yet to our newsletter, now is your chance!

A man posing happily in front of a vivid purple background for an engaging blog post.

Like what you’re reading?

Join the informed ecommerce crowd.

We will never bug you with irrelevant info.

By clicking the Button, you confirm that you agree with our Terms and Conditions .

Continue your Conversion Rate Optimization Journey

  • Last modified: January 3, 2023
  • Conversion Rate Optimization , User Research

Valentin Radu

Valentin Radu

Omniconvert logo on a black background.

We’re a team of people that want to empower marketers around the world to create marketing campaigns that matter to consumers in a smart way. Meet us at the intersection of creativity, integrity, and development, and let us show you how to optimize your marketing.

Our Software

  • > Book a Demo
  • > Partner Program
  • > Affiliate Program
  • Blog Sitemap
  • Terms and Conditions
  • Privacy & Security
  • Cookies Policy
  • REVEAL Terms and Conditions

IMAGES

  1. Importance of Data Analysis for Academic Research

    analysis of data educational research

  2. What is Data Analysis in Research

    analysis of data educational research

  3. (PDF) Secondary data analysis in educational research: opportunities

    analysis of data educational research

  4. Understanding the Different Types of Data Analysis

    analysis of data educational research

  5. PPT

    analysis of data educational research

  6. Big data In Education: How Data Science Transforms Education Process

    analysis of data educational research

VIDEO

  1. Data analysis

  2. Using data analytics to inform teaching and learning

  3. Tips for Teachers: Getting Started with Data Analysis

  4. Data Analysis in Research

  5. data organization and data analysis class 9 in Urdu

  6. Exploratory Data Analysis Overview

COMMENTS

  1. Education Data Science: Past, Present, Future

    What implications did this rise of data science as a transdisciplinary methodological toolkit have for the field of education?One means of illustrating the salience of data science in education research is to study its emergence in the Education Resources Information Center's (ERIC) publication corpus. 1 In the corpus, the growth of data science in education can be identified by the adoption ...

  2. A Beginner's Guide to Applied Educational Research using Thematic Analysis

    Peel, Karen L. (2020) "A Beginner's Guide to Applied Educational Research using Thematic Analysis," Practical Assessment, Research, and Evaluation: Vol. 25, Article 2. This Article is brought to you for free and open access by ScholarWorks@UMass Amherst. It has been accepted for inclusion in Practical Assessment, Research, and Evaluation by an ...

  3. Data science for analyzing and improving educational processes

    In this full review paper, the recent emerging trends in Educational Data Science have been reviewed and explored to address the recent topics and contributions in the era of Smart Education. This includes a set of rigorously reviewed world-class manuscripts addressing and detailing state-of-the-art, frameworks and techniques research projects in the area of Data Science applied to Education ...

  4. Data Collection in Educational Research

    Data collection methods in educational research are used to gather information that is then analyzed and interpreted. As such, data collection is a very important step in conducting research and can influence results significantly. Once the research question and sources of data are identified, appropriate methods of data collection are determined.

  5. A Systematic Review of Big Data Driven Education Evaluation

    The development of modern science and technology has made data and computing the mainstream way to describe educational phenomena. The mining and analysis of massive educational data has serve as an important basis for teaching interaction and decision-making (Wang et al., 2020).Numerous research have also pointed out the characteristics and advantages of education evaluation based on big data.

  6. Data-based decision-making for school improvement: Research insights

    These areas include the use of formative assessment data, educational research study findings and 'big data'. In particular, the discussion focuses on how school leaders and teachers can use different sources of data to improve the quality of education. ... For example, this may include difficulties with analysis and/or translating the data ...

  7. Educational Data Analytics for Teachers and School Leaders

    Dirk's research focuses on the intersection of cognitive psychology, educational technology, data analytics, and organisational learning. His research outcomes include numerous co-authored books, book series, book chapters, journal articles, and international conference papers, as well as successful grant funding in Australia, Germany, and USA.

  8. Teaching analytics, value and tools for teacher data literacy: a

    Teaching Analytics (TA) is a new theoretical approach, which combines teaching expertise, visual analytics and design-based research to support teacher's diagnostic pedagogical ability to use data and evidence to improve the quality of teaching. TA is now gaining prominence because it offers enormous opportunities to the teachers. It also identifies optimal ways in which teaching performance ...

  9. Education big data and learning analytics: a bibliometric analysis

    In addition, the analysis found that the most significant country in the area of education big data and learning analytics research was the United States, with the highest number of publications.

  10. Applications in Data Analysis for Educational Research

    As a consequence, the general and systematic use of advanced methodologies for data analysis is certainly the 'new frontier' for educational research today. This has been a constant guideline while preparing this issue, and the main criterion for selecting contributing papers. A number of papers applying quantitative methodologies have been ...

  11. Educational Data mining and Learning Analytics: An updated survey

    Educational Data Science (EDS) is defined as the use of data gathered from educational environments/settings for solving educational problems (Romero & Ventura, 2017). Data science is a concept to unify statistics, data analysis, machine learning and their related methods. This survey is an updated and improved version of the previous one ...

  12. A practical, iterative framework for secondary data analysis in

    Secondary data analysis in educational research has been an established research method for many years. Yet, few publications outline the "how to" of undertaking the process. This paper presents an analysis framework suitable for undertaking secondary data analysis within the field of education. The framework is a modification and an application of a pre-existing data mining research ...

  13. Data Analytics in Higher Education: An Integrated View

    The purpose of this paper is to offer IS educators and researchers. an overview of the current status of the resear ch and theoretical perspectives on educational data analytics. The paper ...

  14. Visual learning analytics of educational data: A systematic literature

    The data generated in educational contexts is often large, complex, and heterogeneous, making it difficult to understand—even with advanced data analysis capabilities. For instance, students enrolled in Massive Online Open Courses (MOOCs) generate large, heterogeneous datasets comprising clickstream data, contributions to discussion forums ...

  15. Descriptive analysis in education: A guide for researchers

    Whether the goal is to identify and describe trends and variation in populations, create new measures of key phenomena, or describe samples in studies aimed at identifying causal effects, description plays a critical role in the scientific process in general and education research in particular. Descriptive analysis identifies patterns in data ...

  16. (PDF) The role of data analytics in education: Possibilities and

    This chapter utilizes Khan's Learning Framework (Khan, 2001, 2005) to explore the role data analytics can play in education by looking at the possibilities and limitations of analytics. Eight ...

  17. PDF Text as Data Methods for Education Research

    The education-research community is in the middle of two advances that are changing the way we can analyze and understand educational processes. First, more than ever in the past, we have access to broad, rich, educationally-relevant text data from sources such as online discussion forums, transcribed

  18. Types of data analysis

    Discourse or narrative analysis; Grouded theory; Phenomenological analysis or interpretative phenomenological analysis (IPA) Further reading and resources. As a starting point for most of these, we would recommend the relevant chapter from Part 5 of Cohen, Manion and Morrison (2018), Research Methods in Education. You may also find the ...

  19. Qualitative Data Analysis and the Use of Theory

    Educational Researcher, 37 (6), 321-329. The role of theory in qualitative data analysis is continually shifting and offers researchers many choices. The dynamic and inclusive nature of qualitative research has encouraged the entry of a number of interested disciplines into the field. These discipline groups have introduced new theoretical ...

  20. The Analysis of Multilevel Data in Educational Research and ...

    context of the basic structure of data generated in large-scale educational research and evaluation. Basic Data Structure The basic data structure can be characterized by three clusters of variables (background, educational process, outcome) at two different levels of analysis (individual, group). The arguments presented below apply as well

  21. Qualitative Data Analysis

    Summary. Qualitative analysis—the analysis of textual, visual, or audio data—covers a spectrum from confirmation to exploration. Qualitative studies can be directed by a conceptual framework, suggesting, in part, a deductive thrust, or driven more by the data itself, suggesting an inductive process. Generic or basic qualitative research ...

  22. Data Analysis for Educational Research in R

    1.2 Why a book on data analysis in R for educational research. There are at least three reasons why data analysis in educational research is hard: Educational researchers have unique methods: an emphasis on multi-level models, networks, and measurement are just some examples. Educational researchers face unique challenges: coming from myriad ...

  23. Research methods and data analysis procedures used by educational

    Though some new data analysis procedures were introduced into educational research during the last decade, the trends of frequently used data analysis procedures stayed relatively steady. These trends were discussed in light of the changing editorial policies of the journals. Implications for the training of educational researchers were also noted.

  24. Educational research

    Educational research refers to the systematic collection and analysis of evidence and data related to the field of education. Research may involve a variety of methods and various aspects of education including student learning, interaction, teaching methods, teacher training, and classroom dynamics.. Educational researchers generally agree that research should be rigorous and systematic.

  25. (PDF) Data Analytics Applications in Education

    By developing fast and efficient algorithms and data-driven models for real-time data processing, machine learning can produce accurate results and analysis [7]. In the education sector, the data ...

  26. Artificial versus natural intelligence ...

    Future in Educational Research is a peer-reviewed journal publishing original papers on new trends, theories ... and plan for gathering data. Section 4 provides a thorough analysis of the findings in the section that follows. Section 5 concludes by discussing the shortcomings and providing thoughts about possible future research possibilities.

  27. Understanding Different Types of Data in Statistics

    The data are arranged in a sequence according to their values on a scale, which can indicate superiority or inferiority. Ordinal data can be examined using visualisation tools such as a bar chart or tables in which every row corresponds to a distinct category. Common examples of ordinal data: Education level (high school, BS, MS, PhD)

  28. Perspectives and discourses on diversity and inclusion in the

    The lines of research on diversity and inclusion have increased markedly in recent years. However, few studies integrate the perspective of different agents, and even more scarce are those developed within the epistemological and methodological framework of critical pedagogy. In this sense, the aim of the research has been to analyse the perspectives and discourses on diversity and inclusion ...

  29. Digital

    Content analysis was employed to scrutinize the data obtained from these interviews. The outcomes of this analysis shed light on the assistive technology acknowledged, utilized, or desired by students with disabilities in both academic and domestic settings. ... Institute for Educational Technology, National Research Council, Via De Marini 6 ...

  30. Qualitative Research: Definition, Methodology, Limitation, Examples

    Comprehensive, detailed analysis: Educational methods, patient treatment: Record Keeping: None (documents) None (document analysis) Varies with depth of study: ... Quantitative research: Involves numerical data that can be measured and analyzed statistically. Qualitative research: Focuses on non-numerical data, such as words, images, ...