Aisha Naz Ansari[1], Sohail Ahmad[2], Sadia Muzaffar Bhutta[3], Sajid Ali[4]
Introduction
The growing emphasis on evidence informed decisions in the education sector has, arguably, shifted focus from small-scale studies to large-scale research. This shift is driven by the perception that large-scale educational research is instrumental in shaping policy discussions. In the Pakistani context, however, current debates tend to focus primarily on results and policy implications, while methodological choices and decisions underlying such research remain largely invisible. For example, questions such as how, when and why to gather and integrate various forms of data are seldom addressed. Unpacking these methodological choices and decisions is important for at least two reasons. First, the ongoing shift towards designing evidence-informed policies in Pakistan has boosted researchers’ motivation to plan and conduct large-scale studies. Second, methodological decisions are not made in a vacuum; they are influenced by political, cultural, and contextual factors that researchers must carefully navigate.
In this blog, we draw on our experience of conducting two nationwide studies funded by DARE-RC that use mixed-methods approaches to examine different dimensions of Pakistan’s education system. One study investigates teaching and learning practices in science and mathematics classrooms across six regions (Sindh, Punjab, Khyber Pakhtunkhwa, Balochistan, ICT, and Gilgit-Baltistan), while the other study evaluates the effectiveness of Public-Private Partnership (PPP) schools in two regions of Pakistan (Punjab and Sindh). Rather than describing mixed-methods typologies, we will discuss the lessons learned from implementation, with particular emphasis on planning, adaptability, analytical sequencing, and stakeholder engagement.
Lessons Learned
Planning for Integration from the Outset
One of our strongest reflections is that integration must be planned from the very beginning of a study, rather than added at the end. In both research projects, considerations about how quantitative and qualitative methods would work together shaped the development of instruments, sampling strategies, and analytical questions. Early planning helped ensure coherence across tools, for example, by aligning survey constructs and observation rubrics with interview questions so that emerging patterns could be compared and interpreted more easily.
At the same time, we learned that methodological rigour requires flexibility in the field. Despite detailed planning, access conditions in one province changed midway through data collection and analysis when official permissions were delayed. Responding to this shift required rapid adjustments to sampling and timelines while maintaining analytical integrity. This experience showed that rigorous mixed-methods research at scale depends on both careful design and the ability to adapt to unexpected challenges.
Policy Sensitivity and Multi-layered Interpretation
In the policy arena, quantitative results or numbers can be powerful, but they can also be misunderstood or used selectively. When people draw broad conclusions about system performance or school effectiveness based only on data, they may disregard the contextual evidence behind those numbers. Without this context, the numbers can be misunderstood, exaggerated, or blamed on the wrong causes, because they ignore the organisational and resource factors behind them (Biesta, 2007). For example, Steiner-Khamsi (2003) identified three common political reactions to international assessments such as TIMSS (Trends in International Mathematics and Science Study) and PISA (Programme for International Student Assessment) which include scandalisation, glorification, and indifference. In Japan, the TIMSS results reinforced confidence in existing educational methods, while in Germany, PISA results triggered urgent calls for structural reform, demonstrating that the same data were processed through pre-existing national political agendas rather than interpreted neutrally. Given the alignment of our studies with ongoing policy reforms, we were mindful of the consequential use of quantitative findings, what Volante and Klinger (2023) describe as policy refraction or the tendency of policymakers to prioritise evidence that aligns with their prior political preferences. In our studies, qualitative data served a critical role in contextualising statistical patterns by grounding them in the lived experiences of teachers, school leaders, and communities (Creswell & Clark, 2017). For instance, numerical differences across school types became more meaningful when interpreted alongside narratives about resource constraints, governance practices, and classroom realities. This mixed-methods approach strengthened how we interpreted our findings and helped prevent oversimplified accounts that ignore context.
Beginning Qualitative Analysis Early
Another key lesson was about the timing of qualitative analysis. Rather than waiting until all data had been collected, we started transcribing and analysing qualitative data alongside fieldwork. This early engagement with data enabled us to identify gaps, refine interview questions, and sharpen our analytical focus while the fieldwork phase was still active (Corbin & Strauss, 2014). For example, in the Public-Private Partnership study, early coding of teacher interviews in Sindh revealed that questions about instructional time were being understood differently across school types. This prompted us to revise a survey question and add a follow-up probe during subsequent school visits. Without this simultaneous analysis, the issue would have remained undetected. This iterative approach allowed us to integrate emerging qualitative insights with ongoing quantitative analysis, improving both the overall coherence and depth of the analysis. It also made our research instruments more responsive to what we are learning in the field in real time.
Stakeholder Engagement as a Methodological Strength
Our experience reaffirmed that stakeholder engagement is not just an ethical commitment but a methodological strength. In mixed‑methods research, sustained engagement refers to repeated interactions with participants and returning to field sites over time to support contextually grounded interpretation of data (Creswell & Clark, 2017; Teddlie & Tashakkori, 2011). In practice, this involved maintaining regular communication with school leaders and teachers and visiting schools for multiple times. This approach improved response rates, eased logistical coordination, and encouraged more open discussions, particularly on issues that are difficult to capture through structured tools. At the same time, sustained engagement comes with a few risks, such as over‑identification, blurred boundaries, or reduced analytic distance (Corbin & Strauss, 2014). These risks or challenges were addressed through several reflexive practices during the research process. First, the research team conducted regular debriefing sessions to critically examine emerging interpretations against the empirical data rather than relying on impressions shaped by relationships developed in the field. Second, the research team made a concerted effort to include differing and even conflicting perspectives in the analysis, instead of considering a single narrative.
This helped preserve analytical rigour and transparency and reflect the complexity of participants’ experiences. These strategies were particularly important in the Pakistani context, where hierarchical relationships between researchers and school leaders may shape interactions and influence how participants articulate their perspectives. At the same time, meaningful engagement is an ethical commitment; it respects participants’ time, encourages reciprocity, and strengthens accurately representing participants’ views and experiences (Tracy, 2010). Collectively, these relational dynamics enhanced the depth and credibility of our qualitative insights across both studies.
Conclusion
Adopting mixed‑methods approaches enabled us to develop a more holistic and credible understanding of the issue by combining statistical trends with contextually grounded explanations (Creswell & Clark, 2017; Teddlie & Tashakkori, 2011). This shows that using mixed‑methods can make analysis stronger by combining different types of evidence. Their value lies not only in using multiple data sources but also in adapting methods to fit the realities of educational systems shaped by different actors, uneven resources, and shifting policy demands (Morrison & See, 2026). For researchers undertaking large-scale studies in contexts such as Pakistan, data integration should be seen as an ongoing methodological practice rather than something fixed at the design stage because the most important integration decisions are made during fieldwork and analysis, and not just at the planning phase. That said, we also acknowledge that mixed-methods approaches do not resolve all interpretive challenges. We hope these reflections offer useful guidance for other researchers undertaking large-scale studies.
[1] PhD Student, School of Education, Durham University, United Kingdom
[2] PhD Candidate, Aga Khan University, Institute for Educational Development, Pakistan & Research Associate, REAL Centre, University of Cambridge, United Kingdom
[3] Associate Professor, Aga Khan University, Institute for Educational Development, Pakistan
[4] Professor, Aga Khan University, Institute for Educational Development, Pakistan
References
- Biesta, G. (2007). Why “what works” won’t work: Evidence‐based practice and the democratic deficit in educational research. Educational theory, 57(1), 1-22.
- Corbin, J., & Strauss, A. (2014). Basics of qualitative research: Techniques and procedures for developing grounded theory. Sage publications.
- Creswell, J. W., & Clark, V. L. P. (2017). Designing and conducting mixed methods research. Sage publications.
- Morrison, K., & See, B. H. (2026). Introduction to the field of mixed methods research in education. In Handbook of Mixed Methods Research in Education (pp. 1-21). Edward Elgar Publishing.
- Teddlie, C., & Tashakkori, A. (2011). Mixed methods research. The Sage handbook of qualitative research, 4(1), 285-300.
- Tracy, S. J. (2010). Qualitative quality: Eight “big-tent” criteria for excellent qualitative research. Qualitative inquiry, 16(10), 837-851.
- Steiner-Khamsi, G. (2003). The politics of league tables. Journal of Social Science Education, 1.
- Volante, L., & Klinger, D. A. (2023). PISA, global reference societies, and policy borrowing. Policy Futures in Education, 21(1).

