TERMS OF REFERENCE FOR INSTITUTIONAL CONTRACT Summary: UNICEF is planning to engage a technical agency/firm to support the Directorate of Primary Education (DPE) in carrying out the National Student Assessment (NSA) 2026. The NSAs are nationally representative sample-based assessments administered on Grade 3 and 5 students in Bangla and Mathematics. Assessment at Grade 3 provides a mid-point estimate of student performance within the primary education cycle, while assessment at Grade 5 provides an estimate of achievement of terminal competencies. The agency will serve as an additional technical hand to the UNICEF education team for effectively and timely completing the assignment. Estimated duration for the study is two and a half years. Title of the assignment: Provide technical assistance for design, piloting, implementation, reporting and use of findings of National Student Assessment (NSA) 2026 of Primary Education subsector for the Directorate of Primary Education (DPE), Ministry of Primary and Mass Education (MoPME) in Bangladesh. Purpose: To design, pilot, implement, and report on the National Student Assessment (NSA) 2026 for the Primary Education subsector, and to support the Directorate of Primary Education (DPE) and the Ministry of Primary and Mass Education (MoPME) in Bangladesh in the effective use of the assessment findings for policy development, planning, and improvement of educational outcomes. Estimated duration for the study is two and a half years. Location: The assignment will be based in Dhaka, so the international firm is expected to plan at least eight visits to Dhaka, to closely work with DPE and UNICEF. Estimated Duration: Around 30 months (July 2025 – December 2027) Supervisor of the assignment: Education Officer 1. Background and Context Education is both a fundamental human right as well as a key instrument for propelling economic productivity, civic engagement, socio-cultural advancement, and sustainable development through developing a knowledgeable, aware, resilient, and skilled human capital. Bangladesh, as part of its national development vision, has significantly expanded basic education provisions over the past few decades, with almost universal enrolment and sizeable completion rates at the primary level. However, a similar improvement in learning achievement rates has not progressed as expected. The quality of education and the learning levels remain significantly low, which limits the achievement of foundational skills by the children, thereby limiting their effective participation in the 21st-century societies and economies. Bangladesh’s current national development plan, education sector plan and reform programmes – all list the improvement of quality of education as a key priority. Improving learning is also at the core of the Sustainable Development Goal SDG4 which calls on countries to “ensure inclusive and equitable quality education and promote lifelong learning opportunities for all” by 2030. SDG Target 4.1 and the related indicator 4.1.1 particularly emphasize learning outcomes[1] by measuring Minimum Proficiency Level (MPL) at lower primary, end of primary and end of lower secondary. Assessment of learning competencies, both at school and system level allows generating substantial information, evidence, and comparable data on the level of achievement and help identify contextual drivers of learning inequalities at different levels. The large-scale assessments focus on overall system performance and how well the sector-level interventions are working. The results of the large-scale assessments are intended to be used for policy formulation, systemwide decision-making and need-based interventions. The Directorate of Primary Education (DPE), under the guidance of the Ministry of Primary and Mass Education (MoPME), launched Bangladesh’s first large-scale learning assessment of primary level students in 2006. The assessment is typically titled ‘National Student Assessments (NSA)’. The NSAs are nationally representative sample-based assessments administered on Grade 3 and 5 students in Bangla and Mathematics. Assessment at Grade 3 provides a mid-point estimate of student performance within the primary education cycle, while assessment at Grade 5 provides an estimate of achievement of terminal competencies. Assessment of performance in Bangla and Mathematics provides insights on how students are doing on foundational literacy and numeracy skills, the skills that provide foundation for learning in other subjects and achieve other competencies as specified in the curriculum. Since its’ inception in 2006, a total of seven rounds of NSAs have been conducted, with the latest round being carried out in 2022. In terms of methodology, the NSAs grew stronger and better over the years, with a departure from Classical Test Theory (CTT)-based approach to Item Response Theory (IRT) for ensuring comparability of learning performance across years and grades.[2] Comparability is also done across student subgroups by gender, geographical location and school type. Further methodological improvements have been pursued in the NSA 2017 and NSA 2022, where reporting of achievement scores have moved from proficiency bands (5 bands: Band 1, 2, 3, 4 & 5) to performance standards (4 levels: below basic, basic, proficient, advanced)[3] under the approach of Proficiency Standard Setting. The next round of the NSA is scheduled for 2026. The primary curriculum was revised in 2022, which initiated the textbook development process. Typically, each NSA round includes a set of pilot items alongside the main items. The main items are used to assess student performance for the current cycle, while the pilot items are reserved for the future cycles. After the assessment, the pilot items are analysed, and those that are statistically valid and reliable are selected for the next round. However, no items were piloted during NSA 2022 due to the ongoing curriculum and textbook revisions. Therefore, for NSA 2026, new items need to be developed and piloted by November/December 2025. The NSA is mainly managed by DPE with the overall guidance from MoPME. The technical unit/line division for conducting NSA including developing, piloting, and finalizing test items is the National Assessment Cell (NAC) under the Monitoring and Evaluation (M&E) Division of DPE, in collaboration with the National Curriculum and Textbook Board (NCTB) and National Academy for Primary Education (NAPE). The NAC works closely with the technical agency/firm recruited for carrying out the NSA. UNICEF supported DPE in carrying out NSA 2022 and will continue the same for NSA 2026. For this purpose, UNICEF plans to engage an international technical agency that will work closely with DPE and NAC for carrying out NSA 2026. It is important to note that though the main assessment will be administered in November/December 2026, the NSA process will start in July 2025. As mentioned, before the main assessment, the items for NSA have to be piloted in December 2025. As per the technical process, both the piloting of new items and main assessment are conducted at the end of the year to capture the learning of the whole academic year. Before developing new items, assessment framework needs to be developed, which informs the scope of learning competency and weightage in terms of the cognitive domains. After that, new items will be developed for piloting. Therefore, the technical agency needs to be onboarded in July 2025. 2. Rationale / Purpose of the evidence activity Large-scale national student assessments play a crucial role in understanding the performance of an education system and offer significant advantages compared to public examinations. The purpose of the public examinations is mainly to assign grades to students and promote them to next grade or level of education. Public examinations are high-stake assessments and mostly focus on the students’ content knowledge. At the same time, it is important to understand the effectiveness of an education system as well as how students are doing in terms of achieving the competencies set for the respective grades. NSAs enable the generation of substantial information, evidence, and comparable data regarding students' achievements. This process also helps identify contextual factors that contribute to learning inequalities across various levels, as well as enhances our understanding of the challenges and obstacles children face in achieving the expected learning outcomes. Such insights are crucial for the development of relevant and effective sector plans, overall improvement of education systems, identification of teacher performance or management issues, curriculum and teaching material revisions, promoting equity in learning opportunities and outcomes, and enhancing education management and information systems. Overall, following are the rationales for carrying out NSA: Measuring learning outcomes: NSAs provide a comprehensive measure of student learning and performance across the country. They offer insights into how well students are meeting competencies at their respective grades. Informing policy and decision making: The data gathered from NSA help policymakers and education authorities make informed decisions about curriculum development, resource allocation, and educational reforms. They identify strengths and weaknesses in the education system, guiding targeted interventions. Identifying learning disparities: The NSAs highlight disparities in educational achievements among different regions, socioeconomic groups, and demographic segments. This information is vital for addressing inequalities and ensuring that all students have access to quality education. Improving accountability: NSAs hold education authorities accountable for student performance. By providing an objective measure of educational outcomes, they shed light on how successful the education authorities are in delivering quality education services for students. Guiding professional development: The results can inform teacher training and professional development programs by identifying areas where educators need additional support or resources. This helps improve teaching practices and ultimately student learning outcomes. Tracking progress over time: By regularly conducting NSAs, education authorities can track progress and trends in learning achievements over time. This longitudinal data is essential for understanding the effectiveness of educational policies and initiatives. Enhancing parental and community engagement: NSAs provide valuable information to parents, communities, and stakeholders about the state of education. This transparency can increase public engagement and support for educational initiatives. 3. Objectives The objective of this assignment is to provide technical assistance to the Directorate of Primary Education (DPE) in the design, development, administration, analysis, and reporting of the National Student Assessment (NSA) 2026. The assignment will also focus on comparing the 2026 findings with the results of the NSA 2022 and earlier rounds, while embedding strategies for planning future assessments. Specifically, the NSA 2026 aims to: Assess whether, and to what extent, children are learning by measuring what they know and can do. Analyse changes in student learning outcomes compared to previous NSA rounds. Identify specific groups of students who are falling behind in learning achievement. Examine critical barriers or issues hindering student learning, with an emphasis on variations across regions and geographical areas. 4. Scope The agency will directly work with the National Assessment Cell (NAC), under the Monitoring and Evaluation Division, and other respective line divisions of DPE and UNICEF team throughout the study process. Collaboration with and guidance from UNICEF and other Development Partners (DPs) will also be critical in carrying out a technically sound NSA and producing a comprehensive report. In successfully carrying out the process, NAC, UNICEF and the agency will have different set of roles with optimum collaboration among them in all aspects of the process. The agency will mainly support in: i) developing the assessment framework; ii) sampling schools for piloting new items; iii) analysing the results of piloting and selecting items for the main test; iv) updating/finalizing contextual questionnaires; v) finalizing the test booklets; vi) developing technical standards; vii) sampling for the main test; viii) providing training for monitoring test administration; ix) providing training for scoring filled-in test booklets; x) data entry; xi) data analysis; xii) reporting NSA results; xiii) preparing policy briefs and divisional reports; xiv) result dissemination and use of NSA data; and xv) capacity building of NAC through embedded and targeted capacity development activities. The NAC will mainly be responsible for: developing new items by engaging experts in respective areas; sharing the list of all schools for sampling; collecting (or verifying through IPEMIS[4]) the student list from the sampled schools; iv) notifying selected schools about the NSA administration; v) sending the sampled list of students to the selected schools; vi) printing and distributing test booklets and survey questionnaires; vii) selecting school-wise invigilators and supervisors; viii) providing orientation to the invigilators and supervisors on their roles; ix) administering test; x) collecting the filled-in test booklets and sending them to the agency; and xi) organizing review and dissemination workshops. UNICEF will act as the key coordinating point for the whole NSA process. UNICEF will work hand in hand to ensure technical soundness of the process and contextual alignment of each activity as well as take part in all the technical processes carried out by the agency and NAC. All the documents prepared by the agency will firstly be reviewed by UNICEF before sending them to the NAC. Also, all documents will be duly shared by UNICEF with all respective Development Partners (DPs) for their review and comments. However, as mentioned, though all parties have different set of responsibilities, each activity will be carried out in a coordinated manner, and proper and adequate consultations among them are indispensable. 5. Details of the assignment and scope of work The tasks and activities will cover the design, development, implementation, and management of NSA 2026. All activities to be undertaken in close consultation with the DPE, NAC, NCTB, NAPE to ensure national ownership, capacity, and long-term sustainability. The overall scope of work has been divided into following areas to facilitate the entire process; however, these steps are not linear and should be understood as indicative of the actual activities. Most of the activities will include in-country face to face presence, hands-on training for optimal capacity development and to ensure high technical standards. I. Developing the Assessment Framework After the inception, the first task is to develop a comprehensive assessment framework by September 2025. Based on the framework, the new items for the NSA will be developed, piloted and administered. The assessment framework needs to specify the outcomes—such as content, skills, and understandings—to be measured, along with a contextual framework that defines the relevant contextual factors linked to those outcomes. The assessment framework should also outline what, why, how, and who is being assessed. It will guide the design of tools, sampling, implementation, analysis, and reporting, making it a key element in strengthening national assessment systems. However, it should remain flexible to adapt to changing conditions during assessment planning and implementation. The framework will include provisions for the children with disability so that there is equity in participation in the test. It is important to note that NSA typically assesses the achievement of grade-level competencies in Grade 3 and 5. However, Bangladesh has to report the SDG target 4.1.1, which requires data on the proportion of children achieving at least a minimum proficiency level in reading and mathematics in Grade 2/3 and at the end of primary (Grade 5 for Bangladesh). MoPME and DPE are planning to assess both grade-level competency and minimum proficiency during NSA 2026. Therefore, the assessment framework must include/follow all technical criteria required for reporting on SDG 4.1.1 indicators as well include specific protocols and modalities for both grade-level competency and minimum proficiency. Accordingly, the items should be developed in a way that can assess both. II. Piloting of new items As mentioned, due to the changes in curriculum and textbooks, items for NSA 2026 were not piloted during NSA 2022. For this reason, new items based on the existing curriculum and textbooks of Grade 3 and 5 will be developed and piloted in December 2025. As mentioned above, the piloting of the new items is conducted at the end of the academic year to capture the learning of a whole academic year. DPE and UNICEF will engage a pool of experts and organize item writing workshops for this purpose. The pool of experts will include subject specialists, curriculum specialists and assessment specialists. DPE and UNICEF, in consultation with other DPs, will prepare a list of experts based on the previous experience in the respective areas. Then, the experts will be communicated and invited to the workshop for developing the items. The agency will support in developing guidelines for item development and review in accordance with the assessment framework. The guidelines should include instructions on who adjustments/reasonable accommodations will be ensured while developing the items. Also, the agency is expected to provide orientation to the item developers on the guidelines and modalities of item development. Once the items are reviewed and finalized, the agency will select schools where these new items will be piloted. The number of sample schools/students will be determined based on the discussion with DPE. The administration of the test for piloting will be carried out by DPE. The agency will provide guidance on the pilot administration process. Once the test is completed, the agency will collect the answer papers for scoring, data entry and analysis to determine the items for the main test in 2026. The number of items, number of booklets and order of items will be determined based on the final test blueprint/table of specifications that is included in the assessment framework. III. Design of NSA 2026 Once the pilot phase is completed, design of the NSA 2026 is expected to start. The assessment framework would already be developed before the pilot, so the design phase should be informed by framework in the relevant aspects of the process and results from piloting exercises. The design phase will entail developing protocols and procedures for test design, sampling frame and methods, scoring and data entry plan, data cleaning and analysis, plan for data use and approaches for capacity building of NAC. As a part of the design, the items for the main test will be finalized. The NSAs also include survey questionnaires for student, subject teachers and head teachers in order to understand how different factors are associated with students’ learning. The questionnaires from the previous round will be reviewed, updated and finalized for NSA 2026. For maintaining comparability with previous rounds, the design process needs to be informed by the approaches followed during previous rounds with necessary revisions and updating without losing comparability. The whole design phase will be carried out by the agency with close coordination with NAC and UNICEF. As decided by MoPME and DPE, technical standards are to be developed as a part of the NSA 2026. Previously, there were mentions of some standards sporadically in design documents and test administration guidelines. However, there was no separate/specific document that clearly outlines all standards in one place for the NSA process, which are expected to be developed in this round of NSA. The purpose of this is to set benchmarks and best practices for sampling, data collection, analysis and presentation, test administration and management of the overall NSA process. The agency might build on the existing good practices (e. g. PISA) of technical standards developed in different countries for large-scale assessment. This document is expected to detail out each standard, its rationale, and the quality assurance data that need to be collected to demonstrate that the standard has been met. The agency will draft the technical standards document, and it will be finalized based on the discussion with NAC and UNICEF. IV. Sampling The National Student Assessments (NSAs) typically involve nationally representative samples. In the previous round, estimates were generated at both the divisional and district levels. For NSA 2026, the sampling design must be tailored to ensure compatibility with the previous round while also producing statistically reliable and representative estimates at the national, divisional, and district levels. Based on the agreed methodology during the design phase, the selected agency will be responsible for drawing adequate and representative samples from all districts, covering relevant school types and geographical locations. All samples must be drawn using appropriate statistical formulas to maintain methodological rigor. Given the cost and logistical challenges associated with test administration, the number of upazilas (sub-districts) to be sampled may be determined in advance, ensuring alignment with the agreed sampling methods and formulas. The sampling frame for NSA 2026 will include all types of primary schools—public, private, and NGO-run institutions. As part of the sampling process, a verification mechanism must be incorporated to ensure the completeness and accuracy of the sampling frame. Additionally, clear inclusion criteria should be established, such as including only schools that have been operational for a minimum number of years and meet a defined enrolment threshold. Once the schools are sampled, DPE will contact selected schools to collect the list of all students enrolled in Grade 3 and 5. If the lists are available on IPEMIS, DPE will verify the lists with schools and if necessary, update them. Then the final list will be shared with the agency for student-level sampling. For NSA 2026, representative/adequate number of children with disabilities will be included in the sample. For that, when DPE shares the school list, they will indicate which schools have the children with disabilities enrolled including the number of enrolments. The agency will use appropriate sampling methods for selecting schools that have the children with disabilities enrolled. More details on the sampling can be found under Section 6 (Methodology). V. Administration of test and survey Administration of the test and survey is the next step in the process. Preparation for the test administration is a very critical task before the actual test takes place. Preparation entails a list of activities, including developing test booklets (different sets) based on the finalized test items, preparing design layouts for test booklets and survey questionnaires, developing/updating test administration guidelines, printing and distributing test items, survey questionnaires and other materials, selecting the invigilators and supervisors, and organizing orientation for invigilators and supervisors. A critical task during test administration would be to monitor and observe the test administration process. In this process, the agency will mainly have two roles. Firstly, the agency will support the preparation of test booklets, mainly distributing the final items in different sets. The designing, printing and distributing the test booklets and survey questionnaires will be done by DPE and UNICEF. Secondly, the agency will support the monitoring of test administration process including developing a checklist for the observers, creating a pool of observers and facilitating the monitoring process. The test administration process will also be monitored by DPE, UNICEF and other DPs working in the primary education sector. Other tasks as a part of test administration process will mainly be carried out by DPE and UNICEF. Once the test is completed, DPE will collect all the filled-in test booklets and survey questionnaires from the schools. Proper instructions for the supervisors and invigilators will be provided in test administration manual on how to package and send the booklets and questionnaires to DPE. After all the documents reach DPE, the agency will verify and collect them from DPE. VI. Data processing and analysis Tasks will entail developing protocols, procedures, reports, and methodological standards related to data entry, data cleaning, variance estimation treatment, scaling, equating, analysis, results computing, factors analyses and reporting. The agency will undertake these tasks in close collaboration with NAC team, with an objective to build their capacity including training on the software/systems to be used. The agency will develop a data analysis plan that includes a description of methodology and examples of outputs for all potential analyses considered to be methodologically valid and applicable from the database and derived constructs to feed the main national report and additional products, such as policy briefs and research papers. The plan should take into consideration international experience from large scale assessments, various analytical methods, instruments, and parameters. The agency will develop the outline of the main standardised national report based on the agreed topics selected from the analysis plan. Marking the filled-in test booklets or the answer sheets would be an important step before data entry. The agency will work collaboratively with DPE in developing the scoring guidelines and matrix. Then the agency will engage a pool of scorers/markers who will assess the answer sheets. The agency will also organize in-person training for scorers, ensuring that proper and adequate hands-on processes are embedded in the training program. Once marking is completed, data entry will be started. As a part of IPEMIS, DPE has developed an NSA module that allows data entry in the system. Before the data entry, the agency will work with DPE in preparing the data entry template based on the data field requirements. The template will then be embedded in the module and data entry will be done in the system. The agency will engage a pool of data entry operators for this purpose. Upon completion of the data entry, the system will generate excel files with all raw data. The agency will collect the raw data files from DPE for analysis. Data analysis should be conducted in a way that properly and adequately responds to the study questions. For NSA 2026, data analysis is expected to yield: proportion of students achieving grades level competencies, proportion of students achieving minimum competencies, disaggregated performance in terms of Divisions/Districts, school types, geo locations and disability. The focus will also be on identifying factors that are affecting or associated with students’ learning. VII. Reporting for NSA 2026 Based on the analysis, the agency will start writing the NSA 2026 report. As per the typical practice, there will be one public report and one technical report. The findings in the public report should be presented in a way that is easily readable and understandable by all, with only necessary technical details included. Most of the technical details are expected to be included in the technical report. The agency will also support in producing additional short reports or briefs tailored to specific audiences (e.g., MoPME/DPE, sub-national levels, NCTB, teachers etc.). These reports will mainly focus on respective findings and specific actions. In the previous NSA reports, the reporting on the factors associated with learning have been weak as identified by the DPE and DPs. Most of the cases, it becomes extremely difficult for the stakeholders to understand which factors are positively or negatively contributing to learning. For this reason, during NSA 2026, the agency will make maximum efforts to improve the reporting of the factors associated with learning. The typical practice is that the agency does the uni-variate analysis of the factors, which does not give a comprehensive picture of the dynamics of the different factors. Therefore, in NSA 2026, the agency will carry out multi-variate/regression analysis to capture the comprehensive scenario. Identifying the factors is extremely critical as it can inform the policy changes and evidence-based decision-making process. Once the NSA reports are drafted, DPE and UNICEF will facilitate the review process. This will include sharing the reports will all DPs for their review and organizing review workshops involving all government and DP official. The agency is expected to attend the review workshops in person and make the presentation on the findings. The agency will also take note of all the comments and suggestions from the participants and address them to the extent possible to finalize the reports. VIII. Communication, policy guidance, advocacy and use of findings This task includes all activities linked to packaging, developing of dissemination plan and advocacy strategies, communicating findings in ways that are meaningful and actionable for policymakers, teachers, and other stakeholders, and leveraging findings in policy discussions to create change. The agency will be expected to contribute to sharing findings and recommendations from NSA 2026 and other assessments to the intended target audience, including policy makers, education officials, service providers, teachers, parents, and students. It also includes consideration of methods and formats for datasharing and feedback processes to ensure the optimal uptake of the findings. This may also include support for communication strategies, provision of recommendations and follow up actions based on the findings and recommendations form NSA 2022. This can include recommendations for policy makers, curriculum, teacher education, quality assurance etc. for improved education sector plans, budgets, and reforms. Capacity development on analysis of learning results to inform policy and classroom practice are considered an integral part of such efforts. The agency will also contribute to the official launch of the report and review the presentations for quality assurance. The agency will travel to participate in the launch activities and contribute to presenting report and results. IX. Capacity strengthening Tasks will include capacity gap assessment of relevant government agencies, including the NAC on how to measure and track learning outcomes at national level using common standards and approaches; developing short-, medium- and long-term capacity building plan, approach, and guidelines in alignment with the current provisions for Continuous Professional Development (CPD) activities; roll out the plan and document the process for institutionalization of the trainings and other capacity building activities. X. Management and project coordination Applicants will be expected to include overall coordination and management activities in their proposals. Such activities are related to coordinating team members, monitoring, recording, troubleshooting, documenting processes, successes and lessons learned. The coordination function is critical given that numerous partners are involved at different levels. Below is a list of some examples of coordination and management tasks. Regular communication and liaison with UNICEF and DPE team members Supervision of assessment’s technical standards Remote technical support to country team Travel to country to provide technical support All the work has to be undertaken in collaboration with and involvement of relevant officials of DPE M&E Division, NAC, NCTB, NAPE and teachers with hands on support to NAC officials at all stages of the process. Guidance and support to be taken from the steering and technical committees established for this purpose as well as from the DPs working in the sector. These would become important in establishing communicate with divisional levels and below for test administration, standards, and operations. 6. Methodology Based on the above descriptions in section 5 as well as to achieve the objectives outlined under section 3, the agency is expected to propose technically sound methodology that will include: i) study approach; ii) sampling method; iii) data collection process; iv) data entry and analysis process; v) reporting approach; and vi) quality assurance mechanism. The study methodology and other details are to be included in the technical proposal and then in the inception report. The proposed methodology will be agreed and finalized during the inception based on participatory process and consultation with DPE, UNICEF and other Development Partners. Followings are the indicative/suggestive methodology for this study: Study approach: This will employ quantitative method of data collection and analysis. Following is a summary of proposed methods for data collection, including the respondents for each method. Methods: Proficiency test in Bangla and Mathematics Survey questionnaires Respondents Students of Grade 3 and Grade 5 All students who participate in competency test Subject teachers at the selected schools Head teachers at the selected schools Proposed Sampling Strategy for NSA 2026 The sampling strategy for the National Student Assessment (NSA) 2026 will be carefully designed to ensure national representativeness and comparability with previous rounds. The sample will be stratified across key dimensions to capture essential variations in learning outcomes. Stratification will be based on geographical regions (the eight administrative divisions of Bangladesh), urban versus rural location, school management type (public, private, and NGO-run schools), gender (to ensure balanced representation of boys and girls), and disability status (to ensure representation of children with disabilities through targeted oversampling or boost samples). The school will serve as the Primary Sampling Unit (PSU). A multistage cluster sampling approach will be adopted. In the first stage, schools will be selected from the updated sampling frame using probability proportional to size (PPS) methods, stratified by division, location type, and school management type. In the second stage, a random sample of students will be selected within each chosen school, ensuring appropriate grade-level coverage and representation. Student lists maintained at the school level will be used for random selection, and additional measures will be taken to be oversample or boost the representation of students with disabilities wherever applicable. Sample size calculations will be conducted to ensure reliable and precise estimates. Key parameters include a 5% margin of error for learning outcomes at a 95% confidence level, with at least 80% statistical power (preferably 90% if resources allow). A design effect between 1.5 and 2.0 will be applied to account for clustering effects, and the sample size will be adjusted to achieve a minimum 95% response rate. Intra-class correlation coefficients (ICC) based on previous NSA or international experiences (typically between 0.15 and 0.25) will also be taken into consideration to determine optimal cluster sizes. Sampling weights will be computed to reflect the selection probabilities at each stage of sampling and adjusted for non-response and post-stratification factors such as gender and district-level enrollment figures. Proper application of sampling weights in all analysis will ensure unbiased and representative estimates for the target population. Special consideration will be given to the representation of children with disabilities. Schools known to enroll children with disabilities will be identified and targeted during the sampling frame verification. Oversampling strategies or boost sampling will be applied as needed to ensure sufficient sample sizes for meaningful subgroup analyses. Data from smaller subgroups will be analyzed using weighted analysis, and if necessary, pooling techniques across multiple years. Small area estimation (SAE) methods may also be employed to improve the robustness of findings for these subpopulations. Following is the proposed sampling frame in terms of number and representation: See Annex-F in the RFP document. 7. Risks and Mitigation Measures : See Annex-F in the RFP document. 8. Ethical Considerations Agency contracted for the study is expected to follow the ethical principles and considerations outlined in the United Nations Evaluation Group (UNEG) Ethical Guidelines for Evaluation and the UNICEF Procedure for Ethical Standards in Research, Evaluation and Data Collection and Analysis. In addition, the UNEG norms and standards will be observed. As per UNICEF standards for ethical research, the agency must give special attention to ethical considerations and should put in place adequate measures for ethical oversight throughout the study/evaluation period. All researchers and field investigators involved in primary data collection should have undergone basic ethics training, which at a minimum includes completing UNICEF’s AGORA course on Ethics in Evidence Generation or its equivalent. In conducting the study, the agency must ensure informed consent, respecting people’s right to provide information in confidence and making study participants aware of the scope and limits of confidentiality. Furthermore, the agency is responsible for ensuring that sensitive information cannot be traced to its source so that the relevant individuals are protected from reprisals. Data storage and security must be ensured at all stages of the study. Only selected personnel from the research agency should have access to de-identified data, and only anonymised data should be shared externally, and with UNICEF (unless stated otherwise). 9. Use of Findings Since the main intention of NSA is to assess the system performance, especially the efficiency of the interventions in primary education, the findings from it have various levels of uses. Following are the areas where the NSA findings are intended to be used: Preparing an action plan based on NSA findings: An action plan will be prepared based on the NSA findings with key actions and timeline with a focus on targeted interventions that directly and indirectly contribute to the improvement of children’s learning outcome at primary level. Disseminating the NSA findings nationally and divisionally: The major findings from NSA 2025 will be first shared nationally with key relevant government agencies and development partners. Then the divisional analysis of the NSA results will be conducted, and the findings will be shared with officials/teachers working at division, district, upazila and schools. These disseminations will also include discussions on the possible way forwards at the national and sub-national levels and agree on actions that are implementable at different levels. Preparing policy briefs: Based on the findings, DPE and UNICEF will work together to identify most important factors associated with children’s learning, which will lead to the development of several policy briefs. The policy briefs will then be used for organizing policy events and further advocacy on most critical issues. 10. Publication Plan A comprehensive NSA public report will be prepared and published, which will be accessible to all. Also, as mentioned, the findings will be disseminated nationally and sub-nationally with all related stakeholders. 11. Schedule of Deliverables, Timeline and Payment - Annex-F in the RFP document 12. Team composition, Qualifications & Experience required This requires both international and national expertise. The expectation is that the international agency will take the lead in submitting the proposal and assembling the required expertise – at both international and local levels – across the technical areas. The key considerations in assembling the team should include a right balance of technical and operational skills; drawing the right technical expertise as per the international standards and technical requirement of the tasks; drawing the most suitable national expertise with deeper contextual understanding and tacit knowledge of ground realities in Bangladesh and a high concern for ensuring value for money. General requirement of the agency: Demonstrated capacity to assemble required expertise in: Education development, Educational Assessment and Evaluation, Statistics/ Applied Statistics/Psychometric. At least 10 years of professional work experience regarding learning assessment programmes including research and data analysis. Proven experience in temporal comparability of learning assessments using a readily available statistical software. Expertise in the use of SPSS or Stata, Microsoft Access or other database software, and IRT software (e.g., ConQuest, Iteman, specialized R libraries). Proven experience in quantitative research, especially in the analysis of large-scale assessment data. Demonstrated prior experience in conducting large-scale assessments, as well as conducting analysis and reporting on large-scale assessments is required. Prior regional experience on the successful completion of such assignment will be given preference. Experiences of large-scale learning assessment in Bangladesh will be considered as an added advantage. Strong analytical written skills, including the ability to write in an engaging and informative manner and clearly synthesize information. Translation and interpretation capacity in Bangla. The agency should have implemented or ongoing contracts of similar nature with organizations/ companies of similar magnitude and complexity. The team composition would tentatively be: One (1) Team leader: Minimum Master’s degree in relevant field and minimum 15 years of experience in managing learning assessment programmes, preferably with experience in learning assessment in Bangladesh. One (1) Deputy Team Leader: Master’s degree in education or other relevant field and minimum 10 years of practical experience in managing and leading in Learning Assessment Programmes in Bangladesh. One (1) Chief Psychometrician/ Statistician: Minimum of Master’s in psychometrics, [-/statistics or relevant field and minimum 8 years of experience in the field of learning assessment, preferably with experience in learning assessment in Bangladesh. One (1) Psychometrician/ Statistician: Minimum of Master’s in psychometrics, statistics or relevant field and minimum 6 years of experience in the field of learning assessment, preferably with experience in learning assessment in Bangladesh. One (1) Literacy/reading test expert: Minimum of Master’s degree in the relevant field and minimum of 8 years of relevant experience, preferably with familiarity with the curriculum in Bangladesh. One (1) Math/numeracy test expert: Minimum of Master’s degree in the relevant field and minimum of 8 years of relevant experience, preferably with familiarity with the curriculum in Bangladesh. One (1) Field operations coordinator: To monitor the field-level test administration process and coordinate with the government counterparts during the collection of test booklets, test marking and data entry. One (1) Data Manager/Coordinator: To coordinate with DPE for accessing to NSA module, work on data entry template, supervise the data entry and cleaning process. Two (2) Report Writers with excellent English language proficiency, communication skills. One of the report writers should have communications experience and play the role of Communications Coordinator. A team of monitors who will observe and monitor the test administration process on the test day will be required. This team will be engaged only for a certain duration based on the needs, not for whole duration. A team of scorers will be required to be deployed who will mainly mark the filled-in test booklets and enter data into an agreed template. This team will be engaged only for a certain duration based on the needs, not for whole duration. 13. Duty Station The work that the agency is expected to do will be based in mainly Dhaka. But the international agency can work most of the technical work remotely with planned visit to Dhaka. At least eight visits are expected for the following purposes: i) finalizing the assessment framework; ii) sharing the findings of the new item piloting; iii) finalizing and developing items for the main test and contextual questionnaires; iv) carrying out the sampling process; v) conducting markers training; vi) jointly setting the performance parameter (e.g. cut scores) after data analysis; vii) the attending workshop for sharing the results and reviewing the draft report; viii) attending the national dissemination event. Based on the actual situation, the number of visits of the international team may vary from the estimated number. During these visits, the agency will work very closely with the DPE and UNICEF (and other DPs if and when required) to ensure all the decisions are made in a collaborative manner. 14. Management and Supervision DPE: As mentioned, UNICEF will be supporting DPE in carrying out the NSA 2026. Therefore, DPE will provide overall guidance and directions to conduct the assessment. All the documents will be endorsed by DPE. Besides, the National Assessment Cell (NAC) under DPE will support the agency in accessing all the documents, raw data of the previous NSA cycles and list of schools for sampling purposes. DPE will also lead the test administration process with technical support from UNICEF and the agency. A designated technical committee comprising the officials of MoPME, DPE, NCTB and NAPE will review and facilitate the approval process during the conduction of the NSA. UNICEF: UNICEF will serve as the technical lead for the NSA 2026. All deliverables will be reviewed by designated officials of the Education Section before sharing with DPE. UNICEF will also facilitate the discussions, correspondences and consultations between DPE and the agency. UNICEF will also ensure that all DPs are engaged in all decisionmaking and informed about all the processes. UNICEF Supply Section will remain the focal point for all administrative, financial and commercial queries and correspondence, including contract amendment.
Log in or create an account to view complete details for this procurement opportunity
If you need support, please email us at [email protected]