1. Introduction
In the execution and delivery of health services, timely access to quality, efficient and effective care are critical components, especially in the aim of achieving universal health care.
The implementation of the Enhancing Health Care Service Delivery (EHCSD) project aims to improve the throughput of critical services provided at public health facilities, thus reducing waiting times for services such as admissions, as well as reducing hospital length of stay. The Project focuses on three key areas negatively impacting the health sector’s ability to respond timely to the health care needs of the public patients. The areas of focus are: (i) diagnostic radiology services (ii) elective surgeries and (iii) removal of social cases from hospitals.
In recognition of its resource constraints and inability to meet the growing demand for diagnostic services, the Ministry of Health and Wellness (MOHW) commenced the outsourcing of the following imaging services in September 2019: CT scan, MRI, Ultrasound, Angiography, Endoscopy and Histopathology. Patients are being referred to an approved list of private imaging service providers and all costs for outsourced services are being borne by the Government of Jamaica (GOJ). Since September 2019, more than 18,000 tests have been completed nationally and the MOHW has paid out approximately One Billion Jamaican Dollars for the service. There is an existing framework agreement with sixteen approved service providers under the project and they are contracted to provide services to 23 hospitals nationwide.
2. Objectives of Project (Component 1)
The current Component 1 EHCSD project is the first national outsourcing arrangement as it relates to the provision of diagnostic imaging studies. The overall objective of this component is to increase access to diagnostic imaging and radiology services for persons accessing treatment within the public healthcare system. The component is assumed to contribute to the reduction of wait time for diagnosis, treatment and hospital admission for patients accessing services through accident and emergency and outpatient clinics. It should also contribute to reduction of hospital stay and use of bed space for patients admitted on wards awaiting treatment.
3. Purpose of The Evaluation
It is recommended that midterm evaluations (MTE) should be completed for all projects, even if on a relatively small and internal basis as done for this project. These in-house evaluations were managed by staff members, including project management, technical specialists and back-stoppers, and usually conducted mainly by independent officials who have not been involved in the design, management or backstopping of the project.
MTE is to gather data which will enable programme manager and staff to create the best programme; make modification as needed; monitor progress towards programme goal; and judge the success of the programme in achieving its short-term, intermediate and long term outcomes.
As this is a MOHW initiated evaluation and as such will include goals-based evaluation technique. Goals-based represents evaluation that measures the extent to which a programme reaches clear and specific targets and objectives. These are available from the programme documentation or rhetoric which are most times but not always, developed from the concept stage of the project.
Like many Health interventions the exigency of the program resulted in the project being implemented without strict monitoring and evaluation guidelines developed. It was therefore without any formative evaluations, used in an iterative process to make improvements before full implementation.
Thus the approach will utilise best suited statistical analysis for instances where the population in the project were not randomly selected and where the goals of the project were not informed by ex-ante research analysis. The result is that the control and treatment are not definitively demarcated or readily easy to identify in which case the literature support quasi-experimental methods like matching[1] and difference-in-difference analysis to address such a situation.
4. Evaluation Objectives
This outcome evaluation that is concerned with determining if, and by how much, programme activities or services achieved their intended outcomes. Whereas outcome monitoring is helpful and necessary in knowing whether outcomes were attained, outcome evaluation attempts to attribute observed change to the intervention tested, describe the extent or scope of programme outcomes, and indicate what might happen in the absence of the program. It is methodologically rigorous and requires that sound inferential techniques be employed.
This midterm evaluation aims to ensure that diverse viewpoints are taken into account and that results are as complete and unbiased as possible. It will seek to have replicable measurements and that the methods employed are as rigorous as circumstances allow. This consultancy will support the MOHW to assess:
- The optimal model for radiology service– public or private provision or a mix thereof.
- Through utilising the appropriate inferential tool estimate:
- i) the hospitalization averted by the program, and;
- ii) determine if length of stay rates changed post-program initiation
5. Scope of Work
- Receive the remainder of outstanding data request for completion of in-house analysis.
- Select probability or non-probability sampling for the respective quantitative and qualitative assessment. Outline the recruitment strategy for the respective assessments as well. The sampling technique must be based on the following:
- response distribution was assumed to be normal thus 50% was inputted as the value
- standard deviation
- margin of error
- With assistance of the MOHW, the Consultant will recruit the respondents identified through the inclusion criteria adopted by same evaluation team
- Collect data using a mix of quantitative and qualitative methods There are significant data collection strategies owing to the multiplicity of indicators to analyse: key informant interviews, focus group discussions, cost and value allocation (costs being assessed include: transportation, diagnostic services, inpatients day, project savings or dissaving from the framework agreement with the private sector), expansion of radiology services in public sector. Data collection will include a combination of primary and secondary data collection. Create a data extraction tool was created to collect data from the patient docket/medical records.
- Develop and implement focus group discussion (FGD) Protocol is developed and implemented by evaluation team. The focus groups included participants representing key target professionals to be articulated by the Consultant. The team will also be responsible for implementing the focus group discussion sessions, recording the meetings, organising the transcription, ensuring data quality and conducting an initial analysis.
The Consultant must provide a data analysis plan, critical in reducing potential biases and by relying on empirical findings in order to identify thematic nodes. - Identify a balanced list of key informants to be interviewed by the team during the evaluation.
- Analyse qualitative data using content analysis where patterns and connections are identified thematically with constant comparison using an appropriate qualitative analysis software.
- An evaluation matrix developed by the team will articulate sub-evaluation questions aligned to the priority areas and objectives outlined in the project documentation. The matrix also identifies the existing indicators that are relevant to each area, existing data sources and key informants for each objective.
- Design and execute quantitative data analysis plan.
- Recommend Performance and Evaluation Indicators for the project.
- The Consultant is responsible for the systemic collection of information about the activities, characteristics and outcomes of programmes to make judgments about the program, improve program effectiveness, and/or inform decisions about future program development.
- Where necessary complete in-house evaluation exercise.
- Conduct Value-for-money analysis for the project which would facilitate assessing the viability of the project from the GOJ’s perspective – bearing in mind market and non-market values
- Outline the Strategic choice between the public provision or private provision that will include:
- Cost/Benefit Analysis
- Justification
- Anticipated Outcomes
- Assumptions and Critical Success Factors
- Cost estimates
- Risk Analysis
- Additionally, a cost-benefit analysis of your project is to be conducted through this business case, allowing quantitative and qualitative benefits to be documented and analysed.
6. Commencement Date & Period of Implementation
The commencement date will be the day of the signing of the contract and implementation will be over a at least a three (3) month period.
7. Characteristics of The Consultancy
Type of consultancy: Firm
Place of Work: Face-to-face and Virtual
Duration: 3 months commencing on the date of the contract signing
- Qualifications: At least two team members with qualification in a quantitative social science field (e.g., Psychology, Economics, Monitoring and Evaluation Science, Sociology, Applied Statistics, and other related areas) at the post graduate degree level.
- Experience: The firm should possess at least 10 years’ experience in the field of:
- Health systems with demonstrated knowledge and understanding of Jamaica health sector.
- Conducting end-to-end scientific research in an applied setting.
- Experience with and demonstrable expertise on research design methodologies for experiments, quasi-experiments, and non-experiments;
- Experience with qualitative research methodologies, including focus groups, elite interviews, etc.
- Participatory approaches in conducting assessments and facilitating programme/project evaluation exercises;
- Excellent English writing skills.
- Competence:
- The ideal candidate will have effective communication skills (writing, interpersonal, public), deep expertise across all facets of research fundamentals (research design, measurement, statistics), and the ability to adapt to ever-evolving virtual and face-to-face work environments. Project evaluation planning document preparation;
- Capacity to interface with professionals in a multi-disciplinary setting
- Familiarity with the Theory of Change approach and building organizational balanced scorecards;
8. Deliverables and Payment Schedule
Based on the execution of the Scope of Work, the Consultant will be required to provide the following deliverables:
- Inception report: containing the evaluation framework, detailed evaluation methodology, project/programme sample, work plan and logistical arrangements.
- Data collection field report outlining data collection insights, challenges or re-strategising.
- Workshop for Presentation and Validation of Findings and Recommendations: to present findings and tentative recommendations to the senior directorate of the Ministry.
- (Draft and Final) Evaluation report of (including annexes) to be structured as follows: − Executive Summary
− Summary Evaluation report highlighting the cross-cutting key findings, lessons learned and recommendations, including
- Description of the and the three projects under evaluation.
- Evaluation purpose
- Evaluation methodology
- Main findings (presented in terms of achievements and challenges)
- Lessons Learned
- Conclusions and recommendations
Deliverables and Schedule | Date | Payment Tranche |
Inception Report* (including the refined Theory of change or intervention logic, the evaluation methodology and detailed work plan) | Three weeks from signing contract | 15% |
Data collection field report | As per work plan | 15% |
Presentation of main findings and tentative recommendations to the senior managers | As per work plan | 20% |
Draft mid-term evaluation report | As per work plan | 30% |
Final mid-term evaluation report end | As per work plan | 20% |
9. Submission
In submitting proposal, the Consultant must: 1) Carefully review and comment on the Terms of Reference, recommending potential refinements where necessary, including making such recommendations as deemed appropriate to enhance the quality of the assignment and output/deliverables; ii) Review all other documentation that may be relevant to this assignment, and; iii) Submit their professional status and biographic data as well as qualification and experience to carry out the assignment as well as corresponding financial proposal
Verification of these qualifications will be based on the provided curriculum vitae. Moreover, references, web links or electronic copies of two or three examples of recently completed evaluation reports shall be provided together with the technical proposal. Candidates are also encouraged to submit other references such as research papers or articles that demonstrate their familiarity with the subject under review. Attention will be paid to establishing an evaluation team that is balanced as regards gender and geographical balance (as applicable).
10. Reporting Relationships
The Ministry will identify a counterpart team to be led jointly by the Chief Medical Officer and the Director of the Policy, Planning and Development Division. This team will work hand in hand with the Consulting team to ensure achievement of consultancy goals and will provide technical support and guidance to the consultants on a daily basis. Said committee will assist with identifying and introducing officers, obtaining offline GOJ/MOHW documents and, providing access to MOHW/RHA databases and information sets.
11. Selection Framework
11.1 Evaluation Criteria
The criteria and weighting system to be used in evaluating proposals are as follows:-
- Technical Approach and Methodology (45%) – Understanding of assignment and expected outputs; Technical soundness of framework to assess cause and effect; Appropriateness of techniques for the data collection and evaluation
- Work Plan (25%) Specification and sequencing of data collection activities; Timeline for Completion of Tasks; Arrangements for coordination of activities and administrative support services.
- Qualification and Experience (30%) – this relates to the extent to which the qualifications, skills and experience of corporate team match the competency requirements for the study
11.2 Pass Mark
- Consultant(s) proposal must obtain a minimum mark of 70% of total marks
- Proposal(s) which do not obtain the pass mark will not be considered for further evaluation and their financials will be returned unopened
Annex 1: Technical Evaluation Rating Guide
AREA | Maximum | Guide |
Qualification and Experience | 30 | |
· At least two team members with qualification in a quantitative social science field (e.g., Psychology, Economics, Monitoring and Evaluation Science, Sociology, Applied Statistics, and other related areas) at the post graduate degree level. | 10 | yes = 10
undergraduate degree = 5 M&E certification = 2 None or unrelated = 0 |
· At least 10 years’ experience working in monitoring and evaluation with at least three years in Programme Evaluation | 10 | Less than 2 = 0, thereafter 1 point for each year of experience) |
· Must have successfully completed at least 2 programme evaluation studies in the health sector | 6 | Yes = 6
(If one health sector evaluation and other non-health experience = 4) No health experience = 1 |
· Demonstrated experience designing quantitative and qualitative research studies | 3 | Very experienced = 3
Fairly experienced = 2 Very little experience = 0 |
· Good working knowledge of the public health sector. | 1 | Yes = 1 or No = 0
|
Technical approach and methodology | 45 | |
· The report describes the methodology applied to the evaluation that clearly explains how it was specifically designed to address the evaluation criteria, yield answers to the evaluation questions and achieve evaluation purposes.
|
10 | Yes = 10
Good attempt = 7 Satisfactory = 5 Poor = 0 |
· The report describes the data collection methods and analysis, the rationale for selecting them, and states their potential limitations. | 10 | Yes = 10
Good attempt = 7 Satisfactory = 5 Poor = 0
|
· The report describes the data sources, the rationale for their selection, and their limitations. The report includes discussion on a mix of data sources and any mechanism to overcome data limits. | 10 | Yes = 10
Good = 7 Satisfactory = 5 Poor = 0
|
· Expounds on how the evaluation will be designed to provide appraisal of cause and effect and any supporting correlation or descriptives in the analysis | 10 | Yes = 10
Good attempt = 7 Satisfactory = 5 Poor = 0
|
· Outline possible approaches when M&E system for a programme/project is inadequately structured as in the current case | 5 | Yes = 5
Good attempt = 3 Satisfactory = 2.5 Poor = 0
|
Work Plan | 25 | |
· Delineates how members of the team will be involved and clarifies their respective roles in the assignment | 5 | Excellent = 5
Satisfactory = 3 Poor = 1 |
· Sufficient time allocated to conducting data collection, evaluation and preparing the various reports | 10 | Excellent = 10
Good = 7 Satisfactory = 5 Poor = 0 |
· Specification and sequencing of the activities displays an appreciation of the required time and effort of the team members | 5 | Excellent = 5
Satisfactory = 3 Poor = 1 |
· Arrangements for coordination of activities and administrative support services. | 5 | Excellent = 5
Satisfactory = 3 Poor = 1 |
- [1] Typically used when neither randomization, RD or other quasi-experimental options are not possible (i.e. no baseline)