Important disclaimer:
This guidance does not endorse any of the listed evaluation tools. In the case study section, users provide their subjective experience, and some tools were also self-evaluated by the developers/owners. These reviews do not constitute an assessment of the quality of the evaluation tools.
We suggest that you attribute a weight to the themes given below according to your priorities and evaluation needs. To do so, attribute points out of 100 to each theme that is important to you using the slider. The sum of all themes must add up to 100, but some may not be considered (0).
The weight you attribute to a theme will be multiplied with the share of questions that cover the theme in a particular tool. The degree of suitability is the sum of those products for each tool. Thus, the relative suitability indicates the degree to which a tool focuses on your theme of interest in comparison to other tools. Please note that the suitability is strictly based on the number of relevant questions among the total number of questions of each tool (i.e. a quantitative measure); other tools might also include relevant questions for your needs. In other words, this selection tool gives the quantitative coverage of your priorities among the available tools.
Selection tool
Theme | Weight |
---|---|
Technical operations of surveillance |
Weight 0 from 100
|
Resources |
Weight 0 from 100
|
Output and use of information |
Weight 0 from 100
|
Integration |
Weight 0 from 100
|
Collaboration |
Weight 0 from 100
|
Progress and adaptivity |
Weight 0 from 100
|
Surveillance items specific to AMR/AMU |
Weight 0 from 100
|
How was this selection tool developed?
For all tools, the questions and evaluation items constituting them were interpreted and attributed to different evaluation themes. These themes were built progressively, throughout the analysis and their definitions evolved iteratively to encompass all tools in a coherent, meaningful and comprehensive set. Some questions could be attributed to more than one theme. The description of the themes is detailed below. Finally, the share of questions in an evaluation tool devoted to a theme allowed the profiling of the tools and supports the weighting of the tools after the users enter their preferences for the seven themes.
Please note that not all tools were included in this analysis because the evaluation questions or items were not available at the time of the analysis. The 12 included tools are listed here;
- IHR: International Health Regulation core capacity monitoring framework
- ISSEP: Integrated surveillance system evaluation project
- JEE: Joint External Evaluation tool (2nd edition)
- ECoSur: Evaluation of collaboration for surveillance
- NEOH: Network for the Evaluation of One Health Framework
- OASIS: Outil d’Analyse des Systèmes de Surveillance
- PMP-AMR: The FAO Progressive Management Pathway for AMR
- PVS: OIE Tool for the Evaluation of Performance of Veterinary Services
- SurF: Surveillance Evaluation Framework
- OH-APP: One Health Assessment for Planning and Performance
- SERVAL: SuRveillance EVALuation framework
- ATLASS: The FAO Assessment Tool for Laboratory and AMR Surveillance Systems
Evaluation themes
Description
Includes questions on technical features of surveillance operations (surveillance design, laboratory capacities, management of specimens, tests applied, data management and analysis…), their quality management (SOP, traceability…), and the assessment of their performance (sensitivity, specificity,…).
Examples of evaluation questions or items
How representative of the target population is the surveillance system? (Extracted from Serval)
The sensitivity of the case or threat definition. (Extracted from OASIS)
Have the sensitivity and specificity of the tests used been assessed (where relevant)? (Extracted from SurF)
Are mechanisms or procedures in place to ensure data quality to allow sharing, e.g. data completeness, error-checking and correction of errors, clear and accurate descriptions of variables and of aggregations/calculations, documentation available? (Extracted from NEOH)
Describe how data is validated. (Extracted from JEE)
Description
Includes questions quantitatively addressing human, physical and financial resources. Questions on the training level of human resources are also considered in this category.
Examples of evaluation questions or items
The appropriate level of staffing of the veterinary services to allow for veterinary paraprofessional (according to the OIE definition) functions to be undertaken efficiently and effectively (stage 1 to 4) (Extracted from PMP-AMR)
Are resources for rapid response during public health emergencies of national or international concern accessible? (Extracted from IHR)
Are there specific incentives for any workforce specialties (may include physicians, nurses, veterinarians, biostatisticians laboratory assistants and specialists, or animal health professionals)? (Extracted from JEE)
Availability of all appropriate resources to support the collaborative mechanism(s) for coordinating the multisectoral surveillance system – (Extracted from ECoSur)
Adequacy of the central level’s material and financial resources (Extracted from OASIS)
Provision of adequate initial training and an ongoing programme of training for those implementing the surveillance system, particularly those collecting the data (Extracted from SurF)
Description
Includes questions on surveillance outputs provided to public and private stakeholders, how the outputs are used to inform decision-making, and the impacts/benefits from this use of outputs (expected, perceived or measured).
Examples of evaluation questions or items
Consider how the benefits are distributed among stakeholders, including producers, consumers, the livestock industry or society. (Extracted from SurF)
Have infection control plans been implemented nationwide? (Extracted from IHR)
The capability of the VS to keep non-government stakeholders aware and informed, in a transparent, effective and timely manner, of VS activities and programmes, and of developments in animal health, animal welfare and veterinary public health. (Extracted from PVS)
Quality of the communication (both in terms of contents and means) of the information produced by the multisectoral surveillance system to surveillance actors and end-users. (Extracted from ECoSur)
How do OH outputs (OH team, information, and network) impact on decision-making? (Extracted from ISSEP)
What is the impact of the surveillance system? (Extracted from SERVAL)
Description
Includes questions considering three levels of integration:
- integration of data systems ((within and between organizations and at national, regional, or international level, data systems interoperation, adherence to international testing and data standards)
- integration between sectors and disciplines (knowledge integration, shared decision-making and planning, formulation of common goals)
- integration in the national and international context motivating the need for surveillance (link to decision-making, shared decision-making and planning between countries).
Examples of evaluation questions or items
How is the interaction between people organised to foster collaboration across the initiative? (Extracted from NEOH)
What are the incentives (e.g. compensation payments) or barriers (e.g. consequences of reporting) for participation? (Extracted from SurF)
Are there official agreements with labs outside of the country for specialized testing, not available in-country? (Extracted from JEE)
The formalisation of roles and responsibilities of surveillance actors involved in collaborative modalities. (Extracted from ECoSur)
Does the multisectoral coordination mechanism have a current One Health Strategy developed in a participatory manner with its stakeholders? (Extracted from OH-APP)
Description
Includes questions on the framework of collaboration (organisation of roles and responsibilities) and the objective of collaboration (exchange of data, information and knowledge, sharing of capacities). This category also covers questions about the inclusive participation of stakeholders (e.g. considering gender).
Examples of evaluation questions or items
How is the interaction between people organised to foster collaboration across the initiative? (Extracted from NEOH)
What are the incentives (e.g. compensation payments) or barriers (e.g. consequences of reporting) for participation? (Extracted from SurF)
Are there official agreements with labs outside of the country for specialized testing, not available in-country? (Extracted from JEE)
The formalisation of roles and responsibilities of surveillance actors involved in collaborative modalities. (Extracted from ECoSur)
Does the multisectoral coordination mechanism have a current One Health Strategy developed in a participatory manner with its stakeholders? (Extracted from OH-APP)
Description
Includes questions on any design elements allowing for the surveillance system to adapt and evolve. This may include tools, plans and agreements to evolve (e.g. continuous learning programs, external evaluation…), but also the features of management and governance allowing for regular evaluation and adaptation of operations (e.g. frequency of meeting, regularity of progress reports…).
Examples of evaluation questions or items
The capability of the VS to maintain, update and improve the knowledge, attitudes and skills of their personnel, through an ongoing staff training and development programme assessed on a regular basis for relevance and targeted skills development. (Extracted from OIE-PVS)
Have multisectoral and multidisciplinary coordination and communication mechanisms been tested through exercises or through the occurrence of an actual event? (Extracted from IHR)
How flexible is the project design and timeline to respond to internal or external changes at long-term? (Extracted from NEOH)
Implementation of supervision by the intermediary level. (Extracted from OASIS)
Existence and relevance of specific performance indicators of collaboration routinely used. (Extracted from ECoSur)
Is periodic external evaluation used to assess the system outputs in relation to its objectives? (Extracted from SurF)
Description
Includes questions that are specifically addressing the case of AMR (occurrence, prevention, or response) or AMU (recording and management).
Examples of evaluation questions or items
Has responsibility been assigned for surveillance of antimicrobial resistance within the country? (Extracted from IHR)
Which structure is responsible for AMR data collection, analysis and interpretation? (Extracted from ATLASS)
Are data available on the magnitude and trends of antimicrobial resistance? (Extracted from IHR)
How does OH integration contribute to detect trends and correlations in AMR in animals and humans? (Extracted from ISSEP)
How many farms with livestock are in the country? Of these, how many are (will be) sentinel sites for surveillance of infections caused by AMR pathogens in livestock? (Extracted from JEE)
Data collection and reporting on the total quantity of AMs sold for/used in animals with the possibility by type of use (therapeutic, medicated feed, growth promotion). Details on animal species, age groups, method of administration are collected where possible. (Extracted from PMP)