- About the Nutrition Accountability Framework
- The Nutrition Action Classification System
- The SMARTness of nutrition commitments
- Assessing the SMARTness of nutrition commitments
- The Nutrition Action SMARTness Index
- Commitment data cleaning and standardisation
- Developing the NAF Platform's Commitment Registration Form
- A guide to the NAF Platform's Commitment Registration Form
- How NAF commitments are verified
- A glossary of terms
- Authors, contributors, acknowledgments and funding
- Nutrition Accountability Framework and other commitment registers
The SMART (Specific, Measurable, Achievable, Relevant and Timely)-ness of nutrition commitments depends on the quality and comprehensiveness of data provided by stakeholders (i.e., commitment-makers). To facilitate the formulation and registration of SMART nutrition commitments within the Nutrition Accountability Framework (NAF), we developed a Commitment Registration Form – effectively a questionnaire with a user sign-up component – with embedded guidance and detailed examples. The questions included in the form were carefully developed and designed to capture the SMART ingredients described in The SMARTness of nutrition commitments. We employed a pilot program and selected participants that represent the key stakeholder groups expected to make commitments in the Nutrition Year of Action.
The primary objectives of the pilot were to, on a small scale, test and refine the clarity, quality, comprehensiveness, efficiency and, critically, the feasibility of the registration form before implementing it on a large scale and converting it to a web-based interface registration form. The pilot also provided us with key insights into the database requirements, as well as the resources and skills required to process and analyse the registered commitments. The NAF Platform's Commitment Registration Form is the outcome of the development process and is expected to be further refined as commitments continue to be registered and processed.
This document provides an overview of the development process of the registration form, with a focus on the pilot testing of the form, and details the included questions and the information they capture (see A guide to the NAF Platform's Commitment Registration Form).
The development of the registration form included the following three phases.
Phase 1: Analysis to provide operational definition and description of the SMART concept and how to best capture it using clearly articulated ingredients (described in The SMARTness of nutrition commitments). This provided a precise operational definition (which has construct validity), accurately reflected its theoretical base and facilitated the tool development.
Phase 2: Development of the quantitative tool (registration form) to facilitate the formulation and registration of SMART nutrition commitments. This phase included the following three steps.
- Design and develop format the 44 questions included in the tool, based on literature reviews, expert input, informal discussions with potential end-users (i.e., commitment-makers) and processing of past Nutrition for Growth (N4G) commitments. The experts judged the appropriateness, meaningfulness, usefulness and effectiveness of each question to determine how accurately the measurement tool captures the various aspects of the construct questions (SMARTness). An initial version of the tool, specific to N4G commitments, was made publicly available in December 2020, so commitments and feedback received using the past form were also carefully assessed.
- Pilot test the content validity of the tool, clarity and relevance of the questions, format and instructions, as well as feasibility of the process (including participant acceptability, participant burden, ability to complete the questions and detail/accuracy of reported information).
- Refine the tool based on the results of the pilot testing and present the final included questions along with the information they capture (see A guide to the NAF Platform's Commitment Registration Form).
Phase 3: Development and dissemination of the web-based registration form using the pilot-tested tool and informed by user research, including embedded guidance and detailed examples.
Pilot testing of the tool
We used cognitive interviews to develop the registration form. Cognitive interviewing, or cognitive pretesting, is a key method for evaluating questions as part of the questionnaire development process. It is used in fields such as nutrition and health, among others. During the cognitive interviewing process, respondents describe their perception and interpretation of a question prior to answering it, which provides the interviews with an opportunity to address potential ambiguities. Interviewers then observe respondents as they fill out the questionnaire, and ask them about the level of clarity and comprehensiveness of each question.
Our use of cognitive interviewing ensured that the registration form was logically structured. It also helped to identify questions with ambiguous language that could lead to confusion or misinterpretation; evaluate the consistent comprehension of each question; identify sensitive content; and ensure no key information was missing.
Participant identification and selection
To select the sample for the pilot testing, we followed established guidelines. We used convenience sampling and snowballing techniques to identify participants that represented key types of stakeholders likely to make commitments in the Nutrition Year of Action: country governments, donors, civil society and non-governmental organisations, private sector businesses and multilateral organisations, including UN agencies.
For each stakeholder type, we aimed to include at least two representatives. In total, 12 participants were invited by email, and all agreed to participate in the pilot (Table 1). We included both native and non-native English speakers as participants. Participants were approached in June 2021 and the interviews took place in July 2021.
Table 1: Types of participating stakeholders
vs. non-native English speakers
(1), non-English (2)
(1), non-English (1)
society organisations and non governmental organisations
sector food businesses
Pilot testing protocol
We developed a standardised protocol for the pilot testing of the registration form, which was informed by best practices. The pilot testing process included the stage before the interview, the 60-minute interview and the assessment of feedback following the interviews to refine and finalise the form. In total, four Global Nutrition Report (GNR) investigators, all of whom were extensively trained to ensure inter-rater agreement, administered and processed the interviews. For each stage, we developed relevant standardised materials to facilitate the process’ administration and maximise efficiency and consistency in the approach.
Before the interview
In addition to the user sign-up and commitment registration forms, we also developed a participant information sheet, which included information about the purpose of the pilot, the structure of the interview and the preparation required from the participants. One to two weeks before each scheduled interview, we shared the participant information sheet and the user sign-up form with the participants.
To maximise the efficiency of the interview and ensure it did not last for longer than an hour, participants were asked to complete the user sign-up form and send it back to us prior to their interview. They were also asked to send an example of a realistic commitment that could be used to complete the commitment registration form during the interview. This allowed us to check the appropriateness of the example for the pilot testing prior to the interview.
During the interview
The interviews took place online via web calls. Two investigators from the GNR team participated in each interview: one led the interview, and the other was focused on taking notes, managing the time, and intervening, as needed. For the purposes of the pilot, we developed an online version of the commitment registration form using KoBoToolbox, a free, open-source platform for data collection based on Google Open Data Kit.
The steps we followed for each 60-minute interview are as described below.
- Introductions – All attendants introduced themselves.
- Permission for recording – We asked for the participant’s consent before we started recording the interview to use as back up and ensure all feedback was accurately and fully captured.
- Review of the user sign-up form – We reviewed the pre-filled user sign-up form to ensure completeness and clarity of questions and discussed with the participants whether they thought anything critical was missing.
- Completion of the commitment registration form – The stakeholders were asked to share their screen and complete the commitment registration form using the example shared with us prior to the interview. Participants were invited to ‘think aloud’ while filling in each question. The GNR team assessed the participants’ thought process through a direct and indirect approach. To directly probe the participants’ understanding, the interviewers followed a three-step checklist for each question (Table 2) to assess the clarity of the wording, how the question was interpreted, and the overall structure of the question. To keep a natural front and a smooth flow of the interviews, the steps were embedded into the format of a discussion. The GNR team further captured non-verbal cues that may have been indications of the participants’ confusion or difficulty in understanding the questions (e.g., long silences).
- Collection of overall impressions – After completing the form, we collected feedback on overall impressions, such as on the order of the questions and their user-friendliness, and any other additional comments or suggestions from the stakeholders.
- Permission to use example commitment as an official example shared publicly – At the end of the interview, we asked stakeholders if they gave consent to us to use their example commitment as the basis to draft a fully completed and anonymised example that could be shared online for the users of the NAF Platform.
Notes were taken throughout the interview using a standardised template that captured both question-specific and generic comments. For each question in the form, the GNR team recorded the participant’s responses and reactions across three dimensions: ‘Question well understood’, ‘Respondent looked confused/unsure’, and ‘Question not clear/respondent asked for clarification’. Detailed notes were provided to expand on the dimensions as appropriate. Notes also included feedback and suggestions from the participants. All personal identifiers collected were confidential in nature.
Table 2: Examples of probing questions for the pilot testing of the registration form
Example of probing
questions during the interview
|Step 1: Clear wording||
The respondent has understood the key terms used in the question (e.g.,
What does the word “X” mean to you?
How do you interpret the meaning of “X”?
Step 2: Correct
|The respondent finds the meaning of the question as a whole clear||
How did you interpret this question?
In your own words, what do you think this question is trying to ask?
Step 3: Overall structure
of the question
|The respondent does not have to read the question multiple times, they can answer it without doubting about its meaning||
Did you have doubts about the meaning of the question when you first read it?
How easy or difficult was it to select an answer from the options provided? Why?
Do you think any categories are missing from the options/levels provided or do they cover everything?
Source: Adapted from Collins, 2015.
Following the interview
Following each interview, the two interviewers confirmed the notes taken and added to them, if needed, using the recorded interviews. To facilitate feedback assessment and highlight key issues that were identified, the notes from all interviews were merged and organised by question. All issues and suggestions were internally discussed, and the decisions made on how to address each of these were recorded. Differences in data extraction, interpretation and quality assessment between investigators were infrequent (concordance >95%) and resolved by consensus.
Data processing and revisions
The analysis of the data identified emerging themes and trends (e.g., whether certain questions or particular words were unclear for many of the respondents) across interviews. Insightful comments and suggestions were also reported, even if mentioned by only one or two respondents. The suggestions and comments of the respondents were used to revise, improve and finalise the registration tool.
Acceptability for individual questions, or ease of use, was judged by the number of respondents who understood the key terms used in the question. Overall, on average 89% of the participants understood the questions in the tool very well. The validity of a tool refers to the extent to which it provides data relative to commonly accepted meanings of the concept (or it does actually measure what it claims to measure). We found no other tools that measure SMARTness, making it impossible to apply the "gold standard" of validity testing, which is to compare the newly developed tool with one that has an established record of reliability and validity.
Overall, we compiled 114 comments and suggestions from participants, of which 75 were addressed in the revised form. Over 75% of the questions required clarification for at least 1 participant. The majority of the comments concerned the wording of the questions. Relevance and applicability issues were raised for only three questions and by certain stakeholders who are planning to make financial commitments. The revisions of the tool covered five main areas, outlined below.
- Reworded/rephrased questions. We reworded about half of the questions to improve clarity and we further added explanatory text, with links to external and internal resources and/or examples to clarify the information intended to be collected.
- Reduced required information. We amended the status of some questions, from compulsory to optional. In the user sign-up part, two questions were dropped as not essential; one question was added for the website of the organisation and another two for the position and the telephone of the point of contact person. In the commitment registration part, four questions were dropped as not essential; one question was added for the title of the commitment and another one to assess alignment with the global nutrition targets.
- Improved options for answers. We expanded the available responses in a few closed questions to match the diversity of expected commitments.
- Improved clarity and flow: We split some questions with multiple fields (e.g., costs of the commitment) into separate questions to improve the flow of the questionnaire.
- Re-ordered questions. We slightly restructured the tool by changing the order of some questions to improve the flow.
Future versions of the tool will incorporate feedback that was not possible to incorporate in the first iteration due to time and resource constraints (e.g., translation of the tool to other languages, advanced functionalities of the platform). In the end, from the initial 44 questions (13 in the user registration part and 33 in the commitment registration part) that were tested, the revised tool included 50 questions (14 and 36, respectively). The higher number of questions derives from splitting some questions with multiple fields into separate questions to improve the clarity and flow of the questionnaire (see A guide to the NAF Platform's Commitment Registration Form for the questions included).
First public iteration of the NAF Platform's Commitment Registration Form
Findings from user research and user experience design informed the design of the web-based registration form. User interviews provided insights into the behaviours and needs of potential users. Participants of the pilot testing of the form also provided feedback on the user experience of filling out the form (e.g. format and visualisation of specific questions). They also shared how commitment-makers collaborate with colleagues and external organisations when deciding how to provide details for their commitment. The NAF Platform's Commitment Registration Form was then created using a third-party survey tool and tested to resolve any final issues with rendering and flow.
The first iteration of the Commitment Registration Form was launched in September 2021, allowing all stakeholders to register their commitments for nutrition actions. The registration form will ultimately enable commitment-makers to submit, track and report progress made towards their nutrition commitments via the NAF.
The platform will be subject to further developments as commitments are registered and processed and as resources are secured. One of the key expected developments is interactive visualisations of commitments registered and after these have been processed. The platform will also include a progress assessment survey (expected to be launched in 2022), which commitment-makers can digitally complete and submit for each one of their registered commitments. Progress assessment will be primarily based on tracking the progress of the commitment goal indicator(s) as specified by the stakeholders (see A guide to the NAF Platform's Commitment Registration Form for details on indicators).
The NAF Platform's Commitment Registration Form is the first of its kind to be released. The goal of ending poor diets and malnutrition in all its forms must be supported by a comprehensive framework for accountability through which all action on nutrition is recorded according to the same principles, methods and approach. Only by doing this will we be able to determine what is making an impact and encourage better commitments and investments that translate into scalable actions. Resources must be consistently shared, tracked and consolidated into usable insights that can drive better action and decision-making across sectors. Using the platform, the GNR will track and report on progress and collective impact every year to determine whether commitments are translating into action and pinpoint where more action is needed.
Last updated: 28 September 2021.
Willis GB, 2015. Analysis of the Cognitive Interview in Questionnaire Design - Gordon B. Willis – Google Books. Oxford University Press.
Smith AF, 1993. Cognitive psychological issues of relevance to the validity of dietary reports. Eur J Clin Nutr. 47 Suppl 2:S6-18. http://europepmc.org/abstract/MED/8262022; Jobe JB, Mingay DJ, 1989. New from Cognitive Research Improves Questionnaires. Vol 79.; Subar AF, Thompson FE, Smith AF, et al., 1995. Improving Food Frequency Questionnaires. A Qualitative Approach Using Cognitive Interviewing. J Am Diet Assoc. 95(7):781-788. doi:10.1016/S0002-8223(95)00217-0; Carbone ET, Campbell MK, Honess-Morreale L. Use of cognitive interview techniques in the development of nutrition surveys and interactive nutrition messages for low-income populations. J Am Diet Assoc. 2002;102(5):690-696. doi:10.1016/S0002-8223(02)90156-2; Krall JS, Lohse B. Cognitive testing with female nutrition and education assistance program participants informs validity of the satter eating competence inventory. J Nutr Educ Behav. 2010;42(4):277-283. doi:10.1016/j.jneb.2009.08.003
Willis GB, 2015. Analysis of the Cognitive Interview in Questionnaire Design - Gordon B. Willis - Google Books. Oxford University Press; Ryan K, Gannon-Slater N, Culbertson MJ. Improving Survey Methods With Cognitive Interviews in Small- and Medium-Scale Evaluations. Am J Eval. 2012;33(3):414-430. doi:10.1177/1098214012441499
Willis GB, 2015. Analysis of the Cognitive Interview in Questionnaire Design - Gordon B. Willis - Google Books. Oxford University Press; Blair J, Conrad FG, 2011. Sample size for cognitive interview pretesting. Public Opin Q., 75(4):636-658. doi:10.1093/poq/nfr035
Willis GB, 2015. Analysis of the Cognitive Interview in Questionnaire Design - Gordon B. Willis - Google Books. Oxford University Press; Carbone ET, Campbell MK, Honess-Morreale L. Use of cognitive interview techniques in the development of nutrition surveys and interactive nutrition messages for low-income populations. J Am Diet Assoc. 2002;102(5):690-696. doi:10.1016/S0002-8223(02)90156-2; Goerman P, Quiroz R, Mcavinchey G, Reed L, Rodriguez S, 2013. Census American Community Survey Spanish CAPI/CATI Instrument Testing Phase I-Round 1 Final Report.
Collins D, 2015. Cognitive Interviewing Practice. SAGE Publications Ltd; doi:10.4135/9781473910102
Carbone ET, Campbell MK, Honess-Morreale L. Use of cognitive interview techniques in the development of nutrition surveys and interactive nutrition messages for low-income populations. J Am Diet Assoc. 2002;102(5):690-696. doi:10.1016/S0002-8223(02)90156-2