FoodLogger_Usability_Test_Generic_Clearance

FoodLogger_Usability_Test_Generic_Clearance_03182021.docx

Generic Clearance for Survey Research Studies

FoodLogger_Usability_Test_Generic_Clearance

OMB: 0536-0073

Document [docx]
Download: docx | pdf

MEMORANDUM


March 18, 2021


To: Chris Marokov, Desk Officer

Office of Management and Budget


From: Jeffrey Gonzalez, Research Mathematical Statistician

United States Department of Agriculture, Economic Research Service


Via: Pheny Weidman, PRA Clearance Officer

United States Department of Agriculture, Economic Research Service


Subject: Request to conduct usability test of a smartphone application, FoodLogger, under generic clearance for survey research studies (OMB Control # 0536-0073)


The purpose of this memorandum is to obtain OMB clearance for up to three rounds of usability testing to test a smartphone application to collect food purchases and acquisitions.


BACKGROUND AND PURPOSE


In 2012, the National Household Food Acquisition and Purchase Survey (FoodAPS1, OMB Control # 0536-0068) was conducted on behalf of the U.S. Department of Agriculture’s (USDA) Economic Research Service (ERS), as the first nationally representative survey of American households to collect unique and comprehensive data about household food purchases and acquisitions, along with factors that influence household food choices. FoodAPS1 aims to fill a critical data gap and support research that informs policymaking on key national priorities, including health and obesity, hunger, and nutrition assistance policy.


Though FoodAPS1 was successful, the ERS hopes to take steps to improve the quality and efficiency of data collected in the subsequent fielding of FoodAPS2, in 2022 or 2023. ERS has done much to evaluate the procedures and protocols used in FoodAPS1 in order to suggest potential improvements for FoodAPS2, however, it was determined that a pilot study of suggested changes would be necessary to evaluate their feasibility and impact.


ERS has contracted with Westat to develop a native mobile smartphone application, henceforth referred to as FoodLogger, as a data collection mode for collecting FoodAPS data. A pilot study to assess the viability of using FoodLogger in FoodAPS2 and to evaluate the overall data quality and completeness of data collected via FoodLogger will be the focus of a Field Test conducted by Westat on behalf of ERS later this year. The findings from the Field Test will recommend viable improvements for the FoodLogger and will solidify the overall data collection strategy and survey protocols for FoodAPS2.


To prepare for the Field Test, ERS has contracted with the Census Bureau to independently conduct a usability evaluation of the FoodLogger. The FoodLogger, as noted above, is a smartphone application that will serve as a mobile data collection mode for the forthcoming FoodAPS2 Field Test. The FoodLogger consists of three survey modules: (1) a Profile Questionnaire; (2) an Income “worksheet”; and, (3) a seven-day Food Log. The Profile Questionnaire collects basic demographic and other information from each respondent such as age, gender, and general health and work status. The Income “worksheet” collects detailed information on the amount and frequency of receipt of all income sources for each respondent. The seven-day Food Log is a diary designed to collect and capture detailed information on the food household members acquire, where those acquisitions occur, and the quantity and costs associated with each acquired food item. This study is limited only to the seven-day Food Log portion of the FoodLogger. The remaining components will be evaluated alongside the entirety of the data collection procedures in the forthcoming FoodAPS2 Field Test.


Additional features of the Food Log pertinent to this usability study include the following. The Food Log utilizes the smartphone's GPS location services to track the respondent's location throughout the seven-day reporting period and passively collects potential food stops/events. Respondents then have the option to confirm food stops/events and add any missed food stops/events. The Food Log also accesses the smartphone's built-in camera to allow the respondent to scan UPCs of acquired food and beverages as well as to upload pictures of receipts from food acquisition events and individual food items. Finally, the Food Log links to external databases containing detailed product (e.g., package size, weight and volume) and nutrition information of food items and geocodes for food places. The linkage to external databases, such as Nutritionix, Food Data Central, and Google Places API, are intended to reduce respondent burden and improve data quality. For example, when a scanned UPC is successfully matched to an external database within the Food Log, the respondent does not have to report detailed information on food item characteristics, such as package size and weight. The linkage to external databases works in the background of the app and will be a seamless process occurring in real-time.


This usability test will be implemented in April and May 2021. The findings from the usability test will help us understand how users interact with and use the FoodLogger and help us improve the FoodLogger for the Field Test. The following section details the research objectives, study design, and analysis plan for the usability test.


RESEARCH PLANS FOR USABILITY TEST


Evaluation Objectives

Usability is defined as the extent to which a system, product or service can be used by specified users to achieve specified goals with effectiveness, efficiency, and satisfaction in a specified context of use (ISO, 2018). Effectiveness refers to accuracy and completeness with which users achieve specified goals; efficiency refers to resources used in relation to the results achieved; and satisfaction refers to the extent to which the user's physical, cognitive, and emotional responses that result from the use of a system, product or service meet the user’s needs and expectations (ISO, 2018).


The basic objective of the usability evaluation of the native mobile smartphone application, FoodLogger (see Appendix A for screenshots of the application and Appendix A-1 for the corresponding questionnaire instrument), is to ensure that respondents enter food acquisition data into the FoodLogger effectively, efficiently, and with satisfaction. The effectiveness of the FoodLogger will be measured by the success of data entry and the accuracy of entered data; efficiency will primarily be measured by the time taken to enter data; and satisfaction will be measured by respondent-reported satisfaction which includes a user’s perception of difficulty, the extent to which their expectations are met, and a user’s emotional response to data entry. It is hypothesized that effective data entry will lead to less missing or erroneous data and fewer measurement errors, efficient data entry will reflect lower respondent burden, and satisfaction with the data entry experience will help sustain respondents’ participation in the Second National Food Acquisition and Purchase Survey (FoodAPS-2).


Evaluation Strategies

Usability testing will consist of the following strategies to complete the evaluation.


The evaluation will involve up to three rounds of usability testing, with the primary goal of identifying usability issues and discovering the causes of those issues. The first round will involve FoodLogger Version 1.0.4. The second round will be conducted on a revised version of FoodLogger, in which the usability issues discovered in the first round are expected to have been addressed. The aim of the second-round test is to verify usability improvement in the revised version and to identify additional usability issues if any. It is hoped that the revised FoodLogger will have no major usability issues (that would introduce measurement errors and response errors). If major usability issues are identified in the second round of testing, the study team will assess the need for an additional round of usability testing.


The basic methodology for FoodLogger usability testing is qualitative (e.g., observation and probing). In addition to qualitative data, we will also collect quantitative data on key performance measures (e.g., the time used for uploading a receipt) when possible. Such quantitative information will help the study team estimate respondent burden.


Using FoodLogger for FoodAPS-2 is a non-trivial operation requiring respondents to carry out three major activities:

  1. Installing FoodLogger on a smartphone;

  2. Completing training on how to use FoodLogger; and,

  3. Entering information about their food acquisitions and other data into FoodLogger over seven days.

Usability issues could arise during any of these three activities. Thus, the usability evaluation must cover all three activities. In addition, since data entry will take place over seven days on which a variety of food acquisition scenarios could occur, it is important to assess participants’ continued use of FoodLogger as well as the stability of FoodLogger’s functionality over the data collection period, and to investigate the causes of usability-related issues should they arise. Thus, we will conduct a laboratory-based usability testing session and an end-to-end seven-day field usability observation to comprehensively evaluate participants’ data-entry performance.


In the laboratory session, participants will perform a list of critical tasks in specifically designed food events that simulate real-life scenarios. A critical task refers to a task such that failure to complete it will increase the likelihood for measurement error. For example, failure to scan a grocery item’s UPC may result in inaccurate item-level information being entered into the data capture system. In the seven-day field usability observation, participants will enter data of actual food acquisitions they make in their own daily lives. This approach of combining lab-based usability testing and field usability observation will enable the assessment of FoodLogger’s usability for critical tasks as well as to uncover usability issues that may only occur outside of a laboratory setting.


The lab-based usability testing will be conducted on the fourth day of the seven-day data collection period for the following reasons.

  1. Due to the complexity of food acquisition data entry, a learning curve might exist for operating FoodLogger. It is hypothesized that the learning curve might reach a plateau after two to three days of use.

  2. If the lab testing session is conducted right after training, participants might encounter difficulties due to a lack of practice rather than actual usability issues. In the worst case, a testing session might have to be aborted due to inadequate skills.

  3. Holding a lab testing session on Day Four could provide us with a better chance of discovering major and persistent usability issues with less noise, and consequently improve the effectiveness and efficiency of the usability testing.

Given the current COVID-19 pandemic, in-person usability testing poses a health risk to both test administrators (TA) and participants. The lab-based usability testing will thus be carried out virtually. The details of virtual lab testing will be described in the following section.


Proper training for participants is crucial to their successful use of FoodLogger as well as to the validity of usability testing. Ideally, it calls for the same training as that for the Field Test. However, such training is not currently available for the present study because the training that will be utilized in the Field Test will only be fully developed after the final release of the FoodLogger. As a workaround, Westat, the contractor developing FoodLogger, has provided tutorials to the study team who, in turn, will provide training to the participants of usability testing. A limitation of this approach is that a discrepancy exists between the study-team-conducted training and contractor-conducted training due to the different training curriculums. This limitation may imply that some usability issues observed in the present study might not exist if the training had been conducted by the contractor and/or that some other usability issues might be observed in the Field Test but not in the present study.


Round 1 Testing Protocol

As discussed in the preceding section, Evaluation Strategies, we will conduct a usability evaluation combining a laboratory-based usability testing session and an end-to-end seven-day field usability observation (Combined Test). The Combined Test will consist of the following four components, each of which are described below.

  1. Installing FoodLogger on the participant’s personal smartphone device;

  2. Training the participant to use FoodLogger;

  3. Collecting data for seven days in the field; and,

  4. Conducting virtual lab-based usability testing. Details of the testing are described below.


PARTICIPANTS AND RECRUITMENT

Six adults and four school-age children, from six households, will be recruited for Round 1 Testing.

All participants, except children under the age of 11 years old, must meet the following inclusion criteria:

  1. Have an iOS or Android smartphone that they have used daily for at least one year.

  2. Have a data service on their smartphone.

  3. Agree to use their personal smartphone and data service throughout their participation in the study.

  4. Agree to install FoodLogger onto their smartphone.

  5. Agree to enable GPS location service on their smartphone to allow FoodLogger to track their location throughout the duration of the study.

  6. Have an internet service in their household.

  7. Have a laptop/desktop computer in their household which is connected to the internet.


Additional inclusion criteria for adult participants:

  1. All adult participants must be the main food shopper or meal planner in the household.

  2. Each of the six adults must come from a different household.


Additional inclusion criterion for child participants:

  1. All child participants must be enrolled in a K-12 school, and if possible, are acquiring meals from a school lunch program.

  2. Each child must come from a different participating household.


Expected distribution of socio-economic characteristics among participating households (while not mandatory, efforts will be made to attain a sample with such a distribution):

  1. At least three households are participating in a USDA nutrition assistance program, e.g. Supplemental Nutrition Assistance Program (SNAP) or the Special Supplemental Nutrition Program for Women, Infants, and Children (WIC).

  2. At least two adults are at least 60 years old.

  3. Two households have a child between the age of 11-15 years old participating in the study.

  4. Two households have a child under the age of 11 years old participating in the study with parent’s proxy reporting.

  5. Three households reside in a rural area, based on the Rural-Urban Commuting Area (RUCA) classification of their zip code (www.ers.usda.gov, https://depts.washington.edu/uwruca/ruca-taxonomies.php).


Recruiting only skilled smartphone users and individuals fluent in English may also contribute to potential limitations in this study. However, the rationale for including only skilled smartphone users is that this requirement allows us to better isolate usability issues specific to the FoodLogger itself from usability issues that pertain to the smartphone regardless of what it's being used for. It is worth noting that the FoodAPS2 Field Test and Full Survey will employ a multi-mode data collection strategy in which web-based diary and telephone modes will be offered to participants who are unable to use smartphone technology. Additionally, the justification for including only people fluent in English is that the Field Test will only be conducted in English. There are plans to pre-test Spanish-translated versions of new FoodAPS2 questions and concepts as well as a Spanish-translated version of the FoodLogger questionnaire instrument prior to conducting the Full Survey.


Participants will be recruited through advertisements on Craigslist, a recruitment database maintained at the Census Bureau, word-of-mouth, etc. Appendix B contains the advertisement script to be posted on Craigslist, and Appendix C shows a flyer to be posted in public areas. A screener questionnaire will be used for participant screening and determine eligibility for participation (Appendix D). It is assumed that four adults per round will complete the screener questionnaire (Appendix D) but will not meet eligibility/recruitment criteria. The estimated burden associated with successful and unsuccessful (i.e., the potential participant does not meet recruitment criteria) screening is 15 minutes per participant and 7.5 minutes per ineligible, respectively.


TIMELINE OF ROUND 1

Table 1 shows the timeline for the execution of major activities in the first round. Details of the activities are provided in the following paragraphs.


Table 1. Timeline of major first round activities

Component

Day 1

Day 2

Day 3

Day 4

Day 5

Day 6

Day 7

Day 8

Study introduction

X








Receiving a Disclaimer

X








Signing a Consent Form

X








Completing a demographic questionnaire

X








Installing FoodLogger

X








Data entry training

X








Field data entry

X

X

X

x

x

X

x


Field data entry debriefing




x




x

Lab-based usability testing




x





Signing incentive voucher








x


DAY 1 ACTIVITIES

The participant will meet virtually with the study team and perform the following tasks in order:

  1. Study introduction (Appendix E);

  2. Receive Disclaimer (Appendix F);

  3. Sign a Consent Form (Appendix G);

  4. Complete a demographic questionnaire (Appendix H);

  5. Install Microsoft (MS) Teams (Appendix Z);

  6. Install the FoodLogger (see pages 2-5 of Appendix I for FoodLogger installation instructions for both iPhone and Android smartphones);

  7. Receiving training on using the FoodLogger (see pages 6-15 of Appendix I for training on study concepts and FoodLogger usage); and,

  8. Begin the seven-day field data entry period.


STUDY INTRODUCTION (DAY 1, 0.25 HOURS)

This component will be executed at the participant’s home. Only adults deemed eligible to participate based on their completed screener questionnaires (Appendix D), along with any eligible children in their respective households, will be introduced to the study. A TA will communicate with eligible participants through an audio connection and be introduced the objectives of the study (Appendix E). The TA will guide the participants through the disclaimer (Appendix F), have them sign a consent form (Appendix G), and complete the demographic questionnaire (Appendix H).


INSTALLING MICROSOFT TEAMS AND FOODLOGGER (DAY 1, 0.25 HOURS)

This component will be executed at the participant’s home. A TA will communicate with the participant through an audio-and-video connection (e.g., MS Teams). The instructions for installing Microsoft Teams (Appendix Z) and the FoodLogger (Appendix I) will be sent to the participant in advance after they are deemed eligible to participate. The TA will virtually guide the participant to install Microsoft Teams onto his/her own computer and the FoodLogger onto his/her own smartphone, observe the participant’s performance, and record usability issues. After the FoodLogger is installed, a debriefing will be conducted. The TA will be aided with a debriefing guide to cover critical actions for installing FoodLogger (Appendix J). Usability issues will be summarized, and possible root causes will be explored.


DATA ENTRY TRAINING (DAY 1, 0.75 HOURS)

This component will be executed at the participant’s home. A TA will communicate with the participant through an audio-and-video connection. The training materials (Appendix I) will be sent to the participant in advance. The TA will virtually train the participant on entering data into FoodLogger, observe the participant’s performance, log usability issues, and conduct a debriefing at the end of the training session. The focus of the observation will be on the participant’s learning process (e.g., concept comprehension, skill acquisition). The TA will be aided with a debriefing guide to cover critical actions of interest (Appendix K). Usability issues will be summarized, and root causes will be explored.


FIELD DATA ENTRY (DAYS 1-7, 0.5 HOURS PER DAY)

This component will be executed in the participant’s daily living setting on his/her own, without the TA’s observation. The participant will be instructed (Appendix L) to enter information for all foods acquired, either purchased or for free, over the seven days into FoodLogger and to log all the problems and difficulties encountered during data entry. A standard log form (Appendix M) will be provided to the participant. The TA will be available (7 am – 10 pm, Eastern Time) to provide virtual assistance as needed. The TA will log all interactions with participants.


DAY 4 ACTIVITIES (2 HOURS1)

In addition to continuing field data entry, participants will meet the study team virtually and perform the following tasks in order:

  1. Participate in a virtual lab-based usability testing session;

  2. Complete a debriefing on the lab-based session; and,

  3. Complete a debriefing on their field data entry for the first three days.

The design of the lab-based usability session is described in below. The field data entry debriefing will be virtually conducted. TA’s will use a debriefing guide to cover all critical actions of interest (Appendix N). Usability issues will be summarized, and root causes will be explored.


DAY 5-7 ACTIVITIES

Participants will continue field data entry.


DAY 8 ACTIVITIES (1 HOUR)

Participants will virtually meet with the study team and perform the following tasks in order:

  1. Complete a debriefing on their field data entry for the last four days of data collection (Appendix O) and a FoodLogger satisfaction questionnaire (Appendix X) and

  2. Sign an incentive voucher form (Appendix P).

The incentive will be mailed to the participant upon completion of all study activities and tasks.


LAB-BASED USABILITY TESTING DESIGN

Critical Tasks

A critical task refers to a task such that failure to complete it may increase the likelihood of the occurrence of measurement errors. Assessing the participant’s performance of performing critical tasks is the core of usability testing. Twenty-two critical tasks have been identified for successful data entry using FoodLogger and will be tested in the lab-based usability testing (see Appendix Q for a listing of all critical tasks).


Use Cases

The testing includes three use cases.

  1. Food-at-Home (FAH) event (Appendix R);

  2. Food-away-from-Home (FAFH) event (Appendix S); and,

  3. School-meal event (Appendix T).

The three use cases are designed such that each critical task will be performed at least once during testing. The adult participants will be tested on the FAH and FAFH events and a school-meal event as a proxy for a school-age child under 11 years old. School-age participants of at least 11 years old will be tested on a school-meal event only.


Performance Measures

The following metrics will be used to assess participants’ data entry performance.

  1. Extent to which the entered data are correct (data entry accuracy);

  2. Duration between the start- and end-times of entering a datum (data entry time);

  3. Extent to which the participant’s actual navigation path deviates from the optimal path (navigation); and,

  4. Paradata and user-observed variables, if available (see Appendix U for a listing of data elements).


Data Collection During Lab-Based Usability Testing Session

A protocol will be followed to carry out the lab-based usability testing (see Appendix V for details on testing protocol). Both quantitative and qualitative data will be collected to assess task performance as well as to investigate root causes of performance deficiency. The following methods will be utilized to facilitate data collection:

  1. Passive observation;

  2. Think aloud; and,

  3. Retrospective debriefing focused on the critical design components, e.g., language and concept comprehension.

TA’s will follow a debriefing guide to cover the critical design components of interest (Appendix W). comprehension.


Data Analysis

Quantitative data will be analyzed with descriptive and/or inferential statistics accordingly. Qualitative data will be summarized to identify common usability issues and their causes. Usability issues will be classified as high (H), medium (M), and low (L) priority. High-priority issues are those that prevent a task from being completed; medium-priority issues prolong task completion; and low-priority issues do not impact effectiveness and efficiency of task completion but may affect user’s satisfaction (e.g., a user may be dissatisfied with a screen’s layout or the formatting of text on a screen is imperfect).


COVID-19 PROTECTIONS FOR LAB-BASED USABILITY TESTING

Virtual Lab

Given the current COVID-19 pandemic situation, the lab-based testing session will be carried out virtually. Participants will participate in the testing at their home. The TA will communicate with the participant through a secure video-and-audio link (e.g., Government-approved Microsoft Teams). Both audio-and-video transmission will be carried out in real time. The participant’s voice will be heard and recorded through the audio link.


Food Items for Testing

Food items for lab-based usability testing will be purchased from a local grocery store and delivered directly to a participant’s home prior to the lab-based session. The basic workflow detailing the process for acquiring these food items is described in Appendix Y.


INCENTIVE DISBURSEMENT

Cash in the amount of the incentive earned by the household will be mailed via insured USPS priority mail to the adult participant’s home address after the completion of all tasks and activities for the study period. The incentive structure for the usability test is detailed below.


Round 2 Testing Protocol

The protocol for the second round of testing will be developed based on the findings from Round 1. The Round 2 test design will be similar to Round 1 except in the following ways.

  1. Fewer than ten participants may be recruited;

  2. All Round 2 participants must not have participated in Round 1; and,

  3. The laboratory-based testing session may be conducted on the first day to assess the participant’s skill and comprehension of required tasks immediately after training. This would mimic the planned protocol for the Field Test.


Round 3 Testing Protocol

The need for a third round of usability testing will be made based on the findings from the second round. If needed, the testing protocol for the third round will be identical to the second round.


Use of FedRAMP-approved Qualtrics

U.S. Census Bureau’s FedRAMP-approved Qualtrics will be used to collect some administrative information and user experience data from the participants. Collection of CIPSEA-protected data is covered under the Census Bureau Authorization-to-Operate (ATO) with Qualtrics. Specifically, the following activities will be carried out using the Qualtrics:

  1. Signing consent form (Appendix G)

  2. Completing demographic questionnaire (Appendix H)

  3. Debriefing on downloading and installing FoodLogger (Appendix J)

  4. Debriefing on data entry training (Appendix K)

  5. Debriefing on lab-based usability testing (Appendix W)

  6. Debriefing on field data entry days 1-3 (Appendix N)

  7. Debriefing on field data entry days 4-7 (Appendix O)

  8. Signing incentive voucher (Appendix P)


The appendices listed above present the prototype of each data collection instrument’s design.


IT Security

The use of all the equipment for this study will be approved by the USDA security authority.


Confidentiality

Respondents will be provided the following statement when presented with the Disclaimer (Appendix F).


All information which would permit identification of an individual, a practice, or an establishment will be held confidential, will be used for statistical purposes only, will be used only by USDA staff, contractors, and agents authorized by USDA to perform statistical activities only when required and with necessary controls, and will not be disclosed or released to other persons without the consent of the individual or establishment in accordance with the Confidential Information Protection and Statistical Efficiency Act (PL-107-347). By law, every employee as well as every agent has taken an oath and is subject to a jail term of up to five years, a fine of up to $250,000, or both if he or she willfully discloses ANY identifiable information about you. Per the Federal Cybersecurity Enhancement Act of 2015, Federal information systems are protected from malicious activities through cybersecurity screening of transmitted data.


Incentives

The incentive structure is shown in Table 2. Given six adults and four children for each round, the maximum amount of total incentives disbursed per round will be $1,130.


Table 2. Incentive structure (Unit: USD)

Activity

Day

Adult

Child

Setup and training

1

15

15

Data entry


5

5

2

5

5

3

5

5

Laboratory session

4

60

30

Data entry

5

5

5

5

5

6

5

5

7

5

5

Debriefing

8

15

15

Total Incentive


125

95





Burden Estimate

The estimated total number of respondents for Rounds 1 and 2 of this study is 28 individuals. This number can be broken down into two groups: an estimated eight (8) individuals who go through the household recruitment screener but are determined to be ineligible for the study or who are eligible but decline to participate and twenty (20) individuals who are eligible and participate.


The estimated total response burden associated with Rounds 1 and 2 of this study is estimated to be 159 hours. The estimated total response burden for the participants group is 158 hours, which averages to about 7.9 hours (or 474 minutes) per participant. The estimated total response burden for those who are screened but do not participate is one hour, which averages to about 7.5 minutes per individual. Table 3 provides a detailed breakdown of the response burden for this study.  


Table 3. Burden estimate for FoodLogger usability evaluation


Sample Size

Freq

Respondents

Nonrespondents

Total Burden


Response Count

Freq X Count

Hours per response

Burden Hours

Nonresponse Count

Freq X Count

Hours per nonresponse

Burden Hours

Household Recruitment*

20

1

12

12

0.25

3.00

8

8

0.13

1.00

4.00

Introductory Session

20

1

20

20

0.25

5.00





5.00

Study introduction

20

1

20

20

0.08

1.67





1.67

Disclaimer

20

1

20

20

0.04

0.83





0.83

Consent

20

1

20

20

0.04

0.83





0.83

Demographic questionnaire

20

1

20

20

0.08

1.67





1.67

Installation Tasks

20

1

20

20

0.25

5.00





5.00

Install MS Teams

20

1

20

20

0.08

1.67





1.67

Install FoodLogger

20

1

20

20

0.08

1.67





1.67

Debriefing

20

1

20

20

0.08

1.67





1.67

Training

20

1

20

20

0.75

15.00





15.00

Data Entry

20

7

20

140

0.50

70.00





70.00

Laboratory Session

20

1

20

20

2.00

40.00





40.00

Use cases

20

1

20

20

1.50

30.00





30.00

Debriefing use cases

20

1

20

20

0.25

5.00





5.00

Debriefing data entry

20

1

20

20

0.25

5.00





5.00

Study Conclusion

20

1

20

20

1.00

20.00





20.00

Debriefing

20

1

20

20

0.75

15.00





15.00

Incentive

20

1

20

20

0.08

1.67





1.67

Satisfaction questionnaire

20

1

20

20

0.17

3.33





3.33

Total Burden






158.00




1.00

159.00


*Only adults will participate in household recruitment.






NOTIFICATION TO RESPONDENT OF ESTIMATED BURDEN

Public reporting burden for this collection of information is estimated to average 474 minutes per participant for each of Round 1 and 2. This includes the time for reviewing instructions, installing the required applications, gathering and maintaining the data needed, completing and reviewing the collection of information, and participating in the laboratory and debriefing sessions. An agency may not conduct or sponsor, and a person is not required to respond to, a collection of information unless it displays a currently valid OMB control number. The valid OMB control number for this information collection is #0536-0073. Send comments regarding this burden estimate or any other aspect of this collection of information, including suggestions for reducing this burden to: ERS, 1400 Independence Avenue, SW, Mail Stop 1800, Washington, DC 20250-1800, ATTN: Jeffrey Gonzalez (202-694-5341). Do not return the completed form to this address.


References


  1. ISO 9241-11:2018. Ergonomics of human-system interaction — Part 11: Usability: Definitions and concepts. 2018.

  2. https://www.ers.usda.gov/data-products/rural-urban-commuting-area-codes.aspx, accessed on January 26, 2021

  3. https://depts.washington.edu/uwruca/ruca-taxonomies.php, accessed on January 26, 2021



CONTACT INFORMATION


The contact person for questions regarding this data collection is:


Jeffrey Gonzalez

(202) 694-5341

Jeffrey.Gonzalez@usda.gov


Appendixes

  • Appendix A: Screenshots of Smartphone Application, Food Log Portion of FoodLogger

  • Appendix A-1: Food Log Questionnaire for the FoodLogger

  • Appendix B: Participant Recruitment Advertisement Script

  • Appendix C: Participant Recruitment Flyer

  • Appendix D: Screener Questionnaire for Participant Recruitment

  • Appendix E: Introduction on Day 1

  • Appendix F: Disclaimer

  • Appendix G: Consent Form

  • Appendix H: Demographic Questionnaire

  • Appendix I: Courseware for FoodLogger Installation and Data Entry Training

  • Appendix J: Debriefing Guide for Installing FoodLogger

  • Appendix K: Debriefing Guide for Data Entry Training

  • Appendix L: Instructions for Field Data Entry

  • Appendix M: Usability Issues Log

  • Appendix N: Debriefing Guide for Field Data Entry (Days 1-3)

  • Appendix O: Debriefing Guide for Field Data Entry (Days 4-7)

  • Appendix P: Incentive Voucher

  • Appendix Q: Critical Tasks

  • Appendix R: Food-at-Home (FAH) Use Case

  • Appendix S: Food-away-from-Home (FAFH) Use Case

  • Appendix T: School-Meal Use Cases

  • Appendix U: Paradata and User-Observed Variables for Usability Evaluation

  • Appendix V: Protocol for Lab-Based Usability Testing

  • Appendix W: Debriefing Guide for Lab-Based Usability Testing

  • Appendix X: Satisfaction Questionnaire

  • Appendix Y: Workflow for Preparing Food Items for Lab-Based Usability Testing

  • Appendix Z: Instructions for Installing Microsoft Teams and Sharing Smartphone Screen

1 Estimate does not include hours associated with field data entry as those hours are accounted for in the previous heading.

11


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorAaron Maitland
File Modified0000-00-00
File Created2021-09-08

© 2024 OMB.report | Privacy Policy