Skip to main content

NLSY79

Retention & Reasons for Noninterview

Retention vs. Response Rate

Retention rates for NLSY79 respondents from 1979 to 1993 exceeded 90 percent. Rates from 1994 until 2000 exceeded 80 percent. Rates from 2002 until 2014 were in the 70s. Retention rate is calculated by dividing the number of respondents interviewed by the number of respondents remaining eligible for interview. All 1979 (round 1) respondents including those reported as deceased are eligible for interviews, with the exception of those who have been permanently dropped from the sample. In the round 29 (2020) survey, 6,535 civilian and military respondents out of the 9,964 eligible were interviewed, for an overall retention rate of 65.6 percent. Retention rates for each survey are shown in Table 1. This table also shows which interviews were conducted with paper-and-pencil interviewing (PAPI) and which were executed with computer-assisted personal interviewing (CAPI).

The number of respondents can also be expressed as a percentage of the number of base year respondents not known to be deceased. This is referred to as the response rate and is reported in Table 2 for each survey round. As of 2020, 1,185 main respondents had been reported as deceased. The response rate for those believed to be alive is 74.4 percent.

Scroll right to view additional table columns.

Table 1. Sample Sizes & Retention Rates by Sample Type: NLSY79
    Cross-Sectional 
Sample
Supplemental Sample Military Sample Total Sample
Year Type & Mode of Interview ("Personal" mode can refer to either in person and by telephone) Total Retention Rate 1 Total Retention Rate 1 Total Retention Rate 1 Total Retention Rate 1
1979 Personal/PAPI 6111 - 5295 - 1280 - 12686 -
1980 Personal/PAPI 5873 96.1 5075 95.9 1193 93.2 12141 95.7
1981 Personal/PAPI 5892 96.4 5108 96.5 1195 93.4 12195 96.1
1982 Personal/PAPI 5876 96.2 5036 95.1 1211 94.6 12123 95.6
1983 Personal/PAPI 5902 96.6 5093 96.2 1226 95.8 12221 96.3
1984 Personal/PAPI 5814 95.1 5040 95.2 1215 94.9 12069 95.1
1985 Personal/PAPI 5751 94.1 4957 93.6 1862 92.5 108943 93.9
1986 Personal/PAPI 5633 92.2 4839 91.4 183 91.1 10655 91.8
1987 Telephone only/PAPI 5538 90.6 4768 90.1 179 89.1 10485 90.3
1988 Personal/PAPI 5513 90.2 4777 90.2 175 87.1 10465 90.2
1989 Personal/PAPI/CAPI 5571 91.2 4853 91.7 181 90.0 10605 91.4
1990 Personal/PAPI/CAPI 5498 90.0 4755 89.8 183 91.0 10436 89.9
1991 Personal/PAPI 5556 90.9 32814 89.9 181 90.0 90185 90.5
1992 Personal/PAPI 5553 90.9 3280 89.8 183 91.0 9016 90.5
1993 Personal/CAPI 5537 90.6 3293 90.2 181 90.0 9011 90.4
1994 Personal/CAPI 5457 89.3 3256 89.2 178 88.6 8891 89.2
1996 Personal/CAPI 5290 86.6 3171 86.8 175 87.1 8636 86.7
1998 Personal/CAPI 5159 84.4 3065 83.9 175 87.1 8399 84.3
2000 Personal/CAPI 4949 81.0 2921 80.0 163 81.1 8033 80.6
2002 Personal/CAPI 4775 78.1 2792 76.5 157 78.1 7724 77.5
2004 Personal/CAPI 4686 76.7 2818 77.2 157 78.1 7661 76.9
2006 Personal/CAPI 4629 75.7 2862 78.4 162 80.6 7653 76.8
2008 Personal/CAPI 4688 76.7 2908 79.6 161 80.1 7757 77.8
2010 Personal/CAPI  4602 75.3 2808 76.9 155 77.1 7565 75.9
2012 Personal/CAPI 4422 72.4 2731 74.8 147 73.1 7300 73.3
2014 Personal/CAPI 4263 69.8 2660 72.8 147 73.1 7070 71.0
2016 Personal/CAPI 4192 68.6 2581 70.7 139 69.2 6912 69.4
2018 Personal/CAPI 4147 67.9  2587   70.8  144 71.6  6878  69.0 
2020 Personal/CAPI 3942 64.5 2458 67.3 135 67.2 6535 65.6
 
1  Retention rate is defined as the percentage of base year respondents within each sample type remaining eligible who were interviewed in a given survey year. Included in the eligible sample are deceased and difficult to field respondents whom NORC does not attempt to contact.
2  A total of 201 military respondents were retained from the original sample of 1,280.
3  The total number of civilian and military respondents in the NLSY79 at the beginning of the 1985 survey was 11,607.
4  Economically disadvantaged, nonblack/non-Hispanic female and male members of the supplemental subsample are not eligible for interview as of the 1991 survey year. Remaining eligible for interview in post-1990 surveys are 3,652 black and Hispanic or Latino respondents of the supplemental sample, of whom 3,281 were interviewed in 1991.
5  The total number of civilian and military respondents in the NLSY79 at the beginning of the 1991 survey was 9,964.

Scroll right to view additional table columns.

Table 2. Response Rates (Excluding Deceased Sample Members) by Sample Type: NLSY79
  Cross-Sectional Sample Supplemental Sample Military Sample Total Sample
Year # Response Total Deceased Rate # Response Total Deceased Rate # Response Total Deceased Rate # Response Total Deceased Rate
1979 6,111 0 - 5,295 0 - 1,280 0 - 12,686 0 -
1980 5,873 4 96.2 5,075 5 95.9 1,193 0 93.2 12,141 9 95.8
1981 5,892 15 96.7 5,108 14 96.7 1,195 0 93.4 12,195 29 96.3
1982 5,876 24 96.5 5,036 19 95.5 1,211 1 94.7 12,123 44 95.9
1983 5,902 27 97.0 5,093 26 96.7 1,226 4 96.1 12,221 57 96.8
1984 5,814 30 95.6 5,040 33 95.8 1,215 4 95.2 12,069 67 95.6
1985 5,751 36 94.7 4,957 43 94.4 1861 0 92.5 10,8941 79 94.5
1986 5,633 43 92.8 4,839 51 92.3 183 1 91.5 10,655 95 92.6
1987 5,538 51 91.4 4,768 56 91.0 179 3 90.4 10,485 110 91.2
1988 5,513 56 91.0 4,777 68 91.4 175 3 88.4 10,465 127 91.2
1989 5,571 60 92.1 4,853 78 93.0 181 3 91.4 10,605 141 92.5
1990 5,498 67 91.0 4,755 82 91.2 183 3 92.4 10,436 152 91.1
1991 5,556 75 92.0 3,2812 65 91.5 181 4 91.9 9,0182 144 91.9
1992 5,553 81 92.1 3,280 71 91.6 182 4 92.4 9,015 156 91.9
1993 5,537 90 92.0 3,293 83 92.3 181 4 91.9 9,011 177 92.1
1994 5,457 104 90.8 3,256 96 91.6 178 4 90.4 8,891 204 91.1
1996 5,290 129 88.4 3,171 109 89.5 175 5 89.3 8,636 243 88.8
1998 5,159 152 86.6 3,065 118 86.7 175 5 89.3 8,399 275 86.7
2000 4,949 170 83.3 2,921 136 83.1 163 7 84.0 8,033 313 83.2
2002 4,775 188 80.6 2,792 151 79.7 157 7 80.9 7,724 346 80.3
2004 4,686 221 79.6 2,818 171 81.0 157 7 80.9 7,661 399 80.1
2006 4,629 252 79.0 2862 197 82.8 162 7 83.5 7653 456 80.5
2008 4688 274 80.3 2908 222 84.8 161 7 83.0 7757 503 82.0
2010 4602  309 79.3 2808 255 82.7 155 9  80.7  7565 573  80.6
2012 4422  384 77.2 2731 292 81.3 147 13 78.2 7300 689  78.7
2014 4263  444 75.2 2660 331 80.1 147 15 79.0 7070 790 77.1
2016 4192 512 74.9 2581 385 79.0 139 18 76.0 6912 915 76.4
2018 4147 576 74.9 2587 435 80.4 144 22 80.4 6878 1033 77.0
2020 3942 659 72.3 2458 502 78.0 135 24 76.3 6535 1185 74.4
 
Note: Response rate is defined as the percentage of base-year respondents remaining eligible and not known to be deceased who were interviewed in a given survey year.
1A total of 201 military respondents were retained from the original sample of 1,280; 186 of the 201 participated in the 1985 interview. The total number of NLSY79 civilian and military respondents eligible for interview (including deceased respondents) beginning in 1985 was 11,607.
2 The 1,643 economically disadvantaged nonblack/non-Hispanic male and female members of the supplemental subsample were not eligible for interview as of the 1991 survey year. The total number of NLSY79 civilian and military respondents eligible for interview (including deceased respondents) beginning in 1991 was 9,964.

Reasons for Noninterview

A 'Reason for Noninterview' variable is constructed for each survey year (excluding 1979) in the NLSY79 and provides an explanation of why an interview could not be conducted or completed with a respondent. The cause of noninterview is assigned by the NORC interviewer to each respondent designated as a member of the eligible sample for a given survey year. Typical coding categories have included reasons such as an interview being refused by the respondent or by the respondent's parent, the respondent or family unit not being located, or the respondent being reported as deceased.

Beginning in the 1980s, two administrative categories were added. One reflected a decision by NORC not to attempt to interview certain sample members who were determined to be extremely difficult to interview. The second category indicates that, due to funding cutbacks, interviews would not be attempted with certain members of one or more of the NLSY79 subsamples. Thus, beginning in 1985, interviews ceased for 1,079 respondent members of the military subsample; each was permanently assigned a reason for noninterview of "military sample dropped." A second group of respondents, those belonging to the supplemental economically disadvantaged, nonblack/non-Hispanic sample, was similarly dropped from interviewing beginning with the 1991 survey. The target universe for each survey year--that is, the respondents whom NORC attempts to interview--thus includes all respondents interviewed in the initial survey year exclusive of those who were:

  1. reported deceased at an earlier interview
  2. dropped from the sample
  3. judged to be extremely difficult to interview

Important informationabout noninterview

Reasons for noninterview may change for a given respondent between noninterview years, even if those years are contiguous. Some codes, such as "parent refusal/break off," have become virtually obsolete over time with the aging of the cohort. Other codes should be considered relatively permanent, such as those applied to the reported death of a respondent. (Users should be aware that false reports of death have been used to avoid being interviewed. NORC attempts to verify these reports by obtaining death certificate information or newspaper obituaries.)

The coding of deceased members of the two subsamples dropped from interviewing in 1985 and 1991 has not been handled consistently. Those respondents of the military sample reported deceased during the 1980-84 surveys, that is, those with a code of "65 - Deceased" on a 'Reason for Noninterview' variable, have been recoded, beginning in 1985, to "68 - Military Sample Dropped"; this recode occurred for four cases. Thus, the count of 1,079 reflects all members of the military subsample, both living and deceased, who were dropped from interviewing; however, this means that the cumulative count of total deceased respondents on any post-1984 'Reason for Noninterview' will be understated.

The 22 members of the supplemental economically disadvantaged, nonblack/non-Hispanic sample who had died prior to the dropping of the sample in 1991 were not similarly reclassified as dropped. The count of 1,621 for the economically disadvantaged, nonblack/non-Hispanic sample in the 1991 'Reason for Noninterview' variable reflects only the living members of the total 1,643 who were dropped; the 22 deceased members of the supplemental economically disadvantaged, nonblack/non-Hispanic subsample remain coded as deceased.

Table 3 presents the number of respondents not interviewed across survey years by sample type.

Scroll right to view additional table columns.

Table 3. Reasons for Noninterview by Sample Type: NLSY79 1980-2020
Key: C = Cross-sectional, S = Supplemental, M = Military
Survey
Year
Total Not
Interviewed
Reason for Noninterview
Refusal Can't Locate Deceased Other Difficult Cases Dropped1
    C S M C S M C S M C S M C S M C S M C S M
1980 238 220 87 153 91 9 60 101 56 4 5 0 21 23 22 0 0 0 0 0 0
1981 219 187 85 133 71 16 30 64 20 15 14 0 41 38 49 0 0 0 0 0 0
1982 235 259 69 86 73 18 56 123 30 24 19 1 7 25 18 62 19 2 0 0 0
1983 209 202 54 103 94 23 43 63 18 27 26 4 15 14 8 21 5 1 0 0 0
1984 297 255 65 204 138 32 54 73 24 30 33 4 9 11 5 0 0 0 0 0 0
1985 360 338 1094 180 146 5 51 94 7 36 43 0 10 14 2 83 41 1 0 0 1079
1986 478 456 1097 284 230 10 78 115 7 43 51 1 14 22 0 59 38 0 0 0 1079
1987 573 527 1101 286 217 5 118 165 10 51 56 3 28 39 1 90 50 3 0 0 1079
1988 598 518 1105 335 248 4 107 128 13 56 68 3 43 36 4 57 38 2 0 0 1079
1989 540 442 1099 316 202 7 90 93 5 60 78 3 19 25 2 55 44 3 0 0 1079
1990 613 540 1097 385 269 8 101 139 6 67 82 3 23 28 1 37 22 0 0 0 1079
1991 555 1992 1099 316 182 9 97 99 6 75 65 4 8 13 1 59 12 0 0 1621 1079
1992 558 1993 1097 323 196 7 82 70 6 81 71 4 11 16 1 61 19 0 0 1621 1079
1993 574 1980 1099 338 191 11 57 62 3 90 83 4 12 10 1 77 13 1 0 1621 1079
1994 654 2017 1102 398 196 9 78 59 9 104 96 4 11 14 0 63 31 1 0 1621 1079
1996 820 2102 1105 486 216 6 86 87 11 128 109 5 22 23 1 98 46 3 0 1621 1079
1998 952 2208 1105 490 233 8 117 146 9 152 118 5 74 59 1 119 31 3 0 1621 1079
2000 1162 2352 1117 689 333 15 162 180 11 170 136 7 58 61 4 83 21 1 0 1621 1079
2002 1336 2503 1123 684 394 14 201 195 14 188 151 7 65 66 4 198 54 51 0 1621 1079
2004 1425 2477 1123 790 328 16 240 197 15 221 171 7 119 80 3 55 58 3 0 1621 1079
2006 1482 2433 1118 969 403 20 165 138 10 252 197 7 38 19 0 58 33 2 0 1643 1079
2008 1423 2387 1119 814 311 14 177 149 10 274 222 7 103 31 7 55 31 2 0 1643 1079
2010 1509 2487 1125  795 334 22 194 162 14 309 255 9 98 42 1  113 51  0 0 1643  1079 
2012 1689  2564 1133 649 238 16 255 198 13 384 292 13 317 152 12 84 41 0 0 1643  1079 
2014 1848 2635 1133 775 328 18 346 213 16 444 331 15 111 54 3 172 66 2 0 1643 1079
2016 1919 2714 1141 722 318 20 482 295 23 512 385 18 31 19 0 172 54 1 0 1643 1079
2018  1964  2708  1136 633 281  19   496  271 13  576  435  22  22 9 0 237 69 3 0 1643  1079 
2020 2169 2837 1145 1010 487 29 185 94 9 659 502 24 80 49 3 235 62 1 0 1643 1079
 
Two groups of NLSY79 respondents have been dropped from interviewing: (1) 1,079 members of the 1,280 military subsample were dropped after the 1984 survey and (2) the 1,643 members of the supplemental economically disadvantaged, non-black/non-Hispanic subsample were dropped after the 1990 interview.

Sample Representativeness & Attrition

This section reviews the number of respondents by race, sex, and NLSY79 sample type who have continued to be interviewed during all surveys. It also takes a brief look at the racial composition of the cohort at the initial and latest survey points. Table 4 shows the number of respondents, excluding dropped respondents, who were interviewed at all survey points. This table exhibits the high degree of NLSY79 retention. From 1979 to 2020 the survey has been administered 29 times.

Table 4. Percentage of NLSY79 Respondents, Excluding Dropped Respondents, Who Answered Every Survey: 1979-2020
Year Percent Number   Year Percent Number
1979 100% 9964 1993 73.2% 7291
1980 96.0% 9571 1994 71.8% 7153
1981 94.3% 9395 1996 69.6% 6935
1982 92.7% 9234 1998 69.9% 6664
1983 91.6% 9125 2000 63.3% 6310
1984 89.7% 8942 2002 60.3% 6004
1985 87.5% 8721 2004 57.6% 5736
1986 85.0% 8472 2006 55.6% 5538
1987 82.3% 8203 2008 54.3% 5407
1988 79.9% 7957 2010 52.3% 5208
1989 78.5% 7819 2012 50.3% 5012
1990 76.7% 7642 2014 48.0% 4787
1991 75.5% 7521 2016 46.3% 4613
1992 74.2% 7396 2018 44.7%  4452 
        2020 42.5% 4239

Table 5 shows the distribution of the number of interviews completed by respondents, broken down by sample type. The "# who completed" column shows how many respondents completed exactly that number of surveys. These numbers refer to any surveys completed since the NLSY79 began, not necessarily consecutive surveys completed or surveys completed in particular years. The cumulative percent column shows a cumulative total percent of those completing at least a given number of surveys rather than a percentage of those completing an exact number of surveys. Readers should note the attrition suggested in Table 3 greatly overstates the amount of lost information. The NLSY79 asks detailed questions about work history, education, training, marital status, and fertility since the date of the respondent's last interview. These retrospective questions capture information lost due to missed interviews. Hence, a perfect response record is not needed for researchers to understand how the respondent's life changes over time, unless he or she leaves the survey forever.

Scroll right to view additional table columns.

Table 5. Number of Interviews Respondents Completed out of 29 Surveys by Sample Type: NLSY79 1979-2020
  Total Sample Cross Sectional Sample Supplemental Sample Military Sample
# of Surveys1 # who Completed Cumul. Percent # who Completed Cumul. Percent # who Completed Cumul. Percent # who Completed Cumul. Percent
1 39  0.4 25 0.4 14 0.4 0 -
2 38  0.4 22  0.4  15  0.4  1 0. 5
3 29  0.3 19 0.3 10 0.3 0 -
4 38 0.4 17 0.3 19 0.5 2 1.0
5 48 0.5 35 0.6 12 0.3 1 0.5
6 61 0.6 36 0.6 25 0.7 0 -
7 70 0.7 50 0.8 18 0.5 2 1.0
8 65 0.7 42 0.7 22 0.6 1 0.5
9 80 0.8 49 0.8 31 0.8 0 -
10 70 0.7 47 0.8 22 0.6 1 0.5
11 71 0.7 36 0.6 35 1.0 0 -
12 79 0.8 55 0.9 22 0.6 2 1.0
13 96 0.9 62 1.0 34 0.9 0 -
14 105 1.1 69 1.1 35 1.0 1 0.5
15 108 1.1 69 1.1 37 1.0 2 1.0
16 163 1.6 101 1.7 57 1.6 5 2.5
17 161 1.6 116 1.9 42 1.2 3 1.5
18 206 2.1 134 2.2 65 1.8 7 3.5
19 205 2.1 131 2.1 70 1.9 4 2.0
20 235 2.4 147 2.4 82 2.2 6 3.0
21 227 2.3 134 2.2 87 2.4 6 3.0
22 255 2.6 147 2.4 100 2.7 8 4.0
23 310 3.1 177 2.9 122 3.3 11 5.5
24 340 3.4 196 3.2 137 3.8 7 3.5
25 416 4.2 261 4.3 150 4.1 5 2.5
26 498 5.0 267 4.4 213 5.8 18 9.0
27 625 6.3 333 5.4 275 7.5 17 8.5
28  1087  10.9 622 10.2 446 12.2 19 9.5
29 4239 42.5 2712 44.4 1455 40.0 72 35.8
Total 9964 100% 6111 100% 3652 100% 201 100%
 
Note: Universe excludes the 1,079 members of the military subsample and the 1,643 members of the economically disadvantaged, non-black/non-Hispanic oversample dropped from interviewing; it includes the remaining 9,964 eligible members.
1 Surveys completed in any year, not necessarily consecutive survey years.

Confidentiality & Informed Consent

The NLS program has established set procedures for ensuring respondent confidentiality and obtaining informed consent. These procedures comply with Federal law and the policies and guidelines of the U.S. Office of Management and Budget (OMB) and the U.S. Bureau of Labor Statistics:

OMB Procedures and Federal Laws

OMB Procedures

The Office of Management and Budget (OMB) is responsible for setting overall statistical policy among Federal agencies. For example, OMB has established standards on collecting information about race and ethnicity, industry, occupation, and geographic location. OMB also has established standards on the manner and timing of data releases for such principal economic indicators as the gross domestic product, the national unemployment rate, and the Consumer Price Index. In addition, OMB sets standards on whether and how much respondents to Federal surveys can be paid for their participation, an issue of particular concern in the NLS program.

Another of OMB's responsibilities is to review the procedures and questionnaires that Federal agencies use in collecting information from 10 or more respondents. Federal data collections reviewed by OMB include administrative data, such as the tax forms that the Internal Revenue Service requires individuals and corporations to complete. OMB also reviews all censuses and surveys that Federal agencies conduct, either directly or through contracts.

Surveys that are funded through Federal grants to universities and other organizations generally do not have to undergo this OMB review process unless the grantee in turn contracts with a Federal statistical agency such as the Census Bureau to collect the data. In place of OMB review, surveys funded through grants typically must undergo a competitive peer-review process established by the agency administering the grant, and that review process examines the procedures for maintaining respondent confidentiality and obtaining the informed consent of the participants. In addition, such surveys also typically are scrutinized by an institutional review board established at the grantee's institution.

OMB examines a variety of issues during these reviews, such as the:

  • amount of time (and money, if any) that the agency collecting the information estimates respondents will spend to provide the requested information
  • agency's efforts to reduce the burden on respondents of providing the information
  • purpose and necessity of the data collection, including whether it duplicates the objectives of other Federal data collections
  • ways in which the agency obtains informed consent from potential respondents to participate in the data collection
  • policies and procedures that the agency has established to ensure respondent confidentiality
  • statistical methods used to select representative samples, maximize response rates, and account for nonresponse
  • payment of money or the giving of gifts to respondents
  • questionnaire itself, including the quality of its design and whether it includes questions that respondents may regard as sensitive

These OMB reviews are very thorough. From the time an agency prepares an OMB information collection request until the time OMB approves the data collection, the process typically takes 7 months or more and includes multiple layers of review within the agency and at OMB. These reviews are helpful in improving survey quality and ensuring that agencies treat respondents properly, both in terms of providing them with information about the data collection and its uses and protecting respondent confidentiality.

The review process also provides the general public with two opportunities to submit written comments about the proposed data collection. The agency conducting the data collection publishes a notice in the Federal Register describing the data collection and inviting the public to request copies of the information collection request, questionnaires, and other materials that the agency eventually will submit to OMB. The public is invited to submit written comments to the agency sponsoring the data collection within 60 days from the time the Federal Register notice is published. In the history of the NLS program, the public very rarely has submitted comments to BLS, but when comments are received, they are summarized in the information collection request that ultimately is submitted to OMB.

After the request has been submitted to OMB, the agency sponsoring the data collection then publishes a second notice in the Federal Register and invites the public to submit comments directly to OMB within 30 days. Again, in the history of the NLS program, the public very rarely, if ever, has submitted comments to OMB. Once OMB has received the information collection request, they have 60 days to review the package, ask follow-up questions, suggest changes (or, occasionally, insist upon changes) to the survey questionnaire or procedures, and ultimately grant approval.

Respondents' Advance Letter. After OMB grants approval, the sponsoring agency can begin contacting potential respondents and collecting information from them. The process of contacting potential NLS respondents begins with sending them an advance letter several weeks before interviews are scheduled to begin. The advance letter serves several purposes. The obvious purpose is to inform respondents that an interviewer will be contacting them soon, but BLS and the organizations that conduct the surveys for BLS also use the letter to thank respondents for their previous participation and to encourage them to participate in the upcoming round. Another important objective of the advance letter is to remind respondents that their participation is voluntary and to tell them how much time the interview is expected to take. The letter also explains to respondents how the data will be used and how respondents' confidentiality will be protected by BLS and the organizations that conduct the surveys for BLS. An example of an advance letter, along with the confidentiality statement that appears on the back of the letter, is shown in Figure 1.

Figure 1. NLSY79 Round 29 Advance Letter

Dear [Respondent Name],
For more than 40 years, the NLSY79 has provided vital information about the lives of ordinary Americans. Few surveys can match the NLSY79 in helping us understand who we are as a nation. And for that, we thank you.

Your continued participation in this study has impacted how our country understands important economic, educational, and labor market issues. And as you near retirement age and potentially leave the paid labor force, the NLSY79 will permit researchers to study key questions about retirement and the causes and consequences of age-related health issues.

We follow the federal laws that govern the confidentiality of survey respondents, as well as additional policies and procedures that ensure your answers are safeguarded. Please see the back of this letter for more information about privacy and confidentiality.

The average interview lasts about 74 minutes and you can schedule your appointment online as well as get extra cash with our Early Bird program! (See enclosed card for details.) To receive your gift faster, we offer electronic payment options through online or mobile banking and PayPal.

We appreciate your time and willingness to thoughtfully answer our questions. Few people have the opportunity to make such a lasting contribution. Thank You!

Sincerely,

Keenan Dworak-Fisher
Director, National Longitudinal Surveys
U.S. Bureau of Labor Statistics 

WHY IS THIS STUDY IMPORTANT? Thanks to your help, policymakers and researchers will have a better understanding of the work experiences, family characteristics, health, financial status, and other important information about the lives of people in your generation. This is a voluntary study, and there are no penalties for not answering questions. However, missing responses make it more difficult to understand the issues that concern people in your community and across the country. Your answers represent the experiences of hundreds of other people your age. We hope we can count on your participation again this year.

WHO AUTHORIZES THIS STUDY? The sponsor of the study is the U.S. Department of Labor, Bureau of Labor Statistics. The study is authorized under Title 29, Section 2, of the United States Code. The Center for Human Resource Research at The Ohio State University and NORC at the University of Chicago conduct this study under a contract with the Department of Labor. The U.S. Office of Management and Budget (OMB) has approved the questionnaire and has assigned 1220-0109 as the study's control number. This control number expires on ##/##/20##. Without OMB approval and this number, we would not be able to conduct this study.
 

WHO SEES MY ANSWERS? We want to reassure you that your confidentiality is protected by law. In accordance with the Confidential Information Protection and Statistical Efficiency Act, the Privacy Act, and other applicable Federal laws, the Bureau of Labor Statistics, its employees and agents, will, to the full extent permitted by law, use the information you provide for statistical purposes only, will hold your responses in confidence, and will not disclose them in identifiable form without your informed consent. All the employees who work on the survey at the Bureau of Labor Statistics and its contractors must sign a document agreeing to protect the confidentiality of your data. In fact, only a few people have access to information about your identity because they need that information to carry out their job duties.

Some of your answers will be made available to researchers at the Bureau of Labor Statistics and other government agencies, universities, and private research organizations through publicly available data files. These publicly available files contain no personal identifiers, such as names, addresses, Social Security numbers, and places of work, and exclude any information about the States, counties, metropolitan areas, and other, more detailed geographic locations in which survey participants live, making it much more difficult to figure out the identities of participants. Some researchers are granted special access to data files that include geographic information, but only after those researchers go through a thorough application process at the Bureau of Labor Statistics. Those authorized researchers must sign a written agreement making them official agents of the Bureau of Labor Statistics and requiring them to protect the confidentiality of survey participants. Those researchers are never provided with the personal identities of participants. The National Archives and Records Administration and the General Services Administration may receive copies of survey data and materials because those agencies are responsible for storing the Nation's historical documents.

HOW MUCH TIME WILL THE INTERVIEW TAKE? Based on preliminary tests, we expect the average interview to take about 74 minutes. Your interview may be somewhat shorter or longer depending on your circumstances. If you have any comments regarding this study or recommendations for reducing its length, send them to the Bureau of Labor Statistics, National Longitudinal Surveys, 2 Massachusetts Avenue, N.E., Washington, DC 20212.

WHERE CAN I FIND MORE INFORMATION? To learn more about the survey, visit www.bls.gov/nls. To search for articles, reports, and other research based on the National Longitudinal Surveys, visit www.nlsbibliography.org.

Institutional Review Boards

In addition to OMB review, the NLSY79 is reviewed and approved by an institutional review board (IRB) at the institutions that manage and conduct the surveys under contract with BLS. Those institutions are The Ohio State University and NORC at the University of Chicago. BLS and OMB do not require these reviews; rather, the reviews are required under the policies of the universities. Obtaining approval from the IRBs involves completing a form signed by the Principal Investigator, providing a summary of the research project and submitting a description of the consent procedures and forms used in the survey.  Additional documentation includes a copy of any materials used to recruit respondents, a detailed summary of the survey questionnaire, and any other information regarding the risks to humans of participating in the survey. OMB must review all data collections for the NLSY79.

The NLSY79 project staff at The Ohio State University Center for Human Resource Research (CHRR) and at NORC obtain approval from their respective IRBs prior to the start of each round of data collection. Because each survey includes only an interview and no invasive medical procedures, the IRBs typically focus on respondent compensation, consent procedures, and confidentiality protections for special populations, such as incarcerated or disabled respondents. Prisons, schools, and other institutions in which NLSY79 sample members may reside often request the IRB approval statement and application as evidence that appropriate procedures are being followed and to judge whether to permit NLSY79 interviewers to have access to individuals for whom the institutions are responsible.

Federal Laws

Two Federal laws govern policies and procedures for protecting respondent confidentiality and obtaining informed consent in the NLSY79 program: the Privacy Act of 1974 and the Confidential Information Protection and Statistical Efficiency Act (CIPSEA) of 2002. 

The Privacy Act and CIPSEA. These two acts protect the confidentiality of participants in the NLSY79 and its associated Child and Young Adult surveys. CIPSEA protects the confidentiality of participants by ensuring that individuals who provide information to BLS under a pledge of confidentiality for statistical purposes will not have that information disclosed in identifiable form to anyone not authorized to have it. 

In addition, CIPSEA ensures that the information respondents provide will be used only for statistical purposes. While it always has been the BLS policy to protect respondent data from disclosure through the Privacy Act and by claiming exemptions to the Freedom of Information Act, CIPSEA is important because it specifically protects data collected from respondents for statistical purposes under a pledge of confidentiality. 

This law strengthens the ability of BLS to assure respondents that, when they supply information to BLS, their information will be protected. In addition, CIPSEA includes fines and penalties for any knowing and willful disclosure of specific information to unauthorized persons by any officer, employee, or agent of BLS. Since the enactment of the Trade Secrets Act and the Privacy Act, BLS officers, employees, and agents have been subject to criminal penalties for the mishandling of confidential data, and the fines and penalties under CIPSEA are consistent with those prior laws. CIPSEA now makes such fines and penalties uniform across all Federal agencies that collect data for exclusively statistical purposes under a pledge of confidentiality.

Survey interviewers are trained how to answer questions from respondents about how their privacy will be protected. Interviewers explain to potential respondents that all the employees who work on the surveys at BLS, NORC, and CHRR are required to sign a document stating that they will not disclose the identities of survey respondents to anyone who does not work on the NLS program and is therefore not legally authorized to have such information. In fact, no one at BLS has access to information about respondents' identities, and only a few staff members at NORC and CHRR who need such information to carry out their job duties have access to information about respondents' identities.

Interviewers also explain that the answers respondents provide will be made available to researchers at BLS and other government agencies, universities, and private research organizations, but only after all personal identifiers--such as names, addresses, Social Security numbers, and places of work--have been removed. In addition, the publicly available data files exclude any information about the States, counties, metropolitan statistical areas, and other, more detailed geographic locations in which respondents live, making it much more difficult to infer the identities of respondents.

Respondents are told that some researchers are granted special access to data files that include geographic information, but only after those researchers undergo a thorough application process at BLS and sign a written agreement making them official agents of BLS and requiring them to protect the confidentiality of respondents. In no case are researchers provided with information on the personal identities of respondents.

Finally, the reference in the questions and answers to the National Archives and Records Administration and the General Services Administration may be confusing to some potential respondents, because those Federal agencies are not involved in the administration of the surveys. Interviewers explain to respondents that NLS data and materials will be made available to those agencies because they are responsible for storing the Nation's historical documents.  The information provided to those agencies does not include respondents' personal identities, however.

The organizations involved in the NLS program continuously monitor their security procedures and improve them when necessary. Protecting the privacy of NLS respondents entails considerable responsibilities for BLS, the organizations that conduct the surveys for BLS, and the researchers who use the data. Indeed, researchers in particular may become frustrated that they cannot obtain access to all the data that they want or that they must undergo a long review process at BLS to obtain some types of data. It is important to remember, however, that protecting respondent confidentiality must remain paramount. Any action that might jeopardize respondent confidentiality and erode the confidence of respondents could harm response rates in the NLS program and in other government or academic surveys. Thus, without the safeguards in place to protect respondent confidentiality, researchers would have far less data available to work with than they currently enjoy.

Contractors' Role in Maintaining Respondent Confidentiality. BLS, NORC, and CHRR are responsible for following the Federal requirements and maintaining their own security procedures. As mentioned earlier, all officers, employees, and agents of BLS are required to sign agreements stating that they will not disclose the identities of survey respondents to anyone who does not work on the NLS program and is therefore not legally authorized to have such information. Each contractor has in place procedures to ensure that the data are secure at each point in the survey process. (See the Data Handling section for more information.)

Survey Procedures

Like all contractor staff, field interviewers are agents of BLS and are required to sign the BLS agent agreement before working on the NLSY79. All interviewers also must undergo a background check when they are hired. Confidentiality is stressed during training and enforced at all times. Field interviewers receive specific instructions in their reference manuals to remind them of the appropriate procedures when locating or interacting with respondents or contacts.

At the end of each interview, interviewers ask respondents to provide information on family members, friends, or neighbors who can be contacted if the interviewers are unable to locate the sample member in a subsequent round of interviews. The interviewers then use those contacts to help in locating sample members who have moved. When contacting a sample member's relatives, friends, or neighbors about the sample member's whereabouts, interviewers never disclose the name of the survey they are conducting. They are instructed to maintain the confidentiality of any relative, friend, or neighbor who provides information about the sample member's whereabouts.

Answering machines can pose problems when interviewers are contacting sample members because it is difficult to confirm that the interviewer is calling a sample member's correct telephone number or that other household members will not hear the message. For those reasons, interviewers are instructed not to leave messages on answering machines.

When interviewers contact the appropriate household, they ask to speak with the sample member. Interviewers introduce themselves and state the purpose of the call by saying that they are from the National Opinion Research Center at the University of Chicago and are calling concerning a national survey. The name of the survey is not disclosed to anyone but the sample member.

Special Situations

The NLSY79 is a general population survey and includes a variety of sample members with special circumstances, such as incarcerated individuals, respondents in the military, other institutionalized persons, disabled persons, those with limited English proficiency, and so forth.

Incarcerated Respondents. Incarcerated respondents constitute the largest group requiring special accommodations. The first challenge with incarcerated respondents is contacting them to schedule an interview. NLS interviewers must contact the prison administration to arrange for an interview, but the interviewers cannot legally reveal to the prison administration that the prisoner previously had participated in the survey without first obtaining the written, informed consent of the prisoner to reveal that information (Note: Data were incomplete for 2004 due to confidentiality concerns regarding inmates' participation in the NLSY79. A protocol was established for round 22 of the NLSY79). 

The following steps are used for obtaining prisoners' consent:

  1. Prisoners are first sent a letter reminding them about their previous participation in a NORC survey, but, in case the mail is monitored by prison staff, the letter does not name the survey or BLS so as not to reveal the prisoner's participation. The letter encourages the prisoner to participate in the upcoming round of the survey.  It explains that NORC staff needs to set up an interview through the prison administration but that NORC cannot tell the prison administration about the prisoner's participation without the prisoner's informed consent.  he letter then asks the prisoner to request a consent form by signing and dating an enclosed form letter and mailing it to NORC in a pre-addressed, postage-paid envelope. The letter reminds the prisoner that the mail at the institution may be monitored and explains that the consent form that NORC will send the prisoner will state the prisoner's name and the name of the survey. The letter emphasizes that, by returning the enclosed form letter, prison management or staff may learn that the prisoner is a participant in the survey.
  2. If the prisoner chooses to send the form letter to NORC, NORC then sends the prisoner a cover letter and a consent form that names the specific survey.  The prisoner is asked to sign the consent form and mail it to NORC in a pre-addressed, postage-paid envelope. Once NORC has received the signed consent form, NORC staff can contact the prison to request permission to interview the prisoner and learn about any restrictions that the prison administration may impose.
  3. If the prison administration permits an interview and a date and time have been scheduled for the interview, NORC mails another letter to the prisoner. This letter serves two purposes. First, it tells the prisoner when the interview will take place. Second, it informs the prisoner in writing that the interview very likely will be monitored by prison that it is important to tell the prisoner in writing.

Once all of these steps are complete, the prisoner finally can be interviewed, but the NLS program takes additional steps to minimize the risk that prisoners might reveal illegal or illicit behavior in the presence of prison staff during the course of the interview. 

As described later in this chapter, such sensitive questions are asked in the self-administered portions of the NLSY79. During these portions of the survey, the typical protocol for a respondent who is not incarcerated involves the interviewer turning the laptop computer around to enable the respondent to read the questions to him or herself and enter the answers directly into the laptop computer without the interviewer knowing the responses. (In fact, the interviewer does not even know which questions the respondent answered). In some relatively low-security correctional facilities, such as some county jails and halfway houses, this protocol still would be possible. In higher security facilities, the prison administrators would not permit the prisoner to touch the computer, so the questions either would have to be read to the respondent or skipped altogether.

NLS program staff have identified the questions that could be considered even moderately sensitive or risky for the prisoner to answer out loud. Given this examination of these questions, the NLS program has adopted the following protocol for administering sensitive questions to prisoners:

  1. At the very beginning of the interview, the interviewer will indicate in the survey instrument whether a respondent is in a correctional facility of any kind and, if so, whether the facility permits the prisoner to touch the laptop and enter responses to the self-administered questions. For Federal prisons, the interviewer assumes that the prisoner is not permitted to touch the laptop.
  2. If the facility permits the prisoner to enter responses to the self-administered questions directly into the laptop, then the full set of questions, including all of the sensitive questions, would be administered.
  3. If the facility does not permit the prisoner to enter responses directly into the laptop, or if the interview is conducted over the telephone rather than in person, all survey questions will be asked orally by the interviewer, but the instrument is programmed to skip sensitive questions in which the prisoner might be asked about illegal or illicit behavior.

Military Respondents. NLSY79 respondents who are in the military tend to be very cooperative and willing to participate in the surveys, but it sometimes can be difficult to locate and contact them, particularly if they are stationed outside the United States. It sometimes is necessary to seek the help of military or civilian staff in the Department of Defense to locate and contact military respondents, but NLS program staff first must obtain the military member's written, informed consent to reveal to Department of Defense staff that he or she previously had participated in the survey and is willing to be contacted to participate in future rounds of the survey.

Respondents with Limited English Proficiency. Some respondents lack fluency in English and are more comfortable using another language. It is not possible to accommodate all of the different languages other than English that respondents might speak, but the NLSY79 historically has made special arrangements for respondents and their parents who speak Spanish, the most commonly spoken language other than English among respondents. NORC staff members translate advance letters and other informational materials into Spanish to enable respondents and the parents of minor respondents to provide their informed consent based on information that is written in the language that they understand best. Survey questionnaires also have been translated into Spanish to ensure that the surveys are administered consistently, an alternative much preferable to having Spanish-speaking interviewers translate the English-language questionnaire during the interview. The first 20 rounds of the NLSY79 included a Spanish version of the questionnaire, but, because the number of respondents who speak only Spanish has continued to decline, it no longer is cost-effective to continue programming a computerized Spanish questionnaire. For that reason, Spanish questionnaires are not used starting with round 21 (2004) of the NLSY79. Advance letters and other informational materials still are available in Spanish, however.

Sensitive Topics. The NLSY79 has included questions on income and assets, religion, relationships with parents and other family members, sexual experiences, abortion, drug and alcohol use, criminal activities, homelessness, runaway episodes, and other topics that are potentially sensitive for respondents to discuss. Respondents are advised at the start of the interview that they can choose not to answer any questions that they prefer not to answer. During training, interviewers undergo exercises to teach them how to allay the concerns of respondents about answering sensitive questions and encourage them to respond.  Interviewers are instructed not to coerce respondents into answering questions that they prefer not to answer, however.

All questions in the NLSY79 are read to the respondent by an interviewer. The respondent then provides an answer, and the interviewer records that answer on a laptop computer. For especially sensitive questions, some respondents might be reluctant to answer truthfully--or at all--if they have to tell an interviewer their answers, even though interviewers can face criminal and civil penalties if they disclose the respondents' identities or answers to anyone not authorized to receive that information.

Guidelines for E-mailing Sample Members. At the end of each interview, respondents are asked to provide information that will help interviewers contact them during subsequent rounds of the surveys. In addition to the information collected about relatives, friends, or neighbors, interviewers also obtain the e-mail addresses of sample members who have them. During round 20 of the NLSY79 (conducted during 2002), the NLS contractors began using e-mail as a means to contact a small number of sample members who were hard to reach by other means. The following guidelines were enacted to ensure confidentiality:

  1. The name of the survey is not contained in the subject line or text of the e-mail message. Some respondents may share the use of an e-mail address with other household members, so the survey name is omitted from the message to prevent other household members from learning the specific name of the survey.
  2. E-mail is sent from one main address. Field interviewers are not permitted to use their individual e-mail accounts to contact respondents.

Respondents Knowing Respondents. One feature of the sample design in the NLSY79 is that there often are multiple respondents within the same original household, either siblings or, occasionally, spouses. It obviously is not possible in these cases to prevent family members from knowing that a relative is in the survey sample, but interviewers take steps to ensure that each respondent's answers remain private and are not revealed to other family members.

Consent from NLSY79 Respondents. Respondents are able to review the confidentiality and consent information presented in the advance letter. The respondent gives verbal consent to participate at the beginning of the interview.

Data Handling

An important part of maintaining respondent confidentiality is the careful handling and storage of data. Steps taken by BLS, CHRR, and NORC to ensure the confidentiality of all respondents to the NLSY79 include maintaining secure networks, restricting access to geographic variables, and topcoding income and asset values.

Network Security. The data that are stored and handled at each NLSY79 organization's site are done so with maximum security in place.  During data collection, transmission, and storage, password protection and encryption are used to secure the data. Standard protocols for network security are followed at each organization's site. Detailed information about these arrangements is not provided to the public to prevent anyone from circumventing these safeguards.

Restricting Access to Geographic Information. Geographic information about NLSY79 respondents is available only to researchers who are designated agents of BLS. These researchers must agree in writing to adhere to the BLS confidentiality policy, and their projects must further the mission of BLS and the NLSY79 program to conduct sound, legitimate research in the social sciences. Applicants must provide a clear statement of their research methodology and objectives and explain how the geographic variables are necessary to meet those objectives. For more information about applying to use the restricted-use Geocode data is available on the BLS Restricted Data Access page.

Topcoding of Income and Asset Variables. Another step taken to ensure the confidentiality of NLSY79 respondents who have unusually high income and asset values is to "topcode" those values in NLSY79 data sets. Values that exceed a certain level are recoded so that they do not exceed the specified level. In each survey round, income and asset variables that include high values are identified for topcoding. For example, the wage and salary income variable usually is topcoded, but variables indicating the amount received from public assistance programs are not. Notes in the codebooks for topcoded income and asset variables provide more information about the exact calculations used to topcode each variable. For more information see the NLSY79 Documentation section.

References

Center for Human Resource Research. "Technical Sampling Report Addendum: Standard Errors and Deft Factors for Rounds IV through XIV." Columbus, OH: CHRR, The Ohio State University, 1994.

Frankel, M.R.; Williams, H.A.; and Spencer, B.D. Technical Sampling Report, National Longitudinal Survey of Labor Force Behavior. Chicago: NORC, University of Chicago, 1983.

Baker, Paula C.; Mott, Frank L.; Keck, Canada K.; and Quinlan, Stephen V. NLSY79 Child Handbook: A Guide to the 1986-1990 NLSY79 Child Data. Columbus, OH:  CHRR, The Ohio State University, 1993.

NORC. NLSY-National Longitudinal Survey of Labor Force Behavior Interviewer's Manual-Household Screening. Chicago: NORC, University of Chicago, 1978.

Olsen, Randall J. "The Effects of Computer Assisted Interviewing on Data Quality." Columbus, OH: CHRR, The Ohio State University, 1991.

Interview Methods

Interview Methods and Target Universe

During each survey round, NORC attempts to reach all youth within the active samples. No respondents have been routinely excluded from locating efforts with the exception of respondents who have died or, in a small number of cases, were judged to be extremely difficult. The permanent NLSY79 sample designated for interviewing during the 1979-84 interview years consisted of all civilian and military youth who were interviewed in the base year and who were alive at the survey date.

In 1985, when interviewing of the full military sample ceased, the total NLSY79 sample size dropped from 12,686 to 11,607. Retained for interviewing in post-1984 surveys were 201 military respondents randomly selected from the entire military sample of 1,280; the remaining 1,079 military respondents were eliminated from the sample. The 201 military members who were retained included (1) 51 cases that would have been selected as part of a random sample of youth including the military and (2) 150 additional cases selected to provide a sufficient number of original military sample members to avoid overly large sampling variability for the military sample. Beginning in 1991, the 1,643 members of the economically disadvantaged, nonblack/non-Hispanic supplemental sample were no longer interviewed. Unless otherwise noted, eligible sample sizes reported in NLS publications include deceased and difficult-to-field respondents but exclude those respondents dropped from the sample. Additional information on numbers and characteristics of noninterviewed respondents can be found in the Reasons for Noninterview section.

NLSY79 respondents reside in each of the 50 States and the District of Columbia, U.S. territories, and other countries. Prior to fielding, respondents receive a short, informative advance letter reminding him or her of the upcoming interview and confirming the respondent's current address and phone number. As of the 2006 round, all main youth respondents are instructed to call and set up an appointment for their interview.

Field staff locating efforts begin when there is no contact from respondents. Using locator sheets, interviewers at the local level are responsible for contacting all respondents in their caseloads and for tapping additional local resources (post offices, departments of motor vehicles and vital statistics, and so forth) to locate those respondents who have moved. If an interviewer is unsuccessful in locating a respondent, the case is transferred to the field manager who undertakes additional locating strategies.

In the event that such local-level efforts fail, the case is forwarded to NORC's locating shop where the complete files on each respondent can be accessed and used for additional locating efforts. Respondents who cannot be located are only a small percentage of the total not interviewed in a given survey year. (For more information about noninterviews, refer to the Reasons for Noninterview section.)

In addition to its comprehensive locating efforts, NORC makes every effort to convert initial respondent refusals to completed interviews. Uncooperative respondents receive "refusal conversion letters" and a wide array of materials designed to encourage continued participation in the survey. Extensive locating methods and a strong conversion strategy, combined with close monitoring of response rates for each of the subsamples of the NLSY79, have resulted in relatively high retention rates for a longitudinal panel of this duration. 

In early rounds, telephone contact to complete the survey occurred under certain circumstances, such as 1987 when funding restrictions limited in-person interviews, or in any year, when the respondent resided in a remote area or field staff determined that phone contact was the preferred method of interviewing a respondent. Through the years, respondents have become more dispersed or expressed a preference for phone interviews. In response the number of telephone interviews increased greatly beginning in 2002 and is now the main mode of interviewing. The percent of surveys conducted by telephone for each survey year are shown in Table 1.

Table 1. Percent of NLSY79 Interviews Conducted by Telephone, 1979-2020
Year Number of Phone Interviews Percent of Total Interviews   Year Number of Phone Interviews Percent of Total Interviews
1979 548 4.4  19931 - -
1980 648 5.3  19941 - -
1981 654 5.4 1996 1042 12.1
1982 1054 8.7 1998 2069 24.6
1983 324 2.6 2000 2613 32.5
1984 646 5.3 2002 5407 70.0
1985 953 8.7 2004 6497 82.8
1986 929 8.7 2006 6542 85.5
1987 8998 85.8 2008 6875 88.8
1988 920 8.8 2010 6799 90.1
1989 1518 14.3 2012 6697 91.8
1990 1317 12.6 2014 6861 97.1
1991 1241 13.8 2016 6646 96.3
1992 1164 12.9  2018  6592  95.8 
        2020 6407 98.0
 
1 Questions identifying whether interview was conducted by telephone were not included in the 1993 and 1994 surveys.

In rare cases, interviews are conducted in whole or in part with a proxy, a person other than the respondent (for example, four in 1991, two in 1992). A variable, entitled 'Interview Conducted with Proxy Respondent,' is present in the data to identify these interviews. In order to conduct such an interview, individual approval must be obtained by the NORC central office and the circumstances documented.

A Spanish version of the NLSY79 is prepared and NORC employs bilingual, Spanish-speaking interviewers. During the 2014 interview, for example, 93 respondents requested a Spanish version of the questionnaire.

The average length of a personal interview is approximately one hour. The 1987 telephone interviews were completed within about 40 minutes, while the administration of the child assessments added approximately 45 minutes to the total survey administration time for each child. 

Until 1989, the NLSY79 was conducted using only paper-and-pencil interviews (PAPI). PAPI interviews were performed by interviewers filling in the relevant fields of large printed questionnaire booklets. While these booklets were inexpensive to produce, interviewers could make mistakes in following complicated skip patterns and filling in answers. Moreover, after all interviews were completed, additional office staff were needed to transcribe the information collected. Computer-assisted personal interviews (CAPI) were designed to eliminate many of these problems. With CAPI, interviewers take laptop computers into the field instead of questionnaire booklets. A computer program automatically selects the next question, prevents interviewers from entering illegal values, and warns interviewers about questionable answers. The computer also eliminates the need for data transcription except for specific items collected verbally and coded later.

While the majority of interviews in 1990 were collected using PAPI materials, a subset of one fourth of respondents was administered their survey using CAPI methods in order to test the viability and reliability of CAPI administration. Due to the success of these experiments, the NLSY79 interviews became fully CAPI administered beginning in 1993. Users interested in the results of these experiments should consult Olsen (1991).  

Interview Schedule and Fielding Periods

The original interview schedule, which called for yearly personal interviews with NLSY79 respondents, was retained from 1979 through 1986. In 1987, budget constraints dictated a limited phone interview rather than a personal interview. Personal interviews resumed with the 1988 round and continued yearly until 1994.  NLSY79 respondents have been interviewed in even-numbered years since 1994.

The initial NLSY79 interviews were conducted between late January and mid-August 1979. The next several interviews were fielded in the first six months of the year; more recently surveys have typically begun in winter and ended the following winter. Table 2 provides information on the fielding periods for NLSY79 respondents.

Note: There is no timing requirement that interviews take place on the same day/month from round to round. Thus, a respondent interviewed at the beginning of one field period and then interviewed at the latter end of the next field period would have more weeks between interviews than someone who was interviewed the same time both survey years. Also, some respondents, for various reasons, may miss multiple survey rounds before they are interviewed again. This means that the number of weeks since respondents' last interview can vary greatly. The LINT-DATE variables give the date last interviewed.

Table 2. NLSY79 Fielding Periods
Survey Year(s) Fielding Period
1979-80 January-August
1981-82 January-July
1983-85 January-June
1986 February-July
1987 March-October
1988-91 June-December
1992 May-December
1993 June-November
1994 June-December
1996 April-October
1998 March-September
2000 April 2000-January 2001
2002 January-December
2004 January 2004-February 2005
2006 January 2006-March 2007
2008 January 2008-March 2009
2010 December 2009-February 2011
2012 September 2012-September 2013
2014 October 2014-October 2015
2016 October 2016-November 2017
2018  September 2018 - November 2019
2020 September 2020 - December 2021

From 1979 until 1986, timing of the fielding period was designed to allow all respondents still in school to be interviewed before they left to take temporary summer jobs. Detailed information was collected for jobs held by respondents while they were in school. Since the youngest respondents in the survey were 23 years old in 1988, the shift in fielding periods after 1987 had a relatively small impact on information on jobs held while in school. An attempt was made during the initial survey years to keep the fielding period for an individual respondent approximately the same from year to year in order to assure that the time between interviews was approximately twelve months.

Researchers conducting analyses on topics where time periods are critical should carefully examine the reference period of the questions, the actual interview date, and the duration since the preceding interview.

Subscribe to NLSY79