Language Arts, Literacy and Learning

Please reference specific information from the respective readings attached to provide an evidence base of your reasoning and informed perspective.  

1. Why do we need to pay better attention to secondary students’ language arts abilities and to revising our school curricula and instruction to include them?

Don't use plagiarized sources. Get Your Custom Essay on
Language Arts, Literacy and Learning
Just from $13/Page
Order Essay

2. How can we do a better job preparing students for success in the middle and high school grades in writing and language arts? 

Please no plagiarism and quality work. 

NCES2014-391 U.S. DEPARTMENT OF EDUCATION

Public High School Four-Year On-
Time Graduation Rates and Event
Dropout Rates: School Years 2010–

11

and 2011–1

2

First Look

Page intentionally left blank.

Public High School Four-Year On-
Time Graduation Rates and Event
Dropout Rates: School Years
2010–11 and 2011–

12

First Look

APRIL 201

4

Marie C. Stetser
Robert Stillwell
National Center for Education Statistics
U.S. Department of Education

NCES 2014-39

1

U.S. DEPARTMENT OF EDUCATION

U.S. Department of Education
Arne Duncan
Secretary

Institute of Education Sciences
John Q. Easton
Director

National Center for Education Statistics
John Q. Easton
Acting Commissioner

Administrative Data Division
Ross Santy
Associate Commissioner

The National Center for Education Statistics (NCES) is the primary federal entity for collecting, analyzing, and
reporting data related to education in the United States and other nations. It fulfills a congressional mandate to
collect, collate, analyze, and report full and complete statistics on the condition of education in the United States;
conduct and publish reports and specialized analyses of the meaning and significance of such statistics; assist state
and local education agencies in improving their statistical systems; and review and report on education activities in
foreign countries.

NCES activities are designed to address high-priority education data needs; provide consistent, reliable, complete,
and accurate indicators of education status and trends; and report timely, useful, and high-quality data to the U.S.
Department of Education, the Congress, the states, other education policymakers, practitioners, data users, and the
general public. Unless specifically noted, all information contained herein is in the public domain.

We strive to make our products available in a variety of formats and in language that is appropriate to a variety of
audiences. You, as our customer, are the best judge of our success in communicating information effectively. If you
have any comments or suggestions about this or any other NCES product or report, we would like to hear from you.
Please direct your comments to

NCES, IES, U.S. Department of Education
1990 K Street NW
Washington, DC 20006-5651

April 20

14

The NCES Home Page address is http://nces.ed.gov.
The NCES Publications and Products address is http://nces.ed.gov/pubsearch.

This publication is only available online. To download, view, and print the report as a PDF file, go to the NCES
Publications and Products address shown above.

Mention of trade names, commercial products, or organizations does not imply endorsement by the U.S.
Government.

Suggested Citation

Stetser, M., and Stillwell, R. (2014). Public High School Four-Year On-Time Graduation Rates and Event Dropout Rates:
School Years 2010–11 and 2011–12. First Look (NCES 2014-391). U.S. Department of Education. Washington, DC:
National Center for Education Statistics. Retrieved [date] from http://nces.ed.gov/pubsearch.

Content Contact

Marie Stetser
(202) 502-735

6

marie.stetser@ed.gov

  • Acknowledgments
  • The authors would like to thank all of the Consolidated State Performance Report, Common Core of
    Data, and EDFacts Coordinators for the 50 states, the District of Columbia, and the other
    jurisdictions that reported these data. The Department of Education is grateful for these
    Coordinators’ efforts and for the support of the state education agency or the jurisdictional agency
    in which they work.

    iii

    Contents

    Acknowledgments ……………………………………………………………………………………………………………… iii

  • List of
  • Tables
  • ……………………………………………………………………………………………………………………… v

  • Introduction
  • ………………………………………………………………………………………………………………………… 1

  • Selected Findings
  • ………………………………………………………………………………………………………………… 4
    References and Related Data Files ………………………………………………………………………………………… 5
    Tables ………………………………………………………………………………………………………………………………… 6

  • Appendix A: Collection Methodology and Sources of Error
  • …………………………………………………. A-1
    Appendix B: Detailed Methodology for Calculation of Four-Year On-Time Graduation Rates
    and Event Dropout Rates ………………………………………………………………………………………………….. B-1

    i

    v

    List of

    Tables

    Table 1. Public high school 4-year adjusted cohort graduation rate (ACGR), by
    race/ethnicity and selected demographics for the United States, the 50 states,
    the District of Columbia, and other jurisdictions: School year
    2010–11……………………………………………………………………..

    7

    Table 2. Public high school 4-year adjusted cohort graduation rate (ACGR), by
    race/ethnicity and selected demographics for the United States, the 50 states,
    the District of Columbia, and other jurisdictions: School year
    2011–12………………………………………………………………………

    9

    Table 3. Public high school averaged freshman graduation rate (AFGR), by gender
    and race/ethnicity for the United States, the 50 states, the District of
    Columbia, and other jurisdictions: School year 2010–11………………..….. 11

    Table 4. Public high school averaged freshman graduation rate (AFGR), by gender
    and race/ethnicity for the United States, the 50 states, the District of
    Columbia, and other jurisdictions: School year 2011–12…………….……… 1

    3

    Table 5. Public high school event dropout rate for the United States, the 50 states, the
    District of Columbia, and other jurisdictions: School years 2010–11 and
    2011–12………………………………………………………………………. 1

    5

    v

    Introduction

    data reported by state or jurisdiction and,
    for the first time, a national estimated 4-year cohort graduation rate;

    Averaged freshman graduation rate (AFGR) data by state or jurisdiction and a national
    estimated AFGR; and

    High school event dropout rate data by state or jurisdiction and a national estimated event
    dropout rate.

    This National Center for Education Statistics (NCES) First Look report introduces new data for two
    separate measures of 4-year on-time graduation rates as well as event dropout rates for school year
    (SY) 2010–11 and SY 2011–12. Specifically this report provides the following:


    Four-year adjusted cohort graduation rate (ACGR)1

    Both the AFGR and ACGR are 4-year on-time graduation rates that provide measures of the percent of
    students that successfully complete high school in 4 years with a regular high school diploma.2 Event
    dropout rates provide a measure of the percentage of students who drop out in a single year. The tables
    in this report present descriptive information for the United States and for individual states and
    jurisdictions. The findings chosen for this report provide only a few examples of how the graduation
    and dropout data may be used. Compared to other measures of graduation rates, the ACGR is
    considered the most accurate measure available for reporting on-time graduation rates (Seastrom et al.
    2006b). A 4-year ACGR is defined as the number of students who graduate in 4 years with a regular
    high school diploma divided by the number of students who form the adjusted cohort for that
    graduating class. The term “adjusted cohort” means the students who enter grade 9 plus any students
    who transfer into the cohort in grades 9–12 minus any students who are removed from the cohort
    because they transferred out, moved out of the country, or were deceased (34 C.F.R. § 200.19). For a
    more detailed discussion of how ACGR is calculated for a specific school year, see appendix B.
    The AFGR is a proxy indicator for a cohort rate such as ACGR that utilizes aggregated counts of
    students by grade and the overall diploma count, as opposed to individual student-level data, to
    estimate an on-time graduation rate. The AFGR estimate is not as accurate as the ACGR; however, the
    AFGR can be estimated annually as far back as the 1960s using comparable aggregate data.
    Both graduation rates represent the percentage of students who successfully complete high school in 4
    years with a regular high school diploma. They do not represent the percentage of all of students who
    earn a high school credential. This distinction is important because a number of student groups are 1)
    not considered dropouts and 2) not considered on-time completers. For example


    Some students may have been held back one or more grades in high school but do, in the end,
    successfully receive a regular high school diploma.

    Many students complete high school with an alternative credential. Sometimes a student with
    an Individualized Education Plan (IEP) may receive alternative credentials indicating the

    1 The ACGR is referred to in regulations, which amended 34 C.F.R. §200.19 as the Four-Year Adjusted Cohort Graduation Rate.
    2 Under 34 C.F.R. §200.19(b)(1)(iv), a “regular high school diploma” means the standard high school diploma awarded to students in a
    state that is fully aligned with the state’s academic content standards and does not include a high school equivalency credential,
    certificate of attendance, or any alternative award. The term “regular high school diploma” also includes a “higher diploma” that is
    awarded to students who complete requirements above and beyond what is required for a regular diploma.

    1

    completion of their IEP and high school experience. Other students may leave high school
    having successfully achieved a high school equivalency diploma or other alternative credential.

    • Other students, who are dually enrolled in both high school and postsecondary school, take
    more than 4 years to graduate due to the increased requirements. These students often receive
    both a regular high school diploma and an associate’s degree upon completion.

    Because the definition of on-time graduation considered in this report is based on a 4-year high school
    experience resulting in the receipt of a regular high school diploma, the students described in the
    preceding bullets, while counted within the cohort or enrollment base, are neither dropouts, nor on-
    time completers.
    The 4-year on-time graduation rates presented in this report should not be confused with related rates
    intended to study different topics. For example, NCES also publishes completion rates calculated from
    household survey data collected by the Census Bureau. Completion rates indicate the percentage of the
    population, typically in a specified age range, holding high school credentials in general. They are not
    sensitive to how long a person might have taken to earn the credential, or to where the credential was
    earned. Some completion rates also include those earning alternative credentials that represent high
    school equivalency. Many students counted as “completers” for the calculation of a completion rate
    might not qualify as on-time graduates in the ACGR or AFGR. Additionally, the inverse of the ACGR
    or AFGR should not be confused with a dropout rate. Counts of students who have not graduated on
    time with a regular high school diploma do include dropouts, but also include those who will earn a
    regular diploma in more than 4 years and those who have or will earn alternative credentials. It is for
    this reason that NCES also calculates and reports on measures in addition to high school completion,
    such as the event dropout rate included in this report.
    The high school event dropout rate indicates the proportion of students who were enrolled at some time
    during the school year and were expected to be enrolled in grades 9–12 in the following school year
    but were not enrolled by October 1 of the following school year. Students who have graduated,
    transferred to another school, died, moved to another country, or who are out of school due to illness
    are not considered dropouts. The event dropout rate is not comparable to other dropout rates released
    by the Department or elsewhere. Status dropout rates, for example, measure the percentage of a
    population that did not complete high school (e.g., some percentage of young adults aged 18–24
    dropped out of high school).
    The calculated totals in this report, identified as “United States” totals in tabulations and “national”
    estimates in text, include data for only the 50 states and the District of Columbia and exclude data for
    other jurisdictions.
    This First Look provides users with an opportunity to access SY 2010–11 provisional data that have
    been fully reviewed and edited, and SY 2011–12 preliminary data that have been subjected to a limited
    data review and editing.3 Neither set of data have been available publicly prior to the release of this
    report. The data used in this report were collected as part of the U.S. Department of Education’s
    EDFacts Initiative. NCES uses these data to report, analyze, and disseminate statistical data that

    3 NCES has begun implementing a data release methodology based upon three stages of data review: Preliminary, Provisional, and Final.
    Preliminary release data may only include data initially reported by a state education agency (SEA), which has undergone cursory review
    and minimal editing. Preliminary data may be less complete due to late reporting or data quality concerns. Provisional release data have
    undergone a complete review and been subjected to NCES data quality control procedures. The preliminary SY 2011–12 data in this
    report will undergo further review and a revised provisional file will be released later in 2014. Additionally, NCES expects to release
    final SY 2010–11 data that include any final updates reported by SEAs prior to the closing of the SY 2010–11 data collection.

    2

    describe public elementary/secondary education. SEAs submit aggregate counts of students used to
    calculate the dropout and graduation rates or actual rates (in the case of reporting the ACGR). The
    rates included in this report have been reported in whole number percentages or percentage point
    ranges to prevent any potential disclosure of individual student data.
    More detailed explanations of the definitions and methodology used to calculate these rates can be
    found in Appendix A: Collection Methodology and Sources of Error and Appendix B: Detailed
    Methodology for Calculation of Four-Year On-Time Graduation Rates and Event Dropout Rates.

    3

    Selected Findings






    For SY 2010–11, the estimated national4 4-year ACGR for public high school students was 79
    percent (table 1), and for SY 2011–12 it was 80 percent (table 2). This indicates that nearly 4 out of
    5 students receive a regular high school diploma within 4 years of starting 9th grade for the first
    time.

    For SY 2010–11, American Indian/Alaska Native, Black, and Hispanic students had 4-year
    ACGRs below the national average at 65, 67, and 71 percent, respectively.5 White students and
    Asian/Pacific Islander students had ACGRs above the national average at 84 and 87 percent,
    respectively. Economically disadvantaged students, students with limited English proficiency, and
    students with disabilities all had ACGR rates below the national average for all students at 70, 57,
    and 59 percent, respectively (table 1).

    For SY 2011–12 American Indian/Alaska Native, Black, and Hispanic students had a 4-year
    ACGR below the national average at 67, 69, and 73 percent, respectively. White students and
    Asian/Pacific Islander students had 4-year ACGRs above the national average at 86 and 8

    8

    percent,
    respectively. Economically disadvantaged students, students with limited English proficiency, and
    students with disabilities all had 4-year ACGR rates below the national average for all students at
    72, 59, and 61 percent, respectively (table 2).

    The national AFGR (a less precise estimate of an on-time graduation rate than the ACGR) tracked
    slightly above the ACGR estimates with a SY 2010–11 rate of 80 percent and a SY 2011–12 rate of
    81 percent (tables 3 and 4). Like the ACGR, AFGR estimates for American Indian/Alaska Native,
    Black, and Hispanic students were lower than the national average while White and Asian/Pacific
    Islander rates were higher in both SY 2010–11 and SY 2011–12.

    In both SY 2010–11 and SY 2011–12, the AFGR for female students exceeded the graduation rate
    for male students by 7 percentage points. That is, 84 percent for females vs. 77 percent for males in
    SY 2010–11 and 85 percent for females vs. 78 percent for males in SY 2011–12 (tables 3 and 4).6

    The public high school event dropout rate for the United States remained constant at 3.3 percent for
    both SY 2010–11 and SY 2011–12 (table 5). In SY 2010–11, twenty-four states, the District of
    Columbia, and the U.S. Virgin Islands had an event dropout rate that exceeded the national dropout
    rate. Twenty-four states and Puerto Rico had an event dropout rate that was below the national
    dropout rate. In SY 2011–12, twenty states, the District of Columbia, and the U.S. Virgin Islands
    had an event dropout rate that exceeded the national dropout rate. Thirty states and Puerto Rico had
    an event dropout rate that was below the national dropout rate.

    4 Estimates referenced as “national” include only the 50 U.S. states and the District of Columbia. For the purpose of comparison, Puerto
    Rico and the U.S. Virgin Islands are compared to the “national” dropout rate in bullet six but were not included in the calculation of that
    rate.
    5 Black includes African American, Hispanic includes Latino, Asian/Pacific Islander includes Native Hawaiian or Other Pacific Islander,
    and American Indian includes Alaska Native. Race categories exclude Hispanic origin unless specified.
    6 The ACGR is not collected by gender in the Consolidated State Performance Report.

    4

    References and Related Data Files

    References
    Final Guidance on Maintaining, Collecting, and Reporting Racial and Ethnic Data to the U.S.

    Department of Education, 72 Fed. Reg. 59266-59279 (October 19, 2007);
    http://www2.ed.gov/legislation/FedRegister/other/2007-4/101907c.html.

    Seastrom, M., Chapman, C., Stillwell, R., McGrath, D., Peltola, P., Dinkes, R., and Xu, Z. (2006a).
    User’s Guide to Computing High School Graduation Rates, Volume 1: Review of Current and
    Proposed Graduation Indicators (NCES 2006-604). National Center for Education Statistics,
    Institute of Education Sciences, U.S. Department of Education. Washington, DC. Retrieved
    January 27, 2014 from http://nces.ed.gov/pubs2006/2006604 .

    Seastrom, M., Chapman, C., Stillwell, R., McGrath, D., Peltola, P., Dinkes, R., and Xu, Z. (2006b).
    User’s Guide to Computing High School Graduation Rates, Volume 2: Technical Evaluation of
    Proxy Graduation Indicators (NCES 2006-605). National Center for Education Statistics,
    Institute of Education Sciences, U.S. Department of Education. Washington, DC. Retrieved
    January 27, 2014 from http://nces.ed.gov/pubs2006/2006605 .

    The EDFacts Initiatives, retrieved January 27, 2014, from
    http://www2.ed.gov/about/inits/ed/edfacts/index.html.

    Title I—Improving the Academic Achievement of the Disadvantaged; Final Rule, 34 C.F.R. § 200
    (2008); http://www2.ed.gov/legislation/FedRegister/finrule/2008-4/102908a .

    Title 34—Education, Other Academic Indicators, 34 C.F.R. § 200.19 (2009);
    http://www.gpo.gov/fdsys/granule/CFR-2009-title34-vol1/CFR-2009-title34-vol1-sec200-
    19/content-detail.html.

    U.S. Department of Education. (November 2012). Four-Year Regulatory Adjusted Cohort Graduation
    Rate School Year 2010–11. Provisional Release: Data Notes [Press release]. Retrieved January
    27, 2014 from http://www2.ed.gov/documents/press-releases/adjusted-cohort-graduation-
    rate .

    U.S. Department of Education (December 22, 2008). High School Graduation Rate: Non-Regulatory
    Guidance. Retrieved January 27, 2014 from
    http://www2.ed.gov/policy/elsec/guid/hsgrguidance .

    U.S. Department of Education. (November 2012). Provisional Data File: School Year 2010–11 Four-
    Year Regulatory Adjusted Cohort Graduation Rates [Press release]. Retrieved January 27, 2014
    from http://www2.ed.gov/documents/press-releases/state-2010-11-graduation-rate-data .

    Winglee, M., Marker, D., Henderson, A., Aronstamm Young, B., and Hoffman, L. (2000). A
    Recommended Approach to Providing High School Dropout and Completion Rates at the

    State

    Level (NCES 2000-305). National Center for Education Statistics, U.S. Department of
    Education. Washington, DC. Retrieved January 27. 2014 from
    http://nces.ed.gov/pubs2000/2000305 .

    5

    http://www2.ed.gov/legislation/FedRegister/other/2007-4/101907c.html

    http://nces.ed.gov/pubs2006/2006604

    http://nces.ed.gov/pubs2006/2006605

    http://www2.ed.gov/about/inits/ed/edfacts/index.html

    http://www2.ed.gov/legislation/FedRegister/finrule/2008-4/102908a

    http://www.gpo.gov/fdsys/granule/CFR-2009-title34-vol1/CFR-2009-title34-vol1-sec200-19/content-detail.html

    http://www.gpo.gov/fdsys/granule/CFR-2009-title34-vol1/CFR-2009-title34-vol1-sec200-19/content-detail.html

    http://www2.ed.gov/documents/press-releases/adjusted-cohort-graduation-rate

    http://www2.ed.gov/documents/press-releases/adjusted-cohort-graduation-rate

    http://www2.ed.gov/policy/elsec/guid/hsgrguidance

    http://www2.ed.gov/documents/press-releases/state-2010-11-graduation-rate-data

    http://nces.ed.gov/pubs2000/2000305

    Tables

    6

    Table 1. Public high school 4-year adjusted cohort graduation rate (ACGR), by race/ethnicity and selected demographics
    for the United States, the 50 states, the District of Columbia, and other jurisdictions: School year 2010–11

    State

    Percent of students

    Total

    American
    Indian/
    Alaska

    Native

    Asian/
    Pacific

    Islander Hispanic Black White
    Economically

    disadvantaged

    Limited
    English

    proficiency

    Students
    with

    disabilities

    United States1 79 65 87 71 67 84 70 57 59

    Alabama 72 80 77 66 63 78 62 36 30
    Alaska 68 51 74 62 63 75 56 41 40
    Arizona 78 62 87 72 74 85 73 25 67
    Arkansas 81 85 75 77 73 84 75 76 75
    California 76 68 89 70 63 85 70 60 59
    Colorado 74 52 81 60 65 81 62 53 53
    Connecticut 83 72 92 64 71 89 63 59 62
    Delaware 78 77 90 71 73 82 71 65 56
    District of Columbia 59 <> <> 55 58 85 58 53 39
    Florida 71 70 86 70 59 76 60 53 44
    Georgia 67 68 79 58 60 76 59 32 30
    Hawaii 80 60 81 79 77 78 75 60 59
    Idaho2 — — — — — — — — —
    Illinois 84 78 92 77 74 89 75 68 66
    Indiana 86 76 88 81 75 88 79 73 65
    Iowa 88 79 88 75 73 90 78 70 70
    Kansas 83 72 88 73 72 86 73 70 73
    Kentucky2 — — — — — — — — —
    Louisiana 71 71 83 70 64 77 64 43 29
    Maine 84 82 90 87 77 84 73 78 66
    Maryland 83 74 93 72 76 89 74 54 57
    Massachusetts 83 76 88 62 71 89 70 56 66
    Michigan 74 62 85 63 57 80 63 61 52
    Minnesota 77 42 72 51 49 84 58 52 56
    Mississippi 75 71 90 79 69 82 70 54 32
    Missouri 81 78 87 75 67 86 75 62 69
    Montana 82 63 88 78 81 85 71 57 69
    Nebraska 86 64 83 74 70 90 78 52 70
    Nevada 62 52 74 53 43 71 53 29 23
    New Hampshire 86 78 87 73 73 87 72 73 69
    New Jersey 83 87 93 73 69 90 71 68 73
    New Mexico 63 56 77 59 60 73 56 56 47
    New York 77 64 86 63 64 86 69 46 48
    North Carolina 78 70 87 69 71 83 71 48 57
    North Dakota 86 62 88 76 74 90 76 61 67
    Ohio 80 71 88 66 59 85 65 53 67
    Oklahoma2 — — — — — — — — —
    Oregon 68 52 78 58 54 70 61 52 42
    Pennsylvania 83 77 88 65 65 88 71 63 71
    Rhode Island 77 66 75 67 67 82 66 68 58

    See notes at end of table.

    7

    Table 1. Public high school 4-year adjusted cohort graduation rate (ACGR), by race/ethnicity and selected demographics
    for the United States, the 50 states, the District of Columbia, and other jurisdictions: School year 2010–11—
    Continued

    State Percent of students

    Total
    American
    Indian/
    Alaska
    Native
    Asian/
    Pacific
    Islander Hispanic Black White
    Economically
    disadvantaged
    Limited
    English
    proficiency

    Students
    with

    disabilities

    South Carolina 74 67 84 69 70 77 67 62 39
    South Dakota 83 47 84 67 67 89 67 60 64
    Tennessee 86 88 91 79 78 89 80 71 67
    Texas 86 87 95 82 81 92 84 58 77
    Utah 76 57 70 57 61 80 65 45 59
    Vermont 87 ≥80 93 84 84 88 77 82 69
    Virginia 82 82 96 71 73 86 70 55 47
    Washington 76 57 81 63 65 79 66 51 56
    West Virginia 78 <> 88 81 73 78 69 84 60
    Wisconsin 87 75 89 72 64 91 74 66 67
    Wyoming 80 51 87 74 58 82 66 62 57

    Bureau of Indian Education and Puerto Rico

    Bureau of Indian
    Education 61 61 † † † † 61 51 56
    Puerto Rico3 — — — — — — — — —
    — Not available.
    † Not applicable. No students reported for this category in the cohort.
    <> Data were suppressed to protect the confidentiality of individual student data.
    ≥ Greater than or equal to. The estimate has been top coded to protect the confidentiality of individual student data.
    1The United States 4-year ACGR was estimated using both the reported 4-year ACGR data from 47 states and the
    District of Columbia and using imputed data for Idaho, Kentucky, and Oklahoma. The Bureau of Indian Education and
    Puerto Rico were not included in the United States 4-year ACGR estimate.
    2The Department of Education’s Office of Elementary and Secondary Education approved a timeline extension for these
    states to begin reporting 4-year ACGR data, resulting in the 4-year ACGR not being available for these states in SY 2010–
    11.
    3The Department of Education’s Office of Elementary and Secondary Education approved an exception for Puerto Rico to
    report 3-year ACGR data instead of 4-year ACGR data for SY 2010–11.
    NOTE: Reported rates are presented rounded to the whole percentage point where the related population size is greater
    than 300. Estimates have been top coded to protect the confidentiality of individual student data. Top coding is a process
    where rates at or above a specific level are reported in a range, rather than a precise percentage, to protect the privacy of
    individuals represented either within the reported rate or its inverse. Based on the population size, top coded estimates are
    presented as being greater than or equal to a certain percent. For example, a rate of 94 percent may be presented as “≥90”
    percent for one population and “≥80” percent for another, dependent on total population size. Black includes African
    American, Hispanic includes Latino, Asian/Pacific Islander includes Native Hawaiian or Other Pacific Islander, and
    American Indian includes Alaska Native. Race categories exclude Hispanic origin unless specified.
    SOURCE: U.S. Department of Education, National Center for Education Statistics, Common Core of Data (CCD), “NCES
    Common Core of Data State Dropout and Graduation Rate Data file,” School Year 2010–11, Provisional Version 1a.

    8

    Table 2. Public high school 4-year adjusted cohort graduation rate (ACGR), by race/ethnicity and selected demographics
    for the United States, the 50 states, the District of Columbia, and other jurisdictions: School year 2011–12

    State
    Percent of students
    Total
    American
    Indian/
    Alaska
    Native
    Asian/
    Pacific
    Islander Hispanic Black White
    Economically
    disadvantaged
    Limited
    English
    proficiency
    Students
    with

    disabilities
    United States1 80 67 88 73 69 86 72 59 61

    Alabama 75 84 85 69 67 81 66 36 54
    Alaska 70 54 76 70 61 76 59 47 46
    Arizona 76 63 84 70 71 84 71 24 65
    Arkansas 84 78 84 78 78 87 79 77 79
    California 78 72 90 73 66 86 73 62 61
    Colorado 75 58 82 62 66 82 61 53 54
    Connecticut 85 84 92 69 73 91 71 63 64
    Delaware 80 71 93 74 74 83 72 71 57
    District of Columbia 59 <> 74 54 58 86 70 52 44
    Florida 75 70 89 73 64 80 65 57 48
    Georgia 70 67 82 60 62 78 61 44 35
    Hawaii 82 65 84 76 76 79 80 56 74
    Idaho2 — — — — — — — — —
    Illinois 82 79 93 76 68 89 73 66 69
    Indiana 86 78 89 80 73 89 85 78 71
    Iowa 89 73 89 77 74 91 80 74 73
    Kansas 85 78 86 77 75 88 76 74 77
    Kentucky2 — — — — — — — — —
    Louisiana 72 73 85 70 65 78 66 49 33
    Maine 85 72 89 80 72 86 76 74 70
    Maryland 84 79 93 73 77 90 75 55 57
    Massachusetts 85 70 89 66 73 90 72 61 69
    Michigan 76 66 87 64 60 82 64 63 54
    Minnesota 78 45 74 53 51 84 59 51 56
    Mississippi 75 71 90 79 69 82 70 54 32
    Missouri 86 87 90 80 73 89 79 67 73
    Montana 84 63 92 79 79 87 73 53 81
    Nebraska 88 67 83 78 74 91 80 64 72
    Nevada 63 54 74 54 48 72 58 23 24
    New Hampshire 86 73 86 74 76 87 73 68 70
    New Jersey 86 84 95 77 75 93 75 73 74
    New Mexico 70 65 84 68 69 77 65 66 56
    New York 77 63 86 63 63 87 68 44 48
    North Carolina 80 74 87 73 75 85 75 50 60
    North Dakota 87 63 86 73 76 90 74 68 68
    Ohio 81 65 90 68 61 86 68 62 68
    Oklahoma2 — — — — — — — — —
    Oregon 68 51 79 60 53 71 61 49 38
    Pennsylvania 84 74 89 68 68 89 74 64 70
    Rhode Island 77 58 79 67 67 82 66 69 59

    See notes at end of table.

    9

    Table 2. Public high school 4-year adjusted cohort graduation rate (ACGR), by race/ethnicity and selected demographics
    for the United States, the 50 states, the District of Columbia, and other jurisdictions: School year 2011–12—
    Continued

    State
    Percent of students
    Total
    American
    Indian/
    Alaska
    Native
    Asian/
    Pacific
    Islander Hispanic Black White
    Economically
    disadvantaged
    Limited
    English
    proficiency
    Students
    with
    disabilities

    South Carolina 75 71 85 69 71 78 68 64 40
    South Dakota 83 47 84 67 67 89 67 60 64
    Tennessee 87 88 91 80 79 91 82 72 73
    Texas 88 87 94 84 84 93 85 59 77
    Utah 80 64 78 66 64 83 70 51 64
    Vermont 88 ≥80 94 86 72 88 77 75 71
    Virginia 83 81 90 73 75 88 72 55 49
    Washington 77 59 82 67 67 80 66 54 58
    West Virginia 79 67 94 79 74 80 72 83 60
    Wisconsin 88 77 89 74 64 92 75 66 69
    Wyoming 79 50 86 67 66 82 65 56 59

    Bureau of Indian Education and Puerto Rico

    Bureau of Indian
    Education 53 53 † † † † 53 56 48
    Puerto Rico3 — — — — — — — — —
    — Not available. Data were not reported and have not been imputed.
    † Not applicable. No students reported in the cohort.
    <> Data were suppressed to protect the confidentiality of individual student data.
    ≥ Greater than or equal to. The estimate has been top coded to protect the confidentiality of individual student data.
    1The United States 4-year ACGR was estimated using both the reported 4-year ACGR data from 47 states and the District
    of Columbia and using imputed data for Idaho, Kentucky, and Oklahoma. The Bureau of Indian Education and Puerto Rico
    were not included in the United States 4-year ACGR estimate.
    2The Department of Education’s Office of Elementary and Secondary Education approved a timeline extension for these
    states to begin reporting 4-year ACGR data, resulting in the 4-year ACGR not being available for these states in SY 2011–
    12.
    3The Department of Education’s Office of Elementary and Secondary Education approved an exception for Puerto Rico to
    report 3-year ACGR data instead of 4-year ACGR data for SY 2011–12.
    NOTE: Reported rates are presented rounded to the whole percentage point where the related population size is greater
    than 300. Estimates have been top coded to protect the confidentiality of individual student data. Top coding is a process
    where rates at or above a specific level are reported in a range, rather than a precise percentage, to protect the privacy of
    individuals represented either within the reported rate or its inverse. Based on the population size, top coded estimates are
    presented as being greater than or equal to a certain percent. For example, a rate of 94 percent may be presented as “≥90”
    percent for one population and “≥80” percent for another, dependent on total population size. Black includes African
    American, Hispanic includes Latino, Asian/Pacific Islander includes Native Hawaiian or Other Pacific Islander, and
    American Indian includes Alaska Native. Race categories exclude Hispanic origin unless specified.
    SOURCE: U.S. Department of Education, National Center for Education Statistics, Common Core of Data (CCD), “NCES
    Common Core of Data State Dropout and Graduation Rate Data file,” School Year 2011–12, Preliminary Version 1a.

    10

    Table 3. Public high school averaged freshman graduation rate (AFGR), by gender and race/ethnicity for the United
    States, the 50 states, the District of Columbia, and other jurisdictions: School year 2010–11

    State
    Percent of students

    Total Female Male

    American
    Indian/Alaska

    Native
    Asian/
    Pacific

    Islander Hispanic Black White
    United States 80 84 77 68 93 75 67 84

    Alabama 76 80 73 87 87 73 70 80
    Alaska 78 83 76 57 99 80 70 83
    Arizona 79 82 76 66 94 74 80 83
    Arkansas 77 80 74 77 98 79 70 78
    California 80 85 77 75 95 75 67 87
    Colorado 82 85 79 57 84 75 69 84
    Connecticut 85 88 82 ≥98 98 71 73 89
    Delaware 76 81 72 ≥90 94 73 69 79
    District of Columbia 61 65 56 <> <> 65 59 97
    Florida 72 78 70 94 92 75 63 75
    Georgia 70 76 67 ≥98 93 66 63 74
    Hawaii 74 76 71 48 73 56 80 53
    Idaho 83 86 80 75 93 77 67 83
    Illinois 80 84 80 97 97 74 63 87
    Indiana 80 86 78 87 ≥99 83 67 82
    Iowa 89 92 87 61 87 89 66 89
    Kansas 87 91 84 61 93 84 69 87
    Kentucky 81 85 79 63 ≥99 89 76 82
    Louisiana 71 77 65 66 98 90 64 76
    Maine 86 87 84 89 94 ≥98 78 85
    Maryland 84 89 79 76 96 85 75 86
    Massachusetts 85 89 84 68 97 68 77 89
    Michigan 75 80 71 59 92 51 57 80
    Minnesota 89 91 87 49 90 73 70 93
    Mississippi 69 74 63 55 93 68 65 72
    Missouri 85 88 82 88 98 88 73 86
    Montana 84 85 83 62 87 93 91 86
    Nebraska 90 93 87 58 97 88 58 91
    Nevada 59 64 54 40 68 50 40 64
    New Hampshire 87 89 84 79 ≥98 87 78 86
    New Jersey1 87 89 84 ≥98 97 78 73 91
    New Mexico 71 75 67 66 81 70 60 73
    New York 78 81 75 64 94 63 64 88
    North Carolina 77 82 74 71 84 74 68 81
    North Dakota 90 92 88 62 ≥95 78 ≥98 93
    Ohio 82 87 82 83 97 79 61 87
    Oklahoma 80 83 77 74 ≥99 74 67 82
    Oregon 78 85 77 62 82 80 65 78
    Pennsylvania 86 89 83 75 ≥99 73 70 90
    Rhode Island 77 82 72 56 70 70 66 79
    South Carolina 69 75 64 61 79 70 62 73
    South Dakota 82 84 79 43 ≥95 81 79 86
    Tennessee 81 85 78 ≥98 97 74 75 83

    See notes at end of table.

    11

    Table 3. Public high school averaged freshman graduation rate (AFGR), by gender and race/ethnicity for the United
    States, the 50 states, the District of Columbia, and other jurisdictions: School year 2010–11—Continued

    State
    Percent of students
    Total Female Male
    American
    Indian/Alaska
    Native
    Asian/
    Pacific

    Islander Hispanic Black White
    Texas 81 84 79 ≥99 95 80 71 83
    Utah 78 81 76 56 81 62 65 81
    Vermont 93 95 91 76 ≥98 ≥95 ≥98 91
    Virginia 83 89 80 87 99 91 70 84
    Washington 79 84 77 42 81 78 58 80
    West Virginia 78 80 76 55 97 81 74 78
    Wisconsin 92 95 90 70 ≥99 83 67 95
    Wyoming 80 83 78 47 86 82 53 81

    Department of Defense Education Activity (DoDEA), Bureau of Indian Education, and other jurisdictions

    Bureau of Indian
    Education — — — — — — — —
    American Samoa — — — — — — — —
    DoDEA — — — — — — — —
    Guam — — — — — — — —
    Northern Marianas — — — — — — — —
    Puerto Rico 62 68 55 <> <> 61 <> ≥80
    Virgin Islands 68 78 58 <> <> 77 66 68
    — Not available. Data were not reported and have not been imputed.
    <> Data were suppressed to protect the confidentiality of individual student data.
    ≥ Greater than or equal to. The estimate has been top coded to protect the confidentiality of individual student data.
    1 Data are imputed. New Jersey did not report graduate data by gender.
    NOTE: Reported rates are presented rounded to the whole percentage point where the related population size is greater
    than 300. Estimates have been top coded to protect the confidentiality of individual student data. Top coding is a process
    where rates at or above a specific level are reported in a range, rather than a precise percentage, to protect the privacy of
    individuals represented either within the reported rate or its inverse. Based on the population size, top coded estimates are
    presented as being greater than or equal to a certain percent. For example, a rate of 94 percent may be presented as “≥90”
    percent for one population and “≥80” percent for another, dependent on total population size. United States total includes
    data from the 50 states and the District of Columbia. Black includes African American, Hispanic includes Latino,
    Asian/Pacific Islander includes Native Hawaiian or Other Pacific Islander, and American Indian includes Alaska Native.
    Race categories exclude Hispanic origin unless specified.
    SOURCE: U.S. Department of Education, National Center for Education Statistics, Common Core of Data (CCD), “NCES
    Common Core of Data State Dropout and Graduation Rate Data file,” School Year 2010–11, Provisional Version 1a.

    12

    Table 4. Public high school averaged freshman graduation rate (AFGR), by gender and race/ethnicity, for the
    United States, the 50 states, the District of Columbia, and other jurisdictions: School year 2011–12

    State
    Percent of students
    Total Female Male

    American
    Indian/ Alaska

    Native
    Asian/ Pacific

    Islander Hispanic Black White
    United States 81 85 78 68 93 76 68 85

    Alabama 75 79 72 87 89 67 68 80
    Alaska 79 82 78 62 98 84 75 83
    Arizona 77 81 73 67 89 72 73 82
    Arkansas 78 81 75 69 84 80 72 79
    California 82 86 78 77 96 77 70 88
    Colorado 82 86 79 57 87 76 65 84
    Connecticut 86 89 83 ≥98 95 74 73 90
    Delaware 77 82 72 89 96 70 69 81
    District of
    Columbia 71 80 62 <> <> 59 70 98
    Florida 75 82 73 94 94 78 66 77
    Georgia 70 75 66 86 90 64 62 76
    Hawaii 78 81 75 65 76 68 77 56
    Idaho 84 86 82 67 96 83 78 83
    Illinois 82 85 83 91 98 79 64 89
    Indiana 80 87 78 80 ≥99 83 63 82
    Iowa 89 92 87 59 91 88 64 90
    Kansas 89 92 86 64 92 87 70 89
    Kentucky 82 86 80 72 ≥99 89 78 82
    Louisiana 72 78 66 68 98 87 65 76
    Maine 87 88 86 60 ≥98 97 83 86
    Maryland 84 89 81 70 96 85 74 87
    Massachusetts 86 89 85 70 98 69 82 90
    Michigan 77 82 74 66 92 51 60 83
    Minnesota 88 91 86 48 92 70 66 92
    Mississippi 68 74 61 44 85 68 63 72
    Missouri 86 89 83 98 98 92 73 87
    Montana 86 88 84 62 87 96 65 87
    Nebraska 93 95 91 68 97 93 65 93
    Nevada 60 65 55 37 71 50 41 64
    New Hampshire 87 90 84 65 ≥99 86 74 87
    New Jersey1 87 89 84 59 ≥99 78 74 91
    New Mexico 74 78 71 71 90 73 68 76
    New York 78 79 76 68 94 65 65 85
    North Carolina 79 83 76 74 88 78 68 82
    North Dakota 91 93 89 62 ≥95 82 ≥98 93
    Ohio 84 89 84 75 97 82 64 89
    Oklahoma 79 82 76 72 ≥99 78 66 80
    Oregon 78 85 77 58 87 78 65 78
    Pennsylvania 88 91 86 79 ≥99 76 75 92

    See notes at end of table.

    13

    Table 4. Public high school averaged freshman graduation rate (AFGR), by gender and race/ethnicity, for the United
    States, the 50 states, the District of Columbia, and other jurisdictions: School year 2011–12—Continued

    State
    Percent of students
    Total Female Male
    American
    Indian/ Alaska
    Native
    Asian/ Pacific

    Islander Hispanic Black White
    Rhode Island 76 80 72 52 74 72 66 76
    South Carolina 72 78 67 53 83 72 64 76
    South Dakota 83 85 82 42 ≥98 77 77 88
    Tennessee 83 86 81 94 94 ‡ 76 86
    Texas1 82 85 80 97 94 80 73 84
    Utah 78 80 76 58 87 65 60 80
    Vermont 93 95 91 ≥90 ≥98 ≥95 ≥98 91
    Virginia 84 90 81 82 96 92 71 85
    Washington 79 85 77 41 81 79 57 80
    West Virginia 80 82 78 69 ≥98 81 76 80
    Wisconsin 92 94 90 76 97 85 63 96
    Wyoming 80 82 78 44 79 77 58 82

    Department of Defense Education Activity (DoDEA), Bureau of Indian Education, and other jurisdictions

    Bureau of Indian
    Education — — — — — — — —
    American Samoa — — — — — — — —
    DoDEA — — — — — — — —
    Guam — — — — — — — —
    Northern Marianas — — — — — — — —
    Puerto Rico 62 67 57 <> <> 62 <> ≥90
    Virgin Islands 72 81 63 <> <> 76 70 <>
    — Not available. Data were not reported and have not been imputed.
    ‡ Data were suppressed because the reported data did not meet NCES standards.
    <> Data were suppressed to protect the confidentiality of individual student data.
    ≥ Greater than or equal to. The estimate has been top coded to protect the confidentiality of individual student data.
    1 Data are imputed. New Jersey did not report graduate data by gender. Texas did not report any graduate data.
    NOTE: Reported rates are presented rounded to the whole percentage point where the related population size is greater
    than 300. Estimates have been top coded to protect the confidentiality of individual student data. Top coding is a process
    where rates at or above a specific level are reported in a range, rather than a precise percentage, to protect the privacy of
    individuals represented either within the reported rate or its inverse. Based on the population size, top coded estimates are
    presented as being greater than or equal to a certain percent. For example, a rate of 94 percent may be presented as “≥90”
    percent for one population and “≥80” percent for another, dependent on total population size. United States total includes
    only the 50 states and the District of Columbia. Black includes African American, Hispanic includes Latino, Asian/Pacific
    Islander includes Native Hawaiian or Other Pacific Islander, and American Indian includes Alaska Native. Race categories
    exclude Hispanic origin unless specified.
    SOURCE: U.S. Department of Education, National Center for Education Statistics, Common Core of Data (CCD), “NCES
    Common Core of Data State Dropout and Graduation Rate Data file,” School Year 2011–12, Preliminary Version 1a.

    14

    Table 5. Public high school event dropout rate for the United States, the 50 states, the District of Columbia, and
    other jurisdictions: School years 2010–11 and 2011–12

    State
    Percent of high school students

    2010–11 2011–12
    United States 3.3 3.3

    Alabama 1.4 1.4 1
    Alaska 6.9 7.0
    Arizona 5.0 5.9
    Arkansas 3.5 3.2
    California 4.2 4.0
    Colorado 5.1 4.9
    Connecticut 1.9 2.1
    Delaware 3.6 3.5
    District of Columbia 6.1 5.8
    Florida 2.1 2.1
    Georgia 3.9 3.9
    Hawaii 5.1 4.7
    Idaho 1.6 1.9
    Illinois 2.9 2.4
    Indiana 1.8 2.1
    Iowa 3.4 3.2
    Kansas 2.3 2.1
    Kentucky 2.5 2.5
    Louisiana 3.9 5.7
    Maine 3.5 3.2
    Maryland 3.3 3.8
    Massachusetts 2.7 2.5
    Michigan 7.2 6.9
    Minnesota 1.8 1.9
    Mississippi 3.2 3.2
    Missouri 3.4 2.9
    Montana 4.3 4.1
    Nebraska 2.1 2.2
    Nevada 4.1 3.9
    New Hampshire 1.3 1.3
    New Jersey 1.4 1.4
    New Mexico 6.6 6.4
    New York 3.6 3.8
    North Carolina 3.9 3.1
    North Dakota 3.3 3.0
    Ohio 4.4 4.6
    See notes at end of table.

    15

    Table 5. Public high school event dropout rate for the United States, the 50 states, the District of Columbia, and
    other jurisdictions: School years 2010–11 and 2011–12—Continued

    State
    Percent of high school students

    2010–11 2011–12
    Oklahoma 2.5 2.5
    Oregon 3.2 3.4
    Pennsylvania 2.2 2.8
    Rhode Island 5.2 4.2
    South Carolina 2.8 2.5
    South Dakota 2.6 3.1
    Tennessee 3.6 3.7
    Texas 2.4 2.5
    Utah 1.5 1.5 1
    Vermont 2.5 2.5
    Virginia 2.3 1.9
    Washington 4.0 3.8
    West Virginia 3.4 2.7
    Wisconsin 2.0 1.9
    Wyoming 5.4 4.3

    Department of Defense Education Activity (DoDEA), Bureau of Indian Education, and other jurisdictions

    Bureau of Indian Education — —

    American Samoa — —

    DoDEA — —

    Guam — —

    Northern Marianas — —

    Puerto Rico 1.2 1.8

    Virgin Islands 5.1 3.8
    — Not available. Data were not reported and have not been imputed.
    1 Data were imputed from prior year reported data.
    NOTE: Reported rates are presented rounded to the one decimal place to protect against the disclosure of individually
    identifiable information. United States total includes only the 50 states and the District of Columbia.
    SOURCE: U.S. Department of Education, National Center for Education Statistics, Common Core of Data (CCD), “NCES
    Common Core of Data State Dropout and Graduation Rate Data file,” School Year 2010–11, Provisional Version 1a and
    School Year 2011–12, Preliminary Version 1a.

    16

    Appendix A: Collection Methodology and Sources of Error

    EDFacts Collection System
    EDFacts is a U.S. Department of Education initiative to centralize and coordinate the administrative
    data reported by state education agencies (SEAs) to the Department of Education for elementary and
    secondary public education. Program offices within the Department sponsor specific portions of the
    data reported in EDFacts to meet information requirements and support program monitoring and
    policy development. The purpose of EDFacts is to




    place the use of robust, timely performance data at the core of decision and policymaking in
    education;

    reduce state and district data burden and streamline data practices;

    improve state data capabilities by providing resources and technical assistance; and

    provide data for planning, policy, and management at the federal, state, and local levels.
    EDFacts provides the collection and processing systems that allow SEAs to report data annually for
    multiple elementary/secondary programs, such as the Common Core of Data and the Consolidated
    State Performance Report, through a series of data files that fall into different reporting schedules
    throughout each year. SEAs reported all the data elements used in this report for the adjusted cohort
    graduation rate (ACGR), averaged freshman graduation rate (AFGR), and the event dropout rate
    through the EDFacts Submission System.
    For more information on the EDFacts initiative, please visit the public website at
    http://www2.ed.gov/about/inits/ed/edfacts.

    Consolidated State Performance Report (CSPR)
    The CSPR collection is stewarded and monitored by the Department’s Office of Elementary and
    Secondary Education (OESE). The CSPR is the required annual reporting tool for each state, the
    District of Columbia, Puerto Rico, and the Bureau of Indian Education (BIE) as authorized under
    Section 9303 of the Elementary and Secondary Education Act (ESEA), as amended.
    Part I of the CSPR collects information required for the Annual State Report to the Secretary of
    Education, as described in section 1111(h)(4) of ESEA and data required under Homeless Collection
    (added in fiscal year 2005–06). Examples of data in Part I include: participation and performance on
    state assessments, participation and performance of English learners in language programs, highly
    qualified teachers, and homeless students served.
    Part II of the CSPR collects information related to state activities and outcomes of specific ESEA
    programs needed for the programs’ Government Performance and Results Act indicators or other
    assessment and reporting requirements. OESE uses these data in conjunction with data collected in
    Part I to monitor states’ progress in implementing ESEA and to identify technical assistance needs and
    program management and policy needs. Examples of data in Part II include: participation in Title I Part
    A, migrant students served, neglected or delinquent students served, adjusted cohort graduation rates,
    and lists of identified schools.
    The CSPR is considered OESE’s official report on state-level data for the specific programs included
    for a given school year. Figures published in this report may differ from those in related CSPRs for a

    A-1

    http://www2.ed.gov/about/inits/ed/edfacts

    given state. State CSPR reports include data submitted as of the final CSPR deadline. SEAs may
    update data beyond the CSPR deadline, and data in this report may reflect those updates. For more
    information about the CSPR, please e-mail questions to CSPR@ed.gov.

    The Common Core of Data (CCD) Program
    The CCD is a program of the National Center for Education Statistics’ (NCES) Administrative Data
    Division, which is part of the U.S. Department of Education’s Institute of Education Sciences. CCD
    was established as part of the Cooperative Education Statistics System in section 157 of the Education
    Sciences Reform Act of 2002, part C. Each school year the CCD program collects fiscal and nonfiscal
    administrative data about all public schools, public local education agencies (LEAs), and SEAs in the
    United States. The State Nonfiscal Survey of Public Elementary/Secondary Education includes the
    data elements used to calculate the AFGRs and the event dropout rates in this report and is one of six
    annual surveys that comprise the CCD. The other five surveys are the Public Elementary/Secondary
    School Universe Survey, the Local Education Agency Universe Survey, the National Public Education
    Finance Survey, the School District Finance Survey, and the Teacher Compensation Survey.
    The objectives of the CCD are twofold: first, to provide an official listing of public elementary and
    secondary schools and LEAs in the nation, which can be used to select samples for other NCES
    surveys. And second, to provide basic information and descriptive statistics on public elementary and
    secondary schools and schooling in general that are comparable among states.
    SEAs report CCD nonfiscal survey elements as part of the annual EDFacts collection. SEAs report
    CCD fiscal data through separate surveys that are conducted in collaboration with the U.S. Census
    Bureau. CCD contains three categories of information: general descriptive information on schools and
    school districts, data on students and staff, and fiscal data on revenue and expenditures for public
    education. CCD publishes statistical information annually by school year for approximately 100,000
    public elementary and secondary schools, approximately 18,000 local education agencies (including
    independent charter districts, supervisory unions, and regional education service agencies) in the 50
    states, the District of Columbia, Puerto Rico, the U.S. Virgin Islands, Department of Defense
    Education Activity (DoDEA), BIE, Guam, the Commonwealth of the Northern Mariana Islands, and
    American Samoa.

    Data Collection and Review
    This report and the accompanying data files provide both ACGRs and AFGRs. For the ACGR data,
    SEAs track a cohort of high school students over 4 years and then submit to EDFacts aggregated
    counts of students remaining in that cohort at the end of 4 years, the counts of students from that cohort
    who received a diploma at the end of the fourth school year, and the calculated ACGRs from these
    counts. For the components used to calculate AFGRs, SEAs submit to EDFacts each school year the
    October 1 membership counts and aggregate counts of the total number of students who graduated at
    the end of that school year. Both OESE and NCES run a series of validation checks against the
    reported data. Both NCES and OESE check reported data for internal, cross-level, and cross-year
    consistency. State coordinators are asked to review any identified anomalies and either update the
    reported data or explain the anomaly within a stipulated time period. In most cases states were able to
    correct or explain their data. SEAs are asked to respond to these edits and provide either data revisions
    or explanations for the identified anomalies.

    Because data collected in the CSPR are used for program administration, OESE does not alter the data
    reported by SEAs as part of the CSPR. Data errors and anomalies must be resolved by the state. In

    A-2

    mailto:CSPR@ed.gov

    some instances, OESE has not published state-submitted data due to data quality concerns. In cases
    where data are missing or suppressed, NCES worked with OESE to impute these data to improve the
    accuracy of national estimates and to make such estimates nationally representative.

    NCES uses statistical editing procedures to identify inconsistencies or anomalies in reported values for
    CCD that are used to compute the AFGR. Critical data errors include cases where the counts at a
    lower-level of aggregation exceed counts at higher levels of aggregation, cases where a graduate count
    exceeds the enrolled population count by more than 5 percent, or cases where the current year data
    vary widely from data reported in prior years. One method for identifying inconsistencies involves
    looking at the consistency of the data for an individual institution over a 5-year period and comparing
    the mean variation across prior estimates to the variation between those prior data and data collected in
    the current year. NCES provides the results of these edit checks to SEAs and requests that SEAs verify
    and explain the flagged inconsistencies. Or if the SEA finds that the submitted data have errors, NCES
    asks that the SEA resubmit corrected data. If the data for an individual institution (school, LEA, or
    state) are flagged as failing several “critical” data checks and the state is unable to provide a detailed
    explanation of the anomaly, then the data, as reported to EDFacts, are not reported to the public.
    NCES does, on occasion, alter the data reported by SEAs for CCD. NCES adjusts, suppresses, and/or
    imputes the data in question to correct for internal and cross-level consistency. Unexplained violations
    of this check at the school or LEA level result in the suppression of the identified data point. Violations
    at the state-level can result in the suppression and imputation replacement of the identified data point.
    Specific information about the imputation methods applied can be found later in this appendix under
    the heading “Imputation Methodology.”

    In some instances the reported SEA totals are less than the aggregates from the LEA or school level. In
    such cases where the SEA could not explain or correct this inconsistency, NCES applied a “raking”
    procedure to the LEA and/or school-level data to ensure that the sum of the school or LEA data is
    consistent with the state-level data. This “raking” process identifies the percentage of students at the
    school and/or LEA level that exceeds the state level and removes that percentage from each school
    and/or LEA. The raking algorithm uses a statistical rounding technique to carry forward any resulting
    remainders throughout the raking process in order to maintain whole number counts of students. This
    process results in slight changes to individual school/LEA records in an attempt to make the aggregate
    values more statistically consistent.

    The EDFacts collection system accepted blank responses in SY 2010–11 and SY 2011–12 and did not
    require that states distinguish among missing, not applicable, and “zero” values. NCES used statistical
    editing procedures to change blank responses to missing, not applicable, or zero using information
    available from SEAs or from prior year reporting. However, it is possible that some blank responses
    may have been categorized incorrectly. For example, the number of graduates for a specific race group
    may have been categorized as missing when the actual count was zero.

    Response and Nonresponse
    ACGR. For both SY 2010–11 and SY 2011–12, forty-seven states, the District of Columbia, and BIE
    reported 4-year ACGRs. Idaho, Kentucky, Oklahoma, and Puerto Rico have approved timeline
    extension requests to delay reporting of the ACGR. Kentucky and Oklahoma will first report the
    ACGR on SY 2012–13. Idaho will first report the ACGR on SY 2013–14. These systems were granted
    extensions in order to provide time for their internal data systems to mature to the point when they
    would have the requisite data to calculate the ACGR components and rate. Puerto Rico reported a 3-

    A-3

    year rate for SY 2011–12 and will continue to report a 3-year rate since Puerto Rico’s high school
    structure does not allow for reporting a 4-year rate. The estimated national ACGR includes reported
    data from 47 states and the District of Columbia and imputed data for Kentucky, Oklahoma, and Idaho.
    Detailed information on these imputations is provided later in this document in the section marked
    “Imputation Methodology.” Although DoDEA, the Virgin Islands, Guam, American Samoa, and the
    Commonwealth of the Northern Mariana Islands are included in the collection for the CCD, these
    jurisdictions were not included in the collection of ACGR data for SY 2010–11 and SY 2011–12.
    Texas did not report the ACGR components or rate for SY 2011–12 by the CSPR reporting deadline
    but did successfully submit their data in time for inclusion in this report.
    CCD Graduate Data. SEAs from 50 states, the District of Columbia, Puerto Rico, and the U.S. Virgin
    Islands reported graduate counts in EDFacts for SY 2010–11. BIE, DoDEA, Guam, American Samoa,
    and the Commonwealth of the Northern Mariana Islands did not report any graduate counts to
    EDFacts for SY 2010–11. New Jersey did not report graduate counts disaggregated by gender for SY
    2010–11. In order to produce a national estimated AFGR data by gender for SY 2010–11, NCES
    imputed the graduate counts disaggregated by gender for New Jersey.
    SEAs from 49 states, the District of Columbia, Puerto Rico, and the U.S. Virgin Islands reported in
    EDFacts for SY 2011–12. Texas, DoDEA, BIE, Guam, American Samoa, and the Commonwealth of
    the Northern Mariana Islands did not report any graduation data for SY 2011–12. New Jersey did not
    report graduate counts disaggregated by gender for SY 2010–11. In order to produce a national
    estimated AFGR data for SY 2011–12, NCES imputed the graduate counts for Texas and the graduate
    counts disaggregated by gender for New Jersey.
    CCD Dropout Data. SEAs from 50 states, the District of Columbia, Puerto Rico, and the U.S. Virgin
    Islands reported dropout counts in EDFacts for SY 2010–11. BIE, DoDEA, Guam, American Samoa,
    and the Commonwealth of the Northern Mariana Islands did not report dropout data to EDFacts for SY
    2010–11.
    SEAs from 48 states, the District of Columbia, Puerto Rico, and the U.S. Virgin Islands reported
    dropout counts in EDFacts for SY 2011–12. Utah, Alabama, BIE, DoDEA, Guam, American Samoa,
    and the Commonwealth of the Northern Mariana Islands did not report any dropout data to EDFacts
    for SY 2011–12. In order to produce a national estimated event dropout rate for SY 2011–12, NCES
    imputed dropout data for Utah and Alabama.
    NCES makes every effort to work with SEAs to allow them opportunities to submit, correct, and/or
    explain data before imputation procedures are used in place of reported data. NCES only imputed
    missing data that were necessary to produce United States totals from the 50 states and the District of
    Columbia. Data for the other nonrespondent jurisdictions that are not included in the United States
    level estimates were not imputed and are shown as missing in the report tables. Detailed information
    on the imputation methodology is provided below.

    Imputation Methodology
    Several imputations procedures, each based on the specific circumstances of the missing/suppressed
    data item, are employed.

    Method 1: Carrying forward prior year rates
    If a school system was unable to report data for the current year but had been able to report the data for
    a proceeding year, the rates for the previous year were applied to the current year data to estimate the

    A-4

    missing item. The rates were carried forward by the lowest level of disaggregation and totals were
    derived from the imputed disaggregates.



    Missing Diploma Counts (numerator for the AFGR): Ratio of prior year diplomas to 12th-
    grade membership applied to current year grade 12 membership.

    Missing Cohort Graduates (numerator for the ACGR): Ratio of prior year cohort graduates
    to total graduates applied to current year total graduates.

    Missing Cohort Student Count (denominator for the ACGR): Ratio of prior year cohort
    student count to averaged freshman count (denominator for the AFGR) applied to current
    year average freshman count.

    Method 2: Apply average state ratio to missing SEA
    If an SEA did not report a component to the ACGR but did report the corresponding component used
    to calculate the AFGR, NCES computed a ratio of the ACGR to AFGR components from the SEAs
    that did report both these components. Then, for each state with a missing ACGR component, NCES
    applied that ratio to the AFGR component to derive the imputed value for the missing ACGR
    component. The ratios were derived at the lowest level of disaggregation, and totals were derived from
    the imputed disaggregates.



    Missing Cohort Graduates (numerator for the ACGR): Average (weighted) ratio of cohort
    graduates to total graduates across reporting states applied to total graduates in the target
    state.

    Missing Cohort Student Count (denominator for the ACGR): Average (weighted) ratio of the
    cohort student count to the averaged freshman count (denominator for the AFGR) across
    reporting states applied to averaged freshman count in the target state.

    Missing Cohort Rate Components by Disability, English Language Learner, and Poverty
    Status: Average (weighted) ratio of the overall ACGR components to the disaggregate
    subgroups across reporting states applied to either reported or imputed overall ACGR for
    target state.

    Method 3: Carrying back current year rates
    If a school system was unable to report data for the prior year but was able to report the data for the
    current year, the rates for the current year were applied to the prior year data to estimate the missing
    item. The rates were carried back by the lowest level of disaggregation and totals were derived from
    the imputed disaggregates.


    Missing Cohort Graduates (numerator for the ACGR): Ratio of current year cohort graduates
    to total graduates applied to prior year total graduates.

    Missing Cohort Student Count (denominator for the ACGR): Ratio of current year cohort
    student count to averaged freshman count (denominator for the AFGR) applied to prior year
    averaged freshman count.

    Variability of Data Quality
    SEAs and LEAs vary widely in how they collect and manage student data and in the degree of rigor
    applied to verifying the accuracy of graduation and dropout data at all levels. Because of this, the

    A-5

    graduation and dropout data reported by SEAs may have varying levels of quality. Those states that
    collect dropout or graduation data through student-level records systems are better able to verify
    students’ enrollment and graduation status than are those agencies that collect aggregate data from
    schools and districts in more traditional formats. Additionally, some SEAs take a more active role in
    cleaning and processing these data, while others rely more heavily on their LEAs to clean these data
    points.

    A-6

    Appendix B: Detailed Methodology for Calculation of Four-Year On-
    Time Graduation Rates and Event Dropout Rates

    The Adjusted Cohort Graduation Rate (ACGR)
    Starting with the school year (SY) 2011–12 collection, the ACGR has been and will continue to be
    included as a required component of each state’s Consolidated State Performance Report (CSPR).1 The
    ACGR is calculated based on the number of students who graduate in 4 years or less with a regular
    high school diploma divided by the number of students who form the adjusted cohort for the
    graduating class. In order to calculate and report the 4-year ACGR states must follow the progress of
    each individual 9–12 grade student over time and maintain documentation of students who enter or
    leave schools or districts within their state. From the beginning of ninth grade (or the earliest high
    school grade) students who are entering that grade for the first time form a cohort that is “adjusted” by
    adding any students who subsequently transfer into the cohort from another state and subtracting any
    students who subsequently transfer out, emigrate to another country, or die.
    The following formula provides an example of how the 4-year adjusted cohort graduation rate would
    be calculated for the cohort entering 9th grade for the first time in SY 2008–09 and graduating by the
    end of SY 2011–12:

    Number of cohort members who earned a regular high school diploma
    by the end of SY 2011–12

    Number of first-time 9th-graders in fall 2008 (starting cohort) plus
    students who transferred in, minus students who transferred out,
    emigrated, or died during school years 2008–09, 2009–10, 2010–11,
    and 2011–12

    State education agencies (SEAs) report ACGR data for each school, LEA, and for the state total cohort
    rate. The methodology of the ACGR, as it was designed, allows for the movement or transfer of
    students from one school to another, while only counting each student once. A student may change
    schools and thus exit their prior school’s cohort and enter their new school’s cohort, but stay in the
    same district and state cohort. Similarly, a student who changes districts within a state will move to the
    new school and district for the ACGR, but will stay in the states cohort. In order to subtract or transfer
    a student out of a cohort, the school or LEA must have official written documentation that the student
    enrolled in another school or in an educational program that culminates in the award of a regular high
    school diploma.
    Unless specified, the ACGR data in this report and the associated data files reflect the data as reported
    by each SEA. The ACGRs required under the current Title I regulations are more comparable across
    states than were graduation rates submitted by SEAs under prior regulations. However, there has been
    some variation in the way that individual states have interpreted and understood the methodology
    specified in the statute. Examples of ways the calculated ACGR may vary among states include

    1 Under the Title I regulations, states also may calculate and use in accountability determinations an extended-year adjusted
    cohort graduation rate (e.g., a 5-year rate) to take into account students who graduate with a regular high school diploma in
    more than 4 years (see 34 C.F.R. § 200.19(b)(1)(v)). If a state uses an extended-year graduation rate in accountability
    determinations, it must calculate and report that rate separately from, and in addition to, the 4-year rate.

    B-1




    how students are identified for inclusion in certain subgroups;

    how the beginning of the cohort is defined;

    whether summer school students are included; and

    the criteria of what constitutes a diploma that meet the regulatory definition of a regular high
    school diploma.2

    SEAs report the ACGR disaggregated by major reporting groups. The specific groups vary across
    states depending on the relative size of certain disaggregation populations of interest and the subgroups
    they have been approved to report as part of their CSPR Accountability Workbooks. For the purpose of
    this report, data have been aggregated to five race/ethnicity subgroups: American Indian/Alaska
    Native, non-Hispanic; Asian/Pacific Islander, non-Hispanic; Hispanic; Black, non-Hispanic; and
    White, non-Hispanic. Additional levels of disaggregation may be presented in other reports and/or data
    files associated with the ACGR.
    Detailed information on the ACGR can be found in the Department’s 2008 High School Graduation
    Rate Non-Regulatory Guidance: http://www2.ed.gov/policy/elsec/guid/hsgrguidance .
    Detailed information on the guidance provided to SEA coordinators for submitting data relating to the
    ACGR can be found in the EDFacts file specifications 150 and 151. Links to both file specification
    documents can be found on the EDFacts File Specification website:


    SY 2010–11: http://www2.ed.gov/about/inits/ed/edfacts/sy-10-11-xml.html

    SY 2011–12: http://www2.ed.gov/about/inits/ed/edfacts/sy-11-12-nonxml.html

    The Averaged Freshman Graduation Rate (AFGR)
    The AFGR provides an estimate of the percentage of high school students who graduate within 4 years
    of first starting 9th grade. The rate uses aggregate student enrollment data to estimate the size of an
    incoming freshman class and counts of the number of diplomas awarded 4 years later. The incoming
    freshman class size is estimated by summing the enrollment in 8th grade in year one, 9th grade for the
    next year, and 10th grade for the year after, and then dividing by three. The averaging has a smoothing
    effect that helps compensate for prior year retentions in the 8th-, 9th-, and 10th-grade enrollment
    counts. Although not as accurate as a 4-year graduation rate computed from a cohort of students using
    student record data like the ACGR, the AFGR can be computed with widely available cross-sectional
    data. Based on a technical review and analysis of several 4-year graduation rates, the AFGR was
    selected as the most accurate indicator, excepting only the ACGR, from a number of alternative
    estimates that can be calculated using available cross-sectional data (Seastrom et al. 2006a, 2006b).
    The following formula provides an example of how the AFGR would be calculated for the graduating
    class of 2011:3

    2 Under 34 C.F.R. § 200.19(b)(1)(iv) a regular high school diploma is defined as “the standard high school diploma that is awarded to
    students in the State and that is fully aligned with the State’s academic content standards or a higher diploma and does not include a high
    school equivalency credential, certificate of attendance, or any alternative award.”
    3 Eighth-, 9th-, and 10th-grade enrollment was adjusted to include a prorated number of ungraded students using the ratio of the specified
    grade enrollment to the total graded enrollment. The same ratio was used to prorate ungraded students for the disaggregated enrollment
    counts (race/ethnicity and gender).

    B-2

    http://www2.ed.gov/policy/elsec/guid/hsgrguidance

    http://www2.ed.gov/about/inits/ed/edfacts/sy-10-11-xml.html

    http://www2.ed.gov/about/inits/ed/edfacts/sy-11-12-nonxml.html

    Number of regular high school diplomas awarded in SY 2010–11

    The number or 8th-graders enrolled in the fall 2006 plus
    the number of 9th-graders enrolled in the fall 2007 plus
    the number of 10th-graders enrolled in the fall of 2008)

    divided by 3
    The AFGR was intended to address a lack of regular information about timeliness of graduating from
    public high schools. Precise measures of how long it takes for a student to graduate high school require
    data sources that follow the progress of each individual student over time. Until recently, most states
    lacked data systems that captured individual public-school student-level data over time. The AFGR
    was developed to utilize data that were available across the 50 states on a regular basis to provide a
    general and comparable measure of the percentage of public high school students who graduate with a
    regular high school diploma within 4 years of first entering 9th grade. The AFGR is useful for
    longitudinal analysis of graduation rates since the data used to generate the AFGR are available going
    back in time to at least the 1960s.
    The levels of disaggregation for the AFGR components are complicated by the different years for
    which data are necessary in the computation of the rate. Prior to the SY 2008–09 collection, enrollment
    and graduation data were collected disaggregated by 5 major racial/ethnic groups: American
    Indian/Alaska Natives, Asian/Pacific Islanders, Hispanics, Blacks, and Whites. The reporting of
    Hawaiian/Pacific Islanders disaggregated from Asian/Pacific Islander students and the separate
    reporting of students of more than one race was phased in for SY 2008–09. For SY 2008–09 seven
    states reported students disaggregated into the seven race/ethnicity categories. For SY 2009–10 eight
    additional states reported at that level of disaggregation. By SY 2010–11 all states were required to
    report by the seven race/ethnicity categories rather than the traditional five.4 Because the SY 2010–11
    AFGR calculations require data going back to SY 2006–07 it is only possible to calculate it by the
    collapsed five race/ethnicity categories. Thereby, these five categories are the only levels of
    disaggregation available in this report and in the associated data files.
    Detailed information on the guidance provided to SEA coordinators for submitting data relating to the
    AFGR can be found in the EDFacts file specifications. Links to both file specification documents can
    be found on the EDFacts File Specification website:




    8th-grade enrollment for SY 2006–07 was collected under file specification 052:
    http://www2.ed.gov/about/inits/ed/edfacts/sy-06-07-nonxml.html

    9th-grade enrollment for SY 2007–08 was collected under file specification 052:
    http://www2.ed.gov/about/inits/ed/edfacts/sy-07-08-nonxml.html

    10th-grade enrollment for SY 2008–09 was collected under file specification 052:
    http://www2.ed.gov/about/inits/ed/edfacts/sy-08-09-nonxml.html

    Diploma data for SY 2010–11 was collected under file specification 040:
    http://www2.ed.gov/about/inits/ed/edfacts/sy-10-11-nonxml.html

    4 For more information on this change, please refer to the October 19, 2007 Federal Register notice, Final Guidance on Maintaining,
    Collecting, and Reporting Racial and Ethnic Data to the U.S. Department of Education, located at
    http://www2.ed.gov/legislation/FedRegister/other/2007-4/101907c.html.

    B-3

    http://www2.ed.gov/about/inits/ed/edfacts/sy-06-07-nonxml.html

    http://www2.ed.gov/about/inits/ed/edfacts/sy-07-08-nonxml.html

    http://www2.ed.gov/about/inits/ed/edfacts/sy-08-09-nonxml.html

    http://www2.ed.gov/about/inits/ed/edfacts/sy-10-11-nonxml.html

    http://www2.ed.gov/legislation/FedRegister/other/2007-4/101907c.html

    Differential Definitions for “Regular High School Diploma Recipient”
    State and local policies can affect the numbers of regular high school diploma recipients (REGDIP)
    reported. There are differences in what a regular high school diploma represents across states. EDFacts
    file specifications for both annual and cohort REGDIP define a regular diploma as the high school
    completion credential awarded to students who meet or exceed coursework and performance standards
    set by the state or other approving authority. While this language provides a definition of common
    intent, the requirements required to earn a high school diploma varies among states. States therefore
    have differing requirements for REGDIP in terms of required attendance, coursework requirements
    (Carnegie Units), and exit exams.

    High School Event Dropout Rate
    In calculating the event dropout rate, high school dropouts for a given school year include students
    who were


    enrolled in school at some time during the school year;
    expected to be in membership the following school year; and
    not enrolled in grades 9–12 in by October 1 of the following year.

    Dropouts do not include students who were





    reported as a dropout in the year before;
    among students who graduated high school by completing the state graduation requirements,
    receiving a high school equivalency credential without dropping out of school, or completing a
    state or district-approved educational program;
    confirmed as having transferred to another public school district, private school, or state or
    district-approved educational program;
    temporarily absent due to suspension or illness; or
    deceased.

    The high school event dropout rate is the number of dropouts divided by the number of students
    enrolled in grades 9–12 at the beginning of that school year. In cases where LEAs or SEAs report
    students and dropouts in an ungraded category, the National Center for Education Statistics (NCES)
    prorates ungraded students and dropouts into grades in order to calculate an aggregated dropout rate
    for 9th- through 12th-grade students.

    Not all states follow a fall-to-fall school year. The Common Core of Data (CCD) dropout count is
    based on an October–September school year in which a student’s dropout status is determined at the
    beginning of the year. Some states follow a July–June calendar in which a student’s dropout status is
    determined at the end of the school year. Dropout rates in states that follow an alternative reporting
    calendar are comparable with rates for states that follow the October–September calendar (Winglee et
    al. 2000) and are included in the CCD data files.

    The CCD definition attributes dropouts to the grade and school year for which they do not meet their
    obligation. Students who complete 1 school year but fail to enroll in the next school year are counted
    as dropouts from the school year and grade for which they failed to return. For example, a student

    B-4

    completing 10th grade in SY 2008–09 who does not enroll the next year would be reported as an 11th-
    grade dropout for SY 2009–10.

    Students who leave high school to enroll in high school equivalency preparation programs are reported
    as dropouts, unless the district tracks these students and reports as dropouts those who fail to complete
    the program. If a high school equivalency program is an accepted high school credential in the state’s
    Data Usage and Availability, students who have received a high school equivalency by October 1 are
    not considered dropouts, regardless of where they prepared for the test.
    These data are released to the public and to IES Restricted-Use Data Licensees. Public release files can
    be obtained through



    the CCD website (ACGR and AFGR data);

    the Office of Elementary and Secondary Education (OESE) website (ACGR state-level data);
    and

    the DATA.GOV initiative (ACGR school- and LEA-level data).
    Public release data include graduation and dropout rates by race/ethnicity and other demographic
    characteristics at the school district, and state levels.
    For more information on the information available on the public-use data files, please visit the CCD
    data file download page at http://nces.ed.gov/ccd/ccddata.asp, the OESE data tool at
    http://eddataexpress.ed.gov, and the government-wide DATA.GOV initiative at: http://www.data.gov/.
    To learn more about restricted use data files or how to apply for an IES Restricted Use Data License,
    visit the Restricted Use Data License page at: http://nces.ed.gov/statprog/instruct.asp or contact the IES
    Restricted Use Data License Office at: IESData.Security@ed.gov.

    Ensuring Confidentiality of Individual Student Data
    The Department of Education is legally required to protect the confidentiality of data that could be
    used to identify individual students. Legal requirements for protecting the data are detailed in the
    Family Education Rights and Privacy Act (FERPA) (20 U.S.C. § 1232g; 34 CFR Part 99). FERPA
    requires that these data be protected to ensure they cannot be used to identify the attributes of any
    individual student. Beyond the natural barriers to precise identification, including differential time
    frames and net transfer effects, additional disclosure mitigation methods have been applied to these
    data to ensure confidentiality.
    Beginning with the SY 2010–11 graduation and dropout data, NCES’ public release files will only
    include rates or ranges of rates. Explicit counts of graduates and dropouts will only be available on the
    restricted-use files for each of the state, LEA, and school levels. The Department of Education’s
    Disclosure Review Board has established a set of mandatory procedures for these data to protect the
    confidentiality of individual student data. The procedures establish rate floors (minimums) and ceilings
    (maximums) based on the size of the population being examined. Small populations require more
    protection at the top and bottom of the distribution than do large populations to achieve the same level
    of confidentiality protection. For the public-use, LEA-level files, rate ranges within the distribution
    have also been established. These ranges are based on the same methodology as the floors and ceilings
    established for SEA-level data. Making the floors, ceilings, and ranges dynamic based on population
    size allows these rates to be published at a maximum level of precision allowable for a specific
    measured population.

    B-5

    http://nces.ed.gov/ccd/ccddata.asp

    http://eddataexpress.ed.gov/

    http://nces.ed.gov/statprog/instruct.asp

    mailto:IESData.Security@ed.gov

    NCES makes restricted-use data files available to researchers through IES Restricted-Use Data
    Licenses. Restricted use data files of administrative data have not undergone data coarsening.
    Licensees with access to these restricted use data must sign an agreement to ensure that any data
    disseminated outside of the authorized research team are protected using the Department of Education
    approved disclosure avoidance methodology for the licensed data set to ensure the confidentiality of
    individual student data. Any public releases of these data, including presentation materials, journal
    articles, website postings, etc. must be reviewed by IES to ensure that the Department of Education’s
    approved disclosure avoidance methodology has been employed prior to release. The researchers and
    the sponsoring organizations are held accountable for any and all failures to comply with the strict
    requirements agreed to within the IES Restricted Use Data License agreement. Data confidentiality
    violations by IES Restricted Use Data Licensees are subject to Class E felony charges, with a fine up
    to $250,000 and/or a prison term up to 5 years.

    B-6

    • Public High School Four-Year On-Time Graduation Rates and Event Dropout Rates: School Years 2010–11 and 2011–12. First Look
    • NCES Title Page with Authors
    • NCES Information Page
    • Acknowledgments
      List of Tables
      Introduction
      Selected Findings
      Tables
      Table 1. Public high school 4-year adjusted cohort graduation rate (ACGR), by race/ethnicity and selected demographics for the United States, the 50 states, the District of Columbia, and other jurisdictions: School year 2010–11
      Table 2. Public high school 4-year adjusted cohort graduation rate (ACGR), by race/ethnicity and selected demographics for the United States, the 50 states, the District of Columbia, and other jurisdictions: School year 2011–12
      Table 3. Public high school averaged freshman graduation rate (AFGR), by gender and race/ethnicity for the United States, the 50 states, the District of Columbia, and other jurisdictions: School year 2010–11
      Table 4. Public high school averaged freshman graduation rate (AFGR), by gender and race/ethnicity, for the United States, the 50 states, the District of Columbia, and other jurisdictions: School year 2011–12
      Table 5. Public high school event dropout rate for the United States, the 50 states, the District of Columbia, and other jurisdictions: School years 2010–11 and 2011–12
      Appendix A: Collection Methodology and Sources of Error
      EDFacts Collection System
      Consolidated State Performance Report (CSPR)
      The Common Core of Data (CCD) Program
      Data Collection and Review
      Response and Nonresponse
      Imputation Methodology
      Method 1: Carrying forward prior year rates
      Method 2: Apply average state ratio to missing SEA
      Method 3: Carrying back current year rates
      Variability of Data Quality

    • Appendix B: Detailed Methodology for Calculation of Four-Year On-Time Graduation Rates and Event Dropout Rates
    • The Adjusted Cohort Graduation Rate (ACGR)
      The Averaged Freshman Graduation Rate (AFGR)
      Differential Definitions for “Regular High School Diploma Recipient”
      High School Event Dropout Rate
      Ensuring Confidentiality of Individual Student Data

    Writing 2011
    NATIONAL ASSESSMENT OF EDUCATIONAL PROGRESS AT GRADES

    8

    AND

    12

    U.S. Department of Education
    NCES 2012-470

  • Contents
  • 1

  • Executive Summary
  • 4

  • Introduction
  • 10 Grade

    8

    28 Grade

    12

    46

  • Technical Notes
  • What Is The Nation’s Report CardTM?
    Since 1969, NAEP assessments have been conducted periodically in reading, mathematics, science, writing, U.S.
    history, civics, geography, and other subjects. NAEP collects and reports information on student performance at
    the national and state levels, making the assessment an integral part of our nation’s evaluation of the condition and
    progress of education. Only academic achievement data and related background information are collected. The
    privacy of individual students and their families is protected.

    NAEP is a congressionally authorized project of the National Center for Education Statistics (NCES) within the Institute
    of Education Sciences of the U.S. Department of Education. The Commissioner of Education Statistics is responsible
    for carrying out the NAEP project. The National Assessment Governing Board oversees and sets policy for NAEP.The
    Nation’s Report Card™ informs the public about the academic achievement of elementary and secondary students
    in the United States. Report cards communicate the findings of the National Assessment of Educational Progress
    (NAEP), a continuing and nationally representative measure of achievement in various subjects over time.

    Executive
    Summary
    New computer-based assessment of students’ writing skills
    Writing in the 21st century is defined by its frequency and its efficiency. It is clear that the
    ability to use written language to communicate with others—and the corresponding need for
    effective writing instruction and assessment—is more relevant than ever. Reflecting current
    practice and recognizing the impact of communication technologies on the way students
    compose their writing, the National Assessment of Educational Progress (NAEP) administered
    the first computer-based assessment in writing in 2011.

    In this new national writing assessment sample, 24,100 eighth-graders and 28,100 twelfth-
    graders engaged with writing tasks and composed their responses on computer. The
    assessment tasks reflected writing situations common to both academic and workplace
    settings and asked students to write for several purposes and communicate to different audi-
    ences. The results of the 2011 writing assessment offer a new opportunity to understand the
    ability of eighth- and twelfth-grade students to make effective choices in their writing and allow
    for insight into the role and impact of technology on writing education and performance.

    For the first year of this computer-based writing assessment, new scales and achievement
    levels were established. The scales for grades 8 and

    12

    were developed separately and range
    from 0 to 300 with a mean set at 150 for each grade. Additional results are reported based on
    students’ demographic characteristics, educational experiences, and the frequency of engaging
    in actions available to them in word-processing software.

    About one-quarter of students perform at the
    Proficient level in writing
    Twenty-four percent of students at both grades 8 and 12 performed at the Proficient level in
    writing in 2011 (figure A). The NAEP Proficient level represents solid academic performance
    for each grade assessed. Students performing at this level have clearly demonstrated the
    ability to accomplish the communicative purpose of their writing.

    Figure A. Achievement-level results in eighth- and twelfth-grade NAEP writing: 2011

    NOTE: Detail may not sum to totals
    because of rounding.

    1WRITING 2011 1WRITING 2011

    SOURCE: U.S. Department of Education, Institute of Education Sciences, National Center for Education Statistics, National Assessment of Educational Progress (NAEP), 2011 Writing Assessment.

    Fifty-four percent of eighth-graders and 52 percent of twelfth-graders performed at the Basic
    level in writing in 2011. The Basic level denotes partial mastery of the prerequisite knowledge
    and skills that are fundamental for proficient work at each grade.

    Three percent of eighth- and twelfth-graders in 2011 performed at the Advanced level.
    This level represents superior performance.

    SOURCE: U.S. Department of Education, Institute of Education Sciences, National Center for Education Statistics, National Assessment of Educational Progress (NAEP),
    2011 Writing Assessment.

    2 THE NATION’S REPORT CARD 2 THE NATION’S REPORT CARD

    Students’ performance varies by race/ethnicity, gender,
    and school location

    At grade 8, average writing
    scores were

    higher for Asian students
    than for other racial/ethnic
    groups (table A);

    higher for female
    students than for male
    students; and

    higher for students
    attending schools in
    suburban locations than
    for students in cities,
    towns, and rural locations.

    At grade 12, average writing
    scores were



    higher for White students,
    Asian students, and
    students of two or more
    races than for Black,
    Hispanic, and American
    Indian/Alaska Native
    students (table B);

    higher for female students
    than for male students; and

    higher for students in
    suburban schools than
    for students in cities and
    rural locations.

    Table A. Average scores in eighth-grade NAEP writing, by
    selected student and school characteristics: 2011

    Characteristic Scale score
    Race/ethnicity
    White 158
    Black 132
    Hispanic 136
    Asian 165
    Native Hawaiian/Other Pacific Islander 141
    American Indian/Alaska Native 145
    Two or more races 155

    Gender

    Male 140
    Female 160
    School location
    City 144
    Suburb 155
    Town 148
    Rural 150
    NOTE: Black includes African American, and Hispanic includes Latino. Race categories exclude
    Hispanic origin.

    Table B. Average scores in twelfth-grade NAEP writing, by
    selected student and school characteristics: 2011

    Characteristic Scale score
    Race/ethnicity
    White 159
    Black 130
    Hispanic 134
    Asian 158
    Native Hawaiian/Other Pacific Islander 144
    American Indian/Alaska Native 145
    Two or more races 158
    Gender
    Male 143
    Female 157
    School location
    City 146
    Suburb 154
    Town 149
    Rural 149
    NOTE: Black includes African American, and Hispanic includes Latino. Race categories exclude
    Hispanic origin.

    Computer-based assessment provides information on
    students’ use of word-processing actions
    Data collected from the computer-based writing assessment provided information about the extent
    to which students engaged in certain actions on the computer as they responded to the writing tasks.
    Information is reported for 23 unique actions students performed as they either viewed the writing
    prompts or wrote and edited their responses.

    Results for the student actions are reported as the percentages of students engaging in the action with
    varying frequency, and the average writing score for those students. For example, at both grades 8 and
    12, students who used the thesaurus tool more frequently scored higher on average than students who
    engaged in this action less frequently. Twelve percent of eighth-graders and 15 percent of twelfth-graders
    used the thesaurus two or more times during the assessment.

    Writing Assessment Interface and Select Student Actions
    Below is a snapshot of the interface students used as well as data on some of the actions they
    engaged in while viewing the prompts or editing their responses.

    80% or more of twelfth-graders
    did not use the cut, copy, and
    paste features.

    74% of twelfth-graders
    right-clicked to access the
    spell-check option 1 or
    more times.

    3WRITING 2011 3WRITING 2011 3WRITING 2011

    29% of eighth-graders used
    the thesaurus 1 or more times.

    71% of eighth-graders
    used the text-to-speech
    function 1 or more times.

    Introduction
    The 2011 National Assessment of Educational Progress (NAEP) writing
    assessment was developed under a new framework that recognizes the
    significant role that computers play in the writing process, as well as the
    prevalence of computer technology in the lives of students and the
    increasing role of computers in learning activities. Assessment results
    provide information on what eighth- and twelfth-grade students can
    accomplish when writing for a specific communicative purpose and for
    a specified audience.

    The New Writing Framework
    The National Assessment Governing Board oversees the development of NAEP frameworks that
    describe the specific knowledge and skills to be assessed. The Writing Framework for the 2011 National
    Assessment of Educational Progress guided all aspects of assessment development. Major aspects of
    the assessment are anchored in the definition of writing provided by the 2011 framework:

    Writing is a complex, multifaceted, and purposeful act of communication that is
    accomplished in a variety of environments, under various constraints of time, and
    with a variety of language resources and technological tools.

    With this definition as the foundation, all assessment tasks specify both a definite purpose for
    the writing and a specific audience the writing should address. In addition, the computer-based
    tasks provided students with typical language resources such as a thesaurus and common
    computer tools such as spell-check, cut, copy, and paste.

    The movement to a computer-based assessment reflects the important social and educational
    changes in the use of technology since the former writing framework was developed for the
    1998, 2002, and 2007 assessments. The innovations in the new computer-based writing
    assessment prescribed by the new framework preclude the possibility of reporting trend
    results. Future NAEP writing assessment results will be compared to the 2011 results.

    Writing for different purposes and audiences
    Students participating in the 2011 NAEP writing assessment responded to tasks designed to
    measure one of three communicative purposes common to many typical writing situations:

    • To persuade, in order to change the reader’s point of view or affect the reader’s action.

    • To explain, in order to expand the reader’s understanding.

    • To convey experience (real or imagined), in order to communicate individual and imagined
    experience to others.

    4 THE NATION’S REPORT CARD

    The proportion of the assessment tasks devoted to each of the three purposes varies by
    grade, with more of an emphasis on writing to persuade and to explain at grade 12 than at
    grade 8 (table 1).

    Table 1. Target percentage distribution of NAEP writing tasks, by grade and communicative purpose: 2011
    Communicative purpose

  • Grade 8
  • Grade 12
  • To persuade 35 40

    To explain 35 40

    To convey experience 30 20

    SOURCE: U.S. Department of Education, National Assessment Governing Board, Writing Framework for the 2011 National Assessment of Educational Progress (NAEP), 2010.

    5WRITING 2011

    Each task in the writing assessment clearly specifies or implies an audience that corresponds in
    some way to the purpose of the task. The kinds of audiences may vary by grade and broaden
    from grade 8 to grade 12. For example, eighth-graders may be asked to write to a principal,
    local newspaper editor, or online community, and twelfth-graders may be asked to write to a
    government official or community leader.

    Assessment Design
    The 2011 writing assessment included 22 writing tasks at grade 8 and 22 tasks at grade 12.
    Writing tasks were presented to students in a variety of ways, including text, audio, photographs,
    video, or animation on the computer. One example of a writing task and sample student responses
    from the assessment for each grade is presented in this report.

    Students were randomly assigned two writing tasks and had 30 minutes to complete each of
    the tasks. Before being presented with the first task, students were shown a tutorial to familiarize
    them with the way material is presented on the computer screen and show them how to use
    the custom-developed software program provided in the assessment. Students completed their
    writing tasks on laptop computers provided by NAEP, using software similar to common word-
    processing programs. They were able to use some standard tools for editing, formatting, and
    viewing text, but did not have access to potentially irrelevant or distracting tools such as clip
    art, font type and color, or the Internet.

    Survey questionnaires were completed by students, their teachers (at grade 8 only), and
    school administrators. The data obtained from these questionnaires help to provide additional
    information about students’ educational experiences and a context for understanding the
    assessment results.

    Scoring Students’ Writing
    Students’ written responses to each task were evaluated based on a holistic approach that considered
    the response in its entirety rather than focusing on its specific parts. Individual scoring guides for each
    of the communicative purposes were used to evaluate students’ writing. Grade-specific guides were
    used to rate students’ responses to the persuade and explain tasks; however, as students’ responses
    at grades 8 and 12 did not reflect strong differences for the convey experience task, the same scoring
    guide was used to rate this writing purpose. The scoring guides were used to train teams of human
    scorers to rate responses for each of the writing tasks. Due to the on-demand nature of writing tasks
    in the NAEP 2011 assessment, students’ responses were evaluated as first drafts and not as polished
    pieces of writing.

    Responses were scored on a 6-point scale (effective skill, competent skill, adequate skill, developing
    skill, marginal skill, and little or no skill) across three broad features of writing:


    Development of ideas

    Organization of ideas

    Language facility and conventions

    Scoring guides are available at http://nationsreportcard.gov/writing_2011/sample_quest.asp.

    Reporting NAEP Results
    The 2011 writing assessment results are based on nationally representative samples of 24,100
    eighth-graders from 950 schools, and 28,100 twelfth-graders from 1,220 schools. The sample
    design for the first computer-based writing assessment was not intended to report results for
    individual states or large urban districts.

    Scale scores
    The NAEP writing scale was developed in 2011 to facilitate the reporting of NAEP writing results
    and to establish the baseline for future writing assessment results. The scale at each grade
    ranged from 0 to 300 with a mean of 150 and a standard deviation of 35. That is, the average
    overall performance for each grade corresponds to a score of 150. Because NAEP scales are
    developed independently for each subject, scores cannot be compared across subjects. Similarly,
    although the scales are identical for grades 8 and 12, the scale scores were derived separately;
    therefore, scores cannot be compared across grades. More information about the NAEP writing
    scale can be found at http://nces.ed.gov/nationsreportcard/writing/scale.asp.

    6 THE NATION’S REPORT CARD

    http://nationsreportcard.gov/writing_2011/sample_quest.asp

    http://nces.ed.gov/nationsreportcard/writing/scale.asp

    Achievement levels
    Based on recommendations from policymakers, educators, and members of the general public,
    the Governing Board sets specific achievement levels for each subject area and grade assessed.
    Achievement levels are performance standards showing what students should know and be
    able to do. NAEP results are reported as percentages of students performing at the Basic,
    Proficient, and Advanced levels.

    Basic denotes partial mastery of prerequisite knowledge and skills that are fundamental
    for proficient work at each grade.

    Proficient represents solid academic performance. Students reaching this level have
    demonstrated competency over challenging subject matter.

    Advanced represents superior performance.

    More specific definitions of the achievement levels that outline what students should know and
    be able to do in writing at each level are presented in the Assessment Content sections of this
    report for grades 8 and 12. Each content section also includes information on the relationship
    between the writing scores on the NAEP scale and the achievement levels.

    As provided by law, the National Center for Education Statistics (NCES), upon review of
    congressionally mandated evaluations of NAEP, has determined that achievement levels are
    to be used on a trial basis and should be interpreted with caution. The NAEP achievement
    levels have been widely used by national and state officials.

    Student actions
    Administering the NAEP writing assessment on computer provided students access to many of
    the same word-processing tools that have become a common part of their writing experiences
    in and out of school. Also, computer delivery of the assessment allowed for universal design
    features that remove barriers and make the assessment accessible to a wider population of
    students.

    In addition to the number of key presses students made in the process of completing their
    responses to the writing tasks, data were collected on how often they engaged in 23 other
    actions related to editing, formatting, viewing, and reviewing text. While exploratory in nature,
    the collection of data on the frequency with which students used various word-processing tools
    during the assessment offers new insight into how students interact with technology during a
    timed assessment, and the possible relationship between that interaction and performance.

    7WRITING 2011

    Accommodations and Exclusions in NAEP
    It is important to assess all selected students from the population, including students with
    disabilities (SD) and English language learners (ELL). To accomplish this goal, many of the
    same accommodations that students use on other tests are provided for SD and ELL students
    participating in NAEP. Some of the testing accommodations that are provided to SD/ELL students
    in NAEP paper-and-pencil assessments are part of the universal design of the computer-based
    assessment, which seeks to make the assessment available to all students. For example, the
    font size adjustment feature available to all students taking the computer-based assessment is
    comparable to the large-print assessment book accommodation in the paper-and-pencil assess-
    ment, and the digital text-to-speech component takes the place of the read-aloud accommodation
    for paper-and-pencil assessments. However, there were still some accommodations available to SD
    and ELL students taking the computer-based writing assessment that were not available to other
    students, such as extended time and breaks.

    Even with the availability of accommodations, some students may be excluded. The national
    exclusion rates for the 2011 writing assessment were 2 percent at both grades 8 and 12.
    More information about NAEP’s policy on the inclusion of special-needs students is available at
    http://nces.ed.gov/nationsreportcard/about/inclusion.asp.

    Interpreting the Results
    The results from the 2011 writing assessment provide the public, policymakers, and educators
    with important new information about students’ writing achievement and the nature of their
    performance in different communicative situations. There are, however, limitations to the range
    and scope of skills that NAEP can assess because, like most standardized assessments, NAEP
    is an on-demand assessment with limited time and resources. Therefore, the assessment
    results should not be interpreted as a complete representation of student writing performance.

    NAEP reports results using widely accepted statistical standards; findings are reported
    based on a statistical significance level set at .05 with appropriate adjustments for multiple
    comparisons (see the Technical Notes for more information). Only those differences that are
    found to be statistically significant are discussed as higher or lower.

    Although comparisons are made in students’ performance based on demographic
    characteristics and educational experiences, the results cannot be used to establish

    a cause-and-effect relationship between student characteristics and achievement.
    Many factors may influence student achievement, including educational policies and
    practices, available resources, and the demographic characteristics of the student

    body. These factors may vary among student groups.

    8 THE NATION’S REPORT CARD

    http://nces.ed.gov/nationsreportcard/about/inclusion.asp

    Learn more. See more. Do more online.
    Find writing assessment results, analyze data, view sample questions,
    and more with these helpful online resources.

    NAEP 2011 Writing Framework
    Learn how the NAEP writing assessment is designed to measure students’
    writing at grades 4, 8, and 12.
    http://www.nagb.org/publications/frameworks/writing-2011

    NAEP Writing Assessment Questions
    Access all released questions and scoring guides from the 2011 assessment.
    http://nces.ed.gov/nationsreportcard/itmrlsx/search.aspx?subject=writing

    Assessment Interface Examples
    View a sample of the computer-based environment.
    http://nationsreportcard.gov/writing_2011/writing_tools.asp

    Writing Highlights from NAEP
    Receive an in-depth overview of results and scores.
    http://nationsreportcard.gov/writing_2011/

    The NAEP Data Explorer
    Generate customized tables from the NAEP writing data.
    http://nces.ed.gov/nationsreportcard/naepdata/

    Explore more
    online today.

    9WRITING 2011

    http://www.nagb.org/publications/frameworks/writing-2011

    http://nces.ed.gov/nationsreportcard/itmrlsx/search.aspx?subject=writing

    http://nationsreportcard.gov/writing_2011/writing_tools.asp

    http://nationsreportcard.gov/writing_2011/

    http://nces.ed.gov/nationsreportcard/naepdata/

    SOURCE: U.S. Department of Education, Institute of Education Sciences, National Center
    for Education Statistics, National Assessment of Educational Progress (NAEP),
    2011 Writi

    10 THE NATION’S REPORT CARD
    ng Assessment.

    GRADE

    8

    Twenty-seven percent of
    eighth-graders perform
    at or above Proficien

    t

    The average eighth-grade writing score of 150 on the 0–300 scale establishes
    the benchmark against which students’ performance on future NAEP writing
    assessments can be compared.

    The percentages of students performing at each of the three achievement levels
    in 2011 provides information on the proportion of students demonstrating vary-
    ing levels of writing skills and knowledge. Fifty-four percent of eighth-graders
    performed at the Basic level and 80 percent1 performed at or above the Basic level
    in writing in 2011 (figure 1). Twenty-four percent performed at the Proficient level
    and 3 percent performed at the Advanced level.

    1 The percentage is based on the sum of the unrounded percentages as opposed to the rounded percentages
    shown

    in the figure.

    Figure 1. Achievement-level results in eighth-grade NAEP writing: 2011

    NOTE: Detail may not sum to totals because of rounding.

    GRADE

    8

    SOURCE: U.S. Department of Education, Institute of Education Sciences, National Center for Education Statistics, National Assessment of Educational Progress (NAEP),
    2011 Writing Assessment.

    11WRITING 2011

    Asian students score higher than other racial/ethnic groups
    In 2011, NAEP results were available for seven racial/ethnic groups: White, Black, Hispanic, Asian,
    Native Hawaiian/Other Pacific Islander, American Indian/Alaska Native, and students categorized
    as two or more races (multiracial). The average writing score for Asian students was higher than
    the scores for the other six racial/ethnic groups (figure 2).

    A lower percentage of Asian students than White, Black, Hispanic, and multiracial students
    performed at the Basic level in 2011 (figure 3). The percentages of Asian students at Proficient
    and at Advanced were higher than the percentages of White, Black, Hispanic, and multiracial
    students. Higher percentages of White students than Black and Hispanic students performed at
    the Proficient and Advanced levels.

    Figure 2. Percentage of students and average scores in eighth-grade NAEP writing, by race/ethnicity: 2011

    # Rounds to zero.

    NOTE: Black includes African American, and Hispanic includes Latino. Race categories exclude Hispanic origin. Detail may not sum to totals because of rounding.

    Figure 3. Achievement-level results in eighth-grade NAEP writing, by race/ethnicity: 2011

    NOTE: Black includes African American, and Hispanic includes Latino. Race categories exclude Hispanic origin. Detail may not sum to totals because of rounding.

    GRADE
    8

    Female students perform higher than male students
    Female students scored 19 points2 higher on average than male students in 2011 at grade 8
    (figure 4). Differences in the performance of male and female students were also reflected
    in the achievement-level results, with a lower percentage of female than male students per-
    forming at the Basic level and higher percentages of female than male students at Proficient
    and at Advanced (figure 5).
    2 The score-point difference is based on the difference between the unrounded scores as opposed to the rounded scores shown

    in the figure.

    Figure 4. Percentage of students and average scores in eighth-grade
    NAEP writing, by gender: 2011

    NOTE: Detail may not sum to totals because of rounding.

    Figure 5. Achievement-level results in eighth-grade NAEP writing, by gender: 2011

    NOTE: Detail may not sum to totals because of rounding.
    SOURCE: U.S. Department of Education, Institute of Education Sciences, National Center for Education Statistics, National Assessment of Educational Progress (NAEP),
    2011 Writing Assessment.

    12 THE NATION’S REPORT CARD

    GRADE
    8

    Public school students score lower than private
    school students
    Ninety-two percent of eighth-graders attended public schools in 2011 (figure 6). The average
    writing score for students attending public schools was 16 points3 lower than the score for
    students attending private schools and 18 points lower than the score for private school
    students in Catholic schools only.

    The percentages of students performing at the Basic level did not differ significantly by the type
    of school they attended (figure 7). Lower percentages of public school students than private
    school students performed at the Proficient level and at Advanced. While the percentage of
    public school students at Proficient was also lower than the percentage of Catholic school
    students at Proficient, there was no significant difference in the percentages of public and
    Catholic school students at Advanced.
    3 The score-point difference is based on the difference between the unrounded scores as opposed to the rounded scores shown

    in the figure.

    Figure 6. Percentage of students and average scores in eighth-grade
    NAEP writing, by type of school: 2011

    NOTE: Private schools include Catholic, other religious, and nonsectarian private schools. Detail may not sum to totals
    because of rounding.

    Figure 7. Achievement-level results in eighth-grade NAEP writing, by type of school: 2011

    SOURCE: U.S. Department of Education, Institute of Education Sciences, National Center for Education Statistics, National Assessment of Educational Progress (NAEP),
    2011 Writing Assessment.

    13WRITING 2011

    NOTE: Private schools include Catholic, other religious, and nonsectarian private schools. Detail may not sum to totals because of rounding.

    GRADE
    8

    Student performance varies by family income
    Students’ eligibility for the National School Lunch Program (NSLP) is used in NAEP as an
    indicator of family income. Students from lower-income families are eligible for either free
    or reduced-price school lunch, while students from higher-income families are not (see the
    Technical Notes for eligibility criteria).

    Forty-two percent of eighth-graders were eligible for NSLP in 2011 (figure 8). Eighth-graders
    who were not eligible for NSLP scored higher on average than those who were eligible. In com-
    parison to students who were not eligible for NSLP, a larger percentage of eligible students
    performed at the Basic level in 2011, and smaller percentages of eligible students performed
    at Proficient and at Advanced (figure 9).

    Figure 8. Percentage of students and average scores in eighth-grade
    NAEP writing, by eligibility for National School Lunch
    Program: 2011

    NOTE: Detail may not sum to totals because results are not shown for the “Information not available” category.

    Figure 9. Achievement-level results in eighth-grade NAEP writing, by eligibility for
    National School Lunch Program: 2011

    NOTE: Detail may not sum to totals because of rounding.
    SOURCE: U.S. Department of Education, Institute of Education Sciences, National Center for Education Statistics, National Assessment of Educational Progress (NAEP),
    2011 Writing Assessment.

    14 THE NATION’S REPORT CARD

    GRADE
    8

    Students in suburban schools score higher than students in
    other locations
    Student performance on the 2011 writing assessment varied based on the location of the
    schools they attended (see the Technical Notes for more information on how school location
    categories were defined). Students attending schools in suburban locations had a higher average
    score than students attending schools in other locations (figure 10). Scores for students who
    attended schools in rural and town locations were not significantly different from each other,
    and students in both locations had higher scores than students in cities.

    The percentages of eighth-graders performing at the Basic level in city and suburban schools
    were smaller than the percentages at Basic in town and rural locations (figure 11). Students
    attending suburban schools had higher percentages at Proficient and at Advanced than
    students attending schools in the other three locations.

    Figure 10. Percentage of students and average scores in eighth-grade
    NAEP writing, by school location: 2011

    NOTE: Detail may not sum to totals because of rounding.

    Figure 11. Achievement-level results in eighth-grade NAEP writing, by school location: 2011

    NOTE: Detail may not sum to totals because of rounding.
    SOURCE: U.S. Department of Education, Institute of Education Sciences, National Center for Education Statistics, National Assessment of Educational Progress (NAEP),
    2011 Writing Assessment.

    15WRITING 2011

    GRADE
    8
    SOURCE: U.S. Department of Education, Institute of Education Sciences, National Center for Education Statistics, National Assessment of Educational Progress (NAEP),
    2011 Writing Assessment.

    16 THE NATION’S REPORT CARD

    About two-thirds of eighth-graders spend more than
    15 minutes a day writing for English class
    Students were asked how much time they spent in a typical school day writing a paragraph or
    more for their English/language arts (ELA) class. The writing could be on paper or on computer.
    Forty percent of students reported spending between 15 and 30 minutes on writing for their
    ELA class, while 21 percent reported spending between 30 and 60 minutes, and 4 percent
    reported spending more than 60 minutes (figure 12).

    In 2011, eighth-graders who reported spending between 30 and 60 minutes on writing in their
    ELA class scored higher on average than students who wrote more or less frequently.

    Figure 12. Percentage of students and average scores in eighth-grade NAEP writing, by student-reported
    time spent on writing assignments of a paragraph or more during English/language arts class
    in a typical school day: 2011

    NOTE: Detail may not sum to totals because of rounding.

    Students who use computers more frequently to draft and revise
    their writing score higher
    As part of the 2011 eighth-grade writing assessment, questionnaires were completed by the teachers of participat-
    ing students. One of the questions asked teachers to report on how often they asked students to use computers to
    draft and revise their writing. A total of 44 percent4 of students had teachers who reported that they asked students
    to use computers for drafting and revising very often or always.

    Students whose teachers more frequently asked them to use the computer to draft and revise their writing scored
    higher than those whose teachers did so less frequently (figure 13). Students whose teachers never asked them to
    draft and revise their writing on a computer scored the lowest.

    SOURCE: U.S. Department of Education, Institute of Education Sciences, National Center for Education Statistics, National Assessment of Educational Progress (NAEP), 2011 Writing Assessment.

    17WRITING 2011

    4 The percentage is based on the sum of the unrounded percentages as opposed to the rounded percentages shown in the figure.

    Figure 13. Percentage of students and average scores in eighth-grade NAEP writing, by teacher-reported frequency with
    which they ask their students to use a computer for drafting and revising writing assignments: 2011

    NOTE: Detail may not sum to totals because of rounding.

    The frequency with which students’ teachers had them use computers to draft and revise their writing differed by
    students’ eligibility for NSLP and whether they attended public or private schools (table 2). In 2011, larger percent-
    ages of students who were eligible for school lunch than those who were not eligible had teachers who reported
    never or sometimes asking students to use computers to draft and revise their writing. On the other hand, higher
    percentages of students who were not eligible had teachers who reported always or very often asking them to use
    computers for drafting and revising. A larger percentage of students attending public schools than private schools
    had teachers who reported never or hardly ever asking students to use computers to draft or revise their writing.

    Table 2. Percentage of students assessed in eighth-grade NAEP writing, by teacher-reported frequency with which they ask
    their students to use a computer for drafting and revising writing assignments and selected characteristics: 2011

    Characteristi

    c

    Frequency of computer use for writing assignments

    Never or hardly ever Sometimes Very often Always or

    almost always

    Eligibility for National
    School Lunch Program
    Eligible 25 40 23 12
    Not eligible 15 34 30 21

    Type of school
    Public 20 37 26 17
    Private 12 29 32 28

    Catholic 6 34 32 28

    NOTE: Private schools include Catholic, other religious, and nonsectarian private schools. Detail may not sum to totals because of rounding.

    GRADE
    8

    GRADE
    8

    New era of computer-based testing provides insight into
    how students use word-processing tools
    The computer-based delivery of the NAEP writing assessment provided the opportunity to
    collect data on the extent to which students engaged in specific actions available to them in
    the word-processing software. Results for the student actions are reported as the percentages
    of students engaging in the action with varying frequency, and the average writing score for
    those students. For example, students who used the text-to-speech tool more frequently scored
    lower on average than those who engaged in the action less frequently (figure 14). A total of
    71 percent of eighth-grade students used the text-to-speech tool for the writing prompt one or more
    times (the tool was not available to use for their responses), and 29 percent did not use it at all.
    Although not shown here, a higher percentage of students identified with a disability (39 percent)
    than those not identified with a disability (30 percent) used the text-to-speech tool three or
    more times. Further information about student actions is available at http://nationsreportcard
    .gov/writing_2011/writing_tools.asp.

    Figure 14. Percentage of students and average scores in eighth-grade NAEP writing, by
    number of times text-to-speech tool was used during assessment: 2011

    NOTE: Detail may not sum to totals because of rounding.

    Students also had the option of using an online thesaurus tool to enhance or improve their
    writing. Students who used the thesaurus scored higher, on average, than students who did
    not use it, and students who used it two or more times scored higher than students who used
    it only once (figure 15). Seventy-one percent of eighth-grade students did not use the thesau-
    rus tool and 29 percent used the tool one or more times during the assessment. Although not
    shown here, a lower percentage of students identified as English language learners (6 percent)
    than non-English language learners (12 percent) used the thesaurus tool two or more times.

    Figure 15. Percentage of students and average scores in eighth-grade NAEP writing, by number of
    times thesaurus tool was used during assessment: 2011

    NOTE: Detail may not sum to totals because of rounding.
    SOURCE: U.S. Department of Education, Institute of Education Sciences, National Center for Education Statistics, National Assessment of Educational Progress (NAEP),
    2011 Writing Assessment.

    18 THE NATION’S REPORT CARD

    http://nationsreportcard.gov/writing_2011/writing_tools.asp

    http://nationsreportcard.gov/writing_2011/writing_tools.asp

    GRADE
    8
    SOURCE: U.S. Department of Education, Institute of Education Sciences, National Center for Education Statistics, National Assessment of Educational Progress (NAEP),
    2011 Writing Assessment.

    19WRITING 2011

    In addition to the number of key presses students made in the process of completing their
    responses to the writing tasks, data were collected on how often they engaged in 23 other actions
    related to editing, formatting, viewing, and reviewing text. The extent to which students made
    use of each of the tools that were available to them varied. For example, almost all eighth-graders
    used the backspace key at least one time, while 20 percent or less used the copy, cut, or
    paste tools (figure 16).

    Figure 16. Percentage of eighth-graders who used various student actions at least once
    during the NAEP writing assessment: 2011

    Other results for student actions, including the percentages of students engaging
    in specific actions with varying frequency and the average writing scores for those
    students, are available on the NAEP website at http://nationsreportcard.gov/
    writing_2011/student_action_results.asp and in the NAEP Data Explorer at
    http://nces.ed.gov/nationsreportcard/naepdata/ within the Student Factors category.

    http://nationsreportcard.gov/writing_2011/student_action_results.asp

    http://nationsreportcard.gov/writing_2011/student_action_results.asp

    http://nces.ed.gov/nationsreportcard/naepdata/

    GRADE
    8
    SOURCE: U.S. Department of Education, Institute of Education Sciences, National Center for Education Statistics, National Assessment of Educational Progress (NAEP),
    2011 Writing Assessment.

    20 THE NATION’S REPORT CARD
























    A closer look at lower- and higher-performing students
    The summary of results presented below shows how students performing at the lower end of the scale
    (below the 25th percentile) and those scoring at the higher end (above the 75th percentile) differed in
    terms of their demographics, educational experiences, and use of word-processing tools available to them.

    Among eighth-graders who
    scored below the 25th percentile
    (i.e., below a score of 127) in 2011

    Among eighth-graders who
    scored above the 75th percentile
    (i.e., above a score of 175) in 2011

    Student demographic
    characteristics

    39% were White
    25% were Black
    31% were Hispanic
    3% were Asian

    67% were eligible for free/
    reduced-price school lunch

    86% had computers in the home

    73% were White
    6% were Black
    10% were Hispanic
    8% were Asian

    18% were eligible for free/
    reduced-price school lunch

    99% had computers in the home

    Student educational
    experiences and
    attitude toward
    writing

    15% reported always using the
    computer to make changes to
    something they wrote for school
    24% never use a computer to
    write for school assignments
    24% had teachers who
    always have them use word-
    processing tools
    34% agreed or strongly agreed
    that writing is one of their
    favorite activities

    49% reported always using the
    computer to make changes to
    something they wrote for school
    5% never use a computer to
    write for school assignments
    38% had teachers who
    always have them use word-
    processing tools
    58% agreed or strongly agreed
    that writing is one of their
    favorite activities

    Students’ use of
    word-processing
    tools

    5% used the backspace key
    more than 500 times
    31% right-clicked to access
    spell-check 1–10 times
    45% accessed the text-to-speech
    function 3 or more times

    41% used the backspace key
    more than 500 times
    57% right-clicked to access
    spell-check 1–10 times
    18% accessed the text-to-speech
    function 3 or more times

    Assessment
    Content
    Additional insight into students’ performance on the NAEP writing
    assessment can be obtained by examining what eighth-graders are expected
    to know and be able to do in relation to how they performed on one of the
    assessment tasks designed to measure their writing skills. This section
    presents the achievement-level descriptions for writing at grade 8 and
    examples of student performance in response to a grade 8 task.

    Writing Achievement-Level Descriptions for Grade 8
    The specific descriptions of what eighth-graders should know and be able to do at the Basic,
    Proficient, and Advanced writing achievement levels are presented below. NAEP achievement
    levels are cumulative; therefore, student performance at the Proficient level includes the compe-
    tencies associated with the Basic level, and the Advanced level also includes the skills and
    knowledge associated with both the Basic and the Proficient levels. The cut score indicating
    the lower end of the score range for each level is noted in parentheses.

    Basic (120)
    Eighth-grade students writing at the Basic level should be able to address the tasks appropriately
    and mostly accomplish their communicative purposes. Their texts should be coherent and
    effectively structured. Many of the ideas in their texts should be developed effectively. Supporting
    details and examples should be relevant to the main ideas they support. Voice should align with
    the topic, purpose, and audience. Texts should include appropriately varied uses of simple,
    compound, and complex sentences. Words and phrases should be relevant to the topics,
    purposes, and audiences. Knowledge of spelling, grammar, usage, capitalization, and punctuation
    should be made evident; however, there may be some errors in the texts that impede meaning.

    Proficient (173)
    Eighth-grade students writing at the Proficient level should be able to develop
    responses that clearly accomplish their communicative purposes. Their texts
    should be coherent and well structured, and they should include appropriate
    connections and transitions. Most of the ideas in the texts should be devel-
    oped logically, coherently, and effectively. Supporting details and examples
    should be relevant to the main ideas they support, and contribute to overall
    communicative effectiveness. Voice should be relevant to the tasks
    and support communicative effectiveness. Texts should include a
    variety of simple, compound, and complex sentence types combined
    effectively. Words and phrases should be chosen
    thoughtfully and used in ways that contribute to
    communicative effectiveness. Solid knowledge of
    spelling, grammar, usage, capitalization, and
    punctuation should be evident throughout the
    texts. There may be some errors, but these errors
    should not impede meaning.

    21WRITING 2011

    GRADE
    8

    Advanced (211)
    Eighth-grade students writing at the Advanced level should be able to construct skillful responses that
    accomplish their communicative purposes effectively. Their texts should be coherent and well structured
    throughout, and they should include effective connections and transitions. Ideas in the texts should be
    developed logically, coherently, and effectively. Supporting details and examples should skillfully and effec-
    tively support and extend the main ideas in the texts. Voice should be distinct and enhance communicative
    effectiveness. Texts should include a well-chosen variety of sentence types, and the sentence structure
    variations should enhance communicative effectiveness. Words and phrases should be chosen strategically,
    with precision, and in ways that enhance communicative effectiveness. An extensive knowledge of spelling,
    grammar, usage, capitalization, and punctuation should be evident throughout the texts. Appropriate use
    of these features should enhance communicative effectiveness. There may be a few errors, but these errors
    should not impede meaning.

    Sample Task: Writing to Convey Experience
    Whether writing to convey a real or an imaginary experience, the task of the writer is to wield the language in such
    a way that the experience becomes vivid to the reader. When writing to convey experience, the writer employs
    description, voice, and style to evoke a sense of an event or of emotions associated with the events described.

    One of the writing tasks from the eighth-grade assessment asked students to immerse themselves in an
    imaginary situation and to write about it as if from personal experience. In the Lost Island task, students
    listened to an audio recording of atmospheric sounds while reading a few sentences from an imaginary journal.
    The audio provided the sound of waves lapping on the shore, the squawking of birds, as well as the sound
    of footsteps in the sand to create a sense of the island world that the students were to imagine exploring.
    Students’ responses to this task included both journal-style narratives as well as stories told in the third
    person. Responses were rated with a six-level scoring guide ranging from “Little or no skill” to “Effective.”

    LoST ISLAnD TASK SCREEnSHoT

    Task includes an audio clip of
    a journal entry about people
    exploring an island.

    22 THE NATION’S REPORT CARD

    GRADE
    8

    23WRITING 2011

    211

    173

    120

    Range of eighth-grade skills when writing to convey experience
    The item map below illustrates the range of writing skills demonstrated by students when writing to the
    Lost Island task. The responses for each of the credited score categories5 are mapped at different points on
    the NAEP writing scale and fall within the achievement level ranges Basic, Proficient, and Advanced, or in the
    range below Basic. The cut score at the lower end of the range for each achievement level is boxed. The criteria
    for specific score levels reflect the skills demonstrated by eighth-graders when writing to convey experience.
    In reading the map, it is important to remember that the score levels do not have a direct correspondence to
    the achievement level ranges, but indicate where performance mapped for this particular writing task. For
    example, for this task, students performing at the Basic level with a score of 155 were likely to compose a
    coherent story using some relevant details. Students performing within the Proficient range with a score of 195
    were likely to produce mostly well-controlled stories using more variety of sentence structure and thoughtful
    word choices. Students performing at the Advanced level with a score of 225 were likely to develop a story that
    consistently displayed a skillful use of language and technique to fully accomplish the purpose of the task.

    However, student performance varies by task, and ratings for different tasks may map at different points of
    the writing scale. For other tasks, responses rated as “Adequate” may fall within the Proficient range, or a
    “Developing” response might fall within the Basic range.
    5 The lowest rating, “Little or no skill,” receives 0 credit and is not scaled.

    GRADE 8 NAEP WRITING ITEM MAP
    Scale
    score

    Ad
    va

    nc

    ed

    300

    250
    240

    230

    Rating of responses to Lost Island task

    225 “Effective” story about an experience on a remote island

    Rating criteria

    //

    Students writing at the Effective level consistently conveyed an
    experience using well-chosen detail, appropriate organizational
    structures, precise word choices, and varied sentence structure.220

    Pr
    ofi

    ci
    en

    t
    211

    210

    200

    195 “Competent” story about an experience on a remote island

    Students writing at the Competent level conveyed an experience
    using some well-chosen detail, mostly appropriate organizational
    structures, some precise word choices, and some varied sentences.190

    180

    Ba
    si

    c
    173

    170
    160

    155 “Adequate” story about an experience on a remote island

    Students writing at the Adequate level used some detail, but detail
    did not always convey an experience, while organization was some-
    what loose, word choices clear rather than precise, and sentence
    structure relatively unvaried.

    150

    140
    130

    120
    GRADE
    8

    116 “Developing” story about an experience on a remote island Students’ responses at the Developing level showed deficits in
    development, organization, and/or language; the experience was
    thus unevenly conveyed.

    110
    100

    90
    80

    79 “Marginal” story about an experience on a remote island
    Students’ responses at the Marginal level showed severe deficits in
    development, organization, and/or language; little experience was
    thus conveyed.

    70
    //
    0

    NOTE: The sample grade 8 writing task in the 2011 writing assessment was mapped onto the NAEP 0–300 writing scale. The map shows, for each level on the scoring guide from “Marginal” through “Effective,” the scale
    score attained by students who had a 65 percent probability of attaining that level or higher for the selected task. Scale score ranges for writing achievement levels are referenced on the map.
    SOURCE: U.S. Department of Education, Institute of Education Sciences, National Center for Education Statistics, National Assessment of Educational Progress (NAEP), 2011 Writing Assessment.

    SOURCE: U.S. Department of Education, Institute of Education Sciences, National Center for Education Statistics, National Assessment of Educational Progress (NAEP), 2011 Writing Assessment.

    24 THE NATION’S REPORT CARD

    GRADE
    8

    The table below shows the percentage of students within each achievement level whose responses to the
    Lost Island task were rated as “Adequate” or higher. For example, among eighth-graders performing at the
    Basic level, 53 percent provided responses that were rated “Adequate” or higher and among students at the
    Proficient level, 95 percent wrote responses to the task that were rated “Adequate” or higher.

    Five percent of eighth-graders provided responses to the Lost Island task that were rated “Effective” and
    14 percent provided responses rated “Competent.” Three percent of eighth-graders received a rating of
    “Little or no skill” on this task.

    Percentage of eighth-grade students, by response rating for Lost Island task: 2011

    Effective Competent Adequate Developing Marginal
    Little or
    no skill

    5 14 37 30 11 3

    NOTE: Detail may not sum to totals because of rounding.

    Below Basic At Basic At Proficient At Advanced

    4 53 95 100

    Percentage of eighth-grade student responses to Lost Island task rated “Adequate” or
    higher, by achievement level: 2011

    Example of a “Developing” response
    The sample student response shown here was rated as demonstrating “Developing” skill in responding to the
    writing task. It begins with a rather flat explanation of what happened to bring them to the island and contin-
    ues the narrative with a list-like focus on action but no discussion of plot elements. Apart from the “ferocious
    monkeys” and the sudden “loud buzzing noise,” the response incorporates no descriptive vocabulary to evoke
    what it felt like to have the experience or what the island was like. The narrative voice is not controlled as it
    switches from first person to third person. Errors in mechanics and verb tenses are frequent enough to be
    distracting. This response demonstrates skills associated with performance below the Basic level. Eighth-grade
    students performing in the range below Basic respond to the topic but do not sufficiently develop ideas to
    convey the experience.

    The reason we are on this island is because our plane crashed offshore about one hundred feet. Then
    we drifted here. The plane has three days worth of food and 2 days worth of water. We could probably
    use some of the things on the plane to build shelter and a fire. There was a total of 10 people on the plane,
    nobody died when it crashed thinks to the pilot. We are college students on a vacation to Hawaii, but we
    hit a bad storm and drifted way off course. The pilot saw the island and crash landed near it. There is five
    guys and five girls, the girls will get to sleep inside the plane so they don’t get to cold or scared. The guys
    will sleep outside near the plane next to the fire because the plane is now not big enough to hold 10 people.

    The next day, the guys ventured further inland for some fruit to eat for breakfast. All of the sudden they
    were attacked by monkeys, so they ran back and squeezed in the plane until the ferocious monkeys left.
    They ate what fruit they were able to grab and then went to the ocean to catch some fish. While the guys
    were outside fishing, the girls built a sign out of logs that said “HELP”. If they saw a boat or airplane, they
    would light it on fire so that they could be rescued. Two days past, then suddenly a loud buzzing noise
    came from overhead, it was an airplane. They were saved! The plane had saw the smoke from their fire. It
    landed safely near the shore and picked them up.

    Example of an “Adequate” response
    The sample student response shown here was rated as demonstrating “Adequate” skill in responding to the
    writing task. It takes elements from the journal excerpt—the animal, the mountains—and incorporates them
    into a short narrative using both description and action. While simple sentences predominate, there is some
    variety in sentence structure. Ideas are presented clearly and in a logical progression, but without any stylistic
    transition between them. The use of such words as “exotic,” “apex,” and “magnificent” contribute to a sense of
    voice that aligns with the experience the student has been asked to convey. This response demonstrates skills
    associated with performance at the Basic level. Eighth-grade students at this level are able to mostly accom-
    plish the communicative purpose of a task clearly and coherently.

    The animal looked like a brontosaurus, but bigger and more muscular. There were many more animals
    on the island also. there were strange animals that looked like moneys, but they had two tails and four
    arms. Another animal looked like a rhinosaurus, but had four eyes and five horns on his head and back. As
    we watched the creatures, we decided to explore the rest of the island. We cut through fields where won-
    derful foods grew. We had never seen these types of foods before. They food was huge and exotic looking,
    but they were ripe and delicious. Then we made our way to the mountains, they were so big and beauti-
    ful. We thought it would be exciting to climb up these mountains and get a better view of the island. The
    mountain was steep and had a lot of the same strange trees and colorful flowers. When we made it to the
    apex of the mountain, the scenery was magnificent! There were many trees and animal life, the island had
    warm beaches and clear lakes and a clear blue sky. The island was a sight to see, it looked as if had been
    untouched by the pollution and the oil spill. The island was full of beauty.

    Example of a “Competent” response
    The following sample student response was rated as “Competent” in responding to the writing task. Despite
    some awkward moments, the response demonstrates a control of language in a sustained and descriptive
    narrative. A variety of sentence types are used effectively to create both scene and character. In parts of the re-
    sponse, sentence length is used to emphasize meaning, as in the fourth paragraph, where the second sentence
    effectively communicates a sense of suspense. Transitions between most of the sentences and paragraphs
    contribute to the narrative flow. Although the ending is somewhat abrupt, it does create a sense of conclusion.
    This response demonstrates skills associated with performance at the Proficient level. Eighth-grade students at
    this level are able to develop thoughtful responses that accomplish the communicative purpose.

    It was a creature none of us has ever seen before. We were awed by its beauty with its purple furry
    body and head like a cat. When this creature saw us it was not mean, it was not harsh, it simply bent
    down and purred. We were welcomed.

    As we explored the island on the back of the creature we call Bila, we saw more diverse creatures and
    we wanted to captivate this experience. We had a camera with us and with this camera we took pictures
    of Bila and her friend Shia, who looked just like her only orange. We took pictures of the trees, which were
    enormously tall—only not quite as tall as Bila—and blue! they had strange fruits such as these things that
    were yellow like lemons, but were in the shape of a box.

    That was just the first night though. We camped out where Bila stayed and set up a campfire. Bila
    and Shia’s reaction made it seemed as though they never saw a fire before. There were terrified and
    intrigued at the same time. As we were talking Bila said something about the Shigrutan, who didn’t like
    new creatures. They talked about them as though they were demons. They told us to never cross the line
    of invisibility, where the Shigrutan would be waiting.

    (continued on next page)

    25WRITING 2011

    GRADE
    8

    However, one day, about a week into our adventure, one of our followers, Mike, accidentally
    stepped over the line while getting picking some tias—a green fruit shaped like a triangle. As prom-
    ised, the Shigrutan were waiting. They started poking him with sticks and throwing some strange
    looking vegetables at him. Bila and Shia rushed over to get him and started towards the line again.
    They looked mad and afraid that he didn’t listen.

    As Bila and Shia were re-explaining the dangers of crossing the line, my curiosity got the better
    of me. I started towards the line and as Bila started to realize, I was already running. I got to the line
    and started hearing the brushing of bushes and leaves. I started to get nervous, but I wasn’t afraid. I
    crossed the line and everything happened in a rush. There were sticks and vegetables flying and then
    all of a sudden an ear-piercing scream.

    What I took to be the leader of the Shigrutan stepped out signaling his followers to cease fire. He
    wore a yellow wrap around his body, which was blue. He had long black hair that went to the middle
    of his back in a ponytail. His eyebrows looked like they were made out of gems. He was a beautiful
    creature and I felt like I was being pulled to him. I walked towards him and could hear the sharp intake
    of breath from behind me.

    As I got to him, he wrapped me up in a red cloth and threw me on his back. He carried me away
    to his dungeon as I could hear Bila and Shia yelling from behind me. I didn’t care though. Some how
    I felt safe with him. When we got to his dungeon he sat me down and told me about his life. He told
    me that his life was just like mine, except he was always an outcast with his blue skin. His mother told
    him about this place, and he always imagined that one day he would find it. His mother passed away
    when he was seventeen and he went on a mission to find the place his mother was from. He set off on
    the journey when he was nineteen.

    When he got here, he saw creatures, strangely unique creatures. He was pulled in and felt like he
    knew this place just from what his mother would tell him. He had found his home and quickly became
    the king of his loyal followers, the Shigrutan. When he was done telling the story that I became
    captivated in, he asked me about myself. I told him that I was from America and had come all the way
    down here for an adventure. To go where no other has gone before. Even now, after I’ve found what I
    was looking for, there was no way that I was leaving. He got down on one knee and proposed. I said
    even though I don’t know you, I feel strangely connected. I answered yes but with one condition. After
    he asked me what that one condition was, I told him that he had to let my friends come with me.

    We got married and I never had to leave the island. This was where I was staying whether or not
    my friends were.

    Writing tasks available online

    More NAEP writing tasks are available online at http://nces.ed.gov/nationsreportcard/
    itmrlsx/search.aspx?subject=writing and include:




    One released task for each writing purpose at grade 8, including audio and video

    A scoring rubric for each task

    Samples of responses at each scoring level

    Student performance data

    Explore the interface for the writing tasks at http://nationsreportcard.gov/
    writing_2011/writing_tools.asp.

    26 THE NATION’S REPORT CARD

    GRADE
    8

    http://nces.ed.gov/nationsreportcard/itmrlsx/search.aspx?subject=writing

    http://nces.ed.gov/nationsreportcard/itmrlsx/search.aspx?subject=writing

    http://nationsreportcard.gov/writing_2011/writing_tools.asp

    http://nationsreportcard.gov/writing_2011/writing_tools.asp

    Example of an “Effective” response
    The sample student response shown below was rated as showing “Effective” skill in responding to the writing
    task. Well-chosen details are used throughout the response, often in complex sentences to modify or elabo-
    rate on the idea. The consistently high level of word choice strategically enhances what the author wants to
    communicate; for example, “We slowly trudged through the dense sand back to our boat…” conveys the mood
    of the group that just witnessed the fight described in the first paragraph. Although it has a few confusing
    moments, the narrative coheres by virtue of a distinct voice and overall structure. This response demonstrates
    skills associated with performance at the Advanced level. Eighth-grade students at this level establish a distinct
    voice and employ an overall structure to effectively accomplish the communicative purpose.

    We were nervous. What a tall animal it was! It stood on four legs and slowly munched on the leaves
    contentedly. None of our group had ever seen an animal this large, so we sat down to try to think about
    what it could be. Just then, a smaller animal with two legs and tiny little arms came galloping upon the first
    animal and tore off one of its legs. The fight was bloody, but quickly ended with the second animal enjoying
    his victory and savoring the taste of the first animal. That’s when I knew. We were in an island with a bunch
    of dinosaurs.

    Our group started to panic. So I tried to calm them down by simply telling them that we could get far,
    far away with just the boat that we had traveled on. Most of them seemed to agree. We slowly trudged
    through the dense sand back to our boat, which was now in sight. But it seemed that time stopped and the
    next thing I saw was a gigantic foot on top of our ruined boat. A dinosaur’s face 20 feet above leered down
    at us and growled. I could tell by popular knowledge that it was a tyrannosaurus rex, one of the most fear-
    some creatures ever to walk the earth.

    It was a mean sight to see. It was terrifying, with its huge claws and teeth and the fact that it was much
    larger than us. The only thing that reassured me was its tiny little arms. They kind of ruin the whole bad guy
    look for him. But I still am terrified. It’s pandemonium, with everybody running and screaming and the t-rex
    behind us, picking us off one by one. I see many of my closest friends disappear behind the gigantic white
    teeth. And just as quickly as he appeared, he disappeared. Our group was stunned. Where could he have
    gone? We thought silently, worried looks behind our faces. When we couldn’t get anything out of thinking, we
    started to make camp, since it was getting dark by then. We didn’t have any materials and it was beautiful
    weather, with the temperature hanging around 70 degrees Fahrenheit. Se we just grabbed some leaves and
    branches off of what looked like a palm tree and tried to get to sleep, although I don’t think anyone slept that
    night. I kept thinking to myself over and over, “We’re doomed.”

    When we woke up in the morning we started to gather materials to make a new boat to get onto some
    nearby islands that looked small and didn’t have any dinosaurs on them. We made great time, even working
    from dawn to sunset. We made around fifteen rafts that could carry our whole group. We made plans to get
    off the island by tomorrow morning. That’s when I started to feel optimistic about our chances of survival.

    The next morning, we took the rafts one by one down to the beach. I started to lead the group into a good
    rowing formation, since we had made oars also. I could see some dinosaurs cawing in the distance. Seemed
    normal enough to me. Then I thought that only flying things cawed. And that’s when I saw an enormous
    pterodactyl swoop down onto me and lift me up into the air.

    Flying was terrifying. The pterodactyl seemed to have every intention of dropping me into the ocean. I
    looked up at its orange skin and hated it for all the hope it had taken away from my life. Then I felt it lower-
    ing. It flew lower and lower until we were five feet off one of those nearby islands I wrote about earlier. Then
    it dropped me. It was a hard landing on the island, but I shook it off and came up O.K. I looked up and the
    last thing I saw was the pterodactyl swooping low over the horizon. I thanked it in my mind for what it had
    given me.

    By the time the next morning came, the rest of my group was in the rafts on the shore of the island. I was
    so happy to see them all and to learn that one of the group mate’s cell phones had picked up a signal, and
    help was coming. I finally had some hope.

    27WRITING 2011

    GRADE
    8

    Twenty-seven percent of
    twelfth-graders perform
    at or above Proficient
    The average twelfth-grade writing score of 150 on the 0–300 scale establishes the
    baseline for comparing students’ performance on future writing assessments.

    The percentages of students performing at each of the three achievement levels in
    2011 provides information on the proportion of students demonstrating varying levels
    of writing skills and knowledge. Fifty-two percent of twelfth-graders performed at
    the Basic level and 79 percent performed at or above the Basic level in writing in 2011
    (figure 17). Twenty-four percent performed at the Proficient level and 3 percent
    performed at the Advanced level.

    SOURCE: U.S. Department of Education, Institute of Education Sciences, National Center
    for Education Statistics, National Assessment of Educational Progress (NAEP),
    2011 Writing Assessment.

    28 THE NATION’S REPORT CARD

    Figure 17. Achievement-level results in twelfth-grade NAEP writing: 2011

    NOTE: Detail may not sum to totals because of rounding.
    GRADE
    12

    White, Asian, and multiracial students perform
    comparably at grade 12
    Average writing scores for White students, Asian students, and students of two or more races
    (multiracial) did not differ significantly from each other in 2011, and all three groups scored higher
    on average than Black, Hispanic, and American Indian/Alaska Native students (figure 18). There
    were no significant differences among the racial/ethnic groups in the percentages of twelfth-
    graders performing at the Basic level in 2011 (figure 19). The percentages of students at Proficient
    and at Advanced did not differ significantly for White, Asian, and multiracial students, and the
    percentages for all three groups were higher than the percentages of Black and Hispanic students
    at Proficient and at Advanced.

    SOURCE: U.S. Department of Education, Institute of Education Sciences, National Center for Education Statistics, National Assessment of Educational Progress (NAEP),
    2011 Writing Assessment.

    29WRITING 2011

    Figure 19. Achievement-level results in twelfth-grade NAEP writing, by race/ethnicity: 2011

    # Rounds to zero.
    NOTE: Black includes African American, and Hispanic includes Latino. Race categories exclude Hispanic origin. Detail may not sum to totals because of rounding.

    Figure 18. Percentage of students and average scores in twelfth-grade NAEP writing, by race/ethnicity: 2011

    # Rounds to zero.
    NOTE: Black includes African American, and Hispanic includes Latino. Race categories exclude Hispanic origin. Detail may not sum to totals because of rounding.
    GRADE
    12

    Female students score higher than male students
    In 2011, female students scored 14 points higher on average than male students at grade 12
    (figure 20). The percentages of students performing at the Basic level did not differ signifi-
    cantly by gender (figure 21). Higher percentages of female than male students performed at
    the Proficient level and at Advanced.

    SOURCE: U.S. Department of Education, Institute of Education Sciences, National Center for Education Statistics, National Assessment of Educational Progress (NAEP),
    2011 Writing Assessment.

    30 THE NATION’S REPORT CARD

    Figure 20. Percentage of students and average scores in twelfth-grade NAEP
    writing, by gender: 2011

    NOTE: Detail may not sum to totals because of rounding.

    Figure 21. Achievement-level results in twelfth-grade NAEP writing, by gender: 2011

    NOTE: Detail may not sum to totals because of rounding.
    GRADE
    12

    SOURCE: U.S. Department of Education, Institute of Education Sciences, National Center for Education Statistics, National Assessment of Educational Progress (NAEP),
    2011 Writing Assessment.

    31WRITING 2011

    Students whose parents have higher levels of education
    score higher
    Twelfth-graders were asked to report the highest level of education completed by each parent.
    Students selected one of five response options: “did not finish high school,” “graduated from high
    school,” “some education after high school,” “graduated from college,” and “I don’t know.” Results
    are reported for the highest level of education for either parent.

    In 2011, twelfth-graders who reported higher levels of parental education had higher average
    writing scores than those who reported lower levels (figure 22). For example, students whose
    parents graduated from college scored higher on average than those whose parents had some
    education after high school, and they, in turn, scored higher than students whose parents’ highest
    level of education was high school.

    A similar pattern was observed in the achievement-level results—students who reported higher
    levels of parental education had higher percentages at Proficient and at Advanced in comparison to
    the percentages for students who reported lower levels of parental education (figure 23).

    Figure 22. Percentage of students and average scores in twelfth-grade NAEP
    writing, by highest level of parental education: 2011

    NOTE: Detail may not sum to totals because results are not shown for students who reported that they did not know the highest education
    level for either of their parents.

    Figure 23. Achievement-level results in twelfth-grade NAEP writing, by highest level of
    parental education: 2011

    # Rounds to zero.
    NOTE: Detail may not sum to totals because of rounding.

    GRADE
    12

    SOURCE: U.S. Department of Education, Institute of Education Sciences, National Center for Education Statistics, National Assessment of Educational Progress (NAEP),
    2011 Writing Assessment.

    32 THE NATION’S REPORT CARD

    Students in suburban schools score higher than students
    in cities and rural locations
    There were some differences in student performance on the 2011 writing assessment based
    on the location of the schools students attended. The average score for students who attended
    schools in suburban locations was higher than the scores for students who attended schools in
    cities or rural locations, but did not differ significantly from the score for students in town loca-
    tions (figure 24). Average scores for students attending schools in town, city, and rural locations
    did not differ significantly from each other.

    The percentages of twelfth-graders in city and suburban schools performing at the Basic level
    were smaller than the percentage of students at Basic in rural locations (figure 25). Students
    attending schools in suburban locations had a higher percentage at Proficient than students
    attending schools in the other three locations. The percentages of students at the Advanced
    level did not differ significantly by school location.

    Figure 24. Percentage of students and average scores in twelfth-grade NAEP
    writing, by school location: 2011

    NOTE: Detail may not sum to totals because of rounding.
    NOTE: Detail may not sum to totals because of rounding.

    Figure 25. Achievement-level results in twelfth-grade NAEP writing, by school location: 2011

    GRADE
    12

    Students who write four to five pages a week for
    English/language arts homework score higher than
    those who write fewer pages
    Twelfth-graders were asked how many pages they wrote in a typical week for homework in their
    English/language arts (ELA) class. In 2011, the average writing score for twelfth-graders who
    reported writing between four and five pages for ELA homework did not differ significantly from
    the score for students who wrote more than five pages, and was higher than the scores for stu-
    dents who wrote fewer than four pages (figure 26). Students who reported not writing any pages
    for homework scored lower than those selecting any of the other responses. Sixty-eight percent6
    of twelfth-graders reported writing up to three pages for their ELA homework in a typical week.

    SOURCE: U.S. Department of Education, Institute of Education Sciences, National Center for Education Statistics, National Assessment of Educational Progress (NAEP),
    2011 Writing Assessment.

    33WRITING 2011

    6 The percentage is based on the sum of the unrounded percentages as opposed to the rounded percentages shown in the figure.

    Figure 26. Percentage of students and average scores in twelfth-grade NAEP writing,
    by student-reported number of pages written for English/language arts
    homework in a typical week: 2011

    NOTE: Detail may not sum to totals because of rounding.
    GRADE
    12

    SOURCE: U.S. Department of Education, Institute of Education Sciences, National Center for Education Statistics, National Assessment of Educational Progress (NAEP),
    2011 Writing Assessment.

    34 THE NATION’S REPORT CARD

    Students who use a computer more frequently to edit
    their writing score higher
    Twelfth-graders were asked how often during the school year they use a computer to make
    changes to a paper or report (e.g., spell-check or cut and paste). Fifty-six percent of twelfth-
    graders reported always or almost always using a computer to make changes to their writing,
    and 4 percent reported never or hardly ever using one (figure 27). In 2011, twelfth-graders who
    reported more frequent use of a computer to edit their writing had higher average writing scores
    than those who reported less frequent use. For example, students who always or almost always
    used a computer to edit their writing scored higher on average than students who reported doing
    so very often, sometimes, or never or hardly ever.

    Figure 27. Percentage of students and average scores in twelfth-grade NAEP writing, by
    student-reported frequency with which they use a computer to make changes
    to writing assignments: 2011

    NOTE: Detail may not sum to totals because of rounding.

    The frequency with which students used a computer to make changes to a paper or report varied
    by level of parental education (table 3). In 2011, a higher percentage of students whose parents
    graduated from college than students whose parents had lower levels of education reported
    always or almost always using a computer to make changes.

    Table 3. Percentage of students assessed in twelfth-grade NAEP writing, by student-reported frequency
    with which they use a computer to make changes to writing assignments and highest level of
    parental education: 2011

    Frequency of computer use to make changes to writing assignments

    Parental education level
    Never or

    hardly ever Sometimes Very often
    Always or

    almost always

    Did not finish high school 7 24 29 39

    Graduated from high school 6 19 29 46

    Some education after high school 4 14 27 54

    Graduated from college 3 10 22 65
    NOTE: Detail may not sum to totals because of rounding.

    GRADE
    12

    About 44 percent of students report writing
    is a favorite activity
    Twelfth-grade students were asked about the extent to which they agreed or disagreed with the
    statement, “writing is one of my favorite activities.” Thirty-three percent of students agreed that
    writing was one of their favorite activities in 2011 and 11 percent strongly agreed (figure 28).

    Students who strongly agreed with the statement scored higher on average than students who
    simply agreed, and scores for both groups were higher than the scores for students who disagreed
    or strongly disagreed.

    The proportion of students indicating that writing was one of their favorite activities varied by
    gender. Larger percentages of female than male students agreed or strongly agreed with the
    statement (table 4).

    Figure 28. Percentage of students and average scores in twelfth-grade NAEP writing, by
    student-reported level of agreement with the statement, “writing is one of my
    favorite activities”: 2011

    NOTE: Detail may not sum to totals because of rounding.
    SOURCE: U.S. Department of Education, Institute of Education Sciences, National Center for Education Statistics, National Assessment of Educational Progress (NAEP),
    2011 Writing Assessment.

    35WRITING 2011

    Table 4. Percentage of students in twelfth-grade NAEP writing, by student-reported level of agreement with
    the statement, “writing is one of my favorite activities,” and gender: 2011

    Gender

    Level of agreement with statement

    Strongly disagree Disagree Agree Strongly agree

    Male 23 42 27 8

    Female 12 36 39 14
    NOTE: Detail may not sum to totals because of rounding.

    GRADE
    12

    New era of computer-based testing provides insight into
    how students use word-processing tools
    The computer-based delivery of the NAEP writing assessment provided the opportunity to
    collect data on the extent to which students engaged in specific actions available to them in the
    word-processing software. Results for the student actions are reported as the percentages of
    students engaging in the action with varying frequency, and the average writing score for those
    students. For example, students who used the text-to-speech tool more frequently scored lower
    on average than those who engaged in the action less frequently (figure 29). A total of 52 percent
    of twelfth-grade students used the text-to-speech tool for the writing prompt one or more times (the
    tool was not available to use for their responses), and 48 percent did not use it at all. Although
    not shown here, a higher percentage of students identified with a disability (26 percent) than
    those not identified with a disability (16 percent) used the text-to-speech tool three or more
    times. Further information about student actions is available at http://nationsreportcard.gov/
    writing_2011/writing_tools.asp.

    NOTE: Detail may not sum to totals because of rounding.

    Figure 29. Percentage of students and average scores in twelfth-grade NAEP writing, by
    number of times text-to-speech tool was used during assessment: 2011

    SOURCE: U.S. Department of Education, Institute of Education Sciences, National Center for Education Statistics, National Assessment of Educational Progress (NAEP),
    2011 Writing Assessment.

    36 THE NATION’S REPORT CARD

    Students also had the option of using an online thesaurus tool to enhance or improve their
    writing. Students who used the thesaurus more frequently scored higher on average than those
    who engaged in this action less frequently (figure 30). Sixty-nine percent of twelfth-grade stu-
    dents did not use the thesaurus tool at all and 15 percent used the tool two or more times during
    the assessment. Although not shown here, a smaller percentage of English language learners
    (6 percent) than non-English language learners (16 percent) used the thesaurus tool two or
    more times.

    NOTE: Detail may not sum to totals because of rounding.

    Figure 30. Percentage of students and average scores in twelfth-grade NAEP writing, by
    number of times thesaurus tool was used during assessment: 2011

    GRADE
    12

    http://nationsreportcard.gov//writing_2011/writing_tools.asp

    http://nationsreportcard.gov//writing_2011/writing_tools.asp

    In addition to the number of key presses students made in the process of completing their
    responses to the writing tasks, data were collected on how often they engaged in 23 other actions
    related to editing, formatting, viewing, and reviewing text. The extent to which students made use
    of each of the tools that were available to them varied. For example, almost all twelfth-graders
    used the backspace key at least one time, while 20 percent or less used the copy, cut, or
    paste tools (figure 31).

    Figure 31. Percentage of twelfth-graders who used various student actions at least once
    during the NAEP writing assessment: 2011

    Other results for student actions, including the percentages of students engaging
    in specific actions with varying frequency and the average writing scores for those
    students, are available on the NAEP website at http://nationsreportcard.gov/
    writing_2011/student_action_results.asp and in the NAEP Data Explorer at
    http://nces.ed.gov/nationsreportcard/naepdata/ within the Student Factors category.

    SOURCE: U.S. Department of Education, Institute of Education Sciences, National Center for Education Statistics, National Assessment of Educational Progress (NAEP),
    2011 Writing Assessment.

    37WRITING 2011

    GRADE
    12

    http://nationsreportcard.gov/writing_2011/student_action_results.asp

    http://nationsreportcard.gov/writing_2011/student_action_results.asp

    http://nces.ed.gov/nationsreportcard/naepdata/

    SOURCE: U.S. Department of Education, Institute of Education Sciences, National Center for Education Statistics, National Assessment of Educational Progress (NAEP),
    2011 Writing Assessment.

    38 THE NATION’S REPORT CARD

    A closer look at lower- and higher-performing students
    The summary of results presented below shows how students performing at the lower end of the scale
    (below the 25th percentile) and those scoring at the higher end (above the 75th percentile) differed in
    terms of their demographics, educational experiences, and use of word-processing tools available to them.

    Among twelfth-graders who scored
    below the 25th percentile (i.e., below
    a score of 127) in 2011

    Among twelfth-graders who scored
    above the 75th percentile (i.e., above
    a score of 175) in 2011

    Student demographic
    characteristics

    40% were White
    25% were Black
    29% were Hispanic
    32% had at least one parent who
    graduated from college
    87% had computers in the home

    78% were White
    5% were Black
    7% were Hispanic
    69% had at least one parent who
    graduated from college
    99% had computers in the home

    Student educational
    experiences and
    attitude toward
    writing

    33% reported always using a com-
    puter to make changes to a paper or
    report (e.g., cut and paste, spell-
    check)

    18% used a computer daily to write
    for school assignments
    34% agreed or strongly agreed
    that writing is one of their
    favorite activities

    77% reported always using a com-
    puter to make changes to a paper or
    report (e.g., cut and paste, spell-
    check)

    39% used a computer daily to write
    for school assignments
    55% agreed or strongly agreed
    that writing is one of their
    favorite activities

    Students’ use of
    word-processing
    tools

    10% used the backspace key
    more than 500 times
    49% right-clicked to access
    spell-check 1–10 times

    67% used the backspace key
    more than 500 times
    68% right-clicked to access
    spell-check 1–10 times

    GRADE
    12

    Writing Achievement-Level Descriptions for Grade 12
    The specific descriptions of what twelfth-graders are expected to be able to do at the Basic,
    Proficient, and Advanced writing achievement levels are presented below. As the NAEP achievement
    levels are cumulative, the expectations of student performance at the Proficient level include
    writing skills expected at the Basic level. Likewise, skills expected at the Advanced level of writing
    assume those specified for performance at the Basic and Proficient levels. The cut score indicating
    the lower end of the score range for each achievement level is noted in parentheses.

    Basic (122)
    Twelfth-grade students writing at the Basic level should be able to respond effectively to the
    tasks and accomplish their communicative purposes. Their texts should be coherent and well
    structured. Most of the ideas in their texts should be developed effectively. Relevant details and
    examples should be used to support and extend the main ideas in the texts. Voice should sup-
    port the communicative purposes of the texts. Texts should include appropriately varied simple,
    compound, and complex sentence types. Words and phrases should be suitable for the topics,
    purposes, and audiences. Substantial knowledge of spelling, grammar, usage, capitalization, and
    punctuation should be clearly evident. There may be some errors in the texts, but these errors
    should not generally impede meaning.

    Proficient (173)
    Twelfth-grade students writing at the Proficient level should address the tasks
    effectively and fully accomplish their communicative purposes. Their texts should
    be coherent and well structured with respect to these purposes, and they
    should include well-crafted and effective connections and transitions. Their
    ideas should be developed in a logical, clear, and effective manner. Relevant
    details and examples should suppor t and extend the main ideas of the texts
    and contribute to their overall communicative effectiveness. Voice should be
    relevant to the tasks and contribute to overall communicative effectiveness.
    Texts should include a variety of simple, compound, and complex sentence
    types that contribute to overall communicative effectiveness. Words
    and phrases should be chosen purposefully and used skillfully to
    enhance the effectiveness of the texts. A solid knowledge of spelling,
    grammar, usage, capitalization, and punctuation should be evident
    throughout the texts. There may be some errors in the texts, but
    they should not impede meaning.

    Assessment
    Content
    Additional insight into students’ performance on the NAEP writing assessment
    can be obtained by examining what twelfth-graders are expected to know and
    be able to do in relation to how they performed on one of the assessment tasks
    designed to measure their writing skills. This section presents the achievement-
    level descriptions for writing at grade 12 and examples of student performance
    in response to a grade 12 task.

    39WRITING 2011

    GRADE
    12

    Advanced (210)
    Twelfth-grade students writing at the Advanced level should be able to address the tasks strategically, fully
    accomplish their communicative purposes, and demonstrate a skillful and creative approach to constructing
    and delivering their messages. Their texts should be coherent and well structured; they should include skill-
    fully constructed and effective connections and transitions; and they should be rhetorically powerful. All of
    the ideas in their texts should be developed clearly, logically, effectively, and in focused and sophisticated
    ways. Supporting details and examples should be well crafted; they should skillfully support and extend the
    main ideas; and they should strengthen both communicative effectiveness and rhetorical power of the texts.
    A distinct voice that enhances the communicative effectiveness and rhetorical power of the texts should be
    evident. Texts should include a variety of sentence structures and types that are skillfully crafted and enhance
    communicative effectiveness and rhetorical power. Words and phrases should be chosen purposefully, with
    precision, and in ways that enhance communicative effectiveness and rhetorical power. A highly developed
    knowledge of spelling, grammar, usage, capitalization, and punctuation should be evident throughout the texts
    and function in ways that enhance communicative effectiveness and rhetorical power. There may be a few
    errors in the texts, but they should not impede meaning.

    Task includes an animated
    video presentation about young
    people’s use of technology.

    VALue of TechnoLoGy TASK ScReenShoT

    Sample Task: Writing to explain
    When writing to explain, the task of the writer is to bring together relevant information and to present this
    information with focus and clarity so that the topic becomes understandable to a reader. The sequence of
    ideas, and how ideas are arranged, must cohere and contribute to the communicative purpose.

    One of the writing tasks from the twelfth-grade assessment asked students to write about a type of technology
    that they use in their lives and why they value that technology. The Value of Technology task began with a short
    video about young people’s use of technology. This video included animation and statistics about technology
    use. The written part of the task then specified an audience for students to address in explaining the value of a
    particular technology. Responses were rated using a scoring guide ranging from “Little or no skill” to “Effective.”

    40 THE NATION’S REPORT CARD

    GRADE
    12

    Range of twelfth-grade skills when writing to explain
    The item map below illustrates the range of writing skills demonstrated by students when writing to the
    Value of Technology task. The responses for each of the credited score categories7 are mapped at different
    points on the NAEP writing scale and fall within the achievement level ranges Basic, Proficient, and Advanced,
    or in the range below Basic. The cut score at the lower end of the range for each achievement level is boxed.
    The criteria for specific score levels reflect the skills demonstrated by twelfth-graders when writing to this
    purpose (i.e., writing to explain). In reading the map, it is important to remember that the score levels do
    not have a direct correspondence to the achievement level ranges, but indicate where performance mapped
    for this particular writing task. For example, for this task, students performing at the Basic level with a score
    of 150 were likely to focus ideas clearly enough to provide some explanation. Students performing within
    the Proficient range with a score of 187 were likely to provide well-developed explanations with some ideas
    strengthening the clarity and progression. Students performing at the Advanced level with a score of 231 were
    likely to provide consistently controlled explanations enhanced by precise word choice and clearly focused
    ideas expressed in such a way that not only are the ideas clear, but also the relationships among them.

    However, student performance varies by task, and ratings for different tasks may map at different points of
    the writing scale. For other tasks, responses rated as “Adequate” may fall within the Proficient range, or a
    “Developing” response might fall within the Basic range.
    7 The lowest rating, “Little or no skill,” receives 0 credit and is not scaled.

    GRADE 12 NAEP WRITING ITEM MAP
    Scale
    score Rating of responses to Value of Technology task Rating criteria

    Pr
    ofi
    ci
    en

    t
    Ad

    va
    nc

    ed

    300
    //

    250
    240

    231 “Effective” explanation of the value of technology
    Students writing at the Effective level developed explanations with
    well-chosen details that enhance meaning, a clear progression of ideas,
    precise word choices, and well-controlled sentences.

    Students writing at the Competent level developed explanations with
    well-chosen details in parts of the response and an overall control of the
    progression of ideas and sentence structure.

    187 “Competent” explanation of the value of technology
    180

    170
    160
    150

    150 “Adequate” explanation of the value of technology
    Students writing at the Adequate level developed explanations using
    some details that do not enhance the clarity or progression of ideas, while
    organization was somewhat loose and sentence structure simple overall.

    140
    130

    120
    115 “Developing” explanation of the value of technology Students’ responses at the Developing level showed deficits in develop-

    ment, organization, and/or language; their explanations were thus uneven
    in clarity and quality.

    110
    100
    90
    80

    78 “Marginal” explanation of the value of technology
    Students’ responses at the Marginal level showed severe deficits in
    development, organization, and/or language; their explanations were thus
    marginal in clarity and quality.

    70
    //
    0
    Ba
    si
    c
    173

    122

    41WRITING 2011

    GRADE
    12

    230
    220

    200
    190

    210

    NOTE: The sample grade 12 writing task in the 2011 writing assessment was mapped onto the NAEP 0–300 writing scale. The map shows, for each level on the scoring guide from “Marginal” through “Effective,” the scale score
    attained by students who had a 65 percent probability of attaining that level or higher for the selected task. Scale score ranges for writing achievement levels are referenced on the map.
    SOURCE: U.S. Department of Education, Institute of Education Sciences, National Center for Education Statistics, National Assessment of Educational Progress (NAEP), 2011 Writing Assessment.

    SOURCE: U.S. Department of Education, Institute of Education Sciences, National Center for Education Statistics, National Assessment of Educational Progress (NAEP), 2011 Writing Assessment.

    42 THE NATION’S REPORT CARD

    Five percent of twelfth-graders provided responses to the Value of Technology task that were rated “Effective”
    and 21 percent provided responses rated “Competent.” Three percent of twelfth-graders received a rating of
    “Little or no skill” on this task.

    Percentage of twelfth-grade students, by response rating for Value of Technology task: 2011

    Effective Competent Adequate Developing Marginal
    Little or
    no skill

    5 21 34 25 11 3

    NOTE: Detail may not sum to totals because of rounding.

    The table below shows the percentage of students within each achievement level whose responses to
    the Value of Technology task were rated as “Adequate” or higher. For example, among twelfth-graders per-
    forming at the Basic level, 61 percent provided responses that were rated “Adequate” or higher and among
    students at the Proficient level, 97 percent wrote responses to the task that were rated “Adequate” or higher.

    Percentage of twelfth-grade student responses to Value of Technology task rated
    “Adequate” or higher, by achievement level: 2011

    Below Basic At Basic At Proficient At Advanced

    6 61 97 100

    example of a “Developing” response
    The sample response shown here was rated as demonstrating “Developing” skill in responding to the writing
    task. While this response provides clear statements related to the topic, the presentation of ideas is list-like.
    With the exception of the one-sentence explanation for why the writer uses the computer to do research
    for homework, no ideas are developed. Statistics from the video are referred to but not integrated into the
    response. The response never progresses in its organization beyond a listing based on “I use,” “I like,” or “I
    don’t use”; ideas are related to the overall topic more than to one another. This response demonstrates per-
    formance associated with the range below the Basic level. Twelfth-grade students performing in this range
    may provide ideas relevant to the topic, but do not control or develop the ideas to fully explain it.

    A type of technology I use in my daily life is my laptop.

    I use my laptop daily because it’s fast and simple to use. I use it to play games and use
    social networking sites such as facebook. I also use it to listen to music with iTunes. I can use it to sync
    music to put more songs on my ipod. Statistics state that 27% of young teens used video or music down-
    loads. Doing research for homework is also something I use my laptop for. Using my computer makes it
    faster and easier to look things up. My laptop is important to me because it helps me stay connected with
    friends and family through social networking sights such as facebook. The internet somehow keeps me
    entertained knowing that I have everything i need to know right at my fingertips even though it might
    not always be true. I like to stay updated on what my favorite celebrities are doing through twitter. I don’t
    use twitter as much as I use facebook because i feel that facebook has more things you can do on the site
    rather than just telling people what you are doing. The statistics in the video state that 55% of young
    people use social websites and probably for the same reasons I do. These are some of the many reasons
    why i like to use my laptop and why it is important to me.

    GRADE
    12

    GRADE
    12

    Example of an “Adequate” response
    The sample student response shown here was rated as demonstrating “Adequate” skill in responding to the
    writing task. In explaining the value of the technology, the response presents ideas clearly and simply. The
    three middle paragraphs explain different aspects of why the technology is valuable, with some examples
    that support the main idea. The connection between ideas in each paragraph is smooth and contributes to
    a sense of overall cohesion. While some phrasings are unclear, e.g., “fast engulfing” or “a reliant lifestyles,”
    this response demonstrates skills associated with performance at the Basic level. Twelfth-grade students at
    this level demonstrate a control of language to accomplish the communicative purpose.

    To College Admissions Committee:

    Technology is an ongoing factor that will continue to grow each year. I, along with others, can admit
    that without it today’s experiences would not be the same. Facebook, a popular blogging site, is a source
    that I rely on for communication, news, and entertainment.

    Facebook, created by Mark Zukerberg, was originally made as a site to communicate with fellow
    college students. The fast engulfing site has become just this for myself and many million other “friends.”
    Acquaintances are able to update me on what is new with their lives. I am able to communicate with long
    distance family. My school has even taken advantage of the popular site by posting assignments and study
    groups.

    News sweeps across Facebook faster than a hurricane. Although I do watch and read news elsewhere,
    the website makes it easy and convenient. Because most people own a computer, the news can be
    accessed as desired. Television, in comparison, plays news once and your absence can determine if you’re
    informed or not.

    Music, videos and chatting are available on Facebook. Although distracting, the entertainment that the
    website provides is popular. You can even join groups that have the same interests and receive updates
    on events. Whether I want to look up a new hit song, or watch a movie trailer that I know of, I can rely on
    Facebook to have it.

    The most modern form on keeping in touch with people, discovering news, and entertaining yourself is
    found on Facebook. It is easy to look down on such a reliant lifestyles, but earlier generations can’t ignore
    how convenient the website is. Regardless of how your life was yesterday, technology will change tomorrow
    and the day after. Inventions like Facebook make it hard to believe life without such advanced technologies.

    Writing tasks available online
    More NAEP writing tasks are available online at http://nces.ed.gov/nationsreportcard/
    itmrlsx/search.aspx?subject=writing and include:



    One released task for each writing purpose at grade 12, including audio and video

    A scoring rubric for each task
    Samples of responses at each scoring level
    Student performance data
    Explore the interface for the writing tasks at http://nationsreportcard.gov/
    writing_2011/writing_tools.asp.

    43WRITING 2011

    http://nces.ed.gov/nationsreportcard/itmrlsx/search.aspx?subject=writing

    http://nces.ed.gov/nationsreportcard/itmrlsx/search.aspx?subject=writing

    http://nationsreportcard.gov/writing_2011/writing_tools.asp

    http://nationsreportcard.gov/writing_2011/writing_tools.asp

    example of a “competent” response
    The following sample student response was rated as “Competent” in writing to explain. From the strong
    opening sentence and throughout, the response demonstrates a control of language in developing logical
    and clear ideas that contribute to explaining the value of a particular technology. Specific advantages are
    presented, and then the value of the technology is illustrated with a personal example. Overall cohesiveness
    is achieved by relatively smooth transitions. Logically cohesive complex sentences, such as the final one in
    the response, enhance the ideas being expressed. This response demonstrates skills associated with the
    Proficient level. Twelfth-grade students at this level are able to produce well-structured responses that fully
    accomplish the communicative purpose.

    In recent history, people had no means to communicate with each other instantaneously over long
    distances, but within the past ten years social networking has completely altered our ways of communica-
    tion. Mark Zuckerburg’s creation of Facebook truly revolutionized how we interact with one another, for
    he created a relatively safe online environment where people can keep in close contact with friends with
    having to travel to visit them personally. People no longer need to wait days or weeks for the Pony Express
    to transport messages, because now at the mere click of a button we can write and send lengthy messages,
    pictures, memos, videos, and greetings to our peers. Facebook is the form of current technology that has
    most affected my life as a high school student.

    Facebook is a novel innovation that has completely altered how people, especially teenagers, communi-
    cate. Originally intended for college networking, Facebook lets you “friend” your acquaintances and keep in
    closer contact with them by automatically updating them about your life. Facebook has practically limitless
    entertainment and communications options, including instant messaging, status updating, game playing,
    and picture sharing. With this, one can choose to share as much about herself as she wishes, but someone
    like me can also choose to be more reserved and write to people in private inbox messages instead of public
    updates.

    Ever since I created a Facebook account in the beginning of my sophomore year in high school, it has
    allowed me to keep in contact with friends that I would otherwise never see again and share in what is
    happening in their lives. Going to a private school from a public middle school was a difficult decision for
    me because I did not want to leave the close friends I had made in middle school, but Facebook has
    allowed me to be as close to them as ever; for although I no longer see them on a regular basis, I can always
    write a quick “Hello” or “Happy Birthday” on their Facebook wall, and I am constantly updated on events in
    their lives. I can easily invite people to my plays and concerts at my school without spending great quanti-
    ties of time or having to hurt the environment by sending paper invitations. Facebook also helps me keep
    in contact with close friends that I make while in Winds of Change theater camp in Canada. Though I only
    visit Canada for this camp once a year, I can stay connected to my Canadian friends because of Facebook.
    One year I was unable to stay for the entire two weeks of the camp, but thanks to Facebook I was able to see
    pictures from the show I helped prepare that was performed at the end of the second week.

    Although my parents and grandparents kept in minimal contact with some of their high school and
    college friends through telephone and snail mail, I feel that I will stay much closer to friends all throughout
    my childhood because I can see recent pictures of them and read about what is occurring in their lives
    much more easily. I limit my Facebook friends to people that I care about and truly want to keep in contact
    with, and thus whenever I log into Facebook, I am receiving updates even about friends on exchange in
    foreign countries, and I feel much more connected to the world around me.

    44 THE NATION’S REPORT CARD

    GRADE
    12

    example of an “effective” response
    The sample student response shown below was rated as “Effective” in responding to the task about the
    value of technology. After an opening paragraph that defines video games and introduces the ideas to be
    developed throughout, the writer constructs the explanation primarily through use of personal experience.
    This approach skillfully communicates the value of video games through the use of detailed descriptions
    of specific games and what the writer has learned from them. Ideas are fully developed, and the rich use of
    explanatory details establishes a distinct voice speaking intelligently from experience. This response dem-
    onstrates skills associated with performance at the Advanced level. Twelfth-grade students at this level are
    able to craft responses that strategically accomplish the communicative purpose.

    Videogames are a primary source of entertainment for people of all ages. Videogames are discs or cartridges
    that hold data; once a disc or cartridege is inserted into a gaming console, the data is read and displayed on the
    screen along with prompts that allow the game to be played. Games have many genres ranging from fighting to
    educational and can be used for than just mere entertainment. I personally have been experiencing what video-
    games have to offer for over five years now. Gaming is not just something that people do for fun, people can play
    videogames for many reasons. Videogames are an important factor in many peoples lives including mine and
    are a valuable type of technology.

    I have been playing videogames from a very young age. Mario was the first game I was ever introduced to
    and it was not through playing; through sheer coincidence my mother realized that the theme music to Mario
    put me to sleep as a baby. Once I was old enough to hold a controller I began playing the game. Ever since that
    moment I have been playing videogames. Games are multi-purposed; to some it is merely a form of entertain-
    ment, but to others it could be their job. Some people argue that games are a waste of time and that they are
    not product. I beg to differ; games are important to me because not only do they give me something to do to
    pass time but they are also educational. A prime example of this is a game I was introduced to by my cousins,
    Runescape. When I was about thirteen I had went to see my cousins up state and I saw them playing this
    browser game called Runescape (a browser game is a game that can be played within an internet browser
    without the need to download or upload information from a disc or cartridge). Me being the person I am, I was
    curious as to what it was so I began to ask questions. By the end of the day I learned two things about that
    game, two things that to some gamers, were their favorite word. It was a MMORPG (massive multiplayer online
    role playing game) that was free; in essence it was a free gamethat I didn’t have to download and I could do
    basically whatever I wanted that was allowed in the game. Within the game you could do any of the various
    skills offered, quests, and even fight against other players from around the world with you’re avatar. Once I got
    home, I of course signed up and began to play. Throughout the few years I played that game I realized it was set
    in Medieval times and I learned many things about that age. I learned the process it takes to turn ore into metal,
    what smelting is, how leather is crafted into clothing, how clay is used, and some of the politics of Medieval
    civilizations throughout the quests of the game. Although I would spend hours on this game and it seemed like I
    was doing nothing, I infact was actually learning.

    Another game my cousins introduced to me was Age of Mythology. The game was a PC game(which means
    it had to be bought and it contained disc which had to upload the game onto you’re computer or device and
    then the game could be played) and I had played it at my cousins and eventually went on to buy it. If mythology
    was a subject in school, this game could be the teacher. This game focuses around Greek, Egyptian and Norse
    mythology. You follow the antagonists (which you name) through all three civilizations chasing an evil mino-
    taur that is attempting to end the world. You begin in a fictional Greek city and eventually move throughout the
    world. This game teaches any of it’s players not only how armies from all three civilazations worked but those
    civilazations major Gods, minor Gods, demigods and mythological creatures. Stories based on mythology or fact
    are also told and experienced throughout the game; such as the Trojan Horse and Ragnorak. I have never picked
    up a book based on mythology or ancient Gods but because of this game I have an extensive knowledge of the
    mythology of those three cultures. Games are important in society; they give people a hobby and peace of mind.
    They can also be used for educational purposes. Toddlers no longer read books to learn how to read, write, and
    spell, they are given toys and games to play. Games hold a high position in society and can be beneficial to those
    who use them if they wish to use them in that way.

    45WRITING 2011

    GRADE
    12

    Sampling and Weighting
    The schools and students participating in NAEP assessments are selected to be representative
    of all schools nationally. The results from the assessed students are combined to provide accurate
    estimates of the overall performance of students in public, private, and other types of schools
    (i.e., Bureau of Indian Education schools and Department of Defense schools) in the nation. More
    information on sampling can be found at http://nces.ed.gov/nationsreportcard/about/nathow.asp.

    Because each school that participated in the assessment and each student assessed represents a
    portion of the population of interest, the results are weighted to account for the disproportionate
    representation of the selected sample. This includes oversampling of schools with high concen-
    trations of students from certain racial/ethnic groups and the lower sampling rates of students
    who attend very small schools.

    School and Student Participation
    To ensure unbiased samples, NAEP statistical standards require that participation rates for the
    original school samples be 70 percent or higher to report national results separately for public
    and private schools. In instances where participation rates meet the 70 percent criterion but
    fall below 85 percent, a nonresponse bias analysis is conducted to determine if the responding
    school sample is not representative of the population, thereby introducing the potential for
    nonresponse bias.

    The weighted national school participation rates for the 2011 writing assessment were 97 percent
    for grade 8 (100 percent for public schools, 71 percent for private schools, and 96 percent for
    Catholic schools only), and 94 percent for grade 12 (96 percent for public schools, 67 percent for
    private schools, and 77 percent for Catholic schools only). Weighted student participation rates
    were 94 percent at grade 8, and 87 percent at grade 12. Because the participation rate for private
    schools overall fell below 70 percent, results could not be reported for twelfth-graders attending
    private schools in 2011. Results are available for Catholic schools on the NAEP Data Explorer at
    http://nces.ed.gov/nationsreportcard/naepdata/.

    Nonresponse bias analyses were conducted for the private school samples at both grades 8 and
    12. For grade 8, the results of the nonresponse bias analyses showed no significant bias for any
    school characteristic after substitution and nonresponse adjustments. However, at grade 12,
    some variables examined in the analyses still indicated potential bias after nonresponse adjust-
    ments. Specifically, the potential for bias still existed for race after nonresponse adjustments.
    The Asian/Pacific Islander students were slightly underrepresented in the responding private
    school sample.

    Interpreting Statistical Significance
    Comparisons between groups are based on statistical tests that consider both the size of the
    differences and the standard errors of the two statistics being compared. Standard errors are mar-
    gins of error, and estimates based on smaller groups are likely to have larger margins of error. The
    size of the standard errors may also be influenced by other factors such as how representative the
    assessed students are of the entire population.

    When an estimate has a large standard error, a numerical difference that seems large may not
    be statistically significant. Differences of the same magnitude may or may not be statistically
    significant depending upon the size of the standard errors of the estimates. For example, at grade 8,
    the 13-point difference in average writing scores for White and American Indian/Alaska Native

    Technical Notes

    46 THE NATION’S REPORT CARD

    http://nces.ed.gov/nationsreportcard/about/nathow.asp

    http://nces.ed.gov/nationsreportcard/naepdata/

    students was statistically significant, while the 17-point difference between White and Native
    Hawaiian/Other Pacific Islander students was not. Standard errors for the estimates presented
    in this report are available at http://nces.ed.gov/nationsreportcard/naepdata/.

    To ensure that significant differences in NAEP data reflect actual differences and not mere
    chance, error rates need to be controlled when making multiple simultaneous comparisons.
    The more comparisons that are made (e.g., comparing the performance of White, Black, Hispanic,
    Asian, Native Hawaiian/Other Pacific Islander, American Indian/Alaska Native, and multiracial
    students), the higher the probability of finding significant differences by chance. In NAEP, the
    Benjamini-Hochberg False Discovery Rate (FDR) procedure is used to control the expected pro-
    portion of falsely rejected hypotheses relative to the number of comparisons that are conducted.
    A detailed explanation of this procedure can be found at http://nces.ed.gov/nationsreportcard/
    tdw/analysis/infer.asp. NAEP employs a number of rules to determine the number of compari-
    sons conducted, which in most cases is simply the number of possible statistical tests.

    Race/Ethnicity
    In compliance with new standards from the U.S. Office of Management and Budget for collecting
    and reporting data on race/ethnicity, information was collected in 2011 to report results for the
    following seven racial/ethnic categories:

    • White • Native Hawaiian/Other Pacific Islander

    • Black • American Indian/Alaska Native

    • Hispanic • Two or more races

    • Asian

    Students identified as Hispanic were classified as Hispanic even if they were also identified with
    another racial/ethnic group. Students identified with two or more of the other racial/ethnic groups
    (e.g., White and Black) were classified as “two or more races.”

    National School Lunch Program
    NAEP collects data on student eligibility for the National School Lunch Program (NSLP) as an
    indicator of family income. Under the guidelines of NSLP, children from families with incomes
    at or below 130 percent of the poverty level are eligible for free meals. Those from families with
    incomes between 130 and 185 percent of the poverty level are eligible for reduced-price meals.
    (For the period July 1, 2011 through June 30, 2012, for a family of four, 130 percent of the
    poverty level was $29,055, and 185 percent was $41,348.) Some schools provide free meals
    to all students regardless of individual eligibility, using their own funds to cover the costs of
    noneligible students. Under special provisions of the National School Lunch Act intended to
    reduce the administrative burden of determining student eligibility every year, schools can be
    reimbursed based on eligibility data for a single base year. For more information on NSLP, visit
    http://www.fns.usda.gov/cnd/lunch/.

    School Location
    NAEP results are reported for four mutually exclusive categories of school location: city, suburb,
    town, and rural. The categories are based on standard definitions established by the Federal
    Office of Management and Budget using population and geographic information from the
    U.S. Census Bureau. Schools are assigned to these categories in the NCES Common Core of Data
    (CCD) “locale codes” based on their physical address. The locale codes are based on a school’s
    proximity to an urbanized area (a densely settled core with densely settled surrounding areas).
    More detail on the locale codes is available at http://nces.ed.gov/ccd/rural_locales.asp.

    47WRITING 2011

    http://nces.ed.gov/nationsreportcard/naepdata/

    http://nces.ed.gov/nationsreportcard/tdw/analysis/infer.asp

    http://nces.ed.gov/nationsreportcard/tdw/analysis/infer.asp

    http://www.fns.usda.gov/cnd/lunch/

    http://nces.ed.gov/ccd/rural_locales.asp

    Photo Credits:

    © Bill Denison Photography; © Zhang Bo/iStockphoto #16029216; © Atanas Bezov/iStockphoto #8523449; © Andrew Rich/iStockphoto #15355670;
    © LuminaStock/iStockphoto #20119357; © MeshaPhoto\iStockphoto #20569944; © Benoit Chartron/iStockphoto #18681497; © Anatoly Vartanov/
    iStockphoto #14428786; © Christopher Futcher/iStockphoto #15350658; © Nina Shannon/iStockphoto #7168584; © Christopher Futcher/iStockphoto
    #18998338; © Tetra Images/Alamy #BMJ59R; © Alloy Photography/Veer #AYP1225397; © Baran Özdemir/iStockphoto #17218601; © Abel Mitja Varela/
    iStockphoto #17757341; © Christopher Futcher/iStockphoto #15653336; © Kemter/iStockphoto #15747755; © Chris Schmidt/iStockphoto #11294967;
    © Aldo Murillo/iStockphoto #19902167; © Steve Debenport/iStockphoto #20684546

    M O R E I N F O R M A T I O N
    The report release site is
    http://nationsreportcard.gov.
    The NCES Publications and Products
    address is http://nces.ed.gov/
    pubsearch.

    For ordering information, write to
    ED Pubs
    U.S. Department of Education
    P.O. Box 22207
    Alexandria, VA 22304

    or call toll free 1-877-4-ED-Pubs

    or order online at
    http://www.edpubs.gov.

    T H E N AT I O N ’S
    R E P O R T C A R D

    Writing
    2011
    SEP T EMBER 2012

    S U G G E S T E D C I T A T I O N
    National Center for Education
    Statistics (2012).
    The Nation’s Report Card:
    Writing 2011
    (NCES 2012–470).
    Institute of Education Sciences,
    U.S. Department of Education,
    Washington, D.C.

    C O N T E N T C O N T A C T

    Angela Glymph
    202-219-7127
    angela.glymph@ed.gov

    This report was prepared for the National
    Center for Education Statistics under Contract
    No. ED-07-CO-0107 with Educational Testing
    Service. Mention of trade names, commercial
    products, or organizations does not imply
    endorsement by the U.S. Government.

    U.S. Department of Education
    The National Assessment of Educational Progress (NAEP) is a congressionally authorized project sponsored by the
    U.S. Department of Education. The National Center for Education Statistics, within the Institute of Education Sciences,
    administers NAEP. The Commissioner of Education Statistics is responsible by law for carrying out the NAEP project.

    Arne Duncan
    Secretary
    U.S. Department
    of Education

    John Q. Easton
    Director
    Institute of
    Education Sciences

    Jack Buckley
    Commissioner

    National Center for
    Education Statistics

    Peggy G. Carr
    Associate Commissioner
    for Assessment

    National Center for
    Education Statistics

    The National Assessment Governing Board
    In 1988, Congress created the National Assessment Governing Board to set policy for the National Assessment of
    Educational Progress, commonly known as The Nation’s Report CardTM. The Governing Board is an independent,
    bipartisan group whose members include governors, state legislators, local and state school officials, educators,
    business representatives, and members of the general public.

    Honorable David P. Driscoll, Chair
    Former Commissioner of Education
    Melrose, Massachusetts

    Mary Frances Taymans,
     Vice Chair
    Nonpublic School Representative
    Bethesda, Maryland

    Andrés Alonso
    Chief Executive Officer
    Baltimore City Public Schools
    Baltimore, Maryland

    David J. Alukonis
    Former Chairman
    Hudson School Board
    Hudson, New Hampshire

    Louis M. Fabrizio
    Data, Research and Federal Policy Director
    North Carolina Department of Public
    Instruction

    Raleigh, North Carolina

    Honorable Anitere Flores
    Senator
    Florida State Senate
    Miami, Florida

    Alan J. Friedman
    Consultant
    Museum Development and Science
    Communication

    New York, New York

    Shannon Garrison
    Fourth-Grade Teacher
    Solano Avenue Elementary School
    Los Angeles, California

    Doris R. Hicks
    Principal and Chief Executive Officer
    Dr. Martin Luther King, Jr. Charter School
    for Science and Technology

    New Orleans, Louisiana

    Honorable Terry Holliday
    Commissioner of Education
    Kentucky Department of Education
    Lexington, Kentucky

    Richard Brent Houston
    Principal
    Shawnee Middle School
    Shawnee, Oklahoma

    Hector Ibarra
    Middle School Science Teacher
    Belin-Blank International Center
    and Talent Development

    Iowa City, Iowa

    Honorable Tom Luna
    Idaho Superintendent of Public
    Instruction

    Boise, Idaho

    Honorable Jack Markell
    Governor of Delaware
    Wilmington, Delaware

    Tonya Miles
    General Public Representative
    Mitchellville, Maryland

    Dale Nowlin
    Twelfth-Grade Teacher
    Columbus North High School
    Columbus, Indiana

    Susan Pimentel
    Educational Consultant
    Hanover, New Hampshire

    W. James Popham
    Professor Emeritus
    Graduate School of Education and
    Information Studies

    University of California, Los Angeles
    Wilsonville, Oregon

    Andrew C. Porter
    Dean
    Graduate School of Education
    University of Pennsylvania
    Philadelphia, Pennsylvania

    B. Fielding Rolston
    Chairman
    Tennessee State Board of Education
    Kingsport, Tennessee

    Cary Sneider
    Associate Research Professor
    Portland State University
    Portland, Oregon

    Blair Taylor
    Chief Community Officer
    Starbucks Coffee Company
    Seattle, Washington

    Honorable Leticia Van de Putte
    Senator
    Texas State Senate
    San Antonio, Texas

    Eileen L. Weiser
    General Public Representative
    Ann Arbor, Michigan

    John Q. Easton (Ex officio)
    Director
    Institute of Education Sciences
    U.S. Department of Education
    Washington, D.C.

    Cornelia S. Orr
    Executive Director
    National Assessment Governing Board
    Washington, D.C.

    “ T h e D e p a r t m e n t o f E d u c a t i o n ’s m i s s i o n i s t o p r o m o t e s t u d e n t
    a c h i e v e m e n t a n d p r e p a r a t i o n f o r g l o b a l c o m p e t i t i v e n e s s b y
    f o s t e r i n g e d u c a t i o n a l e x c e l l e n c e a n d e n s u r i n g e q u a l a c c e s s .”

    w w w. e d . g o v

    http://nationsreportcard.gov

    http://nces.ed.gov/pubsearch

    http://nces.ed.gov/pubsearch

    http://www.edpubs.gov

    http://www.ed.gov

      Contents
      Executive Summary
      Introduction
      Grade 8
      Twenty-seven percent of eighth-graders perform at or above Proficient
      Asian students score higher than other racial/ethnic groups
      Female students perform higher than male students
      Public school students score lower than private school students
      Student performance varies by family income
      Students in suburban schools score higher than students in other locations
      About two-thirds of eighth-graders spend more than 15 minutes a day writing for English class
      Students who use computers more frequently to draft and revise their writing score higher
      New era of computer-based testing provides insight into how students use word-processing tools
      A closer look at lower- and higher-performing students
      Assessment Content
      Writing Achievement-Level Descriptions for Grade 8
      Sample Task: Writing to Convey Experience
      Grade 12
      Twenty-seven percent of twelfth-graders perform at or above Proficient
      White, Asian, and multiracial students perform comparably at grade 12
      Female students score higher than male students
      Students whose parents have higher levels of education score higher
      Students in suburban schools score higher than students in cities and rural locations
      Students who write four to five pages a week for English/language arts homework score higher than those who write fewer pages
      Students who use a computer more frequently to edit their writing score higher
      About 44 percent of students report writingis a favorite activity
      New era of computer-based testing provides insight into how students use word-processing tools
      A closer look at lower- and higher-performing students
      Assessment Content
      Writing Achievement-Level Descriptions for Grade 12
      Sample Task: Writing to Explain
      Technical Notes

    • More Information

    What Will You Get?

    We provide professional writing services to help you score straight A’s by submitting custom written assignments that mirror your guidelines.

    Premium Quality

    Get result-oriented writing and never worry about grades anymore. We follow the highest quality standards to make sure that you get perfect assignments.

    Experienced Writers

    Our writers have experience in dealing with papers of every educational level. You can surely rely on the expertise of our qualified professionals.

    On-Time Delivery

    Your deadline is our threshold for success and we take it very seriously. We make sure you receive your papers before your predefined time.

    24/7 Customer Support

    Someone from our customer support team is always here to respond to your questions. So, hit us up if you have got any ambiguity or concern.

    Complete Confidentiality

    Sit back and relax while we help you out with writing your papers. We have an ultimate policy for keeping your personal and order-related details a secret.

    Authentic Sources

    We assure you that your document will be thoroughly checked for plagiarism and grammatical errors as we use highly authentic and licit sources.

    Moneyback Guarantee

    Still reluctant about placing an order? Our 100% Moneyback Guarantee backs you up on rare occasions where you aren’t satisfied with the writing.

    Order Tracking

    You don’t have to wait for an update for hours; you can track the progress of your order any time you want. We share the status after each step.

    image

    Areas of Expertise

    Although you can leverage our expertise for any writing task, we have a knack for creating flawless papers for the following document types.

    Areas of Expertise

    Although you can leverage our expertise for any writing task, we have a knack for creating flawless papers for the following document types.

    image

    Trusted Partner of 9650+ Students for Writing

    From brainstorming your paper's outline to perfecting its grammar, we perform every step carefully to make your paper worthy of A grade.

    Preferred Writer

    Hire your preferred writer anytime. Simply specify if you want your preferred expert to write your paper and we’ll make that happen.

    Grammar Check Report

    Get an elaborate and authentic grammar check report with your work to have the grammar goodness sealed in your document.

    One Page Summary

    You can purchase this feature if you want our writers to sum up your paper in the form of a concise and well-articulated summary.

    Plagiarism Report

    You don’t have to worry about plagiarism anymore. Get a plagiarism report to certify the uniqueness of your work.

    Free Features $66FREE

    • Most Qualified Writer $10FREE
    • Plagiarism Scan Report $10FREE
    • Unlimited Revisions $08FREE
    • Paper Formatting $05FREE
    • Cover Page $05FREE
    • Referencing & Bibliography $10FREE
    • Dedicated User Area $08FREE
    • 24/7 Order Tracking $05FREE
    • Periodic Email Alerts $05FREE
    image

    Our Services

    Join us for the best experience while seeking writing assistance in your college life. A good grade is all you need to boost up your academic excellence and we are all about it.

    • On-time Delivery
    • 24/7 Order Tracking
    • Access to Authentic Sources
    Academic Writing

    We create perfect papers according to the guidelines.

    Professional Editing

    We seamlessly edit out errors from your papers.

    Thorough Proofreading

    We thoroughly read your final draft to identify errors.

    image

    Delegate Your Challenging Writing Tasks to Experienced Professionals

    Work with ultimate peace of mind because we ensure that your academic work is our responsibility and your grades are a top concern for us!

    Check Out Our Sample Work

    Dedication. Quality. Commitment. Punctuality

    Categories
    All samples
    Essay (any type)
    Essay (any type)
    The Value of a Nursing Degree
    Undergrad. (yrs 3-4)
    Nursing
    2
    View this sample

    It May Not Be Much, but It’s Honest Work!

    Here is what we have achieved so far. These numbers are evidence that we go the extra mile to make your college journey successful.

    0+

    Happy Clients

    0+

    Words Written This Week

    0+

    Ongoing Orders

    0%

    Customer Satisfaction Rate
    image

    Process as Fine as Brewed Coffee

    We have the most intuitive and minimalistic process so that you can easily place an order. Just follow a few steps to unlock success.

    See How We Helped 9000+ Students Achieve Success

    image

    We Analyze Your Problem and Offer Customized Writing

    We understand your guidelines first before delivering any writing service. You can discuss your writing needs and we will have them evaluated by our dedicated team.

    • Clear elicitation of your requirements.
    • Customized writing as per your needs.

    We Mirror Your Guidelines to Deliver Quality Services

    We write your papers in a standardized way. We complete your work in such a way that it turns out to be a perfect description of your guidelines.

    • Proactive analysis of your writing.
    • Active communication to understand requirements.
    image
    image

    We Handle Your Writing Tasks to Ensure Excellent Grades

    We promise you excellent grades and academic excellence that you always longed for. Our writers stay in touch with you via email.

    • Thorough research and analysis for every order.
    • Deliverance of reliable writing service to improve your grades.
    Place an Order Start Chat Now
    image

    Order your essay today and save 30% with the discount code Happy