








































Study with the several resources on Docsity
Earn points by helping other students or get them with a premium plan
Prepare for your exams
Study with the several resources on Docsity
Earn points to download
Earn points by helping other students or get them with a premium plan
Community
Ask the community for help and clear up your study doubts
Discover the best universities in your country according to Docsity users
Free resources
Download our free guides on studying techniques, anxiety management strategies, and thesis advice from Docsity tutors
Guidelines for achieving quality data in education through clear data definitions, well-documented methodologies, timely data collection and reporting, and the use of automated student data systems. It also emphasizes the importance of staff organization and training at all levels to prevent errors and improve data quality.
Typology: Study notes
1 / 48
This page cannot be seen from the preview
Don't miss anything!
[Non-Regulatory Guidance] April 2006
This paper was produced by DTI Associates, A Haverstick Company, under U.S. Department of
¾ The National Center for Education Statistics (NCES) is charged with the responsibility of working with other components of the U.S. Department of Education and with State and local educational institutions to improve the quality of education data. NCES is responsible for a grant program that provides funding to States for the development of high-quality longitudinal student information systems needed to compute a true cohort graduation rate. At the elementary/ secondary level, NCES recently released a Cooperative System Guide to Building a Culture of Data Quality , aimed at schools and school districts. At the postsecondary level, NCES has redesigned the Integrated Postsecondary Education Data System from a paper system to an online data collection, helping improve the quality of these data, while at the same time increasing their utility.
¾ The Office of Vocational and Adult Education (OVAE) and States have significantly improved the quality of state adult education performance data over the last several years, as States have implemented the National Reporting System for adult education. OVAE has enhanced States’ capacity to provide high-quality assessment data by developing state data quality standards that identify the policies, processes, and materials that states and local programs should have in place to collect valid and reliable data.
¾ The Office of Special Education Programs (OSEP), within the Office of Special Education and Rehabilitative Services, has implemented a data-dependent accountability system, the Continuous Improvement and Focused Monitoring System (CIFMS), that has focused on State performance on a number of performance measures and regulation-based compliance requirements. In support of CIFMS, the office has provided ongoing technical assistance and data reviews to support States’ efforts to provide valid, reliable, and accurate data related to the implementation of the Individuals with Disabilities Education Act.
¾ The Office of Safe and Drug-Free Schools (OSDFS) is improving State data systems and linking those improvement activities to other U.S. Department of Education initiatives. The No Child Left Behind Act requires that each State collect and report to the public certain school crime and safety data elements, such as truancy and the incidence of violence and drug-related offenses. OSDFS is currently implementing two initiatives designed to support improvement in the quality of data related to youth drug and violence prevention programs. Grants have been awarded to 17 States to provide support for enhancing efforts to collect data required by the Uniform Management Information and Reporting Systems (UMIRS) provisions in Title IV of NCLB (Safe and Drug-Free Schools and Communities Act). A second initiative involves the development of a uniform data set that includes the UMIRS data elements.
Readers interested in learning more about Federal data quality initiatives can consult the U.S. Department of Education’s website at http://www.ed.gov.
The accountability provisions included in the No Child Left Behind Act of 2001 (NCLB) significantly increased the urgency for States, local educational agencies (LEAs), and local schools to produce accurate, reliable, high- quality educational data. With determinations of whether or not schools and LEAs make “adequate yearly progress” (AYP) dependent upon their student achievement data, it has never been more important for State and local data systems and reporting processes to produce accurate, reliable information. To assist in this effort, the U.S. Department of Education’s Office of Elementary and Secondary Education has developed this set of education data quality guidelines.
A number of high-profile efforts are currently underway to improve the quality of the data reported to the U.S. Department of Education. Initiatives such as ED Facts , the Schools Interoperability Framework (SIF), the Data Quality Campaign, and numerous other Federal and State education data reform projects have begun the process of implementing systemic, long-term change in the way data are collected, analyzed, and reported. These efforts to reshape the foundations of current data management structures will take a considerable amount of time and resources to achieve. Until these systemic changes are complete, it is vitally important for States and localities to implement the best enhanced management controls possible over the data that are being used to make key judgments about AYP, funding, NCLB accountability, and other State and local education policies.
These guidelines do not impose any additional legal requirements beyond what is in the law, but rather are intended to provide information regarding good practices in data collection. The guidelines are intended to provide shorter-term, relatively inexpensive, interim procedures that States and localities can use now to improve data quality while more systemic restructuring is in progress. In some cases, such as in developing infrastructure and training staff, “short-term” measures will have a long-term impact on data quality procedures. The guidelines are built around the basic data elements required for NCLB Report Card reporting, but are designed to be applicable to other K-12 data that States, LEAs, and schools collect as well. Most States have had accountability systems long before NCLB, and States, LEAs, and schools collect data for a wide variety of purposes beyond Federal NCLB Report Card reporting. What these guidelines term “NCLB data” (from the Federal perspective) are in many cases data elements that States, LEAs, and schools have collected and analyzed since long before NCLB.
Data Matters
The Government Accountability Office (GAO) cautioned, in a September 2004 report on NCLB, that “measuring achievement with faulty data can lead to inaccurate information on schools meeting proficiency goals.” Both GAO and the U.S. Department of Education’s Office of the Inspector General (OIG) have pointed out that the consequences of poor- quality NCLB data can be serious, including the possibility that schools and districts could be erroneously identified as being in need of improvement or corrective action.
GAO’s 2004 report, “Improvements Needed in Education’s Process for Tracking States’ Implementation of Key Provisions,” is available on the GAO website (www.gao.gov). Search for report GAO-04-734, or go to http://www.gao.gov/new.items/d04734.pdf.
It is important to note that the flexibility of NCLB allows States to require LEAs and schools to report on additional data elements beyond the Federal requirements, and any State or LEA may choose to report on as many optional data elements as it sees fit.
NCLB’s greatly enhanced focus on data-driven accountability has brought with it a number of challenges for States, LEAs, and schools. Federal NCLB reporting requires that States have the capability to transmit standard statewide information on demographics, achievement, and teacher quality for all public school students and all public school teachers they serve. This can be a daunting task in a data collection and reporting environment often characterized by aging, “stovepiped” systems that may not be able to share data within a single school, much less across schools, LEAs, or an entire State.
The State of the Data
In its September 2004 study on “Improvements Needed in Education’s Process for Tracking States’ Implementation of Key Provisions,” GAO found that “more than half of the state and school district officials we interviewed reported being hampered by poor and unreliable student data.” (p. 3) GAO’s report, GAO 04-734, is available on the GAO website (www.gao.gov) at http://www.gao.gov/new.items/d04734.pdf.
Among the key data quality problems associated with NCLB and other reporting are:
¾ System non-interoperability. Data collected in one system are not electronically transmittable to other systems. Re-entering the same data in multiple systems consumes resources and increases the potential for data entry errors.
¾ Non-standardized data definitions. Various data providers use different definitions for the same elements. Passed on to the district or State level, non- comparable data are aggregated inappropriately to produce inaccurate results.
¾ Unavailability of data. Data required do not exist or are not readily accessible. In some cases, data providers may take an approach of “just fill something in” to satisfy distant data collectors, thus creating errors.
¾ Inconsistent item response. Not all data providers report the same data elements. Idiosyncratic reporting of different types of information from different sources creates gaps and errors in macro-level data aggregation.
¾ Inconsistency over time. The same data element is calculated, defined, and/or reported differently from year to year. Longitudinal inconsistency creates the potential for inaccurate analysis of trends over time.
¾ Data entry errors. Inaccurate data are entered into a data collection instrument. Errors in reporting information can occur at any point in the process – from the student’s assessment answer sheet to the State’s report to the Federal government.
¾ Lack of timeliness. Data are reported too late. Late reporting can jeopardize the completeness of macro-level reporting and the thoroughness of review. Tight NCLB deadlines, for example, can lead to late reporting, poor data quality, and delayed implementation of program improvement efforts. Rushed reporting can often lead to poor data quality, while reporting that is delayed months or even years can often limit data utility.
Sections 2 through 4 of this document will address each of these problems, expanding on the issues that they raise and providing guidelines for overcoming them.
This document focuses on the processes and mechanisms of data collection and reporting
The target audience for the guidelines is two distinct but complementary groups:
¾ State and local accountability and assessment officers and staff ¾ State and local Management Information Systems (MIS) and data personnel
A key purpose of this document is to ensure that these two groups can work with and speak to each other using a common language and guided by a common set of understandings. The main body of the guidelines is written in a language and depth designed to be accessible to accountability and assessment professionals, but also credible to MIS professionals and data technicians. Throughout the document, vignettes and insets are used to provide specific technical information to the MIS audience and specific administrative applications and examples to program staff.
Following this introduction, the guidelines are organized into three main sections:
¾ Establishing a Solid Foundation ¾ Managing Consistent Data Collection ¾ Confirming Accurate Results
These sections are intended to track the data collection and reporting process through its basic phases, and capture the categories of management control structures that the U.S. Department of Education’s Office of the Inspector General (OIG) identified in its February 2004 Management Information Report: monitoring, receipt and control, scoring, data quality, analysis, and reporting. Each section includes a brief text
n February 2002, the U.S. Office of Management and Budget (OMB) published a set of Federal Information Quality Guidelines. These Guidelines, developed in response to a Congressional mandate, established a basic definition of data quality that included three overarching elements: utility, objectivity, and integrity. OMB also directed each Federal agency to develop its own Department-specific standards. The U.S. Department of Education published its Information Quality Guidelines in February 2003. However, as the Department’s Inspector General noted, the Guidelines “addressed high level standards …and did not require management controls over scoring of State assessments” and other key NCLB data elements. This section will lay out the basic, underlying processes and systems that set a foundation for quality data.
Within the confines of this document, the definition of “data quality” encompasses two of the three components of OMB’s overarching definition: objectivity and integrity. These guidelines assume that the data elements required by NCLB and by States are, by definition, useful in measuring progress toward predefined Federal and State accountability standards.
The U.S. Department of Education’s Information Quality Guidelines describe data “integrity” as the security or protection of information from unauthorized access or revision. “Objectivity” is the presentation of information “in an accurate, clear, complete, and unbiased manner.” For statistical data, achieving this standard entails:
¾ Using clearly defined, broadly understood data definitions; ¾ Using clearly documented, well thought-out methodologies for data collection; ¾ Using reliable data sources; ¾ Processing data in a manner to ensure that data are “cleaned” and edited; ¾ Properly documenting and storing data collections and results; ¾ Producing data that can be reproduced or replicated; ¾ Conducting data collections and releasing data reports in a timely manner; and ¾ Establishing procedures to correct any identified errors.
Key Federal Information Quality Documents
OMB: Guidelines for Ensuring and Maximizing the Quality, Objectivity, Utility, and Integrity of Information Disseminated by Federal Agencies, February 22, 2002 http://www.whitehouse.gov/omb/fedreg/reproducible2.pdf
U.S. Dept. of Ed.: “U.S. Department of Education Information Quality Guidelines,” February 2003 http://www.ed.gov/policy/gen/guid/infoqualguide.html
U.S. Dept. of Ed. OIG: Management Information Report, “Best Practices for Management Controls Over Scoring of the State Assessments Required Under the No Child Left Behind Act of 2001,” February 3, 2004
GAO: “Improvements Needed in Education’s Process for Tracking States’ Implementation of Key Provisions,” September 2004 http://www.gao.gov/new.items/d04734.pdf
Automated Systems. Having an adequate technical infrastructure in place is one key element in producing quality data. At a minimum, data collection, processing, and reporting should be automated and transmittable in an electronic format. Even in small States, LEAs, and schools, pen-and-paper systems for managing data will be overwhelmed by the emphasis that accountability systems such as those established under NCLB place on accurate, comprehensive, and timely data reporting. In addition to creating delays and consuming excessive resources, a system that relies on manual or outdated technology exacerbates all of the data quality problems discussed above in Section 1.3. Many of the data quality solutions included in these guidelines are difficult or impossible to implement without an automated data system.
Current Initiatives. The range of technology options available to States, LEAs, and schools in automating data collection processes is vast – from inexpensive desktop spreadsheets to fully integrated State data warehouses linked to every school. Driven largely by NCLB’s requirements, numerous ambitious Federal and State initiatives are currently underway to implement comprehensive, state-of-the- art data collection, storage, and reporting networks. These networks are typically being built around a system of unique statewide student identifiers and individual student records, and are potentially capable of delivering real-time educational data to individual teachers at the classroom level. These systems also integrate automated data quality checks.
Interim Processes. Of course, these systems are complicated to develop and take time to complete. However, data, assessment, and accountability professionals at the State and local levels should not postpone steps to improve data quality while they wait for a fully automated, fully integrated statewide data system to be implemented. Several technical infrastructure practices that would improve data quality should be possible under current conditions. The key, regardless of which technology is used, is to establish technical processes that allow data to be checked as they are entered into the system and transmitted to other users.
In the Field: Types of Data Systems
Three different types of automated student data systems exist that promote data quality by State Educational Agencies (SEAs).
West Virginia and Delaware host student information systems that are used by LEAs and schools on a day-to-day basis. When data are needed for reporting, the SEA can download what is needed from the real-time systems and receive up-to-date, comparable data.
North Carolina and South Carolina promote data quality by providing the same software to all LEAs. Extracts for reporting purposes can be written once and used by all to promote timely and complete data collections.
In Texas and Florida, two States that began collecting individual student records many years ago, data standards have been established that make it very clear what is expected to be reported by LEAs and in what format. In Texas, regional service centers check the LEAs’ data before the data are submitted to the SEA.
Data Dictionaries. A fundamental piece of any data quality infrastructure is a standardized set of precise data definitions that all providers use. A “data dictionary,” which identifies all data elements and describes their content, coding options, and format, is essential to establishing consistent collection and reporting. Adhering to a standard data dictionary improves data quality by fostering interoperability of different reporting systems and promoting the use of comparable data across the entire State. Staff who understand the definitions of the data they are collecting, entering, and reporting will be less likely to commit errors. Data dictionaries can be useful even where systems remain un-integrated and un-connected to a wider network. They should be the foundation for staff training (see Section 2.4) and a resource for staff to use during the data quality review process (see Section 4.2).
Business Rules. A collection and reporting system that is linked directly to a data dictionary can greatly improve data quality as it funnels – or, in some cases, forces – data into a pre-defined configuration. This integration is achieved through the creation of systematic “business rules” that In the Field: New Hampshire’s Data Dictionary
As part of the U.S. Department of Education’s Data Quality and Standards Project, New Hampshire has begun to implement the “i.4.see” system, an automated education information database. Working with the Center for Data Quality (C4DQ), New Hampshire has established an online data dictionary that lists the definitions, data rules, and validation requirements for every data element reported.
Schools are the linchpin of the “i.4.see” system. Because schools are the ultimate “owners” of student data, and because they know best when the data are accurate, they are responsible for submitting and revising NCLB data. Automated validation routines, based on customized business rules, allow data to be validated at multiple levels: first when the school submits its data, then when the LEA and State review the information for anomalies and final reporting to the Federal government. A key feature of the system is automatic, real-time feedback on the status of data for every submission. Based on the validation rules in the data dictionary, the system labels each piece of data “rejected” or “accepted” and flags rejected data for correction. Rejected data files are accompanied by error messages that refer automatically to the relevant data validation rules.
For further information on i.4.see, access the NH DOE website at http://www.ed.state.nh.us/education/datacollection/i4see.htm
define acceptable values, character formats, and options for handling missing or unavailable data. In the absence of an integrated statewide network, another option is a web portal-based collection system, in which the central portal enforces data dictionary business rules as data are entered.
Data Definitions. In some cases, the U.S. Department of Education (through the National Center for Education Statistics), the U.S. Office of Management and Budget, or the No Child Left Behind Act maintains a definition of a required data element. Where Federal definitions do not exist, a standard definition should be used for all LEAs and schools in the State. For example, the U.S. Department of Education allows flexibility among States on the definition and parameters of a “full academic year.” Once States define data elements such as these, it is important that the definition be adopted uniformly across all data systems in all LEAs. This information should be maintained in an accountability workbook that is readily
available to staff in schools and districts. Hardware and software should be configured around standard definitions, and the accountability guide should provide a clear description of how data collection, entry, and reporting processes work.
Data Granularity. To the maximum extent possible, all data elements should be collected and stored in their most “granular” form. In other words, each component of a calculated data element should be collected separately and stored separately in the database. For instance, when collecting graduation rate data, it is better to store a total number of students graduating and a total number of students eligible to graduate than to store only a computed percentage. To ensure that data reported by all LEAs and schools are comparable, percentages, ratios and other computed data should not be computed until final calculations are made at the State level. If LEAs are completing forms (rather than sending in individual student or staff records), they should report the component parts of the formula and the SEA should compute the percentages.
General Principles
Unique Identifiers: To the maximum extent possible, unique statewide identifiers should attach to every student and teacher for whom NCLB data are required.
Indivisibility: Every data element should be defined and collected in as “granular” a format as possible. For example, the data dictionary should separate total days in membership and total days in attendance and indicate how they can be used to compute an attendance rate.
Comprehensiveness: Data dictionaries should include all relevant information for each data element, including its definition, unique code, dates of collection, and technical business rules (e.g., “three-digit number” or “ten non-numerical characters, all caps”).
Accessibility: The data dictionary should be easily available to all staff at the State, LEA, and school levels. The dictionary should be posted on-line, available for download into databases and applications, and distributed in hard copy format.
Permanence: Never delete codes or definitions from the data dictionary. Codes or definitions that change or go out of date should be de-activated so that staff will not use them inadvertently, but they are important to maintain so that historical comparisons and longitudinal analysis can occur.
Validity: Business rules should not be the final arbiter of valid data. Data should be checked by a staff member who will know if an anomaly captured by a business rule is, in fact, an error. For instance, business rules may identify counts that are out of range based on previous years’ data, but are, in fact, accurate because a significant change has occurred in the reporting unit.
The Data Quality Team. As important as a solid technical infrastructure and a data dictionary are to producing quality data, it is people who determine whether or not NCLB and other data reporting meets a high standard of accuracy. Automation, interoperability, and connectivity of information technology can provide a framework for producing good data, but such tools are only as powerful as their users make them. While creating staff time for training, implementation, and monitoring of sound collection and reporting practices can pose real challenges (particularly for smaller LEAs), investing in the creation of a data quality team can deliver large returns. Staff involvement at all levels – school, LEA, and State - is essential.
School-level Ownership. Ultimately, most of the required data elements in any education data report, including those on a State or LEA annual NCLB report card, “belong” to schools. Schools are where the students and teachers actually perform the activities that are being measured, and school-level personnel are the initial input point for much of the most important student outcome information. Because the most effective method of improving data quality is to prevent errors from occurring, staff
school level are critical to producing reliable reports at the LEA, State, and Federal levels. School staff have a strong interest in producing accurate data, and should be given the responsibility – and time – for developing a proprietary interest in maintaining the quality of data they collect, report, and use. Through regular oversight, engagement, and feedback, LEAs and States can train school- level staff in the relevance of not only the micro-level student data with which they are most familiar, but also of macro-level information. Bad data at the school level that result in an erroneous NCLB report card can be significant for a school, LEA, or State – not just in terms of resources, but also in terms of prestige, morale, and a host of other effects.
In the Field: Empowering Data Stewards
When Virginia’s Fairfax County Public School System began its push for improved data quality through the Education Decision Support Library, a key element in its approach was that the consumers of the data would drive the system, rather than the technology staff. Fairfax designated “data stewards” at every school, who assumed ownership over specific data elements. For example, one data steward oversees enrollment data, another monitors course grades, and a third tracks assessments. A school-district-level data workgroup, composed of a group of stewards and district data personnel, meets every month to discuss data issues proactively and do strategic planning.
Crucially, Fairfax County’s data stewards were given not only responsibility, but also strong authority to make decisions. While overarching data quality policies and technical systems are developed at the district level to ensure a standard data framework, data stewards work within those policies to develop the business rules, data definitions, and quality check processes that will be used for each element. Data stewards monitor and review all data collected in their domain, and when they identify a data anomaly, they are empowered to resolve it. Data stewards at the school level, who know their data best, have the final say on the “right” number that will be reported to the district and the State.
For further information on Fairfax Co., Virginia’s Education Decision Support Library (EDSL), visit http://www.fcps.edu/DIT/edsl/index.html.
State Educational Agency Leadership. The SEA plays an essential role in ensuring that data obtained from schools and LEAs will meet reporting requirements – both at the State and Federal level. Since nearly all required data originate in schools and districts, the SEA must provide leadership and guidance to ensure the highest quality data possible. SEAs must develop data systems that ensure all LEAs and schools can report data in a timely manner and with the least amount of burden, while giving the SEA the information and flexibility to meet State and Federal reporting requirements. Data systems in many States and LEAs are undergoing changes and enhancements, in part in response to Federal data needs, but also because there is a greater perceived need for useful and timely data for decision-making. This magnifies the importance to SEAs of developing the technical and operational skills of data In the Field: personnel at the LEA and Meeting the Data Quality Challenge in a Small LEA school levels. Because a The process of training and organizing staff to establish an efficient, shared understanding of key (^) effective data quality team can be a daunting task for any LEA or data systems and procedures (^) school. For small and/or rural LEAs with limited resources and is a major factor in creating administrative staff of only a few people, the challenge is magnified. effective feedback loops As a result, some small schools and LEAs take a “we just can’t do it” among SEAs, LEAs, and approach, relying on vendors, a State data office, or other outside agencies to process and validate their data. This hands-off method schools, improving these (^) carries grave risks: potentially erroneous information, which could skills can improve efficiency (^) have serious consequences for funding and accountability, may not be and reduce tensions during caught by distant data personnel who are not familiar with the the collection and review particular school in question. While the general guidelines in this process. document do not fully address the unique set of challenges that many small LEAs face, useful examples exist for implementing critical building-level data review despite limited staff and resources. This document contains staff development guidance for In Charles County, Maryland, for instance, data collection and practices that are already in processing are administered primarily from the LEA data office – with place in some organizations, key data review and validation functions carried out by a designated data steward at the building level. The LEA, with a staff of two to three but not in others. It is hoped (^) people working on data part-time, is responsible for assembling all of that all States and LEAs may (^) its data in its data warehouse. As data files are created for information find these guidelines of use such as student demographics or assessment results, district staff to evaluate the status of transmit individual school data to a test coordinator in each building for existing systems and plan for review and verification. Errors or other data questions are filtered back to the LEA level before the data are finalized for transmittal to the State improving the procedures (^) and the Federal government. At least once per year, the LEA data and systems used to collect (^) staff provide training in data quality procedures for all test coordinators. data for both decision- Test coordinators, who may be principals, teachers, or other making and reporting. The administrative staff, receive a small stipend for this extra duty – which guidelines that follow focus generally occurs over a 24-hour turnaround period twice per year. on general principles for organizing and training staff to facilitate what the National Forum on Education Statistics calls “a culture of high quality data.” That culture should pervade all levels of the data organization – from schools to the State Departments of Education and the U.S. Department of Education.
Education: In addition to training in data entry methods and procedures, educate staff on the larger context of the data collections. Where do NCLB and other data originate? Where do they end up? Why are they being collected? Staff who understand the purposes for a data collection and the possible consequences of errors are less likely to “just fill something in” to satisfy the government.
Technical Training: Hold regular standard training sessions for all personnel involved in the NCLB reporting process. These sessions should describe and demonstrate the procedures for NCLB data collection, entry, and reporting.
Documentation: Prepare a State-level data quality handbook, including information on coding, data entry, sample forms, and the larger context of reporting. The handbook should be available on-line, and a summary checklist should be posted prominently wherever data entry takes place.
Ongoing Assistance: Establish a data quality help desk at the LEA or State level, or designate a “go to” person to be available to answer data questions from the field. Having a convenient, dependable resource for authoritative answers can be the difference between “I’ll just fill something in” and getting the data right.
Guidelines for Specific NCLB Data Elements
NCLB Demographic Data Designate a single data steward in each school who is responsible for ensuring that data are entered according to standard definitions. Train all school and LEA data staff on the Federal definitions of each of the required NCLB subgroup categories. Train all data staff in the relationship between NCLB subgroup classifications and AYP determinations. Disseminate a list of “translations” between Federal NCLB demographic definitions and State-, LEA-, and school-level demographic definitions.
NCLB Assessment Data Designate a single data steward in each school who is responsible for monitoring the correctness of identity information on the assessment forms. Train all school and LEA data staff in the content and purpose of the NCLB assessment, and explain the difference between the assessment used to determine NCLB AYP and other Federal, State, and local assessments. Train teachers, assessment proctors, and assessment scorers in the specific scoring procedures related to the State’s or LEA’s NCLB AYP assessment.
NCLB Accountability Data Designate a single data steward in each school who is responsible for ensuring that the data submitted for accountability purposes is correct.
Train all data staff in NCLB accountability measures, including definitions of “advanced,” “proficient,” and “basic” in the State accountability plan. Train all data staff in the “other indicators” being used for NCLB, including the State’s definitions of elements such as graduation rate and dropout rate. Educate all data staff in the uses of NCLB accountability data and the potential consequences of errors in reporting results.
NCLB Teacher Quality Data Designate a single data steward in each school who is responsible for ensuring that teacher assignment information is correctly submitted. Train all data staff in the reporting requirements for NCLB teacher quality data, including the State’s definition of “fully licensed/certified.”
Technical Focus: Training Staff
What should I include in a comprehensive session to train all data staff on procedures for producing high quality data?
9 Discussion of the goals and objectives for good data quality 9 Description of data quality policies and definitions 9 Identification of key personnel involved in the collection/reporting process 9 Dates, times, and durations of collection activities 9 Required procedures for administering specific collections 9 Limits on acceptable deviation from specified procedures 9 Ethical and legal responsibilities related to security/privacy 9 Hands-on practice with the data entry system and the data collection instrument to be used (administering the test or transcribing the record) 9 Samples of reports that will be produced 9 The help desk number, or whom to call with a question during collection
(Adapted from National Center for Education Statistics Cooperative Education Data Collection and Reporting Standards Project Task Force,Standards for Education Data Collection and Reporting, 3-5, 3-6, 4-13.)