Education news. In context.
Diversity & Equity
Politics & Policy
Teaching & Classroom
Student & School Performance
Leadership & Management
Charters & Choice
Find a Job
How to be a Chalkbeat source
Republish Our Stories
Code of Ethics
Our News Partners
Work with Us
July 15, 2015
Here is the IBO’s backfill information, sorted by charter network
The new information illustrates the sharp divide among independent charter schools over how they filled seats in older grades.
February 22, 2011
Report: New York collects the data it needs but isn't using it yet
New York State collects all of the student data it needs to meet federal guidelines, but the state has a ways to go before it starts using the data in ways that will boost student achievement. That's the conclusion of a report released last week by the Data Quality Campaign, an initiative of a group of education and legislative organizations meant to help states build their data tracking systems. Since 2005, the group has surveyed each state's education department to draw a picture of what kinds of student information school systems gather and how they use it to boost student achievement. The campaign compares each state’s system against a list of ten elements it calls “essential” for robust data tracking and another ten "actions" it says states should take to best use the data they collect. The initiative's list matches the federal government's criteria for student data tracking systems, which all 50 states committed to building as a condition of receiving stimulus dollars. New York's data tracking methods now include all 10 of the elements the campaign says comprise a good system. (Four of those elements — including the capacity to follow K-12 students into college and track their progress and a way to link teachers to their students' information — have been added in the past year.) But the state has so far taken only four of the ten steps the campaign says are needed to put that system to good use. Among the actions New York has yet to implement: The state has not yet made data for all individual students who are being tracked available to their teachers and parents through online portals. Nor has the state developed "progress reports" for individual students and their parents and teachers that track performance over time.
January 13, 2011
Union formally appeals court's decision on teacher ratings
As expected, the city teachers union today formally appealed the State Supreme Court ruling that would allow the city to make public teachers' names and performance ratings. On Monday, Justice Cynthia Kern ruled that the city must respond to media outlets' Freedom of Information Law requests for Teacher Data Reports with individual teachers' names attached. The union had sued to stop the release, arguing that releasing individuals' performance ratings would illegally invade teachers' privacy. The union also argues that the ratings are too flawed to be made public. In its brief appeal, the union argues that the lower court erred in its interpretation of the law, saying that the judge should have determined whether the city could legally deny the media organizations' request. In the original decision, Kern ruled that the city had not acted in an arbitrary or capricious manner when it decided to release the teacher rankings, but explicitly did not make a judgement on whether the city could legally have chosen not to make the ratings public under the Freedom of Information Law. In a statement released to reporters when the decision was handed down on Monday, city lawyer Jesse Levine said that the city would respect the union's appeal and will not release the ratings before a second court has ruled.
December 20, 2010
Union requests formal investigation of data reports' accuracy
The city teachers union today formally asked the comptroller and special commissioner of investigation to examine the accuracy of the Department of Education's teacher ratings. The move comes after an ongoing back-and-forth between the union and the city over how city officials ensure the accuracy of the data that determine the ratings. Yesterday, the union called a press conference to share stories of teachers who discovered that their data reports rate their effectiveness based on students and subjects they had never taught. The feud over the ratings began in October, when city officials announced that they intended to release the teacher rankings to reporters. Union officials began collecting examples of errors on the reports, and then sued to block the release, arguing that the reports were too riddled with inaccurate information to be released. Teachers union President Michael Mulgrew said Sunday that his staff has documented at least 200 cases in which teachers' reports include errors. In its court filings, the union gave nearly 20 examples of reports, with teachers' names redacted, that the union claims reflect errors. But city officials countered today in a letter to Mulgrew that because there were no names attached to the examples the union cited, they have been unable to verify the letters. The letter, signed by Deputy Chancellors Shael Polakow-Suransky and John White, asked the union to share the details of those cases.
December 6, 2010
UFT: Value-added ratings don't accurately measure quality
Laying out its case for why the courts should stop the Bloomberg administration from releasing teacher effectiveness ratings, the city teachers union described the ratings as internal, incomplete, and riddled with flaws The union is trying to block the city from releasing the names and ratings of nearly 12,000 teachers, arguing that releasing them would be an invasion of teachers' privacy. The bulk of the materials filed today were prepared by United Federation of Teachers researcher Jackie Bennett and are intended to show that the data reports are inaccurate. "The UFT's review of the TDR's has revealed that a large portion of the reports received are materially flawed as they have been calculated based on errors in student lists," Bennett writes. "In addition, most of the flaws identified came from the most recent year's TDRs, for which information was slightly less opaque and memories were fresher," she continues. "Yet, the TDRs contain three more years of historical student lists and information, lumped in aggregate numbers. The UFT found it very difficult, if not impossible, to penetrate that information, even in a superficial manner." The union has been encouraging teachers to report errors on their reports since city officials announced in October that they intended to release the reports publicly. To support Bennett's argument, the union filed nearly 20 examples of individual data reports that it says show errors.
October 26, 2010
Union mobilizes teachers to find and report errors in ratings
In the next stage of its effort to block the release of thousands of teacher data reports, the city teachers union is mobilizing educators to scrutinize their reports for errors — even setting up a dedicated phone line to monitor concerns. Last week, the city announced that it would release a list of teachers' names and their effectiveness ratings to reporters who had submitted freedom of information requests. The union has sued to stop the release, and the city agreed to postpone publicizing teachers' names until a hearing is held in court next month. The union asserts that the ratings should not be made public in part because they are non-finalized and often error-prone internal documents. To make that case, the union is asking teachers to comb their reports for mistakes and tell the union when they find them. The union sent teachers a sample report showing teachers how to look for mistakes, and has set up a dedicated phone line and e-mail address for concerns about the accuracy of their ratings, according to a memo union President Michael Mulgrew sent teachers last week. A union spokesman said that, as of Friday, at least 200 teachers had called the union to report errors. Department of Education spokesman Matthew Mittenthal said that the city had seen an increase in the number of calls since the union sent out its memo. But he said that the majority of calls were prompted by misunderstandings of the reports rather than inaccuracies. Still, Mittenthal said, the city plans to check teachers' complaints and fix problems it finds before releasing the reports publicly.
July 16, 2010
King outlines plans for $48M expansion of troubled data system
The state education department will spend nearly $48 million over the next three years completing a database that will track students' test scores, courses and teachers from the beginning of their schooling to the end. The database system has been hailed by state education officials as a key tool in their reform efforts. It's intended to help the state use student test scores and grades to judge not only schools and teachers but also the programs that trained the teachers, for example. Education officials also say the system will be instrumental in helping identify students at risk of dropping out of school early on. The state already tracks some information about students from kindergarten through twelfth grade. The data system launched in the 2006-07 school year with an expected cost of $39.4 million over six years. The system got off to a rocky start, plagued by delays in reporting data. In a memo to the Board of Regents in advance of their Monday meeting, State Education Department Deputy Commissioner John King argued that the current system, while improved, doesn't meet the needs of schools or help advance Regents' policy goals. He continued: Furthermore, the system was not user-friendly; school officials complained frequently about the infamous electronic "spinning cube" that caused long delays in reporting and verifying data. Data collection was therefore slow, and the Department missed federal deadlines in reporting school accountability and other results.
August 31, 2009
A teacher wishes ARIS had more data about her students
As teachers start gearing up for the first day of classes next week, many are logging in to ARIS, the city's online school data warehouse. But some are finding that despite all that ARIS offers, it still isn't in sync with what teachers really need. Miss Brave, a second-grade teacher, writes on her blog: Now, I am all about the lists and charts and organizational tools, but I'm already frustrated by ARIS. Maybe it's because I've got second graders, so there's not exactly that much data to go on, but almost every single data field on my students was blank, and the ones that were there are cryptic. My new student from another school has an IEP, but I can't tell what's on it. Several of my students have "health alerts," but I don't know what they are. And a handful have "closed 407s," which (because I am a huge dork) I had to research to find out what exactly that meant. (As far as I can tell, it means they were absent a lot, and the DOE investigated.) This is my third year in the system, and I don't see how I'll ever keep pace with all the acronyms and numbered abbreviations. But all the tools we use at my school to measure student progress --running records and Everyday Math assessments and checklists and such -- don't factor into ARIS.
In your inbox.
Chalkbeat New York
How I Teach
Rise & Shine Colorado
Rise & Shine Detroit
Rise & Shine Indiana
Rise & Shine Tennessee
The Starting Line