BLUE
OTopiniontoday.bsky.social

AP-NORC: Public opinion of the Israel and Hamas conflict nearly a year after the October 7th attacks. More, via Opinion Today: opiniontoday.substack.com/p/241002-top...

0
NAnorc.org

NORC + the University of Arizona teamed up to find home a for Cosmic Explorer, a groundbreaking gravitational-wave observatory. Find out how we're using General Social Survey data to pick the ideal spot for this deep space research hub and its community.

Building Cosmic Explorer: Where Science Meets Community | NORC at the University of Chicago
Building Cosmic Explorer: Where Science Meets Community | NORC at the University of Chicago

NORC is helping select the location of a gravitational-wave observatory so people of diverse backgrounds can live and thrive.

0
DBbakerdphd.bsky.social

From an email the Vanderbilt chancellor sent out where he touts spending money to be told water is wet

Screenshot from an email that reads: Vanderbilt’s position remains unchanged from last year, ranked 18th in the nation. However, as I wrote about last October, U.S. News’ flawed criteria conceal some of Vanderbilt’s greatest facets. I have been highly critical of this ranking system for its imprecise methodology, misaligned incentives and reliance on low-quality data, because these rankings are used to help students—and the families who support them—choose where to go to college. The importance of this decision is enormous: It powerfully and permanently affects students’ lives and careers.
Screenshot of article that reads: Vanderbilt is taking a leadership role in helping to change the way students and their families receive information to evaluate their options. Vanderbilt commissioned a study of five prominent university ranking systems by NORC—an independent nonpartisan and nonprofit research organization that is among the most highly respected in its field. The report confirmed what many university leaders have long suspected: that their “methodologies are unclear”; “rationale for the relative weights of various attributes included in rankings is unknown”; “data quality is inconsistent”; and “some factors assessed are highly subjective, but are critical components in the ranking process, which makes it difficult to establish definitive comparisons between institutions.”
Screenshot of email that reads: A major problem, according to the study, is that there is no shared definition of what “good” looks like for colleges, so each ranking creates its own target and then purports to hold colleges to that subjective standard. In many cases, “good” is not academic excellence or the provision of a transformative education—it is an aggregation of various weighted measures that cannot represent any individual student’s needs or desires for their future place of study.
1
NJnataliej.bsky.social

Ipsos, CNN, AP/NORC all use this type of panel. They are difficult to do at the state level because of the number of people needed, and congressional district is impossible. But for national, these are great. So don't just write off ALL online polls. Know which ones.

1
DBbakerdphd.bsky.social

They paid for these findings!!! Now I gotta find my interview for Science because I feel like I'm taking crazy pills.

Screenshot from the article mentioned upthread that reads: For decades, college rankings have sought to distill many abstract qualities into a digestible, numbered list. For this analysis, researchers reviewed the methodologies of five popular rankings to assess whether they accurately capture the qualities that form an overall “ranking.”  

The analysis was independently conducted by researchers from NORC at the University of Chicago, a 501c3 organization. Funding for the analysis was provided by the Vanderbilt University Office of the Chancellor.
Screenshot of link from upthread: Key findings include:

Methodologies are unclear, and the rationale for the relative weights of various attributes included in rankings is unknown. The concepts captured by the rankings also are not clear. Researchers say this makes it impossible to know exactly what is being measured and how much it should “count” in a final assessment. 
Data quality is inconsistent, which hinders accurate assessments of various measures, even openly defined ones like graduation rates, student debt, and value-added earnings. 
Some factors assessed are highly subjective but are critical components in the ranking process, which makes it difficult to establish definitive comparisons between institutions.
2