
{"id":265,"date":"2012-06-15T11:59:55","date_gmt":"2012-06-15T11:59:55","guid":{"rendered":"http:\/\/pages.charlotte.edu\/gregory-starrett\/?p=265"},"modified":"2013-09-24T11:09:09","modified_gmt":"2013-09-24T11:09:09","slug":"best-practices-for-student-assessment-2","status":"publish","type":"post","link":"http:\/\/pages.charlotte.edu\/gregory-starrett\/2012\/06\/15\/best-practices-for-student-assessment-2\/","title":{"rendered":"Best Practices for Student Assessment"},"content":{"rendered":"<p>For well over a decade, universities, governments, and accrediting agencies\u00a0around the world have been discussing new ways to assess and interpret the basic functions of higher education, including faculty productivity and student learning. In 2006, at the request of the University of North Carolina system&#8217;s general administration,\u00a0the UNC\u00a0statewide Faculty Assembly put together an Assessment Task Force.\u00a0One part of the Assessment Task Force was charged with\u00a0thinking about how best to assess and report information about student learning outcomes.\u00a0Specifically, we were asked to evaluate two\u00a0particular assessment instruments, the\u00a0National\u00a0Survey of Student Engagement (NSSE),\u00a0and the Collegiate Learning Assessment (CLA).<\/p>\n<p>A commission appointed by U.S. Secretary of Education Margaret Spellings\u00a0recommended in September 2006 that:<\/p>\n<blockquote><p>The results of student learning assessments, including value-added measurements that indicate how much students\u2019 skills have improved over time, should be made available to students and reported in the aggregate publicly. Higher education institutions should make aggregate summary results of all postsecondary learning measures, e.g., test scores, certification and licensure attainment, time to degree, graduation rates, and other relevant measures, publicly available in a consumer-friendly form as a condition of accreditation. (<a href=\"http:\/\/www2.ed.gov\/about\/bdscomm\/list\/hiedfuture\/reports\/pre-pub-report.pdf\">http:\/\/www2.ed.gov\/about\/bdscomm\/list\/hiedfuture\/reports\/pre-pub-report.pdf<\/a>, p. 23)<\/p><\/blockquote>\n<p>What we found was that, although the idea of comparing student learning outcomes across universities sounds simple and straightforward, in reality such comparisons are difficult and potentially misleading.<\/p>\n<p>Follow this link to our final report to the UNC General Administration, which includes both our thinking about assessment more generally, and our evaluation of the utility of the NSSE and CLA instruments:\u00a0 <a href=\"http:\/\/pages.charlotte.edu\/gregory-starrett\/wp-content\/uploads\/sites\/15\/2012\/06\/Student_Assessment_Task_Force_Final_Report.pdf\">Best Practices for Student Assessment<\/a>. The full report is 22 pages long, but there&#8217;s a useful two-page executive summary up front.<\/p>\n<p><strong>UPDATE<\/strong> 1 September 2012:<\/p>\n<p>There is progress in the world. In late June, Chris Jackson of the Council for Aid to Education (<a href=\"http:\/\/www.cae.org\/content\/about.htm\">http:\/\/www.cae.org\/content\/about.htm<\/a>) wrote to me\u00a0about\u00a0changes in some of the technical features of CLA. The changes he outlines in the instrument are welcome, and many of them\u00a0answer the criticisms the UNC Faculty Assembly articulated in 2006 (sadly, this was not a\u00a0direct causal\u00a0relationship!) Here is what Mr. Jackson wrote:<\/p>\n<div><span style=\"font-family: Times New Roman;font-size: medium\"><span style=\"color: black;font-family: Calibri;font-size: small\">Greetings Dr. Starrett:<\/span><\/span><\/div>\n<div><\/div>\n<div><span style=\"font-family: Times New Roman;font-size: medium\"><span style=\"color: black;font-family: Calibri;font-size: small\">I wanted to bring you up to speed with information related to the CLA which was either unavailable at the time or has changed since you all wrote your report.<\/span><\/span><\/div>\n<div><\/div>\n<ul>\n<li><span style=\"color: black;font-family: Times New Roman;font-size: medium\"><span style=\"font-family: Calibri;font-size: small\">At the writing of the report, the assembly indicated that longitudinal data was not yet available. Findings from our longitudinal project may now be found <\/span><a href=\"https:\/\/mail.uncc.edu\/OWA\/redir.aspx?C=107dd8c6bfa74137bcbd7ccdec8d80be&amp;URL=http%3a%2f%2fwww.collegiatelearningassessment.org%2ffiles%2fCLA_Lumina_Longitudinal_Study_Summary_Findings.pdf\" target=\"_blank\"><span style=\"font-family: Calibri;font-size: small\">here<\/span><\/a><span style=\"font-family: Calibri;font-size: small\">.<\/span> <\/span><\/li>\n<li><span style=\"color: black;font-family: Times New Roman;font-size: medium\"><span style=\"font-family: Calibri;font-size: small\">Your assembly found that peer group comparisons were not available through the CLA which, at the time, was true. Since then we have introduced peer group comparisons by: Institution size, Minority-serving status, Institution Type (Doctoral, Masters, Bachelors), and Sector (Public vs. Private). Please see this <\/span><a href=\"https:\/\/mail.uncc.edu\/OWA\/redir.aspx?C=107dd8c6bfa74137bcbd7ccdec8d80be&amp;URL=http%3a%2f%2fissuu.com%2fchriscla%2fdocs%2fcla_10-11_report%3fmode%3dembed%26layout%3dhttp%253A%252F%252Fskin.issuu.com%252Fv%252Flight%252Flayout.xml%26showFlipBtn%3dtrue\" target=\"_blank\"><span style=\"font-family: Calibri;font-size: small\">sample 2011-2012<\/span><\/a><span style=\"font-family: Calibri;font-size: small\"> report for examples (specifically pages 12-13).<\/span> <\/span><\/li>\n<li><span style=\"color: black;font-family: Times New Roman;font-size: medium\"><span style=\"font-family: Calibri;font-size: small\">In that same report (pages 9-10), you\u2019ll also note that we\u2019ve introduced subscore reporting in the areas of: Analytic Reasoning and Analysis, Writing Effectiveness, Writing Mechanics, and Problem Solving. The goal here being the provision of information that will assist participating institutions define specific areas for improvement. All reports that we provide to institutions are accompanied by an underlying data file, so that they have the opportunity to run local analyses to determine whether there are specific patterns (demographics, program participation, course-taking patterns, etc.) that lead to better attainment of the skills measured by the CLA.<\/span> <\/span><\/li>\n<li><span style=\"color: black;font-family: Times New Roman;font-size: medium\"><span style=\"font-family: Calibri;font-size: small\">The report notes\u2014correctly\u2014that in 2006, the CLA value-added model did not control for institutional effects. We have since moved to a <\/span><a href=\"https:\/\/mail.uncc.edu\/OWA\/redir.aspx?C=107dd8c6bfa74137bcbd7ccdec8d80be&amp;URL=http%3a%2f%2fwww.collegiatelearningassessment.org%2ffiles%2fSteedle_2010_Improving_the_Reliability_and_Interpretability_of_Value-Added_Scores_for_Post-Secondary_Institutional_Assessment_Programs.pdf\" target=\"_blank\"><span style=\"font-family: Calibri;font-size: small\">hierarchical-linear equation<\/span><\/a><span style=\"font-family: Calibri;font-size: small\">, which does control for student characteristics that, to some extent, define an institution. <\/span><\/span><\/li>\n<li><span style=\"color: black;font-family: Times New Roman;font-size: medium\"><span style=\"font-family: Calibri;font-size: small\">Finally, it is true that\u2014at the institutional-level\u2014CLA scores do correlate quite highly with the SAT and ACT. This, of course, does not mean that they measure the same thing (only that groups of students that tend to do well on one assessment also tend to do well on others, which may be a contribution of other factors, socioeconomic status not the least among them). Still, though raw cohort scores do correlate well, institutional <\/span><span style=\"font-family: Calibri;font-size: small\"><em>value-added <\/em><\/span><span style=\"font-family: Calibri;font-size: small\">scores have no correlation with SAT or ACT, meaning that all institutions (highly selective and less-highly selective) have equal opportunity to contribute to student growth in the skills assessed by the CLA. <\/span><\/span><\/li>\n<\/ul>\n<div><\/div>\n<div><span style=\"font-family: Times New Roman;font-size: medium\"><span style=\"color: black;font-family: Calibri;font-size: small\">Again, I realize that this response was unsolicited. However, we do want you (and others) to know that we listen to participating institutions and are constantly working to refine and better our offerings to ensure that they have genuine utility.<\/span><\/span><\/div>\n<div><\/div>\n<div><span style=\"font-family: Times New Roman;font-size: medium\"><span style=\"color: black;font-family: Calibri;font-size: small\">Best,<\/span><\/span><\/div>\n<div><span style=\"font-family: Times New Roman;font-size: medium\"><span style=\"color: black;font-family: Calibri;font-size: small\">Chris<\/span><\/span><\/div>\n<div><\/div>\n<div>\n<div><span style=\"font-family: Times New Roman;font-size: medium\"><span style=\"color: black;font-family: Calibri;font-size: small\">Chris Jackson<\/span><\/span><\/div>\n<div><span style=\"font-family: Times New Roman;font-size: medium\"><span style=\"color: black;font-family: Calibri;font-size: small\">Director of Business Development, CLA &amp; CWRA<\/span><\/span><\/div>\n<\/div>\n<p>&nbsp;<\/p>\n<p>It&#8217;s worth noting that the increased sophistication of the data analysis and reporting may help universities make better sense of the CLA instrument if they choose to use it. But they do not affect the\u00a0general recommendations in the\u00a0UNC\u00a0Faculty Assembly Best Practices report regarding\u00a0the\u00a0kinds of issues institutions should take into account when developing assessment policies.<\/p>\n<p>Nor can they address the issue of student motivation to do well on the test in the first place. I&#8217;ve heard from some students that they regard it as just another hoop to jump through with as little effort as possible,\u00a0on the way to priority registration or a gift card, two of the many sorts of inducements institutions now use in order to get students to spend three hours of their time on\u00a0a test that doesn&#8217;t affect their individual academic record.<\/p>\n<p>Nor, of course, can even the best testing instrument control for the irresponsible media coverage and political use of findings based on test results, as evidenced by the narrow, sensationalistic and utterly misleading\u00a0coverage of the work of sociologists\u00a0Richard Arum and Josipa Roksa (<em>Academically Adrift: Limited Learning on College Campuses<\/em>, University of Chicago Press, 2011) last year.<\/p>\n<p>More on this later.<\/p>\n<p>UPDATE 9\/24\/2013:<\/p>\n<p>For an explanation of what Arum and Roksa actually said about their CLA data, see my subsequent post, &#8220;Academically Adrift with Molly Broad,&#8221; at <a href=\"http:\/\/pages.charlotte.edu\/gregory-starrett\/2013\/09\/23\/academically-adrift-with-molly-broad\/\">http:\/\/pages.charlotte.edu\/gregory-starrett\/2013\/09\/23\/academically-adrift-with-molly-broad\/<\/a><\/p>\n<div><\/div>\n","protected":false},"excerpt":{"rendered":"<p>For well over a decade, universities, governments, and accrediting agencies\u00a0around the world have been discussing new ways to assess and interpret the basic functions of higher education, including faculty productivity and student learning. In 2006, at the request of the University of North Carolina system&#8217;s general administration,\u00a0the UNC\u00a0statewide Faculty Assembly put together an Assessment Task [&hellip;]<\/p>\n","protected":false},"author":558,"featured_media":0,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"jetpack_post_was_ever_published":false,"_jetpack_newsletter_access":"","_jetpack_dont_email_post_to_subs":false,"_jetpack_newsletter_tier_id":0,"_jetpack_memberships_contains_paywalled_content":false,"_jetpack_memberships_contains_paid_content":false,"footnotes":"","jetpack_publicize_message":"","jetpack_publicize_feature_enabled":true,"jetpack_social_post_already_shared":false,"jetpack_social_options":{"image_generator_settings":{"template":"highway","default_image_id":0,"font":"","enabled":false},"version":2}},"categories":[3],"tags":[6,7,9,8,5],"class_list":["post-265","post","type-post","status-publish","format-standard","hentry","category-news","tag-cla","tag-collegiate-learning-assessment","tag-national-survey-of-student-engagement","tag-nsse","tag-student-assessment"],"jetpack_publicize_connections":[],"jetpack_featured_media_url":"","jetpack_shortlink":"https:\/\/wp.me\/p69uQY-4h","jetpack_sharing_enabled":true,"_links":{"self":[{"href":"http:\/\/pages.charlotte.edu\/gregory-starrett\/wp-json\/wp\/v2\/posts\/265","targetHints":{"allow":["GET"]}}],"collection":[{"href":"http:\/\/pages.charlotte.edu\/gregory-starrett\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"http:\/\/pages.charlotte.edu\/gregory-starrett\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"http:\/\/pages.charlotte.edu\/gregory-starrett\/wp-json\/wp\/v2\/users\/558"}],"replies":[{"embeddable":true,"href":"http:\/\/pages.charlotte.edu\/gregory-starrett\/wp-json\/wp\/v2\/comments?post=265"}],"version-history":[{"count":7,"href":"http:\/\/pages.charlotte.edu\/gregory-starrett\/wp-json\/wp\/v2\/posts\/265\/revisions"}],"predecessor-version":[{"id":267,"href":"http:\/\/pages.charlotte.edu\/gregory-starrett\/wp-json\/wp\/v2\/posts\/265\/revisions\/267"}],"wp:attachment":[{"href":"http:\/\/pages.charlotte.edu\/gregory-starrett\/wp-json\/wp\/v2\/media?parent=265"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"http:\/\/pages.charlotte.edu\/gregory-starrett\/wp-json\/wp\/v2\/categories?post=265"},{"taxonomy":"post_tag","embeddable":true,"href":"http:\/\/pages.charlotte.edu\/gregory-starrett\/wp-json\/wp\/v2\/tags?post=265"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}