Following its three-year run with the Multi-State Collaborative, and its own “All In” assessment project in 2017-2018, PCC ushered in a new era of college-wide assessment in 2018-2019. This was the year PCC’s General Education disciplines, known collectively as the DSACs (Discipline Subject Area Committees), first piloted their new rubrics and the first drafts of their signature assignments designed to measure attainment of the following outcomes:
|DSAC||Outcome (each has a self-titled rubric)|
|Arts & Letters||Integrative Learning|
|Cultural Literacy||Cultural Literacy|
|Science, Math, & Computer Science||Quantitative Literacy|
|Social Science||Social Inquiry & Analysis|
Academic Affairs asked the faculty who were piloting assignments to volunteer their students’ papers (aka ‘artifacts’) for a spring peer scoring project called College-wide Assessment. At the end of winter term and early spring, several hundred artifacts were collected, coded, and redacted for anonymity. Cadres of faculty representing each DSAC were recruited to score the student artifacts that spring. Norming sessions were held–led by faculty who had collaborated on the rubrics from the beginning. The assessments that occurred in 2018-2019, and again the following year, were less about measuring student performance and more about refining the signature assignments. Some or all of the artifacts were scored by two people, and each scorer was assigned a caseload of at least 32 artifacts. After their scores were submitted, each rater was asked to give feedback on a few of the assignments that generated the papers they had scored.
The scores were downloaded into data-crunching software and regrouped by artifact ID. After the data were entered-into a spreadsheet the inter-rater reliability was checked by a member of the psychology faculty. The purposes of this check were to monitor and improve the agreement levels between different faculty scorers when they are rating the same artifacts. This is important because while the dimensions and levels of our rubrics are carefully described, the descriptions must be interpreted and applied in the context of each student artifact. It is important that instructors can agree on their ratings of student work in order to ensure the validity and reliability of our use of these rubrics here at PCC.
Summary data from the 2018-2019 College-Wide Assessment Project are linked below. Note: We continue to develop our processes around assignment design and inter-rater reliability.
|2019 College-Wide Assessment Summary Documents|
Analyses of Totals:
Also See: Latest Rubrics for all but Cultural Literacy. The Cultural Literacy rubric used in 2019 was found to be flawed and is under re-development in 2020-2021.
College-Wide Assessment Q&A for Faculty (2018)