Goal 1. Raise quality of education for undergraduate and graduate students by addressing issues from pedagogy to hiring to support

Objective 1.1 Provide rigorous and accessible curriculum that prepares students for professional careers and lifelong learning

Performance IndicatorBaselineAction Item
Student success metrics, including retention and graduation rates, post-graduate employment rates, and graduate school rates.Most Recent data (Fall 2018) FTF retention: 63% (2 yrs) FTF graduation: 15% (4 yrs), 54% (6 yrs) UDT graduation: 19% (2 yrs), 71% (4 yrs) NSF Per capita Ph.D. ranking (# 1 CSU, #8 Nation, April 2017) Employment and graduate school rates have not been measured.Begin collecting data on employment rates and graduate school enrollment.
Course effectiveness metrics, including reduction of Bottleneck and Gateway courses, elimination of opportunity gaps, and increase percent of students with a research experience.(2017-18) Number of identified Gateway courses: 74 Number of identified Bottleneck courses: 42 Number of courses with opportunity gap > 10%: 56 Additional data in Appendix J Percent of students with a research experience: 75% (see Appendix I)Professional development for faculty to reduce opportunity gaps and increase overall student success. Conduct a periodic scan of student research or related experiences.
Teaching capacity metrics, including Tenure-Track Density and the ratio of Tenure-Track to majors.TT Density TT/Student majors Percent of Students taught by TT Faculty: 40% (all students), 43% (LD), 36% (UD), Fall 2011 – Spring 2016.Measure and track these metrics annually
Alumni and Employer satisfaction N/ACreate and implement a satisfaction survey

Objective 1.2 Comprehensive Professional Development for faculty, staff and administrators that promotes inclusive pedagogy

Performance IndicatorBaselineAction Item
Faculty participation rates: campus-led Professional Development events; Faculty learning communities (e.g. PBLC, Math Pathways); discipline-specific pedagogical conferences.From 2015 survey: 28 self-identified pedagogical resources, 17 different strategies for engaging students. Sixteen faculty have participated in ESCALA institute training as of Fall 2018. Other performance indicators have not been collectedBegin annual collections of these and other performance indicators, in cooperation with HSU’s Center for Teaching and Learning
Dollars invested in Professional Development.CNRS has not tracked this data.Begin tracking the data (annually)
Number of publications addressing teaching and learning in a STEM disciplineCNRS has not tracked this dataBegin tracking the data (annually)
Inclusive learning opportunities experienced by students in labs and classrooms, as captured through observations and surveys.CNRS has not tracked this dataDesign and implement an observational protocol for which meaningful data can be collected.

Objective 1.3 Provide co-curricular support to raise the probability of student success

Performance IndicatorBaselineAction Item
Student participation rates in clubs, mentoring programs, advising, INRSEP, other STEM events CNRS has student clubs and advising available to all majors. CNRS has not centrally tracked data on student participationBegin tracking data annually Write external grants for mentoring and other STEM supports for students
Direct Student Support: Scholarships, Internships, campus employmentStudent Business Services tracks some of this information (through Financial Aid). We also have had approximately 200 CNRS Students who have been employed annually on CNRS-based grants and contracts.Make use of the statistical reports from Financial aid to contribute to an annual report on direct student support.
Student sense of belonging in STEM, as measured through STEM-specific surveys and tools such as MapWorks The HSI-STEM grant measured and compared a sense of belonging for first-time students who participated in a PBLC, and those who did not. Continue to refine and deploy the HSI-STEM survey on belonging for PBLC students annually.

Objective 1.4 All academic programs are fully engaged in a cycle of continuous improvement

Performance IndicatorBaselineAction Item
Assessment plans are established for all programsAll departments have assessment plans through previous program review cycles.All departments will review and revise their assessments plans by the end of the 2018-19 academic year as directed by the Center for Teaching and Learning. These plans will integrate assessment of the CORE competencies.
Assessment data are collected and reviewed annually by program faculty The accredited programs (Engineering, Forestry, Soils, Rangeland, and Chemistry) have collected and reviewed student-learning data annually and reported this data formally.All departments will collect and review student learning data annually, and share with the CNRS.
Program review processes (and/or accreditation) are current for all programsAll but one of the CNRS programs are current in their program review processImplement the new (2019/2020) program review process beginning in Fall 2019.
Humboldt Bus Fieldtrip
Forestry PCR