275-22 Knowledge Surveys and Course-Based Learning Outcomes for Soil Science Courses.

See more from this Division: SSSA Division: Soil Education and Outreach
See more from this Session: Symposium--Teaching of Soils in the 21st Century

Tuesday, November 5, 2013: 3:15 PM
Marriott Tampa Waterside, Florida Salon I-III

Susan B. Edinger-Marshall, Forestry and Wildland Resources, Humboldt State University, Arcata, CA and Edward B. Nuhfer, Academic Programs, Humboldt State University, Arcata, CA
Abstract:
How do we gauge the effectiveness of our teaching and student learning? To answer this question, we performed knowledge surveys (KS) in two upper division soil science courses at Humboldt State University. The KS paralleled course-based learning outcomes. We intended that knowledge surveys would serve as content learning guides for students and helpful to them for developing self assessment skills; that they offer a reliable instrument for recording students' estimates of their own learning gains, and that knowledge surveys offer a way for instructors to present a well-organized course with full disclosure to students of learning expectations.

Students assessed their ability to answer each survey question, on a 3-point scale, during the first and last week of each course. Comments gleaned from anonymous class climate teaching evaluations illustrated students’ considered knowledge surveys as useful study guides. Knowledge surveys in both sections proved reliable (R = 0.93 and 0.98). Paired t-tests showed that pre-post self-reported gains were significant on nearly all items. The detailed disclosure provided by the instrument did seem to facilitate commonly understanding of course organization by instructor and students.

To discover the degree to which these HSU students could accurately self-assess, we paired measures of competency from a standardized 25-item Science Literacy Concept Inventory (SLCI - Cronbach's alpha R=.84) with self-assessment ratings of science literacy registered on a knowledge survey comprised of identical items (Cronbach's alpha R= .93). Considering our small populace of students (N=31) correlation of the two provided a surprisingly high r = 0.53, close to a highly significant r= 0.6 yielded by a national sample of 420 participants. In general, when these students understood the challenge and said they could meet it, most could. These HSU students scored an average of 84% in contrast to the national average for juniors/seniors (n=3855) of 72%.

We next paired class content items from the course knowledge surveys with related items on the final exams that addressed the content of these items. These provided two subsets (two soils classes) of items from the final exams and knowledge surveys, with all four measures having reliability of R >0.8  On these, there was no significant relationship found between students' self-assessment of competency via the class knowledge surveys and their actual class exam results on these measures.

Why this disparity? Both significant and insignificant relationships between course content knowledge surveys and exams have been reported in the literature.   We believe that the disparity may be explained by how closely any two paired instruments have been constructed to closely match addressing the same learning. If not closely aligned in both content and difficulty, such correlations measure ability to transfer information rather than to self-assess. It may be possible to improve transferability by constructing the knowledge survey items at a high level of rigor and difficulty, thus driving students toward over-preparation, which may improve transferability across a wider spectrum of less difficult challenges. We recommend this as a next step in research.

See more from this Division: SSSA Division: Soil Education and Outreach
See more from this Session: Symposium--Teaching of Soils in the 21st Century