Skip to main content

Fitchburg State uses technology to gather information and better understand visitors’ experiences. By continuing to use this website, you consent to this usage in accordance with our Privacy Policy.

Report from “Future is Now 5.0” Conference (Salem State/May 14, 2008)

The theme of the conference was course-embedded assessment. Presenters covered a wide range of conceptual, policy, and practical issues related to implementing, reporting, and using results from embedded assessments, primarily of student course work.

The opening plenary session was conducted led by Dr. Charles Blaich, Director of Inquiry for the Center of Inquiry in the Liberal Arts at Wabash College in Indiana. Wabash has a significant grant from the Lilley Foundation to coordinate a longitudinal study (involving not just Wabash, but 25 other institutions) of “student growth” over the course of a college career. “Student growth” encompasses not just the usual academic indicators (retention, degree completion, GPA), but also less tangible indicators such as engagement with the co-curriculum, leadership, critical thinking, and moral reasoning. Students are extensively tested and surveyed to establish baselines at initial enrollment, then measures are repeated throughout their undergraduate years and results correlated with each other. NSSE is a particularly important data source.

Results to date seem to clearly support the thesis that student growth is best promoted by “quality” interactions with faculty members. “Quality” interactions include those around high expectations, clear and well-organized presentations of course materials, prompt feedback on assignments, and interactions in and out of class which affectively demonstrate interest in students and respect for their time. Another key factor in student growth is experiences with diversity. Experiences at institutions in the Wabash study group suggest that these need to be structured and to some extent enforced (e.g., workshops with mandatory attendance, rather than just a lecture open to the student body) if they are to overcome students’ tendency to sort themselves into comfortable, like-minded categories.

Dr. Blaich noted that faculty and administrators often fear the process of assessment, expecting it to yield negative data that will have to be explained or justified under uncomfortable conditions. He suggested that a more useful approach to engage reflection on assessment data, regardless of the source, is to begin by asking “Who did well on this measure, and why?”

Attendees then transitioned to breakout sessions.

The session on Embedded Assessment in Practice was led by four faculty members from North Shore Community College. All described assignments which they had given to multiple sections of the same course, and what they learned from reflecting on their results. The first presenter, from Office Technologies, determined that students who had previously taken keyboarding performed significantly better on a business communication assignment than students who had not taken a formal keyboarding course. The department has since changed course sequencing.

The second presenter, from the Business Department, assigned a business case study to students and evaluated their responses on a critical thinking scale. He determined that students were much better at identifying problems and potential solutions than they were at assessing potential solutions to identify the best course of action, or at suggesting how a selected course of action might be evaluated after implementation. The instructor has changed his instruction to include more discussion and more practice of the later steps in problem resolution.

The third presenter, from Graphic Design, has a “capstone” logo-design assignment which not only requires students to develop a logo for a company, but present it to a “board” consisting of classmates. She found that students often struggle with bringing their ideas to market, some suffering such stage fright that they would rather fail the presentation than make it, and others with incapacity for brevity. She, too, has changed her instruction so that students are better prepared by the time they complete the assignment.

The second session, Embedding Assessment in Online Course Management Systems, was led by Curtis Naser of Fairfield University in CT. Dr. Naser has developed a True Outcomes/Tk20-like system called Eidos, which he eventually hopes to market. He demonstrated how the system compiles and presents results from student work, and how it selects sample assessments for validation studies.

One point of discussion was how to assess growth over time and experience, whether via the same rubric applied at different points in a student’s career, or retrospectively, by taking early and late samples of student work and assessing them at the end of the student’s career. Fairfield does the latter.

Two take-away points: first, any assessment-management system quickly becomes (a) enormously complex and therefore (b) presents a new user (institutional or individual) with a large learning curve. Second, not every potential user will force themselves to face the learning curve. In the five-year Fairfield experience, faculty utilization of Eidos is 40%-70%, depending on the department. He did not indicate the rate of student utilization. The University considers the present rate acceptable.

The final session, Demonstrating Critical Thinking Skills, involved two faculty members from the Community College of Rhode Island. Like the first session, they mainly covered their own courses, assignments, and assessment results. Unlike the first session, they did discuss how course-level assessment data could “migrate up” to the department, campus, and university level, though the migration is at very early and tentative stages at CCRI.