Joint Forces — Testing/Curriculum meeting

Joint Forces (Testing/Curriculum Committee) Meeting Minutes 9/9/16

In attendance: Nigel Caplan (Chair, Curriculum Committee), Walt Babich (Chair, Testing Committee), Ken Cranker, Mike Fields, Jo Gielow, Kathy Vodvarka, Mikie Sarmiento, Scott Partridge, Bridget Casterline, Sue Walton, Joe Matterer.

  1. The purpose of this Joint Forces Committee is to fulfill a CEA mandate to “regularly draw upon retention, test, and other data to inform our planning and make curricular, textbook, or length/structural changes, as a result.” So, that’s what we’re going to do.
  2. What data do we need to do this?
    1. Promotion/Retention data
    2. Course evaluation data (the middle set of questions on the learning outcomes, plus “do you recommend this course”)
    3. Placement test data
    4. Final test data
    5. Analysis of students’ writing and speaking
    6. Needs analysis data
    7. Alumni surveys (also a CEA promise, BTW)
    8. Probation data (how many students get on/off probation)
    9. CAP student tracking: cumulative GPA, E110 grade, first-year GPA (first 2 semesters), major GPA
    10. Diagnostic data
    11. Research project: tracking students’ vocabulary growth
  3. What questions do we have that might be answered with data?
    1. Why is the retention rate high and climbing in Level IV? [Prioritized for the Curriculum Committee this year. Consider distribution of retention among new, continuing, and retained students, as well as student course evaluation data.]
    2. Do students with “split level” placements perform worse in the higher of the two courses? Do split-level students make equal progress in both skill areas? Does split-level placement affect some L1/national groups more than others? [Prioritized for the Testing Committee this year.]
    3. What gaps in our curriculum do different groups of alumni identify (e.g. undergraduates, graduates, [business] professionals, lawyers, study abroad groups, etc.)? [Prioritized as soon as the alumni needs analysis can be developed.]
    4. Do student outcomes suggest that the placement test is working well?
    5. Does the class swap procedure result in lower diversity in some classes/levels?
    6. What does students’ writing/speaking tell us about skills that need to be taught/reinforced/assessed better at particular levels? Do L1/country of origin differences correlate with teaching and assessment outcomes?
    7. Are there courses which are underperforming in terms of outcomes and satisfaction that should be redesigned or reevaluated?
    8. Do the S3 clusters work? Can we track the number of students who work their way out of academic difficulty and meet their goals (eg university matriculation)?
  4. Action points:
    1. Registrar’s office will liaise with UD Institutional Research to collect CAP (undergrad and grad) tracking data each summer, and share with the Joint Forces committee each Fall, with recommendations proposed at the Spring retreat.
    2. Both committees ask level supervisors to ensure that teachers are conducting diagnostics (especially in writing and speaking) at the start of the session. Ideally, each course will use the same diagnostics in the interest of consistency and to allow comparisons.
    3. Item for advisory committee: We would like teachers to be able to use diagnostic data to report students who appear to be misplaced (too high). Since we have noticed errors creeping into the promotion system (teachers accidentally hitting the wrong key, students being inadvertently allowed to move between levels, students entering EAP/Grad VI having failed Gen VI), there should be a mechanism early in the session to ask that a students’ level be checked against prior records.
    4. Registrar’s Office is considering removing the Promotion/Retention field from teacher grade reports since there is no discretion here: the outcome is based solely on grades, which the computer can verify (if they are entered correctly!).
    5. Item for advisory committee: We would like clarity on the procedure for updating course evaluation forms. Currently, it appears to be level supervisors’ responsibility to ensure that the eval forms align with learning outcomes. The Curriculum Committee reviews and approves changes to learning outcomes but does not oversee evaluation forms. Since a project is also underway to move the evaluation forms online (thus removing the 28-question limit), this would be an opportune moment to review the entire form, perhaps at the Spring Retreat.
    6. Admissions Office: Can we collect students’ home language on the application or intake form? This would enable finer-grained analyses of data (e.g. giving us the option to cross-tabulate by either country of origin or L1) and collect useful data for US citizens/culturally and linguistically diverse learners in the future.
    7. Course evaluation task force: Please consider procedures for sharing requested course evaluation data with the curriculum committee.
    8. Administration, advisory, and other committees: please look at our list of questions and consider adopting one or two as projects.

Respectfully submitted

Nigel Caplan

Facilitator, Joint Curriculum/Testing Committees

9/12/16

Comments are closed