Dev Ed/ABE Toolkit: Chapter 5
For unfamiliar terms, agency names, or acronyms, please see Glossary.
Chapter 5
A lofty goal of this work in Minnesota is to develop what would be a common data set for all Dev Ed/ABE partnerships to use, which if implemented, could track and document both single course student achievement results and student progress over time (i.e., time to credit-bearing courses) and compare the partner program’s desired and actual outcomes. This could provide a means to measuring partnership effectiveness across the state and inform partnership improvements for the benefit of the students being served.
Documenting Results of the Dev Ed/ABE Partnership
There is much debate on how to define and measure student success in higher education. Initially it could seem simple; a successful student is a first-year student who enrolls in college, persists through semesters, and advances to degree completion. However, the routes students take in college are seldom simple and straightforward. There is ample information about the common measures institutions of higher education are using to measure student success, since the millennium: 1) Persistence, 2) Retention, 3) Time to Completion, 4) Academic Performance, and 5) Graduation rates.
Chapter 1 stated:
“Regardless of size and scope [of the partnership level of integration], there is already anecdotal and measurable data demonstrating benefits to students. These include the shortening of time in developmental education, expeditious progression into their program of study, and saving of time and money. In addition, through the additional services offered with ABE as a partner in the redesign, students are taught literacy, technology, work, college, and career readiness skills so they can move through school and into living wage jobs faster.”
So how might data be collected (and which data) to ascribe these benefits to the partnership?
The Need for a Common Data Set
Data collection by existing Minnesota State Dev Ed/ABE partnerships is still somewhat spotty and uneven. Many have reported that they are tracking student success in a single course (i.e., improved pass rates over time). However, taking the longer view (i.e., retention rates, length of time for students to enroll in credit-bearing courses, time to program completion, and career readiness) are not yet being formally and consistently tracked. Nevertheless, there is a growing body of evidence that student success in a single course has increased with the integration of ABE expertise into developmental education classes.
Because all partnerships are collecting data in varying means, a lofty goal of this work in Minnesota is to develop what would be a common data set for all Dev Ed/ABE partnerships to use, which if implemented, could track and document both single course student achievement results and student progress over time (i.e., time to credit-bearing courses) and compare the partner program’s desired and actual outcomes. This could provide a means to measuring partnership effectiveness across the state and inform partnership improvements for the benefit of the students being served. In this way, a case may well be made to provide some comparative data relative to pre and post implementation of a Dev Ed/ABE partnership. However, some data is being collected now; see section below describing this.
A draft common data set was developed in 2021 and vetted across many segments of those engaged in Dev Ed/ABE Partnerships. There was general agreement that it would be nice to have such data in the aggregate, however, the challenge to actually implement a common data set is a difficult problem to address. Here are a few of the current challenges:
- Data collection beyond what the instructors are able to measure in the classroom requires both a financial and personnel commitment at each institution with a Dev Ed/ABE partnership. The Dev Ed – ABE partnership projects are not owned nor sponsored by the overall Minnesota State system of colleges, so there is no authority or protocol available to require each of the institutions involved in Dev Ed classroom partnerships to collect certain data. It would take each of the partnership administrators and faculty to assess the ability of their own campus to commit to such an undertaking.
- A recent obstacle to collecting comparative data is COVID. Partnerships were not able to teach in any way that approximated the success of their in-class methods.
- It is unknown exactly what the source of the data would be for the measures that go beyond a single course (see Draft Common Data Set).
- While some data could be pulled from the Minnesota State’s central research unit, it would take a collaboration of system (college) administrators to request this and to fund the project initially, and over time.
- Dev Ed data collected at the system level from the individual system colleges do not specify whether the Dev Ed student took a class which utilized a partnership, or whether a partnership was available as additional resources for the student. It is highly doubtful that the courses are coded as partnership courses.
Nevertheless, it is hoped that minimally, the first few rows of outcome measures listed in the Draft Common Data set will be feasible to collect. As a fallback, an ongoing request will be made to all existing Dev Ed/ABE partnerships to send whatever data they do collect, so some attempt can be made to officially document known student improvements and partner program challenges. It will provide a data sampling across many of the colleges in the system.
How is Student Progress in Dev Ed Presently Being Assessed?
Presently, several of the Dev Ed/ABE partnerships in Minnesota can document student outcomes in courses where classroom collaboration is taking place mostly through the use of pre and post-tests to evaluate students’ skills and educational functioning throughout their course of study. To date, these have included ACCUPLACER® to place students in courses that match their skills, and TABE (Test of Adult Basic Education) or CASAS (Comprehensive Adult Student Assessment System), which are National Reporting System (NRS) approved assessment tests used by Minnesota ABE to establish a baseline and measure a student’s educational functioning level throughout the course of study, to determine measurable skills gains over time.
In addition to these tests, colleges may track AA (Associate of Arts) completion along with attendance. The Developmental Education committee at the college may also keep track of student grades (the number of students that passed the course) in their reporting system to determine the successful completion of classes. And counselors may gather data from previous semesters to look at retention, individual/cohort GPA’s, and successful classes vs. classes where students had greater difficulty. This information is shared with college and ABE instructors, staff, and administration.
Minnesota ABE requires all students enrolled in ABE programming, including those dually enrolled in Dev Ed/ABE collaborative classes, to complete an initial assessment using either the TABE or CASAS tools within the first twelve hours of instruction followed by post-assessments after every forty hours of instruction, at minimum. In addition to use of the NRS approved tests, ABE has access to online courses that can be used to support instruction, many of which include both diagnostic, formative, and summative assessments that can measure student progress between formal assessments. In much the same way, Development Education programs measure and document student progress throughout instruction by embedded benchmark testing with quizzes, tests, homework assignments, etc. Many colleges also allow students to re-take ACCUPLACER® tests at least once after initial assessment. Although the ACCUPLACER® is a placement test and not a diagnostic test, re-takes can offer a snapshot of increased skills built over time as a result of instruction, and it is not uncommon for students in Development Education to make significant gains in ACCUPLACER® scores after receiving focused, remedial instruction and support.
There are also several qualitative successes that can and should be documented to demonstrate the value a partnership adds to collaborative programming. An example of qualitative success might include such things as lowered cost to students through embedded and integrated programming, and access to free or reduced cost resources that can be supported through the partnership. For example, students enrolled in Adult Basic Education programs in Minnesota have access to several highly vetted, content-aligned distance learning platforms which can be used to enhance and support instruction and potentially eliminate the need for college textbooks altogether, especially at the developmental level. Just-in-time support for basic skills instruction which supports class content may result in better scores on performance measures, such as homework, quizzes, and tests, as will increased access and wrap around support for at-risk students through tutoring, coaching, review, outreach, and online access. Multiple measures (see Glossary) for assessment (including ACCUPLACER®, Next Generation ACCUPLACER®, TABE, CASAS, and others) provides the instructors with evidence of growth and achievement over time. Integrated classes also develop stronger relationships and trust among students and collegiality among staff.
How to Implement Your Partnership’s Data Collection Process
Identify what information that is wanted prior to the start of the semester. It is a lot easier to gather data when you know the questions for which you will want answers. Some of these will be real-time data to guided decisions while the courses are underway and to make larger modifications between semesters. Some of these are longitudinal pieces of data that are much harder to gather if you haven’t decided for how to track students long-term. Each partnership will find different value in the various data, but some partnerships have found the following types of data useful:
Quantitative Data:
- Starting Enrollment
- Drops/Withdrawals
- Past classes (if applicable)
- Longitudinal data- future classes and retention
- Enrolled/Completed/Success – if possible
- Attendance
- Class
- Additional support (how often are they accessing resources outside of the classroom- office hours, tutors, study groups, advising, counseling, etc.)
- Overall grade or incomplete status
- Performance on pre and post testing (ABE facilitated)
- Diagnostics to identify common skill gaps (if possible)
- Distance learning lesson completion
- Communication – how often students are engaging with instructors outside of class
Qualitative Data:
- Student reflections
- Track responses to questions related to soft/technical skills
- Use Likert scales to capture ranges
- Changes in “Beliefs” about subject area (or other identified goals)
- Changes in soft and technology skills over the course of the class
- Regular student check-in surveys
- Anecdotal data
- Feedback from other departments
- Conversations with students
- Word of mouth referrals from students
- Class participation (what worked well or not so well)
Toolkit Navigation
View full Toolkit
You can view and download the full Dev Ed/ABE Toolkit below.
General Toolkit inquiries and/or feedback should be directed to:
- Russ Fraenkel, Consultant, dba Leading-Edge Collaborations, Inc., [email protected]
- Lesley Blicker, Consultant, dba DesignWorks, [email protected]