Reinhardt University SACSCOC Compliance Certification

 

3.3.1.1 Institutional Effectiveness:
Educational Programs

The institution identifies expected outcomes, assesses the extent to which it achieves these outcomes,
and provides evidence of improvement based on analysis of the results in each of the following areas:
(Institutional effectiveness)

3.3.1.1 Educational programs, to include student learning outcomes
Judgment Compliance

Narrative

All educational programs at Reinhardt University have identified expected student learning outcomes, assessed the extent to which they achieve these outcomes, and provided evidence of improvement based on analysis of the results.  As documented in the 2014-15, 2015-16, and 2016-17 Academic Program Assessment Reports, each academic program uses a variety of methods to assess student learning outcomes within the major, which are linked to university-wide general education student learning outcomes. Within these reports, program coordinators propose curricular and other changes based on assessment results and determine their budgetary implications, which are prioritized by the Deans and considered during the budget process.   Assessment methods are continually being developed and refined as determined by assessment results.  In addition, each year program coordinators are asked to provide a status report on the improvements they recommended in the previous year.

 

Overview of the Reinhardt's Academic Assessment, Planning, and Budget Process

As evidenced in CR 2.5, Reinhardt University has a comprehensive and continuous academic planning and evaluation process which calls for program review, annual evaluation of instruction by students, course reflections by the faculty, and assessment of student learning outcomes at the program and course levels. This multifaceted approach to institutional effectiveness flows from the University’s mission statement and its strategic plan and results in a comprehensive assessment of academic program learning outcomes.

At the end of each academic year, program coordinators are required to submit to their dean an Academic Program Assessment Report. See compilations of academic program assessment reports for the past three years. Working with program faculty, coordinators report their expected learning outcomes, assessment methods and benchmarks, and results for each method of assessment.  The current reporting guidelines [1] prompt faculty to analyze results for each assessment, identify needed changes and their budgetary impact, summarize strengths and weaknesses of the program, and follow-up on the implementation of previously recommended improvements.   Program coordinators also submit a copy of the report to the Director of Institutional Research and Effectiveness, who uses a rubric [2] to evaluate the appropriateness of the assessment methods and other elements of the report and may suggest ways to improve the quality of assessment efforts for the next assessment cycle.   See examples of feedback provided to program coordinators.  [345]

The improvements that are proposed in the Academic Program Assessment Report are evaluated and prioritized by each dean who also reviews their programs' strengths and weaknesses and any budgetary implications. The deans summarize program assessment results and proposed changes in the School Annual Report, which includes other measures of performance. An example of a school report is included in the supporting documentation. The summary of program assessments starts on page 13. [6] The School Annual Reports are submitted to the Provost, who reviews them and seeks advice and counsel from the Deans Assembly, a group comprised of school deans and academic unit directors, in order to prioritize academic initiatives for the coming year. A copy of the School Annual Report is sent to the Director of Institutional Research & Effectiveness.

The Provost submits selected academic initiatives and their budget implications [34] for further prioritization to the Leadership Team, comprised of the President, Provost, Vice Presidents, and other senior administrators.  Improvements and budget recommendations are considered by the President and Leadership Team when constructing the budget.  All resulting goals are aligned to the University's strategic goals.  A draft of the new annual goals and budget is shared by the President and Leadership Team with University constituencies including faculty and staff for comments and additional concerns. The budget is then finalized and submitted to the Board of Trustees for approval. The deans are responsible for implementing the identified academic initiatives and managing budget allocations for their programs.  The cycle is then repeated each year.  The following diagram describes Reinhardt’s Academic Assessment, Planning and Budget process.

 

Student Learning Outcomes

Reinhardt University has identified expected outcomes for educational programs, to include student learning outcomes. All degree programs have stated student learning outcomes, and they are reported in the annual Academic Program Assessment Report and published in the academic catalogs. [78]

Furthermore, instructors are required to state course objectives in their syllabi and link them to program outcomes or the general education learning outcomes, where appropriate. Similarly, the University requires all proposals for new degree programs to include statements of student learning outcomes.  The MFA Proposal provides an example; see page 3. [9] New undergraduate programs are also required to link their student learning outcomes to the university student learning outcomes. New program proposals are reviewed on campus by the Academic Programs and Curriculum Committee, Faculty Executive Committee, Faculty Senate, Provost, and the President to ensure compliance with institutional requirements.

Through regular assessment workshops and one-on-one consultations, the Director of Institutional Research and Effectiveness works with existing programs and their coordinators to review and refine their student learning outcomes for each annual assessment cycle.  Programs are advised to state between three and five learning outcomes, which can address the knowledge, values and skills the students are expected to acquire by completion of the program. They have also been instructed to develop "SMART" outcomes, that is, specific, measurable, attainable, results-focused, and time-focused. In addition, all programs have developed curriculum maps indicating how in each required course, program student learning outcomes are introduced, reinforced, and mastered, and what courses provide opportunities  for assessment. [10] Faculty share the program learning outcomes with their students via the undergraduate and graduate catalogs, program websites, and course syllabi. [11] An example of a recent syllabus is attached as supporting documentation. [12]

A compilation of the most recent three cycles of Academic Program Assessment Reports, documenting the student learning outcomes across RU’s degree programs, including online programs, is provided as the supporting documentation in Table 3.3.1.-1. Programs that undergo a program review have the option of not completing an annual assessment report during the review year.

TABLE 3.3.1.1-1

Assessment Reports by Location and Delivery Modality  

Program Location/Modality 2014-15 2015-16 2016-17
Art: Studio Art and Graphic Design (B.F.A.) Main Report Report Report
Business Administration  (B.S.) Main Report Report Report
Business Administration  (B.B.A.) Online Program started in Summer 2016 Program started in Summer 2016 Report
Associate of Science in Criminal Justice Off-site   Report Report
Bachelor of Criminal Justice (B.C.J.) Online Report Report Report
Bachelor of Healthcare Administration (B.H.A.) Online Report Report Report
Biology (B.S.) Main Report Report Report
Biology Education (B.S.) Main No majors Report Report
Communication and Media Studies (B.A.) Main Report Report Program Review
Early Childhood Education (B.S.) Main Report Report Report
English and Language Arts Education (B.S.) Main Report No majors Report
English – Creative Writing (B.A.) Main Report Report Report
English – Literature (B.A.) Main Report Report Report
History (B.A.) Main Report Report Report
Interdisciplinary Studies (B.A.) Main Report Report Report
MAT in Early Childhood Education (M.A.T.) Main & Off-site Report Report Report
Mathematics (B.S.) Main Report Report Report
Mathematics Education (B.S.) Main Report Report Report
Master of Business Administration (M.B.A.) Off-site Report Report Report
Middle Grades Education (B.S.) Main Report Report Report
Master of Fine Arts in Creative Writing (M.F.A.) Main & Online Program started in Summer 2016 Program started in Summer 2016 Report
Master of Public Administration (M.P.A.) Off-site Report Report Report
Music: Performance and Sacred Music

(B.M.)

Main Report Report Report
Master in Music (M.M.) Main No Report* Report Program Terminated
Music Education (B.M.E.) Main Report Report No Report*
Organizational Management & Leadership (B.S.) Off-site Report Report Report
Political Science (B.S) Main Report Report Report
Pre-Nursing (A.S.) Main Report Report Report
Psychology (B.S.) Main Report Report Report
Religion (B.A.) Main Report Report Report
Sociology (B.S.) Main Report1Report2Report3 Report Report
Sport Studies (B.S.) Main Report Report Report
Theater Studies (B.A.) Main Report Report Report
World Languages & Cultures  (B.A.) Main Report Report Report

*In both “No Report” cases, the program was in the midst of a leadership transition.

Assessment Measures

Academic programs at Reinhardt University report their assessment results annually using a Word template.  Instructions for completing the report are posted on the OIRE Site in EagleWeb, Reinhardt’s Intranet. [13] The Academic Program Assessment Report template requires programs to list their student learning outcomes, specify direct and indirect methods of assessments, identify the location of the assessment results, report and analyze findings, and note any changes planned for the next academic year.  Over the past three cycles the Academic Program Assessment Report template was revised to address needed improvements, as illustrated by Table 3.3.1.1-2 below.

TABLE 3.3.1.1-2

Improvements in assessment reporting

 Before 2014-15 2014-15

Cycle

 

2015-17 and 2016-17

Cycles  

Program Mission Statement X X X
Program Mission linked to RU Mission X X
Program SLOs X X X
Program SLOs linked to RU’s SLOs X X X
Methods of Assessment X X X
Benchmark or Success Criteria X X
Results from Assessment X X X
Changes based on Results X X X
Direct links between SLO, method of assessment, results, and changes X X
Program Highlights-program successes and areas of improvement X X
Prompts for identification of direct and indirect methods X X
Data Repository Location X X
Indication of SLO Met, Not Met, Partially Met X X
Follow-up on action items recommended in the previous year X

Although assessment reports compiled prior to 2014-15 included information on learning outcomes, methods of assessments, results and proposed changes, the link between learning outcomes, assessment results, and changes was not always clear. With a few exceptions (i.e., Price School of Education programs), the reports were not specifying the benchmark or criterion of success for each learning outcome and were using few direct measures. Often times the direct measures were expressed as grades in courses and the location of the assessment evidence was not identified.  Only a few programs were engaging in the analysis of program strengths and weaknesses based on the evidence collected during the assessment cycle.

The current reporting template was designed to address all these shortcomings in order to strengthen the assessment process and provide more robust evidence for academic planning and decision-making.  In August of 2014, the Provost conducted an assessment workshop [14] with all program coordinators and provided training on using the new reporting template, placing special emphasis on how to refine learning outcomes and assessment methods, and how to establish reasonable benchmarks (or criteria of success) for each assessment.  Program coordinators were also asked to use at least one direct method for assessing each student learning outcome. (A direct method is where faculty members evaluate a student’s work; an indirect method is where a student evaluates her or his own learning.)  Additional workshops provided each subsequent year training in the areas of rubric design, curriculum maps, and closing the assessment loop. [15]

A review of the reports submitted over the past three years indicates that all programs have made considerable progress in refining their learning outcomes, selecting valid assessment methods that are appropriate for each outcome, and establishing a benchmark for each assessment.  The majority of programs make ample use of rubrics for assessing papers and projects, embedded questions, capstone projects, portfolios, and locally developed tests. The documentation includes as an example the rubrics used by the MBA faculty for their program direct assessments. [16]

The discussion below provides an overview of best practices related to assessment methods as evidenced in the Academic Program Assessment Reports found in the supporting documentation. Although we provide only a few examples here, the 2014-15, 2015-16, and 2016-17 Academic Program Assessment Reports offer a more comprehensive view of the various assessment methods used by Reinhardt's programs.

Direct Methods.  Assessment methods at Reinhardt consist of a wide array of direct measures of student learning, including comprehensive examinations, nationally based proficiency tests, locally developed tests, student portfolios, senior theses, research papers, case study analyses, embedded questions, internship supervisor evaluations, observations, ethics certifications, and pre-post tests. Examples of direct methods in selected programs are listed in Table 3.3.1.1-3 and discussed briefly in the text following the table.

TABLE 3.3.1.1-3

Examples of Direct Methods of Assessment

Direct Method Program

 

Portfolio Evaluation, Final Exam Art
Research Paper, Presentation, ETS Major Field Test, Embedded Questions (Final Exam), Lab Report Biology
Presentation, Research Paper , Homework Assignments, Internally Developed Examinations Business Administration
Internally-developed Examination, Scoring of Essay, Presentation, Research Paper, Case Study Analysis, Ethics Certification, ETS Major Field Test Sociology
Series of critical reading and writing assignments, Research Project, Web Essays, Mid-term Exam Communications
Lesson Planning Rubric, Differentiated Instruction and Assessment Lesson Summary (DIALS) Observation Form, Rubric for Analysis of Impact on Student Learning, Professionalism Assessment Rubric Early Childhood Education
Scoring of Essay, Capstone Project (Senior Thesis), Research Paper , Embedded Questions (Tests) English
Senior Portfolio, Scoring of Essay, Internally-developed Examination – Embedded Questions History
Senior Portfolio, Scoring of Essay, Internally-developed Examination – Embedded Questions, Research Paper Interdisciplinary Studies
Presentation, Research Project,  Internally Developed Examination-Embedded Questions Mathematics
Comprehensive Exams, Final Exam, Research Paper, Case Study Analysis MPA
Capstone Project, Research Paper, Presentation, Research Project Pre-Nursing
Scoring of Essay, Presentation, Research Paper,  Research Project, Ethics Certification Political Science
Internally-developed Examination – Embedded Questions, Scoring of Essay, Capstone Senior Project, Presentation, Research Paper, ETS Major Field Test Psychology
Internally-developed Examination – Embedded Questions, Scoring of Essay, Presentation, Research Paper, Graduation Portfolio Religion
Internship Supervisor Survey, Research Paper, Service Learning Project Sports Studies

At Reinhardt, students in many programs participate in special topics capstone courses, comprehensive examinations, and special projects to assure the proficiency of the students’ knowledge of the essential content of their programs. These experiences have provided programs the opportunity to develop meaningful, comprehensive, and cumulative assessments of skills, predispositions, and knowledge acquired in the majors. The implementation of the upcoming Quality Enhancement Plan, which will ensure that each Reinhardt student completes a capstone experience by the time they graduate, will likely increase the number of capstone assessments.

Several programs use a capstone course or evaluate student work in a required advanced course (or set of courses) as a direct assessment measure. Faculty members teaching these courses collect student work (or samples of student work) and dedicate a designated time at least once annually to reviewing these works and evaluating them in relation to student learning outcomes.

Most programs have developed rubrics that faculty can use to evaluate student work. The MPA program faculty, for instance, score the Comprehensive Examination using a rubric linked the program's learning outcomes. In the Psychology program, students present seminar-style lectures on classic and contemporary theoretical research papers, which are scored with a rubric measuring four domains: Organization, Content Knowledge, Comprehensibility, and Overall Quality.

Also in the MPA program, students have to submit a Professional Portfolio, which is a collection of artifacts generated from course work, internships, and work experience and production materials prepared to assess achievement of the Program SLOs, the Program Objectives, and the Professional Portfolio purposes. MPA faculty score the Professional Portfolio using a rubric determining what is expected in the answers relative to content, program SLOs, and Program Objectives.

Similarly, the MBA program developed a Portfolio evaluation rubric that is aligned to the domains that reflect each of the MBA student learning outcomes. Students submit six papers in their portfolio and each paper addresses one of the six learning objectives; a narrative must be written to show how the paper fits a particular learning objective.  In the Studio Art program, students prepare a visual portfolio and install an exhibition of their work in the Fincher Fine Art Gallery.  The professional portfolio submitted in Art 492 Thesis Exhibition and Portfolio is the second part of a two-part capstone course.  Likewise, the Senior ePortfolio in Communication & Media Studies is a multifaceted and comprehensive assessment project that requires students to demonstrate the ability to use critical organizational skills as well as to edit and revise their best college writing and to create explanatory narratives about how they are fulfilling requirements. Senior portfolios also constitute the main method of assessment for History, Interdisciplinary Studies, and Religion while students in the Music program are assessed trough a Senior Recital.

Evaluations of practica and field experiences are a central component of the assessment plans for several programs. In Price School of Education, for instance, the Director of Field Experiences ensures that MAT candidates have multiple level field experiences through systematic monitoring and placement. Each stage in the program— Initial Admission to the MAT Program, Admission to Candidacy, Admission to the SMART Block, and Admission to Candidate Teaching – is monitored developmentally by both the candidate's faculty advisor and the Director of Field Experiences through observations.  In the Bachelor of Healthcare Administration program, the academic coordinator who supervises the academic aspect of the internship has initiated refinement of the assessment used by the intern’s on-site supervisor to evaluate more critically the intern’s achievement of the stated course and program learning outcomes. The internship evaluation is becoming a more significant component of the student's assessment. Similarly, the MBA program uses a rubric to evaluate the Practicum presentation.

Several undergraduate programs use national tests, which allow the program coordinator to compare the extent to which student learning outcomes are achieved with peer institutions. Examples of national benchmarks are the ETS Major Field Tests for Biology, Business, Psychology, and Sociology.

Lastly, some programs use pre and post-tests as direct measures. In Biology, for instance, in Genetics (BIO 320), a ten, multiple-choice, pre-test is provided to all enrolled students in BIO 320 the first week of class. The same test is provided to all BIO 320 students still enrolled in the course on the last week of class. The gains in percentage of questions answered correctly between the pre- and post-test are calculated as a direct assessment. A similar pre and post-test assessment was introduced in the MBA program.

Indirect Methods.   In addition to direct methods, academic programs at Reinhardt employ a series of indirect methods to assess student learning outcomes. For instance, individual Faculty Course Reflections are collected each term from instructors to help them fine tune their teaching and improve student learning outcomes. Other examples of indirect methods include course evaluations, program exit surveys, exit interviews, and success rates in placing students in graduate programs (see Table 3.3.1.1-4).

TABLE 3.3.1.1-4

Examples of Indirect Methods of Assessment

Indirect  Method Program

 

Exit Senior Survey, Course Evaluations Biology
Course Evaluations, Course Reflections Business  Administration
Course Reflections English
Copy of resume, Records from the Director of Career and Professional Services Development Interdisciplinary Studies
Program Exit Survey Sports Studies
Number of graduates being admitted into graduate programs and/or completing the graduate degree Organizational Management  & Leadership
Exit Survey, Exit Interviews Communications (2013-14 Report)
Exit Interviews Religion (2013-14 Report)
Exit surveys MBA

Since 2010, the University has been using an online course evaluation system (SmartEvals) managed by GAP Technologies. While the standard questions in this indirect assessment focus on students' perceptions of quality of teaching, instructors can add supplemental questions that relate to the course learning outcomes. [17]  The system generates a MyFocus report which is automatically emailed to the instructor once the results have bene complied. The report shows areas where the instructor scores significantly lower than the average rating for the University. An example of MyFocus report is included in the supporting documentation. [18] Feedback is reviewed carefully by instructors and used to improve teaching.  In addition, feedback from course evaluations provides the basis for the performance review by the deans of full-time and adjunct faculty.

In the Course Reflection form, which has to be completed at the end of each term, instructors are asked to choose a course they taught that term and reflect on the effectiveness of the course’s learning outcomes.  The program coordinator reviews the Course Reflection forms and uses the information to assess more accurately the extent to which the program’s student learning outcomes are being met.  An example of a completed course reflection form has been included in the supporting documentation. [19]   While student evaluations and faculty course reflections serve primarily as main methods for course-level assessments, they also inform assessment at the program level, particularly in those courses that house key program assessments, such as capstone courses and seminars.  In addition, course reflections for courses that are part of the General Education Core are used an indirect method in assessing the General Education curriculum.

Some programs conduct their own exit surveys, which ask graduates to rate the extent to which they achieved the program's learning outcomes.  In the Communication program, Exit Interviews are conducted with students by the entire faculty prior to graduation. These interviews provide qualitative feedback on satisfaction with the program, student engagement in the larger community through service, participation in community projects or student organizations.

Program exit surveys are conducted regularly by OIRE for traditional programs, online programs, and programs offered at other sites. [20] Exit surveys for the traditional students at the main campus rotate on an alternating schedule with the NSSE survey [21] being conducted in even years and the Noel-Levitz survey [22] in odd years.

Examples of Improvements Based on Assessment Results

The academic year 2014-15 was the first in which the University used a revised assessment template that prompted program coordinators to link proposed changes more tightly to assessment results. Changes were identified at the conclusion of the 2014-15 academic year and implemented beginning in 2015-16.  At the same time, it should be noted that programs had been reporting changes based on earlier assessment efforts, and these improvements were documented in the 2013-14 and 2012-13 assessment reports, although the reporting template used for these assessment cycles did not strictly enforce a direct link between assessment results and improvements recommended. Examples of program improvements made in 2012-13 and 2013-14 are presented in Table 3.3.1.1-5 and a few are discussed in greater detail following the table.  Examples of planned or implemented changes from the 2014-15, 2015-16, and the 2016-167 assessment reports follow this discussion.

TABLE 3.3.1.1-5

Examples of improvements based on 2012-13 and 2013-14 assessment results

Program Improvements based on assessment results
Biology
  • Removed most of the non-majors in the General Biology course (BIO 107) in order to increase the rigor of that course for the Biology majors. This should help better prepare students to success in BIO 320.
Business Administration
  • In BUS 371 ethics will be assessed sooner than on the date of second test so that the entire course can be presented in the light of ethical conduct.  The next time the course is offered, ethics question will be administered on the first exam instead of the second.
  • Also, International Financial Reporting Standards (IFRS) will be introduced in the class and needs to be assessed sooner than on the date of the second test so that the entire course can be presented in the light of potential IFRS changes and so the instructor can emphasize the difference between GAAP and IFRS as the class covers GAAP throughout the entire term.  The next time the course is offered, the other IFRS question will be administered on the first exam instead of the second.

TABLE 3.3.1.1-5 (CONTINUED)

Examples of improvements based on 2012-13 and 2013-14 assessment results

Program Improvements based on assessment results
Communication
  • In order to address the learning outcome "Students will demonstrate respect for individual and cultural differences," the program added a new seminar course in 2013-14: COM 398 (Special Topics in Global/Intercultural Communication). This course explores global or international issues of contemporary interest to the study of communication or advanced issues in intercultural communication.  (from 2013-14 report)
  • In 2012-13, the program also incorporated portfolio preparation into the COM 340 course, a change that made a significant difference in the preparedness of the seniors as compared to last year. (from 2012-13 report)
Early Childhood Education After a review of PSOE candidate teaching dispositions assessment data for Fall 2014, EPP faculty will provide more explicit, in-depth instruction and scaffolded assistance throughout all program coursework in the following areas:

  • Understanding the GaPSC Code of Ethics and PSOE Policies for Professionalism; specifically, demonstrating appropriate professional dispositions and support of a learning community in the university classroom (as well as field experience and clinical practice)
  • The use of effective spoken and written communication for PSOE candidates
  • Identification, creation, and capstone presentation of artifacts in the candidate teaching electronic portfolio that demonstrate competent proficiency in professional responsibilities in support of differentiated instruction and assessment
English
  • In ENG 335.Multicultural American Literature, all students were skilled at finding appropriate secondary sources, but some students had difficulty integrating ideas from these sources into their own arguments. The instructor will provide students with model papers to demonstrate successful integration of material from secondary sources.
  • The new gateway course, ENG 240-Introduction to Literary Analysis, will continue to focus on enhancing students’ ability to read, comprehend, and interpret both primary and secondary texts. The instructor for this course will work in conjunction with the instructor for ENG 341-Literary Genres and Critical Approaches to enhance students’ skills in this area. (from 2013-14 Report)
Healthcare Administration
  • HCA 410 Capstone Course has been converted from an eight-week course to a full semester sixteen-week schedule.  This allows more time for students and the instructor to communicate about the progressive stages of the project and is expected to produce a higher-quality final report.
  • The BHA program academic coordinator who supervises the academic aspect of the internship has initiated refinement of the assessment used by the intern’s on-site supervisor to more critically evaluate the intern’s achievement of the stated course and program learning outcomes.  This will become a more significant component of the student's assessment.

 

TABLE 3.3.1.1-5 (CONTINUED)

Examples of improvements based on assessment results

Program Improvements based on assessment results
Interdisciplinary Studies
  • In IDS 308 – Baroque World, fewer than 70% of students obtained a score of 70% when their knowledge of western and non-western society and history was tested. Since this is the first test of the semester, faculty will organize and facilitate study sessions prior to future tests.
Mathematics
  • In MAT 102 and MAT 099, the instructors adopted a learning system that checks “mastery” of concepts rather than merely computational skills. Also, faculty recommended that more emphasis be placed on the development of axiomatic systems in the senior seminar. (2013-14 Report)

 

MBA
  • The MBA practicum rubric was revised last year to give a larger percentage weight to the content. Additional rubrics were developed for specific content for the practicum topics and tie together the concepts from both classes as indicated on BUS 695 (CN) instructor course evaluation: “Utilize the concepts of two courses for each practicum rather than one.”
  • Student portfolios were also tied to each of the program learning objectives.
Pre-Nursing
  • Changed the rubric used to evaluate the presentations made by students in Anatomy and Physiology.  The new rubric has helped students better prepare for their presentations.
Psychology
  • Lectures were revised in the Intro. Psychology course to ensure emphasis on interrelationship of topics, especially methodology. This change was prompted by the finding that less than 50% of the students answered correctly test questions related to Integration of Knowledge, especially regarding the integration of methodological issues across content topics. (2013-14 Report )

The following section presents a selected examples of improvements made by programs in more recent years:

In 2014-15, the MBA faculty tightly connected the grading rubric used to assess Practica experiences to student learning outcomes. They faculty examined the grading rubric for one Practicum experience, BUS 635, to make revisions that will ensure more direct measures of student learning. All MBA Practicum experiences culminate with a student presentation.  A Practicum presentation is graded on 1) Organization (20%), 2) Topic Knowledge (20%), 3) Creativity (10%), 4) Visual Aids (20%), 5) Summary (10%), and 6) Stage Presence (20%).   The MBA faculty decided to change the grade distribution for “Topic Knowledge” from 20% to 50%.  More importantly, each Practicum that is connected to an MBA course will be graded according to a “Topic Knowledge” rubric to more accurately assess whether a student meets the stated learning outcome.

The new online programs also used the results of their first assessment reports to develop plans of action and implement program improvements.  In 2013-14, the BCJ program coordinator noted that students did not meet the benchmark for an assessment housed in the Juvenile Justice & Delinquency course, as they did not have sufficient time to prepare an end-of-course research paper explaining delinquent behavior and potential correctional/prevention strategies through the lens of criminological theories. To encourage submission, faculty recommended that students submit building block assignments prior to the end of the course (i.e. annotated bibliography, outline, rough draft of paper). As a result, this course was expanded to an (8) week session in 2014-2015, which may allow more time for students to prepare the paper.  Likewise, based on the first cycle of assessment results, the Capstone Course in the BHA program has been converted from an eight-week course to a full semester sixteen-week schedule.  This allows more time for students and the instructor to communicate about the progressive stages of the project and is expected to produce a higher-quality final report.

In 2015-16, the Biology program recommended an improvement in the quality of the General Biology course and General Chemistry in order to meet the benchmark set the SLO. This will most likely happen as soon as we are able to provide our students with a quality General Chemistry experience. The need for an improved chemistry program was commented on within the Biology Senior Exit Survey completed in April 2015 and 2016. The Majors’ General Biology Course now called BIO 120 was updated in fall semester 2016. A new text book was selected and used that provided a higher quality and more rigorous focus on Genetics and Molecular Biology. This book was directly geared towards majors in the biological sciences versus non- majors which is what the last book we used was focused for. The program also changed the laboratory format and included updated and more quantitative labs with a new format of using undergraduate teaching assistants instead of just a faculty member.  Also a new Chemistry adjunct professor was hired for spring semester 2017. The impact of these changes will be assessed when students take the Genetics course in Fall 2017 and take the major field test as seniors.

In the Communication program, faculty improved the "Cultural Roots" research project, a direct assessment, by 1) revamping of the requirements to include both oral history and family tree building, dedicating additional class periods to preparing students for the assignment (for example, a lecture and film on oral history methods), 2) linking of the assignment to the required use of two technological platforms online: Ancestry.com (for research) and Geni.com (for family tree building), and 3) tutoring students on technological tools and archival research.

In the Psychology program, faculty teaching the Intro. Psychology course noted that fewer than 50% of the students correctly answered test questions related to integration of knowledge, especially regarding the integration of methodological issues across content topics. In response, the instructor revised lectures to ensure emphasis on interrelationship of topics, especially methodology.

In their 2014-15 assessment report, faculty in the Bachelor of Health Care Administration program, recommended to increase requirements for students to use medical terminology in course writing assignments. This was achieved in 2015-16 and, as a result, the percentage of students meeting or exceeding benchmark for SLO 3 (Graduates will master key clinical and managerial terminology and demonstrate the ability to communicate effectively about healthcare administrative issues in discussions with healthcare administration professionals, faculty, and other students)  improved. In addition, two changes were implemented to improve learning for SLO 5 (Students will demonstrate the ability to recognize and analyze issues and problems associated with changes in the healthcare administration field and to subsequently propose or enact constructive responses): HCA 410 expanded to full semester and HCA 490 was augmented with host/preceptor evaluation of intern. As a result, benchmarks for this learning outcome were fully met.

Follow-Up on Improvements

To improve monitoring of the implementation of the actions recommended in the previous year, in 2016 the Office of Institutional Research & Effectiveness added a new section to the reporting template. Program coordinators were asked to follow-up on the improvements suggested in the previous year, report on their implementation and assess the impact of the changes on the outcome being measures.  This last step was included to ensure that the closing of the assessment loop is achieved in each program. The following provides examples from the 2016-17 assessment reports documenting recent improvements and their impact on the outcome being measured.

In the 2015-16 MPA program assessment, the program coordinator noted MPA faculty should consider placing greater emphasis on the current management challenges within the public sector as compared to the private sector.  MPA faculty might also contemplate assigning and using a different textbook with a greater emphasis on leadership styles and applications, instead of the specified foci on management trends and reforms of public organizations. In Fall 2016, the MPA faculty implemented this recommendation.  They placed greater emphasis on the current management challenges within the public sector as compared to the private sector.  Moreover, MPA faculty used different textbooks, Northouse’s Leadership: Theory and Practice, Seventh Edition, along with Miner’s Organizational Behavior I.   These books and course sessions emphasized leadership styles and applications, instead of the specified foci on management trends and reforms of public organizations.  In addition, the Leadership course had a practitioner’s focus.  Seven well-known public, private, and nonprofit leaders spoke to students and lead specific, topical discussions.  As a result of these improvements, faculty saw a positive impact on the students’ retention and use of leadership theories with the new readings and the new guest-led discussions.  Specifically, 100% of MPA students scored greater than or equal to 80% of the questions correctly on the final examination in MPA 605: Leadership and Organizational Behavior.

In the B.S. in Business Administration program, the program coordinator recommended in 2015-16 that areas that could be improved include an increase in global emphasis and a stronger focus on incorporating technology into the classrooms.  This recommendation was related to SLO 5 (Awareness of Global and Multicultural Issues - demonstrate awareness of, and analyze, global and multicultural issues as they relate to business).  During the 2016-17 academic year, an International Accounting course approved by the Faculty Senate and added to 2017-2018 schedule. It expected that this course will increase students’ global business awareness.

In the Mathematics program, faculty noted in the 2015-16 recommended that more emphasis and time be spent on theoretical problems in academic year 2016-2017 and that the measurement tools needed to be statistically significant. In 2016-17, a Pre and Post-Test measurement tool was used in high enrollment courses, like MAT 200 and this is possible by using an external LMS.  All indicators showed an increase in performances from 2015-2016 academic year and the Math Program will consider using this tool in the Calculus cycle.

Program Accreditation and External Program Review

Each Reinhardt degree program undergoes a periodic, which consists of either meeting accreditation or maintaining reaccreditation status for those programs for which there is an accrediting body or a program review for those programs that do not have accrediting agencies. Specialized accreditation reviews are conducted for all Education and Music programs by the Georgia Professional Standards Commission (PSC), and the National Association of Schools of Music (NASM), respectively. [23242526] All the other degree programs are part of a 7-year cycle of program reviews. Academic program reviews are designed to assess program viability, quality, and productivity, assess its alignment with the University’s mission, and facilitate program improvement. The evaluations are based on guidelines [27] established by the Provost’s Office and provides an evaluation of program strengths and weaknesses and includes an evaluation of objectives and learning outcomes, similar to accreditation reviews. Program reviews are conducted on a continuous review cycle. [28] See examples of program review self-studies and external reviewer reports. [29303132]

Program accreditation reports and program review results are used to inform and improve academic programs. Evidence of improvement based on program accreditation lies in continuous accreditation status. Evidence of improvement based on program review resides in actions a program must identify and take related to the reviewer’s recommendations. For instance, as a result of the external review conducted in Fall 2016, the Communication and Media Studies faculty implemented the following curricular improvements [33]:

  1. Added a Research Methods course
  2.  Discontinued concentrations in Global Communications and Media Writing and renamed the Public Relations concertation to “Strategic Communication” and the Digital Film and Video concentration to “ Digital Storytelling”
  3. Streamlined curricular offerings

Conclusion

Reinhardt University has developed a sustainable assessment process at the academic program level, which includes online programs and programs offered at off-site locations.  Course-level assessment is also robust as instructors examine each term their teaching effectiveness through Student Evaluations and Course Reflections.  Results from both student evaluations and faculty course reflections often feed into program-level assessments as indirect methods, particularly in those courses that house major program assessments, such as capstone courses and seminars.

The recent improvements to the Academic Program Assessment Report template, including a prompt to follow-up on the status of recommended improvements, along with ongoing assessment workshops offered to program coordinators and faculty are expected to close the loop in the assessment process and yield higher quality assessment evidence and improvements in student learning.

 

Supporting Documents

[1] RU Guidelines or Academic Program Assessments

[2] Reinhardt University Rubric for Evaluating Academic Program Assessment Reports

[3] OIRE Director Feedback to Interdisciplinary Studies

[4] OIRE Director Feedback to Studio Art

[5] OIRE Director Feedback to WLC

[6] 2015-16 School of Arts & Humanities Annual Report

[7] 2016-17 Undergraduate Catalog SLOs Example

[8] 2016-17 Graduate Catalog SLOs Example

[9] MFA Program Proposal

[10] Program Curriculum Maps 2015-16

[11] RU Syllabus Guidelines

[12] Sample Syllabus – SP 17 POL 380 010

[13] OIRE Academic Program Assessment website 

[14] Reinhardt Faculty Assessment Workshop (August 2014)

[15] May 10 2016 Assessment Workshop

[16] MBA Academic Program Assessment Rubrics

[17] Sample of Completed Course Evaluation

[18] Example of MyFocus Report

[19] Sample of Completed Course Reflection (Communication)

[20] CMS Student Survey Results

[21] RU NSSE 2016 Report of Findings

[22] 2016-17 Noel Levitz SSI

[23] NASM Commission Action Report 2012[24] Self-Study Submitted to NASM

[25] PSOE Institutional Report for Fall 2010 Initial Unit and Program Review to PSC October 2010

[26] PSC approval Letter 2010

[27] Academic Program Review Guidelines

[28] Academic Program Review and Specialized Accreditation Schedule

[29] CMS Program Review Self-Study

[30] CMS Program Review External Reviewer 1 Report

[31] OML Program Review Self-Study

[32] OML Program Review External Reviewer Report

[33] CMS Program Changes Spring 2017

[34] RU Academic Affairs Annual Report Executive Summary