Louisiana State University and A&M College

  1. Home
  2. COMPLIANCE CERTIFICATION
  3. PART 1. Signatures Attesting to Compliance
  4. PART 2. List of Substantive Changes Approved Since the Last Reaffirmation
  5. PART 3. Institutional Assessment of Compliance
    1. Section 2: Core Requirements
      1. 2.1 Degree-granting Authority
      2. 2.2 Governing Board
      3. 2.3 Chief Executive Officer
      4. 2.4 Institutional Mission
      5. 2.5 Institutional Effectiveness
        1. 2.5 Institutional Effectiveness (Continued)
      6. 2.6 Continuous Operation
      7. 2.7.1 Program Length
        1. 2.7.1 Program Length (Continued)
      8. 2.7.2 Program Content
      9. 2.7.3 General Education
      10. 2.7.4 Course work for Degrees
      11. 2.8 Faculty
      12. 2.9 Learning Resources and Services
      13. 2.10 Student Support Services
        1. 2.10 Student Support Services (Continued)
      14. 2.11.1 Financial Resources
      15. 2.11.2 Physical Resources
    2. Section 3: Comprehensive Standards
      1. 3.1.1 Mission
      2. 3.2.1 CEO evaluation/selection
      3. 3.2.2 Governing board control
      4. 3.2.3 Board conflict of interest
      5. 3.2.4 External Influence
      6. 3.2.5 Board dismissal
      7. 3.2.6 Board/administration distinction
      8. 3.2.7 Organizational structure
      9. 3.2.8 Qualified administrative/academic officers
      10. 3.2.9 Personnel appointment
      11. 3.2.10 Administrative staff evaluations
      12. 3.2.11 Control of intercollegiate athletics
      13. 3.2.12 Fund-raising activities
      14. 3.2.13 Institution-related entities
      15. 3.2.14 Intellectual property rights
      16. 3.3.1 Institutional Effectiveness
        1. 3.3.1.1
          1. 3.3.1.1 (Continued)
        2. 3.3.1.2
        3. 3.3.1.3
          1. 3.3.1.3 (Continued)
        4. 3.3.1.4
          1. 3.3.1.4 (Continued)
        5. 3.3.1.5
          1. 3.3.1.5 (Continued)
      17. 3.4.1 Academic program approval
      18. 3.4.2 Continuing education/service programs
      19. 3.4.3 Admissions policies
      20. 3.4.4 Acceptance of academic credit
      21. 3.4.5 Academic policies
      22. 3.4.6 Practices for awarding credit
      23. 3.4.7 Consortial relationships/contractual agreements
      24. 3.4.8 Noncredit to credit
      25. 3.4.9 Academic support services
        1. 3.4.9 (Continued)
        2. 3.4.9 (Continued - 2)
      26. 3.4.10 Responsibility for curriculum
      27. 3.4.11 Academic program coordination
      28. 3.4.12 Technology use
      29. 3.5.1 General education competencies
      30. 3.5.2 Institutional credits for a degree
      31. 3.5.3 Undergraduate program requirements
      32. 3.5.4 Terminal degrees of faculty
      33. 3.6.1 Post-baccalaureate program rigor
        1. 3.6.1 Post-baccalaureate program rigor (Continued)
      34. 3.6.2 Graduate curriculum
      35. 3.6.3 Institutional credits for a graduate degree
      36. 3.6.4 Post-baccalaureate program requirements
      37. 3.7.1 Faculty competence
      38. 3.7.2 Faculty evaluation
      39. 3.7.3 Faculty development
      40. 3.7.4 Academic freedom
      41. 3.7.5 Faculty role in governance
      42. 3.8.1 Learning/information resources
      43. 3.8.2 Instruction of library use
      44. 3.8.3 Qualified staff
      45. 3.9.1 Student rights
      46. 3.9.2 Student records
      47. 3.9.3 Qualified staff
      48. 3.10.1 Financial Stability
      49. 3.10.2 Financial aid audits
      50. 3.10.3 Control of finances
      51. 3.10.4 Control of sponsored research/external funds
      52. 3.11.1 Control of physical resources
      53. 3.11.2 Institutional environment
      54. 3.11.3 Physical facilities
      55. 3.12.1 Substantive change
      56. 3.13 Policy compliance
        1. 3.13.1 "Accrediting Decisions of Other Agencies"
        2. 3.13.2. "Collaborative Academic Arrangements: Policy and Procedures"
        3. 3.13.3. "Complaint Procedures Against the Commission or Its Accredited Institutions"
        4. 3.13.4. "Reaffirmation of Accreditation and Subsequent Reports"
          1. 3.13.4.a.
          2. 3.13.4.b.
      57. 3.14.1 Publication of accreditation status
      58. 3.13.5. "Separate Accreditation for Units of a Member Institution"
        1. 3.13.5.a.
        2. 3.13.5.b.
    3. Section 4: Federal Requirements
      1. 4.1 Student Achievement
      2. 4.2 Program curriculum
        1. 4.2 Program curriculum (Continued)
      3. 4.3 Publication of policies
      4. 4.4 Program length
        1. 4.4 Program length (Continued)
      5. 4.5 Student complaints
      6. 4.6 Recruitment materials
      7. 4.7 Title IV program responsibilities
      8. 4.8 Distance and correspondence education
        1. 4.8.1
        2. 4.8.2
        3. 4.8.3
      9. 4.9 Definition of credit hours
  6. PART 4. Institutional Summary Form Prepared for Commission Reviews
  7. FOCUSED REPORT
  8. QUALITY ENHANCEMENT PLAN (QEP)

3.3.1.3 (Continued)

Narrative (Continued)

Student Advocacy & Accountability (SAA)

Services offered through the department are evaluated in several ways, including the following:

  • The Annual Assessment Plan [73] [74];
  • Use of the National Assessment of Student Conduct Adjudication Processes (NASCAP) assessment program as a means to gauge departmental outcomes as compared to the departmental mission statement; and
  • Tracking longitudinal data by creating regular reports through the Symplicity Advocate Conduct database.

SAA has utilized assessment for continuous improvement of their programs and services and implemented changes based on assessment results. Below are examples of evidence of improvement based on analysis of assessment results.

NASCAP Assessment

  • Student Sample: The student sample included all students who participated in the adjudication process at LSU in the academic years 2009-2010 and 2011-2012. (09-10: Sample = 828, Respondents = 149; 11-12: Sample = 377, Respondents = 55)
  • Key Finding(s) and/or Change(s) Resulting from Assessment:
    • Data from the 2009-2010 NASCAP survey and comparison data across participating institutions reflected an increase in the number of academic cases referred to/adjudicated by SAA. 
    • As a result, the SAA Website was enhanced to include a section of the Website geared toward faculty and academic integrity, and SAA increased educational outreach initiatives to academic departments and colleges.
    • Data from the 2011-2012 NASCAP survey and comparison data across participating institutions was reviewed and supported the assertion that case numbers continue to increase in both referrals and complexity of cases. 
    • As a result, a half-time coordinator position was added within the accountability area to focus on student organization conduct and to assist with the adjudication of individual conduct matters during peak times [75] [76].

Residential Life

Services offered through the department are evaluated in several ways, including:

  • The Annual Assessment Plan [77] [78];
  • EBI survey;
  • internal program review;
  • performance indicators;
  • focus groups; and
  • Connections program (RA on Resident-Intentional Conversations).

Residential Life has utilized assessment for continuous improvement of their programs and services and implemented changes based on assessment results. Below are examples of evidence of improvement based on analysis of assessment results.

Residential College Assessment

  • Student Sample: The student sample included all students who lived in a residential college hall.
  • Key Finding(s) and/or Change(s) Resulting From Assessment:
    • In the past five years the percentage of residential college students has grown from 42% to 57%. The growth of these residential colleges is in support of the university’s goal to increase retention from the first to second year. 
    • The data indicate that, while the overall retention rate for LSU students from first to second year in 2011-2012 was 83%, the rate for those in residential colleges was 85.7%.  
    • As a result, in the past five years, additional residential colleges have been added, including  Mass Communication, Agriculture, Science, Business, Engineering, and IT [79] [80] [81].

RA Connections Program Assessment

  • Student Sample: The student sample included all students who lived in the residential halls during implementation of the Connections program.
  • Key Finding(s) and/or Change(s) Resulting from Assessment:
    • Based on an examination of the factors from the 2011-12 EBI (1, 2, 11, &12), Residential Life implemented the Connections program. The Connections initiative was developed to help the RAs and residents focus on staff satisfaction, programming, community building, and personal interactions. As part of the initiative, each Resident Assistant (RA) meets with his or her residents within the first six weeks of school.
    • After the first year of implementation, the department went up in two out of the four factors. However, in order to ensure the department improved in the other factors, the Connections program was revised, and more quantitative questions were added. In addition, instead of only having the Connections meeting with residents once, the RAs added a second meeting.
    • After the second year of the Connections program, the department’s numbers went up in all four factors. In addition, results were shared with each RA so that he or she could cater programming and community building for his or her specific area of responsibility [82].

Contract Renewal  Assessment

  • Student Sample: The student sample included all students who applied to renew their housing contract with LSU Residential Life.
  • Key Finding(s) and/or Change(s) Resulting From Assessment:
    • Residential Life made changes to the contract renewal process as a result of assessment. Previously, students wishing to renew their housing contract could do so without any additional payment. As a result, many students would renew with the on-campus apartments while shopping off campus—thus taking available space from those truly interested in living on campus. 
    • In 2009, the department implemented a new policy requiring students renewing their contracts each to pay nonrefundable advance rent of $250. The goal was to reduce the number of cancellations and free up space for those interested in living on campus. 
    • Satisfaction with the room assignment process rose from 4.8 on a 7.0 scale to 5.2 in the annual resident assessment. 
    • The department has continued to track cancellations as well as student satisfaction.  As a result, a proposal to require all students to pay nonrefundable advance rent payments in order to secure their housing assignments has been submitted.  The goal is to reduce the number of last minute cancellations and thus reduce waitlist/standby anxiety for new students [82].

Undergraduate Admissions & Student Aid (now Enrollment Management)

Services offered through the department are evaluated in several ways including the following:

  • The Annual Assessment Plan [83] [84] and
  • Annual process tracking.

Enrollment Management has utilized assessment for continuous improvement of their programs and services and implemented changes based on assessment results. Below are examples of evidence of improvement based on analysis of assessment results.

Admission Counselor Training Assessment

  • Sample: The sample included professional staff in the role of admission counselors who participated in the 2012 training.
  • Key Finding(s) and/or Change(s) Resulting from Assessment:
    • In 2012, the recruitment team attended one week of training during the summer with a focus on Round Table. As a result, the team was well educated on the university, but lacked specific training and focus on a unified approach toward increasing applications to the university through specific presentation training, along with proper expectations concerning territory management. Likewise, there was no professional development offered as part of the training of an admissions counselor except for Dry Run, a three-day workshop for new college admission professionals that provides an introduction to the essential elements of the admissions process.
    • The recruiting team unanimously highlighted the lack of professional development, which was supported by the training documentation, or lack thereof, for the previous year. In short, no training document existed outside of the Round Table schedule.
    • Enrollment Management increased admission counselor training from one week to two, with a specific focus on professional development (DiSC training), presentation training, and focused instruction on both macro and micro territory management.
    • Sailing into the wind of a record-setting enrollment year, and with 50% turnover of the recruitment staff over the summer, the current team, to date, is up in applications over last year. In-state applications, in particular, are up 5.18% (up by 402 over this time last year).
    • In addition, a tiered system of leadership was created that, on the one hand, gave our experienced counselors an opportunity to grow professionally, while at the same time giving more one-on-one training to our new counselors [85].

University Recreation

Services offered through the department are evaluated in several ways, including the following:

  • The Annual Assessment Plan [86] [87];
  • Assessment of student learning outcomes, service, and satisfaction within program areas; and
  • Evaluation of positive end of year results based on pro forma expectations.

UREC has utilized assessment for continuous improvement of programs and services and implemented changes based on assessment results. Below are examples of evidence of improvement based on analysis of assessment results.

GroupX Classes Assessment

  • Student Sample: The student sample included all students who participated in GroupX classes. (Spring 2012: Sample = 1600, Respondents = 282; Fall 2012: Sample = 1466, Respondents = 274; Spring 2013: Sample = 1476, Respondents = 300)
  • Key Finding(s) and/or Change(s) Resulting from Assessment:
    • UREC made changes to the GroupX classes following program assessment measuring the overall effectiveness of GroupX classes and instructor performance.  Instructor skills assessed included overall knowledge, easy-to-follow instruction, modifications offered, and creativity.  Future programming needs were also assessed in this survey.  
    • Data showed that 59% of participants’ overall experiences were excellent; 61% of participants preferred class times in late evening, from 6:00- 8:00 p.m.; assessment of  instructor performance showed 75% of participants thought their knowledge was excellent, and 71% of participants thought overall instruction was excellent.
    • To increase instructors’ performance, UREC added various style-specific trainings in May 2012.  Based on participant suggestions, more class styles were provided, including aqua, strength training classes, and Pilates classes.
    • Additionally, with data showing that 43% of participants desired a greater variety in classes, UREC added numerous class styles, such as Barre Tone, Tiger Pump, boxing, Hip Hop, Pilates, strength training, and advanced classes. 
    • When asked about specialty course offerings, respondents’ top answers included cross fit class, Pilates reformer, and outdoor courses.  UREC has added an outdoor class, “Insane Outdoors,” and “PiYo” class two days a week [88] [89] [90].

Facility Reservations Assessment

  • Sample: The sample included entities who rented the UREC facility in the 2012-2013 academic year. (Spring 2012: Sample = 52, Respondents = 15)
  • Key Finding(s) and/or Change(s) Resulting from Assessment:
    • In an effort to better serve facility rental patrons, UREC created a Facility Reservation Survey to target those areas in which customer service could be improved. This is a continuous survey that began at the end of FY 2011-2012, with all rentals for that time period.
    • UREC is now in the second cycle of this survey, and has already started implementing changes as needed, including making sure that every event has a lead point of contact to improve flow of information and creating multiple face-to-face meetings to help assist the renter during the planning period.
    • Data showed that 15% of renters stated that better facilities and 23% stated better communication could make their events better if they were held again at UREC. As a result, UREC completely transitioned over to the assistant director of Facility Operations the sole handling of facility rentals—resulting in a dramatic decrease in miscommunication. This is mainly in part to having one dedicated staff member assigned to these issues. It also has allowed other teammates to continue handling their normal responsibilities.
    • Looking forward, UREC is also interested in gaining knowledge on how to best reach and communicate with a technologically involved generation of students. Some possible alternatives to achieve this reach include allowing students to fill out an online form/Google document or even allowing them to input their reservations directly into scheduling software [91].

Membership Policy Assessment

  • Sample: The sample included 45 peer institutions including SEC, Big 10, Big 12, and other institutions.
  • Key Finding(s) and/or Change(s) Resulting from Assessment:
    • In an effort to better serve external customers, UREC assessed their membership policy, specifically in relationship to freezing membership through a benchmark study of peers. Prior to 2012, UREC did not have a formal policy in place for freezing memberships.
    • The data revealed that 53% of schools freeze memberships and 63% of SEC schools freeze memberships; 66% of schools require some sort of documentation in order to freeze membership.
    • After reviewing the data, UREC implemented the membership freezing policy in fall 2012, allowing a non-student member to temporarily suspend or “freeze” his/her membership due to medical leave, military leave, or sabbatical leave, provided that the member produce documentation of leave by means of physician’s note, letter from commanding officer, or letter on LSU letterhead from the member’s department. No administrative fee would be charged for an “involuntary leave.”
    • In addition, an annual member traveling abroad may select to freeze his/her membership provided that the member provides documentation of travel abroad.  Memberships may be maintained as inactive for a $5 monthly fee (for “voluntary leave”), for a minimum of one month and a maximum of four months.  Members traveling domestically may not select to freeze their membership but are encouraged to take part in the NIRSA passport program.
    • Since its implementation, UREC has assisted three members in freezing their memberships while on sabbatical [92] [93].
  • For a more complete description of the departments within Student Life & Enrollment and their relation to the mission of the university and students served, see the report for Principle 2.10.

Assessment within Other Support Units and Services

Though Student Life & Enrollment is a primary mechanism, units outside of this umbrella unit share the mission of supporting student success in fulfilling the academic mission. These include Communication Across the Curriculum; Equity, Diversity & Community Outreach (Multicultural Affairs, Women’s Center, African American Cultural Center); the Honors College; Student Support Services (within University College); and Health Promotions (within Finance & Administrative Services).

LSU also provides several summer bridge and transition programs to facilitate successful transition into the college environment, preparing students for the rigor of academic life and educating them on available campus resources. Programs include Summer Scholars (University College); Bios – Biology Boot Camp; Encounter Engineering Boot Camp; Tiger Prep Math Camp; S.T.R.I.P.E.S. – transition camp.

Communication Across the Curriculum (CxC)

Services offered through the department are evaluated in several ways, including the following:

  • Performance evaluations for classified and non-classified personnel are completed annually;
  • Student assessment (surveys) are offered at the end of communication-intensive classes;
  • Faculty assessment (surveys) of Summer Institute and Lunch and Learns are offered;
  • Student assessment (surveys) of workshops are offered.

A main component of CxC pedagogy is the feedback loop that encourages students to reflect on and revise work in progress based on comments from faculty and peers.  Thus, it is second nature for CxC staff and faculty to study the results of assessment, identify ways to improve the program and its services, and implement changes. Below are three examples of how assessment data have been used to improve selected program initiatives. 

Faculty Development Assessment and Actions

  • Sample: The sample included faculty who participated in the CxC Summer Institute.
  • Key Finding(s) and/or Change(s) Resulting from Assessment:
    • Data collection: pre-Institute surveys of faculty interests; formative evaluations throughout the institute asking participants how they might apply what they’ve learned and what they still have questions about; summative surveys in which participants are asked to evaluate different features of the Summer Institute
    • Assessment of Summer Institute:  All participants agreed that the Summer Institute was an important experience in helping them rethink ways to improve their teaching.
    • Changes Implemented:  Each year the agenda for the Institute is adapted to meet the needs of participants based on who is attending and on what participants said about the previous summer’s institute.  In particular, the topics for the break-out sessions are re-evaluated [94].

Communication Intensive Course Offerings Assessment and Actions

  • Sample: The sample included students and faculty participating in C-I courses.
  • Key Finding(s) and/or Change(s) Resulting from Assessment:
    • Data Collection: Multiple assessment measures are in place to evaluate the impact of C-I courses on students’ communication skills, including formative and summative assessment of course communication projects, student and faculty surveys gauging the impact of C-I courses, and departmental assessments of student learning and of students’ communication skills. 
    • Assessment findings:  Since fall 2006, CxC has used written surveys to collect opinions of students enrolled in C-I courses.  Data (through 2010) indicate that 56% (n=4,601) consider that their C-I courses improved their communication skills, and 66% (n=4,596) believe they will continue to use the communication skills they have acquired.  Additionally, CxC has collected opinions of faculty who teach C-I courses.   Responses have remained consistent over time.  For example, the 2009 survey (response rate, 72%) revealed that, with regard to each mode, more than 90% (n=80) of C-I faculty believe their students’ communication skills improved noticeably by the end of the C-I course.  In addition, 86% (n=80) agree that their students gained a greater understanding of course content because of C-I activities.  The survey also showed that 93% (n=77) indicated that course content was not compromised to meet C-I requirements.
    • Individual departments also report differences between student performance in C-I and non-CI courses.  For example, in a recent outcomes assessment of students in six different 4000-level courses, the Communication Studies Department faculty randomly selected 28 student papers and rated them on a three-point scale (0=inadequate, 1=adequate, 2=superior) to gauge how well students mastered five specific learning outcomes.  The mean (M) for students in C-I courses was higher than that for non-C-I courses on every measure, including “demonstrates an understanding of concepts” (C-I M=1.77; non C-I M=1.32) and “demonstrates critical thinking and argumentation skills” (C-I M = 1.72; non C-I M= 1.26).  From the chemistry department comes another example. Typically three sections of Chemistry 1202 are offered each fall; one section has been periodically taught using C-I methodologies.  All students take an identical comprehensive final exam; those from the C-I section have consistently scored 10% higher.  The C-I section also has had a lower drop rate (10%) than the other sections (average, 35%). 
    • Changes implemented as a result of assessment: Because data indicates that students improve their communication skills in C-I courses, CxC staff makes a concerted effort to recruit faculty members to offer certified classes, a challenge made more difficult recently as budget shortfalls have forced increases in class size [95].  (C-I courses generally have a faculty to student ratio of 35:1 to allow for one-on-one feedback.)

Cox Center for Student Athletes

Services offered through the department are evaluated in several ways, including the following:

  • Surveys
  • Data tracking.

Below are examples of evidence of improvement based on analysis of assessment results.

Tutor Evaluation of Tutoring Processes, Procedures, and Training

  • Sample: Students and staff who serve as tutors for the Cox Center for Athletes, and student athletes using the tutorial services (Completed survey: 2010: 95; 2011: 84; 2012: 55; 2013: 82)
  • Key Finding(s) and/or Change(s) Resulting from Assessment:
    • The Tutor Evaluation Assessment is given to tutors at the end of each semester to gauge the efficiency of the CCACSA Tutorial Program and tutor satisfaction.  Several changes were implemented based on answers to the fall 2012 survey’s open-ended feedback questions.
    • Consistent trends we saw in the answers to the open-ended questions were complaints of few training sessions, training that focused  more on day-to day procedural duties more than tutor strategies, and lack of access to information after training.
    • In response to the feedback, the following changes were made. In fall 2012, ACSA held compliance trainings three times a semester.  The information is repetitive but critical to adhering to NCAA and SEC protocol with student athletes.  After the survey, the unit changed the frequency to two required trainings and instituted a compliance checkpoint quiz midway through the semester as a post test and to make sure that the integrity of the compliance training remained with the fewer meetings.
    • ASCA reduced the number of CRLA training meetings (the number of meeting hours will remain the same). The unit will also utilize other multimedia training methods to make the training more interactive and more useful. 
    • ASCA created a tutor shared drive and a tutor manual.  The tutors have constant access to most of the information they need.  The unit is in the process of creating “ACSA Tutorials,” which will outline some of the procedures and processes. As these tutorials are created, they are placed on the tutor shared drive where tutors can access them as needed [96] [97].

Summer Academic Success Program (SASP) Assessment

  • Sample: Students participating in the SASP program. (Completed surveys: 36)
  • Key Finding(s) and/or Change(s) Resulting from Assessment:
    • The Summer Academic Success Program assists new student athletes in making the transition to Louisiana State University by supporting their intellectual, social, cultural, and emotional development.
    • SASP is an eight-week intensive student learning and development program  designed to create a culture that reinforces with new student athletes  a priority on academics by equipping them with the skills necessary to graduate from college. In addition, the program helps new student athletes understand the values, traditions, norms, and expectations of an educational experience at LSU. This is accomplished by providing them with the academic and life skills, as well as the resources needed, to make their experience within the collegiate environment a successful one, thus giving each new student athlete more self-confidence, leading to responsibility and initiative for his or her own growth and development.
    • At the conclusion of the 2012 summer program, the 36 SASP participants were surveyed and asked to self-assess and evaluate their own development regarding their summer learning and development experience. 
    • When asked about the improvement of their own study habits, for the item “My study habits have improved because of SASP,” 54% agreed with this statement and an additional 45% strongly agreed.  Nearly all participants responding to this statement believed that they had taken a step in a positive direction as a result of participating in SASP.
    • SASP students are enrolled in two courses during the summer (ENGL 1001 and EDCI 1001).  Students are required to meet with a SASP Strategy Tutor Monday-Friday, 12 pm to 2 pm.  Students are provided with academic support and assisted with the transition into the college environment.  For summer 2013, additional information will be gathered that includes both the student’s and tutor’s assessment of the student learning experience [98].

Equity, Diversity & Community Outreach

Services offered through the department are evaluated in several ways, including the following:

  • Performance evaluations for individual, professional staff;
  • Annual program reviews during the unit summer retreat; and
  • Students’ qualitative and quantitative assessments following programs.

Examples of assessment and evidence are included in the 2006 Diversity Self-Assessment and yearly annual reports [99] [100] [101].

University College

Services offered through the department are evaluated in several ways, including the following:

  • Strategic Plan reviewed each year to ensure that counseling and programs, as well as other initiatives, are supporting the overall objectives and purpose of LSU;
  • Electronic student evaluation survey after each appointment with a counselor;
  • Performance evaluations for professional, classified, and non-classified personnel completed annually;
  • Implementation of  a comprehensive assessment plan, which includes both quantitative and qualitative assessments on programs completed and obtains program data from the Office of Budget and Planning.  End of the Year Programming Reports include such metrics as number of program and services; number of hours spent advising, both group and individual students; and budget reports.
  • Student evaluations and surveys at the end of workshops and presentations.

University College has utilized assessment for continuous improvement of programs and services, and implemented changes based on assessment results. Below are examples of evidence of improvement based on analysis of assessment results.

Counselor Evaluations

  • Sample: Students who attend one-on-one academic advising appointments in UCFY. (Sample = 8170; Respondents = 1559)
  • Key Finding(s) and/or Change(s) Resulting From Assessment:
    • For the academic year 2012-2013, 92% “strongly agreed” or “agreed” that “UCFY’s program assisted me in a professional and friendly manner;” 91% “strongly agreed” or “agreed” that “the counselor was able to assist me in resolving my situation or referred me to someone who may be able to assist me;” 90% “strongly agreed” or “agreed” that “the counselor was knowledgeable of all services offered by the center and of LSU’s policies;” 89% “strongly agreed” or “agreed” that “I would be inclined to visit the center for assistance in the future;” and 88% rated the center and the services provided as “excellent” or “good.”
    • For the academic year 2012-2013 (fall and spring semesters only), University College’s Center for Advising and Counseling advised 4,073 students as face-to-face contacts.  UCAC student evaluations cited 780 responses, noting 95% “strongly agreed” or “agreed” that “UCAC’s program assisted me in a professional and friendly manner;” 93% “strongly agreed” or “agreed” that “the counselor was able to assist me in resolving my situation or referred me to someone who may be able to assist me;” 94% “strongly agreed” or “agreed” that “the counselor was knowledgeable of all services offered by the center and of LSU’s policies;” 93% “strongly agreed” or “agreed” that “I would be inclined to visit the center for assistance in the future;” and 88% rated the center and the services provided as “excellent” or “good” [102].

Summer Scholars Program Assessment

  • Key Finding(s) and/or Change(s) Resulting from Assessment:
    • Quantitative and qualitative assessment of the program is performed.  The quantitative assessment is based on retention, grade point average, and graduation rates.  The qualitative assessment is derived from students’ perception of the value of the program, faculty and employers’ perception of the students and the program, and the number of former Summer Scholars who have become leaders at LSU.  The assessment data show 98% first-to-second-year retention rate, LSU cumulative GPA of 2.84, a 55% graduation rate within four years, and a 69% graduation rate within six years.  Based on the assessment reporting by LSU’s Office of Budget and Planning, LSU University College’s Summer Scholars students have a significantly higher first-to-second-year retention rate, cumulative GPA, and graduation rates in four and six years than other LSU African American students and all other LSU first-year students.
    • Summer Scholars cohorts excel academically, noting the significant differences among “End of First-Year LSU Cumulative GPA,” “Second-Year Retention,” and Third-Year Retention” [103].

Student Health Center

Services offered through the department are evaluated in several ways, including the following:

  • Maintenance of a strategic and quality improvement plan, for which goals are assessed and updated annually;
  • Ongoing, comprehensive satisfaction assessment tool completed via Campus Labs [104] [105];
  • External and internal audits of care protocols and student education, utilizing the medical, electronic clinic managements system, the Sunbelt Survey, and the Student Health Services Listserv;
  • Benchmark studies with health centers at peer institutions are conducted on a regular basis;
  • Performance evaluations for classified and non-classified personnel completed annually;
  • Student assessment at end of workshops and presentations; and
  • Student feedback solicited during the online educational experience via My Student Body.

My Student Body (MSB) Automated Tracking Assessment

  • Key Finding(s) and/or Change(s) Resulting From Assessment:
    • The study was done to track the efficiency of the new automated implementation process over the previous MSB manual process. The new process relieved health promotion staff from time intensive labor of manually lifting registration holds. MSB is an educational, evidence-based online program that is available to students 24/7; it is an effective tool in the department’s comprehensive strategy to promote healthy behavioral change and risk management.
    • Compliance reports were obtained from University Info Systems as well as My Student Body Admin Portal online. Before the automated process, large groups of students were registering themselves incorrectly; this led to non-validations to their accounts, difficulty tracking/verifying students, and staff having to manually lift registration holds on the back end. Implementation issues from students, administrators, and staff were documented and rectified.
    • Empirical evidence of reduced time and labor was gathered from staff [106].

International Student Programs

Services offered through the department are evaluated in several ways, including the following:

  • Surveys and interviews; and
  • Data tracking.      

Below are examples of evidence of improvement based on analysis of assessment results.

International Orientation Survey (written and anonymous)

  • Sample: The sample included students participating in International Orientation.
  • Key Finding(s) and/or Change(s) Resulting from Assessment:
    • International students are brought to the campus for their first semester at LSU as a group. The program educates the students on the basics of the rules governing their U.S. immigration status and documents, prepares them to matriculate into the university through registration and enrollment steps, and provides information about student groups, support services, and community resources that are available to them.
    • Each orientation, the students are asked to give their open comments and score each session of the program. The information is evaluated by IS staff and provided to other units who are involved in the program and who present sessions. Information drives the following orientation program plans and changes.
    • Data showed that students complained that the program was too long and had information that did not pertain to them.
    • International student information that did not pertain to the majority of the group was turned into handouts and information that would be taken away by the students. The overall program was shortened, and the day allowed for a longer lunch period and more breaks.  Since these changes, practically no complaints of the program content and/or length have been reviewed [107].

Curricular Practical Training (CPT) Workshops Assessment

  • Sample: The sample included students participating in CPT Workshops.
  • Key Finding(s) and/or Change(s) Resulting from Assessment:
    • Held semester information session providing information on Curricular Practical Training (CPT):  CPT is work authorization granted to F-1 status holders who meet the U.S. immigration and university regulations for work authorization to participate in an internship in direct relation to their academic program. These internships are usually off campus and can be full- or part-time employment. 
    • Information collected from students who attended Optional Practical Training (OPT) sessions, by the office staff and through advisor appointments, indicated that a CPT informational group session was desired (much like the OPT sessions that IS provides).  In 2012, CPT sessions were provided each semester and continue. These have been the first CPT sessions provided by IS since 2004 because of staff changes, resource limitations, and evaluation of student need.  Attendance of CPT sessions in the early 2000s was incredibly low. 
    • CPT session attendance has been good, with at least 10 students participating, and is growing.  There has been an increase in student requests for CPT showing interest, full understanding of this option and opportunity, and a better economy [108].

Online Student Services

LSU offers instruction via multiple modalities. Online instruction involves (1) students in and around the Baton Rouge area taking some or all courses online and (2) students outside of the geographical area who are completely enrolled and access all services online. Students have access to a wide range of services regardless of the modality in which the course is delivered, including orientation materials and online tutorials regarding learning styles. As a result, a needs and satisfaction survey was implemented to allow students in the LSU Online program to provide feedback to the university. Below is a select summary of those results.

Online Student Experience Survey

  • Sample: The sample included all students enrolled in the LSU Online program. (Sample = 45; Respondents = 26)
  • Key Finding(s) and/or Change(s) Resulting from Assessment:
    • Students from the four programs included in the LSU Online degree programs completed an online survey regarding their current experiences with LSU and future needs related to services for student support.
    • Overall students in the LSU Online program have had positive experiences with the university and its services: 85% of students agree, either strongly or somewhat, that “LSU is welcoming to online students.”  Similarly, 81% agree that “the tuition I pay is worth the educational experience I am having at LSU.”
    • In regard to student issues, 85% of students agree that “I am able to resolve any problems I experience at LSU in a timely manner,” and 78% agree that “when I have questions, it is easy to get answers or the information that I need from LSU staff members.”  Approximately 67% of students agree that “there are appropriate channels at LSU for expressing student complaints and concerns.”
    • Students also communicated their satisfaction through this survey: 87% identified that they are very or somewhat satisfied with the “quality of academic courses in your major,” and 77% were satisfied with the “availability of faculty.”
    • On other topics, 63% of students said that LSU has met or exceeded their expectations, and 81% identified that they would recommend the LSU Online program to others.
    • Of the students services offered through Student Life & Enrollment, students identified the following services as not important or not needed for online students: Campus Life, Greek Life, Parent and Family Programs, Residential Life, and University Recreation.  Students identified the following services as slightly, moderately, or very important to online students: Career Services, Center for Academic Success, Disability Services, Enrollment Management, Orientation, and the Student Financial Management Center [109].

NOTE: A full list of attachments is available in the first half of the narrative.

Author: Stephenie Franks
Last modified: 7/1/2015 8:33 AM (EDT)