Compliance Status
Louisiana State University and A&M College is in compliance with this principle.
Narrative
As the flagship institution in the state, Louisiana State University and A&M College (LSU) offers a comprehensive array of learning resources and services. The Division of Student Life & Enrollment provides academic and student support services, programs, and activities for LSU students. The division’s mission is to recruit, admit, engage, retain, and graduate a diverse student population for success at LSU and beyond. We enhance learning by fostering critical thinking and ethical responsibility to create a university experience that transforms lives. We support each student both inside and outside the classroom in becoming engaged on campus academically and personally. Our programs, services, and quality-of-life facilities are designed to maximize potential and help students succeed. Whether through campus involvement, academic support, service, or accountability opportunities, our collaborative teams help students connect with peers, faculty, staff, and alumni for a more meaningful LSU experience.
LSU Division of Student Life & Enrollment is supporting the vision for students “to achieve the highest levels of intellectual and personal development” through its programs and services and measuring its success in this regard. Academic and student support services, primarily housed within the Division of Student Life & Enrollment, have identified outcomes and developed assessment plans for each outcome, utilizing various assessment methods.
While each department within the division has a unique mission and may serve specific students, all departments support the missions of both the Division of Student Life & Enrollment and the university. Additionally, each department is working toward achieving departmental and divisional strategic goals. The unit underwent a unit program review in 2011.
This narrative also addresses outcomes assessment in other departments that provide academic and support services.
Expected Outcomes of Academic and Student Support Services Defined
Expected outcomes are identified within the Division of Student Life & Enrollment in two different areas, focused both on student learning and on operational outcomes. Student learning outcomes for the division are encompassed in the Student Success Outcomes, and operational outcomes are connected to targets within the Strategic Plan Goals.
Student Success Outcomes (SSOs)
In summer 2010, the Division of Student Life & Enrollment embarked on creating student learning outcomes focused on our primary strategic goal: student success. Following an intensive retreat and review of departments focused on student learning, the Student Success Outcomes (SSOs) were created. The SSOs were adapted from the Center for Academic Success (CAS) standards and the NASPA/ACPA publication Learning Reconsidered I and II. The SSOs focus on student growth and learning in cognitive complexity, knowledge acquisition, intra/interpersonal competence, practical competence, persistence and academic achievement, and citizenship and social responsibility. A concentrated effort is made by the division to provide educational opportunities that are in line with these outcomes. The SSOs are identified as learning outcomes and measurements within departmental assessment plans and select departmental strategic plans [1].
Each year Student Success Outcomes are mapped to ensure the Division of Student Life & Enrollment is making strides to advance student success in the six areas identified [2] [3].
Strategic Plan Goals
The Division of Student Life & Enrollment uses a dynamic process of planning, implementing, and assessing. With student success as the focus, the division’s mission and vision are supported by our values and goals. The goals represent long-term actions to advance the division in support of Flagship 2020. Each goal is supported by a collection of strategies that independently establish short-term actions designed to advance the goal. For each goal, a series of performance indicators provide critical measures for assessing the progress of the goal and ultimately the overall plan. The Division of Student Life & Enrollment’s Strategic Plan Goals include the following:
Student Success: Promote engagement, retention, graduation, and transition to a career.
Communication: Partner across the division and LSU community to develop effective strategies to inform, educate, and engage stakeholders.
Staff Development: Provide educational opportunities to foster an understanding of student development while advancing knowledge, experience, and implementation of best practices.
Operational Excellence: Use assessment and innovation to continuously improve processes, programs, facilities, and services.
Fundraising: Raise external funds to optimize resources for programs and services [4].
Program/Unit Review
The Division of Student Life & Enrollment underwent a program review in 2011 with both internal and external reviewers [5] [6] [7]. According to the LSU Office of Academic Affairs,
“Program review is a meaningful process that contributes to the overall quality of the department and the university. Thus, the review process is evaluative as well as descriptive, directed toward improvement, resulting in action, and based on primarily academic criteria. It consists of an internal, objective process, which will be coordinated, whenever possible, with other reviews. The information gathered provides critical internal data about size and stability of a program, current and future resource needs, market demand, equipment and space needs, strengths and weaknesses, and how the program contributes to the mission of the institution. From an external perspective, the assessment results provide a mechanism for demonstrating accountability and assist in efforts to garner financial, philosophical, and political support. The value of the program review rests on its process, its outcomes, and its usefulness. Because the process and outcomes are developed for purposes of improving educational opportunities, curriculum quality, and program relevance, it is essential that the university make appropriate use of the results.”
Additionally, Student Life & Enrollment submitted a follow-up memorandum in response to the Memorandum of Understanding outlining future plans [8] [9].
Assessment within the Division of Student Life & Enrollment
Assessment is a primary focus and valued function in the Division of Student Life & Enrollment. In summer 2009, the division made a concerted effort to organize assessment division-wide by adding a coordinator with 50% time devoted to assessment.
Assessment is identified in the Divisional Strategic Plan as one of the six values. The division values the use of “data for planning and continuous improvement to provide the best possible services” for LSU students [10]. Assessment is a major priority in the Division of Student Life & Enrollment. The expectation to use assessment data for continuous improvement is inculcated throughout the division and emphasized through professional development and communication through the Assessment Contacts Committee. Each department has focused on assessment during the last four years.
Departmental Assessment Plans
Each department is responsible for crafting an assessment plan related to the mission of the division and the department, and tasked with measuring Student Success Outcomes and Strategic Plan Goals. Assessment plans identify specifics of the project—including purpose, method, timeline, population, challenges, and plans—to improve current practice and reporting. Assessment plans are peer reviewed by at least two members of the Assessment Contacts Committee and evaluated based on a rubric to ensure quality assessment planning. In addition, the assessment coordinator reviews each assessment plan and provides feedback based on the rubric for any changes to be made. Written feedback from the coordinator and peer-review team is given to each department. Departmental assessment plans with area specific information are included here.
Assessment Contacts Committee
Assessment within departments is overseen by the assessment contact within each particular department. The Assessment Contacts Committee is made up of staff members from each department within the division. Members, appointed by each department’s director, meet monthly to discuss and plan division-wide assessment efforts. In addition, professional development opportunities are provided to the team through webinars, on-site training, and hands-on experience [11].
Assessment Methods
The division is a member campus of the Campus Labs Baseline assessment platform, which provides assessment tools and consultation. Direct and indirect methods of assessment are used with optional survey tools, rubrics, and benchmarking capabilities. Departments within the Division of Student Life & Enrollment use various assessment methods, including surveys of needs assessment, rubrics evaluating artifacts (such as open-ended essay questions) and behavior, focus groups, and comparisons of institutional data. All data are tracked within the system, providing numerous years of data.
Annual Reports and Priority Planning
Annual reports are drafted by each department to highlight accomplishments based on Strategic Plan Goals, identify objectives for priority planning based on Strategic Plan Goals, and provide a strategic plan update including updates on Performance Indicators [12] [13] [14] [15].
Assessment is continually stressed throughout the division in several ways, as noted above and in additional examples below:
-
The Assessment Contacts Committee meets on an on-going basis and emphasizes continuing training and education [16];
-
All departments submit an assessment plan as described through the process above; assessment plans are then evaluated using the Assessment Plan Rubric [17];
-
Assessment training and development are provided for staff through the Professional Development Committee and Assessment Contacts Committee on a regular basis, including mini-conferences, webinars, and guest speakers [18] [19].
Included below is a report for each department regarding departmental assessment and the ways in which results were used to inform and improve practice, broken out by academic year. A sample of departmental assessment initiatives, including surveys and other forms of assessment, can be viewed in the Assessment Projects Reports below.
Documentation of Assessment Activities for Departments within the Division of Student Life & Enrollment
Career Services [20]
Center for Academic Success [21]
Enrollment Management [22]
First Year Experience [23]
Parent & Family Programs [24]
Orientation [25]
Campus Life [26]
Disability Services [27]
Greek Life [28]
Student Advocacy and Accountability [29]
Residential Life [30]
University Recreation [31]
Examples of the Use of Assessment in the Division of Student Life & Enrollment
Career Services
Services offered through the department are evaluated in several ways including:
-
The Annual Assessment Plan [32] [33];
-
The Five-year Long-range Assessment Plan, which thoroughly evaluates four primary services each year;
-
An evaluation survey completed by students after each appointment and after each recruitment event; and
-
Benchmark studies with Career Services offices at peer institutions, which are conducted on an annual basis.
Career Services has utilized assessment for continuous improvement of their programs and services and implemented changes based on assessment results. Below are examples of evidence of improvement based on analysis of assessment results.
Career Expo Longitudinal Assessment
-
Student Sample: The student sample included all students (undergraduate and graduate) who attended the Career Expo in each given semester. (Spring 2010: Sample [Unknown due to format for survey sent], Respondents = 137; Spring 2011: Sample = 995, Respondents = 138; Spring 2012: Sample = 1807, Respondents = 256; Spring 2013: Sample = 1865; Respondents= 380 )
-
Key Finding(s) and/or Change(s) Resulting from Assessment:
-
Prior to 2010, the Career Expo was a one-day event in which all majors and employers were represented in a single career fair. After the assessment, Career Services changed the Career Expo into a two-day event: the Engineering, Science, and Technology Expo on one day; and the Business and Liberal Arts Expo on another.
-
According to the survey, “34% of students said that the current format of the Career Expo did not meet their individual needs having all the majors at one career fair.” And when asked if we changed the format from one big career fair for all majors and industries to multiple “boutique” career fairs, broken down by industry or college, students were essentially split 49%/51% between the options.
-
Since the change was made in 2010, students have reported an increase in feeling that the Career Expo was an effective use of their time with the exception of Spring 2011. In Spring 2010, 66% reported the Career Expo was an effective use of their time. The results for Spring 2011, Spring 2012, and Spring 2013 were 63%, 79%, and 71%, respectively. Students have also reported strong satisfaction with the new Career Expo format. In Spring 2011, 80% preferred the expo broken into two days, 86% in Spring 2012, and 85% in Spring 2013 [34] [35] [36] [37] [38] [39] [40].
Experiential Education Assessment
-
Student Sample: The student sample included all students who attended one of the four workshops that were part of the Experiential Education Program in 2009-2010. (118 = Sample, 42 = Respondents)
-
Key Finding(s) and/or Change(s) Resulting From Assessment:
-
In 2009-2010 the Experiential Education Program held four workshops designed to assist students in increasing their level of skill and confidence in their internship searches and expanding their knowledge on how to get the most from their internship experiences.
-
As a result of participating in workshops, 59% of attendees said their confidence about how to obtain work experience increased a great deal or considerably, 54% said their understanding about how to get the most from their experience increased a great deal or considerably, 69% said their knowledge of Career Services resources had increased, and 87% said they would recommend the workshop they attended to others.
-
Following our assessment, the Experiential Education Program made Work Experience Week an annual event; it has been held every March since Spring 2010.
-
Workshops and topics have been evaluated each year to provide the most up-to-date and timely information available. Information in workshops has included showcased work experiences by student panelists, perspectives by employers on hiring interns, presentations by representatives from the Louisiana Department of State Civil Service, and tips on preparation for the world of work [41].
Center for Academic Success
Services offered through the department are evaluated in several ways, including the following:
-
The Annual Assessment Plan [42] [43];
-
Performance evaluations for classified and non-classified personnel completed annually;
-
Quantitative and qualitative assessments of programs completed via Campus Labs and program data from the Office of Budget and Planning;
-
Student assessment at the end of workshops and presentations; and
-
Special projects in collaboration with other departments to create surveys and focus groups regarding CAS image.
The Center for Academic Success has utilized assessment for continuous improvement of programs and services and implemented changes based on assessment results. Below are examples of evidence of improvement based on analysis of assessment results.
IMPACT Assessment
-
Student Sample: The student sample included all students who participated in the IMPACT program. (2010 survey: 127 = Sample, 38 = Respondents; 2011-2013 quantitative data comparison)
-
Key Finding(s) and/or Change(s) Resulting from Assessment:
-
CAS made changes to the IMPACT program after a formative assessment (teaching) was given at the end of the workshop, with a formative (learning) assessment two to four weeks after, and a summative (learning) at the end of the semester.
-
Initial assessments of teaching were positive. Approximately 70% of students responding stated that the activities and quizzes included in the workshop were helpful; however, approximately 25% stated that they would like more “hands-on” opportunities and interactions during the workshop.
-
Self-reported academic confidence was improved. On a scale from 1-10, students ranked their overall confidence between 4-5 before IMPACT and 7-8 after IMPACT learning assessment.
-
Fifty-one out of 70 student users (73%) reported improved grades, better time management, improved study habits (e.g., attending class, reading more, listening to lectures, studying in advance) after IMPACT learning assessment. As a result, students on academic warning or probation, who attended IMPACT obtained at least a .5 GPA increase over those who chose not to attend (same group).
-
The workshop continues to grow with more sessions for students to attend, incorporating more “hands-on” activities to engage students in understanding how to transition more effectively. Total students served in 2010 were 64. In 2011, CAS served 487 students (included all first-year students).
-
Results indicated value of the IMPACT program, thus allowing CAS to make the IMPACT program mandatory in the Spring 2013 to all first-year students who were placed on academic probation or warning (received less than a 2.0 GPA) by the university. Additional workshop sessions were added by CAS to accommodate additional students [44] [45] [46] [47].
Tutorial Center Assessment
-
Student Sample: The student sample included all students who visited the tutorial center in 2012. (1398 = Sample, 65 = Respondents)
-
Key Finding(s) and/or Change(s) Resulting From Assessment:
-
According to the 2012 Tutorial Services Center Survey, overall responses from students indicated that the value of having tutoring available, on an “as needed” basis, to be highly beneficial.
-
Approximately 3% mentioned that there needed to be more tutors or to have specific subjects covered that were not.
-
Of the students responding to the survey, approximately 50% stated that there needed to be more space and additional tutors to help. Students commented that there wasn't enough room or that the center was too crowded; therefore they could not obtain services.
-
The CAS has addressed the issue of not enough tutors by moving from a decentralized center to a centralized center in Middleton Library. With the assistance of LSU’s Student Government, the center is able to provide more tutors, utilizing their skills in multiple subjects. The impact of this continued assessment and a projected increase in use over a 10-year period, a tutorial center expansion is planned for summer 2013 to better serve LSU students [48].
First-Year Experience
Services offered through the department are evaluated in several ways, including the following:
-
The Annual Assessment Plan [49] [50] [51] [52];
-
Program Assessment Plans: All major programs have specific assessment plans that outline the purpose of the assessment, what the assessment is, and how the assessment information will be used;
-
Strategic Plan: Each year, this plan is reviewed to ensure programs and other initiatives are supporting the overall objectives and purpose of FYE;
-
FYE Checkpoints: Periodically, each semester FYE checks in with students around campus (i.e., LSU Student Union, Dining Halls, etc.) to determine areas of concern, what has been helpful, etc.;
-
Individual Program Surveys: Utilizing Campus Labs, surveys are distributed for all programs to determine impact on student learning, what worked, what did not work, what information students are interested in, etc.
First-Year Experience has utilized assessment for continuous improvement of their programs and services and implemented changes based on assessment results.
S.T.R.I.P.E.S. Staff Training Assessment
-
Student Sample: The student sample included all S.T.R.I.P.E.S. staff from 2012. (59 = Sample; 29 = Respondents)
-
Key Finding(s) and/or Change(s) Resulting from Assessment:
-
Prior to 2012, S.T.R.I.P.E.S. staff training was conducted through a one-day training in the spring and then a three-day training in the summer. Training was a substantial time investment, but did not seem to provide the support the students needed, especially in regards to facilitating small and large group activities. Thus, for 2012 the training was changed to include an online Moodle component and an in-person training that was more hands on/experiential learning immediately before the first session.
-
Following the completion of each S.T.R.I.P.E.S. summer training, a debriefing/evaluation session is conducted with the staff, in addition to the program evaluation. Debriefing revealed that returning staff found training to be ineffective and repetitive. New staff indicated not feeling prepared to facilitate activities for the students or knowledgeable enough about the program, especially regarding terminology and traditions that occur specifically during S.T.R.I.P.E.S. Out of the 54 students who completed the online evaluation, 10 said improvement was needed in explaining events and 10 said they wanted opportunities to do mock facilitation/presentations.
-
In 2012, a two-part training model was implemented to better accommodate both new and returning staff. The first part was done online through Moodle. The staff took a pre-test through Moodle and were then provided with access to different training modules. The staff completed the Moodle training on their own prior to the start of the summer training; 71% of respondents agreed or strongly agreed that they found the Moodle training site to be helpful in understanding their responsibilities as S.T.R.I.P.E.S. staff members.
-
As a result of the online training added in 2012, the summer training then became more experiential. The staff practiced actually facilitating activities in order to be better prepared for the sessions; 67% of participants agreed or strongly agreed that they felt the August training prepared them for the responsibilities required of their student staff positions [53] [54].
Transfer Student Orientation Assessment
-
Student Sample: The student sample included all students who participated in Transfer Student Orientation. (2011: Sample = 309, Respondents = 148; 2012: Sample = unknown, Respondents = 79)
-
Key Finding(s) and/or Change(s) Resulting From Assessment:
-
Orientation made changes to the Transfer Student Orientation following a theme that arose out of the open-ended responses of the previous program survey. A trend emerged that transfer students asked for a transfer orientation agenda that mirrored some of the activities on the New Student Orientation agenda—in order to feel more acclimated to campus life. As a result, a myLSU session, a Safety & Security session, and a Campus Tour were added to the Transfer Orientation student agendas.
-
After these changes were made, the transfer students’ evaluation of the program went from 4.0 to a 4.2 out of a possible 5.0 [55] [56].
Spring Invitational (SPIN) Assessment
-
Student Sample: The student sample included all students who participated in SPIN. (2011: Sample = 1190, Respondents = 828; 2012: Sample = unknown, Respondents = 841)
-
Key Finding(s) and/or Change(s) Resulting from Assessment:
-
Orientation made changes to Spring Invitational (SPIN) Student Orientation following an emergence of trends in student comments asking for more step-by-step instruction on how to register for classes. Therefore, the evening Orientation Leader Meeting was redesigned to give students more detailed instructions.
-
After the agenda was changed, program evaluation increased from 4.3 to 4.4 out of an available 5.0 [57] [58].
Family Weekend Assessment
-
Sample: The sample included all family members who participated in Family Weekend. (Sample = 516, Respondents = 212)
-
Key Finding(s) and/or Change(s) Resulting from Assessment:
-
Parent & Family Programs made changes to the Family Weekend Program after results from the program evaluation included comments/suggestions to have more Louisiana cuisine, water service available at all meals, milk added to Jazz Brunch, football ticket section re-located, more activities on Friday, and tailgate activities on Saturday that included more interaction with other families.
-
Parent & Family Programs made the following suggested changes: featured more Louisiana cuisine such as red beans and rice, jambalaya, and bread pudding; added water service at all meals; added white and chocolate milk to the Jazz Brunch menu; and collaborated with the Ticket Office to secure a limited number of lower level seating.
-
Parent & Family Programs also moved check-in to start earlier on Friday and incorporated events already happening on campus, such as museums and other places to visit on campus and tailgate games that encouraged multiple families to participate [59].
Parent Orientation Assessment
-
Sample: The sample included all family members who participated in Family Weekend. (Sample = 2052, Respondents = 821)
-
Key Finding(s) and/or Change(s) Resulting from Assessment:
-
Parent & Family Programs made changes to the Parent Orientation following assessment results on program evaluation in which parents comments/suggestions included requests for more time with campus administrators, more information for out-of-state families during the Financial Aid Presentation, and more information about the ALEKS test. Parents also found it difficult to locate parking areas and to navigate campus.
-
Some of the changes included inviting campus administrators to the Parent Dinner on Day 1 and hosting a Q & A period;. expanding the Enrollment Management Financial Aid presentation to incorporate more information geared to out-of-state students and families; adding a discussion about the ALEKS test by senior colleges in their college meeting presentations and adding ALEKS information on the Parent Orientation Leader (POL) Hot Sheet to better respond to questions/concerns from parents about ALEKS; including a campus map in the reminder email that listed specific parking lots; and creating additional orientation directional signs to be placed around campus [60].
Office of the Dean of Students
Campus Life
Services offered through the department are evaluated in several ways, including the following:
-
The Annual Assessment Plan, which includes (1) national benchmarking through the Multi-Institutional Study of Leadership, a national data set assessing the leadership development of LSU students with respect to the Social Change Model, which is the foundation of Campus Life/LSU’s formal leadership development programs through Campus Life, (2) a needs assessment, and (3) a student leader rubric.
-
Campus Life, which tracks certain metrics for each initiative and develops End-of-the-Year Programming Reports, including metrics on the number of programs/initiatives/services; on the number of hours spent advising, both group and individual students; and on budget reports.
-
All assessment is used to improve initiatives and refine strategic initiatives [61] [62].
Campus Life has utilized assessment for continuous improvement of the unit’s programs and services and implemented changes based on assessment results. Below are examples of evidence of improvement based on analysis of assessment results.
-
NASPA Assessment and Knowledge Consortium – Civic Engagement
Student Sample: Random sample of undergraduate students, representative of all undergraduates (Sample = 4000, Respondents = 710)
-
Multi-Institutional Study of Leadership
Student Sample: A random selection of 4,000 undergraduate students were invited to participate, and an intentionally selected sample of approximately 1,000 students leaders was included. (Sample = 5000, Respondents = 1032)
-
Key Finding(s) and/or Change(s) Resulting from Assessment:
-
Data from the NASPA Assessment and Knowledge Consortium – Civic Engagement and the Multi-Institutional Study of Leadership demonstrated a trend for LSU students to be significantly different from peers in the following ways: less likely to engage in critical conversations, less likely to realize their role in impacting their communities, less likely able to connect what they learn in the classroom to outside of the classroom, and less active in social advocacy issues.
-
Campus Life made changes to the Volunteer/Service programs following this assessment. Prior to 2011, very few moments for reflection, partnerships with outside departments or academics, or critical conversations on societal issues were incorporated into most Campus Life volunteer/service programs.
-
Since 2011, Campus Life, with student organizations such as Volunteer LSU, Geaux BIG Baton Rouge, and Kitchens on the Geaux have increased academic partnerships and included outside departments. Campus Life has also implemented structured reflection questions at the end of each service opportunity (e.g., “How does the project impact the community?”).
-
Also, in 2012, Campus Life added the “Critical Conversations” program to program offerings. This program encourages conversations related to difference and difficult societal issues.
-
Each of these additions and changes are having an impact on the students served and are also helping achieve the Division of Student Life & Enrollment SSOs, particularly those related to citizenship and civic engagement and interpersonal/intrapersonal competence [63] [64].
Disability Services
Services offered through the department are evaluated in several ways, including the following:
-
The Annual Assessment Plan [65] [66] and
-
A series of satisfaction surveys
Disability Services has utilized assessment for continuous improvement of the unit’s programs and services and implemented changes based on assessment results. Below are examples of evidence of improvement based on analysis of assessment results.
-
Key Finding(s) and/or Change(s) Resulting from Assessment:
-
In August 2009, Disability Services held its first new student orientation in an effort to increase the number of students using accommodations, educate new students and their parents about Disability Services and other resources on campus, increase the retention rate and GPAs of students with disabilities, and educate new students on how to navigate using their accommodations at LSU. In 2010, 26% of the assessments turned in noted the length of the orientation was too long for students with attention-related disabilities.
-
As a result, in 2011, the orientation was shortened to 1.5 hours from 3 hours.
-
In 2012, 30% of the assessments noted that the orientation time was not convenient for parents of students with disabilities to attend. Therefore, the orientation will be moved to late morning or afternoon in 2013 [67].
Greek Life
Services offered through the department are evaluated in several ways, including the following:
-
The Annual Assessment Plan [68] [69] and
-
Use of the CAS Standards for Fraternity/Sorority Life.
Greek Life has utilized assessment for continuous improvement of their programs and services and implemented changes based on assessment results. Below are examples of evidence of improvement based on analysis of assessment results.
EMPOWER Weekend Assessment
-
Student Sample: The student sample included all students who participated in EMPOWER Weekend 2010 and 2011. (2010: Sample = 112, Respondents = 65; 2011: Sample = 121, Respondents = 73)
-
Key Finding(s) and/or Change(s) Resulting from Assessment:
-
The Greek Life office assessed EMPOWER 2010, utilizing focus groups and a pre/post test. After confirming the data represented in the pre- and post-test results for EMPOWER, focus group participants assisted with idea generation of ways to improve the program for the future.
-
The participants discussed that it would have been helpful to have met their EMPOWER group prior to the weekend in order to build relationships with their group members prior to the start of the retreat. This meeting prior was termed Phase I and was developed into the curriculum by students and staff.
-
Based on the results, staff made the decision to develop a Phase I for the 2011 EMPOWER experience. This two-hour meeting that occurred a week before the EMPOWER weekend encompassed an introduction to the retreat and gave attendees the opportunity to get to know their EMPOWER group leaders and group members before the weekend retreat [70] [71].
NASPA Assessment and Knowledge Consortium: Greek Life
-
Student Sample: The student sample included all students who are coded in the LSU system as members of Greek organizations. (2010: Sample = 111, Respondents = 65; 2011: Sample = 121, Respondents = 73)
-
Key Finding(s) and/or Change(s) Resulting from Assessment:
-
Greek Life made changes following the Graduating Senior Survey with the goal to identify positive and negative experiences of Greek membership.
-
The survey was sent via the NASPA Consortium: Fraternity/Sorority Life Impact Student Survey, with Greek Life adding 10 questions to this survey that was administered and available to members of any Greek organization. The total consortium project had 1,161 responses, with 931 completing the entire survey. Greek students who answered the 10 additional open-ended questions answered at a lower rate.
-
The item pertaining to the Greek experience assisting in obtaining a job or internship scored significantly less than the other responses. This is something that perhaps students are looking for out of their fraternity or sorority experience but is not coming to fruition as often.
-
The results were shared with chapter presidents and advisors at the Annual Kick off meeting in August of 2011. Chapter presidents and advisors were encouraged to coordinate better efforts to connect members with support in obtaining an internship or job opportunity.
-
To provide additional positive incentives in supporting career development for the chapters, the Greek Life office also added a career workshop opportunity to the annual Greek assessment process as one of the workshops chapters can host to obtain points for Greek assessment. This addition should increase the focus of career development for the chapters and provide additional resources for Greek students [72].
NOTE: The narrative continues in "3.3.1.3 (Continued)."