Home General Documents Institutional Report Conceptual Framework
Exhibit Room Louisiana St. Supp. Rep. SPA Reports  




COEHD Assessment System Description

       The College of Education and Human Development (COEHD) is comprised of three departments: Teaching and Learning, Educational Leadership and Technology, and Counseling and Human Development. The department not addressed in this report is Counseling and Human Development, because its programs are accredited by Council for Accreditation of Counseling and Related Educational Programs (CACREP).  All full-time faculty in the unit participate on a Standard Committee.  The chairperson for each Standard Committee also serves on the NCATE Steering Committee.  The Standard 2 Committee, which is charged with assessment and program evaluation, is comprised of representatives in the three COEHD departments.  The Department of Counseling and Human Development is represented on Standard 2 and other committees to keep its administrators and faculty apprised of NCATE activities.

      The unit assessment system is an organized, multidimensional, integrated, and efficient structure and set of procedures for monitoring and measuring candidate performance and program effectiveness for the purpose of substantiating the achievement of the unit’s goals and ensuring continuous program improvement.  The unit has long emphasized the preparation of effective professionals , but since programs were redesigned, one substantial change in the unit assessment system focus is from viewing candidates as consumers to producers.  This is demonstrated in the replacement of comprehensive examination with an action research project in most advanced programs.  The unit assessment system is illustrated as a chart in Exhibit 2a.4. 

       Principle entities forming the structure are the NCATE Standard 2 Committee, the NCATE Steering Committee, the Dean’s Advisory Council (DAC), the Council for Teacher Education (CTE), COEHD faculty, and other professional community members (representatives of partner colleges and PK-12 schools).  The Steering Committee is comprised of the Director of the COEHD Assessment System and Program Evaluation, the Dean and Department Heads, chairpersons of each Standard Committee, and representatives of partner colleges.  The CTE is chaired by Dr. Rebecca Day, Director of Assessment and Program Evaluation.  Members are appointed by the dean for the purpose of maintaining a council that represents all departments offering degree/certification programs. 

       The unit assessment system procedures begin with the Standard 2 Committee’s review of assessment system documents.  The chairperson of the Standard 2 Committee is responsible for reporting to the NCATE Steering Committee and communicating information from the Steering Committee with the Standard 2 group. The DAC disseminates information to and requests feedback from the CTE, faculty members, and the professional community.  The DAC presents documentation reflecting these communications to the Standard 2 Committee, that, in turn, considers the information and prepares a report that is submitted to the DAC for its approval.   

       Inextricably linked to the unit assessment system are Specialized Program Association (SPA) and  Southern Association of Colleges and Schools (SACS) standards.  In conjunction with efforts pertinent to NCATE accreditation, faculty and administrators work together to align programs with these SPAs and SACS, which share the same purpose with NCATE.


Data Use for Program Improvement

       Another salient process in the unit assessment system is the use of data for program improvement, represented as a chart in Exhibit 2b.1.  While the process shown in the chart is configured as a set of linear steps, the process is also recursive in that a proposed change can be sent back to a previous step to elicit additional information or clarification. When a change in the structure or procedures is proposed, it is presented to the appropriate committee (e.g., faculty teaching methods courses) for review and response, which can be in the form of a decision, an action step, or a proposal for change. The committee’s written response is submitted to the DAC for review and with a request for approval. If the DAC approves, the proposals then moves to the CTE and other committees for further review and comment. The suggested change with comments is returned to the DAC for its consideration and reports its final decision to approve or reject the proposals.  Changes approved by the DAC leave the COEHD for review and approval by the university’s Curriculum Council and administration.  This process and others relevant to the unit assessment system are facilitated, as needed, by the director of the Office of Assessment and Program Evaluation.  

       Needs for program improvement can emanate from various sources, such as a periodic review of courses by a graduate committee or an ad hoc committee’s discussion of a concern. Examples of program improvements based on assessments are provided in Figure 1 below. All proposals, regardless of their source, proceed through the system.  Documentation on data and its uses are shared with stakeholders, such as supervising teachers and university supervisors of student teachers and interns.  To complete the process, improvements must be substantiated.  If  a committee implements strategies to improve candidates’ PRAXIS scores, for example, the scores on subsequent PRAXIS tests can serve as one form of evidence.


Figure 1: Timeline of  Program Improvements Based on Assessments


Study and Findings

Resulting Change


Annual Performances Review

Improvement of Courses

Faculty Improvement

Each Semester


Workshops presented  by faculty


Statewide redesign of teacher preparation. It was found that state programs needed to increase content-based courses.

Southeastern redesigned its elementary and secondary undergrad programs. Content was increased in both elementary and secondary programs.


It was ascertained that candidates required distance education.

Professors were encouraged to teach some online courses.


Electronic Data Reporting Systems

      Through the unit assessment system, data on candidate peformance and program effectiveness are collected, aggregated (and disaggregated), analyzed, and reported by the director of the COEHD Office of Assessment and Program Evaluation. Additional aspects of assessment, such as demographic data on candidates and Student Opinion of Teaching surveys (SOTs),  are under the purview of  the university’s Office of Institutional Research and Assessment (IRA).  For the purpose of evaluating and improving the unit and its operations, each office employs an electronic system for the manipulation of data. 

·        PASS-PORT is the unit’s electronic portfolio system, and

·        PeopleSoft is the electronic system employed by the IRA.

       While the Director of the COEHD Office of Assessment and Program Evaluation and the Director of Southeastern’s Office of Institutional Research and Assessment are responsible for separate aspects of assessment and program evaluation, they cooperate to provide an institutionalized, comprehensive system of instruments and data sets that complement each other and foster continuous improvement.

        Ms. Flo Winstead, Director of the COEHD Office of Assessment and Program Evaluation, coordinates the unit assessment system, which includes the oversight and management of PASS-PORT.  In addition to collecting and aggregating data on key assessments, the Director also receives and fulfills other data requests from administrators and faculty.  In this role, she provides assistance to administrators, faculty, and candidates.  The assessment system utilizes PASS-PORT to document candidate progress through distinct levels, called portals, and to provide data to ensure that candidates are engaged in a broad range of experiences with diverse student populations.  One of the final key assessments in the initial program is the Evaluation of Student Teaching/Internship rubric, based on the Louisiana Components of Effective Teaching (LCET), that is completed by a candidate’s school mentor/supervisor with input from the student teacher and university supervisor.  In both initial and advanced programs, candidates proceed through the series of portals and, if requirements are met at each transition point, they gain access to the next portal.

       Dr. Michelle Hall, director of the university’s Office of Institutional Research and Assessment (IRA), works to fulfill the IRA mission, which is “to provide data, information, expertise, and leadership in support of the mission, vision, and strategic priorities of Southeastern Louisiana University.”  The IRA employs PeopleSoft to collect and manipulate all data, therefore a wide array of information is available to faculty and administrators for research and to gauge program and institutional effectiveness. Demographic data on the COEHD student body, such as gender and ethnicity, are collected and reported by the IRA. The IRA provides enrollment funnels for each academic department’s programs. The system collects data on applicants, those who are admitted, and students who are enrolled.  Each fall, retention rates are calculated and reviewed for different reasons. For instance, data can be used each year to compare outcomes of redesigned programs with outcomes of previous programs.  The IRA also conducts a biannual Alumni Survey and Employer Survey.  Results of these surveys, evaluations, retention rates, and enrollment funnels are distributed to all deans and department heads, but information is also available on the IRA website. Students’ ACT scores and other data are collected on all students, but the data are also disaggregated and reported for each college and department.  

       The IRA administers a variety of surveys to garner the opinions of the university’s partners, such as school principals and supervising teachers, which are used to inform program improvement and institutional effectiveness.  Employer surveys, which seek to learn how well graduates are prepared to perform work responsibilities, are conducted every other year.  Some surveys are administered annually and others alternate over a four-year cycle.  Alumni surveys, some using random samples of recent graduates (initial and advanced), are administered every other year to solicit their opinions regarding such things as satisfaction with their degree program,  professional activities, employment, library services, and technology.  Results of the Exit Survey, which contains questions specific to each student’s major, are compiled and reported annually.  The National Survey of Student Engagement (NSSE), administered to all freshmen and seniors, elicits information that links research-based educational practices to student learning.  The last NSSE survey was conducted in 2002 and 2003.  Additionally, Southeastern was one of 143 schools that participated in a pilot project of the Faculty Survey of Student Engagement (FSSE) in 2003.  Faculty and staff are also surveyed on various topics, according to a 2 to 3 year cycle, to further assess institutional effectiveness. (Note: Complete survey data is not available for AY 2005-2006 due to Hurricanes Katrina and Rita.)

       In addition to the numerous student surveys, the IRA conducts assessments relative to faculty effectiveness, such as the Student Opinions of Teaching (SOTs).  Each semester, students (candidates in COEHD) complete a survey and respond to open-ended questions, which provides a mechanism for lodging complaints and voicing their concerns.  The SOT data are forwarded to the department head and instructor for review and use in the instructor’s final evaluation each year.


Conceptual Framework and Candidate Knowledge, Skills, and Dispositions

       In addition to providing the means for collecting and manipulating data for program evaluation and improvement, the unit assessment system is invaluable for monitoring and measuring candidate performance to promote candidates’ academic success, effectiveness in their professional activities, and impact on PK-12 student learning.  The Conceptual Framework provides direction for the development of effective professionals.  It is a living document that continuously evolves but is always centered on best practices.  As indicated in the diagram below, the COEHD Conceptual Framework is represented in the unit assessment system, which provides the means for assessing candidates’ knowledge, skills, and dispositions.

Programs in all departments reflect the COEHD Conceptual Framework. The four components of the Conceptual Framework are:

·        Knowledge of Learner (KL),

·        Strategies and Methods (SM),

·        Content Knowledge (CK), and

·        Professional Standards (PS).

·        Diversity and Technology are included as themes that are integrated throughout all programs in the unit assessment system.


Selective Admission and Retention in Teacher Education

       An element of the unit assessment system relative to candidates’ entry into and progress through the program is the Selective Admission and Retention in Teacher Education (SARTE) process that tracks candidates to ensure that they are meeting criteria to enter the program and remain the the program.  If a candidate does not meet the criteria for satisfactory progress in SARTE, or if a faculty member believes a candidate needs additional help to meet expectations , procedures are in place to provide assistance and remediation.  Candidates may be referred to the Teacher Development Program, which offers individual and group services (i.e., workshops, conferences, and study materials).  Documentation on referrals and actions are maintained in a file by the Teacher Development Program coordinator, Dr. F. Wood.


PASS-PORT: Assessing Candidate Performance in Initial and Advanced Programs

       Candidates enrolled in initial and advanced programs complete a portfolio that is comprised of distinct levels, called portals, whereby they substantiate their knowledge, skills, and dispositions, and demonstrate the achievement of required competencies that are consistent with the COEHD  Conceptual Framework.  The objectives included in the syllabi for all professional education courses indicate the Conceptual Framework components addressed by each objective; therefore, when any course objective is achieved, candidates’ performance is assessed.  For their portfolios, candidates are required to indicate the components and the knowledge, skills, and dispositions addressed in artifacts, reflections, and professional development activities and clinical experiences.

       PASS-PORT is the unit’s electronic, web-based, portfolio system selected by Southeastern’s College of Education and Human Development to monitor and measure candidates’ performance, to ensure that candidates are engaged in a broad range of experiences with diverse populations, and to collect data with the intention of fostering their continuous development as effective professionals in their field. The professional accountability support system uses a portal approach to track candidates’ progress from their entry into an initial or advanced program through the post-completion portal.  Exhibit 2a.2 is Table 5, which shows the portals through which candidates progress in the initial and advanced programs.  PASS-PORT provides readily-accessible documents and data reports that faculty and administrators can use to identify areas for program improvement and otherwise strengthen the unit.  All artifacts and other components of the portfolio (e.g., field experiences) are assessed by faculty.  Course assignments included as artifacts are assessed, as formative assessments, by faculty who teach the courses.  Faculty also evaluate the portfolios of the advisees who are assigned to them.  The rubric that is completed by faculty at each transition point (portal) serves as a summative evaluation, which either gives a green light to or halts candidates’ access to the next portal.  When a candidate’s portfolio, or a component (e.g., reflection), receives a “Does Not Meet Expectations” rating, the candidate is notified and receives comments from the ealuator.  The candidate has one week to address the deficiences and resubmit the artifact or portfolio to the evaluator.  Until the portfolio meets or exceeds expectations, the candidate is not permitted to enroll in the next level of courses until the portfolio meets or exceeds expectations. The portfolio may only be resubmitted once.  If the portfolio does not meet expectations a second time, the candidate is referred to the Teacher Development Center for remediation.  Upon completion of actions that address areas for improvement, the portfolio may be resubmitted the following semester.  If a candidate chooses to appeal any decision relative to the portfolio evaluation, he or she follows the university’s appeal and grievances procedures found in the student handbook.

       The initial teacher certification program is comprised of undergraduate and Master of Arts in Teaching (MAT) courses of study.  In the initial portfolio, candidates progress through three levels.  At the Introductory Level, candidates enroll in courses focusing on the foundations of education and basic content knowledge. Candidates then move into the Developing Level during which they are enrolled in education methods courses. These courses include diverse field experiences and opportunities to utilize the knowledge, skills, and dispositions essential for effective teaching.  At the Competency Level, candidates participate in a semester of Student Teaching or Internship. This level allows them to broaden their base of experiences even further by providing sustained opportunities in additional field settings. Through the portfolio, candidates at the initial level demonstrate the achievement of competencies representing educational standards set forth by state and national agencies and organizations, such as:

·        Interstate New Teacher Assessment and Support Consortium (INTASC), a national group focused on new teacher development, has identified ten standards and supporting principles that represent competencies essential to teacher effectiveness.

·        Louisiana Department of Education Components of Effective Teaching (LCET) is a set of performance indicators, called benchmarks, that were modeled on the INTASC standards. 

·        SPAs are content-specific organizations (e.g., ACEI, CEC, NAEYC, ISTE, AASA).

       Advanced programs are offered by the three COEHD Departments: Teaching and Learning, Educational Leadership and Technology, and Counseling and Human Development. Programs in the Department of Counseling and Human Development  are accredited by CACREP; therefore, information on the programs are not included in this document. Candidates develop a portfolio that progresses through three levels: Emerging Level, Proficiency Level, and Capstone Level.  Candidates demonstrate the achievement of competencies set forth by the unit, state entities, and SPAs, such as: the National Board for  standardsrepresenting educational standards set forth by state and national agencies and organizations, such as: National Board of Professional Teaching Standards (NBPTS), ISTE, and Educational Leadership Constituents Council.

       For initial programs, candidates transition through five portals. The three portfolio levels are contained in Portals 2, 3, and 4.  For advanced programs, candidates transition through five portals.  The three portfolio levels are contained in Portals 7, 8, and 9.  Exhibit 2a.2 (Table 5) shows the transition point assessments in the Unit Assessment System Charts.  Exhibit 2a.3 is a description of the portals in PASS-PORT through which candidates progress.


Initial Program Transition Points/Portals

Portal 1


Portal 2

Admission to Teacher Education Program

Portal 3

Program Progress

Portal 4

Student Teaching/ Internship

Portal 5

First Year Induction




  Advanced Programs  Transition Points/Portals

Portal 6

Admission to

Graduate School

Portal 7

Admission to Program

Portal 8

Program Progress

Portal 9

Program Completion

Portal 10






Program  Evaluation:


Assessing Faculty and Administrator Effectiveness

       The unit assessment system provides the means for assessing the effectiveness of faculty and administrators, another requisite part of program evaluation for continuous improvement. The structure and procedures are developed and carried out according to university and state policies and the ethics of practice.

       Southeastern’s Faculty Handbook describes the systematic manner in which the performance of administrators, from the president and vice presidents to deans and department heads, are evaluated to foster institutional improvement and to satisfy Southern Association of Colleges and Schools (SACS) requirements. Deans are evaluated by the Provost. The deans complete a self-evaluation and submits the names of  three peers, subordinates, and customers who evaluate the deans’ performance in areas such as communication, decision-making, planning and organizing, and collegial relationships.  Faculty in the colleges of academic deans also complete an anonymous survey on their perceptions of the deans’ performance. The Provost draws on each of these instruments and his own observations in his evaluation of the deans.  He meets with each dean to discuss the evaluation and to give him or her a copy of the report.  Each spring, deans complete an annual evaluation of department heads. The format and procedures for the evaluation are derived from recommendations of the Council of Department Heads and the Faculty Senate.  Full-time faculty in each department are asked to complete a form that includes a set of questions but also requests comments. If the evaluation results indicate general dissatisfaction,  the faculty can be polled to determine a vote of no-confidence. After a meeting with the department head at which the evaluation report results are discussed, a report is submitted to the Provost. Since department heads are employed on an annual appointment basis, a decision can be made to not renew the appointment for the following year or to renew the appointment, with provisions.

       Faculty are evaluated annually in three areas: teaching/job effectiveness, professional activity, and service (university and community).  Department guidelines are presented to the respective dean and Provost. In the spring, a faculty member submits a report of his or her accomplishments since the previous evaluation. The self-report and the department head’s evaluation are discussed in an end-of-the-year meeting.  In addition to documenting accomplishments, strengths and areas for improvement are identified.  To foster growth, a plan to help the faculty member meet or exceed expectations is developed by the faculty member and department head. If deficiencies persist, steps are taken, according to university criteria and procedures, to terminate employment.  If the faculty member disagrees with any aspect of the report, he or she may respond, in writing, to the evaluation, and the department head then must respond in writing within an additional five day period.  By the end of the academic year, the dean verifies in writing to the Provost that the faculty member’s evaluation has been completed.





Southeastern Louisiana University
© 2007Southeastern Louisiana University
All Rights Reserved
Unofficial and external sites are not endorsed by Southeastern Louisiana University