Newsletter 2

Volume 1, Number 2;  Summer, 1998
IN THIS ISSUE:

 

Staff, Office & Web Site

Learning from others: Wharton experience

 Best Practices experience

Reports from the Subcommittees

Subcommittee on Student Learning
Subcommittee on Faculty and Staff Learning
Subcommittee on the Environment for Learning

Resources and Data
Contact Us

Our Staff, Office and Web site

The WASC Self Study office is located in the office of the Vice President for Academic Affairs, MH 135-A. (Enter through MH 133). Our clerical assistant is Jennifer Robinson (Ph: 278-3227). Her hours are 8:00 a.m. to noon. Charlene Carr, our graduate assistant, is located in MH 111A. Our email is WASC@fullerton.edu and our Web site is http://WASC.fullerton.edu.

Traveling the Continent in Search of Models for Student-Centered Learning

Members of the WASC Self Study Committee participated in two national projects designed to help us define what a student-centered learning environment can be. The first was conducted by The Wharton School (University of Pennsylvania), its Institute for Research on Higher Education and the Knight Collaborative (formerly the Pew Roundtable).

The Chancellor’s Office invited CSU campuses to participate in a week-long seminar on "Managing Higher Education" with a particular focus on student learning. All 22 campuses applied, and ten were chosen: Bakersfield, Cal Poly Pomona, Dominguez Hills, Fresno, Fullerton, Los Angeles, Northridge, Sacramento, San Bernardino and Sonoma. Harold Goldwhite (CSU Institute for Teaching and Learning) and Jim Highsmith (CSU Academic Senate) also attended. Most of the campuses, like us, were at some stage of their accreditation process with WASC. Our "team" included Ellen Junn, Director of the Faculty Development Center, and four members of the WASC self study committee: Dave Fromson, Tom Klammer, Bob Palmer and Sandra Sutphen.

We spent a long week, starting with breakfast meetings at 7:30 and ending at 9:00 in the evening. At Friday’s concluding meeting, the Wharton folks told us that of all the groups they have instructed—including stock brokers, business executives and civic leaders—the CSU campuses were the group that was the most controlling and demanding; we refused to behave like students; we insisted on more information; we rejected their premises; we challenged them at every opportunity. We’re not accustomed to such flattery and we don’t know what we did to deserve such high praise from Wharton, but we accepted their judgment with great modesty, and came home better people.

What we learned

What we learned was that our brother and sister campuses are all working with similar problems of assessing student learning. Many campuses are doing just what we are doing: developing new learning goals for general education, spending more time on faculty development particularly as it relates to technology, and focusing seriously on what it means to be a student-centered learning institution.

There was a bit of a mismatch between Wharton and the CSU. We thought we were there to learn about student learning and assessment; Wharton wanted to teach us about negotiation, bargaining and strategy. We spent the first sessions adjusting to each other’s expectations (actually, we adjusted to theirs), using break-out sessions to formulate ideas to share with our own campuses.

We were critical of any number of assumptions that Wharton built into the seminar. Faculty in management and administration will recognize many of the concepts that Wharton faculty teach: "win/win" negotiation, "interest-based" bargaining, flattening hierarchies, team-building, participatory management, quality circles, empowering employees, shared governance. Drawing from "Total Quality Management," the same approach that led to "reinventing government," the Wharton approach acknowledged that business and industry are moving toward management theories that, frankly, have characterized universities all along.

Despite our criticism, the final session (beginning at 7:30 in the morning!) was a great success. We were divided into groups of five, representing five different campuses. We "toured" poster displays of the campuses and representatives from each campus presented their perception of what a student-centered learning environment means for their campus communities. We talked about "action plans" and provided feedback based on our own sets of experience and expectations about what "learning" really means. For the "team" from Fullerton, this feedback took the form of reaffirming that the many different ways in which teaching and learning are assessed here are also strategies and tools employed throughout the CSU. In short, we left Wharton with the strong feeling that our WASC Self Study Team is moving in the right direction.

Visiting "Best Practices" Institutions

Our second experience involved visits to "best practices" campuses around the country, identified by the American Productivity and Quality Center (APQC), a "think tank" in Houston. The CSU is a sponsoring member for several projects that APQC is conducting, all centered around issues of assessment and learning. ("Assessing Learning Outcomes" is the title of the current project.) This particular APQC project involves higher education, primarily, but also includes sponsors from the private sector, such as Raytheon. The goal is to locate institutions (public and private, business, government and education) that have developed models, or strategies, or practices that others may use.

"Best Practices" institutions were selected by APQC for site visits, and, as a sponsor, the CSU invited our campus to send one individual to each site for a one-day intensive seminar. Vice President Tetreault asked the WASC Self Study Committee members to volunteer, and though the notice was short and the timing bad ("April is the cruelest month. . ."), Dave Falconer went to the Tennessee Valley Authority’s University on April 6, Pat Szeszulski went to the University of Phoenix in Arizona on April 17, Sandra Sutphen went to Emporia State University in Kansas on April 20, and Dave DeVries went to Ball State University in Muncie, Indiana on April 30.

We’re still comparing notes on what we observed at these "best practices" institutions, but, again, one clear observation is that Cal State Fullerton’s focus on learning—and the many and varied ways in which we assess that learning—qualifies us for "best practices" standing. There are different models, of course. Smaller institutions, even though they have fewer faculty, are able to work more intimately with students in smaller classes and, frequently, with students who reside on campus. State higher education laws mandate courses of study different from California’s Title V.

We found one experience that was shared by nearly all institutions: when assessment of student learning "works," everyone—faculty, students, employers, and the state legislatures (!)—feels more positively about the university. What does it mean when assessment "works"? Students present to prospective employers a portfolio that demonstrates their acquired skills; outside evaluators rate a department’s assessment measures; students successfully complete basic competency exams; expectations for excellent course work are well understood by students and faculty alike: all represent the kind of "culture of evidence" that we will be documenting in our own WASC self study.

Reports from the Sub-committees.

The Self Study Subcommittees are engaged in defining the scope of our themes and collecting data and resources to present evidence about learning and assessment. There’s been a lot of progress, as each of the subcommittee chairs report.

Student Learning Subcommittee

Pat Szeszulski

As has been the case with all the subcommittees, we have engaged in a variety of activities in order to define the scope of our work. Members of the committee have read a number of philosophical papers on student learning, gathered and considered a great deal of evidence on issues related to student learning at CSUF, and met regularly to discuss all the evidence. During this process, the committee made the decision to focus on a limited number of key issues related to how to educate a diverse student body for the 21st century. Furthermore, we decided to study each issue in great detail rather than covering all possible issues superficially. To facilitate consensus regarding which issues would be pursued, members of the committee participated in a three-hour brainstorming session using Ventana GroupSystems; a software program that allows participants to contribute their ideas anonymously and simultaneously while working at separate workstations. Our session in the Library Studio Classroom on April 28 comprised three phases. First, 82 ideas were generated in response to the prompt, "What questions about student learning should this committee examine in order to be able to address whether or not University practice is consistent with the goals of its mission?" Second, the responses were reviewed and redundant ideas were combined. Third, participants used a 5-point scale (strongly agree to strongly disagree) to vote whether each of the remaining 76 ideas should be considered by the committee. (Several additional "consensus building" phases had been planned for the session, but were abandoned due to the profound slowness of the system). Independent reviews of the resultant data yielded a "philosophical/definitional" category (e.g., What is learning? What is assessment?) as well as the following four broad categories of "evidence" (particular foci) related to educating a diverse student body.

 

Who Are Our Student Learners? (Demographics; Student concerns and preparation) Factors That Influence Learning (Academic &Technological resources; Student/faculty collaboration, Community-based and co-curricular experiences ) Learning Goals (Marks of a CSUF graduate/education; GE and selected other programs that have developed learning goals) Assessment of Learning (Graduates/seniors opinions; Selected programs/departments and initiatives (e.g., Fullerton First Year) working on assessment)

This evidence will be collected in order to address the following:

  1. How students experience the University
  2. How students come to understand their own positionality in relation to others
  3. The relationship between (a) and (b) and what we do in the curriculum and the classroom.
  4. How assessment of (c) informs our planning for the future.

Subcommittee Report: Faculty and Staff Learning

Dave DeVries

Our committee decided quickly that many of the traditional indicators for faculty learning were good measures that stood the test of objective assessment. An enumeration of peer reviewed publications, exhibitions, performances and conference presentations is part of the "culture of evidence" that demonstrates continued professional involvement, and presumptively, continued learning. Sources for these data are easily gathered from departmental year-end reports, Compendium announcements, and acknowledgement at the annual recognition day sponsored by the VPAA. We agreed that there were any number of other indicators, some easier to collect and organize than others. Among these are

  • workshop attendance
  • grants received
  • school/departmental retreats
  • new courses developed
  • active membership in professional associations
  • use of new technology
  • classes taken

We anticipated that the newly created Faculty Development Center will work closely with our subcommittee, both in providing data from past efforts sponsored by the Institute for the Advancement of Teaching and Learning, as well as informing us of new directions for faculty learning. For example, we agreed that it would be useful to have comparative data from other institutions, especially those with a long history of formalized faculty development.

Arriving at measurements for staff learning was less straightforward. We found some obvious indicators:

  • Classes taken at CSUF and elsewhere ("fee waiver")\
  • Degrees earned while working at CSUF
  • PSI awards and other recognitions
  • Workshops/conferences attended
  • Use of new technology

During our discussions about staff learning, we felt the need for what we called "anecdotal" material, or "data" based on responses about opportunities for learning from staff who had taken advantage of these moments (and, we began to get a bit of feedback from some staff members about their own experiences). Because the university has recently redirected resources to provide a staff development program, headed by Naomi Goodwin and Robin Innes, we expect to find not only more courses and workshops but also more data to support the "culture of evidence" about staff learning.

We know about some studies that have been done on campus—including some longitudinal data—about faculty involvement in learning, but we were surprised at how little material has been gathered about staff learning. We anticipate that correcting this deficiency will be a high priority for our work in the Fall.

Subcommittee Report: Campus Environment for Learning

Ray Young

The campus environment for learning is a deceptively straightforward construct. To many observers the first environments which come to mind are the meso-scale "bricks and mortar" of the campus such as particular buildings or their internal classrooms, offices, and the infrastructure necessary to make those function effectively. Yet, our learning environments reach far beyond that while also operating in more subtle, behavioral domains. A full assessment of the environment for learning must include macro-level components literally from "A" (the arboretum) to "Z" (Desert Studies Center at Zzyzx).

The subcommittee also noted the importance of service and business environments that can facilitate or distract one from learning. These would include such components as the Admissions and Records department, Disabled Student Services office, campus food services, the computer "Roll Out," Library, unforgettable parking, public safety, and the Titan Student Union. Moreover, the essence of Cal State Fullerton is expressed by its connections with the larger regional community, through which the public learns about our breadth and strengths. Those connectivity environments range from athletic events and fine arts programs to special recruitment or fundraising efforts and CLE, the Continuing Learning Experience.

We have identified more than 50 distinct components to the campus learning environment which may provide indicators of how well we are implementing our campus Mission and Goals. Space available here does not permit a full recitation of that listing. However, the subcommittee has arrived at a consensus about the particular components that deserve closer attention for the WASC accreditation process. Based on a priority ranking system and group discussions, those are:

      1. Classrooms
      2. General Campus Aura
      3. Landscaping & Pathways
      4. Parking
      5. Faculty Offices
      6. Building Appearance
      7. Safety Elements, including campus lighting
      8. Admissions & Records
      9. Mission Viejo Campus

10T. Physical Plant and Support Services

10T. Service Areas / Work Rooms

      1. Staff and Administrative Offices

13. Student Services Units

14T. Outdoor Gathering Places

14T. Student Interactive Spaces

16T. Residence Halls

16T. Student Organizations

      1. Titan Student Union (aka University Center)

Presenting such a list quickly begs at least two interpretive questions: Do these components represent areas of concern or are they components generally believed to be important attributes of a strong university . . . or both? Could some components of the campus environment for learning have been omitted from this list because they are now perceived of as functioning quite well (such as the library)?

While there are various sources of evidence to paint a clear picture of some of these components, we have a long way to go in understanding how users (various groups of learners) rate the importance of, and satisfaction with, other elements. The subcommittee plans to conduct further research, including focused surveys, during the coming months to expand our state of knowledge about many of these themes. A reexamination of existing evidence, coupled with new perspectives, will enable us to give a more thorough assessment to the WASC reviewers but, just as importantly, provide planning guidance to on-campus decision-makers long after the formal WASC process has concluded.

Resources and Data

Dolores Vura compiled a list of the most readily available data (some of these have been distributed to the Task Force). Here’s where the campus community can help the Task Force enormously. What measures do you know about that we have neglected to mention here? Contact us by filling out the response form in this newsletter, or writing to us at our Web site http://WASC.fullerton.edu.

  • Profiles of New Undergraduate Students, Fall 1997
  • Undergraduate Student Focus Groups Report
  • Student Needs and Priorities Survey [SNAPS] Spring, 1994; Report Fall, 1994
  • Faculty Selected Statistics, Fall, 1980 to Fall, 1997 (includes age projections)
  • Educational Equity Retention Grant; Reports on survey results
  • Campus Climate Report, Student Affairs
  • Classroom Renovation Report

 

Other materials available:

  • Statistical Handbooks, Fall, 1986 through Fall, 1997
  • Guidelines for Annual Reports and Program Performance Reviews
  • Increasing Student Learning Proposals, Guidelines, and reports from nine funded projects
  • Strengths, Weaknesses, Opportunities, and Threats [SWOT] Analysis from the University Planning Committee
  • Graduation Rates Reports, compiled annually for NCAA for the Fall, 1983 through Fall, 1990 cohort.
  • Retention Grants reports, funded by President’s Office
  • Collaborative Learning Grants reports, funded under Robert and Louise Lee Collaborative Teaching Award
  • WASC Interim Report, filed December 1994
  • CSUF Mission, Goals and Strategies
  • CSU Cornerstones
  • Senate Forum articles
  • Institute for the Advancement of Teaching and Learning (IATL) documents and newsletter
  • GE Committee’s reports including new Learning Goals and curricular experiments
  • Fullerton First Year [FFY] assessment reports
  • UPS 210 revision committee
  • Social Science Research Center study on attrition.
  • SAS study of admitted students who do not enroll, Fall, 1996
  • Social Science Research Center study of faculty workload
  • Two uses of ACT outcomes survey: GE committee, and SSRC study of alumni
  • Career Development Center’s annual survey of recent graduates
  • Pat Wegner’s study of introductory chemistry students
  • Tania Marien’s MA thesis tracking learning of introductory biology majors
  • Tom Mayes’ Student Assessment Center: measuring performance outcomes of business majors
  • Lynne McVeigh’s study of MVC students, Fall, 1997
  • Senate surveys in conjunction with Spring elections
  • NSM’s NSF grant for Undergraduate Education
  • Chancellor’s Office Division of Analytic Studies Website, for comparative campus and system-wide statistics. http://www.co.calstate.edu/asd
  • Norm Page’s study of small group collaboration