One of the four participants from Brown, Rich Kogut, Director of Systems & Operations, left Brown in July to take a position with Georgetown University. George Loftus, Associate Director of Network Technology, is now responsible for the systems and networking groups. I don’t know yet whether George will be participating in the assessment project, but for now I’m assuming that we won’t be pursuing measures in this area. The remaining three participants are Florence Doksansky and Howard Pasternack of the Library and myself.
The areas where we are still committed to making some progress are:
- Library electronic resources
- Help desk
- Training
- Instructional support
- Applications support
Florence Doksansky and Howard Pasternack have worked with other Library staff to identify measures of interest to them–gathering statistics on the use of electronic resources. In addition to identifying the measures, they have carried out some of the assessments and can provide some post-assessment feedback. They are finding that the information they can gather varies depending on the vendor of the product, making comparisons difficult.
In the help desk areas, we have identified six measures but have not actually carried out any of the assessment. In some cases we can get information from existing data collection systems. In others we will have to gather the information.
In the last three areas we have not yet developed assessment measures but will be.
Overall, the purpose of the measures we’re coming up with or are likely to come up with are for improvement rather than accountability. The types of measures are likely to be extensiveness, efficiency, effectiveness, and service quality more than impact or usefulness.
Some observations on the process. It has been far more difficult than I anticipated for Computing & Information Services staff to carve out the time to undertake and sustain effort on this project. The late spring, summer , and early fall have become extremely busy times preparing for incoming students, supporting dorm networking, and providing the necessary support for a marked increase in instructional use of computing technology and use of electronic resources. Like every other institution, we’re also facing more staff turnover so we’re often barely able to keep up the basics. Without a strong “institutional imperative” to develop assessment measures and assess on a regular basis, this type of effort gets done on an as possible basis.
- CNI Update 10/97
- An update on the steps and progress being made in the various areas of assessment
- Brown University CNI Project: Assessing the Academic Network Environment Summary of measure
- List of measures that are described in more detail in the following documents
- CNI Planning – Instructional Support
- Instructional Contacts – Requests
- Instructional Contacts – Specialized Software
- Instructional Uses – Services
- Instructional Needs
- In the area of Instructional Support – describes: working name of measure; type of measure; brief description; and how the data was gathered, analyzed; the purpose and value of the measure
- 4 measures:
- CNI Assessments Software Distribution
- In the area of Software Distribution: provides: type of measure; description of the measure; and the purpose of the measure
- Number of applications distributed
- Annual software updates
- Percent of central software usage
- 3 measures:
- CNI Planning
- In the area of Help Desk: provides: working name of assessment; brief description; how the data is/(will be) gathered; how the data will be analyzed; and the purpose and value of the measure
- Types of Questions
- Department Contacts
- Telephone Trends
- Turnaround Time
- Customer Satisfaction Part I – Quality of Service
- Customer Satisfaction Part II – Services Offered
- 6 measures/data collection:
- Networked Environment Training Assessment
- Notes on evaluating computer skill classes