For the full evaluation report and study, please see the 2004 MIT OCW Program Evaluation Findings Report (March 2005) (PDF - 2.1 MB).
With 1,100 courses now available, MIT is delivering on the promise of MIT OpenCourseWare (MIT OCW). We have already heard from educators and learners around the world that they are benefiting from the materials offered freely and openly on the MIT OCW site.
In order to understand how well MIT OCW is fulfilling its mission -- as well as to establish a thorough and continuous feedback process that guarantees its improvement over time -- we have developed a substantial evaluation program. The evaluation is focused on understanding specifics in three areas of user behavior:
- Access: Who is accessing MIT OCW, what are their profiles (educator, student, self-learner, other), what are their disciplines (or other interests), and where are they located?
- Use: How do educators and learners use MIT OCW and is MIT OCW designed appropriately to facilitate that use? To what extent and in what ways are MIT course materials adopted or adapted for teaching purposes?
- Impact: What effects -- positive or negative, intended or unintended -- are being realized through the use of MIT OCW?
The evaluation was undertaken in October and November 2004. Data collection employed an integrated "portfolio approach," as a combination of methods helped to achieve both breadth and depth in the evaluation: Please note that MIT OCW received significant coverage on the CNN International television magazine program "Global Challenges" in September and October 2004, generating unusually high levels of site access and usage patterns during that period (particularly an unusually high number of first-time visitors to the site):
Web Analytics. Akamai, MIT OCW's Web hosting and content distribution network provider, captures aggregate usage data such as page views, object views and user location. Akamai also offers a more sophisticated analytic tool called SiteWise, which MIT OCW employed starting November 1, 2003. Most Web usage statistics in this report have been drawn from the SiteWise tool, with the notable exception of geographic traffic information, which is drawn from Akamai due to its greater accuracy. Unless otherwise noted, Web statistics for this report cover the period of November 1, 2003 to October 31, 2004.
Online Intercept Surveys. Between October 25 and November 22, 2004, a survey tool invited (via pop-up window) 103,741 of the 253,597 OCW visitors for the period to complete an online survey. Of those prompted, 14,308 people began the survey, and 5,000 completed it fully, with a dropout rate of 60% and an overall completion rate of 4.8%. The sample provides a margin of error of not more than 1.5%. Self-learners -- as opposed to educators and students -- were more likely to complete the survey once started. Geographically, overall completion rates do not vary significantly from distribution of MIT OCW traffic.
Interviews. Interviews were conducted with a small subset of people in various target user groups from different geographies to gather textured qualitative data about the use and impact of MIT OCW. Interviewees were selected from those whose responses sparked the curiosity of the evaluation team. Members of the MIT OCW research team conducted 20 in-depth interviews with willing participants from intercept and supplemental survey respondents, distributed across several target regions (Latin America, China, Sub-Saharan Africa, and the Middle East and North Africa) and educational roles (educators, students and self-learners). The interview questions and protocol are included in Appendix 4 of the MIT OCW Program Evaluation Findings Report. In addition, follow-up interviews with subjects from the 2003 MIT OCW program evaluation were conducted to gather information on how their use of MIT OCW and their attitudes about the impact of the MIT OpenCourseWare project have changed. Candidates were selected from 2003 interview subjects based on geographic distribution, user role, and insightfulness of prior responses.
Site feedback. We have implemented a database to support the processing and analysis of user email feedback. The system includes email feedback collected since October 1, 2003. The feedback system allows users to self-identify role, geographic region, and type of feedback; further, the system supports tagging of email feedback by topic, correlation of feedback to related course sites, and full-text searches of feedback messages. We have contacted users as appropriate to gather additional insight into access, use, and impact. This feedback provides anecdotal insight into the MIT OCW user experience. Unless otherwise noted, email feedback addressed in this evaluation is the 3,722 feedback messages collected from November 1, 2003 to October 31, 2004.
MIT Student Survey. In order to better understand the usefulness of MIT OCW to MIT students as a window into the sustainability of opencourseware projects, MIT OCW surveyed MIT undergraduate students using a Web survey and email invitation. On November 8, 2004, 3,900 upperclass undergraduates were invited to complete the survey; by November 19, 2004, 800 students had begun the survey and 709 had completed it fully, for an dropout rate of 11.1% and an overall completion rate of 18.1%. The margin of error for the results is calculated to be no greater than 3.33%. The text of the student survey is included in Appendix 5 of the MIT OCW Program Evaluation Findings Report.
For more information about these findings, please contact Steve Carson, MIT OCW Senior Strategist.