Evaluating Distance Education Across Twelve Time Zones

Melinda G. Cerny
Manager, External Relations
Center for Advanced Educational Services
Massachusetts Institute of Technology
Cambridge, MA  02139
Email:  cerny@mit.edu     Telephone:  617-253-5414

Jesse M. Heines
Associate Professor
Dept. of Computer Science
University of Massachusetts Lowell
Lowell, MA  01854
Email:  heines@cs.uml.edu      Telephone:  978-934-3634

This paper was published in the February 2001 issue of T.H.E. Journal, Volume 28, No. 7, on pages 18-25.

blue_prev.gif (259 bytes) Back to Previous Page


“As I stood in the foyer of our distance classroom suite, I looked in on the control room operating at maximum capacity. The monitors were a montage of student faces, faculty teaching and animated slides of exquisite clarity. Here was technology at its finest. MIT students receiving a state-of-the-art educational experience in a virtual classroom, sharing it with future colleagues who live on the other side of the world! Incredible!”

- an MIT staff member on the first day of the Singapore-MIT Alliance program 


On the evening of September 9, 1999, (EST) the Singapore-MIT Alliance (SMA) launched the first of nine new, highly collaborative engineering subjects for the fall semester. By the end of the first year, 17 new subjects would be offered to the first class of SMA students. Over the next three years, Massachusetts Institute of Technology (MIT), the National University of Singapore (NUS), and the Nanyang Technological University (NTU) plan to develop and offer five interdisciplinary graduate engineering degree programs. The goals for SMA are high: to create highly visible, world class graduate education and research programs in areas of strategic importance to Singapore and the United States, and to form a new paradigm for distance collaboration in education, research, and “technopreneurship.” 

SMA is a unique and ambitious program for many reasons, one of which is that it brings graduate students from almost exact opposite sides of the globe, together, in one virtual classroom, crossing twelve time zones through an Internet2 connection. Through research and development from the MIT Center for Advanced Educational Services, the NUS Center for Instructional Technology and the NTU Center for Educational Development, the SMA program blends the use of state-of-the-art asynchronous and synchronous technology to create a dynamic, virtual-learning environment.

SMA has been many years in the making. In the mid-1990s, the government of Singapore invited an MIT assessment team to review the engineering programs at their two universities. In November 1998, the Singapore-MIT Alliance was signed as the result of key recommendations from that team. The SMA program, co-directed by Prof. Merton C. Flemings (MIT) and Prof. Hang Chang Chieh (NUS) with Deputy Directors Prof. Anthony T. Patera (MIT) and Prof. Chua Soo Jin (NUS), includes MIT, NUS and NTU faculty from the disciplines of Mechanical Engineering, Electrical Engineering and Computer Science, Aeronautics and Astronautics, Chemical Engineering, Materials Science, and Management. The first two programs, Advanced Materials and High Performance Computation for Engineered Systems, began in July 1999. The third program, Innovation in Manufacturing Systems and Technology, began in July 2000, and the final two programs, Chemical Engineering and Computer Science, will begin in July 2001. 

Each SMA program has M.S. and Ph.D. tracks. Students accepted into the program receive degrees from NUS and NTU. Although courses are delivered asynchronously and synchronously over the Internet, an important feature of this program is “summer immersion.” In July, many MIT faculty travel to Singapore to meet their new students and, along with the Singaporean faculty, begin teaching an intensive pre-curriculum. SMA masters students then accompany their SMA faculty advisors to MIT for two weeks in August to continue their instruction and work in their MIT faculty advisors labs. Ph.D. students follow in the fall and work in their MIT advisors’ labs for the entire semester. It is this combination of both face time and web-based content delivery that makes this distance learning program a unique hybrid of brick-and-mortar and virtual classrooms.

Another unusual characteristic of the SMA program is the highly collaborative nature of the degree programs and the course work. Each course is team-taught by at least two faculty members and in some cases as many as six. Almost all courses have at least one instructor from Singapore and one instructor from MIT.


Driven by Technology

Although SMA faculty travels back and forth frequently between Singapore and the United States, most of the classroom experiences involve faculty and students on opposite sides of the planet using state-of-the-art technology. SMA faculty understands their students’ need for face-to-face interaction with their professors, making synchronous delivery of course content imperative. Classes therefore meet early in the morning and in the evening to accommodate the time zone differences. 

The use of Internet2, a very high-speed adaptation of the Internet, is important to the success of the program because it allows for high quality video-conferencing. Faculty and students can move around without causing distortion to the receiving site, talk to one another with only a two-second time-delay, and view graphics that are more detailed and sophisticated than those possible using ISDN video-conferencing. 

Each live lecture is taped and then digitized and archived on the course Web site for later viewing. Other features of the Web site include a chat room, a calendar, homework assignments and reading material.


Evaluation of the Technical Delivery

Technical delivery of the SMA program has been a highly collaborative effort as well. Technical teams at the Center for Advanced Educational Services (MIT), the Centre for Instructional Technology (NUS) and the Centre for Educational Development (NTU) have faced many of the same challenges as the SMA faculty: working with different delivery systems, time zones, and physical distance. 

It was important, therefore, to conduct an assessment of the technical delivery (both synchronous and asynchronous) at a very early stage of the program. We therefore decided to survey students on their experiences with the program during the Spring 2000 semester. Due to the formative nature of the program, the survey focused entirely on technical delivery and specifically excluded questions regarding faculty teaching in the distance-learning environment.


Data on Students

Of the 70 students enrolled in the program, 61 (87%) responded, the majority (87%) of whom are studying at NUS. About half (52%) are enrolled in the Advanced Materials program, with the other half in the High Performance Computation for Engineered Systems program. Most (92%) are in at least their second semester of the SMA program, and this is the only distance education course ever taken by almost all (97%) of the students.


Students’ Use of the SMA Instructional Delivery System

Most students (89%) attend classes in person on campus in Singapore, and 87% stated that they “Always” or “Almost Always” attend the video conferencing lectures. Only 23%, however, stated that they “Always,” “Almost Always,” or “Usually” reviewed the videos of the lectures on the Web. Another 31% stated that they have reviewed at least half of the videos, while 39% stated that they have reviewed only a few. 5% of students stated that they never review the videos. 

Students access the course Web site quite often, as 61% reported that they do so at least 3?5 times per week. Most (92%) access the site at high speed via a corporate or university LAN, and only 7% reported that their type of connection prevented them from using the Web site as they would otherwise. 93% access the course Web site from their university or workplace, so it does not appear that Web access is any barrier to our students.

Only 7% of students reported that they access the Discussion section of the course Web site at least once a week. 23% reported that they have not even looked at that section. Likewise, 15% reported that they access the Calendar section at least once a week, while 25% say they have not even looked at it. 

When asked to pick the three features of the on-line materials that they found most valuable, four features stood out prominently:

The next closest feature, Reading, garnered only 13% of the students selecting it. It is interesting to note that these results are consistent with surveys that Jesse Heines has conducted among computer science students at the University of Massachusetts Lowell. [1]


Students’ Opinions on the Technical Aspects of the SMA Instructional Delivery System

Questions 16-25 asked students to rate various aspects of the instructional delivery system as “Unacceptable,” “Poor,” “OK,” “Good,” or “Excellent.” 

Aspects pertaining to the live instructional delivery system...

  1. your ability to hear the professor clearly over the audio system
  2. the professor’s ability to hear you clearly over the audio system
  3. your ability to hear questions and comments from the remote location
  4. your ability to see the lecturer remote site clearly through the video system
  5. your ability to see slides or projected displays through the projection system
  6. your ability to get the attention of the professor at the remote site to ask a question

Aspects pertaining to the archived instructional delivery system...

  1. your ability to hear the professor clearly
  2. your ability to hear questions and comments from others
  3. your ability to see the lecturer clearly
  4. your ability to see slides or projected displays

To analyze responses to these questions, we first converted students’ responses to a numeric scale as follows:

1 = Unacceptable     2 = Poor     3 = OK     4 = Good     5 = Excellent

We then computed the mean and standard deviation of these numeric representations. When the results are expressed as a box-and-whisker chart as shown in Figure 1, it can easily be seen that, in general, students rated the various aspects of the instructional delivery system between “OK” and “Good.” 2 These evaluation and data display techniques are useful for identifying those areas in which the development team needs to focus its efforts to improve student satisfaction with the system.


Figure 1. Student responses to survey questions 16-25. The cross mark represents the mean, and the vertical lines represent ±1.0 standard deviation. [2]


Questions 26-30 asked students to further evaluate various aspects of the instructional delivery system by indicating how they felt about five carefully worded statements using a Likert Scale. For each question they were to indicate whether they “Strongly Agreed,” “Agreed,” had “No Opinion,” “Disagreed,” or “Strongly Disagreed” with the statement. These questions’ results are show in Table 1.

Table 1. Responses to survey questions 26-31

  SA A NO D SD
  1. I found that information presented using PowerPoint slides (or other professional visual techniques) was clearer than information presented using blackboards or whiteboards.
9 25 16 9 1
  1. I found that I learned better when the presenter used a blackboard or whiteboard than when he or she used PowerPoint or other professional visual techniques.
3 25 17 14 1
  1. It is important to me that the videos are available on-line within 1 working day of the live lecture.
15 24 15 5 1
  1. I learned as much from this course as I would have if it were taught locally at my campus.
8 25 10 15 2
  1. The scheduling of early morning and late afternoon classes was a problem for me.
5 28 14 13 0

In four of the five questions, responses spanned the entire range from “Strongly Agree” to “Strongly Disagree,” and the range in the other question spanned four of the five possible responses. Thus, we may conclude that there is either a wide range of opinion about these statements or a wide range of misunderstandings about what these statements meant. We would therefore caution against analyzing these responses too critically, but one can see that students tended to agree with each statement.


Students’ Impressions of the Technical Aspects of the SMA Instructional Delivery System

The next set of questions attempted to get at students’ thoughts about the SMA Instructional Delivery System using a Semantic Differential. As in the Likert Scale questions, student responses to the questions in this section spanned a wide range, but they tended to be grouped a little more strongly than they were for the Likert Scale questions. We may be able to put a little more weight on these responses, but again we would caution against interpreting these results too strongly.

The survey presented six Semantic Differential items, each with six possible positions between them as shown below. 

26. professional  O  O  O  O  O  O  amateurish

As before, we converted these responses to numbers, computed the means and standard deviations, and plotted them in Figure 2. It can be seen that these responses had smaller standard deviations and therefore tended to be grouped a little more strongly than responses for the Likert Scale questions. It can also be seen that, as hoped, students tended to characterize the system as “professional,” “polished,” “effective,” “easy,” “clear,” and “helpful” as opposed to their semantic opposites.


Figure 2. Student responses to survey questions 31-36. The cross mark represents the mean, and the horizontal lines represent ±1.0 standard deviation.


Free Form Comments on the SMA Instructional Delivery System

The final set of questions gave students a chance to express their feelings about the system in a free form manner. We asked two questions to stimulate their responses:

  1. How did taking this course via distance-learning affect the quality of the course?
  2. On the back of this sheet, please share with us any ideas, suggestions, or concerns you would like us to know about. 

Thirty-eight students (62%) responded to Question 37, and 22 (36%) responded to Question 38. Following is a sampling of representative student responses (with spelling and grammatical errors corrected).

Representative Responses to Question 37

How did taking this course via distance-learning affect the quality of the course?

Representative Responses to Question 38

Please share with us any ideas, suggestions, or concerns you would like us to know about.


Conclusions

We have cautioned against placing too much emphasis on the analytical parts of this survey, but students’ free responses do seem to support the main points born out of that analysis. With those reservations stated, we offer the following conclusions.

As we look ahead, we hope to use SMA students’ suggestions to strengthen our delivery. Faculty members are discussing the possibility of pre-recording their lectures and putting them on the Web to view prior to synchronous class time. Live time together can then be better spent in discussion and problem solving between the students and faculty. The joint technical team from MIT, NUS, and NTU is working to create new tools to facilitate more effective learning during the synchronous class time.

For more information about SMA and technology-enabled learning developments from MIT, NUS and NTU, please visit the following Web sites:

Special Note of Thanks: We gratefully thank our colleagues at NUS for their participation in conducting this survey. Through their diligence, we received an outstandingly high percentage of completed surveys (87%). Without their efforts, this evaluation could not have happened. We also thank the SMA leadership for affording us the opportunity to survey their students.


Footnotes

  1. Heines, Jesse M., 2000. Evaluating the effect of a course Web site on student performance, Journal of Computing in Higher Education 12(1):57-83, October 2000. Available on-line at https://jesseheines.com/~heines/academic/papers/2000thejournal/
  2. The authors acknowledge the assistance of Dr. Shelley Rasmussen, Associate Professor of Mathematics at the University of Massachusetts Lowell, for her recommendation to use box-and-whisker charts to express these data graphically.

This work was done while Jesse Heines was a Visiting Scholar at the MIT Center for Educational Computing Initiatives, a subsection of the Center for Advanced Educational Services.