ISSN 1550-6908


Journal Home
 

 

 

 

Editorial Board

Donald G. Perrin Ph.D.
Executive Editor

   Brent Muirhead Ph.D.

 
Senior Editor

Muhammad Betz
Editor

Elizabeth Perrin Ph.D.

Editor

Write the Editor

 

 

Editorial and Management Services

DonEl  Learning Inc

Table of Contents
 


Editor’s Note
: This study is based on a graduate methods course taught online. It measured concerns about technology integration of 23 K-12 teachers as they progressed through the course. Significant changes were measured in awareness, informal, personal, management, consequence, collaboration, and refocusing as a result of this semester long experience.

 

Experimental Effects of Online Instruction on Teachers’ Concerns About Technology Integration

Yuliang Liu, Peter Theodore, Ellen Lavelle

Abstract

This study investigates the experimental effects of online instruction in a graduate research methods course on K-12 teachers’ concerns about technology integration. The concerns of twenty-three K-12 teachers regarding technology integration were measured using the Stages of Concern Questionnaire (SoC) both before and after completing an online course. The concerns were measured along seven dimensions: awareness, informal, personal, management, consequence, collaboration, and refocusing. Significant changes in all seven dimensions were found after the teachers’ participation in a graduate online course. Discussion includes important implications for future K-12 teacher education programs and suggestions for further research in this area.

Introduction

In recent years, because of support from various funding sources, the percentage of public schools connected to the Internet has continued to increase in the United States: from 35% in 1994, to 95% in 1999, to 98% in 2000, regardless of grade level, poverty level, and metropolitan status. In addition, the ratio of students to instructional computers in public schools reached 5 to 1 (Cattagni & Farris, 2001; Education Week, 2001; Williams, 2000). This ratio is considered a reasonable level for the effective use of computers within schools according to the President’s Committee of Advisors on Science and Technology 1997 (PCAST, 1997). However, according to Education Week (2001), we should go beyond machines. That is, we should be aware that human factors are as important as hardware and software improvements. In order to enhance the use of technology for students, one human factor that is crucial is teachers’ engagement with computers.

Despite increases in resources and training opportunities, according to Rowand (2000), several factors still affect teachers’ use of computers and the Internet in classrooms. The first factor is years of teaching experience. Newer teachers are more likely to utilize computers or the Internet to facilitate various teaching activities than those with 20 or more years of teaching experience. The second factor is poverty level. Teachers in a wealthy school district are more likely to utilize computers or the Internet in teaching than those in a poor school district. In addition, only about one third of teachers surveyed reported feeling well prepared or very well prepared for utilizing computers or technology in teaching. One of the most frequently cited reasons that experienced classroom teachers do not use technology in their teaching is that they find it difficult to implement in the regular classroom (Picciano, 1994; Sheingold & Hadley, 1990). Even experienced teachers who take the initiative to upgrade their technology skills through activities such as reading, hands-on practice, and K-12 instruction may require as long as five years to fully master computer-based technology integration. Similarly, many teachers have attended college before computers were used in the classroom, so they did not benefit from exposure to models of effective technology integration in their content areas. This lack of experience and general lack of confidence regarding classroom applications of computers may foster teachers’ attitudes that do not serve the full and useful integration of technological resources in the classroom (Sheingold et al, 1990).

Another factor that affects teachers’ use of computers and the Internet in the classrooms is teachers’ attitudes or concerns. According to recent research (Atkins & Vasu, 2000; Gbomita, 1997; Snider & Gershner, 1999), teachers’ attitudes or concerns, as one of several important human factors, have a significant influence on one’s computer adoption or implementation behavior in the classroom. According to Mills (1999), elementary school teachers’ concerns and perceptions of an integrated learning system (ILS) affect the way they implement that ILS. It can be inferred that one’s attitude or concern about technology is a critical factor in terms of how rapidly and/or successfully one integrates technology into one's teaching. Thus, for schools expecting to integrate computer technology into teaching, teachers’ concerns about technology integration must be considered. In addition, according to Norton and Sprague (1998), teachers’ concerns about technology integration can even be changed in subtle ways by technology integration workshops, but not to the extent that they result in substantial changes in teaching practices. Finally, Liu, Lavelle, and Andris (2002) found that participation in online courses resulted in a modification of K-12 teachers’ attitudes measured by locus of control. The purpose of this study is to investigate the effects of online instruction on teachers’ concerns about technology integration in schools.

Online education is increasing rapidly at all levels of education worldwide (Kearsley, 2000). This increase has positively influenced many aspects of education, such as learning and teaching styles, both directly and indirectly (CEO Forum, 2000). Much current research focuses on the learners’ achievement and course evaluations as related to online education (Russell, 1999; Kearsley, 2000). There is relatively little attention paid to the effects of online instruction on learners’ attitudes. As more and more K-12 teachers are currently pursuing advanced degrees in programs that utilize various computer and communication technologies, it is increasingly important to investigate the effects of online instruction on teachers’ attitudes or concerns about technology integration. 

The research regarding the effects of technology on student learning and attitudes is somewhat mixed. On one hand, Clark (1983, 1994) maintained that media do not influence learning in any condition. On the other hand, Kozma (1994) debated that technologies such as computers and video will influence learning by interacting with an individual’s cognitive and social processes in constructing knowledge. More recent studies have supported the effects of technological media on learners’ attitudinal dimensions, such as locus of control (Liu, Lavelle, & Andris, 2002; Swan, Mitrani, Guerrero, Cheung, & Schoener, 1990), learning styles (Ching, 1998), and concerns about the use of the media (Rudden & Mallery, 1996). In addition, the effects of participation in an online professional development course on school administrators’ ideas about technology integration and methods to support teachers’ integration practices in K-12 schools have also been reported (Ertmer, Bai, Dong, Khalil, Park, & Wang, 2002).

For example, in Liu, Lavelle, and Andris’ study (2002), 12 graduate instructional technology students participated in an online course (“Distance Education”) in spring 2001. Rotter’s (1966) locus of control (LOC) scale was used to measure LOC change at the beginning, middle, and end of the semester. At the completion of that online course, all students were found to develop their LOC from external to internal. That is, results indicated that online instruction can promote positive beliefs about one’s confidence in managing technology as measured by locus of control. In addition, Rudden and Mallery (1996) studied the effects of short term Internet instruction on preservice teachers’ concerns about technology integration. That study involved 53 college sophomores in elementary education. All participants were required to use the Internet to complete two directed academic tasks with a partner. One task was to find a Web site related to a special interest and to integrate it into a literacy lesson. The other task was to develop an annotated bibliography of five Web sites useful for teachers. Participants were found to increase their concerns in four of the seven areas—awareness, information, consequence, and refocusing, as measured by Hall, George, and Rutherford’s (1977) Stages of Concerns Questionnaire (SoC). The above study indicates that even short term Internet instruction can promote some of the preservice teachers concerns about technology integration. However, the above study only involved the preservice teachers with two academic tasks for a short term online participation. 

The present study is aimed at investigating the effects of online instruction on K-12 inservice teachers’ concerns about technological intervention in instruction during a semester-long graduate Research Methods in Education course. Specifically, a research hypothesis can be derived and stated as follows:

Hypothesis: Students will have higher scores as measured by Hall et al. (1977) Stages of Concerns Questionnaire (SoC) at the completion of a graduate online course compared to the beginning of that online course.
 

Methods

Participants

The lead investigator in this study was the instructor of a graduate online course, Research Methods in Education, at a midwestern state university in fall 2001. The lead investigator taught this same course in traditional classrooms in summer 2001. In addition, he had previous online teaching experiences in other instructional technology courses. Participants were 28 graduate students enrolled in an online section of Research Methods in Education, a core course for the masters’ degree in education. Students received extra points as an incentive for participation in the study. After all students were recruited and agreed to participate, they were asked to complete consent forms and demographic surveys. For the majority of the participants, this was their first time taking an online course. Five students dropped out throughout the semester in this study because of various reasons, such as technological problems or family issues. Thus, 23 participants were included for the final analysis in this study.

The participants’ survey results indicate that all participants were majoring in one of three graduate areas: education, elementary education, or instructional technology. A majority of the participants were in the first or second year of their graduate study. The other demographic information on the participants (e. g., age, ethnicity, experience in using technology and internet access) is shown in Table 1.

Table 1
Participants’ Demographic Information (N = 23)

Demographic Variables

Frequency

Percentage

Age

25 or under

26-35

6-45

46-55

 

3

13

6

1

 

13.0

56.5

26.1

4.3

Ethnicity

Caucasian

Other

 

22

1

 

95.7

4.3

Gender

Male

Female

 

5

18

 

21.7

78.3

Job Title

School administrator

School teacher

Other

 

3

18

2

 

14.3

78.3

8.7

Computer Experiences

More than 5 years

2-5 years

 

19

4

 

2.6

17.4

E-mail Experiences

More than 5 years

  2-5 years

  1-2 years

 

11

10

2

 

7.8

43.5

8.7

Home Computer Access

Yes

No

 

22

1

 

95.7

4.3

Home Internet Access

Yes

No

 

22

1

 

95.7

4.3

Previous Online Class

Yes

No

 

3

20

 

13.0

87.0

Levels of Computer Skills

Beginning

Middle

Advanced

 

9

12

2

 

39.1

52.2

8.7

Levels of Internet Skills

Beginning

Middle

Advanced

 

7

13

3

 

30.4

56.5

13.0

Independent Variable

The independent variable in this study was online instruction, which was delivered completely online on WebCT. A hybrid of instructional techniques, which have been considered as very effective involving the use of online technology (Clark, 1999), were employed in this course In this online course, several major features of WebCT were used throughout the semester. (1) An online objective chapter quiz was administered every week and was graded automatically. Thus, students receive immediate feedback. (2) The bulletin board was used to answer and discuss each chapter’s essay questions and for mutual critiques among students every week. (3) The online synchronous chatroom was used for discussion of course-related assignments and other communication. (4) Students were required to complete a cooperative 3-person group project through various communication methods, such as bulletin board discussion, online chatroom, and private e-mail in WebCT, as well as conversation via telephone. In addition, in order to reduce learners’ learning anxiety and to maximize learning efficiency, two face-to-face technical orientations were conducted in the beginning of fall semester in 2001.

Experimental Design

This study involved a single group pretest-posttest design. Specifically, the participants in this study were pretested with the selected Stages of Concerns Questionnaire by Hall et al. (1977) in the first week face-to-face orientation meeting in fall 2001. Then the participants were exposed to the online WebCT environment after the first week through the final week. Finally, the participants were posttested with the same instrument online in the final week. The mean differences on each of the seven scales in the SoC Questionnaire were statistically tested to determine whether there were any significant differences in the concerns instrument.

Instrument

Stages of Concern. The Stages of Concerns Questionnaire is an established instrument and focuses on K-12 teachers’ concerns about an innovation. For this study, the innovation is defined as technology integration in teaching such as using the Internet or computers to accomplish instructional objectives. The SoC Questionnaire developed by Hall et al. (1977) is widely used to assess concerns about technology. The advantage of the SoC instrument is that it can measure, over time, a continuum of concerns an individual may develop related to technology integration in teaching. This instrument assesses 7 stages of concern. (1) Stage 0 is called awareness (e. g., “I am not concerned about the Internet.”). (2) Stage 1 is called informal (e. g., “I’d like to know more about the Internet.”). (3) Stage 2 is called personal (e. g., “How will the use of the Internet affect me?”). (4) Stage 3 is called management (e. g., “How much time do I need to get my materials ready when using the Internet?”). (5) Stage 4 is called consequence (e, g., “How will the use of my use of the Internet affect my student’s learning?”). (6) Stage 5 is called collaboration (e. g., “I am concerned about relating my use of the Internet with other instructors.”). (7) Stage 6 is called refocusing (e. g., “I have some ideas about how something may work better.”). According to Hall et al., the above seven stages of concerns can be divided into internal and external concerns. The seven stages of concerns are distinctive but are not necessarily mutually exclusive. Educators during the pre-teaching and early-teaching phases will be likely to have concerns related to self (internal). In the late-teaching phase, there tends to be a shift in concerns that focuses on student learning and personal professional development (external). The SoC questionnaire is appropriate for this study since most participants are K-12 teachers enrolled in the educational graduate program and their jobs are increasingly demanding the integration of instructional technologies into their teaching.

This instrument consists of 35 items that participants rate using an eight point Likert scale that ranges from “not true of me now” (0) to “very true of me” (7). Participants choose the appropriate degree to which their concerns are true of them. High numbers indicate high concern, low numbers low concern, and 0 indicates very low concern or completely irrelevant. Five statements represent each of the seven stages. All 35 items appear in the instrument in a mixed order. The raw score for this scale is the simple sum of the responses to the five statements on that scale. The internal reliability using Cronbach’s alpha coefficients ranged from .64 to .83 on the seven scales. The validity of the questionnaire was assessed using different strategies, such as intercorrelation matrices and judgments of interview. Hall et al. (1977) also found that the correlations on the 195-item questionnaire were higher near the diagonal. This finding supports the idea that each scale was more like the ones immediately surrounding it than those farther away (Hall et al. 1977). In addition, validity and reliability has subsequently been examined in other studies, and the original ideas have been supported.

Dependent Variables

There were two dependent variables. The first one was the concern scores in all seven stages, including awareness, informal, personal, management, consequence, collaboration and refocusing. The SoC questionnaire was pretested and posttested on two occasions. Since most participants were not familiar with the use of WebCT, the pretest was administered in the paper-and-pencil format in the first face-to-face orientation meeting in fall 2001, measuring the initial state of the learner’s characteristics before online instruction. The posttest was administered online in the final week, measuring the developmental state of those characteristics affected by online instruction over the semester.

The other dependent variable was participants’ academic performance, which was based on their final grades in this course. The final grades were based on the following components at the completion of this online course: (a) individual weekly essays and critiques (30%); (b) bulletin board discussion and group project (30%); (c) weekly online quizzes (30%); (d) individual reflection statement of his/her own group project and mutual evaluation of the group members (5%); (e) participation and involvement in this research project (5%). Of the total 23 participants, 21 received “A” and 2 received “B” for their final course grades.

Results and Discussion

All data was coded and analyzed using SPSS 11 to compare for the mean differences between the pretest and posttest scores using a paired sample t-test. The means and standard deviations of the scores in two administrations (pretest and posttest) are shown in Table 2. In addition, the graphical representation of the paired mean differences in the concern scores in all seven stages between the pretest and posttest is shown in Figure 1.

Table 2

Means and Standard Deviations of the Instrument Scores
in the Pretest and Posttest


 

Test
Type


N


Mean

Std. Deviation

Std.
Error Mean

Stage 0

Awareness

Pretest

23

5.6087

4.15319

.86600

 

 

Posttest

23

12.5217

4.63062

.96555

 

Stage 1

Informal

Pretest

23

20.3043

5.98053

1.24703

 

 

 

Posttest

23

28.3478

4.59850

.95885

 

Stage 2

Personal

Pretest

23

21.6087

8.12258

1.69367

 

 

 

Posttest

23

29.6957

6.34203

1.32240

 

Stage 3

Management

Pretest

23

14.8696

8.44934

1.76181

 

 

 

Posttest

23

22.0870

7.15995

1.49295

 

Stage 4

Consequence

Pretest

23

22.2609

6.64843

1.38629

 

 

 

Posttest

23

27.6957

6.24088

1.30131

 

Stage 5

Collaboration

Pretest

23

22.1304

6.89733

1.43819

 

 

 

Posttest

23

27.5217

8.55926

1.78473

 

Stage 6

Refocusing

Pretest

23

23.3043

5.83434

1.21654

 

 

 

Posttest

23

30.6522

5.47398

1.14140

 

 

Figure 1. Mean differences in seven stages
between pretest and posttest scores

According to Hall et al. (1977), Figure 1 indicates two peak stages in the concern scores: stage 2 (personal) and stage 6 (refocusing). That is, participants in this study not only had great concerns about possible effects of using the Internet on themselves, but also had some ideas about how technology integration could work better in their teaching. In addition, there is a consistent increase in all seven stages at the end of the online course, compared with those stage scores at the beginning of the online course. In order to determine the mean differences between pretest and posttest, the results of the paired t tests between pretest and posttest are shown in Table 3.
 

Table 3

Results of Paired t Tests of the Instrument Scores
between the Pretest and Posttest

 

 Paired Differences (Pretest-Posttest)

 t

 df

 

 

Mean

Std. Deviation

Std. Error Mean

95% Confidence of the Difference

Interval

 

 

 

 

 

 

 

Lower

Upper

 

 

Awareness

STAGE0 - STAGE0

-6.91

3.18

.66

-8.29

-5.54

-10.44**

22

Informal

STAGE1 - STAGE1

-8.04

5.94

1.24

-10.61

-5.47

-6.49**

22

Personal

STAGE2 - STAGE2

-8.09

7.72

1.61

-11.43

-4.75

-5.02**

22

Management

STAGE3 - STAGE3

-7.22

6.09

1.27

-9.85

-4.58

-5.68**

22

Consequence

STAGE4 - STAGE4

-5.44

6.39

1.33

-8.20

-2.67

-4.08**

22

Collaboration

STAGE5 - STAGE5

-5.39

7.94

1.66

-8.83

-1.96

-3.26*

22

Refocusing

STAGE6 - STAGE6

-7.35

4.01

.84

-9.08

-5.62

-8.80**

22

Note: * p < .01. ** p <. 001.
 

Table 3 indicates that significant differences were found in all 7 stages in SoC instrument — awareness (Stage 0), informal (Stage 1), personal (Stage 2), and management (Stage 3), consequence (Stage 4), collaboration (Stage 5), and refocusing (Stage 6)—between pretest and posttest (p < .01). Thus, the hypothesis was supported. There were significant differences between the pretest and posttest in the scores of all the seven stages.

All the participants developed significantly higher concern scores about technology integration at the completion compared to the beginning of the online course. Specifically, in Stage 0 scores, there was a significant difference between pretest and posttest (t = -10.44, df = 22, p < .001); in Stage 1 scores, there was a significant difference between pretest and posttest (t = -6.49, df = 22, p < .001); in Stage 2 scores, there was a significant difference between pretest and posttest (t = -5.02, df = 22, p < .001); in Stage 3 scores, there was a significant difference between pretest and posttest (t = -5.68, df = 22, p < .001); in Stage 4 scores, there was a significant difference between pretest and posttest (t = -4.08, df = 22, p < .001); in Stage 5 scores, there was a significant difference between pretest and posttest (t = -3.26, df = 22, p < .01); in Stage 6 scores, there was a significant difference between pretest and posttest (t = -8.80, df = 22, p < .001).

Thus, online instruction effectively changed the participants’ concerns about technology integration in the schools, including both internal (related to self) and external (related to student learning) concerns. The first four stages are internal, including awareness, informal, personal, and management. The last three stages are external, including consequence, collaboration, and refocusing. For educators, any changes in both the above internal and external concerns are very important for technology integration.

However, the above findings are not consistent with some previous research results. Rudden and Mallery’s (1996) study only reported significant differences between pretest and posttest in four concern areas: awareness, information, consequence, and refocusing. This inconsistency may be related to several important factors. The first one is experimental duration. Rudden and Mallery’s study only involved a short-term online instruction, but this study involved a semester-long course. The second one is experimental tasks. Rudden and Mallery’s only involved two academic tasks, but this study involved numerous course-related assignments and tasks. The third one is research participants. Rudden and Mallery’s study involved preservice teachers at the undergraduate level, but this study involved K-12 teachers at the graduate level.

Important Applications for K–12 Education

Since there are not many studies investigating the effects of online instruction on K-12 teachers’ concerns about technology integration in the schools, this is an important exploratory study in this area. This study indicates that online instruction can effectively help K-12 teachers heighten their concerns about technology integration in the schools. This result not only has significant practical implications for K-12 teacher education since all K–12 teachers are encouraged to use technology to assist their classroom instruction in order to improve students learning performance, but also promises contributions to the concern literature in the area of technology integration. Based on the results of this study, more online instruction should be proposed for educational programs. Thus, embedded online courses may be used in place of more lengthy/costly training. However, since this study was a single group pretest and posttest experimental design, care should be taken when any generalization is made to other environments. Therefore, further investigation of this topic is required in other control group environments.

References

Atkins, N. E., & Vasu, E. S. (2000). Measuring knowledge of technology usage and stages of concern about computing: a study of middle school teachers. Journal of Technology and Teacher Education, 8(4), 279-302.

Cattagni, A., & Farris, E. (2001). Internet Access in U.S. Public Schools and Classrooms: 1994 – 2000. Education at a Distance, 15(6). Received on December 7, 2002, from http://www.usdla.org/html/journal/JUN01_Issue/article04.html.

CEO Forum  (2000). The CEO forum: School technology and readiness report [Online]. DC: CEO Forum. Available: http://www.ceoforum.org/.

Ching, L. S. (1998). The influence of a distance-learning environment on students' field dependence/independence. Journal of Experimental Education, 66 (2), 149.

Clark, R. E. (1983). Reconsidering research on learning from media. Review of Educational Research, 53, 445-459.

Clark, R. E. (1994). Media will never influence learning. Educational Technology, Research and Development, 42(2), 21-29.

Clark, R. E. (1999). Bloodletting, media, and learning. In T. L. Russell, The No Significant Difference Phenomenon (pp. vii – xi). Office of Instructional Telecommunications: North Carolina State University.

Education Week. (2001). Technology counts 2001: The new divides. Bethesda, MD. Editorial Project in Education, Inc. Retrieved June 27, 2002, from: http://www.edweek.org/sreports/tco1/.

Ertmer, P. A., Bai, H., Dong, C., Khalil, M., Park, S. H., & Wang, L. (2002). Online professional development: building administrators' capacity for technology leadership. Journal of Computing in teacher Education, 19(1), 5-11.

Gbomita, V. (1997). The adoption of microcomputers for instruction: Implications for emerging instructional media implementation. British Journal of Educational Technology, 28(2), 87-101.

Hall, G. E., George, A. A., & Rutherford, W. L. (1977). Measuring stages of concern about the innovation: A manual for use of the SoC questionnaire. Austin, TX: Southwest Educational Development Laboratory (SEDL).

Kearsley, G. (2000). Online education: learning and teaching in no cyberspace. Belmont, CA: Wadsworth.

Kozma, R. B. (1994). Will media influence learning? Reframing the debate. Educational Technology Research and Development, 42, 7-19.

Liu, Y. Lavelle, E. & Andris, J. (2002). Experimental Effects of Online Instruction on Locus of Control. United States Distance Learning Association Journal, 16(6), Article 002. Retrieved June 27, 2002, from http://www.usdla.org/html/journal/JUN02_Issue/article02.html.

Mills, S. C. (1999). Integrating computer technology in classrooms: teacher concerns when implementing an integrated learning system. KS, US. (ERIC Document Reproduction Service No. ED 432 289).

Norton , P., & Sprague, D. (1998). Teachers teaching teachers: The Belen goals 2000 professional development project. Virginia, US (ERIC Document Reproduction Service No. ED 421109).

Picciano, A. (1994). Computers in the schools, a guide to planning and administration. New York: Merrill, MacMillan Publishing Company.

President's Committee of Advisors on Science and Technology, Panel on Educational Technology (1997). Report to the President on the Use of b Technology to Strengthen K-12 Education in the United States. This report is available online (http://www.ostp.gov/PCAST/ K-12ed.html).

Rotter, J. B. (1966). Generalized expectancies for internal versus external control of reinforcement. Psychological Monographs: General and Applied, 80 (1), 1-26.

Rowand, C. (2000). Teacher use of computers and the Internet in public schools (NCES 2000-090). U.S. Department of Education. Washington, DC: National Center for Education Statistics.

Rudden, J. F., & Mallery, A. L. (1996). Effects of Internet instruction and computer experience on preservice teachers' concerns about its place in planning and teaching. U.S.; Pennsylvania. (ERIC Document Reproduction Service No. 409592).

Russell, T. L. (1999). The No Significant Difference Phenomenon, Office of Instructional Telecommunications: North Carolina State University.

Santiago, R., & Okey, J. (1992). The Effects of Advisement and Locus of Control on Achievement in Learner-Controlled Instruction. Journal of Computer-Based Instruction, 19 (2), 47-53

Sheingold, K., & Hadley, M. (1990). Accomplished teachers: Integrating computers into classroom practices. New York: Bank Street College of Education, Center for Technology in Education.

Snider, S. L.; Gershner, V. T. (1999). Beginning the change process: Teacher stages of concern and levels of Internet use in curriculum design and delivery in one middle and high school setting. TX, US. (ERIC Document Reproduction Service No. ED 432 300).

Swan, K., Mitrani, M., Guerrero, F., Cheung, M., & Schoener, J. (1990). Perceived locus of Control and Computer-based Instruction. Albany, NY. ERIC 327 140.

Wang, A. Y., & Newlin, M. H. (2000). Characteristics of students who enroll and succeed in psychology web-based classes. Journal of Educational Psychology, 92 (1) 137-143.

Williams, C. (2000). Internet access in US public schools and classrooms: 1994-99 (NCES 2000-086). U.S. Department of Education. Washington, DC: National Center for Education Statistics.

About the Authors

Dr. Yuliang Liu is assistant professor of instructional technology in the Department of Educational Leadership at Southern Illinois University Edwardsville. His major research interest is in the area of distance education, online instruction, and research methodology. His full contact information is:

Yuliang Liu, Ph. D.
Department of Educational Leadership
Southern Illinois University Edwardsville
Edwardsville, Illinois 62026-1125 USA
Office Phone: (618) 650-3293
Fax: (618) 650-3808
E-mail: yliu@siue.edu
 

Dr. Peter Theodore is assistant professor of instructional technology in the Department of Educational Leadership at Southern Illinois University Edwardsville. His full contact information is:

Dr. Peter Theodore
Department of Educational Leadership
Southern Illinois University Edwardsville
Edwardsville, Illinois 62026-1125, USA
Office Phone: (618) 650-3291
Fax: (618) 650-3808
E-mail: ptheodo@siue.edu
 

Dr. Ellen Lavelle is associate professor of educational psychology in the Department of Educational Leadership at Southern Illinois University Edwardsville. Her full contact information is:

Ellen Lavelle, Ph. D.
Department of Educational Leadership
Southern Illinois University Edwardsville
Edwardsville, Illinois 62026-1125, USA
Office Phone: (618) 650-3945
Fax: (618) 650-3808
E-mail: elavell@siue.edu
on is:

Ellen Lavelle, Ph. D.
Department of Educational Leadership
Southern Illinois University Edwardsville
Edwardsville, Illinois 62026-1125, USA
Office Phone: (618) 650-3945
Fax: (618) 650-3808
E-mail: elavell@siue.edu

 
 go top