January 2008 Index
 
Home Page

Editor’s Note: This excellent and compelling study contradicts some popular beliefs about the significance of instructional design and organizers to facilitate student performance. It reminds us that human beings are not lab animals, and that human needs for communication and participation are sometimes more significant than theory and practice.

Assessing the Impact of Instructional Design and Organization on Student Achievement in Online Courses

Lori Kupczynski, Rebecca Davis, Philip Ice, David Callejo
United States

Abstract

The Community of Inquiry Framework posits teaching, social and cognitive presence interact to create the learning experience in online environments (Anderson, Rourke, Garrison & Archer, 2001). Though a great deal of research has been conducted to empirically validate this construct, it has done so largely from the perspective of student satisfaction and perceived learning. Using a mixed methods design, this study examined the relationship between instructional design and organization (one of the components of teaching presence) and student performance. The results suggest that much more inquiry is needed in this area as triangulation of data raised serious questions related to the perceived value of instructional design elements among certain socio-economic groups of learners.

Keywords: online learning, teaching presence, instructional design, organization, student achievement, online courses, Hispanic students, Community College.

Introduction

As enrollment of online learners continues to grow at double digit rates (Allen & Seaman, 2006), it is imperative that faculty understand those elements that redefine what it means to be a teacher within this environment (Bennett & Lockyer, 2004). While the most obvious aspect of this paradigm shift is developing an understanding of related technologies (Brown, 2003; Pittinsky, 2002), it is essential that faculty understand the move from tool usage to application of such tools in a fashion that is informed by evaluation of their impact on pedagogy/andragogy (Epper & Bates, 2001; Hiltz & Goldman, 2005).

Though several models have been proposed to explain the learning process in online environments, one gaining the most attention is the Community of Inquiry Framework (CoI) (Garrison, 2007). Grounded in the constructivist school of thought, the CoI consists of three overlapping elements – teaching, social presence and cognitive presence – that coalesce to create the educational experience (Garrison, Anderson & Archer, 2000; Garrison & Archer, 2002). With a search of Google Scholar revealing more than 160 citations (Arbaugh, 2007) and confirmation through factor analysis (Arbaugh, 2007; Arbaugh & Hwang, 2006; Garrison, Cleveland-Innes & Fung, 2005), the CoI is considered as a baseline for the establishment of grounded theory in online teaching and learning dynamics.

Several studies have examined the three presences (Arbuagh & Hwang, 2006; Richardson & Swan, 2003; Shea, Li, Swan & Pickett, 2005; Swan & Shih, 2005); however, research has largely assessed each in terms of its impact on student satisfaction, with few studies assessing the impact of the presences on learning effectiveness (Wise, Chang, Duffy and del Valle, 2004). This study moves in this direction by examining the relationship between one facet of teaching presence – instructional design, organization and relationship – and learning effectiveness.

Following a review of the related literature, the institutional setting is contextualized and a description of the convergent triangulation research design is presented. Interpretation of data uses a comparative construct to explain the complexity of assessing the impact of practice on performance. Finally, conclusions and directions for further research are presented in hopes of expanding on this exploratory study.

Literature Review

To promote learner satisfaction and success in the online environment, educators must examine emerging teaching methodologies and engage in critical self-reflection of their instructional practices (Bennett & Lockyer, 2004; Conrad, 2004; Long, 2002; Merriam & Caffarella, 1999; Palloff & Pratt, 1999). Among the many considerations to foster positive outcomes for the learner is preparation and facilitation of courses in this medium. Instructors must be willing to rethink how they will guide learners to understand material and concepts that are essential for the transfer of learning (Olgren, 2000). Thus, the perspective of the instructor regarding learning via online instruction is a large factor in the success or failure of a distance learning venture. Instructor attitudes toward the online forum tend to range from enthusiasm to skepticism. Nevertheless, the online approach to teaching is here and measures for excellence in this endeavor must be cultivated for this is a continually expanding educational opportunity (Dziuban, Shea & Arbaugh, 2005; Palloff & Pratt, 2003).

Teaching Presence and the Community of Inquiry Framework

Viewed in a larger context, the performance of the aforementioned instructor-related tasks fall within the teaching presence construct of the Community of Inquiry Framework (CoI) (Anderson, Rourke, Garrison & Archer, 2001; Garrison, Anderson & Archer, 2000). Consisting of three overlapping presences (teaching, social, and cognitive) which coalesce in asynchronous learning communities, the CoI is considered a leading theoretical framework for understanding the co-construction of knowledge in online learning environments (Garrison, 2007; Garrison & Arbaugh, 2007). For purposes of this study, teaching presence is considered the most important; however, brief synopses of social and cognitive presence are provided to allow for contextualization.

Social presence, in the context of online learning, is described as the ability to project one's self through media and establish personal and meaningful relationships. The three main factors that allow for the effective projection and establishment of social presence are effective communication, open communication and group cohesion (Richardson & Swan, 2003; Swan & Shih, 2005).

Grounded in the work of Dewey (1933), cognitive presence is defined as the exploration, construction, resolution and confirmation of understanding through collaboration and reflection (Garrison, 2007). Garrison and Archer (2003) describe this process as consisting of four phases, beginning with creating a sense of puzzlement or posing a problem that piques learners' curiosity. As a community, course participants exchange information and integrate understandings to answer the initial problem, culminating in the resolution phase were learners are able to apply the knowledge to both course and non-course related issues.

Teaching presence, the third component of the CoI, is described by Anderson and colleagues (2001) as a three-part structure consisting of: facilitation of discourse, direct instruction, and instructional design and organization. The first element, facilitation of discourse, is necessary to maintain focus and engagement in course discussions. It also allows the instructor to set the appropriate climate for academic exchanges (Anderson et al., 2001). The authors include the following as indicators of facilitation of discourse:

  • identifying areas of agreement and disagreement

  • seeking to reach consensus and understanding

  • encouraging, acknowledging, and reinforcing student contributions

  • setting the climate for learning

  • drawing in participants and prompting discussion

  • assessing the efficacy of the process

With respect to direct instruction, Anderson et al. (2001) describe the following indicators:

  • presenting content and questions

  • focusing the discussion on specific issues

  • summarizing discussion

  • confirming understanding

  • diagnosing misperceptions

  • injecting knowledge from diverse sources

  • responding to technical concerns

Recent work by Shea and colleagues (2005) indicates that students may not perceive a difference between facilitation of discourse and direct instruction. In their research, factor analysis indicated that perhaps these first two elements should be collapsed into one category and termed directed facilitation.

With respect to instructional design and organization, the element most important to this study, Anderson et al. (2001) include the following indicators:

  • setting curriculum

  • designing methods

  • establishing time parameters

  • utilizing the medium effectively

  • establishing netiquette

Although social and content-related interactions (social and cognitive presence respectively) are necessary to facilitate learning in online environments, Garrison and colleagues (2000) contended that by themselves they are not sufficient to ensure maximization of outcomes. Interactions need to have clearly defined parameters and be focused toward established goals and objectives, in other words, application of the tenets of teaching presence (Garrison and Arbaugh, 2007). Reinforcing this assertion are a number of studies underscoring the importance of teaching presence in online learning environments (Dixon, Kuhlhorst & Reiff, 2006; Finegold & Cooke, 2006; Garrison & Cleveland-Innes, 2005; Murphy, 2004; Swan, 2003; Richardson & Swan, 2003; Swan & Shih, 2005; Wu & Hiltz, 2004). However, the vast majority of teaching presence research has focused on the facilitation of discourse and directed instruction with little attention given to instructional design and organization. Further, a review of the available instructional design and organization literature revealed that those few studies that do exist address the relationship between this element and student satisfaction, not performance.

Instructional Design and Organization

Traditionally, instructional design has been thought of as a systematic process that addresses, desired goals and outcomes, then working backwards, strives to develop assessments, strategies and materials that will achieve these objectives (Davidson-Shivers & Rasmussen, 2006; Gagne, Wager, Golas & Keller, 2004; Morrison, Ross & Kemp, 2006; Wiggins & McTighe, 2005). Applying this general definition to online learning and refining it to apply to the CoI, Anderson and colleagues (2001) described the design and organization element of teaching presence as the planning, design and development of those structures and processes that serve as catalysts for interaction in online courses.

Because online learning environments are low in paralinguistic cues (Liu, Bonk, Magiuka, Lee & Su, 2005) and generally lack the transparency associated with the traditional classroom (Coppola, Hiltz & Rotter, 2002), socially mediated practice (Vygotsky, 1978) can be negatively impacted. Therefore, instructors must be more explicit with respect to providing directions and establishing expectations (Anderson, Rourke, Garrison & Archer, 2001).

Method

This study utilized a convergent triangulation design to answer the following research questions:

RQ 1: Is there a relationship between student satisfaction with instructional design and organization and student performance in online courses?

RQ 2: What facets of instructional design and organization do students associate with success in online courses?

Instructional Setting

The study is based on a population of students residing in the Rio Grande Valley of south Texas and attending class at South Texas College. At this institution, students have the opportunity to complete certificate programs through the Bachelor of Applied Technology degree. The participants for the study were enrolled in online courses in a variety of subjects, ranging from developmental education through senior level class work in all areas of instruction. Course design utilizes WebCT as the learning system and synchronous or asynchronous instruction, determined by the instructor’s preference. During the Fall 2005 semester, one or more sections of 75 different courses were offered.

Participants

South Texas College’s Institutional Review Board approved the protocol for this study to ensure ethical treatment of all participants. The survey instrument was administered to 2,157 students enrolled in one or more online courses with no incentive for participation offered. A total of 362 participants (response rate = 16.8%) chose to complete the survey. The majority of respondents (69.3%) were between the ages of 18 and 29, with 28.2% age 18-21. Females comprised 79.9% of participants. With respect to ethnicity, 91.4% described themselves as Hispanic, 4.3% Anglo, 1.4% African American, 0.6% Native American and 2.3% as Other. In terms of previous online course experience, 65% had previously taken at least one online course. With respect to technical preparation, 51.1% of participants had completed a pre-course tutorial offered to students taking online courses and 95.4% believed that they were adequately prepared.

Design

A mixed methods approach utilizing a convergent triangulation design with both concurrent and sequential components was implemented (Cresswell & Plano-Clark, 2006). In the analysis and interpretation phase, equal weighting was given to both the quantitative and qualitative components to enrich the description of the value participants placed on instructional design and organization (Morse, 1991). Three separate sets of data were utilized in the triangulation process: end of course Likert-type items, end of course open-ended qualitative items and autoethnographic reporting (Patton, 2002).

A mixed methods research design was selected for the work and guided by a “pragmatic approach” or paradigm (Morgan, 2007). The focus was to capitalize on the strengths of both quantitative and qualitative approaches to data collection. This required following established criteria for generating high quality quantitative and qualitative data. While criteria for judging the quality of quantitative studies are well established, there is less agreement regarding what quality criteria are applicable to qualitative research (Denzin & Lincoln, 2003; Marshall & Rossman, 1989). Jick (1979) argued that triangulation of data sources aimed at enriching understanding through multiple perspectives should be the central criteria by which mixed methods research is judged.

End of Course Survey – Quantitative Data

At the end of the semester, students were asked to complete a survey to assess satisfaction and perceived learning. The survey consisted of 48 items. Of these, 13 asked for demographic information and four were open-ended qualitative items. The remaining 31 were Likert-type items assessing student satisfaction with course design, navigation, the instructor and perceived learning, including an item that asked for self-reporting of final grades. Four of these former items were related to instructional design and organization (Appendix A) and comprise the quantitative portion of the study.

End of Course Survey – Qualitative Data

Of the four end-of-course survey items, two were used in the study:

  1. Please list one thing the instructor did that helped you to succeed in this class.

  2. Please list one thing the instructor did that hindered your success in this class.

Responses were analyzed following suggestions by both Strauss (1987) and Tesch (1990) using an interpretive, iterative approach with emphasis placed on drawing out thematic strands. Because of the data richness, both within and cross case analyses were utilized to more fully represent what occurred at both the individual level and as part of a group dynamic. Data were then transformed and quantified by theme within the teaching presence construct of the CoI. Thirty replies were related to personal issues, therefore not falling within the CoI Framework. These replies were coded and categorized as Other.

Autoethnographic Reporting

In accordance with suggestions offered by Patton (2002), this paper’s lead author utilized a self-interview format for ongoing journaling of her perceptions of student response to instructional organization and design techniques during the semester in which data collection occurred. Though this technique is arguably subject to bias on the part of the reporter, it was deemed a valuable tool for cross-checking the interpretation of the end of course survey qualitative data.

Triangulation

After analyzing each qualitative data set in the manner described above, the end of course survey, qualitative data was crosschecked with the autoethnographic reporting to assess commonality and accuracy of independent interpretations. This process included the use of negative case analysis to explore consistency across data sources (Ryan & Bernard, 2003). Quantitative data were then analyzed using descriptive statistics and regression analysis.

The qualitative findings were then converged on the quantitative data to fully explore the implications of the statistical findings. As there was a significant difference between quantitative and qualitative data, the qualitative points were used to offer an explanation of these differences, using suggestions made by Cresswell and Plano-Clark (2006). The interpretive conclusions from triangulation analyses were then compared to what is known about the corresponding elements, instructional design and organization to develop conclusions and directions for future research.

Results

Reliability

As previously noted, the wording of the instructional organization and design subscale was revised to accommodate potential interpretability issues that may impact community college learners. Therefore, reliability of the scale was a primary concern in this study. Reliability analysis produced a Cronbach’s Alpha of .91, thus alleviating concerns related to reliability.

Multiple Regression Analysis

A multiple regression analysis was applied to examine the relationship between the instructional design and organization measures and student reported performance. In the regression analysis, the criterion variable was the final grade in the course as reported by students. The predictor variables were four measures of instructional design and organization (Appendix A). No violations were found in the assumptions of normality, linearity, and homoscedasticity of residuals. Thirteen outliers were found based on the criteria of beyond +/-3 standard deviations; these were removed, and 349 cases were used in the present analysis.

Presented in Table 1 are the unstandardized betas (B), standard error (SE B) and standardized betas (Beta) of the independent variables. The results of the regression model were found to not be significant, F (4, 342) =2.194, p>.05. The multiple correlation coefficient was .177, indicating that 3.1% of total variance in student performance could be accounted for by instructional design and organization.

Table 1
Unstandardized Betas, Standard Error and Standardized Betas
 
B
SE B
Beta

(Constant)

1.638*

0.615

 

Class is clearly designed

0.288

0.149

0.15

Syllabus is clearly presented

-0.252  

0.171

-0.11   

Syllabus offers a tentative schedule

0.109

0.113

0.057

Instructors requirements clearly explained

0.093

0.17

0.044

*P < .05

Qualitative Data

Of the 362 students completing the survey, 227 chose to provide feedback relating to both their success and lack of success in the course. An additional 74 students chose to provide feedback related only to their success in the course and 29 chose to provide feedback related to their lack of success in the course.

Through an iterative, interpretive review of the qualitative data, it was possible to group all but 30 responses into one of the preconceived teaching presence categories. As only the instructional design and organization component of teaching presence was explored using quantitative analysis, the qualitative data was only divided into components for this category. All other themes that related to teaching presence were grouped under facilitation of discourse.

Presented in Tables 2 through 4, below, is the categorical prevalence of themes associated with success and lack of success. Data is segregated according to the qualitative items to which students responded.

Table 2
Categorical prevalence of themes associated with success
and lack of success by those who responded to both qualitative items.

 

Success

Lack of Success

Class is clearly designed

3

19

Syllabus is clearly presented

8

25

Syllabus offers a tentative schedule

11

8

Instructors requirements clearly explained

17

24

Facilitation of discourse

180

134

Other

8

17

 Table 3

 Categorical prevalence of themes associated with success
by those who chose to respond only to this item.

 

Success

Class is clearly designed

3

Syllabus is clearly presented

1

Syllabus offers a tentative schedule

14

Instructors requirements clearly explained

6

Facilitation of discourse

48

Other

2

 Table 4
Categorical prevalence of themes associated with lack of success
by those who chose to respond only to this item.

 

Lack of Success

Class is clearly designed

5

Syllabus is clearly presented

3

Syllabus offers a tentative schedule

2

Instructors requirements clearly explained

9

Facilitation of discourse

7

Other

3


Of students who cited instructional design and organization issues contributed to their lack of success, 31% received As in the course, 39% received Bs, 24% received Cs and 6% received Ds or Fs. In contrast, of students who cited facilitation of discourse issues contributed to their lack of success, 8% received As, 43% received Bs, 45% received Cs and 4% received Ds or Fs.

Autoethnographic Reporting

Throughout the semester during which the data was collected, the researcher utilized a journal format to record personal observation of student perceptions and reactions regarding instructional organization and design techniques. Observations were recorded at least weekly and many times more often as the need arose. Based on the study of these observations, it was determined that there were three areas where students consistently required additional information which was requested either through the e-mail or the discussion tool in WebCT. The three areas were assignment dates, assignment directions, and submission guidelines.

Predominantly, student e-mail or phone calls were for clarification of assignment due dates. While these were listed in the course calendar, under Assignments in the shell’s course menu, and in the course calendar, students requested confirmation of a due date or what was the due date. The response included a direct answer and a casual comment about where the information could be found. Invariably, when the next assignment was due, the same students would e-mail again asking for deadline information.

Students often contacted the instructor for specific directions for assignments. Again, this material was presented in general terms in the syllabus and in more specific terms under each assignment heading. Often, the student questions regarding assignment requirements were submitted via the discussion forum, where students were encouraged to post and answer questions. Often, the instructor would leave a question unanswered for a brief period of time to observe whether another student would offer the answer. This was a rare occurrence.

The most illuminating area noted through observation was the area of submission guidelines. This referred to the technical aspect of uploading the information to the appropriate area. For this instance, instructions were provided in the course for each assignment and general instructions were also provided by the course management system. Still, students often e-mailed, usually very near the deadline for the assignment, that they were unable to submit or did not understand how to submit the assignment. Many times, resolution required a telephone call to walk the student through the submission process step-by-step to ensure that the submission was correctly handled.

Triangulation & Discussion

Regression analysis revealed no significant relationship between instructional design and organization and student achievement in online courses. In addition to a lack of significance, the multiple correlation coefficients indicated that only 3.4% of the variance in student achievement was accounted for by the predictor variables. Though this study was exploratory in nature, it was believed that a relationship was likely to exist and would account for a larger degree of variance. Therefore, the results of the regression analysis were quite surprising as they contradicted the assumptions upon which this study was founded.

However, supporting the quantitative findings were the transformed qualitative data. As open ended questions do not impose a preconceived bias on respondents’ replies, it gives significant weight to alignment of this data with the quantitative findings. Of those students citing factors responsible for their success in the course, only 20.9% were related to instructional design and organization. Of those citing factors for their lack of success, 37.1% cited factors related to instructional design and organization. On the surface, this data appears to contradict the quantitative findings to some extent. However, of the students sighting instructional organization and design issues as a reason for their lack of success, 31% received A’s in the course and 39% received B’s. It is interpreted that a total of 70% believed they were not completely successful in mastering course content and objectives, yet performed at levels deemed excellent or good by conventional standards.

In understanding why a significant relationship was not found to exist between student achievement and instructional design and organization, the autoethongraphic reporting is informative. Despite directions for participation, due dates and course content being presented in a clear, and often redundant, manner, students frequently contacted the instructor via email or phone to seek additional information. A review of the associated course website indicated that this data was clearly available and presented in an easily interpretable fashion.

Whether students were simply not reading the online materials or wishing to make additional contact with the instructor for reasons related to the establishment of social presence remains unclear. However, the noted lack of student-to-student communication in seeking clarification suggests that establishing a relationship with the instructor at a more personal level or as an authority figure may have been a motivating force in the frequent level of contact.

 The autoethongraphic data also suggests that, for these students, the relationship with the instructor was far more important in the learning process than was the relationship between students and the content. Supporting this hypothesis is the qualitative data, which revealed that 75.2% of students attributed their success in the course to directed facilitation on the part of the instructor.

Findings of this triangulation are presented in a guarded manner as participant demographics limit generalizability in two ways. First, this study consisted of community college learners who may have learning needs that differ significantly from learners in other post-secondary programs. Second, the population was overwhelming (91.4%) Hispanic, raising the prospect that ethnicity may be a confounding factor in interpreting the relationship between instructional design / organization and achievement.

Over the past decade, four-year college completion rates have been declining across all racial and ethnic groups as more students take longer to receive their Bachelors degree (Astin & Oseguera, 2005; Cabrera et al., 1993; Longerbeam et al., 2004). Compounding the problem is the tendency of prospective students from low socio-economic areas, where poorly maintained and funded public schools are the norm, to doubt their academic abilities, question the value of their scholarly contributions, and reconsider their decision to pursue a degree (Cuádraz, 1997; Gándara, 1995; Solórzano, 1998). In response, many students from this demographic elect to begin their coursework in a community college setting where the curriculum is perceived to be less rigorous and the risk of failure lower. However, the reality is that too often these institutions provide instruction grounded in cultural practices that remain alien to many attendee clusters, thereby failing to address socially derived structural inequalities (Garcia, 2003; Garcia & Gopal, 2003; Valencia & Bernal, 2000)) and therefore curb the knowledge transference function in community colleges (Ornelas & Solórzano, 2004).

For Hispanic students, the situation is even more dire. In comparison to other ethnic groups, research shows that they take longer to enroll in college and to eventually graduate (Kennen & Lopez, 2005; Swail, Cabrera, Lee and Williams, 2005). Delayed enrollment and longer time to degree completion for Hispanic students has been attributed to several factors, such as working full-time while also taking courses part or full-time, having to tend to familial responsibilities, or having to take developmental courses which may not be credited towards degree attainment (Nora, 2004). In turn, these experiences serve to amplify self-doubt and lead to a need for external reinforcement or precipitate the decision to withdraw from programs (Ponjuan, 2005).

Conclusions

The original intent of this study was to determine if a relationship existed between instructional design / organization and student performance in online courses. None of the three methods revealed a relationship between the predictor and criterion variables. However, we believe that the study is meaningful in that it illuminated the possibility that among certain demographics little value may be placed on instructional design related elements. Rather, there is a strong indication that the students in this study were highly dependent upon interpersonal student / instructor interactions for both direction and reinforcement.

From the literature, reviewed in the triangulation and discussion section, this hypothesis appears to be consistent with the needs and expectations of both low socio-economic status students and Hispanics that enroll in community college courses. However, as both groups were inexorably intertwined in this study, more analysis is needed to determine if both demographics present the same set of needs or if the phenomenon is more tightly focused.

Likewise, future inquiry should address multiple institutions, racial groups, geographic clusters and degree levels. Ideally, a mixed methods study with a quantitative component that utilizes hierarchical linear modeling would be ideal for this purpose as various group attributes could be defined as nested data sets and regressed against the criterion variable of performance.

Regardless of the approach taken we believe that future studies are imperative. If in fact, contemporary practices related to design of online courses and subsequent pedagogical strategies are repressive to any socio-economic group, then we are creating a secondary digital divide just as the cost-driven digital divide is starting to be mitigated by market forces. As such, this issue should be viewed as one of promoting equity through technology mediated praxis.

References

Allen, I. E., & Seaman, J. (2006). Making the grade: Online education in the United States. Needham, MA: Sloan Consortium.

Anderson, T., Rourke, L., Garrison, D. R., & Archer, W. (2001). Assessing teaching presence in a computer conferencing context. Journal of Asynchronous Learning Networks, 5(2), 1-17.

Arbaugh, J. B. (2007). An empirical verification of the community of inquiry framework. Journal of Asynchronous Learning Networks 11(1), 73-85.

Arbaugh, J. B. & Hwang, A. (2006). Does "teaching presence" exist in online MBA courses? The Internet and Higher Education 9(1), 9-21.

Astin, A.W. & Oseguera, L. (2005). Degree attainment rates at American colleges and universities, revised edition. Los Angeles: Higher Education Research Institute, UCLA.

Bennett, S., & Lockyer, L. (2004). Becoming an online teacher: Adapting to a changed environment for teaching and learning in higher education. Educational Media International, 41(3), 231-244.

Brown, D. (Ed.). (2003). Developing faculty to use technology: Programs and strategies to enhance teaching. Bolton, MA: Anker.

Cabrera, A.F., La Nasa, S.M., & Castaneda, M.B. (1993). College persistence: Structural equation modeling test of an integrated model of student retention. Journal of Higher Education,
64(2), 123-137.

Conrad, D. (2004). University instructors’ reflections on their first online teaching experiences. Journal of Asynchronous Learning Networks. 8(2), 31-44.

Coppola, N. W., Hiltz, S. R., & Rotter, N. G. (2002). Becoming a virtual professor: Pedagogical roles and asynchronous learning networks. Journal of Management Information Systems,
18
(4), 169-189.

Cresswell, J. W., & Clark, V. L. P. (2006). Designing and conducting mixed methods research. Thousand Oaks, CA: Sage.

Cuádraz, G. (1997). Chicana academic persistence: Creating a university-based community. Education and Urban Society, 30, 107–121.

Davidson-Shivers, G. V., & Rasmussen, K. L. (2006). Web-based learning: Design, implementation, and evaluation. Upper Saddle River, NJ: Pearson Prentice Hall

Denzin, N. K., & Lincoln, Y. S. (Eds.). (2003). Collecting and interpreting qualitative materials (2nd ed.). Thousand Oaks, CA: Sage.

Dewey, J. (1933). How we think (Rev. Ed.). Boston: D.C. Heath.

Dixcon, M., Kuhlhorst, M., & Reiff, A. (2006). Creating effective online discussions: Optimal instructor and student roles. Journal of Asynchronous Learning Networks 10(4), 15-28.

Dziuban, C., Shea, P., & Arbaugh, J. (2005). Faculty roles and satisfaction in asynchronous learning networks. In S. Hiltz & R. Goldman (Eds.), Learning together online: Research on asynchronous learning networks (pp. 19- 37). Mahwah, NJ: Lawrence Erlbaum.

Epper, R. M. & Bates, A. W. (Eds.). (2001). Teaching faculty how to use technology: Best practices from leading institutions. Westport, CT: Onyx.

Finegold, A. R. D., & Cooke, L. (2006). Exploring the attitudes, experiences and      dynamics of interaction in online groups. Internet and Higher Education, 9(3), 201-215.

Gagne, R. M., Wager, W. W., Golas, K., & Keller, J. M. (2004). Principles of instructional design (5th ed.). Belmount, California: Wadsworth/Thompson Learning.

Gándara, P. (1995). Over the ivy walls: The educational mobility of low-income Chicanos. Albany, NY: State University of New York Press.

Garcia, P. (2003). The use of high school exit examinations in four southwestern states. Bilingual Research Journal, 27 (3), 431–450.

García, P. A., & Gopal, M. (2003). The relationship to achievement on the California high school exit exam for language minority students. NABE Journal of Research and Practice, 1 (1), 123–137.

Garrison, D. R. (2007). Online community of inquiry review: Social, cognitive, and teaching presence issues. Journal of Asynchronous Learning Networks 11(1), 61-72.

Garrison, D. R., Anderson, T., & Archer, W. (2000). Critical inquiry in a text-based environment: Computer conferencing in higher education. The Internet and Higher Education 2(2-3), 87-105.

Garrison, D. R. & Archer, W. A. (2003). Community of Inquiry Framework for Online Learning. (M. G. Moore & W. G. Anderson, Eds.). Handbook of distance education. New York: Erlbaum.

Garrison, D.R., & Arbaugh, J.B. (2007). Researching the community of inquiry framework: Review, issues and future directions. The Internet and Higher Education 10(3), 157-172.

Garrison, D. R., Cleveland-Innes, M. & Fung, T. (2004). Student role adjustment in online communities of inquiry: Model and instrument validation. Journal of Asynchronous Learning Networks
8
(2), 61-74.

Garrison, D. R., & Cleveland-Innes, M. (2005). Facilitating cognitive presence in online learning: Interaction is not enough. American Journal of Distance Education, 19(3), 133-148.

Hiltz, S. & Goldman, R. (Eds.). (2005). Learning together online: Research on asynchronous learning networks. Mahwah, NJ: Lawrence Erlbaum.

Jick, T. (1979, December). Mixing qualitative and quantitative methods: Triangulation in      action. Administrative Science Quarterly, (24)1, 602-11.

Kennen, E., & Lopez, E. (2005) Finding alternate degree paths for non-traditional, NOW-traditional students. The Hispanic Outlook in Higher Education, 15, 21-22.

Liu, X., Bonk, C. J., Magiuka, R. J., Lee, S., & Su, B. (2005). Exploring four dimensions of online instructor roles: A program level case study. Journal of Asynchronous Learning Networks
9(4), 29 – 48.

Long, H. B. (2002). Teaching for learning. Malabar, FL: Kreiger Publishing.

Longerbeam, S.D., Sedlacek, W.E., & Alatorre, H.M. (2004). In their own voices: Latino student retention. NASPA Journal, 41(3), 538-550.

Marshall, C., & Rossman, G. B. (1989). Designing qualitative research. Thousand Oaks, CA: Sage.

Merriam, S. B., and Caffarella, R. S. (1999). Learning in adulthood. 2nd ed. San Francisco: Jossey-Bass.

Morgan, D. L. (2007). Paradigms lost and pragmatism regained: Methodological        implications of combining qualitative and quantitative methods. Journal of Mixed Methods Research 1(1), 48-76.

Morrison, G. R., Ross, S. M., & Kemp, J. E. with Kalman, H. K. (2006). Designing effective instruction (5th ed.). San Francisco: John Wiley and Sons.

Morse, J. M. (1991). Approaches to qualitative-quantitative methodological triangulation. Nursing Research 40, 120-123.

Murphy, E. (2004a). Recognizing and promoting collaboration in an online asynchronous discussion. British Journal of Educational Technology, 35(4), 421-431.

Nora, A. (2004). The role of habitus and cultural capital in choosing a college, transitioning from high school to higher education, and persisting in college among minority and non-minority students. Journal of Hispanic Higher Education, 3(2), 180- 208.

Olgren, C. H. (2000). Learning strategies for learning technologies. In E. J. Burge (Ed.) (No. 87), New directors for adult and continuing education: The strategic use of learning technologies
(pp. 7-16). San Francisco: Jossey-Bass.

Ornelas, A., & Solórzano, D. G. (2004). Transfer conditions of Latina/o community college students: A single institution case study. Community College Journal of Research and Practice,
28, 233–248.

Pallof, R. M., and Pratt, K. (1999). Building learning communities in cyberspace: Effective strategies for the online classroom. San Francisco: Jossey-Bass.

Patton, M. (2002). Qualitative research and educational methods (3rd ed.). Thousand Oaks, CA: Sage.

Pittinsky, M. S. (Ed.). ( 2003). The wired tower: Perspectives on the impact of the internet on higher education. Upper Saddle River, NJ: Pearson Education.

Richardson, J. C. & Swan, K. (2003, February). Examining social presence in online courses in relation to students’ perceived learning and satisfaction. Journal of Asynchronous Learning Networks.
(7)
1, 68-88.

Ryan, G. W., & Bernard, H. R. (2000). Data management and analysis methods. In N. K.     Denzin & Y.S. Lincoln (Eds.) Collecting and interpreting qualitative materials (2nd ed.) (760-802). Thousand Oaks, CA: Sage.

Shea, P., Li, C. S., Swan, K., & Pickett, A. (2005, December). Developing learning community in online asynchronous college courses: The role of teaching presence. Journal of Asynchronous Learning Networks 9(4), 59-82.

Solórzano, D. G. (1998). Critical race theory, racial and gender microaggressions, and the experiences of Chicana and Chicano scholars. International Journal of Qualitative Studies in Education, 11, 121–136.

Strauss, A. L. (1987). Qualitative analysis for social scientists. New York: Cambridge University Press.

Swail, W.S., Cabrera, A.F., Lee, C. & Williams, A. (2005). Latino students and the educational pipeline. Washington, D.C.: Educational Policy Institute 14 HERI Research Report Number 3, May 2007.

Swan, K., & Shih, L. F. (2005). On the nature and development of social presence in online course discussions. Journal of Asynchronous Learning Networks 9(3), 115-136.

Tesch, R. (1990). Qualitative research: Analysis types and software tools. New York: Falmer.

Valencia, R. R. (Ed.). (1997). The evolution of deficit thinking: Educational thought and practice. Bristol, PA: Taylor and Francis.

Valencia, R. R., & Bernal, E. (eds.). (2000). The Texas assessment of academic skills (TAAS) case: Perspectives of plaintiff s’ experts. [Special issue]. Hispanic Journal of Behavioral Sciences,
22 (4).

Vygotsky, L. (1978). Mind in society: The development of higher psychological processes. Cambridge, MA: Harvard University Press.

Wiggins, G. P., & McTighe, J. (2005). Understanding by design (expanded 2nd ed.) Alexandria, VA: Association for Supervision and Curriculum Development.

Wise, A., Chang, J., Duffy, T. & del Valle, R. (2004). The effects of teacher social presence on student satisfaction, engagement, and learning. Journal of Educational Computing Research
31
, 247-271.

Wu, D., & Hiltz, S. R. (2004). Predicting learning from asynchronous online discussions. Journal of Asynchronous Learning Networks 8(2), 139-152.

 

About the Authors

Lori Kupczynski, Ed.D. is an instructional designer at the University of Texas-Pan American in Edinburg, Texas. She serves as the faculty liaison for distance education and trainer for all faculty who wish to teach online. Her doctorate focused on Educational Leadership with an emphasis on adult education, and her research interest centers upon Internet-based instruction and the role of the adult learner in this medium.
Contact: loriski@utpa.edu

Rebecca Davis, Ph.D. is an Assistant Professor of Adult Education at Texas A&M University-Kingsville. Rebecca completed her Ph.D. at Texas A&M University in Educational Human Resource Development and her M.Ed. in Adult Education at Texas A&M University-Kingsville. She has over 16 years of experience in adult education. Rebecca worked in the field of Continuing Education at the University of New Hampshire and was Professional Development Coordinator for Texas A&M University-Kingsville. In addition to her teaching, Dr. Davis is the Director for the grant funded South Region GREAT Center which provides professional development for literacy teachers in South Texas.
Contact: rebecca.davis@tamuk.edu.

Philip Ice, Ed.D. teaches courses in instructional design and school curriculum in the Department of Middle, Secondary and K-12 Education at the University of North Carolina Charlotte. His research is focused in two interrelated areas. The first is the use of audio feedback in online environments. The second is exploring how the projection of teaching presence impacts the emergence of cognitive presence in online courses.
Contact: pice@uncc.edu

David M. Callejo Pérez, Ed.D. currently teaches curriculum studies and coordinates the doctoral program in Curriculum and Instruction at West Virginia University. He co-edited Pedagogy of Place (2004) and Educating for Democracy in a Changing World (2007) and wrote Southern Hospitality (2001) and Life of a School (2007). He has written dozens of articles and book chapters focusing on identity and schools, civil rights, teacher education, qualitative research, and transmigration.

Contact: david.callejo@mail.wvu.edu

 


 
go top
January 2008 Index
Home Page