July 2009 Index
 
Home Page

Editor’s Note: Learning styles continue to challenge researchers and those who design instruction. Learning is more efficient and effective when presented in the preferred style of the learner; however, in the real world, learners must be capable of working with all learning styles in order to be successful.

Informing the Design of Personalized Learning Environments through Iterative Analysis of
Learner Interaction and Feedback

Mohammad Issack Santally
Mauritius

Abstract

Personalization in web-based learning environments is receiving growing attention from practitioners, educational researchers and technologists worldwide. One of the aspects that has been heavily studied, with varying levels of success and consistent results, is personalization through adaptation with respect to learning - cognitive styles and preferences. Educators and researchers have mixed opinions on the impact of learning styles/cognitive styles on the learning experience with particular reference to adult learners and web-based learning environments. The validity of learning styles questionnaires has often been put in question. This article describes a method that can be followed to minimize inconsistencies and possible subjectivity of learner responses in questionnaires that will increase the validity of the learning style surveys. This paper further describes a 4-phased iterative life-cycle model that is in the process of adaptation to individual learning preferences and learning styles.

Keywords: Personalized instruction, learning styles, cognitive styles, instructional and pedagogical design, web-based learning.

Introduction

Personalization in web-based learning environments is receiving growing attention from practitioners, educational researchers and technologists worldwide. The current belief is that the ‘one size fits all’ philosophy as advocated by early web 1.0 approaches is limited in pedagogical effectiveness. The web is no longer seen as an innovative delivery medium only but as a platform where rich pedagogical scenarios, adapted to specific learning needs of learners, can be delivered on a ‘just-in-time’ basis. While the concept of personalization inevitably results in adaptation to specific learner preferences, there are a variety of ways in which online educators can provide the so-called personalization. The first aspect relates to the way the learner wants to ‘see’ the environment. In this case, he/she can select layout templates, colours, fonts and other display attributes. He/she can also select the level of educational material he or she wants to access depending on his or her educational needs, previous education, and experience. This kind of personalization is achieved through features to “customize” the learning environment.

Another possibility is to have the system provide the learner with a customized learning scenario.. The system is based on prior recorded parameters such as performance assessment, prior skills, and pre-requisites of the course. Personalization in this context is achieved in a somewhat hegemonic way where the system (the teacher in a traditional environment) provides individually tailored content to the learner. In the past, researchers considered a number of features that can be used in the personalization of instructions in web-based learning environments. One aspects that has been heavily studied, with varying levels of success, is personalization hat adapts to cognitive styles and learning style preferences.

Educators and researchers have mixed opinions on the value of learning styles/cognitive styles on the learning experience with particular reference to adult learners and web-based learning environments. The validity of learning styles questionnaires has often been put in question. This article describes a method that can minimize inconsistencies and learner subjectivity while filling the questionnaires to improve reliability and validity of learning style surveys.

The paper further describes a 4-phased iterative life-cycle model in the process of adaptation to individual learning preferences of the learners according to their learning styles. The aim is to have a learner profile that is as accurate as possible and which evolves over time. It is based on the learner’s interaction with the learning environment and the teacher, the student’s perception of the learning material, and learner achievement in terms of performance and understanding of the content. Furthermore, by capturing learner feedback and perceptions, learning paths can be established. It will determine if adaptation to learning styles contributes to learning enhancement, and improved learning outcomes in web-based learning environments.

Personalization, Learning Styles and Web-based Learning Environments

Terrell and Dringus (2000) investigated the effects of learning styles on student success in an online learning environment. They tracked 98 Masters level students in an information science programme using the Kolb Learning Style Inventory. While they found that a majority of students can succeed in an online learning environment regardless of learning styles, they also found that there have been more drop-outs where the students learning style fell in the accommodator category.

A study from Ross and Schulz (1999) based on the Gregorc Style Delineator revealed that learning styles significantly affected learning outcomes and that abstract random learners may do poorly with some forms of computer-aided instruction (CAI). From definition within the Gregorc Style Delineator, abstract random learners tend to be “non-linear, multidimensional, and prefer active, free and colorful environments. They thrive on building relationships with others and dislike extremely structured assignments.”

Underperformance in computer aided instruction environments can be explained where these environments lack the collaborative, collective features of web-based learning environments. Furthermore, computer-aided instruction often focuses on highly structured assignments such as fill-in-the-blanks and multiple choice questions while web-based learning environments can provide less-structured learning activities and scenarios to fit learner preferences. 

Butler and Pinto-Zipp (2005) conducted an experiment similar to Ross and Schulz (1999) with mature learners in an online learning environment (rather than a traditional CAI setting). The feedback from learners suggested that, for mature students who are practicing professionals focused on their career goals, the real effect of learning styles cannot be established in a cause and effect way. This is because their thrust to complete the course is influenced by intrinsic and/ or extrinsic motivation factors. The study also revealed that a significant number of online learners developed a dual learning style. Butler and Pinto-Zipp (2005) further argue that today’s learners are more flexible, stretch their learning styles to accommodate a variety of instructional methods, or simply transcend through preferred methods.

Luk (1998) studied differences in academic achievement with respect to cognitive styles of nursing students. He found that field-independent students performed better than their field-dependent peers in both traditional education and distance learning settings. The culprit factor in distance education was attributed to the highly impersonal nature of the modality given that the teacher and learner are not in a face-to-face relationship on a continuous basis. However, this argument does not hold strong for two reasons. The first one is that even in face-to-face relationships, the interaction is not on a continuous basis. The second is that new and advanced communication technologies help to recreate the face-to-face setting in an improved way. An example is the promotion of a greater number of one-on-one interactions with the teacher.

One interesting argument provided by Luk (1998) is the concept of structure in self-directed instructional packages. It is argued that field-dependent and field-independent learners would perform equally well if the instructional package is highly structured while field-dependent learners would have difficulty in semi-structured or unstructured learning contexts.

Ford (2000) argues that virtual environments enable a given information space to be traversed in different ways by different individuals using different roots and tools. He argues that cognitive styles are useful factors that can help in the personalisation of instruction in virtual environments. He suggests the need of more robust student models to achieve better learning systems design. He proposes virtual environments that enable differential patterns and sequences of access to information to suit the different types of students. Such access should be prescribed, autonomous or recommended. 

Furthermore, Hall and Moseley (2005) argue that translating specific ideas about learning styles into teaching and learning strategies is critically dependent on the extent to which these learning styles have been reliably and validly measured, rigorously tested in authentic situations, given accurate labels and integrated into everyday practices of information gathering, understanding, productive learning and strategic and reflective thinking.

Critiques of learning - cognitive styles instruments

The model of Kolb (1984) has been criticized by different authors and there is need for a more reliable and valid instrument for measurement of learning styles (Kinshuk, 1996). The construct of the Learning Style Inventory (LSI) was found to be unsatisfactory by different authors (Freedman & Stumpf 1978; Wilson 1986) while the face validity, an important aspect of the LSI was not well-accepted by managers (Kinshuk 1996).

From a pedagogical perspective, Atherton (2002) writes that one of the strengths of the model is that it “provides one of the most useful descriptive models of the adult learning process available. The most direct application of the model is to use it to ensure that teaching and tutoring activities give full value to each stage of the process."

From a lifelong learning perspective, Reijo (2000) states that Kolb's learning cycle does not illustrate the fact that empirical (i.e. experiential) thinking based on action has limitations such as:

  • It may result in false conclusions.
  • It may not help us understand and explain change and new experiences.
  • It may cause mental laziness and dogmatic thinking.

Reijo (2000) also suggests that Kolb's experience and reflection occur in isolation and it is necessary for the individual to interact with other humans and the environment in order to enhance the reasoning and conclusions drawn. This critique comes mainly from a constructivist point of view. However, from an e-learning perspective, web-based learning does favor the design of constructivist learning environments that can be combined with the Kolb learning model to meet the critique above.

Garner (2000) argues the poor theoretical foundations of Kolb theory and questions the reliability of the instrument. Despite acknowledging a number of positive works based on Kolb’s model, Garner argues that a “misunderstanding is generated by the confusion around whether Kolb is arguing for learning styles as traits (and so stable) or states (and so flexible)—a clear answer to this is needed”.

In addition, Zwanenberg et al. (2000) investigated the psychometric properties of the different learning styles instruments such as the ILS (Index of Learning Styles) and the LSQ (Learning Style Questionnaire). He challenged the poor psychometric features of the instruments and questioned their reliability to obtain satisfactory results in experiments.

Veenman et al. (2003) demonstrated the limitations of self-assessment reports in determination of learning styles and proposed think-aloud techniques as a more reliable approach. However, it must be pointed out that this technique of read-aloud can pose some practical limitations when the number of learners is elevated and if they are at dispersed geographic locations.

Improving reliability of data gathered from learning style instruments for personalized web-based environments

This section presents a method to address the critics and issues raised by previous studies concerning the validity of self-assessment instruments to determine preferred learning styles as well as criticisms that some learning styles are a mere determination of personality preferences. If we look at learning style questionnaires, students can answer up to 90 questions and be classified in one category only. Educational designers will then derive a set of guidelines to tell teachers about the learning activities that will suit that particular ‘category’ of student. This means that at some stage there is a grouping done with the learners at a particular granularity level which cannot be broken down further. Analyzing individual answers will help to further understand the learners’ preferences even if their profile is categorized by the different ‘creators’ of learning style instruments.

An analysis of the Honey and Mumford (1986) LSQ, the Kolb LSQ and the V-A-K questionnaire (Barbe & Milone, 1980) reveals many similar questions to determine learner preferences. Furthermore, with respect to the questionnaires studied, while there may be some questions that are not relevant to an educational context, there are some statements which are direct determinants of a learner’s preferences.

Furthermore, it has been noted that some questions do in fact repeat themselves in a masked way. While some writers have criticized this approach, it is believed there can be a logical rationale behind the inclusion of some redundancy. The reason is to minimize the subjectivity and inconsistencies of the learners while filling the questionnaire. Such similar questions can be grouped together after the learner has filled in the questionnaire to see if the learner was consistent in his choice or not. It might be assumed in a first instance that a learner who gives contradictive answers for redundant questions is either unsure of his preferences or is just filling in the questionnaire in an ad-hoc basis. Therefore, the researcher can be informed of this inconsistency and decides to remedy the situation as he/she deems appropriate. For such cases, an interview may be conducted with the learner to probe into his preferences and update his profile  accordingly.

An alternative solution to the filling out a questionnaire by learners is to have their navigational patterns and online behavioral patterns analyzed in a feed-forward neural network (Lo & Shu 2005). The network’s output then predicts the student’s ‘learning style’ with some accuracy. The experiment, though limited to the V-A-K instrument after training on the network, produced satisfactory outputs. However, this network only predicts one learning style for the student based on the neuron that ‘fires’. In this case, when the neuron ‘fires Visual Learning Style’ for a student, the system will have to look for content matching ‘visual preferences’ only. The technique therefore, can be seen as an alternative automated technique for determining learning preferences of a student. However, it does not address the issue of subjective answers by the learners as the training set used is based on the answers the student gave originally. Training a neural net at this stage, even if the system predicts accurately the learner’s style, might be predicting an inconsistent outcome. 

A more conventional approach can help reveal the inconsistencies and subjectivity of the learner while filling the original questionnaire. The approach used here can be both adaptive and adaptable. If the adaptable approach is selected, then the learner will not be given material preselected by the system, but will in fact be given the freedom to choose his own set of learning materials. His/her navigation history will then be kept and, depending on the learning path he/she undertook, his/her preferred learning preferences will be revealed. This can also be seen as a useful method to adjust the learner profile with respect to his/her original selection. However, for every page navigated by the learner, feedback needs to be obtained on his/her perception of the learning content. This is because, a learner might ‘accidentally’ or ‘by curiosity’ click on a link which he/she does not really perceive as valuable and this link can be counted as forming part of his/her learning path. The idea of basing oneself purely on the concept of ‘visited links’ v/s ‘unvisited links’ can be misleading.

In the case of the adaptive approach, the learners will be given the most appropriate content as chosen by the system based on his initial profile setup. Depending on the learner’s preferences the system applies the appropriate algorithm (Santally & Senteni 2005) to select the most appropriate content. However, what can be the most appropriate content for the system after applying the algorithm or for a teacher after taking cognizance of a learner’s preferences might not actually be what the learner feels like is most appropriate. Therefore learner feedback is important in every cycle of learning activity. Learner feedback will not only help to update his/her profile, but grouped evaluation of learner feedback on the same content will also help in pointing where the pedagogical designers got it inaccurate with their perception of a particular content. Learner feedback will also reveal any other factors other than learning/cognitive styles that might be influencing the learning experience.

The Lifecycle of the Process: An Expansive and Iterative Approach

The above discussion leads to the development of a structured approach that can be seen as a lifecycle model for the development of personalized learning environments with respect to learning preferences of the learners. An iterative approach is proposed in phases that evolve expansively in cycles for adjustment of the learners changing preferences over time. This model assumes the pre-existence and classification of learning content by the pedagogical designers.

Phase [1]: Data collection and analysis from learning - cognitive styles instruments

This phase relates to the learners filling out respective instruments that will classify them into different learning/cognitive styles that currently exist. This can be carried out through web-based questionnaires administered as a pre-requisite before starting the course. The information collected is stored in a central learner profile repository. Contradictive answers for redundant questions are identified and appropriate action (such as follow-up interview) is taken to improve accuracy of the information collected.

Phase [2]: Logging students’ navigation history and feedback

Once student profiling is carried out, they can start navigating through the content (either in the adaptable or adaptive configuration) and each of the link (content) they navigate and provide feedback on, is recorded for form a documented learning path for each student. In the case of the adaptable configuration, content on which learner do not provide feedback will be discarded from the learning path of the student. It will be assumed that the learner visited that content either “accidentally” or “by curiosity”. Even if a learner is engaged in a learning process where the configuration mode is set to be adaptable, an ‘expected’ learning path will be generated using the adaptive configuration which will be running as a hidden layer.

In the adaptive configuration, the learner will have to give his feedback before he moves on to the next learning content. The feedback he/she gives will be based on questions that are devised after careful analysis of the answers in the questionnaire he/she filled in phase 1.

Phase [3]: Analysis of navigation history and Student feedback

Once the learner completes a learning unit/course, his navigational patterns are displayed. If he/she was in an adaptable configuration, the actual learning path will be compared with the one predicted by the adaptive mode. His/her feedback on each learning content will be analyzed and compared with his/her original answers. On the other hand, group feedback will be used to determine if it’s the student profiles need to be adjusted or if the learning content metadata needs to be reviewed. This is where analysis of individual answers will help further the initial classification granularity where learners are grouped in categories such as ‘reflectors’, ‘pragmatists’ etc to provide a more accurate profile of the learner.

Phase [4]: Evaluation of learners performance and learning outcomes

A variety of techniques can be used to evaluate learner performances in the learning units/ activities and to check if they have met the learning objectives. Viva-voce examinations, written work assignments and examinations as well as the level of understanding attained, are all indicators of the degree of successful adaptation to the learners’ preferences.  After this phase, the cycle starts again with either the same batch of learners moving on to a higher level of education or with new batches of learners joining the same programme.

Discussion

The issue of incorporating learning styles into the design of instruction occurred initially in traditional classroom settings. With the emergence of web-based instruction, a number of researchers have focused on the possibility for extrapolation of the concept for learners engaged in online learning. Most of the experiments were focused on one particular style and based on rigid ‘if-then-else’ statements limiting the flexibility of the system and defeating the argument that has been established by previous researchers that learning styles do change over time.

The real issue when it comes to design of personalized learning environments is not about getting the students preferred learning style on a perfectly accurate scale. The important thing is to get an initial profile of the learner as accurately as possible. The other important aspect is to design and deliver content in a format that is appropriate to each learners’ preferences. There cannot be any combination of content and methodology that will perfectly match any learner preference, and there cannot be a student with only one learning preference who will not achieve the intended learning outcomes solely because the learning content or teaching method did not meet his preferences. The aim therefore, is to enhance the students learning experience by giving them, as far as possible, content in a form that matches his/her learning preferences.

The “iterative analysis of learner interaction and feedback” helps to address the issue of changing learning styles as well as the one on subjective filling of self-reporting instruments. This can help improve learner profiling in online learning environments as well as provide adequate grounds to determine any adjustments needed in learning content profiling. The method presented is an important layer of instructional specification for the design of personalized learning environments. Such an instructional specification is a multi-layered structure consisting of:

  • A pedagogical framework for the design of learning activities and scenarios. It also provides information with respect to actors involved in the learning environment. Such a framework follows an activity-theoretical approach for the learning activities and constitutes level 1 of the learning design framework.
  • A process model, as elaborated in this paper, consisting of appropriate mechanisms for student and learning content profiling. It also consists of records of student navigation history, interactions with learning content and learner perceptions of the content. This constitutes level 2 of the learning design framework
  • An adaptation mechanism consisting of a fuzzy algorithm to match the most appropriate content and methods based on the learner profile stored in the system. It decides learning object selection and sequencing, and determines the proposed learning path for the learner for a particular learning activity. This is level 3 of the learning design framework.

Personalization in web-based learning environments using the learning styles approach also implies that a variety of content targeting similar learning outcomes should be available in a variety of formats. However, with increasing availability of web-based resources such as open-educational resources and learning object repositories, there is a possibility of having content with multiple modalities. Furthermore, the lack of conformance to learning metadata standards and high granularity level of some content can be a constraint for reuse, interoperability and compatibility of the learning content.

While the learners profile can be obtained through methods such as self-assessment, read-aloud, iterative interaction and feedback; or through automated techniques like feed-forward neural networks, it is a much more difficult process (in terms of practicalities) for pedagogical experts to spend a significant amount of their time to initially evaluate content, and fill in content metadata related to personalization factors. This also de-facto implies that learning object design must be very carefully looked (or re-looked) at to match a variety of learners. Learning activities must be re-engineered to fit, for instance, field-dependent learners who as previously stated, often suffer from the ‘impersonal’ nature of some types of learning environments (Luk, 1998) and the extent of structure in the instructional package.

Finally, as correctly argued by Hall and Moseley (2005), the outcome of engaging with style should be strategy and that the “goal of ‘personalized education’ or ‘learning to learn’, whether couched as learner agency or learner autonomy, is simply freedom, and descriptions of learning style should be tools to break chains of habit and limitation”.

Styles, as described in this paper, can therefore be the starting point of an iterative process that will result in the identification of relevant teaching and learning strategies to focus and motivate the learner to achieve established performance objectives. Furthermore, the concept of selection of ‘most appropriate’ learning material and strategies tallies with the ideas that students should be encouraged, especially in online learning environments, to develop and adapt to other styles of learning engagement.

Conclusion

It is clear that while a number of critiques are addressed to the validity of learning style instruments, there are no alternative constructs to address these issues. On the other hand, the trends in research suggest that learning and cognitive styles are getting the attention of researchers with particular reference to online learning environments. This paper presented a process model that allows, in an iterative way, for the learner to improve his/her profile and preferences through his/her own interaction with the learning content and perception of the learning experience. The process model fits in a wider context of an instructional design specification for personalized web-based learning.

References

Atherton, J. S. (2002). Learning and Teaching: Learning from experience [On-line]: UK: Available: http://www.dmu.ac.uk/~jamesa/learning/ Retrieved 20 June 2004

Barbe, W., & Milone, M. (1980). Modality. Instructor, 89(6), 44-46.

Butler, T., & Pinto-Zipp, G. (2005). Students’ learning styles and their preferences for online instructional methods. Journal of Educational Technology Systems, 34(2), 199-221

Ford, N. (2000) . Cognitive Styles and Virtual Environments. Journal of the American Society for information science. 51(6), 543-557.

Freedman R. D. & Stumph S. A. (1978). What can one learn from the Learning Style Inventory? in Kinshuk (1996). Computer-Aided Learning for Entry-Level Accountancy Students. PhD Thesis. De Montfort University, United Kingdom.

Garner, I. (2000). Problems and Inconsistencies with Kolb’s Learning Styles. Educational Psychology. 20 (3). 341-348.

Hall, E., & Moseley, D. (2005). Is there a role for learning styles in personalised education and training? International Journal of Lifelong Education, 24 (3), 243–255

Honey, P. & Mumford, A. (1986). Using your learning styles. Maidenhead. Honey Publications

Kinshuk (1996). Computer-Aided Learning for Entry-Level Accountancy Students. PhD Thesis. De Montfort University, United Kingdom.

Kolb, D. (1984). Experiential Learning. Prentice-Hall, Englewood Cliffs, NJ.

Lo, J., & Shu, P. (2005). Identification of learning styles online by observing learners’ browsing behaviour through a neural network. British Journal of Educational Technology. 36 (1). 43-55.

Luk, S. C. (1998). The relationship between cognitive style and academic achievement. British Journal of Educational Technology. 29 (2). 137-147

Reijo, M. (2000). The concept of experiential learning and John Dewey's theory of reflective thought and action, International Journal of Lifelong Education, 19 (1), January-February, 54-72

Ross, J., & Schulz, R. (1999). Can computer-aided instruction accommodate all learners equally? British Journal of Educational Technology. 30 (1). 5-24

Santally, M., & Senteni, A., (2005). Santally, M., Senteni, A. (2005). A Learning Object Approach to Personalised Web-Based Instruction. European Journal of Open and Distance Learning. [Online] Available http://www.eurodl.org/materials/contrib/2005/Santally.htm

Terrell, S., & Dringus, L. (2000). An investigation of the effect of learning style on student success in an online learning environment. Journal of Educational Technology Systems. 28(3). 231-238.

Veenman, M., Prins, F., & Verheij, J. (2003). Learning styles: Self-reports versus thinking-aloud measures. Educational Psychology. 24(4). 532-548.

Wilson D. K. (1986). An investigation of the properties of Kolb's learning style inventory. in Kinshuk (1996). Computer-Aided Learning for Entry-Level Accountancy Students. PhD Thesis. De Montfort University, United Kingdom.

Zwanenberg, V. N., Wilkinson, L. J., & Anderson, A. (2000). Felder and Silverman’s index of learning Styles and Honey and Mumford’s learning styles questionnaire: how do they compare and do they predict academic performance? Educational Psychology, 20(3). 365-380.

About the Author

Mohammad Issack Santally (m.santally@uom.ac.mu) is a senior lecturer in Educational Technology at the University of Mauritius and currently in charge of the Virtual Centre for Innovative Learning Technologies, where he joined 8 years ago as instructional designer. He is also involved in the SIDECAP research project, funded by the EU-ACP consortium, led by the Open University of UK on the concept of distributed education. His research interests are personalised learning environments, and the use of open educational resources in online education and innovative pedagogical design.


 
go top
July 2009 Index
Home Page