September 2005 Index
 

Home Page


Editor’s Note
: The computer is frequently used in laboratory instruction to guide the student step-by-step as he learns a complex activity. The computer has many advantages: It can combine text with color images and sound; it can be paced by the student on a step-by-step basis and steps are easily repeated; it can simulate the experiment so that the student can practice prior to conducting the experiment. It can even be used as a replacement for an activity that would otherwise be too time-consuming, costly, or dangerous for the student to perform without expert assistance. Add to this the advantage that the computer can be a management system to monitor – even measure – student performance. This paper discusses options for design and development of the computer software for a professor in India.

Development of Computer Software Support for Undergraduate Electronics Science Laboratory
Practical Learning

Yogendra Babarao Gandole

Abstract

The use of computer-based resources in supporting the teaching of a electronics laboratory course is described where the course has been enhanced to develop skills in experimental design, data analysis and links to theoretical parts of the subject. In particular, a computer software comprising laboratory demonstration, background theory, worked examples and sample data, was created using “mixed” design methodology. The “Electronics experiments” package is described in detail. The packages contain a course management element, which allows students to be assessed on their understanding of the background theory and their competence to carry out experiments. It is concluded that interactive computer software packages can be routinely customised to meet the individual needs of teaching and learning situations.

Introduction

Electronics is a laboratory-based course designed to expose students to basic analog and digital devices and circuits. Students are encouraged to discover concepts through laboratory experiences and apply their knowledge to problems and projects. 

Laboratory work should aim to encourage students to gain

  • Manipulative skills

  • Observational skills

  • Ability to interpret experimental data

  • Ability to plan experiments

  • Interest in the subject

  • Enjoyment of the subject

  • A feeling of reality for the phenomena talked about in theory

The role of the laboratory is central in Under Graduate electronics courses since students must construct their own understanding of electronics ideas. This knowledge cannot simply be transmitted by the teacher, but must be developed by students in interactions with nature and the teacher. Meaningful learning will occur where laboratory activities are a well-integrated part of a learning sequence.

It is not clear that our existing courses provide opportunities for students to develop all these skills. Indeed, Johnstone (1997) suggests that “it is possible to reach the end of a laboratory period having learned nothing with the exception of some hand skills. Masson and Lawrenson (1999) used computer testing to highlight problem areas with first year laboratory classes and concluded that problems relate to poor understanding of background theory and general scientific concepts and to difficulty with dealing with experimental results. This conclusion is consistent with our own observations of first year students working in the electronics laboratory.

Students at the beginning of a university course face a number of problems in the electronics laboratory. In the first place they lack experience with the procedures concerned, and the accompanying lack of manipulative skills often means the data they collect is of poor quality.

Furthermore the amount of time available is too short for them to collect sufficient data for interesting analysis or to repeat observations in an attempt to improve their technique through practice. Tutors often respond to these problems by providing a detailed protocol for the students to follow; this is designed to minimize the faults in technique and to ensure that data is collected under optimal conditions. One consequence of this is that students carry out the manipulations mechanically and without thought; this lack of engagement with the process means that students gain little inspiration from laboratory work and lose faith in the theory which underpins it. Different suggestions have been made to deal with these problems. For example Mishra R.A. (2000) reported some success following the introduction of a number of changes into first year electronics practicals, including laboratory exercises involving audio-video learning and data analysis. Nicholls (1999) designed computer software for use as pre-lab exercises in the inorganic chemistry laboratory. Garratt et. al. (1997) prepared computer simulations which they claimed to be useful both as pre-lab exercises and to provide students with the means to learn about experimental design and data interpretation.

This paper reports on research related to two issues. The first involves the extent to which properly designed computer software can serve as a cognitive bridge that can help electronics students in development of the target instructional model. The second examines the mutual roles that laboratory and computer software tools can play in the learning process.

Design Methodology

The first step in designing an application is to decide which design methodology to use. For the vast majority of computer applications it is hard to say that one correct implementation is better than another. There is always a certain amount of artistry involved and evaluating between two correct implementations usually involves subjective arguments based on the preferences of the evaluator. In an investigator belief, such artistry extends beyond the choice of implementation into the choice of design methodology. An investigator may, for instance, prefer one method for developing a certain type of application while another person prefers another. In a project such as this where there is only one programmer involved, an investigator would say that the design methodology to use is the one that the programmer can achieve results with most confidence. The methodology does not have to agree exactly with any well-known methodology.

Personally, an investigator feels most confident with "mixed" design methodology. An investigator had selectively used the objects of, top-down and bottom-up design based on the current task to solve. Overall the development structure could be called evolutionary with some rapid prototyping. Most of the professional programmers that an investigator knows use some variation of the technique when designing small to medium sized applications. It allows the flexibility of matching a particular style to the task in hand.

Perhaps the best way to illustrate the method is to go through the design process of the project in roughly chronological order.

It has to be said that the design did not occur wholly before the implementation; rather it was an ongoing processing throughout the implementation. The separation of the design from implementation is something that is inherent to the Waterfall method of design - a method which is widely regard as a nice idea, and perhaps even the ideal, but not something that accurately reflects programming reality. A design is not properly assessed until it is implemented, by which time, according to the Waterfall method, it is too late to change the design. This is even more apparent in experimental projects, such as this one, where there is little readily available knowledge as to which project design is correct because nobody has implemented such a project before. My technique for designing this project, which I believe to be the best one for experimental, time constrained projects, was to design a part of the project, implement it, assess the design, redesign if necessary and then move on to the next part of the project.

Steps in Software Development

  • Clear understanding of the problem: Problem Specification.

  • Careful solution design paying attention all the constraints:

  • Transform algorithm into a program code: Abstraction and coding.

  • Complete debugging: Error removal.

  • Thorough testing:

  • Maintenance dictated by the environmental changes

Problem specification

The essential function of laboratory instruction is to teach theory and process of experimentation. The computer software may be found suitable for communicating process of laboratory work effectively to the students. The Aim of the Computer software support is to:

  1. Communicate the basic knowledge (theory) related to practical work in electronics.  

  2. Assist the students in selecting the measuring instruments and electronic components require performing an experiment in laboratory.

  3. Develop the competency of assembling the practical electronic circuit.

  4. Communicate procedure of an experiment.

  5. Communicate demonstration of an experiment.

  6. Reduce the labor of calculation and to obtain accuracy in design, results etc.

Design

A) Output Design

Output, generally refers to the results and information that are generated by the application. For many end-users, output is the main reason for developing the system and the basis on which they will evaluate the usefulness of the application, when designing output, investigator accomplished the following:

  1. Determined what information to be presented.

  2. Decided whether to display, print, or “speak” the information and selected the output medium.

  3. Arranged the presentation of information in an acceptable format.

  4. Decided how to distribute the output to intended recipients.

B) Design of Input

 The design of input includes the following input design details:

1.  What data to input.

2.  What medium to use.

3.  How the data should be arranged or coded.

4.  The dialog to guide users in providing input.

5.  Data items and transactions needing validation to detect errors.

6.  Methods for performing input validation and steps to follow when errors occur.

The design decisions for handling input specify how data are accepted for computer processing.

C) Design of Procedure

Procedures specify what tasks must be performed in using the application. The important procedures include:

  • Data entry procedures: Methods for capturing input data and entering it into the system.

  • Run-Time procedure: Steps and actions taken by end-users who are interacting with the application to achieve the desired results.

  • Error-handling procedures: Actions to take when unexpected results occur.

 D) Design of Program Specification

Designing computer software is important to ensure that

  • The actual programs produced perform all tasks and do so in the manner intended.

  • The structure of the software into modules permits suitable testing and validation to be sure procedures are correct.

  • Future modifications can be made in an efficient manner.

E) Stages in the development of a simulation Option

  • The object, system or phenomenon was analyzed and ample data     collected bout it

  • A model was designed based on the data and the purpose of the simulation

  • A software model was created

  • The model was tested against the real life equivalent. This process is called validation.

  • The model was improved and further testing carried out until it behaves as   much like the real thing as is possible.

  • Then the model was used to examine unknown situations

Coding

After designing the user interface, the next stage was to decide how each control reacts to user actions, such as click of a mouse, keystrokes, and so on.  As the application does not determine the flow of; instead, the events caused by the user determine the flow of the application. Hence coding is essential to react to various external conditions (Events), and the user’s actions determine the application’s flow.

Testing

Testing is the major quality control measure employed during software development. Its basic function is to detect errors in the software. During requirements, analysis and design, the output is usually textual and non-executable. After the coding phase, computer programs are available that can be executed for testing purpose. This implies that testing, not only has to uncover errors introduced during coding, but also errors introduced during the previous phases. Thus, the goal of testing is to uncover requirements, design or errors in the programs.

The starting point of the testing was unit testing. In this a module was tested separately and is often performed by the code himself simultaneously with the coding of the module. The purpose was to exercise the different parts of the module code to detect coding errors. After this the modules were gradually integrated into subsystems, which were then integrated themselves to eventually forms the entire application. After the system was put together, application testing was performed. Here the application was tested against the requirements to see if all the requirements are met and the application performs as specified by the requirements.

Software Tryout (Initial Stages)

The questionnaire has been designed by investigator to evaluate the quality of computer software program. This questionnaire contains 13 statements refer to the technical aspects of the software.

Their aim was to evaluate its technical adequacy to the learning objectives of the program. They deal mainly with questions related to the general structure of the product navigation, interactivity, design and other aspects that can favour or hinder the learning process.  Overall 27 statements refer to curricular design aspects, usefulness and intend to evaluate the integration capacity of the program in the learning process of an electronics practical.

Responses of students were in five-point scale. The 45 students ( 15 students from each class FY, SY & TY B.Sc.).,offering electronics subject at B.Sc. level were selected randomly for this piloting. The students were divided into three batches. Initially the investigator demonstrated the each module of the software using LAN for each batch and they asked to operate the each module of the software freely. Finally the questionnaire was given to every student and they asked to write ‘1’ for very bad or NO, ‘2’ for bad or sometimes, ‘3’ for acceptable or average, ‘4’ for good or almost always and ‘5’ for very good or Yes. The students who respond either 4 or 5 were grouped together which indicated the good quality of computer software. On the other hand the students who responded either 3, 2 or 1 are grouped together which indicated the poor quality of software and requires modifications. It was later on converted into percentage. The consolidated list of percentage for every statement was given in table 1.

From the analysis of piloting it was found that the development of computer software programme for laboratory communication was very good.

Finally, acceptance testing was performed to demonstrate to the students, teachers and experts in electronics and computer field, on the real life data of the practical.

Implementation

Even well designed and technically elegant software can succeed or fail because of the way they operated and used. Hence those who will be associated with or affected by the software must know in detail, what their roll will be, how they can use the software and what the software will or will not do. To overcome this problem, the investigator visited the selected colleges and provided a detail training of operation of software to the concerned teachers.

Table 1
Analysis of computer software tryout (initial stage).

Item No.

Statement

%of positive response

%of negative response

A.  Technical-Instructive Adaptation :Interface Design (Screen design)

1

The quantity of colour on screen is adequate for the sort of information contained

84.44 %

15.56 %

2

The quantity of the images is adequate for the sort of information contained

77.78 %

22.22 %

3

The sound quality level is adequate for the sort of information transmitted

71.11 %

28.89 %

4

The quantity of graphics and images is adequate for the sort of information transmitted

82.22 %

17.78 %

5

The resolution of graphic and images is adequate for the sort of information transmitted

82.22 %

17.78 %

6

The text presentation on screen is adequate for the information transmitted, Access and control of the information

97.78 %

2.22 %

7

The student has control over different parameters of presentation (colour, sound level, etc.)

82.22 %

17.78 %

8

The program facilitates the paper printing of selected information by the student

95.56 %

4.44 %

9

The program facilitates the navigation through the contents

88.89 %

11.11 %

10

The program gives the student the possibility of modifying the information contained.

84.44 %

15.56 %

11

The interaction tools (buttons, menu, commands) facilitate the learning process

86.67 %

13.33 %

12

The program, in general, is easy to use

88.89 %

11.11 %

13

It is easy for the student to learn how to use the program

88.89 %

11.11 %

14

The running of the program is adequate (there are no bugs which block it)

86.67 %

8.89 %

B  Didactic or Curricular Adaptation

B1 Learning contents

15

Are clearly presented

93.33 %

6.67 %

16

Emphasize the most important things

77.78 %

22.22 %

17

Are sequenced

93.33 %

6.67 %

18

The information is updated

88.89 %

11.11 %

19

Are enough to achieve the objectives

80.00 %

20.00 %

20

Are beneficial to the improvement of attitudes

91.11 %

8.89 %

21

Are extra-laboratory activities

91.11 %

8.89 %

22

Are free of grammar or spelling errors

95.56 %

4.44 %

B2 Learning activities

23

Require different levels of mastery

8.89 %

91.11 %

24

Follow a logical sequence in relation to the objectives

84.44 %

15.56 %

25

The number of different activities is enough

80.00 %

20.00 %

26

Allow different tries for answering

91.11 %

8.89 %

27

Examples of the activities to be done are shown

93.33 %

6.67 %

28

Examples are clear and adequate

93.33 %

6.67 %

B3  Evaluation

29

The program is constantly evaluating the student's output

84.44 %

15.56 %

30

Shows the student the errors he/she has made

82.22 %

17.78 %

31

Provides specific help for the student's errors

91.11 %

8.89 %

32

The feed-back is immediate

88.89 %

11.11 %

33

The feed-back is motivating for the student

86.67 %

13.33 %

34

The feed-back provides clear and significant information

88.89 %

11.11 %

35

Facilitates self-correction

86.67 %

13.33 %

36

Constantly informs the student about his/her output

82.22 %

17.78 %

B4  Motivation

37

The program increases the active involvement of the student on the laboratory task

91.11 %

8.89 %

38

Students show a better interest in learning practical

93.33 %

6.67 %

C.  Usefulness

39

Use it as self-instruction material

88.89 %

11.11 %

40

Use it as complementary laboratory material

68.89 %

31.11 %

41

The program makes it possible for the students to work in groups of two or three

71.11 %

28.89 %


 
Table 2
Opinion Scale

Item No.


Item

Response

Yes

No

1

Was the SOFTWARE material relevant to the objectives of the Electronics experiment?

 

 

2

Was the Software interested in your progress in Electronics?

 

 

3

Were the experiments presented in an interesting manner?

 

 

4

The laboratory demonstration contained instructions that were easy to follow.

 

 

5

What is required in the write-up of an experiment is clear.

 

 

6

The theory behind the experiments was clearly presented

 

 

7

The simulation module made me feel I have the ability to continue in Electronics science

 

 

8

The laboratory demonstration, experimental techniques and write-up were all interlinked

 

 

9

The experiments were interesting

 

 

10

Time in practical was spent effectively

 

 

11

I felt free to use software.

 

 

12

The software stimulated my interest in the subject area.

 

 

13

The software did his share in helping us to learn electronics experiments.

 

 

14

The software is user friendly.

 

 

15

The content of the software is fully self-instructional.

 

 

16

The interactive nature of the software made the experiment more interesting

 

 

17

Does the software enhance your enjoyment of learning about electronics subject?

 

 

18

Does the software helped make the experiment concepts easy to understand ?

 

 

19

Did the content of the software assume too much prior knowledge?

 

 

20

When needed, I found the written instructions & simulation to be helpful

 

 

21

Had any problems gaining access to the software in the laboratory?

 

 

22

Were the instructions provided with the software adequate?

 

 

23

Does the software give sensible results?

 

 

24

Does the software help electronics practical learning?

 

 

25

Does the software add value over conventional practical methods?

 

 

26

Did the software save you any practical time?

 

 

27

Did you get a feeling of personal satisfaction from using the software?

 

 

28

Did the software meet the needs for your electronics practical?

 

 

29

Do the software make the student think about the subject matter?

 

 

30

Are the software relevant to the learning practical objectives?

 

 

31

Is the demonstration well laid out and of practical use?

 

 

32

Did you have any difficulty using any part of the software?

 

 

33

Would the software help you teach [the electronics experiment]?

 

 

34

Does the software support activities that are otherwise difficult to learn?

 

 

35

Does the software have the potential to add anything new to the students learning experience that traditional practical method would not provide?

 

 

36

Will students learn by using the software?

 

 

37

Would you recommend the software for teaching students about electronics practical?

 

 

38

Do the tasks in software engage the students?

 

 

39

Can the learner test out their ideas and receive feedback using software?

 

 

Please use the following scale for the next item: ( item No 40)


    5. Outstanding (Among the top 10%)
    4. Excellent (Among the top 30%)
    3. About Average (Middle 40%)
    2. Fair (In the lowest 30%)
    1. Poor (In the lowest 10%)
    0. Not Applicable / Don't Know / There were none

40

Overall, I would rate this software

5

4

3

2

1

0

 

Table 3 (a)
Students opinion towards computer software (analysis)

Item No.

Number of positive response

Number of negative response

 

 

FY.B.Sc.

SY.B.Sc.

TY.B.Sc.

Total

FY.B.Sc.

SY.B.Sc.

TY.B.Sc.

Total

 

1

39

35

37

111

11

15

13

39

 

2

44

40

45

129

6

10

5

21

 

3

48

40

40

128

2

10

10

22

 

4

47

42

45

134

3

8

5

16

 

5

50

40

45

135

0

10

5

15

 

6

50

50

50

150

0

0

0

0

 

7

45

46

45

136

5

4

5

14

 

8

48

45

45

138

2

5

5

12

 

9

45

40

40

125

5

10

10

25

 

10

40

40

45

125

10

10

5

25

 

11

41

40

45

126

9

10

5

24

 

12

46

46

48

140

4

4

2

10

 

13

48

45

45

138

2

5

5

12

 

14

42

45

45

132

8

5

5

18

 

15

40

45

45

130

10

5

5

20

 

16

47

47

45

139

3

3

5

11

 

17

48

49

45

142

2

1

5

8

 

18

40

40

40

120

10

10

10

30

 

19

10

10

10

30

40

40

40

120

 

20

48

48

48

144

2

2

2

6

 

21

12

10

10

32

38

40

40

118

 

22

40

41

40

121

10

9

10

29

 

23

41

40

40

121

9

10

10

29

 

24

42

40

40

122

8

10

10

28

 

25

48

49

45

142

2

1

5

8

 

26

41

40

40

121

9

10

10

29

 

27

48

49

50

147

2

1

0

3

 

28

40

39

40

119

10

11

10

31

 

29

41

45

45

131

9

5

5

19

 

30

39

35

35

109

11

15

15

41

 

31

40

45

45

130

10

5

5

20

 

32

12

15

15

42

38

35

35

108

 

33

40

35

40

115

10

15

10

35

 

34

45

48

48

141

5

2

2

9

 

35

45

47

48

140

5

3

2

10

 

36

38

35

35

108

12

15

15

42

 

37

44

45

45

134

6

5

5

16

 

38

46

48

48

142

4

2

2

8

 

39

46

46

45

137

4

4

5

13

 

40

Overall, I would rate this software

FY B.Sc

SY B.Sc

TY B.Sc

5. Outstanding (Among the top 10%)

30 %

40 %

36 %

4. Excellent (Among the top 30%)

50 %

40 %

44 %

3. About Average (Middle 40%)

20 %

20 %

20  %

2. Fair (In the lowest 30%)

0 %

0 %

0 %

1. Poor (In the lowest 10%)

0 %

0 %

0 %

0. Not Applicable / Don't Know / There were none

0 %

0 %

0 %

 

Table 3 (b)
Students Opinion towards Computer Software (Percentage analysis)

 

Response in Percentage

Item No.

FY.B.Sc.

SY.B.Sc.

TY.B.Sc.

Positive

Negative

Positive

Negative

Positive

Negative

1

78%

22%

70%

30%

74%

26%

2

88%

12%

80%

20%

90%

10%

3

96%

4%

80%

20%

80%

20%

4

94%

6%

84%

16%

90%

10%

5

100%

0%

80%

20%

90%

10%

6

100%

0%

100%

0%

100%

0%

7

90%

10%

92%

8%

90%

10%

8

96%

4%

90%

10%

90%

10%

9

90%

10%

80%

20%

80%

20%

10

90%

20%

80%

20%

90%

10%

11

82%

18%

80%

20%

90%

10%

12

92%

8%

92%

8%

96%

4%

13

96%

4%

90%

10%

90%

10%

14

84%

16%

90%

10%

90%

10%

15

80%

20%

90%

10%

90%

10%

16

94%

6%

94%

6%

90%

10%

17

96%

4%

98%

2%

90%

10%

18

80%

20%

80%

20%

80%

20%

19

20%

80%

20%

80%

20%

80%

20

96%

4%

96%

4%

96%

4%

21

24%

76%

20%

80%

20%

80%

22

80%

20%

82%

18%

80%

20%

23

82%

18%

80%

20%

80%

20%

24

84%

16%

80%

20%

80%

20%

25

96%

4%

98%

2%

90%

2%

26

82%

18%

80%

20%

80%

20%

27

96%

4%

98%

2%

100%

10%

28

80%

20%

78%

22%

80%

20%

29

82%

18%

90%

10%

90%

10%

30

78%

22%

70%

30%

70%

30%

31

80%

20%

90%

10%

90%

10%

32

24%

76%

30%

70%

30%

70%

33

80%

20%

70%

30%

80%

20%

34

90%

10%

96%

4%

96%

4%

35

90%

10%

94%

6%

96%

4%

36

76%

24%

70%

30%

70%

30%

37

88%

12%

90%

10%

90%

10%

38

92%

8%

96%

4%

96%

4%

39

92%

8%

92%

8%

90%

10%

The training included the following points:

  1. Computer literacy. (Fundamentals of computer and software)

  2. Installation of software through CD (Compact disk) and installation guide document.

  3. How to turn the software on and familiarization with run procedure, which involves working through the sequence of activities.

  4. Troubleshooting list that identifies possible problems and remedies for them.

  5. How to use this software for laboratory practical learning.

  6. Observation and collection of the data.

Finally the effect of computer software, on the development of competency of performing an experiment was studied in this experimental research work. The investigator had used opinionnaire to collect learner’s opinion towards Computer software support for laboratory communication.

The investigator used opinionnaire (given in 2.) to collect learner’s opinion regarding Computer software support. This opinionnaire was containing 39 items related to various educational aspects. The analysis of data collected with the help of opinionnaire is given in table 3.

Conclusion

The investigator has developed the computer software support program related to selected experiments for study. These tests were pilot-tested/validated for use in collecting data for study.

Most of the students of experimental group were of opinion that the demonstration of laboratory experiment must be a part of laboratory communication. Therefore, the investigator is of opinion that the animation approach including demonstration of an experiment must be a part of communication tool. While communication, science subject through distance education mode, it is necessary to arrange contact programme or a laboratory workshop where the demonstration of an experiments will be given by teacher and then only students can perform the experiments.

The effectiveness of laboratory communication may further increase if the animation of demonstration of an experiment will be included in Computer software support for communication. Thus, Computer software media for laboratory communication no doubt support the learners for their laboratory activities but they never replace the role of teacher in laboratory.

The positive responses by students and staff to the material tested at the four colleges affiliated to Amravati  University, Amravati ( India), suggests that the difficulties of producing high quality video on a PC and meaningful interactivity (self-learning and self assessment) have been overcome. The challenge now shifts to Universities in order to facilitate a student culture in which all students have a CD ROM PC on day one of their degree course, a teaching and learning culture in which students can proceed at their own pace, laboratory experiences where students develop both practical and presentational skills, and staff development facilities whereby staff can be enabled to produce the new IT materials for the 21st century.

References

James A. Senn , (1989), Analysis and design of Information systems, 2nd, edition, MCGraw-Hill Publication, New York.

Evangelos Petroutsos, (1998), Mastering Visual basic 6, BPB Publication, New Delhi.

Brown, J. and Clement, J. (1989). Overcoming misconceptions via analogical reasoning: Abstract transfer versus explanatory model construction. Instructional Science, 18, 237-261.

DeJong, T. and van Jooligen, W.R. (1998). Scientific Discovery learning with Computer Simulations of Conceptual Domains. Review of Educational Research, 68(2), 179-201.

Duit, R. and Treagust, D. (1998). Learning in Science--from behaviorism towards social constructivism and beyond. In B. Fraser and K. Tobin (eds.), International Handbook of Science Education.

Dordrecht, Netherlands. Klluwer, 3-25. Goldberg, F. (1997). Constructing physics understanding in a computer-supported learning environment. In Rigden, J. (Ed.) Proceedings of the International Conference on Undergraduate Physics Education Volume II. American Institute of Physics.

Hutchins, E. (1995). Cognition in the Wild. Cambridge, MA: MIT Press.

Morse, Robert Otero, V. (2001). The Process of Learning about Static Electricity and the Role of the Computer Simulator. Unpublished doctoral dissertation. San Diego State University.

Mishra R.A. ( 2000) , Development and Tryout of audio/Video support for electronics experiments at UG level - (pp 40). Research Article, Y.C.M.O.U. Nashik.

Best John W. (1978) , Research in Education ( 3rd Edition), Prentice-Hall of India Private Ltd. New Delhi.

Campbell, J.O., Lison, C.A. (1995),  Using computer and video technologies to develop interpersonal skills. Computers in Human Behaviour, 11(20,223-239).

Capper, J., and Copple, C. (1985 ),  Computer Use in Education: Research Review and Instructional Implications. Washington, DC: Center for Research into Practice,

Casey, C. (1996), Incorporating cognitive apprenticeship in multi-media. Educational  Technology Research and Development, 44(1), 71-84.

Casey, P. (1997),  Computer programming: a medium for teaching problem solving. Computers in the Schools, 13(1-2), 41-51.

Dewhurst DG, Meehan AS. (1993), Evaluation of the use of computer simulations of experiments in teaching undergraduate students. Br J Pharmacol Proc Suppl 108:238P.

Dodge, B. (1991),  Computers and creativity: Tools, tasks, and possibilities. Communicator: The Journal of the California Association for the Gifted, 21 (1), 5-8.

Garcia, J. R. (1995), Use of technology in developing problem-solving/critical thinking skills. Journal of Industrial Technology, 11(1), 14-17.

Gokhale, A. A.  ( 1996), Effectiveness of Computer Simulation for Enhancing Higher Order Thinking , Journal of Industrial Teacher Education, 33(4), 36-46

Vedensyagam E.G. (1988), Teaching Technology for college Teachers, Sterling Publishers Private Ltd., New Delhi

Woolf, B., & Hall, W. (1995),  Multimedia pedagogues: Interactive systems for teaching and learning. IEEE Multimedia, 74-80.
 

About the Author

Yogendra Babarao Gandole

 

Yogendra Babarao Gandole is Lecturer at the Department of Electronics, Adarsha Science, J.B.Arts and Birla Commerce, Mahavidyalaya, Dhamangaon Rly. - 444 709, India.

Mr. Gandole has a M.Sc. in Applied Electronics from Amravati University, M.Sc. in Electronics from Y.C.M.O.U.Nashik, and A.D.C.S.S.A.A. from Bombay technical Board, Mumbai.

In 2000, he presented a “Information Technology for Masses and Globalisation” at the national seminar on Information Technology - Current Trends. at the Department of Computer Science and Energy. Amravati University.

E-mail: ygandole@indiatimes.com 
Office Phone: (07222)  -  237045

go top

September 2005 Index

Home Page