NEW


Best of EDEN 2018 Special issue
 

There are not any recent contributions.

There are not any recent contributions.

Archives

EURODL Mailinglist

2851 subcribers
 

EURODL Visitors

back

Internet based formative prior knowledge assessment

Rob Martens & Henry Hermans
Otec, Open University of the Netherlands, The Netherlands

© 1999


Abstract
Introduction
Instruments
    Prior knowledge tests
    Research intruments
Subjects
Results
Discussion and conclusion
References


Abstract

Many investigations have shown that prior knowledge state is one of the most determining factors and an important predictor of subsequent study outcomes. This research focuses on the evaluation of a set of electronic instruments to make potential students more aware of their prior knowledge. This might lead to better-underpinned decisions about study plans at the OUNL.

The electronic intake instruments considered in this study are related to:

  • acquaintance with the level of content and the way of studying (excursions within content domains)
  • prior knowledge state on mathematics
  • prior knowledge state on mastery Dutch
  • prior knowledge state on mastery English

For a period a five month possible students who visited the public Internet site of the Open University of the Netherlands (www.ou.nl/info-alg-index/index.htm) were by means of an electronic form given the possibility to express their opinion about the intake instruments provided. We received 151 valid responses.

In this research we mainly focussed on student use and -appreciation. The respondents’ profiles did not differ from regular subscribers at the OUNL. It appeared to be relatively highly educated target group, the traditional bias of Internet users (e.g. young and male) has disappeared.

Both can be considered remarkably high. When compared to other ways of getting information about studying at the OUNL the use was high. The appreciation was positive. Ninety percent of the responders consider the instrument to be useful.


Introduction

This article is about the use of various prior knowledge tests to support persons interested in a study at the OUNL (Open University of the Netherlands, OUNL). The OUNL develops - to foster an independent learning process of its adult students - sets of self-study learning materials for distance education. More and more the direction of its education is towards competence based education, putting emphasis on collaborative and problem based learning via the use of ICT. De Wolf (1994, p. 1557): "Distance education is described by a variety of labels such as ‘correspondence education’, ‘home study’, ‘independent study’, ‘external studies’, ‘distance teaching’ and ‘open learning’. We define open learning systems as "flexible learning systems which support a learning process in which there is a delicate balance between a maximal student freedom of choices and the optimal adaptation to characteristics of the learner, in order to guarantee a maximal degree of study success" (Valcke, Dochy & Daal, 1991, p. 3). These forms of learning are influencing more and more traditional education (e.g., Sorensen, 1995; Sewart, 1995). Former ‘traditional’ universities tend to use more and more ICT and thereby the difference between ‘traditional’ universities and ‘distance’ universities seems to decrease.

Alongside its advantages, distance education obviously has disadvantages as well. Some of the most important of these are the relative lack of support, guidance and interactivity, the fact that course material is often static and is not tailored to meet the needs of the users, and the lack of interim adjustments to take account of what students actually do. It can be argued that these are critical problems, which are at least partly to blame for the difficulties students encounter in the process of self-study (Martens, 1998, a, b). Such problems may express themselves in the form of students’ falling behind in their studies or dropping out. Distance education attempts to solve these problems by making use of embedded support devices in (written and electronic) course material (Rowntree, 1990; Martens & Valcke, 1995; Martens, Valcke, Poelmans & Daal, 1996). However, errors, incorrect assumptions, unrealistic planning and ineffective study methods are still hard to detect and correct.

A problem often encountered is students tending to have unrealistic beliefs about their possibilities to study at a distance. Boon, Janssen, Joosten, Liefhebber, & Poelmans, (1995) report the following: 99% of students enrolling for a course at the OUNL indicate that they want to take the examination. Many of them (46%) wish to do so because they are working towards a degree. After a year, we see the following: 28% have received the course certificate (meaning that they took and passed the examination), 12 % took the examination but failed, 11% did not even start the course and 49% started the course but quit before the first examination. Although the latter group might eventually take the examination, these figures indicate a striking difference between what students believe they can do at the start of a course and what they are ultimately able to achieve after a considerable period of time.

In line with the view that assessment can be used as a tool for learning (e.g., Dochy & McDowell, 1997; Askham, 1997; Moerkerke, 1996), a way to solve these problems is to make students more aware of the required knowledge and skills for studies (Taylor, 1998). This The most important aspect of this is making students aware of their prior knowledge. This might lead to better-underpinned decisions about study plans at the OUNL. Many investigations have shown that prior knowledge state is one of the most determining factors and that it is an important predictor of subsequent study outcomes (e.g., House, 1995; Land & Hannafin, 1996). Vosniadou (1996, p. 102): ‘In the context of cognitive psychology, the construct that seems capable of providing an explanation of phenomena such as inert knowledge and misconceptions is that of prior knowledge.’ Research in the field of prior knowledge, often based on constructivism, supports the point of view that a detailed analysis of prior knowledge components provides an essential contribution to an effective diagnosis and support of a students’ learning processes (Dochy, 1992; Dochy & Alexander, 1995; Dochy, Moerkerke & Martens, 1996).

The definition of prior knowledge is not an easy job. Researchers reporting about prior knowledge often avoid it (Dochy, Moerkerke & Martens, 1996). Prior knowledge can be defined as: a knowledge state at a certain time, that encompasses both declarative and procedural knowledge, that is present before execution of a learning task, that is directly available or can be retrieved, that is relevant for the objectives of the learning task, that is hierarchically structured, that is applicable to other learning tasks (inside and outside of the knowledge domain), that has a dynamic nature (c.f. Dochy, 1992; Dochy & Alexander, 1995). In line with this definition we consider prior knowledge as more then factual knowledge, but a complex of knowledge and skills.

In close collaboration with study mentors and study counselors and based on student interviews, several common problems were detected with regard to prior knowledge amongst students from the OUNL. These problems are:

  • Students, especially in technical and social sciences, lack important prior knowledge and skills in the field of mathematics. This often becomes a bottleneck and cause for study problems and delay later on.
  • In the Netherlands about 6% of the population is non-indigenous. These are migrants from former Dutch colonies, such as Surinam. The second category is migrant workers, coming mainly from Mediterranean countries, migrants and their family and finally there are refugees (Driessen & Van der Grinten, 1994). Especially the last two categories of non-indigenous people often have problems with the Dutch language. Most courses from the OUNL are in Dutch.
  • Some of the courses of the OUNL, especially the more advanced ones, are in English. Some students lack a thorough command of the English language.
  • Many students don’t know what it is to study in a distance education context. They lack experience with this type of education, in which they are made self-responsible for their study to a high extent.
  • Students have an incomplete or even wrong impression of the subject matter of the study they want to begin with.

Assessment can be used as a tool to help students, on a voluntary base, to make better-founded decisions. Assessment is to be seen as something that is much broader than formal final testing at that it makes students more responsible (cf. Thorpe, 1998). This is in line with a constructivist view on learning and in line with a model for the integration of assessment and learning (Dochy, Moerkerke & Martens, 1996). According to many authors (e.g., Glaser, 1990; Glaser & De Corte, 1992; Dochy, 1992) assessment should not only be used on a formal base. It should also be used as a tool to help students decide. In the near future, assessment should make it possible to adapt education by means of an electronic learning, flexible learning environment (Valcke & Martens, 1997). A tool in this respect is currently being developed at the OUNL.

Earlier we reported upon the use of prior knowledge tests with semiautomatic feedback at the OUNL (Martens & Dochy, 1997; see also Moerkerke & Dochy, 1998). The results can be summarized as follows: assessments with delayed feedback are considered valuable and are often used by students, but they do not seem to change the learning process dramatically. These assessments do not change the order in which a student studies nor does it change the study method.

Besides the appreciation of progress assessment and prior knowledge assessment, the research described above was mainly aimed at establishing the effects on the learning process of students who had already started their study.

An interesting extension to these studies is to aim at students who are looking around for a study at the OUNL. Based on the mentioned experiences it was decided to develop a more sophisticated way to present various kinds of prior knowledge tests to possible new students who are orienting themselves towards a study at the OUNL. In the experiments mentioned above tests were filled in at home and then sent by post to the OUNL. There the tests were analyzed automatically and feedback was sent back, again by post. Of course this is a time consuming way to proceed. It is also rather expensive, and therefore not the best suited method for students who didn’t enroll yet, but who are only orienting themselves on a study.

Because of the fast growing use of - and availability of the Internet in the Netherlands and the Dutch speaking part of Belgium, it was decided to use the World Wide Web to deliver these tools to those whore are interested in a study at the OUNL. It is estimated that in the near future up to 80% of our students will have access to the Internet. Distribution of, in this case, self-assessment packages is much easier and more cost-effective than delivery and distribution on CD-ROM or disc (e.g., Taylor, 1998).

As stated earlier we analyzed five common problems related to prior knowledge amongst our students, which we seek to solve by means of prior knowledge tests. In close collaboration with domain experts of our faculties five tests were constructed. In the forthcoming sections these tests will be described more in detail. The term ‘tests’ may be misleading because the aim was not to construct highly validated and reliable instruments, that lead to a summative ‘go no go’ score for students. Our aim was rather to construct tools (embedded support devices) to support students in making a decision whether or not to enroll for a study at the OUNL. In this respect the tools were mainly designed to make possible new students aware of their own level of comprehension (both knowledge and skills) at the one hand, and the required levels at the other hand.

There are few experiences with this kind of prior knowledge assessment. Taylor (1998) used a program called Self Test in a distance education context. Self Test supports self-assessment and was tested in groups of engineering students studying mathematics. The appreciation was positive and the self-assessment seemed to lead to valid conclusions. But it can be questioned if potential students are interested in taking tests to assess their readiness for a particular topic. It is questionable if they find it useful to fill in a prior knowledge test before even having subscribed to a course.

Another research question is related to the type of users of the Internet: does the same type of students use these tools when compared to students who actually subscribe? Not all of our students may have equal access to the Internet. Particularly male students may make more from intake instruments via the Internet and students with paid jobs. Also, Internet users tend to be relatively young.

An investigation is set up in order to give an indication about the usefulness of this approach. The focus here is on the student perspective. Based upon previous experiments (Martens & Dochy, 1997), we mainly expected effects on student appreciation and measures will aim at this. So, in the next sections we will seek answers to the following research questions:what is the use, appreciation and what are the effects of prior knowledge tests presented via the Internet and who use these tools?

The use of the instruments will be compared to the total amount of new subscribers ot the OUNL. A number of at least 5% of the average annual total amount of new students should make use of the instruments. The research context will be the Infonet of the Open University of the Netherlands (http://www.ou.nl/info-alg-intake/index.htm). The Infonet is one of the four Internet sites of the OUNL. This site provides interested parties with information and tools to support making the choice whether or not to start a study at the OUNL.

Instruments

Prior knowledge tests

In order to tackle the before mentioned five common problems amongst students four types of prior knowledge tests were developed. This resulted in a total of seven instruments:

  • Three ‘domain excursions’ were developed in various content domains. In a domain excursion one ‘chapter’ of a starters course is presented to possible students. These realistic learning contents are alternated with questions with feedback. The domain excursions take about four hours of study each and are presented as ‘Studying one evening at the OUNL.’ These prior knowledge tests or exercises were intended to deal with the problems that student don’t realize what it is to follow distance education courses and that they have an unrealistic picture of the content matter. The subject domains were cultural sciences, jurisprudence and psychology.
  • One prior knowledge test Dutch was designed to give students an idea of the mastery of the Dutch language that is required. The test contains a text with nine accompanying multiple choice questions. After filling in the test students can compare their own answers with the correct answers and an explanation about the scores is provided. If they make too many mistakes it is advised that they should keep in mind that problems related to the Dutch language mastery level are to be expected and some recommendations are made.
  • The prior knowledge tests English contains short text fragments (each about two standard alignments) with ten accompanying multiple choice questions. Two versions are constructed; each suited for certain content domains that students are interested in.
  • Finally an existing set of four prior knowledge tests on mathematics was converted into HTML format. These tests consisted of multiple choice test-items, which were stored in a database. Where in this article we speak about ‘prior knowledge state test mathematics’ this set of four tests is meant.

After having completed the test a test score was given, accompanied by correct answers and feedback per item. Furthermore a general advise, similar to prior knowledge test Dutch, was provided. At the end suggestions for additional learning material were presented.

Most instruments also have printed counterparts available at the OUNL study centers. In this article we will direct our attention to the electronic (HTML) versions. These electronic instruments have been constructed using Microsoft Frontpage98 Ô and Active Server Pages (ASP).

An example of an item taken from the electronic prior knowledge test English is depicted in Figure 1.

Fig.1
Figure 1: item in prior knowledge test English

The electronic intake instruments are located at the OUNL website called Infonet (http://www.ou.nl/info-alg-intake/index.htm). At several locations of the Infonet hyperlinks to these instruments or tests can be found.

Research instruments

The main research instrument was an electronic HTML-form containing a list of questions. Potential students who used the instruments were given the possibility to give their opinion by a clicking the hyperlink ‘evaluation’. This hyperlink was present at each page of the test.

The electronic questionnaire could be answered anonymously. It contained nineteen mostly multiple choice questions about the use and appreciation of the instruments and about characteristics (e.g., age) of the respondents. After having filled in the form, the given answers were sent in by e-mail. We also used automatic logging of page visits.

Research procedure

It was stressed that also students who did not fill in the intake instruments completely, for instance because they considered them too difficult, were asked to fill in the questionnaire. The electronic list of questions was combined with all seven prior knowledge assessment instruments. As mentioned earlier, the instruments were provided as part of general information about studying at the OUNL, along with for instance an electronic course catalogue. The electronic questionnaire was on line from 1 May until 1 October 1998. It was recorded (logged) how many persons visited the electronic intake instruments. Recorded were those who visited more than one page of the instruments. All variables were checked for skewness and outliers. All significance tests were performed at a significance level of p<.05. In most cases we restricted the analysis to quantitative descriptions. SPSS one-sample T-tests were used to test the significance of differences between proportions.

Subjects

After the expiration of our ‘pilot’ period the electronic evaluation form and the hyperlinks to this form were removed from the web. By that time we had received 166 responses. Eight respondents had filled in the questionnaire because of a ‘scientific or professional’ interest for the intake-instruments (e.g., educational scientists). Because they did not have the intention to become informed about their prior knowledge state they were deleted from further analyses. Also some invalid filled in forms were put aside, resulting in 151 valid cases.

The logging information showed that at least 497 persons made use of the electronic intake instruments in the considered period of five months. So, the valid response was is about 30%. Over the last 4 years there were 10445 new students on average per year.

Results

All the (usually standard) techniques used seem to be relatively stable and reliable although certainly not perfect: 71% of the subjects reported no technical problems what so ever, but 6,5% reported a system crash and 9,3% reported other error messages. 13,1% reported other (minor) problems.

Do the profiles of the respondents differ from ‘regular’ enrolled students?

To answer this question the present data file was compared with the most recent figures about enrolled students (n=1342; Joosten, 1997). One-sample T-tests showed no significant differences in the proportions (sex, paid job, and age) in the experimental sample when compared to the figures derived from the proportions in the sample of ‘regular’ enrolled students. See Table 1.

Table 1: Comparison of profiles from electronic intake students and 'regular' enrolled students

Profile Enrolled students Electronic intake  
  Proportion Number Proportion Difference:
T/ degrees of freedom
Female .44 62 .41 -0,732/ 150
Male students .56 89 .59  
Total 1 151 1  
with paid job .79 120 .79 0,058/150
no paid job .21 31 .21  
Total 1 151 1  
age < 30 .35 38 .25 2,713/149
age > 30 .65 112 .75  
Total 1 150 1  

What use is being made of the electronic intake instruments?

It was recorded (by logging) how many persons ‘visited’ the electronic intake instruments. Recorded were those who at least ‘opened’ the first question or page. No logging was done from the prior knowledge state tests English (Table 2).

Table 2: Logging use of intake instruments

Instrument / month 1998 05 06 07 08 09 total
Domain excursion cultural sciences 24 22 15 13 13 87
Domain excursion jurisprudence 22 19 15 12 10 78
Domain excursion psychology 31 30 16 15 19 111
Prior knowledge state test mathematics 37 39 27 24 21 148
Prior knowledge state test on mastery Dutch 20 21 10 11 11 73
Prior knowledge state test on mastery English - - - - - -
Prior knowledge state test on mastery English for information science - - - - - -
Total 134 131 83 75 74 497

In Table 3 the use of the electronic intake instruments is displayed. The figures in Table 3 are based on the list of questions. The figures are not absolute but serve as an indication of the distribution of the use. The domain excursion psychology and the prior knowledge state test mathematics are the most frequently used instruments.

From the respondents 69,3% indicates to hardly know the OUNL and 78,1% never before subscribed to a course from the OUNL.

Over the last 4 years there were 10445 new students on average per year. The equivalent for 5 months is 4352. This means that about 11% of the average number of subscribers made use of the intake instruments, which is more then the limiting value of 5%.

Table 3: Response to questionnaire on electronic intake instruments

Instrument type Frequency Percentage
Domain excursion cultural sciences 12 7,94
Domain excursion jurisprudence 14 9,27
Domain excursion psychology 32 21,19
Prior knowledge state test mathematics 35 23,17
Prior knowledge state test on mastery Dutch 23 15,23
Prior knowledge state test on mastery English 17 11,26
Prior knowledge state test on mastery English for information science 18 11,92
Total 151 100

From the respondents who used an instrument, 71,6% completely used the instrument and 28,4% skipped some parts. With the statement about the intake instruments ‘It was always evident what to do’ 89,8% of the respondents agreed.

How are the appreciation and the perceived effects of the electronic intake instruments?

In Table 4 is depicted how many students find that the instrument influenced the study choice; that the instrument is useful; and that the objective of the instrument is evident. Analysis of an open question for general comments showed again, besides more technical and content domain related remarks, that students highly appreciate this form of prior knowledge assessment.

Table 4: Appreciation and the perceived effects electronic intake instruments (n=151)

Statement Agree Disagree Proportion that agrees
The instrument influenced my study choice 44 99 30,8
The instrument is useful 139 9 89,8
The objective of the instrument is evident 139 8 94,6

In the following tables the appreciation and the perceived effects will be presented per electronic intake instrument.

Table 5: Agreement about influence of the intake instrument (n=143)

Agreement about influence of the intake instrument Agree Disagree
Instrument type Row % Count Row % Count
Domain excursion cultural sciences 66,67 6 33,33 3
Domain excursion jurisprudence 28,57 4 71,43 10
Domain excursion psychology 43,75 14 56,25 18
Prior knowledge state test mathematics 25 8 75 24
Prior knowledge state test on mastery Dutch 26,09 6 73,91 17
Prior knowledge state test on mastery English 12,5 2 87,5 14
Prior knowledge state test on mastery English information science 23,53 4 76,47 13

In Table 5 can be seen that there is no general agreement about the influence of the intake instrument. The domain excursion 'cultural sciences' is considered to have more effect. Some figures in the cells are too small to perform a statistical significance level analysis.

Table 6: Agreement about usefulness of the intake instrument (n=148)

Agreement about usefulness of the intake instrument Agree Disagree
Instrument type Row % Count Row % Count
Domain excursion cultural sciences 100 11
Domain excursion jurisprudence 100 14
Domain excursion psychology 87,5 28 12,5 4
Prior knowledge state test mathematics 93,93 31 6,06 2
Prior knowledge state test on mastery Dutch 91,30 21 8,69 2
Prior knowledge state test on mastery English 100 17
Prior knowledge state test on mastery English information science 94,44 17 5,556 1

From Table 6 it can be concluded that there is a general agreement about the usefulness of all the instruments.

Table 7: Agreement about objective of the intake instrument (n=147)

Agreement about objective of the intake instrument Agree Disagree
Instrument type Row % Count Row % Count
Domain excursion cultural sciences 100 11
Domain excursion jurisprudence 100 14
Domain excursion psychology 87,5 28 12,5 4
Prior knowledge state test mathematics 87,88 29 12,12 4
Prior knowledge state test on mastery Dutch 100 23
Prior knowledge state test on mastery English 100 16
Prior knowledge state test on mastery English information science 100 18

In Table 7 it is shown that there is general agreement about the obviousness of the objective of the intake instruments.

Discussion and conclusion

In line with the view that assessment can be used as a tool for learning (e.g., Dochy & McDowell, 1997; Askham, 1997; Moerkerke, 1996), we used prior knowledge assessment on a formative base. Earlier we reported on the use of prior knowledge tests with semi automated feedback (Martens & Dochy, 1997). Although those tests were in paper format and the feedback took quite some time, students’ appreciation was very positive. Also Taylor (1998) found that self-assessment tools are positively appreciated in distance learning contexts.

One could consider taking into account other effects, besides appreciation, for example effects on study choice and achievement. Reviewing research findings on effects of questions and feedback (e.g., Helgeson & Kumar, 1993; Rieber, 1994; Wiggins, 1993) it appears that those effects are not always consistent. Sometimes effects of assessment are positive, sometimes there are no effects or effects can even be negative. Butler & Winne (1995) criticize in their overview on feedback in self-regulated learning, the too narrow focus on effects of feedback on achievement, thus neglecting interacting factors in self-regulated learning processes. We concluded that effects of prior knowledge assessment are strongly individually determined. On the basis of the same result some students may conclude not to start with a course, or to neglect the test results, or to start with extra effort, and so on (c.f. McDowell, 1996).

Therefore, in this research we mainly focussed on student use and -appreciation. Both can be considered remarkably high in this study. When compared to the number of new subscriptions to the OUNL, the use was high. The appreciation was positive. Ninety percent of the responders considered the instrument to be useful.

One of the research questions was aimed at the profile of the users of the electronic intake instruments. These profiles did not differ from ‘regular’ subscribers at the OUNL. It appears that within our relatively highly educated target group, the traditional bias of Internet users (e.g., young and male) has disappeared.

The most important limitation of the present study is already indicated above: we only analyzed group results. Individual effects were not investigated. Since the questionnaire had to be short, not too many in-depth questions could be asked.

Although we did not find any bias with respect to the profiles of the users of the intake instruments, there is still a chance that there may be some differences with other possible subscribers to courses, e.g., on the field of attitude towards computers.

Future research will have to make clear what the long-term effects are of presenting prior knowledge assessment to persons interested in studying at the OUNL. Some considerations can be made. Of course there is the danger putting people off, because they are discouraged. However, earlier research already showed that this is not likely to happen (Martens & Dochy, 1997). About 30% of the respondents indicated that ‘the instrument influenced my study choice’, so it seems that the majority of the respondents was only confirmed in their views.

But information should be honest and students who really lack important prior knowledge should be informed about this. Nevertheless, a goal of the domain excursions was to get students enthusiastic about a study at the OUNL. Future analysis of the developments in subscriptions will have to show if students’ enthusiasm about the electronic intake instruments does lead to higher enrolment figures.

References

Askham, Ph. (1997). An instrumental response to the instrumental student: assessment for learning. Studies in educational evaluation, 23, 299-317.

Boon, J., Janssen, J., Joosten, G., Liefhebber, J., & Poelmans, P. (1995). Looking back to face the future. In: M. Valcke & K. Schlusmans (Eds.), Inside out. An introduction to the Open Universiteit Nederland (pp. 13-28). Heerlen: Open University.

Butler, D.L., & Winne, P.H. (1995). Feedback and self-regulated learning: a theoretical synthesis. Review of Educational Research, 65, 245-281.

De Wolf, H.C. (1994). Distance education. The international encyclopaedia of education. Second edition. Oxford: Elsevier Science, 1557-1562.

Dochy F.J.R.C., & Alexander, P.A. (1995). Mapping prior knowledge: a framework for discussion among researchers. European Journal of Psychology of Education, 10, 225-242.

Dochy, F., Moerkerke, G., & Martens, R. (1996). Integrating assessment, learning and instruction: assessment of domain-specific and domain-transcending prior knowledge and progress. Studies in educational evaluation, 22, 309-339.

Dochy, F., Moerkerke, G., & Martens, R. (1996). An investigation into tools for study support in distance education: implementing alternative assessment procedures. Instruzione a distanza, International journal on distance education, 5, 27-38.

Dochy, F., & McDowell, L (1997). Assessment as a tool for learning. Studies in educational evaluation, 23, 279-298.

Dochy, F.J.R.C. (1992). Assessment of prior knowledge as a determinant for future learning. Utrecht/London: LEMMA, Jessica Kingsley.

Driessen, G., & Van der Grinten, M. (1994). Home language proficiency in the Netherlands: the evaluation of Turkish and Moroccan bilingual programmes- a critical review. Studies in educational evaluation, 20, 365-386.

Glaser, R. (1990). Toward new models for assessment. International journal of educational research, 5, 457-483.

Glaser, R., & De Corte, E. (1992). Preface. In: F. Dochy (Ed.) Dochy, F.J.R.C. (1992). Assessment of prior knowledge as a determinant for future learning. Utrecht/London: LEMMA, Jessica Kingsley.

Helgeson, S.L., & Kumar, D.D. (1993). A review of educational technology in science assessment. Journal of computers in mathematics and science teaching, 12, 227-243.

House, J.D. (1995). Noncognitive predictors of achievement in introductory college chemistry. Research in Higher Education, 36, 473-490.

Joosten, G. (1997). De eerste kennismakingstrajecten. Een overzicht van deelnemende studenten [The first introduction routes. An overview of students participating]. Otec work document 97/w09. Heerlen: Open Universiteit Nederland.

Land, S.M., & Hannafin, M.J. (1996). A conceptual framework for the development of theories-in-action with open-ended learning environments. Educational Technology Research and Development, 44, 37-53.

Martens, R., & Dochy, F. (1997). Assessment and feedback as student support devices. Studies in educational evaluation, 23, 257-273.

Martens, R., Valcke, M., Poelmans, P., & Daal, M. (1996). Functions, use and effects of embedded support devices in printed distance learning materials. Learning and Instruction, 6, 77-93.

Martens, R.L. (1998, A). The use and effects of embedded support devices in independent learning. Ph.D. Thesis. Utrecht: Uitgeverij Lemma BV.

Martens, R.L. (1998, B). Does embedding support devices have an effect in independent learning? European Journal of Open and Distance Learning.

Martens, R.L., & Valcke, M.M.A. (1995). Validation of a theory about functions and effects of embedded support devices in distance learning materials. European Journal for the Psychology of Education, 10 (2), 181-196.

McDowell, L. (1996). Encouraging students to learn from assessment: student perspectives. Newcastle: university of Nortumbria.

Moerkerke, G. (1996). Assessment for flexible learning. Ph. D. Thesis. Heerlen: Open University.

Moerkerke, G., & Dochy, F. (1998). Effects of prior knowledge state assessment and progress assessment on study results in independent learning. Studies in Educational Evaluation, 24, 179-203.

Rieber, L.P. Animation as real time feedback during a computer-based simulation. Paper presented at the annual conference of the American Educational research association, New Orleans, April 1994.

Rowntree, D. (1990). Teaching through self-instruction. London:Kogan page.

Sewart, D. (Ed.) (1995). One world, many voices. ICDE June 1995, London: Eyre & Spottiswoode Ltd.

Sorensen, K. (1995) Evaluation of interactive television instruction: assessing attitudes of community college students. DEOSNEWS (Distance education on-line journal), 5 (9), ACSDE@PSUVM.PSU.EDU.

Taylor, J.A. (1998). Self Test: a flexible self assessment package for distance and other learners. Computers & Education, 31, 319-328.

Thorpe, M. (1998). Assessment and ‘third generation’ distance education. Distance Education, 19, 265-286.

Valcke, M., & Martens, R. (1997). An interactive learning and course development environment: Context, theoretical and empirical considerations. Distance Education, 18, 7-23.

Valcke, M.M.A., Dochy, F.J.R.C., & Daal, M.M. (1991). Functions and effects of support in open learning systems. Outlining current empirical findings. Heerlen: Open university, OTIC.

Vosniadou, S. (1996). Towards a revisited cognitive psychology for new advances in learning and instruction. Learning and Instruction, 6, 95-109.

Wiggins, G.P. (1993). Assessing student performance. Exploring the purpose and limits of testing. Chapter 6, feedback, 182-205. San Francisco: Jossey-Bass Publishers.

 

Tags

e-learning, distance learning, distance education, online learning, higher education, DE, blended learning, MOOCs, ICT, information and communication technology, collaborative learning, internet, interaction, learning management system, LMS,

Current issue on Sciendo

– electronic content hosting and distribution platform

EURODL is indexed by ERIC

– the Education Resources Information Center, the world's largest digital library of education literature

EURODL is indexed by DOAJ

– the Directory of Open Access Journals

EURODL is indexed by Cabells

– the Cabell's Directories

EURODL is indexed by EBSCO

– the EBSCO Publishing – EBSCOhost Online Research Databases

For new referees

If you would like to referee articles for EURODL, please write to the Chief Editor Ulrich Bernath, including a brief CV and your area of interest.