NEW


Best of EDEN 2018 Special issue
 

There are not any recent contributions.

There are not any recent contributions.

Archives

EURODL Mailinglist

2851 subcribers
 

EURODL Visitors

back

The Challenges of Developing Self Assessment Practices in Different Disciplines and Institutions: Educational development or educational research?

Judith Thomas, Project Manager, Graduate School of Education, University of Bristol, 35 Berkeley Square, Bristol, BS8 1JA, United Kingdom.

© 1999


Abstract
Introduction to the Self Assessment in Professional and Higher Education Project (Saphe).
Research methodology
Evaluation strategy within the context of self assessment
How Saphe Interprets 'Self Assessment'
Co-operative research
Level 1
Level 2
Level 3
Level 4
Some difficulties with this methodology
Conclusions
References
Appendix One


Abstract

This paper is based on the work of staff and students involved in the Self Assessment in Professional and Higher Education Project (Saphe). Over the last three years the project has been introducing and developing self assessment within the disciplines of law and social work in four different types of higher education institution. The strengths and problems of the action research methodology will be considered to highlight the overlaps and tensions between research, development and evaluation. The different levels of evaluation will be discussed to illustrate how this has supported the development of self assessment.


Introduction to the Self Assessment in Professional and Higher Education Project (Saphe)

Saphe is a collaborative project, based in four different types of institutions on six sites, within the disciplines of Law and Social Work. The project is one of over 70 projects funded by the Higher Education Funding Council for England (HEFCE). It is part of their initiative to develop teaching and learning under the Fund for the Development of Teaching and Learning (see www.ncteam.ac.uk/fdtl.html). The setting up of this fund marks the increased commitment of the HEFCE council policy to put more emphasis on teaching and learning to balance the emphasis on research within universities. Bahram Bekhradnia, HEFCE director of policy confirmed this is set to continue.

"Support for high quality learning and teaching has always been central to our work. This has been given additional emphasis by the Dearing Report and by the Government. Through this strategy (including FDTL projects), we will be able to focus our efforts to raise the profile of teaching in higher education, and improve the learning experience of students." (HEFCE 1997:1)

In May 1996 the United Kingdom Government set up a committee to look at the future of higher education and what we could expect of graduates. Within the report (Dearing 1998) four key skills are identified that graduates need to acquire. These skills are identified in the report as:

  • numeracy
  • the use of information technology
  • learning to learn
  • communication

The latter two, concerning learning to learn and communication are central to self-assessment and link closely with the aims of the Saphe project which are to:

  • develop, pilot and evaluate a variety of self and peer assessment tools
  • explore the relationship between self assessment techniques and course content
  • develop staff and student skills of self reflection and self monitoring.
  • The Fund for the Development of Teaching and Learning (FDTL) was set up to:
  • Stimulate developments in teaching and learning
  • Encourage dissemination across the sector

The focus of this initiative was slightly different from traditional research initiatives that are more familiar to academic institutions. Put simplistically they were more used to research aimed at studying, inventing things, creating knowledge or 'proving' something. The emphasis from the funding body has been on creating and developing teaching and learning resources that other people can use. The FDTL initiative has been less concerned academic products, such as articles in peer reviewed academic journals or research reports. It has been more concerned with efforts concentrated towards developing, using and sharing materials through staff development, workshops, conferences and consultancy.

Research methodology

 A recurring question asked of the Saphe team by colleagues and others has been 'Is this proper research?' This question is more problematic as the people who are asking it often implicitly construct 'proper' research as being within the paradigm of positivist research. Research within this paradigm can be characterised as believing there is a single tangible reality to be identified. It is seen as being objective i.e. there separation between the researcher and the area of study, the knowledge that is produced will hold true everywhere at all times and that inquiry is value free (Lincoln and Guba 1985: 36 - 38). In contrast with the positivist view a naturalistic approach considers that there are multiple constructed realities, the researcher and researched interact to influence one another, it aims at considering individual cases and sees research as value bound rather than value free (Lincoln and Guba 1985 and Carr and Kemmis 1986).

The rationale for working within the paradigm of naturalistic research adopted by the Saphe project can be made on methodological, ethical and pragmatic grounds. When identifying an educational research method it is important to remember that research is 'only a way of investigating certain kinds of questions....Why should one adopt one research approach over another?......the method...ought to maintain a certain harmony with the deep interest that makes one an educator.' (Manen 1990:1-2).

A cursory look at the diversity of the project illustrates the need to consider the multiple realities and the interaction between the researchers and the research:

The courses offered by the 2 disciplines involved, law and social work, are very different. Social work students undertake a combined professional and academic award the 'Diploma in Social Work' (Dip SW) usually a two year years combined degree and social work qualification, (4 years at one of the sites, Bath). In law the project has been working on the three year academic degree that precedes the professional training for solicitors or barristers.

4 institutions and 6 sites

Social Work Law
University of Bristol - A traditional 'old' university
University of the West of England (A former polytechnic that converted to a University in the 1990s)
University of Bath
(Established as a University in the 1960s)
Southampton Institute
(One of the largest provider of higher education courses in England that does not have the title of a University)

The institutions all have very different histories, student profile on intake, research interests, staffing levels etc. A visit to the web sites of these institutions shows some of the history and the way in which they portray themselves conveys something of their culture (see www.bris.ac.uk, www.uwe.ac.uk, www.bath.ac.uk, www.solent.ac.uk).

Carr and Kemmis make the point that:
'educational acts are social acts which are reflexive, historically located, and embedded in particular intellectual and social contexts. So knowledge about education must change according to historical circumstances, local context and different participants understandings of what is happening in the educational encounter. And it is clear that the knowledge we have will, to a very great extent, be rooted in local historical and social contexts.' (1986:44)

The different types of educational settings has created challenges in that it has not been easy to transfer practices and self assessment tools directly. Each site has needed to develop self assessment within its own context, culture and history.

There are between 20 - 100 students involved on each site who range in age from 18 to mature students in their fifties. Some come with high Advanced Level results, with two or three A's the top grade common for law at UWE and Bristol. Entries to the Diploma in Social Work range from graduates to those who left formal education early without any academic awards and who gain entry through access courses.

On each of the sites between 4 and 16 staff are involved in the developments. At least 16 key staff (2 or more from each site) are collaborating across the sites and disciplines. They are also involved in determining the direction of the project. Carr and Kemmis outline the different views individuals may hold on the purpose of educational research (1986:27 & 28) and some of these variations were present in the staff involved in Saphe work.

Boud argues; "Context is perhaps the single most important influence on reflection and learning. It can permit or inhibit working with learners’ experience" (Boud 1998). Therefore, to reduce the diversity in order to get more easily comparative data or to construct a single reality would risked undermining the impact of and the learning from undertaking self assessment within these different contexts.

Farran proposes that " 'Research' is rarely a single product. Different and appropriate versions of it are produced for different kinds of audiences" (1990:88). Among the audiences we had to satisfy were HEFCE, to whom we had to demonstrate that we were meeting the aims of FDTL. We also had to provide the Saphe Steering (Management) Group with evidence that we were meeting the aims of the project. Torbert suggests that one the purpose of educational research should be "to educate more successfully" (1981:143) and this is probably the most important underlying concern of the external funding body and those within the project.

Some of the questions we needed to consider in developing our research strategy were:

  • What do we want to know?
  • What is the best way of finding out?
  • Are we trying to PROVE, DEVELOP or DISCOVER something?
  • What methods will best enable us to 'develop' and 'disseminate'?
  • How can we reflect the value dimension of the project?
  • What will be consistent with and model with what we are trying to do with students?

We needed ones that would:

  1. Take account of difference.
  2. Enable us to use our evaluations and develop practice in the light of these.
  3. Enable others to 'see' the context in which the work was undertaken.
  4. Be credible.

The next section goes on to look at how these questions have been addressed, the practitioner action research approach we have taken and how this has developed.

Evaluation strategy within the context of self assessment

The original strategy for project evaluation (University of Bristol 1996 p7) was identified at four levels. These were:

  • Level 1 - monitoring and self evaluation as part of student and staff development.
  • Level 2 - regular evaluation/feedback meetings in Subject Executive Committee meetings.
  • Level 3 - Review meetings of the Steering Group.
  • Level 4 - 'audit targeting' one element of the design, implementation and progress of the project each year.

Within these different levels it was anticipated that 'Staff will be encouraged to assess their own development' and that 'Self-evaluation will secure the responses of the staff and students directly concerned' (University of Bristol 1996 p5). In the planning process self assessment was integrated within the evaluation strategy as well as being the topic for development. To appreciate the rationale for the strategy and to understand the connection with action research it is important to have some understanding of how we have interpreted self assessment.

How Saphe Interprets 'Self Assessment'

The discourse surrounding self-assessment is problematic. It is used interchangeably to describe the process whereby students use it to develop their appreciation of their effectiveness as learners (Ramsden 1992, Sadler 1989) and in discussions of students being involved in marking their own work (Cowan 1981, Heron 1981, Falchikov 1986 and McConnell 1994). Whereas Boud 1986 focusses on the development of students' capacity to assess themselves as they make judgements about their learning and perhaps set the criteria for evaluation what they learn.

In the Saphe project we have used the term self-assessment to convey a way of improving student learning by facilitating discussion and communication between learners and facilitators (students and staff). This is achieved by working towards making assessment criteria explicit and encouraging students to understand, discuss and use criteria to make qualitative judgements about their own work. Self-assessment does not meant students working on their own; they need structures and resources to support the process. Essentially, self-assessment is a two-part process: the setting and negotiation of criteria and standards by which judgements will be made and the use of those standards in making, justifying and explaining judgements. Students are encouraged to think about what constitutes quality in relation to different content material and contexts. They can then begin to identify ‘gaps’ in their learning and take steps to ‘fill the gaps’ which enables them to develop their evaluative and critical skills.

"Self assessment requires students to think critically about what they are learning, to identify appropriate standards of performance and to apply them to their own work. Self assessment encourages students to look to themselves and to other sources to determine what criteria should be used in judging their work rather than being dependent solely on their teachers or other authorities" (Boud D, 1991:6)

In this way the term ‘self assessment’ covers ways of assessing or judging the product of learning and is also a learning process in itself. To our knowledge the term that best fits this relationship is ‘assessment-as-learning’ a concept pioneered by staff at Alverno College, Milwaukee (Schmitz, 1994).

We have been concentrating on the use of self and peer assessment to improve student learning rather than on student's grading their own work or marking their own exams. This has been implemented in a variety of ways (see Hinett and Thomas 1999 eds, forthcoming) . For example, students have read drafts of each others assignments then given feedback on their own and each others' work. In some cases their own explicit evaluation of their learning has been assessed by the institution, for example, through course work and assignments. The project includes examples of students' critical evaluation of their work being included as one of the marking criteria. In one instance students have submitted work for formal assessment where they propose a mark or grade. The work on portfolios on social work placements has involved students providing evidence of their competence against the assessment criteria, students' claims are verified or challenged by the practice teacher but the onus is on the student to self assess in this way. Dialogue is an essential aspect of self assessment, it has also been a fundamental aspect of our work and connects with some of the principles of co-operative research.

Co-operative research

Co-operative research that has been developed by researchers who see interaction between people as an essential element of a research process, that is "with and for" people rather than aimed study them as objective objects (Reason 1988:1).

"Research changes the world in three ways: it makes a difference to the researcher; it makes a difference to those who come to know the research; and it makes a difference to whatever is studied" (Rowan 1981:96-97)

Reason suggests "co-operative inquiry can range from full collaboration through all stages of the inquiry, to genuine dialogue and consultation at the moments of PROJECT, ENCOUNTER and MAKING SENSE" (1988:9). He goes on to argue that co-operative research should include negotiating the involvement of the individuals, all contributing to creative thinking and aiming for authentic collaboration. One of the challenges of the Saphe project from its initial conception has been to do this.

The table below shows the various dialogues and different contexts in which the evaluation strategy has been implemented.

Table 1 Methods used to review, evaluate and develop practices and materials

Type Internal External
Site specific Staff review meetings, student evaluations Project managers

External evaluator

Discipline specific Regular approx 3 monthly meetings with lead staff on other sites, project managers External evaluator

Conference presentations and workshops

Multi disciplinary Project steering group and joint subject review and planning meetings Conference presentations and workshops

This strategy has been implemented in a way that echoes the fundamental characteristics of action research which "direct attention to the importance of empirical data as basis for reflectively improving practices" (Elliot 1991:51). It is also consistent with Meggison and Pedlar’s model of staff development (1992:8). Both these methods mirror self assessment practices in that they follow a continual process of taking action, gaining experience, reflecting on the experience in order to learn and then creating new ideas and theories to inform future action.

One of the conceptual frameworks we have used to introduce staff and students to self assessment processes has been Kolb's (1984) work on the learning cycle.

Experiencing/

Noticing

Applying/

Testing

Interpreting/

Reflecting

Generalising/

Judging

This section now goes on to explore this model at the four levels of evaluation set out earlier.

Level 1

To illustrate this we can take the example of one of the exercises used at Southampton Institute. Here the 'IT Passport' was designed to enable students self assess their IT skills (Appendix 1). They were required to start by considering their previous IT experience, that is to notice it and then to reflect on their level of experience. In this context it is important to remember that "experience has within it judgement, thought and connectedness with other experience - it is not isolated sensing" (Boud 1993:6) They then had to form a judgement about, for example, were they able to use the 'Windows' environment? This had to be tested, initially for themselves by finding the evidence to support this judgement and then through the process of evaluating their evidence with peers and staff. Following this they could then either move on to looking at another area or take action to remedy the gap they identified. This led onto the next experience that they analysed in a similar way.

Transferring Kolb's model to this example we can identify that staff and students had the experience of using the IT passport and opportunities were available to evaluate them using the following methods:

  • group discussions (students and staff/students)
  • completing evaluation questionnaires
  • in interviews with staff
  • by looking at learning outcomes.

Within these methods there were questions aimed at reflection, 'what did you think of this?', and judgement 'what did you learn from it?' or 'how could it be more effective'. Elliot argues that 'the fundamental aim of action research is to improve practice' (1991:49). Thus the IT passport was initially developed by staff and students observing and reflecting on the concrete experience of using IT, it was tested and adjustments made to arrive at the current model. It is an illustration of Elliot's proposal that "teaching is conceived as a form of research aimed at understanding how to translate educational values into concrete forms of practice....Evaluation is an integral component of action research. ... curriculum development is not a process which occurs prior to teaching....occurs through the reflective practice of teaching.' (1991:54). As the project has progressed we have about 20 other specific examples of this type of learning and evaluation of self assessment tools and approaches.

Level 2

The next level of evaluation involves the three sites coming together three or four times a year to examine their practices and the findings of their own site evaluations. The purpose of this is to learn from each other's experiences and to share materials (see Burgess et al 1999, Taylor et al 1999 and Hinett et al 1999).

Level 3

The evaluation at Level 3 involved each of the six sites preparing a brief report, based on an evaluation of their work, for the Steering Group that meets every six months. We are now in the process of drawing together and analysing the information in these reports. In these reports sites are asked to identify what is helping or hindering the project and what they have learnt about self assessment.

This work is not complete and the initial analysis will need to be considered by those involved in the project and the Steering Group. Considering co-operative research Randall and Southgate argue "it is essential that the analysis you as educator or researcher bring to the situation is adequate and that your understanding of it through dialogue is correct" (1981:351). At the time of writing this analysis is written from the perspective of the project manager (the author) and will need to be considered and developed by other members of the project before it is a true representation of the project. However, in the initial interpretation of this data the author has identified the following as conditions that support self assessment:

The value of engaging in the process of monitoring and reflection and having the time and opportunity to reflect, on occasions this has included making an audit of current practices. Within this although collaboration with other sites is mentioned (5) the interaction between the staff team on each site is mentioned much more frequently (20).

Having teaching and learning resources such as tools, structures, workbooks and guides (14). In later reports the emphasis moves more towards approaches such as staff openness to students and flexibility (24). Once integration (29) is possible this then supports further developments but initially the resources are seen as proving much needed structure and support.

Engaging students in an appreciation and understanding of self assessment and within this to involving them in the process of developing and evaluating the tools. Reports submitted in the first year of the project discuss the importance of staff understanding later this moves more towards student understanding.

Level 4

Level 4 of the evaluation is undertaken by an external evaluator. This report is based on primary and secondary data gathered by a range of methods including; questionnaire, individual interview, focus group interview, observation, participant observation, and documentary sources. The main findings of this report for the 1997/8 academic year are:

All six sites have made significant progress in relation to the aims of the project, though there are differences, particularly with regard to the fourth aim (‘Develop strategies for facilitating institutional acceptance of self-assessment’).

There is clear evidence of a series of positive outcomes as regards the experience of many participating students, particularly in the area of understanding and engaging meaningfully with assessment criteria. As might be expected, these effects are clearest in the four sites with the longest association with the project, though they are also present at the two sites joining later.

The range of activities initiated and supported under the SAPHE umbrella represent a series of realistic and context-appropriate interpretations of the aims.

The year saw significant dissemination activity, particularly through the two national conferences.

Project management continued to be effective. It was characterised by responsiveness, sensitivity, efficiency and enthusiasm.

The project raised a number of important issues regarding the conceptual underpinning for innovation in self-assessment, not least of which is to do with its straddling of the formative/summative distinction. (James 1999)

Some difficulties with this methodology

In order to illustrate the approach we have taken and to link it with educational research some of the difficulties have been understated in this paper. Perhaps the most significant of these is the problem of dealing with experience:

"Much experience, however, is multifaceted, mutli-layered and so inextricably connected with other experiences, that it is impossible to locate temporally or spatially. It almost defies analysis as the act of analysis inevitably alters the experience and the learning which flows from it." (Boud et al 1993:7)

From this quote it follows that the challenge of making sense of and presenting such a rich diversity of experiences is inevitably likely to be incomplete. As with most research there is the problem of words not adequately conveying a complex and moving picture that is in a constant state of change.

At the moment this analysis is one person's interpretation of other people's information and for greater reliability it will need to be shared and debated amongst the project team. This, however, is also problematic as it requires time, commitment and attention from others who are balancing many other competing demands. Team members vary in their teaching commitments and the time involved in travelling to meetings also determines their level of involvement. In bringing people together another problem is presented in terms of whose voice is strongest and how to present what can be widely differing perspectives. As Reason warns "The notion of participation carries strong positive connotations for many people; it is often seen as 'a good thing'. It is very easy to espouse participation and yet at times incredibly difficult to practice genuinely." (Reason 1994:3).

Conclusions

The sub title of this paper 'Educational development or educational research' presents a false dichotomy between the two activities. They can be connected in a way that our evaluation material indicates enables and supports the introduction of innovative practice. The care, integrity of those involved and attention to detail is needed is considerable but essential to ensure evaluation activities can be presented as valid 'research'. This is the particularly the case with a sceptical audience but they are also a crucial part of the dynamic that can support its credibility.

This paper represents what Oakeshott would describe as "a platform of conditional understanding" (1975:6). I see it as research in the making, no where near finished but with plenty to build on and the challenge of ensuring the Saphe project is seen as research as well as an educational development.

References

Bekhrandnia, B. (1997) ‘New moves to promote learning and teaching’ in Council Briefing of the Higher Education Funding Council November Issue 12

Burgess, H., Baldwin, M., Dalrymple, J. and Thomas, J (1999) 'Developing Self-Assessment In Social Work Education' Social Work Education Vol 18(2), Carfax forthcoming

Boud, D (1981) Developing Student Autonomy in Learning. New York: Kogan Page.

Boud D (1991) Implementing Student Self Assessment, (2nd edition) Green Guide no.5 Higher Education Research and Development Society of Australia

Boud, D., Cohen, R., Walker, D. (1993) Using Experience for Learning. Buckingham, Open University Press.

Carr, W and Kemmis, S (1986) Becoming Critical: Education, Knowledge and Action Research. London; Flamer Press 1997 edition

Elliot, J (1991) Action Research for Educational Change. Milton Keynes Open University Press

Cowan, J (1991) 'Struggling with Self Assessment' in Boud ed 1981

Dearing, R (1997) Higher Education in the Learning Society: report of the National Committee. London, National Committee of Inquiry into Higher Education DfEE.

Falchikov N (1986) 'Product Comparisons and Process Benefits of Collaborative Peer Group and Self Assessments' in Assessment and Evaluation in Higher Education Vol.11 no.2 pp 146-166

Farrran, D. (1990) 'Seeking Susan' producing statistical information on young people's leisure. Chapter 6. Stanley L. (1990) Feminist Praxis. London: Routledge.

Manen (1990) Researching Lived Experience: Human Science for an Action Sensitive Pedagogy. New York: State University Press

Heron, John (1981) 'Assessment Revisited' in Boud (ed) op cit. pp 77 - 90.

Hinett K and Thomas, J (eds) (1999) Staff guide to Self and Peer Assessment. Oxford OCLSD - forthcoming

Hinett, K., Lee, B., Maughan, C and Stanton, K. (1999) 'Managing Assessment Change in Legal Education: A Tale of Two Cities' The Law Teacher. Sweet and Maxwell - forthcoming

James, D (1999) External Evaluation Report for Year Two of the Project (1997 - 1998 Academic Year) Unpublished Paper University of Bristol.

Kolb, D (1984) Experiential Learning. London, Prentice Hall

Lincoln, Y and Guba, E (1985) Naturalistic Inquiry. California, Sage

McConnell, David (1994) 'Managing Open Learning in Computer Supported Collaborative Learning Environments' in Studies in Higher Education Vol 19, No 3 pp 341 - 358.

Meggison D. and Pedler M. (1992) Self Development: A Facilitators Guide. London, McGraw-Hill

Oakeshott, M. (1975) On Human Conduct. Oxford: Clarendon Press

Ramsden P (1992) Learning to Teach in Higher Education. London and NY. Routledge.

Randall, R. and Southgate, J., (1981) Doing dialogical research. Chapter 30. in Reason and Rowan (eds) ) op. cit., pp 349 - 361.

Reason, Peter (1994) Participation in Human Inquiry. London Sage

Reason, P. and Rowan, J. (eds) (1981) Human Inquiry. Bath: John Wiley and Sons.

Sadler R.D (1989) ‘Formative Assessment and the Design of Instructional Systems’ in Instructional Science Vol.18 pp119-144

Schmitz J (Ed) (1994) Student Assessment-as-Learning at Alverno College Milwaukee, US.

Taylor, I, Thomas, J., and Sage, H. (1999) 'Portfolios For Learning And Assessment: Laying the foundations for professional development' Social Work Education Vol 18(2), Carfax - forthcoming

Torbert, W. (1981) 'Why educational research has been so uneducational: the case for a new model of social science based on collaborative inquiry'. in Reason and Rowan 1981 (eds) op., cit., pp.141 - 153

University of Bristol (1996) Application for Funding under the HEFCE FDTL programme submission

Appendix One

Southampton Institute - The I.T. Passport - (for student use)

Once you get to know the library you will understand that legal research involves the use of IT resources. Many governmental reports, white papers etc. are now available via the Internet. The principal way of searching for appropriate cases, articles to back up an answer to an essay or problem is via the use of CD Rom.

Within your cohort will be a huge variety of IT competence and so it is very much left to you to sign up for the training that you need. To help you see what training you may need to do this year the next pages detail a pathway to basic competence in the use of IT for legal research. Some of you will be able to move quickly through the tasks set as you have previous experience in using CD Rom and the Internet. In this case will you please produce on a side of A4 details of a search strategy you have devised using legal databases as evidence of your competence.

For those of you with less IT experience there are a variety of ways to build up your experience and meet the tasks set; (and please note we expect it to take practice and time - this passport is for the year not to completed in induction week).

1. Margaret Feetham, the law librarian, provides instruction workshops on the use of law CD Roms and legal search engines on the Internet. Details appear on the library noticeboards and are given out in lectures.

2. The IT Resource Centres also provide instruction workshops on various IT packages and the Netscape. Details appear on noticeboards in the Resource Centres.

3. The Resource Centres all have help desks, so that if you are trying to use a CD Rom, the Internet, or intranet and get stuck, they will be able to sort you out. They have more time to do this early in the semester, later the resource centres become crammed with students doing assessments. The evening is generally the quietest time when help will be more freely available.

4. Many of the CD Roms you need to use are on the network and so can be accessed at any of the IT Resource Centres.

5. Information sheets and workbooks are available if you wish to work through the system on your own.

6. In the Legal research course in semester two there are lectures and a supporting seminar on search strategies.

This passport is for you to work through. The lecturers on the Legal Research course will ask to see it and expect you to have reached point 9 by the beginning of semester two.

Steps towards IT research competence Evidence Date
1. Attended Institute IT induction

 

 

2. Competent user of the Windows environment

 

 

3. Attended library CD Rom instruction workshop
(Law or general)
 

 

4. Use CD Rom by oneself
a. Full text CD Rom
b. Indexing CD Rom

 

 

5. Identify which useful CD Roms are on the student network.

 

 

6. Attend workshop or use workbook to familiarise self with Netscape environment  

 

7. Use Internet to go to a specific Website address

 

 

8. Use the Intranet to find an exam paper

 

 

9. Attend Internet instruction workshop

 

 

10. Attend lecture (semester 2) on search strategies

 

 

11. Design and implement a successful search strategy to be used on relevant databases  

 

 

Tags

e-learning, distance learning, distance education, online learning, higher education, DE, blended learning, MOOCs, ICT, information and communication technology, collaborative learning, internet, interaction, learning management system, LMS,

Current issue on Sciendo

– electronic content hosting and distribution platform

EURODL is indexed by ERIC

– the Education Resources Information Center, the world's largest digital library of education literature

EURODL is indexed by DOAJ

– the Directory of Open Access Journals

EURODL is indexed by Cabells

– the Cabell's Directories

EURODL is indexed by EBSCO

– the EBSCO Publishing – EBSCOhost Online Research Databases

For new referees

If you would like to referee articles for EURODL, please write to the Chief Editor Ulrich Bernath, including a brief CV and your area of interest.