Working Paper
© Dr. Terence Love & Ms. Trudi Cooper
Love Design & Research
www.love.com.au
This paper reports the main findings of an exploratory investigation into the key factors necessary to designing information systems for online portfolio-based assessment in tertiary, professional, secondary and primary education that maximize benefits for all stakeholders. A review of contemporary practice in designing online portfolio assessment systems showed a widespread neglect of several key factors necessary to formulating designs that maximize benefits for all stakeholders. In addition, it was found that design processes were marked by an over-emphasis on technical issues to facilitate implementation rather than addressing the primary educational goals. The result is online portfolio systems that fall significantly short of their potential, and, in many cases, are inferior to conventional portfolio assessment and other more traditional assessment approaches.
The paper identifies key design factors necessary for successful conceptual design of online portfolio information systems that maximize benefits for all stakeholders. Initial design heuristics are outlined for designing online portfolio assessment information systems that provide improvements on conventional portfolio assessment and other assessment modalities for all stakeholders.
Keywords: HCI,concepts, online portfolios, IS education
Internationally, there has been a significant increase in the use of online portfolios in tertiary, secondary, primary and professional education over the last three years (see, for example, AAHE, 2001; Barrett, 2000; Brooks & Madda, 1999; Curriculum Council, 1998; Education Department of Western Australia, 2000; Fried, n.d.; McCracken, 1997; Thompson & Farrow, 1999). The intention of most of these online portfolio assessment programs has been to combine the benefits of traditional portfolio-based assessment with the paper saving and other benefits of online environments (AAHE, 2001; Barrett, 2000; Cooper & Love, 2002; Oliver & Herrington, 2001). The benefits of conventional portfolios in assessment and learning are well established, and, in most cases, online portfolios have aimed to replicate conventional portfolio documents online (see, for example, AAHE, 2001; Barrett, 2000, 2000; Biggs & Tang, 1997; Bowie et al., 2000; Cooper, 1999; Cooper, Hutchins, & Sims, 1999; Education Department of Western Australia, 2000; Fried, n.d.; McCracken, 1997; Sewell, Marczac, & Horn, n.d.; Thompson & Farrow, 1999).
This paper reports recent research undertaken by Love and Cooper into online portfolio assessment. It focuses specifically on the general findings of the research about stakeholder issues and the design factors that affect the value distribution to different stakeholder groups. Detailed analysis of the data collected about individual online portfolio assessment systems will be reported elsewhere. The research was undertaken to explore how Cooper’s previous work in developing successful portfolio assessment system models for students, lecturers, teachers, course designers, nurses and midwives (Cooper, 1999, 1997; Cooper & Emden, 2000) could be extended to online environments. The research was exploratory in nature using of a mixture of broad scans across the web and literature and focused investigation into exemplar online portfolio assessment systems. The primary themes investigated were integrity of course design, and benefits for all stakeholders. This latter point is especially significant because one of the main benefits offered by online environments is the ability to efficiently automate many of the time consuming routine administrative tasks associated with education and assessment. These improvements in efficiency through automation of tasks are of particular interest to stakeholders other than students.
Implementing conventional portfolios and portfolio-based assessment systems online is beset with problems that frequently reduce the value of this method of assessment. Using portfolios in an online context brings in additional considerations over and above those that need to be addressed in offline use of portfolios. The two main issues are:
The aim of using online technologies in portfolio assessment processes is to maximise benefit for all stakeholders. Online portfolio-based assessment requires a careful amalgam of several processes. The main aim in focusing on the design criteria and underlying concepts is to identify heuristics for designing online information systems that offer improvements in some or all of these processes. There are different process issues for each of the different stakeholders, and in online scenarios these are often different in type and magnitude from those in offline portfolio-based assessment processes. If not addressed well, they compromise the assessment process and lose the benefits associated with portfolios reducing the value of the assessment mechanism.
Different forms of conventional portfolio-based assessment are appropriate to different circumstances (Cooper, 1999; Cooper & Love, 2002). Quality course design processes require careful matching between the characteristics of different forms of portfolio assessment to the aims, objectives, resources, stakeholders and other characteristics of the course in which portfolios are being used (AAHE, 2001; Brooks & Madda, 1999; Cooper, 1999; Education Department of Western Australia, 2000, 2000; McCracken, 1997; Sewell et al., n.d.). Review of existing conventional offline portfolio-based assessment shows these issues are frequently not adequately addressed. In many online courses, core pedagogical and assessment design issues are frequently either neglected or, more commonly, overshadowed by an over emphasis on the analysis of relative technical advantages of particular proprietary software and hardware formats (see, for example Herrington & Bunker, 2002; Oliver & Herrington, 2001). The implication is that overall educational outcomes are likely to be adverse in spite of the potential benefits from the online environment.
Online portfolios are likely to be of increasing significance in universities due to:
Central to the design research reported in this paper is the assumption that designing successful solutions first requires a detailed understanding of how direct and contextual factors shape the distribution of value between stakeholders – which, in turn, requires an understanding of stakeholder positions. For example, an important issue is the opportunity to automate administrative processes, which benefits particular stakeholder groups. From this perspective, this broad swath of mainly qualitative design issues must be addressed before focusing on technical decisions about the hardware, software, data structures and the database management systems needed to implement online portfolio assessment.
The paper has four sections including this Introduction. The next section provides definitions of portfolio and portfolio-building process, identifies the benefits of portfolio assessment and the main stakeholders, and, as an example of a contextual analysis, sets out details of the university design context for information systems for online portfolio assessment. The third section focuses more directly on the functions of an online portfolio assessment system. The final section provides a summary of the paper, and draws out the broader implications for designing online information systems for portfolio assessment.
This paper uses definitions of portfolio and the portfolio-building process that underpins assessment as developed by Cooper initially for evidence-based assessment in tertiary courses for professionals, and later extended to other applications (see, for example, Cooper, 1999; Cooper, 1997; Cooper & Emden, 2000; Cooper et al., 1999; Cooper & Love, 2000). Cooper’s definition of a portfolio is,
‘A collection of evidence that demonstrates skills, achievements, learning or competencies’.
Cooper (1999) defined the portfolio-building process in terms of six sub-processes:
The process ensures portfolio-based assessment is transparently and directly applied in a purposive way to predefined educational processes and outcomes.
The benefits of portfolio-based assessment over other assessment approaches is well established (see, for example, AAHE, 2001; Barrett, 2000, 2000; Biggs & Tang, 1997; Brooks & Madda, 1999; Cooper, 1999; Cooper et al., 1999; Curriculum Council, 1998; Education Department of Western Australia, 2000; Sewell et al., n.d.; Thompson & Farrow, 1999). Some of their main advantages are:
Designing online education courses that take advantage of the benefits of portfolio-based assessment requires careful consideration of stakeholders and the relative value distribution for different stakeholder groups of specific design instances of online portfolio assessment.
This can be undertaken at many levels of sophistication. A common but often unhelpful approach is for course designers to focus on students as a single stakeholder group. The value of the online portfolio assessment system then is interpreted only in terms of that group. For example:
A slightly broader perspective includes the teaching staff, and, perhaps, the education institution. This increased breadth allows the value, advantages and disadvantages, of a particular online portfolio assessment system to be perceived in relation to these additional stakeholder constituents.
The broadest and most sophisticated approach is to design and evaluate potential online portfolio assessment systems in terms of all the stakeholder constituents impacted by the designed outcomes. These include:
Differences in the designs for specific online portfolio assessment systems can offer more or less benefits and value for each of these stakeholders.
Successfully designing and evaluating online portfolio assessment systems to distribute value and benefits to stakeholders requires more than identifying the stakeholders. It also requires designing to be undertaken from a particular standpoint that determines the relative importance of the value distribution to particular stakeholder groups. In most commercial design contexts, the standpoint from which designs are developed and evaluated is dictated by the client or sponsor and made explicit through design criteria described in a design brief. In educational contexts, although widely used, business concepts of sponsor, client, and customer are poorly suited to describing the institutionalised power and service relations. Instead it is necessary to look to the institutional structures and mores that govern what is valued in different educational sectors and situations. These institutional issues comprise the existing value structures (what the purposes of the institution are), and the trends and forces of change (what the institution is becoming). The example below explores some of the trends and change factors acting on the university sector. It exposes important information that is foundational to developing successful designs of information systems for online portfolio assessment in universities.
The design of future courses in universities is influenced by a variety of factors. For example:
Evidence-based’ assessment is increasingly found in university systems in Australia, the US and the UK (ANTA/AVCC, 2000; DETYA, 2001, 1999), in the school education system (Curriculum Council, 1998), and in the professions (see, for example, Australian Computer Society, 2001; Australian Nursing Council Inc., 2000; Engineering Council, 1997). Evidence-based’ assessment requires demonstration of application of a body of knowledge rather than knowing about a body of knowledge. In fast changing disciplines, it is unsatisfactory to assess knowledge of content likely to be obsolete soon. Professional institutions such as the Australian Computer Society (2001) regard as important that students and practitioners can demonstrate their understanding of fundamental concepts underpinning the discipline’s knowledge and can demonstrate their ability to continually update their skills. Traditional assessment methods do this badly, being originally developed for discipline areas with a low rate of change of knowledge, for societies with low levels of social change, and where professional skills are assessed by other means.
Employability: Students completing a course expect to be employable as a result (DETYA, 2001, 1999). This results in tension in course designs between:
Taken together they present a complex assessment problem that is relatively easily resolved by portfolio assessment because portfolios can contain different evidence that has been derived from learning experiences with different educational purposes (Cooper & Love, 2002).
Quality assurance: is becoming increasingly critical to all stakeholders in Australia (Cooper, 1999; DETYA, 1999; Kemp, 1999). Key quality issues include transparency and high levels of integrity and coherency between institutional educational aims and objectives, assessment practices, and effective moderation processes between and across units, courses, programs and institutions.
Equity problems in assessment become increasingly important as student populations become culturally, educationally, and socially more diverse (DETYA, 1999) (Cooper, 1999).
Plagiarism is increasingly problematic (Terrell & May, 2001), especially where students have high levels of Internet and electronic file manipulation skills (Kearns, 2000). Traditional modalities of assessment do not easily lend themselves to the triangulation that is a core aspect of identifying plagiarism because they don not offer the means for institutional staff to easily co-locate individual student’s different assessments. Portfolios can assist in identifying plagiarism because their nature as a ‘container of student’s work’ facilitates triangulation of assessment and offers examiners a ready means to correlate standards across an individual student’s multiple assessment items.
Increasingly, there are pressures on course designers to address issues of commercialisation, modularisation and globalisation. A common response is for courses and units to be provided by partner institutions, via flexible delivery, and, occasionally with marking processes outsourced (Bradley, 2000; Technology and Industry Advisory Council, 2000).
Research into portfolio-based assessment indicates it can address and resolve many or all of the above issues (Cooper, 1999).
Portfolio assessment, especially in its online forms, differs in detail in many ways from other forms of assessment. Informatically, the portfolio is a container in which different people can place a wide variety of informatic entities relating to evidence, in different ways and for different purposes. Potential benefits of using online modalities to implement portfolios and portfolio-based assessment include:
This perspective shifts the emphasis onto gaining maximum qualitative benefits for stakeholders rather than creating ‘electronic’ facsimiles of hardcopy portfolios.
The processes by which information is put into and taken out of the portfolio assessment system are important to the designing of successful information systems to support online portfolio assessment because they impact on most stakeholders. The following areas emerged in the research as being particularly relevant to clarifying design issues are:
The online medium offers the potential for designing portfolio administration systems that enable the different stakeholders to interact with a portfolio to gain value for themselves specific to their orientation. The research suggests that the value to stakeholders other than students is substantial and that most of these benefits accrue from the automation of administrative functions within the online portfolio system that reduce the transaction costs of processing, storing and accessing student assessment information and in facilitating quality assurance, evaluation, moderation and feedback on the related educational and management systems.
The online environment offers many means of automating tasks in order to standardise the structure of portfolios and identify omissions, and potentially reduce or remove many routine tasks undertaken by lecturers and admin staff including marking (see, for example, Fried, n.d.; Rudner, 2001). Automation features that can reduce human administrative work and realise benefits include:
Good alignment between users and online technologies is a potentially significant issue. For example, there are likely to be problems if portfolio systems based on Linux workstations were implemented in an Art and Design studio environment because the technical nature of the Linux environment conflicts with the user expectations related to the Macintosh/ minimum technical expertise/ supportive graphical interface environment typical of Art and Design studios.
Typically, information storage is the area that has received most attention, often at the expense of education, assessment and informatic issues. In many cases, primary concerns have emphasised the technical attributes (se, for example, Barrett, 2000, 2000, 2000) of, e.g. pdf, Word, MPEG and other proprietary and generic hardware and software specifications, whilst neglecting important education, assessment and informatic issues on which such decisions must be based to ensure satisfactory educational and organisational outcomes and ensure increased value for all stakeholders. This moves the focus away from inappropriately prioritizing the satisfying of interests of those with technical responsibilities for institutions’ hardware and software infrastructure, or those who wish to gain academic kudos from being involved in a fashionable educational development.
There are several ways in which Quality Assurance can be understood. Sometimes it is only understood as being concerned with the demonstrating to external assessors the “goodness” of the product or process. More useful from the point of view of assessment and education, however, is to view Quality Assurance as primarily concerned with ‘quality improvement’.
When Quality Assurance is defined in terms of Quality Improvement, the two key quality improvement issues for any education or assessment program are transparent moderation and sound processes of evaluation and feedback. Their application is not limited to the direct education and assessment activities (e.g. teaching performance and students’ evidence of learning). Where course design and online portfolio assessment programs are defined in terms of all stakeholders, then processes of evaluation, moderation and feedback are central to quality improvement and success of all aspects of the course design and online portfolio assessment system. These moderation, evaluation and feedback processes focus on the way that portfolio-based assessment has been implemented on line in a specific context That is, whether it has achieved what it was intended to achieve, whether there were unintended consequences, and whether any aspects of the system need to be changed to improve it.
In an online environment aimed at creating and distributing value for all stakeholder groups, this implies that evaluation, moderation and feedback processes are created as a parallel and integrated aspect of the online portfolio assessment system – and by implication also be implemented online.
In online environments, equity is strongly dependent on students’ technology skills and technology access. Equity is significantly reduced where access to technology is not ubiquitous or where technology skill indirectly influences the online presentation of evidence. In many cases, this implies standardising the hardware and software, and training students in that hardware and software. Other techniques include the use of online templates and restrictive modalities of input. The overall intention is to limit the scope for more affluent students to gain assessment advantage through access to more sophisticated software and hardware, or through more advanced technical skills that are not being assessed directly as part of the online portfolio assessment process.
Computed-aided plagiarism is a problem in paper-based and online formats (Kearns, 2000) It undermines confidence in assessment processes and can significantly damage the reputation of an educational institution (Terrell & May, 2001). Traditional approaches are not widely effective against computer-based forms of plagiarism and this implies that new technical forms of addressing plagiarism are needed (Cooper & Love, 2002). Online portfolio assessment offers a technical basis for facilitating the detection of plagiarism through plagiarism testing software (see, for example, IntegriGuard.com, 2001; iParadigms.com, 2001; Plagiarism.com, 2001). It is trivial to pipe students’ portfolio contents through such anti-plagiarism software either automatically or under manual control with feedback to appropriate stakeholders. The structured nature of online portfolios also facilitates this process
The flexibility of online portfolio assessment offers advantages in providing many different technical ways to insert information into portfolio containers and to access it. This flexibility of access raises information security problems that need to be addressed informatically in technically different ways dependent on the information processes and pathways.
Fraud can be a temptation for students with access to hardware, software and skill that enable it. Tampering with electronic evidence is serious and conceptually falls under standard university regulations relating to, e.g. misrepresentation of qualifications. Institutional response to online fraud remains, in many cases, inadequate in terms of well-developed university regulations and guidelines. Designers of quality courses, however, should expect to make provision for an appropriate response to successful or unsuccessful attempts at IT enabled fraud or security ingress.
Online security technologies are now relatively mature due to their development in other industries such as online banking, retail and share trading. Online portfolio assessment systems can usefully use many of the online security certification and server-based security methods developed in these fields. The triangulation facilitated by portfolio assessment also offers some protection against fraud (and also plagiarism).
For example, external and associate examiners can enter marks or competency certification using secure web-based password protected forms directing secure server-side scripts to place an authenticated certificate in a student’s online portfolio container. This is especially relevant to courses that are competency-based, have practical components, or require professional accreditation. Online portfolio assessment in these circumstances would likely require a secure interface for external authorised and authenticated assessors to enter reports about individual students whose work they have observed. Potentially, such a secure interface would also offer the means for external and internal assessors to certify they have sighted originals of certificates from elsewhere.
Many universities claim that students enrolled in undergraduate courses develop ‘generic skills’ or ‘graduate attributes’. Assessing ‘graduate attributes’ is made difficult, however, in awards in which courses have been modularised. This difficulty arises because the process of developing ‘graduate attributes’ spans modules, which means that assessment of graduate attributes cannot be successfully contained within the normal module assessment processes. The use of student portfolios has the capacity to resolve this problem because portfolios can collate student work across modules and supplement formal assessments with evidence from a range of sources including practicum and work-based learning. Module-based education implies different stakeholders associated with each module. The use of online environments for portfolio-based assessment offers potential for improving value for module stakeholders through reducing the transaction costs of accessing student assessment information associated with graduate attributes. Online portfolio assessment systems can also facilitate the quality assurance compliance associated with graduate attribute assessment and hence offer value to stakeholder groups at higher institutional levels.
Online portfolios can offer an effective means for demonstrating graduate attributes where portfolio assessment is based on the combination of performance criteria and evidence. Performance criteria derived from generic graduate attributes can be included in portfolio assessment processes with other performance criteria derived from specific professional competencies and skills. Students can use evidence collected to satisfy unit-based assessment as evidence they possess particular graduate attributes. Where courses are intending to assess graduate attributes, portfolios provide an overview of each student’s educational development that transcends the unit boundaries that occur within modular courses. This offers a significant improvement in educational efficiency because it removes the need for a separate graduate assessment mechanism, and minimises students’ documentation effort.
The approach taken in this paper is to focus on the roles of online portfolio assessment systems in terms of value generation and distribution to all stakeholders. Stakeholders interact with the online portfolio assessment system via interfaces. These interfaces are directly related to the underlying processes that provide stakeholders with the value and benefits generated by the online system. This implies that several sorts of interface are required and that these are closely related to data input and formatted output processes associated with stakeholder specific value creation and distribution. This perspective suggests that dedicated interfaces are required for:
There are two main approaches to developing interfaces and systems for students to create their portfolios:
1. Provide online access to a complete suite of software (e.g. word processing, graphic and multimedia software) for a student to complete all aspects of their portfolio and not allow the use of any other software (for equity reasons).
2. Provide online access to the means to build a coherent portfolio out of individual elements created elsewhere. Students create individual assessment submissions that form the body of evidence outside the online portfolio system using whatever conventional proprietary document preparation software they have access to.
In both situations, the student also creates a commentary, the online meta-documentation that explicitly links and explains the connections between different elements of evidence in the portfolio and the assessment criteria (performance indicators). The expected outcome would be a unified portfolio of work that contains all the necessary elements of document structure, such as:
· Table of contents
· Tables of evidence
· Evidence
· Performance indicators/criteria
· Commentaries on the relationships between specific elements of evidence and performance indicators
· Reports from approved assessors (e.g. practicum supervisors’ reports, and competency certification from approved assessors).
Each of the above may also require individual interfaces.
The computerised and internetworked processes that facilitate online portfolio assessment informatic processes require appropriate choices of cognitive artifacts. At the simplest level these may relate to iconic representations on screen. At a more complex level, they are associated with the choice of metaphors through which the details of the complex of informatic processes and states and the navigation through them is rendered more comprehensible to users. The highest level of abstraction includes the models that map the relationships between the cognitive artifacts, as perceived by the users of the online environment, to the real world practical issues associated with teaching, learning and assessment involving real people.
The practical aspects of categorizing different aspects of design element involve more general choices about hardware and software technologies. The issues of concern here are essentially practical in terms of, for example, the scope of software environments and the overheads and resource costs in designing and maintaining those environments. Practical correspondences between different forms of hardware and software, i.e. will software A run on platform X alongside that version Y of software Z? These technical issues are addressed after other system design issues.
The main points of the above are listed in Table 1 below.
Table 1: Issues and responses in the designing of information systems for online portfolio assessment
Issue |
Design Response |
Stakeholders |
Identify all the stakeholders and stakeholder groups that potentially obtain value and benefit from an online portfolio assessment system. Pay particular attention to those stakeholder groups who gain benefit from automating their administrative processes to reduce transaction costs. |
Automation of administrative functions |
The main benefits offered by the use of the online environment are in reducing transaction costs for stakeholders through the automation of administrative processes. A primary aim of the design process is to identify where significant valuer can be provided to stakeholders through online automation of routine activities. |
Quality Assurance |
View in terms of quality improvement. Use appropriate methods for transparently moderating portfolios, evaluating portfolio administration and educational processes, and providing feedback to appropriate stakeholder groups |
Equity Issues |
Use standardised and restricted hardware and software platform and interfaces. Use standard input templates where possible. Train students in portfolio-building theory and in using the online system. |
Plagiarism |
Integrate automated and manual plagiarism detection into the online portfolio system with reporting at appropriate levels to relevant stakeholders. |
Fraud |
Implement conventional security systems to enable tamperproof recording, information storage and data access processes. |
Graduate Attributes |
Implement cross module portfolio assessment system using performance indicators for generic attributes. Train students in skill recognition and in uses of documents as evidence of generic skills |
Interface Issues |
Identify potential value creation and distribution opportunities for all stakeholders. Identify the online and offline activities associated with these value processes. Identify stakeholder-based preferences for accessing the value streams. Define appropriate interfaces for stakeholders to access value streams whilst minimising their transaction costs. |
Information storage |
Information storage needs and processes defined in terms of creating value for stakeholders. Technical decisions relating to information storage methods secondary to value creation. |
Hardware and software technology decisions |
Educational issues, gaining benefits of automation, and creating stakeholder value are primary to the design brief, and take precedence over decisions about technical means of implementing the online portfolio system. Make hardware and software decisions after other system design decisions. |
Discipline related factors and technology choice |
Make sure that the technology implementation aligns with the skills typical of users of the online portfolio assessment system |
Cognitive and information artefacts |
Choice and definition of cognitive and informatic artefacts to be as a result of clear understanding of how value creation and distribution is undertaken in the online portfolio assessment system |
The paper has outlined the findings from recent research undertaken by the authors into designing online portfolio assessment systems that align with best practice in educational course design and that focus on the roles of online portfolio assessment systems in creating and distributing value to all stakeholders.
The research indicated that, at present, online portfolio assessment is widely undertaken without sufficient attention being given to course design issues and the potential benefits to all stakeholders from automating administrative processes specific to individual stakeholder groups. The majority of instances of online portfolio assessment systems found on the Internet consist of web-based electronic facsimiles of conventional documents. In many cases, the ‘portfolio’ consists of a single essay, project report or term paper. This approach does not gain the benefits of the online environment, does not address the disadvantages or equity issues, and does not ensure increased value distribution to all stakeholders. Experience from implementing conventional portfolio assessment indicates that this form of online portfolio assessment is likely to compare unfavourably with conventional portfolios and other conventional assessment modalities if subjected to critical review in educational and administrative terms.
The paper has put forward an alternative design approach that focuses on designing online portfolio assessment systems to increase value for all stakeholder groups. The proposed approach involves: identifying the details of the educational position from which the online system is designed and evaluated; identifying appropriate value distribution to stakeholders; developing an online system design that fulfils the requirements of the course design criteria uses best practice in course design; and utilises the extensive potential for increased value creation for stakeholders through process automation. In this approach, these educational analyses, the benefits of automation, and creating stakeholder value are regarded as primary aspects of the design brief, and take precedence over consideration of the technical means by which the online portfolio system is implemented.
AAHE. (2001). Electronic Portfolios: Emerging Practices for Students, Faculty and Institutions, [html document]. AAHE. Available: http://aahe.ital.utexas.edu/electronicportfolios/index.html.
ANTA/AVCC. (2000). Pathways to Partnerships. Australian Vice-Chancellors Committee. Available: www.avcc.edu.au/policies_activities/teaching_learning/credit_transfer/index.htm [2001, July].
Australian Computer Society. (2001). Accredited Tertiary Courses, [html file]. Australian Computer Society. Available: www.acs.org.au/national/accreditation/intro.htm [2001, June].
Australian Nursing Council Inc. (2000). National Competency Standards for the Registered Nurse. Adelaide: Australian Nursing Council Inc.
Australian Quality Council. (2000). Business Excellence An Overview. Canberra: Australian Quality Council.
Barrett, H. C. (2000). Electronic Portfolios, School Reform and Standards, [html file]. Barrett, Helen C. Available: http://transition.alaska.edu/www/portfolios/PBS2.html [2001, May].
Barrett, H. C. (2000). Create Your Own Electronic Portfolio, [html file]. Barrett, Helen C. Available: http://transition.alaska.edu/www/portfolios/iste2k.html [2001, May].
Barrett, H. C. (2000). The Electronic Portfolio Development Process, [html file]. Barrett, Helen C. Available: http://transition.alaska.edu/www/portfolios/aahe2000.html [2001, May].
Barrett, H. C. (2000). Collaborative Planning for Electronic Portfolios: Asking Strategic Questions, [html file]. Barrett, Helen C. Available: http://transition.alaska.edu/www/portfolios/planning.html [2001, May].
Biggs, J., & Tang, C. (1997). Assessment by portfolio: Constructing learning and designing teaching. Research and Development in Higher Education, 79-87.
Bowie, C., Taylor, P., Zimitat, C., & Young, B. (2000). Electronic Course portfolio in a new on-line Graduate certificate in Flexible Learning. HERDSA News(April).
Bradley, D. (2000). Distance education: An Open Question?, [html document]. University of South Australia. Available: www.com.unisa.edu.au/cccc/papers/keynote_address.htm [2001, July].
Brooks, B. A., & Madda, M. (1999). How to organize a professional portfolio for staff and career development. Jounal for Nurses in Staff Development, 15(1), 5-10.
Cooper, T. (1997). Portfolio assessment: a guide for students. Perth: Praxis Education.
Cooper, T. (1999). Portfolio assessment: A guide for lecturers teachers and course designers. Perth: Praxis Education.
Cooper, T., Hutchins, T., & Sims, M. (1999). Developing a Portfolio which demonstrates Competencies. In M. Sims & T. Hutchins (Eds.), Learning materials: Certificate in Children’s Services; 0-6 years (bilingual support) (pp. 3-29). Perth: Ethnic Childcare Resource Inc. Western Australia.
Cooper, T., & Emden, C. (2000). Portfolio Assessment: A Guide for Nurses and Midwives. Perth: Praxis Education.
Cooper, T., & Love, T. (2000). Portfolios in university-based design education. In C. Swann & E. Young (Eds.), Re-inventing Design Education in the University (pp. 159-166). Perth: School of Design, Curtin University.
Cooper, T., & Love, T. (2001). Online Portfolio Assessment in Information Systems. In S. Stoney & J. Burn (Eds.), Working for Excellence in the E-conomy (pp. 417-426). Perth: We-B Research Centre, Edith Cowan University.
Cooper, T., & Love, T. (2002). Online portfolios: issues of assessment and pedagogy. In P. Jeffrey (Ed.), AARE 2001: Crossing Borders: New Frontiers of Educational Research. Coldstream, Victoria: AARE Inc.
Curriculum Council. (1998). Curriculum Framework for Kindergarten to Year 12 Education in Western Australia. Perth, WA: Curriculum Council.
DETYA. (1999). The Quality of Australia Higher Education: An Overview, [html document]. DETYA. Available: www.detya.gov.au/archive/highered/pubs/quality/overview.htm [2000, March].
DETYA. (2001). Higher Education Report for the 2001-2003 Triennium (6655HEPA01A). Canberra: Department of Education, Training and Youth Affairs.
Education Department of Western Australia. (2000, March 2000). Assessment - Portfolios, [html file]. Education Department of Western Australia. Available: www.eddept.wa.edu.au/centoff/outcomes/focus/fc429.htm [2001, May].
Education Department of Western Australia. (2000, March 2000). Review Classroom Approaches to Student Assessment, [html file]. Education Department of Western Australia. Available: www.eddept.wa.edu.au/centoff/outcomes/focus/fc42.htm [2001, May].
Education Department of Western Australia. (2000, March 2000). Review of School Assessment Policy, [html file]. Education Department of Western Australia. Available: www.eddept.wa.edu.au/centoff/outcomes/focus/fc43.htm [2001, May].
Engineering Council. (1997). Standards and Routes to Registration (SARTOR), [html]. Engineering Council. Available: www.soe.org.uk/soe.org/soe/engcoun.htm [2001, May 2001].
Fried, M. (n.d.). Portfolios at University of California, Irvin. Interactive E-mail; Enhanced Student Assessment, [html document]. Ammerican Association for Higher Education [2001, Aug].
Herrington, A., & Bunker, A. (2002). Quality Teaching Online: Putting Pedagogy First. In A. Goody & J. Herrington & M. Northcote (Eds.), Research and Development in Higher Education. Annual International HERDSA Conference 7-10 July 2002 Volume 25. Perth: HERDSA Inc.
IntegriGuard.com. (2001). IntegriGuard, [html file]. IntegriGuard.com. Available: www.integriguard.com/ [2001, May].
iParadigms.com. (2001). Turnitin, [html file]. iParadigms.com. Available: http://iparadigms.com/turnitin.html [2001, May].
Kearns, L. (2000). School assessment: does it pass the test?, Theage.com.au (Vol. 2001): Theage.com.au.
Kemp, D. (1999). Knowledge and Innovation: A policy statement on research and research training. Canberra: Legislative Services, AusInfo.
McCracken, W. M. (1997). Portfolio Assessment in Design Education. Atlanta: EduTech Institute and College of Computing, Georgia Institute of Technology.
Oliver, R., & Herrington, J. (2001). Teaching and Learning Online. Perth: Centre for Research in Information Technology and Communications, Edith Cowan University.
Plagiarism.com. (2001). Glatt Plagiarism Screening Program, [html file]. Plagiarism.com. Available: www.plagiarism.com/screen.id.htm [2001, May].
Rudner, L. M. (2001). Computer grading using Bayesian Networks - Overview, [html document]. Rudner, Lawrence M. Available: http://ericae.net/betsy/bayesian_ov.htm.
Sewell, M., Marczac, M., & Horn, M. (n.d.). The Use of Portfolio Assessment in Evaluation, [html document]. University of Arizona [2001, Aug].
Technology and Industry Advisory Council. (2000). Export of Western Australian Education & Training: Constraints & Opportunities, [pdf file]. Technology and Industry Advisory Council. Available: www.wa.gov.au/tiac/exportedu/index.html [2001, June].
Terrell, D., & May. (2001). The Deane Terrell Inquiry Report of the External Reviewer appointed to examine issues arising from a case of alleged plagiarism. A report commissioned by: Professor Lance Twomey Vice-Chancellor Curtin University of Technology Perth, Western Australia. Perth, Western Australia: Curtin University of Technology.
Thompson, R., & Farrow, T. (1999). The Workbook Portfolio: Facilitating undergraduate student learning in the mental health clinical area. Nursing Praxis in New Zealand, 14(2), 21-30.