Brian Kelly, UKOLN, University
of Bath, BATH, UK
<B.Kelly@ukoln.ac.uk>
Marieke Guy, UKOLN, University
of Bath, BATH, UK
<M.Guy@ukoln.ac.uk>
Hamish James, AHDS, Kings
College London, UK
<Hamish.James@ahds.ac.uk>
In this paper the authors describe approaches for the deployment of quality assurance (QA) procedures for a digital library programme. The authors argue that formal QA procedures are needed in order to ensure that deliverables from digital library programmes will be interoperable and give be easily deployed and repurposed. The adoption of open standards is acknowledged as essential in development programmes but in a distributed development environment it can be difficult to ensure that programme deliverables actually implement appropriate standards and best practices. The authors describe the approaches to the development of a quality culture which is being taken by the JISC in the UK in its development of an Information Environment which seeks to provide seamless access to quality scholarly resources.
Keywords: quality assurance, QA
The World-Wide Web is accepted as the key delivery platform for digital library services. The Web promises universal access to resources and provides flexibility, including platform- and application-independence, though use of open standards. In practice however, it can be difficult to achieve this goal. Proprietary formats are often felt to be appealing and, as we learnt to our cost during the "browser wars", software vendors can promise open standards while deploying proprietary extensions which can result in services which fail to be interoperable. Developers can also be unsure as to which standards may be applicable to their area of work: there is a danger that simple standards, such as HTML, are used when richer standards, such as XML, could provide greater interoperability.
The JISC (Joint Information Systems Committee) has funded a QA Focus post which aims to ensure that projects make use of QA (quality assurance) procedures which will help ensure interoperability through use of appropriate standards and best practices.
A summary of the work of QA Focus is provided in this paper. The paper describes the background to IT development in the UK's Higher Education community, the role of standards and the approaches taken by QA Focus. The paper concludes by outlining future work for QA Focus and the potential for use of similar approaches by other digital library programmes.
The UK's Higher Education community has a culture which is supportive of open standards in its IT development programmes. Within the eLib Programme, for example, the eLib Standards Guidelines [1] were released in 1996 which defined the standards funded projects were expected to implement.
Although the Standards Guidelines document was available shortly after the start of the programme, compliance was not enforced. There was recognition of the dangers of enforcing standards too rigidly in those early days of the Web: if the programme had started a few years earlier use of Gopher could well have been chosen as the standard delivery mechanism! In addition the UK Higher Education community had previously attempted to standardise on Coloured Books networking protocols, which subsequently failed to be adopted widely and were eventually superceded by Internet protocols.
The eLib programme encouraged a certain amount of diversity: this approach of letting a "thousand flowers bloom" was probably a sensible approach for the mid-1990s, before it was clear that the Web would be the killer application which, with hindsight, we recognise that it is. This approach also reflected the culture of software development in a HE environment, in which strict management practices aren't the norm and there has been a tendency to allow software developers a fair amount of freedom.
Nowadays, however, there is increased recognition of the need to have a more managed approach to development. The Web is now recognised as the killer application. Project deliverables, which are often Web-based, can no longer be treated as self-contained services - there is a need for them to interoperate. Also stricter compliance with standards will be needed: Web browsers have been tolerant of errors in HTML resources, but this will be different in a world in which "Web Services" technologies will be reliant on well-structured resources for machine processing. Finally, JISC has moved on from a research and experimental approach and is now funding programmes in which project deliverables are normally expected to be deployed in a service environment.
The JISC's Information Environment (IE, formerly DNER) [2] seeks to provide seamless access to scholarly resources which are distributed across a range of providers, including centrally-funded JISC services, commercial providers and the institutions themselves. The Standards and Guidelines To Build A National Resource document [3] was written to define the standards which form the basis for the IE. The standards document is supported by an IE Architecture [4] which describes the technical architecture of the IE.
The JISC has funded a number of programmes in order to develop the IE, including 5/99 [5] which was followed by the FAIR [6] and X4L [7] programmes.
JISC has recognised that there is a need for the JISC-funded programmes to be supported by a post which ensures that projects comply with standards and best practices. The QA Focus post has been funded for two years (from 1 January 2002) to support the JISC 5/99 programme. Initially the post was provided by UKOLN (University of Bath) and ILRT (University of Bristol), but, following a decision to refocus on other areas, in January 2003 ILRT were replaced by AHDS (Arts and Humanities Data Service).
QA Focus aims to provide a support service to 5/99 projects: the emphasis is on advice and support, based on close links with the projects, rather than a policing role. An important deliverable will be the development of a self-assessment toolkit which can be used by the projects themselves for validation of the project deliverables.
QA Focus is addressing a range of technical areas which include digitisation, Web (including accessibility), digitisation, metadata, software development and service deployment.
The areas of work which are being carried out by QA Focus include:
Although QA Focus places an emphasis on its role in supporting projects in developing their own QA procedures, in cases of severe interoperability problems QA Focus will be expected to make contact with the project concerned and seek to ensure that concerns are addressed. If this does not result in a satisfactory solution, the issue will be passed on to the JISC.
A number of workshop sessions have been held with a selection of the projects. The first two workshops aimed to obtain feedback from the projects on (a) the Standards document, (b) implementation experiences and (c) deployment of project deliverables into a service environment.
The workshops provided valuable feedback which has proved useful for identifying key areas which need to be addressed. Useful information was obtained about the Standards document including a lack of awareness of the standards document in some cases, concerns over the change control of the document (since new standards may be developed and other standards may fail to gain acceptance) and some uncertainties as to the appropriateness of some of the standards and deployment difficulties in other cases, especially projects which were reliant on third party software development of existing systems which cannot easily be modified. The feedback on implementation experiences raised several predictable issues, including the poor support for Web standards in many widely-used browsers. The lack of a technical support infrastructure was highlighted by several projects, mainly those based in academic departments or in smaller institutions.
A meeting of 5/99 projects was held at the University of Nottingham, on 30 October - 1 November 1 2002. Prior to the meeting QA Focus carried out a survey of various aspects of 5/99 project Web sites. The survey findings [8] were made available and formed the basis for discussions at the QA Focus workshop sessions.
The surveys made use of a number of freely available tools, all of which had a Web-interface. This meant methodology was open and tools could be used by projects themselves without the need to install software locally. The survey findings were published openly. This allowed examples of best practices to be seen, trends to be monitored and areas which projects found difficult to implement to be identified.
The surveys were complemented by a number of brief advisory documents. In addition a number of case studies have been commissioned which allows the projects themselves to describe their approaches to compliance with standards and best practices and any difficulties they have experienced and lessons they have learnt.
Survey |
Tool |
Information |
HTML Compliance |
W3C's HTML validator |
Does home page comply with HTML standards? What DTDs are used? |
CSS Compliance |
W3C's CSS validator |
Does home page use CSS? Does the CSS comply with standards? |
Accessibility |
Bobby |
Does home page comply with W3C WAI guidelines? |
404 page |
Manual observation |
Does the 404 page provide navigational facilities and help to users? |
Internet Archive |
Manual observation |
Is the Web site available in the Internet Archive? |
PDA Access |
AvantGo |
Can the Web site be accessed by a PDA? |
XHML Conversion |
W3C's Tidy tool |
Can the Web site be converted to XHTML without loss of functionality? |
WML Conversion |
Google WAP conversion service |
Can the Web site be converted to WML without loss of functionality? |
HTTP Headers |
Dundee's HTTP analysis tool |
Are correct HTTP headers sent? What Web server environment is used? |
Metadata |
W3C's Tidy and RDF validator & UKOLN's DC-dot tools |
Is Dublin Core metadata used? Does it comply with standards? |
Table 1: Initial QA Focus Surveys
The surveys aimed to establish how well project Web sites complied with standards and best practices. The surveys addressed several areas related to Web technologies including compliance with HTML and CSS standards and compliance with W3C's Web Accessibility Initiatives (WAI) guidelines for project entry points. The HTTP Headers were analysed and details of the Web server platform recorded (together with details of invalid HTTP headers).
As well as testing compliance with well-defined standards the survey also used a number of tools which helped to see if the Web sites allowed repurposing. This included checking availability of project Web sites in the Internet Archive, using the AvantGo service to test access to project Web sites on a PDA and converting the Web site to WML and viewing in the Opera browser which provides a WAP emulator.
The survey also used a simple usability test by reporting on the approach taken to the Web site 404 error page: whether the 404 error page was branded, provided helpful information and appropriate links, etc.
Metadata embedded in project Web site entry points was tested and any Dublin Core metadata found was validated using a Dublin Core validation tool developed in UKOLN. In addition the Dublin Core metadata was converted to RDF format and then visualised allowing an alternative display of the metadata to be viewed.
There is a danger that the publication of the findings can be perceived as threatening to projects. Where the findings indicate lack of compliance with standards or failure to implement best practices projects may point out particular features of their project which the surveys fail to acknowledge, limitations of the tools used the timing of the survey and the available resources.
There is an element of truth in such concerns. The projects are addressing a diverse set of areas, including digitising content, enhancing existing services and software development. The project Web sites will also have a diverse set of objectives, including providing communications with project partners, providing information about the project and providing access to project deliverables. The projects will have different levels of funding, start and completion dates and technical expertise.
Despite these reservations it is felt that significant benefits can be gained from the QA Focus approach. The openness seeks to facilitate dialogue with projects and sharing of best practices. The approach also takes what can be perceived as a dry standards document and places it more centrally in the activities of the projects. It also helps to provide feedback on the standards; if a particular standard has not been adopted this may indicate that the standard is too esoteric or a lack of tools or expertise. Such considerations can be fed back to the authors of the Standards document.
An important role of QA Focus is to ensure that appropriate documentation is provided for the projects. The approach that has been taken is to produce short advisory documents which address specific problems. This approach has the advantage that documents can be written more quickly and can be easily updated.
A summary of the documents published to date is given in Table 2. The documents can be accessed at [9].
Document |
Area |
Checking Compliance With HTML and CSS Standards |
Summarises a number of approaches for checking that HTML resources comply with HTML and CSS standards |
Use Of Automated Tools For Testing Web Site Accessibility |
Describes tools such as Bobby and summarises the implications of common problem areas |
Use Of Proprietary Formats On Web Sites |
Provides suggestions for techniques when using proprietary formats such as MS Word and PowerPoint |
404 Error Pages On Web Sites |
Describes best practices for providing user-friendly 404 error pages |
Accessing Your Web Site On A PDA |
Describes an approach for making a Web site available on a PDA |
Approaches To Link Checking |
Describes approaches for link-checking, including links to CSS & JavaScript files |
Search Facilities For Your Web Site |
Describes different approaches for providing search facilities on project Web sites |
Enhancing Web Site Navigation Using The LINK Element |
Provides advice on use of the HTML <link> element to provide enhanced Web site navigation |
Image QA In The Digitisation Workflow |
Provides advice on QA for images |
What Are Open Standards? |
Gives an explanation of open standards |
Mothballing Your Web Site |
Provides advice on "mothballing" a Web site, when funding ceases and the Web site will no longer be maintained |
How To Evaluate A Web Site's Accessibility Level |
Describes approaches for checking Web accessibility |
Table 2: QA Focus Advisory Documents
The advisory documents are complemented by case studies which are normally written by the project developers themselves. The case studies provide a solution to the common request of "Can you tell me exactly what approaches I should be using?". It is not possible to provide answers to this question as there are many projects, addressing a range of areas and with their own background and culture. It is also not desirable to impose a particular solution from the centre. The case studies allow projects to describe the solution which they adopted, the approaches they took, any problems of difficulties they experienced and lessons learnt.
The case studies which have been published to date include:
Access to these documents is available at [10].
Once the QA Focus work in the Web area has been finalised work will move on to a number of other areas including digitisation, multimedia, metadata, software development and deployment into service.
The initial work carried out by QA Focus made use of automated tools to monitor compliance with standards and best practices. In the areas listed above there will be a need to address the use of manual QA processes as well as use of automated tools. For example, the use of correct syntax for storing metadata can be checking using software, but ensuring that textual information is correct cannot be done using only automated processes.
As well as providing advice and support for the projects, QA Focus will also provide advice to JISC on best practices for the termination of the programme and for setting up new programmes. This will include development of FAQs (along the lines of those which have been developed by UKOLN to supports its role in providing the Technical Advisory Service for the nof-digitise programme [11]).
Digitisation is the first stage in the creation of a resource, and it represents the link between the analogue and digital worlds. The consequences of poor quality digitisation will flow through the entire project, reducing the value of all later work.
QA for digitisation is therefore very important, but, with the exception of the digitisation of bitmap images [12] there is relatively little advice and support that is accessible to the non-specialist. QA Focus will provide QA guidance for image digitisation, but will also deal with other types of material including text, audio and moving images. We will also link the process of capturing data to the next step, organising the data once it is in digital form, by providing QA for databases and XML applications in particular.
Digitisation typically has some of the qualities of a production line with analogue originals being retrieved, digitised and returned while digital files are created, edited and stored. Rigorous procedures can make sure that this process goes smoothly ensuring that originals are not missed, or mislaid, and the status of digital files (particularly what post-processing has occurred to them and which original or originals they relate to) is tracked. This type of quality assurance for the digitisation workflow is well established for images, and we aim to provide analogous advice for projects digitising other types of material, including checklists and model procedures to follow.
Ensuring consistent quality, and keeping records that demonstrate this, is a vital part of digitisation that indirectly affects interoperability by ensuring, that however the final resource is accessed, users can make informed use of it.
Structured metadata provides a useful mechanism for recording aspects of the digitisation process. QA Focus will review relevant existing and emerging standards. We will also investigate tools for the semi-automatic or automatic creation of technical metadata about digitised material.
Before any material is digitised, projects need to define their requirements for the digitised material. QA Focus advocates that projects' take active responsibility for these decisions and avoid allowing the capabilities of available technology set these decisions.
A key part of QA for digitisation is the development of objective, measurable criteria for judging if the digitised material is 'fit for purpose'. Determining what is fit for purpose involves consideration of the acceptable level of accuracy in digitisation in relation to the intended purpose of the digitised material. For example, a low resolution image may be suitable for a Web page, but a product also available on CD-ROM could include higher resolution images. Very similar situations occur with the digitisation of audio and moving images, but we will also address less obviously similar situations, such as rules for the standardisation of place names or the transliteration of text during transcription.
Digital files are easily copied and distributed, so it is important for projects to ensure that they have obtained any necessary rights to use the originals. Projects may also want to protect their own rights in the digitised material.
Intellectual Property law is a complex area and QA Focus will not be able to provide definitive answers, but we hope to produce a series of case studies that demonstrate how a project can best minimise the risk of running afoul of copyright infringement. We will liaise with JISC's Legal Information Service [13] which has expertise in this area.
Metadata has a key role to play in ensuring the projects deliverables can be interoperable. However unless QA procedures are deployed which ensure that the metadata content is correct, the metadata is represented in an appropriate format, complies with appropriate standards and can be processed unambigously we are likely to encounter difficulties in service deployment.
While resource discovery metadata is central to interoperability, we will also investigate requirements for workflow, technical and rights metadata that support the digitisation process and deployment into service.
We are currently planning focus group sessions in which we will obtain feedback from groups with experience in metadata activities. This should provide us with examples of the type of approaches which can be recommended in order to ensure that metadata is interoperable.
Approaches we are currently considering include:
The QA procedures will be applied to metadata which is used in various ways including metadata embedded in HTML and XML resources, OAI metadata, educational metadata, RSS newsfeeds, etc.
A case study which describes the use of metadata in an e-journal, including details of the metadata elements used, the purpose of the metadata, the architecture for managing the metadata and the limitations of the approach has been published [16].
QA is crucial in the development of quality software. It is fundamental to the entire software development process from the initial systems analysis and agreement on standards through to problem handling and testing and software deployment. Once established, QA processes form a thread through the software development lifecycle and help developers focus on possible problem areas and their prevention.
Before the onset of a software development project the project team should produce a detailed set of specifications that document what exactly the software will do. Questions need to be asked about the purpose of the software and whether this purpose reflects the requirements of the user.
QA Focus will be providing case studies and briefing papers on these areas. Consideration of one possible design process for recording specific software development requirements, Unified Modelling Language (UML), is given in a case study provided by the Subject Portals Project [17].
QA Focus will be providing advice on standards for software documentation, both public and internal. Having clear documentation is especially important in a digital library programme in which short term contracts and high staff turnover are the norm [18]. In the long term good documentation can improve usability, reduce support costs, improve reliability and increase ease of maintainance. Throughout a project's lifetime information should be recorded on the software environment a package has been developed in, language systems used and the libraries accessed.
Project teams will need to agree on standards used when writing software code. This should be done prior to development. QA Focus have produced a briefing paper which provides advice on how projects do this [19].
In the later stages of development work user documentation may be required. Writing documentation is a useful process that can show up bugs which have been missed in testing. Ideally the documentation writers are a different team of people from the developers and provide a different perspective on the software.
A software product should only be released after it has gone through a proper process of development, testing and bug fixing. Testing looks at areas such as performance, stability and error handling by setting up test scenarios under controlled conditions and assessing the results.
Before commencing testing it is useful to have a test plan which gives the scope of testing, details on the testing environment (hardware/software) and the test tools to be used. Testers will also have to decide on answers to specific questions for each test case such as what is being tested? How are results documented? How are fixes implemented? How are problems tracked? QA Focus will be looking mainly at automated testing which allows testers reuse code and scripts and standardise the testing process. We will also be considering the documentation that is useful for this type of testing such as logs, bug tracking reports, weekly status report and test scripts. We recognise that there are limits to testing, no programme can be tested completely. However the key is to test for what is important. We will be providing documentation on testing methodologies which projects should consider using.
As part of the testing procedure it is desirable to provide a range of inputs to the software, in order to ensure that the software can handle unusual input data correctly. It will also be necessary to check the outputs of the software. This is particularly important if the software outputs should comply with an open standard. It will be necessary not only to ensure that the output template complies with standards, but also that data included in the output template complies with standards (for example special characters such as & will need to be escaped if the output format is HTML).
The final area QA Focus will be looking at is the deployment of project deliverables in a service environment. It is unlikely that project will migrate into a service directly - the intention is that many of the project deliverables will be transferred to a JISC service who will be responsible for deploying the deliverables into a service environment. In addition to the deployment into a service environment for use by end users project resources may also need to be preserved. This is another area in which we will provide appropriate advice. Other work will address the issues involved in deploying software deliverables, digitised resources, Web sites, etc. into a service environment.
There are a number of scenarios for the deployment of projects deliverables: the deliverables may be hosted by a national service, within an institution or on the user's desktop.
It may be necessary to consider any special requirements for the user's desktop PC. For example will the service require a minimum browser version, will it require use of browser plugin technologies, are their any security issues (e.g. use of JavaScript), could institutional firewalls prevent use of the service, etc.
Inevitably there are resource implications for the deployment of project deliverables into a service environment: consideration needs to be given of the time taken for deployment and possible impact on other services (such as security, performance and compatibility issues). As well as these technical and resource issues there will be human aspects, including the potential resistance to change or reluctance to make use of work carried out by others.
An interesting approach which sought to provide a simple syndication tool has been carried out by the RDN. The RDN-include tool provides access to subject gateways and allows the institution to control the look-and-feel of the gateway. However, as this tool is implemented as a CGI script it requires System Administration privileges in order to be deployed. It was felt that System Administrators may be reluctant to deploy the tool, due to concerns over potential security problems. In order to address such concerns RDNi-Lite was devolved, which provides similar functionality but, as it is implemented using JavaScript, can be used by an HTML author: no special System Administration privileges are required. This example illustrates an approach which acknowledges potential deployment difficulties and provides an alternative solution. Further information on this approach is available [20].
An important aspect of this work will be to ensure that projects describe the development environment at an early stage, in order to ensure that services are aware of potential difficulties in deploying deliverables in a service environment. One could envisage, for example, a project which made use of innovative technologies, open source tools, etc. which the service had no expertise in. This could potentially make service deployment a costly exercise, even if open standards and open source products are used.
In addition to considerations of the deployment technologies, there is also a need to address the licence conditions of digitised resources. Again it would be possible to envisage a scenario in which large numbers of resources were digitised, some with licenses which permitted use by all and some which limited use to the project's organisation. In this scenario it is essential that the right's metadata allows the resources which can be used freely is made available to the service and that the production service can be deployed without making use of resources with licence restrictions.
Even if a project has a clear idea of its final service deployment environment, there may be additional requirements during the project's development. Within the context of the JISC 5/99 programme there is now an expectation that learning objects funded by the programme will be stored in a learning object repository. The Jorum+ project [21] has been set up to provide repositories of the learning objects.
There is also discussion of the need to provide a records management service to ensure that project documentation, such as project reports, are not lost after the end of the programme.
In both of these areas QA Focus is well-positioned to advise JISC and the projects on appropriate strategies, based on its work in advising on technical interoperability.
An important QA Focus deliverable will be a QA Self Assessment Toolkit which will allow projects to check their project QA procedures for themselves.
A pilot version of the toolkit is currently being tested. The pilot covers the QA requirements when mothballing a project Web site and other project deliverables once the project has finished and funding ceases [22].
The toolkit consists of a number of checklists with pointers to appropriate advice or examples of best practices. The toolkit is illustrated below.
Figure 1: Toolkit For Mothballing Web Sites
The toolkit aims to document the importance of standards in a readable manner, which can be understood by project managers as well as technical developers. The toolkit will make use of case studies which have been commissioned and appropriate advisory documents. Most importantly the toolkit will provide a checklist and, in a number of cases, a set of tools which will allow projects to assess project deliverables for themselves.
The structure of the toolkit is illustrated below.
QA Self-Assessment Toolkit
Area: Access (e.g. Web resources, accessibility, ...).
Importance: Will describe the importance of standards and best practices, including examples of things that can go wrong.
Standards: Will describe the relevant standards.
Best Practices: Will describe examples of best practices.
Tools: Will describe tools which can be used to measure compliance with standards and best practices.
Responsibility: Person responsible for policy and compliance.
Exceptions: Description of allowable exceptions.
Compliance: Description of approaches for ensuring compliance.
Figure 2: QA Self-Assessment Toolkit Structure
In the area of standards compliance for Web resources software tools can be used to check for compliance with standards. An article on "Interfaces To Web Testing Tools" describes the use of "bookmarklets" and a server-based interface to testing tools [23].
In a number of areas the use of software tools will be documented. The documentation will include a summary of the limitations of the tools, and ways in which the tools can be used for large-scale deliverables. This may include testing of significant deliverables, sampling techniques, etc.
We are using the methodologies described in this paper for in-house QA for the QA Focus Web site. This is being done in order to ensure that the Web site fulfils its role, to test our own procedures and guidelines and to gain experience of potential difficulties.
The approach used is to provide a series of policy documents [24]. The policies follow a standard template, which describes the area covered, the reason for the policy, approaches to checking compliance, allowable exceptions and audit trails, as illustrated below.
Policy On Standards For QA Focus Web Site
Area: Web
Policy: The Web site will be based on XHTML 1.0 and CSS 2.0.
Justification: Compliance with appropriate standards should ensure that access to Web resources is maximised and that resources can be repurposed using tools such as XSLT.
Responsibilities: The QA Focus project manager is responsible for this policy. The Web editor is responsible for ensuring that appropriate procedures are deployed.
Exceptions: Resources which are derived automatically from other formats (such as MS PowerPoint) need not comply with standards. In cases where compliance with this policy is felt to be difficult to implement the policy may be broken. However in such cases the project manager must give agreement and the reasons for the decision must be documented.
Compliance measures: When new resources are added to the Web site or existing resources update the ,validate tool will be used to check compliance. A batch compliance audit will be carried out monthly.
Audit trail: Reports from the monthly audit will be published on the Web site. The QA Focus Blog will be used to link to the audit.
Further information: Links to appropriate QA Focus documents.
Figure 3: QA Policy For QA Focus Web Site
Although the approach to QA described in this paper is meant to be developmental, it is likely that projects will, to some extent, feel obligated to deploy the methodologies described. Use of the methodology from projects which are not funded under the JISC 5/99 programme will help to establish the effectiveness of the approach and should provide valuable feedback.
A presentation on the QA Focus work was given to staff from the Centre For Digital Library Research (CDLR) based at the University of Strathclyde in April 2003 [25]. Shortly afterwards CDLR staff felt sufficiently motivated to investigate the potential of the methodology for two digital library projects: a digitisation project funded by the NOF-digitise programme which is currently under development and a regional digital library project which has been completed with no funding available for additional work.
The following conclusions were drawn:
"CDLR staff attempted to follow QA Focus guidelines retrospectively and to implement appropriate recommendations. This exercise showed that the extent of compliance with guidelines could be categorised into four areas: (1) areas of full compliance, where the project had already made decisions in accordance with QA guidelines; (2) areas in which compliance could be achieved with little extra work or with minor changes to workflow procedures; (3) areas in which QA guidelines were considered desirable but impracticable or too expensive and (4) areas where QA guidelines were not considered appropriate for the project.
The conclusion from the project managers involved was that consideration of the QA guidelines improved the value, flexibility and accessibility of the digital library deliverables, provided they were interpreted as guidelines and not rules. Rather than the QA process imposing additional constraints, the exercise validated decisions that had been made to vary from recommended standards, provided the issues had been considered and the decisions documented. What had been seen as a potentially burdensome exercise was regarded in retrospect as beneficial for the user service, for accessibility, interoperability, future flexibility and even for content management. It was felt that there are a number of areas in which simple developments to scripts or use of tools can provide a significant development to interoperability." [26].
The JISC promotes the use of open standards in its development programmes. However feedback from projects indicates that there is not necessarily a clear understanding of what is meant by open standards.
QA Focus has produced a briefing document which seeks to clarify the term 'open standards' [27]. However there is still an unresolved issue as to the role that proprietary standards have in development programmes and the processes needed to evaluate open and proprietary standards and perhaps, in certain circumstances, chose a proprietary standards rather than an open one due to issues such as resources implications, maturity of standard, etc.
On reflection it would appear that an approach based on a simply advocating use of open standards is not necessarily desirable. It is felt that there are several factors which need to be addressed, including:
It is felt that use of a matrix approach when choosing the standards for use in a development programme is well suited to the developmental culture prevalent in many digital library programmes and is preferable to a strict requirement that only open standards may be used.
The approach will, of course, require documentation outlining the decisions made and justification of deviation from use of accepted open standards and best practices.
QA Focus is provided by UKOLN and the AHDS, which are located in Bath and London respectively. In order to support working by a distributed team and minimise unnecessary travel team members make use of a number of collaborative tools, including My.Yahoo as a shared repository of resources. YahooGroups for managing the team mailing list and the MSN instant messenger to provide real time communications. We are also making use of a 'Blog' to provide news on QA Focus activities.
This approach appears to be working well. In order to share the experiences with other projects and to highlight potential problems (e.g. reliance on an unfunded third party) a case study has been produced [28].
Although QA Focus funded is due to finish on 31st December 2003 we will be seeking additional funding to continue our work. We feel that QA for JISC's development programmes will be an ongoing activity, and, indeed, will grow in importance as "Web Service" technologies are developed which will require more rigourous compliance with standards.
We would hope to maintain the resources on the QA Focus Web site and produce new ones in appropriate areas. Additional activities we could engage in could include the deployment, development or purchase of testing tools and services. One possibility would be hosting a JISC compliance service, along the lines of the UK Government's eGIF Compliance Service [29].
As well as providing advice to projects, QA Focus will also advise JISC on approaches to future programmes. We will be well-placed to provide advice prior to the start of project work, which will help to ensure that best practices are deployed from the start. We will recommend that, in addition to providing training on project management when new programmes begin that training is provided on best practices for ensuring that project deliverables are interoperable in a broad sense. We will also advise on contractual issues, including advice on the persistency of Web sites once project funding has finished. Advice will also be provided for evaluators of project proposals to ensure that consideration is given to issues such as QA procedures as well as technical feasibility.
The paper has described the work of the QA Focus project which supports JISC development activities by providing advice and support for projects in ensuring that project deliverables will be widely accessible, are interoperable and can be deployed into a service environment with the minimum of effort.
JISC will not be alone in giving a higher profile to quality assurance and compliance with standards and best practices for its development programmes. Within the UK two examples of standards-based programmes should be mentioned: (1) the e-government interoperability framework (e-GIF) defines the "internet and World Wide Web standards for all government systems" [30]; (2) the New Opportunities Fund's NOF-digitise programme provides funding to digitise cultural heritage resources [11].
We will be exploring the possibilities of shared approaches to QA with these bodies. The author welcomes feedback from those involved in similar activities in the international digital library community.
Brian Kelly provides the UK Web Focus post, a JISC-funded post which provides advice and support to the UK Higher and Further Education communities on Web issues. He is also the QA Focus project manager.
Brian has been involved in Web activities since 1993, when he was helped establish one of the first Web sites in the UK Higher Education community. He has been involved in Web activities since then and has participated in several of the International World Wide Web Conferences as a member of the programme committee, author of several short papers and delegate.
Brian is based in UKOLN - a national centre of excellence in digital information management, based at the University of Bath.
Marieke Guy is a member of the QA Focus team which supports JISC's Information Environment programme by ensuring that funded projects comply with standards and recommendations and make use of appropriate best practices.
Marieke has previously worked as a NOF-digitise Advisor and co-ordinated technical support and advice services to the NOF national digitisation programme. Prior to this she was the editor of Exploit Interactive and Cultivate Interactive Web magazines, both funded by the European Commission and the deputy editor of Ariadne Web magazine.
Marieke is based in UKOLN - a national centre of excellence in digital information management, based at the University of Bath.
Hamish James is the Collections Manager for the Arts and Humanities Data Service and is the AHDS member of the QA Focus team.
From 1999 to 2002 Hamish was User Support and Collections Management Officer for the History Data Service, a Service Provider for the AHDS. Hamish leads the AHDS's work in developing policies and practices for digital preservation, and contributes to workshops and other advice services offered by the AHDS.
Hamish is based in the AHDS Executive, at Kings College London - the AHDS is a distributed national service that aids the discovery, creation and preservation of digital collections in the arts and humanities.