SSSSSSSSSSS"? ""SS" ?SSSSS! `SSSSSSSSSS??S"?SSSSSSSSSSSSSSS MMMMMMMMM* fx n H MMMMMM MMMMMMMMMMM?MM:!MMMMMMMMMMMMMMM MMMMMMMMM: : ?MMh MMMMMMMMMMMMM!.MMMM` MMMM : MMMMMMM MMMMMMMM :~f "MMk "MMMMMMMMMMMMM Mf "~ *MMMMMMM M : ~ M MM .MMMMMMM!~ ~ ?M ! : HM: .MM. MMMM~ ::! Nu xbi NN: !NNiNNNNNN!Nb: H u:f.oNNNN xN$$$$$k !$$$$$$$ ~'! $$ $$$$!x $$$$$$$$ < h$$$$$$$$$ ~~< ' ~ x$$$$$$$$$ RRRRRRRRH MRRRRRRRRRRR ~"#x: ~ : <'# RRRRRRRRR RRRRRRRRRM MRRRRRRRRRRR~ MRRRRRRRRRRRR MHxMMMMMMMMk !:""MMMMMMMM ~ MM : ~:M?MMMMMMMMMMMM 888888888888888i:' !888888W ` ~ :8888x9X8k:u8*8888888888888 NNNNNNNNNNNNNNNN : *NNNNNNNN: ` dNNNNNNNH8N 8 :xN"*NNNNNNNNN $$$$$$$$$$$$$$$$N ~ $$$$$$$$R < $"$$$$$$$$$$$$N$F f$$$$$$$$$ $$$$$$$$$$$$$$$$$$<~ $$$$$$$$$$ :~ $ @$$$$$$$$$$$P "$$$$$$$ $$$$$$$$$$$$$$$$$F 4$$$$$$$$$$$$ $$$$$$$$$$$$$$$$ $$$$$$$ $$$$$$$$$$$$$$$$$< t$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$@"$$$R:$$$ RRRRRRRRRRRRRRRRR< RRRRRRRRRRRRRRRRRRRRRRRRRRRRRRRRRRRRRRRRRRMRRRR 888888888888888888o88888888888888888888888888888888888888888888888
ANNEX 1: PROGRAMME
ANNEX 2: DELEGATES
ANNEX 3: GLOSSARY OF ACRONYMS AND TERMS
The Coalition for Networked Information was founded in March 1990 to promote the creation and use of networked information resources and services that advance scholarship and intellectual productivity. The Coalition is a joint project of the Association of Research Libraries (ARL), CAUSE (see below), and Educom. A task force of over 200 institutions and organisations provides the Coalition with insights, initiatives, and resources to pursue its mission. Members of the task force include higher education institutions, publishers, network service providers, computer companies, library networks and organisations, and public and state libraries. It is a truly diverse partnership of institutions and organisations with a common interest in realising the promise of networked information resources and services.
CAUSE is the association for managing and using information resources in higher education. An international non-profit association, CAUSE's mission is to enable the transformational changes occurring in higher education through the effective management and use of information resources - technology, services, and information. The CAUSE membership includes more than 1,300 campuses and other educational organisations from all regions of the United States, Canada, Mexico, and several other countries - as well as 73 corporate members. Nearly 3,700 individuals participate in CAUSE as member representatives from their institutions.
The British Library is the national library of the United Kingdom, and contains over one hundred and fifty million items representing every age of written civilisation. The British Library exists to serve scholarship, research and innovation. It is the national archive of monographs and serials received by legal deposit. It provides Reading Room and enquiry services, as well as a range of document supply services for remote users. The Library's Initiatives for Access programme is looking at how new digital and networking technologies can expand the use of its rich collections.
The British Library Research and Development Department is the main UK funding agency for research in the library and information field. It supports and disseminates the results of a wide range of projects. The Department has a number of other activities, including administering grants for cataloguing and preservation, research in the book world and special initiatives for the Department of National Heritage.
The Joint Information Systems Committee (JISC) was established on 1 April 1993 by the Higher Education Funding Councils for England, Scotland and Wales, and is now also supported by the Department of Education for Northern Ireland (DENI). The mission for JISC is "to stimulate and enable the cost effective exploitation of information systems and to provide a high quality national infrastructure for the UK higher education and research councils' communities".
The main objectives of JISC are:
The UK Office for Library and Information Networking (UKOLN) and its antecedent organisations have been based at the University of Bath for the last 18 years. It supports the UK library and information communities through research, co-ordination and awareness, and information services in the area of network information management. It has recently expanded and developed a work programme around distributed library and information systems, resource discovery and metadata, bibliographic management and public library networking. At the same time it is expanding its network information and event organisation services. UKOLN is funded by the British Library Research and Development Department and by JISC. Further information about UKOLN and its activities can be found at URL: http://ukoln.bath.ac.uk
As Sir Brian Follett reminds us in his opening presentation, the bits and bytes which constitute networked information are no respecters of international boundaries. Any issues concerning the effective use of network-based information or resources truly are best dealt with internationally; and there is no shortage of issues. This conference confirmed the high level of interest in issues to do with production, distribution, management and preservation of digital resources; and that there is a strong desire to identify, face and overcome these issues. Fortunately, the wish to resolve such problems is as international as the problems. As Lynne Brindley pointed out in her welcoming address, the significance of the event was marked symbolically in two ways: first by the international connotations of the venue, Heathrow; and second by the integrative nature of the programme being represented by the sponsorship of the event by five leading institutions: The British Library, CNI, CAUSE and JISC.
The main part of the conference consisted of 18 formal presentations (see Annex 1 for the complete programme). Four were on general topics, giving perspectives of some important developments in the UK and the complex and changing cost modelling issues which concern us all. The other presentations described a varied sample of specific projects. The projects cover a range of subjects; they are grouped under the following five headings:
The presentations, and the conference as a whole, was marked by impressively high levels of energy and participation, as Paul Evan Peters remarked. I hope that this report conveys the sense of excitement and progress which was evident during the conference.
The remainder of this report consists primarily of accounts of the conference presentations. The accounts were prepared by consultants. They are not formal papers written by the speakers, though in many cases notes, slides and other materials were provided by the speakers. In all cases, speakers were offered the opportunity to review the account of their presentation.
In keeping with its subject matter, the organisers have decided to publish this report primarily on the Internet rather than on paper. This has the benefit of allowing us to represent speakers' references by means of hyperlinks rather than by mere static footnotes on paper. As some of the references will evolve during the period in which this report is of interest, this adds value by ensuring that the most up to date material is available. Of course, there is an accompanying potential disadvantage, namely that some of the referenced material will be taken off the network or will be moved to a different URL; an apposite illustration of one of the significant issues facing users of networked information!
The use of networks, and the amount of information on the Internet is growing very rapidly; by some measures at a rate which is more than logarithmic [1] . There is general agreement that scholarly resources are used internationally. However, it is notable that although many projects and initiatives presented at the conference could have world-wide relevance, interest and impact, the majority are essentially national in their organisation. This underlines the importance of an international event such as this conference; it provided a valuable opportunity for practitioners from several countries to share experiences. Paul Evan Peters expressed, in the closing session, the hope that the discussions which took place might give rise to joint initiatives building on the strengths of the existing national projects; this conference was one way in which synergy can usefully be encouraged , and others should actively be sought in order to maximise and distribute the benefits of work in this field.
Finally, I want to express thanks to the speakers who all tolerantly supplied notes, slides and other materials to help generate this report. I hope that they find their ideas are expressed accurately, and that they will forgive any omissions by abbreviation. Thanks are also due to my colleague Mike Arblaster, who assisted in the preparation of several of the accounts; to Isobel Stark, who applied the HTML tags; and to Hazel Gott of UKOLN who, as chief organiser of this conference, patiently contended with a stream of requests for details and clarifications during the preparation of this report.
[1] See, for example, URL http://www.nw.com/
This account was prepared for this report by The Marc Fresko Consultancy. It is an edited version of a paper supplied by the speaker.
This presentation starts by covering some of the background to the University Libraries Review. Next it examines some of the follow-up activities resulting from the Libraries review focusing especially on the Electronic Libraries Programme (eLib) and the Anderson report. It concludes with a brief look at future hopes for the programme.
This paper is presented in the context of an international conference. Development of the Electronic Library (or the Digital Library) must be an international activity. Individuals need passports and visas to get into or out of Britain. Our goods and chattels need customs clearances. But our bits and bytes respect no formalities, going where we send them, and sometimes where we do not. Our laws and our institutions are territorial, bits and bytes are not. We must work together to ensure we can understand and use them.
Ariadne, a new electronic journal for libraries, was launched earlier this year in the UK. The original Ariadne, daughter of a king of Crete, features in a mythological tale which involves Theseus. In this tale, Theseus gratefully uses information supplied by Ariadne to find his way through the Minotaur's labyrinth, but he later is unwilling to pay the price, and abandons Ariadne. Vice-Chancellors and Funding Council members might sometimes be compared with Theseus, having often been viewed as unwilling to pay the price to maintain university libraries. The launch of Ariadne, however, signifies that they are prepared to pay their share, as explained below.
In 1992 the newly formed Higher Education Funding Councils asked the speaker to chair an enquiry into libraries in higher education. At that time universities were in a time of tremendous expansion. The number of students in UK universities increased by 57% between 1988 and 1992. This was very welcome, but it exacerbated the pressures for university libraries, for during this period, library provision increased very little. How could we support these students? At the same time, the ex-polytechnics became universities. How could they meet their research aspirations without research libraries? Meanwhile, the infamous journals price spiral was threatening research collections even in well-established universities.
The review group decided to report quickly and pragmatically, rather than going back to basics. After 12 months work by the Review group and its 3 sub-groups, it reported in November 1993 with what has become known as the Follett Report. It came up with 46 recommendations to the Funding Councils covering virtually all parts of the library from space issues to the electronic future.
The four higher education funding councils accepted virtually all the recommendations, turning down only one: funds for inter-library co-operation. In all they set aside close to £100 million for implementation, and we have been very busy over the last two years. HEFCE "believes the report was a very successful document, suggesting pragmatic solutions for some of the major issues facing UK HE libraries". The speaker suggests it worked because the report not only identified an area of worrying neglect but also offered some practical solutions. The outcomes of the report are described below.
Perhaps the most visible and enduring result of the report will be the buildings programme. Across the country about £200 million was spent on 70 projects (about £150M from universities plus about £50M from the funding councils). This will produce spaces for about 250,000 readers, many IT-equipped, in 70 institutions. The first of these, in Southampton, is to open officially in March 1996.
The Review concentrated internationally on Arts and Humanities subjects; the focus was small to ensure that the results would be valid. Many libraries contain special collections of great importance to researchers in the humanities, but which are not widely known. In this part of the programme, we are funding projects to conserve, catalogue and preserve some of these collections. We will also do work on making information on archives accessible over the network.
The total cost of this over 5 years comes to £32 million. There is a provision to review the programme after 3 years, since many important collections are still not being funded.
As mentioned above, the journals price spiral was one of the motivators for the Review. One of the options explored was for the licensing of copyright material under more favourable conditions. This would include unlimited copying on site, including for course packs, as well as electronic access.
The funding councils agreed to set up a UK-wide Pilot Site Licence initiative for three years starting in January 1996. The publishers selected for the pilot include Academic Press, Blackwells Publishing, Blackwells Science and Institute of Physics Publishing. In exchange for a central payment by the funding bodies, all the journals would be made available in both print and electronic form at a substantial discount. This initiative has a particularly democratic aspect: all universities, old and new, will have the same access.
This has been one of the most controversial proposals. The funding councils were rather sceptical. Librarians have been extremely suspicious. Publishers not involved are furious about it.
Final negotiations have proved very difficult. Undoubtedly there will be problems, and it may well not be the right way to go. But it is worth a try. Let this pilot be used to find out what the pros and cons really are.
Problems of copyright raise difficult and controversial issues.
We were extremely encouraged by the reports from the AAU/ARL working parties on intellectual property that came out last year. We hoped for a while that if we in the UK also took a similar line, that we might be able to do something about the copyright paradox. That is, universities paying to create the work, giving it away, then paying publishers again to buy it back for our libraries.
The speaker visited the AAU in May last year with two colleagues. It was clear from that visit that both sides of the Atlantic are finding these issues intractable. It is a difficult area for university Presidents or Vice-Chancellors to take concerted action. We need a continuing dialogue with our American colleagues. I strongly doubt that CVCP (Committee of Vice-Chancellors and Principals) can take effective action on its own so we need some sort of forum for international action.
Given this lack of progress , the publishers still have us over a barrel. It is crucial that arrangements are made that recognise the delicate economic balance that exists between players in the HE community. If this is not recognised, universities and academics are very likely to by-pass traditional scholarly communications methods altogether.
Recently, things seem to have got worse, with the publication in the US of the White Paper on Copyright and the NII. This report appears highly oriented towards the rights holders and away from the traditional balance of copyright. We are very concerned that the proposed legislation in the US should not form a precedent for other parts of the world.
We have recently published a collection of papers on copyright, on the network as well as in print. As usual, this collection provides more questions than answers.
Most IT-related activities of these took place under the aegis of the Joint Information Systems Committee, otherwise known as JISC.
JISC's main role is to organise JANET/SuperJANET, our academic network, but it already had a substantial programme funding datasets and datacentres. An early endeavour was BIDS, which started with the national site licence for the ISI database several years ago. Next came MIDAS, hosting statistical and similar datasets, and more recently EDINA, a second site for bibliographic and other datasets.
JISC recommended that funds should be provided to establish an Arts & Humanities Data Service (AHDS). This will be a distributed service, with an executive based at King's College, London and headed by Dan Greenstein. The resources provided will be hosted by the universities where they originated, and where the expertise to maintain them lies. This corresponds to a "classical" digital libraries format, where materials remain in universities and are accessed by Internet.
We expect that a significant component of these resources will be images, and will be linked to other work relating to images in the electronic libraries programme.
The Consortium of University Research Libraries (CURL, a group of a dozen or so of our larger university libraries) has for years been bringing together the bibliographic databases from its members. However, the resulting database needed a lot of work. We persuaded the funding councils to provide funds for the development of this database as a national resource, with an associated document delivery service. This work is underway at Manchester.
We wanted to know if a major national retrospective catalogue conversion programme was justified. The report we commissioned is quite convincing on the benefits of that conversion. We have about 11 million records in electronic form in our university libraries. We need to convert over 25 million more. In theory this means that more than two thirds of the material in our libraries can be discovered only by inspecting card or other hard copy catalogues.
The problem is the cost. This is estimated at £50 million, made up of £25 million from central sources plus matching funding from the universities. This is going to be difficult to fund given the funding pressures and we are not sure how to cope with this, beyond spending a small amount each year attacking key parts of the problem. One thing is certain: decisions of this nature need to be linked to a national strategy. Initiatives in this area will give shape to prioritising collections for conversion and help to emphasise collections of national significance.
Now we come to one of the most significant developments, the Electronic Libraries Programme, eLib (a digital libraries programme as it might be called in the United States).
eLib is managed by an Implementation Group for IT (FIGIT), chaired by Lynne Brindley. FIGIT is a sub-committee of the JISC. The programme is directed by Chris Rusbridge, who has chosen to base himself at Warwick, where he is supported by Kelly Russell.
FIGIT has released two calls for proposals: one in July 1994 and one in November 1995. The initial call was divided into seven programme areas, and the second call into four areas.
Elib projects are quite different from the NSF/NASA/ARPA digital library projects. These are six huge, integrated projects each looking at many different aspects of the digital library. In contrast eLib, about the same size in cash terms, has funded over 50 projects to date. These are mostly small - aiming for deliverables over the next three years or so. The projects involve more than 85 different HE institutions. Overall it is a pragmatic programme, with relatively short term projects. With this scope, eLib could represent an important step toward broad based cultural change.
The programme areas of the first call are examined below.
Having concentrated in part on the arts and humanities, we also initiated document delivery projects, some of which are particularly relevant to science and technology.
In document delivery, we aimed first of all to test different models in a networked, distributed environment. Most document delivery in the UK is sourced from the British Library's Document Supply Centre at Boston Spa. We wanted to make more use of our own resources. LAMDA is a project using RLG's ARIEL software to undertake document delivery between universities in the London area and a group in Manchester.
We are also funding two major systems development projects for paper-based documents. One of these is dual language: English and Welsh. We have also agreed to join with Australian and New Zealand partners to commission enhancements to ARIEL. Finally in this area is the InfoBike project, which will provide document delivery from electronic-sourced documents.
In the electronic journals area, we are funding twelve projects. Two of these, CLIC and Internet Archaeology, are described in other papers in this report. Others range from support for learned society publishers moving to electronic formats, to the second SuperJournal project. This has a consortium of 20 publishers, in various disciplines. The project plans to test a selection of off-the-shelf interfaces, carrying out a series of user behaviour studies. These studies will then be evaluated to determine the way users interact with clusters of journals from these publishers.
Digitisation has proved to be a difficult area, and it has taken time to clarify and develop thinking in this area. We thought we might be able to release some space by digitising long runs of out-of-copyright material. Despite getting a good number of proposals, we have only funded two digitisation projects to date, and only one of these (from Oxford and Leeds) deals with early journals. The second deals with recent journals in the area of design. The motivation here is not space but conservation. Students in this area apparently steal not just pages or volumes, but entire runs of journals! We can only hope that they will not resort to stealing instead PCs or workstations...
These modest test projects have been funded to provide experience and insight into the technological and economic issues of digitisation. It seems clear now that the most appropriate model is a central digitisation facility (possibly in co-operation with the private sector) which would negotiate copyright clearance and provide off-site expert evaluation services. We would like to fund such a centre, though it is not yet certain that all necessary technologies are mature enough.
On Demand publishing is the practice of printing short runs of publications - sometimes extremely short runs - when they are needed, rather than the more traditional practice of printing large runs and keeping stocks awaiting demand. An examination of this area shows how conservative, book-oriented, teaching is currently.
On Demand Publishing is one programme area where the major emphasis throughout has been on the teaching and learning benefits. We have funded seven projects. Most have a print-on-paper emphasis, but a few, notably the ERIMS project in management studies and the Liverpool John Moores project in the humanities, have an electronic basis. These projects are generally all having difficulties with getting publishers' rights cleared at a reasonable price. Publishers seem determined to kill off a promising market.
We have always believed that training and awareness is a vital area. We have funded six projects here, generally quite different from one another. Netskills is our main skills improvement project, based on the group at Newcastle who provided training through the Mailbase project. We also have EduLib, aiming to upgrade the educational competencies of librarians. There are three projects which are practically based, but essentially studies.
Finally there is the recently-launched Ariadne. This is a print and network-based newsletter which was initiated earlier in 1996. It aims to stimulate discussion in the library community, and should achieve that, from its first issue.
It is notoriously difficult to find resources on the Internet. One approach we are taking is the creation of subject-based gateways. Another paper in this report describes the SOSIG gateway project; there are a further six gateways and one technology project in this area.
We always expected there would be useful work which was difficult to categorise. The supporting studies area brings these together. We have three projects, looking at the economics of document delivery, at cultural change, and at problems of resource discovery (refer to the paper on ROADS herein). There have also been shorter studies on existing digitisation projects, on the need for an images data service, and on technologies for copyright management.
Images have been mentioned more than once above. FIGIT received quite a number of image-based proposals in its first call but chose to wait for an images scoping study it had funded. That document is now out for consultation, and FIGIT has taken it as a framework for awarding three significant projects, covering very different areas: digital maps, medical images and photographic images of historical interest. These are very new.
The second call asked for proposals in pre-prints and grey literature, in quality assurance and in electronic reserve. These areas were addressing gaps where the initial response was not felt adequate. Pre-prints or grey literature lend themselves particularly well to an electronic environment. None of their timeliness is lost in processing or postal delays. We have agreed four projects in this area, and this is the first public announcement of support for a pre-print service in the cognitive sciences, directed by Stevan Harnad. This is an area in which the UK may take a world-wide lead.
Quality assurance projects will work to develop working models for refereeing in an electronic environment. The successful project is concerned more with streamlining peer review in than new models of quality assurance. It could prove very complementary to work in electronic journals and pre-prints. One can perhaps envisage further work on a system that moves materials directly from a pre-print via a refereeing environment into an e-journal.
The electronic reserve projects take our on-demand publishing work one stage further. There are some very interesting projects here, including some vital software for tracking access to copyright materials. There is also a project on delivering access to music and video as electronic reserve materials, for dance
Naturally, there are numerous sources of information about eLib:
A famous British writer, Terry Pratchett, has written a whole series of science fiction stories about his imaginary Discworld. This world has many extraordinary parallels with our own. In these stories, the Librarian at the Unseen University has been changed through some unfortunate magical accident to an orang-utan. He could be changed back, but the magical books he looks after are so dangerous that he chooses to remain as an orang-utan. It may be hard to communicate when you entire vocabulary is Ook, but I have heard of librarians in our universities who feel that very long arms and orange fur would help in dealing with their clients!
In one story (Guards! Guards!) the Librarian uses a thread, like Theseus, to find his way back from an expedition deep into his library. The density of knowledge is so great that it distorts space-time. He manages to find his way to a point a week ago, before a critical book had been stolen. After reading it he carefully replaces it and retraces his step to the present, following the thread. Presumably some of our librarians wish they could do that, too!
In another book (Small Gods), he makes the suggestion that this distortion of space-time is so great that all Libraries everywhere are connected, in "L-space". The Librarian is thus able to rescue some books from the centre of a burning library in another city.
This idea that all Libraries are - or should be - connected, is one of the central ideas of the Anderson Report. Michael Anderson looked at the problem of support for research, and essentially decided that we absolutely have to co-operate and collaborate. We have highly competitive institutions, but we must find ways to develop some sort of national strategy for co-ordinated support for research - a strategy that includes the national libraries. This is especially important in the UK with our numerous smallish universities and libraries. This idea underpins out thinking on the national distributed collection (print and electronic). We know also that we must go far beyond connecting OPACs in providing simple means to find our information resources.
The Anderson Report also raised the issue of preservation of digital information. We are all familiar with the fate of the fabled Alexandria Library, and probably also the medieval monastic library in Umberto Eco's book The Name of the Rose (this library was also a labyrinth, this time to protect the monks from the knowledge in some books). It was not clear until a few years ago that digital information might be similarly vulnerable. Now we realise that the move to digital information - an inexorable move, it seems - brings new and unique problems in preservation. In the US, the Commission for Preservation and Access has been considering this for some time. They wrote an excellent draft report with the RLG [1]. We have now taken our first steps to play our part in resolving these difficulties, with a workshop at Warwick last year [2]. Unless we continue to support this work vigorously, it will inhibit taking advantage of the new methods of scholarly communication.
So where do we go from here? We have a significant programme of projects under way. The projects are sometimes described as a set of experiments. We now have to start looking for ways to make these experiments practical realities in our libraries. We must look wider than the UK, and try to integrate our work with overseas projects - whatever we can find that is useful.
So the first plank of our future strategy is to select some projects, then to embark on scaling up, integration and implementation, with wide dissemination and real efforts at cultural change. We should aim to get the results implemented across many or all libraries. In order to maintain the flow of funding, we shall have to "deliver results".
The second plank is to build on the ideas from the Anderson report. We must try to increase library collaboration. We want to be able to find documents wherever they are.
The third and final plank of the strategy is to try to make some real progress - backed by real money - in the emerging area of digital preservation.
We have not got all the funds we need to implement this strategy.
As a result of the Libraries Review, we have some very substantial programmes funded. This may not be everything we could wish for, but there is plenty to get on with. The eLib programme is the key development. It will have a major effect on cultural change in universities. We need to go further with this if we can get the funding to do so.
The organisation of HE in the UK means we can have a national strategy. Electronic library developments will only happen in this way.
We want to see eLib and other national programmes such as TLTP, etc. developing an holistic approach to teaching and learning. At the same time, we have to try to collaborate more in support of research.
It is important to keep relationships with the USA strong - particular areas include digital preservation, licensing and copyright.
We could view our world as a multi-dimensional labyrinth. In different ways as funders, Vice-Chancellors, Librarians and as users, we are all faced with a different maze of difficult choices. How do we find our way? We need whatever guides we can find. Let us hope we treat our Ariadne better than Theseus did.
[1] Draft report available at URL
http://www-rlg.stanford.edu/ArchTF/
and by FTP at server lyra.stanford.edu/pub/ArchTF/
[2] Report available at URL
http://ukoln.bath.ac.uk/fresko/
This account was drafted for this report by The Marc Fresko Consultancy. It is based on notes taken during the presentation and notes supplied by the speaker.
For many years the main support agency for Information Technology applications in library information services, The British Library Research and Development Department, has had a long-term interest in networking. The founding of UKOLN, by the addition of networking to the existing bibliographic management research centre at Bath, set up a powerful facility for awareness, advice, research and standards. UKOLN initially concentrated on the academic sector, but, with encouragement from The British Library, has since extended its operations to all types of library. The Research and Development Department itself has undertaken a number of projects in non-academic libraries and its plans for the future give a high priority to this area.
The British Library is the main United Kingdom funding agency for research in the Library and Information field. Even so, funds are limited - the 1995/6 budget being only £1.6 million. They are also spread over a wide range of topics, such as information policy, user studies, education, training and awareness, and across all communities, not just the academic. In addition, technology is applicable to information handling in all information services, not just libraries. The British Library has had to live with dwindling resources in real terms (the equivalent to the 1979 level of funding would be £3.6 million today) and the trend looks set to continue. Much tighter focusing in the future is therefore inevitable.
In view of the funding position, collaborations and partnerships with other funding sources are always very welcome. The Library has also become involved in functions which are tantamount to management of research and development funds for other bodies - in particular, various public library development schemes for the Department of National Heritage.
The Follett initiative to put resources into university library research and development is very much appreciated, especially at a time when the smaller scale projects of the type generally supported by The British Library in this area were no longer sufficient to address needs.
Nevertheless a great deal of library research projects have been supported, which in retrospect can be seen as precursors to the present eLib programme. Some of these are described below.
BLEND, the Birmingham and Loughborough Electronic Networking Development, was undertaken in the first half of the 1980s and constituted a very early attempt to set up an electronic journal. Although the technology was crude by today's standards, many new ideas were tested. The basis of the project was an electronic conferencing system, which set up a community linked together by communications (with all of the attributes of a present-day networked community), including e-mail, and the journal arose as a by-product.
University College London and the University of Hertfordshire joined them in a subsequent project, the former bringing expertise in integrating text, sound and graphics in electronic documents, the latter bringing extensive research on optical media. The quartet of universities combined to investigate electronic document handling, with a view to developing practical applications from the research. One of these was a researcher's workstation, but in the event this was superseded by external developments resulting from the rapid progress of technology, currently rendering products obsolescent every two years or so.
More recently the University of Loughborough, with Institute of Physics Publishing, has delivered an electronic physics journal, and University College London has networked the American Chemical Society Journals.
The De Montfort University set up an electronic library on the Milton Keynes campus, with support from The British Library, eLib and the European Commission. They also collaborated with the Nara Institute in Japan and with NACSIS, which provides an academic network similar to JANET.
A call for proposals in 1987 led to a range of research projects investigating retrieval from image databases.
Many information technology projects have a networking aspect, and by 1989 its importance had been recognised by university libraries and by The British Library. At that time a considerable amount of early work concerned with applications, surveys, awareness and publicity was undertaken.
In 1992, UKOLN, the United Kingdom Office for Library and Information Networking, was fully established at the University of Bath, with half of its funding from The British Library Research and Development Department and half from JISC, the Joint Information Systems Committee. It is now a major vehicle for dissemination of information on electronic libraries, supporting events such as the Follett Lectures, seminars conferences and publishing. In particular, UKOLN has demonstrated networking power by implementing and offering a wide range of Web services.
ARIADNE, the Internet magazine for librarians and information specialists, is the latest outcome of collaboration with eLib and in January of this year it attracted over one hundred thousand accesses.
The Conference on Long Term Preservation of Electronic Materials, held at Warwick University in November 1995, engendered various possible study areas, including:
Sectors outside the university environment - where research is a main raison d'être - have been less fortunate, with no Follett grants and consequently less awareness of information technology or networking matters, and on the level of research in general. The British Library Research and Development Department is therefore concentrating efforts on the Public Library and other relevant non-academic sectors.
After long neglect, Public Libraries have a window of opportunity which must not be missed. Initiatives such as EARL, Electronic Access to Resources in Libraries, and Millennium funding must be promoted. A strong movement is developing to "empower the people" through public access to information superhighways. Political statements, such as the following issued recently by the Labour Party are very encouraging:
"Access: We wish to ensure that participation in the information revolution is available to all and not just to the privileged few. There must be equality of access through an integrated national network which covers all parts of the country, reaching as extensively and as affordably as possible, in which each network system links with others. We seek to empower individual citizens as participants and consumers and also to ensure equal access to the providers of services."
Currently, rather than Public Libraries being the focus of Governmental networking efforts, priority is being given to wiring up schools and hospitals. Thus, to support the Secretary of State for National Heritage's statement concerning uptake of the Internet in Public Libraries, UKOLN commissioned a rapid review of this subject, using telephone survey techniques to contact all Public Libraries. The results, with a 100% response, revealed that:
The survey shows a low take-up and a need for investment; though leading-edge research (for example, CLIP, the Croydon Library Internet Project, IT-Point, the Solihull Library Access Project) and consortia such as EARL cannot transform Public Libraries. They can, however, point the way to larger scale endeavours, such as the Library Association's Millennium bid, and influence political strategies backed by hard cash - be it Government financing, Local Authority sources or public/private sector partnerships.
The British Library Research and Development has a new Director, taking his appointment at a time of immense change.
The Secretary of State has appointed a new Libraries and Information Commission to co-ordinate information issues between sectors, to review the United Kingdom's international role, especially in regard to the European Union, and, in particular, to develop research policy in Library Information Systems. The new Director will be involved with the Research Sub-Committee of that Commission.
Meanwhile, the cuts in funding are expected to continue and both The British Library and its Research and Development Department will not be exempt. Accordingly, The British Library is re-structuring and the Research arm will follow suit. There will inevitably be a re-focus of effort, with perhaps less directed at higher education, though liaison with that area will remain of paramount importance (universities are still the logical contacts for high technology). Interest will grow in private sector input into research, probably via partnerships and joint funding.
It should be emphasised in this context that it will be the hand of friendship that The British Library Research and Development Department will be extending and not the begging bowl. Research will continue at The British Library.
For further information, contact the speaker at terry.cannon@bl.uk
This account was drafted for this report by The Marc Fresko Consultancy. It is based on notes taken during the presentation and slides used.
As one of the world's great research libraries, the British Library has served scholarship, research and innovation for over four hundred and fifty years. Now, tremendous changes are taking place at a rapid, and increasing, rate. Some of these transformations arise from the demands of users who are themselves influenced by new phenomena, such as the Internet; some are due to the sweeping changes affecting equipment, particularly computers and communications. The British Library is preparing for this new environment, not only by initiating programmes to exploit information technology, but by forging new partnerships to meet the challenges ahead.
As well as being the national library of the United Kingdom, the British Library is one of the world's great research libraries. With an annual budget of £110 million, it employs a staff of 2,300 people and houses over one hundred and fifty million items.
Its main purpose is to serve scholarship, research and innovation. In this respect the Library is not primarily providing resources for undergraduate teaching, but acts rather as a research material provider. Its strategic objectives are exemplified in the following quotations from the Library's Strategic Objectives for the Year 2000 report:
Computer literacy is becoming more and more commonplace, from the schoolroom to the workplace (a library example of how much this aspect has changed is in the OPAC, the On-line Public Access Catalogue, which was originally feared to be too complex for the public user and is now considered to be too simple). This process has been encouraged by the exponentially increasing use of the Internet.
In its own services and collections, The British Library is sure that digital materials will not replace all traditional library materials and that people will continue to want to use them traditional materials. In parallel, there is an inexorable rise in the demand for digital materials.
There is also a definite trend for users to want desktop type of access, whenever they want and wherever they are.
Information Technology (IT) is polarising in respect to its equipment and activities. Convergence on a global scale is manifest in computers and communications, media and publishing (where many organisations are merging) and education and entertainment. The same technology trends are catalysing great individuality, creativity and variety.
Access to electronically held information is distance-independent and in the near future the cost of using powerful communications over long distances will be negligible. At present, one hour of such use would cost about five cents. Although this might be perceived as relatively low and although it will undoubtedly fall, the level of charge is currently two orders of magnitude above costs and significant changes are perhaps unlikely over the next five years. Nevertheless communications is now viewed as a global commodity, though the infrastructure is not in place as yet in many parts of the world.
Software tools are becoming increasingly inexpensive, easy-to-use, powerful and sophisticated. This, combined with the development, price reductions and availability of hardware advances, such as massively parallel processing, will allow desktop computers to interact very quickly and a great deal more naturalistically.
The British Library is preparing for the new environment, where access will be a primary factor. It has initiated the Initiatives for Access programme of pilots and demonstrators to exploit IT, test new services, examine organisational implications and provide a vehicle for collaboration and Public Relations - including PR on exactly what a great Research and Development library is capable of.
The change of technology has been viewed by some as a change of purpose, but this is not so. That remains the same as it has been for the last four hundred and fifty years.
The vision of the British Library is of integrated access to its digital collections and those of other organisations. It will, therefore, be organised, and indexed, for such access and will work towards increasing the access, both in terms of people and materials, while maintaining availability of digital archives.
Other aspects of that vision will include staff having "digital competencies" (all staff will have need of these, not just a few "digital librarians"), establishing a balance between the requirements of Intellectual Property Rights and fair dealing with information and the commitment of substantial investment, from both the Library and its partners, be they government or private sector.
The priorities will be:
Collaboration has always been part of library provision and this has been encouraged by lack of resources, especially on a national level. The global digital library clearly requires co-operation, not least on standards, protocols and on services based on access to digital collections. Different kinds of partnerships will encompass agreements on standards, collaboration with other digital collection owners and joint ventures with the private sector. Different partners could be derived from the academic community, from industry and commerce and from public libraries. Partnership will provide opportunities for developing improved, more comprehensive services and a means of sharing development costs. In addition to nationally-based partnerships, the British Library will be seeking to extend collaboration with relevant organisations in both the European Union and the United States
It will also seek to utilise the United Kingdom government's Private Finance Initiative, acknowledging its key points (transfer of risk to the private sector, value for money for the taxpayer and open competition in the selection of partners) in respect to its three main models (financing public investment, private initiative and joint venture).
Overall, the great changes now taking place world-wide and at an accelerating rate will transform the nature of all libraries and the British Library will not be exempt from that process - far from it. That is the challenge over the next few years, a challenge for which the British Library is preparing and which it is looking forward to meeting.
Information is available from john.mahoney@bl.uk and at URL: http://portico.bl.uk/access/
This account was drafted for this report by The Marc Fresko Consultancy. It is based on notes taken during the presentation and slides used.
The costs of scholarly information are based on long experience with a print-based creation and distribution system. The global Internet, however, is already changing the economics of the information distribution system. Understanding networked information and its potential effect on universities' costs for acquiring, storing and delivering information is essential in today's information and technology-oriented world. New approaches to allocating institutional funds to recognise networked information are required if scholars and students are to benefit from new information technologies. Examining the functions of the scholarly information distribution process will indicate where changes in role and investment are required by universities if networked information is to be used successfully.
We can describe the knowledge infrastructure in terms of collection(s) of materials, methods of access, means of storage, channels of distribution and technologies for printing and display. In functional terms, it can be considered as equivalent to a traditional library. By contrast however, the Knowledge Infrastructure is radically different from conventional libraries in procedural and conceptual terms. In effect, we have the two completely different models of information management co-existing in parallel.
For any meaningful analysis on the Knowledge Infrastructure, we shall have to assume the existence of the Internet (or something like it). We will also find that the Knowledge Infrastructure demands close collaboration between traditionally separate constituencies such as campus organisations, technology groups and librarians.
The US has seen a major transition in management of the network over the last 18 months or so. The change has involved the emergence of the Internet as an entity in its own right. The models for its overall management and technical approach are being changed as it moves away from Federal Government control (under the NSF) towards a private market approach. Some academics feel some sense of loss as this privatisation proceeds; but the full financial and technological impacts of this evolutionary change - and of the rapid growth which triggered it - are still far from complete.
Today, the Knowledge Infrastructure is a priority for the current administration and Congress. The national technological infrastructure effectively extends down to the workstation level, but policy fails to recognise the demands of integrated functionality. The level of regulation is being debated; some want a competitive environment, though a regulated environment has real advantages. The Telecommunications Reform Act - a major piece of legislation - is an attempt to reconcile some of these conflicting demands; but competition will be central to the policy.
The current network operates on data packets, and is moving towards a market-managed technological environment. There are regulated local voice network services and also regulated local video distribution services. The trend is towards integration of these in the near future.
This change and debate inescapably raises the question of what the higher education community's interest should be in the evolving national networking infrastructure.
Any model of scholarly communications must extend from the creation of information to its use, and must take in all steps in between. This can be done by expressing a model in terms of functions and performance attributes.
The functions can be summarised in the following list:
Note that a digital Knowledge Infrastructure has to provide equivalents to all of these functions if it is to be successful. Along with these key functions, we have to consider the following performance attributes:
Analysis with this model suggests that new roles and changed institutional behaviour will be required. It is also clear that there is a desperate need for investment in the technological infrastructure.
The success and survival of the Knowledge Infrastructure must be examined in the light of this model. We can, for example, analyse electronic journals, comparing haw they perform along the entire chain from creator to user, compared to the conventional paper scheme. One of our challenges is to make sure that all functions are performed well.
The marketplace for scholarly information is evolving rapidly.
The concept of intellectual property rights (IPR) is central; and the issues it raises are controversial. Fundamentally, we can identify three kinds of information:
Note that the paper model combines these three kinds of information successfully. There are two ways in which information is used, namely:
Fierce debate rages around what usage should be compensated and what should be uncompensated. The mechanisms for dealing with IPR for electronic materials are not yet fully developed, and where international transactions are concerned the level of development is even lower. The concept of "fair use" is creating a controversial dynamic; if anything, there is a trend towards a more conservative, more restrictive interpretation of fair use.
While issues and mechanisms are immature, we can be confident of one thing: there is no free ride in this electronic environment; we will get what we pay for!
Librarians have long been great sharers of information. Possibly because of this, perceptions about resource sharing are based on concepts allied to the paper model, and these perceptions are changing but slowly. Accordingly there is a tendency towards the premise that any savings will result from an extension of the inter-library loan model; and equally a tendency to view the acquisitions budget as likely to show savings. In fact however, due to the continued strength of private property rights in scholarly literature, libraries acquisitions budgets are likely to increase rather decrease.
To use the language of classical economics, the marketplace for journals is imperfect. The imperfection is due to the fact that there are no substitutable goods for many journals, as evidenced by the fact that librarians tend to reduce the numbers of subscriptions rather than subscribing to alternative titles when faced with price rises which exceed budget increases. The last few years have seen increases of !0% - 15% every year in the cost of journals. We can guess that increases cannot continue indefinitely; something is bound to change.
Electronic media allow for a diverse range of pricing methods and policies. The question of how we will pay for information is now considerably more significant than it was with paper publications. The pricing schemes include:
There may be other variables too; for instance, costs for the same information may vary according to how new or old the information is when it is used.
It is important for us to recognise that the different approaches can lead to considerably different total costs. We can have some influence on costs, at least in some cases, by co-operating with other users to reap economies of scale. And it is clear that the general trend is towards licensing rather than acquisition. However, other issues are less clear cut: for example, if a licence payment is arranged, what happens when the licence expires? What happens if users are by that time "hooked" on using the information? Libraries should beware of low initial costs for licences.
Electronic information requires less storage space than information on paper (though realistically, at best we can only hope that it will reduce the rate of growth of library storage space). Furthermore, the cost of storing information electronically is less than the cost of storing the equivalent on paper (at least in the short term). So electronic materials can lead to savings in costs for storage, access and circulation of scientific and technical materials. We should look to these cost centres to provide additional funds for content acquisition.
In the US, the concept of needing to pay for archiving and preservation is still relatively young (because the history of the US is relatively so short). But archiving and preservation must be recognised as being of paramount importance, and funded accordingly.
Two conceptual changes will be enabled by emerging electronic technologies:
Most so-called "electronic" strategies today rely on the use of paper at some level. A good example is the SPIRES High Energy Physics pre-prints initiative, which is operating successfully on a large scale. Although it allows the production, circulation and storage of papers in electronic form, the authors still demand that the papers be published in paper form. The real savings of these technologies will only start to be realised when there is no paper stage.
All of these factors will lead us to revise our thinking about the relative importance and sizes of traditional library cost centres. Making investment decisions for electronic information rights in a networked world are complex.
Content costs will almost inevitably increase, and there is little that librarians alone can do about it. We therefore will need to find ways to accommodate the increased costs of information content. Possibilities may include:
Finally, the cost structures may affect collection policies, as there will be pressures to let these policies be dictated by usage more than before.
The above has outlined several of the issues and scenarios which will shape the future. Just when a new electronic models will rule is not known; certainly we do not expect any single model to dominate the scene for the next twenty years. Over that period the differing models will exist in parallel. This will lead to "unnecessary:" duplication of some activities - and hence higher costs. Consequently we will need funds processes to support all the models at once, and recognition of this increasingly expensive and complex environment is an essential element of any new strategy.
To close, there are some actions we can take to advance the situation positively. The actions are:
This account was drafted for this report by The Marc Fresko Consultancy. It is based on notes taken during the presentation, slides used, and text adapted from a WWW presentation on CLIC which is referred to at the end of this account.
Molecular science is both a very visual and "three dimensional" subject, and one that is very rich in precise semantic content and standard definitions. The CLIC electronic journal project has as its objectives the parallel printed and electronic publication of a flagship chemistry journal, that will use some very recent publishing technologies to deliver to the reader the three dimensional visual element along with textual information.
Chemistry is one of the most visual and "three dimensional" of sciences. For many generations, communication of the subject has been rooted on the printed pages of chemical journals, with even colour a rare event. Partially because of such limitations, the subject has evolved a complex and arcane symbolism for its written representation. The complexities of this "chemical nomenclature" in turn result in substantial risk of the propagation of errors and misinterpretation of results. A refereeing system exists to catch both errors of science and transcription errors, but the reality is that referees have few "tools" to assist them to catch errors on the printed page other than then own eyes and minds.
For the first time, electronic tools to allow the cost-effective dissemination of three-dimensional information are now at hand. We can envisage distributing electronic documents which represent three-dimensional objects (molecules), and which allow readers to "manipulate" them to examine the objects from all angles. Electronic publications also allow other features, such as linking and access statistics. This combination of features has created the opportunity which the CLIC project is exploring.
The main objective of this eLib project is to develop parallel electronic and printed forms of the established journal Chemical Communications. An electronic version will provide such information to the reader, with what might be called "semantic integrity" and accuracy of the information. We even envisage providing mechanisms for readers to comment on the individual articles, and thus to interact with the original authors. To this extent, this aim differs from some other electronic journals, where the paramount objective is to achieve what is called "page integrity" with the original printed version. Whilst semantic and page integrity are not necessarily exclusive, to achieve both requires significant extra effort in storing the basic content of the journal, and its presentation to the user. Thus the CLIC project will concentrate on developing standards for storing, transmitting, displaying and applying molecular information.
This is being achieved in three stages:
From March 1997, a number of complete issues of the journal will be available electronically, with some enhancement.
Not the least task is educating the audience to actively participate in this method of information retrieval, and indeed persuading authors to contribute information in the appropriate form in the first place. The CLIC project thus aims to increase awareness in the chemical community of the possibilities and advantages of electronic publications. This is being achieved by:
The project team is particularly pleased to note that a number of chemistry software vendors are now producing freely distributable software for use with the CLIC journal. In general, this is in the form of "cut-down" versions of commercial products, made available for network use by the vendors. One product specifically designed for such an e-journal has recently been announced (Chemscape Chime from MDLI). Another notable success is the popularity of the e-conference: over 15,000 different people have connected to the conference in the last ten months (note that this is comparable to the number of attendees at a major ACS conference, but at a fraction of the cost!)
For more information on these activities, refer to the URL at the end of this account.
The project also has objectives which are specific to the nature of the discipline of chemistry.
Primarily, it seeks to achieve "future-proof" electronic delivery mechanisms. This is problematic, as many necessary standards simply do not exist yet. One approach being investigated is the use of SGML to HTML conversion with chemical DTDs. Another hopeful prospect is the results of the Hyper-G project, namely its distributed servers and index engines. We are also monitoring the progress of the PURL (Persistent URL) initiative.
Another domain-specific requirement is the preservation of chemical semantics. The team is pursuing a number of alternatives including chemical MIME (for the multimedia delivery of molecular content), virtual reality techniques (using VRML), Java and CML (Chemical Markup Language).
The CLIC consortium comprises groups in three university chemistry departments (Imperial College, Leeds and Cambridge Universities) and a learned society (The Royal Society of Chemistry).
At present, the team approaches authors to request electronic copies of their papers; they then convert them and apply the electronic enhancements. The process of preparing and publishing electronic papers is faster than paper publishing, but at the moment both electronic and paper issues are published at about the same time.
One part of the publication process which is easier and faster is refereeing. In one case, a paper was submitted and refereed electronically within eight hours! We are considering "commentable" or "discussable" papers too, but this raises questions about the nature of moderation; the best answers are not yet clear.
The electronic medium is ideally suited to the gathering of access statistics; this can act as a valuable form of peer reviewing.
This presentation, plus more background information including demonstration of some of the special
features such as viewing three dimensional molecules, is available at URL:
http://www/ch/ic/ac/uk/clic/talk_1.html
Project contacts are: Leeds University: Imperial College of Science, Technology and Medicine: Cambridge University: CML:
This account was drafted for this report by The Marc Fresko Consultancy.
It is based on notes taken during the presentation and slides used.
Archaeology is a particularly appropriate subject to promote the use of electronic media. Much archaeological
work is by its nature destructive; archaeologists therefore need to preserve access to primary data in order to
repeat and test conclusions. Traditional publishing methods have not provided the functionality that archaeologists
require to manipulate the data types involved. The Internet Archaeology project aims to establish both a new
definitive electronic publication and a model for subsequent developments. Key issues are touched on, with
proposed approaches.
It is clear that electronic publications can be more flexible and more effective than paper publications
could ever be. One brief example illustrates this: during the 1920s, an archaeologist painstakingly
reconstructed the design of a theatre which had been excavated. His reconstruction was widely accepted
until it was re-examined during this decade using solid modelling tools. Because of the facilities offered by
solid modelling, this re-examination proved that the earlier work was not completely feasible, and ideas about
the form of the theatre were revised. This probably would not have happened without the use of powerful
computer-assisted techniques.
Accordingly, the main objective of this project is to develop a fully electronic, regular, online only,
refereed journal. Internet Archaeology aims to become one of the world's archaeological journals of record,
by publishing refereed papers of high academic standing which also use the potential of electronic publication
to the full.
Subsidiary objectives are to:
The project consortium is led by the University of York. The other members are:
The project is constituted as a charitable trust, with a formal management structure including a steering committee,
an editorial board and a technical panel. The chair of the committee and Honorary Editor is Professor Barry Cunliffe,
from Oxford.
Archaeology is well suited to the application of advanced electronic techniques. It is multidisciplinary subject, which makes calls on many different skills and methods of analysis. One instance of this is its use of many different data types - text, images, numerical data, GIS modelling etc. Practitioners already make use of electronic tools, and a body of experience in multimedia publication is growing.
WHY AN ELECTRONIC JOURNAL?
An electronic journal will be an ideal vehicle to convey archaeological information. It will provide new tools to
allow archaeologists to say things about the past in ways which where not previously possible. It will allow access
to primary research data, enriching it with additional functionality so that readers can manipulate this data,
allowing readers to make use of, and do justice to, the rich diversity of information.
Finally, an electronic journal has some special logistical benefits which make it well suited to the dissemination
of academic works. Its network orientation makes distribution both easy and inexpensive; and the distribution can
include unusual or bespoke programs which add value to the data.
Unfortunately, some drawbacks accompany these benefits. In particular, the costs of preparing papers have been
higher than anticipated, partly because of the many different data types with which archaeologists work. The team
has also found that the complexities and difficulties of running an electronic journal are greater than expected,
but remains convinced that it offers a valuable method of publication for archaeological work.
The journal will feature:
The annual running cost of Internet Archaeology is projected to be about £60,000. Clearly, an equivalent
revenue stream is needed. This may arise from diverse sources including subscriptions and access charges.
Various models are being considered; they include personal subscriptions, multiple access subscriptions and
so on. One option will be to offer personal subscriptions which allow greater functionality than site licences.
The final model is not certain; but it has been agreed that the first issue will be free for a period of one year.
We also intend that keyword searching and contents information will remain free of charge.
As in any publication, protection of intellectual property is a concern. In this domain, there is the added
complication that some of the data is commercially valuable (eg to companies which offer services connected
with environmental archaeological impact assessments).
Refereeing is essential for quality control and academic credibility. It has proved difficult because not
all archaeologists are sufficiently familiar with the relevant multimedia formats. A two-stage refereeing process
is being developed. The stages are:
In practice, this turns out to take place over the network as an iterative process.
Careful construction of the reader licences is essential so that the right balance is struck between
usability and fair use on one hand and excessive sharing on the other. Rules for citations are envisaged.
Somehow, the long term survival of the journal contents must be assured. How this will happen is as
yet unclear. Evidently, standards will have a part to play here; and the Arts and Humanities Data
Service may also be involved. This is a complex issue which is not yet resolved.
More information about the project and a sample electronic paper which displays many of the unique
features of the electronic medium are available at URL:
http://intarch.york.ac.uk
Members of the project team include:
Dr Michael Heyworth, Council for British Archaeology
m.heyworth@bbcnc.org.uk The editor of Internet Archaeology is Dr Alan Vince, University of York
editor@intarch.york.ac.uk
This account was prepared for this report by The Marc Fresko Consultancy.
It is based on an edited version of a paper supplied by the speaker.
The members of the CIC (Committee on Institutional Co-operation) are presently engaged in the collaborative
development of the largest fully managed collection of electronic journals available on the Internet. The project
directly addresses the growing need to develop, test, and implement networked information tools and resources
which use collaborative, multi-institutional, efforts. This paper outlines the development of this resource,
and concludes with a discussion about the future environment which will be necessary if projects such as this
are to become commonplace throughout the higher education community. To place the project in context,
descriptions of the Committee on Institutional Co-operation (CIC), the CIC universities, the CIC Center for
Library Initiatives, and CICNet, Inc., are given.
As noted above, the "CIC" stands for the Committee on Institutional Co-operation, a 35 year-old collaboration
among the following universities:
Today, there are over 75 separate and unique co-operative activities operating under the aegis of the CIC.
Collectively the CIC universities account for more than 17% of the doctorates awarded annually, more than
$2.5 billion in externally funded research annually, and over 17% of the holdings of the Association for Research
Libraries. They represent an aggregate total of over 500,000 students, 33,000 faculty, and 57 million volumes
within their libraries.
Like all of the nation's major institutions of higher education, the CIC universities depend on the availability
of reliable, high quality resources of all kinds, ranging from those available through their libraries and
faculties to the most advanced technologies in their laboratories, computing environments, and related teaching
and research facilities. What is unique about the CIC universities, however, are the many initiatives within
and among them which depend on a reliable and advanced networked infrastructure and on staff, facility, and
financial investments focused on true programmatic Co-operation. Collaborative initiatives have evolved which
require stable inter-institutional technical standards and support mechanisms and, increasingly, the availability
of shared, reliable information resources and services. Indeed, there are now some initiatives which might not be
possible without such infrastructures.
The Committee on Institutional Cooperation is made up of the Provosts of the above universities; it meets
four times annually. The office of the CIC is located at Champaign/Urbana, Illinois. It is professionally
staffed, with a director and nine FTEs. Its current operating budget is slightly over $4 million. It is
important to understand that there is core financial backing to the philosophical concept of the CIC in order
to understand the context of this paper. Additional funding, from member dues, research grants, and the usual
host of other sources exist for most of the initiatives which have evolved under the umbrella of the office
of the CIC.
The CIC has always been governed by its three founding principles:
These principles are critical to the successes of our programs. Among academic consortia, the
nature of the CIC institutions' collaboration is outstanding: individually these are some of the
greatest institutions of higher learning in this - or any - country. They are driven by different missions,
governed by separate boards and obtain their funding from a variety of separate sources. They are all
clearly autonomous organisations with no central funding or administrative body. Further, each of these
universities has unparalleled academic and research programs in a variety of fields; they sometimes compete
fiercely for funding, students and stature. Yet, co-operatively, they have been able to become a formidable
force in higher education.
Two offices which have evolved under the umbrella of the CIC are the CIC Center for Library Initiatives
and CICNet, Incorporated.
CICNet, the regional TCP/IP network founded by the CIC in 1988, serves the internetworking needs of the
CIC universities, other academic institutions, not-for-profit organisations, and businesses. CICNet has had a
strong interest in the design and deployment of networked information services. CICNet is currently involved
in three major National Science Foundation-funded projects to bring Internet access to under-served communities,
namely:
These projects will enable constituencies that can most benefit from Internet access for communication to
take advantage of it. Funding for CICNet is primarily from three sources: member dues from the CIC universities,
specific projects funded by the National Science Foundation, and the sale of Internet connectivity and services to
for-profit and non-for-profit sectors.
The CIC Center for Library Initiatives (CLI) was established in September 1994, to support collaborative
efforts specifically among the CIC libraries. Co-operative resource sharing has long been a practice among
the CIC libraries, and the CIC has an enviable record of successful, funded R&D projects. Most notably, the
CIC libraries are now engaged in a Virtual Electronic Library (VEL) project, funded through a United States
Department of Education grant. The VEL project will develop the technical infrastructure required to provide
seamless interconnections among a range of OPAC systems within the CIC libraries, and demonstrate its
applications through user-initiated interlibrary loans and document delivery throughout the CIC. The VEL
will enable more than half a million faculty, staff, and students to explore and take advantage of vast
resources within the CIC. Inherent to the VEL project is the addition of an expanded set of electronic
sources to the VEL pool of information resources. The Center for Library Initiatives is funded by dues
paid by members of the CIC.
Thus the general framework for collaboration lies in the strong support at the highest levels of member
university administration as evidenced by the existence of the funded office of the Committee on Institutional
Cooperation and its subsidiaries. Relevant initiatives in support of the missions of the member universities
flow from within this framework. One such initiative, in this case involving the Center for Library Initiatives
and CICNet, Inc., is the Wide Area Information Resources Management (SEIRM) project, informally referred to as
the CIC Electronic Journals Collection or CIC EJC.
Increasingly ubiquitous Internet access within the research university, combined with the popularisation of the
World Wide Web, has made on-line academic publication and research more desirable. The expansion of Internet-based
publishing provides opportunities and challenges for libraries interested in the shared development and management
of electronic collections. The low cost per increment of user access compared with print journals, regardless
of user location, argues strongly for building multi-institutional electronic collections. Internet tools and
communications methods also make it possible to distribute the collection development and management tasks across
multiple institutions. We are faced with two challenges:
The development of the CIC Electronic Journals Collection (CIC EJC) grows out of collaborative efforts between
CICNet and the CIC library community, combined with CICNet's early work with WAIS and Gopher. In 1991, the CIC
library collection development officers asked CICNet to create an archive of the public domain electronic
journals which many of them had begun to collect locally but had no long-term means of archiving. CICNet went
beyond this original charge: archiving e-journals available at CIC member institutions, and sweeping the Internet
to collect all e-journals that could be obtained through an automated FTP process. The result, an undifferentiated
collection of some 700 titles in varying depths of retrospective coverage and completeness, was made accessible on a
CICNet server, and is now accessed approximately 35,000 times a day by users of the Internet.
The collection process was valuable in illustrating the range of materials available, but it became clear that
no entirely automated process could produce a collection that would satisfy the needs of most scholars. The
process of automatically sweeping the net could be used only to gather those items available through FTP. Many
new electronic journals are made available via Gopher or the Web. The current collection is best thought of as
a snapshot of materials available at a particular time in the history of the Internet rather than a comprehensive
resource, but the frequency of its use clearly indicates the need for a reliable managed collection of e-journals.
This collection is available at URL
gopher://gopher.cic.net:2000/11/e-serials/managed
In response to the need for a comprehensive collection of scholarly and research electronic journals,
the CIC Task Force on the Electronic Collection developed a complete plan for building a managed electronic
journal collection (CIC-EJC) based on the CICNet gopher server. The Task Force, which includes representation
both from CICNet and the CIC libraries, planned the collection from selection to maintenance. CICNet staff
then developed a prototype system based on the recommendations and input of the Task Force. The prototype
system is available on the World Wide Web at URL http://ejournals.cic.net and is intended to serve as an
illustration of the work which can be achieved by the librarians of the CIC universities and the staff of
CICNet. It includes some 50 electronic journals, with current bibliographic records, complete and current
holdings of all titles in the collection, a helpful World Wide Web interface which links the bibliographic
records to the e-journals (the journals remain on publishers' sites; the text is archived by CIC-EJC), and
consistent archiving of the materials in the collection. The prototype supports browsing by title and by
subject, and searching the bibliographic records. The titles are mainly in the areas of IT and science
subjects, but we hope to increase the Arts and Humanities coverage.
We have five broad objectives for this project:
The work of the project has been divided and organised as follows:
We hope to begin aggressive development of this resource within the next six months including:
Along the way, we hope to obtain insights into the answers for relatively fundamental questions, such as:
We will also consider natural functional enhancements, such as full text searching.
As we move forward with development of a viable virtual electronic library, the CIC EJC will serve both as
a testbed of inter-university collaboration and as an integral information resource for our users. The CIC
Center for Library Initiatives, in concert with CICNet, will continue the work of tying together discrete
projects and developments with an aim of providing flexible, desktop access and delivery of information
resources for the 500,000 students and 35,000 faculty and researchers of the CIC universities.
In the short term, we have submitted a proposal to the National Science Foundation for support of the
CIC EJC. We will continue to refine specifications for interlibrary lending systems and concomitant
delivery mechanisms, and we will move aggressively into the production and dissemination of digital
resources. Our aim is to deploy a seamless access tool to allow users to navigate the wealth of CIC
library and information resources. Through such action, we will continue the tradition of excellence
in higher education upon which the CIC was founded - providing our students with the finest education
available, and providing our researchers with unparalleled access to information. And by providing the
CIC EJC to the Internet community we continue our great tradition of serving a vital role as contributors
of research results and products to the broader international education community.
Further information is available at URL:
http://www.cic.net/cic/cic.html/
The speaker can be contacted at bmallen@uiuc.edu
The prototype electronic journal collection (CIC-EJC) is available at URL:
http://ejournals.cic.net
At the request of the speaker, this presentation has not been reported here in full.
Interested readers are referred to the URLs shown at the end of the paper.
This presentation described the innovative instructional projects that combine networked information
technology and resources, interdisciplinary teams for design and delivery of instruction and interests in
student collaborative learning. The descriptions were based on data from the ten project teams who
participated in the Coalition for Networked Information's second invitational Conference on New Learning
Communities, held at Indianapolis in November 1995.
The first New Learning Communities conference was held in July 1994 in Phoenix and was furthered in a
subsequent conference in November 17-19 1995 at Indianapolis. Both were organised under the auspices of the
Coalition for Networked Information, CNI. The programme seeks to promote cross-fertilisation of the different
types of professionals in higher education who use networks to enrich their curricula and broaden students'
learning experiences.
To achieve this aim, the programme brings together institutional, or inter-institutional, teams of
varying compositions and roles, comprising faculty, librarians, information technologists, instructional
technologists and students. The exchange of ideas and exposure to different viewpoints will lead to a greater
understanding of the total perspective in terms of campus utilisation of networks and the development of a set
of "best practices" for the benefit of the wider educational community.
From the propositions put forward in response to a call for projects issued by CNI early in 1995,
ten teams were selected to participate in the Indianapolis conference. The projects are:
Further information can be obtained from the following URLs:
This account was drafted for this report by The Marc Fresko Consultancy.
It is based on notes taken during the presentation and slides used.
Over 30 UK universities have joined together to produce and evaluate language learning courseware
in five European languages. The project is nearing completion, with over 40 packages coming on stream
in 1996. Issues such as consortium management, pedagogic design, formative evaluation procedures,
training and dissemination all suggest lessons to be learned.
The focus of the Technology Enhanced Language Learning project (TELL) was the production of
courseware for language learning. The project was initiated in 1992, and was funded by the
Teaching and Learning Technology Programme (TLTP).
The objective was to produce courseware for French, German, Spanish, Italian and Portuguese.
All levels of learner were to be catered for, from beginner to final-year expert. The courseware
was developed to run on current desktop systems; as the project started in 1993, the basic
requirement is a PC with a 80386 processor or an equivalent Macintosh.
The project started in January 1993 and was formally completed in December 1995, with a total budget
of £1.35 million. Being funded ultimately by top-slicing from the funding councils, it was undertaken
in order to determine the value of computer-aided learning, and the place it should have, in this field.
The products to be delivered by the project were divided into three strands, namely:
For example, there is a translation environment, which provides help and hints etc; and sets of
grammar exercises to get students to apply their new knowledge.
In the end, the project has produced 43 packages. These include seven CD-ROMs and 36 networkable packages.
They are at present being commercialised, but are available to higher education institutions at cost.
TELL was performed by a consortium led from the University of Hull, which is also the location of the
Computers in Teaching Initiative (CTI) Centre for Modern Languages. The project has thus benefited from
CTI expertise in requirements, its knowledge of appropriate developers, and experience in project management.
There were 15 development sites and 21 affiliate (evaluation) sites.
Project management was performed by the speaker (about 25% of time), a project manager (50%), and one
other (100%), with secretarial support. It turned out that project management was difficult and demanding,
for several reasons. The obvious reason is that tight production deadlines and academia do not necessarily
sit well together. Less obvious perhaps is the complexity of managing and distributing funds to so many other
institutions through the University of Hull. Finally, liaison with the funding councils was problematic in the
early days of the project, specifically in the area of co-ordination between projects (for example over the issue
of copyright).
Some of the motivators, or "adhesives", which helped the consortium work together effectively included payment
deadlines, and the requirement to complete tasks in order to get paid; reputation, which impelled contributors to
produce good quality work; and a degree of camaraderie with co-workers engaged in similar tasks.
Co-ordination of work and standardisation of approaches was of course important with a consortium as
large as TELL. Within about six months of the start of work, we had agreed common guidelines (down to the
level of fonts and colours). As an evaluation of the value of tools of this type is an important part of the
project, we also agreed common evaluation procedures, including a particularly extensive exercise to monitor
students using software packages.
We undertook and planned widespread dissemination activities, using common materials. This included
workshops, site visits, and prospective user group meetings.
The project materials are now being promoted (copies of promotional literature are available on request),
and the materials themselves will be available from April 1996 onwards. Some funding has been obtained for
ongoing maintenance and development.
We are optimistic of success, but obviously it remains to be proven. Signs of success will be a sense
of ownership of the products by the HE community; development of a "materials bank" consisting of additional
compatible resources, and development of new packages. The amount of success will become apparent in a
timescale of some two or three years.
On another level, we already know we can claim some successes. We successfully steered a consortium of
over 30 universities to work together and to produce results. Teams have been formed, some of which are
continuing to work together. Many skills have been acquired and enhanced during the course of the project.
And we have created a seed bed into which the nuclei of new ideas can fruitfully be planted.
Further information is available by e-mail from
g.chesters@french.hull.ac.uk
This account was prepared for this report by The Marc Fresko Consultancy. It is based on edited and merged versions of papers and slides supplied by the speakers.
IT POINT is about bringing the benefits of IT to the community, and to create a vision of the public branch
library of the 21st century. The project involved the installation of PCs with a wide range of facilities for
public use into a branch library. This paper describes the aims of the project, the implications for the public
library service, and community involvement and awareness. Statistical details of usage are provided, and the paper
concludes with a glimpse into the future for IT POINT and information networking in public libraries.
IT POINT is within Chelmsley Wood Library, a large branch library situated in the North of Solihull, to the
East of Birmingham. It is the second largest library in the borough, serving a community of about 24,000 with a
collection of some 60,000 items.
IT POINT is a part of the public library. It contains facilities for members of the public to book and pay
for the use of one of six PCs to access a range of facilities. The facilities include:
Customers can use training packages to learn how to use software packages at their own pace.
IT POINT is not only a fully functioning service but also a research project funded by The British Library
Research and Development Department. The research nature is illustrated by the fact that the service was originally
free when opened in August 1994. Charges were introduced in October 1995, to test ways of sustaining the project,
and are likely to stay. The project has been an opportunity to provide public library managers with a well-tested
model which can be adapted to their own services. Its present funding from the BL R&DD ends in March 1996; the
future of the service and how it can be sustained are examined below.
The project has created an opportunity to redress the imbalance of IT development in the library community where
academic libraries have benefited from the development of JANET and special libraries can often call upon the finances
of a large parent organisation.
The aims are to:
Early findings are described below; the final report is planned to be available in May 1996.
These IT services are managed not by library staff but by an IT expert (Gulshan Kayam). The library's role has
been to support the project and ensure it complements the library service offered to the community. The advantages
of having such a facility within a branch library have been enormous. Access to the Internet and CD-ROMs is an
excellent boost to any library service.
Staff training and awareness have been fundamental to the acceptance and utilisation of these new formats. But
there remain a number of challenges to using IT for information provision:
Statistics for the use of IT POINT by staff for public enquiries have not been collected, but on average there
are some two or three such per week. The volume probably would increase if there were greater integration into the
library service, for example if there were CD-ROM and Internet access at the Information Desk. This will come with
a change in culture and with younger, more IT-literate, staff.
In addition to using IT for public enquiries, we have also used more direct methods to introduce our customers to
the advantages of IT.
The greatest success has been in schools. Not all local schools have PCs and CD-ROMs, so IT POINT has provided a
real advantage. We have organised an annual contest between primary schools, called the CD-ROM challenge. In this
competition, groups of students race against each other to answer questions using CD-ROM encyclopaedias. We have
also organised a Print vs IT challenge to encourage recognition between the different formats of information.
We use CD-ROMs with younger customers as well, in story time sessions for under-fives. This encourages parents
to recognise that they may have a need to understand IT in order to support their children's education needs.
Using the library's links with the careers service has also ensured the high profile of IT POINT, as it becomes a
showcase once a year when the library hosts the annual Careers Convention for the locality. Therefore, young adults
and their parents have an opportunity to view and use the facilities.
Informal coffee mornings and introductory tours have been an opportunity to encourage our customers to recognise
the advantages of IT and the role it has to play within the library.
All these activities have been ways of changing people's perceptions of libraries by ensuring the use of IT is
customary within libraries.
IT POINT currently has 1,100 members. 50% of members are local, from Chelmsley Wood. A further 24% of members
come from nearby neighbouring areas in North Solihull, 5% come from South Solihull and the rest come from neighbouring
cities such as Birmingham and Coventry. See chart 1 below.
IT POINT's membership ranges from 5 years of age to 75. Chart 2 shows the division of men, women, boys and
girls using the services.
Usage increases noticeably during the Summer months, when school children are on holiday, and make greater use
of the service. For example, in August 1995 Internet usage reached 279 hours. Chart 3 shows the change in usage
by month, for selected applications over a four month period in Summer 1995.
Usage of the service dropped in October 1995, which is when usage charging was introduced. The decrease can be
seen by comparing chart 3 with chart 4, which shows usage for the following four months. Usage is now steadily
progressing, and with further marketing of IT POINT's services we anticipate that usage will increase anew. Internet
usage here includes Electronic mail, Web browsers, Telnet, Gopher and FTP.
The usage is shown above in strictly statistical terms. What impact IT POINT has had on the community, we do not
know yet. This is the subject of research work in progress; it is being carried out by researchers from the University
of Central England. However, we already see a wide diversity of kinds of usage:
We also see fascinated individuals who are totally taken by what the Internet has to offer and over the past 7
months we have been providing Internet awareness sessions to the community.
People attending training sessions are not necessarily members of IT POINT. Some are from the private sector.
As well as aiming to raise the awareness within the community , IT POINT has been used to train internal staff to use
the Internet and CD-ROMs. And not staff just from Libraries but also from the IT Department, Arts & Tourism, Careers
and Housing Departments. We also have close relationships with the local Job Centre and the Citizens' Advice Bureau.
We have provided awareness sessions for their staff, enabling them to refer clients with confidence having seen what
we have to offer.
We see three main themes emerging:
Chelmsley Wood is eligible for European Objective 2 funding, where the aims are to help alleviate problems arising
from the decline in traditional industry. One of the reasons to locate IT POINT in Chelmsley Wood was because of
funding; a means of maintaining the service when the British Library Research and Development grant ends.
We have applied for European Funding namely, European Social Fund and European Regional Development Fund. We have
worked with the local college, Solihull College, and together we are planning to implement a Telecollege, which has
resulted as part of a successful Single Regeneration Bid (SRB). It will run for five years starting this year. The
total funding from SRB, European money and Further Education would be £350,000.
All the courses offered will be accredited courses leading to qualifications. The Telecollege project will involve
local firms and will build IT-based courses relevant to their skill needs. We hope they begin to make a financial
contribution, in order to ensure maintenance of the service when the SRB funding comes to an end.
Taking a look at the future of IT POINT, three main issues emerge:
The immediate future will be completing the original project, specifically the research dimension. in terms of
British Library Funding which comes to an end at the end of March 1996. We will also continue with the training work
and the Single Regeneration Bid and Telecollege already mentioned.
In terms of value to the community and corporate objectives to the council, it is already evident that there is a
need for:
The purpose of Project LISTED is to develop a dynamic cataloguing system of distance learning materials accessible
from libraries. Solihull is the co-ordinating partner of Project LISTED and the project is due to start mid March.
Chelmsley Wood has been chosen as one of eight test sites, the testing is due to take place approximately 9 months
after the start of the project.
A number of important questions arise. Where does this leave the library service? How do libraries ensure they
will have a lead role ? How does this fit in with the digital information that is being delivered by cable to the home
and will there be as many public kiosks as there are telephones? We look to this sort of research to try to answer
these questions.
There is now a plethora of networks. We see community networking, libraries networking, schools and higher
education networking... but how can this all come together? How do we bridge the gap? The answer may be government
funding through the LA millennium bid, public access kiosks and networking with cable companies and BT which is already
in progress. And this is suggested by the document produced by the Local Government Management Board, called
Tomorrow's Town Hall. A key passage of this report states:
We fervently hope that this will happen before the year 2000.
Higher education depends on the community for its intake and context, and not just on the students currently
"in the mill". All public libraries need to benefit from the Electronic Libraries Programme. They have no benevolent
central funding. To conclude, what we need is co-operation and project co-ordination.
After 20 months experience with IT POINT, it is obvious that Information Technology has a vital role to play
within public library provision. This should be not only for the purpose of accessing greater amounts of information
with greater ease, but also to support life long learning skills in the community. However, public libraries are not
going to achieve this in isolation.
Cambridge Site of the Royal Society of Chemistry:
David James (Project Manager) jamesd@rsc.org
Ben Whitaker benw@chem.leeds.ac.uk
and Chris Hildyard chrish@chem.leeds.ac.uk
Henry Rzepa (Project Director) rzepa@ic.ac.uk
and Omer Casher hoc@ic.ac.uk
Jonathan Goodman jmg11@cus.cam.ac.uk
and Dave Riddick dar25@cus.cam.ac.uk
Peter Murray-Rust p.murray-rust@mail.cryst.bbk.ac.uk
INTERNET ARCHAEOLOGY: OVERCOMING THE OBSTACLES AND USING THE OPPORTUNITIES
SEAMUS ROSS, Assistant Secretary (Information Technology), The British Academy
ABSTRACT
THE PROJECT
Objectives
The Consortium
WHY ARCHAEOLOGY?
CONTENTS
ISSUES
Revenue Generation
Intellectual Property
Refereeing
Licensing
Long Term Preservation
FURTHER INFORMATION
Dr Seamus Ross, The British Academy seamusr@britac.ac.uk
Dr Julian Richards, University of York jdr1@york.ac.uk
FROM EPHERMERAL TO INTEGRAL: COLLABORATIVE MANAGEMENT OF ELECTRONIC JOURNALS
BARBARA McFADDEN ALLEN Director, CIC Center for Library Initiatives
ABSTRACT
THE CIC UNIVERSITIES
THE COMMITTEE ON INSTITUTIONAL COOPERATION
CIC OFFICES: CO-ORDINATION AND LEADERSHIP
THE PROBLEM: UNMANAGED SCHOLARLY MATERIALS
PREDECESSOR TO THE PROTOTYPE: THE CICNET E-SERIALS ARCHIVE
A PROTOTYPE COLLECTION: THE CIC-EJC
Center for Library Initiatives:
CICNet, Incorporated:
CIC Member Librarians:
DEVELOPMENT PLANS
FURTHER INFORMATION
NEW LEARNING COMMUNITIES IN THE NETWORKED ENVIRONMENT
JANA BRADLEY, Assistant Professor, Indiana University - Purdue University of Indiana
ABSTRACT
NEW LEARNING COMMUNITIES
THE PROJECTS
http://www.cni.org/projects/nlc/www/nlc.html
ftp://ftp.cni.org/CNI/projects/nlc
gopher://gopher.cni.org.70/11/cniftp/projects/nlc
LANGUAGE LEARNING:
A CONSORTIUM APPROACH
GRAHAM CHESTERS, TELL Consortium, University of Hull
ABSTRACT
THE TELL PROJECT
PROJECT DELIVERABLES
PROJECT ORGANISATION
COMMON ACTIVITIES
SUCCESSES
FURTHER INFORMATION
IT POINT - NETWORKING IN THE COMMUNITY
GULSHAN KAYAM, IT POINT Manager & SUE TURNER, Library Manager, Chelmsley Wood Library
ABSTRACT
INTRODUCTION
IT POINT
Aims of the project
Results
LIBRARY IMPLICATIONS
COMMUNITY INVOLVEMENT AND AWARENESS
EARLY FINDINGS
Geographical distribution
Age and Sex
Usage
KIND OF USAGE
TRAINING
EVOLUTION OF IT POINT
FUNDING
THE FUTURE
Immediate Future
Value of IT POINT to the Community and Corporate Objectives of Solihull Council
Community Networking
"...Councils were better placed to deal with "information imbalance" in society than any other institution [...]
and now an Internet service is at the heart of all its libraries' services, heavily subsidised for all and free to
those on benefits or low incomes."
CONCLUSION
FURTHER INFORMATION
Further information can be obtained at URL: http://www.itpoint.org.uk
Electronic mail can be sent to the project team at info@itpoint.org.uk
This account was drafted for this report by The Marc Fresko Consultancy. It is based on notes taken during the presentation, slides used, and a paper supplied by the speaker.
In many institutions, librarians and computer professionals, faculty and researchers are developing collaborative relationships. Many of these relationships have evolved in the context of networked information resources and services. The characteristics of true collaborations or partnerships are examined and their elements defined - taking into account the views of expert authors such as Schrage, Kantor and Katzenbach and Smith - and using examples of library/computing partnerships in higher education, in areas such as development of campus information systems, teaching and learning programmes, facilities and services. Factors that promote collaboration, and those that impede it, are explored in the light of the experience of participants in the Coalition for Networked Information's Working Together programme, which brings together teams of librarians and information technologists.
Many of the collaborative relationships which are now developing in many institutions, between librarians, computing professionals, teaching faculty and researchers, have evolved in the context of networked information resources and services.
This paper describes the characteristics of successful relationships, based on the work of writers in the fields of communications and organisational behaviour. The range of existing collaborations in the networked information environment will be categorised and factors which motivate collaboration, and those that hinder it, will be examined. The implications for library administrators, computer controllers and higher education institutes, both now and in the future, will be considered.
The roles and job functions of librarians and information professionals have been subject to significant change over the last five to ten years - firstly as a result of automation and, latterly, as a consequence of the advent of networks.
In some higher educational establishments this has led to the administrative merging of libraries and computing centres; in others it has resulted in increased collaboration between two administratively separate units and there are yet others where the consequence has been duplication of some functions.
In the development of local library systems, libraries often contracted with computing centres to run library applications on mainframe computers (sometimes physically located in the computing centre), update and amend software and take responsibility for the reliability and integrity of the system. The library was responsible for the system conceptualisation and content. Reciprocal arrangements, wherein computing centres contracted with libraries for informational needs, have not been much in evidence.
As the information environment has become increasingly networked, librarians and information technologists have begun to move from contractual arrangements to new patterns of relationships characterised by a move towards shared responsibility in:
Some researchers describe these new relationships as collaborations or partnerships. In Shared Minds in 1990, Michael Schrage defined collaboration as "the process of shared creation: two or more individuals with complementary skills interacting to create a shared understanding that none had previously possessed or could have come to on their own. Collaboration creates a shared meaning about a process, a product, or an event." Schrage's concept of collaboration focuses on the characteristics of the process of interaction among the partners, with emphasis on an intangible - shared meaning - and a quality of interaction between the participants that can best be described as mutual respect. His view of collaboration, therefore, encompasses:
Rosabeth Moss Kantor in 1990 described productive partnerships as those which evolve and continue nevertheless to yield benefits, create new value and work through interpersonal connections and internal infrastructures which enhance learning. She identifies eight characteristics of best partnerships:
Kantor's characteristics are similar to Schrage's in emphasising the mutual reliance of partners, the need for each to contribute skills and/or knowledge and the development of trust, but she adds a key element in the realisation by participants that they must address a common mission or strategic objective.
Ten years ago, neither the library nor the computing centre would have nominated a Campus-Wide Information System (CWIS) as one of their strategic goals. Now most campuses expect that a CWIS is either available or is being developed by one or more units - and both the library and the computing centre would have a valid claim to its being within their strategic province. By combining resources, pooling professional talents and learning from each other, the two units can implement a collaborative project which will enhance the information environment of the campus and yield shared meaning.
In a true collaboration, each unit considers the endeavour to be mission critical, while in a contractual relationship only one party would have that view, the other having provision of support services as the primary goal. Also, in a collaboration, risks and benefits are shared, whereas in a contractual arrangement, the contracting party reaps most of the benefit if the project is successful and is the major loser if the project fails. Success, as far as the party providing services is concerned may still provide a limited share of some of the benefits, but failure will not prejudice the agreed remuneration.
In their widely read 1993 book, Wisdom of Teams, Katzenbach and Smith focused on the dynamics of team performance - defining a team as "a small number of people with complementary skills who are committed to a common purpose, performance goals and approach, for which they hold themselves mutually accountable." They also suggest that while teams can and should take direction from upper management levels, they must shape their own purpose and they stipulated five factors typical of teams:
A particularly useful aspect of the work of Katzenbach and Smith was in their delineation of the differences between teams and working groups. In the latter, a common area of work is identified and group members are assigned work based on their specific areas of expertise and their performance is judged individually. There is no attempt to develop mutual accountability or shared meaning. By contrast, the team approach encompasses both individual and mutual accountability and shared meaning is fundamental to it. The different quality of interaction results in an end-product for a team-based project which is more than the sum of the individual team member contributions.
CNI sponsored a workshop/retreat to assist new and existing teams in learning better ways of working together. One of the techniques to help them develop shared meaning was an exercise in force field analysis, from which the teams identified several factors which tended to motivate collaboration between librarians and information technologists:
It is perhaps much harder for people to discuss problems. Nevertheless factors which hinder collaboration have been identified as:
Administrative structure and personality differences can exacerbate any one or all of these factors. Collaboration initially on a smaller scale may help counter problems, especially in the case of the last two.
Only a few years ago, the provision of information services to the campus community had a very different meaning for both libraries and computing centres. Advances in the networked information environment in, for example, library catalogues on the Internet and universal access to information via gopher and the World Wide Web, have brought library and computing centre services closer together. The traditional division of complementary skills into "content" for librarians and "conduit" for computing professionals, which has served as a good starting point for collaborations, has become increasingly blurred.
Collaborations on service projects with a concrete outcome, such as the provision of reference facilities, or the establishment of a help desk, or the development of an Internet training programme, offer opportunities for staff from library and the computing centre to begin to discuss common mission, can provide mechanisms for sharing skills and will encourage the concept of shared meaning.
The development of a campus information environment, including information policies, creation and distribution of electronic journals and the development of Web sites and pages, provides a fertile ground for the exchange of ideas on the mission of the library and of the computing centre, promotes convergence of the principles and practical experience of professionals in both disciplines and encourages the adoption of the concept of shared meaning.
Top administrators of libraries and computing centres have a critical role to play in the successful implementation of collaborative projects between their respective units. Firstly, they must share their vision of a project and the way in which it addresses the mission of their unit, with all staff involved; secondly, they must demonstrate a demeanour which discourages power struggles between the units; finally, they must make evident a mutual respect and help to create a climate in which others in their units can develop shared understandings.
Apart from the methods described above, there are many creative and forward-looking ways to foster partnerships between librarians and information technologists. Most important, however, is to begin with an open mind and an attitude that each group has much to gain from the other.
Successful partnerships between libraries and other professionals, such as the teaching faculty, publishers, media designers and instructional specialists, are also becoming increasingly common in the environment of the World Wide Web. There is a strategic advantage for libraries and librarians in securing collaborative relationships both within the academic arena and out of it. No one individual or profession has all of the skills now needed to create an information infrastructure for a community of users.
In conclusion, the establishment of a collaborative relationship should provide a stake in it which relates to the mission of all partners. A shared understanding of the way in which a project contributes to each partner's mission will have a beneficial effect on all team members and the individuals concerned must also pay as much attention to the relationship among the partners as to the outcome of the project. A fundamental appreciation of the complementary skills of each team member - and of the viewpoints associated with them - is crucial and must be accompanied by open and easy communication, opportunities for mutual teaching and learning and the building of mutual trust and respect. Shared meaning should be an explicit goal of the project.
Finally, partners need to focus on the ultimate benefit of collaborative relationships - improved products and services for the community of users in an increasingly complex and sophisticated information environment.
Further information is available from: joan@cni.org
Billings, Harold. Supping with the Devil: New Library Alliances in the Information Age. Wilson Library Bulletin 68 (October 1993) 33-7.
Creth, Shiela D. The Information Arcade: Playground for the Mind. Journal of Academic Librarianship 20 (March 1994) 22-3.
Creth, Shiela D, and Lipow, Anne G, eds. Building Partnerships: Computing and Library Professionals. Berkeley: Library Solutions Press, 1995.
Flowers, Kay, and Martin, Andrea. Enhancing User Services through Collaboration at Rice University. CAUSE/EFFECT 17 (Fall, 1994) 19-24.
Graves, William H, Jenkins, Carol G, and Parker, Anne S. Development of an Electronic Information Policy Framework. CAUSE/EFFECT 18 (Summer 1995) 15-23.
Gusack, Nancy, and Lynch, Clifford A, eds. Special Theme: The TULIP Project. Library Hi Tech 13 (No. 4 1995) 7-74.
Hess, Charlotte, and Bernbom, Gerald. INforum: A Library/IT Collaboration in Professional Development at Indiana University. CAUSE/EFFECT 17 (Fall 1994) 13-18.
Kanter, Rosabeth Moss. Collaborative Advantage: The Art of Alliances. Harvard Business Review 72 (July-August 1994) 96-108.
Katzenbach, Jon R. and Smith, Douglas K. The Wisdom of Teams. New York: HarperBusiness, 1993
Koopman, Ann. Library Web Implementation: A Tale of Two Sites. CAUSE/ EFFECT 18 (Winter 1995) 15-21.
Lowry, Anita. The Information Arcade at the University of Iowa. CAUSE/EFFECT 17 (Fall 1994) 38-44.
McMillan, Marilyn, and Anderson, Gregory. The Prototyping Tank at MIT: Come on in, the Water's Fine. CAUSE/EFFECT 17 (Fall 1994) 51-2.
Moholt, Pat. On Converging Paths: The Computing Center and the Library. Journal of Academic Librarianship 11 (November 1985) 284-288.
Neff, Raymond K. Merging Libraries and Computer Centers: Manifest Destiny or Manifestly Deranged? EDUCOM Bulletin 20 (Winter 1985) 8-12.
Schiller, Nancy. Internet Training and Support: Academic Libraries and Computer Centers: Who's Doing What? Internet Research 4 (Summer 1994) 35-47.
Schrage, Michael. Shared Minds: The New Technologies of Collaboration. New York: Random House, 1990.
Shapiro, Beth J, and Brook Long, Kevin. Just Say Yes: Re-engineering Library User Services for the 21st Century. Journal of Academic Librarianship 20 (November 1994) 285-90.
Wordsworth, Anne. Computer Centers and Libraries as Cohorts: Exploiting Mutual Strengths. Journal of Library Administration 9 (No. 4 1988) 21-34.
Young, Arthur P. Information Technology and Libraries: A Virtual Convergence. CAUSE/EFFECT 17 (Fall 1994) 5-6.
This account was drafted for this report by The Marc Fresko Consultancy. It is based on notes taken during the presentation, and a paper supplied by the speaker.
Information Services Directors in United Kingdom universities are now faced with a new threat: submergence in paper and electronic data. The introduction of electronic aids, from publishing to e-mail, has actually increased the amount of information to be handled by staff and students in higher education. This presents a very real risk that depth of scholarship will be subordinated to volume of activity. Information Services Directors may actually be adding to the problem by their quest for improved information handling. But is it in fact a problem? And if so, what should be done about it?
It is emphasised that the views following are those of the speaker and they are not made with any reference to views or activities being undertaken by any other person or institution.
Information is one of those words of obvious meaning: "telling", or "what is told", or "knowledge", or "items of knowledge", or "news". Information services in higher education usually refers to the collection, cataloguing and promulgation of information, the media involved and the personnel who effect these processes. Within this context, information is probably best defined as "items of knowledge" or "news".
In the area of information services, two questions need to be addressed:
The sometimes unseemly scramble at many institutions to force convergence of Library and Computing Services is a partial recognition of the possibilities now emerging for a management strategy for information and its means of distribution. The recent JISC document on Information Strategies has focused attention on this area and on the need to include all sources of information in this strategy. Thus, Library and Computing Services, though the main players, should be joined by MIS, Public Relations and AVS in the formulation of any information strategy, since they are all services in which knowledge and news play an essential role. It is worth noting that although these services are already linked strategically this does not necessarily mean that their management is linked. This may, however, be an outcome of an information strategy.
The advent of Internet and the World Wide Web has changed customers' perceptions and requirements to such an extent that they now see many information providers (and their staff) as a hindrance rather than as a help. Enquiries for guidance to knowledge are being superseded by requests for connection only and this, understandably, is leading to severe loss of morale by dedicated library staff.
In addition, information service providers are themselves being inundated by an increasing influx of information, both paper-based and electronic (e-mail, the Web - and even the telephone and fax). Information professionals are increasingly under pressure with respect to their own information handling circumstances at a time when the need to understand and respond to customer needs is greater than ever. The result is maximum activity and minimum thinking. Sir Brian Follett remarked upon the influence of the "newspaper mentality" upon scholarly publications and voiced fears for their future. In the very near future, crucial questions to consider will be:
There is a perceived move by academic professionals to contain the increased workload (the number of students, for example, has increased by over 50% in the last two years) by setting courses which require little or no supervision. This involves greater use of resource centres, or establishing them in libraries where they have not yet been installed.
This is expected to be achieved, however, in a climate of rapidly declining financial support for capital projects and ever-tightening budgets - i.e. with little prospect of additional staff or other guidance resources, no new accommodation, or new equipment.
The aspect of collaborative working by students, in project or work groups, only addresses the fringe of the problem. Leaving students to find their own way through information gathering, without guidance from either faculty or information professionals, carries a great risk of lack of depth in scholarship. For a university with a record of high achievement in research, this would be particularly galling. To some extent this could be mitigated by Quality Assurance techniques, but unless a structured approach to group learning, with guidance, is devised, there is a very real danger that significant scholarship potential will be lost.
Any set of problems like this invites initiatives - a chance to turn threats into opportunities. The question, however, is what initiatives would be applicable?
It has been suggested by many people that a way of controlling access to the Web, and of insertion of information into the Web, needs to be found and applied. This would have an immediate effect on the volume issue (and calm senior administrators' minds regarding outsiders taking legal action against universities), but given the nature of the Web, and despite legislation being considered in the United Kingdom and the United States, it is not a practical proposition.
Collaboration via network access can involve many types of organisation, including collaboration with public libraries, museums, television and communications companies and other educational establishments. It can also take many forms - the development of Metropolitan Area Networks in Scotland being a good example to emulate, and in particular, SCRAN, the Scottish Cultural Resources Access Network.
The increasing collaboration of librarians and computer professionals ("computarians") has complementary elements, in that librarians like to categorise while computarians like to facilitate and to control physical resources. On the one hand, the free and easy access to networked information may be compromising categorisation and, on the other, the demise of the mainframe, coupled with on-line access and updating, may be infringing the territory of the computarian. Co-operation may counter these trends and reverse the loss of control which might so easily equate to loss of scholarship.
Opportunities need to be taken to review the situation in-depth (as an outcome, perhaps, of TLTP and in eLib - some of the projects of which deal with this theme). There is also the prospect that digital library resources might change attitudes to network surfing. At least a number of projects are pointing in the right direction.
What must be prevented at all costs is centuries of scholarship becoming a "fashion boutique" occupying "niches" on the super highway to nowhere.
Further information is available from r.field@ed.ac.uk
This account was drafted for this report by The Marc Fresko Consultancy. It is based on notes taken during the presentation, slides used, and a paper supplied by the speaker.
For over one hundred and fifty years public libraries have been providing information to citizens in a friendly, open door environment. The new superhighways have accelerated technological change and development of new and exciting information resources. To help citizens to take advantage of the consequent opportunities, public libraries need investment from the private and public sectors to create an infrastructure of a Millennium Citizen Information Network. As a first step, the Library Association has put forward a proposal to the United Kingdom Millennium Commission for funding for the initial development of a nation-wide infrastructure so that public/private partnerships can support national resource sharing and content creation. Project EARL, a consortium of public libraries developing Electronic Access to Resources in Libraries, is a key partner in the bid. With the aim of supporting lifelong learning, business development and community, cultural and government services, the Library Association and the EARL projects seek to empower the Millennium Citizen to participate fully and actively in the emerging technological democracies in Great Britain, Europe and beyond.
The Millennium Citizen Network will be a virtual, rather than physical, monument to the Millennium - but in providing equality of access to information superhighways for every citizen in the twenty-first century, it will have a very tangible and lasting impact.
The project will be initiated by the development of resource sharing and network services between public library authorities.
Public libraries are used by more people more of the time than any other comparable institution. It is estimated that over 60% of the population of the United Kingdom are regular library users. As Ken Worpole put it in Towns for People (OUP 1992):
"... the central library remained the key cultural institution ... where one could find a genuine cross-section of people by age, class and ethnic origin."
This, backed by a hundred and fifty year tradition of providing free access to information services - with, now, publicly available print and electronic resources - puts public libraries in an unparalleled position to give universal access to knowledge.
That universality will necessarily involve access to resources of all libraries. The major obstacle to this goal is in the level of electronic interconnection. Survey findings in the Public Libraries Review revealed that:
"53% of all public libraries have some kind of Internet connection, but this is a very limited form of connection, both in terms of penetration and type of access" (Library and Information Commission Public Library Internet Survey, UKOLN, 1995).
The position is exacerbated by local bias. Again, the Public Libraries Review reported:
"... many library authorities drew attention to innovations they had pioneered or developed as a "flagship" service, (but) there was an apparent reluctance to take up initiatives they had not initiated."
Millennium project funding would be directly channelled to counter this low level of interconnection and low use of network services.
The Millennium Commission intends to support at least a dozen very major projects across the United Kingdom. Up to £50 million per project will be made available, but this will have to be matched pound-for-pound by funds from public and private sectors or from other funding bodies, such as
Another potential source would be by generation of revenue, though in its Final Report, the Review of the Public Library Service in England and Wales, undertaken in 1995 for the Department of National Heritage, notes that:
"About two in five users tend to favour charging, but it is worth noting that the combined totals for council tax, a special library fund and income tax or VAT exceed the proportion favouring charging. In other words, public sector sources taken together attract more support than charging."
Other possible sources within the libraries sphere, although they can be given a direct value, are not strictly allowable as matched funding. They include the creation of assets with national content and statutory provision of sources. Qualification for funding will also require evidence of long term viability of the project.
The deadline for the completion of the Proposal Form was December 8th 1995 and the Application Form has to be submitted by 12th February 1996. An appraisal from Technical/Financial/Commissioner viewpoints by the end of April 1996 will lead to a long listing. This gives a strong indication that, subject to technical adjustments, the project will be accepted, but it obviously does not guarantee success.
The next step is a detailed appraisal, followed by shortlisting. Final acceptance will lead to the payment of the requested grant.
The initiative is an umbrella project (using SUSTRANS as a model) controlling many projects for the national co-ordination of public libraries networking.
The infrastructure has to take into account:
It has, as a guide, the observation by the Public Libraries Review that:
"Infrastructure investment should be made to link central, branch and mobile public libraries and new access points throughout England and Wales to the information superhighways through broadband cable on Integrated Services Digital Network ISDN connections. The links should provide rapid access to on-line databases and regionally based CD-ROM collections."
Accordingly the infrastructure will have to support:
A typical type of interconnection would be that of the British Telecom connected rural/urban model to link the County of Norfolk and the City of Birmingham, to provide information on data, traffic, applications, addresses telephone numbers and opening hours.
It is based on SMDS switched multi-megabit core technology to establish seamless automatic connection, on a "virtual" network, at the speed of the slowest link, so that any library in any authority will have access to all of the resources of any other library in the designated areas.
The backbone of the network consists of 0.5Mbit/2.0Mbit/4.0Mbit/10.0Mbit/SMDS access points. Links from the libraries will be via ISDN2 (Integrated Services Digital Network) or Kilostream; links within libraries will use an Ethernet LAN.
The principal participants are:
EARL, Electronic Access to Resources in Libraries, was the first public libraries networking initiative, demonstrating a bottom-up approach to infrastructure. It involves 41 public library authorities and is funded by subscription and by the British Library Research and Development Department. It encompasses:
Further information is available from http://www.earl.org.uk/
The project will support Public Library Authority development. Its EARL and UKOLN derived framework and linking initiatives will promote:
Essentially this covers the short term, February to April 1996, ensuring that the project is prepared with:
Finally, the Public Libraries Review once more encapsulates the way forward that the project represents: "Library authorities should retain the option to levy charges for new added value services. However, we believe that there is a case for allowing uncharged access to Internet or World Wide Web sources that are essentially free - that is, available at no more than the cost of local calls to telecommunications nodes."
The importance of universal access to knowledge for the citizen of the twenty-first century cannot be overstated. Nor can there be a better monument to the Millennium than the means by which it can be created.
Further information is available by e-mail from Philippa@EARL.org.uk
This account was drafted for this report by The Marc Fresko Consultancy. It is based on notes taken during the presentation and slides used.
Despite the growth of the WWW, despite its ease of use and despite the proliferation of search tools, there is still a need for human intervention to guide the search process by adding judgmental information. SOSIG is the first of a series of subject-based gateways under construction in the UK. SOSIG has been running for 18 months and is piloting an approach to classification and description which allows the information to reside at appropriate centres of expertise. This expertise will be made available to user in a transparent and distributed way.
Until about two years ago, we had to put up with a variety of complex interfaces for Internet services. Not only were interfaces inconsistent, but addresses were constantly changing. We had Gophers galore - frequently highly nested - and the X.25 communications, incompatible with direct Internet use, which the UK academic community was using.
Typically, users went through three phases of network awareness. First, before network training, "machine-phobia", as they were sceptical of the use of the Internet. Then, b after their initial training, came a period of "network-o-philia" as they became excited by the possibilities of Internet research. Sadly, the third phase, "getting lost" followed soon after as the degree of difficulty of locating resources became apparent, the enthusiasm being lost in proportion.
This gave rise to the concept of the "subject gateway", being a meta-resource which provides links to data resources in a particular domain, in an organised manner.
Today's hyper-linked WWW network cuts through the hierarchical nests which proved so confusing. WWW is pervasive, as is its use. It is increasingly populated with a variety of search engines and "robots" which make searching easier. And for those who are still struggling. there are many "idiot's guides" available to help. It would therefore be understandable to question the need for further subject gateways.
Despite the advances of the WWW, searching the Internet is still problematic. The search tools which exist are still primitive; for example, they often return large numbers of "hits" in response to search enquiries. In practice, serendipity and browsing are insufficient, even augmented by today's tools. There is still a need for human selection and expertise to assist the search process; and that is precisely what a subject gateway offers.
SOSIG - the Social Science Information Gateway - (pronounced "sausage") is designed to be the first stop for Social Scientists on the Internet. Its key attributes are to provide a consistent interface, and to deal with information issues which include:
The resources linked by the gateway include mailing lists, newsgroups, printed guides and catalogues, and networked search tools. The resources included have been "filtered" in accordance with a developed policy. This policy is qualitative, emphasising the value added by human involvement in the maintenance of the gateway. The aim is to make the gateway comprehensive for UK resources.
Resources are classified according to their UDC (Universal Decimal Classification) code, with some cross-referencing. The lists can be viewed in alphabetical order or by UDC (users almost always choose the former, which is the default).
Each resource identified in SOSIG is described in one record. The records, based on IAFA templates, describing a resource in the SOSIG gateway include:
The descriptions clearly are central to the success of the enterprise. Some are contributed by owners of the resources, others are developed by the SOSIG team.
These records underlie the search facility, and automatically create the browsable classified collection.
SOSIG has been used in practice to answer an impressive range of questions and enquiries. Some real life examples include:
The main ongoing effort is to make SOSIG a test site for the ROADS system [1]. There are also efforts under way to encourage the creation and recording of new Social Sciences resources, including non-networked resources.
The project team strongly believes in remembering that "It's the data, stupid" - in other words, we must remember that the information is prime, not the technology. It is also essential to avoid being tied to particular standards, protocols or technologies - they change too often.
The amount of effort expended on maintaining the gateway is large, especially so given the breadth of the subject covered. This is giving rise to some concerns, and we are proposing to devolve some of the work to subject experts. Finally, there is the inevitable question of funding. Present funding is provided by the eLib programme; but this is finite. The source of funds for continued operation will have to be determined.
SOSIG can be accessed at URL: http://sosig.ac.uk or use telnet to sosig.ac.uk with login sosig
[1] See the presentation on ROADS by L. Dempsey in this volume.
This account was drafted for this report by The Marc Fresko Consultancy. It is based on notes taken during the presentation and on the slides used.
This paper describes the genesis, objectives, and status of the ROADS - Resource Organisation and Discovery in Subject-based Services - project. Key issues raised by the project are examined, and possible future actions are sketched out.
The project emerged from a recognition of the need for an easy way of locating Internet resources. The idea hinges on the use of simple Internet resource descriptions, which will allow users easily to retrieve information from distributed autonomously managed resources. The idea was incorporated into a proposal for eLib funding; ISSC has funded the project for two years, to develop a suite of services which support resource discovery.
The University of Bristol is acting as project co-ordinator. This involves looking after the liaison, the documentation, and other project management tasks. Key personnel are Nicky Ferguson, Chris Osborne and Phil Hobbes.
UKOLN is providing metadata research, and work on user requirements, WWW issues and the interoperability of MARC and Z39.50. Key personnel are Rachel Heery and John Kirriemuir.
Technical development is taking place at Loughborough University, where the key contacts are Jon Knight and Martin Hamilton.
The first objective is to make a meaningful contribution to the development of a sharable, distributed systems platform for resource discovery services. In fact, it will achieve more, as some services are actually being developed.
The project is based on the avoidance of idiosyncratic components, and so a number of pragmatic choices have been made. The work will build on:
Several eLib projects are developing new subject-based services. The second objective is to support these projects with tools, advice and guidelines. Examples of the projects are:
The third objective is to work with subject-based services to involve information providers in the description of their own resources, in order to make them as useful and accessible as possible. Early on, a decision was taken that there would be a need for tools to allow information providers to document their resources. The "trusted information provider" and ALIWEB models are both under consideration; quality of documentation remains a concern, and is a constant issue.
There is a perception that the UK does not participate enough in international standards setting work. Partly in response, the fourth objective of ROADS is to implement and test emerging standards, and to increase the level of UK participation in standards work. Accordingly, Loughborough University is working on the development of WHOIS++ and the uniform resource initiatives (UR*); UKOLN is working on metadata research, and is organising (jointly with OCLC) a metadata workshop which will follow on from the recent meeting in Dublin (Ohio) which resulted in the "Dublin Core". The workshop will be held in Warwick in April 1996.
The project started in August 1995.
The current version of ROADS software is referred to as v.0. It includes an IAFA-based search engine, and has given rise to much discussion on the ideal template contents.
The next step is ROADS v.1, which is expected in late the Summer of 1996. It will be a standalone WHOIS++ implementation.
By the end of 1996, ROADS v.2 should be available. This will implement a fully distributed environment, with Centroids. It will allow a single entry point to one server, which refers the enquirer to the specialist server which is most likely to be useful.
The ROADS team recognises that a plurality of protocols will be used for the foreseeable future. Therefore, as part of the project, it is planned to develop a small demonstrator which will show interoperability between services based on Z39.50 and MARC.
The project team sincerely hopes that the experience gained so far, and experience to be gathered in the remainder of the ROADS project, will feed back into ongoing discussion and development in this area. One of the debating points which has already arisen is the need for a Simple Internet Resource Description.
A key question concerns the practicality of providing this prototype for a distributed directory service. The use of Centroids will need to be tested, as will questions associated with the inclusion of data from domains other than the limited selection currently incorporated.
Finally, there is the possibility of integrating the ROADS scenario to take in other, non-Internet resources, such as business data.
See the ROADS project home page at URL: http://ukoln.bath.ac.uk/roads
Details of related eLib projects can be found at the eLib Information Centre, http://ukoln.bath.ac.uk/elib/
Further descriptions are in ARIADNE at http://ukoln.bath.ac.uk/ariadne/
This account was drafted for this report by The Marc Fresko Consultancy. It is based on notes taken during the presentation.
In this paper, some of the preliminary findings of the CNI working party on networked information discovery and retrieval (NIDR) are discussed. They include consideration of cost and usage, of cataloguing and metadata. A number of significant gaps requiring further work are highlighted.
The thinking in this paper is shaped by research conducted for the CNI white paper on information discovery and retrieval [1]. This work was initiated when the CNI was approached by Avra Michelson of the MITRE Corporation, who crystallised the need for research in this area. In late 1994 CNI formed a team, including the author, Craig Summerhill from CNI and Cecilia Preston; the work was co-sponsored by the MITRE Corporation. This team has held meetings and discussions, and currently is drafting its paper.
It is gratifying to note that some of the team's early predictions are already beginning to become reality.
A major question at the outset is how to impose some sort of structure on the discovery and retrieval "problem". Predicting how it is likely to evolve is also a fruitful source of debate. Because of the scope and size of the issues, the team consciously avoided current detailed or technical points; it tried instead to determine where work will be needed strategically in the future, on concerns such as protocols, metadata taxonomies, etc.
The first generation of networked information discovery and retrieval (NIDR) tools were predicated on the concept that "content is free". For example, tools such as Web Crawlers would search and index network servers systematically, assuming that they would find information which is freely available. However, this will change (in fact, it has already begun to change). The content of the Internet of the future will not all be free; it will contain a complex mix of proprietary and non-proprietary information. As the proportion of information which is charged for increases, so the impact on NIDR tools will increase too.
We are already starting to see the evolution of complex and information-rich organisational information spaces. These can be described as Intranets, that is a part of the Internet which is internal to an organisation and segregated from the bulk of the Internet by a firewall. As this trend continues, organisations will store more and more of their internal data in their Intranets while simultaneously relying on other Internet resources for external data. The impact will be that owners of Intranets will demand consistent tools for Intranet and Internet use. This will attract major software vendors to the NIDR marketplace. We are already seeing some software houses trying to squeeze revenue from the Intranet phenomenon.
Discovery is conceptually straightforward - it consists of finding an appropriate network location, then identifying the required information. Similarly, retrieval is simple - it only requires a collection of bits to be moved to the desired location. However, a complication arises because retrieval is not equivalent to use. For example, in today's software environment, it is easy enough to retrieve a resource only to find that the "helper application" needed to use the resource is not available. We anticipate that this gap between retrieval and use will grow and will become more and more significant. Accordingly, the concept of "fitness for use" will become more important.
Today, information needs in the NIDR environment are typically formulated in terms of intellectual content description. In the future environment, other factors will become important: access time, service levels, price, and formats. Future tools will need to take these more mechanical factors into account.
From the outset, the team felt that the subject of discovery in a network context would be crucial. It remains very significant, but its importance is decreasing because of the growth of digital libraries. As a matter of definition, digital libraries are established by individuals or bodies who care enough about their contents to build, maintain and - in some cases - describe. It follows that for many users, the task of network discovery will be reduced to finding the best digital library, then searching within it. This will make discovery much easier than it is at present.
Of course, network discovery technologies will be needed for use within the digital libraries, but the libraries will impose constraints and simplifications which will facilitate the process.
Digital libraries will not meet all users' needs. There will remain a core of users who will need to search the entire Internet routinely. In the main, these will be users who deal in "raw" information, for example in law enforcement, research, and corporate research. Generally these users will be well-informed and sophisticated users of network tools.
Most searching approaches and activities are heavily influenced by the nature of online cataloguing. However, online catalogues have several weaknesses as information systems; the weaknesses include lack of memory concerning previous searches, and the lack of personalisation features. Unfortunately, these weaknesses are being carried forward into NIDR systems. Clearly we need to move beyond the current model. What is needed is a move towards a more active model which involves the "push" distribution of information by systems which learn and become personalised through use.
The cataloguing tradition is non-evaluative (which is superficially attractive as it seems to suggest cataloguing by unskilled personnel, something which does not work well in practice); but a purely descriptive approach has very real limitations which prevent the catalogue from answering evaluative questions which users would like to be able to ask. For example, users would like to ask questions such as "what is the best reference?" or "which is more relevant?" Because of these limitations, NIDR suffers from a perception that "catalogues don't do this!"
Potentially, there are fortunes to be made by entrepreneurs who successfully depart from the conventional catalogue model to meet users' needs and desires more fully.
The architectural model used to conceptualise the retrieval component of NIDR today is still dominated by the view of the world imposed by the file transfer protocol (FTP), which is twenty years old. This model views retrieval as the process of moving a file - a collection of bits - from one place to another. It lacks the concept of content at an intellectual level. This concept recognises that content can be represented through a variety of digital representations - formats, resolutions, etc - which might be selected based on the capabilities of the client and server involved in the retrieval and the network linking the two, as well as the ultimate needs of the person who will use the intellectual content. Protocols more modern than FTP, such as Z39.50, allow much more negotiation about format as part of retrieval; indeed, retrieval becomes a process rather than a simple action.
The very simplistic view of retrieval today has also confused a number of issues related to metadata. There is a tendency to consider information related to a digital object (price, format etc) as static information that can be viewed as metadata associated with the object rather than as more transitory and situational parameters that are established in the context of a specific retrieval of the object.
Today, retrieval normally assumes acquisition for digital information in the sense of physically copying the information to the users' machine. The legal framework for networked information use has largely shifted from purchase of copies to licenses to use copies. Retrieval will need to be extended to also allow the negotiation and signing of licenses, not simply the transfer of funds to pay for an object. In the near future we are likely to see a much richer and more complex set of use options for information than simply making local copies; this will include:
The structures used for the collection of metadata will necessarily grow more complex. For the last few years, we have seen Web Crawlers trawling the network to derive descriptive metadata, following the model that was established by the archie system for indexing FTP archives. As alluded to earlier, this is not well suited to a network which contains valuable proprietary information; but, paradoxically, the owners of that information will have a vested interest in publicising the existence and nature of their information while still retaining control over it. We conclude that a new series of interface points will emerge where suppliers will interact with search engines. Quite possibly this will require new levels of abstraction. Users will not be satisfied solely with metadata provided by the information suppliers, which often is basically advertising; the more neutral information extracted from indexing and cataloguing processes will still be needed.
Metadata presents a complex set of ill-defined issues. The team spent considerable time looking at metadata issues; our conclusion was that to a great extent the view of information as metadata is highly contextual, depending on specific usage scenarios. Also, the linkage between metadata and the base object that metadata describes is elusive and becoming more blurred as technologies for automatic indexing, abstracting, translation, image recognition and similar processes improve.
With only limited research, we found no fewer than ten metadata mapping schemes. Mapping metadata between these schemes is imperfect at best; this represents a gap which should be filled.
One current trend is to try to define "core sets" of metadata, on the premise that agreeing the bottom layer of a conceptual hierarchy of metadata sets will permit some form of interoperability between models. Unfortunately this premise overlooks the critical weakness that it is impossible to rebuild a full metadata set starting from a core set; this weakness diminishes the value of the concept.
The compromise approach of lowest common denominator core sets of metadata also gives rise to another problem. While it is possible, through approaches such as the "Dublin Core" work, to develop reasonable core sets and gain consensus around them, they are not particularly interpretable or interoperable at the machine level. Typically, the data elements are not highly structured and are going to be used with free text searching and presented to people for decision making.
There is currently much talk about "intelligent agents", about sophisticated semantic interoperability. The current view of NIDR presumes the active, continual involvement of a human to control the process and to make complex decisions involving relatively unstructured information. A world of intelligent agents would have to rely on highly structured data and data interchange standards. In this sense, current NIDR technology is largely disconnected from the development of intelligent agents and focused on solving a different problem.
[1] The current draft or final version of the white paper is available at http://www.cni.org/ and ftp://ftp.cni.org/
This account was drafted for this report by The Marc Fresko Consultancy. It is based on notes taken during the presentation, slides used, and notes supplied by the speaker.
IGPL, the Interest Group on Pure and Applied Logic, is an international association of almost one thousand logicians from almost all the countries of the world. The Interest Group is an excellent testbed for developing an "electronic community", i.e. a community of researchers communicating in various ways over the Internet. As part of the services it provides for its members, IGPL publishes an electronic journal and is developing an electronic dictionary of logic - a distributed, refereed, dynamically updated and highly interactive database of information concerning scientific knowledge on logic. In contrast to current searching on the World Wide Web, where locating particular information can often be more a result of chance than choice, the logic database will facilitate easy and accurate access to the area of interest. A database of this scope cannot be set up by a single author, nor even by a small group, and consequently a scheme is being devised to encourage participation by the whole of the IGPL community. Such a scheme need not be confined to logic, but could be extended to a wider scientific domain, provided that support from the research community is forthcoming.
Logic is an interdisciplinary subject, with a long tradition, from Aristotle onwards, of providing foundations for examining problems in Philosophy, Mathematics, Language, Psychology and, latterly, in Computer Science and Artificial intelligence. In the late 1970s it became clear to many members of the research community that there were many similarities in the way logic is used and applied by otherwise very diverse factions. It was apparent that a great deal of mutual benefit would result from bringing these disparate groups together and establishing means by which they could communicate and collaborate.
A series of handbooks was published in the 1980s covering, in a thematic and co-operative manner, many of the research areas of logic. Currently, over twelve volumes have been published and this number is already planned to rise to forty volumes, with over twenty thousand pages, by the year 2001. A consequence of the intensive collaborative work of many scientists on these handbooks was the formation of a community of like-minded individuals. It was a natural step, therefore, to found an electronic network and an electronic journal to serve this community - a community which, in the early 1990s became the Interest Group on Pure and Applied Logic, IGPL.
The IGPL is sponsored by FoLLI, the European Association for Logic, Language and Information and currently has a membership of almost one thousand researchers in various aspects of logic from about forty different countries. Acting mainly as a research and information clearing house, its activities include:
Until the advent of electronic communications, scientific knowledge was generally imparted via journals, textbooks, conference proceedings, dictionaries, encyclopaedias and handbooks. The problem with this approach was that it tended to be chaotic, particularly if a particular piece of information was required, searching was very time-consuming and the data was static, with no updates and no animations.
On the publishing side, there were long delays - two years was not uncommon and this often fatally compromised the currency of the subject matter - and large backlogs of papers for publication. Economic factors also limited the size of papers to less than one hundred and fifty pages. Even when people independently put items on to an electronic network the quality was extremely variable, in a discipline where consistently high standards were paramount.
By 1993, however, IGPL had developed the infrastructure which would enable them to eliminate these problems by publishing their own, electronic, journal.
IGPL's Journal is published several times per year. It was the first scientific journal on logic to be electronic in all departments (a hardcopy version is produced, but only in response to specific demand). Submission, refereeing, revising, typesetting, checking proofs and distribution are all accomplished via electronic mailing and publishing.
Each issue currently has about a thousand pages, and invites papers dealing with all areas of pure and applied logic, including pure logical systems, proof theory, model theory, recursion theory, type theory, non-classical logics, non-monotonic logic, numerical and uncertainty reasoning, artificial intelligence, foundations of logical programming, computation, language and logic engineering.
Although there are some residual problems in the realm of costs and charging policy, copyright, rate and type of expansion and technical development, most have been dealt with (in regard to copyright, for example, no restriction is placed on authors who wish to publish the same work elsewhere). Two major objectives have been realised: the average speed of publication has been reduced from two years to two months and the Journal provides encouragement for logicians to communicate with each other - an aspect which has been furthered by another IGPL activity: the logic dictionary.
The dictionary is intended as a distributed dynamic Internet archive, residing on the World Wide Web and continuously maintained by the IGPL community. Its goals will include:
Communication will be via HTML forms and the submission of contributions, editing (both on a general and topic level) and the refereeing process will be open, simple to access and easy to manipulate. Services will include search facilities, automatic notification by e-mail of new items and a referees database.
IGPL is tasked with establishing the dictionary and its procedures, means and tools for maintenance, operation and evolution, both in electronic and social terms. There are two primary objectives:
The IGPL has indeed developed an electronic community, effectively eliminating distance as a barrier to access to information, ideas, expertise and people with similar interests. By the year 2001 that community will directly link the student at home with the world of logic, with the dictionary of logic knowledge, with the twenty thousand pages of the logic handbooks and with a society with interests centred around logic. It is estimated that there are perhaps only ten thousand people world-wide directly working in the logic domain. But logic is a junction which impinges on and influences many more subject areas. In that respect the IGPL will reach out to and provide services for a vast audience for whom electronic communications will be the predominant means of knowledge access in the twenty-first century.
Further information is available from any of the following:
igpl-request@doc.ic.ac.uk
dg@doc.ic.ac.uk
ohlbach@mpi-sb.mpg.de
and on the World Wide Web at URLs:
http://www.mpi-sb.mpg.de/igpl/Bulletin
http://www.mpi-sb.mpg.de/igpl/Journal
This account was drafted for this report by The Marc Fresko Consultancy. It is based on notes taken during the presentation and slides used.
The introduction of the Mosaic browser ignited a fire of interest that has changed the face of the Internet and the way we deal with networked information. The scramble for commercial success on the Internet has brought many technology vendors into the Web trade, resulting in the proliferation of new tools and methods. As these advances define the role of commerce on the Internet, they will also change the way in which routine business is conducted on networked campuses.
On the campus, administrative computing has traditionally occupied a prominent central position with a strong operational focus. It has mainframe orientation, supporting on-line terminals and it is characterised by a superlative levels of security and reliability - the latter in excess of 99%.
An alternative opinion, however, is that it has become boring and outdated, is invariably weighed down by huge backlogs of work, takes too long to respond and has poor accessibility, since it has to have an intermediary between its operation and those who want results from it.
Unfavourable comparisons between its nature and performance and that of the dinosaurs, just after their 350 million year reign, have been offered.
The latest phenomenon is Client/Server architecture, which has swept in with the kind of speed expected of new technology and is busy replacing all administration systems, to a greater or lesser degree. Its characteristics are that it is extremely effective and highly efficient.
Effectiveness is epitomised by the fact that it provides information for decision makers immediately, is adept at supporting new users and can accommodate many thousands of users, rather than the couple of hundred maximum on-line to a mainframe. Its efficiency is derived from its ability to break down organisational layers and put power and accountability at local levels, to provide integrated solutions, with one view of the world and one place for base information, and to reduce redundancy by its distributed system approach.
In investigating Client/Server applicability the prospective user should consider:
The World Wide Web, however, has introduced a new dimension to information technology. It is very open, very public, ubiquitous, extremely easy to access, carries an unbelievable amount of information, eliminates distance and is, frankly, fun to use.
The applications which it can accommodate are practically infinite. Examples in academia, especially the university library environment, include:
A merger of information services and administration will see a combination of Client/Server and Web benefits. It will bring multi-platform access, with, for example, compatibility between Apple Macintoshes and Unix PCs. Common Graphical User Interfaces, using HTML, will help in this respect. Using the Web to deliver documents will provide an efficient and low-cost option and, indeed, the Web represents low-cost client software, in this form of Client/Server hybrid, and the interactive nature of the Web would tend to make the whole system self-documenting and self-training.
The mainframe operating environment still predominates, with investment locked in and legacy systems which change very little, if at all, over very long periods. In regard to performance, most mainframes have problems coping with more than a very few hundred on-line users and the prospect of thousands of users simultaneously logged on would not appear to be practical. The problems of security, with open access, are obvious and somewhat formidable.
Solutions are at hand, however. The mainframe can be adapted as a server in the Client/Server environment. Rather than attempt the daunting task of conversion, the legacy data can be filtered and interpreted when necessary. With respect to performance, it is expectations rather than actual response time which are important and these are currently being set by the Internet, where delays of five or six seconds are tolerated.
Security can be resolved, though perhaps not to a highly secure standard, by, for example, positioning a Secure Socket Layer (SSL) between the browser and a Common Gateway Interface (CGI) and applying encryption technology at the SSL level.
One of the first advantages of utilising the Web in a Client/Server environment is that the database technology does not have to be changed, yet users have real time access to real, operational data. Interpretive servers, also, make users feel comfortable by providing familiarity. The retraining associated with conventional implementation of new Client/Server systems is not necessary on the Web, where ease of use and a philosophy of "just do it" predominate and update of user entered information is so simple and direct that information such as student addresses tends to be more up-to-date and accurate, yet requires no administrative resource.
Whether the Web is the ultimate destination of information technology or not remains to be seen, but taking advantage of it now in administrative computing is at the very least an astute interim strategy. It may well be - as happened at Princeton in December 1995 - that the biggest problem will be in selecting the application with which to start the process. Once commenced, however, it can be expected that implementation will be fast, investment will be encouraged and thousands of new users can be connected, with virtually no training costs. Perhaps one of the most important effects of the Web is its ability to act as a smile generator. What may seem to be a trivial characteristic can prove to be very useful at times of great change, which the Web will - and is already - bringing to campuses world-wide.
Further information is available from dwk@princeton.edu
The article "Internet Tools Access Administrative Data at the University of Delaware", originally published in CAUSE/EFFECT in late 1995, will provide additional insights. It is available at URL: http://cause-www.colorado.edu/cause-effect/cem95/cem953.htm
Delaware's administrative Web site is at http://www.mis.udel.edu/admin.html
13.15 - 15.00 Lunch
16.00 - 16.30 Tea
17.30 - 19.00 Free period
19.00 - 20.00 Reception
20.00 Dinner
10.30 - 11.00 Coffee
12.00 - 13.00 Lunch
15.15 Tea and Departure
British Library R&D Report 6250
© The British Library Board 1996
© Joint Information Systems Committee of the Higher Education Funding
Bodies 1996
The opinions expressed in this report are those of the contributors and not necessarily those of the sponsoring organisations.
RDD/C/187
The primary publication medium for this report is via the Internet at URL
http://www.ukoln.ac.uk/services/papers/bl/rdr6250/
It may also be purchased as photocopies or microfiche from
the British Thesis Service, British Library Document Supply Centre,
Boston Spa, Wetherby, West Yorkshire, LS23 7BQ.
This report of the conference was prepared by The Marc Fresko Consultancy Telephone +44 181 645 0080 E-mail marc@easynet.co.uk
Converted to HTML by Isobel Stark of UKOLN, July 1996