C&ITC Star gazing meeting
New College, 9th April 2003


There were at least 50 people at this meeting, mostly Computing Officers from throughout the University, and several people from each of MIS, EUCS, Library and MALTS. In the first half of the meeting there were presentations from the University's main 'service providers', and in the second half, a more futuristic look at where 'society' might be carrying the University.

Scott Currie opened the proceedings by noting that the vision foreseen in the EUCS strategy document for 1999-2004 was still largely a vision now, when another strategy document is almost due. This vision imagined both student and lecturer finding all their resources online so that both could do their work more efficiently, though there was no thought then of eLearning, or of the University becoming any less campus-based. Our main challenges in the coming five years include if and how to support a growing list of versions of Windows, MacOS and Unix, and how to deliver increased network bandwidth to everyone without allowing it to be dominated by eScience, medical and other demanding applications. The Science Research Infrastructure Fund is pouring money into new resources which are expected to come into mainstream use within three years, for example to promote 'grid' computing. eLearning is clearly a major priority, for which both technical and user support will have to be provided 24 hours a day, if we are offering a world-wide service. These new resources will have to be provided without any degradation of the computing and network environment on which everyone will depend for their everyday work, so there will have to be Quality of Service agreements. Peering cautiously just five years ahead, Scott anticipated that the EdLAN upgrade would be complete, that Netware would have disappeared, that general network bandwidth would be approaching 100 gigabits/second, that most homes would have broadband connections, the next-generation of SuperJANET would be in place, there would be more legislation controlling us, security would be an even bigger problem than today (apparently if you plug a new computer into the Internet, it will be probed for security loopholes within just 15 minutes on average), and users will hopefully need only a single username and password for all their services. A major challenge for the University as a whole is to implement an information structure to control who gets access to what.

Simon Marsden, Director of MIS, didn't try to see far into the future, but described the 'Enterprise Portal' which MIS is currently developing, and which he hopes many University staff will choose to have for their home page on the Web. This would be the staff equivalent of the Edinburgh Student Portal, giving a single point of access to all relevant services. There is a huge range of University resources already available for exploitation in this way, each having their own logins and style of presentation. The vision is to have all these different resources accessible in a seamless, personalised Web environment. The issues to be dealt with in delivering this include breadth of coverage, presentation, accessibility, availability, content, and maintenance. The result, using technology to do as much of the work as possible behind the scenes, should increase efficiency, productivity and satisfaction for its users. The intention is that every user will be able to personalise their home page on the Portal, selecting a subset of all the information available to them and designing how it should appear on the screen. This depends on all the components being designed to work together as modules, and using XML, XSLT and other standards. MIS is currently investigating two products, the open-source 'U-Portal' (which arose in US HE), and 'Oracle Portal', and hopes to begin implementing Edinburgh's Enterprise Portal in the Summer for trial release some time in the coming academic session. To begin with this is seen as a way of improving the efficiency with which staff can perform University business, but in the longer term it should become possible to link to channels provided by other agencies, such as University suppliers, a weather bureau, the BBC news-feed, or the user's bank.

John MacColl had a go at predicting what the University Library might be like in the year 2010, when it would be likely to have three main areas of responsibility: to manage the information architecture for the University, to provide a supportive study environment for students, and to look after printed materials. Most librarians are still guided by "the five laws of library management" published by Ranganathan in 1931: books are for use, every book has its reader, every reader has her/his book, save the reader's time, and, a library is a growing organism. These principles will still be true in 2010, though "books" are more likely to be "digital resources". (The Principal told last month's eLearning meeting that "saying 'digital library' these days is like saying 'horseless carriage' ".) Books are still likely to have a place in time-limited Reserve Collections for some time to come, but the expectation is that the Library will eventually be more like the National Library, with books kept behind the scenes and only available when specifically asked for, not for general browsing. Throughout the world, librarians are evolving to become information environment designers, metadata consultants, resource buyers, consultants on eLearning and research materials, and agents for resource preservation. This activity in Edinburgh will continue to be centred on the Main Library and the major branch libraries, with pervasive wireless networking already becoming a reality here. There are many initiatives and studies into how best to meet the new demands, XML and XSLT again being prominent. The Library is concerned that it has lost sight of its traditional role while developing its new one, and is therefore keen to explore all these issues. It is concentrating in particular on how to deliver protected study space which matches the way modern students like to work (the SELLIC building was designed for this from the outset, and its cancellation is greatly regretted). The Library is likely to consider becoming a publisher of academic information, to ensure its widespread dissemination at low cost.

Nora Mogey of MALTS gave a spirited promotion of eLearning, urging the University to make the effort to embrace this in a major way. She emphasised that this should not be interpreted as "distance learning" in Edinburgh, but as a component along with face-to-face teaching of "blended learning" to give students the best of both worlds. There is an eLearning Web site at http://www.elearn.malts.ed.ac.uk/, on which comments are welcomed. The full results from the eLearning survey carried out last month are not yet available (they should be on http://www.ipc.isg.ed.ac.uk/e-learning_front.htm in due course), but some conclusions are already clear: most academics believe that the use of eLearning will increase, seeing it as improving efficiency and enhancing learning quality for students. Students themselves were worried about inequalities in computer ownership and access from home, and were adamant that this should not be a replacement for face-to-face teaching. The University is currently using two Virtual Learning Environments, WebCT (with over 15,000 users of over 100 courses at various levels of development), and IVLE (to be phased out this year because of accessibility problems). They are further developing their services to integrate better with MIS (for the Edinburgh Student Portal), EUCS (for authentication), the Library (for information resources), and online assessment tools. They are piloting a plagiarism detection service (an interesting point was made that gnutella potentially makes plagiarism impossible to detect), exploiting streaming video more, and developing e-reserve and e-CPD. Having said that the University should be embracing eLearning, and that it must be driven by teachers, not technologists, Nora admitted that many teachers are too busy to do the large amount of work needed, and that there is a shortage of people who know how to design online courses. It was pointed out that teachers are anyway "on a hiding to nothing" as long as this is a research-based University which doesn't reward teaching effort; the Principal's strategy group is said to be considering this aspect.

Colin Higgs is a Computing Officer in Physics, and a member of the star-gazing group which organised this meeting. He described the work and aims of the group, and its remit to get away from the everyday 'firefighting' problems to think strategically about how the University should best be exploiting IT. The group meets throughout the year, and its members include Peter Burnhill who chaired the second session very eloquently. The two aspects of their work are 'tasks' and 'technology', and to try and keep discussion manageable each meeting focuses on just one task and how one technology can assist it. For example a recent meeting had focused on mobile computing and all its ramifications. Everyone is agreed that the number of people who use personal computing devices is steadily increasing, and they need wireless networks to support new facilities and new ways of working. There are major security issues with wireless networks to be sorted out, but the result was said to be more people-focused computing, and that it might one day be economically sensible for the University to give all its staff a PDA and do away with desktop computers. It was even suggested that the University could take a lead in this if it wanted, and provide a ubiquitous, always-connected computing environment for its staff, along the lines of what the eScience Centre in South College Street already takes for granted.

Paul Anderson is a Computing Officer in Informatics who has an academic background and therefore sees "both sides". He looked at three developing areas to look for common aspects of the future IT environment: Web services, grid computing and peer-to-peer applications. In conventional Web browsing, HTML and HTTP are used to deliver information from a computer to a person, but with *Web Services* the aim is to get computers to collaborate to do more work and deliver a more refined result to the user. This is based on adherence to new interfacing standards, and on more sophisticated techniques involving XML and SOAP, for example. Thus the user's computer might automatically detect the need for an exchange rate or a postcode and go and look it up for itself before presenting the final result. (One place to find out more is http://www.xmethods.com/). *Grid computing* is surrounded by hype at the moment which is obscuring its significance. It is about the automatic harnessing and interlinking of resources over the Internet to achieve a result which might not otherwise be possible: for example a medical scanner might produce real-time images by farming out data to a lot of idle desktop computers, instead of having to have its own in-built computing power -- like renting a house rather than buying it for a fortnight's holiday. The key issue is "federation" of resources owned by different people, and to get everything to work together. The Globus toolkit developed at Argonne University in Chicago was an early implementation, with a Web-based system based on the 'Open Grid Service Architecture' now emerging. The most famous example of *peer-to-peer* computing is probably Napster, which wasn't genuinely peer-to-peer because it kept an index on a single central server, which meant that it could be closed down easily. Gnutella and other more recent implementations do not have this 'single point of failure', and are very robust. This is what's needed for distributed file servers, for example, and it was suggested that the spare power and disk capacity in the University's computing labs is currently being wasted (while the University is probably about to spend half a million pounds on a Storage Area Network for its backups). The challenge to the University is not the technology but on 'closed mindsets' and on developing the Quality of Service agreements which would have to underpin new services. Such systems would not, and must not, depend on key experts, but would be built from simpler modules each performing its specific task to a standard interface. (There were lots of references to all this, including the books "Peer-to-peer: harnessing the power of disruptive technologies" by Andy Oram (O'Reilly) and "The Grid: Blueprint for a new computing infrastructure" by Ian Foster et al (Morgan Kaufman), and the Web sites http://www.globus.org/research/papers/anatomy.pdf and http://www.w3.org/2002/ws/)

The final talk was by John Butler, on "pervasive computing". John is a Computing Officer, Course Organiser and lecturer in Informatics and sees at first hand how students are changing the way they work. Nearly all Informatics students have their own computers, regardless of parental income, and 80% of all University students are likely to own a computer before they leave. Quite a few implement their own home networks, some even using wireless, and most take it for granted that they will have ready access to the Internet. Not only is a networked computer increasingly pervasive, but so is its use -- for teaching, study, social life, employment, and access to the world in general: young people use C&IT instinctively in everything they do, in ways most of the older generation never will, assuming on-request access to people and resources. 'Old-format' lectures are still accepted and valued for the face-to-face contact, but otherwise the University must plan how best to exploit the new fact of life that students will want to use computers for everything else -- even for the face-to-face group work that is part of the Informatics degree. It was suggested that 12-22 year-olds now take C&IT as much for granted as electricity and water, and that the University will have to follow their lead and integrate these new attitudes into its teaching process. One particular problem which needs to be addressed is that students are not very discerning in their use of information from the Web, and need to learn the difference between primary and secondary sources. Some people are worried about the cost of providing laptop computers for all students, but it was pointed out that over the four years of a degree course this works out to about the same cost as text books, and might to some extent replace those text books -- and if we went so far as to insist that all students have a computer, we would abolish a current have/have-not situation. Informatics is currently trying to decide on such a policy, and the University as a whole is likely to take its lead from this decision. John finished with several quotes to reinforce his argument that the University must adapt to cater for its students, of which his own was "The move to mobile computing is not about computing but about education". The gist of John's talk can be found at http://www.dcs.ed.ac.uk/home/jhb/cs/Pervasive1.pdf.

The meeting finished with a very brief discussion of various issues. No one seemed to argue with the way we are going, and all agreed that the use of standards will enable everything to work together. The major challenge was seen to be with authentication and authorisation, the first of which can be solved technically (EUCS is working hard on this at the moment), but authorisation is much harder, and is likely to present a significant challenge for some time to come.

More information from the meeting will be available through the Star-gazing Group Web site, at http://www.ucs.ed.ac.uk/ucsinfo/cttees/citc/sgg/