\centerline{\bf Idle by the Thames} \medskip \noindent On April 5th--7th I had the pleasure of attending a workshop on \sgml\ organised by the AGOCG (Advisory Group on Computer Graphics). The reason for the workshop ran thus: now that \sgml\ has adopted CGM (Common Graphics Metafile) as its method of incorporating graphics, is there a case for promoting \sgml\ as the medium of document interchange among the graphics community. Keen readers will notice a flaw here straightaway: CGM is in no way a part of \sgml: on the other hand, CALS (Computer-aided Acquisition and Logistic Support) does embrace both CGM and \sgml\ -- and group 4 fax, and IGES. True, it is possible to include CGM as a `Notation'. A `Notation' is a bit like a |\special|. You can include anything you like, but whether anything sensible happens to it is another matter altogether. As I've suggested elsewhere, in the \TeX\ world we've been electronically interchanging documents for quite some time. And since I'm quite convinced that \TeX\ is a whole lot better at graphics than we admit, I thought that it might be helpful to contribute something about \TeX\ and\slash or graphics. (`Share experiences' as someone said at the workshop: something to do with changing a light bulb I think.) The workshop was attended my three main groups: those presenting papers, who notionally had some grasp of \sgml\ (this included me, so the grasp could be tenuous); a clutch of University Computer Centre people who mostly didn't know a document type from its declaration (probably there for the experience); and the Rutherford boys and girls. This last group requires a bit of explanation: the workshop was held at the delightful Cosener's Hall in Abingdon. Cosener's is owned\slash used by Rutherford Labs as a sort of residential\slash conference place. It is really delightful with an excellent conference area, and electronic gadgetry so advanced no-one could work it (but no email access: I was quite twitchy after three days -- just like caffeine deprivation). Essentially it is an outpost of Rutherford. The papers were, on the whole, good. Easily the best was Tim Niblett's from the Turing Institute in Glasgow (`\sgml\ and Programmed Documents'). I also found Angela Scheller's paper (`Experience with \sgml\ in the real World') very useful, since she was discussing the {\sc Daphne} project. {\sc Daphne} (Document Application Processing in a Heterogeneous Network Environment) uses \TeX\ as its formatter. Hers was the only paper which demonstrated that you could do anything with \sgml\ and CGM graphics (on her own very particular installation). That's not quite fair, since David Duce and Ruth Kidd of Rutherford did something similar (`The {\sc Daphne} Document Types and AGOCG'), but had to email it to Angela to be processed. Obviously there was a very heavy diet of \sgml. Paul Ellison waded us through the murky depths of what was happening with standards, and all the associated standards like DSSSL, SPDL, etc, etc (`\sgml\ and related Information Standards'). It was a rich diet of acronymns. Martin Bryan exposed us to `Using the full power of \sgml', an awesome prospect. Two things surprised me from the beginning. First was the general low level of understanding of what was in the various standards -- what they were for. Although all the papers were available at least a few days in advance, they had almost all been written for \sgml\ literate audiences. First mistake. The second surprise was the readiness with which standards were accepted as `a good thing'. There was a conviction that only ISO standards could be considered. This left \PS\ in an interesting position, although we were assured that the SPDL (Standard Page Description Language) would look a lot like \PS, though without the programmability or the fonts (what's left?). People seemed happy to accept assurances that there would one day be suitable software to allow \sgml\ to be input painlessly across a wide range of platforms; that one day there would be suitable formatters linked to the \sgml\ to allow you to see what had been input to \sgml; and that one day there would be enough DTDs (Document Type Definitions: broadly analagous to \LaTeX\ styles) to suit a reasonable range of needs. The workshop convenor, Anne Mumford, made it clear from the start that we were supposed to work for our living and make some recommendations about electronic document interchange. Most of the papers and discussion addressed this in a general way (although I would argue that only three of the dozen or so invited papers even {\it considered\/} graphics at all -- mine, and the two {\sc Daphne} papers). I had understood the `community' to be the academic\slash research community (hence the inclusion, for example, of Lou Burnard and TEI -- the Text Encoding Initiative). However, it never did become very clear exactly who the `community' were supposed to be. To focus our minds, Anne had distributed a list of `discussion topics': \item{\rtr}what requirements does the community have for document exchange? \item{\rtr}what are the target systems? \item{\rtr}what document types meed to be defined for the community? \item{\rtr}what software can we use today? \item{\rtr}what software do we need to have available if \sgml\ is to be a successful form of document interchange? \item{\rtr}what support is needed in the community if we are to move to use \sgml? \item{\rtr}writing funding proposals for the community for work to get \sgml\ in use. \noindent It was fairly clear from these topics that at best we could bring in a verdict of `not proven' for \sgml's suitability, but there was no way we could reject it entirely. The whole jury was biassed towards one view point. To ensure `evenhandedness', an extra speaker (Ian Campbell-Smith of ICL) was shipped in at the last minute to tell us something about ODA (note that the `O' now implies `Open', not `Office' as it once did). But eventually, when the work came to be done, these topics were laid aside and the real reason for the workshop appeared. The participants were broken into three groups. I should have sensed something was wrong when two of the groups were to be chaired by Rutherford people (the other by the convenor): I definitely realised that we had been railroaded when a revised sheet of topics appeared to be considered by each group. True, these revised topics could be considered to have arisen during the course of the workshop: however they were not logical outgrowths of either the explicit purpose of the workshop, nor of the main direction of discussion. So replace those discussion topics given above with: \noindent ``AGOCG wishes to distribute: \item{\rtr}viewgraphs \item{\rtr}teaching material \item{\rtr}manuals \noindent to University sites in a form where they can incorporate it into local teaching material and documentation. \item{\rtr}is \sgml\ the right protocol to use? \item{\rtr}if so, are the {\sc Daphne} DTDs a good starting point? \item{\rtr}if so, what changes are needed? \item{\rtr}are there commercial offerings we should consider? \item{\rtr}what utilities related to the \sgml\ system are needed? \item{\rtr}should the UK academic community develop its own software? \item{\rtr}how should what AGOCG does be influenced by other requirements? \item{\rtr}if we target for {\tt troff} and \TeX\ is that sufficient?'' \noindent If the conclusions had been pre-empted before, they were even more constrained now! As Bob Hopgood of Rutherford explained, the question {\it he} wanted answered could be summarised `how can Rutherford best distribute the GKS manual electronically'. That was why {\it he} had brought us together, and why {\it he} had had the workshop organised (the myth of ACOCG as an entity with any existence outside Hopgood was eliminated). Plain and simple. Had that been an explicit question from the outset we could have given him an answer on day one and then got down to something more interesting. Given Hopgood's position at Rutherford, the weighting of the workshop and its work groups towards Rutherford, and the master stroke of a piece of paper with specific questions, it was inevitable that it was difficult to restore the workshop to its apparent explicit purpose. An interesting by-product of the workshop was the prepared papers. They illustrated neatly the problem with systems which emphasise structure and virtually ignore formatting: they may have been logically structured, but in terms of document design they were almost uniformly appalling, and difficult to read: the typewriter conventions of underlining, no indentation on the first line of a paragraph, but an extra `line' between paragraphs, and so on were much in evidence. Even `standard' \LaTeX\ documents were starting to look well-designed by comparison. That does not mean the workshop was a waste of time. From my own point of view it helped me put straight my view of \TeX\ and graphics; it allowed me to raise the perennial question of the character corruption which occurs to file transfers which pass through the Rutherford Gateway (and to be told that it was because I was trying to pass `non-mail' characters through -- whatever that mumbo-jumbo means); it revealed the immense ignorance that persists about \sgml, ODA, \TeX, and practically everything else we've been doing for the last five to ten years. Yes it depressed me -- especially the ready adherence to tomorrow's software in preference to today's tested and available software. But the food was plentiful, the majority of the participants stimulating, and the surroundings extremely pleasant. If only it had been a bit more rigorous. Beware Rutherford Appleton Labs bearing gifts! \rightline{\sl \mwc} \bar\centerline{\bf Echoes} \smallskip \noindent The Displays Group of the BCS held a `State of the Art Seminar' on Systems Integration and Data Exchange on February 28th. This meeting examined some of the current range of `documentation' standards like \sgml, ODA, CGM, \PS\ and CALS. It was curious that this came so hot on the heels of the BCS Electronic Publishing Group's similar one day meeting. Much could be gained by a little more cooperation here. Nevertheless, outside the subject matter there was surprisingly little overlap. The speakers were different, and the audience was not the same as the EP one (except for me, I think). One of the things I found interesting was the re-usability of presentations. One of the major selling points for \sgml\ is that it allows the same information to be re-used in a number of forms. In fact, if you are not going to re-use the information, it becomes difficult to justify the added inconvenience of \sgml. Paul Ellison's talk was re-used at the AGOCG Workshop the following week; Alan Francis' presentation was a reworking of his Electronic Publishing talk at Durham last year (and at the AGOCG meeting I heard Lou Burnard re-present the paper he had given at the BCS ep meeting). It often helps to hear the same material again -- the army principle of `tell them what you're going to tell them; tell them it; and then tell them what it is they've just heard'. Even Heather Brown's talk reminded me of something I had once heard at another Displays Group meeting at Rutherford Labs. All a case of d\'j\`a \'ecout\'e. Anne Mumford (Integration and Exchange -- Restating the Case for Standards) introduced the day by making a case for standards. Here standards tend to be taken to mean `Standards, as agreed and ratified by national or international bodies'. Paul Ellison (\sgml\ and Related Information Standards) led us through the many-threaded path of `\sgml\ and Related Information Standards', and even treated us to his version of how Adobe was led to the sacrificial altar to place \PS\ `in the public domain'. I'd heard a rather different version, so it will be interesting to find out just what went on. There must be room for a book on Adobe, just like the clutch of books which have come out recently on Apple. It is difficult to get excited about the many standards and what seems like their interminably slow path to acceptance. One of Paul's claims was that math coding through \sgml\ would mean that the resultant formulas could be input to algebraic manipulation systems. This seemed such a very useful attribute that I contacted Barbara Beeton to see if she knew of any cases where this was done. She was unable to uncover anything. On the other hand, several systems, Mathematica included, output \TeX\ form formulas. This tendency to attribute to \sgml\ capabilities which only exist in theory does worry me. I would like to see something substantive. Heather Brown's talk (Structured Multimedia Documents and the Office ({\it sic\/}!) Document Architecture) was a very good overview of ODA -- easily the best I've heard so far. It is quite intriguing how both \sgml\ and ODA appear to be embracing `multimedia'. ODA almost offers something which I find quite interesting (something that \TeX\slash \LaTeX\ offers too, but have failed to point out to the world as a positive feature). An ODA document can be revisable or not: depending on what you ship, the recipient can change it and reformat it, or merely print it out (in \TeX\ terms you can send the marked up text, which is revisable, or the \dvi, which isn't). There are many documents which you do not wish to be changed. The ultimate non-revisable document must be the fax, but it's usually also unreadable. It seems to me that ODA is a little more realistic than \sgml\ in its world view. It at least acknowledges that there is a layout structure, as well as a logical one. Alan Francis discussed the many differences between CGM and \PS\ (CGM versus \PS\ -- Horses for Courses). It should come as no real surprise that they are trying to address slightly different issues, and that in different circumstances one is more applicable than the other. Since CGM isn't a programming language and doesn't address itself to font questions, it is a great deal simpler and more compact. Converting from CGM to \PS\ seems no great feat. CGM is undoubtedly an ideal way of encoding graphics for interchange. I note that there are some \dvi\ drivers which can accept included CGM, and also that Wilcox's Metaplot may also convert CGM to \MF. As usual, we're there, but we aren't jumping up and down about it. It seemed to me less a case of horses for courses than of trying to compare bicycles and fish. Jon Owen (Standards for Product Data Exchange and Conformance Testing) illustrated that exchanging CAD-CAM graphics and diagrams was certainly possible, but that you had to be very careful to establish just what it was you thought you were exchanging. The drawings didn't always contain all the information you expected: the old adage about what you see not being all there is, far less what you want, was brought home very clearly. In a multi-authored paper from Rutherford Labs, (Integration of Graphics and Communications in {\sc Argosi}: J Gollop, R Day, R Maybury \& D A Duce) Duce looked at {\sc Argosi}, a `European' project to transmit continuously updated information to selected points. {\sc Argosi} (Applications Related Graphics and OSI Standards Integration) is an Esprit project to advance the state of the art of communicating graphical information over international networks. The specific demonstrator application chosen is a prototype road freight scheduling system, calling on databases in nations represented in the project. The databases contain causes of delay which a freight scheduler needs to take into account when planning a Europe-wide journey. Lastly, Norman Harris of Procad described the CALS project (CALS -- the US Initiative). With the massive backing of the DoD (and now a number of other US Government agencies) CALS has lent massive legitimacy to \sgml, having `adopted' it as one of its many `standards'. I have a sneaky suspicion that CALS has managed to rescue \sgml\ from the doldrums. Harris described some of the original motivation and history of the iniative, covering TIMS and ATOS. As I have commented elsewhere, one of the curious by-products of CALS may be to save trees, since one objective is to reduce the paper flows between and within the contractors and the military. CALS will have a very wide effect: besides the US Armed Forces, other Armed Forces may adopt it; non-US aerospace and `defence' contractors will have to comply to tender for US contracts; some CALS specifications have been proposed as FIPS (Federal Information Processing Standards) for use throughout the US federal government. Some state governments (especially those like California and Washington with a large number of arms contractors) are likely to adopt it. CALS is not static, and the next stages include examination of other `interchange' formats and standards, like ODA/ODIF, SQL (Structured Query Language), and PDES (Product Data Exchange Specification). The absence of the last speaker meant that there was extra time for discussion. The printed version of this paper by Shiela Lewis (Testing, Testing, One, Two, Three) raised several interesting issues about conformance testing, chiefly in the context of the CALS Test Network. At a time when many vendors claim to adhere to various standards, it is valuable to see what mechanisms are being invoked to clarify what adherance means. She notes that `Conformance testing does not in any way prove the usability of the product, simply its ability to process code in the manner prescribed by the Standard'. An interesting programme which would have been a worthy BCS ep meeting -- it seems a great pity that there is not greater coordination between the groups. Calling it a `State of the Art Seminar' was very astute! It seemed a bit expensive to me, especially as there were no foreign speakers jetted in at enormous cost, and no lunch. BCS ep provides speakers of equivalent standard (sometimes the same ones!), and lunch. The only advantage that I would say this meeting had was the provision of the papers to all the participants. I have mixed feelings about this: on the one hand it is a real boon to the audience (and to a reviewer). On the other hand it puts sufficient extra work into the hands of the speakers that it might dissuade some of them from speaking at all. \rightline{\sl Malcolm Clark}