22 February 2008
ICOM cordially invites all members of the global museum community to participate in IMD on 18 May 2008 with activities in their museums based on our theme “Museums as agents of social change and development".
Sounds good so far. IMD is International Museum Day. They invite us all to participate in both the real and virtual worlds with activities consistent with their theme. And here is where they lose the plot completely:
The highlight of the suggested online activities on http://icom.museum is hosted by The Tech Museum of Innovation on 18 May in the replica of its Silicon Valley museum of technology on SECOND LIFE, the virtual 3-D platform created by Linden Lab. From real-world museums, museum professionals and the public will be able to communicate with colleagues, artists and “residents” in the virtual world. They will therefore be able to participate in the collective development of exhibits in The Tech in SECOND LIFE.
I am not at all sorry to say that this is simply one of the worst ideas I’ve heard of recently.
18 February 2008
As a former economist, I have long been interested in the new economic or commercial models that are emerging on the web. Many of us will be familiar with Chris Anderson’s “Long Tail” description of the niche marketing of online stores like Amazon. Well, here is another theory. It comes from another respected web pioneer: Kevin Kelly (who helped launch Wired Magazine and is still a board member of the Long Now Foundation).
Kelly has recently written up a post called Better Than Free and in it he offers us “eight generatives” that people will still be willing to pay for in the new web environment (”a copy machine”) where so many copies of everything are now available somewhere for free (eg. peer-to-peer networks, not that I’d have any idea what they are for!).
What he says is that even when some product or service is available for free, we are probably still willing to pay for it elsewhere when it is surrounded by or within an environment characterised by these qualities (which can’t be copied, cloned, faked, replicated, counterfeited, or reproduced). Here is a quick and dirty summary of Kelly’s article with my comments about how each one might apply in the museum world, in italics:
- Immediacy – Eventually you will be able to find a free version of just about everything somewhere, but it could take sometime. People still pay a premium for special air delivery import magazines, so in much the same way we value getting a copy immediately delivered to our inbox as soon as it is released, requested or created. Digital downloads by subscription where applicable and possible.
- Personalization — Generic versions may well be free, but getting something bespoke will always be something that some people want. Offering products like hand-crafted digital prints, very high resolution objects, or rare special copies/facsimile editions may be well received.
- Interpretation — "As the old joke goes: software, free. The manual, $10,000. But it’s no joke." I'm not sure how this applies to us because for many museums, particularly in Australia, although we have bucket loads of interpretation, the general expectation is that we provide it, as well as most quick reference services, for free online. Perhaps we need to look at paid subscriptions for well-written online publications?
- Authenticity — "You’ll pay for authenticity." Again, we can offer very authentic material and already have this advantage. More, better branding?
- Accessibility – "Ownership often sucks. You have to keep your things tidy, up-to-date, and in the case of digital material, backed up. Many people, me included, will be happy to have others tend our “possessions” by subscribing to them." I'm still not so sure about this generative: in some ways, we can get web services like del.icio.us and Google Reader to do such things for us, like looking after our bookmarks/favourites and blogs (respectively) for free. It also doesn't seem to be named that well.
- Embodiment — "At its core the digital copy is without a body. . . . The music is free; the bodily performance expensive." For museums I think this is about what else we can offer in terms of paid programs or experiences. Generally, major museums and galleries in Oz, charge only for special/imported exhibitions or "blockbusters" (except us). Perhaps it means selling or charging curatorial talks on the talks circuit. I do a few of those in relation to our Lawrence exhibition and a few other things, and so far they are all free!
- Patronage — Audiences probably want or at least don't mind paying creators. "But they will only pay if it is very easy to do, a reasonable amount, and they feel certain the money will directly benefit the creators." This applies universally for creators, including us, but we probably need to pay more attention to making payment easier and reasonable.
- Findability — "No matter what its price, a work has no value unless it is seen; unfound masterpieces are worthless." I like this one a lot and it is probably one of the most relevant to our cultural world where such a large percentage of our collections is not on permanent display. It should not be too hard to highlight, find and get our products and services - not too many gates or complicated registration.
Some of these eight qualities apply to us more than others and a few could be better described or have a different descriptor applied to them (like accessibility?). To the list I’d probably add trust and, like one of his comments says, usability. Most cultural institutions are trusted and we can take advantage of that, but usability isn't really a major focus - think of most of our unfriendly catalogues and systems.
OK, so I might occasionally use a peer-to-peer network for some music and films that are either impossible hard to get or far too expensive in Oz, but I also download a lot of material for a fee from iTunes and I’d agree that the reasons I do this are pretty well mapped out above. If we are to come up with a decent model to make money or even recover costs for certain products and services on our museum websites, we need to very carefully look at this article.
12 February 2008
Shift happens: how the network effect, two-sided markets, and the wisdom of crowds are impacting libraries and scholarly communication
Abstract: This session will discuss the changing nature of library services and scholarly research in the networked world. Our affiliated group of not-for-profit digital initiatives - JSTOR, ARTstor, Portico, and Aluka - has a unique perspective on this shifting environment. There is ongoing discussion about the evolving Web (or Web 2.0): the migration of the Internet from a platform to a service; the network effect that encourages (and values) contributions and collaborations; and a shift in software and services to a participatory model. This evolution is changing libraries, publishing, and scholarship. In particular, it is fundamentally changing the paradigm of scholarly communication, and this presentation will examine this change.
I thought this was yet another good paper from the final day. Bruce knew his stuff and was an engaging and stimulating speaker. Fabulously, you can download the slides he used from this link: http://www.jstor.org/about/forum/ShiftHappens.pdf (1.1. Mb pdf file)
Bruce opened up by quoting Neil Postman "Technology doesn't add or subtract something. It changes everything." It does, however have a short half life. He then argued that Apples introduction of the iPod (bringing us portable media) in 2001 was as important an advance as Tim Berners-Lee's World Wide Web in 1989.
Next he told us of John Seely Brown's "Four exponentials" (regarding the pace of change as it applies to working together):
- Moore's Law: the power of computing doubles every 18 months.
- The Law of Fibre: the capacity of the bandwidth of fibre doubles every 9 months.
- The Law of Storage: digital storage doubles for the same cost every 12 months.
- The Law of Community (Metcalf's Law): the power of the network increases with the square of the networked people interacting with it (more people = more power).
The Transition from the Information Age to the Age of Participation
- Active, not passive
- Multilateral, not unilateral (If your federated search has a problem, who do you call? It could be with any one of 12 repositories.)
- Communities, not silos
- Contribution as well as consumption.
- The network effect. It increases in value the more people use it, eg. Open Source software (Linux, Open Office), Communication (email, SMS), Social Networking software (MySpace, Facebook), Scholarly Resources (arXiv.org, JSTOR). Its growth can be extraordinarily fast ("viral") and without control. Eventually the power of the network moves down.
- Two-sided markets. In Web 2.0 people can contribute as easily as they consume. These new networks have two groups that provide benefits to each other and enjoy intermediary platforms that balance their interests, eg. Flickr, eBay and OCLC's WorldCat.
- The "Wisdom of Crowds". In the right circumstances groups are often smarter than the best people in them. Their decisions work best when the crowd is: diverse, decentralized, has a mechanism for summarising the answer and acts independently, eg. Wikipedia (this applies particularly to our situation and our Encyclopedia!), Google's page ranking algorithm.
- Libraries (and we may read here "museums" or "cultural institutions" I think) have to manage access and preservation for system wide and local resources (wikis, blogs, repositories).
- We need to take advantage of economies of scale (OMG, I think I've said this meself before and nobody believed me!) so that we can reduce costs by sharing core services.
- We must reconfigure our services for the networked environment (which means they aren't actually configured that way now).
- We need to learn how to engage proactively with our constituents - see the OCLC report Sharing, Privacy and Trust in Our Networked World.
- Free-standing publishers will need to share the commodity layers of their activities, eg. HighWire Press. There is tremendous pressure to move from print to electronic publishing.
- Publishers that harness the network effects and which are able to build self-sustaining communities will grow faster than others, eg. arXiv.org
- (There are also implications for the academic world, but I'm not going into those here. Sorry, call me selfish and self-centred.)
Libraries (and other cultural institutions) are small systems in a much larger one and we must learn to move with it! Bruce then briefly touched on the "Gorbachev Syndrome" in which change agents are swept aside by the tide of change they initiated because of their continued commitment to legacy systems/products/services. And I'm afraid that in my view, most libraries and archives that I know about are still well anchored in their old ways and processes. The world has changed around us and we need to move on. Some of our much loved standards and ways need to be left behind, not continually patched up and brought with us.
11 February 2008
Some people don't have the time to plough through all this text, so I've been asked to put together some of the main messages that I picked up at VALA. I reserve the right to adjust these as I complete posting all of my notes. So, to date, I think the key messages that come to mind are as follows:
- The importance of pro-active engagement and interaction with the relatively new social networks that have emerged on Web 2.0. That is where the future will evolve from (very rapidly) and we need to be aware and involved to stay up. It is relatively risk and cost free. We should start making more use of engagement/interactive tools like wikis to develop and grow our own community (utilising the wisdom of crowds).
- Systems (on the web) need to be engaging and intuitive (not "must do") or they'll be avoided by users.
- We need to look at the ways we catalogue and who we are cataloguing for (ourselves). If you think of the Collection-Cataloguing continua it is something like Acquisition>Arrangement>Store>Keep - we are good at all of that, but we are not so good when it comes to the "providing public access via the web" (assisting our users to find and get) part. Our catalogues need to be fully optimised for search engines like Google, Yahoo and MSN. If we are maintaining systems that do not get the data out to the web because of some facility or capability that only we need, we should consider using a mash-up to account for those needs and simpler more open web standards for the essential needs of the public users. The use of persistent identifiers (particularly "canonical" URLs that in many ways are brief catalogue entries themselves) was a plenary topic that attracted much interest.
- Web services are increasingly being used and can provide almost anything. Slideshare is a good online example of an online repository in the Web 2.0 world. Much of the useful cataloguing (or tagging) is done by the extended community or network. Library Thing for Libraries was also mentioned quite a bit as being used by many libraries around our size (mainly to augment their conventional cataloguing systems).
- We must stay in touch with developments in Copyright and we should consider making use of the Exceptions in the Act to ensure they stay with us (this will be relevant to the WW1 non-OR digitisation project that we are just beginning - many orphaned and unpublished works).
- Regarding digital repositories - much of the experience so far has been in the universities storing research material. From them we learn that a one-size-fits-all approach (from the outset) should not be applied too rigorously. Needs and different requirements will evolve as the repositories are used and certain assets may have vastly different needs to others (eg. storage, metadata, etc.) Otherwise, the ECM itself may become a victim of "Gorbachev Syndrome" - swept away by a tide of change that it started and could not keep up with through its own inflexibility and resistence to changing with the times and new technological trends.
10 February 2008
Here is a link to LukeW's good summary of Stuart Weibel's plenary that closed VALA 2008. I liked what Stuart had to say and my notes probably differ a tad from Luke's notes, but maybe I just misunderstood what Stuart had to say?
Stuart comes from OCLC and presented really well, leaving most of us with a new perspective on what could have been a dull a dry topic. I found his message easy to follow and quite inspiring.
Branding & Web 2.0
OCLC have released a report on the perceptions of libraries and information resources.
Libraries and search engines are trusted about the same.
People care about the quantity and quality of information.
They do not view paid information or free information differently.
Branding can be achieved by building on trust by making things look free. Scale is represented by libraries and their presence everywhere. They have global scope and reach (via networks?). BUT people need more awareness. We must be part of the new online environments that dominate our lives.
Social networking software. Only technical manifestation is new (we've always networked). Motivate people to tag, participate. Wired said 40% of those they interviewed contributed in one way or another. (Higher than Yahoo's figures.)
Re social consumer environments. Facebook, etc. are not just for games. But they are probably not the right models. There are lessons to learn though (OCLC has just put an application into Facebook). They are flawed – closed gardens, rudimentary features, but offer an experience as well as a service.
Libraries must compete and compare favourably with popular models (in Seattle books with coffee is the law!). But can we compete and should we? What can we do to fit in and how to distinguish between the trends and the trendy?
Catalogues – how can they change morph/grow? Networking. Collections linked to people, organisations, concepts, context, metadata, etc. (So I've just started an account on Library Thing to learn how this works.)
Do we need a web or scaffolding – do we want more – coherence, durability, etc.
Mentioned FRBR – works, expression, manifestation and item. But with other dimensions.
He said that for discovery on the web, a book review is more useful than a MARC record (I agree - how many people truly understand MARC records?). They are a social bibliography. He also cited: lists, services, commentary, etc.
Infusing bibliographic ideas into the web & vice versa?
First class objects need: persistent identifiers; access to all; stand alone status (identification & clear IP); and they are curated (not left lying around unintended - bit-rot!). Allow users to enter and traverse the catalog from any point!
Establishing a canonical identity on the web is very important.
See WorldCat identities. This should have been done ages ago. Tag cloud into popular Ids. Has stuff by/on author, works, links, encourages serendipitous discovery, associated subjects. All from bibliographic data.
Identities on the web
What characteristics are best in identifiers? There are no hard/fast rules – just suggestions. He thinks URLs need to reflect something about what they are. Make them meaningful.
Design criteria for identifiers
- persistence (function of organisational commitment);
- universal accessibility & global scoping (work everywhere, open to all, WorldCat provides architecture for library assets mapping global surrogate to the local);
- optimised for search engines and canonical (raises search engine ranking);
- branding via URIs (mini-billboards);
- usability by people and machines – speakable, short, predictable (hackable).
WorldCat identifiers – are they good enough?
Unique, free, citable, resolvable, linked, canonical (no, not really). Some functional duplicates (more records pointing to same thing).
A glimir of the future?
A global manifestation identifier. Global, business neutral, canonical, provides URL equity, fits with FRBR model.
There are other identifier schemes. So, OCLC is cautiously exploring this territory.
IDs are the key; they are needed for mission, to compete, brand, to bring bibliographic values to web, to provide services and access to digital tribe. Books not done yet.
See particularly his blog posts on related subjects (persistent identifiers).
Andy Powell asked whether he was talking about the semantic web. Stu supports it, but is skeptical about the technologies involved. He spoke of middleware as the plywood of the internet. It needs to become the plywood of our arena. He said the abstract model has fundamental importance on the web.
Luke Wroblewski, Senior Principal of Product Ideation & Design, Yahoo! Inc. and Principal of LukeW Interface Designs, USA
I really enjoyed this plenary and got a lot out of it. He may have initially been a bit biased towards Yahoo and anti-Google, but eventually he got over that and had excellent points to make. Again these notes are pretty rough. It was a great start to the final day and really got us in the mood and opened our minds.
It isn't that people don't read – they'll read when they find something they want to read. So, there are three key considerations in designing websites: Presentation – voice, where interaction happens; Interaction - responding to users; & Organisation - structure.
Luke strives for usefulness, usability and desirability (why do I care, why should I use it?)
He referred to videoegg – as a good example.
What is different about today's web? What are the recent shifts?
A. From locomotion to services
We interact through locomotion, conversation and manipulation. It is all now much easier and more widely evident on the web. He showed us the huge use of yahoo answers (5 mins to an answer in US; 0 to 90 mill. users in 1.5 years); the spread and use of word processors online. It is all part of the web transition – the locomotion to digital representations of physical entities, then digital manipulation of physical good (e-commerce), and now purely digital services (no physical presence, just display services: aggregation, flickr, MySpace, blogging tools, video editing online, entertainment sites). Many of these services have popped up in last few years with hosting for around US$6 per month. Instantly you can reach a huge audience; there are few barriers to enter; and you can use free open source platforms. BUT, you have about 1.6 secs per month per person to convince them your site is interesting, unique, worthwhile. Therefore you need to know your core – define, focus & build outwards. Some examples of those who do:
- Eg. eBay – global economic democracy. 30th largest economy in world. US$1,800 sold per seconds. 520,000 stores hosted worldwide. (Sorry, no link to eBay, I was the one who shouted out "greed" when Luke asked what makes it work so well.) Luke said it was held together by democratic feedback. Any search or browse defined by democratic comments & feedback; they are not sorted. Interaction on a level playing field is core underpinning element that makes it tick.
- Also digg with 3 million people online filtering the news. Interaction on all news items. All can express opinions. One click interaction made the site go - the core element.
- flickr – builds outwards, can be shared, embedded, favourited, etc., but core is a picture of a subject.
- "Meaningful shouting" through: differentiation (distinct and appropriate), attraction, and embodying the brand. He looked at three well known wiki tools – how are they distinguished meaningfully? Is it coherent story-wise?
- “Back of pack” – supporting the story & outlining benefits/features. The new Yahoo home page calls up elements and gives you benefits in three bullets on a pop-up window. It helps people use the product. Yahoo also provided a 2 min video on how to use Yahoo Bookmarks.
- The unpacking experience – eg. the Apple experience. Culminates with the personal photo taken from your new laptop. Google video just gives you a form. It is an interrogation room. That happens with 90% web services. Jumpcut first asks you to make a movie, so the first things seen is the movie and online editor. After that, you are asked for an email address. Also showed pbwiki – it does the email messaging thing and then three more steps, re passwords, access terms of service, more services, etc., and then you need to get through a barrage of marketing material, all before you can even start! But he compared that to geni (creates a family tree) – it starts with a name to make a tree. You jump straight in right out of the gate. It is what people want to do! They got 5 million profiles in 5 months.
It is all about design considerations. Ajax interface design. Pages become more dynamic, updated and rich (although these same pages become more difficult for those who are print handicapped). Examples of inline micro-actions (within previously flat web pages): ratings, online indication, fade, transition, status, transition, etc. You manage three spaces by design: the invitation (to vote/drag); transition (when voting/dragging); and feedback (when the vote/drag is done). All are then encapsulated in design patterns – repeatable design patterns (they catalogue different states and interfaces – eg. Yahoo's pattern library) – the use of search assistance layers after some user hesitation and these deliver more meaningful results, and more conversational information retrieval.
C. From sites to content experiences
Sites used to be structured in hierarchies – closed and rather negative. The emerging networks, like clouds, etc. are not as accurate, but maybe they are accurate enough? Content is not treated as part of a structure, it is treated as part of an experience. See his article on primary and secondary action in web form. The new experiences are delivered in the form of: content creation tools (eg. search, blogs, like ajaxian, wikis), aggregators (like digg, del.icio.us), display surfaces (eg. Facebook, MySpace), and entertainment services (eg. You Tube).
Design considerations again. When readers come to his page it delivers primary content, related content and a bit of context. How much of site dedicated to overhead? People really just want what they came to find and maybe a bit of related stuff and context. Why hamper the user experience with what they don't want. Do you need to get everything into the whole page? (The long tail phenomena again - for most web sites, only a few pages get most of the attention/use.)
Eg. personalised search like ROLLYO and a party planner/arranger like RENKOO
If expectations are met . . . people will look around and may take up relevant invitations.
Distributed or re-mixed content. These experiences are not just about distribution, but bringing content in context (and core design still matters!). Eg. blog posts with rich metadata in it, say from Yahoo Shortcuts (interestingly, I found it easier to find this page online via a Google search than a Yahoo search). Things can be added in with a single click on your blog. Context can be king.
D. From webmaster creation to everyone creates
Community on the web comes from features like tags, ratings reviews, trackbacks, blogs, wikis, subscribe RSS, etc. Unity through shared interests and goals. Something gets them all there around something. Social behaviours – reputation & identity; communications; sequences, etc. Implications: GOOD - Filter, content creation, increased engagement (Yahoo answers), invested consumers and collaborative innovation. BAD – blurred focus, spam and poor quality, power laws (abuse), factions and tribes, privacy and exposure issues.
- Enable identity for communities: welcome, anonymity can be a death sentence, profiles.
- Provide for creators synthesizers, comsumers, not just one or two. Who creates – only 1% create, 10% are synthesizers and 100% are consumers – they read, engage, benefit from content. Value from reaction of people with each other.
- It all depends on the tools. How to get people to contribute and how to encourage quality? MySpace kinda ugly? But is is possible to create good stuff. It is hard to create good profiles on MySpace and easy to create ugly stuff.
- Quality content is based off the level of effort needed to put something in. Burying the submit button encourages fewer but better posts! So some barriers to entry can help QA. The best check on bad behaviour is identity. Has implications for comments on our blogs (Facebook founder, Mark Z.)
Luke's responses to questions:
Re government websites – any trends? Enormous opportunity for us to build on these principles. Many different ways to engage. Eg. initial (adverse) reactions from digital media, film and music and their attitude now.
Getting around crappy content – don't just go for the quick fix, quick dollar, think about the long term (what Liz usually urges us to do!). There are ways to make it good.
Redeveloping sites from scratch – all about knowing your core. Start at that. Not with the amalgam that you have that hides the great original idea. What is really working and what is the core essence?
06 February 2008
Professor Geist holds the Canada Research Chair in Internet and E-commerce Law at the University of Ottawa.
He blogs on the net and IP. See also the Fair Copyright for Canada Facebook Group.
[No paper on CD or the web just yet, so for now you'll need to rely on my rough notes. It was a good keynote!]
History. Initially there was a push for governments to be hands-off, but they were never there. They always wanted to have a hand in regulating the net on at least a domestic level and in some international agreements. Canadians used the Australian anti-spam legislation as their model. (I didn't know we had such a law – it certainly isn't effective.) There always was a role for public policy and government.
Internet 2008. The blogosphere (>100 million, but incl. some subject matter experts in some fields); power of social networks – Facebook/Myspace (eg. his group on Canada Fair Copyright had many thousands of members within a week or so indicating opposition to new legislation – now 40,000 members); podcasts' role (he usually uses his iPhone to record talks and then podcasts the MP3 file – people don't want to read, but will listen or even re-listen; wider audience); postsecret – posts secrets to the world in an artistic/creative way (within a veil of anonymity) – many sad, but now >250k and many in galleries and museums in US; online video sharing (eg. YouTube, Star Wreck – free download, incl English subtitles – people could download and still they bought DVD and were licensed for broadcast; elephants dream – open movie using free tools; public broadcasting (like our ABC); flickr & other photo sharing sites like Facebook (many using CC licenses); rise of creative commons (some rights reserved); free online publications (that can also be purchased, eg. In the Public Interest); collaborative internet growth (eg. Wikipedia.org – it doesn't have a monopoly on making mistakes, but has a remarkable panel of expertise; EOL – encyclopedia of life); citizen journalism's rise (eg. OhmyNews – written by everyone); Project Gutenberg (public domain digitised books, like SPW); LibriVox (audio versions of books); educational content online like MIT OpenCourseWare (decade long time-frame finished within four years or four years early assisted by advances in technology and the rise of support for such initiatives); move towards Open Access, eg. PloS, the Public Library of Science – some people have gone on to win Nobel Prize from that online journal; see also Open Medicine); Internet Archive (Wayback Machine) – public domain material hosted for free, forever; digitisation projects like Google Books (whole and snippets), bringing books to life & Canada has a National Digitisation Strategy including photos freely available; Open Source software – browsers, web services, etc.
BoingBoing (originally a zine) has larger readership than any newspaper in Canada. Lots of concern re copyright in Canada, all starting from that Facebook group. Many others being used to voice opposition to public policies.
Internet 2018. (Not a prediction, but public policies and potential.)
- Connectivity. Broadband for all (or you cannot participate - so there is a public sector policy role there); muni wifi; net neutrality (a notion of a two tiered internet – fast for the rich and slower for the rest – treating all content in an equal fashion); spam; spyware.
- Enhancing participation. Intermediary liability issues (eg. things posted on your blog by others & not taken down fast enough); domain names; privacy (still struggling with issues, eg. Facebook issues & their privacy settings – many don't use, 70-80%); trust; transparency.
- Copyright. Anti-circumvention legislation; fair use (Canada has no exceptions like time shifting, three-step test, loss of gift if not used, etc.); term extension (70 years+?); orphaned works; WIPO (an agenda that has moved much further than anyone expected).
- Content. Open Access; digitisation; Crown copyright (could affect us – people asking for permission to copy the Copyright Act!; military denying screenshots of equipment if it thought it critical!); public broadcasting.
I liked this good round-up of issues relevant to public policy and the net. It wasn't too heavy and highlighted many possibly obscure and not obvious connections.
Responses to questions:
The interests of public institutions sometimes undermined by meeting the lowest common denominator and strategies limited to such baby steps that are so conservative and aimed mostly at not offending! Too many stakeholders in the room making decisions. They need to take a stronger line with what they are doing and use restrictions. Too much time spent telling people what to do, not what you can do.
Social networks may be skewed towards the younger demographics.
Expectations of privacy on things like Facebook – people are not expecting that everyone can see it. See Danah Boyd's work regarding the reaction of youth to parents looking at their profiles. Some governments ban the use of Facebook by employees, but all of their potential hires are on Facebook. It needs a re-think about the content posted on Facebook.
He was against the introduction of filtering systems as they are highly problematic and of unknown length/application.
Exceptions under s 200AB of the Copyright Amendment Act 2006
The new Flexible Exceptions for Cultural Institutions are intended to be open-ended and more flexible than previous exceptions. They enable us to make use of copyright material where that use doesn't infringe the copyright holder's interests.
“Fair Use” - under US law. Libraries and Archives exceptions exist but don't cover museums and galleries. (Our law has “Fair Dealing”.) Emily said flexibility and uncertainty move along together.
Must be by/on behalf of a library or archive.
For the purpose of maintaining or operating the library or archive (onsite or online!); but
Can't be for commercial advantage (cost recovery is OK) – any kind of profit is NOT OK; and
Work is not infringed by a use where the use amounts to a special case; doesn't conflict with a normal exploitation of the work/subject matter; and doesn't unreasonably prejudice the legitimate interests of the owner. (With some terms having the same meaning as TRIPS Art. 13 – international law.)
Whole of sector behaviour could have an effect on how the law is applied.
There are no fixed answers, so therefore, for us a risk management strategy is wise.
Emily says there is some exciting potential for the sector to act collectively and in unison.
[Her papers are available online at IPRIA.]
Responses to questions:
Institutions are not used to uncertainty and flexibility, so are inherently conservative, whereas the US environment has seen more activity.
External legal advice tends to be more risk averse because they don't really understand our circumstances or the law as it applies to us. What is the worst that will happen? Taking online works down usually works. If someone does go off and use litigation, the remedy will likely be rather limited against a cultural digitisation program. S 200AB gives a potentially powerful defence.
People seem to be looking to us to test the law on behalf of others.
ALCC is currently drafting the guidelines to use of 200AB under a use it or lose it principle. Laura Symes supposed to be talking to us now – Sophie??? We should let them know what we are about to do.
(I have to say that Emily delivered an amazing paper, because I know she'll probably be reading this blog soon. Hopefully she'll correct the errors I no doubt made above!]
Schubert Foo, Professor, Division of Information Studies, Wee Kim Wee School of Communication & Information, Nanyang Technological University
Amidst changing lifestyles, Internet savvy users, and the availability of large amounts of information on the Web, libraries are faced with the main challenge to remain relevant and to continue develop innovative products and services to serve the needs of users. This paper proposes a number of roles that libraries can play in such a future: as info-concierges; as a network of inter-connected info-concierges; and as a network of true collaborations. Using a case study of the National Library Singapore (NLS), a number of initiatives currently undertaken by the library to move forward in such a direction are outlined. These include the introduction of a SMS reference service, enhanced accessibility of NLS’s content through deliberate availability in users’ search and social networking spaces, and the development and use of a platform that uses the principles of “wiki” to support the formation and use of a collaborative reference network to support reference enquiries.
This paper is directly relevant to our references services in the Research Centre. I'll make the full paper available to all RC staff (and anyone else interested) when I get back. My notes here are really pretty rough, but give you a taste of the content.
He was impressed by what he had seen and heard at the conference and encouraged us to spread our wings and not always expect the US to be the leaders in information management and technology.
He said that his students in Singapore have been very keen on using Wikipedia as a reference source for nearly everything! Many public reference enquiries received from parents on behalf of their children for study purposes. They use an acceptable complaint:compliment ratio of 1:24. Collaboration in teams is big in Singapore.
Libraries: brick V click; collect-organise-store-access; mediator (source-user); authoritative-trusted content. Most library users don't come to the library they are net users. They use search engines and sometimes they believe that that is the only place to find information. Instant gratification is expected and must be download-able; they are not interested in browsing. They also like exceptional user-experiences (memorable, unique, exceptional were the words he used), but are not interested in help files. Only 1% of users go to an OPAC – they prefer Google (55%), Yahoo (21%) and then MSN (9.6%), in Singapore (I expect that the % in favour of Google is higher in Australia).
So what do we do as librarians? (Well, not me as I'm not a librarian.) We delve into their net world. Singapore has high saturation of broadband, PCs and mobile phone use. SMS is very highly used too. SMS plans are much cheaper there. (I think cost has a lot to do with the usage rates of new IT services and the web.) Users want to connect anywhere, anytime from any device.
The Info-Concierge – information as a commodity. Each object is self-contained, but must be connected and across multi-platforms. Let users continue on the pathway of discovery - “what is next?”. Connectivity through links, different platforms and by pushing/suggesting further exploration (like Amazon does). They use push for simple alerts, but it could be pushed much more on a finer granular manner. The concern is spamming users or intruding on their private spaces. They want to deliver information to users, not bother them. Basic encouragement ideas: taxonomies (browsing); formats; relational search; events; share & join in.
Promotion of discovery is very important. A good example is bookjetty.com, and where formal MARC records that are augmented by user tags/comments, like LibraryThing for Libraries. Bookjetty recognises where you are and presents you with options relevant for you. It gets users to get back to the library.
Libraries need to harvest, select & authenticate, meta-tag, create/maintain/grow taxonomies (they must be download-able!), and organise information content.
He encouraged connections (facilitated by libraries using Web 2.0): content-content; content-people; and people-people. Using tools like wikis, blogs & social spaces.
They also curate exhibitions relevant to topical and current events and to highlight their collections. All are eventually moved online in a virtual sense.
Reference services are provided within reach of everyone – wherever, whenever. SMS service as well as email and mobile phone. SMS request constrained to 160 characters. Answers are usually sent back as a URL within a template. If they provide a book's catalogue entry, they have a comment field for value-adding “Librarian's notes”. It finishes with a feedback sheet that attempts to get to know the user better by three key questions – like usefulness, finished, other comments (I could not read them on the screen). He said users are overwhelmingly positive in feedback.
They have an infopedia like our Encyclopedia, that was once buried in their website and now can be accessed by Google, Yahoo and MSN (he calls them the "GYM space"). They've used a microsite to expose it to Google. Content can be found more easily on Google Maps, Google Earth, a Yahoo Search, etc. Content usage has increased exponentially (160 fold). I wasn't sure how they managed to push the content to these search engines – may be in the paper.
Collaborative research responses. Making wider use of librarians and even other users. There are multiple entry points and a network of specialists (community) that power it and moderate it. Based on a wiki. Community alerted by SMS/email to them and can come in and assist to make the full answer. Multi-user collaboration.
He urges support for librarians to initiate new projects, but says that we should not push too hard and allow for some experiments to fail. We also need to get to know users better and encourage information literacy. The basics are still needed.
He referred to the recent JISC/British Library report The Google generation is a myth – they are not that information (web) literate. One stop shops don't work. He said that we need to be much more e-consumer friendly and connect via Facebook, etc. [The significance of this for research libraries is threefold: (1) they need to make their sites more highly visible in cyberspace by opening them up to search engines; they should abandon any hope of being a one-stop shop; they should accept that much content will seldom or never be used, other than perhaps a place from which to bounce.]
He was asked about Second Life and said that he thought it was something that a lot of users went into once and came out, then never returned. He suspects users are not serious about using it. (Apparently it has a huge “churn rate”.)
05 February 2008
It was patchy, but you get that. Don't expect as many posts from Day #2 as I am going out Wednesday night and in any case the program content is not as strong, as far as we are concerned.
I had dinner after the close with Carmel and Paulie and they basically agreed with me about the papers I heard.
One thing I did notice on my travels on foot around town is that Melbourne city is now full of public works of art. They are everywhere and wonderful!
I'll try and take some more pics tomorrow.
Peter Johan Lor, Secretary General, International Federation of Library Associations and Institutions and Extraordinary Professor, University of Pretoria, South Africa
The World-Wide Web is evolving into an interactive, multipolar social space, referred to as Web 2.0. Libraries are urged to follow suit, as implied by the term Library 2.0. A brief exploration of the evolving environment precedes a discussion of a number of trends which affect the library profession and which require attention at the international level. They include the commodification and dematerialisation of information, globalisation, and disintermediation. Their effects are diverse and affect freedom of information, equity of access, and inclusion in the information society – three themes that are addressed as part of IFLA’s international advocacy programme.
Well, as a keynote, this one didn't measure up. The content was initially entertaining but pedestrian in its content. Sorry, but I expected more from a keynote than this and struggled to stay awake. I wasn't expecting him to put a Reliant Robin into orbit, just some stimulating new ideas or a different perspective. Maybe there is more in his written paper.
He talked about the early days of library automation and how it has accelerated. Now we are in a "disruptive innovation phase".
He mentioned the importance of enjoying the journey (search) & the Long Tail of obscure/esoteric trivia encountered along the way. Problems encountered with amount of information to digest and limited bandwidth. Web: interactive; collaborative; and private/personal.
He links the Web & Library 2.0 to Info Economy. He also noted the ephemeral nature of new “dematerialised” documents. How do we preserve them? The place of publication is now irrelevant/obsolete.
Virtual content must reside somewhere and he briefly entioned “trusted repositories” in this context. Maybe they make it easier to pull the plug and censor material.
See his discussion of the relationship between Commoditisation/IP and the Long Tail (- in his paper for those interested I have the pdf file).
He also talks about the issues like orphaned works (i.e governing anything produced after 1860-70!) which confront mass digitisation projects. IFLA/IPA have a joint statement on this: should conduct a diligent search and if they can't find the owner and then go ahead and (mass) digitise, and the owner appears, there should not be a sanction against the library. [This may be relevant to the RC WW1 digitisation Project.]
Data management and the curation continuum: how the Monash experience is informing repository relationships
Cathrine Harboe-Ree, University Librarian, Monash University
Repositories are evolving in response to a growing understanding of institutional and research community data and object management needs. This paper (building on work already published in DLib, September, 2007) explores how one institution has responded to the need to provide management solutions that accommodate different object types, uses and users. It introduces three key concepts. The first is the curation continuum, which identifies a number of characteristics of data objects and the repositories that contain them. The second divides the overall repository environment based on these characteristics into three domains (research, collaboration and public), each with associated repository/ data store environments. The third is the curation boundary, which separates each of the three domain types.
This one was really aimed at the academic environment, but I hoped that there would be something for us to learn here too. I think it was beneficial. I have the pdf file for those more interested.
The core of Andrew's presentation was his slide on the Data Curation continua identified so far:
Less Metadata <-> More Metadata
More Items <-> Fewer Items
Larger Objects <-> Smaller Objects (different reqts)
Objects continually updated <-> Objects static
Researcher Manages <-> Organisation Manages
Less Preservation <-> More Preservation
(eg. no commitment to those presentations being around forever on Slideshare)
Closed Access <-> Open Access
Less Exposure <-> More Exposure
(His paper also stresses the importance of going well beyond access into exposure and discoverability using a range of techniques such as OAI-PMH, RSS feeds, search engine spidering and federated search.)
From the paper's conclusion (as this is of some relevance to our DAMS/Mediabin):
When the ARROW philosophy was initially conceived it was thought that a single institutional repository that was integrated, interoperable and flexible would provide the best platform to support teaching and research at Monash. The single common repository approach, while initially attractive, has been found to suffer from a range of implementation challenges and fails to provide adequate management solutions for data generated by researchers over the entire research lifecycle. These challenges can be best addressed when considered in terms of the data curation continua. The ARROW, DART and ARCHER projects have seen the evolution of this concept into a more nuanced understanding of the different types of content that would need to be managed, and the different audiences and uses for that content. This has led to an acceptance that multiple, albeit interoperable, repositories would be better. One set of decisions about what to do for each of the continua leads to three different sorts of repository domains. Monash University is calling these research (DART), collaboration (ARCHER) and public repositories (ARROW) respectively. A further management concept, the curation boundary, provides a mechanism for determining when and how objects can be moved between the domains.
We may not always need to use something like these three stages and currently we just use two – private (museum staff only) and public (web). It could be, however, that we will soon require a medium stage where we are more open to collaborative ventures and cooperative creation of our digital collections. Perhaps that also comes in via tagging of public assets?
As knowledge about institutional and data management repositories evolves over the next few years, these ideas will be further explored, by Monash and many other institutions. I guess what he was saying is why apply the one set of rules to everything if not everything is to be kept/preserved forever – perhaps as objects cross the curation boundaries, different rules can be applied by workflow? A good example would be the generation and attachment of metadata?
Andrew is now setting up the ANDS.
During questions, both Catherine and Andrew talked about developing the new people needed to take such projects forward. It is a growth area for librarians, but there are not a lot around who have the full compliment of both IT and IM skills. Data management and other curatorial skills will be required for us (for the ECM system).
Queensland Stories: community, collections and digital technology at the State Library of Queensland
In the vast state of Queensland, the ability to create and share stories about people, places, landscapes and ecology using digital technology and the World Wide Web bridges distance and difference. The sharing of stories is the key concept around which the Queensland Stories Program has been built. The Program strongly aligns with the State Library’s new strategic priorities and positions it as a leading institution in the field of digital technology. It promotes the State Library as a centre of creativity and learning, and provides opportunities for community engagement projects as well as the creation of user generated content for the collection.
Deb showed some images of the new SLQ building and I really, really need to see it! SLQ has many other partners that they work with. Over 1 million people visit it each year now! (I think they might be interested in a small touring version of our T.E. Lawrence collection.)
Digital story-telling started at Berkley and other examples include the VHP at the LoC. ACMI also has had a program since 2004. Builds multi-media collections (i.e development!). Queensland Stories Project does just that. This kind of model is something we could follow for recent conflicts.
The Queensland Stories website, launched in June 2005, is a rich storehouse of Queensland digital stories. Digital stories can be viewed over a dialup and broadband connection on both Apple Macintosh and PC platforms. The digital stories are available in both Real and Windows Media Player formats to enhance the viewing experience.
IP issues: layers & elements; advice given to creators accordingly (esp. re music and film – use whatever is free). Looking at Creative Commons licenses as well.
Uses local champions and trainers similar to the VHP in the US. Staff and volunteers at community libraries. Stories held on local networks?
Yen Wong, Learning & Technology Librarian, State Library of Victoria
Christopher Alexander is a controversial architect who believes that those who build physical spaces must address the question of human feeling. When combined with some ideas on metamedia literacy, there are implications in his work for the building of social online spaces such as Inside-a-Dog, a new site being developed by the State Library of Victoria for young readers.
Many were keen to go to this presentation, but in the end it didn't deliver. Too much theory and light on content.
Relationships between parts and whole. Emergent things.
Neglect of the importance of the human feeling and the beauty of shape.
There wasn't much in this for us. Eventually I fell asleep.
Caught up with Shirley Foster from Altarama. They are very grateful for all the promotional support the AWM has given them and the company is now going strongly, especially in the ACT. There was keen interest in RefTracker at the conference, including the NLA.
Discussed RFID with two providers, including 3M. They'll ask me to go and see them for a demo in Sydney in April around the time I am giving a digitisation master-class.
Over lunch I caught up with Paulie and Carmel from NLA and they agreed that Andy Powell's plenary address was the most stimulating thing from the morning session. The rest of the Library 2.0 stuff that I missed was useless, at least for us.
(a written paper was provided on CD - let me know if you want the pdf file)
She related 2.0 developments and initiatives to other physical improvements such as wifi and RFID – aiming at 100% self-serve.
Referred to Helene Blowers' Learning 2.0 program. Self-paced and online to encourage play and exploration. 12 week program – 23 things. Exercises set out on blogs. A good model for us and ECM learning exercises.
YPRL are also using a wiki internally as a training resource for staff.
LibraryThing for Libraries is also used through their catalogue - tagging.
Andy Powell, from Eduserv Foundation (an educational charity based in Bath)
This presentation had a lot of useful views for us as we approach the new philosophy of our whole ECM environment.
So far (and at this conference) digital repositories seem mostly on the academic agenda in university libraries. Not much seems to be recognised regarding the challenges facing cultural institutions, so maybe we can learn something from the academic experience?
Powell has some cynical views re repositories.
He started by giving us his background with Dublin Core – he has been involved from early days, esp re web based metadata generation tool development. (The Abstract Model was discussed zzzzzzzzz.)
Then he moved on to JISC Info Environment – again aimed largely at the tertiary and further education environment. Most UK digital repositories are based in this environment. They use all the expected standards, but the environment has missed or ignored the Web and this is missing from most digital library spaces, particularly web architecture.
Eduserve has worked with the UK Science Museum and they started by modeling the infrastructure behind the repository (similar to what we have done with ECM). They have built (i.e. developed) a repository for them that they called a Web Content Management (WCM) system.
Serving stuff on the web still missing from JISC Road Maps. He was very positive about open access and what it will do to scholarly publishing – it isn't an "if", but a "when" – it will happen!
Repositories (to date) are mostly focussed on deposit, not servicing the web. WCMs are essential if they are to be used. Concepts such as search engine optimization are essential (not just having federated search within the environment).
He briefly touched on the "REST" architectural style – it focusses on resources and global identifiers.
Is the focus just to be on the institutional repositories or a global environment?
Web 2.0 means: the new "prosumer", remote applications, social-ness & exposed APIs; plus diffusion (eg. blogs, etc.) and "concentration" (via Lorcan Demsey's recent writings at OCLC) – hosting services that are global in scale, eg. Flickr, Technorati and maybe del.icio.us? He mentioned those using Amazon S3 hosting. Social networks are critical, particularly for research purposes and this needs global services.
Future – what would a web 2.0 repository look like? He said it would look like Slideshare Not many in the audience seemed to be using it. You can share, embed, tag, favourite, etc. Other attributes he suggested: a high quality web based document viewer; tagging; visible to Google; RSS; Amazon S3 (infra-structural services); social groups ability; global in scale. BUT – it doesn't support preservation, complex workflows and doesn't expose rich metadata – so what? Are they really needed (in this system)? How can these needs be met without destroying everything? I think that is the problem we have made for ourselves with some of our CMS - wanting them to be all things to all users and forgetting their most critical tasks.
One way forward may well be using SWAP – scholarly works application profile. Used to described eprints – scholarly works/publications held in repositories. They used FRBR – functional requirements for bibliographic records. (See also this Demsey blog post.)Simple Dublin Core (the metadata standard/protocol) doesn't do this – it is all about relationships, not just a flat structure description. But it may all be too complex in the end. Can we just encourage users to tag, vice deep formal cataloguing that nobody ever sees and few outside the institution ever use? This has effectively distorted much of our work (in the AWM) that seems lost to any users, even on our own site. Rich cataloguing records are locked away inside some of our CMS and NEVER exposed on the web. THIS IS FUNDAMENTALLY WRONG!
We need to learn more from Web 2.0 about what works on the web as most repositories are not working (particularly re sharing on the web). They are not marrying up with the social networks that researchers actually use. Slideshare gets by with almost no formal metadata, just by using tags and links between resources.
Open access is important – making content available on the web. Policy needs to reflect this. We still focus on deposit, not putting resources on the web.
Andy's clossing message was for us to think about resource orientation, not services – digital libraries ignore this at their peril.
Questions & further discussions:
Warwick Cathro (NLA) suggested that institutional repositories can account for needs such as preservation and richer identification (which is what we are aiming at to some extent), but Powell said that that world has not yet been built, at least not in the UK. Building the social layer is beyond that model.
Physicists seem to be sharing their knowledge and research in arXiv.org and they maintain their affiliations.
Why do we still publish as PDF – it is like still working on paper, why not XHTML – embedded links and micro-formats. It runs counter to the mainstream web. Citation is another huge area and is still at odds with how it works on the web yet even WordPress allows for this with an app.
Stuart Weibel from OCLC suggested that researchers are too lazy and won't do what is needed re deposit and identification of resources. But Powell remains optimistic that the low cost of sharing a presentation on slideshare brings massive benefits in terms of knowledge sharing. It is intuitive and obvious and can work with little encouragement. It is a second best to say “you must do this”. Systems must be more intuitive than that! (I think this is a key message for us and the new practices and protocols we will be setting up and using within our ECM.)
Re future of scholarly publishing, Powell said that Open Access is just an inevitable change. We see it in the music industry already. Researchers can make their stuff free on the web. Yes, people will still want to buy and want to publish in journals. Maybe they'll be different, but something will change. National funding bodies seem unable to fund global networks for researchers, so publishers are starting to step into that space, building “Facebooks” for researchers. What impact will blogging have on this – probably an increasing one.
[My apologies for this long post, but this is the one paper not yet provided to us online or via the CD we received at Rego. I had to take these rough notes during his presentation. I thought it was pretty relevant to us as we approach ECM implementation.]
04 February 2008
After looking over their digitisation facilities, I had a look at the Victorians on Vacation temporary exhibition. It is an interesting exhibition with some very evocative images, but I found their online audio guide to be a good model for us. There is clear signage about it as you enter the exhibition and I asked the staff about their free to hire MP3 players. They are iPod Nanos (about $199 each I think) and visitors can borrow them for a photo ID. They've had only one stolen and for that one no photo ID was left - just stolen credit cards. They're pretty well used, especially on the weekends, but they only had a few to lend out. They say the use of the guides varies depending on the exhibition content, but they are expecting them to be popular for their next exhibition, The Medieval Imagination, opening in late March.
The people who visited us to look at our digitisation facilities last week returned the favour and showed me theirs this week. They have three (yes 3!) Bookeye scanners. They are not perfect for everything, but I think they are needed for many bound formats. The SLV are just scanning in grey scale for access at this stage and have not bothered with OCR for printed text.
Above you can also see an image of their map scanner with decent layout tables each side of it.
The last image is a light box that they fashioned themselves for their own glass plate negative program.
I went to the NGV, but didn't have the time to see everything, so just looked in a couple of exhibitions and galleries. I like the way they clearly notify visitors of all talks being done during the life of an exhibition (including who is giving them), on the walls as you enter. I think we should do this with all of our special exhibitions.
Role Play was small enough to view with limited time and there were a couple of great photos that interested me.
It was pretty uneventful, even a bit dull, but less dull in NSW where they were doing roadworks every 10 km. And for those who haven't done this trip in a while, the new Albury bypass is now open, so you just breeze through the border towns at 110 km/h. This was later to cause the only interesting part of the second half because I missed the usual Caltex petrol station in Albury and there are no big service centres yet on the bypass.
So, I charged on to Melbourne confident that the range remaining indicator which said over 340 km to go in Albury would get me to the outskirts of Melbourne where I knew there was another Caltex. Of course I could have just bought some petrol at another station (but who wants to use money when they have a Caltex card?) and I could also have turned off and gone to a side town or even turned around and crossed the highway for a Caltex on the other side, but really?
Even though the Victorian Police can pick you up for thinking about going over the speed limit (whilst you can gun down a few former friends in Carlton and freely get away with it), backing off the throttle and using the cruise control religiously didn't seem to extend my range to the point at which I could relax. In fact, as I got closer to Melbourne I found myself doing things like reducing my cruise speed even more (which enabled old Kelvinators with wheels to overtake) and shouting out "Its touch and go! Touch and go!" (because I found that scene in the recent movie Death at a Funeral particularly amusing. I began to regret some minor throttle adjustments I that had made between the endless NSW roadworks to keep up my average speed.
Soon the range indicator just told me that the range was limited and then the patrol gauge started to flash regular warnings about the same matter, as if I'd been ignoring it!
The needle barely registered any life at all and I decided to pull into any establishment selling petrol as soon as I saw one, but as I cruised down the hill into somewhere near Wallan, less than 40 km from Melbourne city, a huge Caltex station magically appeared and I was able to drive all the way to the pump without phoning either the RACV or Alfa Road Assist. The 60 litre tank took just on 58 litres, so obviously I'd been panicking for nothing and needn't have worried.
I filled up, consulted the Melways and charged onwards towards the maze of tollways, freeways and bridges that surround the city. I skillfully managed to take the correct turn off from the Westgate Freeway after the Bolte bridge, but then paid the ultimate price for making one unplanned turn in Southbank (within a block of my accommodation) and immediately found myself heading north east, out of the city, on something called the Monash Freeway. The first exit was somewhere near Burnley and then I had to circumnavigate Richmond and South Yarra to get back to Southbank.
It didn't matter, I was now in Melbourne: the most civilised city in Oz.