Semantic, linked and smart data – predictions for 2014

See the full article on Scoop.itData & Informatics


Quite a lot to digest here, though the overall sentiment is positive for development and innovation around open and linked data. Actual products as opposed to proofs, pilots and concepts.

There is also renewed optimism that the Semantic Web can deliver on its original vision, Semantic Web 2.0 (my term) utilising ‘cognition-as-as-service’ (CaaS), and building bridges between ‘Big Data’ and the Semantic Web in order to rurn unstructured chaos into higher level insights.

The following abstract caught my eye:

One less obvious problem is one of information retrieval. Keyword search is now fundamentally broken. The more information is out there, the worse keyword search performs. Advanced query systems like Facebook’s Graph Search or Wolfram Alpha are only marginally better than keyword search. Even conversation engines like Siri have a fundamental problem. No one knows what questions to ask. We need a web in which information (both questions and answers) finds you based on how your attention, emotions and thinking interconnects with the rest of the world.

Sounds good if a little utopian.

Overall, some useful insights in this piece.

Original source: semanticweb.com

Feel free to share...Share on FacebookShare on Google+Tweet about this on TwitterShare on LinkedInBuffer this pagePin on Pinterest

Social Media: It’s not the Wild West after all!

If there is one good thing to come out of the Newsnight fiasco, which resulted in the irresponsible and inaccurate smearing of Lord McAlpine on social networks, it’s the challenge to the long-held assumption that Tweeting or blogging defamatory or libellous material cannot be policed, and that those who propagate and repeat such mis-information cannot be held to account. As Lord McAlpine’s lawyers progress their legal case against the BBC and those considered to have been responsible for incorrectly identifying him on social networks, it may cause quite a few people to reflect on their behaviour. This is the moment when it can be clearly shown and understood that people cannot be libelled or harassed with impunity just because the defamation is published online and by individuals rather than on paper or by large organisations. The idea that the world wide web is the Wild West and immune from the law is – at long last – being seriouslsy challenged.

 I think Sally Bercow should be particularly worried, since her tweet on 4th November: “Why is Lord McAlpine trending? *Innocent face*” was deliberately meant to start a feeding frenzy, which it did. I hope she’s got some good legal insurance (actually, on reflection, I hope she hasn’t!).

What many people seem to forget is that having a social media account, e.g. on Facebook or Twitter brings with it a certain responsibility. After all, these social networks bring incredible reach and potential access to an audience of billions. Other than age restrictions imposed by some vendors, you don’t need any special skills, no training, no licence and you don’t have to demonstrate any competence before you can establish an account and begin pumping out your message to the world. This is all well and good, and reinforces the democratisation of voice and freedom of speech. But this doesn’t mean you can say (write) anything you want. Yes, there will always be trolls, but that doesn’t mean we have to accept them or to not pursue them through the courts if they harass, incite hatred or libel people. A few short sharp shocks, as is promised in the pending legal action brought by Lord McAlpine, will perhaps remind social media users that they have moral and legal obligations and cannot sit behind a computer screen detached and immune from the consequences of their actions.

Think twice before you post that blog, or tweet/re-tweet that message! Do you trust your sources?

Mona Lisa

Feel free to share...Share on FacebookShare on Google+Tweet about this on TwitterShare on LinkedInBuffer this pagePin on Pinterest

15 Website Good Practice Principles

15 website development good practice principles.

Feel free to share...Share on FacebookShare on Google+Tweet about this on TwitterShare on LinkedInBuffer this pagePin on Pinterest

Where would we be without serendipity? I was looking for some examples of agile development techniques and stumbled across this excellent post from Tom Loosemore (the man behind Gov.uk from Government Digital Services). I thought I’d repeat the post here, but the original is on Tom Loosemore’s blog, published 7th February, 2007. As relevant today as it was then; proof maybe that sound advice built on practical experience doesn’t diminish over time.

  1. Build web products that meet audience needs: anticipate needs not yet fully articulated by audiences, then meet them with products that set new standards. (sourced from Google)
  2. The very best websites do one thing really, really well: do less, but execute perfectly. (Sourced from Google, with a tip of the hat to Jason Fried)
  3. Do not attempt to do everything yourselves: link to other high-quality sites instead. Your users will thank you. Use other people’s content and tools to enhance your site, and vice versa.
  4. Fall forward, fast: make many small bets, iterate wildly, back successes, kill failures, fast.
  5. Treat the entire web as a creative canvas: don’t restrict your creativity to your own site.
  6. The web is a conversation. Join in: Adopt a relaxed, conversational tone. Admit your mistakes.
  7. Any website is only as good as its worst page: Ensure best practice editorial processes are adopted and adhered to.
  8. Make sure all your content can be linked to, forever.
  9. Remember your granny won’t ever use “Second Life”: She may come online soon, with very different needs from early-adopters.
  10. Maximise routes to content: Develop as many aggregations of content about people, places, topics, channels, networks & time as possible. Optimise your site to rank high in Google.
  11. Consistent design and navigation needn’t mean one-size-fits-all:Users should always know they’re on one of your websites, even if they all look very different. Most importantly of all, they know they won’t ever get lost.
  12. Accessibility is not an optional extra: Sites designed that way from the ground up work better for all users
  13. Let people paste your content on the walls of their virtual homes:Encourage users to take nuggets of content away with them, with links back to your site
  14. Link to discussions on the web, don’t host them: Only host web-based discussions where there is a clear rationale
  15. Personalisation should be unobtrusive, elegant and transparent:After all, it’s your users’ data. Best respect it.

 

 

Feel free to share...Share on FacebookShare on Google+Tweet about this on TwitterShare on LinkedInBuffer this pagePin on Pinterest

Why are Government and Local Councils still using IE6?

It’s insecure, it’s flaky… it’s government IT policy!

I picked up on this article in The Register a couple of days ago, where Tom Watson MP had asked UK government departments when they intended to upgrade their browsers from Internet Explorer (IE6). It didn’t really surprise me that Tom Watson had raised this issue since I know he’s an advocate for modernising government through better use of technology, he’s a prolific blogger in his own right http://www.tom-watson.co.uk/ and was the primary driver in setting up the Power of Information Taskforce.

Tom Watson told the Reg:

“I’ve asked the questions because I feel sorry for the thousands of civil servants using the Austin Allegro of web browsers when they can have newer, faster alternatives. I want government CIOs to pull their fingers out.”

You can read the full article for yourselves, but I’ve abstracted the key points below:

  • The Department of Justice and Foreign Office are in the process of upgrading
  • The Department of Culture, Media and Sport expects to complete its move to IE7 by the end of August 2009.
  • The Home Office quoted February 2010
  • The Department of Health has no plans to upgrade
  • The MOD currently has no plans to upgrade.

A pretty mixed bag then and pretty depressing reading on behalf of couple of departments.  It also occurred to me why there isn’t an overarching strategy for web browsers across government. After all, isn’t this a key and fundamental component for doing ANY work on the intranet or the internet? And aren’t most staff in these departments dependent on being on-line as part of their daily routines?

I’d like to see this same question being asked of local government, where I suspect a similar pattern of complacency will emerge. In a strange paradox, government (central and local) put a high premium on security and accessibility for any new web services or technology procurement, but once vendors have jumped through all the appropriate hoops, a coach and horses can be driven through the whole process by insisting that the product or service will work with IE6 browsers. This places a huge burden on vendors who must ensure all features are backwards compatible with a browser that doesn’t comply with W3C standards and is full of security holes.

I know for a fact (given I am the business lead for the product) that a significant part of the development budget for the  local government community of practice platform goes into ensuring that all the features work with IE6. I estimate that at least 20% savings could be made if backward compatibility extended only as far as IE7 – which does at least comply with most of the W3C standards.

The latest information on IE6 market share is just over 12%. I’m betting that a good proportion of this 12% is public sector workers who continue to be poorly served by their IT departments and CIOs who don’t see the browser as being an important component in improving user productivity.

I’d like to see a campaign similar to the one started by Mash the State (Twitter: http://twitter.com/mashthestate)  which aims to get more councils to use RSS feeds, but this time to get central and local government to kick the IE6 habit – and quickly. My preference would be to give some choice to users on the browsers they use (I use Firefox because of the huge number of productivity plug-ins I can use), but I suspect this may be too ambitious. Let’s at least provide civil servants with a standards compatible browser that is more secure than IE6, offers some productivity enhancements and requires less development effort to make it work with standards-compliant web services.

Anyone up for getting a campaign started?

Feel free to share...Share on FacebookShare on Google+Tweet about this on TwitterShare on LinkedInBuffer this pagePin on Pinterest

Records Management in a Web 2.0 World

Information Management

My colleague James Lappin (Thinking Records Ltd) recently hosted and facilitated a podcast with me, Steve Baily – senior adviser on records management issues for JISC infoNet and author of the hugely successful and thought-provoking book ‘Managing the Crowd, rethinking records management for the web 2.0 world‘ and Elizabeth Lomas, PhD Researcher at Northumbria University. I was indeed in esteemed company!

For me this was an opportunity to air the views I had previously blogged about regarding a perceived disinterest or lack of understanding in the public sector (and possibly elsewhere) of how the Web 2.0 world is making traditional records management policies and procedures largely redundant and in some cases completely unworkable.

The podcast (Episode 4)  last 46 mins and covers the following points:

  • what impact is Web 2.0 having on the way organisations are keeping their records?
  • are current records management practices and standards still adapted to the web 2.0 world with its increased volume, and pace of information exchange, increased diversity of systems and increased pace of technological change?
  • what kind of record keeping would be suited to the web 2.0 world? Will the web 2.0 world result in organisations keeping records in a completely different kind of way?

This may be a dry subject for some people, but whichever side of the coin you’re on – the dynamic and relatively undisciplined world of Web 2.0 or the highly disciplined and structured world of records management, I think you’ll soon be affected one way or another since many of the issues go to the heart of how information is created, used, published and destroyed – or not, as the case may be. It’s certainly a polarising topic with few opportunities for sitting on the fence.  Either we accept that information creation is increasingly user-centric and we adapt policies, procedures and technology to cope with this, or we continue to throw money and resource at ECM and EDRM systems based on increasingly redundant policies and procedures that assume centralised control and management of information. Whatever you believe, there is a tension in the system that is going to lead to something breaking somewhere, and soon.

If you have an opinion – let’s hear it!

Feel free to share...Share on FacebookShare on Google+Tweet about this on TwitterShare on LinkedInBuffer this pagePin on Pinterest

Taxonomies vs. Folksonomies

I wanted to get myself up to date on contemporary ideas around use of taxonomies vs. folksonomies and was drawn to a course being run by the UKeiG (part of CILIP). The course was led by a renowned and respected information management professional and Fellow of CILIP.

It was like stepping back in time 10 or 15 years, where metadata standards, structured lists, taxonomies, thesauri and controlled vocabularies were paramount in the discipline of effective information management. Discussion on folksonomies, and social bookmarking (the original reason for my attendance) was sadly limited to a brief 10 minute slot at the end of the day. This led me to wonder whether professional bodies such as CILIP had truly grasped the magnitude of the change now taking place in the social computing space, and indeed, whether the social element of information management was recognised at all.

I was reminded of the unnecessarily over-complex government metadata schema e-GMS (a superset of Dublin Core) and the even more complex government subject tag encoding scheme, the Integrated Public Sector Vocabulary (IPSV), now over 8000 terms I’m reliably informed. Though I appeared to be the only one present who understood the connection between ‘over complex’ and ‘poorly implemented’. I thought it was common knowledge that many (most?) departments and organisations in the public sector arbitrarily picked a convenient high-level term from IPSV to classify all their web pages just so that they could tick the box for being IPSV compliant. I wonder how long it’s going to be before this ludicrous standard is consigned to the ‘good idea at the time but impractical to implement‘ bin by the folk over at the Cabinet Office.

Furthermore, I was not convinced by the argument put forward by the course leader as to the benefits of accurate and consistent use of IPSV terms for ensuring good search results. Searching on the term ‘wellington’, could, we were told, return results about the Wellington boot, Wellington New Zealand or the Duke of Wellington.

Right. But if users are foolish enough to use one search term without giving any context, then they deserve to get mixed and irrelevant results. One of the good things about Google is that it has conditioned most people on how to construct reasonably good search queries. I wonder how many users in the public sector would think to themselves as they surveyed their mixed bag of results from the ‘Wellington’ query “mmm, I must contact the webmaster about ensuring IPSV terms are more accurately applied” and how many would refine their search to something like ‘Wellington boot’, Wellington NZ’ or ‘Duke of Wellington’? Indeed, as far as a Google web search is concerned, complete absence of the IPSV meta-tag will make not a jot of difference to the search results because Google know they can’t rely on subject metadata in their search algorithms.

Then at last we finally got to discuss taxonomies and folksonomies. It was clear that folksonomies were not favoured by the course leader, who quickly demonstrated a tag cloud on Flickr, but without explaining why some tags were in a larger font than others (indicating their frequency of use) or the social networking aspect of how the tags got created in the first place. The sole reason put forward as to why folksonomies were not as good as taxonomies for information retrieval was the cost of tagging (?), conveniently forgetting – it seems – that taxonomies also require use of tags.

So, for the benefit of Librarians and other information professionals, and particularly the ones on the course I attended, here is my slightly more detailed analysis of the relative merits of taxonomies and folksonomies:

TAXONOMIES FOLKSONOMIES
Central control Democratic creation
Top-down Bottom-up
Meaning to the author Meaning to the reader
Process to add new Just do it
Accurate Good enough
Navigation Discovery
Restrictive Expansive
Defined vocabulary Personal vocabulary

As always, I am open to other views and opinions from my peers and experts in this field, and in particular I want to be reassured that information management professionals do understand that there is a quiet revolution happening in the world of social computing that threatens some long established standards and practices for effective management of information, and that there are now some new tools in the toolbox.

Feel free to share...Share on FacebookShare on Google+Tweet about this on TwitterShare on LinkedInBuffer this pagePin on Pinterest

Web 2.0 vs Accessibility

I attended an “Enterprise 2.0” event last week where Ian Lloyd gave a very thought provoking presentation on the impact of Web 2.0 on accessibility. Ian is a web developer working for the Nationwide Building Society, and clearly knows his stuff when it comes to designing websites that will accommodate assistive technologies – such as screen readers, voice to text and screen magnifiers.

This was particularly relevant to the work I’m presently doing in building on-line environments for support of Communities of Practice in the public sector, where accessibility standards and guidelines for websites is far more rigorously enforced than in the private sector. Conforming to standards such as W3C Web Accessibility Initiative is a given, but websites must also conform to guidance such as Delivering Inclusive Websites, issued by the COI.

Personally, I have some sympathy with developers of ‘social media-rich’ websites (which I’ll categorise as being ‘Web 2.0’) in that it’s quite difficult to find the right balance between accessibility and making the site appealing to a mass audience. Clearly Facebook comes to mind here. However, I’m not sure that vendors/developers do enough to ensure that they have catered for the disabled minority. For example, the Captcha processes used on a growing number of websites are fairly difficult to negotiate even for someone with 20:20 vision.

I don’t necessarily think that Social Media has to mean poor accessibility, yet there seems to be a sort of tacit acceptance that this is the case . I’m now far more aware of my obligations in striving to make the CoP platform available to a more diverse audience and will be taking steps to in the next development phase to ensure we’re meeting the required guidelines and best practice.

Two very useful resources for anyone interested in issues around accessibility and diversity are Abilitynet and the Shaw Trust.

Feel free to share...Share on FacebookShare on Google+Tweet about this on TwitterShare on LinkedInBuffer this pagePin on Pinterest

New Gov website fails accessibility standards

Just picked up from Public Sector Forums (PSF) – the Cabinet Office have launched a new ‘Customer Service Excellence‘ website, which apparently fails the government’s standards for web accessibility, breaches the guidelines for government websites and contravenes the COI’s ‘Inclusive Websites’ guidance. The website claims to be ‘AA’ compliant, but accessibility expert Dan Champion described the site as “shockingly bad…. a catalogue of serious failings. Every page on this site fails WCAG level A on multiple checkpoints and requires significant attention to be made accessible”.
Clearly this needs to be brought to the attention of the government department responsible for defining and policing these standards….which is, oh dear, the Cabinet Office. Another case of do as I say and not as I do perhaps?

Feel free to share...Share on FacebookShare on Google+Tweet about this on TwitterShare on LinkedInBuffer this pagePin on Pinterest

Social Media User’s Bill of Rights

Feel free to share...Share on FacebookShare on Google+Tweet about this on TwitterShare on LinkedInBuffer this pagePin on Pinterest

Came across this today, which seems to be gathering a body of support. I like the sentiments; pity it’s not enforceable!

Joseph Smarr, Marc Canter, Robert Scoble, and Michael Arrington have authored a bill of rights for users of the social web. The bill states:

We publicly assert that all users of the social web are entitled to certain fundamental rights, specifically:

  • Ownership of their own personal information, including:
    • their own profile data
    • the list of people they are connected to
    • the activity stream of content they create;
  • Control of whether and how such personal information is shared with others; and
  • Freedom to grant persistent access to their personal information to trusted external sites.

Sites supporting these rights shall:

  • Allow their users to syndicate their own profile data, their
    friends list, and the data that’s shared with them via the service,
    using a persistent URL or API token and open data formats;
  • Allow their users to syndicate their own stream of activity outside the site;
  • Allow their users to link from their profile pages to external identifiers in a public way; and
  • Allow their users to discover who else they know is also on their
    site, using the same external identifiers made available for lookup
    within the service.
Feel free to share...Share on FacebookShare on Google+Tweet about this on TwitterShare on LinkedInBuffer this pagePin on Pinterest

Blogging Policy

Feel free to share...Share on FacebookShare on Google+Tweet about this on TwitterShare on LinkedInBuffer this pagePin on Pinterest

I’ve recently seen a number of conversations in the blogosphere from people asking about corporate blogging policies, since I assume their companies are getting nervous about what their employees might  be saying via the cyber medium. Sun met this issues head on about 3 years ago, and actively encouraged their employees to blog by providing them with dedicated server space. Their blogging policy is as good as any I’ve seen.  Pity that all companies don’t encourage this level of transparency (I’m not a Sun employee by the way!). 

Feel free to share...Share on FacebookShare on Google+Tweet about this on TwitterShare on LinkedInBuffer this pagePin on Pinterest