Tuesday, December 04, 2007

Web 2.0 changing the organisation

One thought I had attending the Enterprise 2.0 conference was how Web 2.0 would change the nature of organizations. A lot of the presentations at the conference were about how web based social networking could be used to promote a company, get staff or promote products or services. But there was little about how this will change the nature of work itself. If the staff are used to using web based tools to communicate with each other, will that change the way they do work? Will this put an end to meetings?

In some respects the Web may have already changed retail stores. I went into the new Telstra T.Life store in Sydney to look at wireless broadband. The new store looks like something from the TV comedy "Ab Fab". The store seems to be entirely white with almost no furniture, no products visible and no information about products. Staff at a counter which could the reception desk of an upmarket hotel took my inquiry and walked me over to a wall mounted touch screen. They entered my query, almost as if they were part of a voice operated web page. I was then able to read the results on the screen which they explained to me. Unfortunately the store did not have any of the actual products in stock and Telstra did not have a marketable product anyway (Telstra sell their entry level wireless broadband as a per-hour service, which is makes it non-viable).

But how will Web 2.0 effect normal offices? If staff can communicate and coordinate their work on line, will the need for people to be co-located be removed? Will staff be able to work from a serviced office or from home? Previously even independent workers, such as lawyers and stockbrokers, tended to cluster in particular places so they could share gossip. If web based services provide the usual office water cooler, coffee shop gossip, then the need for this clustering is removed.

But as well as removing the need for members of one profession to cluster at one place, it may remove the need for people in one organisation to be collocated or to even work for the organisation. The extreme example of this is Amazon Mechanical Turk. With this a pool of piece workers bid to do tasks. It is conceivable that 21st century organisations, including private companies, government agencies and not-for-profits could consist of a web site with links activating the Mechanical Turk to employ staff to do the work. There would be no staff employed by the organisation.

Labels: ,

Leveraging the Webfor the New Australian Government

Brian Stonebridge, Director of Service Improvement and Interoperability Projects, AGIMO at the new Department of of Finance and Deregulation (created yesterday) talked about "Leveraging the Web to Collaborate in Government: GovDex Case Study", at the Enterprise conference. Unlike many of the previous speakers who used powerpoint slides, Brian actually demonstrated the GovDex tool live, showing online collaboration with the French government. He used learning to dance the Salsa as a metaphor for interdepartmental and international collaboration for IT systems.

One question with this is to what extent the nature of bureaucracy will inhibit the use of Web 2.0. Web based social networking assumes personal, informal and spontaneous communication. Government agencies usually work through impersonal, formal and planned communication. Will the benefits of Web 2.0 be possible in the government environment and in large companies with similar structures.

Collaborative Government Suite

"A suite of tools to support interagency and cross jurisdictional projects"

The newly elected Australian Government has indicated a stronger focus on working across jurisdictions to deliver better results for Australian citizens and firms. The Australian Government Information Management Office, a Group within the Department of Finance, is building a suite of tools to support this agenda.

National Service Improvement Framework (NSIF)

The NSIF is a suite of re-usable documents and tools that aim to deliver enhanced collaborative service delivery arrangements across government departments and agencies. The NSIF is a structured approach to collaborative service delivery across traditional boundaries. The Framework provides a tiered approach for Government agencies to follow when seeking to collaborate. The Framework defines a process where potential partners to collaboration can build on agreements in incremental tiers. A key component of the NSIF is the provision of a Collaborative Head Agreement which provides an "off the shelf" MOU to fast track the legal treatment of risks associated with inter agency collaboration.

For more information please contact Liz Marchant on Liz.Marchant@finance.gov.au, or go to www.nsif.gov.au

Australian Government Architecture (AGA)

The Australian Government Architecture (AGA) aims to assist in the delivery of more consistent and cohesive service to citizens and support the more cost-effective delivery of ICT services by government, providing a framework that:

Provides a common language for agencies involved in the delivery of cross-agency services;
Supports the identification of duplicate, re-usable and sharable services;
Provides a basis for the objective review of ICT investment by government; and
Enables more cost-effective and timely delivery of ICT services through a repository of standards, principles and templates that assist in the design and delivery of ICT capability and, in turn, business services to citizens.

For more information please contact Peter Leach on peter.leach@finance.gov.au

Government Data Exchange (GovDex)

GovDex is an Australian Government initiative to facilitate business process collaboration across portfolios, administrative jurisdictions and agencies. It promotes effective and efficient information sharing, governance structures, tools, methods and re-usable technical components.
GovDex is comprised of 3 components:

  • a collaborative workspace;
  • a registry/repository; and
  • tools and methods.

GovDex is currently being used across the Federal Government as well as the NSW, QLD, VIC, SA and WA State governments. More

Standards Governance

Government policy objectives in the 21st century require greater agility by agencies. This often requires collaboration across portfolios and jurisdictions. Complex policy objectives related to water management, carbon trading, standard business reporting, national security, tax fraud etc all require the participation of a wide range of agencies.
Standardisation of information and processes can provide significant benefits to Australian governments through reduction in risk, increase in reuse and a higher level of interoperability (and hence efficiency) within and across jurisdictions. They support the effective delivery of services to citizens and business.
In this context it is important that agencies have a mechanism for agreeing on a "standardised" way of exchanging data to help achieve those business outcomes.

For more information please contact Brian Stonebridge on Brian.Stonebridge@finance.gov.au

Government Information Exchange Methodology (GIEM)

If you are working on a project that includes a requirement for XML message exchange between agencies (G2G) or between business and government (B2G) then GIEM provides the hands-on tools and methods to help you achieve your goals in accordance with whole-of-government standards. GIEM is a model driven approach to interoperability that is based on international standards and best practices. GIEM provides a top down architectural approach that ensure that the final technical components are aligned with government standards and meet business goals and requirements.
GIEM has been submitted to the UN/CEFACT as part of the review of UMM.
It has been successfully used in a number of projects.

To see more go to GovDex Tools

From: Collaborative Government Suite, 2007

Labels: , ,

Automated Folksonomy Creation

David HawkingAt the Enterprise 2.0 conference in Sydney, David Hawking, Funnelback founder and Chief Scientist at CSIRO ICT Centre talked about how a web search system could add user generated tags automatically. This does not involve the user entering the tags, or even knowing they are doing it. Instead the search interface notes the words the user searches for and the documents they then select.

It would be interesting to see how well the Funnleback search system works on Web 2.0 content, rather than neat government reports and scientific research reports. Perhaps the search system could participate in discussions as an intelligent information source.

The conference ends today. I am on a discussion panel in the afternoon at the conference having given my presentation yesterday on "Enterprise 2.0 Providing Solutions to Wider Business Needs".

Labels: , ,

Monday, December 03, 2007

Enterprise 2.0 with Web 2.0 for organisations

Greetings from the conference "Enterprise 2.0: Collaborative Web 2.0 into your organisation", in Sydney 3 and 4 December. I am on after lunch with "Enterprise 2.0 Providing Solutions to Wider Business Needs". This is about applying web social stuff to business. My take is to say this will be part of the replacement of the desktop PC in 2008 with low power thin client Linux computers.

The Wikipedia prefers the term "Enterprise social software" to Enterprise 2.0:
Enterprise social software is a term describing social software used in "enterprise" (business) contexts. It includes social and networked modifications to company intranets and other classic software platforms used by large companies to organize their communication. In contrast to traditional enterprise software, which imposes structure prior to use, this generation of software tends to encourage use prior to providing structure. ...

The term 'enterprise social software' is a general term for describing this class of tools. As of 2007, Enterprise 2.0 is a catchier term sometimes used to describe social and networked changes to enterprise, which often includes social software (but is not limited to it, nor to either social collaboration or software); and Enterprise Web 2.0 sometimes describes the introduction and implementation of Web 2.0 technologies within the enterprise including those rich internet applications, providing software as a service, and using the web as a general platform.

Labels: ,

Wednesday, October 24, 2007

National ICT Center Opening in Canberra

NICTA will be opening their new national ICT center building in Canberra next month, with a free seminar on "Semantic Technologies for Business and Government", 13 November . Unfortunately NICTA prepared the invitation as a PDF file which is 9 times larger than it need be, so here is text version of the details:
Semantic Technologies for Business and Government
A half-day seminar in Canberra co-ordinated by NICTA’s e-Government Project


We live in an age of information overload, surrounded by masses of digital data, but lacking the tools to process it based on its meaning or semantics. Consequently, a lot of human time and effort is spent manually transforming and processing data when porting it from one application or data store to another, or when aggregating it into a form suited for analysis and execution. Semantic Technologies represent a new wave in computing which aims to make the meaning of data and services explicit and machine processible for improved interoperability, searching and querying.
This seminar will focus on the core concepts and issues of semantic technologies, covering these topics:
  • Overview of semantic technologies — Semantic Web, Web2.0, Ontologies, Metamodels and Metadata creation, Modelling Languages, OWL, Knowledge Sharing and Utilization;
  • Overview of current tools, languages, and notations;
  • Applicability to government services, processes, and infrastructure;
  • Case studies of the use of semantic technologies in government;
  • Survey of industry opinion in Australia on the future of semantic technologies.
Rather than comparing vendor technologies or detailing specific languages and notations, the seminar will focus on presenting the core technical ideas and approach of semantic technologies, providing attendees with a firm basis for further investigation and evaluation.

Intended Audience

This seminar is designed for senior technical staff and business managers in government, involved in business transformation, digital preservation and record keeping, knowledge management, enterprise planning and enterprise architecting,
inter-agency interoperability, and organisational process improvement. It will also be of interest to representatives of the ICT industry involved in enabling these activities.

Format of the Seminar

9:00 Registration
9:15–10:30 Session 1:
Overview of Semantic Technologies (Anne Cregan and Paul Brebner, NICTA).
Government case study (Don Bartley, ABS).
10:30–11:00 Morning Tea
11:00–12:00 Session 2:
Case Study 2 Industry Survey — Towards the Semantic Web: Standards and Interoperability across Document Management and Publishing Supply Chains (Anni Rowland-Campbell, Fuji-Xerox and RMIT).
12:00–12:30 Panel Session.
Includes the above presenters plus representatives from AGIMO.
12:30–13:30 Light Lunch and Networking

Bookings for this free event are essential

Please RSVP by no later than 6 November, 2007:
Phone: (08) 8302 3928
Fax: (08) 8302 3115
Email: industryeducation@nicta.com.au

Seminar Room NICTA Building 7 London Circuit Canberra ACT

Date: Tuesday, 13th November 2007 9:00am--1:30pm

From: Semantic Technologies for Business and Government, NICTA, 2007
ps: While NICTA's PDF files are too big, they are not as bad as National Archives of Australia invitations. The invitation I was emailed for their National Speaker's Corner, was 5.2 Mbytes, which is 200 times larger than it need be. By doing so NAA is wasting public money and contributing to the greenhouse effect.

To see how to do elelctronic documents, and therefore government, more efficiently, there are still some places on my course available.

Labels: , , ,

Friday, October 19, 2007

How to Create On-line University Courses in Electronic Archiving: Part 11 - First Draft

In Part 10 I looked at requirements. Now I have incorporated that into a first draft of the course web site in Moodle. All are welcome to read and review the course, using "guest" access". The students expect a printed set of notes, so I copied and pasted the 12 units into OpenOffice.org to make 70 pages of notes (not including the exercises or reading material).

Electronic Document Management

By Tom Worthington FACS HLM

Module 2 of Systems Approach to the Management of Government Information, ANU, 2007

The Electronic Document Management course introduces two topics: metadata and data management (digital library, electronic document management). Use of the technology for practical e-commerce and e-publishing applications is emphasized using case studies and anecdotes drawing on the lecturer's experience.

Identifying steps that can be taken to accelerate the uptake of e-commerce by Australian small- and medium-sized enterprises, this course enables the participant to learn practical skills for incorporating e-commerce into their businesses.

This course is based on Tom Worthington's lectures on Metadata and Electronic Document Management for IT in e-Commerce (COMP3410/COMP6341) 2007.

Structure: The course is 12 hours in total spread over 3 days with 4 teaching hours per day. The course consists of 6 hours of lectures, 2 hours of practical classes, 2 hours of tutorials and 2 hours of assessment exercises.

Labels: , , , ,

Friday, October 05, 2007

Enterprise 2.0 Providing Solutions to Wider Business Needs

Wednesday, August 29, 2007

W3C Australia Standards Symposium

W3C Australia held a one day Standards Symposium in Canberra on 28 August 2007. This is a one day event to look at where web standards are going. These are my informal notes from the event, not official minutes. The symposium was organized with NICTA, with OASIS, OGC and AGIMO also presenting.

World Wide Web Consortium Australia

The World Wide Web Consortium's Australian office (
W3C Aus) is run by CSIRO in Canberra (on the other side of my office wall in the ANU Computer Science and Information Technology Building).

W3C issue what they call "recommendations", but which are really standards, for HTML, XML, CSS and other key web technologies. W3C was founded by Tim Berners-Lee, inventor of the web, in 1994. As with any standards work, there is a rich mix of political, technological and commercial forces at work.

A recent area of tension touched on in the introduction was the schism in the web community between HTML and XHTML. Those working on the next version of HTML (HTML 5) have clearly stated they want to go a different direction from the work on the next XHTML (version 2).

Other tensions are with intellectual property issues with web recommendations. W3C aims to produce technology which can be freely used, without payment of royalties.

W3C wants to expand the web beyond desktop computers, to devices such as mobile phones. That probably is more a matter of commerce, than technology, but the advent of new consumer smart phones may make a differecne.

Typically the W3C process is to first have a "workshop" in an area of interest, then a working groups is formed (if justified) and publishes drafts for comment, implementations are produced to see the technology works, and after several more drafts a recommendation is released. Perhaps more importantly, W3C releases revisions and new versions of recommendations. Implementation guides and web tools are also provided to help with implementation.

As well as the more technical standards for HTML and CSS, W3C also produces guidelines, such as those for web accessibility. There are dozens of working groups working on interrelated recommendations who need to coordinate their work. W3C membership costs money and working group members contribute their time for free.

W3C Australia head, Ross Ackland, claimed the future of the web was to: semantic web, mobile web, and sensor web. He suggested we were in the middle of a ten year adoption of the mobile web, with the semantic web was further in the future and
sensor web was a newly emerging technology CSIRO would like to foster.

The semantic web tries to make a web which machines can understand. Ross argued that Web 2.0 and mashups were a "grass roots" ad-hoc approach to what the semantic web was attempting. My view is that WSeb 2.0 and mashups were providing useful services, while semantic web is a failure which should be abandoned.

The W3C Mobile Web Initiative in 2005 got the attention of the mobile phone industry. But the industry has had several attempts at turning the mobile phone into a viable mobile web device. The industry's attempt with WAP was a failure costing billions of dollars. W3C's own attempt with XHTML Basic, has had limited success. About the only one to be successful was Japan's iMode, which uses a version of HTML which the W3C rejected.

The Sensor Web will provide some standards for sensor access in the future:
The Sensor Web is a type of sensor network or geographic information system (GIS) that is especially well suited for environmental monitoring and control. The term describes a specific type of sensor network: an amorphous network of spatially distributed sensor platforms (pods) that wirelessly communicate with each other. This amorphous architecture is unique since it is both synchronous and router-free, making it distinct from the more typical TCP/IP-like network schemes. The architecture allows every pod to know what is going on with every other pod throughout the Sensor Web at each measurement cycle.

From: Sensor Web, Wikipedia, 21:20, 26 July 2007
CSIRO have a sensor web in Brisbane which can be accessed via web services:

This server contains test deployments of the Open Geospatial Consortium's (OGC) Sensor Web Enablement (SWE) services. ... getCapabilities ... data from the sensors deployed by the Autonomous Systems Laboratory in Brisbane, Australia. The sensor measure temperature, soil moisture and onboard diagnostics at three locations, qcat, belmont and gatton. There are roughly 125 stations with two or three sensors each. This yields over 250 data sources of which about 150 appear to be active. Each source reports every few minutes with data coming in every few seconds. ...

From: CSIRO ICT Centre SWE Web Services, CSIRO ICT Centre, 20 April 2007

Ross ended by asking what Australia could do for web standards. He pointed out that successful standards also needed market adoption. Standards take about five years to develop. The benefits are global. How does Australia contribute? An example is standards for water data standards to help with conservation in Australia and world wide.


OGC develops "specifications" for digital maps. The aim is to be able to knit together different online mapping services to produce a coherent view for the user. OGC works with W3C groups, ISO (ISO 191xx series including ISO 19115 for Metadata) and OASIS (such as Common Alert Protocol (CAP) for emergency messages), IEEE (Sensor Model Language: SensorML).

OGC sponsors scenarios to test implementation of standards (much like the
Coalition Warrior Interoperability Demonstration [CWID] for military IT). OWS 4 in December 2006 worked on sensor web enablement SWE, geo processing workflow GPN and geo-decision support. OWS 5 for 2007 is being planned.

One thing which got my attention was mention of "Social Change On-line".

At question time there was a philosophical discussion of what a standard was, their benefits, disadvantages and processes. This was entertaining but not very enlightening. Perhaps there is a need for some courses on what standards are and how they are created.

Organisation for the Advancement of Structured Information Standards

Organisation for the Advancement of Structured Information Standards (OASIS) was foundered in 1993 for SGML related standards (more recently XML standards). It has more than 60 technical committees. Individuals and organisations can join. A well known OASIS standard is ODF, based on the OpenOffice.org office document format. OASIS produces horizontal standards (general purpose technology) and vertical standards (for a particular business function). Other standards are
Universal Business Language (UBL) , Customer Information Quality (CIQ) for identifying locations, organisations and people and Common Alert Protocol (CAP) for emergency messages.

Semantic Web

W3C's Semantic Web is about being able to process information. Current work is on an English-like version of the
Web Ontology Language (OWL). This reminds me of the attempt with COBOL to create an English-like computer programming language which could be understood by non technical business people. The result was a verbose language which was still unintelligible to business people and cumbersome for trained computer programmers.

SPARQL is the semantic query language. POWDER the Protocol for Web Description Resources. GRDDL the Gleaning Resources Descriptions and Dialects of Languages.

This was the least useful session of the day. The Semantic Web may well turn out to be very useful one day, but so far all that appears to have been produced are a bewildering array of unintelligible standards. About the only prospect of any of this work ever being of use would be to apply the process Tim Berners-Lea used to create the web, where he took a large and complex standard (SGML) and trimmed it down to the essentials to make HTML.


Chris Body presented about standards in Geoscience Australia. GA seem to have suddenly become more visible, with work on geospatial standards and Tsunami warnings.
The Special Minister of State, Gary Nairn, announced an Australian Spatial Consortium (ASC), on 14 August 2007, but it was not clear to me what this is.

ANZLIC (Spatial Information Council) have provided the ANZLIC Metadata Profile (December 2006) ISO TC211 framework. GeoNetwork is a metadata entry tool endorsed by Australian agencies in August 2007.

Geoscience people have a preference for formal international standards. However, GA is aiming to have any Australian contributions to be available free for public use under a Creative Commons licence.

Australian Government Information Management Office

Brian Stonebridge from AGIMO working on a standards governance framework. Brian argued that standards are boring to end users, there has to be some value to the user to get them interested. Brian's presentation was the most impressive of the day, because he was taking about how the standards could be used for the benefit of the community and he actualled used the technology he was talking about to make the presentation, via AGIMO's GovDex:
GovDex is a resource developed by government agencies to facilitate business process collaboration across policy portfolios (eg. Taxation, Human Services etc.) and administrative jurisdictions i.e. federal, state or local government levels. ...

From: Welcome to GovDex, Australian Government Information Management Office, 2007
Brian mentioned that some of the work is being done online, via the system with the French government.

Brian estimated that development of standards for government use will cost about $2M a year to administer. This is not the development of new technical standards from scratch, but selecting and profiling standards for a particular application (such as selecting e-document formats for an electronic application for building a house).

AGIMO have developed a plugin for enterprise architect for government standards.

AGIMO will use underlying international and national standards, and over this methods and tools, governance and references models. The business case for this is that it will reduce the cost over time.

Unfortunately Brian then lost me in an assortment of acronyms, including:
  • GIEM, Government Information Exchange Methodology (UMM v2.0 and CCTS v2.0). This extends the Canadian GSRM and is similar to the upper layers of AGA.
  • AGOSP: Australian Government Online Services Portal.
Also NICTA launched a three-year research initiative in eGovernment in January 2007, but it is not clear what this is intended to achieve.

Overview of the day

Ross Ackland argued that we were now "moving up the stack": the low level standards for digital communications using the Internet are set and largely working. The web provides an digital publishing overlay for this. Now more semantic content is being added to the web with standards in areas such as Geoscience and more general areas such as the Semantic Web. This is a useful way to think about the work, but the reality I see is not such a clear or systematic path.

Ross asked what should W3C and other bodies do to further standards in Australia. W3C has only a few full memebrs in Australia, due to the small size of the It industry.

I suggested that NICTA, CSIRO and other interested parties could create a one hour presentation explaining how standards development works in Australia. This could be placed on the web and offered to ACS and other IT groups to explain where standards come from and how they could get involved. This may help avoid some of the controversy and confusion surrounding issues such as the proposed adoption of Microsoft's OOXML format as an ISO standard.

One way to look at this which Ross pointed out is that the point of view about the systems are built will change: instead of building an application for an organisation and then try to interface it to other organisations, we will build the interfaces first. From the wiser perspective, I suggested that the web standards effort could be seen as building a global computer system for processing information, much as the Internet is a global system for communicating information.

Some Overall Issues on the Day

* WHERE IS ASIA?: Several speakers talked of how the standards committees were heavily influenced by US government agencies (particularly the military and security) and less so by European organisations. There appears to be little involvement by Asian organisations. There appeared to be a lack of interest in why this is so, the problems it will cause and what to do about it. Australia is culturally close to the USA and Europe and so can ride on the coat tails of the current standards process. However, at some point Asian countries and industries may decide their interests are not being served by the current standards process and decide to set up a new process for standards. Perhaps Australia can play a part in bridging the gap. This could address cultural and geopolitical issues using the web technology itself.

* USING THE STANDARDS: Many groups are producing advanced web standards. Some Internet and web tools are being used by committees. But the output of the standards committees are PDF documents or web pages. It might be useful for the web standards groups to apply some of the technology they are proposing to the standards process itself.

* USING STANDARDS: Perhaps one area in which Australia can contribute is to helping test and implement standards. This will provide useful feedback to the standards developers and also provide potential useful products.

* AUSTRALIAN DEVELOPMENT THROUGH STANDARDS: The most productive part of the day was meeting David Peterson from Boab Interactive . This Australian IT company is the latest member of W3C Australia. They are based in Townsville, North Queensland and doing web work, mostly with tropical environment research projects. Some years ago the AUstralian government funded me to see how to get regional ICT happening.

Labels: , , , , , , , ,

Tuesday, July 31, 2007

Researching Web 2.0

On Monday, Roger Clarke argued in an Australian National Unviersity seminar that Web 2.0 is a valid area for formal research.
Roger made a good case that something in Web 2.0 was worth researching, even if it was just working out if Web 2.0 is actually anything. ;-)
An earlier version of his notes are available, as are the slides, but they are 5Mbytes.

Subscribers to the Link mailing list will have seen this work evolve, with a number of requests for input and comment. Early on I commented that Web 2.0 was the same as AJAX and the talk was useful in correcting that misconception.

It was be easy to dismiss Web 2.0 as just a marketing gimmick, but even if so it is a very effective marking gimmick. Therefore those involved in delivering and researching systems need to be able to talk intelligently about it (even if just to say it isn't anything). One aspect of this is that Web 2.0 is very much about commercial use of the web and this colors all discussion of it.

Roger's search showed few genuine academic citations exist about Web 2.0. This would therefore seem a fruitful areas for research proposals. He first summarized Web 1, as an aggregation of technology for e-commerce and the like, without a formal architecture.

Web 2 is a marketing driven drive for something, but it is not clear what (more of a feeling that a strict distinction). One aspect is addressing the "long tail": exploiting the low volume business with low cost online services.

Some aspects: Syndication (as in RSS), Advertising Syndication (I suspect per click models might have had their day). Participation (as in Wikipedia), Collaboration (as in Wikipedia), and Tagging. One interesting aspect is that companies can induce customers to provide some of their customer support, in the form of produce reviews, support and FAQs. It occurred to me that this was the equivalent of the telephone support line putting you into a conference call with the other customers, and recorded message saying "sort it out yourself". ;-)

The new trendy area of the web is social networking and its application to business. As I found out only last week, the trendy new buzzword for this is "Enterprise 2". To find out exactly what that is, if anything, we will need to wait for another seminar.

Roger's next seminar at the ANU is "Big Brother Google?", 27 August 2007.

Labels: , , ,

Friday, July 27, 2007

Social computing for government and business

The Web Standards Group Meeting in Canberra 26 July, 2007 was devoted to applying social computing to business:

Collaboration, innovation, distribution: social computing adoption benefits for government and business, by Stephen Collins, acidlabs.

Stephen argued that social computing can be used for government and business. He confused me at the beginning by putting up a photo of someone and saying they had popularized "Enterprise 2". Apparently this is term for Web 2.0 applied to business. Social networking makes relationships between people visible and explicit and Stephen argues this would help in business. However, it is not clear to me this will translate to all business or social cultures. Web 2.0 social networks seems to imply a very naive view of how social and business relationships work. Stephen argues that organisations can build up the trust needed to make social networking work in government. This seems to have elements of the matrix organisation about it. Stephen suggests that social networking tools can be used, with appropriate security and some short guidelines. It occurred to me that military personnel are trained to use social networks and so are more likely to cope with the online equivalent more than other organizational staff.

However, this assumes that there will appropriate reward mechanisms (such as pay) for those who contribute to the social network and some way to detect and moderate the behavior of those who are unable or unwilling to play the game by the rules. Real world organisations have complex overlapping, fluid groups. Even formal political parties have factions and, as when there is a conscience vote, someone can be in several different groups with conflicting aims simultaneously. Much the same behavior occurs at technical standards meetings. Online systems for running organisations need to take this into account.

Examples: NLA Wiki, AGIMO GovDev, Network of Public Sector Communications NZ.

Goldilocks and the three bears: a story about social computing in government by Matthew Hodgson, SMS Management & Technology

Matthew argued the folk taxonomies to be used by government agencies to better communicate with their clients. Tagging could be used as a bridge between the wording used be clients via topic maps to strictly structured taxonomies. He argued that systems used for records management systems, such as Tower Software's Trim, are too rigid for many work purposes. Tagging examples he used were Technorati, flickr and Blogger. He argued a tag cloud could be used for reporting what client relevant activities the organisation had undertaken.

At question time I asked if semi-automatically added tags could be used, with the same technology as used by search engines for understanding documents. Matthew replied this can be done, but the organisation has to have suitable tools. In one project the technology is being used to reformat information.

What I found most useful was an example web page which showed the formal taxonomic term at the top, a definition of the term and the folkosonomy tags at the bottom. In this way there could be a translation between the bureaucratic formal language and what is used in the real world.

Web 2.0 Research

Also on Monday, Roger Clarke will argue at the ANU that Web 2.0 is a valid area for formal research. Given that the ANU is, in effect, the university for training the Australian Government, perhaps that research can include how to apply Web 2.0 social computing to government. This might be a way to extend government to more remote areas and make it relevant.

Labels: , ,

Monday, February 05, 2007

Corporate social networking with web 2.0?

The IT business media seem to be taking Web 2 seriously, so perhaps it is time to look at it. But there seem to be several concepts mixed up together (or perhaps "mashed up"?). Sorting this out may solve some problems in corporate document management and academic publishing.

One is the use of AJAX and similar technology to provide a more interactive interface via the web. Another is traditional office applications provided via the AJAX interface (such as word processors and spreadsheets). The third is on-line meeting places, such as MySpace.

There is also YouTube, a video sharing web site, which usually gets mentioned in the same articles but does not seem to have anything to do with social networking or corporate applications, but just gets included because it is popular.

Capitalizing on Interactivity, Mobility and Personalization by Donna Bogatin, January 22nd, 2007:
Categories: Business Models, Web 2.0, Culture, Google, Blogs, User-Generated Content, MySpace, Social Web, Amateur Content, Self-Promotion, Google Software Applications, Social Networking, Social Media

Is MySpace coming to the enterprise? According to Business Week it is.

On what does Steve Hamm base his assertion? IBM's announcement today of “Lotus Connections.”

IBM describes its offering as “the industry's first platform for business-grade social computing”:

Lotus Connections facilitates the gathering and exchange of information through professional networks, provides a dashboard-like view of current projects and connects users to like-minded communities. In addition, Lotus Connections removes the need for multiple social software applications, providing businesses with a single destination for building professional communities. ...
Corporate social networking is name of game with Lotus Connections, By Stan Beer, 24 January 2007 :
While Microsoft has been trying to win Web 2.0 corporate hearts and minds with Sharepoint Server, IBM threatens to steal the show with a new corporate tested offering called Lotus Connections. Web 2.0 in the consumer space is all about social networking as exemplified by sites such as MySpace, YouTube and FaceBook. Users of these sites with common interests can network, share ideas and provide each other with information that builds upon their mutual knowledge base.
The idea of using more interactive web applications makes sense in the corporate environment, provided you have the bandwidth and processing power to do it and accept its limitations. In some ways this is a step back to centralized mainframe computing, with the web application running on the server. If the central application stops, no one can do any work. This would be a good way to go if you have a new application to introduce across a wide network.

The extreme case seems to be to run your corporate service on someone else's web server. Google have a service called "Google Apps for Your Domain" which provides online tools for email, instant messaging and shared calendar. The idea is that the same tools used for Google's Gmail and others are available for use by companies, educational institutions and other organisations. They use the Google system in place of their own in-house software.

Google are not charging for these services, but presumably are doing it to make people more familiar with Google's services which have advertising on them:

Google Apps for Your Domain lets you offer private-labeled email, instant messaging and calendar accounts to all of your users, so they can share ideas and work more effectively. These services are all unified by the start page, a unique, dynamic page where your users can preview their inboxes and calendars, browse content and links that you choose, search the web, and further customize the page to their liking. You can also design and publish web pages for your domain.
I remain a bit skeptical of online meeting places as a business tool. Any form of collaboration requires skills from the participants. Not everyone has these skills and corporations will need to invest in training and staff to make them work. As well as cooperation, workplace involve competition. Perhaps rather than a social network, an information market would be a better model for the on-line workplace. Also much social networking takes place outside the organisation.

Are companies prepared to formalize and document online the process by which their staff trade information with other organisations? In many cases these contacts take place verbally and informally, while tacitly endorsed by superiors. If the contacts took place via a computer system, all transactions would be recorded and could be used in evidence in court. Much of these contacts would be considered unethical or illegal, limiting the scope for using a formal system.

What has this to do with corporate document management or academic publishing? Organisations, particularly governments, are having difficulty with staff filing electronic documents properly. Academia are having difficulty over the role of academic publishing. In both these cases the problem is that the records manager or librarian sees the document or publication as an end in itself.

But the office worker or academic author sees them just as part of a process; a byproduct of doing some work or some research. By incorporating the social network process in the system used to produce the document, keeping good records or publications will be a natural by product of the work. This is more than just an automated work flow which prompts you for some keywords before you can save a document.

Labels: , , , , , , , , , , , , ,