Wednesday, October 14, 2009

Filtering of Web Content by Internet Service Providers

The ACS has released the report "Thechnical Observations of ISP Based Filtering of the Internet" (12 October 2009). This discusses how well filtering of web content works, rather than the political or ethical issues of if it should be filtered. Some of these are discussed in my ABC Radio talk "Filtering Porn on the Internet:Imperfect by Necessity". The ACS report is 22 pages (300 kBytes of PDF), but the findings are summarised in a media release "ACS ISP Filtering Report: No Silver Bullet".
The ISP Filtering Report highlights the current challenges associated with filtering or blocking of internet content, which include:
  • Lack of a clear definition of the types of content that are subject to filtering
  • Limitations of automated techniques for analysing video, pictorial and audio content
  • Need for clear and consistent criteria behind labelling and rating of content
  • Where filters are placed within the network architecture, there is an impact on network performance (efficiency, speed etc)
  • Avoiding ‘over blocking’ and ‘under blocking’ and achieving consistency in blocking of material
  • The rate at which new Internet-accessible content is being generated makes it difficult to maintain up-to-date black lists, white lists, keywords and phrases etc used by analysis algorithms
  • Effectively managing user-generated material, which is created ‘on the fly.’ The labelling/rating of these sites and content is practically impossible; and
  • How to deal with encrypted traffic and secure channels, as encryption impedes filtering....
The Taskforce makes the following recommendations for progressing the debate and some issues surrounding ISP filtering. These recommendations aim to reduce the likelihood of inadvertent exposure to illegal content on the Internet:
  1. Multi- faceted approach using filtering technologies to address the distribution of illegal material - A multi faceted approach is needed to address filtering out or blocking of illegal material on the Internet using filtering technologies at the ISP, user and enterprise levels. This includes increased professionalism and tighter controls around domain name registration, education at all levels of society and oversight by parents.
  2. Education and oversight are the best methods to ensure online safety for children - There is no technological substitute for appropriate education and parental supervision of young people who are using the Internet. Education and oversight remains the best method of ensuring that children (and other end users) are aware of online safety and are not deliberately viewing inappropriate material or engaging in inappropriate behaviour online.
  3. Objectives of any ISP filtering program should be clearly defined. Based on recommendations 1 and 2, the Taskforce believes the policy objective for filtering should be clearly articulated; for instance, whether it is:
    • to avoid inadvertent or unintended viewing of Refused Classification (RC) or illegal content while surfing the web;
    • to prevent, detect, block and prosecute delivery, access, publication or circulation of RC or illegal content;
    • to deter both inadvertent and/or deliberate interaction with a wider ambit of RC, illegal or prohibited material using any method of Internet access.

    In addition to clear objective(s), this program should also include: performance standards, clarity around the definition of material to be filtered, reporting processes, type of traffic and filtering mechanisms to be used.
  4. Development of minimum standards to measure filtering efficiency - Different filtering processes achieve varying results in terms of impacts on speeds, resource usage and accuracy of filtering (over blocking and under blocking). In mandating or regulating for ISP level filtering, the Federal Government should develop a set of minimum standards to be achieved against which the efficacy of filtering can be measured.
  5. Planning for location of content filters - The Taskforce believes considerable thought needs to be given to location of filters within the ISP architecture (depending on the size, speed and level of redundancy) to avoid multiple filtering of feeds, filter failure which causes service disruptions and significant performance reduction due to filter operations.
  6. Implementing a national, voluntary content rating system - As part of any ISP filtering program, a national, content rating system could allow content providers to rate the material on their sites. Any rating scheme used should be standardised and easy to use so content developers can self-rate their content.
  7. Transparent guidelines and auditing process - The Government should establish clear, unambiguous guidelines on sites and material that will be included on the ACMA black list. In addition, there should be an independent and transparent auditing process for the black list and an ability for complaints about those sites included on the black list to be lodged and assessed in a timely manner.
  8. Ability to customise filtering levels - The Government should strongly encourage ISPs to provide products that allow users to select/customise their preferred level of filtering (above that which is mandatory).
  9. Education on protection and threats – As filtering is only one level of protection, the community needs to better understand the factors associated with threats, computer and network vulnerabilities and how countermeasures work and what they can do to protect themselves, if they are going to adequately protect their identities and their activities online. ...
From: ACS ISP Filtering Report: No Silver Bullet, Media Release, Australian Computer Society, 12 October 2009
The table of contents from the report:
1 Introduction 1
2 Background 2
3 Key Issues Addressed by the Task Force 3
3.1 What Are the Goals/Objectives of ISP Filtering? 3
3.2 What Type of Content Should Be Filtered? 3
3.3 Where Should Filtering Occur in the Network Architecture? 4
3.4 What Type of Internet Services Should Be Filtered? 4
3.5 Nature of Internet Filtering 4
3.6 How Is Illegal Material Distributed? 5
3.7 What Are the Criteria Behind the Black List? 5
4 Technical Issues and Filtering Techniques 6
4.1 IP Blocking Using IP Packet Filtering/Blocking 6
4.2 Domain Name Server Poisoning 7
4.3 URL Blocking Using Proxies 8
4.4 Hybrid System 8
4.5 Content and Site Labelling Based Filtering 9
4.6 Other Methods of Content Control 9
5 Issues and Design Choices for Filtering 10
5.1 Content Classification Issues 10
5.2 Criteria Enforcement 10
5.3 What Traffic to Filter 11
5.4 Encrypted Traffic 11
5.5 Filtering and Network Architecture 12
5.6 Implementation Issues 13
5.7 Addressing P2P and BitTorrent 13
5.8 Circumventing Filters 13
5.9 Over Blocking and Under Blocking 14
6 Other Issues 15
6.1 Improved Control Over Domain Name Registration 15
6.2 ISP Filtering Trial 16
7 Awareness and Education of Users 16
8 The Way Forward 17
Task Force Members 18 ...

From: Thechnical Observations of ISP Based Filtering of the Internet, Australian Computer Society, 12 October 2009

Labels: , ,

Wednesday, October 08, 2008

Mobile Internet taking off with Younger Australians

Last night Scott Ewing from the ARC Centre of Excellence for Creative Industries and Innovation reported on a survey of Australians use of the Internet. This is part of the World Internet Project (WIP), looking at Internet use over time and accross countries. Some of the more interesting results are that 19% of Australians don't use the Internet, 94% of 18 to 24 years olds do and of them 20% use the Internet on their mobile phones. The published report is available: CCi Digital Futures Report The Internet in Australia 2008. Scott and his colleagues will be talking in Melbourne, PERTH, BUNBURY, HOBART, Adelaide, SYDNEY and other locations.

Labels: , , ,

Thursday, September 18, 2008

What do Australians do online?

Scott Ewing from the World Internet Project (WIP) will speak on the social, cultural, political and economic impact of the Internet and other new technologies at free ACS talks around Australia in October and November 2008:
LocationDate
Canberra7 October 2008
Melbourne15 October 2008
PERTH 21 October 2008
BUNBURY22 October 2008
HOBART28 October 2008
Adelaide29 October 2008
SYDNEY24 November 2008

ACS Branch Forum (Final EDxN for 2008)
The World Internet Project
What do Australians do online?

CCi Digital Futures is the Australian component of the World Internet Project (WIP), a collaborative, survey-based project looking at the social, cultural, political and economic impact of the Internet and other new technologies. Founded by the UCLA Centre for the Digital Future in the United States in 1999 (now based at the USC Annenberg Centre), the WIP is now approaching 25 partners in countries and regions all over the world.

The Internet is everywhere, at work, at home and on the move. If the Prime Minister's plans come to anything, it will soon be in every school. The underlying technologies are scarcely three decades old, and some of the most popular sites, such at You Tube and Facebook, are only a few years old, but this new world of information and communication is now, for many of us, an utterly everyday experience. What is equally remarkable is how little we really know about how the net is used, where and by whom.

Researchers are tackling these and other questions on several fronts. The answers will tell us a great deal about what sort of people Australians are becoming in the new era of networks. They will also tell us something about the real prospects for turning Australia into one of those new, desirable 'knowledge economies', based on innovation and creativity. What is the point of this sort of research? A global, long-run study of the net is useful for many people: for policy makers, for consumers, businesses and innovators. This kind of knowledge has another possible benefit, if it can help make what now seems strange a bit less scary. We could then spend a little less time worrying about what the net might do to us or our children, and some more time figuring out what it can achieve for us all.

Biography:

Scott Ewing


A Senior Research Fellow at Swinburne University of Technology's Institute for Social Research and at the ARC Centre of Excellence in Creative Industries and Innovation, Scott Ewing has fifteen years experience as a social researcher, both at Swinburne and in the private sector. Currently managing the Australian component of the World Internet Project, a global survey of internet use and non-use, Scott's research interests include the social impact of new technologies and the role of economic evaluation in social policy. He has taught at both the undergraduate and postgraduate level and his research output includes a book, a book chaper, numerous monographs and reports, ten journal articles and many conference papers (both published and unpublished).

Labels: , , ,

Wednesday, September 12, 2007

Government services via the web in regional Australia

At the 4th Annual Web Content Management for Government, Hyatt Hotel Canberra, on 17 September 2007 I will be talking about how to deliver government services via the web in regional Australia. This is one of a series on ICT for a Civil Society.

Several new wireless technologies are being introduced to regional areas of Australia. With a few small changes to their web sites, government agencies can optimise their service delivery over these new delivery chains. Smart phones are now readily available in agencies and companies, but are being used for little more than reading email. These can be effective tools to address rapidly emerging situations, such as influenza pandemic. Australian governments are addressing a critical issues in remote indigenous communities. Provision of government services, information and education via the web can supplement and support other delivery mechanisms.

  1. Optimising web sites for new wireless regional networks
  2. Smart phones for managing pandemics
  3. Services for remote indigenous communities online
  4. Using the web to reduce regional carbon emissions

Labels: ,

Wednesday, August 29, 2007

W3C Australia Standards Symposium

W3C Australia held a one day Standards Symposium in Canberra on 28 August 2007. This is a one day event to look at where web standards are going. These are my informal notes from the event, not official minutes. The symposium was organized with NICTA, with OASIS, OGC and AGIMO also presenting.

World Wide Web Consortium Australia

The World Wide Web Consortium's Australian office (
W3C Aus) is run by CSIRO in Canberra (on the other side of my office wall in the ANU Computer Science and Information Technology Building).

W3C issue what they call "recommendations", but which are really standards, for HTML, XML, CSS and other key web technologies. W3C was founded by Tim Berners-Lee, inventor of the web, in 1994. As with any standards work, there is a rich mix of political, technological and commercial forces at work.

A recent area of tension touched on in the introduction was the schism in the web community between HTML and XHTML. Those working on the next version of HTML (HTML 5) have clearly stated they want to go a different direction from the work on the next XHTML (version 2).

Other tensions are with intellectual property issues with web recommendations. W3C aims to produce technology which can be freely used, without payment of royalties.

W3C wants to expand the web beyond desktop computers, to devices such as mobile phones. That probably is more a matter of commerce, than technology, but the advent of new consumer smart phones may make a differecne.

Typically the W3C process is to first have a "workshop" in an area of interest, then a working groups is formed (if justified) and publishes drafts for comment, implementations are produced to see the technology works, and after several more drafts a recommendation is released. Perhaps more importantly, W3C releases revisions and new versions of recommendations. Implementation guides and web tools are also provided to help with implementation.

As well as the more technical standards for HTML and CSS, W3C also produces guidelines, such as those for web accessibility. There are dozens of working groups working on interrelated recommendations who need to coordinate their work. W3C membership costs money and working group members contribute their time for free.

W3C Australia head, Ross Ackland, claimed the future of the web was to: semantic web, mobile web, and sensor web. He suggested we were in the middle of a ten year adoption of the mobile web, with the semantic web was further in the future and
sensor web was a newly emerging technology CSIRO would like to foster.

The semantic web tries to make a web which machines can understand. Ross argued that Web 2.0 and mashups were a "grass roots" ad-hoc approach to what the semantic web was attempting. My view is that WSeb 2.0 and mashups were providing useful services, while semantic web is a failure which should be abandoned.

The W3C Mobile Web Initiative in 2005 got the attention of the mobile phone industry. But the industry has had several attempts at turning the mobile phone into a viable mobile web device. The industry's attempt with WAP was a failure costing billions of dollars. W3C's own attempt with XHTML Basic, has had limited success. About the only one to be successful was Japan's iMode, which uses a version of HTML which the W3C rejected.

The Sensor Web will provide some standards for sensor access in the future:
The Sensor Web is a type of sensor network or geographic information system (GIS) that is especially well suited for environmental monitoring and control. The term describes a specific type of sensor network: an amorphous network of spatially distributed sensor platforms (pods) that wirelessly communicate with each other. This amorphous architecture is unique since it is both synchronous and router-free, making it distinct from the more typical TCP/IP-like network schemes. The architecture allows every pod to know what is going on with every other pod throughout the Sensor Web at each measurement cycle.

From: Sensor Web, Wikipedia, 21:20, 26 July 2007
CSIRO have a sensor web in Brisbane which can be accessed via web services:

This server contains test deployments of the Open Geospatial Consortium's (OGC) Sensor Web Enablement (SWE) services. ... getCapabilities ... data from the sensors deployed by the Autonomous Systems Laboratory in Brisbane, Australia. The sensor measure temperature, soil moisture and onboard diagnostics at three locations, qcat, belmont and gatton. There are roughly 125 stations with two or three sensors each. This yields over 250 data sources of which about 150 appear to be active. Each source reports every few minutes with data coming in every few seconds. ...

From: CSIRO ICT Centre SWE Web Services, CSIRO ICT Centre, 20 April 2007

Ross ended by asking what Australia could do for web standards. He pointed out that successful standards also needed market adoption. Standards take about five years to develop. The benefits are global. How does Australia contribute? An example is standards for water data standards to help with conservation in Australia and world wide.

OPEN GEOSPATIAL CONSORTIUM

OGC develops "specifications" for digital maps. The aim is to be able to knit together different online mapping services to produce a coherent view for the user. OGC works with W3C groups, ISO (ISO 191xx series including ISO 19115 for Metadata) and OASIS (such as Common Alert Protocol (CAP) for emergency messages), IEEE (Sensor Model Language: SensorML).

OGC sponsors scenarios to test implementation of standards (much like the
Coalition Warrior Interoperability Demonstration [CWID] for military IT). OWS 4 in December 2006 worked on sensor web enablement SWE, geo processing workflow GPN and geo-decision support. OWS 5 for 2007 is being planned.

One thing which got my attention was mention of "Social Change On-line".

At question time there was a philosophical discussion of what a standard was, their benefits, disadvantages and processes. This was entertaining but not very enlightening. Perhaps there is a need for some courses on what standards are and how they are created.

Organisation for the Advancement of Structured Information Standards

Organisation for the Advancement of Structured Information Standards (OASIS) was foundered in 1993 for SGML related standards (more recently XML standards). It has more than 60 technical committees. Individuals and organisations can join. A well known OASIS standard is ODF, based on the OpenOffice.org office document format. OASIS produces horizontal standards (general purpose technology) and vertical standards (for a particular business function). Other standards are
Universal Business Language (UBL) , Customer Information Quality (CIQ) for identifying locations, organisations and people and Common Alert Protocol (CAP) for emergency messages.

Semantic Web

W3C's Semantic Web is about being able to process information. Current work is on an English-like version of the
Web Ontology Language (OWL). This reminds me of the attempt with COBOL to create an English-like computer programming language which could be understood by non technical business people. The result was a verbose language which was still unintelligible to business people and cumbersome for trained computer programmers.

SPARQL is the semantic query language. POWDER the Protocol for Web Description Resources. GRDDL the Gleaning Resources Descriptions and Dialects of Languages.

This was the least useful session of the day. The Semantic Web may well turn out to be very useful one day, but so far all that appears to have been produced are a bewildering array of unintelligible standards. About the only prospect of any of this work ever being of use would be to apply the process Tim Berners-Lea used to create the web, where he took a large and complex standard (SGML) and trimmed it down to the essentials to make HTML.

GEOSCIENCE AUSTRALIA

Chris Body presented about standards in Geoscience Australia. GA seem to have suddenly become more visible, with work on geospatial standards and Tsunami warnings.
The Special Minister of State, Gary Nairn, announced an Australian Spatial Consortium (ASC), on 14 August 2007, but it was not clear to me what this is.

ANZLIC (Spatial Information Council) have provided the ANZLIC Metadata Profile (December 2006) ISO TC211 framework. GeoNetwork is a metadata entry tool endorsed by Australian agencies in August 2007.

Geoscience people have a preference for formal international standards. However, GA is aiming to have any Australian contributions to be available free for public use under a Creative Commons licence.

Australian Government Information Management Office

Brian Stonebridge from AGIMO working on a standards governance framework. Brian argued that standards are boring to end users, there has to be some value to the user to get them interested. Brian's presentation was the most impressive of the day, because he was taking about how the standards could be used for the benefit of the community and he actualled used the technology he was talking about to make the presentation, via AGIMO's GovDex:
GovDex is a resource developed by government agencies to facilitate business process collaboration across policy portfolios (eg. Taxation, Human Services etc.) and administrative jurisdictions i.e. federal, state or local government levels. ...

From: Welcome to GovDex, Australian Government Information Management Office, 2007
Brian mentioned that some of the work is being done online, via the system with the French government.

Brian estimated that development of standards for government use will cost about $2M a year to administer. This is not the development of new technical standards from scratch, but selecting and profiling standards for a particular application (such as selecting e-document formats for an electronic application for building a house).

AGIMO have developed a plugin for enterprise architect for government standards.

AGIMO will use underlying international and national standards, and over this methods and tools, governance and references models. The business case for this is that it will reduce the cost over time.

Unfortunately Brian then lost me in an assortment of acronyms, including:
  • GIEM, Government Information Exchange Methodology (UMM v2.0 and CCTS v2.0). This extends the Canadian GSRM and is similar to the upper layers of AGA.
  • AGOSP: Australian Government Online Services Portal.
Also NICTA launched a three-year research initiative in eGovernment in January 2007, but it is not clear what this is intended to achieve.

Overview of the day


Ross Ackland argued that we were now "moving up the stack": the low level standards for digital communications using the Internet are set and largely working. The web provides an digital publishing overlay for this. Now more semantic content is being added to the web with standards in areas such as Geoscience and more general areas such as the Semantic Web. This is a useful way to think about the work, but the reality I see is not such a clear or systematic path.

Ross asked what should W3C and other bodies do to further standards in Australia. W3C has only a few full memebrs in Australia, due to the small size of the It industry.

I suggested that NICTA, CSIRO and other interested parties could create a one hour presentation explaining how standards development works in Australia. This could be placed on the web and offered to ACS and other IT groups to explain where standards come from and how they could get involved. This may help avoid some of the controversy and confusion surrounding issues such as the proposed adoption of Microsoft's OOXML format as an ISO standard.

One way to look at this which Ross pointed out is that the point of view about the systems are built will change: instead of building an application for an organisation and then try to interface it to other organisations, we will build the interfaces first. From the wiser perspective, I suggested that the web standards effort could be seen as building a global computer system for processing information, much as the Internet is a global system for communicating information.

Some Overall Issues on the Day

* WHERE IS ASIA?: Several speakers talked of how the standards committees were heavily influenced by US government agencies (particularly the military and security) and less so by European organisations. There appears to be little involvement by Asian organisations. There appeared to be a lack of interest in why this is so, the problems it will cause and what to do about it. Australia is culturally close to the USA and Europe and so can ride on the coat tails of the current standards process. However, at some point Asian countries and industries may decide their interests are not being served by the current standards process and decide to set up a new process for standards. Perhaps Australia can play a part in bridging the gap. This could address cultural and geopolitical issues using the web technology itself.

* USING THE STANDARDS: Many groups are producing advanced web standards. Some Internet and web tools are being used by committees. But the output of the standards committees are PDF documents or web pages. It might be useful for the web standards groups to apply some of the technology they are proposing to the standards process itself.

* USING STANDARDS: Perhaps one area in which Australia can contribute is to helping test and implement standards. This will provide useful feedback to the standards developers and also provide potential useful products.

* AUSTRALIAN DEVELOPMENT THROUGH STANDARDS: The most productive part of the day was meeting David Peterson from Boab Interactive . This Australian IT company is the latest member of W3C Australia. They are based in Townsville, North Queensland and doing web work, mostly with tropical environment research projects. Some years ago the AUstralian government funded me to see how to get regional ICT happening.

Labels: , , , , , , , ,

Monday, February 05, 2007

Corporate social networking with web 2.0?

The IT business media seem to be taking Web 2 seriously, so perhaps it is time to look at it. But there seem to be several concepts mixed up together (or perhaps "mashed up"?). Sorting this out may solve some problems in corporate document management and academic publishing.

One is the use of AJAX and similar technology to provide a more interactive interface via the web. Another is traditional office applications provided via the AJAX interface (such as word processors and spreadsheets). The third is on-line meeting places, such as MySpace.

There is also YouTube, a video sharing web site, which usually gets mentioned in the same articles but does not seem to have anything to do with social networking or corporate applications, but just gets included because it is popular.

Capitalizing on Interactivity, Mobility and Personalization by Donna Bogatin, January 22nd, 2007:
Categories: Business Models, Web 2.0, Culture, Google, Blogs, User-Generated Content, MySpace, Social Web, Amateur Content, Self-Promotion, Google Software Applications, Social Networking, Social Media

Is MySpace coming to the enterprise? According to Business Week it is.

On what does Steve Hamm base his assertion? IBM's announcement today of “Lotus Connections.”

IBM describes its offering as “the industry's first platform for business-grade social computing”:

Lotus Connections facilitates the gathering and exchange of information through professional networks, provides a dashboard-like view of current projects and connects users to like-minded communities. In addition, Lotus Connections removes the need for multiple social software applications, providing businesses with a single destination for building professional communities. ...
Corporate social networking is name of game with Lotus Connections, By Stan Beer, 24 January 2007 :
While Microsoft has been trying to win Web 2.0 corporate hearts and minds with Sharepoint Server, IBM threatens to steal the show with a new corporate tested offering called Lotus Connections. Web 2.0 in the consumer space is all about social networking as exemplified by sites such as MySpace, YouTube and FaceBook. Users of these sites with common interests can network, share ideas and provide each other with information that builds upon their mutual knowledge base.
The idea of using more interactive web applications makes sense in the corporate environment, provided you have the bandwidth and processing power to do it and accept its limitations. In some ways this is a step back to centralized mainframe computing, with the web application running on the server. If the central application stops, no one can do any work. This would be a good way to go if you have a new application to introduce across a wide network.

The extreme case seems to be to run your corporate service on someone else's web server. Google have a service called "Google Apps for Your Domain" which provides online tools for email, instant messaging and shared calendar. The idea is that the same tools used for Google's Gmail and others are available for use by companies, educational institutions and other organisations. They use the Google system in place of their own in-house software.

Google are not charging for these services, but presumably are doing it to make people more familiar with Google's services which have advertising on them:

Google Apps for Your Domain lets you offer private-labeled email, instant messaging and calendar accounts to all of your users, so they can share ideas and work more effectively. These services are all unified by the start page, a unique, dynamic page where your users can preview their inboxes and calendars, browse content and links that you choose, search the web, and further customize the page to their liking. You can also design and publish web pages for your domain.
I remain a bit skeptical of online meeting places as a business tool. Any form of collaboration requires skills from the participants. Not everyone has these skills and corporations will need to invest in training and staff to make them work. As well as cooperation, workplace involve competition. Perhaps rather than a social network, an information market would be a better model for the on-line workplace. Also much social networking takes place outside the organisation.

Are companies prepared to formalize and document online the process by which their staff trade information with other organisations? In many cases these contacts take place verbally and informally, while tacitly endorsed by superiors. If the contacts took place via a computer system, all transactions would be recorded and could be used in evidence in court. Much of these contacts would be considered unethical or illegal, limiting the scope for using a formal system.

What has this to do with corporate document management or academic publishing? Organisations, particularly governments, are having difficulty with staff filing electronic documents properly. Academia are having difficulty over the role of academic publishing. In both these cases the problem is that the records manager or librarian sees the document or publication as an end in itself.

But the office worker or academic author sees them just as part of a process; a byproduct of doing some work or some research. By incorporating the social network process in the system used to produce the document, keeping good records or publications will be a natural by product of the work. This is more than just an automated work flow which prompts you for some keywords before you can save a document.

Labels: , , , , , , , , , , , , ,