Tuesday, February 23, 2010

Transportable Data Centre for Broadcaster and Bulldozer Company

Broadcaster Seven Network Limited has proposed merging with the WesTrac machinery company to form Seven Group Holdings Limited. A TV broadcaster might not seem to have much in common with a company which sells and repairs Caterpillar brand bulldozers, but late last year IBM have announced it was building a "Portable Modular Data Center"for WesTrac. This is in two modified shipping containers, with its own generators and could be very useful for a broadcaster.

Labels: , , , , ,

Friday, December 18, 2009

IBM Portable Modular Data Center for WA

IBM have announced in "WesTrac Selects IBM's Portable Modular Data Center" that West Australian bulldozer company WesTrac is buying an IBM Portable Modular Data Center (PMDC). This will be made up of two modified shipping containers, with IBM "Rear Door Heat Exchanger", uninterruptible power supply (UPS), batteries, chiller unit and a 400kVA generator.

The IBM Rear Door Heat Exchangers replace the usual perforated metal doors of a standard equipment rack with a water colled unit. The hot air for the back of the equipment is cooled as it leaves the rack. This contrasts with the approach of APC and other vendors, who cool the hot air behind the rack. The IBM approach would increase the complexity of the installation, with pluming full of cold water on a moving door of each equipment rack.

The data centre is claimed to be "portable", but IBM talks of also a concrete slab on which the containers, generator and chiller will be installed. It is not clear how the system could be easily portable if it needs a concrete slab to be laid. A system using screw piles with twistlocks, which attach to the standard ISO container connectors, would seem to make more sense.

One point not made clear is why WesTrack would need such a large portable data centre. All of the data processing for a modern medium sized company would fit in a couple of equipment racks about the size of a filing cabinet. If the equipment is intended to support customers online, then there is no need for the equipment to be portable, or to be located in a remote area, as it could be as easily located anywhere in the world with Internet access. It is difficult to see the need for this much data centre capacity in an isloated location not connected to the Internet.

SYDNEY, Australia - 17 Dec 2009: IBM (NYSE: IBM) today announced that WesTrac Pty Ltd, an industrial machinery supplier headquartered in Perth, has selected IBM to design and implement a Portable Modular Data Center (PMDC) solution to provide the company with a flexible, cost-effective data centre to meet its immediate business needs as well as support future IT growth.

Faced with the need for additional data centre capacity fuelled by a major IT project and unable to secure more space in its own data centre or through traditional co-location with data centre operators in Perth, WesTrac turned to IBM. With tight project deadlines, WesTrac selected IBM's PMDC as the right solution offering a compact, fully functional, high-density and highly protected data centre, housed within two 6.1 metre customised shipping containers. The IBM solution, due for completion in February, will allow WesTrac to avoid the cost associated and time and space required with building a new facility.

Further:

"After assessing solutions from other vendors, WesTrac is pleased to select IBM to implement a scalable, flexible and portable data centre facility," said Mark Curtis, Communications Infrastructure Manager, WesTrac.

"This agreement provides us with a complete solution and, most importantly, enables all IT equipment to be easily serviced and maintained from within a closed, physically secure and environmentally tight container. All managed and delivered by IBM, WesTrac will benefit from temporary hosting during transitioning stages, project financing, and ultimately, permanent IT accommodation."

"IBM is delighted to work with WesTrac to design and deliver a PMDC solution to provide them with a quickly delivered, cost-effective and flexible data centre alternative," said David Yip, Site and Facilities Services Business Executive, IBM Australia. "The PMDC offering, part of the IBM Data Center Family of modular solutions, is designed as a flexible option for companies requiring remote or temporary data centre capacity to support their business growth."

WesTrac's PMDC solution will consist of two containers, one purpose built for IT equipment, using IBM Rear Door Heat Exchanger cooling doors for the most efficient cooling solution and overhead cooling and the other for services infrastructure including uninterruptible power supply (UPS) and batteries, chiller unit, cooling fan coils and electrical and mechanical distribution gear and a configured 400kVA engine generator.

Further, IBM will also purpose-build a concrete slab on which the PMDC containers, generator and second chiller unit will be installed. An early warning fire detection system, fire suppression system, fingerprint access system and video surveillance provide the required security for the solution.

The agreement was signed in December 2009.

About WesTrac

WesTrac is one of the largest Caterpillar dealerships in the world, servicing the territories of Western Australia, New South Wales, The Australian Capital Territory and Northern China. Established in 1989, WesTrac® is a wholly owned subsidiary company of Australian Capital Equity, which is owned by Kerry Stokes. WesTrac offers total support for customers at every stage of their Equipment Management Cycle. The comprehensive solution offers a wide choice of equipment options, parts, servicing and maintenance support, that is amongst the best in the industry. ...

From: "WesTrac Selects IBM's Portable Modular Data Center" , Media Release, IBM, 17 Dec 2009

Labels: , , , ,

Wednesday, August 19, 2009

Datapod Shipping Container Sized Modular Data Center Components

Datapod containerized data center moduleCanberra based company Datapod, are offering a system of shipping container sized modules for quickly assembling a data center. This differs from the IBM and Sun Micro systems containerised data centres and has advantages over them. By allowing new, more efficient data centres to be built quickly, this technology could help reduce computer energy use.

Two significant differences in the Datapod system are that it does not use a custom cooling system, nor standard shipping containers. Datapod use APC's hot aisle technology, (as used by Canberra Data Center)with two rows of back to back racks in a module. This allows for the easy installation of equipment, with readily available components. Local technicians will be familiar with this system and be able to support and expand it.

Instead of using standard welded side steel shipping containers, lightweight insulated removable panel are used for the walls of the Datapod modules. This allows for the sides to be opened up for access. This does limit the shipping and placement options for the system compared to those from other vendors, but this should not be a problem in real systems.

In theory, the IBM and Sun shipping container data centres could be stacked with other cargo and transported on the deck of a container ship. They could also be installed outdoors, relying on the weatherproof container to provide protection. Such systems are favoured for use by the military in harsh conditions.

However, it is unlikely a container full of millions of dollars of computer equipment is going to receive rough handling during transport, or be operated outdoors. Even the military are likely to transport the containers within a ship, such as the new Joint High Speed Vessel (JHSV), not on the deck.

It is very unlikely that a company or government agency is going to simply dump data center containers in their car park and wire them up. Instead a building will be built to house the equipment. The building need be little more than a shed, to protect the equipment from the elelements and provide physical security. Standard modualrised bulding components can be errect such a building quickly and cheapely.

Labels: , ,

Wednesday, May 13, 2009

Green Data Centre In Canberra

Yesterday I had a tour of Canberra Data Centres new facility in Canberra. This is a commercial data center set up to host government and non-government computer systems. It is located in a converted warehouse in an industrial park. Apart from the emphasis on green computing and security, what is most impressive about this facility is that it is open for business and is servicing clients already, unlike proposals such as that for Canberra Technology City (CTC) .

The facility uses APC’s "Hot Aisle" system. Two rows of computer racks are placed back to back with a polycarbonate roof and doors at either end enclosing the hot air. Coolers are placed at intervals in the racks, drawing the hot air, cooling it and supply it to the front of the racks. The coolers are supplied with chilled water from a central plant. The result is that the cooling is supplied to where it is needed, making the system more efficient and more flexible.

The APC system has all power, data and the cooling supplied from above. There is no need for a false floor. Pods can be devoted to a particular client and even isolated with a wall where security requires. Smaller clients rent racks in a shared pod.

The central chiller plant has multiple units and an insulated tank to hold a supply of cold water. This allows the load on the chillers to be balanced and a backup supply of chilled water if the units have to be shut down (or mains power is lost).

The APC pods have battery backup, to keep servers running until the multiple diesel generators start to supply power. But this full backup power is expensive. I suggested to CDC that there would be scope for using the resiliency features of the web and the power saving of modern servers to provide clients with a lower cost option. Web servers could be programmed to reduce their power use during a mains failure, by lowering their serving rate. The customers using this option could be charged a lower rental rate, as they would be making less use of the backup power. Those customers with an alternate server at another location could rely on that server taking the load. Otherwise, a well designed web application would automatically provide the essential information (such as the text) and delay delivering non-essential information (such as graphics).

The CDC facility provides an alternative to companies and government agencies building their own facilities. All but the largest agencies would have difficulty meeting the stringent requirements for such facilities and the increasingly stringent additional environmental requirements. The power to the CDC's pods is separately metered, allowing the customers to each be charged for electricity used (as well as for cooling). This would assist with carbon emission reporting and also to show the power savings.

Government agencies can rationalise their computing equipment by placing it in such a facility. But it would be unfortunate of they simply took a lot of old inefficient equipment and put it in a new centre. Agencies need to look at rationalising the number of servers they use and the efficiency of their existing equipment.

A more difficult task would then be for the Government to rationalise IT use between agencies. There seems no good reason why dozens of agencies run their own web servers, records management, financial and human resource systems. As these systems become server applications with web interfaces and with web user interfaces, there is increased scope for rationalisation. If government systems were rationalised in this way only a few data centres the size of the CDC facility would be needed to service all of the federal government's requirements.

Labels: , ,

Monday, April 13, 2009

Google battery backed shipping container server

According to "Google uncloaks once-secret server" (Stephen Shankland, CNET, 1 April 2009) google uses a 12 Volt battery to back up each of its servers. The servers have two disk drives and two CPUs in a 3.5 inch high unit and then puts 1,160of them in a shipping container. What is not clear is if Google actually puts these in industry standard racks, or in some lower cost mounts.

The batteries used seem to be gel sealed lead acid units. Presumably these are designed to last the life of the server and not be replaced individually.

The Google server looks logically designed. My only slight worry is that the unit pictured looks like a DIY prototype, not a finished product which tens of thousands of are made. The article is dated 1 April and Google have previously produced April fools day jokes. But there seem to be other independent postings reporting on the design as well, from a data centre efficiency summit.

Labels: , ,

Friday, July 18, 2008

Palletized Computer Data Warehouse

In 2003 I suggested to the Chinese government they could build a palletized data warehouse. This had started out as a joke. But given that major computer vendors have come out with containerized data centers, it might be time to revive the idea.

Palletized Data Warehouse (PDW)

The PDW would combine the space saving features of rack mounting computers and low cost of industrial pallet equipment. Rack mounted equipment would be fixed to standard ISO pallets. These pallets would then be stacked in a warehouse, using a fork lift truck.

Webbing straps, as used in deployable military command centres would be used to fix the equipment to pallets. This would allow standard racks to be used and provide some flexibility, to allow for vibration during transport.

The modules would be assembled and tested, before being shipped to the site and plugged in. The modules would be sized to be compatible with standard industrial pallet handling equipment for ease of transport. Small vans could be used for transport, along with aircraft. The pallets could be loaded into standard ISO shipping containers for long distance transport. Individual pallets could be moved by one person with a simple hand cart and fit through a standard door and into a passenger lift.

A low cost industrial pallet rack warehouse could be used as a data centre. Equipment modules would be tested at ground level, then stacked 15m high into standard pallet racks, using fork lift trucks. Lighting and air conditioning would be hung from the ceiling, with cabling snaking down the racks, using standard industrial fittings. There would be no expensive false floor, or office quality fittings, just a sealed concrete floor. Heavy air and power conditioning equipment would be pallet mounted at ground level for fast installation and maintenance.

Staff would wear overalls and hard hats, and be trained to use safety harnesses when servicing the elevated equipment. The open design would allow for easy re-cabling and service. For any major service work, a module would be removed from the pallet rack using a fork lift truck and returned to the ground level maintenance area.

The temperature in the building would be allowed to fluctuate more than in a traditional data centre, to reduce air conditioning costs. The open design of the building would allow good air circulation for cooling. In may locations the ambient temperature would be sufficient to cool the building most of the year, with just fans needed, not air conditioning, nor complex fluid based cooling systems.

The palletized data warehouse would use much less floor space than a conventional data center and be quick to build using standardized prefabricated warehouse building modules. The data center could be finished on the outside to blend in with office buildings, or with inexpensive steel cladding in an industrial park. It would also be easier to service and take less space than an ISO containerized data center.

Labels: , , ,

Data centre in a shipping container from Sun, IBM and HP

Sun Blackbox: data center in a shipping containerSun, IBM and now HP offer a data centre in a shipping container. But these are mostly marketing gimmicks. The companies offer to install rack mounted servers, disk drives and cooling in a standard steel 40 foot ISO shipping container. The idea is that this makes it easy for a company to add computing power: just take deliver of the shipping container and plug it in. But apart from the military, who are used to containerized equipment, it is not easy to integrate a truck sized box of electronics into your organization.
HP POD datacenter in a shipping container
  1. HP Performance-Optimized Datacenter (Pod)
  2. IBM Portable Modular Data Center
  3. Sun Blackbox
The computer maker can configure the hardware, connect all the cables, close the doors and ship the box to the customer. The customer then just needs to open the doors, plug the box in and switch it on. But in reality, it is not quite this simple:
  1. Cooling: Densely packed rack mounted equipment is difficult to keep cool. Placing it in a cramped metal box will make this worse. Rack mounted equipment is usually designed to draw cool in air from the front and exhaust hot air out the back. This assumes there is a isle at the front and back for the air to circulate; a false floor underneath for the cool air to be delivered and space above the cabinets to carry the hot air away. An ISO shipping container is too small to do this in and most of the designs use only one isle down the middle with racks up against the side of the container. Photos of the Sun system show what appear to be very large cooling air ducts coming out of the front, which have to be ducted somewhere. Other units show doors in non standard places and lots of cables coming out of holes in the containers.
  2. Maintenance: The isle at the front and back of racks not only allows air to circualte, it also also provides space for maintenance workers to exchange equipment and run cables (there are a lot of cables in a data center). The width of an ISO container only allows for one narrow isle, making maintenance difficult.
  3. Delivery: Rack mounted cabinets are designed to fit in the back of a small truck or plane. There are trucks with special suspension designed to carry sensitive computer equipment. Only a few specialist cargo aircraft are large enough to carry an ISO container, so the boxes would have to long distances by sea, road or rail. The sea, road and rail transport systems designed to handle ISO shipping containers are not intended for delicate equipment and do not protect containers from the elements. The data center would need to be very well sealed for transport to prevent water damage and be sturdy enough top prevent damage from vibration, knocks and being tilted. The containers need to have enough room in them for staff to install and maintain the equipment, so about one third to one half of each container is empty, resulting in increased shipping costs.
  4. HP POD datacenter showing cable connections to the shipping containerInstallation: Rack mounted cabinets are designed to fit trough a space about the size and shape of a standing person, so they can be pushed through a normal doorway and into a passenger lift, using a simple handcart. The equipment is therefore compatible with office buildings. In contrast shipping containers require a very large fork lift truck to move them and will not fit in an ordinary office building. They would need a specially designed warehouse-like building or annex to a building. ISO shipping containers are designed to be weatherproof, but setting up a datacenter outdoors would require all of the conduits to be carefully sealed and make maintenance very difficult, as containments would enter every time a door was opened. There have been many modular building systems based on ISO containers which have failed due to leaks. Having a container crammed with sensitive electrical equipment in a leaky steel box would be disastrous. Also the average corporation does not want to have something which looks like a container wharf or an electricity substation, next to their office building. The plan for a major data center in Canberra is in jeopardy due to opposition to the collocated power station. A containerized data center is likely to draw planning objections.
  5. Safety: Data center equipment is designed to be maintained with the power switched on. Staff need to be able to replace one computer in a rack, while the rest of the equipment keeps working. Working in a cramped metal box will be far less safe than a traditional data center. There will be less room for the staff to work and the walls will form one sealed electrically conductive box. Noise from the equipment is likely to be higher than in a normal room. As the box is designed to be sealed, it will need to have vents added to allow for fire fighting. If inert gas firefighting is used, it will be deadlier than in a conventional room and there will be fewer escape exits. Staff may have less than a minute to escape before being killed by the fire suppression system.
Alternative: Pallet Mounted Computers

An alternative o the shipping container data center, I suggested some time ago, is a pallet data warehouse. With this the computers would be mounted on standard shipping pallets. The pallets would be simply placed on the floor. For very high density installations they could be stacked in a warehouse-like building using small fork lift trucks. Pallets are designed to fit in small trucks and aircraft . Smaller ISO pallets are designed to fit through a doorway and in a passenger lift. ISO pallets are designed to fit in ISO shipping containers and so these could be used for transport, with additional protective packing around the pallets. If needed, a shipping container data center could be build by wiring up the palletized equipment in a container.

Labels: , ,

Wednesday, December 19, 2007

Datacenter in a shipping container

Sun Project Blackbox prototype virtualized datacenter in a shipping containerSun Microsystems have designed a prototype earthquake proof data center to be delivered in a standard 20 foot shipping container. Their claim that this is the world's first virtualized datacenter built into a shipping container is hard to credit as the military have been putting computers in transportable buildings for year. But this might be the first attempt to produce a commercial off the shelf product.

Inside the Sun Black box containerized data center showing  cooling system and control panelOne innovation Sun claims is that the system uses water cooling instead of air conditioning. However, the opened door of the container shows ten very large fans. It is not clear how heat is transferred to the outside. Normally an air conditioner would be used so that just three small pipes are needed to be passed through the wall of the data canter, for coolant and condensed water.

ISO Twistlock connectorIt should be noted that while shipping containers are designed to be robust enough to survive transshipping, they are not necessarily suitable for use as permanent freestanding buildings. Something like the Sun Blackbox would normally be built into a building with a roof over it and walls surrounding it. There are numerous systems for incorporating containers into buildings which could be used. Standard ISO shipping containers have 3 "twistlock" connectors on each of their eight corners. There are an assortment of devices designed to connect multiple containers together using the twistlock connectors , to attach a containerized the building to its foundations and to add a roof.

For a secure freestanding structure, it might be better to use one of the modular concrete buildings designed for railway trackside electrical equipment. One of these from Garard was displayed at the Australian Rail Conference Exhibition 2007. These buildings are about the size of a shipping container made from one continuous piece of reinforced concrete. They have the advantage of having been designed to meet government security standards. The buildings can be made on site, or delivered on a truck (or train) pre-wired with the equipment installed. Because they are made of one piece of concrete, they are very secure and less likely to leak. It may also be possible to design one which would fit a shipping container inside. In that case the concrete building could be built on site or delivered empty, and then the shipping container full of computers simply slid inside.

It should be noted that shipping container data centers will not necessarily be a good use of space. The containers are narrow and will only have room for two rows of rack mounted cabinets, with a walkway between. There will only be access to the front of the cabinet, with no access to the back, making maintenance difficult. In most cases it will be better to use a larger room which can provide better access. If space is at a premium and a large data center is needed, then a pallet warehouse could be used (I suggested this to the Chinese government in 2003).

Also before investing in a new data center, an organization should conduct an inventory of its current data and processing requirements. In most cases it will be found that more efficient use of applications can be used to reduce the data and processing requirements, so that a smaller data center can be used, reducing the cost, space and energy use. Use of efficient XML based data storage and Web 2 applications can greatly reduce the needs of the organization for storage and processing.

Instead of virtualizing inefficient PC desktop applications, they can be replaced with properly engineered efficient applications designed to run remotely over a data link. This could reduce the processing requirements between ten and one hundred times. As an example, an organization which would have needed one of Sun's shipping container data centers, could instead downsize to one rack mount computer, the size of a four drawer filing cabinet. Apart from being one hundredth the size and use one hundredth the power, this would cost about one hundredth as much to buy.

Outsourcing the data storage or processing to a location with more space and power can also be considered, but not necessarily as far away as Iceland. The Canberra Technology City (CTC) is a proposed large data center for government and company use in Canberra, with its own power station.

Of course, alongside the shipping container data center will be needed a shipping container cafe, for the workers. ;-)

Labels: , ,

Wednesday, December 12, 2007

Australian Rail Conference Exhibition 2007

Last week, in between computer conferences, I attended AusRAIL PLUS 2007, the
Australian Rail Conference Exhibition. This is held in conjunction with a conference, which costs money. But like many such events, the exhibition is open to anyone from business for free. There were a number of computer and telecommunications exhibits to justify my attendance, but it was really just an excuse to look at train stuff. ;-)

Some items of interest:

Thales Australia are expanding out from Defence equipment into transport, particularly rail systems. As an example they are supplying the Communications and Surveillance Subsystem (CSS) and to perform the Information and Communications Technology (ICT) System Integration (SI) for the Sydney Suburban Passenger Vehicle Public Private Partnership (PPP) Project (ie: computers and telecommunications for Sydney trains).

Ultimate Australia Transportation Equipment Pty Ltd Sleeper Seat with designer Gary Ullmannhave designed an aircraft style reclining seat for long distance trains . The SLP-1 Sleeper Seat (prototype) is 60 kg (production mass 50kg), has a single seat width of 600 mm (double 1200mm), with a pitch of 1800 mm.They hope to sell this for the Cairns Tilt Train. The seat has the same entertainment system LCD video display as fitted to the tilt train retracting into one arm of the seat. I suggested to Gary Ullmann, the designer, that they replace this with a larger 10 inch LCD display and keyboard, as used on the Airbus A380. This could then be used as a computer for business, as well as entertainment. Unfortunately Ultimate do not seem to have an Australian web site, but you can get an idea from their China one.

China South Locomotive and Rolling Stock Industry (Group) Corporation were one of several companies from China with cumbersome names selling locomotives and other railway products. They each seemed to have some form of high speed passenger train on offer as well as freight locomotives. I was unable to get their web site to work in English.

Garard were offering monolithic concrete shelters for equipment. These are buildings about the size of a shipping container made from one continuous piece of reinforced concrete. They are used to hold electrical equipment for railways, but could make very secure computer rooms. The buildings can be made on site, or delivered on a truck (or train) pre-wired with the equipment installed. Because they are made of one piece of concrete, they are very secure and less likely to leak.

Open Access displayed their Wireless Announcer. This is the wireless Emergency Warning and Intercommunication (EWIS) Alert system installed in the Sydney CBD for the APEC meeting. Unit with antennas, digital radio, amplifier, loudspeakers and battery backup are mounted on poles around the city to warn in an emergency. Some units also have alphanumeric displays.

CRC for Rail Innovation, is an industry academic research collaboration. They are looking at:
  1. Economic Social and Environmental Sustainability
  2. Operations and Safety
  3. Engineering and Safety­
  4. Education and Training

Labels: , , ,

Monday, August 20, 2007

Reducing Carbon Emissions from the ICT Industry

Last Thursday the Australian Computer Society issued a study of carbon emissions from the ICT industry and policy statement on how to reduce them. This included some suggestions I made. I will be expanding on them at the Green CIO conference in Sydney Tuesday.

The ACS recommended:
  1. Extending the Energy Rating System to ICT equipment for domestic and commercial use
  2. Innovative technologies to reduce power consumption
  3. Carbon offsets to help offset the emissions being produced by ICT equipment used in the office
  4. Virtualisation to replace servers
  5. Disable screen savers and implement ‘sleep mode’ for inactive equipment.

Some themes I will be talking about at the Green CIO conference:
It will be interesting to see what the IT vendors have to offer. It will be a lot easier to get both commercial and home users to be greener if there are some new stuff they can buy, rather than just telling them to do more with less.

Photovoltaic Trough Concentrator SystemIt is tempting to look for some sort of grand scheme to lower energy use in IT (such as when I suggested in 2003 the Chinese government put solar cells on data centers). But there is a need for a change in attitude by clients so that suppliers can have a market for products. Also there are changes the clients can make themselves in how they use the products.

IT staff up to now have not made energy efficiency a priority because their bosses and clients have not seen it as one. You would have not got sacked if your data system used 10% more power than the industry average, but you would if it was 10% less reliable.

Because this has not been a priority for the customer it has not been priority for the IT industry, so we do not have standards and guidelines to help us reduce energy and know what is good. That is now being addressed by the ACS, with the carbon audit of ICT, policy and the Green ICT Group. IT professionals have an ethical obligation to act in the public interest, even if individual customers do not want us to.

There are opportunities for IT professionals to be part of the green message of their organization. This can involve industry, government and university. A good example is the Defence Department's recently announced "Defence Future Capability Technology Centre" (DFCTC) to work on defence research projects with industry and universities from mid 2008. Much of this research, such as that on explosives, electronic warfare and electromagnetics will be defence specific, but the other areas have commercial application.

Tenix-Navantia Landing Helicopter Dock Ship Cross Section DiagramAs an example, technologies for battlespace management are similar to those used in business, particularly with the adoption of common Internet and web tools. Recently I suggested that containerized "smart rooms" could be tested as outback classrooms, before being deployed in 2012 on the new amphibious ships HMAS Canberra and HMAS Adelaide. Robotic systems for the ships are also applicable to industry applications as well as the military.

Energy saving is an important issue for the military, each extra computer and telecommunications gadget has to be powered by batteries on the soldier's back or from diesel brought in by truck or ship. Reducing energy use can make room for more ammunition. The US Army is now placing an emphasis on solar power of bases to reduce fuel use.

Neo 1973 Open Source PhoneDiscussion of green issues for business have been hampered by the use of complex environmental language and an appeal to ecological concerns. IT professionals can instead talk to their client and employers about the financial benefits from saving energy. As an example i
t is very easy to explain that with a bit of extra software we can get all those smart phones they have to link to corporate systems and give them very concentrated, timely information. That will reduce the need for power hungry desktop computers.

But you can only do so much with ad hoc power saving methods. Sustainable development engineering strategies emphasize a need for an integrated approach to energy and materials saving. Business processes need to be redesigned. IT staff can work with specialists on an overall strategy. They can assist with a web site and other online facilities to get their message out to corporate employees. Include some real time graphs showing company green performance. IT professionals ca act preemptively by proposing to use environmental standards.

ThinLinX Thin Client computersThe major impediment to green IT is a lack of expertise in the IT profession. Staff and suppliers do not know what can be done, what to do, or even the language to use to describe what to do. As an example thin client systems can save money and power. But staff have little expertise with these.

The CIO can start by implementing simple measures, such as setting screen savers to reduce power, switching off unused systems automatically after hours. They can propose to implement environmental standards and provide information to staff online. But reliably economic operation must remain the priority. Environmental efficiency must come after that.

Some of the low hanging fruit are to use XML web based technology to make your applications more efficient so that a larger center is not needed. Consider outsourcing the data center to large specialists who have economies of scale.

IT professionals can help in educating other staff and providing online services which reduce travel, hardware use and the like.

It is unlikely the Government will regulate in the IT area. But the industry should get in first and implement its own guidelines and standards anyway.

Sustainable ICT can be incorporated into your strategic planning goals and targets. Strategies which will provide financial saving to offset the costs are more likely to gain corporate approaval.

Much of this will have to be a DIY effort for a few years until there are consultants and companies trained up to help. There are training materials being designed and these can be incorporated into online programs, such as the ACS Computer Professional Education Program.

The IT industry needs to look at their organisation and see if they practice what they preach, so they will be credibly be able to offer advice, products and services. IT has had a clean and efficient image. In environmental terms it is in reality a dirty wasteful business. That is a reality we have to change.

Labels: , , , ,