By Barry Long, xAssets
In March of this year the United States CIO, Vivek Kundra, mandated that government agencies and departments begin to develop plans to consolidate their data centers. With 1,200 data centers currently in operation, the task will not be an easy one. However, it should bear real fruit. With software and file servers accounting for more than a quarter of the federal IT budget, and with server utilization estimated to be under 15% of total capacity, consolidation could result in some real savings in the billions of dollars.
The project has an aggressive schedule, with a target date of April 30, 2010 for the creation of an initial inventory. Developing a comprehensive inventory of the IT assets that comprise any data center is the most fundamental step in making any substantive change, and it is the right first step. The real question is not the merits of performing the inventory, but how to complete it in such a short time and develop accurate results.
For data centers with a developed IT asset management (ITAM) solution in place, the job is probably already done. Automated inventory and discovery operations keep track of devices, configurations and installed software, providing detailed reports on an on-demand basis. For those without an ITAM solution in place, the task will be daunting to say the least. Consider some of the obstacles:
*Physically counting and recording the configuration of all the devices in each data center, including end points, network equipment and software, could take months, not weeks.
*Purchase records are likely decentralized and only reflect purchases, not moves, changes or disposals.
*A manual reconciliation of installed software and purchase records is typically out-of-date before it is finished.
*A fully functional installed ITAM solution can extend over several months before any usable data is retrieved.
Faced with these challenges, what options are available to the agency CIO or data center manager? The answer can actually be found in another of Mr. Kundra’s initiatives – the use of cloud computing applications in government operations. The use of cloud computing, or hosted applications, is almost a novel concept in government computing and is not the first place IT managers look for a solution. As Kundra observed in a recent speech, “What I would submit to you is … we’re focused on building datacenter after datacenter, procuring server after server, and we need to fundamentally shift our strategy on how we focus on technology across the federal government.” In a report entitled “Saving Money Through Cloud Computing,” (www.brookings.edu/papers/2010/0407_cloud_computing_west.aspx) released by the Brookings Institute on April 7 of this year, the adoption of cloud-computing is supported and estimated that by doing so federal data centers could actually save 25% to 50% of their operating budgets. The report cites various cloud-computing implementations across the country as practical case studies of how the cloud can be successfully implemented in government environments.
Returning to the need to create accurate data center inventories as mandated by Mr. Kundra, federal IT managers should look to the cloud for help. Cost effective, efficient and accurate cloud-based ITAM solutions are available that can create a comprehensive inventory of the data centers in the required timeframe and without disrupting any data center operations.
In fact, a cloud-based ITAM application may be exactly the right tool for that first step. Cloud-based ITAM solutions include a broad spectrum of benefits:
*They are provided as a service, obviating the need for lengthy installations on data center equipment or the acquisition of dedicated servers.
*The service can be up and running in days, with usable data almost immediately available.
*Services can be made available for a one time inventory, or for ongoing use, on a pay-as-you-go basis.
*A software license is not required. Some solutions are listed on the GSA schedule, eliminating the need for a lengthy RFP process.
*Agentless applications are available, eliminating the need to add any software to the end point devices.
*The software is always up-to-date. No patches, upgrades, bug fixes or annual maintenance contracts are required. Neither is a full-time system administrator.
*Overhead costs typically associated with operating an installed application, such as cooling, power, floor space, maintenance and back-up operations, are eliminated.
*In most instances, the need for professional services is nominal, further reducing the cost of the service.
*Being web-based, the IT asset information is available from any location with internet access
In today’s economic environment, cost is always an issue. Cloud-based ITAM solutions can be a fraction of the cost of a traditional installed software solution. There is no up-front license fee and the service can be contracted for a single inventory scan lasting for several weeks. Alternatively it can be subscribed to on an annual basis. The number of users can be incremented or decremented as needs dictate and maintenance fees do not apply. As there is no local installation, the service is up and running days after the acquisition is completed, with usable data available within days of initiating the service. In addition, the service never becomes an obsolete or out-dated legacy system requiring ongoing and expensive support.
To be sure, cloud-based ITAM solutions are not a panacea for every government discovery and inventory project. In closed and highly-secure networks, it is unlikely that any outside access would be permitted or possible. In any government application, security is always a concern. Issues such as not having control of the information, not allowing outside access to the data center and relying on third party security provisions are commonly voiced. However, as cloud-computing has matured so has the security of the applications and the data centers hosting the applications. Vendors have taken great pains to address the common security related concerns.
Regarding maintaining control over the data, cloud-based ITAM vendors offer dedicated databases and even dedicated servers, that are only accessible by the customer. The level of control is equal to that of a locally maintained database, with the added assurance that backups are rigorously scheduled and that a failover site is also maintained to insure constant availability of the information.
In addition, dataflow is typically outbound only. In most cases a collection server, which can be an existing virtual machine and need not be a dedicated device, aggregates the discovered information behind the firewall and then communicates with the hosted application server. Access to the application and database is login and password protected and is further restricted by a customer defined departmental and role/responsibility matrix. Users view and manipulate data on the hosted server and can print to authorized devices. The firewall will not be breached by incoming data.
Most commercial hosting vendors maintain SAS-70 certificates. An SAS-70 certification and the accompanying Service Auditor’s Report, with an unqualified opinion issued by an Independent Accounting Firm, differentiates the hosting provider from other firms by demonstrating that it has established effectively designed control objectives and control activities. Beyond partnering with hosting providers who maintain SAS-70 certificates, vendors of cloud-based ITAM solutions have the option of employing Secure Socket layer encryption. Each SSL secured transaction consists of a public key and a private key. The public key is used to encrypt information and the private key is used to decipher it. When a Web browser points to a secured domain, a SSL handshake authenticates the server (Web site) and the client (Web browser). An encryption method is established with a unique session key and secure transmission can begin. The combination of SAS-70 standards and SSL security insures both data integrity and a secure hosting environment.
Clearly, a case can be made for the use of cloud-based ITAM applications as a tool for developing the IT asset inventories needed to begin the federal data center consolidation process. The software will add even more value as the program gets underway, as decisions are made as to exactly what areas can be consolidated. The ongoing benefits of using the data captured by the ITAM application will yield even more benefits as the data centers become more efficient and server utilization increases. Cloud-based ITAM solutions will become one of the lasting tools that agency CIOs and data center managers can rely on to insure that the data center consolidation process yields the intended benefits and efficiencies.
Barry Long is the US Business Development Director for xAssets. He has extensive experience dealing with agencies in the federal government, particularly with regard to IT asset management programs. He is a member of the SIIA Saas/Government committee. Readers can contact Barry at firstname.lastname@example.org