3 Ways to Reduce the Costs of Data Center Audits


Auditing becomes more critical all the time. It’s always been important for a business: it prevents fraud, provides trustworthy financial reporting, and helps an organization pursue its objectives. Government regulations makes auditing even more important. The consequences of sloppy business practices can be truly painful, and good auditing ensures things stay on track.

In the recording below Tautges talks to Bruce Frank about his data center audit experiences and how he was able to make audits less costly, faster, easier and more accurate.

Citi’s Vice President of Global Technical Operations, Bruce Frank, is a twenty year veteran of technical operations.  He was formerly a Director for Dendrite International and EDS.



Daniel Tautges is the President and Founder of Pinpoint Worldwide, a business acceleration company. Formerly he was the President of nlyte Software and Vice President of Visual Network Design (Rackwise),  Micromuse (now IBM), and Lucent Technologies.


3 Ways to Reduce the Costs of Data Center Audits



What is the Citi data center estate look like today?

Currently we have 14 mobile strategic centers spread out across the four regions: North America, Latin America, Asia, and Europe/Middle East/Africa. Outside the strategic data centers which are managing our critical business applications we have 285 tech rooms and satellite data centers. Some of these satellite locations house between 5,000 and 10,000 servers. We have 8,000 branches that we manage small pieces of infrastructure for and 76,000 physical servers. At one point we were closer to 100,000 servers, but now we’ve scaled down from the physical server perspective. We have another 125,000 to 150,000 devices that are under management within the data center.

From the standpoint of auditing, what do you consider best practices for an audit and why is it so important to Citi?

We’re under scrutiny all the time, not only internally but externally, especially on the investment banking side. There’s the FCC and Sarbanes-Oxley. I’m mainly dealing with data quality. Probably ten times a year we have to make sure that information in the database represents a physical device. There are a lot of fines that could be assessed just based on the way we have our infrastructure set up. We’re involved in exercises; we probably have 20,000 or 30,000 pieces of equipment across the globe and we may need to know what applications are on the equipment and what the equipment connects to. Based on best practices, we strive for 95% data accuracy, what’s in the system vs. what’s on the rack. Once a year we officially reconcile, which involves a manager signing off that everything has been checked. This involves a walkthrough of every single data center from top to bottom and ensures that the device isn’t just in the rack but that it’s also the right model and the barcode is correct. We’ve had stuff that was decommissioned five years ago but was still in the rack, and we’re paying for the maintenance and warranty. We depreciate our equipment for three or four years, and without a proper inventory it becomes very difficult to manage.

Last year a government agency came in for business-critical information. They couldn’t find the device with the data and they had to do a walkthrough of every single device in the data center. After about two weeks they found the device, wrong name on it, wrong label, but they found it and the data was there. The way they found it was by serial number and IP address.

What made you successful doing audits while reducing costs?

Data quality is my number one goal. For the last 12 months my team and I have been working on data quality. We’ve been scrubbing 250,000 pieces of equipment to ensure that the quality is there, that the make is right, the model is right, the platform and connectivity. One of the things we do goes back to reconcilement. We brought in Trackit. The goal is to be able to update information while you’re at the rack. You don’t have to go back and forth. You have the tablet right in front of you, you’re executing right there.

All you do is take it to your desktop and use the connector and it’s basically sucking in the data, consuming the data. The data’s been entered and verified. Now we have a verifiable source, and when you go to the auditor you can see it’s been verified. We have a pretty accurate inventory at this point.

In the past when we were doing audits it would take maybe a half an hour to do a rack. You need to verify the model and barcode. We probably have 7000 or 8000 pieces of equipment in each datacenter, so the amount of time to verify all of this is significant. With a scanner you can bring that time down to one minute.

The first year it may take a little longer. In the past it was taking us about three or four months. Now we’re going to get it done in a month, and the next year in three weeks. Everything is barcoded, and we have an accurate inventory. At the end of the year we should be able to open up each cabinet and in two weeks we should have our inventory done.

So when you were putting together your requirements for a tool, you wanted something that could be integrated to backend systems or DCIM platforms with connectors. Offline mode was very important because of your lack of connectivity. And you wanted something easy to use that didn’t require a lot of training. Were there other requirements you had in mind when you chose Trackit as your tool?

That’s probably a good summary of the rationale for the choice. A big part was the integration, the flexibility of the tool, or the ability of the tool to talk to these other APIs. The integration with Aperture was major. The ability to pull down data and see it visually was important too. Obviously Citi has a large footprint, and new tools don’t always scale, and we end up doing a lot of work ourselves. With Trackit, 80% of the installation went off without a hitch. Overall I think we’re in good shape now.

app infrastructure

Maybe we could talk about where you were before and where you are now.

Yes, well a big one was the audit. Each audit used to require between five and seven full time employees in the front in the data center and five more employees in back. Somebody has to get the information uploaded. At least ten people worked on an audit, and the audit took three months. Now an audit requires two employees and takes less than a month. We tried working with RFID. It worked, but it was very cumbersome, very difficult to manage. You can take the Trackit tablet, hand it over to the next guy, and he can see just looking at the screen what he needs to do next. You don’t have to fiddle faddle through pages on a clipboard. We had multiple data entry points, but with Trackit you have this tool sitting in front of a server, you’re verifying and updating it, you’re syncing your pad, and you’re ready to go with no human interaction necessary. It’s a somewhat automated process. Now we have increased accuracy for DCIM and ETM. We still have challenges. When moving data, we need to make sure that the data is properly represented. For audits we need to make sure the data is where it says it is. Trackit supports the end-to-end lifecycle we want on our equipment.

So audits need to be done to ensure accuracy for DCIM planning and workflow, warranty and maintenance spend, software licensing costs, ledger depreciation, and ETM data theft. You said Trackit reduces audit costs by as much as 80%, which is significant. Can we get some idea what the future requirements are?

One is integration with the CA tool so we can track CA development. That’s a major one. We’re moving pretty well on that piece. We’ve already told the CA organization we’ve made an investment with Trackit for the next three to five years, probably longer, so we want to move forward with the investment we made with this tool. We want to take the tool and make it do more for us than we’re doing right now. The other big piece here is the ability to use the Trackit application to do one of two things. The first thing is building out a data center where we don’t have a full visualization. We just have big rooms there. It would be nice to say, “We need new cabinets,” and with the Trackit tool drag and drop cabinets, sync it up, and—boom—the new cabinets are on the floor there. It would give us a way to get information into our drawings.


The second thing we’d like to do is leverage the Trackit tool to supplement our installation process. Something we do now is hardware validation. A piece of equipment arrives at our loading dock, it needs to be scanned, it’s scanned with a barcode number, we need to go back to Aperture and put in a barcode number, put in the serial number, the model and make. We have most of that information already because we’ve already created a request. So now there’s a guy on the dock trying to figure out where this equipment comes from and what the purchase order number is. We want to leverage the Trackit application and basically download the information to a form on Trackit where it has that synchronization opportunity. We want to take that information out of the DCIM application that we’re using at this time and drop it in the Trackit application. As a delivery comes through the door, the scanner pulls in the serial number.

We use Trackit to produce barcodes, and we can get that number on the box as it comes through the door.We’re validating and putting quality into every step.  Trackit gives us the ability to tack critical assets across our global estate from the cradle to grave. .


Get more information on how Trackit-Solutions can help you:

Contact Trackit-solutions