When disaster strikes, the role of a company's top IT manager becomes extraordinarily complex. For Sony Electronics CIO Drew Martin, the October wildfires that threatened the company's San Diego-area headquarters quickly escalated into a critical balancing act. Martin found himself torn between trying to attend to the welfare of the company's massive information infrastructure while considering the personal and professional needs of thousands of displaced workers—including himself and his family.
In the weeks following the fires, Martin found time to reflect on his experience with eWeek contributor Steve Kovsky.
How did the early hours of the fire unfold for you, and when did you first realize the magnitude of the threat it represented to your family, your co-workers and the company?
It all happened on a Sunday when the fires started. That was a good and a bad thing. On the plus side, we didn't have people in the office. But on the other hand, we started to realize it was headed directly for our corporate headquarters in Rancho Bernardo, and we knew it would be difficult to reach people over the weekend if we had to advise them that we were closing the office. It was pretty chaotic.
One of the big surprises was just how quickly the fire was moving.
Our neighborhood is located next to Poway High School, which on Sunday was already being used as an evacuation center for the town of Ramona. ... Then by midnight on Sunday, we were getting ready to evacuate ourselves.
In the end, we barely had time to grab some photo albums, a change of clothes, some hard drives and some teddy bears. I left with my wife, three kids and the dog. The goldfish, unfortunately, got left at home. Luckily, our home only suffered minor damage from the ash and strong winds.
Read here about how ingenuity saved the day during the wildfires.
Had you ever experienced a situation like this before?
In some ways, it was reminiscent of September 11, 2001. I was right in the middle of 9/11, too, because I was living in New York at the time. I grew up in New York and moved to San Diego when Sony moved its headquarters here from New Jersey in 2004.
Unlike during 9/11, we never lost communications in San Diego, even though everyone [government authorities and the media] was telling people to keep off their cell phones so the bandwidth would be available for emergency crews.
What were your first considerations from an IT standpoint?
When you're facing something like this, you get tactical fairly quickly. For instance, I realized that people's in-boxes were filling up so we had to make some unilateral "battlefield" calls to upgrade everybody's in-box size. We also had to ask people to stop attaching files to their e-mails because we had so many people on their BlackBerrys—it was starting to clog up communications.
One of the big take-aways for me was the importance of implementing a reverse-911 notification system for our employees. It's in our long-range plans, and we had already started evaluating products, but I think it's something we are going to have to accelerate moving forward. It will also be very useful in our New Jersey facility because of the severe weather that can occur in that part of the country.
A great example was with our school district, which had already implemented a reverse-911 system. When [the Poway Unified School District] issued its school closure orders on Monday morning [after the fires had started], I instantaneously got an e-mail, while my home phone got a voice message and my cell phone and my wife's phone both started to ring. The county of San Diego had also activated its reverse-911 system by that time, but we never received a call from it.
Page 2: Feeling the Heat
What steps did you take to notify Sony employees and inform them of the situation?
We put out voice mails and e-mails notifying people not to come into the office on Monday morning, along with posting a notice on our intranet site. But you can do all that and still miss somebody who's in the middle of grabbing their hard drives and wedding albums and figuring out where they're going to stay. They're not going to have time to log in to their voice mail or e-mail. It's not the same as getting an actual phone call.
At first we didn't think we would have to close the office on Monday, so it was fairly late on Sunday when we made that decision. The time difference between when you make the decision and when people get around to checking their e-mail or voice mail can create a lot of confusion. That's when the ability to trigger an automated outbound phone-notification system would be really useful.
The next thing that became critical was to assemble a crisis task force. One lesson I've learned is to know who that task force will consist of well ahead of time. You assume it will include senior executives and managers from information systems and operations, but you also need to consider human resources, finance and even legal departments. When you have employees who have suddenly lost their homes and all their possessions and records, you're faced with a need to solve unexpected problems, like how to pay them, etc. So you need all those different people on the crisis team.
It's also important to know ahead of time how that team will function, like knowing who's going to initiate and facilitate the actions that have to be taken. You have to be clear on the decision-making process, including what decisions the team is going to make and what's going to be run through senior management for evaluation. That team was a critical clearinghouse for decisions during the crisis period.
How close did the fire come to Sony's headquarters?
The Sony campus in Rancho Bernardo was very close to an area called Westwood, which got hit very badly. We were not allowed into the town or the building, so there was no way to know how close the flames were coming to our facility. At that point, our team in New Jersey came up with a very creative idea to find out the status of our buildings. They used the video-conferencing system to dial into various conference rooms, then used the cameras to look out the window and scout for flames.
One of our big areas of concern was the data center. We couldn't get in to change air filters, and we were worried about the dust becoming an issue. By that point, the smoke from the fires had made the air quality terrible throughout the area. Even though we didn't experience any system failures, we are continuing to monitor the machines because there may be internal damage that we can't detect. So we are going ahead and accelerating replacement of any machines that are nearing their end of life.
What other IT issues were you monitoring?
Power outages were intermittent throughout the area, but we did have backup power in place, and we never lost power for very long, anyway. We had no connectivity issues at any time.
Along with some basic things, such as dealing with the limitations of BlackBerrys and in-boxes getting up to their storage limits, we had to deal with all the people who were not used to being remote users and were suddenly faced with becoming remote users. It was an added support burden, but we were lucky to have staff in multiple locations who were available to provide support.
There were also some unexpected operations issues to deal with. For example, there was the cross-border shipping situation. The fires also were raging down along the Mexican border. We have manufacturing facilities in Mexico that we support, and we found ourselves trying to get our arms around how the fires were affecting shipping across the border. It was right at the end of month, when shipments tend to increase. And we were entering our peak season around the holidays, which is when we do the bulk of the shipping to our retailers around the country, so it was a critical time for the company.
We were getting reports that the border crossings were closed, potentially affecting inbound shipments of Sony TVs from manufacturing facilities in Baja California. It was very hard to get good information. While the news networks were saying that borders were closed, we were trying to reach out to people on the ground to see if trucks were actually going back and forth. We found out that while there were significant delays and even some temporary closures in certain places, there were still some alternative routes available to keep the shipments moving.
It underscored the fact that you need to have a network in place beforehand to get the vital information you need. When you try to draw up your business continuity plans, you deal with scenarios that you think might occur. That's an important process, but the fact is, you really can't predict how an emergency situation is going to unfold. In some respects, your time is better spent figuring out how you're going to communicate and how you're going to get good information to keep the business running.