Last week, the director of Utah’s Department of Technology Services (DTS) resigned in the wake of a massive data breach that exposed the personal information of nearly 800,000 people to hackers believed to have been in Eastern Europe.
The breach did not happen due to sophisticated malware, however. Instead, a series of configuration mistakes during an upgrade left the server wide open to attackers, who downloaded data from the server March 30.
The incident serves as a reminder of just how costly configuration errors can be for organizations. In the case in Utah, interim DTS director Mark VanOrden told theDeseret News about a series of errors that had exposed the server as the state upgraded its Medicaid Management Information System. The server, he explained, was installed by an independent contractor and was not protected by a firewall during the upgrade. In addition, the server used factory-issued default passwords, which he said is not “routine.”
“Two, three or four mistakes were made,” VanOrden was quoted as saying. “Ninety-nine percent of the state’s data is behind two firewalls, this information was not. It was not encrypted and it did not have hardened passwords.”
Organizations seem to struggle with defining management security objectives such as the change control policy for high-value assets, actually implementing those objectives in practice, monitoring the environment for compliance, detecting deviations and responding effectively when something unusual occurs, said Scott Crawford, research director at Enterprise Management Associates.
“Lack of identifying high-value assets and prioritizing monitoring and control in those environments often contributes to exposures,” he said. “Finer control over access privilegesimplicated directly in the Utah caseis one example where such control can and should be scrutinized more carefully and more consistently enforced.”
Poor controls over browsers are another example, he added, noting that many of todays browsers enable sandboxing, validate code and provide other techniques to limit exposures.
“Many organizations find it difficult to keep such issues current, particularly with large numbers of widely distributed endpoints,” he said.
Oftentimes, people are more interested in making things work than making them work right, said Andrew Storms, director of security operations at nCircle.
One of the most common configuration errors I see is running services with too many permissions,” he said. “For example, in UNIX, the Apache process is run as user www to limit exposure. If Apache were compromised and the process was running as an admin, then the attacker would gain full administrative access to the server.”
“Another common configuration error is altering file system permissions in order to make an application run,” he continued. “This is the quick and easy way out of file/folder access problems, but its better to ask why the application needs access to those files in the first place.
“Even if [people] understand the implications of configuration errors, they take the easy way out, he said.