When the federal government announced earlier this month that Chinese hackers had stolen sensitive personnel records of 4.2 million current and former government employees (myself included), the biggest surprise was that it had taken so long for this kind of breach to occur. The truth is that it was less an indicator of the Chinese government’s technical prowess than it was proof of the U.S. federal government’s lackadaisical approach to securing its computer systems.
{mosads}Many of the security vulnerabilities that likely contributed to the data breach had already been uncovered by government auditors. Obviously, this was to no avail. But rather than pointing fingers merely to score political points, policymakers should use this unprecedented breach to catalyze substantive change to the federal government’s approach to information security by creating a zero-tolerance policy that drives real change.
The most frustrating part of this whole affair is that it might have been prevented if the target of the breach, the Office of Personnel Management (OPM), had followed the federal rules for information security. The Federal Information Security Management Act outlines steps an agency must take to secure its systems. In 2014, the inspector general for OPM found many areas where it did not follow these baseline security practices. For example, it failed to routinely scan its servers for vulnerabilities, implement multi-factor authentication for remote access or maintain a comprehensive inventory of systems. Findings this substantial should have sent shockwaves through the government, but they instead elicited a collective shrug from officials who have grown accustomed to subpar security practices.
While OPM’s problems were more severe than other agencies, it is certainly not alone. For example, not counting the Department of Defense, only 41 percent of federal agencies have implemented the minimum authentication requirements for accessing federal networks. Federal agencies are routinely targets for cyberattacks, so ignoring these vulnerabilities comes at great risk. The long-term solution to this problem is to build a culture in federal agencies that does not tolerate such poor performance.
Achieving this will require strong leadership from within agencies and vigorous oversight from Congress. When agencies fall short in meeting baseline standards, agency leaders should be held responsible. Agencies that fail to address these problems should face budget cuts and agency heads should be replaced. The purpose of these accountability measures is not to assign blame, but to drive structural change by creating a sense of urgency for improving federal information security practices.
In the short-term, President Obama should issue an executive order to address one of the primary reasons this most recent attack was possible: improperly secured data. The president should require agencies to submit to Congress within 90 days a confidential, comprehensive and prioritized inventory of every system that stores sensitive information in an unencrypted format. In addition, federal chief information officers (CIOs) should be required to submit plans to secure these systems, including any additional funding they might need. Congress can then decide if these agencies are deficient due to a lack of resources or their own inadequacies, and if the former, they should provide immediate funding to address the shortcomings. CIOs should provide Congress with an update every six months until the job is accomplished.
Given the scope and sensitivity of the personal information that the U.S. government collects, doing a job that is “good enough for government” is no longer acceptable when it comes to information security. Attacks on the government’s information systems are not going to stop. The question is whether or not we will be prepared.
Castro is the vice president for the Information Technology and Innovation Foundation (ITIF).