It’s becoming all too commonplace. A disgruntled systems staffer tampers with a server to reroute and reject the e-mail of the boss who fired him. An information technology manager plants a malicious code “time bomb” on his debt collection company’s network, triggered to corrupt thousands of company records. A contractor hacks into restricted parts of the network while performing a routine infrastructure upgrade.
Think these are provocative fictional accounts? Then Google the names “Roman Meydbray,” “William Carl Shea” and “Joseph Thomas Colon” to find a long list of misdeeds these IT workers committed against their former employers. Colon, for example, was a contractor working for the FBI when he decided to penetrate classified Witness Protection Program and counterespionage files. He used common decryption tools found on the Internet.
Technology is evolving faster than our capacity to rein it in, creating and raising issues that are not easy to address: data access, access tracking and monitoring, systems auditing and benign hacking. As automation catches up to corporate systems and processes and as data readiness needs force businesses to capture more information to better compete, how can businesses ensure that the masters of the electronic domains don’t use their undisputed knowledge and power to wreak havoc?
Medicine has the Hippocratic Oath. The heavily regulated medical profession licenses its practitioners and tracks and restricts the prescription of drugs. But what about IT? Other than a few providers of professional certifications, there’s no professional standards or regulatory body that governs IT or determines what qualifies as ethical behavior. There’s also no governing body that performs or insists on audits of systems logs or demands systematic background checks for IT staffers with security roles or that teaches ethics for systems use. So, would a code of ethics help? For advocates, IT ethics are considered more carrot than stick, but at least it’s a starting point. For detractors, the impact would be minimal.
Still, 43 percent of BizTech readers say IT staffers should be required to certify the security of the systems they supervise and 10 percent report that they’re required to do so by their employers. Although readers apparently like the idea of IT staffers certifying that their systems are secure, only 39 percent think this step will improve security. And another 36 percent of readers are on the fence.
Currently — just like most security breaches — monitoring IT’s access to systems is an inside job. What happens if an IT worker violates his or her employer’s trust? How would they know the transgression even happened? Without some type of auditing or monitoring of systems logs, it may be impossible to discover a security breach. Luckily, the disgruntled workers typically make their menaces known. Shea’s code bomb, for instance, went off not long after his termination. Just like the Sarbanes-Oxley Act tackled the lack of quality assurance for financial data gathered by publicly traded businesses, a similar set of controls for systems security for IT organizations seems likely, especially if these ethical failures continue to surface — and they will.
Editor in Chief