Most SSL Sites Remain Insecure and Vulnerable
With sales of more than $1 trillion dollars in 2012, e-commerce appears to be alive and well. But is it safe?
According to the latest SSL Pulse findings shared by the Trustworthy Internet Movement (TIM), the answer is no: More than 76 percent of websites using Secure Sockets Layer (SSL) encryption to secure traffic are effectively insecure. Fortunately, there are some relatively quick and easy fixes that organizations can implement to address their vulnerabilities.
The Trustworthy Internet Movement, a nonprofit supported by the global security community, with the goal of resolving major security issues on the Internet, has been producing SSL Pulse, an ongoing dashboard that monitors the quality of SSL support across the top 1 million websites, since April 2012. During that time, improvements in the SSL big picture have been consistent, starting with 89 percent of websites being insecure and improving to the most recent 76 percent figure.
While those are scary numbers, they don’t equate to a post-apocalyptic Internet, where bands of marauding hackers run roughshod over cyberspace. “SSL Pulse shows you what you should be doing, ideally,” says Ivan Ristic, director of application security research for Qualys and a member of TIM. The key to understanding SSL Pulse is that it refers to effective security, meaning that the website’s SSL configuration must be up to date and must have addressed all known vulnerabilities related to SSL.
Jason Brvenik, VP of security strategy for security developer Sourcefire, says that although this is a looming issue, it is not a dire threat.
“The latest research around SSL doesn’t raise cryptographic attacks up to a level of major concern for most IT security staff. There’s a lot of lower-hanging fruit to take care of before getting to SSL attacks,” he says.
To Brvenik’s point, the BEAST attack, which targets the SSL protocol in browsers, was only proved to be a practical attack vector in 2011. But that doesn’t dismiss the real concerns of the SSL Pulse findings: the widespread ignorance around threats to a traffic function that’s essential to today’s e-commerce-driven world.
The difficulties that prevent organizations from maintaining effective SSL security are not technological hurdle; they are actually quite mundane. The primary issue is that many IT teams are running the vendors’ default SSL settings on their servers.
“Most vendors configure their servers with all security protocols and features enabled,” explains Ristic. “That means that the server will accept traffic using any security protocol out there. That’s bad because many of those protocols now have known vulnerabilities.”
The Legacy of SSL Technology
The SSL protocol was developed by web-browser vendor Netscape and made its debut back in 1995. SSL went through a couple of upgrades before being replaced in 1999 by the Transport Layer Security (TLS) protocol, but was not interoperable with it.
The fact that we are still talking about a legacy protocol that became obsolete 14 years ago speaks to the depth of this security problem. SSL Pulse reports that 27.9 percent of websites still support SSL 2.0 and 99.8 percent still support SSL 3.0, both of which have known vulnerabilities.
“SSL 2.0 is often the default setting, which today is too weak,” says Ristic. “Unfortunately, most server managers don’t recognize the need to change the settings, or they are crunched for time and don’t fiddle with them, because they need that server up and running immediately.”
Brvenik is in agreement with Ristic. “It’s important for organizations to provide the highest level of support for encryption, but it requires awareness of needing this. Not all organizations are aware that TLS is out there, and that it is updating and improving,” he says.
Both Ristic and Brvenik touch on a similar archetype: the over-worked IT staffer. These data-center gladiators are being pulled in a million directions at once. Can they be expected to stay aware of an arguably obscure (though very essential) server setting across several rows of servers while juggling several other mission-centric responsibilities? This, too, is part of the SSL problem.
“Setting up effective SSL encryption requires expert knowledge,” explains Ristic. “It needs to be easier. What we need is a way to automate the server security updating process. It is too complicated, and IT people are typically too stretched for time to properly address it.”
Even if the IT staff is really on the ball and has tweaked all of its servers to support TLS 1.2, it could all be for naught. Browsers typically have poor support for new protocol versions, which in turn provides no incentive for server manufacturers to upgrade the protocols on their products. And, more directly, communication traffic is a two-way street that requires a shared medium for it to work.
As Brvenik states, “Businesses also need to look at who their servers are interacting with. Can their partners’ servers support the latest TLS iteration? If they can’t, then why should the business itself bother? Until there’s a business justification for it, many businesses won’t feel compelled to pursue it.”
Other business-related problems cited by organizations as reasons for not updating their traffic security are performance degradation and the costs involved.
Jesse Wiener, a principal consultant for CDW, explains, “With servers, there’s overhead involved when securing traffic. Turning on HTTPS has an impact on web server performance, slowing it down. Some of these issues can be addressed with newer equipment, or specialized devices such as a load balancer – and that’s an additional cost.”
Lackluster Digital-Certificate Management Creates Security Gaps
It’s hard to talk about SSL encryption and traffic security without digital certificates coming into the conversation. An organization seeking to secure its website traffic will request (via an electronic document with pertinent identity information) a digital certificate from a certificate authority (CA). The CA “signs” this certificate, thus verifying the identity of the organization. These certificates are then assigned to the organization’s servers. Then, during web browsing, the certificate is served to any web browser that connects to the organization’s website, proving to the browser that the organization (running the server) is trustworthy and has been accurately represented. These certificates are used to verify TLS traffic.
“This chain of trust established by certificates is very important,” explains Wiener. “Certificates are being spoofed and hacked because they offer an attractive attack vector. Look at 2011’s Comodo attack. This CA was hacked and the integrity of its certificate chain was broken. Comodo was issuing certificates to entities that were not who the certificate said they were. This attack highlights how important it is to maintain that chain of trust with certificates.”
Unfortunately, maintaining certificates on servers is another task that slips down the to-do list of many server managers. Wiener continues, “Managing certificates is a concern for many of the organizations I interact with. ‘How do we manage them?’ It’s an ongoing process, which lends itself to offloading it to a services provider.”
So what’s an organization to do if it wants to address these server-security problems? Obviously, updating the default SSL settings on servers is the best place to start, taking into account, as Sourcefire’s Brvenik suggested, the encryption levels of traffic partners. “Make sure you offer up the latest TLS suites to users where you can,” Brvenik says. “And make sure your certificates are up to date. Stay on top of managing them.”
CDW’s Wiener says that organizations need to run tests on their servers prior to bringing them online, fine-tuning the SSL settings and making sure that certificates are operating properly. He also suggests organizations “look at management services, to help with ongoing management of servers and certificates. It really helps a lot toward keeping ongoing operations in check.”
Ristic’s advice is concise and to the point: “Fixing all the existing servers out there is going to be a very slow and manual process. The best use of our (limited) time right now is to focus on getting all the servers and applications correctly configured before they go online.”