A US SKi and Snowboard Association’s misconfigured web server exposed the username and password of a source code management user, putting other USSA websites and data at risk. (UPDATED)
Continue readingCategory: Security (Page 2 of 3)
On Friday, the University of Utah Health (U of U Health) announced a data breach of medical patient information among other data due to a phishing attack.
This breach is the latest in a series of medical breaches hitting Utah health providers. Since 2018, about 16% of Utah citizens have had their health information compromised in data breaches.
Continue readingThe UtahCyberCheck project found the US Ski and Snowboard Association (USSA) puts the data of tens of thousands of its members at risk and violates its terms of use and privacy policy by using software with critical vulnerabilities and running at least one hacked website. The association, despite multiple attempts to contact them, has ignored reports about these serious issues. (UPDATED to reflect new developments.)
Continue readingA hack of the Odyssey Charter School Website went unnoticed for possibly two-and-a-half years until the UtahCyberCheck project discovered the hack in December 2019.
Continue readingAs a security professional, I keep tabs on industry news especially when it pertains to education and government organizations. Regularly there are breaches announced by these types of institutions usually right after ransomware hits a county office or a university suddenly shuts down most of its servers to stop malware from infecting everything.
I have sympathy for small shops trying to do the best they can and commend them for doing remarkably well given their constraints. That’s one reason why I’m giving my time. But the IT and internet climate is changing, and these organizations need to adapt. It is no longer sufficient to rely on a small local staff to handle the cybersecurity challenges that even the largest companies and governments struggle with.
One dominating characteristic of municipalities and education that contributes to their security posture is their small size. Smaller sizes means they have a smaller internet presence, which means a smaller attack surface. There is a smaller chance an attacker will be able to get a foothold. Think of it like shopping for a special grocery item such as soy sauce. A supermarket is practically guaranteed to have it (hopefully in stock), but you could got to half a dozen small convenience stores and still not find it. If a hacker is looking for particular system to attack, they could try dozens of cities or schools and still not find any that use that system.
On the other hand, the small size means there is no dedicated professional security staff. There are a few IT employees who are good at keeping the computers running, but don’t have the time and expertise to adequately protect those systems. Security takes a back seat if thought about at all. When someone does take advantage of a vulnerability (and they will), it’s highly likely to go undetected unless the effects are visible such as a defaced website or ransomware attack.
It is my intention to raise awareness of the current situation in a responsible way that will lead to change and improvement. At the very least, I hope people will acknowledge there is a problem.
You can read the original announcement of my project to highlight deficiencies in local government and education.
I am announcing my small, voluntary effort called UtahCyberCheck (short for Utah cybersecurity check) to show there are deficiencies in the current way Utah education and governments are defending against cyberattacks, stopping abuse of their systems, and protecting the data of students and citizens. Although my focus is specific to the state of Utah, I may include other geographic locations may from time-to-time.
The information I present from this effort is an indicator, not a complete picture, of the Utah’s institutions’ cybersecurity posture. A comprehensive evaluation requires much more information that I do not readily have access to. Instead, I have chosen to find poor practices, vulnerabilities, and evidence of abuse or compromises in public-facing systems that reside on, are managed, or owned by education institutions (including school districts and charter schools), the state, and municipalities. The only exception is Brigham Young University (BYU) and LDS Business College due to the fact the former is my current employer and the latter has affiliation with my employer. This is a personal project and is not affiliated or sponsored by BYU.
My actions are not penetration tests. All data is publicly available and legally obtained, and I intend to interpret and follow responsible disclosure guidelines to the best of my abilities. Discovering techniques are non-intrusive and do not affect confidentiality, integrity or availability of systems or data.
This effort and subsequent reports should not be taken as a sign of failure, but as a sympathetic act to encourage improvement. We are in an ongoing global cyber-conflict where just because there are setbacks so far doesn’t mean we have lost. If we are going to succeed in the end, we need to improve upon what we’re doing now.
When I tweet about this project, I will use #UtahCyberCheck.
Be sure to read my next post detailing some motivation behind this project.
1/7/2020 Update to clarify techniques are benign.
2/6/2020 Included statement that BYU is not associated with UtahCyberCheck. Updated title.
Featured Photo by Nikolai Ulltang from Pexels
Network encryption is a game changer for security teams as it makes it more difficult to identify malicious traffic. It may even paralyze some people and cause others to dismiss network security monitoring altogether.
But does it have to be this way? During a recent SANS webcast entitled Alternative Network Visibility Strategies for an Encrypted World hosting Zeek/ Bro experts, Matt Bromiley said, “(Encryption) just means I have to change my analysis techniques and change the way I approach these particular datasets as well.”
Continue readingWith the majority of web traffic now served over HTTPS, it is important to decrypt traffic to give visibility to network security monitoring (NSM) tools. The Palo Alto Networks next-generation firewall can decrypt inbound traffic quite effectively.
However, there is one gotcha when enabling this feature on production systems with live traffic. Beware of SSL session caching!
Identifying the SSL decryption transition issue
When I first tested SSL inbound inspection in my Palo Alto firewall, it was in a lab environment and it worked great! The URLs were showing up in the logs, I did not get any SSL errors (decrypt-error,
decrypt-unsupport-param, or decrypt-cert-validation) and it all seemed to work fine. I submitted a change request and was on my marry way.
Then I enabled the feature on a system with a fair amount of active traffic. The results were startling. There was a huge spike in “decrypt-error” logs I couldn’t explain. Enough users were complaining that I ended up reverting the change, puzzled at why what worked flawlessly in the lab didn’t work in production.
I had two leads to what the cause was. The first was Palo Alto’s 8.0 and 8.1 documentation on the “decrypt-error” session reason end saying:
“The session terminated because you configured the firewall to block SSL forward proxy decryption or SSL inbound inspection when firewall resources or the hardware security module (HSM) were unavailable. This session end reason is also displayed when you configured the firewall to block SSL traffic that has SSH errors or that produced any fatal error alert other than those listed for the decrypt-cert-validation and decrypt-unsupport-param end reasons. “
Palo Alto Networks Pan-OS® Administrator’s Guide 8.0 & 8.1
The second clue was the error that appeared in browser windows of some clients who had an active connection to the server at the time. For Google Chrome it was “ERR_SSL_VERSION_INTERFERENCE” and “ERR_SSL_PROTOCOL_ERROR” for Samung’s browser on Android. Firefox and Microsoft Edge gave similar messages.
This led me to believe that clients with cached SSL sessions were attempting to resume their SSL sessions. When that happened the firewall treated the connections as if they were a new connection but would produce a fatal error when it didn’t receive the expected payloads for a new session.
Resolution to resumed SSL sessions
To resolve this, I tried changing the decryption profile settings such as disabling “Unsupported Mode Checks,” but to no avail. On affected clients I tried clearing the SSL cache and even restarting the machines but that did not correct the issue.
Finally, I reduced the SSL session cache timeout setting on the server itself to 60 seconds. When that happened, the issue disappeared!
I wouldn’t recommend shutting SSL session caching entirely as there could be a huge performance impact to the server, but a 60 second timeout leading up to and immediately after enabling the policy should be adequate. If possible, keep it 60 seconds for as long as what the value was previously. I also found decreasing the timeout even after enabling SSL inbound inspection immediately worked.
You can find documentation for SSL session timeout settings for Nginx F5, Apache, and IIS. (At the time of writing I have not tested the parameters on each of these.)
Some forum posts suggest restarting the server(s) will also clear the server’s SSL session cache and force a new negotiation, although I did not test this. This isn’t always feasible if sessions are shared across multiple backend servers, there is a load balancer at play, or engineers are turning on decryption for a large number of servers.
When I contacted Palo Alto about this issue, they told me, “(T)here is feature request (FR ID: 5786) in addition to Jira PAN-80072,” that they did not have a work-around, and that “the only thing to be done now is to wait till further notice.” I am disappointed they do not publish the known issues surrounding decryption and did not have a work-around readily available as this would have saved me hours of troubleshooting, research, and some downtime.
If you found this write-up useful, I ask you let people on Twitter and LinkedIn know. If you liked this post, check out Is Network Security Monitoring Dead in the Age of Encryption?
Over the last several years we have seen encryption become more pervasive. Does it now make sense for security teams to invest in network security monitoring solutions?
With the strong push for encryption on everything from websites to hard drives, encryption is becoming a standard practice for most organizations. Reviewing the graph below from Google’s Transparency Report, we see that a majority of web traffic is now HTTPS.
Encryption is permeating other protocols. In September 2018, CloudFlare announced a new protocol that hides the server name during the SSL handshake. RFC 7858 (DNS-over-TLS) and RFC 8484 (DNS-over-HTTPS) both were proposed this decade and are already implemented by some organizations. (Note that DNSSEC doesn’t encrypt dns queries, but ensures they are authenticated.) SMB and SNMP in their third versions also include cryptographic capabilities. Microsoft’s Remote Desktop protocol now incorporates SSL, and SSH has always been encrypted.
It seems that just about all data transmitted over a network is encrypted or is moving in that direction. It is these reasons that some vendors push to move security monitoring to the endpoint where the machine decrypts the information anyways. Is network security monitoring dead in the coming age of encryption? Continue reading
Every organization with an Internet presence is battling for survival. State-sponsored hackers and organized crime groups continue to gather power and are more dangerous than ever before. We have approached the day when organizations must combine forces and reallocate resources to effectively defend against these formidable adversaries.
Continue reading