avatarTeri Radichel

Summary

The article emphasizes the importance of understanding and controlling data flow within an organization to prevent exposure and breaches, particularly in cloud environments.

Abstract

The article, part of a series on cybersecurity for executives, stresses the critical need for organizations to maintain a comprehensive inventory of their data assets and monitor data flow to prevent exposure and potential breaches. It highlights recent security issues, such as unsecured AWS S3 buckets, databases exposed to the Internet, and the use of third-party services like DropBox and Google Drive, which can lead to data leaks. The author, Teri Radichel, points out that the shift towards cloud services and the ease of changing network configurations have increased the risk of data breaches, often due to a lack of proper governance, inadequate training, and insufficient monitoring of data transfers. Radichel also warns against the dangers of integrating third-party code and services into websites without proper validation and security measures, as well as the risks associated with vendors and consultants who may have access to sensitive data. The article advocates for establishing strict controls on data connections, enhancing visibility into data flows, and providing comprehensive security training to both technical and non-technical staff to mitigate the risk of data exposure.

Opinions

  • The author believes that knowing where your data is and how it can flow within and outside the organization is as crucial as maintaining an inventory of hardware and software assets.
  • Radichel suggests that the ease of cloud services and the ability for developers to change network configurations without proper oversight contribute to the increased risk of data breaches.
  • The article conveys the opinion that organizations often overlook the risks associated with third-party services and storage solutions, which can lead to unauthorized data transfers and breaches.
  • The author emphasizes the importance of proper governance and monitoring, including the use of Cloud Access Security Brokers (CASBs) and Security Information and Event Management (SIEM) systems, to gain visibility into data flows and detect potential breaches.
  • Radichel criticizes the common practice of integrating third-party code and services into websites without adequate security checks, which can expose customer data to external sources and malware.
  • The author points out that reliance on external vendors and consultants introduces additional risks that need to be managed through careful monitoring and access controls to sensitive data.
  • Radichel advocates for regular security training for employees, including developers and non-technical staff, to raise awareness about data breach risks and preventive measures.

Data exposure

Protect your gold

~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

⚙️ Part of my series on Cybersecurity for Executives

💻 Free Content on Jobs in Cybersecurity | ✉️ Sign up for the Email List

~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

Do you know where the data lives in your organization? If you don’t, how do you know if it is exposed or not? If you think you know where it is, do you understand all the ways it can flow in and out of systems that might inadvertently expose it to the Internet or third parties?

In my last post in my online book, Cybersecurity for Executives, I wrote about how open network ports can expose your organization to unnecessary risk. The concept of blocking network ports to keep things inside or outside your network presumes you have control over your network perimeter and can track what goes in and out via controlling ports. Sometimes in this day and age of cloud services, this is easier said than done. This post will look at some of the changes in systems and architectures in recent years that make managing network access difficult. Some of it has to do with approved third-party access to systems that create paths for data exfiltration (a fancy way to say data extracted from your network in a subversive manner).

Click here to purchase a full copy of the ebook or paperback on Amazon: Cybersecurity for Executives in the Age of Cloud

The Center for Internet Security publishes a list of the top controls businesses should implement to maintain a secure environment. The list comes from the most common causes of data breaches and what will stop them. The first two items in the list involve maintaining an inventory of hardware and software assets. However, I have often wondered — isn’t it just as important to know where your data is, and where it can flow? When I give security presentations, I like to say, “Your data is your gold.” Businesses derive value in part, from the data they maintain. Having that data leak may decrease a company’s competitive advantage and along with it the reason one company is worth more than another. Think about the impact of your competitors knowing your business plans, customers, and intellectual property, not to mention the potential $350-million-dollar price tag of mega breaches I wrote about in the first post.

What are some of the recent causes of data exposure? Of course, I have to mention the rash of AWS S3 buckets left open to the Internet. An AWS S3 bucket is a place to store data. Technically it’s called “object storage,” but to a business executive, it’s a way to store files so applications can access and manage them in a cloud environment. Azure and Google Cloud Platform have similar constructs. An organization cannot maintain S3 storage inside a private network due to its initial construction. As I wrote about in a separate blog post, new features exist that help lock down S3 buckets, but during the timeframe I was on the Capital One cloud team, the files had to traverse the Internet to go between applications and the buckets. A lot of companies probably never thought about this, but Capital One put in feature requests for some of the new controls that exist today that can keep those files off the Internet.

People that don’t understand and are not thinking about networking won’t consider these risks when they choose to use a particular service. Often it’s a business owner focused on how something functions or a developer who is just trying to get something to work. The network and security teams may not even understand how an S3 bucket works or that anyone is using it. The buckets get exposed because developers not adequately trained in networking are maintaining Internet access. They change the configuration to open up the bucket and don’t fully understand the implications of those actions. In other cases, the bucket may have been securely configured to start, but someone changed the configuration after the fact — an ops person in production responding to a ticket, or malware that got into the cloud environment.

Another common occurrence lately has to do with databases being exposed directly to the Internet. In the past, this was not as likely because a network and security team were involved in the design of most systems containing sensitive data. The network architecture and the server implementation put those critical database and data storage servers in a private network that required traversing additional network layers before reaching the data. Developers did not have a choice regarding which server would host the data or in which part of the network.

The recent shift is that now, developers are often responsible for the implementation of cloud environments where networking is easy to change, and databases are sometimes set up with direct Internet access. Additionally, some database services from cloud providers only operate over the Internet and are exposed by virtue of how they work. This type of breach is especially surprising, given that it is an obvious flaw in system architectures and configuration, but it is happening much too often. Some of the most common culprits are Mongo databases and Elastisearch. However, attackers have also breached other types of relational databases and cloud storage for the same reason.

Similar to S3 buckets, services like DropBox, Box, and Google Drive facilitate storage, but often for a non-technical user in a company. I was listening to a panel talk about trying to maintain governance when it comes to cloud services. A sales executive for a security vendor spoke up and said that he uses DropBox even though not permitted by his company because it is the only way he can do business and get files to clients. He works for a security company! Organizations need to understand what people need to be able to do their jobs to facilitate systems that allow what is required and try to find a way to monitor for unauthorized file transfers. Employees also need to be trained on the repercussions of too much data exposure. Companies need to be aware that these types of storage services have been used by insiders to steal incredibly sensitive data from organizations.

Services that facilitate data flows and data transfers are particularly risky if they are not architected with proper visibility so a company can monitor where its data is flowing between integrated systems and services. I’ve reviewed the way companies are setting up cloud applications as a Director of SAAS Engineering and Cloud Architect. I also examined system architecture related to data breaches and for potential acquisitions as Director of Security Strategy and Research for a security vendor. Additionally, I’ve had the opportunity to review system architecture while working on cloud audits through my company, 2nd Sight Lab. Due to the way some cloud systems work, it may be challenging for IT and security departments to obtain visibility into data flows between systems. Some companies also do not consider the risk related to these data flows when they are just trying to make things work.

For example, one particular type of application allows companies to easily transfer data to and from other companies and cloud services by quickly setting up new connections through a user-friendly console and a few button clicks. The first problem with this type of system is that the implementation of governance is challenging when people are simply clicking buttons. One application I reviewed had no evidence of change control, and no one was monitoring the logs for suspicious activity. They also were not aware of the steps they should take if a data breach occurs. A high-risk exists in scenarios like this for an inadvertent or purposefully malicious change to divert data to the wrong place.

Another problem that exists in this scenario is that two data flows could exist — one that sends data into the service and another that sends data out to some other third party. This type of data flow would never traverse the IT and security monitoring systems the company has in place. Perhaps IT and security teams don’t even understand these connections exist. No one would never know the data exfiltration occurred. No data loss prevention system such as a CASB (Cloud Access Security Broker) would ever see it if the cloud provider is not configured to send those logs to a SIEM and does not integrate with any logging system via an API. In the case of an incident, the company would be dependent on the cloud provider to deliver logs for evidence — if they exist and are adequately handled to maintain chain of custody (the proper handling so they will stand up to scrutiny in a court case).

Another source of data breaches lately is related to use of connections on websites that blindly send customers to third-party domains or code when a webpage is loaded. When a web site loads, it may include links to third-party domain names (like https://fonts.google.com/ to get a font to use on the web site). Sometimes I open up a web page and 20 or 30 different connections exist to domains other than the one I am trying to view! This seems excessive. News websites are some of the worst offenders of this principle in my experience.

When a customer opens your web page, if your site is connecting to all these other third-party sites, data that the customer downloads from your web site or enters into your web site could be accessible by these third-party sites if a vulnerability or misconfiguration exists. The third-party web site or code could also be serving up malware. The Magecart campaign has reportedly infected over 960 E-commerce stores by injecting malicious code. Note that the article suggests the problem has something to do with cloud providers, but likely it is just that the attackers are running automated attacks against applications in these environments, rather than a problem with the cloud provider itself. More details should emerge over time.

Have your developers download third-party code files and store them in your own source control system. Validate that the code does not contain software vulnerabilities, and serve it up from your domain name. Make sure your site does not expose customer data to external sources that serve up code, advertisements, and even images. Alternatively proxy traffic for requests through your web server in a secure manner to the web site that is serving up the ads and validate the content on the way back through. Serve all the content from your own domain. One other way data is exposed is through misconfiguration of something called CORS (Cross-Origin resource sharing), which specifies which third party web sites can access data from your web site.

As an executive, you don’t need to understand all the details of how these particular web technologies work, but you can ask how many different domains customers are exposed to when they open your web site or web application in their browser. Limit those connections to limit data exposure while customers are browsing your website. Additionally, make sure your development staff gets proper security training on top threats, how data breaches work, and potential vulnerabilities so they can design and implement more secure systems.

One other point of data exposure exists when your organization hires external vendors to maintain systems and data. A recent rash of attacks on MSPs, MSSPs, and organizations that develop systems for other companies have exposed data of the customers they support. A Wall Street Journal report explains how Chinese hackers breached U.S. Navy contractors. A company named Attunity exposed data of Ford, TD Bank, and Neflix. I already mentioned the breach of Wipro, India’s third-largest IT outsourcing company.

Think about how you monitor network access within your organization. If your IT or security team sees suspicious actions happening on your network, they will hopefully take appropriate action. If they see a connection by a vendor you have hired to assist with system maintenance or development, would they consider this activity suspicious? Likely the vendor is connecting over an encrypted channel from a trusted network. Your IT or security team has limited or no visibility into the vendor systems or networks. Companies need to be aware of how well their vendors maintain security within these external systems and networks. Organizations also need to be cognizant of what type of data is accessible to the consultants and vendors connecting to their network. Additionally, consider how the vendors are connecting and what can and cannot be monitored.

Companies need to govern how connections to data are established and monitor data transfers for potential loss or exfiltration. Reporting should exist that shows the types of data stored in systems, and the intended and possible data flows between systems to correctly evaluate the risk of loss, modification, or exposure. Organizations need to establish controls related to who can set up new databases, networking, and data connections and with what requirements to help prevent data breaches. Employees need to understand the risk of data exposure as well via adequate security training — including technical and non-technical personnel.

Follow for updates.

Teri Radichel | © 2nd Sight Lab 2019

About Teri Radichel:
~~~~~~~~~~~~~~~~~~~~
⭐️ Author: Cybersecurity Books
⭐️ Presentations: Presentations by Teri Radichel
⭐️ Recognition: SANS Award, AWS Security Hero, IANS Faculty
⭐️ Certifications: SANS ~ GSE 240
⭐️ Education: BA Business, Master of Software Engineering, Master of Infosec
⭐️ Company: Penetration Tests, Assessments, Phone Consulting ~ 2nd Sight Lab
Need Help With Cybersecurity, Cloud, or Application Security?
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
🔒 Request a penetration test or security assessment
🔒 Schedule a consulting call
🔒 Cybersecurity Speaker for Presentation
Follow for more stories like this:
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ 
❤️ Sign Up my Medium Email List
❤️ Twitter: @teriradichel
❤️ LinkedIn: https://www.linkedin.com/in/teriradichel
❤️ Mastodon: @teriradichel@infosec.exchange
❤️ Facebook: 2nd Sight Lab
❤️ YouTube: @2ndsightlab
Data Security
Data Flow Diagram
Network Security
Cloud Security
Security Training
Recommended from ReadMedium