Earlier this month, the Financial Services Information Sharing and Analysis Center (FS-ISAC) held its annual summit in Miami. Attended by over 1,100 individuals, this was a highly concentrated gathering of information security and information technology (IT) professionals from the financial services industry. At the FS-ISAC Summit, I had the opportunity to attend some sessions, speak with a number of attendees, and get a sense of what’s top of mind. In this post, I’ll touch on a few topics that cropped up on multiple occasions during my travels through the summit.
At an Amazon Web Services (AWS) session, nearly half the audience raised their hands during an informal poll to see who was already using public cloud services. AWS stated that they have more than 1,000 customers from the financial services industry. In contrast, at a PriceWaterhouseCooopers (PwC) session, only a handful put their hands up in response to a question about whether the public cloud is more secure. So clearly, the financial services industry remains cautious about adopting the public cloud, but at the same time realizes that the benefits are too great to ignore.
The flexibility, near infinite scalability, and cost advantages of public cloud computing continue to resonate with CIOs as a means to provide IT services without the traditional delays and up-front capital investment required in private data centers. Ultimately, this translates into enabling the business to pursue competitive advantages in a timely fashion. This echoes my own experience in meeting with financial institutions. Many are conducting proofs of concepts with public cloud service providers to better understand the security implications and to sort out the processes and technologies required to safely use these services. However, other institutions remain on the sidelines with a “wait and see” attitude. Ultimately, the path forward for the financial services industry will entail migrating less sensitive workloads to the public cloud initially, but still with appropriate security controls in place.
At the end of the day, your business critical data is the asset that needs to be protected. Consequently, an awareness of where it resides, who has access to it, and how it travels through your network is necessary. Unfortunately, knowledge of one’s own traffic and network is generally limited. In most cases, applications and their associated data traffic just spontaneously appear on the network. There’s generally no governance process for introducing new or modified data flows across the network. Besides being a problem for network capacity planning, the lack of visibility to new application traffic limits the ability to secure the environment.
To protect data, encryption at rest has become the new norm. However, that’s not sufficient. Visibility into how and where it flows during the course of normal business is critical. Armed with this knowledge, deviations from the baseline can be detected and even stopped with appropriate network segmentation. Of course, a process to govern new or changed application traffic flows will then be necessary to effect corresponding controls across the network. This approach enables the necessary business workflows, but would constrain unexpected traffic that is the hallmark of malicious actors. By limiting lateral movement within the network, the attack lifecycle of advanced threats is severely hampered, and further attacks can be prevented.
Several sessions mentioned the value of pre-defined plans to maintain business during a cybersecurity crisis, to coordinate response/remediation efforts, and to ensure appropriate, timely communication with the regulators, customers, employees, and the public. There is certainly an element of business continuity involved here, but the plans may also include having cyber breach attorneys and cyber forensic teams on retainer as supplemental resources. The post-breach plan would also need to be exercised periodically to ensure all parties understand their roles and any inter-dependencies between them. No one can argue against the wisdom of being prepared in the aftermath of a breach. However, taking measures up front to prevent the likelihood of a successful breach is at least as important. Although he surely wasn’t speaking about data breaches, Benjamin Franklin’s quote about an ounce of prevention can readily be applied here. By adopting a philosophy of prevention, institutions can improve their overall cybersecurity posture and reduce the likelihood of invoking their post-breach plans for an actual event. A balanced approach, to prevent, detect, and respond, would best serve the organization.
Palo Alto Networks Next-Generation Security Platform can protect financial institutions by preventing both known and unknown attacks. To learn more about how we secure the public cloud, how to apply network segmentation, and how to prevent successful breaches, please visit the following resources: