Archive for the ‘PCI’ Category

This week, VPN Haus continues its conversation with Branden Williams, a seasoned information security specialist, about PCI and the cloud.

VPN Haus: Because of PCI 2.0’s lack of clarity on the cloud, do you think most merchants will only move non-PCI related data to the cloud – until they get more guidance from the Council?

Branden Williams: Frankly, I don’t think the virtualization bit should have been added into PCI DSS 2.0. That’s a training issue. But since they did add it in, I bet merchants and service providers will look to the Council to provide guidance on cloud. Companies should approach cloud from a security and data perspective. Regulated data should probably not be put into a public cloud, but catalogue or other public data could certainly be. It’s not an all or nothing approach. Savvy IT and IS managers will look at the spread of options and implement what makes most sense for each type of service. Companies waiting for the Council to tell them what to do will be missing out on one of the biggest economic shifts in IT services of our generation. Their competitors will pass them by.

VPN Haus: You’ve compared physical security with network security. What are some lessons learned from physical security that IT administrators can use? Obviously you can’t use someone’s body language to determine intent with network security…or can you?

Williams: Interesting concept, could you use body language to determine intent? I think it depends on the distance we are talking about. If you can physically observe the body language of the individual, you may be able to determine intent. But if you cannot see the individual, you can use analytics of their activities to determine intent. Most companies avoid this activity because they struggle with justifying the cost versus the risk. The cost gets a bit out of control when you have multiple entry points with multiple applications and business lines. It would be pretty easy to do this for a small company with only one corporate location and a website with a single function. Attackers get crafty and disguise their gentle testing of the environment, and without context or other types of fingerprinting, it’s difficult to track one individual over a period of time. If you assume people are already in the network or always knocking on your door, you create a layered approach to security just like you would in the physical world (supply closets, data centers, and other sensitive areas often require additional badge access).

This week, VPN Haus catches up with Branden Williams, a seasoned information security specialist, about PCI and the cloud.

VPN Haus: You’ve blogged about the fact that cloud isn’t overtly mentioned in PCI 2.0. Can you provide some examples of common problems merchants/service providers considering cloud solutions might come up against when dealing with QSAs who don’t have cloud experience?

Branden Williams: Merchants and service providers considering cloud solutions should absolutely read and understand the impact the fine print of their contracts with the cloud provider has to their security and compliance initiatives. In many cases, the most economical options are the least security and compliance friendly. Once a suitable contract that meets requirement 12.8 (at a minimum) is executed, you may need to train your QSA on how the solution works. In many cases, the QSA will not understand how to assess a cloud environment, but it should not be assessed with any different requirements than a physical environment. QSAs must spend some time learning how your particular solution works before they can make a judgment call on compliance. This may extend the duration and increase the cost of your assessment.

VPN Haus: In the blog post, you recommend folks using the cloud map their data, yet many companies don’t do this. What’s the major challenge to mapping data?

Williams: Mapping data and data flows is an immense task. Most companies don’t have singular systems or flows, and data sprawls everywhere. Moreover, to truly discover and map this data, you need tools. Some of these tools can be pricey and impact operations, which forces companies to reconsider their deployment. Add to that a heterogeneous IT deployment that includes various flavors of Windows, UNIX, Mid-Tier, and Mainframe computing, and it’s easier to give up than to actually map things. Another problem companies face is that once the initial process is done, how do you keep it up to date? By the time the first pass is done, you can almost guarantee that something minor has changed. In order to do it well, you need tools to help you do this.

Stay tuned, next week VPN Haus talks to Williams about comparisons between physical security and the cloud.

By Anton Chuvakin

  • Mainstream security in the cloud:  Yes, Qualys and a few others have been doing it since 1999 and a few cloud security providers has been absorbed into large entities (latest, sort of). But I suspect that in 2011 we will see much more of “ approach to security of … now in the cloud.” By the way, I mean REALLY using SaaS/PaaS/IaaS cloud options and not “press-release cloud” like many do today.
  • “New” types of incidents:  Going on limb, I predict a few large (and very damaging) breaches, NOT involving regulated PII, but good old secrets. Wikileaks mentality + cybercrime resources = a fun year!
  • SIEM for dummies:  OK, this is another risky one. As you know, there is no leader in the SMB/SME SIEM market and I am really looking for somebody to climb on that hill. The world needs a penultimate “SIEM for dummies.” As of today, SIEM is decidedly not.
  • Security vendors:  Despite the silly 2007 predictions by the RSA CEO, there will still be hundreds of security companies around. However, some of the players will definitely feel like they “overstayed the market’s welcome” (e.g. some legacy SIEM vendors) and will either die or firesale.
  • Risk “management”: Every past year, I predicted that we will remain dazed and confused about how to apply risk to information security in an objective manner (objective, not necessarily quantitative). This year…. drumroll… I am laying these dark thoughts to rest – at least for a while. Maybe, just maybe, we are starting to see both data and approaches that will eventually give us something to work with. And no just whine about it.

For Part 1 on this series, click here.

VPN Haus continues its conversation with PCI compliance expert Anton Chuvakin about the latest updates to PCI DSS 2.0, issued late last month.

VPN Haus: Do the new standards leave too much open to merchant’s interpretation?

Anton Chuvakin: This is really a $1 million-question and only practice will tell. I think the 2.0 version leave less than before to interpretation. For example, virtualization was a big question mark in many merchants’ mind and now it is resolved. Many other questionable and debatable points are clarified but I am sure merchants would come with more excuses as PCI DSS 2.0 is implemented in practice.

VPN Haus: Do you think pushing the DSS lifecycle from 24 months to 3 years will stagnate the rate of change? Or will it allow more time to investigate and build support around necessary changes?

Chuvakin: Well, I will side with [PCI General Manager Bob Russo] on this one:  PCI DSS is getting mature enough to not need change that frequently. While some assault the standard as “not being dynamic,” in reality doing what PCI DSS prescibes and doing it well, by following the spirit and not only the letter, will equip organizations for dealing with today’s and – in my opinion – tomorrow’s threats. For example, recent Verizon PCI report showed that compliant non-organizations seem to fare worse, which indirectly confirms that PCI DSS in its current form helps reduce risk of data theft.

See previous interviews VPN Haus did with Chuvakin on PCI compliance here and here.

We continue our conversation with Martin McKeay, a seasoned IT security professional dedicated to spreading awareness about security and privacy through his “Network Security Blog” and podcast series.

On whether PCI standards will strengthen:

I think the standards are going to change, but slowly. They’ll change faster than a federal mandate could, and I think that’s their strength. The PCI standards 2.0 should be released in October, but from what we’re seeing right now, there are no major changes.

I’m hoping some of the other special interest groups working behind the scenes will provide some clarity and some guidance on new technologies. But even that’s going to take awhile, so I don’t see any major changes in the next few years.

On the technologies he’d like to see recommended in the PCI standards:

Hopefully, we’re going to see tokenization and end-to-end encryption as technologies the PCI Council recognizes and encourages more people to use and implement. Both of those are still nascent technologies.  They are technologies that a lot of people are trying to figure out and implement. But neither has an accepted industry definition or an accepted industry implementation.

On his definition of end-to-end encryption and tokenization:

End -to-end encryption is when you encrypt a credit card from the moment you take the information to the end. I feel that unless you’re encrypting from the moment you take the credit card, it’s not end-to-end encryption. But that’s open to interpretation. There are technologies that call themselves end- to-end [but don’t do that]. But it’s a nascent technology that is still being defined, and it hasn’t been implemented fully,  except in a few cases.

Tokenization is a form of encryption. You have a tokenization server that would encrypt the credit card number and give you back a number that you can use in your database. This number would look like a credit card number, but it would have no actual real relation to the credit card number.  You have the credit card information available in your server, but in your more public-facing databases, you have a number with no direct relation.

For Part 1 of this series, click here.