Tag Archives: pci dss

PCI Compliance & Your Vendors: Can you rely on them?

I
don’t want to dump on the vendors, after all NCI is a vendor of a lot of gear
as well as a QSA, but I’ve bumped into some vendors lately that are making some
pretty sloppy claims about PCI compliance. And let’s not even talk about some
of the claims from service providers, at least not yet, maybe we can talk about
them in a later post (PCI & Cloud anyone?).

What’s
that you say? “Vendors making outrageous claims? I’m shocked! The room is
spinning, I have to sit down to clear my head.” Let’s try not to be
flippant, this is rather serious, and here is why. 

Your
QSA comes in, whether for a scope assessment, gap analysis, your annual onsite
assessment, or even to review your SAQ, and they find a problem (We won’t be
specific about what the problem is because it’s not really relevant to the
discussion).

The
vendor said that their product would make you PCI compliant, or at the very
least help you meet some of the requirements, how could there be a problem?
Let’s take a look at two broad ways that we see vendors making promises that
they really shouldn’t’. 

  1. A vendor of awesome security gear (i.e. they sold you a box that you put in a
    rack in your computer room) tells you that all you have to do is use their
    stuff and your PCI compliance woe’s (or is it whoa’s) will be all gone. 
  2. A vendor of your “important” software (could be a payment
    application, or your ERP perhaps) tells you that if you install it in
    such-and-such a way then, presto, PCI compliance is yours. 

Not
so fast…

Are
these QSAs? Are they even a QIR (Qualified Integrator and Resellers)? In my
experience this is often not the case. So why are they offering authoritative
opinions on PCI DSS compliance matters? That last question was rhetorical. They
are doing it because that is what vendors do; they try to help their customers.
Sometimes they don’t end up helping though. They may even be making it worse.
Simply put they can’t offer a reasoned and considered opinion on PCI matters.
That’s for your QSA to do. 

Let’s
give two examples: 

  1. A vendor of log aggregation and analysis gear will put a box on your
    network that will collect all the logs (that you send it), grind them up and
    produce a canned report. But… Are you logging the right events as per 10.2?
    Are you sending the right details as per 10.3? Are you restricting access to
    logs as per 10.5? Are you reviewing the right things as per 10.6? Are you
    keeping enough logging on hand as per 10.7? Selecting the “PCI
    Report” is not a guarantee that you are, not even close. 
  2. The vendor of your payment application is performing an upgrade and as part of
    that upgrade they’ve made changes to the underlying application architecture.
    They now claim it is now possible to remove your workstations from your PCI
    scope. Wow! That would be nice. But they don’t tell you that you also have to
    change your business process so that you can’t take credit card details over the
    phone. Oops!

How
could your QSA have helped? 

  1. Your QSA is aware not just of what the PCI DSS has to say in requirement 10,
    but also what the intent of that requirement is, what evidence must be produced
    to validate this requirement, and the opinions of other QSAs in the industry
    about this requirement. If the way you or your vendor have configured the
    logging gear doesn’t pass muster with the QSA then you will have gaps in your
    environment and may not pass your assessment. If you had involved your QSA,
    either when selecting your vendor/product, or afterwards when specifying how it
    was to be deployed and configured, a lot of headaches could have been
    avoid. 
  2. Your QSA understands the ins-and-outs of PCI scope determination, interaction
    with other PCI standards (such as the PA-DSS), and can consult with other QSAs
    about what they’ve seen in the field for frequently used software. If you had
    involved your QSA earlier, they could have reviewed the modified architecture,
    read the Implementation Guide (required for PA-DSS applications), and with
    knowledge your credit card handling processes validated if the vendor’s claim
    was accurate. If it was not they could have provided you guidance as to how to
    use the new version’s architecture to achieve the scope reduction.

Achieving
and maintaining PCI compliance is tough enough without having vendors, QSAs and
your staff working at cross purposes. Everyone has a part to play and everyone
should be involved in the right capacity at the outset. Vendors should bring
their expertise of their product to bear, QSA’s their expertise in the PCI DSS,
and customers their understanding of their business and technical needs. When
we’re all playing on the same team we can get to PCI compliance much easier and
faster than when we’re at odds with one another. 

At
least that’s the philosophy at NCI. Give us a call to learn more.

Jason Murray, MEng, CISSP, QSA, CCSK

PCI in midst of Sell-off

We come across so many interesting business scenarios in our line of work.  For clients having to deal with Payment Card Industry Data Security Standard (PCI DSS), we see many faced with different challenges and scenarios.  Whether they’re just getting their feet wet with understanding the requirements to remediating some of the gaps identified or diligently embracing PCI as “A Way of Life”, NCI is here to help organizations better understand the standards and become PCI DSS-compliant. 

We recently came across an interesting scenario where a client is currently PCI DSS-compliant, but is dealing with a potential sell-off of a piece of their business. 

What are the implications and responsibilities for the seller and the buyer?  Below are some points to consider. 

  • There is a subtle difference between compliance and validation. Compliance is 24x7x365, validation is a point in time periodic exercise.
  • This new entity, whether as part of SELLER, or as its own legal entity should be maintaining their compliance at all times. When they start operating on their own, compliance should be in place despite not having a valid ROC in their name.
  •  Moving forward, the sold off entity would not be in the SELLER scope, so would not impact the SELLER’s current PCI efforts. If the business processes and technologies goes with the sale and is already fully isolated, this would make it much easier.
  • To avoid complications and intricacies it’s best to make the separation a clean one, both legally and technically. Try not to get into “interesting” relationships with who owns equipment, or who provides staffing. Just sever them and see them on their way.
  •  PCI wise this is a new legal/business entity so they have their own PCI obligations. That new entity or BUYER will have to renegotiate their contracts with Visa, MC, etc. With that will come direction from them as to whether the BUYER can keep/rely on the existing ROC or if they will have to do a new one. Likely they will have to do a new one – which means a new submission date.
  •  In the meantime, the SELLER will need to continue its existing obligations to submit by their renewal date.  They will need to make sure that a) the scope is clearly described and b) if the sale hasn’t finished yet that this division is listed in the exclusions section of the ROC.

Since 2007, NCI has helped clients reduce the cost and time required for achieving PCI compliance within their organization, as an approved Scanning Vendor (ASV), Qualified Security Assessor or Compliance Auditor (QSA), Program Application (PA-QSA) or Point-to-Point Encryption (P2PE) Assessor or Auditor.

Posted by Anne Kwok, Sr Account Executive, BSc, MBA and Jason Murray, Sr Security Consultant, MEng, CISSP, QSA, CCSK

Will tokenization (t10n) make your PCI pain go away?

I just finished reading the tokenization guidelines from the PCI Council. A very good document, much more informative than the one on virtualization. However, it does not provide the simple connect the dots type of advice most would want because t10n is complicated. It is complicated in its own right, let alone the fact that it is being deployed as part of PCI DSS compliance program.

Here are some of the issues that are raised:

  • Solution architectural,
  • Deployment,
  • Operational challenges
  • Software development, and
  • Contractual terms and conditions.

So will tokenization make your PCI compliance pain go away? Will it even ease your pain? Just a little bit?

Let me cut to the chase: Maybe, but don’t count on it. There are no silver bullets in the PCI compliance arena. At the end of the day t10n is a *scope reduction* approach. As such it can help reduce and minimize your PCI compliance efforts, but it does not eliminate your need to comply. Also, because it is part of what defines your PCI DSS scope it will need to be reviewed in detail each and every year when you undergo your PCI validation whether Self-Assessment Questionnaire or Report on Compliance.

I highly recommend that merchants thinking about deploying t10n give it a read. I also highly recommend any service providers looking to offer a t10n solution read it as well. It’s got good advice for both. Let’s dig in a bit more: Continue reading

Ruminations on the recent PCI DSS Virtualizaton Guidelines

The long awaited guidance on virtualization technologies was recently released by the PCI SCC. Having read it over I did not find any big surprises, but a few thing did stand out for me.

This is guidance only and does not supersede the PCI DSS. It doesn’t really add anything new that wasn’t included in the PCI DSS already. Basically just because you use virtualization doesn’t mean all the PCI DSS doesn’t apply. Furthermore when adding virtualization you are adding another layer of complexity; technical, administrative, and architectural. However I think it tips the Council’s hand in what we might see in an updated DSS in October 2013.

They include in virtualization, not just VMs, but virtual storage, virtual networking (think virtual switch, not vlan), virtual desktops, and or course the hypervisor. They also throw in the cloud for good measure. I wish they wouldn’t have done that, because that’s a whole other kettle of fish.

Here is my top 3 from the guidance.

  1. Mixed-mode: They use the term mixed-mode when mixing VMs of different trust levels or those in-scope and out-of-scope for PCI on the same hypervisor/hardware. They strongly recommend that all VMs in such a scenario should be in-scope. The reasoning is that the lower security VMs represent a potential avenue of attack. I see their point: A guest VM could be popped and then chained with a vuln. to escape to the hypervisor (or host OS) at which point it essentially game over. I see their point, but other controls in the PCI DSS are supposed to be in place to mitigate this.

    They also point out that it may not be possible to achieve appropriate levels of isolation between in-scope and out-of-scope guests with a particular virtualization technology. True, but in that case all the guest VMs would be in-scope due to inadequate segmentation.

    All in I think this is a very strong statement. I suspect that merchants and service providers will protest strongly. As a QSA if you can demonstrate that your virtualization solution provides for adequate isolation, you’ve configured it properly, and have the processes in place to keep the isolation in place, you should be OK for now. But you might want to start planning for this to be added in PCI DSS version 2.x in October 2013.

  2. VM Images: This one made me do a head slap (figuratively). VMs of course aren’t really hardware, they are just a bunch of bits in a VM image. This image contains the memory contents, disk contents, swap files, etc. So what about when a VM is dormant (off or suspended)? For PCI in-scope VMs it likely contains CHD. It may even contain it in an unencrypted state depending on when it was suspended. Worse it would contain sensitive authentication data in memory (verboten to store). What about moving images around? Some solutions do this to allow for increased availability. Maybe you are backing them up. Or moving them up to AWS to do some testing. We now have a new class of files that can contain both CHD, and verboten sensitive authentication data. Proper care and handling will have to be taken. What policies, procedures, technical controls are in place?
  3. Complexity: There’s a saying in computer science that all things can be solved by adding another layer of abstraction. That in a nutshell is what virtualization essentially is. OS process isolation wasn’t sufficient so we invented the virtual machine monitor in the 60’s and this essentially gave birth to today’s virtualization hypervisors. That additional level of abstraction has many implications. First off we have just increased the attack surface. Also we now need processes to control the lifecycle of VMs so we can keep a handle on them. We have also created a new class of administrators: the virtual machine admin. They have administrator level access to the hypervisor of course. But what about the underlying VMs? What about the virtual appliances that perform traditional network or security functions, the virtual switches, virtual firewalls, virtual AV? This has implications for separation of duties.

There is nothing to get too alarmed about. Most of this is already included in the PCI DSS on careful reading. It does highlight that in an effort to cut down on costs and leverage infrastructure we’ve introduced a host of other issues that we’ll have to deal with. Perhaps the cost saving and leverage wasn’t quite as large as was originally thought, especially when you throw PCI DSS compliance into the mix.

Jason M.

If you have any questions regarding virtualization and how it will affect your PCI DSS compliance efforts, please leave a comment or feel free to contact us directly. Our team of experienced QSAs would be happy to have a discussion with you.

IT Professionals Remain Skeptical – Really?

I recently read an article in eWeek by Fahmida Y. Rashid “PCI-DSS Compliance Helps Prevent Data Breaches Despite IT Doubts: Survey”.

So who are the people who remain skeptical about the security effectiveness regulatory compliance? Probably those people who believe the accounting, engineering, and medical industries should have no regulatory compliance either.

Let’s face it, information technology is a relatively new field and has begun to standardize itself with regulatory compliance in some key areas like privacy and the use of payment cards. Without it, how can an organization assure the public, its customers, and business partners that they are not putting their information at risk? 

Well of course there are those running their organizations’ IT Security programs very well; most likely using some type of standard similar to ISO 27001 to baseline themselves. Which is great and they very well might be very secure and doing a great job. Compliancy helps to achieve this in needed areas.

I have not met one senior IT person (CIO, Director/Manager of IT) that doesn’t believe in securing there organizations’ assets. Many look at regulatory compliance as a positive step forward. Many times, budgets will not present themselves without something like regulatory compliancy to drive their IT security projects forward.

Standards like PCI & NERC are put in place to help ensure that we are being protected. Like any compliancy program it has its positives and negatives; however, I believe the positives out weigh the negatives.

Each year I see more and more demand from clients to understand how they can reassure their senior management that they are using industry best practices measured by some standard. IT security is a very complicated and evolving paradigm. With so many avenues for an intrusion to take place it is my belief that standards and compliancy are a healthy step forward in the IT security industry.

Danny T.