Tag Archives: cloud

Palo Alto Wildfire Malware Analysis

Network attacks are increasingly driven by sophisticated malware that is designed to avoid traditional antivirus controls. WildFire extends the next-generation firewall to identify and block targeted and unknown malware by actively analyzing unknown malware in a safe, cloud-based virtual environment, where Palo Alto Networks can directly observe malicious malware behaviors.

WildFire automatically generates protections for newly discovered malware, and delivers these protections globally, enabling all customers to benefit from the analysis. With version 5 of PAN OS, Wilfire is now available in a pay-for subscription service.

Basic WildFire functionality is available to all Palo Alto Networks customers at no charge. You can automatically submit suspicious files to WildFire and protections are delivered with regular threat prevention content updates (threat prevention license is required). The pay-for WildFire license provides WildFire protection within 1 hour of new malware being detected anywhere in the world, integrated logging/reporting; access to WildFire API for programmatic submission of up to 100 samples per day and up to 1,000 report queries by file hash per day.

About Wildfire:

Turning the Power of the Cloud Against Malware

WildFire is built on a revolutionary architecture that marries the high throughput and full visibility of the next-generation firewall to inspect all traffic with the scalability and flexibility of the cloud to safely analyze vast quantities of potentially malicious files. By performing analysis in the cloud, WildFire can give complete freedom to malware to perform any actions without putting the your network at risk. Also, leveraging the power of the cloud removes the need to install additional single-use hardware in your network, and as malware analysis demands grow, the WildFire cloud can simply add capacity as needed. Furthermore as malware evolves, sandbox logic can easily be updated in the cloud without requiring any updates to your firewalls.

Automatically Protect Users and Stop Outbreaks

Detecting a threat is always the first step, but the real value lies in protecting users and the network itself. When WildFire identifies new malware, it automatically generates protections, which are delivered to all WildFire subscribers world-wide within 1 hour. This allows subscribers to share in the intelligence gathered from all WildFire users, and stop malware outbreaks before they spread. WildFire also analyzes command-and-control behaviors, URLs and DNS patterns to identify and block traffic from any users who may already be infected. Furthermore, as a true inline firewall, Palo Alto Networks always retains the ability to directly drop malicious traffic instead of relying solely on TCP resets which can easily be filtered or ignored by malicious endpoints.

Correlation and Reporting

WildFire provides a wealth of analysis and forensics for all inspected files. The WildFire portal is available to all WildFire users and provides a window in malware behavior including any malicious actions, domains the sample visited, files that were created and registry entries that were affected. Customers with the WildFire subscription additionally gain access to fully integrated WildFire logs and reports via the standard Palo Alto Networks user interface or Panorama. This log integration makes it easy to quickly tie malware to users, applications, URLs, files or other threats for fast incident response, and even modifying policies to reduce future attack vectors.

If you have any questions regarding Wildfire or require assistance in activating or upgrading your PAN-OS appliance, please contact your NCI rep today.

New Year’s Resolution – Protect your public facing web site

You know it’s on your to-do list, that static content web site that’s sitting unprotected out in the cloud somewhere – time to protect that public image with more than a simple access control list?

Incapsula provides every website, regardless of its size, with enterprise-grade website security. Incapsula’s security expertise is based on years of experience from Imperva (Incapsula’s parent company), the leading provider of security solutions for world class enterprise sites. This core security technology has been adapted to a new cloud platform, and optimized to support websites of all sizes. Incapsula enhances security through real-time and centralized collaboration, does not require you to have advanced security expertise in-house and can be setup within minutes. Let NCI show you how easy it is to protect your public image.

And while we’re at it, hopefully you’ve got vulnerability scans and pen tests scheduled for your public facing web site taken care of, but who’s doing vulnerability management on those cloud-based applications? Let NCI handle the daily, weekly and monthly analysis of whether or not your cloud provider is staying on top of the latest security threats by patching these web servers. NCI will perform daily vulnerability scans and keep track of whether patches and updates are completed on a timely basis.

For more information please contact your NCI rep.

(ISC)² Security Congress 2011

The congress was held Sept 19-22 at the Orange Country Convention Center in Orlando. This was (ISC)²’s first annual Security Congress, hopefully not the last! It was co-located with the ASIS International’s 57th annual seminar and exhibits, a move that recognizes the convergence of physical and information security.

After attending this congress, I realized how big the physical security world is. To give you the numbers, there were 280 attendees from (ISC)² versus 20,000 from ASIS, and enough exhibitors for this crowd to visit: 700.

There were 3 hour-long educational sessions per day, with about 25 topics to choose from for each session.

What were they talking about?

The 3 topics that was heard and discussed and debated on in almost every session (among the 10 or so (ISC)² sessions that I attended) were:

  1. Cloud Security
  2. Mobile Device Security
  3. Social Media

The trend and the focus for the information security industry in the next couple of years will be on addressing the above 3 topics with policies, regulations, products, and services. Below I’ll expand a little bit on why each area is attractive, and what are the security risks. 

1. Cloud Security

Why cloud? – Flexibility and scalability, cost savings, availability and disaster recovery

Threats? – Data loss/leakage, abuse of cloud, account/service hijacking, shared technology

What to do? – Like any other technology, cloud has risks associated with its benefits. All the classic principals of information security should be applied to it, having it in mind from the design/architecture phase. Have an incident response plan. Consider private/community/public/hybrid cloud options. 

2. Mobile Device Security

Why mobile devices? – Business rewards (response time, availability, flexibility), employee experience (ubiquitous mobile devices, employee owned), executive adoption

Threats? – Data loss/leakage, employee privacy concerns, compromise of corporate network from mobile device

What to do? – Look into device ownership (= liability) issues, have a corporate and a personal mobile device use policy, provide training to go along with that policy, harden mobile devices 

3. Social Media

Why social media? – It’s ubiquitous and unavoidable, it is the basis for Web 2.0, it has great potential to be used as a marketing and customer communication tool for the enterprise

Threats? – Faster spread of malware through the ‘trust’ factor, phishing attacks, worms, shortened URL’s, Evil Twin attack, session hijacking, identity theft, all leading to information leak and corporate liability issues

What to do? – Social media use policy (AUP), education and awareness, use of content filtering and DLP products to control traffic to and from social media sites

Some interesting notes:

  • Security is not about security, it’s about risk management
  • What is the perimeter of your network? It’s the end user!
  • A smartphone on your network should not be treated ANY differently from any other computer on your network
  • 1 out of 5 tweets names a product brand
  • Facebook mobile users are 50% more active than other users of the site
  • Sources of social media risk include: clients, employees, vendors, competitors, activists, and cyber criminals

Some interesting links:

Some interesting speakers:

  • Jeb Bush, Former Governor of Florida
  • Vicente Fox, former president of Mexico
  • Burt Rutan, designer of SpaceShipOne
  • Janet Napolitano, US DHS Secretary
  • Winn Schwartau, celebrity and power thinker on security/privacy/infowar/cyber-terrorism
  • Charlie Blanchard, Manager of Security & Privacy Services, Deloitte & Touche LLP
  • Simon Hunt, VP and CTO, Endpoint Security, McAfee
  • Shayne Bates, Director Security Cloud Strategy, Microsoft Global Security
  • James Hewitt, Director of Security Governance, CGI Federal

Vahid A.

 

The Cloud – Like a first date…

I’ve been asked numerous times over the past few months on whether or not clients should be using the cloud.  The original “cloud” providers were web hosting organizations.  These providers provided redundant internet paths, redundant hardware, networking infrastructure, power, cooling and all the bells and whistles now touted by some of the larger cloud vendors.  They simply “rented” space on their physical hardware for a low monthly price.  Many customers chose to host their web content on external providers assuming that a dedicated provider would be able to patch and maintain a web server much more efficiently than their own staff.  While true, much of the web content hosted 10 years ago was static content, contained really no sensitive data and was accessed by relatively few individuals.

Fast forward to 2011 and the explosion of on demand services, hardware, virtual-desktops, hosted Microsoft Sharepoint & Exchange, hosted apps like SalesForce give organizations a choice between in-house or in-the-cloud.  These dynamic applications rely upon a tremendous amount of information being stored and hence the security concern.  We all understand that security is a trade-off between risk and cost.  The more money you spend on security should buy us additional security, but at a certain point the risk / reward just doesn’t make sense.  We should take the same approach to the cloud.  Many clients today are conducting SoS (Statements of Sensitivity) on applications.  Depending on the level of risk an organization is willing to undertake with specific applications may make them perfect candidates for the cloud.  For example, an e-commerce site with a limited number of products and a hosted payment page may be a perfect candidate to try out the cloud.  By completing a statement of sensitivity it may become clear that there isn’t a tremendous amount of risk or exposed data.  Why not use this as your cloud trial?

In 2010 Tiffany Bova from Gartner hosted a session and described the cloud as simply a different method of service delivery – perhaps we should think of the cloud as we did with virtualization six or seven years ago, start with some light weight, low resource intensive applications that aren’t mission critical to get comfortable with the cloud infrastructure.  Who knows?  You just might like it – just don’t wait 3 days to call it back.

Eugene N.

This post deals primarily with the concept of ‘public cloud’. If you have questions or comments regarding this subject, or would like to talk to someone regarding the distinction between public, private, and hybrid cloud, please leave a comment or contact us via our contact page.

Data Delivery Within a Cloud Infrastructure

Today, there is a growing trend worldwide amongst private and public organizations to move towards having an open non-private data ideology. Reason being there is a growing need to facilitate access to non-private data for citizens and employees. This is as a result of the Information Age and its predominance in today’s society. The issue this raises is what kind of impact would routine publishing of non-private data have on an organization’s infrastructure? It is evident that cloud computing can be a clever and cost effective means to achieve this end.

Organizations have begun to create what is referred to as an “open data catalogue” hosted in the cloud as this was the most time efficient manner to make non-private data easily accessible to those individuals this information was meant for. This also allowed for minimal infrastructure changes, flexibility, convenience and at a low cost. It has proven to be a great way for organizations to be transparent to their employees and drive interest from the public. It has also been fundamental in fostering a feeling of empowerment to employees as they feel more involved in the shaping of the organization they are a part of.

Moving forward organizations are looking to develop a process for establishing standard operations for the addition and maintenance of data sets in the catalogue in order to create an open data infrastructure.  It is imperative that employees and citizens have a good understanding of the benefits and value in making this type of information available. Transparency is key to building and fostering trust. It will help to generate employee interest in being more involved and excited about where they work resulting in strengthened unity. Unity within any organization is of paramount importance to its long run health and viability.

Chris Medina

Do you have an open date catalogue project on your radar? What are the challenges and obstacles you foresee in the future?

Is the Web Evolving Faster Than Your Security Practices?

Recently, I took a few minutes using the Wayback Machine to look at how our website has evolved over the past decade. It was a very interesting exercise because it revealed a few things about our corporate culture and emphasized some dramatic shifts in the way we develop and use websites today.

A few things to consider when looking at the pictures (above): Continue reading