• Solutions
  • Services
  • Team
  • Blog
  • About
Get More Info

Month: December 2019

Data classification strategies for 2020

Posted on December 31, 2019January 3, 2020 by Aman

Data is everywhere within the organization, but few people know the significance of their data to the business. In this post, I discuss strategies for performing data classification.  What makes data classification complex is the sheer number of items that need classification.  Overall, the classification process should be outsourced to the document owners, this is […]

Data is everywhere within the organization, but few people know the significance of their data to the business. In this post, I discuss strategies for performing data classification.  What makes data classification complex is the sheer number of items that need classification.  Overall, the classification process should be outsourced to the document owners, this is the best approach and a strategy needs to be identified early. The information security team’s role here is to identify the data repository’s data classification capability.  Not all repositories have an easy and intuitive classification method. Let’s take a look at some possible approaches.

Identify current state

Data classification is a process by which information security and legal teams identify critical and sensitive organizational data.  In order to succeed in this endeavor, you’ll need to bite off small chunks and, work with patience and determination.  If you’re late to the game, start by identifying your repositories built-in capability to classify data.  Investigate whether you’re able to simply turn on the feature within the application. It might be offered in an enhanced version of the product that requires additional licensing or costs however data classification is important and the additional cost will be well worth it.

A data classification guide should be provided to all document owners. The classification guide should be simple to follow and understand. A typical classification system includes 3 classification levels including sensitive, secret, and unclassified, however, additional classification types might be useful depending on your specific business needs.  To begin with, classify your data directly in the document. This classification should be searchable and readable by search and regular expression engines.  This allows for a DLP solution to search the document to identify its classification level.

Fill in the gaps

Enterprise document storage solutions should have robust scripting and plugin capabilities. In-house development can help create data-classification add-ons and plugins for any document management solution that does not provide such capability.  Some collaboration suites include a marketplace that readily offers such extras, however, investigate the add-on for security issues before implementing.

Crowdsource your classification efforts

Off-course, this isn’t always possible and in that case, setting up a meeting with the document owners and walking them through the importance of the data classification program will be required.  Users are more willing to participate in mundane yet important tasks if the process is more engaging and rewarding. Identify ways in which you could gamify the process and provide rewards, provide a leaderboard and acknowledge those at the top during company meetings. Data classification is a critical process, therefore, participation should be part of everyone’s job responsibilities. Executive leadership needs to buy in and ensure that data classification is an essential part of everyone’s routine and not a burden on employee workload.

Improve your vuln management program in 2020

Posted on December 31, 2019January 3, 2020 by Aman

When it comes to vulnerability management, you can get the first 20% up and running without many issues. Deploying a management panel, adding sensors, running discovery scans. Most of these are fairly intuitive tasks and a few platforms will have wizard-like interfaces that will help you get to the 20% mark within a few hours. […]

When it comes to vulnerability management, you can get the first 20% up and running without many issues. Deploying a management panel, adding sensors, running discovery scans. Most of these are fairly intuitive tasks and a few platforms will have wizard-like interfaces that will help you get to the 20% mark within a few hours. However, to get the most out of your VMS (Vulnerability Management System), you’ll need to work within and outside the security team, with a mixture of stakeholders, to ensure your efforts are worthwhile and that the vulnerabilities that are detected, are handled quickly and efficiently.

Whatever your ticketing and task management platform, it’s important that vulnerability management solutions do not operate in a vacuum. It isn’t necessarily true or scalable that your VMS operator should be the person that also fix or patch the vulnerable system, or be responsible for identifying who the vulnerable system belongs to. In large enterprises, or even small ones, identifying systems owners can be challenging. Who owns the operating system? Are they the ones responsible for patching firmware? Who owns the web-server? Are they the ones that own the application?

Each system can have varying levels of complexity as well as a myriad of owners that own different aspects of the affected system. Documenting who owns the system is important, however, the scans should be setup in such a way that scanned assets also include asset identifying metadata that can reference ownership of the system or platform. It’s a good practice to include this information in a summary or description field of the scan so its handy for automation.

Once ample metadata is provided, automation can be used to create and assign high severity vulnerabilities to the appropriate owner. This ensures that it gets handled in a timely manner and that the vulnerability is not stuck within the confines of the VMS itself. Whether the ticketing system is Zendesk, ServiceNow or Jira, each platform can be configured via scripting and API services to create and assign tickets with the appropriate severity rating.

Off-course, in order to get the highest fidelity scans possible, the scanning engine should get accurate information about the system that is being scanned. VMS systems should be configured with either certificate based or other forms of authentication in order to log into the OS or application in order to accurately assess that system for vulnerabilities. Unauthenticated scans are only good for asset management but not for vulnerability management. Authenticated scans increase the signal to noise ratio and help identify critical vulnerabilities and assign the appropriate SLA.

Ensure that your VMS system has only the minimum credential level required to perform the appropriate scanning. This can be accomplished with either a sudoers file or setting appropriate privilege levels for the commands required. LDAP and Hashicorps vault are good examples of centralized authentication systems.  It is a common and recommended practice to rotate credentials on a set interval and log all activities performed by the system. This allows quick detection and prevention of unintended actions.

Last but not least, establish realistic SLAs and MTRs (mean time to remediation) and track these metrics, giving the power back to the system owners to incorporate their own patching strategies for their respective systems. The patching strategy should automatically alert the Scan engineer that a remediation task has been performed. It should allow the ticketing platform to kick off another scan task to validate the fix by calling the VMS system via the API.

New Team Members

Posted on December 30, 2019 by Aman

Something great

Something great

Recent Posts

  • Identity Services Engine
  • Automating DNS to IP update for ISE DACLs
  • Data classification strategies for 2020
  • Improve your vuln management program in 2020
  • New Team Members

Recent Comments

    Archives

    • January 2020
    • December 2019
    • March 2019

    Categories

    • Featured
    • New Posts
    • Service
    • Uncategorized

    Meta

    • Log in
    • Entries feed
    • Comments feed
    • WordPress.org
    Services
    • Design
    • Anti-Malware
    • Firewall
    • ISE
    Site Nav
    • Design
    • Anti-Malware
    • Firewall
    • ISE
    About Us
    • Solutions
    • Services
    • Team
    • Blog
    • About
    Connect
    • Facebook
    • Twitter
    • LinkedIn
    Sign Up For Our Newsletter
    Copyright© 2020 Arrowhead Information Security | San Jose, CA
    • Solutions
    • Services
    • Team
    • Blog
    • About