April 2017 Digital Edition

Click Here

March 2017 Digital Edition

Click Here

Feb. 2017 Digital Edition

Click Here

January 2017 Digital Edition

Click Here

Nov/Dec 2016 Digital Edition

Click Here

Oct 2016 Digital Edition

Click Here

Technology Sectors

Market Sectors

Using the past to predict and prevent future cyber attacks

Joe Gottlieb is CEO of
Sensage, a KEYW company

Government agencies are facing an ever-increasing set of threats that range from typical data loss to highly sophisticated and malicious cyber assaults. In order to effectively manage these risks, government organizations must proactively monitor all security events across their IT landscape. Taking a page out of the intelligence community’s playbook, many government security teams are starting to apply a two-pronged strategy to their security practices -- understanding the here-and-now, and using history to advise the here-and-now. 

The former, considered “real-time” monitoring, is where most vendors have focused their technologies, and where most security practitioners spend their time. You get dozens of alerts a day based on events happening across networks, endpoints and systems. It is a manual process to make sense of the silos of security management solutions you are running, and in many cases, you either don’t have time to chase each one down, or you do have time and most are not a threat at all. 

The latter, using historical analysis to predict and advise security management practices, requires a more analytical approach and commitment to continuous process improvement. While logs were typically used for investigating breaches after the fact, those same logs can be used to build more effective real-time alerts, policies or DLP settings, for example. 

First, you must understand the past so you can set a security baseline. Next, build thresholds which if exceeded, alert the security staff. This automated anomaly detection -- whether it is in user activities, network performance or system functions -- helps remove the amount of ad hoc investigation or real-time alert chasing that is currently being done. Several enterprises and government agencies are already ahead of the game. Some have been collecting event data for more than five years and therefore are in a better state to identify what anomalies look like. While this seems like a very new concept, the building blocks for taking a predictive analytics approach are fairly straightforward: 

Collect all event data…and we mean everything -- While you might not yet see the value in correlating data from seemingly unrelated systems (for example, IT ticketing system versus firewall events or customer record systems access versus Web traffic), you will understand its value once that data is made available. 

Store it all in one data warehouse -- Centralized storage is critical because you want the data to be uniformly gathered and accessible. So often, these are the biggest obstacles to contend with. It is also important that you consider a data warehouse built for time-stamped data, which is quite different than operational data that does not possess the granularity that event data allows you to capture.

Keep the data in its raw form -- You might need to normalize the data for your real-time processing, but maintain a stream that is untouched. Why? The results of your analysis will be more valuable if the fidelity of the data remains intact. 

Make analysis simple -- Find a way for even the most non-technical, non-security user to make sense of the data. Think about leveraging the same visualization and drill-down capabilities available in off-the-shelf third party business intelligence tools. Apply analytical best practices and techniques found in marketing, finance and sales organizations to slice and dice your data. 

Start with basic correlations -- Even after a month, the amount of data you collect will already contain interesting analysis. This may seem unlikely, but start with something simple -- pick event data from a system that contains valuable information and chart employee download volumes versus time of day. Patterns will start to emerge -- employees who download a set amount of records per day, during specific times. Within that, some unusual volumes or times of day may surface. Decide if those are suspicious or justified, and then start establishing alerts based on deviations against those baselines.

Here are a few more examples of interesting activities that will immediately make real-time alerts and investigations more rational: 

Correlating downloads at unusual times with failed log-ins -- By reviewing data from various systems, you will be able to flag any employee who is entering a system at an unusual hour, and then successfully or unsuccessfully logging into a system he or she normally doesn’t access. From here, you can quickly determine if an employee has either sold their log-in details or had them compromised. 

Identifying multiple IP addresses for the same log-in -- By correlating log-in activities with IP addresses, you can determine if multiple IPs are trying to access the same log-in. This can determine if an employee is sharing passwords or there is malicious activity. 

Tracking unusual badge swipe behavior -- If you understand typical employee enter/exit times, it is easier to identify unusual behavior. When correlated with other alerts, this can indicate if an employee’s badge is lost or compromised, which could lead to unauthorized access. 

Over time, you can add more sophisticated filtering and automation to the processes you establish, making your security management approach very sensitive to any departures from previously acceptable behaviors. This “total view” of historical data enables the security staff to make timely, informed decisions about existing baselines and ongoing patterns. By combining log management, real-time oversight, incident response, forensic investigation and compliance to this now-required “total view,” government agencies can effectively improve security.

Cyber threats are growing and evolving rapidly. Their omnipresence in civilian, government and critical infrastructure environments today pose a grave threat to national security. It is vital that government agencies that are vulnerable understand how to mitigate these threats. With newfound intelligence and in-depth insight derived from looking at historical data, government organizations can precisely produce accurate breakdowns of compromise attempts immediately and in the future. 

 

Recent Videos

HID Global is opening the door to a new era of security and convenience.  Powered by Seos technology, the HID Mobile Access solution delivers a more secure and convenient way to open doors and gates, access networks and services, and make cashless payments using phones and other mobile devices. ...
Mobile device forensics can make a difference in many investigations, but you need training that teaches you how to get the most out of your mobile forensics hardware and software, and certifies you to testify in court. Read this white paper to learn how to evaluate mobile forensics training...
PureTech Systems is a software company that develops and markets PureActiv, its geospatial analytics solution designed to protect critical perimeters and infrastructure.  Its patented video analytics leverage thermal cameras, radars and other perimeter sensors to detect, geo-locate, classify, and...
PureTech Systems is a technology leader in the use of geospatial video, focusing on perimeter security.  When combining geospatial capabilities with video analytics and PTZ camera control, managers of critical facilities can benefit by allowing the video management system to aid them in the process...