The size of the worldwide big data security industry, which was valued at USD 24 billion in 2021, is anticipated to increase by 19% between 2021 and 2030 to reach USD 115 billion. The industry is abundant with opportunities to innovate and excel. However, the sector encounters a wide range of challenges that require redefined strategies. As an example, the abundance of security systems is an urgent concern.
Consequently, it remains a challenge for experts to deploy enormous numbers of systems most effectively to obtain maximum results. What worsens the scenario is the need for more skilled resources across all levels. Besides, the effective consolidation of systems is necessary for optimal results. But, unfortunately, it is only sometimes that the industry is able to achieve this. Yet another challenge is offloading or transitioning from on-premise solutions to cloud-based solutions. Both these challenges are aggravated by the need for more resources at different levels. Read More…
We can all agree that computer software is a complicated construct composed of numerous diverse components. Opensource software is becoming ever more common as a building block in software. This phenomenon is accompanied by an increase in exploitable vulnerabilities so it is little wonder that being able to tell quickly what your software (or that of your suppliers and vendors) is composed of is becoming increasingly important. Even though you may think the SBOM is a new idea people always needed to know what were the components in their software. Initially, the process was done on excel sheets but as software became more and more complicated and interconnected a more automated format was required.
Dvir is the CTO at CalCom and is a veteran of the IT industry with over 25 years of experience in system, security and Microsoft infrastructures. Dvir has been managing complex hardening projects for over 15 years and is considered a thought leader in his field. Dvir also leads CalCom’s product team including developing the product strategy and road map. Organizations have a set of configuration standards and industry best practices to harden their digital configurations. These benchmarks are set by the Center for Internet Security (CIS) and can be used to measure the effectiveness and efficiency of cybersecurity best practices. With the IT infrastructure being so complex multiple dependencies need to be configured with these best practices in mind. The technical task of hardening an organization’s servers to comply with comprehensive security policies is an involved process that is very time-consuming.
16 years ago I started my career in cybersecurity in the nascent IT GRC space. My company had developed a “manager of managers” platform to be able to assess a company’s IT Compliance by attaching to existing network security automation applications (SIEMs, VA scanners, CMDBs, etc.) and pulling all the regular configuration checks into a findings repository, what we would today call a Data Lake. We would have coded in all the controls for most critical compliance frameworks (HIPAA, PCI, ISO27001) and would scrub the findings against the controls to determine an organization’s compliance standing against any of those frameworks.
We could also pivot the findings on IT Risk using industry standard algorithms. Pretty cool right? I visited many CISOs to show them how they could synthesize the findings of all of their security automation apps into one view, and spend many fewer man-hours on monitoring, showing continuous status, and saving expertise for the critical remediations. Some of them understood, but too many (who ended up getting breached) said “I’m the ‘IT GRC’ around here – we can monitor all of our apps just fine”.
Most of them ended up spending millions of dollars in annual risk assessments by Big 5 consultants who sent newly-minted MBAs with clipboards to do manual checks via questionnaire (really!) and feed findings into Excel. That type of risk assessment typically took at least 6 months, so you could only do 1-2 per year. What happened if you were breached with a major PCI violation and had to both attest and show remediation in 3 months? You had a high likelihood of losing your ability to accept credit card payments electronically, is what. If you were, say an online gaming (“gambling” if not in the UK) company, that meant outof-business.
Few would argue that GRC is a key component of the cybersecurity portfolio for any modern organization. Sound GRC ensures an organization is operating within legal and regulatory frameworks while minimizing risks and maximizing opportunities. Management of threats and vulnerabilities is an essential component of a successful GRC program. Threats to an organization’s information systems can come from a variety of sources, including nation states, cybercriminals, insiders with malicious intent, accidents, and natural disasters.
Vulnerabilities can arise from outdated software, misconfigured systems, or weak passwords. The consequences of a successful cyberattack or data breach can be severe, including financial loss, reputational damage, and legal liability. In this article, we will explore some recommendations for implementing effective threat and vulnerability management as part of a broader GRC strategy.
GRC threat management is essential because of the constantly changing landscape of risks and threats. Risks can originate from internal and external sources and can cause significant harm to an organization. Internal risks include insider threats such as data breaches, fraud, and theft, while external risks include cyber attacks, natural disasters, supply chain disruptions, and regulatory compliance violations. Threat management helps organizations identify and prioritize risks, implement strategies to mitigate these risks, and minimize the impact of threats.