Emerging Themes in Cyber Security – 2021

By Renny McPherson and Dr. Josh Lospinoso


The cyber security landscape is evolving rapidly as the attack surface for cyber attack grows exponentially due to mega-trends in how people live today: more devices, more digital everything, more open source, more enterprises developing software, and everything digital being connected.

Covid’s work from home mandates have exacerbated the risk. As such, there are many opportunities for startups to have an impact by addressing a new, modern theme or taking a new approach to a long-standing cyber segment such as endpoint protection. Below, we outline eight themes of interest for First In this year. This list is far from exhaustive, as there are many segments within cyber security which present opportunity.


The Long Tail

Small and medium-sized businesses (SMBs) are increasingly at risk of cyber attack, and enterprises are more and more vulnerable to supply chain risk from their vendors and partners. Themes that have worked in enterprise are now more necessary, at a lower price point and with more ease of use, to SMBs.

We will devote a follow-on post to the long tail of risk.


Data Security

Enterprises generate and retain massive amounts of data. It’s important to secure this data with a combination of filtering, blocking, and remediating techniques. Data security platforms will integrate directly with other data platforms to monitor, provide backups, and ensure compliance. There are a lot of incumbents in this space but we believe this is a growing segment.

We are keeping an eye on data encryption startups who are answering the call for quantum-resilient encryption techniques. While the technological problem clearly exists, companies are still working to find viable business models for their technological solutions.

We believe that data vaults are an investment opportunity in this space. If service providers host highly secure data and expose it as a service to customers, they can neatly solve several pain points at once. These so-called “data vaults” transfer risk to the service provider.

Major players in this space include Very Good Security, Evervault, and Skyflow.


Application & Composition Analysis

COVID-19 exacerbated the pressure on technology organizations to integrate security into multiple phases of the software development lifecycle. Over the next several years, teams increasingly will integrate security into their build phases. Startups in this space will offer tools to detect vulnerabilities in software dependencies and perform software composition analysis.

Major players in this space include Sonatype, Snyk, Whitesource, and Micro Focus. Phylum is an upstart taking a next generation approach. Rather than match known vulnerabilities against open source package versions, Phylum ingests terabytes of open source code and performs analysis to find unknown vulnerabilities, identify dependency risk, and mine for malicious activity. Earlier this year, First In led Phylum’s seed stage financing.


Application Security Orchestration and Correlation

While application security is a burgeoning industry, we believe there will be a major growth in the amount of tools available to enterprises. These tools will require integration and correlation. As this market will likely be fragmented, there will be startups rising to integrate the complementary solutions and improve end-user experiences. This market is poised to break out.

Emerging companies in this space include Code Dx and ZeroNorth.


Cyber Insurance 

Cyber insurance is still in its early days, with major insurance providers finding their footing in this essential market. In the race to maturity here, look for more news such as the recently announced partnership between Google, Allianz and Munich Re. With breaches rising every year and cybersecurity spending rising yearly too, a risk transference mechanism is necessary. Large insurance companies are not as well-suited to copy-pasting life insurance actuarial tables onto the cyber risk paradigm. As a result, this is a ripe market for small, nimble companies with strong risk assessment chops, to stand out.

We believe that a core problem in cyber insurance is information. Insurers simply have a difficult time quantifying risk. The insured, especially small and medium sized businesses, want to mitigate what they can and transfer the rest without thinking about it too much. We believe there’s a large market opportunity for companies to address both issues at once. By pairing cybersecurity assessments with insurance, the same entity can perform a service to the SMB (cybersecurity risk) and more accurately understand what they’re insuring. Finally, it becomes possible to price cybersecurity mitigations based on how they impact insurance premiums.

Emerging companies include Coalition, Cowbell, and Trava.


Unifying Security in the Cloud: CSPM, CWPP and GRC

As containerization permeates everything, cloud workload protection platforms will become essential additions to cloud access security broker offerings. This is a hot space with recent acquisitions by Palo Alto, McAfee, Cisco, CheckPoint, and Fastly. As Kara Nortman of Upfront Ventures hypothesizes, the “Rise of the Multi-Cloud” will be a core driver for cybersecurity tool demand. While 93% of enterprises intend to use a multi-cloud strategy, cybersecurity products aren’t built for a cloud-first world.

Caveonix created a single integrated platform for automated compliance, cloud security posture management (CSPM), cloud workload protection (CWPP)and governance in a hybrid and multi-cloud environment. First In led Caveonix’s $7M Series A in Q4 2020.


Identity and Access Management

Identity and access management manages permissions across an enterprise. It helps customers manage employee and customer identities and ensures privacy preferences and access provisioning safeguard sensitive services and data. This is a large and growing market

We believe that there’s a major opportunity for players to develop better rules management for IT and security teams. Currently this is an error-prone and labor intensive process.

There continues to be a major opportunity for evolving beyond passwords and multifactor authentication. Based on behavioral analytics and the device used for access, there are possible replacements such as Zero-Factor Authentication.

Major players in this space are Beyond Identity, Forter, Mati, JumpCloud, and Alloy.


New Approaches to Endpoint Security

Endpoints are remote devices that provide services and process data. These devices, like computers, phones, network gear, and servers, remain critical. This is a very established and large segment in the information security field, and we view this as very difficult for new players to penetrate as it is such a crowded field.

However, there are some subsegments that offer opportunity. Internet of Things and Operational Technology, for example, represent a new frontier of cybersecurity that we believe represents a huge opportunity.

We believe there’s opportunity in the Extended Detection and Response (XDR) space. This represents a potential next generation of endpoint security, where detection and response are automated. Startups with a superior product could challenge increasingly outdated antivirus solutions, and labor-intensive security information and event management software incumbents.


The Rise, Ubiquity and Vulnerability of Open Source Software

By Renny McPherson, managing partner of First In and Matthew Dulaney, director of operations (summer associate) at First In.
Open source software is ubiquitous today. That’s a good thing. But it wasn’t always clear open source would win. Reviewing the history of proprietary and open source software development can help us understand how open source became so widely used and how open source software came to be both incredibly valuable to the world, and incredibly vulnerable as a threat vector for cyber attacks. 

Let’s start with the brief history. In the early 1970s, universities and research organizations were the only institutions with the resources and demand to purchase computers with sufficient functionality to be usable. MITS (Micro Instrumentation and Telemetry Systems) changed that with the Altair 8800 microcomputer, which began to bring computing mainstream. Bill Gates and Paul Allen designed a BASIC interpreter for the device, fuelling both Altair’s sales and the success of their own company, Microsoft. By 1976, BASIC was used by most Altair owners; however, few owners actually paid for the software. Instead, many people loaded friends’ purchased copies onto their machines.

Bill Gates wrote an “Open Letter to Hobbyists” in 1976, and accused users of stealing their software. The letter was an assault on the software development community, who embodied what would now be considered open source values of decentralization and open information sharing stemming from the early days of computing. “Hobbyists” would riff off of Gates’s code and share their own versions for free — other developers would take those modified editions and further adjust the code, spreading a network of free software based on the original code written by Microsoft. Gates condemned code sharing, instead advocating professional, paid software development.

Proprietary software — which can be called “closed-source” software, as opposed to open source software — dominated the 1980s and much of the 1990s. Software was sold attached to hardware products, and users could not access or modify source code behind their products. Microsoft, after Gates’s “Open Letter to Hobbyists,” continued to criticize the principles behind open source. 

Meanwhile, an MIT computer programmer, Richard Stallman, was inspired to establish a free operating system, GNU, in 1983. Stallman had programmed printer software to notify users and to pause printing when a printer was jammed. When a new printer arrived with closed-source software that inhibited his ability to program the printer in the same way, he created and began to build a new operating system. He took it further. Stallman quit MIT to continue developing GNU, and his strict adherence to free software is codified in the GNU General Public License, or GPL. The GPL prevents open source software developers from maintaining exclusive rights to their software or charging others for its use. Moreover, the GPL prevents users of GPL-licensed software from placing restrictions on or monetizing software they develop using other GPL-licensed software.

In 1991, Linus Torvalds created Linux with GNU’s tools. Linux — a portmanteau of Unix and his first name — is, strictly speaking, a kernel of an operating system, providing developers extensive flexibility to write programs that meet their needs. Licensed under GNU’s GPL, Linux is steeped in open source orthodoxy: users can freely use and alter the OS’s code, but in doing so must publish their modifications and projects for others to access. 

While Linux has had a massive impact in the history of open source software development, its early success was limited. An initial point of contention was rooted in doubt that a mass of amateur, part-time coders could effectively and consistently create usable software. Another roadblock was Linux’s complexity compared to meticulously developed alternatives.

Linux’s popularity exploded for those willing to spend time untangling its complexity in return for its power. Soon enough, companies like Red Hat began to develop Linux-based toolkits, which allowed developers to harness Linux’s vast functionality. Red Hat built a careful business model to take advantage of open source without monetizing open source software per se. They would assemble open source code, polish it, and release two versions: a free version with just the source code, and a paid version which included the source code and how-to guides, serial numbers, and customer support for the software. They appeased hardcore open source developers, offered prices that were a fraction of the competition, and made money along the way. Its popularity surged.

Linux’s burgeoning success converted many previously hardline anti-open source software developers. One such convert is Eric Raymond, who published his experience with open source in The Cathedral and the Bazaar. Raymond initially believed proper software development necessitated careful design by small teams, with no early beta release. He was a convert and stated Linus Torvald’s genius was to “release early [and] release often” (p. 29) and treat users as co-developers (p. 27). He also debunks the claim that open source software is inherently inferior to proprietary alternatives: “Quality [is] maintained not by rigid standards or autocracy but by the naively simple strategy of releasing every week and getting feedback from hundreds of users within days, creating a sort of rapid Darwinian selection on the mutations introduced by developers. To the amazement of almost everyone, this work[s] quite well.”  

Raymond’s essay caught the attention of executives at Netscape, maker of the popular Navigator web browser. Soon after Raymond’s essay was published, the company decided to release the source code for Navigator 5.0, kicking off the Mozilla project. This gave further legitimacy to open source. From the Mozilla project’s ashes, developers at AOL (who acquired Netscape) created a sleeker version of the browser called Mozilla Firefox in 2004. Firefox challenged Internet Explorer’s dominance, and Firefox had 100 million downloads within a year and a half and 1 billion downloads by 2009. Where Navigator 5.0 was a jumbled mess of features and code, Firefox was sleek and user-friendly. 

As Firefox grew in popularity, Linus Torvalds himself was advancing another key pillar of open source software development: Git. Git allowed developers to track revisions and easily implement source code changes, bringing transparency and elegance to the previously clunky version control scheme. Git’s tools were consolidated in 2007 with the advent of GitHub, a free repository of open source code. The current GitHub workflow begins with branching, where developers essentially create a new environment where they can tweak code without directly impacting the master branch. In the new branch, developers “commit” new code to the existing project, adding and testing new features separate from the core project. Commits also track developments, allowing project owners to understand from whom changes came and reverse progress if bugs are discovered. Developers solicit community feedback with Pull Requests, then, once satisfied, deploy a branch by merging it with the master project.

GitHub facilitates open source development by tracking developer histories and allowing developers to establish reputations for their contributions in GitHub. The branching and merging process addresses version control, while profile tracking makes developer histories transparent and allows software owners to better evaluate incoming changes to their code.

Open source completed its rise, ubiquity, and became an official part of the mainstream when Microsoft purchased GitHub for $7.5 billion of Microsoft stock. The acquisition marks a stark turnaround in sentiment from Bill Gates’s Open Letter, and from the early 2000s when then-CEO Steve Ballmer called Linux “a cancer”. If Netscape’s embrace of open source in 1998 offered credibility and allowed corporations to follow suit and consider similar adoption, Microsoft’s acquisition solidified open source as the dominant software development ethos. GitHub plays host to major corporations’ source code, including Facebook, Amazon, and Google, and continues to be the default for software developers worldwide. Per CNBC, “The success of open source reveals that collaboration and knowledge sharing are more than just feel-good buzzwords, they’re an effective business strategy.”

Software developers of all kinds — from tech giants to amateurs — continue to rely on open source code in software development, allowing developers to harness others’ ingenuity to create high quality programs. However, open source remains imperfect, as demonstrated by higher volumes of software bugs and vulnerability to cyber attacks reported in recent years. Notably, Google in 2014 disclosed the now-infamous Heartbleed bug in OpenSSL: over 500,000 websites used OpenSSL’s Heartbeat (the program afflicted by Heartbleed), and thus were vulnerable to attack. Companies at risk ranged from Twitter to Github to the Commonwealth Bank of Australia. This incident highlighted a crucial vulnerability in open source. Essential programs, like OpenSSL, are imperative to the success of major companies and projects, but lack security oversight.

Experts agree: “Windows has a dev team. OpenSSL these days is two guys and a mangy dog,” says Matthew Green, assistant professor at Johns Hopkins. Writes Chris Duckett of ZDNet: “In years past, it was often the case that businesses took the view that all that was needed was to drop source code on a server, and the community will magically descend to contribute and clean up the code base. Similarly, users of open source software wrongly assume that because the code is open source, that an extensive review and testing of the package has occurred.” “The mystery is not that a few overworked volunteers missed the bug,” says OpenSSL Foundation former President Steve Marquess, “The mystery is why it hasn’t happened more often.”

Today, given how open source evolved, it is no one’s specific job to secure it. Code with open source dependencies relies on potentially thousands of developers, exposing software developers to upstream attacks from bad actors writing malicious software into open source packages that are then included in other projects. 

Humanity is reliant on open source software. It’s time to ensure security is present in building and reviewing open source code. Large enterprises are starting to pay attention to this burgeoning vulnerability and agree this problem needs to be solved. New companies are being created in real-time to address this gap, in order to ensure open source can continue to provide extreme value – without today’s vulnerabilities – for everyone.