It’s hard to ignore the generative AI hype. One day machines couldn’t reliably generate text and images, and then suddenly they could.
One executive described it to me as AI’s “Cambrian moment,” a sudden explosion of computational possibility. Generative AI feels like magic because it seems like you’re getting something for nothing. Which raised the same fear new technology has always raised: What do you need all these people for?
It’s an awkward question that many of my colleagues at Codeword have spent a lot of time finding a good answer to. In short: We’ve found that AI is best for taking care of busywork, like summarizing meetings and creating templates, so we can focus on the actual creative work. AI is not magic. It’s a tool.
We’re seeing a similar approach in other fields as companies begin to implement generative AI features into their products. A great example of this is in cybersecurity, which is currently dealing with its own special crisis: There just aren’t enough people.
Although the field of cybersecurity has never been bigger, ISC2, the world’s largest IT security organization, recently found that the industry is facing a growing talent gap. There are an estimated 4.7 million professionals in the field, but ISC2 estimates the industry would need to hire 3.4 million more to “secure assets effectively.” It’s no surprise, then, that amongst the cybersecurity professionals ISC2 surveyed, 70% reported feeling overworked. Can you see where this is going?
At the recent Google Cloud Security Summit, Google introduced its solution to closing the talent gap: large language model Sec-PaLM 2. Teased earlier this year at Google I/O, Sec-PaLM 2 is a more specialized version of PaLM 2, the LLM powering Google’s new experimental generative AI products like Bard and Duet AI.
At first glance, it seems counterintuitive. How can AI that’s designed to produce natural language help shore up the talent gap in such a highly technical field? Speaking at the Security Summit, Umesh Shankar, chief technologist for security at Google Cloud, explained how the new AI Security Workbench, powered by Sec-PaLM 2, would help overworked cybersecurity professionals. The actual applications of generative AI are wildly different from how we might use it in a creative field, but the goal is the same: provide tools that help save you time, so you spend more of your day focusing on the tasks that need a human touch.
“Today, it can be an overwhelming challenge to keep up with the evolving threat landscape,” said Shankar. “Even if you know all the threats, knowing what to do and doing it, isn’t easy.” A LLM like Sec-PaLM 2 is constantly trained on new data and threats, so it can help professionals analyze, spot, and explain potentially malicious code snippets before they lead to a breach.
Like any job, cybersecurity has more than its fair share of busywork. “If we’re being honest, the practice of security can feel like trying to stay on a treadmill that goes faster and faster,” said Shankar. “This is a big cause of burnout in the industry.” So, instead of someone manually adding security policies with every new code deploy, Sec-PaLM 2 can automatically generate the right protections. This cuts down on a ton of repetitive tasks for cybersecurity professionals, giving them time to work on more complex problems.
“It’s important to emphasize that this is just the start of a long and exciting journey,” said Shankar. As Google sees it, AI can help close the talent gap in cybersecurity, but ultimately, you still need people to find new ways to put it to good use. To do that, they need more tools, not magic.