Can generative AI help address the cybersecurity resource gap?
[ad_1]
Join us in Atlanta on April 10th and explore the landscape of security workforce. We will explore the vision, benefits, and use cases of AI for security teams. Request an invite here.
At every cybersecurity event I attend, CISOs and other security professionals discuss the challenges of finding, hiring and maintaining a team of cybersecurity professionals. Not surprisingly, ISC2 recently revealed that the industry is experiencing a nearly 4 million person workforce shortage — and that number is growing.
Anything we can do to put less burden on our security analysts and engineers means more time mitigating cyber risk, which is a huge win. Fortunately, generative AI can help address this skill shortage and have a positive impact on cybersecurity. Gen AI can enable the following:
Lowering the bar to entry
The cybersecurity industry often requires specialized training and certifications — requirements that can prevent people from trying to find and secure jobs in the field. Gen AI can be applied to technical documentation and other cybersecurity information to create more dynamic training that meets new hires where they are, versus building training material for a specific user background and requiring new hires to do that work before joining an organization.
Creating more user-friendly documentation
Today, there are pages and pages of technical documentation for almost every cybersecurity tool that exists on the market. Users often feel overwhelmed and need to rely on vendors to train them on how to use their solutions. Gen AI can be used to process and distill that same information into something very precise and meaningful for the user.
VB Event
The AI Impact Tour – Atlanta
Request an invite
Let’s say the customer needs to know how to run a query in this tool. Instead of the customer spending hours combing through technical documentation, security teams can use gen AI to quickly provide the three to five steps required for the customer to do this action. Gen AI can help organizations create more user-friendly documentation so customers can access information faster and accelerate their time to implement and reduce risk.
Reducing risk of burnout
Security professionals often face burnout when doing tedious activities such as searching for documentation and logging their processes and findings. Large language models (LLMs) are purpose-built to analyze and synthesize data. This could be applied to the organization’s large amount of internal and external documentation, reducing the time that security analysts need to find information to do their job and communicate with their broader team. By reducing the ‘busy work’ that burdens teams, the focus can then be on increasing time spent remediating and reducing risk.
Staying up to date with the latest news and research
One area that I believe would benefit greatly from gen AI is continual education. As we know, cybersecurity threats, attack vectors and bad actors are constantly changing. However, too often, security professionals are heads down handling incidents, writing policies and drafting architectures; they don’t have the time to be educated on what’s happening outside their organization. Gen AI can be used to gather and distill information pertinent to an industry vertical from an organization’s trusted sources, from favorite trade publications and industry associations to other preferred research and resource sites.
Improving cross-team organizational security communications
Organizational education is another ongoing challenge for cybersecurity teams. The amount of time spent synthesizing phishing information could be reduced by gen AI and automation. For example, if an organization sees persistent phishing attempts, gen AI could analyze the text and create custom messaging based on each department’s function to best equip them to mitigate the risk — giving time back to the security organization as well as reducing incident load.
Take time to build the right guardrails
These are just a few ways gen AI can be used to bring more qualified people into the cybersecurity field as well as amplify the output of those already part of it today. I am excited to see what other use cases we’ll benefit from in the future.
But with all cutting-edge technology, we must carefully consider how it should be used and put the proper policies and other safeguards in place.
For example, a recommendation we make to all of our clients is once organizations choose a gen AI platform, they should put in place a paid, contractual relationship so the vendor can provide guidance on the tool and help troubleshoot any issues. Why? Because we shouldn’t have our security teams go to ChatGPT on their own and create their own accounts where there is no visibility or control; we also need the ability to audit our vendors and ensure transparency of their processes.
Organizations also should train gen AI on documentation, data and other information only from trusted sources. Finally, always remember that while gen AI can do a lot of good, everything it outputs must be checked and executed by a human.
Gen AI is already transforming the cybersecurity industry and will be instrumental in closing the cybersecurity resource gap.
Kyle Black is a cybersecurity architect with Symantec by Broadcom.
DataDecisionMakers
Welcome to the VentureBeat community!
DataDecisionMakers is where experts, including the technical people doing data work, can share data-related insights and innovation.
If you want to read about cutting-edge ideas and up-to-date information, best practices, and the future of data and data tech, join us at DataDecisionMakers.
You might even consider contributing an article of your own!
Read More From DataDecisionMakers
[ad_2]
Source link