DevSecOps — like its fraternal twin, DevOps — has been a process in play for several years now in software shops, intended to enable more collaborative and intelligent workflows. Now, AI is poised to add more juice to these efforts — but many are still skeptical about its implications.
Also: AI brings a lot more to the DevOps experience than meets the eye
These are some of the takeaways from a recent survey out of the SANS Institute, involving 363 IT executives and managers, which finds rising interest in adding AI or machine learning capabilities to DevSecOps workflows. Just over the past year, there has been a significant increase (16%) in the use of AI or data science to improve DevSecOps through investigation and experimentation — from 33% in 2022 to 49% in 2023.
While interest in applying AI to the software development lifecycle is on the rise, there is also healthy skepticism about going full-throttle when injecting AI into workflows. “A strong contingent of the respondents, approximately 30%, reported not using AI or data science capabilities at all,” note the SANS authors, Ben Allen and Chris Edmundson. “This may reflect issues such as the rising level of concern surrounding data privacy and ownership of intellectual property.”
DevSecOps, as defined in the report, “represents the intersection of software development (Dev), security (Sec), and operations (Ops) with the objective of automating, monitoring, and integrating security throughout all phases of the software development lifecycle.” In other words, establish processes to build in security right at the start — the design phase — and see it through to deployment.
Ultimately, a well-functioning DevSecOps effort delivers “reduced time to fix security issues, less burdensome security processes, and increased ownership of application security,” Allen and Edmundson state.
There has been an increase in pilot projects integrating security operations into both the “AI and machine learning ops” (19% fully or partially integrated) and “data science operations” (24%) categories. This is a “possible indication that organizations are performing threat modeling and risk assessments prior to incorporating AI capabilities into products,” the authors state.
Also: Generative AI now requires developers to stretch cross-functionally. Here’s why
Many organizations feel an urgent need for more qualified DevSecOps personnel — 38% report skills gaps in this area. “Because demand continues to outweigh supply in this area, there is a real need to spark more interest in this ever-changing field,” the authors urge. “To cope with the scarcity of talent amid competitive pressures, organizations should further leverage proven DevSecOps practices and explore emerging technological capabilities.”
Platform engineering, intended to streamline the flow of software from idea to implementation, also is gaining ground — fully or partially adopted by 27% of respondents. “As the developer self-service features inherent in a platform engineering practice mature, it will be essential to leverage the orchestration used to build, package, test, and deploy an application to incorporate security testing and tooling at key points along the path that has been laid out,” Allen and Edmundson state. “A well-implemented software engineering platform, designed in close collaboration with security stakeholders, could likely meet an organization’s application security orchestration and correlation objectives.”