Black Duck has introduced the overall availability of Black Duck Sign
, an agentic AI software safety resolution designed from the bottom as much as deal with the safety challenges created by AI-native software program growth. The launch comes as AI coding assistants transfer from novelty to norm throughout enterprise software program groups. Business analysts predict that 90% of enterprise builders will probably be utilizing AI coding instruments by 2028, a shift that’s essentially altering the amount, velocity, and nature of the code hitting manufacturing techniques. The issue, in response to Black Duck, is that the safety instruments designed to guard that code haven’t saved tempo.
“AI is not simply accelerating growth, it’s actively authoring software program,” stated Jason Schmitt, CEO of Black Duck. “Sign unlocks AI-driven growth by eradicating threat and bringing intelligence, determinism and governance to that actuality.”
A Totally different Structure for a Totally different Downside
Not like conventional software safety testing (AST) instruments that depend on language-specific, rule-based scanning engines, Sign is constructed on an agentic AI structure. Slightly than a single mannequin, it deploys a coordinated system of specialized AI safety brokers that work collectively to analyse code, assess the exploitability of vulnerabilities, prioritise threat, and advocate or robotically apply fixes, reasoning by means of points with what Black Duck describes as human-like logic.
Central to that intelligence is ContextAI, Black Duck’s purpose-built software safety mannequin, educated on petabytes of human-validated safety knowledge accrued over greater than 20 years. The corporate argues that this grounding in real-world safety experience is what separates Sign from general-purpose AI safety instruments: the brokers aren’t simply pattern-matching in opposition to recognized signatures, they’re drawing on deep contextual data to make knowledgeable judgements about threat and remediation.
That distinction issues significantly for the sorts of vulnerabilities which are hardest to catch; complicated, cross-file dataflow points, enterprise logic errors, and novel defects that don’t match any current rule or signature. Sign’s multi-model method implies that totally different brokers are utilized at totally different phases of study, with Black Duck claiming every is optimised for the duty at hand.
Proof within the Wild
Black Duck has pointed to a concrete real-world instance to reveal Sign’s capabilities. The corporate’s Cybersecurity Analysis Middle used Sign to establish a beforehand undisclosed authentication bypass vulnerability in Gitea, the favored open supply Git platform, earlier than it was publicly recognized. The discovering, Black Duck says, illustrates Sign’s capability to floor high-impact logic flaws that typical instruments would miss totally.
Constructed for The place Code Really Will get Written
Sign integrates immediately into the instruments builders already use: AI coding assistants, IDEs, and automatic pipelines, through mannequin context protocol (MCP) and APIs. It analyses code repeatedly as it’s written or generated, surfacing points earlier than they attain a commit somewhat than flagging them after the actual fact. The place conventional AST instruments are recognized for prime false optimistic charges that erode developer belief, Sign’s built-in exploitability evaluation is designed to filter out non-issues and floor solely what genuinely issues.
As a result of its intelligence is model-driven somewhat than rule-driven, Sign can be language and framework-agnostic from day one. It requires no rule updates, no language packs, and no tuning, which means organisations are usually not left ready for vendor help to meet up with the newest language options or frameworks used of their AI-generated code.
Governance at AI Scale
Past detection, Black Duck frames Sign as an enterprise governance instrument. As AI coding assistants more and more design and ship manufacturing software program autonomously, organisations face mounting challenges round safety, compliance, and belief. Sign is positioned to present safety and engineering leaders the visibility and management they should govern AI-generated software program at scale, with out sacrificing the event velocity that AI instruments are meant to ship.
Black Duck Sign is now usually accessible.
The put up Black Duck Launches Sign to Sort out the Safety Dangers of AI-Generated Code appeared first on IT Safety Guru.









