SANTA CLARA, Calif., May 6. NextOp Software, Inc. formally introduced itself today as a functional verification provider focused on delivering Assertion-Based Verification solutions that leverage design and testbench information to uncover bugs, expose coverage holes and increase verification observability. NextOp also announced today its flagship assertion synthesis product BugScope, after four years of development of key technologies and successful work with several customers.
"Design complexity has adversely impacted our ability to have confidence that the RTL functional verification process is complete. The current engagement between design teams and verification teams is becoming increasingly inadequate; there is a critical need for tools that instill confidence in both teams that their chip will work as intended," stated Yunshan Zhu, President and CEO of NextOp. "For four years now, NextOp has been focused on building an assertion-based verification solution resulting in its BugScope assertion synthesis product, which is currently in production use by several customers."
Current Verification Methodology Limitation: Lack of Adequate Specifications
Today's verification methodologies include a combination of directed simulation, constrained random simulation, and formal and semi-formal methods.
-- Directed simulation utilizes 'blackbox' checkers which test input and
output behavior for each feature interaction. This approach is
fundamentally not scalable due to the number of complex interactions
-- Constrained random simulation utilizes external checkers to define the
expected behaviors of the Design under Test (DUT). During simulation,
the output of the DUT is compared with the checker, and mismatches are
used to identify bugs. The checkers and the DUT are typically
developed independently based on the architectural specification. It
is difficult to write a checker to exactly match the DUT, and as a
result, features and interactions are often skipped by the external
checkers, including performance related features, exception and
-- Formal verification uses mathematical analysis to prove or disprove
certain properties for all possible legal input stimuli. Complete
verification using formal methods requires that users specify
sufficient properties to cover all features of the design.
As design complexity grows, it is imperative that an understanding of the design's structure and intent be infused into the verification process. Regardless of the speed of the simulator or formal engine, the result of verification is only as good as the specification. Without an adequate specification, the debugging cycle will continue to increase, and design and verification teams will be unable to adequately reduce the risk of chip defects that can cause re-spin costs and schedule overruns.
Importance of Assertion-Based Verification
Assertion-based verification enhances directed and constrained random simulation, formal and emulation verification approaches by driving more effective and targeted verification. An Assertion-based Verification approach utilizes assertions and functional coverage properties, which are logic statements that define the intended behavior of signals in the design.
-- Whitebox assertions specify the behaviors of the internal logic and
inject observability into the Register Transfer Level (RTL) code. In
contrast, traditional blackbox checkers specify the input and output
behaviors of the DUT.
-- Assertions ensure the correctness of the implementation logic, and the
number of assertions needed to verify the design scales linearly with
the complexity of the RTL.
-- Whitebox functional coverage properties expose corner case behaviors
created by implementation and ensure such behaviors are targeted by
simulation test vectors.
-- Assertions and functional coverage properties can be reused across all
verification platforms, including simulation, formal and emulation,
and allow cross checking between different test environments. They
also facilitate design reuse.