Itís a Mixed-Signal World
Today itís impossible to separate the analog and digital domains without compromising essential system behaviour. Designers need to adapt their modeling style for mixed-signal verification.
It’s often said that most of today’s designs are mixed signal, and that mixed-signal verification remains one of the biggest design challenges. A typical chip design today is a complete system with millions of gates that make up large numbers of DSPs, memories and processors, all of which must interface with the real world through displays, antennas, sensors, and cables. This requires unprecedented integration of analog and digital content, without compromising performance or size, and on a technology scale that dramatically increases vulnerability to process and electrical variation.
These challenges to mixed-signal design are real. With process technologies moving to 65nm and below, the cost of design re-spins increases exponentially. Over 50 percent of design re-spins at 65nm and below are due to mixed-signal functionality, with an estimated additional cost of $5 million to $10 million and a six- to eight-week delay in product rollout, either of which can be financially disastrous. Practically speaking, re-spins are simply not an option anymore. Designers need new verification methodologies that will help to mitigate these potential problems.
In particular, verification technologies must help designers answer several key advanced SoC design questions, including:
- When I pack half a million transistors into a “square mm” die, how do I deal with the heat they produce?
- How do I maintain the low operating voltage for these transistors without worrying about power supply rejection? (Or, if they are strictly digital engineers: “Power supply what?”)
- How do I get enough current into that “square mm” to power all those transistors? And,
- When I get them all switching at hundreds of MHz, how do I keep them from randomly talking to each other?
Predictably, digital designers also ask themselves, “Do I have to do something about it? How can I continue my current methodology and still take care of all of these items? If I just characterize everything correctly ahead of time, why should I have to worry about it?”
Mixed-signal design environments are not an entirely new ballgame. In the broader context, from an analog perspective, engineers have been doing mixed-signal design for years; however, it seems today that neither analog nor digital engineers are completely prepared to enter each others’ areas of expertise. Analog engineers may shy from the complexity of SoC verification, and digital engineers may find the fuzziness of analog design disconcerting in the context of applications that must run on the chip.
However, there’s simply no way to avoid this interaction. Until recently, mixed-signal designs could be deconstructed into separate analog and digital functions, designed independently and then bolted together during system integration (Figure 1). Functional integration was the key problem.
Figure 1: The analog, AMS and digital
components of an SoC
Today, mixed-signal designs have multiple feedback loops through the analog and digital domains, so it’s impossible to separate them without compromising essential system behaviour—thus the need for an integrated mixed-signal simulation and verification environment. “Top level verification” becomes the major potential bottleneck in this component of the design flow. Functional complexity in terms of modes of operation, extensive digital calibration, and architectural algorithms are now overwhelming the traditional verification methodologies.
Clearly, the old way of using these tools simply will not work anymore. It will take additional planning to figure out design strategies and verification plans. The emphasis will be on finding good (meaning quick and accurate, in that order) ways to simulate and verify the design, rather than on creating the design.
Mixed-Signal Verification Approaches
Currently there are two approaches for SoC integrators to verify mixed-signal designs: use a mixed-signal simulator, or black-box the mixed-signal circuits.
Each approach has merits and demerits. SPICE-Verilog co-simulation, using SPICE for analog blocks and Verilog for digital blocks is faster than full transistor level simulation. SPICE-Verilog simulation provides SPICE-like accuracy and simulation time is even faster than simulating the entire circuit using SPICE. However, it is still much slower than digital simulation. So while SPICE-Verilog simulation is a useful tool for analog designers, it is not practical to use this approach for SoC verification.
As a result, black-box modeling of mixed-signal blocks has been the preferred approach for digital designers to use for verification. In this case, a Verilog timing model is created for mixed-signal blocks. The advantage of this approach is that designers can use the existing tools and methodology to verify their designs. The disadvantage is that the analog nature of the block is ignored, so functional verification of the design is not possible. This approach worked for simple designs in the past, but will likely result in increasing numbers of design respins going forward.
Full-chip SoC simulations now require more accuracy than digital Verilog/VHDL alone can offer. Real number models are being used more often to discretely model the analog portions of the design. The real number models are easily portable between analog design and digital verification environments. Real value modeling (RVM) is a process by which users can perform verification of their analog or mixed-signal designs using discretely simulated real values. This allows simulation using only the digital solver and avoiding the slower analog simulation, thus, enabling intensive verification regression runs of mixed-signal design in a short period of time.
RVM also opens the possibility of linkage with other advanced verification technologies such as assertion-based verification without the difficulty of interfacing to the analog engine or defining new semantics to deal with analog values. It is anticipated that users enable the RVM flow by migrating their analog models or transistor level design to real value modeling style.
Cadence offers a complete set of mixed-signal verification solutions to address the analog-centric as well as the digital-centric users (Figure 2). Analog-centric users have successfully been using the AMS Designer solution to apply mixed-signal verification test benches to both transistor-level and AMS behavioral views of cells and subsystems. For the digital-centric users, Cadence has just introduced a solution enabling designers to use discrete real-valued models of analog blocks to allow ultra high-speed verification of mixed-signal systems. The key is for designers to recognize the need for their analog and digital teams to work together in both the modeling and verification arenas. It's the only way they can seamlessly verify the operation of their entire mixed-signal system.
Figure 2: The Cadence integrated mixed-signal verification environment
Model what you need, not what you can
In many companies, there seems to be a disconnect between analog and digital design and verification teams. Analog design teams may develop models in an analog-centric environment, and then throw their models over the wall to a separate digital team for use in verification in a totally different environment. This can create a huge disconnect in the verification flow when continuity is critical. Verification testing of subsystems needs to be driven from the top but used by the analog design team, and mixed-signal comprehension needs to fully extend into the digital/verification groups.
Behavioral models can also be differentiated by what they model. A performance-oriented model typically only captures critical behavior necessary to efficiently explore the design space and make basic system-level trade-offs. Functional models capture the actual circuit behavior to the level of detail required to functionally verify the correct design functionally. Both types of models are found in top-down and bottom-up modeling. However, functional verification is far more common in bottom-up modeling where they are used to perform the final design verification prior to tape-out. For example, a functional verification model can be used to study dynamic closed-loop functionality between the RF and digital baseband ICs, whereas a performance-oriented model of an RF block could be used for cascaded measurements such as the third-order intercept point (IP3) and explore system-level metrics such as the bit error rate (BER) and error vector magnitude (EVM).
A single behavioral model can capture both performance and functionality. However, there are practical considerations to be taken into account, including model complexity and simulation speed. Adding functionality to a performance-oriented model will cause a simulation speed penalty and vice versa. Some types of functional behavioral models may make use of model constructs such as wreal data types that dramatically improve simulation. Additionally, performance and functional models are often created by different members of a larger design team. Whereas the performance-oriented models are created by the system and circuit designers, the functional behavioral models are created and used by the verification teams. The difference in goals and authors often results in distinct sets of models rather than a single combination of both.
Designers need to adapt their modeling style for mixed-signal verification. These models are being used for functional verification, so they should describe the functional behavior of the block as simply as possible. Models should predict the behavior of the pins and the transfer functions between the pins can be described mathematically. Cadence has addressed this mixed-signal verification challenge by developing a solution that leverages the real value modeling techniques for model generation, and more importantly, for model validation to provide a more tightly integrated mixed-signal verification solution.
The practice has to start with a simulatable spec to prevent details from falling out. The flow needs to cover everything from the behavioral to the transistor level, while recognizing that neither is any more than a part of the solution. The solution needs to encourage, not require, the design team to carefully think out a custom plan for each project. The practice needs to parse the design into what sections and which features can be verified by which methodology. All of these areas have been addressed in the Cadence Mixed-signal Verification solution to provide designers with an integrated mixed-signal verification methodology.
The benefits of adopting this solution are improved quality and improved time to market, using accelerated, overnight mixed-signal regression runs; reduced design re-spins using real number modeling and top-level mixed-signal verification; and improved productivity due to easier convergence and higher-level verification.
About the author:
Kishore Karnane is the Enterprise and Functional Verification Group product marketing director at Cadence Design Systems, Inc.
Cadence Design Systems Inc.
San Jose, CA