By Lauro Rizzatti
The world of integrated circuit (IC) design looks very different than it did 10 years ago, when EVE incorporated and started building its first hardware emulator. In 2000, the semiconductor industry was still reveling in the new millennium and the economy was going strong.
Back then, the process technology node was 180-nanometer (nm) and the average number of transistors in a design was 20 million. The average design size was one-million application specific integrated circuit (ASIC) gates, with large designs coming in at around 10-million ASIC gates and the largest designs at about 100-million ASIC gates. Only a small fraction of the design functionality is derived from the embedded software.
Verification took 70 percent of the project cycle and emulation was used almost exclusively on the large CPU and graphics chip design. EVE’s emulation system in 2000 was able to handle 600,000-ASIC gates and seemed impossibly cutting edge.
In 2010, the economy is in a slow recovery, and 32nm is the current process technology node. The average number of transistors has climbed to 200 million, while the average design size — not the largest — is about 10-million gates. Large designs are now about 100-million ASIC gates, with the largest reaching or exceeding one-billion ASIC gates.
Software now accounts for two-thirds of the chip’s functionality and verification still takes up 70 percent of the project cycle. Emulation is used now on CPU, graphics, wireless, digital television, set-top box, digital selective calling, camcorder, multifunction printer designs and many, many more. One emulator can handle one-billion ASIC gates, beating Moore’s Law since the emulator’s capacity has doubled each year, not every 18 months or two years.
Moving into this new decade, we see further trends in System-on-Chip (SoC) hardware designs that range from graphics and video to processors, networking and wireless. For the foreseeable future, verification will consume 70 percent of the project schedule.
In today’s chip design environment, emulation must be useable for a variety of applications. For video processing, it must be able to process anywhere from one to 15 high-definition frames per second and verify digital image stabilization. Embedded CPU design demands the ability to boot Linux in a minute and have capabilities for pre-silicon validation.
Wireless and mobile applications have their own set of requirements. Emulation should enable a design team to create a virtual prototyping environment for early software development. And, the peripheral/storage application requires an ability to print 1,200 dots per inch (dpi) images and being able to quickly verify intellectual property (IP) block with pseudo-random tests.
Unmistakably, growing design sizes drive the need for very long verification sequences of clock cycles to dig out deeply buried bugs. Added software content makes hardware/software co-verification a critical component of the verification process. Emulation is needed to trace the source of software bugs that show up in hardware misbehavior and hardware bugs that exhibit malicious effects in the embedded software. Software validation has to be done well ahead of tapeout.
These trends and others will continue to drive the need for fast emulation that offers billions of verification cycles and help move chip design ahead over the next 10 years.
Sunday, June 6, 2010
Subscribe to:
Posts (Atom)