Introduction about Advanced Functional Verification

      Electronic gadgets are an integral part of our day-to-day life. Lifeline of these gadgets/products are semiconductor IC/SoC/ASIC/FPGA which are mounted on the PCB (Printed Circuit Boards) & connected with each other to make the gadget operational. As per Moore’s law no. of transistors in an Integrated Circuit (IC) doubles every 18 months which turns out to be more functional features supported in the design and hence designs are getting more & more complex. The core challenge is to successfully verify this exponential growth happening towards the design side. To handle this design growth, there is substantial improvement in the VLSI functional verification technologies, approaches & methodologies in last couple of years.

       This blog provides an overview of the Advanced Functional Verification Challenges, different technologies available today, different approaches which can be used & latest Constrained Random Verification (CRV) methodology with brief information on UVM (Universal Verification Methodology) as well.

1. Introduction

      Since Hardware designs are getting more and more complex, the traditional approach to verify the designs using manually written tests i.e. Directed Test approach is difficult do and maintain for bigger & complex designs. There are corner cases which are either not humanly possible to imagine, code test for them or missed during the verification. Visual inspection of the waveforms in order to trace a design bug is always a tedious task. The amount of time we spent in verification now a days exceeded the time we spent in design, comprising of almost 70% of the total development effort.

The ultimate goals is – “Find Bugs Early & Fast”

We’ll discuss following topics in this paper:

  • Functional Verification Requirements
  • Functional Verification Technologies
  • Functional Verification Approaches
  • Functional Verification Methodologies

These topics will provide a good insight into these technical areas & will help to comprehend verification world!

 

2. Functional Verification Requirements

There are certain requirements which should be targeted in order to develop an effective and robust verification environment to verify the Design Under Test (DUT).

  • Verify all the features/usecases & find all the bugs
  • Re-usability
  • Progress Measurement
  • Automation
  • Easy to write and maintain

Automation is the key to enhance the effectiveness for executing testcases, observing & analyzing results which could range from self-checking mechanism to functional coverage back annotation. Verification metrics helps to track & measure the progress of the verification completion. It helps to focus on the important functional scenarios prior to the corner cases being covered. Adopting various different techniques, like executable verification plan, we can assign priority & weightage to a particular feature to be tested. We can map feature to be tested to the specification document. Even the coverage results could be automatically back-annotated to the verification plan displaying the progress being made.

Re-usability needs to be another highly focused concept which is having an immense importance as far as functional verification is concerned. The more we develop verification components e.g. Agents, Stimulus, Scoreboard etc. which are re-usable vertically as well as horizontally across the projects, better is our ability to develop a robost verification environment in lesser time. Re-usability encouraged the verification community to come up with the concepts like “factory” and “configuration” in the latest verification methodology which helped various verification engineers in multiple roles like developers, integrators & test developers to work in tandem & create different interesting stimulus scenarios to test the DUT.

Modularity is another concept which is important to consider while developing verification components & environment. Its helps to better manage the work product which could be OOPs based classes – child classes extended from the parent classes etc. Inheritance & Polymorphism supported by SystemVerilog facilitate these kind of modular features. Coding guidelines is another important parameter for an effective verification environment. Amoung the numerious other benefits, it helps to maintain the code in a consistent way & also helps in debugging with ease.

3. Functional Verification Technologies

     In industry, there are two technologies which are used to perform Functional Verification. In Simulation Based Verification, Design Inputs are stimulated by the dynamic vectors injected by the verification engineer & output is measured against the expected values which can be termed as golden values.

  • Simulation Based Verification
  • Formal Verification
    • Equivalence Checking
    • Formal Property (Assertion) Checking

Simulation_Process

Figure 1: Simulation Process (Source: Mentor Graphics)

        Another approach is Formal Verificaiton, where verification engineer starts out by stating what output behavior is desirable and then let the formal verification tool to prove or disprove it. Verification engineer do not concern with the input stimuli at all. Usually this approach is called Formal Property Checking. In Equivalence Checking, 2 different design stage outputs are compared to be functionally equivalent which is totally EDA tool driven. Example of these design stage outputs could be RTL design vs Gate level netlist after synthesis or pre-scan inserted netlist vs. post-scan inserted netlist.

Simulation_Vs_Formal

Figure 2: Simulation Vs Formal Verificaiton

     Simulating a vector can be conceptually viewed as verifying a point in the input space. With this view, simulation-based verification can be seen as verification through input space sampling. Unless all points are sampled, there exists a possibility that an error escapes verification. As opposed to working at the point level, formal verification works at the property level. Given a property, formal verification exhaustively searches all possible input and state conditions for failures. If viewed from the perspective of output, simulation-based verification checks one output point at a time; formal verification checks a group of output points at a time (a group of output points make up a property).

4. Functional Verification Approaches

     In a complex SoC design flow functional verification is very important; any behavioral or functional bug escaping this phase will not be detected in the subsequent implementation phases and will surface only after the first silicon is integrated into the target system, resulting in costly design and silicon iterations. To handle this challenge, a number of academic and industrial research laboratories have been carrying out research on different approaches for functional verification. Five such dynamic vectors based approaches are listed below – neither of one approach out of these is sufficient & fully capable to ensure bug-free verification. These approaches works in tandem & together makes a best possible solution.

  • Directed Verification
  • Constrained Random Verification
  • Coverage Driven Verification
  • Assertion Based Verification
  • Emulation Based Verification

Directed verification is the traditional approach to functionally verify the Designs using manually created testcases based on the features to be tested. Directed tests approach may work well for smaller designs, but for ever growing size and functionalities of the SoC/IP where it may require thousands of testcases, this approach is encountering many missed functional issues and time-to-market challenge in fully verifying the Design.

Directed_Vs_Random

Figure 3: Directed Vs Constrained Random Approach

     To overcome this limitation and improve verification productivity, the only way is reducing the time it takes to create working tests. Using constrained-random stimulus generation, scenarios can be generated in an automated fashion under the control of a set of rules, or constraints, specified by the user. SystemVerilog provides a vast array of language capabilities for describing complex verification environments, including constrained-random stimulus generation, object-oriented programming, multi-threaded capabilities, inter-process communication and synchronization, and functional coverage. These features allow users to develop testbenches that automate the generation of various scenarios for verification.

Verification_Cyle

Figure 4: Verification Cycle using Constrained Random Approach

     Figure 3 shows verification cycle using Constrained Random Verification approach i.e. Layered Testbench Development, testing larger design space (Orange color), testing very specific scenarios (Corner cases) & finally hitting the remaining holes using writing directed testcases.

Coverage driven verification serve critical purposes throughout the verification process. The one very important is to identify holes in the process by pointing to areas of the design that have not yet been sufficiently verified. This helps to direct the verification effort by answering the key question of what to do next — for example, which new directed test to write or how to vary/control the parameters for constrained-random testing.

     Another even more important purpose is to – acts as an indicator of when verification is thorough enough to tape out. Coverage provides more than a simple yes/no answer, incremental improvement in coverage metrics helps to assess verification progress and thoroughness, leading to the point at which the development team has the confidence to tape out the design. In fact, coverage is so critical that most advanced, automated approaches implement coverage-driven verification, in which coverage metrics guide each step of the process.

      Coverage is divided into two main categories: code coverage and functional coverage (Fig 6). Code coverage, in its many forms (line coverage, toggle coverage, expression coverage), is typically an automated process that tells whether all of the code in a particular RTL design description was exercised during a particular simulation run (or set of runs). Functional coverage drived from Functional coverage model helps to indicate the covered/uncovered functional points in the design. It could be further classified into Point coveage, Transistion coverage and Cross coverage.

Coverage_and_Assertion

Figure 5: Coverage and Assertion Application

Assertions can enhance the capabilities and productivity of any verification environment. Assertions are in simple terms –  the statements of the design intent. Beyond that, Assertions can be helpful in many different phase of the verification cycles e.g. Protocol Checkers, Assertion coverage and the same set of assertions can be used during emulation.

     During the verification process, engineers typically use a variety of tools. They use logic simulators for block-level  verification, which traditionally simulate at 10–1000 clock cycles per second. However, the performance of logic simulators goes down drastically with increased design size, rendering them practically impossible to use for system-level integration testing. Simulation speed is also limited by the number of clock cycles required to run a design; for example, a full video frame in even a moderately sized design will take many, many clock cycles and  thus a long time to run in pure simulation.

Emulators aim to fill this gap. By mimicking the actual hardware in an emulator it can be run at a few million clock cycles per second. In this approach, RTL design in the emulator interacts with the Testbench running on a workstation, as shown in Figure 6.

Emulation_Based_Verification

Figure 6: Emulation Based Verification (Source: Synopsys)

5. Functional Verification Methodologies

Any methodology guides us towards “how to do” things. Similarly Advanced Functional Verification Methodology helps the verification community to provide a framework which if used properly & as suggested results in reusable verification components (UVC/OVC/VIP), better controllability via Configuration support, Layered Testbench framework & performance improvement.

5.1 Universal Verification Methodology (UVM)

In December 2009, a technical subcommittee of Accellera — a standards organization in the electronic design automation (EDA) industry — voted to establish the UVM (Universal Verification Methodology) and decided to base this new standard on the Open Verification Methodology (OVM-2.1.1), a verification methodology developed jointly in 2007 by Cadence Design Systems and Mentor Graphics.

On February 21, 2011, Accellera approved the 1.0 version of UVM. UVM 1.0 includes a Reference Guide, a Reference Implementation in the form of a SystemVerilog base class library, and a User Guide. UVM was not built from scratch. It is the culmination of many independent efforts in the verification methodology space. Its heritage includes AVM, URM, VMM, and OVM.

UVM_TB_Hierarchy

Figure 7: UVM Testbench Hierarchy

Cookbook

Figure 8: UVM/OVM Framework (Source: Mentor Graphics)

 6. Summary

In this blog, we dicussed different verification technologies, approaches & methodologies. As SoC size is growing at a rapid pace, we discussed why & how the functional verification community moved from directed approach to constrained random approach & next moving towards emulation. To tackle productivity, performance & bug free designs, UVM & SystemVerilog has become immensely popular and being adopted by large numbers of semiconductor organizations in last couple of years. Another good part UVM and SystemVerilog emerged as the unanimously accepted among EDA and verification users community.

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
完整版:https://download.csdn.net/download/qq_27595745/89522468 【课程大纲】 1-1 什么是java 1-2 认识java语言 1-3 java平台的体系结构 1-4 java SE环境安装和配置 2-1 java程序简介 2-2 计算机中的程序 2-3 java程序 2-4 java类库组织结构和文档 2-5 java虚拟机简介 2-6 java的垃圾回收器 2-7 java上机练习 3-1 java语言基础入门 3-2 数据的分类 3-3 标识符、关键字和常量 3-4 运算符 3-5 表达式 3-6 顺序结构和选择结构 3-7 循环语句 3-8 跳转语句 3-9 MyEclipse工具介绍 3-10 java基础知识章节练习 4-1 一维数组 4-2 数组应用 4-3 多维数组 4-4 排序算法 4-5 增强for循环 4-6 数组和排序算法章节练习 5-0 抽象和封装 5-1 面向过程的设计思想 5-2 面向对象的设计思想 5-3 抽象 5-4 封装 5-5 属性 5-6 方法的定义 5-7 this关键字 5-8 javaBean 5-9 包 package 5-10 抽象和封装章节练习 6-0 继承和多态 6-1 继承 6-2 object类 6-3 多态 6-4 访问修饰符 6-5 static修饰符 6-6 final修饰符 6-7 abstract修饰符 6-8 接口 6-9 继承和多态 章节练习 7-1 面向对象的分析与设计简介 7-2 对象模型建立 7-3 类之间的关系 7-4 软件的可维护与复用设计原则 7-5 面向对象的设计与分析 章节练习 8-1 内部类与包装器 8-2 对象包装器 8-3 装箱和拆箱 8-4 练习题 9-1 常用类介绍 9-2 StringBuffer和String Builder类 9-3 Rintime类的使用 9-4 日期类简介 9-5 java程序国际化的实现 9-6 Random类和Math类 9-7 枚举 9-8 练习题 10-1 java异常处理 10-2 认识异常 10-3 使用try和catch捕获异常 10-4 使用throw和throws引发异常 10-5 finally关键字 10-6 getMessage和printStackTrace方法 10-7 异常分类 10-8 自定义异常类 10-9 练习题 11-1 Java集合框架和泛型机制 11-2 Collection接口 11-3 Set接口实现类 11-4 List接口实现类 11-5 Map接口 11-6 Collections类 11-7 泛型概述 11-8 练习题 12-1 多线程 12-2 线程的生命周期 12-3 线程的调度和优先级 12-4 线程的同步 12-5 集合类的同步问题 12-6 用Timer类调度任务 12-7 练习题 13-1 Java IO 13-2 Java IO原理 13-3 流类的结构 13-4 文件流 13-5 缓冲流 13-6 转换流 13-7 数据流 13-8 打印流 13-9 对象流 13-10 随机存取文件流 13-11 zip文件流 13-12 练习题 14-1 图形用户界面设计 14-2 事件处理机制 14-3 AWT常用组件 14-4 swing简介 14-5 可视化开发swing组件 14-6 声音的播放和处理 14-7 2D图形的绘制 14-8 练习题 15-1 反射 15-2 使用Java反射机制 15-3 反射与动态代理 15-4 练习题 16-1 Java标注 16-2 JDK内置的基本标注类型 16-3 自定义标注类型 16-4 对标注进行标注 16-5 利用反射获取标注信息 16-6 练习题 17-1 顶目实战1-单机版五子棋游戏 17-2 总体设计 17-3 代码实现 17-4 程序的运行与发布 17-5 手动生成可执行JAR文件 17-6 练习题 18-1 Java数据库编程 18-2 JDBC类和接口 18-3 JDBC操作SQL 18-4 JDBC基本示例 18-5 JDBC应用示例 18-6 练习题 19-1 。。。

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值