Showing posts with label Verification. Show all posts
Showing posts with label Verification. Show all posts

The Need for Speed: Understanding Design Factors that Make Multi-core Parallel Simulations Efficient

 

by Shobana Sudhakar & Rohit Jain, Mentor Graphics
Running a parallel simulation may be as easy as flipping on a switch with the progressive and maturing solutions available today, but do people really take full advantage of the technology? It is true that in some scenarios the overhead of communication and synchronization needed for parallel simulation can negate any substantial performance gains. However, there are scenarios where deploying the parallel simulation technology can provide tremendous benefit. A long running simulation that exercises large blocks of a design concurrently and independently is one good example.
Designers need to be aware of the factors that can inhibit the advantages of parallel simulations, even in these best case scenarios; the main factor being inflexibility due to the way designs are modeled today. This article focuses on these factors and is an effort to educate on best design principles and practices to maximize the advantage of simulation with parallel computing. The discussion also extends to the three main fundamental features of parallel simulations, viz., load balancing, concurrency and communication. Designers need to understand how their designs run in simulation with these factors to ensure they get the maximum out of parallel simulations....
Read complete article from Mentor Graphics:  The Need for Speed: Understanding Design Factors that Make Multi-core Parallel Simulations Efficient

Monitors, Monitors Everywhere – Who Is Monitoring the Monitors?

 

by Rich Edelman and Raghu Ardeishar, Mentor Graphics
The reader of this article should be interested in predicting or monitoring the behavior of his hardware. This article will review phase-level monitoring, transaction-level monitoring, general monitoring, in-order and out-of-order transactionlevel monitors, A protocol specific AXI monitor written at the transaction-level of abstraction will be demonstrated. Under certain AXI usages, problems arise. For example partially written data may be read by an overlapping READ. This kind of behavior cannot be modeled by the "complete transaction" kind of monitor; it must be modeled by a phase-level monitor. All of these monitoring and scoreboard discussions can be widely applied to many protocols and many monitoring situations.
The task of a monitor is to monitor activity on a set of DUT pins. This could be as simple as looking at READ/WRITE pins or as complex as a complete protocol bus, such as AXI or PCIe. In a very simple case a monitor can be looking at a set of pins and generating an event every time there is a change in signal values. The event can trigger a scoreboard or coverage collector. This monitor is typically very slow and not very useful as it generates a lot of irrelevant data.....
Read complete article from Mentor Graphics:  Monitors, Monitors Everywhere – Who Is Monitoring the Monitors?

Flexible UVM Components: Configuring Bus Functional Models

by Gunther Clasen, Ensilica

Modern object-oriented testbenches using SystemVerilog and OVM/UVM have been using SystemVerilog interface constructs in the testbench and virtual interfaces in the class based verification structure to connect the two worlds of static modules and dynamic classes. This has certain limitations, like the use of parameterized interfaces, which are overcome by using Bus Functional Models. BFMs are now increasingly adopted in UVM testbenches, but this causes other problems, particularly for complex BFMs: They cannot be configured from the test environment, thus significantly reducing code reuse.
This article shows a way to write BFMs in such a way that they can be configured like any other UVM component using uvm_config_db. This allows a uniform configuration approach and eases reuse. All code examples use UVM, but work equally with the set_config_*() functions in OVM......
Read complete article from Mentor G raphics: Flexible UVM Components: Configuring Bus Functional Models

NoC Generic Scoreboard VIP

by François Cerisier and Mathieu Maisonneuve, Test and Verification Solutions

The increase of SoC complexity with more cores, IPs and other subsystems has led SoC architects to demand more from the main interconnect or network-on-chip (NoC), which is thus becoming a key component of the system. Power management, multiple clocks, protocol conversions, security management, virtual address space, cache coherency are among the features that must be managed by main interconnect and that demand proper verification.
In addition, IP reuse and NoC generation solutions have enabled the conception of new SoC architectures within months or even weeks. Simple point-to-point scoreboard methodology is taught in most good verification methodology books and tutorials. However, building a generic verification solution for an SoC interconnect that can quickly adapt to any bus protocols and SoC architectures, and can deal with SoC advanced features, requires much more than dealing with point-to point transaction matching.....
Read complete article from Mentor graphics:  NoC Generic Scoreboard VIP

Making it Easy to Deploy the UVM

 

by Gaurav Jalan, SmartPlay Technologies, and Pradeep Salla, Mentor Graphics
Until recently, the semiconductor industry religiously followed Moore's Law by doubling the number of transistors on a given die approximately every two years. This predictable growth allowed ecosystem partners to plan and deal with rising demands on tools, flows and methodologies. Then came the mobile revolution, which opened up new markets and further shifted the industry's focus to consumers. The diversified nature of this market posed many, often conflicting development challenges: How to speed time to market while building products rich in functionality? How to boost performance while keeping both power consumption and cost at modest levels? Wading through these questions contributed to a multifold increase in verification complexity....
Read complete article from Mentor Graphics:  QVM: Enabling Organized, Predictable, and Faster Verification Closure

Making it Easy to Deploy the UVM

by Dr. Christoph Sühnel, frobas GmbH

The Universal Verification Methodology (UVM) is becoming the dominant approach for the verification of large digital designs. However, new users often express concern about the effort required to generate a complete and useful UVM testbench. But the practical experience collected in numerous OVM and UVM projects during the last few years shows a different view. The UVM is a very suitable methodology for any kind of design and implementation, i.e. ASIC and FPGA due to the availability of the UVM library and the well-defined testbench structure. This allows the automation of essential steps in employing the UVM.
This article describes an UVM approach reducing testbench implementation effort, guaranteeing an early success and streamlining the processing of the test results. Depending on the number of functional interfaces and the design complexity up to 6 weeks of implementation effort or even more can be saved. A runnable UVM testbench will be handed over to the verification team at the very beginning of the project. The verification engineers have to implement only the corresponding drivers, monitors and functional coverage models. Later on the scoreboards needed have to be architected and implemented.....
Read complete article from Mentor Graphics:  Making it Easy to Deploy the UVM

Confidence in the Face of the Unknown: X-state Verification

 

by Kaowen Liu, MediaTek Inc., and Roger Sabbagh, Mentor Graphics
Unknown signal values in simulation are represented as X-state logic levels, while the same X-states are interpreted as don't care values by synthesis. This can result in the hazardous situation where silicon behaves differently than what was observed in simulation. Although the general awareness of X-state issues among designers is good, gotchas remain a risk that traditional verification flows are not well equipped to guard against. The unknown simulation semantics of X has two oft discussed pitfalls: X-pessimism and X-optimism.
X-optimism is most worrisome as it can mask true design behavior by blocking the propagation of X-states and instead propagating a deterministic value through the design in simulation, when in reality various possible values will be seen in the hardware. If the unexplored values cause the design to malfunction, then X-optimism has masked a design bug that will only be discovered in silicon.......
Read complete article from Mentor Graphics:  Confidence in the Face of the Unknown: X-state Verification

Verifying High Speed Peripheral IPs

 

by Sreekanth Ravindran and Chakravarthi M.G., Mobiveil
High speed serial interconnect bus fabric is the SoC backbone, managing dataflow and keeping up with the dynamic bandwidth requirements of high speed applications. Verification of high speed interconnect IPs presents critical challenges not only in terms of complying with standards, but also in ensuring that the design is robust and flexible enough to handle and manage a large amount of time-critical data transfers. Acquiring such expertise requires years of verification experience. In this article, Silicon IP and platform enabled solution provider Mobiveil shares its story of verifying high speed bus protocol standards like PCI Express and Serial RapidIO, including what considerations are required when verifying high speed designs. In addition, Mobiveil highlights the benefits of using the Mentor Graphics Questa Verification Platform, including Questa Advanced Simulator, Questa CoverCheck, and Questa Clock-Domain Crossing (CDC) Verification, which together facilitates smart functional verification, debug and reporting of the high speed designs....
Read complete article from Mentor graphics: Verifying High Speed Peripheral IPs

Non-invasive Software Verification using Vista Virtual Platforms

 

by Alex Rozenman, Vladimir Pilko, and Nilay Mitash, Mentor Graphics
With the SoCs now supporting Multi-Core processors, complex ASICs and combinations that include systems on a board, SoC implementations now become an ever growing challenge for software development. Software development has to be supported not only by the inclusion of an RTOS, but, many SoC providers now have to leverage upon the Bare-Metal concept to achieve the necessary demands of today's SoCs. However, there still exists a huge chasm in the software development arena that addresses both the need to be able to verify not only the sw/hw interactions, but, also the software itself in a hardware context. This has become almost a necessity in today's "security" based systems marketplace....
Read complete article from Mentor Graphics:  Non-invasive Software Verification using Vista Virtual Platforms

Power Up Hardware/Software Verification Productivity

 

by Matthew Ballance, Mentor Graphics
Today's complex designs increasingly include at least one, and often more, embedded processors. Given software's increasing role in the overall design functionality, it has become increasingly important to leverage the embedded processors in verifying hardware/software interactions during system-level verification. Comprehensively verifying low-level hardware/software interactions early in the verification process helps to uncover bugs that otherwise would be uncovered during operating system or application bring-up – potentially in the lab. Characterizing, debugging, and correcting this type of bug is easier, faster, and thus less expensive, early in the verification cycle....
 Read complete article from Mentor Graphics: Power Up Hardware/Software Verification Productivity

An Up-sized DAC Issue Takes the Stage

 

by Tom Fitzpatrick, Editor and Verification Technologist, Mentor Graphics Corporation
Building a theater set is not unlike what we do as verification engineers. It involves modeling the "real world," often at a higher level of abstraction, and it has hard deadlines. "The show must go on," after all. Productivity is also key because all of us building the set are volunteers. We reuse set pieces whenever we can, whether it's something we built for a past production or something we borrowed from a neighboring group. And we often have to deal with shifting requirements when the director changes his mind about something during rehearsals. Oh, and it also involves tools – lots of tools. However, there is one important way in which set construction is different from verification. Those of us building theater sets know our work only has to stay up for two weeks and that no one in the audience is going to see it from closer than thirty feet away. In contrast, all engineers know the chips they verify must work a lot longer under much closer scrutiny. .....
 Read complete article from Mentor Graphics : An Up-sized DAC Issue Takes the Stage

Maximum Productivity with Verification IP

 

by Joe Rodriguez, Raghu Ardeishar, and Rich Edelman, Mentor Graphics
When beginning a new design it's common to evaluate how to build a verification infrastructure in the quickest amount of time. Of course it's never just quick to deploy, verification also has to be complete enough to improve confidence in the design. Rapid bring-up and improving the quality of your design are excellent goals. However, you should not forget that your environment should be efficient to use during the verification process. This is where you will spend most of your time, slugging it out day after day. Arguably, debugging design bugs is one of the most time consuming tasks of any project. Transaction Level Modeling (TLM) will change the way you think about debug productivity, especially if you have recently experienced the long and difficult task of deciphering PCIe's training sequences,data transfers and completion codes at the pin level......
Read complete article from Mentor Graphics: Maximum Productivity with Verification IP

In scan chains if some flip flops are +ve edge triggered and remaining flip flops are -ve edge triggered how it behaves?


Answer:


For designs with both positive and negative clocked flops, the scan insertion tool will always route the scan chain so that the negative clocked flops come before the positive edge flops in the chain. This avoids the need of lockup latch.


For the same clock domain the negedge flops will always capture the data just captured into the posedge flops on the posedge of the clock.


For the multiple clock domains, it all depends upon how the clock trees are balanced. If the clock domains are completely asynchronous, ATPG has to mask the receiving flops.

What you mean by scan chain reordering?

Answer1:


Based on timing and congestion the tool optimally places standard cells. While doing so, if scan chains are detached, it can break the chain ordering (which is done by a scan insertion tool like DFT compiler from Synopsys) and can reorder to optimize it.... it maintains the number of flops in a chain.


Answer2:


During placement, the optimization may make the scan chain difficult to route due to congestion. Hence the tool will re-order the chain to reduce congestion.


This sometimes increases hold time problems in the chain. To overcome these buffers may have to be inserted into the scan path. It may not be able to maintain the scan chain length exactly. It cannot swap cell from different clock domains.


Because of scan chain reordering patterns generated earlier is of no use. But this is not a problem as ATPG can be redone by reading the new netlist.

What is JTAG?

Answer1:


JTAG is acronym for "Joint Test Action Group".This is also called as IEEE 1149.1 standard for Standard Test Access Port and Boundary-Scan Architecture. This is used as one of the DFT techniques.

Answer2:


JTAG (Joint Test Action Group) boundary scan is a method of testing ICs and their interconnections. This used a shift register built into the chip so that inputs could be shifted in and the resulting outputs could be shifted out. JTAG requires four I/O pins called clock, input data, output data, and state machine mode control.


The uses of JTAG expanded to debugging software for embedded microcontrollers. This elimjinates the need for in-circuit emulators which is more costly. Also JTAG is used in downloading configuration bitstreams to FPGAs.


JTAG cells are also known as boundary scan cells, are small circuits placed just inside the I/O cells. The purpose is to enable data to/from the I/O through the boundary scan chain. The interface to these scan chains are called the TAP (Test Access Port), and the operation of the chains and the TAP are controlled by a JTAG controller inside the chip that implements JTAG.


For more info:


http://www.xess.com/faq/M0000297.HTM
http://www.cadreng.com/open_source/jtag/jtag_tutorial.php
http://www.ee.ic.ac.uk/pcheung/teaching/ee3_DSD/ti_jtag_seminar.pdf

Formal Verification Example

Formality (from Synopsys) is the tool used to formally verify the design. The design SAMM is verified in two ways. Gate level netlist and testable netlist are formally verified. Gate level netlist in .db format is taken as reference and testable netlist in .db format is considered as implementation. The matching of these two netlists generates below log data:


*********************************** Matching Results ***********************************
498 Compare points matched by name
0 Compare points matched by signature analysis
0 Compare points matched by topology
27 Matched primary inputs, black-box outputs
0(1) Unmatched reference(implementation) compare points
0(2) Unmatched reference(implementation) primary inputs, black-box outputs
----------------------------------------------------------------------------------------
Unmatched Objects REF IMPL
----------------------------------------------------------------------------------------
Input ports (Port) 0 2
Output ports (Port) 0 1
****************************************************************************************


There are total two unmatched input ports exists in implementation. These ports are test_se and test_si which are related to Design for Test (DFT) scan circuit. Since under normal operation DFT circuits doesn’t come into picture tool has to be instructed not to match these ports. This is performed by setup option in Graphical User Interface (GUI) and setting those two signals as false as shown in below.

Formality (match)> setup
1
Formality (setup)> set_constant -type port i:/WORK/sam3/test_se 0
Set 'i:/WORK/sam3/test_se' to constant 0
1
Formality (setup)> set_constant -type port i:/WORK/sam3/test_si 0
Set 'i:/WORK/sam3/test_si' to constant 0
1



After setting these options designs match and verification succeeds. The log data of the same is shown below.


Status: Verifying...
********************************* Verification Results *********************************
Verification SUCCEEDED
----------------------
Reference design: r:/WORK/sam3
Implementation design: i:/WORK/sam3
480 Passing compare points
----------------------------------------------------------------------------------------
Matched Compare Points BBPin Loop BBNet Cut Port DFF LAT TOTAL
----------------------------------------------------------------------------------------
Passing (equivalent) 0 0 0 0 31 449 0 480
Failing (not equivalent) 0 0 0 0 0 0 0 0
Not Compared
Constant reg 18 0 18
****************************************************************************************
1


Similarly original verilog code and testable netlist is verified.

Related Articles

Introduction to Formal Verification
Why Formal Verification?
Concept of Formal Verification
Methodologies in Formal Verification
Binary Decision Diagram (BDD)
Equivalence Checking (EC)
Model Checking
Theorem Proving
Satisfiability (SAT) Solvers
Recent Advances in SAT Based Formal Verification
Hybrid Formal Verification Techniques

Hybrid Formal Verification Techniques

Traditional formal techniques like Model Checking and Theorem Proving are capable of analyzing and finding all bugs in a design. However, they suffer from several drawbacks due to which their applicability in practice is restricted. Model checking is limited by well known state explosion problem. For the effective use of theorem proving technique, highly knowledge and skillful manpower is necessary. In this context hybrid techniques where both formal and informal techniques are used in conjunction with each other are gaining more popularity. The general goal of a hybrid technique is to enhance coverage of the state space traversed in order to address the verification bottleneck, by using the strengths of individual methods.

Hybrid Formal Verification approaches is classified into four categories.

They are:

1. Methods combining formal and informal techniques
2. Combination of two formal techniques
3. Combination of two informal techniques
4. Combination of multiple verification approaches

Symbolic Model Verification (SMV) is a powerful method for Formal Verification but most of the realistic designs are too large and complex for pure symbolic simulation techniques. A method surveyed by [6] used concept of synthesis and verification. This hybrid technique achieved better coverage and quicker in finding bugs. One more hybrid technique which combines model checking and test generation, used general graph theoretic model, functional fault model and test generation procedures. This method is highly suitable for pipelined processors.

It is interesting to note that a number of research activities are going on in hybridization of ATPG and formal techniques. ATPGs avoid state explosion problem while formal methods are “complete”. Hybridization of these two strengthens the verification. In one of the method surveyed in [6], formal specifications (or properties) are used as input to the test generator. This method was quite successful in dealing with commercial designs. Another method combined both ATPG and BDD with the SMV. As noted by [6] results showed better coverage than either symbolic techniques or simulation in isolation. Several other hybrid techniques which combine ATPG, BDD and SMV can be found in [6].

Effectively combining two formal methods like theorem proving and model checking is a great challenge to research community. The experiment that combined “Symbolic Trajectory Evolution” (STE) with theorem proving method showed good results for the verification of 64-bit integer multiplier. Some other theorem proving methods that use external tools like ORACLE for checking formulas as theorem during proof search and they use ORACLE interface to decide Boolean formulas through connections to BDD and SAT solving libraries.

Another tool, MIST, surveyed by [6] hybridizes two model checking techniques. The technique enables an interface between STE and SMC-BDD or SAT based solvers. In this technique, once the initial set of states is obtained by using STE, SMC is used to check the property on the original circuit model. Thus the model MIST enhances both the capacity and performance of SMC. The method helps the debugging process by letting the verifier focus on critical and error prone areas.

Various other methods combining informal methods are also emerging. One among them employs “computer learning” and “Bayesian Networks” and couple of others use model based randomized functional verification.

The first proposed method combining multiple verification approaches used test pattern generation, no-reachability analysis, symbolic simulation, SAT based BMC, symbolic fix point computation and automatic abstraction. These techniques helped to focus coverage issues much earlier in the design cycle, to expose bugs in deep circuit blocks.

Hybrid Formal Verification techniques are employed in commercial tools like Megellan, Formality by Synopsys; Incissive by Cadence; FormalPro by Mentor Graphics.

Reference

[1] Sandeep K. Shukla, “An Introduction to Formal Verification”, http://www.ics.uci.edu/~rgupta/ics212/w2002/verification.pdf, 9/10/07

[2] Rolf Drechsler, “Towards Formal Verification on the System Level”, Institute of Computer Science, University of Bremen, Germany, 2005

[3] Prof. Eduard Cerny, Xiaoyu Song, “Formal Verification Methods”, University of Montrial, Canada, 1999. http://www.iro.umontreal.ca/~cerny/IFT6222/1-intro-99.0808.pdf, 9/10/07

[4] Jawahar Jain, Amit Narayan, M.Fujita, A.Sangiovanni Vincentelli, “A Survey of Techniques for Formal Verification of Combinational Circuits”, Proceedings of the 1997 International Conference on Computer Design (ICCD '97), IEEE, 1997

[5] F. Wang, “Formal Verification of Timed Systems: A Survey and Perspective”, Proceedings of the IEEE, Vol. 92, No. 8, August 2004. pp.283-1305. http://www.cerc.utexas.edu/~jay/fv_surveys/, 9/10/07

[6] J. Bhadra, M. S. Abadir, L.-C. Wang, S. Ray, “Functional Verification through Hybrid Techniques: A Survey”, journal of IEEE Design & Test of Computers, March-April 2007. http://www.cerc.utexas.edu/~jay/fv_surveys/, 9/10/07

[7] M. Prasad, A. Biere, A. Gupta, “A Survey of Recent Advances in SAT-Based Formal Verification”, Software Tools for Technology Transfer, Vol. 7, No. 2, 2005. pp. 156-173. http://www.cerc.utexas.edu/~jay/fv_surveys/, 9/10/07


Related Articles

Introduction to Formal Verification
Why Formal Verification?
Concept of Formal Verification
Methodologies in Formal Verification
Binary Decision Diagram (BDD)
Equivalence Checking (EC)
Model Checking
Theorem Proving
Satisfiability (SAT) Solvers
Recent Advances in SAT Based Formal Verification
Formal Verification Example

Recent Advances in SAT Based Formal Verification

SAT can be used for model checking and the method is known as Bounded Model Checking. Bounded Model Checking based on SAT methods was introduced by Biere et al. [7]. This is rapidly gaining popularity as a complementary technique to BDD-based symbolic model checking. A more compact CNF for the BMC problem is used to generate structural processing. This makes the resulting SAT problem easier for the SAT solver to solve. Traditional model checking uses a method, known as cone of influence (COI) reduction. An improvement on this method the method of bounded cone of influence (BCOI) reduction. Variable ordering has long been recognized as a key determinant of the performance of SAT solvers. BMC tools using circuit-based SAT solvers essentially use some variant of the J-frontier justification heuristic popularly used in sequential ATPG tools.

Several case studies mentioned in [7] which employed BMC as Formal Verification methodology has proven the capability of the model successfully. BMC was found to significantly outperform the BDD based Symbolic Model Verification model checker for several of the benchmarks. The case studies were carried out on processors like PowerPC and Intel Pentium 4. Thus it is generally accepted in the research and industrial community that SAT-BMC tools require minimal tuning effort and work particularly well on large designs where bugs need to be searched at shallow to medium depths.
The paper [7] further surveys the several other latest trends in Model Checking algorithm based on SAT solvers. Few of them are named here: SAT-based Unbounded Model Checking, ATPG-based Model Checking, Quantified Boolean Formulae (QBF).

The SAT-based verification has been emerged as an orthogonal technology to BDD-based model checking techniques (bounded and unbounded). SAT based techniques require fewer users tuning as well as less sensitive to the problem size. However, major challenge is to develop a verification methodology which employs both SAT and BDD methods in a manner best suited to utilize their respective strengths.

Reference

[1] Sandeep K. Shukla, “An Introduction to Formal Verification”, http://www.ics.uci.edu/~rgupta/ics212/w2002/verification.pdf, 9/10/07

[2] Rolf Drechsler, “Towards Formal Verification on the System Level”, Institute of Computer Science, University of Bremen, Germany, 2005

[3] Prof. Eduard Cerny, Xiaoyu Song, “Formal Verification Methods”, University of Montrial, Canada, 1999. http://www.iro.umontreal.ca/~cerny/IFT6222/1-intro-99.0808.pdf, 9/10/07

[4] Jawahar Jain, Amit Narayan, M.Fujita, A.Sangiovanni Vincentelli, “A Survey of Techniques for Formal Verification of Combinational Circuits”, Proceedings of the 1997 International Conference on Computer Design (ICCD '97), IEEE, 1997

[5] F. Wang, “Formal Verification of Timed Systems: A Survey and Perspective”, Proceedings of the IEEE, Vol. 92, No. 8, August 2004. pp.283-1305. http://www.cerc.utexas.edu/~jay/fv_surveys/, 9/10/07

[6] J. Bhadra, M. S. Abadir, L.-C. Wang, S. Ray, “Functional Verification through Hybrid Techniques: A Survey”, journal of IEEE Design & Test of Computers, March-April 2007. http://www.cerc.utexas.edu/~jay/fv_surveys/, 9/10/07

[7] M. Prasad, A. Biere, A. Gupta, “A Survey of Recent Advances in SAT-Based Formal Verification”, Software Tools for Technology Transfer, Vol. 7, No. 2, 2005. pp. 156-173. http://www.cerc.utexas.edu/~jay/fv_surveys/, 9/10/07


Related Articles

Introduction to Formal Verification
Why Formal Verification?
Concept of Formal Verification
Methodologies in Formal Verification
Binary Decision Diagram (BDD)
Equivalence Checking (EC)
Model Checking
Theorem Proving
Satisfiability (SAT) Solvers
Hybrid Formal Verification Techniques
Formal Verification Example

Satisfiability (SAT) Solvers

Most SAT solvers use a Conjunctive Normal Form (CNF) representation of the Boolean formula. In CNF representation the formula is represented as a conjunction of clauses, each clause is a disjunction of literals, and a literal is a variable or its negation. Each clause must evaluate to true to satisfy the formula. There exist polynomial algorithms to transform an arbitrary propositional formula into a satisfiability equivalent CNF formula that is satisfiable if and only if the original formula is satisfiable. The SAT solver may be modified to work directly on the Boolean circuit representation. Most modern SAT solvers are based on the Davis-Putnam- Logemann-Loveland (DPLL) algorithm. The DPLL algorithm finds a solution if and only if the formula is satisfiable. However, modern SAT solvers use conflict analysis techniques to analyze the reasons for a conflict.

Reference

[1] Sandeep K. Shukla, “An Introduction to Formal Verification”, http://www.ics.uci.edu/~rgupta/ics212/w2002/verification.pdf, 9/10/07

[2] Rolf Drechsler, “Towards Formal Verification on the System Level”, Institute of Computer Science, University of Bremen, Germany, 2005

[3] Prof. Eduard Cerny, Xiaoyu Song, “Formal Verification Methods”, University of Montrial, Canada, 1999. http://www.iro.umontreal.ca/~cerny/IFT6222/1-intro-99.0808.pdf, 9/10/07

[4] Jawahar Jain, Amit Narayan, M.Fujita, A.Sangiovanni Vincentelli, “A Survey of Techniques for Formal Verification of Combinational Circuits”, Proceedings of the 1997 International Conference on Computer Design (ICCD '97), IEEE, 1997

[5] F. Wang, “Formal Verification of Timed Systems: A Survey and Perspective”, Proceedings of the IEEE, Vol. 92, No. 8, August 2004. pp.283-1305. http://www.cerc.utexas.edu/~jay/fv_surveys/, 9/10/07

[6] J. Bhadra, M. S. Abadir, L.-C. Wang, S. Ray, “Functional Verification through Hybrid Techniques: A Survey”, journal of IEEE Design & Test of Computers, March-April 2007. http://www.cerc.utexas.edu/~jay/fv_surveys/, 9/10/07

[7] M. Prasad, A. Biere, A. Gupta, “A Survey of Recent Advances in SAT-Based Formal Verification”, Software Tools for Technology Transfer, Vol. 7, No. 2, 2005. pp. 156-173. http://www.cerc.utexas.edu/~jay/fv_surveys/, 9/10/07


Related Articles

Introduction to Formal Verification
Why Formal Verification?
Concept of Formal Verification
Methodologies in Formal Verification
Binary Decision Diagram (BDD)
Equivalence Checking (EC)
Model Checking
Theorem Proving
Recent Advances in SAT Based Formal Verification
Hybrid Formal Verification Techniques
Formal Verification Example

Theorem proving




A theorem in logic is nothing but relationship (i.e. logical equivalence/logical implication) between a specification and an implementation. This theorem has to be proven within the context of a proof calculus. The implementation should satisfy specifications provided. This is then proved by mathematical reasoning. Implementation and specification expressed as formulas in a formal logic. The implementation provides axioms and assumptions that the proof can draw upon. The inference can be simplification, rewriting, induction etc.

Figure (1) Theorem proving [3]

Proof checking and generation
It is a purely syntactic matter to decide whether each theorem is an axiom or follows from previous theorems (axioms) by a rule of inference. Complete automation of theorem proving methodology is generally not possible due to theoretical limitations of decisions. However, a greater part of the theorem, like decidable subsets, specific classes of applications and specification styles can be automated. All theorem proving techniques require human guidance in indirect form.

Theorem Proving Systems
Theorem proving systems can be broadly classified into Automated Deduction Systems (e.g. Prolog) and Interactive Theorem Proving Systems. Automated deduction systems are full automatic, but only for a decidable subset of FOL. here speed is emphasized over versatility. Since this method is often developed in the context of application specific research, methodology is implemented by ad hoc decision procedures. Interactive theorem proving systems are semi-automatic, but not restricted to a decidable subset. Versatility of the method is emphasized over speed. Naturally this method is slow. Theoretically a complete proof can be generated for every theorem.

Some known theorem proving systems are Boyer-Moore (first-order logic), HOL (higher-order logic), PVS (higher-order logic) and Lambda (higher-order logic).

Advantages of Theorem Proving
* Theorem proving methodology provides high abstraction and powerful logic expressiveness and powerful reasoning.
* User is in full control of the design. Hence hierarchical structure and regularity of the design ca be fully exploited.
* The methodology is very much useful for verifying parameterized Datapath-dominated circuits. The technique can be customized for specific application i.e. larger proofs can be built based on basic ones.

Disadvantages of Theorem Proving

* User intervention and guidance is necessary for the verification process. This necessitates expertise in the field. To prove small theorems large human investment is necessary.
* The methodology is automated for narrow classes of designs only. Hence the technique is still under development.
* As the complexity of the design increases theorem proving becomes difficult.

Reference

[1] Sandeep K. Shukla, “An Introduction to Formal Verification”, http://www.ics.uci.edu/~rgupta/ics212/w2002/verification.pdf, 9/10/07

[2] Rolf Drechsler, “Towards Formal Verification on the System Level”, Institute of Computer Science, University of Bremen, Germany, 2005

[3] Prof. Eduard Cerny, Xiaoyu Song, “Formal Verification Methods”, University of Montrial, Canada, 1999. http://www.iro.umontreal.ca/~cerny/IFT6222/1-intro-99.0808.pdf, 9/10/07

[4] Jawahar Jain, Amit Narayan, M.Fujita, A.Sangiovanni Vincentelli, “A Survey of Techniques for Formal Verification of Combinational Circuits”, Proceedings of the 1997 International Conference on Computer Design (ICCD '97), IEEE, 1997

[5] F. Wang, “Formal Verification of Timed Systems: A Survey and Perspective”, Proceedings of the IEEE, Vol. 92, No. 8, August 2004. pp.283-1305. http://www.cerc.utexas.edu/~jay/fv_surveys/, 9/10/07

[6] J. Bhadra, M. S. Abadir, L.-C. Wang, S. Ray, “Functional Verification through Hybrid Techniques: A Survey”, journal of IEEE Design & Test of Computers, March-April 2007. http://www.cerc.utexas.edu/~jay/fv_surveys/, 9/10/07

[7] M. Prasad, A. Biere, A. Gupta, “A Survey of Recent Advances in SAT-Based Formal Verification”, Software Tools for Technology Transfer, Vol. 7, No. 2, 2005. pp. 156-173. http://www.cerc.utexas.edu/~jay/fv_surveys/, 9/10/07


Related Articles

Introduction to Formal Verification