Build a collaborative verification environment based on instruction-level simulator and logic simulator

The concept of software and hardware co-verification has been proposed for many years, but it is not until these years that with the development of SOC technology, the software and hardware co-verification technology has received more attention and attention and has been developed. Software and hardware co-verification is a technology that verifies whether SOC system hardware and software can work correctly before hardware tape-out packaging. Co-verification can also be called virtual prototype technology, because although the simulation of the hardware part is basically the same as the operation of the real hardware, the hardware simulation is actually realized by the operation of a software program on the workstation. The basic framework of collaborative verification is shown in Figure 1.

Author: Xing Qiang

1 Overview

The concept of software and hardware co-verification has been proposed for many years, but it is not until these years that with the development of SOC technology, the software and hardware co-verification technology has received more attention and attention and has been developed. Software and hardware co-verification is a technology that verifies whether SOC system hardware and software can work correctly before hardware tape-out packaging. Co-verification can also be called virtual prototype technology, because although the simulation of the hardware part is basically the same as the operation of the real hardware, the hardware simulation is actually realized by the operation of a software program on the workstation. The basic framework of collaborative verification is shown in Figure 1.

Build a collaborative verification environment based on instruction-level simulator and logic simulator

Compared with traditional verification methods, collaborative verification technology enables software design engineers to debug in the early stages of design, integrate software and hardware earlier, and shorten the time-to-market. On the other hand, through software and hardware co-verification, hardware design engineers can be provided with a set of incentive sets that are very close to reality, which is beneficial to improve the quality of verification.

The collaborative verification system consists of a hardware execution environment and a software execution environment. Through events and commands, some mechanisms are used to control between these two environments. The software execution environment is used to generate a sequence of bus cycles. The collaborative verification tool converts the bus cycle into many signal events or command sets, drives these signal event commands into the hardware execution environment, and then samples the bus cycle response to the hardware environment. The response is sent back to the software environment. At the same time, keep the hardware and software environment in sync so that both the hardware and software environment can find errors due to missed responses.

In the software and hardware co-verification, the software and hardware parts are realized through modeling. Among them, there are several hardware modeling methods: FPGA prototype or simulation system; HDL and logic simulator; behavior model written in high-level programming language (such as C++/C++ language). Software can also be executed in several ways: using ISS (instruction set simulator) to run on the CPU; through compilation, and running on the host running the simulation program. In general SOC design, hardware modeling is generally implemented with a hardware description language, and software generally needs to be compiled into target code for SOC hardware embedded cores. Therefore, in this article, an instruction-level simulator plus a logic simulator (ISS with logicsimulator) method to build a collaborative verification environment.

2 Inter-process communication

The communication between software and hardware emulators is a key technology in collaborative verification. Since software simulation and hardware simulation use two independent processes, the Inter Process Communication (IPC) technology under Unix can be used to realize the information interaction between software and hardware emulators. Several commonly used Unix IPCs have unnamed pipes, named pipes (FIFO) and Unix sockets (Socket). The unnamed pipe is the most commonly used IPC method in Unix. The advantage is that it is very concise, but the weakness is that it can only be used between related processes that have a common ancestor; the difference between a named pipe (FIFO) and an unnamed pipe is that it is durable and stable. And it allows unrelated processes to exchange data; a socket is an abstract data structure used to create a channel (connection point) for sending and receiving messages between unrelated processes. Once the channel is established, it is connected The up process can communicate through ordinary file system access routines. Among the two parties in communication, one is called the client and the other is called the server. The establishment process is shown in Figure 2. Comparing the characteristics of the above three communication methods, UnixSocket is used here to realize the communication between the two processes.

Build a collaborative verification environment based on instruction-level simulator and logic simulator

3 messaging

In this environment, the communication between software and hardware can be regarded as a process of message transmission, adding data transmission functions in the software environment and the hardware environment, and realizing information interaction through the interface, as shown in Figure 3.

Build a collaborative verification environment based on instruction-level simulator and logic simulator

In the process of message transfer between the two processes, messages can be divided into three categories: synchronous data transmission; asynchronous data transmission; and synchronous signals without data transmission. Synchronous data transmission refers to a mechanism to ensure that when the sender process sends data, the receiver process is in the correct state; if the sender process initiates a data transfer, the receiver is not in the proper state, send The end blocks this transmission until the receiving end is ready. Asynchronous data transmission means that when the sending end process sends data, it does not consider whether this data or the previous data is received by the receiving end. The data is not buffered. Each new data sent will overwrite the previous data. Therefore, In one transmission, data may be received multiple times or not received. Usually this kind of transmission is used to convey status information. The synchronization signal without data transmission is used to synchronize the state of the two processes without data exchange. A process can use a synchronization signal to notify another process to start a task or wait for it to complete a task.

The communication mode between the data transfer function and the interface in the software environment and the hardware environment is different. In the software environment, the data transfer function only includes simple read and write operations, and read and write operations are divided into two modes: synchronous and asynchronous. In the hardware environment, some ports need to be defined to control data transmission, and each port uses a different protocol. When performing synchronous data transmission, use the handshake port in the hardware environment, and at the same time cooperate with the synchronous read and write functions in the software environment. There are 4 handshake ports, divided into two SENDs (data from the hardware environment to the interface) and two RECVs (data from the interface to the hardware environment). Each SENDs and RECVs are divided into Q_SENDs, Q_RECVs and A_SENDs, A_RECVs according to whether the hardware environment is the requester or the responder. When carrying out asynchronous data transmission, define two ports PORT_IN and PORT_OUT in the hardware environment, and cooperate with the asynchronous read-write function in the software environment for transmission. In addition, an EVENT port needs to be defined to send a synchronization signal without data transmission to synchronize the simulation time of the software environment and the hardware environment.

Table 1 summarizes the port usage in the interface. In this collaborative verification environment, through the use of these ports, designers can ignore the underlying details and selectively use the above ports to achieve information interaction between software and hardware.

Build a collaborative verification environment based on instruction-level simulator and logic simulator

4 Realization of collaborative verification environment

4.1 Basic structure

The basic structure of the entire environment is shown in Figure 4. The left and right boxes respectively represent the software simulation process and the hardware simulation process. They exchange information through UnixSockets, and Verilog PLI is their interactive interface. The various parts of the environment are introduced below.

In the software environment, the instruction level simulator (ISS) is a program that simulates the behavior of the CPU. It runs on the host and can simulate the executable image of the software program. After the SOC software is compiled into the target code for the SOC hardware embedded core, it can be simulated on the instruction-level simulator. The C bus model (C BFM) is used to realize the conversion from the instruction level to the cycle level, and realize the bus interface function, connect with the hardware environment, and also includes the service program of Unix Sockets.

The hardware environment consists of 2 parts. One part is Verilog PLI Inter-face. It is the interface between software and hardware. On the one hand, it exchanges information with the software environment through the client program of UnixSockets; on the other hand, it sends request or response signals to the hardware based on the information passed by Sockets through some task functions. The other part is the hardware model to be verified, which is described in Ver-ilog. Its periphery is a verilog controlstub, which calls functions in the PLI at every moment of the simulation, and exchanges information with the Verilog PLI Interface through the port.

Build a collaborative verification environment based on instruction-level simulator and logic simulator

4.2 Function realization

Figure 5 shows an example of communication through the A_RECV port in the software and hardware environment.

Build a collaborative verification environment based on instruction-level simulator and logic simulator

When the software needs to write a piece of data to the hardware, it first sends the data to the PLI through the Socket; PLI receives the data, and at the same time pulls the request line up, and sends a request to the hardware; after the hardware model is cleared, the data is received through the Verilog ControlStub and passed to Internal register, and pull up the answer line to send an answer to the PLI. After the PLI gets the response, it pulls down the request line and sends a synchronization signal to the Socket. After the hardware model receives the pull down request, it pulls down the response line, and one transmission ends. When the software needs to perform the next write operation, it first needs to read the synchronization signal through the Socket, and the next operation can be started when there is a synchronization signal. The other port handshake process is similar to this.

4.3 Co-simulation synchronization

The instruction set simulator performs simulation in the unit of instruction, the clock advances one instruction cycle at a time, and the length of the instruction cycle varies with instructions. Hardware simulators usually use event-driven simulation algorithms, which use events as scheduling objects. The hardware simulator advances the clock according to the time progress of the event. The time unit is generally specified by the user’s design, such as nanoseconds, microseconds, etc. Therefore, a certain mechanism must be adopted to synchronize the two, and whether they can be synchronized will directly affect the correctness of the collaborative simulation. The lock-step method is usually used for synchronization between the instruction set simulator and the hardware simulator. To adopt this synchronization method, the synchronization point must first be determined to ensure that no software and hardware interaction events occur during the time interval between the two synchronization points. As mentioned earlier, Verilog PLIInterface is an information exchange interface between software and hardware simulators, so the synchronization point will be different because of the exchange of information. The hardware simulator advances the clock in the order of the event occurrence time. To ensure that its clock does not cross the synchronization point, it must introduce a synchronization signal, that is, the signal with time as the synchronization point. After the hardware simulator completes an event, it needs to report to the Verilog PLI Interface sends back a synchronization message, indicating that the clock of the hardware simulator has advanced to the synchronization point.

5 Conclusion

Design verification is one of the key technologies of SOC design. It runs through the entire SOC technology. With the development of SOC technology, software and hardware co-verification technology has received more attention and attention. Unlike the previous method of treating the entire environment as a single process, this environment separates the process of software and hardware simulation, and uses Unix Sockets to realize the communication between software and hardware, which is closer to the actual situation. At the same time, since the two simulation processes are executed in parallel, the simulation speed is faster and the efficiency is higher. In this environment, the software is written in a programming language, and the hardware is modeled in a hardware description language, which conforms to the habits of general software and hardware engineers, and integrates software and hardware earlier. In this environment, the throughput of communication between software and hardware is the bottleneck that affects the speed of simulation. The next step is to study the synchronization and optimization of collaborative simulation.

The Links:   MIG30J103H G150XG03-V0

Related Posts

Leave a Reply

Your email address will not be published. Required fields are marked *