Control Your IC Design Flow

by John McGehee on August 8, 2009

Chip design is an iterative process.  The design flow is run repeatedly as it is extended and refined into a program that can automatically build the entire chip. Your flow needs software to make sure that only the necessary steps are executed, in the correct order, on the correct data. Different hierarchical blocks must be constructed similarly, yet differently.

This is one of a series of articles on simple, specific techniques that will make your chip design flow easy to use. In this installment, I will explain how to control your chip design flow, and rules to keep it organized. Other articles include,

The Power of Names

First, I want to tell you the most effective way to make your flow easy to use and understand: whenever you name something, choose a consistent, informative name. How can something this simple be so important? Look at the value proposition:

  • The name will often be the only documentation
  • At the very outset, you need to choose a huge number of names. A naming convention will make this easier.
  • Names are hard to change. Once chosen, you’re usually stuck with them.
  • Good names cost almost nothing. Just pay attention.

Here I am talking about variable and flow step names.  File names are closely related, but file naming is explained in another article.

Good step and variable names for your design flow are the same as good variable names in ordinary software. This means that there is a wealth of information about variable naming conventions on the Internet and in programming books. I like Code Complete, Chapter 11, The Power of Variable Names for its straightforward suggestions. I also like how it agrees with me.

I have one additional variable naming convention unique to IC design flow: use names that are legal Verilog identifiers. The rules are not at all restrictive:

  • Use only letters, numbers, and _
  • The first character must be a letter

Verilog is the most restrictive name space in chip design. Except for reserved words (like foreach in TCL), a legal Verilog identifier can be used as a file name in every file system, and a variable name in every computer language.

Your design flow will run a wide variety of tools. Use exactly the same variable name everywhere. Resist the temptation to change names to respect the naming convention of a particular tool. For example, I once used designName for the name of the top design in TCL scripts, and DESIGN_NAME in Makefiles. This was extremely inconvenient until I changed to designName everywhere, in violation of the usual Makefile custom of using capitals and underscores for variable names.

Flow Steps

You might be tempted to create giant scripts to run EDA tools from start to finish. It is better to separate the flow into shorter steps that perform individual tasks.

  • Each step must be reentrant. This means that you can safely restart the flow at any step. The most basic requirement for reentrance is that the step reads data from one database at the beginning, and saves the result in a database with a different name at the end.
  • Give each step a short, yet descriptive name. You will type it a lot, and many file names will be based on the step name.
  • Where reasonable and efficient, the inputs and outputs of a step should be in a vendor independent format like OpenAccess, LEF, DEF or Verilog. Still, many steps are best saved as a proprietary database, particularly in place and route. For example, a global routing database will only be read by the tool that wrote it, so go ahead and save it as an Encounter, Volcano or Milkyway database.
  • End steps at points where you have a meaningful, stable result that will allow you to make an informed decision about what to do next. For example, produce a legal placement, create a timing report, and then end the step.
  • The more iterations required to get a step right, the shorter the step should be. An example of this is power routing, where you often tweak your power routing script, run, and observe the results several times before it works correctly.
  • However, starting and stopping CAD tools, reading and writing databases takes time and space. For the sake of economy, you want fewer steps.

When designing your flow steps and your flow in general, learn from others. Refer to existing flows like the TSMC Reference Flow, Cadence Encounter Foundation Flow, Synopsys Lynx Design System or the Voom Flow.

Chip Design Flow Control Architecture

The following diagram shows the main components that make up an IC design flow:

IC Design Flow Tool Hierarchy

  • The flow controller is the boss. Based on instructions from the user, it determines,
    • The steps to be executed based on the state of the design and the goal requested by the user
    • Variable values. It passes these values to the other tools and scripts.
  • The flow controller starts the correct EDA tool, which may be anything from an awk script to a place and route tool
  • If appropriate, the flow controller specifies a script to be executed by the EDA tool
  • If you have a server farm, you will need a job scheduler to distribute the computing load among the servers
  • Larger projects use design management software to manage data versions

Now let’s look at each one of these applications in the tool hierarchy.

Flow Controller

The flow controller manages the execution of flow steps like placement or timing analysis. It determines,

  • The steps that must be executed to bring the design from its current state to the state requested by the user
  • The EDA tool version to use
  • Variable values. The flow controller passes these values to the other tools and scripts.

To restate the first point, the job of the flow controller is to prevent file skew.  This pernicious problem occurrs when some prerequisite file, like a library or a verification result is out of date.  Once it gets into your data, file skew causes subtle errors that necessitate a tremendous amount of rework.

So that a given flow configuration will run in any user’s environnent, avoid controlling your flow with user-defined shell environment variables, such as in .bashrc. This way, all users get the same result no matter what their environment.  The UNIX module utility can be used to control versions of EDA tools and other executables through the environment.

There are several common flow controllers that vary in sophistication. As with most things, there is a trade off between automation and the effort needed to manage the solution. The more engineers who run the flow, the more automatic your flow controller must be.

Simple Shell Script

Many chip designers manually maintain a shell script that executes every step in the flow. After a step completes successfully, the designer comments it out so that re-running the script executes only the remaining incomplete steps. For example, a script might look like this after data preparation is complete, and it is time to do placement:

#! /bin/sh
# Perform physical synthesis with Synopsys IC Compiler
# John McGehee, Voom, Inc. 8/1/09

export flowPath=../../flow

# icc_shell -f ${flowPath}/icc/scr/netlistIn.tcl | tee netlistIn.log
# icc_shell -f ${flowPath}/icc/scr/floorplan.tcl | tee floorplan.log
# icc_shell -f ${flowPath}/icc/scr/pgRoute.tcl | tee pgRoute.log
# icc_shell -f ${flowPath}/icc/scr/setupTiming.tcl | tee setupTiming.log
icc_shell -f ${flowPath}/icc/scr/placement.tcl | tee placement.log
icc_shell -f ${flowPath}/icc/scr/cts.tcl | tee cts.log
# and so on...

When it is time to re-run the entire design flow, the designer simply removes all the comments and runs the script again.

The shell script has the advantage of simplicity—these scripts rarely contain even a single if statement.  It is the user’s job to keep track of the state of the design and prevent file skew.  Since it is difficult to accurately share this state with other users, this flow controller cannot be recommended for multiple users.

Eventually you will outgrow your shell script’s inability to track the state of the design.  When this happens, transform your existing script into a Makefile and get GNU Make to move those comments around for you automatically.

GNU Make

Make has been my favorite flow controller since I first applied it to a design data converter. Originally developed to help programmers compile and link programs, Make can control any process where a sequence of shell commands must be performed in an arbitrarily complex order. GNU Make is free, open software that is already installed on virtually every computer used for designing chips.

Make syntax is somewhat arcane, but the GNU Make documentation is excellent. Most programmers (and many chip designers) are familiar with Make, so there is almost certainly someone at your company who can answer your questions. Finally, the ability to use Make is a useful, portable skill that you should have.

Example Makefile for IC Design

Make is controlled by a Makefile that describes the relationships among files and the commands for updating each file. When the data are source files like C++ and Verilog, you can use the actual input and output file names. When the data are design databases, the file names are unpredictable, so it is best to use a semaphore file instead of the actual database files. For example, the shell script flow represented above is described in the Makefile like this:

ICC_SHELL=/path/to/icc_shell
GREPERR=egrep -i '(error|warning|fail)'
flowPath=../../flow

designName=blockHead
netlistFileName=../../data/netlist/$(designName).vg
sdcFileName=../../data/timing/$(designName).sdc
floorplanScript=local/floorplan.tcl
pgRouteScript=local/pgRoute.tcl

.PHONY: all tidy clean realclean

# The first target is the default.
# Therefore, streamOut will be done when you type 'make' or 'make all'.
all: streamOut

# Note how the comments are inside the target, so they will be printed
# as the flow runs.
netlistIn: $(netlistFileName)
        ####################################
        # Read the netlist
        # Input file $(netlistFileName), output cell netlistIn.CEL
        $(ICC_SHELL) -f $(flowPath)/icc/scr/netlistIn.tcl | tee netlistIn.log
        -$(GREPERR) netlistIn.log > netlistIn
        touch netlistIn

floorplan: netlistIn $(floorplanScript)
        ####################################
        # Run the commands in $(floorplanScript) to floorplan the block
        # Input cell netlistIn.CEL, output cell floorplan.CEL
        $(ICC_SHELL) -f $(flowPath)/icc/scr/floorplan.tcl | tee floorplan.log
        -$(GREPERR) floorplan.log > floorplan
        touch floorplan

pgRoute: floorplan $(pgRouteScript)
        ####################################
        # Run the commands in $(pgRouteScript) to route power, ground, and other special nets
        # Input cell pgRoute.CEL, output cell floorplan.CEL
        $(ICC_SHELL) -f $(flowPath)/icc/scr/pgRoute.tcl | tee pgRoute.log
        -$(GREPERR) pgRoute.log > pgRoute
        touch pgRoute

setupTiming: floorplan $(sdcFileName)
        ####################################
        # Read timing constraints
        # Input cell floorplan.CEL, output cell setupTiming.CEL
        $(ICC_SHELL) -f $(flowPath)/icc/scr/setupTiming.tcl | tee setupTiming.log
        -$(GREPERR) setupTiming.log > setupTiming
        touch setupTiming

placement: setupTiming
        ####################################
        # Perform automatic placement
        # Input cell setupTiming.CEL, output cell placement.CEL
        $(ICC_SHELL) -f $(flowPath)/icc/scr/placement.tcl | tee placement.log
        -$(LOGSUMMARY) placement.log placement.tim | $(MAILER) -s "$(designName) Placement Done" $(USER_EMAIL)
        -$(GREPERR) placement.log > placement
        touch placement

cts: placement
        ####################################
        # Perform clock tree synthesis
        # Input cell placement.CEL, output cell cts.CEL
        $(ICC_SHELL) -f $(flowPath)/icc/scr/cts.tcl | tee cts.log
        -$(GREPERR) cts.log > cts
        touch cts

# And so on, for the complete flow ending with the streamOut target
# that creates Milkyway cell $(designName).CEL, $(designName).FRAM,
# and streams out GDSII file $(designName).gds2 

# Finally, some administrative targets

tidy:
        ####################################
        # Remove annoying junk files, but never any useful files, not even *.log
        rm -f  *.log.??_??_??_?? *.cmd.??_??_??_?? *.TXT core dbSwap* vfPage* vlog*

clean: tidy
        ####################################
        # Remove all the semaphore files to reset the flow in preparation
        # to re-run.  Delete no data, so you can recover if you change
        # your mind.
        rm -f netlistIn floorplan pgRoute setupTiming placement cts # ...

realclean: clean tidy
        ####################################
        # Remove all data created by this flow
        # $(designName) is the library containing all the cells
        rm -rf $(designName)
        rm  -f *.tim *.spf *.SDF *.log *.drc *.lvs \
            *.dump.* *.prsum *.gds2 *.text *.prv \
            *.skew* *.dc *.sdf

With this Makefile in your working directory, you can type,

make floorplan

and Make will execute the commands within the netlistIn and floorplan targets, updating the modification date of the netlistIn and floorplan semaphore files along the way. Then, if you give the command,

make placement

Make will see that the file floorplan exists and is newer than netlistIn, which is newer than $(netlistFileName), so those steps are up-to-date and need not be rerun. The pgRoute, setupTiming and placement targets will be executed and the modification time of the respective semaphore files updated.

Refer to the GNU Make documentation for a thorough explanation of Make and its theory of operation.

Make Automatic Variables

In the above example Makefile, you will notice that the same strings appear repeatedly. I wrote it this way for the sake of clarity, but it is better to use automatic variables. For example, while the commands in the placement target are running, automatic variable $@ is set to placement, and $< is set to setupTiming, Therefore, this target can be rewritten,

placement: setupTiming
        ####################################
        # Perform automatic placement
        # Input cell $<.CEL, output cell $@.CEL
        $(ICC_SHELL) -f $(flowPath)/icc/scr/$@.tcl | tee $@.log
        -$(LOGSUMMARY) $@.log $@.tim | $(MAILER) -s "$(designName) $@ Done" $(USER_EMAIL)
        -$(GREPERR) $@.log > $@
        touch $@

Now, when you make a new target by copying an existing one, there is less to change, and therefore fewer opportunities to make a mistake. It also enforces the rule that the target, script, log file, output cell, output file should all share the same name.

Tips for Using GNU Make as an IC Design Flow Controller

Finally, let me share a few GNU Make tricks that I find useful:

  • Use touch and rmto restart the flow at an intermediate step. Which one you use depends on your situation:
    • To invalidate a step, remove the target (semaphore file) for it. For example, if you do not like your power/ground routing, fix the power routing script and then do,
      rm pgRoute
      make pgRoute

      to rerun the pgRoute step.

    • If you update the results of a step, touch its target to update its modification time to the present time. For example, suppose that you fix the placement by moving some cell instances. Save the placement database, and declare that this result is the most up-to-date using,
      touch placement
  • Like many UNIX commands, Make has a -n option that prints the commands that it would do, without actually doing them. Use this to check what will happen before you commit to starting a run.
  • Another handy GNU Make option is -t, which updates all dependencies of the target you specify, without executing the commands within the targets. For example, suppose you are using the Makefile shown above, and floorplan is up-to-date. The command,
    make -t placement

    will update the modification time of files pgRoute, setupTiming, and placement, so that the placement target is up-to-date. make -t can be tricky, so after you use it, check that the flow really is in the expected state using make -n.

  • Make variables are exported to the environment of the shell in which the command is run. This means that you can define your design parameters once in the Makefile, and use these values in your scripts. For example, netlistIn.tcl can and should get the netlist file name using the Tcl command $env(netlistFileName).
  • The article Debugging Makefiles explains an advanced technique that allows you to see more clearly what will be executed and inspect the value of Makefile variables.
  • You may have noticed the LOGSUMMARY command in the placement target. This is a Perl program that keeps me informed by extracting a summary of the placement result from the log file and emailing it to my mobile phone. Tuscany Design Automation sells a similar application.

Other Flow Controllers

If you want commercial software, check out Runtime Design Automation. Their FlowTracer flow controller allows you to easily reorder your steps to optimize your flow. Make is less flexible in this respect because it was designed for a work flow that needs no rearrangement—programmers never want to try linking before compiling.

Some companies build their own proprietary flow controllers. With the solid free and commercial offerings that are available, a custom flow controller sounds like an expensive luxury to me. People have tried to add GUIs to flows, but that does not seem worth the effort, either.

Job Scheduler

Workload management software is used to balance the computing workload across multiple servers. Platform Computing LSF is the industry standard.

Runtime Design Automation offesr a unique combination of flow controller and job scheduler that helps you to get the most out of your computing hardware and software licenses.

If you like, start out by submitting jobs interactively, and add job scheduling software later when you have a server farm big enough to take advantage of it. The change will cause little disruption.

EDA Tool

This is the software that actually processes the data to construct or verify the chip. This may include simple scripts as well as traditional CAD tools. It is not for me to recommend EDA tools to you, as you probably have your own ideas.

It is the responsibility of the flow controller to launch the correct version of the tool. The UNIX module utility can help control versions of EDA tools and other executables.

Tool Scripts

Many EDA tools are controlled using a Tcl or Python language script. Specify the script on the command line when you launch the tool, as shown in the Flow Controller examples.

The tool hierarchy is getting pretty deep here: the user starts the flow controller, which launches the EDA tool, which executes the tool script. That’s enough. Adding more will only confuse the user. For example, do not do this:

  • In the flow controller, execute a Perl program
  • The Perl program creates an input file and launches an EDA tool

This is much easier to understand:

  • In the flow controller, execute a Perl program to create an input file
  • In the flow controller, launch the EDA tool

Similarly, avoid calling sub-scripts within scripts. Get the same effect without nesting scripts by performing initialization in the startup file. Organize your base routines into reusable packages or modules.

Any software works best when you use it in the plainest, most ordinary way. Whenever possible, do things the tool’s way rather than your way. User manuals tell you about individual commands, but proven flows like the TSMC Reference Flow, Cadence Encounter Foundation Flow, Synopsys Lynx Design System or the Voom Flow provide the most accurate documentation on how to assemble a working design system.

Perform operations related to a tool in that tool’s native scripting language, on the tool’s native database. Most scripting interfaces give you excellent access to the EDA tool database (beware of tools that do not). Use this powerful feature to,

  • Shorten your script because the EDA tool contains all kinds of related functionality that will help you. For example, if you want to move a cell, do it and then run placement legalization. This sure beats munging DEF with a Perl program.
  • Increase your immunity to changes in new EDA tool versions. Database access APIs are remarkably stable.

However, do not write extensive software that depends on a proprietary format, database or language. This ties your software to a specific tool or vendor that may become unavailable or unreliable. Develop larger CAD applications on the OpenAccess design database or public formats like LEF, DEF, or Verilog so your application will work with any tool.

Keep your scripts and configuration files in a source code version control system like Subversion or CVS. You may find it advantageous to use the same design management system for scripts as well as data.

Acknowledgements

Thanks to David Chinnery, author of Closing the Power Gap between ASIC & Custom: Tools and Techniques for Low Power Design and Closing the Gap Between ASIC & Custom: Tools and Techniques for High-Performance ASIC Design for reviewing this manuscript and offering invaluable suggestions.

* 日立FBSLやHUSLデータ変換 プログラムでした。その当時に日立武蔵工場で一緒に立ち上げた方々、なつかしいですね。是非連絡下さい!(search terms fishing for friends from that project)

Leave a Comment

Previous post:

Next post: