toolflow

A python-based toolflow to build a vivado project from a simulink design, using the CASPER xps library.

A work in progress.

class toolflow.ISEBackend(plat=None, compile_dir='/tmp')[source]
__init__(plat=None, compile_dir='/tmp')[source]
Parameters:
  • plat
  • compile_dir
add_compile_cmds(cores=8, plat=None)[source]

add the tcl commands for compiling the design, and then launch vivado in batch mode

compile(cores, plat)[source]
static format_clock_const(c)[source]
static format_const(attribute, val, port, index=None)[source]

Generate a tcl syntax command from an attribute, value and port (with indexing if required)

gen_constraint_file(constraints)[source]

Pass this method a toolflow-standard list of constraints which have already had their physical parameters calculated and it will generate a contstraint file and add it to the current project.

get_ucf_const(const)[source]

Pass a single toolflow-standard PortConstraint, and get back a tcl command to add the constraint to a vivado project.

class toolflow.SimulinkFrontend(compile_dir='/tmp', target='/tmp/test.slx')[source]
__init__(compile_dir='/tmp', target='/tmp/test.slx')[source]
Parameters:
  • compile_dir
  • target
compile_user_ip(update=False)[source]

Compile the users simulink design. The resulting netlist should end up in the location already specified in the peripherals file.

Parameters:update (bool) – Update the simulink model before running system generator
gen_periph_file(fname='jasper.per')[source]

generate the peripheral file.

i.e., the list of yellow blocks and their parameters.

It also generates the design_info.tab file which is used to populate the fpg file header

Parameters:fname (str) – The full path and name to give the peripheral file.
write_git_info_file(fname='git_info.tab')[source]

Get the git info for mlib_devel and the model file. :param fname: :return:

class toolflow.Toolflow(frontend='simulink', compile_dir='/tmp', frontend_target='/tmp/test.slx', jobs=8)[source]

A class embodying the main functionality of the toolflow. This class is responsible for generating a complete top-level verilog description of a project from a ‘peripherals file’ which encodes information about which IP a user wants instantiated.

The toolflow class can parse such a file, and use it to generate verilog, a list of source files, and a list of constraints. These can be passed off to a toolflow backend to be turned into some vendor-specific platform and compiled. At least, that’s the plan…

__init__(frontend='simulink', compile_dir='/tmp', frontend_target='/tmp/test.slx', jobs=8)[source]

Initialize the toolflow.

Parameters:
  • frontend (str) – Name of the toolflow frontend to use. Currently only simulink is supported
  • compile_dir – Compile directory where build files and logs should go.
build_top()[source]

Copies the base top-level verilog file (which is platform dependent) to the compile directory. Constructs an associated VerilogModule instance ready to be modified.

check_attr_exists(thing, generator)[source]

Lots of methods in this class require that certain attributes have been set by other methods before proceeding. This is probably a symptom of the code being terribly structured. This method checks if an attribute exists and throws an error message if not. In principle it could automatically run the necessary missing steps, but that seems pretty suspect.

Parameters:
  • thing (str) – Attribute to check.
  • generator (str) – Method which can be used to set thing (used for error message only)
check_templates()[source]

Check for any yellow blocks marked with a non-None value of the template_project attribute. a) Blocks must not have complicting template_project values. b) If any of the template_project values are non-None, the

specified template_project should be a valid file.
constraints_rule_check()[source]

Check pin constraints against top level signals. Warn about missing constraints.

dump_castro(filename)[source]

Build a ‘standard’ Castro object, which is the interface between the toolflow and the backends.

exec_flow(gen_per=True, frontend_compile=True)[source]

Execute a compile.

Parameters:
  • gen_per (bool) – Have the toolflow frontend generate a fresh peripherals file
  • frontend_compile (bool) – Run the frontend compiler (eg. System Generator)
gen_periph_objs()[source]

Generate a list of yellow blocks from the current peripheral file.

Internally, calls:

  • _parse_periph_file: parses .per file
  • _extract_plat_info: instantiates platform instance

Then calls each yellow block’s constructor. Runs a system-wide drc before returning.

generate_consts()[source]

Compose a list of constraints from each yellow block. Use platform information to generate the appropriate physical realisation of each constraint.

generate_hdl()[source]

Generates a top file for the target platform based on the peripherals file.

Internally, calls:

  • instantiate_periphs: call each yellow block’s mod_top method
  • instantiate_user_ip: add ports to top module based on port entries in peripheral file
  • regenerate_top: rewrite top.v
generate_peripheral_hdl()[source]

Create each yellowblock’s custom hdl files and add them to the projects sources

generate_xml_ic(memory_map)[source]

Generate xml interconnect file that represent top-level AXI4-Lite interconnect for Oxford’s xml2vhdl.

generate_xml_memory_map(memory_map)[source]

Generate xml memory map files that represent each AXI4-Lite interface for Oxford’s xml2vhdl.

regenerate_top()[source]

Generate the verilog for the modified top module. This involves computing the wishbone interconnect / addressing and generating new code for yellow block instances.

write_core_info()[source]
write_core_jam_info()[source]
xml2vhdl()[source]

Function to call Oxford’s python code to generate AXI4-Lite VHDL register interfaces from an XML memory map specification.

Obtained from: https://bitbucket.org/ricch/xml2vhdl/src/master/

class toolflow.ToolflowBackend(plat=None, compile_dir='/tmp')[source]
__init__(plat=None, compile_dir='/tmp')[source]
Parameters:
  • plat
  • compile_dir
add_const_file(constfile)[source]

Add a constraint file to the project. via a tcl incantation. In non-project mode, it is important to note that copies are not made of files. The files are read from their source directory. Project mode copies files from their source directory and adds them to the a new compile directory.

Parameters:constfile
add_source(source, plat)[source]

Add a sourcefile to the project. Via a tcl incantation. In non-project mode, it is important to note that copies are not made of files. The files are read from their source directory. Project mode copies files from their source directory and adds them to the a new compile directory.

static calculate_checksum_using_bitstream(bitstream, packet_size=8192)[source]

Summing up all the words in the input bitstream, and returning a Checksum - Assuming that the bitstream HAS NOT been padded yet

Parameters:
  • bitstream – The actual bitstream of the file in question
  • packet_size – max size of image packets that we pad to
Returns:

checksum

compile(core, plat)[source]
Parameters:
  • core
  • plat
gen_constraint_file(constraints)[source]

Pass this method a toolflow-standard list of constraints which have already had their physical parameters calculated and it will generate a constraint file and add it to the current project.

import_from_castro(filename)[source]
initialize()[source]
Parameters:plat
mkfpg(filename_bin, filename_fpg)[source]

This function makes the fpg file header and the final fpg file, which consists of the fpg file header (core_info.tab, design_info.tab and git_info.tab) and the compressed binary file. The fpg file is used to configure the ROACH, ROACH2, MKDIG and SKARAB boards.

Parameters:
  • filename_bin (str) – This is the path and binary file (top.bin) that contains the FPGA programming data.
  • filename_fpg (str) – This is the output time stamped fpg file name
class toolflow.ToolflowFrontend(compile_dir='/tmp', target='/tmp/test.slx')[source]
__init__(compile_dir='/tmp', target='/tmp/test.slx')[source]
Parameters:
  • compile_dir
  • target
compile_user_ip()[source]

Compile the user IP to a single HDL module.

Return the name of this module.

Should be overridden by each FrontEnd subclass.

gen_periph_file(fname='jasper.per')[source]

Call upon the frontend to generate a jasper-standard file defining peripherals (yellow blocks) present in a model.

This method should be overridden by the specific frontend of choice, and should return the full path to the peripheral file.

Use skip = True to just return the name of the file, without bothering to regenerate it (useful for debugging, and future use cases where a user only wants to run certain steps of a compile)

write_git_info_file(fname='git_info.tab')[source]

Call upon the frontend to generate a git info file, which contains the git repo information, which is used for the header for the fpg file. This function is overwritten by the SimulinkFrontEnd Class

class toolflow.VitisBackend(xsa, plat=None, compile_dir='/tmp', periph_objs=None)[source]

Incantations of a Vitis flow

Uses the hardware platform (.xsa) exported from Vivado to generate a software platform. Here we start by building the device tree

__init__(xsa, plat=None, compile_dir='/tmp', periph_objs=None)[source]
compile()[source]

This will break plain ole fpga’s (non xlnx zynq soc’s). This is a temporary implementation placeholder. Will be moving to seperate back end class.

The general idea here is to run Vitis (xsct) with the platform hardware (.xsa) to generate supporting software products to create a device tree overlay. The following:

  1. makes a jdts dir in the compile_dir to hold the build products
  2. create an empty xsct_gogogo.tcl (similar to the Vivado jasper flow) so that:
  1. can build xilinx device tree (requiring the xlnx-device-tree repo)
  2. allow each peripheral device (yellow block) to also add xsct commands to generate whatever they need. In the case of the rfdc it is not explicitly added to the MPSoC and there is no mmap path managed by Vivado. This excludes the rfdc from being auto-magically included as part of the exported drivers from the xlnx device tree driver information. However, the IP and its configuration is still present within the .xsa hardware project and we can instead manually build the device tree to match what xrfdc driver expects.
  1. run xsct against the generated xsct_gogogo.tcl file
  2. take all of the products generated and build one complete jasper.dtsi device tree overaly description that is later compiled with dtc producing a compatible overlay.
mkdtbo(dtsi_file, dtbo_file)[source]
class toolflow.VivadoBackend(plat=None, compile_dir='/tmp', periph_objs=None)[source]
__init__(plat=None, compile_dir='/tmp', periph_objs=None)[source]
Parameters:
  • plat
  • compile_dir
  • periph_objs
add_compile_cmds(cores=8, plat=None, synth_strat=None, impl_strat=None, threads='multi')[source]

Add the tcl commands for compiling the design, and then launch vivado in batch mode

add_compile_cmds_pr(cores=8, plat=None, synth_strat=None, impl_strat=None)[source]

Add the tcl commands for compiling the design, and then launch vivado in batch mode

add_const_file(constfile)[source]

Add a constraint file to the project. via a tcl incantation. In non-project mode, it is important to note that copies are not made of files. The files are read from their source directory. Project mode copies files from their source directory and adds them to the a new compile directory.

Parameters:constfile
add_ip(ip)[source]

Add an ip core from a library

add_library(path)[source]

Add a library at <path>

add_source(source, plat)[source]

Add a sourcefile to the project. Via a tcl incantation. In non-project mode, it is important to note that copies are not made of files. The files are read from their source directory. Project mode copies files from their source directory and adds them to the a new compile directory.

add_tcl_cmd(cmd, stage='pre_synth')[source]

Add a command to the tcl command list with a trailing newline.

compile(cores, plat, synth_strat=None, impl_strat=None, threads='multi')[source]
Parameters:
  • cores
  • plat
  • impl_strat – Implementation Strategy to use when carrying out the implementation run ‘impl’
eval_tcl()[source]
static format_cfg_const(attribute, val)[source]

Generate a configuration tcl syntax command from an attribute and value

static format_clock_const(c)[source]
static format_clock_group_const(c)[source]
static format_const(attribute, val, port, index=None)[source]

Generate a tcl syntax command from an attribute, value and port (with indexing if required)

static format_false_path_const(c)[source]
static format_gen_clock_const(c)[source]
static format_input_delay_const(c)[source]
static format_max_delay_const(c)[source]
static format_min_delay_const(c)[source]
static format_multi_cycle_const(c)[source]
static format_output_delay_const(c)[source]
gen_bd_tcl_cmds()[source]

Allow each yellowblock to generate tcl commands specific to creating a block design

gen_constraint_file(constraints)[source]

Pass this method a toolflow-standard list of constraints which have already had their physical parameters calculated and it will generate a constraint file and add it to the current project.

gen_yellowblock_custom_hdl()[source]

Create each yellowblock’s custom hdl files and add them to the projects sources

gen_yellowblock_tcl_cmds()[source]

Compose a list of tcl commands from each yellow block. To be added to the final tcl script.

get_tcl_const(const)[source]

Pass a single toolflow-standard PortConstraint, and get back a tcl command to add the constraint to a vivado project.

initialize()[source]
Parameters:plat