Moved install files from runtime back to common; modified makefile to reflect changes
diff --git a/common/README b/common/README
new file mode 100644
index 0000000..decfec1
--- /dev/null
+++ b/common/README
@@ -0,0 +1,584 @@
+open_pdks : A system for installing silicon foundry PDKs for open-source EDA tools
+(also maybe works for installing commercial tools)
+
+----------------------------------------------------------------------------------
+
+Written by Tim Edwards 2019 / 2020 for efabless (efabless.com)
+and Open Circuit Design (opencircuitdesign.com)
+
+----------------------------------------------------------------------------------
+
+Introduction:
+
+ Silicon foundry PDKs are notoriously non-standard, and files obtained
+ from the foundry may end up in any possibly configuration of files and
+ folders. In addition, silicon foundries are notorious among open source
+ EDA tool enthusiasts for supplying user setups for commercial EDA tools
+ and all but ignoring open source EDA tools. Open_pdks aims to mitigate
+ the problem by defining a standard layout of files and directories for
+ known open standard formats (e.g., SPICE, verilog, liberty, LEF, etc.)
+ and for various open source EDA tools (e.g., magic, netgen, OpenROAD,
+ klayout) using a Makefile system and a number of conversion scripts to
+ ensure that for any process, all files needed by all EDA tools can be
+ found in predictable locations.
+
+ The scripts aim to be as general-purpose as possible to allow easy
+ adaptation to new tools, formats, and foundries. Where foundry data
+ is intractably unusable, custom install files can be added to overwrite
+ or annotate vendor data as needed.
+
+ Each foundry process is a subdirectory of the open_pdks top level and
+ has its own Makefile. The typical install process is to cd to the
+ foundry top level and run "make" (see below for details).
+
+ The general file structure created by open_pdks is as follows:
+
+ <foundry_root>/
+ <name_of_pdk_variant_1>/
+ <name_of_pdk_variant_2>/
+ ...
+ <name_of_pdk_variant_x>/
+ libs.tech/
+ <name_of_EDA_tool_1>/
+ <name_of_EDA_tool_2>/
+ ...
+ <name_of_EDA_tool_x>/
+ <EDA_tool_setup_files>
+ libs.ref
+ <name_of_IP_library_1>/
+ <name_of_IP_library_2>/
+ ...
+ <name_of_IP_library_x>/
+ <name_of_file_format_1>
+ <name_of_file_format_2>
+ ...
+ <name_of_file_format_x>
+ <vendor_files>
+
+ Note that this format is very general and does not constrain the
+ EDA tools supported or file formats supported, so long as there
+ are scripts in the system to provide that support. It is intended
+ that open_pdks can be extended as needed to support new tools or
+ new file formats.
+
+ Current EDA tools supported in this version of open_pdks:
+ Tool Directory name
+ --------------------------
+ ngspice ngspice
+ magic magic
+ netgen netgen
+ klayout klayout
+ qflow qflow
+ openlane openlane
+
+ Current IP library file formats supported in this version of open_pdks*:
+ Format Directory name
+ --------------------------
+ CDL cdl
+ SPICE spice
+ magic mag, maglef
+ LEF lef
+ GDS gds
+ verilog verilog
+ liberty lib
+ PDF** doc
+
+ (* "Supported" meaning expected/handled by conversion scripts;
+ as noted, the install is very general purpose and any name
+ can be used as a target for any vendor or custom files.)
+ (** or HTML or any valid document format, plus supporting files.)
+
+How to use open_pdks:
+
+ There are a seriously limited number of open foundry PDKs. Those that
+ are known (SkyWater, MOSIS SCMOS) are included in the repository. In
+ other cases (X-Fab XH035, XH018) it is possible to get an extension to
+ open_pdks from a known trusted source through NDA verification with
+ the foundry. In all other cases, foundries should be berated until
+ they agree to support the open_pdks format.
+
+ Open_pdks does not attempt to keep any foundry data to the extent
+ possible. Instead, it adapts to the file structure available from
+ whatever system each foundry uses for downloads. Each foundry
+ directory should contain a README file that details how to obtain
+ downloads from the foundry, and what files need to be downloaded.
+ Since the download methods vary wildly, it is up to the user to obtain
+ the foundry data as instructed. The Makefile in the open_pdks foundry
+ directory then needs to be edited to set the correct path to the
+ foundry source data.
+
+ The installation is a bootstrapping process, so needs to be done in
+ stages. The first stage installs setup files for all the EDA tools.
+ The second stage installs IP libraries (e.g., standard cells, padframe
+ I/O, analog circuits) and depends heavily on the use of the open EDA
+ tools themselves to fill in any missing file formats. Therefore the
+ tool setup files need to be installed first, and then the IP libraries.
+ If using a distributed install (see below), then the tool setup files
+ need to be installed and distributed (relocated to the final run-time
+ location) before the IP libraries are installed.
+
+ There are two distinct install types supported by open_pdks:
+
+ (1) Local install: Use a local install when the EDA tools will be run
+ on a single host, and all the PDK data are on the same host.
+
+ The local install sequence is:
+
+ make
+ make install-local Install EDA tool setup
+ make install-vendor-local Install IP libraries
+
+ (2) Distributed install: Use the distributed install when the PDK
+ will be run from multiple hosts, but will be installed into a
+ different location such as a git repo which is then distributed to
+ all hosts, and may not itself reside in the same root directory tree.
+
+ The distributed install sequence is:
+
+ make
+ make install-dist Install EDA tool setup
+ make install-vendor-dist Install IP libraries
+
+ Note that local installs may opt to make symbolic links back to the
+ foundry sources, where possible (see options for foundry_install.py,
+ below). Distributed installs and local installs may also make
+ symbolic links from any PDK variant back to a "master" PDK variant,
+ where possible (that is, where the files are the same). For example,
+ a standard cell library will probably be compatible with all metal
+ back-end stacks, and so only one copy of all the library files is
+ needed in one of the PDK variants. For the other PDK variants, the
+ same files are all symbolic links to the files in the first PDK
+ variant. But an I/O library would have different layouts for different
+ metal back-end stacks, so layout-dependent files like GDS would be
+ different for each PDK, but layout-independent files like verilog
+ might be symbolic links to files in the first PDK.
+
+Prerequisites:
+
+ The following tools/software stacks are needed to run open_pdks:
+
+ python3
+
+ magic opencircuitdesign.com/magic or github.com/RTimothyEdwards
+
+ assumed to be installed and discoverable in the standard
+ search path as defined by the shell (version 8.2+ required)
+
+How to make or update an open PDK:
+
+ The backbone of the open_pdks system is a set of scripts found in the
+ common/ subdirectory. The two main scripts are "preproc.py" and
+ "foundry_install.py", with a host of supporting scripts.
+
+ Creating a new PDK starts with generating a Makefile, which can be
+ done by copying a Makefile from an existing project. The first thing
+ to do is to define the number of PDK variants (usually based on back-end
+ metal stacks available, but can also include front-end options, especially
+ if they are mutually exclusive rather than simply additional masks).
+ Then create the make and make-install targets for local and distributed
+ install, including install (plain), install-vendor, and install-custom.
+ Define the default source and target paths.
+
+ (Needed: A "make makefile" script that generates the "local" and "dist"
+ automatically, and potentially can also make all the different PDK
+ targets automatically, from a much shorter and simpler master Makefile.)
+
+ Create the basic scripts for tools. Since foundries do not support open
+ EDA tools, it is inevitable that these files need to be created by hand
+ unless there is an option to import other formats. Because Magic is used
+ heavily by open_pdks to create missing file formats from other existing
+ file formats, a Magic techfile is critical. Each of the basic scripts
+ will contain #ifdef ... #endif and similar conditionals to allow the
+ script to be parsed for each target PDK variant. Each of these scripts
+ is passed through common/preproc.py to handle the conditionals. Of course,
+ it is possible to make a separate file for each PDK variant as long as the
+ Makefile handles them properly, but use of the preproc.py script allows
+ all the PDK variants to be handled in the same way, simplifying the Makefile.
+
+ --------------------------------------------------------------------------
+ preproc.py Usage:
+
+ preproc.py input_file [output_file] [-D<variable> ...]
+
+ Where <variable> may be a keyword or a key=value pair
+
+ Syntax: Basically like cpp. However, this preprocessor handles
+ only a limited set of keywords, so it does not otherwise mangle
+ the file in the belief that it must be C code. Handling of boolean
+ relations is important, so these are thoroughly defined (see below)
+
+ #if defined(<variable>) [...]
+ #ifdef <variable>
+ #ifndef <variable>
+ #elseif <variable>
+ #else
+ #endif
+
+ #define <variable> [...]
+ #undef <variable>
+
+ #include <filename>
+
+ <variable> may be
+ <keyword>
+ <keyword>=<value>
+
+ <keyword> without '=' is effectively the same as <keyword>=1
+ Lack of a keyword is equivalent to <keyword>=0, in a conditional.
+
+ Boolean operators (in order of precedence):
+ ! NOT
+ && AND
+ || OR
+
+ Comments:
+ Most comments (C-like or Tcl-like) are output as-is. A
+ line beginning with "###" is treated as a preprocessor
+ comment and is not copied to the output.
+
+ Examples;
+ #if defined(X) || defined(Y)
+ #else
+ #if defined(Z)
+ #endif
+
+ --------------------------------------------------------------------------
+
+ The script common/foundry_install.py handles all the IP library processing
+ and installation. It generates the local directory structure and populates
+ the directories with foundry vendor data, and filters or otherwise uses
+ open EDA tools to generate missing standard file formats or create file
+ formats needed by the open EDA tools.
+
+ foundry_install.py Usage:
+
+ foundry_install.py [option [option_arguments]] ...
+
+ All options begin with "-" and may be followed by one or more
+ arguments (that do not begin with "-"). The foundry_install.py
+ script may be called multiple times, although it is best to
+ group together all files for the installation of an IP library,
+ since the options given will be used to determine what files are
+ missing and need to be generated.
+
+ Global options:
+ -link_from <type>
+ Make symbolic links to vendor files from target
+ Types are: "none", "source", or a PDK name.
+ Default "none" (copy all files from source)
+ -source <path>
+ Path to source data top level directory
+ -target <path>
+ Path to target top level directory
+ -local <path>
+ For distributed installs, this is the local
+ path to target top level directory.
+
+ -library <type> <name>
+ The install target is an IP library with
+ name <name>.
+ -ef_format
+ Use the original efabless format for file
+ installs. This has several differences from
+ then no-efabless install. The most important
+ is that the order of directories for IP libraries
+ is <file_format>/<library_name> instead of
+ <library_name>/<file_format>. As the efabless
+ platform migrates to the open_pdks developing
+ standard, this use should eventually be
+ deprecated. In open_pdks, the option is set
+ from the EF_FORMAT variable setting in the Makefile.
+
+ All other options represent installation into specific directories.
+ The primary rule is that if foundry_install.py is passed an option
+ "-library" (see syntax below), then all other non-global options
+ represent subdirectories of the IP library, given the same name as
+ the option word following the "-". If the foundry_install.py command
+ line does not have an option "-library", then all non-global options
+ represent per-EDA tool subdirectories, where the name of the subdirectory
+ is the same as the option word following the "-".
+
+ Each tool install option has the syntax:
+
+ -<tool_name> <path> [<option_arguments>]
+
+ Each IP library install option has the syntax:
+
+ -<file_format_name> <path> [<option_arguments>]
+
+ The <path> is a directory path that is relative to the path prefix
+ given by the -source option. The path may be wildcarded with the
+ character "*". The specific text "/*/" is always replaced by the
+ name of the IP library (if "-library" is an option). Otherwise,
+ "*" has the usual meaning of matching any characters in a name
+ (see python glob.glob() command for reference).
+
+ (Note that the INSTALL variable in the Makefile starts with "set -f"
+ to suppress the OS from doing wildcard substitution; otherwise the
+ wildcards in the install options will get expanded by the OS before
+ being passed to the install script.)
+
+ In some cases, it may be required to process an option like "compile"
+ (see below) on files already in the target path without adding any
+ source files. In that case, <path> may be any keyword that does not
+ point to a valid directory; "none" is a recommended choice.
+
+ Library option:
+
+ -library <type> <name> [<target>]
+
+ <type> may be one of the following:
+
+ digital Digital standard cells
+ primitive Primitive devices
+ general All others
+
+ Analog and I/O libraries fall under the category "general".
+
+ <name> is the vendor name of the library.
+
+ [<target>] is the (optional) local name of the library. If omitted,
+ then the vendor name is used for the target (there is no particular
+ reason to specify a different local name for a library).
+
+ Any number of libraries may be supported, and one "-library" option
+ may be provided for each supported library. The use of multiple
+ libraries for a single run of foundry_install.py only works if the
+ formats (gds, cdl, lef, etc.) happen to all work with the same wildcards.
+ But it is generally most common to declare only one library name per
+ call to foundry_install.py.
+
+ Common foundry_install.py options when used with "-library":
+
+ -techlef <path> [option_arguments] Technology LEF file
+ -doc <path> [option_arguments] library documentation
+ -lef <path> [option_arguments] LEF file
+ -spice <path> [option_arguments] SPICE netlists
+ -cdl <path> [option_arguments] CDL netlists
+ -lib <path> [option_arguments] Liberty timing files
+ -gds <path> [option_arguments] GDS layout data
+ -verilog <path> [option_arguments] Verilog models
+
+ Any name can be used after the "-" and the installation of files
+ will be made into a directory of that name, which will be created
+ if it does not exist. The names used above are preferred, for
+ the sake of compatibility between EDA tools.
+
+ Of special note is "techlef", as technology LEF files are often
+ associated with a PDK and not an IP library. In this system,
+ the technology LEF file should be associated with each standard
+ cell library for which it is intended.
+
+ [option_arguments] may be one of the following:
+
+ up <number>
+ Any tool option can use this argument to indicate that
+ the source hierarchy should be copied entirely, starting
+ from <number> levels above the files indicated by <path>.
+ For example, if liberty files are kept in multiple
+ directories according to voltage level, then
+
+ -liberty x/y/z/PVT_*/*.lib
+
+ would install all .lib files directly into
+ libs.ref/<libname>/liberty/*.lib while
+
+ -liberty x/y/z/PVT_*/*.lib up 1
+
+ would install all .lib files into
+ libs.ref/liberty/<libname>/PVT_*/*.lib.
+
+ nospec
+ Remove timing specification before installing (used with
+ verilog files only; could be extended to liberty files).
+
+ compile
+ Create a single library from all components. Used when a
+ foundry library has inconveniently split an IP library
+ (LEF, CDL, verilog, etc.) into individual files.
+
+ compile-only
+ Same as argument "compile", except that the individual
+ files are not copied to the target; only the compiled
+ library is created.
+
+ stub
+ Remove contents of subcircuits from CDL and SPICE netlist,
+ or verilog files. This is useful to LVS and other tools
+ to know the order of pins in a circuit (for CDL or SPICE),
+ or simply to ignore the contents of the file (any format)
+ so that the circuit in question is treated as a "black box".
+
+ priv
+ Mark the contents being installed as privleged, and put
+ them in a separate root directory libs.priv where they
+ can be given additional read/write restrictions.
+
+ filter <script_file_path>
+ Process all files through the script <script_file_path>,
+ which is given as a relative path to the directory
+ containing the Makefile. The filter script traditionally
+ is put in local subdirectory custom/scripts/. The filter
+ script should be written to take a single argument, which
+ is the path to a file, and process that file, and overwrite
+ the file with the result. Commonly used filters are found
+ in the common/ directory. See common/fixspice.py for an
+ example.
+
+ noclobber
+ Mainly diagnostic. When specified, any temporary files
+ used during installation will be retained instead of
+ deleted after use. This includes, for example, scripts
+ passed to magic for running extraction or file format
+ generation. It is useful when debugging problems with
+ the install.
+
+ anno
+ Currently only supported for LEF files. This argument
+ indicates that the vendor LEF files should be used only
+ for annotating GDS input with port location information,
+ but the LEF files themselves should not be installed.
+
+ noconvert
+ Install files from source to target, but do not perform
+ any additional conversions (such as CDL to SPICE, or
+ GDS or LEF to magic).
+
+ ignore=<keyword>[,...]
+ Specifically for CDL and SPICE netlists, ignore any
+ parameter found matching <keyword>
+
+ rename=<new-name>
+ For single files copied from source to target, the
+ target file should be named <new-name> and not be
+ given the same name as the source file. When used
+ with the "compile" or "compile-only" options, then
+ the compiled file gets the name <new-name> rather
+ than taking the name of the library.
+
+ exclude=<file>[,...]
+ When using "compile" or "compile-only", exclude any
+ file in the target directory matching the name <file>.
+
+ File conversions handled by foundry_install.py:
+
+ The following file format conversions can be done automatically by
+ foundry_install.py:
+
+ CDL to SPICE: A CDL netlist or library can be converted to a
+ general-purpose SPICE netlist that can be read
+ by any tool that can read Berkeley SPICE 3f5
+ syntax.
+
+ GDS to LEF: An abstract view can be generated from a full
+ layout view using Magic.
+
+ GDS to SPICE: In the absence of any netlist, Magic will
+ extract a SPICE netlist from a full layout.
+
+ SPICE (any) to SPICE (ngspice): The fixspice.py script will
+ attempt to convert any SPICE model file,
+ cell library, or netlist to a form that is
+ compatible with ngspice version 30.
+
+ open_pdks additional Makefile notes:
+
+ The "make install-local" ("make install-dist") step is generally
+ broken into individual make sections, one for each tool (e.g.,
+ magic, netgen, klayout). There is an additional section called
+ "general" which installs a ".config" directory at the PDK top
+ level, containing a file "nodeinfo.json" which has general
+ information about the PDK that may be used by any tool that
+ understands the key:value pairs used in the JSON file. Keys used
+ are as follows:
+
+ foundry : Short name of the foundry, equal to the foundry
+ directory root, above the PDK variants.
+ foundry-name : Long name of the foundry.
+ node : The name of the PDK variant
+ feature-size : The foundry process feature size (e.g., 130nm)
+ status : "active" or "inactive". May be used by tools
+ to present or hide specific PDK variants.
+ description : Long text description of the process variant
+ (e.g., 6-metal stack + MiM caps)
+ options : List of options, corresponding to the definitions
+ used in the Makefile and passed to preproc.py.
+ stdcells : List of standard cell libraries available for this
+ PDK variant.
+ iocells : List of I/O pad cell libraries available for this
+ PDK variant.
+
+ Note that the JSON file is, like other EDA tool setup files, usually a
+ master file that is parsed by preproc.py; therefore when specifying
+ "options", use #undef before specifying each option name so that the
+ option name itself is ignored by the pre-processor.
+
+
+Goals of the open_pdks project:
+
+ The intended goal of open_pdks is to be able to support as many open source
+ EDA tools as practical, and to be able to generate all needed files for
+ those tools from any sufficiently complete set of vendor files.
+
+ A number of file converions are not available but would be useful to have:
+
+ SPICE to liberty: Create timing files by running simulations
+ on SPICE netlists using ngspice.
+
+ liberty to verilog: Use the function statements in liberty
+ format to create verilog primitives. Maybe
+ use liberty timing information to generate
+ LEF specify sections.
+
+ verilog to liberty: Reverse of the above. Use verilog logic
+ tables and specify sections to generate liberty
+ functions and timing tables.
+
+ File formats that need to be supported:
+
+ Schematic and symbol: There are few standards, so either everyone
+ needs to agree on a good format to use, or there
+ needs to be a lot of scripts to do conversions
+ between formats. Open EDA tools that could be
+ supported include:
+
+ electric, xcircuit, kicad, sue2
+
+ Other open source EDA tools that need to be supported:
+
+ OpenROAD
+ Coriolis2
+ (add more here. . .)
+
+ Commercial EDA tools can potentially be supported under this same system,
+ provided sufficient compatibility with the file system structure.
+
+ Other scripts needed:
+
+ Project setup script: It would be useful to define a "standard
+ project file structure" that is similar to the standard PDK file
+ structure defined in open_pdks. The preferred project setup
+ based on the efabless model is:
+
+ <project_name>
+ .config/
+ techdir (symbolic link to open_pdks PDK)
+ project.json (information file for tools)
+ <tool_name> (magic, qflow, ngspice, etc.) or
+ <format_name> (spice, gds, verilog, etc.)
+
+ In general, <tool_name> directories are intended to be workspaces
+ for specific EDA tools (and may have their own nested hierarchies;
+ e.g., qflow/<digital_block>/source,synthesis,layout) while
+ <format_name> is a place to keep (final) files of a specific format,
+ with the intention that any project can easily be made into an
+ IP library and folded into the open_pdks scheme with little effort.
+
+ The project.json file contains project information that can be used
+ by a script to build a setup for any EDA tool. One goal of the
+ project.json file is to define "datasheet" (documented elsewhere)
+ that can be used to drive characterization simulations and create
+ a datasheet for the project. Field "ip-name" of "datasheet" is
+ the canonical name of the project, which can be distinguished from
+ the project directory top-level name, such that the project can be
+ moved or copied without affecting the tool flows.
diff --git a/common/create_gds_library.py b/common/create_gds_library.py
new file mode 100755
index 0000000..b02535f
--- /dev/null
+++ b/common/create_gds_library.py
@@ -0,0 +1,218 @@
+#!/usr/bin/env python3
+#
+# create_gds_library.py
+#
+#----------------------------------------------------------------------------
+# Given a destination directory holding individual GDS files of a number
+# of cells, create a single GDL library file named <alllibname> and place
+# it in the same directory. This is done for the option "compile" if specified
+# for the "-gds" install.
+#----------------------------------------------------------------------------
+
+import os
+import sys
+import glob
+import fnmatch
+import subprocess
+
+#----------------------------------------------------------------------------
+
+def usage():
+ print('')
+ print('Usage:')
+ print(' create_gds_library <destlibdir> <destlib> <startup_script> ')
+ print(' [-compile-only] [-excludelist="file1,file2,..."] [-keep]')
+ print('')
+ print('Create a single GDS library from a set of individual GDS files.')
+ print('')
+ print('where:')
+ print(' <destlibdir> is the directory containing the individual GDS files')
+ print(' <destlib> is the root name of the library file')
+ print(' <startup_script> is the full path to a magic startup script')
+ print(' -compile-only removes the indidual files if specified')
+ print(' -excludelist= is a comma-separated list of files to ignore')
+ print(' -keep keep the Tcl script used to generate the library')
+ print('')
+
+#----------------------------------------------------------------------------
+
+def create_gds_library(destlibdir, destlib, startup_script, do_compile_only=False, excludelist=[], keep=False):
+
+ # destlib should not have a file extension
+ destlibroot = os.path.splitext(destlib)[0]
+
+ alllibname = destlibdir + '/' + destlibroot + '.gds'
+ if os.path.isfile(alllibname):
+ os.remove(alllibname)
+
+ # If file "filelist.txt" exists in the directory, get the list of files from it
+ if os.path.exists(destlibdir + '/filelist.txt'):
+ with open(destlibdir + '/filelist.txt', 'r') as ifile:
+ rlist = ifile.read().splitlines()
+ glist = []
+ for rfile in rlist:
+ glist.append(destlibdir + '/' + rfile)
+ else:
+ glist = glob.glob(destlibdir + '/*.gds')
+ glist.extend(glob.glob(destlibdir + '/*.gdsii'))
+ glist.extend(glob.glob(destlibdir + '/*.gds2'))
+
+ if alllibname in glist:
+ glist.remove(alllibname)
+
+ # Create exclude list with glob-style matching using fnmatch
+ if len(glist) > 0:
+ glistnames = list(os.path.split(item)[1] for item in glist)
+ notglist = []
+ for exclude in excludelist:
+ notglist.extend(fnmatch.filter(glistnames, exclude))
+
+ # Apply exclude list
+ if len(notglist) > 0:
+ for file in glist[:]:
+ if os.path.split(file)[1] in notglist:
+ glist.remove(file)
+
+ if len(glist) > 1:
+ print('New file is: ' + alllibname)
+
+ if os.path.isfile(startup_script):
+ # If the symbolic link exists, remove it.
+ if os.path.isfile(destlibdir + '/.magicrc'):
+ os.remove(destlibdir + '/.magicrc')
+ os.symlink(startup_script, destlibdir + '/.magicrc')
+
+ # A GDS library is binary and requires handling in Magic
+ print('Creating magic generation script to generate GDS library.')
+ with open(destlibdir + '/generate_magic.tcl', 'w') as ofile:
+ print('#!/usr/bin/env wish', file=ofile)
+ print('#--------------------------------------------', file=ofile)
+ print('# Script to generate .gds library from files ', file=ofile)
+ print('#--------------------------------------------', file=ofile)
+ print('drc off', file=ofile)
+ print('gds readonly true', file=ofile)
+ print('gds flatten true', file=ofile)
+ print('gds rescale false', file=ofile)
+ print('tech unlock *', file=ofile)
+
+ for gdsfile in glist:
+ print('gds read ' + gdsfile, file=ofile)
+
+ print('puts stdout "Creating cell ' + destlibroot + '"', file=ofile)
+ print('load ' + destlibroot, file=ofile)
+ print('puts stdout "Adding cells to library"', file=ofile)
+ print('box values 0 0 0 0', file=ofile)
+ for gdsfile in glist:
+ gdsroot = os.path.split(gdsfile)[1]
+ gdsname = os.path.splitext(gdsroot)[0]
+ print('getcell ' + gdsname, file=ofile)
+ # Could properly make space for the cell here. . .
+ print('box move e 200', file=ofile)
+
+ print('puts stdout "Writing GDS library ' + destlibroot + '"', file=ofile)
+ print('gds library true', file=ofile)
+ print('gds write ' + destlibroot, file=ofile)
+ print('puts stdout "Done."', file=ofile)
+ print('quit -noprompt', file=ofile)
+
+ # Run magic to read in the individual GDS files and
+ # write out the consolidated GDS library
+
+ print('Running magic to create GDS library.')
+ sys.stdout.flush()
+
+ mproc = subprocess.run(['magic', '-dnull', '-noconsole',
+ destlibdir + '/generate_magic.tcl'],
+ stdin = subprocess.DEVNULL,
+ stdout = subprocess.PIPE,
+ stderr = subprocess.PIPE, cwd = destlibdir,
+ universal_newlines = True)
+
+ if mproc.stdout:
+ for line in mproc.stdout.splitlines():
+ print(line)
+ if mproc.stderr:
+ print('Error message output from magic:')
+ for line in mproc.stderr.splitlines():
+ print(line)
+ if mproc.returncode != 0:
+ print('ERROR: Magic exited with status ' + str(mproc.returncode))
+ if do_compile_only == True:
+ print('Compile-only: Removing individual GDS files')
+ for gfile in glist:
+ if os.path.isfile(gfile):
+ os.remove(gfile)
+ if not keep:
+ os.remove(destlibdir + '/generate_magic.tcl')
+ else:
+ print('Only one file (' + str(glist) + '); ignoring "compile" option.')
+
+#----------------------------------------------------------------------------
+
+if __name__ == '__main__':
+
+ if len(sys.argv) == 1:
+ usage()
+ sys.exit(0)
+
+ argumentlist = []
+
+ # Defaults
+ do_compile_only = False
+ keep = False
+ excludelist = []
+
+ # Break arguments into groups where the first word begins with "-".
+ # All following words not beginning with "-" are appended to the
+ # same list (optionlist). Then each optionlist is processed.
+ # Note that the first entry in optionlist has the '-' removed.
+
+ for option in sys.argv[1:]:
+ if option.find('-', 0) == 0:
+ keyval = option[1:].split('=')
+ if keyval[0] == 'compile-only':
+ if len(keyval) > 0:
+ if keyval[1].tolower() == 'true' or keyval[1].tolower() == 'yes' or keyval[1] == '1':
+ do_compile_only = True
+ else:
+ do_compile_only = True
+ elif keyval[1] == 'exclude' or key == 'excludelist':
+ if len(keyval) > 0:
+ excludelist = keyval[1].trim('"').split(',')
+ else:
+ print("No items in exclude list (ignoring).")
+ elif keyval[1] == 'keep':
+ keep = True
+ else:
+ print("Unknown option '" + keyval[0] + "' (ignoring).")
+ else:
+ argumentlist.append(option)
+
+ if len(argumentlist) < 3:
+ print("Not enough arguments given to create_gds_library.py.")
+ usage()
+ sys.exit(1)
+
+ destlibdir = argumentlist[0]
+ destlib = argumentlist[1]
+ startup_script = argumentlist[2]
+
+ print('')
+ print('Create GDS library from files:')
+ print('')
+ print('Path to files: ' + destlibdir)
+ print('Name of compiled library: ' + destlib + '.gds')
+ print('Path to magic startup script: ' + startup_script)
+ print('Remove individual files: ' + 'Yes' if do_compile_only else 'No')
+ if len(excludelist) > 0:
+ print('List of files to exclude: ')
+ for file in excludelist:
+ print(file)
+ print('Keep generating script: ' + 'Yes' if keep else 'No')
+ print('')
+
+ create_gds_library(destlibdir, destlib, startup_script, do_compile_only, excludelist, keep)
+ print('Done.')
+ sys.exit(0)
+
+#----------------------------------------------------------------------------
diff --git a/common/create_lef_library.py b/common/create_lef_library.py
new file mode 100755
index 0000000..4a46e5f
--- /dev/null
+++ b/common/create_lef_library.py
@@ -0,0 +1,173 @@
+#!/usr/bin/env python3
+#
+# create_lef_library.py
+#
+#----------------------------------------------------------------------------
+# Given a destination directory holding individual LEF files of a number
+# of cells, create a single LEF library file named <alllibname> and place
+# it in the same directory. This is done for the option "compile" if specified
+# for the "-lef" install.
+#----------------------------------------------------------------------------
+
+import sys
+import os
+import glob
+import fnmatch
+
+#----------------------------------------------------------------------------
+
+def usage():
+ print('')
+ print('Usage:')
+ print(' create_lef_library <destlibdir> <destlib> [-compile-only]')
+ print(' [-excludelist="file1,file2,..."]')
+ print('')
+ print('Create a single LEF library from a set of individual LEF files.')
+ print('')
+ print('where:')
+ print(' <destlibdir> is the directory containing the individual LEF files')
+ print(' <destlib> is the root name of the library file')
+ print(' -compile-only remove the indidual files if specified')
+ print(' -excludelist= is a comma-separated list of files to ignore')
+ print('')
+
+#----------------------------------------------------------------------------
+
+def create_lef_library(destlibdir, destlib, do_compile_only=False, excludelist=[]):
+
+ # destlib should not have a file extension
+ destlibroot = os.path.splitext(destlib)[0]
+
+ alllibname = destlibdir + '/' + destlibroot + '.lef'
+ if os.path.isfile(alllibname):
+ os.remove(alllibname)
+
+ print('Diagnostic: Creating consolidated LEF library ' + destlibroot + '.lef')
+
+ # If file "filelist.txt" exists in the directory, get the list of files from it
+ if os.path.exists(destlibdir + '/filelist.txt'):
+ with open(destlibdir + '/filelist.txt', 'r') as ifile:
+ rlist = ifile.read().splitlines()
+ llist = []
+ for rfile in rlist:
+ llist.append(destlibdir + '/' + rfile)
+ else:
+ llist = glob.glob(destlibdir + '/*.lef')
+
+ if alllibname in llist:
+ llist.remove(alllibname)
+
+ # Create exclude list with glob-style matching using fnmatch
+ if len(llist) > 0:
+ llistnames = list(os.path.split(item)[1] for item in llist)
+ notllist = []
+ for exclude in excludelist:
+ notllist.extend(fnmatch.filter(llistnames, exclude))
+
+ # Apply exclude list
+ if len(notllist) > 0:
+ for file in llist[:]:
+ if os.path.split(file)[1] in notllist:
+ llist.remove(file)
+
+ if len(llist) > 1:
+ print('New file is: ' + alllibname)
+ with open(alllibname, 'w') as ofile:
+ headerdone = False
+ for lfile in llist:
+ with open(lfile, 'r') as ifile:
+ # print('Adding ' + lfile + ' to library.')
+ ltext = ifile.read()
+ llines = ltext.splitlines()
+ headerseen = False
+ for lline in llines:
+ if headerdone:
+ if not headerseen:
+ if not lline.startswith('MACRO'):
+ continue
+ else:
+ headerseen = True
+ ltok = lline.split()
+ if len(ltok) > 1 and ltok[0] == 'END' and ltok[1] == 'LIBRARY':
+ # Remove "END LIBRARY" line from individual files
+ pass
+ else:
+ print(lline, file=ofile)
+ headerdone = True
+ print('#--------EOF---------\n', file=ofile)
+
+ # Add "END LIBRARY" to the end of the library file
+ print('', file=ofile)
+ print('END LIBRARY', file=ofile)
+
+ if do_compile_only == True:
+ print('Compile-only: Removing individual LEF files')
+ for lfile in llist:
+ if os.path.isfile(lfile):
+ os.remove(lfile)
+ else:
+ print('Only one file (' + str(llist) + '); ignoring "compile" option.')
+
+#----------------------------------------------------------------------------
+
+if __name__ == '__main__':
+
+ if len(sys.argv) == 1:
+ usage()
+ sys.exit(0)
+
+ argumentlist = []
+
+ # Defaults
+ do_compile_only = False
+ excludelist = []
+
+ # Break arguments into groups where the first word begins with "-".
+ # All following words not beginning with "-" are appended to the
+ # same list (optionlist). Then each optionlist is processed.
+ # Note that the first entry in optionlist has the '-' removed.
+
+ for option in sys.argv[1:]:
+ if option.find('-', 0) == 0:
+ keyval = option[1:].split('=')
+ if keyval[0] == 'compile-only':
+ if len(keyval) > 0:
+ if keyval[1].tolower() == 'true' or keyval[1].tolower() == 'yes' or keyval[1] == '1':
+ do_compile_only = True
+ else:
+ do_compile_only = True
+ elif keyval[1] == 'exclude' or key == 'excludelist':
+ if len(keyval) > 0:
+ excludelist = keyval[1].trim('"').split(',')
+ else:
+ print("No items in exclude list (ignoring).")
+ else:
+ print("Unknown option '" + keyval[0] + "' (ignoring).")
+ else:
+ argumentlist.append(option)
+
+ if len(argumentlist) < 2:
+ print("Not enough arguments given to create_lef_library.py.")
+ usage()
+ sys.exit(1)
+
+ destlibdir = argumentlist[0]
+ destlib = argumentlist[1]
+
+ print('')
+ print('Create LEF library from files:')
+ print('')
+ print('Path to files: ' + destlibdir)
+ print('Name of compiled library: ' + destlib + '.lef')
+ print('Remove individual files: ' + 'Yes' if do_compile_only else 'No')
+ if len(excludelist) > 0:
+ print('List of files to exclude: ')
+ for file in excludelist:
+ print(file)
+ print('')
+
+ create_lef_library(destlibdir, destlib, do_compile_only, excludelist)
+ print('Done.')
+ sys.exit(0)
+
+#----------------------------------------------------------------------------
diff --git a/common/create_lib_library.py b/common/create_lib_library.py
new file mode 100755
index 0000000..1f058c7
--- /dev/null
+++ b/common/create_lib_library.py
@@ -0,0 +1,169 @@
+#!/usr/bin/env python3
+#
+# create_lib_library.py
+#
+#----------------------------------------------------------------------------
+# Given a destination directory holding individual liberty files of a number
+# of cells, create a single liberty library file named <alllibname> and place
+# it in the same directory. This is done for the option "compile" if specified
+# for the "-lib" install.
+#----------------------------------------------------------------------------
+
+import sys
+import os
+import glob
+import fnmatch
+
+#----------------------------------------------------------------------------
+
+def usage():
+ print('')
+ print('Usage:')
+ print(' create_lib_library <destlibdir> <destlib> [-compile-only] ')
+ print(' [-excludelist="file1,file2,..."]')
+ print('')
+ print('Create a single liberty library from a set of individual liberty files.')
+ print('')
+ print('where:')
+ print(' <destlibdir> is the directory containing the individual liberty files')
+ print(' <destlib> is the root name of the library file')
+ print(' -compile-only remove the indidual files if specified')
+ print(' -excludelist= is a comma-separated list of files to ignore')
+ print('')
+
+#----------------------------------------------------------------------------
+# Warning: This script is unfinished. Needs to parse the library header
+# in each cell and generate a new library header combining the contents of
+# all cell headers. Also: The library name in the header needs to be
+# changed to the full library name. Also: There is no mechanism for
+# collecting all files belonging to a single process corner/temperature/
+# voltage.
+#----------------------------------------------------------------------------
+
+def create_lib_library(destlibdir, destlib, do_compile_only=False, excludelist=[]):
+
+ # destlib should not have a file extension
+ destlibrooot = os.path.splitext(destlib)[0]
+
+ alllibname = destlibdir + '/' + destlibroot + '.lib'
+ if os.path.isfile(alllibname):
+ os.remove(alllibname)
+
+ print('Diagnostic: Creating consolidated liberty library ' + destlibroot + '.lib')
+
+ # If file "filelist.txt" exists in the directory, get the list of files from it
+ if os.path.exists(destlibdir + '/filelist.txt'):
+ with open(destlibdir + '/filelist.txt', 'r') as ifile:
+ rlist = ifile.read().splitlines()
+ llist = []
+ for rfile in rlist:
+ llist.append(destlibdir + '/' + rfile)
+ else:
+ llist = glob.glob(destlibdir + '/*.lib')
+
+ # Create exclude list with glob-style matching using fnmatch
+ if len(llist) > 0:
+ llistnames = list(os.path.split(item)[1] for item in llist)
+ notllist = []
+ for exclude in excludelist:
+ notllist.extend(fnmatch.filter(llistnames, exclude))
+
+ # Apply exclude list
+ if len(notllist) > 0:
+ for file in llist[:]:
+ if os.path.split(file)[1] in notllist:
+ llist.remove(file)
+
+ if len(llist) > 1:
+ print('New file is: ' + alllibname)
+ with open(alllibname, 'w') as ofile:
+ headerdone = False
+ for lfile in llist:
+ with open(lfile, 'r') as ifile:
+ # print('Adding ' + lfile + ' to library.')
+ ltext = ifile.read()
+ llines = ltext.splitlines()
+ headerseen = False
+ for lline in llines:
+ if headerdone:
+ if not headerseen:
+ if not lline.split()[0] == 'cell':
+ continue
+ else:
+ headerseen = True
+ print(lline, file=ofile)
+ headerdone = True
+ print('/*--------EOF---------*/\n', file=ofile)
+
+ if do_compile_only == True:
+ print('Compile-only: Removing individual LEF files')
+ for lfile in llist:
+ if os.path.isfile(lfile):
+ os.remove(lfile)
+ else:
+ print('Only one file (' + str(llist) + '); ignoring "compile" option.')
+
+#----------------------------------------------------------------------------
+
+if __name__ == '__main__':
+
+ if len(sys.argv) == 1:
+ usage()
+ sys.exit(0)
+
+ argumentlist = []
+
+ # Defaults
+ do_compile_only = False
+ excludelist = []
+
+ # Break arguments into groups where the first word begins with "-".
+ # All following words not beginning with "-" are appended to the
+ # same list (optionlist). Then each optionlist is processed.
+ # Note that the first entry in optionlist has the '-' removed.
+
+ for option in sys.argv[1:]:
+ if option.find('-', 0) == 0:
+ keyval = option[1:].split('=')
+ if keyval[0] == 'compile-only':
+ if len(keyval) > 0:
+ if keyval[1].tolower() == 'true' or keyval[1].tolower() == 'yes' or keyval[1] == '1':
+ do_compile_only = True
+ else:
+ do_compile_only = True
+ elif keyval[1] == 'exclude' or key == 'excludelist':
+ if len(keyval) > 0:
+ excludelist = keyval[1].trim('"').split(',')
+ else:
+ print("No items in exclude list (ignoring).")
+ else:
+ print("Unknown option '" + keyval[0] + "' (ignoring).")
+ else:
+ argumentlist.append(option)
+
+ if len(argumentlist) < 3:
+ print("Not enough arguments given to create_lib_library.py.")
+ usage()
+ sys.exit(1)
+
+ destlibdir = argumentlist[0]
+ destlib = argumentlist[1]
+ startup_script = argumentlist[2]
+
+ print('')
+ print('Create liberty library from files:')
+ print('')
+ print('Path to files: ' + destlibdir)
+ print('Name of compiled library: ' + destlib + '.lib')
+ print('Remove individual files: ' + 'Yes' if do_compile_only else 'No')
+ if len(excludelist) > 0:
+ print('List of files to exclude: ')
+ for file in excludelist:
+ print(file)
+ print('')
+
+ create_lib_library(destlibdir, destlib, do_compile_only, excludelist)
+ print('Done.')
+ sys.exit(0)
+
+#----------------------------------------------------------------------------
diff --git a/common/create_spice_library.py b/common/create_spice_library.py
new file mode 100755
index 0000000..90d4e65
--- /dev/null
+++ b/common/create_spice_library.py
@@ -0,0 +1,224 @@
+#!/usr/bin/env python3
+#
+# create_spice_library.py
+#
+#----------------------------------------------------------------------------
+# Given a destination directory holding individual SPICE netlists of a number
+# of cells, create a single SPICE library file named <alllibname> and place
+# it in the same directory. This is done for the option "compile" if specified
+# for the "-spice" install.
+#----------------------------------------------------------------------------
+
+import sys
+import os
+import re
+import glob
+import fnmatch
+
+#----------------------------------------------------------------------------
+
+def usage():
+ print('')
+ print('Usage:')
+ print(' create_spice_library <destlibdir> <destlib> <spiext>')
+ print(' [-compile-only] [-stub] [-excludelist="file1,file2,..."]')
+ print('')
+ print('Create a single SPICE or CDL library from a set of individual files.')
+ print('')
+ print('where:')
+ print(' <destlibdir> is the directory containing the individual files')
+ print(' <destlib> is the root name of the library file')
+ print(' <spiext> is the extension used (with ".") by the SPICE or CDL files')
+ print(' -compile-only remove the indidual files if specified')
+ print(' -stub generate only .subckt ... .ends for each cell')
+ print(' -excludelist= is a comma-separated list of files to ignore')
+ print('')
+
+#----------------------------------------------------------------------------
+
+def create_spice_library(destlibdir, destlib, spiext, do_compile_only=False, do_stub=False, excludelist=[]):
+
+ # destlib should not have a file extension
+ destlibroot = os.path.splitext(destlib)[0]
+
+ fformat = 'CDL' if spiext == '.cdl' else 'SPICE'
+
+ allstubname = destlibdir + '/stub' + spiext
+ alllibname = destlibdir + '/' + destlibroot + spiext
+ if do_stub:
+ outputname = allstubname
+ else:
+ outputname = alllibname
+
+ print('Diagnostic: Creating consolidated ' + fformat + ' library ' + outputname)
+
+ if os.path.isfile(outputname):
+ os.remove(outputname)
+
+ # If file "filelist.txt" exists in the directory, get the list of files from it
+ if os.path.exists(destlibdir + '/filelist.txt'):
+ with open(destlibdir + '/filelist.txt', 'r') as ifile:
+ rlist = ifile.read().splitlines()
+ slist = []
+ for rfile in rlist:
+ slist.append(destlibdir + '/' + rfile)
+ else:
+ if fformat == 'CDL':
+ slist = glob.glob(destlibdir + '/*.cdl')
+ else:
+ # Sadly, there is no consensus on what a SPICE file extension should be.
+ slist = glob.glob(destlibdir + '/*.spc')
+ slist.extend(glob.glob(destlibdir + '/*.spice'))
+ slist.extend(glob.glob(destlibdir + '/*.spi'))
+ slist.extend(glob.glob(destlibdir + '/*.ckt'))
+ slist.extend(glob.glob(destlibdir + '/*.cir'))
+ slist.extend(glob.glob(destlibdir + '/*' + spiext))
+
+ if alllibname in slist:
+ slist.remove(alllibname)
+
+ if allstubname in slist:
+ slist.remove(allstubname)
+
+ # Create exclude list with glob-style matching using fnmatch
+ if len(slist) > 0:
+ slistnames = list(os.path.split(item)[1] for item in slist)
+ notslist = []
+ for exclude in excludelist:
+ notslist.extend(fnmatch.filter(slistnames, exclude))
+
+ # Apply exclude list
+ if len(notslist) > 0:
+ for file in slist[:]:
+ if os.path.split(file)[1] in notslist:
+ slist.remove(file)
+
+ if len(slist) > 1:
+ with open(outputname, 'w') as ofile:
+ allsubckts = []
+ for sfile in slist:
+ with open(sfile, 'r') as ifile:
+ # print('Adding ' + sfile + ' to library.')
+ stext = ifile.read()
+ subckts = re.findall(r'\.subckt[ \t]+([^ \t\n]+)', stext, flags=re.IGNORECASE)
+ sseen = list(item for item in subckts if item in allsubckts)
+ allsubckts.extend(list(item for item in subckts if item not in allsubckts))
+ sfilter = remove_redundant_subckts(stext, allsubckts, sseen)
+ print(sfilter, file=ofile)
+ print('\n******* EOF\n', file=ofile)
+
+ if do_compile_only == True:
+ print('Compile-only: Removing individual SPICE files')
+ for sfile in slist:
+ if os.path.isfile(sfile):
+ os.remove(sfile)
+ elif os.path.islink(sfile):
+ os.unlink(sfile)
+ else:
+ print('Only one file (' + str(slist) + '); ignoring "compile" option.')
+
+#----------------------------------------------------------------------------
+# Remove redundant subcircuit entries from a SPICE or CDL netlist file. "sseen"
+# is a list of subcircuit names gleaned from all previously read files using
+# re.findall(). "slist" is a list of subcircuits including those in "ntext".
+# If a subcircuit is defined outside of "ntext", then remove all occurrences in
+# "ntext". Otherwise, if a subcircuit is defined more than once in "ntext",
+# remove all but one copy. The reason for doing this is that some netlists will
+# include primitive device definitions used by all the standard cell subcircuits.
+#
+# It may be necessary to remove redundant .include statements and redundant .model
+# and/or .option statements as well.
+#----------------------------------------------------------------------------
+
+def remove_redundant_subckts(ntext, slist, sseen):
+ updated = ntext
+ for subckt in slist:
+ if subckt in sseen:
+ # Remove all occurrences of subckt
+ updated = re.sub(r'\n\.subckt[ \t]+' + subckt + '[ \t\n]+.*\n\.ends[ \t\n]+', '\n', updated, flags=re.IGNORECASE | re.DOTALL)
+
+ else:
+ # Determine the number of times the subcircuit appears in the text
+ n = len(re.findall(r'\n\.subckt[ \t]+' + subckt + '[ \t\n]+.*\n\.ends[ \t\n]+', updated, flags=re.IGNORECASE | re.DOTALL))
+ # Optimization: Just keep original text if n < 2
+ if n < 2:
+ continue
+
+ # Remove all but one
+ updated = re.sub(r'\n\.subckt[ \t]+' + subckt + '[ \t\n]+.*\n\.ends[ \t\n]+', '\n', n - 1, updated, flags=re.IGNORECASE | re.DOTALL)
+ return updated
+
+#----------------------------------------------------------------------------
+
+if __name__ == '__main__':
+
+ if len(sys.argv) == 1:
+ usage()
+ sys.exit(0)
+
+ argumentlist = []
+
+ # Defaults
+ do_compile_only = False
+ do_stub = False
+ excludelist = []
+
+ # Break arguments into groups where the first word begins with "-".
+ # All following words not beginning with "-" are appended to the
+ # same list (optionlist). Then each optionlist is processed.
+ # Note that the first entry in optionlist has the '-' removed.
+
+ for option in sys.argv[1:]:
+ if option.find('-', 0) == 0:
+ keyval = option[1:].split('=')
+ if keyval[0] == 'compile-only':
+ if len(keyval) > 0:
+ if keyval[1].tolower() == 'true' or keyval[1].tolower() == 'yes' or keyval[1] == '1':
+ do_compile_only = True
+ else:
+ do_compile_only = True
+ elif keyval[1] == 'exclude' or key == 'excludelist':
+ if len(keyval) > 0:
+ excludelist = keyval[1].trim('"').split(',')
+ else:
+ print("No items in exclude list (ignoring).")
+ elif keyval[1] == 'stub':
+ if len(keyval) > 0:
+ if keyval[1].tolower() == 'true' or keyval[1].tolower() == 'yes' or keyval[1] == '1':
+ do_stub = True
+ else:
+ do_stub = True
+ else:
+ print("Unknown option '" + keyval[0] + "' (ignoring).")
+ else:
+ argumentlist.append(option)
+
+ if len(argumentlist) < 3:
+ print("Not enough arguments given to create_spice_library.py.")
+ usage()
+ sys.exit(1)
+
+ destlibdir = argumentlist[0]
+ destlib = argumentlist[1]
+ startup_script = argumentlist[2]
+
+ print('')
+ if spiext == '.cdl':
+ print('Create CDL library from files:')
+ else:
+ print('Create SPICE library from files:')
+ print('')
+ print('Path to files: ' + destlibdir)
+ print('Name of compiled library: ' + destlib + spiext)
+ print('Remove individual files: ' + 'Yes' if do_compile_only else 'No')
+ if len(excludelist) > 0:
+ print('List of files to exclude: ')
+ for file in excludelist:
+ print(file)
+ print('')
+
+ create_spice_library(destlibdir, destlib, spiext, do_compile_only, do_stub, excludelist)
+ print('Done.')
+ sys.exit(0)
+
+#----------------------------------------------------------------------------
diff --git a/common/create_verilog_library.py b/common/create_verilog_library.py
new file mode 100755
index 0000000..55e1c76
--- /dev/null
+++ b/common/create_verilog_library.py
@@ -0,0 +1,204 @@
+#!/usr/bin/env python3
+#
+# create_verilog_library.py
+#
+#----------------------------------------------------------------------------
+# Given a destination directory holding individual verilog files of a number
+# of modules, create a single verilog library file named <alllibname> and place
+# it in the same directory. This is done for the option "compile" if specified
+# for the "-verilog" install.
+#----------------------------------------------------------------------------
+
+import sys
+import os
+import re
+import glob
+import fnmatch
+
+#----------------------------------------------------------------------------
+
+def usage():
+ print('')
+ print('Usage:')
+ print(' create_verilog_library <destlibdir> <destlib> [-compile-only]')
+ print(' [-stub] [-excludelist="file1,file2,..."]')
+ print('')
+ print('Create a single verilog library from a set of individual verilog files.')
+ print('')
+ print('where:')
+ print(' <destlibdir> is the directory containing the individual files')
+ print(' <destlib> is the root name of the library file')
+ print(' -compile-only remove the indidual files if specified')
+ print(' -stub generate only the module headers for each cell')
+ print(' -excludelist= is a comma-separated list of files to ignore')
+ print('')
+
+#----------------------------------------------------------------------------
+
+def create_verilog_library(destlibdir, destlib, do_compile_only=False, do_stub=False, excludelist=[]):
+
+ # 'destlib' should not have an extension, because one will be generated.
+ destlibroot = os.path.splitext(destlib)[0]
+
+ alllibname = destlibdir + '/' + destlibroot + '.v'
+ if os.path.isfile(alllibname):
+ os.remove(alllibname)
+
+ print('Diagnostic: Creating consolidated verilog library ' + destlibroot + '.v')
+
+ # If file "filelist.txt" exists in the directory, get the list of files from it
+ if os.path.exists(destlibdir + '/filelist.txt'):
+ print('Diagnostic: Reading sorted verilog file list.')
+ with open(destlibdir + '/filelist.txt', 'r') as ifile:
+ rlist = ifile.read().splitlines()
+ vlist = []
+ for rfile in rlist:
+ vlist.append(destlibdir + '/' + rfile)
+ else:
+ vlist = glob.glob(destlibdir + '/*.v')
+
+ if alllibname in vlist:
+ vlist.remove(alllibname)
+
+ # Create exclude list with glob-style matching using fnmatch
+ if len(vlist) > 0:
+ vlistnames = list(os.path.split(item)[1] for item in vlist)
+ notvlist = []
+ for exclude in excludelist:
+ notvlist.extend(fnmatch.filter(vlistnames, exclude))
+
+ # Apply exclude list
+ if len(notvlist) > 0:
+ for file in vlist[:]:
+ if os.path.split(file)[1] in notvlist:
+ vlist.remove(file)
+
+ if len(vlist) > 1:
+ print('New file is: ' + alllibname)
+ with open(alllibname, 'w') as ofile:
+ allmodules = []
+ for vfile in vlist:
+ with open(vfile, 'r') as ifile:
+ # print('Adding ' + vfile + ' to library.')
+ vtext = ifile.read()
+ modules = re.findall(r'[ \t\n]module[ \t]+([^ \t\n\(]+)', vtext)
+ mseen = list(item for item in modules if item in allmodules)
+ allmodules.extend(list(item for item in modules if item not in allmodules))
+ vfilter = remove_redundant_modules(vtext, allmodules, mseen)
+ # NOTE: The following workaround resolves an issue with iverilog,
+ # which does not properly parse specify timing paths that are not in
+ # parentheses. Easy to work around
+ vlines = re.sub(r'\)[ \t]*=[ \t]*([01]:[01]:[01])[ \t]*;', r') = ( \1 ) ;', vfilter)
+ print(vlines, file=ofile)
+ print('\n//--------EOF---------\n', file=ofile)
+
+ if do_compile_only == True:
+ print('Compile-only: Removing individual verilog files')
+ for vfile in vlist:
+ if os.path.isfile(vfile):
+ os.remove(vfile)
+ elif os.path.islink(vfile):
+ os.unlink(vfile)
+ else:
+ print('Only one file (' + str(vlist) + '); ignoring "compile" option.')
+
+#----------------------------------------------------------------------------
+# Remove redundant module entries from a verilog file. "m2list" is a list of
+# module names gleaned from all previously read files using re.findall().
+# "mlist" is a list of all module names including those in "ntext".
+# The reason for doing this is that some verilog files may includes modules used
+# by all the files, and if included more than once, then iverilog complains.
+#----------------------------------------------------------------------------
+
+def remove_redundant_modules(ntext, mlist, m2list):
+ updated = ntext
+ for module in mlist:
+ # Determine the number of times the module appears in the text
+ if module in m2list:
+ # This module seen before outside of ntext, so remove all occurrances in ntext
+ new = re.sub(r'[ \t\n]+module[ \t]+' + module + '[ \t\n\(]+.*[ \t\n]endmodule', '\n', updated, flags=re.DOTALL)
+ updated = new
+
+ else:
+ n = len(re.findall(r'[ \t\n]module[ \t]+' + module + '[ \t\n\(]+.*[ \t\n]endmodule', updated, flags=re.DOTALL))
+ # This module defined more than once inside ntext, so remove all but one
+ # Optimization: Just keep original text if n < 2
+ if n < 2:
+ continue
+
+ # Remove all but one
+ updated = re.sub(r'[ \t\n]+module[ \t]+' + module + '[ \t\n]+.*[ \t\n]endmodule', '\n', n - 1, updated, flags=re.IGNORECASE | re.DOTALL)
+ return updated
+
+#----------------------------------------------------------------------------
+
+if __name__ == '__main__':
+
+ if len(sys.argv) == 1:
+ usage()
+ sys.exit(0)
+
+ argumentlist = []
+
+ # Defaults
+ do_compile_only = False
+ do_stub = False
+ excludelist = []
+
+ # Break arguments into groups where the first word begins with "-".
+ # All following words not beginning with "-" are appended to the
+ # same list (optionlist). Then each optionlist is processed.
+ # Note that the first entry in optionlist has the '-' removed.
+
+ for option in sys.argv[1:]:
+ if option.find('-', 0) == 0:
+ keyval = option[1:].split('=')
+ if keyval[0] == 'compile-only':
+ if len(keyval) > 0:
+ if keyval[1].tolower() == 'true' or keyval[1].tolower() == 'yes' or keyval[1] == '1':
+ do_compile_only = True
+ else:
+ do_compile_only = True
+ elif keyval[1] == 'exclude' or key == 'excludelist':
+ if len(keyval) > 0:
+ excludelist = keyval[1].trim('"').split(',')
+ else:
+ print("No items in exclude list (ignoring).")
+ elif keyval[0] == 'stub':
+ if len(keyval) > 0:
+ if keyval[1].tolower() == 'true' or keyval[1].tolower() == 'yes' or keyval[1] == '1':
+ do_stub = True
+ else:
+ do_stub = True
+ else:
+ print("Unknown option '" + keyval[0] + "' (ignoring).")
+ else:
+ argumentlist.append(option)
+
+ if len(argumentlist) < 3:
+ print("Not enough arguments given to create_verilog_library.py.")
+ usage()
+ sys.exit(1)
+
+ destlibdir = argumentlist[0]
+ destlib = argumentlist[1]
+ startup_script = argumentlist[2]
+
+ print('')
+ print('Create verilog library from files:')
+ print('')
+ print('Path to files: ' + destlibdir)
+ print('Name of compiled library: ' + destlib + '.v')
+ print('Path to magic startup script: ' + startup_script)
+ print('Remove individual files: ' + 'Yes' if do_compile_only else 'No')
+ if len(excludelist) > 0:
+ print('List of files to exclude: ')
+ for file in excludelist:
+ print(file)
+ print('')
+
+ create_verilog_library(destlibdir, destlib, startup_script, do_compile_only, do_stub, excludelist)
+ print('Done.')
+ sys.exit(0)
+
+#----------------------------------------------------------------------------
diff --git a/common/foundry_install.py b/common/foundry_install.py
new file mode 100755
index 0000000..9a2793a
--- /dev/null
+++ b/common/foundry_install.py
@@ -0,0 +1,2098 @@
+#!/usr/bin/env python3
+#
+# foundry_install.py
+#
+# This file generates the local directory structure and populates the
+# directories with foundry vendor data. The local directory (target)
+# should be a staging area, not a place where files are kept permanently.
+#
+# Options:
+# -ef_format Use efabless naming (libs.ref/techLEF),
+# otherwise use generic naming (libs.tech/lef)
+# -clean Clear out and remove target directory before starting
+# -source <path> Path to source data top level directory
+# -target <path> Path to target (staging) top level directory
+#
+# All other options represent paths to vendor files. They may all be
+# wildcarded with "*", or with specific escapes like "%l" for library
+# name or "%v" for version number (see below for a complete list of escape
+# sequences).
+#
+# Note only one of "-spice" or "-cdl" need be specified. Since the
+# open source tools use ngspice, CDL files are converted to ngspice
+# syntax when needed.
+#
+# -techlef <path> Path to technology LEF file
+# -doc <path> Path to technology documentation
+# -lef <path> Path to LEF file
+# -spice <path> Path to SPICE netlists
+# -cdl <path> Path to CDL netlists
+# -models <path> Path to SPICE (primitive device) models
+# -liberty <path> Path to Liberty timing files
+# -gds <path> Path to GDS data
+# -verilog <path> Path to verilog models
+#
+# -library <type> <name> [<target>] See below
+#
+# For the "-library" option, any number of libraries may be supported, and
+# one "-library" option should be provided for each supported library.
+# <type> is one of: "digital", "primitive", or "general". Analog and I/O
+# libraries fall under the category "general", as they are all treated the
+# same way. <name> is the vendor name of the library. [<target>] is the
+# (optional) local name of the library. If omitted, then the vendor name
+# is used for the target (there is no particular reason to specify a
+# different local name for a library).
+#
+# In special cases using options (see below), path may be "-", indicating
+# that there are no source files, but only to run compilations or conversions
+# on the files in the target directory.
+#
+# All options "-lef", "-spice", etc., can take the additional arguments
+# up <number>
+#
+# to indicate that the source hierarchy should be copied from <number>
+# levels above the files. For example, if liberty files are kept in
+# multiple directories according to voltage level, then
+#
+# -liberty x/y/z/PVT_*/*.lib
+#
+# would install all .lib files directly into libs.ref/<libname>/liberty/*.lib
+# (if "-ef_format" option specified, then: libs.ref/<libname>/liberty/*.lib)
+# while
+#
+# -liberty x/y/z/PVT_*/*.lib up 1
+#
+# would install all .lib files into libs.ref/liberty/<libname>/PVT_*/*.lib
+# (if "-ef_format" option specified, then: libs.ref/<libname>/liberty/PVT_*/*.lib)
+#
+# Please note that the INSTALL variable in the Makefile starts with "set -f"
+# to suppress the OS from doing wildcard substitution; otherwise the
+# wildcards in the install options will get expanded by the OS before
+# being passed to the install script.
+#
+# Other library-specific arguments are:
+#
+# nospec : Remove timing specification before installing
+# (used with verilog files; needs to be extended to
+# liberty files)
+# compile : Create a single library from all components. Used
+# when a foundry library has inconveniently split
+# an IP library (LEF, CDL, verilog, etc.) into
+# individual files.
+# compile-only: Like "compile" except that the individual
+# files are removed after the library file has been
+# created.
+#
+# stub : Remove contents of subcircuits from CDL or SPICE
+# netlist files.
+#
+# priv : Mark the contents being installed as privleged, and
+# put them in a separate root directory libs.priv
+# where they can be given additional read/write
+# restrictions.
+#
+# exclude : Followed by "=" and a comma-separated list of names.
+# exclude these files/modules/subcircuits. Names may
+# also be wildcarded in "glob" format.
+#
+# rename : Followed by "=" and an alternative name. For any
+# file that is a single entry, change the name of
+# the file in the target directory to this (To-do:
+# take regexps for multiple files). When used with
+# "compile" or "compile-only", this refers to the
+# name of the target compiled file.
+#
+# filter: Followed by "=" and the name of a script.
+# Each file is passed through the filter script
+# before writing into the staging area.
+#
+# sort: Followed by "=" and the name of a script.
+# The list of files to process (after applying items
+# from "exclude") will be written to a file
+# "filelist.txt", which will be used by the
+# library compile routines, if present. The sort
+# script will rewrite the file with the order in
+# which entries should appear in the compiled library.
+# Only useful when used with "compile" or "compile-only".
+# If not specified, files are sorted by "natural sort"
+# order.
+#
+# noconvert : Install only; do not attempt to convert to other
+# formats (applies only to GDS, CDL, and LEF).
+#
+# options: Followed by "=" and the name of a script. Behavior
+# is dependent on the mode; if applied to "-gds",
+# then the script is inserted before the GDS read
+# in the Tcl generate script passed to magic. If
+# what follows the "=" is not a file, then it is
+# Tcl code to be inserted verbatim.
+#
+# NOTE: This script can be called once for all libraries if all file
+# types (gds, cdl, lef, etc.) happen to all work with the same wildcards.
+# However, it is more likely that it will be called several times for the
+# same PDK, once to install I/O cells, once to install digital, and so
+# forth, as made possible by the wild-carding.
+
+import re
+import os
+import sys
+import glob
+import stat
+import shutil
+import fnmatch
+import subprocess
+
+# Import local routines
+from create_gds_library import create_gds_library
+from create_spice_library import create_spice_library
+from create_lef_library import create_lef_library
+from create_lib_library import create_lib_library
+from create_verilog_library import create_verilog_library
+
+def usage():
+ print("foundry_install.py [options...]")
+ print(" -copy Copy files from source to target (default)")
+ print(" -ef_format Use efabless naming conventions for local directories")
+ print("")
+ print(" -source <path> Path to top of source directory tree")
+ print(" -target <path> Path to top of target directory tree")
+ print("")
+ print(" -techlef <path> Path to technology LEF file")
+ print(" -doc <path> Path to technology documentation")
+ print(" -lef <path> Path to LEF file")
+ print(" -spice <path> Path to SPICE netlists")
+ print(" -cdl <path> Path to CDL netlists")
+ print(" -models <path> Path to SPICE (primitive device) models")
+ print(" -lib <path> Path to Liberty timing files")
+ print(" -liberty <path> Path to Liberty timing files")
+ print(" -gds <path> Path to GDS data")
+ print(" -verilog <path> Path to verilog models")
+ print(" -library <type> <name> [<target>] See below")
+ print("")
+ print(" All <path> names may be wild-carded with '*' ('glob'-style wild-cards)")
+ print("")
+ print(" All options with <path> other than source and target may take the additional")
+ print(" arguments 'up <number>', where <number> indicates the number of levels of")
+ print(" hierarchy of the source path to include when copying to the target.")
+ print("")
+ print(" Library <type> may be one of:")
+ print(" digital Digital standard cell library")
+ print(" primitive Primitive device library")
+ print(" general All other library types (I/O, analog, etc.)")
+ print("")
+ print(" If <target> is unspecified then <name> is used for the target.")
+
+# Return a list of files after glob-style substituting into pathname. This
+# mostly relies on glob.glob(), but uses the additional substitutions with
+# escape strings:
+#
+# %v : Match a version number in the form "major[.minor[.rev]]"
+# %l : substitute the library name
+# %% : substitute the percent character verbatim
+
+from distutils.version import LooseVersion
+
+#----------------------------------------------------------------------------
+#----------------------------------------------------------------------------
+
+def makeuserwritable(filepath):
+ if os.path.exists(filepath):
+ st = os.stat(filepath)
+ os.chmod(filepath, st.st_mode | stat.S_IWUSR)
+
+#----------------------------------------------------------------------------
+#----------------------------------------------------------------------------
+
+def substitute(pathname, library):
+ if library:
+ # Do %l substitution
+ newpathname = re.sub('%l', library, pathname)
+ else:
+ newpathname = pathname
+
+ if '%v' in newpathname:
+ vglob = re.sub('%v.*', '*', newpathname)
+ vlibs = glob.glob(vglob)
+ try:
+ vstr = vlibs[0][len(vglob)-1:]
+ except IndexError:
+ pass
+ else:
+ for vlib in vlibs[1:]:
+ vtest = vlib[len(vglob)-1:]
+ if LooseVersion(vtest) > LooseVersion(vstr):
+ vstr = vtest
+ newpathname = re.sub('%v', vstr, newpathname)
+
+ if '%%' in newpathname:
+ newpathname = re.sub('%%', '%', newpathname)
+
+ return newpathname
+
+#----------------------------------------------------------------------------
+#----------------------------------------------------------------------------
+
+def get_gds_properties(magfile):
+ proprex = re.compile('^[ \t]*string[ \t]+(GDS_[^ \t]+)[ \t]+([^ \t]+)$')
+ proplines = []
+ if os.path.isfile(magfile):
+ with open(magfile, 'r') as ifile:
+ magtext = ifile.read().splitlines()
+ for line in magtext:
+ lmatch = proprex.match(line)
+ if lmatch:
+ propline = lmatch.group(1) + ' ' + lmatch.group(2)
+ proplines.append(propline)
+ return proplines
+
+#----------------------------------------------------------------------------
+# Read subcircuit ports from a CDL file, given a subcircuit name that should
+# appear in the file as a subcircuit entry, and return a dictionary of ports
+# and their indexes in the subcircuit line.
+#----------------------------------------------------------------------------
+
+def get_subckt_ports(cdlfile, subname):
+ portdict = {}
+ pidx = 1
+ portrex = re.compile('^\.subckt[ \t]+([^ \t]+)[ \t]+(.*)$', flags=re.IGNORECASE)
+ with open(cdlfile, 'r') as ifile:
+ cdltext = ifile.read()
+ cdllines = cdltext.replace('\n+', ' ').splitlines()
+ for line in cdllines:
+ lmatch = portrex.match(line)
+ if lmatch:
+ if lmatch.group(1).lower() == subname.lower():
+ ports = lmatch.group(2).split()
+ for port in ports:
+ portdict[port.lower()] = pidx
+ pidx += 1
+ break
+ return portdict
+
+#----------------------------------------------------------------------------
+# Filter a verilog file to remove any backslash continuation lines, which
+# iverilog does not parse. If targetroot is a directory, then find and
+# process all files in the path of targetroot. If any file to be processed
+# is unmodified (has no backslash continuation lines), then ignore it. If
+# any file is a symbolic link and gets modified, then remove the symbolic
+# link before overwriting with the modified file.
+#----------------------------------------------------------------------------
+
+def vfilefilter(vfile):
+ modified = False
+ with open(vfile, 'r') as ifile:
+ vtext = ifile.read()
+
+ # Remove backslash-followed-by-newline and absorb initial whitespace. It
+ # is unclear what initial whitespace means in this context, as the use-
+ # case that has been seen seems to work under the assumption that leading
+ # whitespace is ignored up to the amount used by the last indentation.
+
+ vlines = re.sub('\\\\\n[ \t]*', '', vtext)
+
+ if vlines != vtext:
+ # File contents have been modified, so if this file was a symbolic
+ # link, then remove it. Otherwise, overwrite the file with the
+ # modified contents.
+ if os.path.islink(vfile):
+ os.unlink(vfile)
+ with open(vfile, 'w') as ofile:
+ ofile.write(vlines)
+
+#----------------------------------------------------------------------------
+# Run a filter on verilog files that cleans up known syntax issues.
+# This is embedded in the foundry_install script and is not a custom
+# filter largely because the issue is in the tool, not the PDK.
+#----------------------------------------------------------------------------
+
+def vfilter(targetroot):
+ if os.path.isfile(targetroot):
+ vfilefilter(targetroot)
+ else:
+ vlist = glob.glob(targetroot + '/*')
+ for vfile in vlist:
+ if os.path.isfile(vfile):
+ vfilefilter(vfile)
+
+#----------------------------------------------------------------------------
+# For issues that are PDK-specific, a script can be written and put in
+# the PDK's custom/scripts/ directory, and passed to the foundry_install
+# script using the "filter" option.
+#----------------------------------------------------------------------------
+
+def tfilter(targetroot, filterscript, outfile=[]):
+ filterroot = os.path.split(filterscript)[1]
+ if os.path.isfile(targetroot):
+ print(' Filtering file ' + targetroot + ' with ' + filterroot)
+ sys.stdout.flush()
+ if not outfile:
+ outfile = targetroot
+ else:
+ # Make sure this file is writable (as the original may not be)
+ makeuserwritable(outfile)
+
+ fproc = subprocess.run([filterscript, targetroot, outfile],
+ stdin = subprocess.DEVNULL, stdout = subprocess.PIPE,
+ stderr = subprocess.PIPE, universal_newlines = True)
+ if fproc.stdout:
+ for line in fproc.stdout.splitlines():
+ print(line)
+ if fproc.stderr:
+ print('Error message output from filter script:')
+ for line in fproc.stderr.splitlines():
+ print(line)
+
+ else:
+ tlist = glob.glob(targetroot + '/*')
+ for tfile in tlist:
+ if os.path.isfile(tfile):
+ print(' Filtering file ' + tfile + ' with ' + filterroot)
+ sys.stdout.flush()
+ fproc = subprocess.run([filterscript, tfile, tfile],
+ stdin = subprocess.DEVNULL, stdout = subprocess.PIPE,
+ stderr = subprocess.PIPE, universal_newlines = True)
+ if fproc.stdout:
+ for line in fproc.stdout.splitlines():
+ print(line)
+ if fproc.stderr:
+ print('Error message output from filter script:')
+ for line in fproc.stderr.splitlines():
+ print(line)
+
+#----------------------------------------------------------------------------
+# This is the main entry point for the foundry install script.
+#----------------------------------------------------------------------------
+
+if __name__ == '__main__':
+
+ if len(sys.argv) == 1:
+ print("No options given to foundry_install.py.")
+ usage()
+ sys.exit(0)
+
+ optionlist = []
+ newopt = []
+
+ sourcedir = None
+ targetdir = None
+
+ ef_format = False
+ do_clean = False
+
+ have_lef = False
+ have_techlef = False
+ have_lefanno = False
+ have_gds = False
+ have_spice = False
+ have_cdl = False
+ have_verilog = False
+ have_lib = False
+
+ # Break arguments into groups where the first word begins with "-".
+ # All following words not beginning with "-" are appended to the
+ # same list (optionlist). Then each optionlist is processed.
+ # Note that the first entry in optionlist has the '-' removed.
+
+ for option in sys.argv[1:]:
+ if option.find('-', 0) == 0:
+ if newopt != []:
+ optionlist.append(newopt)
+ newopt = []
+ newopt.append(option[1:])
+ else:
+ newopt.append(option)
+
+ if newopt != []:
+ optionlist.append(newopt)
+
+ # Pull library names from optionlist
+ libraries = []
+ for option in optionlist[:]:
+ if option[0] == 'library':
+ optionlist.remove(option)
+ libraries.append(option[1:])
+
+ # Check for option "ef_format" or "std_format" or "clean"
+ for option in optionlist[:]:
+ if option[0] == 'ef_naming' or option[0] == 'ef_names' or option[0] == 'ef_format':
+ optionlist.remove(option)
+ ef_format = True
+ elif option[0] == 'std_naming' or option[0] == 'std_names' or option[0] == 'std_format':
+ optionlist.remove(option)
+ ef_format = False
+ elif option[0] == 'clean':
+ do_clean = True
+
+ # Check for options "source" and "target"
+ for option in optionlist[:]:
+ if option[0] == 'source':
+ optionlist.remove(option)
+ sourcedir = option[1]
+ elif option[0] == 'target':
+ optionlist.remove(option)
+ targetdir = option[1]
+
+ if not targetdir:
+ print("No target directory specified. Exiting.")
+ sys.exit(1)
+
+ # Take the target PDK name from the target path last component
+ pdkname = os.path.split(targetdir)[1]
+
+ # If targetdir (the staging area) exists, make sure it's empty.
+
+ if os.path.isdir(targetdir):
+ # Error if targetdir exists but is not writeable
+ if not os.access(targetdir, os.W_OK):
+ print("Target installation directory " + targetdir + " is not writable.")
+ sys.exit(1)
+
+ # Clear out the staging directory if specified
+ if do_clean:
+ shutil.rmtree(targetdir)
+ elif os.path.exists(targetdir):
+ print("Target installation directory " + targetdir + " is not a directory.")
+ sys.exit(1)
+
+ # Error if no source or dest specified unless "-clean" was specified
+ if not sourcedir:
+ if do_clean:
+ print("Done removing staging area.")
+ sys.exit(0)
+ else:
+ print("No source directory specified. Exiting.")
+ sys.exit(1)
+
+ # Create the target directory
+ os.makedirs(targetdir, exist_ok=True)
+
+ # Here's where common scripts are found:
+ scriptdir = os.path.split(os.getcwd())[0] + '/common'
+
+ #----------------------------------------------------------------
+ # Installation part 1: Install files into the staging directory
+ #----------------------------------------------------------------
+
+ # Diagnostic
+ print("Installing in target (staging) directory " + targetdir)
+
+ # Create the top-level directories
+
+ os.makedirs(targetdir + '/libs.tech', exist_ok=True)
+ os.makedirs(targetdir + '/libs.ref', exist_ok=True)
+
+ # Path to magic techfile depends on ef_format
+
+ if ef_format == True:
+ mag_current = '/libs.tech/magic/current/'
+ else:
+ mag_current = '/libs.tech/magic/'
+
+ # Check for magic version and set flag if it does not exist or if
+ # it has the wrong version.
+ have_mag_8_2 = False
+ try:
+ mproc = subprocess.run(['magic', '--version'],
+ stdout = subprocess.PIPE,
+ stderr = subprocess.PIPE,
+ universal_newlines = True)
+ if mproc.stdout:
+ mag_version = mproc.stdout.splitlines()[0]
+ mag_version_info = mag_version.split('.')
+ try:
+ if int(mag_version_info[0]) > 8:
+ have_mag_8_2 = True
+ elif int(mag_version_info[0]) == 8:
+ if int(mag_version_info[1]) >= 2:
+ have_mag_8_2 = True
+ print('Magic version 8.2 (or better) available on the system.')
+ except ValueError:
+ print('Error: "magic --version" did not return valid version number.')
+ except FileNotFoundError:
+ print('Error: Failed to find executable for magic in standard search path.')
+
+ if not have_mag_8_2:
+ print('WARNING: Magic version 8.2 (or beter) cannot be executed ')
+ print('from the standard executable search path.')
+ print('Please install or correct the search path.')
+ print('Magic database files will not be created, and other missing file formats may not be generated.')
+
+ # Populate any targets that do not specify a library, or where the library is
+ # specified as "primitive".
+
+ # Populate the techLEF and SPICE models, if specified. Also, this section can add
+ # to any directory in libs.tech/ as given by the option; e.g., "-ngspice" will
+ # install into libs.tech/ngspice/.
+
+ if libraries == [] or 'primitive' in libraries[0]:
+
+ for option in optionlist[:]:
+
+ # Legacy behavior is to put libs.tech models and techLEF files in
+ # the same grouping as files for the primdev library (which go in
+ # libs.ref). Current behavior is to put all libs.tech files in
+ # a grouping with no library, with unrestricted ability to write
+ # into any subdirectory of libs.tech/. Therefore, need to restrict
+ # legacy use to just 'techlef' and 'models'.
+
+ if len(libraries) > 0 and 'primitive' in libraries[0]:
+ if option[0] != 'techlef' and option[0] != 'techLEF' and option[0] != 'models':
+ continue
+
+ # Normally technology LEF files are associated with IP libraries.
+ # However, if no library is specified or the library is 'primitive'
+ # (legacy behavior), then put in the techLEF directory with no subdirectory.
+
+ filter_scripts = []
+ if option[0] == 'techlef' or option[0] == 'techLEF':
+ for item in option:
+ if item.split('=')[0] == 'filter':
+ filter_scripts.append(item.split('=')[1])
+ break
+
+ if ef_format:
+ techlefdir = targetdir + '/libs.ref/' + 'techLEF'
+ else:
+ techlefdir = targetdir + '/libs.tech/lef'
+
+ os.makedirs(techlefdir, exist_ok=True)
+ # All techlef files should be copied, so use "glob" on the wildcards
+ techlist = glob.glob(substitute(sourcedir + '/' + option[1], None))
+
+ for lefname in techlist:
+ leffile = os.path.split(lefname)[1]
+ targname = techlefdir + '/' + leffile
+
+ if os.path.isfile(lefname):
+ shutil.copy(lefname, targname)
+ else:
+ shutil.copytree(lefname, targname)
+
+ for filter_script in filter_scripts:
+ # Apply filter script to all files in the target directory
+ tfilter(targname, filter_script)
+
+ optionlist.remove(option)
+
+ # All remaining options will refer to specific tools (e.g., -ngspice, -magic)
+ # although generic names (.e.g, -models) are acceptable if the tools know
+ # where to find the files. Currently, most tools have their own formats
+ # and standards for setup, and so generally each install directory will be
+ # unique to one EDA tool.
+
+ else:
+ filter_scripts = []
+ for item in option:
+ if item.split('=')[0] == 'filter':
+ filter_scripts.append(item.split('=')[1])
+ break
+
+ print('Diagnostic: installing to ' + option[0] + '.')
+ tooldir = targetdir + '/libs.tech/' + option[0]
+ os.makedirs(tooldir, exist_ok=True)
+
+ # All files should be linked or copied, so use "glob" on
+ # the wildcards. Copy each file and recursively copy each
+ # directory.
+ toollist = glob.glob(substitute(sourcedir + '/' + option[1], None))
+
+ for toolname in toollist:
+ toolfile = os.path.split(toolname)[1]
+ targname = tooldir + '/' + toolfile
+
+ print(' installing from ' + toolfile + ' to ' + targname)
+
+ if os.path.isdir(toolname):
+ # Remove any existing directory, and its contents
+ if os.path.isdir(targname):
+ shutil.rmtree(targname)
+ os.makedirs(targname)
+
+ # Recursively find and copy or link the whole directory
+ # tree from this point.
+
+ alltoollist = glob.glob(toolname + '/**', recursive=True)
+ commonpart = os.path.commonpath(alltoollist)
+ for subtoolname in alltoollist:
+ # Get the path part that is not common between toollist and
+ # alltoollist.
+ subpart = os.path.relpath(subtoolname, commonpart)
+ subtargname = targname + '/' + subpart
+
+ if os.path.isfile(subtoolname):
+ os.makedirs(os.path.split(subtargname)[0], exist_ok=True)
+ shutil.copy(subtoolname, subtargname)
+ else:
+ print(' copy tree from ' + subtoolname + ' to ' + subtargname)
+ # emulate Python3.8 dirs_exist_ok option
+ try:
+ shutil.copytree(subtoolname, subtargname)
+ except FileExistsError:
+ pass
+
+ for filter_script in filter_scripts:
+ # Apply filter script to all files in the target directory
+ tfilter(subtargname, filter_script)
+
+ else:
+ # Remove any existing file
+ if os.path.isfile(targname):
+ os.remove(targname)
+ elif os.path.isdir(targname):
+ shutil.rmtree(targname)
+
+ if os.path.isfile(toolname):
+ shutil.copy(toolname, targname)
+ else:
+ shutil.copytree(toolname, targname)
+
+ for filter_script in filter_scripts:
+ # Apply filter script to all files in the target directory
+ tfilter(targname, filter_script)
+
+ optionlist.remove(option)
+
+ # Do an initial pass through all of the options and determine what is being
+ # installed, so that we know in advance which file formats are missing and
+ # need to be generated.
+
+ for option in optionlist[:]:
+ if option[0] == 'lef':
+ have_lefanno = True if 'annotate' in option or 'anno' in option else False
+ have_lef = True if not have_lefanno else False
+ if option[0] == 'techlef' or option[0] == 'techLEF':
+ have_techlef = True
+ elif option[0] == 'gds':
+ have_gds = True
+ elif option[0] == 'spice' or option[0] == 'spi':
+ have_spice = True
+ elif option[0] == 'cdl':
+ have_cdl = True
+ elif option[0] == 'verilog':
+ have_verilog = True
+ elif option[0] == 'lib' or option[0] == 'liberty':
+ have_lib = True
+
+ # The remaining options in optionlist should all be types like 'lef' or 'liberty'
+ # and there should be a corresponding library list specified by '-library'
+
+ for option in optionlist[:]:
+
+ # Ignore if no library list---should have been taken care of above.
+ if libraries == []:
+ break
+
+ # Diagnostic
+ print("Install option: " + str(option[0]))
+
+ if option[0] == 'lef' and have_lefanno:
+ print("LEF files used for annotation only. Temporary install.")
+
+ # For ef_format: always make techlef -> techLEF and spice -> spi
+
+ if ef_format:
+ if option[0] == 'techlef':
+ option[0] = 'techLEF'
+ elif option[0] == 'spice':
+ option[0] = 'spi'
+
+ destdir = targetdir + '/libs.ref/' + option[0]
+ os.makedirs(destdir, exist_ok=True)
+
+ # If the option is followed by the keyword "up" and a number, then
+ # the source should be copied (or linked) from <number> levels up
+ # in the hierarchy (see below).
+
+ hier_up = 0
+ for item in option:
+ if item.split('=')[0] == 'up':
+ hier_up = int(item.split('=')[1])
+ break
+
+ filter_scripts = []
+ for item in option:
+ if item.split('=')[0] == 'filter':
+ filter_scripts.append(item.split('=')[1])
+ break
+
+ # Option 'stub' applies to netlists ('cdl' or 'spice') and generates
+ # a file with only stub entries.
+ do_stub = 'stub' in option
+
+ # Option 'compile' is a standalone keyword ('comp' may be used).
+ do_compile = 'compile' in option or 'comp' in option
+ do_compile_only = 'compile-only' in option or 'comp-only' in option
+
+ # Option 'nospecify' is a standalone keyword ('nospec' may be used).
+ do_remove_spec = 'nospecify' in option or 'nospec' in option
+
+ # Option 'exclude' has an argument
+ try:
+ excludelist = list(item.split('=')[1].split(',') for item in option if item.startswith('excl'))[0]
+ except IndexError:
+ excludelist = []
+ else:
+ print('Excluding files: ' + (',').join(excludelist))
+
+ # Option 'rename' has an argument
+ try:
+ newname = list(item.split('=')[1] for item in option if item.startswith('rename'))[0]
+ except IndexError:
+ newname = None
+ else:
+ print('Renaming file to: ' + newname)
+
+ # Option 'sort' has an argument. . .
+ try:
+ sortscript = list(item.split('=')[1] for item in option if item.startswith('sort'))[0]
+ except IndexError:
+ # If option 'sort' is not specified, then use the "natural sort" script
+ sortscript = scriptdir + '/sort_pdkfiles.py'
+ else:
+ print('Sorting files with script ' + sortscript)
+
+ # For each library, create the library subdirectory
+ for library in libraries:
+ if len(library) == 3:
+ destlib = library[2]
+ else:
+ destlib = library[1]
+
+ if ef_format:
+ destlibdir = destdir + '/' + destlib
+ else:
+ destdir = targetdir + '/libs.ref/' + destlib + '/' + option[0]
+ destlibdir = destdir
+
+ os.makedirs(destlibdir, exist_ok=True)
+
+ # Populate the library subdirectory
+ # Parse the option and replace each '/*/' with the library name,
+ # and check if it is a valid directory name. Then glob the
+ # resulting option name. Warning: This assumes that all
+ # occurences of the text '/*/' match a library name. It should
+ # be possible to wild-card the directory name in such a way that
+ # this is always true.
+
+ testpath = substitute(sourcedir + '/' + option[1], library[1])
+ liblist = glob.glob(testpath)
+
+ # Create a file "sources.txt" (or append to it if it exists)
+ # and add the source directory name so that the staging install
+ # script can know where the files came from.
+
+ with open(destlibdir + '/sources.txt', 'a') as ofile:
+ print(testpath, file=ofile)
+
+ # Create exclude list with glob-style matching using fnmatch
+ if len(liblist) > 0:
+ liblistnames = list(os.path.split(item)[1] for item in liblist)
+ notliblist = []
+ for exclude in excludelist:
+ notliblist.extend(fnmatch.filter(liblistnames, exclude))
+
+ # Apply exclude list
+ if len(notliblist) > 0:
+ for file in liblist[:]:
+ if os.path.split(file)[1] in notliblist:
+ liblist.remove(file)
+
+ if len(excludelist) > 0 and len(notliblist) == 0:
+ print('Warning: Nothing from the exclude list found in sources.')
+ print('excludelist = ' + str(excludelist))
+ print('destlibdir = ' + destlibdir)
+
+ # Diagnostic
+ print('Collecting files from ' + testpath)
+ print('Files to install:')
+ if len(liblist) < 10:
+ for item in liblist:
+ print(' ' + item)
+ else:
+ for item in liblist[0:4]:
+ print(' ' + item)
+ print(' .')
+ print(' .')
+ print(' .')
+ for item in liblist[-6:-1]:
+ print(' ' + item)
+ print('(' + str(len(liblist)) + ' files total)')
+
+ destfilelist = []
+ for libname in liblist:
+ # Note that there may be a hierarchy to the files in option[1],
+ # say for liberty timing files under different conditions, so
+ # make sure directories have been created as needed.
+
+ libfile = os.path.split(libname)[1]
+ libfilepath = os.path.split(libname)[0]
+ destpathcomp = []
+ for i in range(hier_up):
+ destpathcomp.append('/' + os.path.split(libfilepath)[1])
+ libfilepath = os.path.split(libfilepath)[0]
+ destpathcomp.reverse()
+ destpath = ''.join(destpathcomp)
+
+ if option[0] == 'verilog':
+ fileext = '.v'
+ elif option[0] == 'liberty' or option[0] == 'lib':
+ fileext = '.lib'
+ elif option[0] == 'spice' or option[0] == 'spi':
+ fileext = '.spice' if not ef_format else '.spi'
+ elif option[0] == 'techlef':
+ fileext = '.lef'
+ else:
+ fileext = '.' + option[0]
+
+ if newname:
+ if os.path.splitext(newname)[1] == '':
+ newname = newname + fileext
+
+ if len(liblist) == 1:
+ destfile = newname
+ else:
+ if not do_compile and not do_compile_only:
+ print('Error: rename specified but more than one file found!')
+ destfile = libfile
+ else:
+ destfile = libfile
+
+ targname = destlibdir + destpath + '/' + destfile
+
+ # NOTE: When using "up" with link_from, could just make
+ # destpath itself a symbolic link; this way is more flexible
+ # but adds one symbolic link per file.
+
+ if destpath != '':
+ if not os.path.isdir(destlibdir + destpath):
+ os.makedirs(destlibdir + destpath, exist_ok=True)
+
+ # Remove any existing file
+ if os.path.isfile(targname):
+ os.remove(targname)
+ elif os.path.isdir(targname):
+ shutil.rmtree(targname)
+
+ # NOTE: Diagnostic, probably much too much output.
+ print(' Install:' + libname + ' to ' + targname)
+ if os.path.isfile(libname):
+ shutil.copy(libname, targname)
+ else:
+ shutil.copytree(libname, targname)
+
+ # File filtering options: Two options 'stub' and 'nospec' are
+ # handled by scripts in ../common/. Custom filters can also be
+ # specified.
+
+ local_filter_scripts = filter_scripts[:]
+
+ if option[0] == 'verilog':
+ # Internally handle syntactical issues with verilog and iverilog
+ vfilter(targname)
+
+ if do_remove_spec:
+ local_filter_scripts.append(scriptdir + '/remove_specify.py')
+
+ elif option[0] == 'cdl' or option[0] == 'spi' or option[0] == 'spice':
+ if do_stub:
+ local_filter_scripts.append(scriptdir + '/makestub.py')
+
+ for filter_script in local_filter_scripts:
+ # Apply filter script to all files in the target directory
+ tfilter(targname, filter_script)
+
+ destfilelist.append(os.path.split(targname)[1])
+
+ if sortscript:
+ with open(destlibdir + '/filelist.txt', 'w') as ofile:
+ for destfile in destfilelist:
+ print(destfile, file=ofile)
+ if os.path.isfile(sortscript):
+ print('Diagnostic: Sorting files with ' + sortscript)
+ subprocess.run([sortscript, destlibdir],
+ stdout = subprocess.DEVNULL,
+ stderr = subprocess.DEVNULL)
+
+ if do_compile == True or do_compile_only == True:
+ # NOTE: The purpose of "rename" is to put a destlib-named
+ # library elsewhere so that it can be merged with another
+ # library into a compiled <destlib>.<ext> on another pass.
+
+ compname = destlib
+
+ # To do: Make this compatible with linking from another PDK.
+
+ if option[0] == 'verilog':
+ # If there is not a single file with all verilog cells in it,
+ # then compile one, because one does not want to have to have
+ # an include line for every single cell used in a design.
+
+ create_verilog_library(destlibdir, compname, do_compile_only, do_stub, excludelist)
+
+ elif option[0] == 'gds' and have_mag_8_2:
+ # If there is not a single file with all GDS cells in it,
+ # then compile one.
+
+ # Link to the PDK magic startup file from the target directory
+ startup_script = targetdir + mag_current + pdkname + '-F.magicrc'
+ if not os.path.isfile(startup_script):
+ startup_script = targetdir + mag_current + pdkname + '.magicrc'
+ create_gds_library(destlibdir, compname, startup_script, do_compile_only, excludelist)
+
+ elif option[0] == 'liberty' or option[0] == 'lib':
+ # If there is not a single file with all liberty cells in it,
+ # then compile one, because one does not want to have to have
+ # an include line for every single cell used in a design.
+
+ create_lib_library(destlibdir, compname, do_compile_only, excludelist)
+
+ elif option[0] == 'spice' or option[0] == 'spi':
+ # If there is not a single file with all SPICE subcircuits in it,
+ # then compile one, because one does not want to have to have
+ # an include line for every single cell used in a design.
+
+ spiext = '.spice' if not ef_format else '.spi'
+ create_spice_library(destlibdir, compname, spiext, do_compile_only, do_stub, excludelist)
+
+ elif option[0] == 'cdl':
+ # If there is not a single file with all CDL subcircuits in it,
+ # then compile one, because one does not want to have to have
+ # an include line for every single cell used in a design.
+
+ create_spice_library(destlibdir, compname, '.cdl', do_compile_only, do_stub, excludelist)
+
+ elif option[0] == 'lef':
+ # If there is not a single file with all LEF cells in it,
+ # then compile one, because one does not want to have to have
+ # an include line for every single cell used in a design.
+
+ create_lef_library(destlibdir, compname, do_compile_only, excludelist)
+
+ if do_compile_only == True:
+ if newname and targname:
+ if os.path.isfile(targname):
+ os.remove(targname)
+
+ # "rename" with "compile" or "compile-only": Change the name
+ # of the compiled file.
+
+ if newname:
+ print(' Renaming ' + compname + fileext + ' to ' + newname)
+ origname = destlibdir + '/' + compname + fileext
+ targrename = destlibdir + destpath + '/' + newname
+ if os.path.isfile(origname):
+ os.rename(origname, targrename)
+
+ # If "filelist.txt" was created, remove it
+ if sortscript:
+ if os.path.isfile(destlibdir + '/filelist.txt'):
+ os.remove(destlibdir + '/filelist.txt')
+
+ # Find any libraries/options marked as "privileged" (or "private") and
+ # move the files from libs.tech or libs.ref to libs.priv, leaving a
+ # symbolic link in the original location. Do this during the initial
+ # install so that options following in the list can add files to the
+ # non-privileged equivalent directory path.
+
+ if 'priv' in option or 'privileged' in option or 'private' in option:
+
+ # Diagnostic
+ print("Install option: " + str(option[0]))
+
+ if ef_format == True:
+ os.makedirs(targetdir + '/libs.priv', exist_ok=True)
+
+ for library in libraries:
+ if len(library) == 3:
+ destlib = library[2]
+ else:
+ destlib = library[1]
+
+ if ef_format:
+ srclibdir = targetdir + '/libs.ref/' + option[0] + '/' + destlib
+ destlibdir = targetdir + '/libs.priv/' + option[0] + '/' + destlib
+ else:
+ srclibdir = targetdir + '/libs.ref/' + destlib + '/' + option[0]
+ destlibdir = targetdir + '/libs.priv/' + destlib + '/' + option[0]
+
+ if not os.path.exists(destlibdir):
+ os.makedirs(destlibdir)
+
+ print('Moving files in ' + srclibdir + ' to privileged space.')
+ filelist = os.listdir(srclibdir)
+ for file in filelist:
+ srcfile = srclibdir + '/' + file
+ destfile = destlibdir + '/' + file
+ if os.path.isfile(destfile):
+ os.remove(destfile)
+ elif os.path.isdir(destfile):
+ shutil.rmtree(destfile)
+
+ if os.path.isfile(srcfile):
+ shutil.copy(srcfile, destfile)
+ os.remove(srcfile)
+ else:
+ shutil.copytree(srcfile, destfile)
+ shutil.rmtree(srcfile)
+
+ print("Completed installation of vendor files.")
+
+ #----------------------------------------------------------------
+ # Installation part 2: Generate derived file formats
+ #----------------------------------------------------------------
+
+ # Now for the harder part. If GDS and/or LEF databases were specified,
+ # then migrate them to magic (.mag files in layout/ or abstract/).
+
+ ignorelist = []
+ tclscript = None
+ do_cdl_scaleu = False
+ no_cdl_convert = False
+ no_gds_convert = False
+ no_lef_convert = False
+ cdl_compile_only = False
+ lef_compile = False
+ lef_compile_only = False
+
+ cdl_exclude = []
+ lef_exclude = []
+ gds_exclude = []
+ spice_exclude = []
+ verilog_exclude = []
+
+ cdl_reflib = '/libs.ref/'
+ gds_reflib = '/libs.ref/'
+ lef_reflib = '/libs.ref/'
+
+ for option in optionlist[:]:
+ if option[0] == 'cdl':
+ # Option 'scaleu' is a standalone keyword
+ do_cdl_scaleu = 'scaleu' in option
+
+ # Option 'ignore' has arguments after '='
+ for item in option:
+ if item.split('=')[0] == 'ignore':
+ ignorelist = item.split('=')[1].split(',')
+
+ elif option[0] == 'gds':
+ for item in option:
+ if item.split('=')[0] == 'options':
+ tclscript = item.split('=')[1]
+ tcllines = []
+ print('Adding Tcl script options from file ' + tclscript)
+
+ # Option 'noconvert' is a standalone keyword.
+ if 'noconvert' in option:
+ if option[0] == 'cdl':
+ no_cdl_convert = True
+ elif option[0] == 'gds':
+ no_gds_convert = True
+ elif option[0] == 'lef':
+ no_lef_convert = True
+
+ # Option 'privileged' is a standalone keyword.
+ if 'priv' in option or 'privileged' in option or 'private' in option:
+ if option[0] == 'cdl':
+ cdl_reflib = '/libs.priv/'
+ elif option[0] == 'gds':
+ gds_reflib = '/libs.priv/'
+ elif option[0] == 'lef':
+ lef_reflib = '/libs.priv/'
+
+ # If CDL is marked 'compile-only' then CDL should only convert the
+ # compiled file to SPICE if conversion is needed. If LEF is marked
+ # 'compile' or 'compile-only' in annotate mode, then create a LEF
+ # library from magic LEF output.
+
+ if 'compile-only' in option:
+ if option[0] == 'cdl':
+ cdl_compile_only = True
+ elif option[0] == 'lef':
+ lef_compile_only = True
+ elif 'compile' in option:
+ if option[0] == 'lef':
+ lef_compile = True
+
+ # Find exclude list for any option
+ for item in option:
+ if item.split('=')[0] == 'exclude':
+ exclude_list = item.split('=')[1].split(',')
+ if option[0] == 'cdl':
+ cdl_exclude = exclude_list
+ elif option[0] == 'lef':
+ lef_exclude = exclude_list
+ elif option[0] == 'gds':
+ gds_exclude = exclude_list
+ elif option[0] == 'spi' or option[0] == 'spice':
+ spice_exclude = exclude_list
+ elif option[0] == 'verilog':
+ verilog_exclude = exclude_list
+
+ devlist = []
+ pdklibrary = None
+
+ if tclscript:
+ # If tclscript is a file, then read it. Otherwise, assume
+ # that the option contents should be inserted verbatim.
+ if os.path.isfile(tclscript):
+ with open(tclscript, 'r') as ifile:
+ tcllines = ifile.read().splitlines()
+ else:
+ tcllines = list(tclscript)
+
+ if have_gds and not no_gds_convert:
+ print("Migrating GDS files to layout.")
+
+ if ef_format:
+ destdir = targetdir + gds_reflib + 'mag'
+ srcdir = targetdir + gds_reflib + 'gds'
+ vdir = targetdir + '/libs.ref/' + 'verilog'
+ cdir = targetdir + cdl_reflib + 'cdl'
+ sdir = targetdir + cdl_reflib + 'spi'
+
+ os.makedirs(destdir, exist_ok=True)
+
+ # For each library, create the library subdirectory
+ for library in libraries:
+ if len(library) == 3:
+ destlib = library[2]
+ else:
+ destlib = library[1]
+
+ if ef_format:
+ destlibdir = destdir + '/' + destlib
+ srclibdir = srcdir + '/' + destlib
+ vlibdir = vdir + '/' + destlib
+ clibdir = cdir + '/' + destlib
+ slibdir = sdir + '/' + destlib
+ else:
+ destdir = targetdir + gds_reflib + destlib + '/mag'
+ srcdir = targetdir + gds_reflib + destlib + '/gds'
+ vdir = targetdir + '/libs.ref/' + destlib + '/verilog'
+ cdir = targetdir + cdl_reflib + destlib + '/cdl'
+ sdir = targetdir + cdl_reflib + destlib + '/spice'
+ destlibdir = destdir
+ srclibdir = srcdir
+ vlibdir = vdir
+ clibdir = cdir
+ slibdir = sdir
+
+ os.makedirs(destlibdir, exist_ok=True)
+
+ # For primitive devices, check the PDK script and find the name
+ # of the library and get a list of supported devices.
+
+ if library[0] == 'primitive':
+ pdkscript = targetdir + mag_current + pdkname + '.tcl'
+ print('Searching for supported devices in PDK script ' + pdkscript + '.')
+
+ if os.path.isfile(pdkscript):
+ librex = re.compile('^[ \t]*set[ \t]+PDKNAMESPACE[ \t]+([^ \t]+)$')
+ devrex = re.compile('^[ \t]*proc[ \t]+([^ :\t]+)::([^ \t_]+)_defaults')
+ fixrex = re.compile('^[ \t]*return[ \t]+\[([^ :\t]+)::fixed_draw[ \t]+([^ \t]+)[ \t]+')
+ devlist = []
+ fixedlist = []
+ with open(pdkscript, 'r') as ifile:
+ scripttext = ifile.read().splitlines()
+ for line in scripttext:
+ lmatch = librex.match(line)
+ if lmatch:
+ pdklibrary = lmatch.group(1)
+ dmatch = devrex.match(line)
+ if dmatch:
+ if dmatch.group(1) == pdklibrary:
+ devlist.append(dmatch.group(2))
+ fmatch = fixrex.match(line)
+ if fmatch:
+ if fmatch.group(1) == pdklibrary:
+ fixedlist.append(fmatch.group(2))
+
+ # Diagnostic
+ print("PDK library is " + str(pdklibrary))
+
+ # Link to the PDK magic startup file from the target directory
+ # If there is no -F version then look for one without -F (open source PDK)
+ startup_script = targetdir + mag_current + pdkname + '-F.magicrc'
+ if not os.path.isfile(startup_script):
+ startup_script = targetdir + mag_current + pdkname + '.magicrc'
+
+ if have_mag_8_2 and os.path.isfile(startup_script):
+ # If the symbolic link exists, remove it.
+ if os.path.isfile(destlibdir + '/.magicrc'):
+ os.remove(destlibdir + '/.magicrc')
+ os.symlink(startup_script, destlibdir + '/.magicrc')
+
+ # Find GDS file names in the source
+ print('Getting GDS file list from ' + srclibdir + '.')
+ gdsfilesraw = os.listdir(srclibdir)
+ gdsfiles = []
+ for gdsfile in gdsfilesraw:
+ gdsext = os.path.splitext(gdsfile)[1].lower()
+ if gdsext == '.gds' or gdsext == '.gdsii' or gdsext == '.gds2':
+ gdsfiles.append(gdsfile)
+
+ # Create exclude list with glob-style matching using fnmatch
+ if len(gdsfiles) > 0:
+ gdsnames = list(os.path.split(item)[1] for item in gdsfiles)
+ notgdsnames = []
+ for exclude in gds_exclude:
+ notgdsnames.extend(fnmatch.filter(gdsnames, exclude))
+
+ # Apply exclude list
+ if len(notgdsnames) > 0:
+ for file in gdsfiles[:]:
+ if os.path.split(file)[1] in notgdsnames:
+ gdsfiles.remove(file)
+
+ # Generate a script called "generate_magic.tcl" and leave it in
+ # the target directory. Use it as input to magic to create the
+ # .mag files from the database.
+
+ print('Creating magic generation script to generate magic database files.')
+ with open(destlibdir + '/generate_magic.tcl', 'w') as ofile:
+ print('#!/usr/bin/env wish', file=ofile)
+ print('#--------------------------------------------', file=ofile)
+ print('# Script to generate .mag files from .gds ', file=ofile)
+ print('#--------------------------------------------', file=ofile)
+ print('crashbackups stop', file=ofile)
+ print('drc off', file=ofile)
+ print('gds readonly true', file=ofile)
+ print('gds flatten true', file=ofile)
+ print('gds rescale false', file=ofile)
+ print('tech unlock *', file=ofile)
+
+ # Add custom Tcl script lines before "gds read".
+ if tclscript:
+ for line in tcllines:
+ print(line, file=ofile)
+
+ for gdsfile in gdsfiles:
+ # Note: DO NOT use a relative path here.
+ print('gds read ' + srclibdir + '/' + gdsfile, file=ofile)
+
+ # Make sure properties include the Tcl generated cell
+ # information from the PDK script
+
+ if pdklibrary:
+ tclfixedlist = '{' + ' '.join(fixedlist) + '}'
+ print('set devlist ' + tclfixedlist, file=ofile)
+ print('set topcell [lindex [cellname list top] 0]',
+ file=ofile)
+
+ print('foreach cellname $devlist {', file=ofile)
+ print(' load $cellname', file=ofile)
+ print(' property gencell $cellname', file=ofile)
+ print(' property parameter m=1', file=ofile)
+ print(' property library ' + pdklibrary, file=ofile)
+ print('}', file=ofile)
+ print('load $topcell', file=ofile)
+
+ else:
+ # Use LEF files to set the port properties
+ if have_lefanno or have_lef:
+ lefdirname = 'lef'
+
+ # Find LEF file names in the source
+ if ef_format:
+ lefsrcdir = targetdir + lef_reflib + lefdirname
+ lefsrclibdir = lefsrcdir + '/' + destlib
+ else:
+ lefsrcdir = targetdir + lef_reflib + destlib + '/' + lefdirname
+ lefsrclibdir = lefsrcdir
+
+ leffiles = os.listdir(lefsrclibdir)
+ leffiles = list(item for item in leffiles if os.path.splitext(item)[1] == '.lef')
+ if len(leffiles) > 0:
+ lefnames = list(os.path.split(item)[1] for item in leffiles)
+ notlefnames = []
+ for exclude in lef_exclude:
+ notlefnames.extend(fnmatch.filter(lefnames, exclude))
+
+ # Apply exclude list
+ if len(notlefnames) > 0:
+ for file in leffiles[:]:
+ if os.path.split(file)[1] in notlefnames:
+ leffiles.remove(file)
+
+ if len(leffiles) > 0:
+ print('puts stdout "Annotating cells from LEF"', file=ofile)
+ for leffile in leffiles:
+ print('lef read ' + lefsrclibdir + '/' + leffile, file=ofile)
+
+ # Use CDL or SPICE netlists to set the port order
+ if have_cdl or have_spice:
+ if have_cdl:
+ netdir = clibdir
+ else:
+ netdir = slibdir
+
+ # Find CDL/SPICE file names in the source
+ # Ignore "sources.txt" if it is in the list.
+ netfiles = os.listdir(netdir)
+ print('puts stdout "Annotating cells from CDL/SPICE"',
+ file=ofile)
+ for netfile in netfiles:
+ if os.path.split(netfile)[1] != 'sources.txt':
+ print('catch {readspice ' + netdir + '/' + netfile
+ + '}', file=ofile)
+
+ # print('cellname delete \(UNNAMED\)', file=ofile)
+ print('puts stdout "Writing all magic database files"', file=ofile)
+ print('writeall force', file=ofile)
+
+ leffiles = []
+ lefmacros = []
+ if have_lef:
+ # Nothing to do; LEF macros were already installed.
+ pass
+ elif have_lefanno:
+ # Find LEF file names in the source
+ if ef_format:
+ lefsrcdir = targetdir + lef_reflib + 'lef'
+ lefsrclibdir = lefsrcdir + '/' + destlib
+ else:
+ lefsrcdir = targetdir + lef_reflib + destlib + '/lef'
+ lefsrclibdir = lefsrcdir
+
+ leffiles = os.listdir(lefsrclibdir)
+ leffiles = list(item for item in leffiles if os.path.splitext(item)[1] == '.lef')
+ # Create exclude list with glob-style matching using fnmatch
+ if len(leffiles) > 0:
+ lefnames = list(os.path.split(item)[1] for item in leffiles)
+ notlefnames = []
+ for exclude in lef_exclude:
+ notlefnames.extend(fnmatch.filter(lefnames, exclude))
+
+ # Apply exclude list
+ if len(notlefnames) > 0:
+ for file in leffiles[:]:
+ if os.path.split(file)[1] in notlefnames:
+ leffiles.remove(file)
+
+ # Get list of abstract views to make from LEF macros
+ # (Note: exclude list can only contain the file being
+ # read, not individual macro names in the file; might
+ # need some additional feature to accommodate this.)
+ for leffile in leffiles:
+ with open(lefsrclibdir + '/' + leffile, 'r') as ifile:
+ ltext = ifile.read()
+ llines = ltext.splitlines()
+ for lline in llines:
+ ltok = re.split(' |\t|\(', lline)
+ if ltok[0] == 'MACRO':
+ lefmacros.append(ltok[1])
+
+ elif have_verilog and os.path.isdir(vlibdir):
+ # Get list of abstract views to make from verilog modules
+ # (NOTE: no way to apply exclude list here!)
+ vfiles = os.listdir(vlibdir)
+ vfiles = list(item for item in vfiles if os.path.splitext(item)[1] == '.v')
+ # Create exclude list with glob-style matching using fnmatch
+ if len(vfiles) > 0:
+ vnames = list(os.path.split(item)[1] for item in vfiles)
+ notvnames = []
+ for exclude in verilog_exclude:
+ notvnames.extend(fnmatch.filter(vnames, exclude))
+
+ # Apply exclude list
+ if len(notvnames) > 0:
+ for file in vfiles[:]:
+ if os.path.split(file)[1] in notvnames:
+ vfiles.remove(file)
+
+ for vfile in vfiles:
+ with open(vlibdir + '/' + vfile, 'r') as ifile:
+ vtext = ifile.read()
+ vlines = vtext.splitlines()
+ for vline in vlines:
+ vtok = re.split(' |\t|\(', vline)
+ try:
+ if vtok[0] == 'module':
+ if vtok[1] not in lefmacros:
+ lefmacros.append(vtok[1])
+ except:
+ pass
+
+ elif have_cdl and os.path.isdir(clibdir):
+ # Get list of abstract views to make from CDL subcircuits
+ cfiles = os.listdir(clibdir)
+ cfiles = list(item for item in cfiles if os.path.splitext(item)[1] == '.cdl')
+ # Create exclude list with glob-style matching using fnmatch
+ if len(cfiles) > 0:
+ cnames = list(os.path.split(item)[1] for item in cfiles)
+ notcnames = []
+ for exclude in cdl_exclude:
+ notcnames.extend(fnmatch.filter(cnames, exclude))
+
+ # Apply exclude list
+ if len(notcnames) > 0:
+ for file in cfiles[:]:
+ if os.path.split(file)[1] in notcnames:
+ cfiles.remove(file)
+
+ for cfile in cfiles:
+ with open(clibdir + '/' + cfile, 'r') as ifile:
+ ctext = ifile.read()
+ clines = ctext.splitlines()
+ for cline in clines:
+ ctok = cline.split()
+ try:
+ if ctok[0].lower() == '.subckt':
+ if ctok[1] not in lefmacros:
+ lefmacros.append(ctok[1])
+ except:
+ pass
+
+ elif have_spice and os.path.isdir(slibdir):
+ # Get list of abstract views to make from SPICE subcircuits
+ sfiles = os.listdir(slibdir)
+ sfiles = list(item for item in sfiles)
+
+ # Create exclude list with glob-style matching using fnmatch
+ if len(sfiles) > 0:
+ snames = list(os.path.split(item)[1] for item in sfiles)
+ notsnames = []
+ for exclude in spice_exclude:
+ notsnames.extend(fnmatch.filter(snames, exclude))
+
+ # Apply exclude list
+ if len(notsnames) > 0:
+ for file in sfiles[:]:
+ if os.path.split(file)[1] in notsnames:
+ sfiles.remove(file)
+
+ for sfile in sfiles:
+ with open(slibdir + '/' + sfile, 'r') as ifile:
+ stext = ifile.read()
+ slines = stext.splitlines()
+ for sline in slines:
+ stok = sline.split()
+ try:
+ if stok[0].lower() == '.subckt':
+ if stok[1] not in lefmacros:
+ lefmacros.append(stok[1])
+ except:
+ pass
+
+ if not lefmacros:
+ print('No source for abstract views: Abstract views not made.')
+ elif not have_lef:
+ # This library has a GDS database but no LEF database. Use
+ # magic to create abstract views of the GDS cells. If
+ # option "annotate" is given, then read the LEF file after
+ # loading the database file to annotate the cell with
+ # information from the LEF file. This usually indicates
+ # that the LEF file has some weird definition of obstruction
+ # layers and we want to normalize them by using magic's LEF
+ # write procedure, but we still need the pin use and class
+ # information from the LEF file, and maybe the bounding box.
+
+
+ # For annotation, the LEF file output will overwrite the
+ # original source LEF file.
+ lefdest = lefsrclibdir + '/' if have_lefanno else ''
+
+ for leffile in leffiles:
+ if have_lefanno:
+ print('lef read ' + lefsrclibdir + '/' + leffile, file=ofile)
+ for lefmacro in lefmacros:
+ print('if {[cellname list exists ' + lefmacro + '] != 0} {', file=ofile)
+ print(' load ' + lefmacro, file=ofile)
+ print(' lef write ' + lefdest + lefmacro + ' -hide', file=ofile)
+ print('}', file=ofile)
+
+ print('puts stdout "Done."', file=ofile)
+ print('quit -noprompt', file=ofile)
+
+ print('Running magic to create magic database files.')
+ sys.stdout.flush()
+
+ # Run magic to read in the GDS file and write out magic databases.
+ with open(destlibdir + '/generate_magic.tcl', 'r') as ifile:
+ mproc = subprocess.run(['magic', '-dnull', '-noconsole'],
+ stdin = ifile, stdout = subprocess.PIPE,
+ stderr = subprocess.PIPE, cwd = destlibdir,
+ universal_newlines = True)
+ if mproc.stdout:
+ for line in mproc.stdout.splitlines():
+ print(line)
+ if mproc.stderr:
+ print('Error message output from magic:')
+ for line in mproc.stderr.splitlines():
+ print(line)
+ if mproc.returncode != 0:
+ print('ERROR: Magic exited with status ' + str(mproc.returncode))
+
+ # Set have_lef now that LEF files were made, so they
+ # can be used to generate the maglef/ databases.
+ have_lef = True
+
+ elif not have_mag_8_2:
+ print('The installer is not able to run magic.')
+ else:
+ print("Master PDK magic startup file not found. Did you install")
+ print("PDK tech files before PDK vendor files?")
+
+ if have_lefanno:
+ # LEF files were used for annotation. If "compile" or "compile-only"
+ # was also passed as an option, then build the LEF library now from
+ # the LEF output from magic.
+ print("Compiling LEF library from magic output.")
+ if lef_compile or lef_compile_only:
+ create_lef_library(lefsrclibdir, destlib, lef_compile_only, lef_exclude)
+
+ if have_lef and not no_lef_convert:
+ print("Migrating LEF files to layout.")
+ if ef_format:
+ destdir = targetdir + '/libs.ref/' + 'maglef'
+ srcdir = targetdir + lef_reflib + 'lef'
+ magdir = targetdir + gds_reflib + 'mag'
+ cdldir = targetdir + cdl_reflib + 'cdl'
+ os.makedirs(destdir, exist_ok=True)
+
+ # For each library, create the library subdirectory
+ for library in libraries:
+ if len(library) == 3:
+ destlib = library[2]
+ else:
+ destlib = library[1]
+
+ if ef_format:
+ destlibdir = destdir + '/' + destlib
+ srclibdir = srcdir + '/' + destlib
+ maglibdir = magdir + '/' + destlib
+ cdllibdir = cdldir + '/' + destlib
+ clibdir = cdir + '/' + destlib
+ slibdir = sdir + '/' + destlib
+ else:
+ destdir = targetdir + '/libs.ref/' + destlib + '/maglef'
+ srcdir = targetdir + lef_reflib + destlib + '/lef'
+ magdir = targetdir + gds_reflib + destlib + '/mag'
+ cdldir = targetdir + cdl_reflib + destlib + '/cdl'
+ cdir = targetdir + cdl_reflib + destlib + '/cdl'
+ sdir = targetdir + cdl_reflib + destlib + '/spice'
+
+ destlibdir = destdir
+ srclibdir = srcdir
+ maglibdir = magdir
+ cdllibdir = cdldir
+ clibdir = cdir
+ slibdir = sdir
+
+ os.makedirs(destlibdir, exist_ok=True)
+
+ # Link to the PDK magic startup file from the target directory
+ startup_script = targetdir + mag_current + pdkname + '-F.magicrc'
+ if not os.path.isfile(startup_script):
+ startup_script = targetdir + mag_current + pdkname + '.magicrc'
+
+ if have_mag_8_2 and os.path.isfile(startup_script):
+ # If the symbolic link exists, remove it.
+ if os.path.isfile(destlibdir + '/.magicrc'):
+ os.remove(destlibdir + '/.magicrc')
+ os.symlink(startup_script, destlibdir + '/.magicrc')
+
+ # Find LEF file names in the source
+ leffiles = os.listdir(srclibdir)
+ leffiles = list(item for item in leffiles if os.path.splitext(item)[1].lower() == '.lef')
+
+ # Get list of abstract views to make from LEF macros
+ lefmacros = []
+ err_no_macros = False
+ for leffile in leffiles:
+ with open(srclibdir + '/' + leffile, 'r') as ifile:
+ ltext = ifile.read()
+ llines = ltext.splitlines()
+ for lline in llines:
+ ltok = re.split(' |\t|\(', lline)
+ if ltok[0] == 'MACRO':
+ lefmacros.append(ltok[1])
+
+ # Create exclude list with glob-style matching using fnmatch
+ if len(lefmacros) > 0:
+ lefnames = list(os.path.split(item)[1] for item in lefmacros)
+ notlefnames = []
+ for exclude in lef_exclude:
+ notlefnames.extend(fnmatch.filter(lefnames, exclude))
+
+ # Apply exclude list
+ if len(notlefnames) > 0:
+ for file in lefmacros[:]:
+ if os.path.split(file)[1] in notlefnames:
+ lefmacros.remove(file)
+
+ if len(leffiles) == 0:
+ print('Warning: No LEF files found in ' + srclibdir)
+ continue
+
+ print('Generating conversion script to create magic databases from LEF')
+
+ # Generate a script called "generate_magic.tcl" and leave it in
+ # the target directory. Use it as input to magic to create the
+ # .mag files from the database.
+
+ with open(destlibdir + '/generate_magic.tcl', 'w') as ofile:
+ print('#!/usr/bin/env wish', file=ofile)
+ print('#--------------------------------------------', file=ofile)
+ print('# Script to generate .mag files from .lef ', file=ofile)
+ print('#--------------------------------------------', file=ofile)
+ print('tech unlock *', file=ofile)
+
+ # If there are devices in the LEF file that come from the
+ # PDK library, then copy this list into the script.
+
+ if pdklibrary:
+ shortdevlist = []
+ for macro in lefmacros:
+ if macro in devlist:
+ shortdevlist.append(macro)
+
+ tcldevlist = '{' + ' '.join(shortdevlist) + '}'
+ print('set devlist ' + tcldevlist, file=ofile)
+
+ for leffile in leffiles:
+ print('lef read ' + srclibdir + '/' + leffile, file=ofile)
+
+ # Use CDL or SPICE netlists to make sure that ports are
+ # present, and to set the port order
+
+ if have_cdl or have_spice:
+ if have_cdl:
+ netdir = clibdir
+ else:
+ netdir = slibdir
+
+ # Find CDL/SPICE file names in the source
+ # Ignore "sources.txt" if it is in the list.
+ netfiles = os.listdir(netdir)
+ print('puts stdout "Annotating cells from CDL/SPICE"',
+ file=ofile)
+ for netfile in netfiles:
+ if os.path.split(netfile)[1] != 'sources.txt':
+ print('catch {readspice ' + netdir + '/' + netfile
+ + '}', file=ofile)
+
+ for lefmacro in lefmacros:
+
+ if pdklibrary and lefmacro in shortdevlist:
+ print('set cellname ' + lefmacro, file=ofile)
+ print('if {[lsearch $devlist $cellname] >= 0} {',
+ file=ofile)
+ print(' load $cellname', file=ofile)
+ print(' property gencell $cellname', file=ofile)
+ print(' property parameter m=1', file=ofile)
+ print(' property library ' + pdklibrary, file=ofile)
+ print('}', file=ofile)
+
+ # Load one of the LEF files so that the default (UNNAMED) cell
+ # is not loaded, then delete (UNNAMED) so it doesn't generate
+ # an error message.
+ if len(lefmacros) > 0:
+ print('load ' + lefmacros[0], file=ofile)
+ # print('cellname delete \(UNNAMED\)', file=ofile)
+ else:
+ err_no_macros = True
+ print('writeall force', file=ofile)
+ print('puts stdout "Done."', file=ofile)
+ print('quit -noprompt', file=ofile)
+
+ if err_no_macros == True:
+ print('Warning: No LEF macros were defined.')
+
+ print('Running magic to create magic databases from LEF')
+ sys.stdout.flush()
+
+ # Run magic to read in the LEF file and write out magic databases.
+ with open(destlibdir + '/generate_magic.tcl', 'r') as ifile:
+ mproc = subprocess.run(['magic', '-dnull', '-noconsole'],
+ stdin = ifile, stdout = subprocess.PIPE,
+ stderr = subprocess.PIPE, cwd = destlibdir,
+ universal_newlines = True)
+ if mproc.stdout:
+ for line in mproc.stdout.splitlines():
+ print(line)
+ if mproc.stderr:
+ print('Error message output from magic:')
+ for line in mproc.stderr.splitlines():
+ print(line)
+ if mproc.returncode != 0:
+ print('ERROR: Magic exited with status ' + str(mproc.returncode))
+
+
+ # Now list all the .mag files generated, and for each, read the
+ # corresponding file from the mag/ directory, pull the GDS file
+ # properties, and add those properties to the maglef view. Also
+ # read the CDL (or SPICE) netlist, read the ports, and rewrite
+ # the port order in the mag and maglef file accordingly.
+
+ # Diagnostic
+ print('Annotating files in ' + destlibdir)
+ sys.stdout.flush()
+ magfiles = os.listdir(destlibdir)
+ magfiles = list(item for item in magfiles if os.path.splitext(item)[1] == '.mag')
+ for magroot in magfiles:
+ magname = os.path.splitext(magroot)[0]
+ magfile = maglibdir + '/' + magroot
+ magleffile = destlibdir + '/' + magroot
+ prop_lines = get_gds_properties(magfile)
+
+ # Make sure properties include the Tcl generated cell
+ # information from the PDK script
+
+ prop_gencell = []
+ if pdklibrary:
+ if magname in fixedlist:
+ prop_gencell.append('gencell ' + magname)
+ prop_gencell.append('library ' + pdklibrary)
+ prop_gencell.append('parameter m=1')
+
+ nprops = len(prop_lines) + len(prop_gencell)
+
+ cdlfile = cdllibdir + '/' + magname + '.cdl'
+ if os.path.exists(cdlfile):
+ cdlfiles = [cdlfile]
+ else:
+ # Assume there is at least one file with all cell subcircuits
+ # in it.
+ try:
+ cdlfiles = glob.glob(cdllibdir + '/*.cdl')
+ except:
+ pass
+ if len(cdlfiles) > 0:
+ for cdlfile in cdlfiles:
+ port_dict = get_subckt_ports(cdlfile, magname)
+ if port_dict != {}:
+ break
+ else:
+ port_dict = {}
+
+ if port_dict == {}:
+ print('No CDL file contains ' + destlib + ' device ' + magname)
+ cdlfile = None
+ # To be done: If destlib is 'primitive', then look in
+ # SPICE models for port order.
+ if destlib == 'primitive':
+ print('Fix me: Need to look in SPICE models!')
+
+ proprex = re.compile('<< properties >>')
+ endrex = re.compile('<< end >>')
+ rlabrex = re.compile('rlabel[ \t]+[^ \t]+[ \t]+[^ \t]+[ \t]+[^ \t]+[ \t]+[^ \t]+[ \t]+[^ \t]+[ \t]+[^ \t]+[ \t]+([^ \t]+)')
+ flabrex = re.compile('flabel[ \t]+.*[ \t]+([^ \t]+)[ \t]*')
+ portrex = re.compile('port[ \t]+([^ \t]+)[ \t]+(.*)')
+ gcellrex = re.compile('string gencell')
+ portnum = -1
+
+ with open(magleffile, 'r') as ifile:
+ magtext = ifile.read().splitlines()
+
+ with open(magleffile, 'w') as ofile:
+ has_props = False
+ is_gencell = False
+ for line in magtext:
+ tmatch = portrex.match(line)
+ if tmatch:
+ if portnum >= 0:
+ line = 'port ' + str(portnum) + ' ' + tmatch.group(2)
+ else:
+ line = 'port ' + tmatch.group(1) + ' ' + tmatch.group(2)
+ ematch = endrex.match(line)
+ if ematch and nprops > 0:
+ if not has_props:
+ print('<< properties >>', file=ofile)
+ if not is_gencell:
+ for prop in prop_gencell:
+ print('string ' + prop, file=ofile)
+ for prop in prop_lines:
+ print('string ' + prop, file=ofile)
+
+ print(line, file=ofile)
+ pmatch = proprex.match(line)
+ if pmatch:
+ has_props = True
+
+ gmatch = gcellrex.match(line)
+ if gmatch:
+ is_gencell = True
+
+ lmatch = flabrex.match(line)
+ if not lmatch:
+ lmatch = rlabrex.match(line)
+ if lmatch:
+ labname = lmatch.group(1).lower()
+ try:
+ portnum = port_dict[labname]
+ except:
+ portnum = -1
+
+ if os.path.exists(magfile):
+ with open(magfile, 'r') as ifile:
+ magtext = ifile.read().splitlines()
+
+ with open(magfile, 'w') as ofile:
+ for line in magtext:
+ tmatch = portrex.match(line)
+ if tmatch:
+ if portnum >= 0:
+ line = 'port ' + str(portnum) + ' ' + tmatch.group(2)
+ else:
+ line = 'port ' + tmatch.group(1) + ' ' + tmatch.group(2)
+ ematch = endrex.match(line)
+ print(line, file=ofile)
+ lmatch = flabrex.match(line)
+ if not lmatch:
+ lmatch = rlabrex.match(line)
+ if lmatch:
+ labname = lmatch.group(1).lower()
+ try:
+ portnum = port_dict[labname]
+ except:
+ portnum = -1
+ elif os.path.splitext(magfile)[1] == '.mag':
+ # NOTE: Possibly this means the GDS cell has a different name.
+ print('Error: No file ' + magfile + '. Why is it in maglef???')
+
+ elif not have_mag_8_2:
+ print('The installer is not able to run magic.')
+ else:
+ print("Master PDK magic startup file not found. Did you install")
+ print("PDK tech files before PDK vendor files?")
+
+ # If SPICE or CDL databases were specified, then convert them to
+ # a form that can be used by ngspice, using the cdl2spi.py script
+
+ if have_spice:
+ if ef_format:
+ if not os.path.isdir(targetdir + cdl_reflib + 'spi'):
+ os.makedirs(targetdir + cdl_reflib + 'spi', exist_ok=True)
+
+ elif have_cdl and not no_cdl_convert:
+ if ef_format:
+ if not os.path.isdir(targetdir + cdl_reflib + 'spi'):
+ os.makedirs(targetdir + cdl_reflib + 'spi', exist_ok=True)
+
+ print("Migrating CDL netlists to SPICE.")
+ sys.stdout.flush()
+
+ if ef_format:
+ destdir = targetdir + cdl_reflib + 'spi'
+ srcdir = targetdir + cdl_reflib + 'cdl'
+ os.makedirs(destdir, exist_ok=True)
+
+ # For each library, create the library subdirectory
+ for library in libraries:
+ if len(library) == 3:
+ destlib = library[2]
+ else:
+ destlib = library[1]
+
+ if ef_format:
+ destlibdir = destdir + '/' + destlib
+ srclibdir = srcdir + '/' + destlib
+ else:
+ destdir = targetdir + cdl_reflib + destlib + '/spice'
+ srcdir = targetdir + cdl_reflib + destlib + '/cdl'
+
+ destlibdir = destdir
+ srclibdir = srcdir
+
+ os.makedirs(destlibdir, exist_ok=True)
+
+ # Find CDL file names in the source
+ # If CDL is marked compile-only then ONLY convert <distdir>.cdl
+ if cdl_compile_only:
+ alllibname = destlibdir + '/' + destlib + '.cdl'
+ if not os.path.exists(alllibname):
+ cdl_compile_only = False
+ else:
+ cdlfiles = [alllibname]
+
+ if not cdl_compile_only:
+ cdlfiles = os.listdir(srclibdir)
+ cdlfiles = list(item for item in cdlfiles if os.path.splitext(item)[1].lower() == '.cdl')
+
+ # The directory with scripts should be in ../common with respect
+ # to the Makefile that determines the cwd.
+
+ # Run cdl2spi.py script to read in the CDL file and write out SPICE
+ for cdlfile in cdlfiles:
+ if ef_format:
+ spiname = os.path.splitext(cdlfile)[0] + '.spi'
+ else:
+ spiname = os.path.splitext(cdlfile)[0] + '.spice'
+ procopts = [scriptdir + '/cdl2spi.py', srclibdir + '/' + cdlfile, destlibdir + '/' + spiname]
+ if do_cdl_scaleu:
+ procopts.append('-dscale=u')
+ for item in ignorelist:
+ procopts.append('-ignore=' + item)
+
+ print('Running (in ' + destlibdir + '): ' + ' '.join(procopts))
+ pproc = subprocess.run(procopts,
+ stdin = subprocess.DEVNULL, stdout = subprocess.PIPE,
+ stderr = subprocess.PIPE, cwd = destlibdir,
+ universal_newlines = True)
+ if pproc.stdout:
+ for line in pproc.stdout.splitlines():
+ print(line)
+ if pproc.stderr:
+ print('Error message output from cdl2spi.py:')
+ for line in pproc.stderr.splitlines():
+ print(line)
+
+ elif have_gds and not no_gds_convert:
+ # If neither SPICE nor CDL formats is available in the source, then
+ # read GDS; if the result has no ports, then read the corresponding
+ # LEF library to get port information. Then write out a SPICE netlist
+ # for the whole library. NOTE: If there is no CDL or SPICE source,
+ # then the port numbering is arbitrary, and becomes whatever the
+ # output of this script makes it.
+
+ if ef_format:
+ destdir = targetdir + cdl_reflib + 'spi'
+ srcdir = targetdir + gds_reflib + 'gds'
+ lefdir = targetdir + lef_reflib + 'lef'
+ os.makedirs(destdir, exist_ok=True)
+
+ # For each library, create the library subdirectory
+ for library in libraries:
+ if len(library) == 3:
+ destlib = library[2]
+ else:
+ destlib = library[1]
+
+ if ef_format:
+ destlibdir = destdir + '/' + destlib
+ srclibdir = srcdir + '/' + destlib
+ leflibdir = lefdir + '/' + destlib
+ else:
+ destdir = targetdir + cdl_reflib + destlib + '/spice'
+ srcdir = targetdir + gds_reflib + destlib + '/gds'
+ lefdir = targetdir + lef_reflib + destlib + '/lef'
+
+ destlibdir = destdir
+ srclibdir = srcdir
+ leflibdir = lefdir
+
+ os.makedirs(destlibdir, exist_ok=True)
+
+ # Link to the PDK magic startup file from the target directory
+ startup_script = targetdir + mag_current + pdkname + '-F.magicrc'
+ if not os.path.isfile(startup_script):
+ startup_script = targetdir + mag_current + pdkname + '.magicrc'
+ if os.path.isfile(startup_script):
+ # If the symbolic link exists, remove it.
+ if os.path.isfile(destlibdir + '/.magicrc'):
+ os.remove(destlibdir + '/.magicrc')
+ os.symlink(startup_script, destlibdir + '/.magicrc')
+
+ # Get the consolidated GDS library file, or a list of all GDS files
+ # if there is no single consolidated library
+
+ allgdslibname = srclibdir + '/' + destlib + '.gds'
+ if not os.path.isfile(allgdslibname):
+ glist = glob.glob(srclibdir + '/*.gds')
+ glist.extend(glob.glob(srclibdir + '/*.gdsii'))
+ glist.extend(glob.glob(srclibdir + '/*.gds2'))
+
+ allleflibname = leflibdir + '/' + destlib + '.lef'
+ if not os.path.isfile(allleflibname):
+ llist = glob.glob(leflibdir + '/*.lef')
+
+ print('Creating magic generation script to generate SPICE library.')
+ with open(destlibdir + '/generate_magic.tcl', 'w') as ofile:
+ print('#!/usr/bin/env wish', file=ofile)
+ print('#---------------------------------------------', file=ofile)
+ print('# Script to generate SPICE library from GDS ', file=ofile)
+ print('#---------------------------------------------', file=ofile)
+ print('drc off', file=ofile)
+ print('gds readonly true', file=ofile)
+ print('gds flatten true', file=ofile)
+ print('gds rescale false', file=ofile)
+ print('tech unlock *', file=ofile)
+
+ # Add custom Tcl script lines before "gds read".
+ if tclscript:
+ for line in tcllines:
+ print(line, file=ofile)
+
+ if not os.path.isfile(allgdslibname):
+ for gdsfile in glist:
+ print('gds read ' + gdsfile, file=ofile)
+ else:
+ print('gds read ' + allgdslibname, file=ofile)
+
+ if not os.path.isfile(allleflibname):
+ # Annotate the cells with information from the LEF files
+ for leffile in llist:
+ print('lef read ' + leffile, file=ofile)
+ else:
+ print('lef read ' + allleflibname, file=ofile)
+
+ # Load first file and remove the (UNNAMED) cell
+ if not os.path.isfile(allgdslibname):
+ print('load ' + os.path.splitext(glist[0])[0], file=ofile)
+ else:
+ gdslibroot = os.path.split(allgdslibname)[1]
+ print('load ' + os.path.splitext(gdslibroot)[0], file=ofile)
+ # print('cellname delete \(UNNAMED\)', file=ofile)
+
+ print('ext2spice lvs', file=ofile)
+
+ # NOTE: Leaving "subcircuit top" as "auto" (default) can cause
+ # cells like decap that have no I/O to be output without a subcircuit
+ # wrapper. Also note that if this happens, it is an indication that
+ # power supplies have not been labeled as ports, which is harder to
+ # handle and should be fixed in the source.
+ print('ext2spice subcircuit top on', file=ofile)
+
+ print('ext2spice cthresh 0.1', file=ofile)
+
+ if os.path.isfile(allgdslibname):
+ print('select top cell', file=ofile)
+ print('set glist [cellname list children]', file=ofile)
+ print('foreach cell $glist {', file=ofile)
+ else:
+ print('foreach cell [cellname list top] {', file=ofile)
+
+ print(' load $cell', file=ofile)
+ print(' puts stdout "Extracting cell $cell"', file=ofile)
+ print(' extract all', file=ofile)
+ print(' ext2spice', file=ofile)
+ print('}', file=ofile)
+ print('puts stdout "Done."', file=ofile)
+ print('quit -noprompt', file=ofile)
+
+ # Run magic to read in the individual GDS files and
+ # write out the consolidated GDS library
+
+ print('Running magic to create GDS library.')
+ sys.stdout.flush()
+
+ mproc = subprocess.run(['magic', '-dnull', '-noconsole',
+ destlibdir + '/generate_magic.tcl'],
+ stdin = subprocess.DEVNULL,
+ stdout = subprocess.PIPE,
+ stderr = subprocess.PIPE, cwd = destlibdir,
+ universal_newlines = True)
+ if mproc.stdout:
+ for line in mproc.stdout.splitlines():
+ print(line)
+ if mproc.stderr:
+ print('Error message output from magic:')
+ for line in mproc.stderr.splitlines():
+ print(line)
+ if mproc.returncode != 0:
+ print('ERROR: Magic exited with status ' + str(mproc.returncode))
+
+ # Remove intermediate extraction files
+ extfiles = glob.glob(destlibdir + '/*.ext')
+ for extfile in extfiles:
+ os.remove(extfile)
+
+ # If the GDS file was a consolidated file of all cells, then
+ # create a similar SPICE library of all cells.
+
+ if os.path.isfile(allgdslibname):
+ spiext = '.spice' if not ef_format else '.spi'
+ create_spice_library(destlibdir, destlib, spiext, do_compile_only, do_stub, excludelist)
+
+ sys.exit(0)
diff --git a/common/gate_list.txt b/common/gate_list.txt
new file mode 100644
index 0000000..b1097ec
--- /dev/null
+++ b/common/gate_list.txt
@@ -0,0 +1,124 @@
+#-----------------------------------------------------------------------
+# List of standard gate symbols with mapping to liberty file fields
+#
+# Y, X refer to logic output pins
+# C, S, ... refer specifically to adder carry and sum output pins
+# Q, QB refer specifically to flip-flop or latch output pins
+# IQ, IQB refers to function entries for flops. "I" is literal.
+# A, B, C, ... refer to input pins
+# D, R, S, ... refer specifically to flip-flop input pins
+# SD refers to scan data input pin for flip-flops
+# CI refers specifically to full adder carry-in input
+# Z refers to high-impedance state
+# & means AND
+# | means OR
+# ! means NOT
+# ( ) groups
+#
+# Symbol Liberty Liberty
+# primitive file field
+# name field value ...
+#-----------------------------------------------------------------------
+AND2 function Y=A&B
+AND3 function Y=A&B&C
+AND4 function Y=A&B&C&D
+AND5 function Y=A&B&C&D&E
+AND8 function Y=A&B&C&D&E&F&G&H
+
+AND2I function Y=!A&B
+
+AO21 function Y=(A&B)|C
+AO22 function Y=(A&B)|(C&D)
+
+AOI21 function Y=!((A&B)|C)
+AOI22 function Y=!((A&B)|(C&D))
+
+NAND2 function Y=!(A&B)
+NAND3 function Y=!(A&B&C)
+NAND4 function Y=!(A&B&C&D)
+NAND5 function Y=!(A&B&C&D&E)
+NAND8 function Y=!(A&B&C&D&E&F&G&H)
+
+NAND2I function Y=!(!A&B)
+
+OR2 function Y=A|B
+OR3 function Y=A|B|C
+OR4 function Y=A|B|C|D
+OR5 function Y=A|B|C|D|E
+OR8 function Y=A|B|C|D|E|F|G|H
+
+OR2I function Y=!A|B
+
+OA21 function Y=(A|B)&C
+OA22 function Y=(A|B)&(C|D)
+
+OAI21 function Y=!((A|B)&C)
+OAI22 function Y=!((A|B)&(C|D))
+
+NOR2 function Y=!(A|B)
+NOR3 function Y=!(A|B|C)
+NOR4 function Y=!(A|B|C|D)
+NOR5 function Y=!(A|B|C|D|E)
+NOR8 function Y=!(A|B|C|D|E|F|G|H)
+
+NOR2I function Y=!(!A|B)
+
+XOR2 function Y=(A&!B)|(!A&B)
+
+XNOR2 function Y=(A&B)|(!A&!B)
+
+INV function Y=!A
+
+BUF function Y=A
+
+TBUF function Y=A three_state E
+
+TBUFI function Y=A three_state !E
+
+MUX2 function Y=(A&C)|(B&!C)
+
+MUX2I function Y=!((A&!C)|(B&C))
+
+MUX4 function Y=(A&!E&!F)|(B&!E&F)|(C&E&!F)|(D&E&F)
+
+MUX4I function Y=!((A&!E&!F)|(B&!E&F)|(C&E&!F)|(D&E&F))
+
+HA function C=A&B function S=(A&!B)|(!A&B)
+FA function C=(A&B)|(A&CI)|(B&CI) function S=(A&B&CI)|(!A&B&!CI)|(!A&!B&CI)|(A&B&CI)
+
+LATCH function Q=IQ enable E data_in D
+LATCHI function Q=IQ enable !E data_in D
+LATCHR function Q=IQ enable E data_in D clear !R
+LATCHIR function Q=IQ enable !E data_in D clear !R
+LATCHSR function Q=IQ enable E data_in D preset !S clear !R
+LATCHISR function Q=IQ enable E data_in D preset !S clear !R
+
+LATCHQ function Q=IQ QB=IQB enable E data_in D
+LATCHIQ function Q=IQ QB=IQB enable !E data_in D
+LATCHRQ function Q=IQ QB=IQB enable E data_in D clear !R
+LATCHIRQ function Q=IQ QB=IQB enable !E data_in D clear !R
+LATCHSRQ function Q=IQ QB=IQB enable E data_in D preset !S clear !R
+LATCHISRQ function Q=IQ QB=IQB enable !E data_in D preset !S clear !R
+
+DFF function Q=IQ clocked_on C next_state D
+DFFQ function Q=IQ function QB=IQB clocked_on C next_state D
+DFFS function Q=IQ clocked_on C next_state D preset !S
+DFFR function Q=IQ clocked_on C next_state D clear !R
+DFFSR function Q=IQ clocked_on C next_state D preset !S clear !R
+DFFSQ function Q=IQ function QB=IQB clocked_on C next_state D preset !S
+DFFRQ function Q=IQ function QB=IQB clocked_on C next_state D clear !R
+DFFSRQ function Q=IQ function QB=IQB clocked_on C next_state D clear !R preset !S
+DFFI function Q=IQ clocked_on !C next_state D
+DFFIQ function Q=IQ function QB=IQB clocked_on !C next_state D
+DFFIS function Q=IQ clocked_on !C next_state D preset !S
+DFFIR function Q=IQ clocked_on !C next_state D clear !R
+DFFISR function Q=IQ clocked_on !C next_state D preset !S clear !R
+DFFISQ function Q=IQ function QB=IQB clocked_on !C next_state D preset !S
+DFFIRQ function Q=IQ function QB=IQB clocked_on !C next_state D clear !R
+DFFISRQ function Q=IQ function QB=IQB clocked_on !C next_state D clear !R preset !S
+
+EDFF function Q=IQ clocked_on C next_state (D&E)|(IQ&!E)
+EDFFQ function Q=IQ QB=IQB clocked_on C next_state (D&E)|(IQ&!E)
+
+SDFF function Q=IQ clocked_on C next_state (D&!E)|(SD&E)
+SDFFQ function Q=IQ QB=IQB clocked_on C next_state (D&!E)|(SD&E)
diff --git a/common/insert_property.py b/common/insert_property.py
new file mode 100755
index 0000000..53fdba9
--- /dev/null
+++ b/common/insert_property.py
@@ -0,0 +1,140 @@
+#!/usr/bin/env python3
+#
+# insert_property.py: For the given install path, library name, and cellname,
+# find the Magic layout of the cell, and add the specified property string.
+# If the property exists and is the same as specified, then it remains the
+# same. If the property exists but has a different value, it is replaced.
+# The property is added to the layout in both the mag/ (full) and maglef/
+# (abstract) directories. Option "-maglef" or "-mag" will restrict the
+# use to only the view indicated by the option.
+#
+# e.g.:
+#
+# insert_property.py /path/to/sky130A \
+# sky130_fd_io sky130_fd_io__top_gpiov2 "MASKHINTS_HVI 0 607 15000 40200"
+
+import os
+import re
+import sys
+
+def addprop(filename, propstring, noupdate):
+ with open(filename, 'r') as ifile:
+ magtext = ifile.read().splitlines()
+
+ propname = propstring.split()[0]
+ proprex = re.compile('<< properties >>')
+ endrex = re.compile('<< end >>')
+
+ in_props = False
+ printed = False
+ done = False
+
+ with open(filename, 'w') as ofile:
+ for line in magtext:
+ pmatch = proprex.match(line)
+ if pmatch:
+ in_props = True
+ elif in_props:
+ linetok = line.split()
+ if linetok[0] == 'string':
+ testname = linetok[1]
+ testval = linetok[2]
+ if testname == propname:
+ if noupdate == False:
+ print('string ' + propstring, file=ofile)
+ printed = True
+ done = True
+
+ ematch = endrex.match(line)
+ if ematch:
+ if in_props == False:
+ print('<< properties >>', file=ofile)
+ if done == False:
+ print('string ' + propstring, file=ofile)
+
+ if not printed:
+ print(line, file=ofile)
+ printed = False
+
+def usage():
+ print("insert_property.py <path_to_pdk> <libname> <cellname> <prop_string> [option]")
+ print(" options:")
+ print(" -mag do only for the view in the mag/ directory")
+ print(" -maglef do only for the view in the maglef/ directory")
+ print(" -noupdate do not replace the property if it already exists in the file")
+ return 0
+
+if __name__ == '__main__':
+
+ options = []
+ arguments = []
+ for item in sys.argv[1:]:
+ if item.find('-', 0) == 0:
+ options.append(item)
+ else:
+ arguments.append(item)
+
+ if len(arguments) < 4:
+ print("Not enough options given to insert_property.py.")
+ usage()
+ sys.exit(0)
+
+ source = arguments[0]
+ libname = arguments[1]
+ cellname = arguments[2]
+ propstring = arguments[3]
+
+ # Diagnostic
+ print('insert_property.py:')
+ print(' source = ' + source)
+ print(' library = ' + libname)
+ print(' cell = ' + cellname)
+ print(' property = ' + propstring)
+
+ noupdate = True if '-noupdate' in options else False
+ fail = 0
+
+ efformat = True if '-ef_format' in options else False
+
+ domag = True
+ domaglef = True
+ if '-mag' in options and '-maglef' not in options:
+ domaglef = False
+ if '-maglef' in options and '-mag' not in options:
+ domag = False
+
+ if domag:
+ if efformat:
+ filename = source + '/libs.ref/mag/' + libname + '/' + cellname + '.mag'
+ else:
+ filename = source + '/libs.ref/' + libname + '/mag/' + cellname + '.mag'
+
+ if os.path.isfile(filename):
+ addprop(filename, propstring, noupdate)
+ else:
+ fail += 1
+ else:
+ fail += 1
+
+ if domaglef:
+ if efformat:
+ filename = source + '/libs.ref/maglef/' + libname + '/' + cellname + '.mag'
+ else:
+ filename = source + '/libs.ref/' + libname + '/maglef/' + cellname + '.mag'
+
+ if os.path.isfile(filename):
+ addprop(filename, propstring, noupdate)
+ else:
+ fail += 1
+ else:
+ fail += 1
+
+ if fail == 2:
+ print('Error: No layout file in either mag/ or maglef/', file=sys.stderr)
+ if efformat:
+ print('(' + source + '/libs.ref/mag[lef]/' + libname +
+ '/' + cellname + '.mag)', file=sys.stderr)
+ else:
+ print('(' + source + '/libs.ref/' + libname + '/mag[lef]/'
+ + cellname + '.mag)', file=sys.stderr)
+
diff --git a/common/orig/foundry_install.py b/common/orig/foundry_install.py
new file mode 100755
index 0000000..96caf14
--- /dev/null
+++ b/common/orig/foundry_install.py
@@ -0,0 +1,1175 @@
+#!/usr/bin/env python3
+#
+# foundry_install.py
+#
+# This file generates the local directory structure and populates the
+# directories with foundry vendor data.
+#
+# Options:
+# -link_from <type> Make symbolic links to vendor files from target
+# Types are: "none", "source", or a PDK name.
+# Default "none" (copy all files from source)
+# -ef_names Use efabless naming (libs.ref/techLEF),
+# otherwise use generic naming (libs.tech/lef)
+#
+# -source <path> Path to source data top level directory
+# -target <path> Path to target top level directory
+#
+#
+# All other options represent paths to vendor files. They may all be
+# wildcarded with "*" to represent, e.g., version number directories,
+# or names of supported libraries. Where wildcards exist, if there is
+# more than one directory in the path, the value represented by "*"
+# will first be checked against library names. If no library name is
+# found, then the wildcard value will be assumed to be numeric and
+# separated by either "." or "_" to represent major/minor/sub/...
+# revision numbers (alphanumeric).
+#
+# Note only one of "-spice" or "-cdl" need be specified. Since the
+# open source tools use ngspice, CDL files are converted to ngspice
+# syntax when needed.
+#
+# -techlef <path> Path to technology LEF file
+# -doc <path> Path to technology documentation
+# -lef <path> Path to LEF file
+# -lefanno <path> Path to LEF file (for annotation only)
+# -spice <path> Path to SPICE netlists
+# -cdl <path> Path to CDL netlists
+# -models <path> Path to SPICE (primitive device) models
+# -liberty <path> Path to Liberty timing files
+# -gds <path> Path to GDS data
+# -verilog <path> Path to verilog models
+#
+# -library <type> <name> [<target>] See below
+#
+# For the "-library" option, any number of libraries may be supported, and
+# one "-library" option should be provided for each supported library.
+# <type> is one of: "digital", "primitive", or "general". Analog and I/O
+# libraries fall under the category "general", as they are all treated the
+# same way. <name> is the vendor name of the library. [<target>] is the
+# (optional) local name of the library. If omitted, then the vendor name
+# is used for the target (there is no particular reason to specify a
+# different local name for a library).
+#
+# All options "-lef", "-spice", etc., can take the additional arguments
+# up <number>
+#
+# to indicate that the source hierarchy should be copied from <number>
+# levels above the files. For example, if liberty files are kept in
+# multiple directories according to voltage level, then
+#
+# -liberty x/y/z/PVT_*/*.lib
+#
+# would install all .lib files directly into libs.ref/lef/<libname>/*.lib
+# while
+#
+# -liberty x/y/z/PVT_*/*.lib up 1
+#
+# would install all .lib files into libs.ref/lef/PVT_*/<libname>/*.lib
+#
+# Other library-specific arguments are:
+#
+# nospec : Remove timing specification before installing
+# (used with verilog files; needs to be extended to
+# liberty files)
+# compile : Create a single library from all components. Used
+# when a foundry library has inconveniently split
+# an IP library (LEF, CDL, verilog, etc.) into
+# individual files.
+#
+# NOTE: This script can be called once for all libraries if all file
+# types (gds, cdl, lef, etc.) happen to all work with the same wildcards.
+# However, it is more likely that it will be called several times for the
+# same PDK, once to install I/O cells, once to install digital, and so
+# forth, as made possible by the wild-carding.
+
+import re
+import os
+import sys
+import glob
+import shutil
+import subprocess
+
+def usage():
+ print("foundry_install.py [options...]")
+ print(" -link_from <name> Make symbolic links from target to <name>")
+ print(" where <name> can be 'source' or a PDK name.")
+ print(" Default behavior is to copy all files.")
+ print(" -copy Copy files from source to target (default)")
+ print(" -ef_names Use efabless naming conventions for local directories")
+ print("")
+ print(" -source <path> Path to top of source directory tree")
+ print(" -target <path> Path to top of target directory tree")
+ print("")
+ print(" -techlef <path> Path to technology LEF file")
+ print(" -doc <path> Path to technology documentation")
+ print(" -lef <path> Path to LEF file")
+ print(" -lefanno <path> Path to LEF file (for annotation only)")
+ print(" -spice <path> Path to SPICE netlists")
+ print(" -cdl <path> Path to CDL netlists")
+ print(" -models <path> Path to SPICE (primitive device) models")
+ print(" -lib <path> Path to Liberty timing files")
+ print(" -liberty <path> Path to Liberty timing files")
+ print(" -gds <path> Path to GDS data")
+ print(" -verilog <path> Path to verilog models")
+ print(" -library <type> <name> [<target>] See below")
+ print("")
+ print(" All <path> names may be wild-carded with '*' ('glob'-style wild-cards)")
+ print("")
+ print(" All options with <path> other than source and target may take the additional")
+ print(" arguments 'up <number>', where <number> indicates the number of levels of")
+ print(" hierarchy of the source path to include when copying to the target.")
+ print("")
+ print(" Library <type> may be one of:")
+ print(" digital Digital standard cell library")
+ print(" primitive Primitive device library")
+ print(" general All other library types (I/O, analog, etc.)")
+ print("")
+ print(" If <target> is unspecified then <name> is used for the target.")
+
+def get_gds_properties(magfile):
+ proprex = re.compile('^[ \t]*string[ \t]+(GDS_[^ \t]+)[ \t]+([^ \t]+)$')
+ proplines = []
+ if os.path.isfile(magfile):
+ with open(magfile, 'r') as ifile:
+ magtext = ifile.read().splitlines()
+ for line in magtext:
+ lmatch = proprex.match(line)
+ if lmatch:
+ propline = lmatch.group(1) + ' ' + lmatch.group(2)
+ proplines.append(propline)
+ return proplines
+
+# Read subcircuit ports from a CDL file, given a subcircuit name that should
+# appear in the file as a subcircuit entry, and return a dictionary of ports
+# and their indexes in the subcircuit line.
+
+def get_subckt_ports(cdlfile, subname):
+ portdict = {}
+ pidx = 1
+ portrex = re.compile('^\.subckt[ \t]+([^ \t]+)[ \t]+(.*)$', re.IGNORECASE)
+ with open(cdlfile, 'r') as ifile:
+ cdltext = ifile.read()
+ cdllines = cdltext.replace('\n+', ' ').splitlines()
+ for line in cdllines:
+ lmatch = portrex.match(line)
+ if lmatch:
+ if lmatch.group(1).lower() == subname.lower():
+ ports = lmatch.group(2).split()
+ for port in ports:
+ portdict[port.lower()] = pidx
+ pidx += 1
+ break
+ return portdict
+
+# Filter a verilog file to remove any backslash continuation lines, which
+# iverilog does not parse. If targetroot is a directory, then find and
+# process all files in the path of targetroot. If any file to be processed
+# is unmodified (has no backslash continuation lines), then ignore it. If
+# any file is a symbolic link and gets modified, then remove the symbolic
+# link before overwriting with the modified file.
+#
+# If 'do_remove_spec' is True, then remove timing information from the file,
+# which is everything between the keywords "specify" and "endspecify".
+
+def vfilefilter(vfile, do_remove_spec):
+ modified = False
+ with open(vfile, 'r') as ifile:
+ vtext = ifile.read()
+
+ # Remove backslash-followed-by-newline and absorb initial whitespace. It
+ # is unclear what initial whitespace means in this context, as the use-
+ # case that has been seen seems to work under the assumption that leading
+ # whitespace is ignored up to the amount used by the last indentation.
+
+ vlines = re.sub('\\\\\n[ \t]*', '', vtext)
+
+ if do_remove_spec:
+ specrex = re.compile('\n[ \t]*specify[ \t\n]+')
+ endspecrex = re.compile('\n[ \t]*endspecify')
+ smatch = specrex.search(vlines)
+ while smatch:
+ specstart = smatch.start()
+ specpos = smatch.end()
+ ematch = endspecrex.search(vlines[specpos:])
+ specend = ematch.end()
+ vtemp = vlines[0:specstart + 1] + vlines[specpos + specend + 1:]
+ vlines = vtemp
+ smatch = specrex.search(vlines)
+
+ if vlines != vtext:
+ # File contents have been modified, so if this file was a symbolic
+ # link, then remove it. Otherwise, overwrite the file with the
+ # modified contents.
+ if os.path.islink(vfile):
+ os.unlink(vfile)
+ with open(vfile, 'w') as ofile:
+ ofile.write(vlines)
+
+# Run a filter on verilog files that cleans up known syntax issues.
+# This is embedded in the foundry_install script and is not a custom
+# filter largely because the issues are in the tool, not the PDK.
+
+def vfilter(targetroot, do_remove_spec):
+ if os.path.isfile(targetroot):
+ vfilefilter(targetroot, do_remove_spec)
+ else:
+ vlist = glob.glob(targetroot + '/*')
+ for vfile in vlist:
+ if os.path.isfile(vfile):
+ vfilefilter(vfile, do_remove_spec)
+
+# For issues that are PDK-specific, a script can be written and put in
+# the PDK's custom/scripts/ directory, and passed to the foundry_install
+# script using the "filter" option.
+
+def tfilter(targetroot, filterscript):
+ if os.path.isfile(targetroot):
+ print(' Filtering file ' + targetroot)
+ subprocess.run([filterscript, targetroot, targetroot],
+ stdin = subprocess.DEVNULL, stdout = subprocess.PIPE,
+ stderr = subprocess.PIPE, universal_newlines = True)
+ else:
+ tlist = glob.glob(targetroot + '/*')
+ for tfile in tlist:
+ if os.path.isfile(tfile):
+ print(' Filtering file ' + tfile)
+ subprocess.run([filterscript, tfile, tfile],
+ stdin = subprocess.DEVNULL, stdout = subprocess.PIPE,
+ stderr = subprocess.PIPE, universal_newlines = True)
+
+# This is the main entry point for the foundry install script.
+
+if __name__ == '__main__':
+
+ if len(sys.argv) == 1:
+ print("No options given to foundry_install.py.")
+ usage()
+ sys.exit(0)
+
+ optionlist = []
+ newopt = []
+
+ sourcedir = None
+ targetdir = None
+ link_from = None
+
+ ef_names = False
+
+ have_lef = False
+ have_lefanno = False
+ have_gds = False
+ have_spice = False
+ have_cdl = False
+ ignorelist = []
+
+ do_install = True
+
+ # Break arguments into groups where the first word begins with "-".
+ # All following words not beginning with "-" are appended to the
+ # same list (optionlist). Then each optionlist is processed.
+ # Note that the first entry in optionlist has the '-' removed.
+
+ for option in sys.argv[1:]:
+ if option.find('-', 0) == 0:
+ if newopt != []:
+ optionlist.append(newopt)
+ newopt = []
+ newopt.append(option[1:])
+ else:
+ newopt.append(option)
+
+ if newopt != []:
+ optionlist.append(newopt)
+
+ # Pull library names from optionlist
+ libraries = []
+ for option in optionlist[:]:
+ if option[0] == 'library':
+ optionlist.remove(option)
+ libraries.append(option[1:])
+
+ # Check for option "ef_names" or "std_names"
+ for option in optionlist[:]:
+ if option[0] == 'ef_naming' or option[0] == 'ef_names':
+ optionlist.remove(option)
+ ef_names = True
+ elif option[0] == 'std_naming' or option[0] == 'std_names':
+ optionlist.remove(option)
+ ef_names = False
+ elif option[0] == 'uninstall':
+ optionlist.remove(option)
+ do_install = False
+
+ # Check for options "link_from", "source", and "target"
+ link_name = None
+ for option in optionlist[:]:
+ if option[0] == 'link_from':
+ optionlist.remove(option)
+ if option[1].lower() == 'none':
+ link_from = None
+ elif option[1].lower() == 'source':
+ link_from = 'source'
+ else:
+ link_from = option[1]
+ link_name = os.path.split(link_from)[1]
+ elif option[0] == 'source':
+ optionlist.remove(option)
+ sourcedir = option[1]
+ elif option[0] == 'target':
+ optionlist.remove(option)
+ targetdir = option[1]
+
+ # Error if no source or dest specified
+ if not sourcedir:
+ print("No source directory specified. Exiting.")
+ sys.exit(1)
+
+ if not targetdir:
+ print("No target directory specified. Exiting.")
+ sys.exit(1)
+
+ # If link source is a PDK name, if it has no path, then pull the
+ # path from the target name.
+
+ if link_from:
+ if link_from != 'source':
+ if link_from.find('/', 0) < 0:
+ target_root = os.path.split(targetdir)[0]
+ link_from = target_root + '/' + link_from
+ link_name = link_from
+ else:
+ # If linking from source, convert the source path to an
+ # absolute pathname.
+ sourcedir = os.path.abspath(sourcedir)
+
+ # Take the target PDK name from the target path last component
+ pdkname = os.path.split(targetdir)[1]
+
+ # checkdir is the DIST target directory for the PDK pointed
+ # to by link_name. Files must be found there before creating
+ # symbolic links to the (not yet existing) final install location.
+
+ if link_name:
+ checkdir = os.path.split(targetdir)[0] + '/' + link_name
+ else:
+ checkdir = ''
+
+ # Diagnostic
+ if do_install:
+ print("Installing in target directory " + targetdir)
+
+ # Create the top-level directories
+
+ os.makedirs(targetdir, exist_ok=True)
+ os.makedirs(targetdir + '/libs.ref', exist_ok=True)
+ os.makedirs(targetdir + '/libs.tech', exist_ok=True)
+
+ # Path to magic techfile depends on ef_names
+
+ if ef_names == True:
+ mag_current = '/libs.tech/magic/current/'
+ else:
+ mag_current = '/libs.tech/magic/'
+
+ # Populate the techLEF and SPICE models, if specified.
+
+ for option in optionlist[:]:
+ if option[0] == 'techlef':
+ filter_script = None
+ for item in option:
+ if item.split('=')[0] == 'filter':
+ filter_script = item.split('=')[1]
+ break
+
+ if ef_names == True:
+ techlefdir = targetdir + '/libs.ref/techLEF'
+ checklefdir = checkdir + '/libs.ref/techLEF'
+ if link_from:
+ linklefdir = link_from + '/libs.ref/techLEF'
+ else:
+ linklefdir = ''
+ else:
+ techlefdir = targetdir + '/libs.tech/lef'
+ checklefdir = checkdir + '/libs.tech/lef'
+ if link_from:
+ linklefdir = link_from + '/libs.tech/lef'
+ else:
+ linklefdir = ''
+ os.makedirs(techlefdir, exist_ok=True)
+ # All techlef files should be linked or copied, so use "glob"
+ # on the wildcards
+ techlist = glob.glob(sourcedir + '/' + option[1])
+
+ for lefname in techlist:
+ leffile = os.path.split(lefname)[1]
+ targname = techlefdir + '/' + leffile
+ checklefname = checklefdir + '/' + leffile
+ linklefname = linklefdir + '/' + leffile
+ # Remove any existing file(s)
+ if os.path.isfile(targname):
+ os.remove(targname)
+ elif os.path.islink(targname):
+ os.unlink(targname)
+ elif os.path.isdir(targname):
+ shutil.rmtree(targname)
+
+ if do_install:
+ if not link_from:
+ if os.path.isfile(lefname):
+ shutil.copy(lefname, targname)
+ else:
+ shutil.copytree(lefname, targname)
+ elif link_from == 'source':
+ os.symlink(lefname, targname)
+ else:
+ if os.path.exists(checklefname):
+ os.symlink(linklefname, targname)
+ elif os.path.isfile(lefname):
+ shutil.copy(lefname, targname)
+ else:
+ shutil.copytree(lefname, targname)
+
+ if filter_script:
+ # Apply filter script to all files in the target directory
+ tfilter(targname, filter_script)
+ optionlist.remove(option)
+
+ elif option[0] == 'models':
+ filter_script = None
+ for item in option:
+ if item.split('=')[0] == 'filter':
+ filter_script = item.split('=')[1]
+ break
+
+ print('Diagnostic: installing models.')
+ modelsdir = targetdir + '/libs.tech/models'
+ checkmoddir = checkdir + '/libs.tech/models'
+ if link_from:
+ linkmoddir = link_from + '/libs.tech/models'
+ else:
+ linkmoddir = ''
+
+ os.makedirs(modelsdir, exist_ok=True)
+
+ # All model files should be linked or copied, so use "glob"
+ # on the wildcards. Copy each file and recursively copy each
+ # directory.
+ modellist = glob.glob(sourcedir + '/' + option[1])
+
+ for modname in modellist:
+ modfile = os.path.split(modname)[1]
+ targname = modelsdir + '/' + modfile
+ checkmodname = checkmoddir + '/' + modfile
+ linkmodname = linkmoddir + '/' + modfile
+
+ if os.path.isdir(modname):
+ # Remove any existing directory, and its contents
+ if os.path.isdir(targname):
+ shutil.rmtree(targname)
+ os.makedirs(targname)
+
+ # Recursively find and copy or link the whole directory
+ # tree from this point.
+
+ allmodlist = glob.glob(modname + '/**', recursive=True)
+ commonpart = os.path.commonpath(allmodlist)
+ for submodname in allmodlist:
+ if os.path.isdir(submodname):
+ continue
+ # Get the path part that is not common between modlist and
+ # allmodlist.
+ subpart = os.path.relpath(submodname, commonpart)
+ subtargname = targname + '/' + subpart
+ os.makedirs(os.path.split(subtargname)[0], exist_ok=True)
+ if do_install:
+ if not link_from:
+ if os.path.isfile(submodname):
+ shutil.copy(submodname, subtargname)
+ else:
+ shutil.copytree(submodname, subtargname)
+ elif link_from == 'source':
+ os.symlink(submodname, subtargname)
+ else:
+ if os.path.exists(checkmodname):
+ os.symlink(linkmodname, subtargname)
+ elif os.path.isfile(submodname):
+ shutil.copy(submodname, subtargname)
+ else:
+ shutil.copytree(submodname, subtargname)
+
+ if filter_script:
+ # Apply filter script to all files in the target directory
+ tfilter(targname, filter_script)
+
+ else:
+ # Remove any existing file
+ if os.path.isfile(targname):
+ os.remove(targname)
+ elif os.path.islink(targname):
+ os.unlink(targname)
+ elif os.path.isdir(targname):
+ shutil.rmtree(targname)
+
+ if do_install:
+ if not link_from:
+ if os.path.isfile(modname):
+ shutil.copy(modname, targname)
+ else:
+ shutil.copytree(modname, targname)
+ elif link_from == 'source':
+ os.symlink(modname, targname)
+ else:
+ if os.path.isfile(checkmodname):
+ os.symlink(linkmodname, targname)
+ elif os.path.isfile(modname):
+ shutil.copy(modname, targname)
+ else:
+ shutil.copytree(modname, targname)
+
+ if filter_script:
+ # Apply filter script to all files in the target directory
+ tfilter(targname, filter_script)
+
+ optionlist.remove(option)
+
+ # The remaining options in optionlist should all be types like 'lef' or 'liberty'
+ for option in optionlist[:]:
+ # Diagnostic
+ if do_install:
+ print("Installing option: " + str(option[0]))
+ destdir = targetdir + '/libs.ref/' + option[0]
+ checklibdir = checkdir + '/libs.ref/' + option[0]
+ if link_from:
+ destlinkdir = link_from + '/libs.ref/' + option[0]
+ else:
+ destlinkdir = ''
+ os.makedirs(destdir, exist_ok=True)
+
+ # If the option is followed by the keyword "up" and a number, then
+ # the source should be copied (or linked) from <number> levels up
+ # in the hierarchy (see below).
+
+ if 'up' in option:
+ uparg = option.index('up')
+ try:
+ hier_up = int(option[uparg + 1])
+ except:
+ print("Non-numeric option to 'up': " + option[uparg + 1])
+ print("Ignoring 'up' option.")
+ hier_up = 0
+ else:
+ hier_up = 0
+
+ filter_script = None
+ for item in option:
+ if item.split('=')[0] == 'filter':
+ filter_script = item.split('=')[1]
+ break
+
+ # Option 'compile' is a standalone keyword ('comp' may be used).
+ do_compile = 'compile' in option or 'comp' in option
+
+ # Option 'nospecify' is a standalone keyword ('nospec' may be used).
+ do_remove_spec = 'nospecify' in option or 'nospec' in option
+
+ # Check off things we need to do migration to magic database and
+ # abstact files.
+ if option[0] == 'lef':
+ have_lef = True
+ elif option[0] == 'gds':
+ have_gds = True
+ elif option[0] == 'lefanno':
+ have_lefanno = True
+ elif option[0] == 'spice':
+ have_spice = True
+ elif option[0] == 'cdl':
+ have_cdl = True
+
+ # For each library, create the library subdirectory
+ for library in libraries:
+ if len(library) == 3:
+ destlib = library[2]
+ else:
+ destlib = library[1]
+ destlibdir = destdir + '/' + destlib
+ destlinklibdir = destlinkdir + '/' + destlib
+ checksrclibdir = checklibdir + '/' + destlib
+ os.makedirs(destlibdir, exist_ok=True)
+
+ # Populate the library subdirectory
+ # Parse the option and replace each '/*/' with the library name,
+ # and check if it is a valid directory name. Then glob the
+ # resulting option name. Warning: This assumes that all
+ # occurences of the text '/*/' match a library name. It should
+ # be possible to wild-card the directory name in such a way that
+ # this is always true.
+
+ testopt = re.sub('\/\*\/', '/' + library[1] + '/', option[1])
+
+ liblist = glob.glob(sourcedir + '/' + testopt)
+
+ # Diagnostic
+ print('Collecting files from ' + str(sourcedir + '/' + testopt))
+ print('Files to install:')
+ if len(liblist) < 10:
+ for item in liblist:
+ print(' ' + item)
+ else:
+ for item in liblist[0:4]:
+ print(' ' + item)
+ print(' .')
+ print(' .')
+ print(' .')
+ for item in liblist[-6:-1]:
+ print(' ' + item)
+ print('(' + str(len(liblist)) + ' files total)')
+
+ for libname in liblist:
+ # Note that there may be a hierarchy to the files in option[1],
+ # say for liberty timing files under different conditions, so
+ # make sure directories have been created as needed.
+
+ libfile = os.path.split(libname)[1]
+ libfilepath = os.path.split(libname)[0]
+ destpathcomp = []
+ for i in range(hier_up):
+ destpathcomp.append('/' + os.path.split(libfilepath)[1])
+ libfilepath = os.path.split(libfilepath)[0]
+ destpathcomp.reverse()
+ destpath = ''.join(destpathcomp)
+
+ targname = destlibdir + destpath + '/' + libfile
+
+ # NOTE: When using "up" with link_from, could just make
+ # destpath itself a symbolic link; this way is more flexible
+ # but adds one symbolic link per file.
+
+ if destpath != '':
+ if not os.path.isdir(destlibdir + destpath):
+ os.makedirs(destlibdir + destpath, exist_ok=True)
+
+ # Both linklibname and checklibname need to contain any hierarchy
+ # implied by the "up" option.
+
+ linklibname = destlinklibdir + destpath + '/' + libfile
+ checklibname = checksrclibdir + destpath + '/' + libfile
+
+ # Remove any existing file
+ if os.path.isfile(targname):
+ os.remove(targname)
+ elif os.path.islink(targname):
+ os.unlink(targname)
+ elif os.path.isdir(targname):
+ shutil.rmtree(targname)
+
+ if do_install:
+ if not link_from:
+ if os.path.isfile(libname):
+ shutil.copy(libname, targname)
+ else:
+ shutil.copytree(libname, targname)
+ elif link_from == 'source':
+ os.symlink(libname, targname)
+ else:
+ if os.path.exists(checklibname):
+ os.symlink(linklibname, targname)
+ elif os.path.isfile(libname):
+ shutil.copy(libname, targname)
+ else:
+ shutil.copytree(libname, targname)
+
+ if option[0] == 'verilog':
+ # Special handling of verilog files to make them
+ # syntactically acceptable to iverilog.
+ # NOTE: Perhaps this should be recast as a custom filter?
+ vfilter(targname, do_remove_spec)
+
+ if filter_script:
+ # Apply filter script to all files in the target directory
+ tfilter(targname, filter_script)
+
+ if do_compile == True:
+ # To do: Extend this option to include formats other than verilog.
+ # Also to do: Make this compatible with linking from another PDK.
+
+ if option[0] == 'verilog':
+ # If there is not a single file with all verilog cells in it,
+ # then compile one, because one does not want to have to have
+ # an include line for every single cell used in a design.
+
+ alllibname = destlibdir + '/' + destlib + '.v'
+
+ print('Diagnostic: Creating consolidated verilog library ' + destlib + '.v')
+ vlist = glob.glob(destlibdir + '/*.v')
+ if alllibname in vlist:
+ vlist.remove(alllibname)
+
+ if len(vlist) > 1:
+ print('New file is: ' + alllibname)
+ with open(alllibname, 'w') as ofile:
+ for vfile in vlist:
+ with open(vfile, 'r') as ifile:
+ # print('Adding ' + vfile + ' to library.')
+ vtext = ifile.read()
+ # NOTE: The following workaround resolves an
+ # issue with iverilog, which does not properly
+ # parse specify timing paths that are not in
+ # parentheses. Easy to work around
+ vlines = re.sub(r'\)[ \t]*=[ \t]*([01]:[01]:[01])[ \t]*;', r') = ( \1 ) ;', vtext)
+ print(vlines, file=ofile)
+ print('\n//--------EOF---------\n', file=ofile)
+ else:
+ print('Only one file (' + str(vlist) + '); ignoring "compile" option.')
+
+ print("Completed installation of vendor files.")
+
+ # Now for the harder part. If GDS and/or LEF databases were specified,
+ # then migrate them to magic (.mag files in layout/ or abstract/).
+
+ ignore = []
+ do_cdl_scaleu = False
+ for option in optionlist[:]:
+ if option[0] == 'cdl':
+ # Option 'scaleu' is a standalone keyword
+ do_cdl_scaleu = 'scaleu' in option
+
+ # Option 'ignore' has arguments after '='
+ for item in option:
+ if item.split('=')[0] == 'ignore':
+ ignorelist = item.split('=')[1].split(',')
+
+ devlist = []
+ pdklibrary = None
+
+ if have_gds:
+ print("Migrating GDS files to layout.")
+ destdir = targetdir + '/libs.ref/mag'
+ srcdir = targetdir + '/libs.ref/gds'
+ os.makedirs(destdir, exist_ok=True)
+
+ # For each library, create the library subdirectory
+ for library in libraries:
+ if len(library) == 3:
+ destlib = library[2]
+ else:
+ destlib = library[1]
+ destlibdir = destdir + '/' + destlib
+ srclibdir = srcdir + '/' + destlib
+ os.makedirs(destlibdir, exist_ok=True)
+
+ # For primitive devices, check the PDK script and find the name
+ # of the library and get a list of supported devices.
+
+ if library[0] == 'primitive':
+ pdkscript = targetdir + mag_current + pdkname + '.tcl'
+ print('Searching for supported devices in PDK script ' + pdkscript + '.')
+
+ if os.path.isfile(pdkscript):
+ librex = re.compile('^[ \t]*set[ \t]+PDKNAMESPACE[ \t]+([^ \t]+)$')
+ devrex = re.compile('^[ \t]*proc[ \t]+([^ :\t]+)::([^ \t_]+)_defaults')
+ fixrex = re.compile('^[ \t]*return[ \t]+\[([^ :\t]+)::fixed_draw[ \t]+([^ \t]+)[ \t]+')
+ devlist = []
+ fixedlist = []
+ with open(pdkscript, 'r') as ifile:
+ scripttext = ifile.read().splitlines()
+ for line in scripttext:
+ lmatch = librex.match(line)
+ if lmatch:
+ pdklibrary = lmatch.group(1)
+ dmatch = devrex.match(line)
+ if dmatch:
+ if dmatch.group(1) == pdklibrary:
+ devlist.append(dmatch.group(2))
+ fmatch = fixrex.match(line)
+ if fmatch:
+ if fmatch.group(1) == pdklibrary:
+ fixedlist.append(fmatch.group(2))
+
+ # Diagnostic
+ print("PDK library is " + str(pdklibrary))
+
+ # Link to the PDK magic startup file from the target directory
+ # If there is no -F version then look for one without -F (open source PDK)
+ startup_script = targetdir + mag_current + pdkname + '-F.magicrc'
+ if not os.path.isfile(startup_script):
+ startup_script = targetdir + mag_current + pdkname + '.magicrc'
+
+ if os.path.isfile(startup_script):
+ # If the symbolic link exists, remove it.
+ if os.path.isfile(destlibdir + '/.magicrc'):
+ os.remove(destlibdir + '/.magicrc')
+ os.symlink(startup_script, destlibdir + '/.magicrc')
+
+ # Find GDS file names in the source
+ print('Getting GDS file list from ' + srclibdir + '.')
+ gdsfiles = os.listdir(srclibdir)
+
+ # Generate a script called "generate_magic.tcl" and leave it in
+ # the target directory. Use it as input to magic to create the
+ # .mag files from the database.
+
+ print('Creating magic generation script.')
+ with open(destlibdir + '/generate_magic.tcl', 'w') as ofile:
+ print('#!/usr/bin/env wish', file=ofile)
+ print('#--------------------------------------------', file=ofile)
+ print('# Script to generate .mag files from .gds ', file=ofile)
+ print('#--------------------------------------------', file=ofile)
+ print('gds readonly true', file=ofile)
+ print('gds flatten true', file=ofile)
+ # print('gds rescale false', file=ofile)
+ print('tech unlock *', file=ofile)
+
+ for gdsfile in gdsfiles:
+ # Note: DO NOT use a relative path here.
+ # print('gds read ../../gds/' + destlib + '/' + gdsfile, file=ofile)
+ print('gds read ' + srclibdir + '/' + gdsfile, file=ofile)
+
+ # Make sure properties include the Tcl generated cell
+ # information from the PDK script
+
+ if pdklibrary:
+ tclfixedlist = '{' + ' '.join(fixedlist) + '}'
+ print('set devlist ' + tclfixedlist, file=ofile)
+ print('set topcell [lindex [cellname list top] 0]',
+ file=ofile)
+
+ print('foreach cellname $devlist {', file=ofile)
+ print(' load $cellname', file=ofile)
+ print(' property gencell $cellname', file=ofile)
+ print(' property parameter m=1', file=ofile)
+ print(' property library ' + pdklibrary, file=ofile)
+ print('}', file=ofile)
+ print('load $topcell', file=ofile)
+
+ print('writeall force', file=ofile)
+
+ if have_lefanno:
+ # Find LEF file names in the source
+ lefsrcdir = targetdir + '/libs.ref/lefanno'
+ lefsrclibdir = lefsrcdir + '/' + destlib
+ leffiles = list(item for item in os.listdir(lefsrclibdir) if os.path.splitext(item)[1] == '.lef')
+
+ if not have_lef:
+ # This library has a GDS database but no LEF database. Use
+ # magic to create abstract views of the GDS cells. If
+ # option "-lefanno" is given, then read the LEF file after
+ # loading the database file to annotate the cell with
+ # information from the LEF file. This usually indicates
+ # that the LEF file has some weird definition of obstruction
+ # layers and we want to normalize them by using magic's LEF
+ # write procedure, but we still need the pin use and class
+ # information from the LEF file, and maybe the bounding box.
+
+ print('set maglist [glob *.mag]', file=ofile)
+ print('foreach name $maglist {', file=ofile)
+ print(' load [file root $name]', file=ofile)
+ if have_lefanno:
+ print('}', file=ofile)
+ for leffile in leffiles:
+ print('lef read ' + lefsrclibdir + '/' + leffile, file=ofile)
+ print('foreach name $maglist {', file=ofile)
+ print(' load [file root $name]', file=ofile)
+ print(' lef write [file root $name]', file=ofile)
+ print('}', file=ofile)
+ print('quit -noprompt', file=ofile)
+
+ # Run magic to read in the GDS file and write out magic databases.
+ with open(destlibdir + '/generate_magic.tcl', 'r') as ifile:
+ subprocess.run(['magic', '-dnull', '-noconsole'],
+ stdin = ifile, stdout = subprocess.PIPE,
+ stderr = subprocess.PIPE, cwd = destlibdir,
+ universal_newlines = True)
+
+ if not have_lef:
+ # Remove the lefanno/ target and its contents.
+ if have_lefanno:
+ lefannosrcdir = targetdir + '/libs.ref/lefanno'
+ if os.path.isdir(lefannosrcdir):
+ shutil.rmtree(lefannosrcdir)
+
+ destlefdir = targetdir + '/libs.ref/lef'
+ destleflibdir = destlefdir + '/' + destlib
+ os.makedirs(destleflibdir, exist_ok=True)
+ leflist = list(item for item in os.listdir(destlibdir) if os.path.splitext(item)[1] == '.lef')
+
+ # All macros will go into one file
+ destleflib = destleflibdir + '/' + destlib + '.lef'
+ # Remove any existing library file from the target directory
+ if os.path.isfile(destleflib):
+ os.remove(destleflib)
+
+ first = True
+ with open(destleflib, 'w') as ofile:
+ for leffile in leflist:
+ # Remove any existing single file from the target directory
+ if os.path.isfile(destleflibdir + '/' + leffile):
+ os.remove(destleflibdir + '/' + leffile)
+
+ # Append contents
+ sourcelef = destlibdir + '/' + leffile
+ with open(sourcelef, 'r') as ifile:
+ leflines = ifile.read().splitlines()
+ if not first:
+ # Remove header from all but the first file
+ leflines = leflines[8:]
+ else:
+ first = False
+
+ for line in leflines:
+ print(line, file=ofile)
+
+ # Remove file from the source directory
+ os.remove(sourcelef)
+
+ have_lef = True
+
+ # Remove the startup script and generation script
+ os.remove(destlibdir + '/.magicrc')
+ os.remove(destlibdir + '/generate_magic.tcl')
+ else:
+ print("Master PDK magic startup file not found. Did you install")
+ print("PDK tech files before PDK vendor files?")
+
+ if have_lef:
+ print("Migrating LEF files to layout.")
+ destdir = targetdir + '/libs.ref/maglef'
+ srcdir = targetdir + '/libs.ref/lef'
+ magdir = targetdir + '/libs.ref/mag'
+ cdldir = targetdir + '/libs.ref/cdl'
+ os.makedirs(destdir, exist_ok=True)
+
+ # For each library, create the library subdirectory
+ for library in libraries:
+ if len(library) == 3:
+ destlib = library[2]
+ else:
+ destlib = library[1]
+ destlibdir = destdir + '/' + destlib
+ srclibdir = srcdir + '/' + destlib
+ maglibdir = magdir + '/' + destlib
+ cdllibdir = cdldir + '/' + destlib
+ os.makedirs(destlibdir, exist_ok=True)
+
+ # Link to the PDK magic startup file from the target directory
+ startup_script = targetdir + mag_current + pdkname + '-F.magicrc'
+ if not os.path.isfile(startup_script):
+ startup_script = targetdir + mag_current + pdkname + '.magicrc'
+
+ if os.path.isfile(startup_script):
+ # If the symbolic link exists, remove it.
+ if os.path.isfile(destlibdir + '/.magicrc'):
+ os.remove(destlibdir + '/.magicrc')
+ os.symlink(startup_script, destlibdir + '/.magicrc')
+
+ # Find LEF file names in the source
+ leffiles = os.listdir(srclibdir)
+
+ # Generate a script called "generate_magic.tcl" and leave it in
+ # the target directory. Use it as input to magic to create the
+ # .mag files from the database.
+
+ with open(destlibdir + '/generate_magic.tcl', 'w') as ofile:
+ print('#!/usr/bin/env wish', file=ofile)
+ print('#--------------------------------------------', file=ofile)
+ print('# Script to generate .mag files from .lef ', file=ofile)
+ print('#--------------------------------------------', file=ofile)
+ print('tech unlock *', file=ofile)
+
+ if pdklibrary:
+ tcldevlist = '{' + ' '.join(devlist) + '}'
+ print('set devlist ' + tcldevlist, file=ofile)
+
+ for leffile in leffiles:
+
+ # Okay to use a relative path here.
+ # print('lef read ' + srclibdir + '/' + leffile', file=ofile)
+ print('lef read ../../lef/' + destlib + '/' + leffile, file=ofile)
+
+ # To be completed: Parse SPICE file for port order, make
+ # sure ports are present and ordered.
+
+ if pdklibrary:
+ print('set cellname [file root ' + leffile + ']', file=ofile)
+ print('if {[lsearch $devlist $cellname] >= 0} {',
+ file=ofile)
+ print(' load $cellname', file=ofile)
+ print(' property gencell $cellname', file=ofile)
+ print(' property parameter m=1', file=ofile)
+ print(' property library ' + pdklibrary, file=ofile)
+ print('}', file=ofile)
+
+ print('writeall force', file=ofile)
+ print('quit -noprompt', file=ofile)
+
+ # Run magic to read in the LEF file and write out magic databases.
+ with open(destlibdir + '/generate_magic.tcl', 'r') as ifile:
+ subprocess.run(['magic', '-dnull', '-noconsole'],
+ stdin = ifile, stdout = subprocess.PIPE,
+ stderr = subprocess.PIPE, cwd = destlibdir,
+ universal_newlines = True)
+
+ # Now list all the .mag files generated, and for each, read the
+ # corresponding file from the mag/ directory, pull the GDS file
+ # properties, and add those properties to the maglef view. Also
+ # read the CDL (or SPICE) netlist, read the ports, and rewrite
+ # the port order in the mag and maglef file accordingly.
+
+ # Diagnostic
+ print('Annotating files in ' + destlibdir)
+ magfiles = os.listdir(destlibdir)
+ for magroot in magfiles:
+ magname = os.path.splitext(magroot)[0]
+ magfile = maglibdir + '/' + magroot
+ magleffile = destlibdir + '/' + magroot
+ prop_lines = get_gds_properties(magfile)
+
+ # Make sure properties include the Tcl generated cell
+ # information from the PDK script
+
+ if pdklibrary:
+ if magname in fixedlist:
+ prop_lines.append('string gencell ' + magname)
+ prop_lines.append('string library ' + pdklibrary)
+ prop_lines.append('string parameter m=1')
+
+ cdlfile = cdllibdir + '/' + magname + '.cdl'
+ if not os.path.exists(cdlfile):
+ # Assume there is one file with all cell subcircuits in it.
+ try:
+ cdlfile = glob.glob(cdllibdir + '/*.cdl')[0]
+ except:
+ print('No CDL file for ' + destlib + ' device ' + magname)
+ cdlfile = None
+ # To be done: If destlib is 'primitive', then look in
+ # SPICE models for port order.
+ if destlib == 'primitive':
+ print('Fix me: Need to look in SPICE models!')
+ if cdlfile:
+ port_dict = get_subckt_ports(cdlfile, magname)
+ else:
+ port_dict = {}
+
+ proprex = re.compile('<< properties >>')
+ endrex = re.compile('<< end >>')
+ rlabrex = re.compile('rlabel[ \t]+[^ \t]+[ \t]+[^ \t]+[ \t]+[^ \t]+[ \t]+[^ \t]+[ \t]+[^ \t]+[ \t]+[^ \t]+[ \t]+([^ \t]+)')
+ flabrex = re.compile('flabel[ \t]+[^ \t]+[ \t]+[^ \t]+[ \t]+[^ \t]+[ \t]+[^ \t]+[ \t]+[^ \t]+[ \t]+[^ \t]+[ \t]+[^ \t]+[ \t]+[^ \t]+[ \t]+[^ \t]+[ \t]+[^ \t]+[ \t]+[^ \t]+[ \t]+[^ \t]+[ \t]+([^ \t]+)')
+ portrex = re.compile('port[ \t]+([^ \t])+[ \t]+(.*)')
+ portnum = -1
+
+ with open(magleffile, 'r') as ifile:
+ magtext = ifile.read().splitlines()
+
+ with open(magleffile, 'w') as ofile:
+ for line in magtext:
+ tmatch = portrex.match(line)
+ if tmatch:
+ if portnum >= 0:
+ line = 'port ' + str(portnum) + ' ' + tmatch.group(2)
+ else:
+ line = 'port ' + tmatch.group(1) + ' ' + tmatch.group(2)
+ ematch = endrex.match(line)
+ if ematch and len(prop_lines) > 0:
+ print('<< properties >>', file=ofile)
+ for prop in prop_lines:
+ print('string ' + prop, file=ofile)
+
+ print(line, file=ofile)
+ pmatch = proprex.match(line)
+ if pmatch:
+ for prop in prop_lines:
+ print('string ' + prop, file=ofile)
+ prop_lines = []
+
+ lmatch = flabrex.match(line)
+ if not lmatch:
+ lmatch = rlabrex.match(line)
+ if lmatch:
+ labname = lmatch.group(1).lower()
+ try:
+ portnum = port_dict[labname]
+ except:
+ portnum = -1
+
+ if os.path.exists(magfile):
+ with open(magfile, 'r') as ifile:
+ magtext = ifile.read().splitlines()
+
+ with open(magfile, 'w') as ofile:
+ for line in magtext:
+ tmatch = portrex.match(line)
+ if tmatch:
+ if portnum >= 0:
+ line = 'port ' + str(portnum) + ' ' + tmatch.group(2)
+ else:
+ line = 'port ' + tmatch.group(1) + ' ' + tmatch.group(2)
+ ematch = endrex.match(line)
+ print(line, file=ofile)
+ lmatch = flabrex.match(line)
+ if not lmatch:
+ lmatch = rlabrex.match(line)
+ if lmatch:
+ labname = lmatch.group(1).lower()
+ try:
+ portnum = port_dict[labname]
+ except:
+ portnum = -1
+ elif os.path.splitext(magfile)[1] == '.mag':
+ # NOTE: Probably this means the GDS cell has a different name.
+ print('Error: No file ' + magfile + '. Why is it in maglef???')
+
+ # Remove the startup script and generation script
+ os.remove(destlibdir + '/.magicrc')
+ os.remove(destlibdir + '/generate_magic.tcl')
+ else:
+ print("Master PDK magic startup file not found. Did you install")
+ print("PDK tech files before PDK vendor files?")
+
+ # If SPICE or CDL databases were specified, then convert them to
+ # a form that can be used by ngspice, using the cdl2spi.py script
+
+ if have_spice:
+ if not os.path.isdir(targetdir + '/libs.ref/spi'):
+ os.makedirs(targetdir + '/libs.ref/spi', exist_ok=True)
+
+ elif have_cdl:
+ if not os.path.isdir(targetdir + '/libs.ref/spi'):
+ os.makedirs(targetdir + '/libs.ref/spi', exist_ok=True)
+
+ print("Migrating CDL netlists to SPICE.")
+ destdir = targetdir + '/libs.ref/spi'
+ srcdir = targetdir + '/libs.ref/cdl'
+ os.makedirs(destdir, exist_ok=True)
+
+ # For each library, create the library subdirectory
+ for library in libraries:
+ if len(library) == 3:
+ destlib = library[2]
+ else:
+ destlib = library[1]
+ destlibdir = destdir + '/' + destlib
+ srclibdir = srcdir + '/' + destlib
+ os.makedirs(destlibdir, exist_ok=True)
+
+ # Find CDL file names in the source
+ cdlfiles = os.listdir(srclibdir)
+
+ # The directory with scripts should be in ../common with respect
+ # to the Makefile that determines the cwd.
+ scriptdir = os.path.split(os.getcwd())[0] + '/common/'
+
+ # Run cdl2spi.py script to read in the CDL file and write out SPICE
+ for cdlfile in cdlfiles:
+ spiname = os.path.splitext(cdlfile)[0] + '.spi'
+ procopts = [scriptdir + 'cdl2spi.py', srclibdir + '/' + cdlfile, destlibdir + '/' + spiname]
+ if do_cdl_scaleu:
+ procopts.append('-dscale=u')
+ for item in ignorelist:
+ procopts.append('-ignore=' + item)
+ print('Running (in ' + destlibdir + '): ' + ' '.join(procopts))
+ subprocess.run(procopts,
+ stdin = subprocess.DEVNULL, stdout = subprocess.PIPE,
+ stderr = subprocess.PIPE, cwd = destlibdir,
+ universal_newlines = True)
+
+ sys.exit(0)
diff --git a/common/pdk.bindkeys b/common/pdk.bindkeys
new file mode 100644
index 0000000..1b502db
--- /dev/null
+++ b/common/pdk.bindkeys
@@ -0,0 +1,186 @@
+#
+# Cadence-compatibility bindings except where marked.
+#
+macro f "view" ;# zoom to fit window
+macro ^z "zoom 0.5" ;# zoom in
+macro Z "zoom 2" ;# zoom out
+macro B "popstack" ;# up hierarchy
+macro X {pushstack [cellname list self]} ;# down hierarchy
+macro x "edit" ;# down hierarchy, edit-in-place
+macro b "select top cell ; edit" ;# up hierarchy from edit-in-place
+macro p "tool wire; magic::trackwire %W pick" ;# path
+macro ^r "redraw"
+macro ^f "unexpand"
+macro F "expand"
+macro ^a "select area"
+macro ^d "select clear"
+macro k "magic::measure"
+macro K "magic::unmeasure"
+macro i "magic::promptload getcell"
+macro l "magic::update_texthelper ; wm deiconify .texthelper ; raise .texthelper"
+macro O "magic::clock"
+macro <del> "magic::delete"
+
+# Toolkit parameter dialog
+macro q "magic::gencell {} ; raise .params"
+#
+# The following should be already implemented as existing Magic bindings
+#
+macro u "undo"
+macro U "redo"
+macro m "move"
+macro c "copy"
+#
+# Compatibility with Electric; Cadence bindings are on function keys and
+# do not work through the VNC.
+macro ^s "magic::promptsave magic" ;# save dialog menu
+
+#
+# Bertrand's bindings follow except where marked.
+#
+macro < sideways
+macro ^ upsidedown
+#
+# Set grid at 1 micron
+#
+macro 0 "grid on ; grid 1l" ;# Grid at 0.5um (1 lambda)
+# macro ^f "feedback clear" ;# conflicts with Cadence binding
+#
+# Paint/Erase macros
+#
+macro 1 "paint m1"
+macro ! "erase m1"
+macro 2 "paint m2"
+macro @ "erase m2"
+macro 3 "paint m3"
+macro # "erase m3"
+#ifdef METAL4
+macro 4 "paint mtp"
+macro $ "erase mtp"
+#endif
+#ifdef METAL5
+macro 4 "paint m4"
+macro $ "erase m4"
+macro 5 "paint mtp"
+macro % "erase mtp"
+#endif
+#ifdef METAL6
+macro 4 "paint m4"
+macro $ "erase m4"
+macro 5 "paint m5"
+macro % "erase m5"
+macro 6 "paint mtp"
+macro ^ "erase mtp"
+#endif
+
+macro 7 "paint poly"
+# macro & "erase poly"
+# macro p "paint pdiff"
+macro n "paint ndiff"
+# macro l "erase labels"
+macro P "erase pdiff"
+macro N "erase ndiff"
+macro y "drc check; drc why"
+macro ? "select area; what"
+
+macro / "expand toggle"
+macro ^w "writeall force"
+macro ^e "edit"
+# macro ^x "quit"
+
+macro z "findbox zoom"
+# "f" conflicts with Cadence "full view", so use control-i to select cells.
+# macro f "select cell"
+macro ^i "select cell"
+
+# Leave keypad bindings as-is, further down. However, keypad
+# keys generally don't translate through the VNC session, so
+# use the following arrow key bindings:
+#
+# no shift shift
+# arrows only -> Pan 10% 100%
+# with alt -> Move 1 lambda 1 grid
+# with ctrl -> Stretch 1 lambda 1 grid
+#
+# Pan 10 percent of the window size with arrows
+# macro XK_Left "scroll l .1 w"
+# macro XK_Up "scroll u .1 w"
+# macro XK_Right "scroll r .1 w"
+# macro XK_Down "scroll d .1 w"
+
+# Pan 100 percent of the window size with arrows
+# macro Shift_XK_Left "scroll l 1 w"
+# macro Shift_XK_Up "scroll u 1 w"
+# macro Shift_XK_Right "scroll r 1 w"
+# macro Shift_XK_Down "scroll d 1 w"
+
+# move 0.05um with arrows
+# macro Alt_XK_Left "move l 1l"
+# macro Alt_XK_Right "move r 1l"
+# macro Alt_XK_Up "move u 1l"
+# macro Alt_XK_Down "move d 1l"
+
+# move 1 grid unit with arrows
+# macro Alt_Shift_XK_Left "move l 1g"
+# macro Alt_Shift_XK_Right "move r 1g"
+# macro Alt_Shift_XK_Up "move u 1g"
+# macro Alt_Shift_XK_Down "move d 1g"
+
+# stretch 0.05um with arrows
+# macro Control_XK_Left "stretch l 1l"
+# macro Control_XK_Right "stretch r 1l"
+# macro Control_XK_Up "stretch u 1l"
+# macro Control_XK_Down "stretch d 1l"
+
+# stretch 1 grid unit with arrows
+# macro Control_Shift_XK_Left "stretch l 1g"
+# macro Control_Shift_XK_Right "stretch r 1g"
+# macro Control_Shift_XK_Up "stretch u 1g"
+# macro Control_Shift_XK_Down "stretch d 1g"
+
+# shift mouse wheel bindings for right-left shift
+macro Shift_XK_Pointer_Button4 "scroll r .05 w"
+macro Shift_XK_Pointer_Button5 "scroll l .05 w"
+
+# control mouse wheel bindings for zoom in/out
+macro Control_XK_Pointer_Button4 "zoom 0.70711"
+macro Control_XK_Pointer_Button5 "zoom 1.41421"
+
+# Bertrand's original arrow macros
+# move 1 grid unit with arrows
+macro XK_Left "move l 1g"
+macro XK_Right "move r 1g"
+macro XK_Up "move u 1g"
+macro XK_Down "move d 1g"
+
+# move 0.05um with arrows
+macro Control_XK_Left "move l 1l"
+macro Control_XK_Right "move r 1l"
+macro Control_XK_Up "move u 1l"
+macro Control_XK_Down "move d 1l"
+
+# stretch 1 grid unit with arrows
+macro Shift_XK_Left "stretch l 1g"
+macro Shift_XK_Right "stretch r 1g"
+macro Shift_XK_Up "stretch u 1g"
+macro Shift_XK_Down "stretch d 1g"
+
+# stretch 0.05um with arrows
+macro Control_Shift_XK_Left "stretch l 1l"
+macro Control_Shift_XK_Right "stretch r 1l"
+macro Control_Shift_XK_Up "stretch u 1l"
+macro Control_Shift_XK_Down "stretch d 1l"
+
+# Restore pan function on Alt-key
+# Pan 10 percent of the window size with arrows
+macro Alt_XK_Left "scroll l .1 w"
+macro Alt_XK_Up "scroll u .1 w"
+macro Alt_XK_Right "scroll r .1 w"
+macro Alt_XK_Down "scroll d .1 w"
+
+# Pan 100 percent of the window size with arrows
+macro Alt_Shift_XK_Left "scroll l 1 w"
+macro Alt_Shift_XK_Up "scroll u 1 w"
+macro Alt_Shift_XK_Right "scroll r 1 w"
+macro Alt_Shift_XK_Down "scroll d 1 w"
+
diff --git a/common/pdk.prm b/common/pdk.prm
new file mode 100644
index 0000000..719eb74
--- /dev/null
+++ b/common/pdk.prm
@@ -0,0 +1,26 @@
+; TODO: make changes to this file for TECHNAME?
+; configuration file for TECHNAME (left same as osu035, 0.35um process)
+; Note that these values are totally bogus!
+;
+
+lambda 0.01 ; length scaling, microns (1 lambda = 1 centimicron)
+
+capga .0115 ; gate capacitance, pF/micron^2
+
+capda 0.0012
+capdp 0.0013
+cappda 0.00260
+cappdp 0.00090
+
+lowthresh 0.5 ; logic low threshold as a normalized voltage
+highthresh 0.5 ; logic high threshold as a normalized voltage
+
+cntpullup 0 ; irrelevant, cmos technology; no depletion transistors
+diffperim 0 ; don't include diffusion perimeters for sidewall cap.
+subparea 0 ; poly over transistor won't count as part pf bulk-poly cap.
+diffext 0 ; diffusion extension for each transistor
+
+resistance n-channel dynamic-low 2 0.4 1844.70
+resistance p-channel dynamic-high 6.2 0.4 1489.10
+resistance n-channel static 2 0.4 2203.94
+resistance p-channel static 6.2 0.4 1693.37
diff --git a/common/pdk.tcl b/common/pdk.tcl
new file mode 100644
index 0000000..489c5f6
--- /dev/null
+++ b/common/pdk.tcl
@@ -0,0 +1,274 @@
+#-------------------------------------------------------------------
+# General-purpose routines for the PDK script in all technologies
+#-------------------------------------------------------------------
+#
+#----------------------------------------
+# Number Conversion Functions
+#----------------------------------------
+
+#---------------------
+# Microns to Lambda
+#---------------------
+proc magic::u2l {micron} {
+ set techlambda [magic::tech lambda]
+ set tech1 [lindex $techlambda 1]
+ set tech0 [lindex $techlambda 0]
+ set tscale [expr {$tech1 / $tech0}]
+ set lambdaout [expr {((round([magic::cif scale output] * 10000)) / 10000.0)}]
+ return [expr $micron / ($lambdaout*$tscale) ]
+}
+
+#---------------------
+# Lambda to Microns
+#---------------------
+proc magic::l2u {lambda} {
+ set techlambda [magic::tech lambda]
+ set tech1 [lindex $techlambda 1] ; set tech0 [lindex $techlambda 0]
+ set tscale [expr {$tech1 / $tech0}]
+ set lambdaout [expr {((round([magic::cif scale output] * 10000)) / 10000.0)}]
+ return [expr $lambda * $lambdaout * $tscale ]
+}
+
+#---------------------
+# Internal to Microns
+#---------------------
+proc magic::i2u { value } {
+ return [expr {((round([magic::cif scale output] * 10000)) / 10000.0) * $value}]
+}
+
+#---------------------
+# Microns to Internal
+#---------------------
+proc magic::u2i {value} {
+ return [expr {$value / ((round([magic::cif scale output] * 10000)) / 10000.0)}]
+}
+
+#---------------------
+# Float to Spice
+#---------------------
+proc magic::float2spice {value} {
+ if {$value >= 1.0e+6} {
+ set exponent 1e+6
+ set unit "meg"
+ } elseif {$value >= 1.0e+3} {
+ set exponent 1e+3
+ set unit "k"
+ } elseif { $value >= 1} {
+ set exponent 1
+ set unit ""
+ } elseif {$value >= 1.0e-3} {
+ set exponent 1e-3
+ set unit "m"
+ } elseif {$value >= 1.0e-6} {
+ set exponent 1e-6
+ set unit "u"
+ } elseif {$value >= 1.0e-9} {
+ set exponent 1e-9
+ set unit "n"
+ } elseif {$value >= 1.0e-12} {
+ set exponent 1e-12
+ set unit "p"
+ } elseif {$value >= 1.0e-15} {
+ set exponent 1e-15
+ set unit "f"
+ } else {
+ set exponent 1e-18
+ set unit "a"
+ }
+ set val [expr $value / $exponent]
+ set val [expr int($val * 1000) / 1000.0]
+ if {$val == 0} {set unit ""}
+ return $val$unit
+}
+
+#---------------------
+# Spice to Float
+#---------------------
+proc magic::spice2float {value {faultval 0.0}} {
+ # Remove trailing units, at least for some common combinations
+ set value [string tolower $value]
+ set value [string map {um u nm n uF n nF n pF p aF a} $value]
+ set value [string map {meg "* 1.0e6" k "* 1.0e3" m "* 1.0e-3" u "* 1.0e-6" \
+ n "* 1.0 e-9" p "* 1.0e-12" f "* 1.0e-15" a "* 1.0e-15"} $value]
+ if {[catch {set rval [expr $value]}]} {
+ puts stderr "Value is not numeric!"
+ set rval $faultval
+ }
+ return $rval
+}
+
+#---------------------
+# Numeric Precision
+#---------------------
+proc magic::3digitpastdecimal {value} {
+ set new [expr int([expr $value * 1000 + 0.5 ]) / 1000.0]
+ return $new
+}
+
+#-------------------------------------------------------------------
+# File Access Functions
+#-------------------------------------------------------------------
+
+#-------------------------------------------------------------------
+# Ensures that a cell name does not already exist, either in
+# memory or on disk. Modifies the name until it does.
+#-------------------------------------------------------------------
+proc magic:cellnameunique {cellname} {
+ set i 0
+ set newname $cellname
+ while {[cellname list exists $newname] != 0 || [magic::searchcellondisk $newname] != 0} {
+ incr i
+ set newname ${cellname}_$i
+ }
+ return $newname
+}
+
+#-------------------------------------------------------------------
+# Looks to see if a cell exists on disk
+#-------------------------------------------------------------------
+proc magic::searchcellondisk {name} {
+ set rlist {}
+ foreach dir [path search] {
+ set ftry [file join $dir ${name}.mag]
+ if [file exists $ftry] {
+ return 1
+ }
+ }
+ return 0
+}
+
+#-------------------------------------------------------------------
+# Checks to see if a cell already exists on disk or in memory
+#-------------------------------------------------------------------
+proc magic::iscellnameunique {cellname} {
+ if {[cellname list exists $cellname] == 0 && [magic::searchcellondisk $cellname] == 0} {
+ return 1
+ } else {
+ return 0
+ }
+}
+
+#--------------------------------------------------------------
+# Procedure that checks the user's "ip" subdirectory on startup
+# and adds each one's maglef subdirectory to the path.
+#--------------------------------------------------------------
+
+proc magic::query_mylib_ip {} {
+ global TECHPATH
+ global env
+ if [catch {set home $env(SUDO_USER)}] {
+ set home $env(USER)
+ }
+ set homedir /home/${home}
+ set ip_dirs [glob -directory ${homedir}/design/ip *]
+ set proj_dir [pwd]
+ set config_dir .config
+ set info_dir ${proj_dir}/${config_dir}
+ if {![file exists ${info_dir}]} {
+ set config_dir .ef-config
+ set info_dir ${proj_dir}/${config_dir}
+ }
+
+ set info_file ${info_dir}/info
+ set depends [dict create]
+ if {![catch {open $info_file r} ifd]} {
+ set depsec false
+ while {[gets $ifd line] >= 0} {
+ if {[string first dependencies: $line] >= 0} {
+ set depsec true
+ }
+ if {$depsec} {
+ if {[string first version: $line] >= 0} {
+ if {$ipname != ""} {
+ set ipvers [string trim [lindex [split $line] 1] ']
+ dict set depends $ipname $ipvers
+ set ipname ""
+ } else {
+ puts stderr "Badly formatted info file in ${config_dir}!"
+ }
+ } else {
+ set ipname [string trim $line :]
+ }
+ }
+ }
+ }
+
+ foreach dir $ip_dirs {
+ # Version handling: version dependencies are found in
+ # ${config_dir}/info. For all other IP, use the most recent
+ # version number.
+ set ipname [lindex [file split $dir] end]
+ if {![catch {set version [dict get $depends $ipname]}]} {
+ if {[file isdirectory ${dir}/${version}/maglef]} {
+ addpath ${dir}/${version}/maglef
+ continue
+ } else {
+ puts stderr "ERROR: Dependency ${ipname} version ${version} does not exist"
+ }
+ }
+
+ # Secondary directory is the version number. Use the highest
+ # version available.
+
+ set sub_dirs {}
+ catch {set sub_dirs [glob -directory $dir *]}
+ set maxver 0.0
+ foreach subdir $sub_dirs {
+ set vidx [string last / $subdir]
+ incr vidx
+ set version [string range $subdir $vidx end]
+ if {$version > $maxver} {
+ set maxver $version
+ }
+ }
+ if {[file exists ${dir}/${maxver}/maglef]} {
+ # Compatibility rule: foundry name must match.
+ # Get foundry name from ${config_dir}/techdir symbolic link reference
+ if {[file exists ${dir}/${maxver}/${config_dir}/techdir]} {
+ set technodedir [file link ${dir}/${maxver}/${config_dir}/techdir]
+ set nidx [string last / $technodedir]
+ set techdir [string range $technodedir 0 $nidx-1]
+ if {$techdir == $TECHPATH} {
+ addpath ${dir}/${maxver}/maglef
+ }
+ }
+ }
+ }
+}
+
+#--------------------------------------------------------------
+# Procedure that checks the user's design directory on startup
+# and adds each one's mag subdirectory to the path.
+#--------------------------------------------------------------
+
+proc magic::query_my_projects {} {
+ global TECHPATH
+ global env
+ if [catch {set home $env(SUDO_USER)}] {
+ set home $env(USER)
+ }
+ set homedir /home/${home}
+ set proj_dirs [glob -directory ${homedir}/design *]
+ foreach dir $proj_dirs {
+ # Compatibility rule: foundry name must match.
+ # Get foundry name from ${config_dir}/techdir symbolic link reference
+ if {[file exists ${dir}/mag]} {
+ set config_dir .config
+ set tech_dir ${dir}/${config_dir}
+ if {![file exists ${tech_dir}]} {
+ set config_dir .ef-config
+ set tech_dir ${dir}/${config_dir}
+ }
+ if {[file exists ${dir}/${config_dir}/techdir]} {
+ set technodedir [file link ${dir}/${config_dir}/techdir]
+ set nidx [string last / $technodedir]
+ set techdir [string range $technodedir 0 $nidx-1]
+ if {$techdir == $TECHPATH} {
+ addpath ${dir}/mag
+ }
+ }
+ }
+ }
+}
+
+#----------------------------------------------------------------
diff --git a/common/preproc.py b/common/preproc.py
new file mode 100755
index 0000000..9612921
--- /dev/null
+++ b/common/preproc.py
@@ -0,0 +1,594 @@
+#!/usr/bin/env python3
+#--------------------------------------------------------------------
+#
+# preproc.py
+#
+# General purpose macro preprocessor
+#
+#--------------------------------------------------------------------
+# Usage:
+#
+# preproc.py input_file [output_file] [-D<variable> ...]
+#
+# Where <variable> may be a keyword or a key=value pair
+#
+# Syntax: Basically like cpp. However, this preprocessor handles
+# only a limited set of keywords, so it does not otherwise mangle
+# the file in the belief that it must be C code. Handling of boolean
+# relations is important, so these are thoroughly defined (see below)
+#
+# #if defined(<variable>) [...]
+# #ifdef <variable>
+# #ifndef <variable>
+# #elseif <variable>
+# #else
+# #endif
+#
+# #define <variable> [...]
+# #define <variable>(<parameters>) [...]
+# #undef <variable>
+#
+# #include <filename>
+#
+# <variable> may be
+# <keyword>
+# <keyword>=<value>
+#
+# <keyword> without '=' is effectively the same as <keyword>=1
+# Lack of a keyword is equivalent to <keyword>=0, in a conditional.
+#
+# Boolean operators (in order of precedence):
+# ! NOT
+# && AND
+# || OR
+#
+# Comments:
+# Most comments (C-like or Tcl-like) are output as-is. A
+# line beginning with "###" is treated as a preprocessor
+# comment and is not copied to the output.
+#
+# Examples;
+# #if defined(X) || defined(Y)
+# #else
+# #if defined(Z)
+# #endif
+#--------------------------------------------------------------------
+
+import os
+import re
+import sys
+
+def solve_statement(condition):
+
+ defrex = re.compile('defined[ \t]*\(([^\)]+)\)')
+ orrex = re.compile('(.+)\|\|(.+)')
+ andrex = re.compile('(.+)&&(.+)')
+ notrex = re.compile('!([^&\|]+)')
+ parenrex = re.compile('\(([^\)]+)\)')
+ leadspacerex = re.compile('^[ \t]+(.*)')
+ endspacerex = re.compile('(.*)[ \t]+$')
+
+ matchfound = True
+ while matchfound:
+ matchfound = False
+
+ # Search for defined(K) (K must be a single keyword)
+ # If the keyword was defined, then it should have been replaced by 1
+ lmatch = defrex.search(condition)
+ if lmatch:
+ key = lmatch.group(1)
+ if key == 1 or key == '1' or key == True:
+ repl = 1
+ else:
+ repl = 0
+
+ condition = defrex.sub(str(repl), condition)
+ matchfound = True
+
+ # Search for (X) recursively
+ lmatch = parenrex.search(condition)
+ if lmatch:
+ repl = solve_statement(lmatch.group(1))
+ condition = parenrex.sub(str(repl), condition)
+ matchfound = True
+
+ # Search for !X recursively
+ lmatch = notrex.search(condition)
+ if lmatch:
+ only = solve_statement(lmatch.group(1))
+ if only == '1':
+ repl = '0'
+ else:
+ repl = '1'
+ condition = notrex.sub(str(repl), condition)
+ matchfound = True
+
+ # Search for A&&B recursively
+ lmatch = andrex.search(condition)
+ if lmatch:
+ first = solve_statement(lmatch.group(1))
+ second = solve_statement(lmatch.group(2))
+ if first == '1' and second == '1':
+ repl = '1'
+ else:
+ repl = '0'
+ condition = andrex.sub(str(repl), condition)
+ matchfound = True
+
+ # Search for A||B recursively
+ lmatch = orrex.search(condition)
+ if lmatch:
+ first = solve_statement(lmatch.group(1))
+ second = solve_statement(lmatch.group(2))
+ if first == '1' or second == '1':
+ repl = '1'
+ else:
+ repl = '0'
+ condition = orrex.sub(str(repl), condition)
+ matchfound = True
+
+ # Remove whitespace
+ lmatch = leadspacerex.match(condition)
+ if lmatch:
+ condition = lmatch.group(1)
+ lmatch = endspacerex.match(condition)
+ if lmatch:
+ condition = lmatch.group(1)
+
+ return condition
+
+def solve_condition(condition, keys, defines, keyrex):
+ # Do definition replacement on the conditional
+ for keyword in keys:
+ condition = keyrex[keyword].sub(defines[keyword], condition)
+
+ value = solve_statement(condition)
+ if value == '1':
+ return 1
+ else:
+ return 0
+
+def sortkeys(keys):
+ newkeys = []
+ for i in range(0, len(keys)):
+ keyword = keys[i]
+ found = False
+ for j in range(0, len(newkeys)):
+ inword = newkeys[j]
+ if inword in keyword:
+ # Insert keyword before inword
+ newkeys.insert(j, keyword)
+ found = True
+ break
+ if not found:
+ newkeys.append(keyword)
+ return newkeys
+
+def runpp(keys, keyrex, defines, ccomm, incdirs, inputfile, ofile):
+
+ includerex = re.compile('^[ \t]*#include[ \t]+"*([^ \t\n\r"]+)')
+ definerex = re.compile('^[ \t]*#define[ \t]+([^ \t]+)[ \t]+(.+)')
+ paramrex = re.compile('^([^\(]+)\(([^\)]+)\)')
+ defrex = re.compile('^[ \t]*#define[ \t]+([^ \t\n\r]+)')
+ undefrex = re.compile('^[ \t]*#undef[ \t]+([^ \t\n\r]+)')
+ ifdefrex = re.compile('^[ \t]*#ifdef[ \t]+(.+)')
+ ifndefrex = re.compile('^[ \t]*#ifndef[ \t]+(.+)')
+ ifrex = re.compile('^[ \t]*#if[ \t]+(.+)')
+ elseifrex = re.compile('^[ \t]*#elseif[ \t]+(.+)')
+ elserex = re.compile('^[ \t]*#else')
+ endifrex = re.compile('^[ \t]*#endif')
+ commentrex = re.compile('^###[^#]*$')
+ ccstartrex = re.compile('/\*') # C-style comment start
+ ccendrex = re.compile('\*/') # C-style comment end
+ contrex = re.compile('.*\\\\$') # Backslash continuation line
+
+ badifrex = re.compile('^[ \t]*#if[ \t]*.*')
+ badelserex = re.compile('^[ \t]*#else[ \t]*.*')
+
+ # This code is not designed to operate on huge files. Neither is it designed to be
+ # efficient.
+
+ # ifblock state:
+ # -1 : not in an if/else block
+ # 0 : no condition satisfied yet
+ # 1 : condition satisfied
+ # 2 : condition was handled, waiting for endif
+
+ ifile = False
+ try:
+ ifile = open(inputfile, 'r')
+ except FileNotFoundError:
+ for dir in incdirs:
+ try:
+ ifile = open(dir + '/' + inputfile, 'r')
+ except FileNotFoundError:
+ pass
+ else:
+ break
+
+ if not ifile:
+ print("Error: Cannot open file " + inputfile + " for reading.\n", file=sys.stderr)
+ return
+
+ ccblock = -1
+ ifblock = -1
+ ifstack = []
+ lineno = 0
+
+ filetext = ifile.readlines()
+ lastline = []
+
+ for line in filetext:
+ lineno += 1
+
+ # C-style comments override everything else
+ if ccomm:
+ if ccblock == -1:
+ pmatch = ccstartrex.search(line)
+ if pmatch:
+ ematch = ccendrex.search(line[pmatch.end(0):])
+ if ematch:
+ line = line[0:pmatch.start(0)] + line[pmatch.end(0) + ematch.end(0):]
+ else:
+ line = line[0:pmatch.start(0)]
+ ccblock = 1
+ elif ccblock == 1:
+ ematch = ccendrex.search(line)
+ if ematch:
+ line = line[ematch.end(0)+2:]
+ ccblock = -1
+ else:
+ continue
+
+ # Handle continuation detected in previous line
+ if lastline:
+ # Note: Apparently there is a character retained after the backslash,
+ # so strip the last two characters from the line.
+ line = lastline[0:-2] + line
+ lastline = []
+
+ # Continuation lines have the next highest priority. However, this
+ # script will attempt to keep continuation lines in the body of the
+ # text and only collapse lines where continuation lines occur in
+ # a preprocessor statement.
+
+ cmatch = contrex.match(line)
+
+ # Ignore lines beginning with "###"
+ pmatch = commentrex.match(line)
+ if pmatch:
+ continue
+
+ # Handle ifdef
+ pmatch = ifdefrex.match(line)
+ if pmatch:
+ if cmatch:
+ lastline = line
+ continue
+ if ifblock != -1:
+ ifstack.append(ifblock)
+
+ if ifblock == 1 or ifblock == -1:
+ condition = pmatch.group(1)
+ ifblock = solve_condition(condition, keys, defines, keyrex)
+ else:
+ ifblock = 2
+ continue
+
+ # Handle ifndef
+ pmatch = ifndefrex.match(line)
+ if pmatch:
+ if cmatch:
+ lastline = line
+ continue
+ if ifblock != -1:
+ ifstack.append(ifblock)
+
+ if ifblock == 1 or ifblock == -1:
+ condition = pmatch.group(1)
+ ifblock = solve_condition(condition, keys, defines, keyrex)
+ ifblock = 1 if ifblock == 0 else 0
+ else:
+ ifblock = 2
+ continue
+
+ # Handle if
+ pmatch = ifrex.match(line)
+ if pmatch:
+ if cmatch:
+ lastline = line
+ continue
+ if ifblock != -1:
+ ifstack.append(ifblock)
+
+ if ifblock == 1 or ifblock == -1:
+ condition = pmatch.group(1)
+ ifblock = solve_condition(condition, keys, defines, keyrex)
+ else:
+ ifblock = 2
+ continue
+
+ # Handle elseif
+ pmatch = elseifrex.match(line)
+ if pmatch:
+ if cmatch:
+ lastline = line
+ continue
+ if ifblock == -1:
+ print("Error: #elseif without preceding #if at line " + str(lineno) + ".", file=sys.stderr)
+ ifblock = 0
+
+ if ifblock == 1:
+ ifblock = 2
+ elif ifblock != 2:
+ condition = pmatch.group(1)
+ ifblock = solve_condition(condition, keys, defines, keyrex)
+ continue
+
+ # Handle else
+ pmatch = elserex.match(line)
+ if pmatch:
+ if cmatch:
+ lastline = line
+ continue
+ if ifblock == -1:
+ print("Error: #else without preceding #if at line " + str(lineno) + ".", file=sys.stderr)
+ ifblock = 0
+
+ if ifblock == 1:
+ ifblock = 2
+ elif ifblock == 0:
+ ifblock = 1
+ continue
+
+ # Handle endif
+ pmatch = endifrex.match(line)
+ if pmatch:
+ if cmatch:
+ lastline = line
+ continue
+ if ifblock == -1:
+ print("Error: #endif outside of #if block at line " + str(lineno) + " (ignored)", file=sys.stderr)
+ elif ifstack:
+ ifblock = ifstack.pop()
+ else:
+ ifblock = -1
+ continue
+
+ # Check for 'if' or 'else' that were not properly formed
+ pmatch = badifrex.match(line)
+ if pmatch:
+ print("Error: Badly formed #if statement at line " + str(lineno) + " (ignored)", file=sys.stderr)
+ if ifblock != -1:
+ ifstack.append(ifblock)
+
+ if ifblock == 1 or ifblock == -1:
+ ifblock = 0
+ else:
+ ifblock = 2
+ continue
+
+ pmatch = badelserex.match(line)
+ if pmatch:
+ print("Error: Badly formed #else statement at line " + str(lineno) + " (ignored)", file=sys.stderr)
+ ifblock = 2
+ continue
+
+ # Ignore all lines that are not satisfied by a conditional
+ if ifblock == 0 or ifblock == 2:
+ continue
+
+ # Handle include. Note that this code does not expect or
+ # handle 'if' blocks that cross file boundaries.
+ pmatch = includerex.match(line)
+ if pmatch:
+ if cmatch:
+ lastline = line
+ continue
+ inclfile = pmatch.group(1)
+ runpp(keys, keyrex, defines, ccomm, incdirs, inclfile, ofile)
+ continue
+
+ # Handle define (with value)
+ pmatch = definerex.match(line)
+ if pmatch:
+ if cmatch:
+ lastline = line
+ continue
+ condition = pmatch.group(1)
+
+ # Additional handling of definition w/parameters: #define X(a,b,c) ..."
+ rmatch = paramrex.match(condition)
+ if rmatch:
+ # 'condition' as a key into keyrex only needs to be unique.
+ # Use the definition word without everything in parentheses
+ condition = rmatch.group(1)
+
+ # 'pcondition' is the actual search regexp and must capture all
+ # the parameters individually for substitution
+
+ parameters = rmatch.group(2).split(',')
+
+ # Generate the regexp string to match comma-separate values
+ # Note that this is based on the cpp preprocessor, which
+ # apparently allows commas in arguments if surrounded by
+ # parentheses; e.g., "def(a, b, (c1,c2))". This is NOT
+ # handled.
+
+ pcondition = condition + '\('
+ for param in parameters[0:-1]:
+ pcondition += '(.*),'
+ pcondition += '(.*)\)'
+
+ # Generate the substitution string with group substitutions
+ pvalue = pmatch.group(2)
+ idx = 1
+ for param in parameters:
+ pvalue = pvalue.replace(param, '\g<' + str(idx) + '>')
+ idx = idx + 1
+
+ defines[condition] = pvalue
+ keyrex[condition] = re.compile(pcondition)
+ else:
+ parameters = []
+ value = pmatch.group(2)
+ # Note: Need to check for infinite recursion here, but it's tricky.
+ defines[condition] = value
+ keyrex[condition] = re.compile(condition)
+
+ if condition not in keys:
+ # Parameterized keys go to the front of the list
+ if parameters:
+ keys.insert(0, condition)
+ else:
+ keys.append(condition)
+ keys = sortkeys(keys)
+ continue
+
+ # Handle define (simple case, no value)
+ pmatch = defrex.match(line)
+ if pmatch:
+ if cmatch:
+ lastline = line
+ continue
+ condition = pmatch.group(1)
+ defines[condition] = '1'
+ keyrex[condition] = re.compile(condition)
+ if condition not in keys:
+ keys.append(condition)
+ keys = sortkeys(keys)
+ continue
+
+ # Handle undef
+ pmatch = undefrex.match(line)
+ if pmatch:
+ if cmatch:
+ lastline = line
+ continue
+ condition = pmatch.group(1)
+ if condition in keys:
+ defines.pop(condition)
+ keyrex.pop(condition)
+ keys.remove(condition)
+ continue
+
+ # Now do definition replacement on what's left (if anything)
+ # This must be done repeatedly from the top until there are no
+ # more substitutions to make.
+
+ while True:
+ origline = line
+ for keyword in keys:
+ newline = keyrex[keyword].sub(defines[keyword], line)
+ if newline != line:
+ line = newline
+ break
+
+ if line == origline:
+ break
+
+ # Output the line
+ print(line, file=ofile, end='')
+
+ if ifblock != -1 or ifstack != []:
+ print("Error: input file ended with an unterminated #if block.", file=sys.stderr)
+
+ if ifile != sys.stdin:
+ ifile.close()
+ return
+
+def printusage(progname):
+ print('Usage: ' + progname + ' input_file [output_file] [-options]')
+ print(' Options are:')
+ print(' -help Print this help text.')
+ print(' -quiet Stop without error if input file is not found.')
+ print(' -ccomm Remove C comments in /* ... */ delimiters.')
+ print(' -D<def> Define word <def> and set its value to 1.')
+ print(' -D<def>=<val> Define word <def> and set its value to <val>.')
+ print(' -I<dir> Add <dir> to search path for input files.')
+ return
+
+if __name__ == '__main__':
+
+ # Parse command line for options and arguments
+ options = []
+ arguments = []
+ for item in sys.argv[1:]:
+ if item.find('-', 0) == 0:
+ options.append(item)
+ else:
+ arguments.append(item)
+
+ if len(arguments) > 0:
+ inputfile = arguments[0]
+ if len(arguments) > 1:
+ outputfile = arguments[1]
+ else:
+ outputfile = []
+ else:
+ printusage(sys.argv[0])
+ sys.exit(0)
+
+ defines = {}
+ keyrex = {}
+ keys = []
+ incdirs = []
+ ccomm = False
+ quiet = False
+ for item in options:
+ result = item.split('=')
+ if result[0] == '-help':
+ printusage(sys.argv[0])
+ sys.exit(0)
+ elif result[0] == '-ccomm':
+ ccomm = True
+ elif result[0] == '-quiet':
+ quiet = True
+ elif result[0][0:2] == '-I':
+ incdirs.append(result[0][2:])
+ elif result[0][0:2] == '-D':
+ keyword = result[0][2:]
+ try:
+ value = result[1]
+ except:
+ value = '1'
+ defines[keyword] = value
+ keyrex[keyword] = re.compile(keyword)
+ keys.append(keyword)
+ keys = sortkeys(keys)
+ else:
+ print('Bad option ' + item + ', options are -help, -quiet, -ccomm, -D<def> -I<dir>\n')
+ sys.exit(1)
+
+ if not os.path.isfile(inputfile):
+ if not quiet:
+ print("Error: No input file " + inputfile + " found.")
+ else:
+ sys.exit(0)
+
+ if outputfile:
+ ofile = open(outputfile, 'w')
+ else:
+ ofile = sys.stdout
+
+ if not ofile:
+ print("Error: Cannot open file " + outputfile + " for writing.")
+ sys.exit(1)
+
+ # Sort keys so that if any definition contains another definition, the
+ # subset word is handled last; otherwise the subset word will get
+ # substituted, screwing up the definition names in which it occurs.
+
+ keys = sortkeys(keys)
+
+ runpp(keys, keyrex, defines, ccomm, incdirs, inputfile, ofile)
+ if ofile != sys.stdout:
+ ofile.close()
+
+ # Set mode of outputfile to be equal to that of inputfile (if not stdout)
+ if outputfile:
+ statinfo = os.stat(inputfile)
+ mode = statinfo.st_mode
+ os.chmod(outputfile, mode)
+
+ sys.exit(0)
diff --git a/common/staging_install.py b/common/staging_install.py
new file mode 100755
index 0000000..fb67ead
--- /dev/null
+++ b/common/staging_install.py
@@ -0,0 +1,658 @@
+#!/usr/bin/env python3
+#
+# Copyright 2020 OpenCircuitDesign
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+"""
+staging_install.py [options...]
+
+This file copies the staging area created by foundry_install.py
+into the target directory area, changing paths to match the target,
+and creating symbolic links where requested and allowed.
+
+Options:
+ -staging <path> Path to staging top level directory that the files
+ will be installed from.
+
+ -target <path> Final install path in system file system.
+
+ Normally, '$(prefix)/pdks/<unique pdk name>'.
+
+ If -local is not given, this will be the top level
+ directory location the files are installed too.
+
+ -local <path> Actual file system location to write the files too.
+ The result can then be packaged and distributed.
+
+ For usage with things like package managers and other
+ administrator installation tooling. The resulting
+ files still need to be installed at '-target' on the
+ final system.
+
+ Think 'DESTDIR', see
+ https://www.gnu.org/prep/standards/html_node/DESTDIR.html
+
+ -source <path> Path to original source top level directory, if
+ link_from is "source". This option may be called
+ multiple times if there are multiple sources.
+
+ -variable <name> Specify a variable name that is used for the
+ target path. This variable name must be enforced
+ in setup scripts like .magicrc
+
+Less common options:
+ -link_from <type> Make symbolic links to vendor files from target.
+
+ Types are: "none", "source", or a PDK name.
+
+ Default "none" (copy all files from source)
+
+ -ef_format Use efabless naming (libs.ref/techLEF),
+ otherwise use generic naming (libs.tech/lef)
+
+If <target> is unspecified then <name> is used for the target.
+"""
+
+import re
+import os
+import sys
+import glob
+import stat
+import shutil
+import filecmp
+import subprocess
+
+# NOTE: This version of copy_tree from distutils works like shutil.copytree()
+# in Python 3.8 and up ONLY using "dirs_exist_ok=True".
+from distutils.dir_util import copy_tree
+
+def makeuserwritable(filepath):
+ if os.path.exists(filepath):
+ st = os.stat(filepath)
+ os.chmod(filepath, st.st_mode | stat.S_IWUSR)
+
+# Filter files to replace all strings matching "stagingdir" with "localdir" for
+# every file in "tooldir". If "tooldir" contains subdirectories, then recursively
+# apply the replacement filter to all files in the subdirectories. Do not follow
+# symbolic links.
+
+def filter_recursive(tooldir, stagingdir, localdir):
+ # Add any non-ASCII file types here
+ bintypes = ['.gds', '.gds2', '.gdsii', '.png', '.swp']
+
+ if not os.path.exists(tooldir):
+ return 0
+ elif os.path.islink(tooldir):
+ return 0
+
+ toolfiles = os.listdir(tooldir)
+ total = 0
+
+ for file in toolfiles:
+ # Do not attempt to do text substitutions on a binary file!
+ if os.path.splitext(file)[1] in bintypes:
+ continue
+
+ filepath = tooldir + '/' + file
+ if os.path.islink(filepath):
+ continue
+ elif os.path.isdir(filepath):
+ total += filter_recursive(filepath, stagingdir, localdir)
+ else:
+ with open(filepath, 'r') as ifile:
+ try:
+ flines = ifile.read().splitlines()
+ except UnicodeDecodeError:
+ print('Failure to read file ' + filepath + '; non-ASCII content.')
+ continue
+
+ # Make sure this file is writable (as the original may not be)
+ makeuserwritable(filepath)
+
+ modified = False
+ with open(filepath, 'w') as ofile:
+ for line in flines:
+ newline = line.replace(stagingdir, localdir)
+ print(newline, file=ofile)
+ if newline != line:
+ modified = True
+
+ if modified:
+ total += 1
+ return total
+
+# To avoid problems with various library functions that copy hierarchical
+# directory trees, remove all the files from the target that are going to
+# be replaced by the contents of staging. This avoids problems with
+# symbolic links and such.
+
+def remove_target(stagingdir, targetdir):
+
+ slist = os.listdir(stagingdir)
+ tlist = os.listdir(targetdir)
+
+ for sfile in slist:
+ if sfile in tlist:
+ tpath = targetdir + '/' + sfile
+ if os.path.islink(tpath):
+ os.unlink(tpath)
+ elif os.path.isdir(tpath):
+ remove_target(stagingdir + '/' + sfile, targetdir + '/' + sfile)
+ else:
+ os.remove(tpath)
+
+# Create a list of source files/directories from the contents of source.txt
+
+def make_source_list(sources):
+ sourcelist = []
+ for source in sources:
+ sourcelist.extend(glob.glob(source))
+ return sourcelist
+
+# Replace all files in list "libfiles" with symbolic links to files in
+# "sourcelist", where the files are found to be the same. If the entry
+# in "libfiles" is a directory and the same directory is found in "sourcelist",
+# then repeat recursively on the subdirectory.
+#
+# Because the installation may be distributed, there may be a difference
+# between where the files to be linked to currently are (checklist)
+# and where they will eventually be located (sourcelist).
+
+def replace_with_symlinks(libfiles, sourcelist):
+ # List of files that never get installed
+ exclude = ['generate_magic.tcl', '.magicrc', 'sources.txt']
+ total = 0
+ for libfile in libfiles:
+ if os.path.islink(libfile):
+ continue
+ else:
+ try:
+ sourcefile = next(item for item in sourcelist if os.path.split(item)[1] == os.path.split(libfile)[1])
+ except:
+ pass
+ else:
+ if os.path.isdir(libfile):
+ newlibfiles = glob.glob(libfile + '/*')
+ newsourcelist = glob.glob(sourcefile + '/*')
+ total += replace_with_symlinks(newlibfiles, newsourcelist)
+ elif filecmp.cmp(libfile, sourcefile):
+ if not os.path.split(libfile)[1] in exclude:
+ os.remove(libfile)
+ # Use absolute path for the source file
+ sourcepath = os.path.abspath(sourcefile)
+ os.symlink(sourcepath, libfile)
+ total += 1
+ return total
+
+# Similar to the routine above, replace files in "libdir" with symbolic
+# links to the files in "srclibdir", where the files are found to be the
+# same. The difference from the routine above is that "srclibdir" is
+# another installed PDK, and so the directory hierarchy is expected to
+# match that of "libdir" exactly, so the process of finding matches is
+# a bit more straightforward.
+#
+# Because the installation may be distributed, there may be a difference
+# between where the files to be linked to currently are (checklibdir)
+# and where they will eventually be located (srclibdir).
+
+def replace_all_with_symlinks(libdir, srclibdir, checklibdir):
+ total = 0
+ try:
+ libfiles = os.listdir(libdir)
+ except FileNotFoundError:
+ print('Cannot list directory ' + libdir)
+ print('Called: replace_all_with_symlinks(' + libdir + ', ' + srclibdir + ', ' + checklibdir + ')')
+ return total
+
+ try:
+ checkfiles = os.listdir(checklibdir)
+ except FileNotFoundError:
+ print('Cannot list check directory ' + checklibdir)
+ print('Called: replace_all_with_symlinks(' + libdir + ', ' + srclibdir + ', ' + checklibdir + ')')
+ return total
+
+ for libfile in libfiles:
+ if libfile in checkfiles:
+ libpath = libdir + '/' + libfile
+ checkpath = checklibdir + '/' + libfile
+ srcpath = srclibdir + '/' + libfile
+
+ if os.path.isdir(libpath):
+ if os.path.isdir(checkpath):
+ total += replace_all_with_symlinks(libpath, srcpath, checkpath)
+ else:
+ try:
+ if filecmp.cmp(libpath, checkpath):
+ os.remove(libpath)
+ os.symlink(srcpath, libpath)
+ total += 1
+ except FileNotFoundError:
+ print('Failed file compare with libpath=' + libpath + ', checkpath=' + checkpath)
+
+ return total
+
+#----------------------------------------------------------------
+# This is the main entry point for the staging install script.
+#----------------------------------------------------------------
+
+if __name__ == '__main__':
+
+ if len(sys.argv) == 1:
+ print("No options given to staging_install.py.")
+ print(__doc__)
+ sys.exit(0)
+
+ optionlist = []
+ newopt = []
+
+ stagingdir = None
+ targetdir = None
+ link_from = None
+ localdir = None
+ variable = None
+
+ ef_format = False
+ do_install = True
+
+ # Break arguments into groups where the first word begins with "-".
+ # All following words not beginning with "-" are appended to the
+ # same list (optionlist). Then each optionlist is processed.
+ # Note that the first entry in optionlist has the '-' removed.
+
+ for option in sys.argv[1:]:
+ if option.find('-', 0) == 0:
+ if newopt != []:
+ optionlist.append(newopt)
+ newopt = []
+ newopt.append(option[1:])
+ else:
+ newopt.append(option)
+
+ if newopt != []:
+ optionlist.append(newopt)
+
+ # Check for option "ef_format" or "std_format"
+ for option in optionlist[:]:
+ if option[0] == 'ef_naming' or option[0] == 'ef_names' or option[0] == 'ef_format':
+ optionlist.remove(option)
+ ef_format = True
+ elif option[0] == 'std_naming' or option[0] == 'std_names' or option[0] == 'std_format':
+ optionlist.remove(option)
+ ef_format = False
+ elif option[0] == 'uninstall':
+ optionlist.remove(option)
+ do_install = False
+
+ # Check for options "link_from", "staging", "target", and "local"
+
+ link_name = None
+ for option in optionlist[:]:
+ if option[0] == 'link_from':
+ optionlist.remove(option)
+ if option[1].lower() == 'none':
+ link_from = None
+ elif option[1].lower() == 'source':
+ link_from = 'source'
+ else:
+ link_from = option[1]
+ link_name = os.path.split(link_from)[1]
+ elif option[0] == 'staging' or option[0] == 'source':
+ optionlist.remove(option)
+ stagingdir = option[1]
+ elif option[0] == 'target':
+ optionlist.remove(option)
+ targetdir = option[1]
+ elif option[0] == 'local':
+ optionlist.remove(option)
+ localdir = option[1]
+ elif option[0] == 'variable':
+ optionlist.remove(option)
+ variable = option[1]
+
+ # Error if no staging or dest specified
+ if not stagingdir:
+ print("No staging directory specified. Exiting.")
+ sys.exit(1)
+
+ if not targetdir:
+ print("No target directory specified. Exiting.")
+ sys.exit(1)
+
+ # If localdir is not specified, then it is the same as the parent
+ # of the target (local installation assumed)
+ if not localdir:
+ localdir = targetdir
+
+ # Take the target PDK name from the target path last component
+ pdkname = os.path.split(targetdir)[1]
+
+ # If link source is a PDK name, if it has no path, then pull the
+ # path from the target name.
+
+ if link_from:
+ if link_from != 'source':
+ if link_from.find('/', 0) < 0:
+ link_name = link_from
+ link_from = os.path.split(localdir)[0] + '/' + link_name
+ else:
+ # If linking from source, convert the source path to an
+ # absolute pathname.
+ stagingdir = os.path.abspath(stagingdir)
+
+ # If link_from is the same as localdir, then set link_from to None
+ if link_from == localdir:
+ link_from = None
+
+ # checkdir is the DIST target directory for the PDK pointed
+ # to by link_name. Files must be found there before creating
+ # symbolic links to the (not yet existing) final install location.
+
+ if link_name:
+ checkdir = os.path.split(targetdir)[0] + '/' + link_name
+ else:
+ checkdir = ''
+
+ # Diagnostic
+ if do_install:
+ print("Installing in target directory " + targetdir)
+ else:
+ print("Uninstalling from target directory " + targetdir)
+ print("(Method not yet implemented)")
+
+ # Create the top-level directories
+
+ os.makedirs(targetdir, exist_ok=True)
+ os.makedirs(targetdir + '/libs.tech', exist_ok=True)
+ os.makedirs(targetdir + '/libs.ref', exist_ok=True)
+ if os.path.isdir(stagingdir + '/libs.priv'):
+ os.makedirs(targetdir + '/libs.priv', exist_ok=True)
+ has_priv = True
+ else:
+ has_priv = False
+
+ # Path to magic techfile depends on ef_format
+
+ if ef_format == True:
+ mag_current = '/libs.tech/magic/current/'
+ else:
+ mag_current = '/libs.tech/magic/'
+
+ # First install everything by direct copy. Keep the staging files
+ # as they will be used to reference the target area to know which
+ # files need to be checked and/or modified.
+
+ if not os.path.isdir(targetdir):
+ try:
+ os.makedirs(targetdir, exist_ok=True)
+ except:
+ print('Fatal error: Cannot make target directory ' + targetdir + '!')
+ exit(1)
+
+ # Remove any files from the target directory that are going to be replaced
+ print('Removing files from target')
+ remove_target(stagingdir, targetdir)
+
+ print('Copying staging files to target')
+ # print('Diagnostic: copy_tree ' + stagingdir + ' ' + targetdir)
+ copy_tree(stagingdir, targetdir, preserve_symlinks=True)
+ print('Done.')
+
+ # Magic and qflow setup files have references to the staging area that have
+ # been used by the vendor install; these need to be changed to the target
+ # directory.
+
+ print('Changing local path references from ' + stagingdir + ' to ' + localdir)
+ print('Part 1: Tools')
+
+ needcheck = ['ngspice']
+ techdirs = ['/libs.tech/']
+ if has_priv:
+ techdirs.append('/libs.priv/')
+
+ for techdir in techdirs:
+ tools = os.listdir(targetdir + techdir)
+ for tool in tools:
+ tooldir = targetdir + techdir + tool
+
+ # There are few enough tool setup files that they can just all be
+ # filtered directly. This code only looks in the directory 'tooldir'.
+ # If there are files is subdirectories of 'tooldir' that require
+ # substitution, then this code needs to be revisited.
+
+ # Note that due to the low overhead of tool setup files, there is
+ # no attempt to check for possible symlinks to link_from if link_from
+ # is a base PDK.
+
+ total = filter_recursive(tooldir, stagingdir, localdir)
+ if total > 0:
+ substr = 'substitutions' if total > 1 else 'substitution'
+ print(' ' + tool + ' (' + str(total) + ' ' + substr + ')')
+
+ # If "link_from" is another PDK, then check all files against the files in
+ # the other PDK, and replace the file with a symbolic link if the file contents
+ # match (Note: This is done only for ngspice model files; other tool files are
+ # generally small and deemed unnecessary to make symbolic links).
+
+ if link_from not in ['source', None]:
+ thispdk = os.path.split(targetdir)[1]
+
+ # Only create links for PDKs other than the one we are making links to.
+ if thispdk != link_from:
+ print('Replacing files with symbolic links to ' + link_from + ' where possible.')
+ for techdir in techdirs:
+ for tool in needcheck:
+ tooldir = targetdir + techdir + tool
+ srctooldir = link_from + techdir + tool
+ if checkdir != '':
+ checktooldir = checkdir + techdir + tool
+ else:
+ checktooldir = srctooldir
+ if os.path.exists(tooldir):
+ total = replace_all_with_symlinks(tooldir, srctooldir, checktooldir)
+ if total > 0:
+ symstr = 'symlinks' if total > 1 else 'symlink'
+ print(' ' + tool + ' (' + str(total) + ' ' + symstr + ')')
+
+ # In .mag files in mag/ and maglef/, also need to change the staging
+ # directory name to localdir. If "-variable" is specified in the options,
+ # the replace the staging path with the variable name, not localdir.
+
+ if variable:
+ localname = '$' + variable
+ else:
+ localname = localdir
+
+ needcheck = ['mag', 'maglef']
+ refdirs = ['/libs.ref/']
+ if has_priv:
+ refdirs.append('/libs.priv/')
+
+ if ef_format:
+ print('Part 2: Formats')
+ for refdir in refdirs:
+ for filetype in needcheck:
+ print(' ' + filetype)
+ filedir = targetdir + refdir + filetype
+ if os.path.isdir(filedir):
+ libraries = os.listdir(filedir)
+ for library in libraries:
+ libdir = filedir + '/' + library
+ total = filter_recursive(libdir, stagingdir, localname)
+ if total > 0:
+ substr = 'substitutions' if total > 1 else 'substitution'
+ print(' ' + library + ' (' + str(total) + ' ' + substr + ')')
+ else:
+ print('Part 2: Libraries')
+ for refdir in refdirs:
+ libraries = os.listdir(targetdir + refdir)
+ for library in libraries:
+ print(' ' + library)
+ for filetype in needcheck:
+ filedir = targetdir + refdir + library + '/' + filetype
+ total = filter_recursive(filedir, stagingdir, localname)
+ if total > 0:
+ substr = 'substitutions' if total > 1 else 'substitution'
+ print(' ' + filetype + ' (' + str(total) + ' ' + substr + ')')
+
+ # If "link_from" is "source", then check all files against the source
+ # directory, and replace the file with a symbolic link if the file
+ # contents match. The "foundry_install.py" script should have added a
+ # file "sources.txt" with the name of the source directories for each
+ # install directory.
+
+ if link_from not in ['source', None]:
+ print('Replacing files with symbolic links to source where possible.')
+ for refdir in refdirs:
+ if ef_format:
+ filedirs = os.listdir(targetdir + refdir)
+ for filedir in filedirs:
+ print(' ' + filedir)
+ dirpath = targetdir + refdir + filedir
+ if os.path.isdir(dirpath):
+ libraries = os.listdir(dirpath)
+ for library in libraries:
+ libdir = targetdir + refdir + filedir + '/' + library
+ libfiles = os.listdir(libdir)
+ if 'sources.txt' in libfiles:
+ libfiles = glob.glob(libdir + '/*')
+ libfiles.remove(libdir + '/sources.txt')
+ with open(libdir + '/sources.txt') as ifile:
+ sources = ifile.read().splitlines()
+ sourcelist = make_source_list(sources)
+ total = replace_with_symlinks(libfiles, sourcelist)
+ if total > 0:
+ symstr = 'symlinks' if total > 1 else 'symlink'
+ print(' ' + library + ' (' + str(total) + ' ' + symstr + ')')
+ else:
+ libraries = os.listdir(targetdir + refdir)
+ for library in libraries:
+ print(' ' + library)
+ filedirs = os.listdir(targetdir + refdir + library)
+ for filedir in filedirs:
+ libdir = targetdir + refdir + library + '/' + filedir
+ if os.path.isdir(libdir):
+ libfiles = os.listdir(libdir)
+ if 'sources.txt' in libfiles:
+ # List again, but with full paths.
+ libfiles = glob.glob(libdir + '/*')
+ libfiles.remove(libdir + '/sources.txt')
+ with open(libdir + '/sources.txt') as ifile:
+ sources = ifile.read().splitlines()
+ sourcelist = make_source_list(sources)
+ total = replace_with_symlinks(libfiles, sourcelist)
+ if total > 0:
+ symstr = 'symlinks' if total > 1 else 'symlink'
+ print(' ' + filedir + ' (' + str(total) + ' ' + symstr + ')')
+
+ # Otherwise, if "link_from" is another PDK, then check all files against
+ # the files in the other PDK, and replace the file with a symbolic link
+ # if the file contents match.
+
+ elif link_from:
+ thispdk = os.path.split(targetdir)[1]
+
+ # Only create links for PDKs other than the one we are making links to.
+ if thispdk != link_from:
+
+ print('Replacing files with symbolic links to ' + link_from + ' where possible.')
+
+ for refdir in refdirs:
+ if ef_format:
+ filedirs = os.listdir(targetdir + refdir)
+ for filedir in filedirs:
+ print(' ' + filedir)
+ dirpath = targetdir + refdir + filedir
+ if os.path.isdir(dirpath):
+ libraries = os.listdir(dirpath)
+ for library in libraries:
+ libdir = targetdir + refdir + filedir + '/' + library
+ srclibdir = link_from + refdir + filedir + '/' + library
+ if checkdir != '':
+ checklibdir = checkdir + refdir + filedir + '/' + library
+ else:
+ checklibdir = srclibdir
+ if os.path.exists(libdir):
+ total = replace_all_with_symlinks(libdir, srclibdir, checklibdir)
+ if total > 0:
+ symstr = 'symlinks' if total > 1 else 'symlink'
+ print(' ' + library + ' (' + str(total) + ' ' + symstr + ')')
+ else:
+ libraries = os.listdir(targetdir + refdir)
+ for library in libraries:
+ print(' ' + library)
+ filedirs = os.listdir(targetdir + refdir + library)
+ for filedir in filedirs:
+ libdir = targetdir + refdir + library + '/' + filedir
+ srclibdir = link_from + refdir + library + '/' + filedir
+ if checkdir != '':
+ checklibdir = checkdir + refdir + library + '/' + filedir
+ else:
+ checklibdir = srclibdir
+ if os.path.exists(libdir):
+ total = replace_all_with_symlinks(libdir, srclibdir, checklibdir)
+ if total > 0:
+ symstr = 'symlinks' if total > 1 else 'symlink'
+ print(' ' + filedir + ' (' + str(total) + ' ' + symstr + ')')
+
+ # Remove temporary files: Magic generation scripts, sources.txt
+ # file, and magic extract files.
+
+ print('Removing temporary files from destination.')
+
+ for refdir in refdirs:
+ if ef_format:
+ filedirs = os.listdir(targetdir + refdir)
+ for filedir in filedirs:
+ if os.path.islink(filedir):
+ continue
+ elif os.path.isdir(filedir):
+ libraries = os.listdir(targetdir + refdir + filedir)
+ for library in libraries:
+ libdir = targetdir + refdir + filedir + '/' + library
+ libfiles = os.listdir(libdir)
+ for libfile in libfiles:
+ filepath = libdir + '/' + libfile
+ if os.path.islink(filepath):
+ continue
+ elif libfile == 'sources.txt':
+ os.remove(filepath)
+ elif libfile == 'generate_magic.tcl':
+ os.remove(filepath)
+ elif os.path.splitext(libfile)[1] == '.ext':
+ os.remove(filepath)
+ elif os.path.splitext(libfile)[1] == '.swp':
+ os.remove(filepath)
+ else:
+ libraries = os.listdir(targetdir + refdir)
+ for library in libraries:
+ filedirs = os.listdir(targetdir + refdir + library)
+ for filedir in filedirs:
+ filepath = targetdir + refdir + library + '/' + filedir
+ if os.path.islink(filepath):
+ continue
+ elif os.path.isdir(filepath):
+ libfiles = os.listdir(filepath)
+ for libfile in libfiles:
+ libfilepath = filepath + '/' + libfile
+ if os.path.islink(libfilepath):
+ continue
+ elif libfile == 'sources.txt':
+ os.remove(libfilepath)
+ elif libfile == 'generate_magic.tcl':
+ os.remove(libfilepath)
+ elif os.path.splitext(libfile)[1] == '.ext':
+ os.remove(libfilepath)
+
+ print('Done with PDK migration.')
+ sys.exit(0)