Lumped-element impedance calculator and frequency-domain plotter.

Related tags

Text Data & NLPfastZ
Overview

fastZ: Lumped-Element Impedance Calculator

fastZ is a small tool for calculating and visualizing electrical impedance in Python. Features include:

  • Support for lumped-parameter resistors, capacitors, and inductors.
  • Construction of series and parallel impedance networks with the + and // operators.
  • Element labels with subscript assignment using the subscript operator [].
  • Impedance calculation at a single frequency or over a numpy array using the call operator ().
  • Frequency-domain Bode magnitude plots with curve annotation.

You can also compute circuit transfer functions represented as the ratio of two impedance networks. See the PID compensator in the Examples section for more information.

from fastz import R, L, C
from fastz.plotting import bodez
import numpy as np

Zin = ( L(v=22e-6) + ( C(v=100e-6) // R(v=2.0) )['p'] )['in']
fig, ax = bodez(Zin, ff=np.logspace(2, 5, 1000), zlines='Zin Zp', refzlines='R L C')

png

Installation

Install the fastZ package with pip:

pip install fastz

Dependencies: numpy and matplotlib

Usage

Constructing Impedance Models

Many impedance networks can be represented by series and parallel combinations of RLC elements. The fastZ package provides the classes R, L, and C along with the series and parallel operators + and // for this purpose. For example, a resistor R1 with a value of 50Ω is constructed as:

R1 = R('1', 50)
str(R1)
'R1[50Ω]'

The first argument to the constructor is the resistor's subscript. It gets appended to the the resistor's prefix 'R' to form the label 'R1'. The second argument is the resistor's value in Ohms. Both the subscript and value are optional, but keep the following rules in mind:

  • If you omit the subscript, you must pass the value using the keyword argument v.
  • If you omit the value, you must later provide it when evaulating or plotting the impedance (more about this below).

The LC constructors are similar, except that L accepts a value in Henreies (H) and C in Farads (F).

The addition operator + constructs series impedance networks. For example, we can build a series RC network using:

Zs = R(v=10.0) + C(v=1e-6)
str(Zs)
'(R[10.0Ω] + C[1e-06F])'

Similarly, the floor division operator // constructs parallel impedance networks. For example, a parallel RL network is constructed as:

Zp = R(v=100) // L(v=22e-6)
str(Zp)
'(R[100Ω] || L[2.2e-05H])'

Create more complex impedance networks by combining the series and parallel operators in hierarchy:

Zc = (R('1') + C('1')) // (R('2') + L('2') + C('2')) + L('3') // C('3')
str(Zc)
'(((R1 + C1) || (R2 + L2 + C2)) + (L3 || C3))'

Evalulating Impedance Models

Call an impedance with a single frequency or numpy array of frequencies using the call operator () to evalulate the impedance at those frequencies. For example, suppose we have the impedance:

Z = L(v=22e-6) + C(v=100e-6) // R(v=2.0)

You can evalulate its value at a frequency of 4kHz using:

Z(3e3)
(0.1314731604517404-0.0809519077494511j)

Or evalulate the impedance over multiple frequencies using:

Z(np.array([1, 1e3, 100e3]))
array([1.99999684e+00-2.37504008e-03j, 7.75453273e-01-8.36233246e-01j,
       1.26643460e-04+1.38070932e+01j])

If you omitted element values when constructing an impedance network, or want to temporarily overwrite the values of some elements, you'll need to pass the element values as keyword arguments to the call operator:

Z(3e3, L=100e-6, R=100.0)
(0.0028143981128015963+1.3544540460266075j)

Plotting Impedance Models

The bodez function provided within the plotting module draws the Bode magnitude plot of an impedance given a numpy array of the frequencies at which to evaulate the impedance. Use the optional string argument zlines to specify the whitespace-separated labels of additional sub-impedances to draw on the plot. The optional string argument refzlines specifies the labels of sub-impedances to plot in the reference-line style (dashed gray by default.) To change the horizontal postion of an impedances curve's annotation, append a colon followed by the horizontal location in frequency units. For example:

Z = (R(v=1) // L(v=100e-6) // C(v=200e-6))['p'] + L('2', 10e-6)
fig, ax = bodez(Z, ff=np.logspace(2, 5, 1000), zlines='Z:30000 Zp:10000', refzlines='R:4000 L:100e3 C L2')

png

If you omitted element values when constructing an impedance network, or want to temporarily overwrite the values of some elements, you'll need to pass the element values as keyword arguments as well:

fig, ax = bodez(Z, ff=np.logspace(2, 5, 1000), zlines='Zp', refzlines='R L C L2', R=10, C=50e-6)

png

Using Subscripts

Subscripts are string or integer suffix values that help identify resistors, inductors, capacitors, and composite impedances. To assign a subscript to an RLC element, pass it to the constructor:

La = L('a', 1e-6)
str(La)
'La[1e-06H]'

You can assign a subscript to a composite impedance using the subscript operator []:

Zin = (R(v=1.0) + La)['in']
str(Zin)
'Zin:(R[1.0Ω] + La[1e-06H])'

Bode plot annotations reflect the appropriate subscripts:

fig, ax = bodez(Zin, ff=np.logspace(4, 7, 1000), refzlines='R La')

png

Accessing Sub-Impedances

We might build an impedance network consisting of multiple labeled subportions. For example:

Z1 = (C('1') + L('1'))['a'] // (R('2') + L('2'))['b'] // C('3')
str(Z1)
'(Za:(C1 + L1) || Zb:(R2 + L2) || C3)'

Sometimes it may be useful to access the sub-impedances Za and Zb, or the individual RLC elements. Use the subz method to do so:

Za = Z1.subz('Za')
str(Za)
'Za:(C1 + L1)'
C1 = Z1.subz('C1')
str(C1)
'C1'

Internally, the bodez plotting function relies on the subz method to plot additional impedances specified in the zlines and refzlines arguments

Computing Break Frequencies

The breakfreq method computes RC, RL, and LC break frequencies. A break frequency is the frequency at which one element's impedance magnitude equals the other element's impedance magnitude. Suppose we have the following parallel RLC network:

Z1 = R(v=1) // L(v=100e-6) // C(v=22e-6)
str(Z1)
'(R[1Ω] || L[0.0001H] || C[2.2e-05F])'

The following draws vertical lines at the RC, RL, and LC break frequencies:

fig, ax = bodez(Z1, ff=np.logspace(2.5, 4.5, 1000), refzlines='R L:2200 C:5000')
for fb in [Z1.breakfreq('R L'), Z1.breakfreq('L C'), Z1.breakfreq('R C')]:
    ax.axvline(x=fb, ls=':', color='red')

ax.set_ylim((0.1, 3))
(0.1, 3)

png

Examples

SMPS Output Impedance

Here's a model of the small-signal output impedance of a Buck, Boost, or Buck-Boost converter (switching-mode power supplies):

SMPS output impedance

Le is the effective output inductance of the converter, C is the output capacitor, and Rload represents the load. To make things a bit more interesting, we've included the inductor's ohmic loss as RL and the capacitor's equivalent series inductance and resistance as Lesl and Resr, respectively. We construct and evaluate a fastZ model with some sample component values below:

from fastz import R, L, C
from fastz.plotting import bodez
import numpy as np
import matplotlib.pyplot as plt

Zout = ( R('load', 10) // (L('esl', 1e-6) + C(v=100e-6) + R('esr', 1))['cap'] // (L('e', 44e-6) + R('L', 3.0))['ind'] )['out']
bodez(Zout, ff=np.logspace(1, 7, 1000), 
      zlines='Zout:10e3 Zcap:10e3 Zind:10e3', 
      refzlines='Rload C:300 Lesl:120e3 Resr Le:7e3 RL')
plt.ylim((0.6, 12))
plt.show()

png

PID Compensator

This op amp circuit could appear in a feedback control loop as a PID (lead-lag) compensator:

PID compensator

VREF represents the setpoint of the feedback system (assumed constant in this case), vfb is the feedback voltage signal, and vc is the compensated output voltage signal. The transfer relationship is

Vc(s) = Gc(s)·Ve(s)

where ve = VREF - vfb is the error signal and Gc(s) = Z1(s)/Z2(s) is the compensator's transfer function. We can use fastZ to compute Gc since it is the ratio of two lumped-element impedance networks.

from fastz import R, L, C
from fastz.plotting import bodez
import numpy as np
import matplotlib.pyplot as plt
from matplotlib.ticker import EngFormatter

# construct models of Z1 and Z2
Z1 = ( R('1', 30e3) + C('1', 20e-9) )['1']
Z2 = ( R('2', 10e3) // C('2', 5e-9) + R('3', 2e3) )['2']

# evalulate frequency response of Gc
ff = np.logspace(1, 6, 1000)
GGc = Z1(ff) / Z2(ff)

# plot the results
fig, (axm, axp) = plt.subplots(2, 1, figsize=(6, 8))
axz = axm.twinx()

bodez(Z1, ff, ax=axz, zlines='Z1:1e3', refzlines='R1 C1:5e3')
bodez(Z2, ff, ax=axz, zlines='Z2:1e3', refzlines='R2:800e3 R3 C2:200')
axm.loglog(ff, np.abs(GGc), color='purple')
axp.semilogx(ff, np.angle(GGc)*180/np.pi, color='purple')
axm.annotate('$|G_c|$', (ff[-1], abs(GGc[-1])), 
             ha='center', va='center', backgroundcolor='w')
axp.annotate('$\\angle G_c$', (ff[-10], np.angle(GGc[-10])*180/np.pi), 
             ha='center', va='center', backgroundcolor='w')

axm.xaxis.set_major_formatter(EngFormatter())
axp.xaxis.set_major_formatter(EngFormatter())
axm.set_ylabel('Compensator Gain (V/V)')
axp.set_ylabel('Compensator Phase Shift (°)')
axp.set_xlabel('Frequency (Hz)')
axm.set_ylim((1, 110))
axz.set_ylim((1e3, 1e6))
plt.show()

png

You can see that there's a phase boost of about 40° at 10kHz. An inverted zero appears at about 300Hz to boost the low-frequency gain.

Contributing

Contributions are what make the open source community such an amazing place to be learn, inspire, and create. Any contributions you make are greatly appreciated.

  1. Fork the Project
  2. Create your Feature Branch (git checkout -b feature/AmazingFeature)
  3. Commit your Changes (git commit -m 'Add some AmazingFeature')
  4. Push to the Branch (git push origin feature/AmazingFeature)
  5. Open a Pull Request

License

Distributed under the MIT License. See LICENSE for more information.

Contact

Wesley Hileman - [email protected]

Owner
Wesley Hileman
Wesley Hileman
Unofficial Parallel WaveGAN (+ MelGAN & Multi-band MelGAN & HiFi-GAN & StyleMelGAN) with Pytorch

Parallel WaveGAN implementation with Pytorch This repository provides UNOFFICIAL pytorch implementations of the following models: Parallel WaveGAN Mel

Tomoki Hayashi 1.2k Dec 23, 2022
Super easy library for BERT based NLP models

Fast-Bert New - Learning Rate Finder for Text Classification Training (borrowed with thanks from https://github.com/davidtvs/pytorch-lr-finder) Suppor

Utterworks 1.8k Dec 27, 2022
Indobenchmark are collections of Natural Language Understanding (IndoNLU) and Natural Language Generation (IndoNLG)

Indobenchmark Toolkit Indobenchmark are collections of Natural Language Understanding (IndoNLU) and Natural Language Generation (IndoNLG) resources fo

Samuel Cahyawijaya 11 Aug 26, 2022
Simple Text-To-Speech Bot For Discord

Simple Text-To-Speech Bot For Discord This is a very simple TTS bot for discord made with python. For this bot you need FFMPEG, see installation to se

1 Sep 26, 2022
A Non-Autoregressive Transformer based TTS, supporting a family of SOTA transformers with supervised and unsupervised duration modelings. This project grows with the research community, aiming to achieve the ultimate TTS.

A Non-Autoregressive Transformer based TTS, supporting a family of SOTA transformers with supervised and unsupervised duration modelings. This project grows with the research community, aiming to ach

Keon Lee 237 Jan 02, 2023
Fake news detector filters - Smart filter project allow to classify the quality of information and web pages

fake-news-detector-1.0 Lists, lists and more lists... Spam filter list, quality keyword list, stoplist list, top-domains urls list, news agencies webs

Memo Sim 1 Jan 04, 2022
TEACh is a dataset of human-human interactive dialogues to complete tasks in a simulated household environment.

TEACh is a dataset of human-human interactive dialogues to complete tasks in a simulated household environment.

Alexa 98 Dec 09, 2022
Toward Model Interpretability in Medical NLP

Toward Model Interpretability in Medical NLP LING380: Topics in Computational Linguistics Final Project James Cross ( 1 Mar 04, 2022

Reproduction process of BERT on SST2 dataset

BERT-SST2-Prod Reproduction process of BERT on SST2 dataset 安装说明 下载代码库 git clone https://github.com/JunnYu/BERT-SST2-Prod 进入文件夹,安装requirements pip ins

yujun 1 Nov 18, 2021
HF's ML for Audio study group

Hugging Face Machine Learning for Audio Study Group Welcome to the ML for Audio Study Group. Through a series of presentations, paper reading and disc

Vaibhav Srivastav 110 Jan 01, 2023
MHtyper is an end-to-end pipeline for recognized the Forensic microhaplotypes in Nanopore sequencing data.

MHtyper is an end-to-end pipeline for recognized the Forensic microhaplotypes in Nanopore sequencing data. It is implemented using Python.

willow 6 Jun 27, 2022
Neural network models for joint POS tagging and dependency parsing (CoNLL 2017-2018)

Neural Network Models for Joint POS Tagging and Dependency Parsing Implementations of joint models for POS tagging and dependency parsing, as describe

Dat Quoc Nguyen 152 Sep 02, 2022
Unofficial Python library for using the Polish Wordnet (plWordNet / Słowosieć)

Polish Wordnet Python library Simple, easy-to-use and reasonably fast library for using the Słowosieć (also known as PlWordNet) - a lexico-semantic da

Max Adamski 12 Dec 23, 2022
Signature remover is a NLP based solution which removes email signatures from the rest of the text.

Signature Remover Signature remover is a NLP based solution which removes email signatures from the rest of the text. It helps to enchance data conten

Forges Alterway 8 Jan 06, 2023
An easier way to build neural search on the cloud

An easier way to build neural search on the cloud Jina is a deep learning-powered search framework for building cross-/multi-modal search systems (e.g

Jina AI 17.1k Jan 09, 2023
⚡ Automatically decrypt encryptions without knowing the key or cipher, decode encodings, and crack hashes ⚡

Translations 🇩🇪 DE 🇫🇷 FR 🇭🇺 HU 🇮🇩 ID 🇮🇹 IT 🇳🇱 NL 🇧🇷 PT-BR 🇷🇺 RU 🇨🇳 ZH ➡️ Documentation | Discord | Installation Guide ⬅️ Fully autom

11.2k Jan 05, 2023
KoBART model on huggingface transformers

KoBART-Transformers SKT에서 공개한 KoBART를 편리하게 사용할 수 있게 transformers로 포팅하였습니다. Install (Optional) BartModel과 PreTrainedTokenizerFast를 이용하면 설치하실 필요 없습니다. p

Hyunwoong Ko 58 Dec 07, 2022
A repository to run gpt-j-6b on low vram machines (4.2 gb minimum vram for 2000 token context, 3.5 gb for 1000 token context). Model loading takes 12gb free ram.

Basic-UI-for-GPT-J-6B-with-low-vram A repository to run GPT-J-6B on low vram systems by using both ram, vram and pinned memory. There seem to be some

90 Dec 25, 2022
Labelling platform for text using distant supervision

With DataQA, you can label unstructured text documents using rule-based distant supervision.

245 Aug 05, 2022
News-Articles-and-Essays - NLP (Topic Modeling and Clustering)

NLP T5 Project proposal Topic Modeling and Clustering of News-Articles-and-Essays Students: Nasser Alshehri Abdullah Bushnag Abdulrhman Alqurashi OVER

2 Jan 18, 2022