The Python3 import playground

Overview

The Python3 import playground

I have been confused about python modules and packages, this text tries to clear the topic up a bit.

Sources:

https://chrisyeh96.github.io/2017/08/08/definitive-guide-python-imports.html
https://abarker.github.io/understanding_python_imports/ 
https://stackoverflow.com/questions/44834/can-someone-explain-all-in-python
https://docs.python.org/3/reference/import.html
https://stackoverflow.com/questions/43059267/how-to-do-from-module-import-using-importlib

Modules

Each file with extension .py can be used as a module. The module is in source file module_source.py, while it is imported as import module_source. In this example, both the module source file and the importing file are in the same directory.

Lets import the module. (the module source includes a print statement print("module_foo is being imported"))

>> import module_foo module_foo is being imported >>> print("after: import module_foo") after: import module_foo">
>>> print("before: import module_foo")
before: import module_foo
>>> import module_foo
module_foo is being imported
>>> print("after: import module_foo")
after: import module_foo

The imported module is being parsed and run during the import statement. Python is a dynamic language, and there is no way to determine the interface of a module, without running it. This can have it's uses: you can add some module specific initialisations in the global scope of the module, these are run just once at import time.

Often you can see the following lines in some python code:

if __name__ == '__main__':
  run_main_function()

This means that the function run_main_function will be run only when the file is run as a script (meaning it is run as python3 module_file.py), __name__ is a built-in variable that holds the name of the current module, it defaults to "__main__" for the file that is directly run by the python interpreter.

How does the module look like on the importing side?

>>> import module_foo
>>>
>>> print(type(module_foo))

   

   

A variable with the same name as the imported module is defined implicitly by the python runtime, and it is of type .

  • lets use the interface that is exported by the module; all functions defined in the module are accessed via the module name followed by a dot.
foo = module_foo.Foo("gadget")
print(foo)

module_foo.print_foo("some stuff: ", 42)
  • Lets take a look at the properties of the module_foo variable
>>> print("module_foo.__dict__ keys: ", ", ".join(module_foo.__dict__.keys()))
module_foo.__dict__ keys:  __name__, __doc__, __package__, __loader__, __spec__, __file__, __cached__, __builtins__, datetime, Foo, print_foo, _internal_print

This print statements shows all keys of the ___dict__ attribute for the import module variable. The __dict__ attribute is a dictionary and it maps the names of object instance variables names to their value.

A more cultured way of accessing this information is the built-in dir function; The documentation says that for a module object the following info is returned If the object is a module object, the list contains the names of the module’s attributes.

module_foo.__dict__ key: Foo value-type: module_foo.__dict__ key: print_foo value-type: module_foo.__dict__ key: _internal_print value-type: ">
>>> for key, value in module_foo.__dict__.items():
...     print("module_foo.__dict__ key: ", key, "value-type: ", type(value))
...

module_foo.__dict__ key:  datetime value-type:  
      
       
module_foo.__dict__ key:  Foo value-type:  
       
        
module_foo.__dict__ key:  print_foo value-type:  
        
         
module_foo.__dict__ key:  _internal_print value-type:  
         

         
        
       
      

That makes sense: the call of module_foo.print_foo("some stuff: ", 42) is just a short form for a regular object call module_foo.__dict__['module_foo'].print_foo("some stuff :", 42) An imported module is just an instance of a module object, where each exported class or method is a member of that module object!

Interesting that even names with a leading underscore are visible via import of a module (although pylint is giving a warning if you use them, and this is regarded as very bad style). Importing from a package does not expose these symbols (unless defined in the __init__.py module)

Please note: in this case module_foo is also listing all modules imported by the imported module (like module datetime)

Where do we put the module source file?

An imported module must be a directory in the sys.path list, the current directory is always part of this list. You can add directories to sys.path by setting PYTHONPATH environment variable, before running python executable, or by explicitly adding your directory to sys.path, before calling import. (Example source imorting the module and source of the module

Import renames

There are other forms of import,

import module_foo as mfoo

Here the variable defined by the runtime is renamed to mfoo, and the code that uses the module looks as follows

mfoo.print_foo("some stuff: ", 42)

You will sometimes see the following kind of imports in both modules and packages.

import math as _math
import os as _os

This turns the imported package name into a private symbol, so that the import of packages will not turn into symbols when importing the module as follows; from module_name import * - this form of import adds all symbols from the module to the current namespace. See example

Import renames with directories

The import with rename feature can be used to access python files in subdirectores: module_foo_src is in a sub directory, relative to module_source.py See module source and module usage

import module_foo_src.module_foo as mfoo

Please note that you can only get into one directory level beneath any directory that is listed under the python import path. The import path includes the current directory of the main module,

Importing symbols info the namespace of the caller

You can import symbols selectively into the calling program, as follows:

from  module_foo import print_foo, Foo

print_foo("some stuff: ", 42)

However some say that this kind of import does not make the code more readable. The Google style guide does not recommend this approach.

You can also import all symbols from module_foo right into your own namespace

from module_foo import *

See example module and usage

Now this form of import has an interesting case: if the module source defines a list variable named __all__, then this variable lists all symbols exported by the module, it limits the list of symbols that can be imported with the * import. However this variable is only used for the from module_foo * import form, The example module does not list print_foo in it's __all__ variable, however it is still possible to import it by means of from module_foo import print_foo

Also the * import does not import symbols with leading underscores, these are respected as module private symbols.

Lots of details here...

Multiple imports

You can also import several packages from the same import statements, technically you can do

import os, sys as system, pathlib

However pylint gives you a warning for multiple imports in the same line, therefore it is not a good thing to do.

Exceptions that occur during module import

You get an ImportError exception if the python runtime ran into a problem during import. This can be used to choose between alternative versions of a library.

try:
    import re2 as re
except ImportError:
    import re

This example includes the re2 regular expression engine, if that is not installed, then it falls back to the compatible regular expression module re

However when the imported module did run code in its global scope that threw a regular error like ValueError, then you will get a ValueError exception.

Packages

Here again is an example package. source of package foo and example using package package_foo

A Directory with an __init__.py is a python package, this directory can include more than one python file, the idea of a package is to treat all the python files in this directory as a whole.

Once a package is imported: its __init__.py in that directory is implicitly run, in order to determine the interface of that package.

The __init.py__ module is run when a package is imported. The namespace of this module is made available to the importer of the package. Technically, importing a package is the same to importing the __init__.py module of a package. It's the same as:

import package_name.__init__  as package_name

Most of the following information will be very familiar from the previous explanation of modules:

An imported package foo must be a sub directory directly under any one of the directories listed in the sys.path list, the current directory is always part of that list.

>>> import sys
>>> print(sys.path)
['', '/Library/Frameworks/Python.framework/Versions/3.9/lib/python39.zip', '/Library/Frameworks/Python.framework/Versions/3.9/lib/python3.9', '/Library/Frameworks/Python.framework/Versions/3.9/lib/python3.9/lib-dynload', '/Users/michaelmo/Library/Python/3.9/lib/python/site-packages', '/Library/Frameworks/Python.framework/Versions/3.9/lib/python3.9/site-packages']

The first entry is '', meaning the directory where the script file is in.

You can add directories to sys.path by setting PYTHONPATH environment variable, before running python executable, or by explicitly adding your directory to sys.path, before calling import.

>>> import sys
>>> print(type(sys))

   

   

At the importing side: and imported package is represented by a variable of type 'class module'; the namespace of that package (including built-in classes and functions) are part of package_name.__dict__

>>> import sys as system
>>> print(system.path)
['', '/Library/Frameworks/Python.framework/Versions/3.9/lib/python39.zip', '/Library/Frameworks/Python.framework/Versions/3.9/lib/python3.9', '/Library/Frameworks/Python.framework/Versions/3.9/lib/python3.9/lib-dynload', '/Users/michaelmo/Library/Python/3.9/lib/python/site-packages', '/Library/Frameworks/Python.framework/Versions/3.9/lib/python3.9/site-packages']

import sys as system - this construct is renaming the variable of type 'class module', to act as an alias for the import name.

sys.modules - is a global variable, it's a dictionary that maps import name to variable of type , It stands for all currently imported modules and packages; import first checks if a package is already imported, to avoid loading the same package twice.

>>> import sys
>>> print(sys.modules.keys())
dict_keys(['sys', 'builtins', '_frozen_importlib', '_imp', '_thread', '_warnings', '_weakref', '_io', 'marshal', 'posix', '_frozen_importlib_external', 'time', 'zipimport', '_codecs', 'codecs', 'encodings.aliases', 'encodings', 'encodings.utf_8', '_signal', 'encodings.latin_1', '_abc', 'abc', 'io', '__main__', '_stat', 'stat', '_collections_abc', 'genericpath', 'posixpath', 'os.path', 'os', '_sitebuiltins', '_locale', '_bootlocale', 'site', 'readline', 'atexit', 'rlcompleter'])

This map is also listing all of the built-in modules.

>>> import _frozen_importlib
>>> print(_frozen_importlib.__doc__)
Core implementation of import.

This module is NOT meant to be directly imported! It has been designed such
that it can be bootstrapped into Python as the implementation of import. As
such it requires the injection of specific modules and attributes in order to
work. One should use importlib as the public-facing version of this module.

The __doc__ member of the module variable is the docstring defined for the module. Function objects also have such a member variable

Writing the __init__.py file

The tricky part in writing a package is the __init__.py file, this file has to import all other files as modules, as follows:

from  .file1 import  *

This is a relative import, it imports the module file1 in file1.py from the current directory, and adds all symbols to the namespace of the init.py file (except for names with a leading underscore, these are treated as package private names). Having these symbols as part of the __init__.py namespace is the condition for making these symbols available upon import.

A generic __init__.py file

I sometimes forget to include a module from the __init__.py file, so lets make a generic __init__.py file. See the result of this effort here in this example; _import_all is a function that imports all modules in the same directory as __init__.py, except for modules with a leading underscore in their name, as well as the __init__.py file itself. First it enumerates all such files with extension .py in that directory. Each relevant module is loaded explicitly via importlib.import_module, this function returns the module variable for the imported package.

Next, the namespace of that module is merged with the current namespace, it does so by enumerating all entries of the module variables __dict__ member, and add these to the global namespace returned by the global() built-in function.

The function also builds the __all__ member of the package, if an __all__ global variable has been defined in the module, then it is appended to the __all__ list of the __init__.py file.

The _import_all function from this example is a nice generic function, it buys you some convenience at the expense of the time to load the module, but this kind of trade off is very frequent in computing...

Packages with sub packages

An example of a package with sub-packages package source and package usage

├── package_foo
│   ├── __init__.py
│   ├── sub_package_one
│   │   ├── __init__.py
│   │   └── file1.py
│   └── sub_package_two
│       ├── __init__.py
│       └── file2.py
├── use_foo.py
└── use_module_import.py

Here the __init__.py file of the main package needs to import the sub packages into its namespace. It is not possible to import a sub package selectively, you can import package directories that are directly under any one of the directories in the module search path (that includes the current directory)

from  .sub_package_one import  *
from  .sub_package_two import  *

The curious case of the empty __init__.py file

Sometimes there is an empty ___init__.py file in the package_foo directory. That enables us to do directly import the sub package files

import package_foo.sub_package_one as sub_package_one

foo = sub_package_one.Foo("gadget")

The idea is that package_foo needs an __init__.py file in order to count as a package, without such a file, a package import would fail prior to python version 3.3, and you could not resolve the import path package_foo.sub_package_one for this reason. This problem is then solved with an empty __init__.py file in the package_foo directory. However this changed with with Python3.3, for later versions you no longer need the empty __init__.py file, an import of a directory without __init__.py does not fail for later versions.

Conclusion

I hope that this text has cleared the topic of python import system. Python is a relatively simple language, however there are a lot of usage patterns that one has to get used to. These are not always obvious from the python documentation.

Owner
Michael Moser
Michael Moser
A universal memory dumper using Frida

Fridump Fridump (v0.1) is an open source memory dumping tool, primarily aimed to penetration testers and developers. Fridump is using the Frida framew

551 Jan 07, 2023
pytorch implementation for PointNet

PointNet.pytorch This repo is implementation for PointNet in pytorch. The model is in pointnet/model.py. It is teste

Fei Xia 1.7k Dec 30, 2022
Stratified Transformer for 3D Point Cloud Segmentation (CVPR 2022)

Stratified Transformer for 3D Point Cloud Segmentation Xin Lai*, Jianhui Liu*, Li Jiang, Liwei Wang, Hengshuang Zhao, Shu Liu, Xiaojuan Qi, Jiaya Jia

DV Lab 195 Jan 01, 2023
PyTorch implementation of U-TAE and PaPs for satellite image time series panoptic segmentation.

Panoptic Segmentation of Satellite Image Time Series with Convolutional Temporal Attention Networks (ICCV 2021) This repository is the official implem

71 Jan 04, 2023
Pytorch implementation of the paper DocEnTr: An End-to-End Document Image Enhancement Transformer.

DocEnTR Description Pytorch implementation of the paper DocEnTr: An End-to-End Document Image Enhancement Transformer. This model is implemented on to

Mohamed Ali Souibgui 74 Jan 07, 2023
Neighborhood Contrastive Learning for Novel Class Discovery

Neighborhood Contrastive Learning for Novel Class Discovery This repository contains the official implementation of our paper: Neighborhood Contrastiv

Zhun Zhong 56 Dec 09, 2022
Official public repository of paper "Intention Adaptive Graph Neural Network for Category-Aware Session-Based Recommendation"

Intention Adaptive Graph Neural Network (IAGNN) This is the official repository of paper Intention Adaptive Graph Neural Network for Category-Aware Se

9 Nov 22, 2022
Pytoydl: A toy deep learning framework built upon numpy.

Documents: https://pytoydl.readthedocs.io/zh/latest/ Pytoydl A toy deep learning framework built upon numpy. You can star this repository to keep trac

28 Dec 10, 2022
Spam your friends and famly and when you do your famly will disown you and you will have no friends.

SpamBot9000 Spam your friends and family and when you do your family will disown you and you will have no friends. Terms of Use Disclaimer: Please onl

DJ15 0 Jun 09, 2022
A Real-ESRGAN equipped Colab notebook for CLIP Guided Diffusion

#360Diffusion automatically upscales your CLIP Guided Diffusion outputs using Real-ESRGAN. Latest Update: Alpha 1.61 [Main Branch] - 01/11/22 Layout a

78 Nov 02, 2022
An evaluation toolkit for voice conversion models.

Voice-conversion-evaluation An evaluation toolkit for voice conversion models. Sample test pair Generate the metadata for evaluating models. The direc

30 Aug 29, 2022
Diagnostic tests for linguistic capacities in language models

LM diagnostics This repository contains the diagnostic datasets and experimental code for What BERT is not: Lessons from a new suite of psycholinguist

61 Jan 02, 2023
Official implement of Evo-ViT: Slow-Fast Token Evolution for Dynamic Vision Transformer

Evo-ViT: Slow-Fast Token Evolution for Dynamic Vision Transformer This repository contains the PyTorch code for Evo-ViT. This work proposes a slow-fas

YifanXu 53 Dec 05, 2022
Keyword2Text This repository contains the code of the paper: "A Plug-and-Play Method for Controlled Text Generation"

Keyword2Text This repository contains the code of the paper: "A Plug-and-Play Method for Controlled Text Generation", if you find this useful and use

57 Dec 27, 2022
A simple Tensorflow based library for deep and/or denoising AutoEncoder.

libsdae - deep-Autoencoder & denoising autoencoder A simple Tensorflow based library for Deep autoencoder and denoising AE. Library follows sklearn st

Rajarshee Mitra 147 Nov 18, 2022
Hub is a dataset format with a simple API for creating, storing, and collaborating on AI datasets of any size.

Hub is a dataset format with a simple API for creating, storing, and collaborating on AI datasets of any size. The hub data layout enables rapid transformations and streaming of data while training m

Activeloop 5.1k Jan 08, 2023
SCU OlympicsRunning Baseline

Competition 1v1 running Environment check details in Jidi Competition RLChina2021智能体竞赛 做出的修改: 奖励重塑:修改了环境,重新设置了奖励的分配,使得奖励组成不只有零和博弈,还有探索环境的奖励。 算法微调:修改了官

ZiSeoi Wong 2 Nov 23, 2021
This is the pytorch re-implementation of the IterNorm

IterNorm-pytorch Pytorch reimplementation of the IterNorm methods, which is described in the following paper: Iterative Normalization: Beyond Standard

Lei Huang 32 Dec 27, 2022
Oriented Response Networks, in CVPR 2017

Oriented Response Networks [Home] [Project] [Paper] [Supp] [Poster] Torch Implementation The torch branch contains: the official torch implementation

ZhouYanzhao 217 Dec 12, 2022
Code repository for the paper "Doubly-Trained Adversarial Data Augmentation for Neural Machine Translation" with instructions to reproduce the results.

Doubly Trained Neural Machine Translation System for Adversarial Attack and Data Augmentation Languages Experimented: Data Overview: Source Target Tra

Steven Tan 1 Aug 18, 2022