Progressbar 2 - A progress bar for Python 2 and Python 3 - "pip install progressbar2"

Overview

Text progress bar library for Python.

Travis status:

https://travis-ci.org/WoLpH/python-progressbar.svg?branch=master

Coverage:

https://coveralls.io/repos/WoLpH/python-progressbar/badge.svg?branch=master

Install

The package can be installed through pip (this is the recommended method):

pip install progressbar2

Or if pip is not available, easy_install should work as well:

easy_install progressbar2

Or download the latest release from Pypi (https://pypi.python.org/pypi/progressbar2) or Github.

Note that the releases on Pypi are signed with my GPG key (https://pgp.mit.edu/pks/lookup?op=vindex&search=0xE81444E9CE1F695D) and can be checked using GPG:

gpg --verify progressbar2-<version>.tar.gz.asc progressbar2-<version>.tar.gz

Introduction

A text progress bar is typically used to display the progress of a long running operation, providing a visual cue that processing is underway.

The ProgressBar class manages the current progress, and the format of the line is given by a number of widgets. A widget is an object that may display differently depending on the state of the progress bar. There are many types of widgets:

The progressbar module is very easy to use, yet very powerful. It will also automatically enable features like auto-resizing when the system supports it.

Known issues

Due to limitations in both the IDLE shell and the Jetbrains (Pycharm) shells this progressbar cannot function properly within those.

Links

Usage

There are many ways to use Python Progressbar, you can see a few basic examples here but there are many more in the examples file.

Wrapping an iterable

import time
import progressbar

for i in progressbar.progressbar(range(100)):
    time.sleep(0.02)

Progressbars with logging

Progressbars with logging require stderr redirection _before_ the StreamHandler is initialized. To make sure the stderr stream has been redirected on time make sure to call progressbar.streams.wrap_stderr() before you initialize the logger.

One option to force early initialization is by using the WRAP_STDERR environment variable, on Linux/Unix systems this can be done through:

# WRAP_STDERR=true python your_script.py

If you need to flush manually while wrapping, you can do so using:

import progressbar

progressbar.streams.flush()

In most cases the following will work as well, as long as you initialize the StreamHandler after the wrapping has taken place.

import time
import logging
import progressbar

progressbar.streams.wrap_stderr()
logging.basicConfig()

for i in progressbar.progressbar(range(10)):
    logging.error('Got %d', i)
    time.sleep(0.2)

Context wrapper

import time
import progressbar

with progressbar.ProgressBar(max_value=10) as bar:
    for i in range(10):
        time.sleep(0.1)
        bar.update(i)

Combining progressbars with print output

import time
import progressbar

for i in progressbar.progressbar(range(100), redirect_stdout=True):
    print('Some text', i)
    time.sleep(0.1)

Progressbar with unknown length

import time
import progressbar

bar = progressbar.ProgressBar(max_value=progressbar.UnknownLength)
for i in range(20):
    time.sleep(0.1)
    bar.update(i)

Bar with custom widgets

import time
import progressbar

widgets=[
    ' [', progressbar.Timer(), '] ',
    progressbar.Bar(),
    ' (', progressbar.ETA(), ') ',
]
for i in progressbar.progressbar(range(20), widgets=widgets):
    time.sleep(0.1)

Bar with wide Chinese (or other multibyte) characters

# vim: fileencoding=utf-8
import time
import progressbar


def custom_len(value):
    # These characters take up more space
    characters = {
        '进': 2,
        '度': 2,
    }

    total = 0
    for c in value:
        total += characters.get(c, 1)

    return total


bar = progressbar.ProgressBar(
    widgets=[
        '进度: ',
        progressbar.Bar(),
        ' ',
        progressbar.Counter(format='%(value)02d/%(max_value)d'),
    ],
    len_func=custom_len,
)
for i in bar(range(10)):
    time.sleep(0.1)
Comments
  • Multiple progress bars appear at once in IntelliJ IDEA

    Multiple progress bars appear at once in IntelliJ IDEA

    Description

    This obviously doesn't seem to be intended behaviour, but I'm not sure whether it's a problem with progressbar, with IntelliJ, or with my configuration.

    image

    Here's the full console output from running examples.py.

    Code

    Really any code that outputs a progress bar to the console, e.g.

    import time
    import progressbar
    
    with progressbar.ProgressBar(max_value=10) as progress:
        for i in range(10):
            time.sleep(0.1)
            progress.update(i)
    

    Versions

    • Python version: 3.6.0 (v3.6.0:41df79263a11, Dec 23 2016, 07:18:10) [MSC v.1900 32 bit (Intel)]
    • Python distribution/environment: IDLE
    • Operating System: Windows 10
    • Package version: 3.18.1

    Edit: I just tested the following simple function and the issue still occurs. Guess that means it's probably an issue with IntelliJ IDEA.

    import sys
    from time import sleep
    
    def update_progress(i, total, length=10, fg="#", bg=" ", decimals=0):
        progress = 100 * (i / float(total))
        blocks = int(length * i // total)
        bar = fg * blocks + bg * (length - blocks)
        sys.stderr.write(f"\r[{bar}] {progress:.{decimals}f}%")
        sys.stderr.flush()
    
    for i in range(0, 100):
        update_progress(i, 99)
        sleep(0.1)
    
    unfixable upstream bug 
    opened by HatScripts 25
  • I have a present for you

    I have a present for you

    I give to you the gift of detecting ANSI support

    import sys
    import os
    import time
    
    sample_ansi = '\x1b[31mRED ' + '\x1b[33mYELLOW ' + '\x1b[32mGREEN ' + '\x1b[35mPINK ' + '\x1b[0m' + '\n'
    
    handle = sys.stdout
    
    if (
        # This works for newer versions of pycharm only. older versions there is no way to check.
        ('PYCHARM_HOSTED' in os.environ and os.environ['PYCHARM_HOSTED'] == '1') or
        (
            # check if we are writing to a terminal or not. typically a file object is going to return False
            # if the instance has been overridden and isatty has not been defined we have no way of knowing
            # so we will not use ansi.
            (hasattr(handle, "isatty") and handle.isatty()) and
            (
                # ansi terminals will typically define one of the 2 environment variables.
                ('TERM' in os.environ and os.environ['TERM'] == 'ANSI') or
                'ANSICON' in os.environ
            )
        )
    ):
        handle.write("ANSI output enabled.\n")
        handle.write(sample_ansi)
    
    elif sys.platform.startswith('win'):
        handle.write("Windows console, no ANSI support.\n")
    
    else:
        handle.write('ANSI output disabled.\n')
    
    handle.write("\n\n")
    handle.flush()
    

    This will detect if a 3rd party terminal that supports ANSI is being used on Windows. It also detects if PyCharm is being used. (latest version of PyCharm has been tested and does work)

    opened by kdschlosser 20
  • Something I wrote that you might be interested in.

    Something I wrote that you might be interested in.

    I was poking about in your commit history to see what changes you have made in the past few months. I didn't know if you had added the multi threaded progress bar or not.. But I had noticed a bunch of PR's for sphinx and thought this may be of use to you.

    https://github.com/kdschlosser/sphinx-distutils-extension

    It allows for the setup program to handle building the documentation. It also builds the config file as well. All options for the config file are provided as well as all of the sphinx command line arguments are extended into the build class. It is very simple to use it is also cross platform and does not require anything else to be installed to run it (make for example). Works out of the box with all of the various CI utilities and you can also use it in a chained command...

    python setup.py install build_docs
    

    this will install your program as well as build the documentation. There are code examples of how to install any of the extensions as well as sphinx using setup_requires. this is a nice way to go about it so that any modules needed by the doc builder do not get installed into the user site-packages but get placed in a temporary directory ".eggs" where the setup.py file is located. It is not not to install onetime use modules into the users python interpreter

    in-progress 
    opened by kdschlosser 19
  • Logging module ignored when progressbar not used

    Logging module ignored when progressbar not used

    #!/usr/bin/env python3
    
    import progressbar
    import logging
    
    progressbar.streams.wrap_stderr()
    
    logger = logging.getLogger(__name__)
    logging.basicConfig(level=logging.DEBUG)
    logger.info('test')
    logger.info('test')
    logger.info('test')
    
    # count = 50000
    # with progressbar.ProgressBar(max_value=count) as bar:
    #     for i in range(count):
    #         #logger.info('test')
    #         bar.update(i)
    

    ref #129 If you do not use the progressbar module, all logger traces are ignored when used with progressbar.streams.wrap_stderr()

    Using 0.32.0

    opened by NicoHood 17
  • Auto-flushing when wrapping streams?

    Auto-flushing when wrapping streams?

    Description

    Have you considered some kind of auto-flushing when wrapping the streams? In the cases where each progress step takes a while, it would be convenient to get log outputs more frequently without manually calling progressbar.streams.flush(). Please run the code below for a frustrating example of too long waiting times.

    Code

    import time
    import progressbar
    
    
    with progressbar.ProgressBar(redirect_stdout=True) as progress:
        for i in progress(range(3)):
            for j in range(10):
                time.sleep(0.5)
                print(i, j)
                # progressbar.streams.flush()
    
    opened by davidparsson 16
  • GPG signatures for source validation

    GPG signatures for source validation

    As we all know, today more than ever before, it is crucial to be able to trust our computing environments. One of the main difficulties that package maintainers of Linux distributions face, is the difficulty to verify the authenticity and the integrity of the source code.

    The Arch Linux team would appreciate it if you would provide us GPG signatures in order to verify easily and quickly your source code releases.

    Overview of the required tasks:

    • Create and/or use a 4096-bit RSA keypair for the file signing.
    • Keep your key secret, use a strong unique passphrase for the key.
    • Upload the public key to a key server and publish the full fingerprint.
    • Sign every new git commit and tag.
    • Create signed compressed (xz --best) release archives
    • Upload a strong message digest (sha512) of the archive
    • Configure https for your download server

    GPGit is meant to bring GPG to the masses. It is not only a shell script that automates the process of creating new signed git releases with GPG but also comes with this step-by-step readme guide for learning how to use GPG.

    Additional Information:

    • https://github.com/NicoHood/gpgit
    • https://help.github.com/categories/gpg/
    • https://wiki.archlinux.org/index.php/GnuPG
    • https://git-scm.com/book/en/v2/Git-Tools-Signing-Your-Work
    • https://www.qubes-os.org/doc/verifying-signatures/
    • https://lkml.org/lkml/2016/8/15/445
    • https://developers.google.com/web/fundamentals/security/encrypt-in-transit/why-https
    • https://www.enigmail.net/index.php/en/

    Thanks in advance.

    opened by NicoHood 15
  • Width of progressbar is wrong

    Width of progressbar is wrong

     10% (1 of 10) |##                          | Elapsed Time: 0:00:00 ETA: 0:00:0
     20% (2 of 10) |#####                       | Elapsed Time: 0:00:00 ETA: 0:00:0
     30% (3 of 10) |########                    | Elapsed Time: 0:00:00 ETA: 0:00:0
     40% (4 of 10) |###########                 | Elapsed Time: 0:00:00 ETA: 0:00:0
     50% (5 of 10) |##############              | Elapsed Time: 0:00:00 ETA: 0:00:0  
     60% (6 of 10) |################            | Elapsed Time: 0:00:00 ETA: 0:00:0 
     70% (7 of 10) |###################         | Elapsed Time: 0:00:00 ETA: 0:00:0
     80% (8 of 10) |######################      | Elapsed Time: 0:00:00 ETA: 0:00:0
     90% (9 of 10) |#########################   | Elapsed Time: 0:00:00 ETA: 0:00:0
    100% (10 of 10) |#############################| Elapsed Time: 0:00:01 Time: 1.00
    

    arch linux, i3wm, rxvt-unicode-256 color Also on windows in mysy2

    opened by maxnoe 14
  • signatures for github tarball releases

    signatures for github tarball releases

    Description

    It would be nice to create and upload signatures for the github tarballs as well. Compared to the pypi packages, the github tarballs include tests and more important: docs. This would aid distro packaging to distribute man/text version of the docs aside of the python modules plus have the possibility to to test for regressions with other modules (like python-utils).

    Alternative solution: add tests and docs to pypi package.

    opened by anthraxx 13
  • Progressbar printed 2 times

    Progressbar printed 2 times

    Description

    Hello, 2 progress bars are printed in my jupyter notebook. If I use a simple example I get this:

    Code

    bar = progressbar.ProgressBar()
    for i in bar(range(100)):
        time.sleep(0.02)
    
    100% (100 of 100) |##########################################################################################################################| Elapsed Time: 0:00:02 Time: 0:00:02  1% (1 of 100) |#                                                                                                                           | Elapsed Time: 0:00:00 ETA:  0:00:00
    

    Versions

    I'm using Ubuntu 14.04.5 LTS and python 2.7.12 |Continuum Analytics, Inc.| (default, Jul 2 2016, 17:42:40) [GCC 4.4.7 20120313 (Red Hat 4.4.7-1)] progressbar2==3.12.0 ipykernel==4.5.2 ipython==5.1.0 ipython-genutils==0.1.0 ipywidgets==5.2.2 jupyter==1.0.0 jupyter-client==4.4.0 jupyter-console==5.0.0 jupyter-core==4.2.1

    inactive 
    opened by tboquet 13
  • Fix unwrapping stdout/stderr

    Fix unwrapping stdout/stderr

    Hi @WoLpH,

    Looks like you've got a couple of off-by-one errors here, which leads to the stdout/stderr not being restored appropriately. Causing some bugs downstream :).

    opened by matthewwardrop 12
  • Check for Jupyter Notebook before allocating other terminal sizes.

    Check for Jupyter Notebook before allocating other terminal sizes.

    Currently, the number of columns and rows used by progressbar2 in Jupyter notebooks is inherited from the shell from which it was run... which leads to some pretty weird behaviour. Instead, I think we should check for Jupyter notebooks first.

    Is there ever a case where this doesn't make sense?

    @WoLpH

    opened by matthewwardrop 12
  • Not sure how to go about doing this

    Not sure how to go about doing this

    OK I want to create a progress bar that looks like the following

    SomeFileName.cpp [13/45] |####################                               | 54% [FT 00:01:00][TT: 00:02:30][ETR: 00:05:40]
    

    I need to be able to change SomeFileName.cpp And I would also like to have the ability to update the progress bar to have the TT (Total Time) and FT (File Time) update even if I am not incrementing the value at all. The FT time would need to be able to be reset back to zero when the SomeFileName.cpp changes.

    opened by kdschlosser 4
  • Is there a way to have multiple concurrent and independent progress bars?

    Is there a way to have multiple concurrent and independent progress bars?

    Description

    I have 2 concurrent processes (threads really) performing operations that are independent. I would like to have 2 progress bars (as 2 lines) to show the progress of each. The first one is bound (number of tasks known), the second one is not (number of tasks unknown)

    Is that possible? I didn't see anything obvious in the examples

    opened by wabiloo 4
  • Question: multiprocessing, logging and progressbar

    Question: multiprocessing, logging and progressbar

    Description

    I am trying to work out a way of combining multiprocessing (with several process), logging and your progressbar. The idea is that there will be multiple processes each working on separate tasks, and I'd like a progress bar that shows overall progress across all of them, whilst not preventing logs (and std out) from being shown on screen.

    I am not managing however to find a way to get a nice progressbar at the bottom of the console, whilst all the processes send their logging information. I was hoping that when using a QueueHandler with a specific listener process, and a specific process to also receive "status" information and managing the progress bar, it would work, but it does not...

    I started from the example given at https://docs.python.org/3/howto/logging-cookbook.html#logging-to-a-single-file-from-multiple-processes and modified it into the following code.

    It's fair to say the question is probably not strictly related to the use of your progressbar, but I'm wondering if you'd have any suggestion...

    import logging
    import logging.handlers
    import multiprocessing
    
    from random import choice, random
    import time
    
    import progressbar
    progressbar.streams.wrap_stderr()
    progressbar.streams.wrap_stdout()
    
    def listener_configurer():
        root = logging.getLogger()
        # h = logging.handlers.RotatingFileHandler('mptest.log', 'a', 300, 3)
        # f = logging.Formatter('%(asctime)s %(processName)-10s %(name)s %(levelname)-8s %(message)s')
        # h.setFormatter(f)
        # root.addHandler(h)
    
    def listener_process(queue, configurer):
        configurer()
        while True:
            try:
                record = queue.get()
                if record is None:  # We send this as a sentinel to tell the listener to quit.
                    break
                logger = logging.getLogger(record.name)
                logger.handle(record)  # No level or filter logic applied - just do it!
            except Exception:
                import sys, traceback
                print('Whoops! Problem:', file=sys.stderr)
                traceback.print_exc(file=sys.stderr)
    
    # Arrays used for random selections in this demo
    
    LEVELS = [logging.DEBUG, logging.INFO, logging.WARNING,
              logging.ERROR, logging.CRITICAL]
    
    LOGGERS = ['a.b.c', 'd.e.f']
    
    MESSAGES = [
        'Random message #1',
        'Random message #2',
        'Random message #3',
    ]
    
    def worker_configurer(queue):
        h = logging.handlers.QueueHandler(queue)  # Just the one handler needed
        root = logging.getLogger()
        root.addHandler(h)
        # send all messages, for demo; no other level or filter logic applied.
        root.setLevel(logging.DEBUG)
    
    def worker_process(queue, configurer, status_queue):
        configurer(queue)
        name = multiprocessing.current_process().name
        # print('Worker started: %s' % name)
    
        for i in range(10):
            time.sleep(random())
            logger = logging.getLogger(choice(LOGGERS))
            level = choice(LEVELS)
            message = choice(MESSAGES)
            logger.log(level, message)
    
            status_queue.put(i)
        # print('Worker finished: %s' % name)
    
    def status_updater_process(queue):
        cpt = 0
        with progressbar.ProgressBar(max_value=progressbar.UnknownLength) as bar:
    
            while True:
                next = queue.get()
                if next is None:  # We send this as a sentinel to tell the listener to quit.
                    break
    
                cpt += 1
                bar.update(cpt)
    
    def main():
        status_queue = multiprocessing.Queue()
        status_worker = multiprocessing.Process(target=status_updater_process,
                                                args=(status_queue, ))
        status_worker.start()
    
        queue = multiprocessing.Queue(-1)
        listener = multiprocessing.Process(target=listener_process,
                                           args=(queue, listener_configurer))
        listener.start()
        workers = []
        for i in range(10):
            worker = multiprocessing.Process(target=worker_process,
                                             args=(queue, worker_configurer, status_queue))
            workers.append(worker)
            worker.start()
        for w in workers:
            w.join()
        queue.put_nowait(None)
        status_queue.put_nowait(None)
        listener.join()
        status_worker.join()
    
    if __name__ == '__main__':
        main()
    
    
    opened by wabiloo 3
  • Multi {threading/processing} support

    Multi {threading/processing} support

    Description

    Given multiple tasks processed in parallel, a progressbar for each one may bring some enlightment to the developer.

    Code

    i found this option here, but its interface makes it too slow to process heavy tasks. Your ASCII-based solution suits me well.

    Versions

    • Python version: 3.8.10 (default, Nov 26 2021, 20:14:08) [GCC 9.3.0]
    • Python distribution/environment: CPython/Anaconda/IPython/IDLE
    • Operating System: Ubuntu Linux
    opened by brunolnetto 22
  • Conda feedstock out of date

    Conda feedstock out of date

    Description

    Hi folks, is there any reason conda progressbar2-feedstock isn't maintained? The latest version available in there is 3.53.1, which is way behind pypi version 3.55.0.

    help-wanted 
    opened by kukushking 1
Releases(v4.2.0)
Owner
Rick van Hattem
Author of @mastering-python and entrepreneur interested in scaling large and complicated systems.
Rick van Hattem
Json Formatter for the standard python logger

This library is provided to allow standard python logging to output log data as json objects. With JSON we can make our logs more readable by machines and we can stop writing custom parsers for syslo

Zakaria Zajac 1.4k Jan 04, 2023
Key Logger - Key Logger using Python

Key_Logger Key Logger using Python This is the basic Keylogger that i have made

Mudit Sinha 2 Jan 15, 2022
This is a wonderful simple python tool used to store the keyboard log.

Keylogger This is a wonderful simple python tool used to store the keyboard log. Record your keys. It will capture passwords and credentials in a comp

Rithin Lehan 2 Nov 25, 2021
A small utility to pretty-print Python tracebacks. ⛺

TBVaccine TBVaccine is a utility that pretty-prints Python tracebacks. It automatically highlights lines you care about and deemphasizes lines you don

Stavros Korokithakis 365 Nov 11, 2022
Yaml - Loggers are like print() statements

Upgrade your print statements Loggers are like print() statements except they also include loads of other metadata: timestamp msg (same as print!) arg

isaac peterson 38 Jul 20, 2022
Keylogger with Python which logs words into server terminal.

word_logger Experimental keylogger with Python which logs words into server terminal.

Selçuk 1 Nov 15, 2021
Progressbar 2 - A progress bar for Python 2 and Python 3 - "pip install progressbar2"

Text progress bar library for Python. Travis status: Coverage: Install The package can be installed through pip (this is the recommended method): pip

Rick van Hattem 795 Dec 18, 2022
A simple package that allows you to save inputs & outputs as .log files

wolf_dot_log A simple package that allows you to save inputs & outputs as .log files pip install wolf_dot_log pip3 install wolf_dot_log |Instructions|

Alpwuf 1 Nov 16, 2021
changedetection.io - The best and simplest self-hosted website change detection monitoring service

changedetection.io - The best and simplest self-hosted website change detection monitoring service. An alternative to Visualping, Watchtower etc. Designed for simplicity - the main goal is to simply

7.3k Jan 01, 2023
A Fast, Extensible Progress Bar for Python and CLI

tqdm tqdm derives from the Arabic word taqaddum (تقدّم) which can mean "progress," and is an abbreviation for "I love you so much" in Spanish (te quie

tqdm developers 23.7k Jan 01, 2023
Ransomware leak site monitoring

RansomWatch RansomWatch is a ransomware leak site monitoring tool. It will scrape all of the entries on various ransomware leak sites, store the data

Zander Work 278 Dec 31, 2022
The new Python SDK for Sentry.io

sentry-python - Sentry SDK for Python This is the next line of the Python SDK for Sentry, intended to replace the raven package on PyPI. from sentry_s

Sentry 1.4k Dec 31, 2022
GTK and Python based, system performance and usage monitoring tool

System Monitoring Center GTK3 and Python 3 based, system performance and usage monitoring tool. Features: Detailed system performance and usage usage

Hakan Dündar 649 Jan 03, 2023
Python logging package for easy reproducible experimenting in research

smilelogging Python logging package for easy reproducible experimenting in research. Why you may need this package This project is meant to provide an

Huan Wang 20 Dec 23, 2022
Structured Logging for Python

structlog makes logging in Python faster, less painful, and more powerful by adding structure to your log entries. It's up to you whether you want str

Hynek Schlawack 2.3k Jan 05, 2023
Outlog it's a library to make logging a simple task

outlog Outlog it's a library to make logging a simple task!. I'm a lazy python user, the times that i do logging on my apps it's hard to do, a lot of

ZSendokame 2 Mar 05, 2022
Splunk Add-On to collect audit log events from Github Enterprise Cloud

GitHub Enterprise Audit Log Monitoring Splunk modular input plugin to fetch the enterprise audit log from GitHub Enterprise Support for modular inputs

Splunk GitHub 12 Aug 18, 2022
Docker container log aggregation with Elasticsearch, Kibana & Filebeat

Epilog Dead simple container log aggregation with ELK stack Preface Epilog aims to demonstrate a language-agnostic, non-invasive, and straightfo

Redowan Delowar 23 Oct 26, 2022
APT-Hunter is Threat Hunting tool for windows event logs

APT-Hunter is Threat Hunting tool for windows event logs which made by purple team mindset to provide detect APT movements hidden in the sea of windows event logs to decrease the time to uncover susp

824 Jan 08, 2023
Rich is a Python library for rich text and beautiful formatting in the terminal.

Rich 中文 readme • lengua española readme • Läs på svenska Rich is a Python library for rich text and beautiful formatting in the terminal. The Rich API

Will McGugan 41.5k Jan 07, 2023