The command line interface for Gradient - Gradient is an an end-to-end MLOps platform

Overview

GitHubSplash

Gradient CLI

PyPI Downloads


Get started: Create AccountInstall CLITutorialsDocs

Resources: WebsiteBlogSupportContact Sales


Gradient is an an end-to-end MLOps platform that enables individuals and organizations to quickly develop, train, and deploy Deep Learning models. The Gradient software stack runs on any infrastructure e.g. AWS, GCP, on-premise and low-cost Paperspace GPUs. Leverage automatic versioning, distributed training, built-in graphs & metrics, hyperparameter search, GradientCI, 1-click Jupyter Notebooks, our Python SDK, and more.

Key components:

  • Notebooks: 1-click Jupyter Notebooks.
  • Workflows: Train models at scale with composable actions.
  • Inference: Deploy models as API endpoints.

Gradient supports any ML/DL framework (TensorFlow, PyTorch, XGBoost, etc).


See releasenotes.md for details on the current release, as well as release history.


Getting Started

  1. Make sure you have a Paperspace account set up. Go to http://paperspace.com to register and generate an API key.

  2. Use pip, pipenv, or conda to install the gradient package, e.g.:

    pip install -U gradient

    To install/update prerelease (Alpha/Beta) version version of gradient, use:

    pip install -U --pre gradient

  3. Set your api key by executing the following:

    gradient apiKey

    Note: your api key is cached in ~/.paperspace/config.json

    You can remove your cached api key by executing:

    gradient logout

Executing tasks on Gradient

The Gradient CLI follows a standard [command] [--options] syntax

For example, to create a new Deployment use:

gradient workflows create [type] [--options]

For a full list of available commands run gradient workflows --help. You can also view more info about Workflows in the docs.

Contributing

Want to contribute? Contact us at [email protected]

Pre-Release Testing

Have a Paperspace QA tester install your change directly from the branch to test it. They can do it with pip install git+https://github.com/Paperspace/[email protected].

Comments
  • Some parameters in job_client.create() don't work

    Some parameters in job_client.create() don't work

    I'm following the sample docs at gradient.api_sdk.clients.job_client. I can't get working_directory or job_env to work.

    working_directory: auto set to /paperspace. When I pass in /app, it's ignored & /paperspace is still used (listed Console > Job > Environment; and on the Job model fetch from job_client.list()). I just cashed in and set all my Docker stuff from /app to /paperspace and it worked. Ideally I wouldn't even want to pass /app, but it'd just be picked up from WORKDIR in the Dockerfile (which is /app)

    job_env: I can't find a workaround (save putting a config.json in /storage). Parameter gets gets ignored; of note, the docs has a typo:

    job = job_client.create(
        ...
        job_env={
            'CUSTOM_ENV'='Some value that will be set as system environment',
        }
    )
    

    'CUSTOM_ENV'='..' should be 'CUSTOM_ENV': '..'. Not nit-picking; just that it had me second guessing my approach, since using a dict gets ignored ([j.job_env for j in job_client.list()] => [None, None, ..]) so I tried json.dumps(my_dict), still ignored. Any suggestions on getting the job_env passed in?

    Overall, it seems like some params are respected, and others not; and it's hard to know which is which. Maybe a mismatch on the docs vs Github vs PyPi?

    opened by lefnire 8
  • Unable to install due to numpy==1.19.4

    Unable to install due to numpy==1.19.4

    Hi I am unable to use gradient on python 3.7.9 due to RuntimeError caused by numpy==1.19.4

    RuntimeError: The current Numpy installation ('c:\users\XXXX\envs\gradient\lib\site-packages\numpy\init.py') fails to pass a sanity check due to a bug in the windows runtime. See this issue for more information: https://tinyurl.com/y3dm3h86

    I tried downgrading the numpy, but it appears to be incompatible and i got the ff ModuleNotFoundError: ModuleNotFoundError: No module named '_curses'

    opened by thompsonalecbgo 7
  • Update Readme.md

    Update Readme.md

    I found two outdated or currently not working commands while trying to execute my container:

    1. The --project option was ignored but providing --projectId worked.
    2. The example for the --command option had as error message: Error: Missing argument "SCRIPT...". Upon providing a script the originally provided command was ignored. Instead I had to use the --shell option to execute the correct call.
    opened by nziermann 6
  • Keep artifacts for canceled jobs

    Keep artifacts for canceled jobs

    I am aware this is probably not the place to open this issue, but it seems like artifacts are not kept when a job is canceled. But why? I just finished a 2 days running job and wanted to have access to my best model, but the artifacts are not showing up potentially because I canceled the job. If this is the case, this is very frustrating.

    opened by MichelML 5
  • feat(notebooks): enable basic notebook lifecycle commands PS-13680

    feat(notebooks): enable basic notebook lifecycle commands PS-13680

    Newly Supported Commands

    • notebooks stop
    • notebook start
    • notebook create
    • notebook fork
    • notebook artifacts list

    Related tickets

    https://paperspace.atlassian.net/browse/PS-12219 https://paperspace.atlassian.net/browse/PS-11559 https://paperspace.atlassian.net/browse/PS-13681 https://paperspace.atlassian.net/browse/PS-13682 https://paperspace.atlassian.net/browse/PS-13683 https://paperspace.atlassian.net/browse/PS-13684

    This PR follows up on the PR for PS-12219 (https://github.com/Paperspace/PS_API/pull/1526). It enables createNotebook, startNotebook, artifactsList and forkNotebook commands with the v2 endpoint. It adds in the ability for creating a notebook with vm_type_id or vm_type_label

    opened by kevin-kabore 4
  • Failing to start notebook

    Failing to start notebook

    I'm trying to start an instance of an existing notebook: gradient notebooks start --id [id] --machineType Free-P5000

    and get in return: Failed to create resource: Cluster null not found

    specifying any cluster ID doesn't change anything and the output remains the same.

    A bug ?

    bug 
    opened by macsunmood 3
  • CR2-22 CR2-49 CR2-48 add metrics list to sdk

    CR2-22 CR2-49 CR2-48 add metrics list to sdk

    Adding list custom metrics functionality to sdk and cli for jobs, experiments, deployments, and notebooks

    QA Test Plan:

    1. in cli, check that this new list command shows up in the help text: gradient deployments metrics --help
    2. create a deployment (gui or cli doesn't matter) and wait for it to finish provisioning
    3. try the new list command: gradient deployments metrics list --id XYZ, should return custom metrics (a list of words),
    4. repeat 1 and 2 for jobs, experiments, and notebooks. these might return null but that's okay for now, just checking that this new command is available. note the stuff that returns null below if any and I'll solve that in a separate ticket.
    released 
    opened by robghchen 3
  • Fix: Serialize jobenv to envVars PS-15020

    Fix: Serialize jobenv to envVars PS-15020

    Fixes serialization of job environment

    Test Plan: Create a job and specify jobEnv parameter. Use command "env" to see if job environment is affected

    released 
    opened by paperspace-philip 3
  • When providing --ignoreFiles comma separated the code returns Attribute error.

    When providing --ignoreFiles comma separated the code returns Attribute error.

    The command below with the --ignoreFiles field set

    gradient experiments run single node \
    --name test \
    --projectid 12345 \
    --container paperspace/tensorflow-python \
    --machineType V100 \
    --command 'python main.py' \
    --ignoreFiles "file1,file2,file3,folder1,folder2,folder3"
    

    Returns the following error

    file_paths = self._retrieve_file_paths(workspace_path, ignore_files)
    File "/lib/python3.7/site-packages/gradient/workspace.py", line 59, in _retrieve_file_paths
        exclude += ignored_files.split(',')
    AttributeError: 'list' object has no attribute 'split'
    
    opened by EvDuijnhoven 3
  • Notebook example with SDK concepts

    Notebook example with SDK concepts

    Example of using new SDK functionality to create projects, experiments, analyze model, and create an inferrence deployment.

    Tested on TensorFlow 2.0 & Python3 container image.

    opened by dte 3
  • Replace faulty chunk counting logic

    Replace faulty chunk counting logic

    PR #384 for allowing multipart uploads introduced a bug that prevents datasets from being uploaded that contain files that are multiples of 500Mb. This became more of an issue with #389 which changes the chunk sizes to 15Mb, meaning that any file that is a multiple of 15Mb (75Mb in my case) will cause the dataset to fail to upload.

    What I believe is happening is that because we faulty logic instructs us to read an additional block of data from the filesystem that doesn't exist. How this appears in my experience is that the CLI hangs indefinitely, or in a Workflow it crashes.

    released 
    opened by fmorlock-tt 2
  • Failed to execute request against storage provider when uploading a dataset

    Failed to execute request against storage provider when uploading a dataset

    Dear everyone,

    I am trying to upload a new version of my dataset, but I keep getting Brokenpipe errors. Here is what I am doing :

    gradient datasets versions create --id ...
    gradient datasets files put --id ...:... --source-path "."
    
    Failed to execute request against storage provider: ('Connection aborted.', BrokenPipeError(32, 'Broken pipe'))
    

    Uploading to other cloud providers is working, but not here.

    By the way, is gradient-cli uploading everything again at each new version? Or is it comparing hashes to avoid unnecessary file transfers?

    Thanks, Clément

    opened by clementpoiret 0
Releases(v2.0.6)
Owner
Paperspace
Paperspace
Neovim integration for Google Keep, built using gkeepapi

Gkeep.nvim Neovim integration for Google Keep, built using gkeepapi Requirements Neovim 0.5 Python 3.6+ A patched font (optional. Used for icons) Tabl

Steven Arcangeli 143 Jan 02, 2023
Termtyper is a TUI typing application that provides you a great feel with typing with a lot of options to tweak

Termtyper Termtyper is a TUI (Text User Interface) typing application that provides you a great feel with typing with a lot of options to tweak! It is

Noob Coder 834 Dec 27, 2022
liquidctl – liquid cooler control Cross-platform tool and drivers for liquid coolers and other devices

Cross-platform CLI and Python drivers for AIO liquid coolers and other devices

1.7k Jan 08, 2023
This is an app for creating your own color scheme for Termux!

Termux Terminal Theme Creator [WIP] If you need help on how to use the program, you can either create a GitHub issue or join this temporary Discord se

asxlvm 3 Dec 31, 2022
A Python-based Wordle solver and CLI player

Wordle A Python-based Wordle solver and CLI player This was created using Python 3.9.7. SPOILER ALERT: the data directory contains spoilers for upcomi

Will Fitzgerald 1 Jul 24, 2022
Notion-cli-list-manager - A simple command-line tool for managing Notion databases

A simple command-line tool for managing Notion List databases. ✨

Giacomo Salici 75 Dec 04, 2022
A CLI tool for creating disposable environments.

dispenv - Disposable Python Environments ⚠️ WIP Need to make an environment to work on a GitHub issue? Want to try out a new package and not leave the

Peter Baumgartner 3 Mar 14, 2022
Python script to tabulate data formats like json, csv, html, etc

pyT PyT is a a command line tool and as well a library for visualising various data formats like: JSON HTML Table CSV XML, etc. Features Print table o

Mobolaji Abdulsalam 1 Dec 30, 2021
Execute shell command lines in parallel on Slurm, S(on) of Grid Engine (SGE), PBS/Torque clusters

qbatch Execute shell command lines in parallel on Slurm, S(on) of Grid Engine (SGE), PBS/Torque clusters qbatch is a tool for executing commands in pa

Jon Pipitone 26 Dec 12, 2022
A terminal client for connecting to hack.chat servers

A terminal client for connecting to hack.chat servers.

V9 2 Sep 21, 2022
Python-Stock-Info-CLI: Get stock info through CLI by passing stock ticker.

Python-Stock-Info-CLI Get stock info through CLI by passing stock ticker. Installation Use the following command to install the required modules at on

Ayush Soni 1 Nov 05, 2021
Command-line script to upload videos to Youtube using theYoutube APIv3.

Introduction Command-line script to upload videos to Youtube using theYoutube APIv3. It should work on any platform (GNU/Linux, BSD, OS X, Windows, ..

Arnau Sanchez 1.9k Jan 09, 2023
CLTools provides various tools and command to use in the terminal.

CLTools provides various tools and command to use in the terminal. As of date, CLTools is only able to generate temporary email addresses and receive emails. There are plans to integrate more tools a

Ashwin Chugh 2 Feb 14, 2022
🎈 A Mini CLI-based Operating System simulator

Mini OS Simulator Summary 🎈 A Mini CLI-based Operating System simulator, well, not really. It simulates a file system with songs and movies and stuff

Jaiyank S. 3 Feb 14, 2022
Themes for the kitty terminal emulator

Themes for the kitty terminal This is a collection of themes for the kitty terminal emulator. The themes were initially imported from dexpota/kitty-th

Kovid Goyal 190 Jan 05, 2023
An open-source CLI tool for backing up RDS(PostgreSQL) Locally or to Amazon S3 bucket

An open-source CLI tool for backing up RDS(PostgreSQL) Locally or to Amazon S3 bucket

1 Oct 30, 2021
CLI tool to fix linked references for dates.

Fix Logseq dates This is a CLI tool to fix the date references following a change in date format since the current version (0.4.4) of Logseq does not

Isaac Dadzie 5 May 18, 2022
Stephen's Obsessive Note-Storage Engine.

Latest Release · PyPi Package · Issues · Changelog · License # Get Sonse and tell it where your notes are... $ pip install sonse $ export SONSE="$HOME

Stephen Malone 23 Jun 10, 2022
Regis-ltmpt-auto - Program register ltmpt 2022 automatis

LTMPT Register Otomatis 2022 Program register ltmpt 2022 automatis dibuat untuk

1 Jan 13, 2022
Play Wordle Bot - Wordle Bot written in python

Wordle Bot A Bot written in python with a CL Interface to guess adn solve Wordle

Prashant 1 Feb 25, 2022