dcargs is a tool for generating portable, reusable, and strongly typed CLI interfaces from dataclass definitions.

Overview

dcargs

build mypy lint

dcargs is a tool for generating portable, reusable, and strongly typed CLI interfaces from dataclass definitions.

We expose one function, parse(Type[T]) -> T, which takes a dataclass type and instantiates it via an argparse-style CLI interface. If we create a script called simple.py:

import dataclasses

import dcargs


@dataclasses.dataclass
class Args:
    field1: str  # A string field.
    field2: int  # A numeric field.


if __name__ == "__main__":
    args = dcargs.parse(Args)
    print(args)

Running python simple.py --help would print:

usage: simple.py [-h] --field1 STR --field2 INT

optional arguments:
  -h, --help    show this help message and exit

required arguments:
  --field1 STR  A string field.
  --field2 INT  A numeric field.

And, from python simple.py --field1 string --field2 4:

Args(field1='string', field2=4)

Feature list

The parse function supports a wide range of dataclass definitions, while automatically generating helptext from comments/docstrings. Some of the basic features are shown in the example below.

Our unit tests cover many more complex type annotations, including classes containing:

  • Types natively accepted by argparse: str, int, float, pathlib.Path, etc
  • Default values for optional parameters
  • Booleans, which are automatically converted to flags when provided a default value (eg action="store_true" or action="store_false"; in the latter case, we prefix names with no-)
  • Enums (via enum.Enum; argparse's choices is populated and arguments are converted automatically)
  • Various container types. Some examples:
    • typing.ClassVar types (omitted from parser)
    • typing.Optional types
    • typing.Literal types (populates argparse's choices)
    • typing.Sequence types (populates argparse's nargs)
    • typing.List types (populates argparse's nargs)
    • typing.Tuple types, such as typing.Tuple[T, T, T] or typing.Tuple[T, ...] (populates argparse's nargs, and converts automatically)
    • typing.Final types and typing.Annotated (for parsing, these are effectively no-ops)
    • Nested combinations of the above: Optional[Literal[T]], Final[Optional[Sequence[T]]], etc
  • Nested dataclasses
    • Simple nesting (see OptimizerConfig example below)
    • Unions over nested dataclasses (subparsers)
    • Optional unions over nested dataclasses (optional subparsers)
  • Generic dataclasses (including nested generics, see ./examples/generics.py)

A usage example is available below. Examples of additional features can be found in the tests.

Comparisons to alternative tools

There are several alternative libraries to dcargs; here's a rough summary of some of them:

Parsers from dataclasses Parsers from attrs Nested dataclasses Subparsers (via Unions) Containers Choices from literals Docstrings as helptext
dcargs
datargs
simple-parsing soon
argparse-dataclass
argparse-dataclasses
dataclass-cli
hf_argparser

Some other distinguishing factors that dcargs has put effort into:

  • Robust handling of forward references
  • Support for nested containers and generics
  • Strong typing: we actively avoid relying on strings or dynamic namespace objects (eg argparse.Namespace)
  • Simplicity + strict abstractions: we're focused on a single function API, and don't leak any argparse implementation details to the user level. We also intentionally don't offer any way to add argument parsing-specific logic to dataclass definitions. (in contrast, some of the libaries above rely heavily on dataclass field metadata, or on the more extreme end inheritance+decorators to make parsing-specific dataclasses)

Example usage

This code:

"""An argument parsing example.

Note that there are multiple possible ways to document dataclass attributes, all
of which are supported by the automatic helptext generator.
"""

import dataclasses
import enum

import dcargs


class OptimizerType(enum.Enum):
    ADAM = enum.auto()
    SGD = enum.auto()


@dataclasses.dataclass
class OptimizerConfig:
    # Variant of SGD to use.
    type: OptimizerType

    # Learning rate to use.
    learning_rate: float = 3e-4

    # Coefficient for L2 regularization.
    weight_decay: float = 1e-2


@dataclasses.dataclass
class ExperimentConfig:
    experiment_name: str  # Experiment name to use.

    optimizer: OptimizerConfig

    seed: int = 0
    """Random seed. This is helpful for making sure that our experiments are
    all reproducible!"""


if __name__ == "__main__":
  config = dcargs.parse(ExperimentConfig, description=__doc__)
  print(config)

Generates the following argument parser:

$ python example.py --help
usage: example.py [-h] --experiment-name STR --optimizer.type {ADAM,SGD} [--optimizer.learning-rate FLOAT]
                  [--optimizer.weight-decay FLOAT] [--seed INT]

An argument parsing example.

Note that there are multiple possible ways to document dataclass attributes, all
of which are supported by the automatic helptext generator.

optional arguments:
  -h, --help            show this help message and exit
  --optimizer.learning-rate FLOAT
                        Learning rate to use. (default: 0.0003)
  --optimizer.weight-decay FLOAT
                        Coefficient for L2 regularization. (default: 0.01)
  --seed INT            Random seed. This is helpful for making sure that our experiments are
                        all reproducible! (default: 0)

required arguments:
  --experiment-name STR
                        Experiment name to use.
  --optimizer.type {ADAM,SGD}
                        Variant of SGD to use.
Comments
  • Cannot use `tuple` and `list` in python 3.9

    Cannot use `tuple` and `list` in python 3.9

    Hi all,

    I have a problem when using the new tuple and list type annotation with tyro. It gives me the following error:

    AttributeError: type object 'tuple' has no attribute 'copy_with'
    

    The code runs fine with Tuple and List.

    opened by Msadat97 5
  • Generic dataclass detection fails for unions

    Generic dataclass detection fails for unions

    Hi Brent,

    I have a very low-level bug to flag for you -- when saving/loading nested dataclasses to yaml (using extras.to_yaml(), extras.from_yaml(), if a dataclass has a Union of two custom types, they don't get detected as custom types for the yaml.Loader to construct.

    I wrote a MWE to replicate the issue:

    import dataclasses
    import dcargs
    
    from typing import Union
    
    @dataclasses.dataclass
    class TypeA:
        data: int
    
    @dataclasses.dataclass
    class TypeB:
        data: int
        
    @dataclasses.dataclass
    class Wrapper:
        subclass: Union[TypeA, TypeB] = TypeA(1)
        
    if __name__ == "__main__":
        wrapper1 = Wrapper() # Create Wrapper object.
        wrapper2 = dcargs.extras.from_yaml(Wrapper, dcargs.extras.to_yaml(wrapper1)) # Errors, no constructor for TypeA
    

    No worries if this is too low-level to deal with right now -- I think we can work around it by just pickling the configs, but wanted to flag something is going awry in the custom type detection.

    opened by pculbertson 5
  • Subcommands are broken

    Subcommands are broken

    Given this file,

    # tyro_test.py
    
    from dataclasses import dataclass
    from typing import Union
    
    import tyro
    
    
    @dataclass
    class DataparserA:
        pass
    
    
    @dataclass
    class DataparserB:
        pass
    
    
    Dataparser = Union[DataparserA, DataparserB]
    
    
    @dataclass
    class ModelA:
        pass
    
    @dataclass
    class ModelB:
        pass
    
    
    Model = Union[ModelA, ModelB]
    
    
    @dataclass
    class Pipeline:
        dataparser: Dataparser
        model: Model
    
    
    @dataclass
    class Trainer:
        pipeline: Pipeline
    
    
    tyro.cli(Trainer)
    

    When I run with python tyro_test.py pipeline.dataparser:dataparser-a pipeline.model:model-a, I get this following error with tyro==0.3.35 and ==0.3.36:

    Traceback (most recent call last):
      File "/home/kchen/mttr_nerfstudio/tyro_test.py", line 43, in <module>
        tyro.cli(Trainer)
      File "/home/kchen/miniconda3/envs/nerfstudio/lib/python3.10/site-packages/tyro/_cli.py", line 125, in cli
        _cli_impl(
      File "/home/kchen/miniconda3/envs/nerfstudio/lib/python3.10/site-packages/tyro/_cli.py", line 326, in _cli_impl
        out, consumed_keywords = _calling.call_from_args(
      File "/home/kchen/miniconda3/envs/nerfstudio/lib/python3.10/site-packages/tyro/_calling.py", line 100, in call_from_args
        value, consumed_keywords_child = call_from_args(
      File "/home/kchen/miniconda3/envs/nerfstudio/lib/python3.10/site-packages/tyro/_calling.py", line 110, in call_from_args
        subparser_def = parser_definition.subparsers_from_prefix[
    KeyError: 'pipeline.model'
    
    • I do not get an error if I use tyro==0.3.33
    • I do not get an error if I use tyro==0.3.35 but change the last line to tyro.cli(Pipeline) and then run python tyro_test.py dataparser:dataparser-a model:model-a

    Maybe this is related to the new tyro.conf.ConsolidateSubcommandArgs functionality? But I am not sure. Sorry, for just dumping the error and not looking into the source--I don't have much time right now.

    opened by kevinddchen 3
  • Setting docstrings for dynamic dataclasses

    Setting docstrings for dynamic dataclasses

    Another weird question:

    Is it possible to set docstrings programatically for dynamic dataclasses?

    A quick look through the source seems to indicate that this isn't possible, since for dataclasses it relies on the source existing, and seems to default to returning None in the case it detects that the object is a dynamic dataclass.

    It could be super useful if you could add optional support for docs with dynamic dataclasses, for purposes like dynamically generating schemas from class definitions / function definitions. It's not super elegant but this could be exposed through the metadata field of dataclasses.field, which according to the docs is meant for 3rd party extensions for dynamic dataclasses.

    opened by tovacinni 3
  • Visibility of parameter options across subcommands with defaults

    Visibility of parameter options across subcommands with defaults

    Hi Brent,

    First off, I'm really liking this framework! I have a use case that kind of combines "base configs as subcommands" with "sequenced subcommands".

    Say I have a module that has two submodules, A and B. Furthermore, say each submodule has several possible "typical" configurations, e.g. A1, A2..., B1, B2,...

    What I would like to do is simultaneously:

    1. Set up base configs for all combinations of the typical configs for both A and B, without having to enumerate all combinations e.g. A1B1, A1B2, etc...
    2. View, from -h, the possible options for both submodules A and B.

    Is there currently a way of doing this? I've attached 2 examples. The first one sets up all base configs for both, but doesn't list all options with -h (it only lists options for the most recent subcommand). The second one will display all the possible parameter options for both A and B with -h (after one of the subcommands is specified).

    I'm not even sure if what I'm trying to do is possible in a "subcommand" sense? I've also tried the AvoidSubcommands type but I can't really make that work either.

    Thanks, Mark

    a.py:

    from dataclasses import dataclass
    from typing import Annotated, Union
    
    import tyro
    from tyro.conf import subcommand
    
    @dataclass(frozen=True)
    class SubModuleAConfig:
        param1: float
    submoda_defaults = {
        'basic': SubModuleAConfig(param1=1.),
        'fancy': SubModuleAConfig(param1=2.2),
    }
    submoda_descriptions = {
        'basic': 'Basic config',
        'fancy': 'Fancy config'
    }
    SubModuleADefaultsType = tyro.extras.subcommand_type_from_defaults(
        submoda_defaults, submoda_descriptions
    )
    
    @dataclass(frozen=True)
    class SubModuleBConfig:
        param2: int
    submodb_defaults = {
        'basic': SubModuleBConfig(param2=0),
        'fancy': SubModuleBConfig(param2=-5),
    }
    submodb_descriptions = {
        'basic': 'Basic config',
        'fancy': 'Fancy config'
    }
    SubModuleBDefaultsType = tyro.extras.subcommand_type_from_defaults(
        submodb_defaults, submodb_descriptions
    )
    
    @dataclass
    class FullModuleConfig:
        suba: SubModuleADefaultsType
        subb: SubModuleBDefaultsType
    
    if __name__ == '__main__':
        full_module_config = tyro.cli(FullModuleConfig)
        print(full_module_config)
    

    Output:

    $ python a.py suba:basic subb:basic -h
    usage: a.py suba:basic subb:basic [-h] [--subb.param2 INT]
    
    Basic config
    
    ╭─ arguments ─────────────────────────────────────────────╮
    │ -h, --help              show this help message and exit │
    ╰─────────────────────────────────────────────────────────╯
    ╭─ subb arguments ────────────────────────────────────────╮
    │ --subb.param2 INT       (default: 0)                    │
    ╰─────────────────────────────────────────────────────────╯
    

    b.py

    from dataclasses import dataclass
    from itertools import product
    from typing import Annotated, Union
    
    import tyro
    from tyro.conf import subcommand
    
    @dataclass(frozen=True)
    class SubModuleAConfig:
        param1: float
    submoda_defaults = {
        'basic': SubModuleAConfig(param1=1.),
        'fancy': SubModuleAConfig(param1=2.2),
    }
    submoda_descriptions = {
        'basic': 'Basic config',
        'fancy': 'Fancy config'
    }
    
    @dataclass(frozen=True)
    class SubModuleBConfig:
        param2: int
    submodb_defaults = {
        'basic': SubModuleBConfig(param2=0),
        'fancy': SubModuleBConfig(param2=-5),
    }
    submodb_descriptions = {
        'basic': 'Basic config',
        'fancy': 'Fancy config'
    }
    
    @dataclass
    class FullModuleConfig:
        suba: SubModuleAConfig
        subb: SubModuleBConfig
    
    all_defaults = {}
    all_descriptions = {}
    combos = product(submoda_defaults.items(), submodb_defaults.items())
    for (suba_name, suba_config), (subb_name, subb_config) in combos:
        name = f'A{suba_name}_B{subb_name}'
        all_defaults[name] = FullModuleConfig(
            suba=suba_config,
            subb=subb_config,
        )
        all_descriptions[name] = f'A: {submoda_descriptions[suba_name]}, ' \
            + f'B: {submodb_descriptions[subb_name]}'
    
    if __name__ == '__main__':
        full_module_config = tyro.cli(
            tyro.extras.subcommand_type_from_defaults(
                all_defaults,
                all_descriptions,
            )
        )
        print(full_module_config)
    

    Output:

    $ python b.py Abasic_Bbasic -h
    usage: b.py Abasic_Bbasic [-h] [--suba.param1 FLOAT] [--subb.param2 INT]
    
    A: Basic config, B: Basic config
    
    ╭─ arguments ─────────────────────────────────────────────╮
    │ -h, --help              show this help message and exit │
    ╰─────────────────────────────────────────────────────────╯
    ╭─ suba arguments ────────────────────────────────────────╮
    │ --suba.param1 FLOAT     (default: 1.0)                  │
    ╰─────────────────────────────────────────────────────────╯
    ╭─ subb arguments ────────────────────────────────────────╮
    │ --subb.param2 INT       (default: 0)                    │
    ╰─────────────────────────────────────────────────────────╯
    
    opened by nishi951 2
  • Fallthrough args for subcommands

    Fallthrough args for subcommands

    One thing that's really nice about CLI11 is fallthrough args.

    This isn't supported by argparse natively, which means that instead of writing something like:

    python x.py subcommand1 subcommand2 {--options for the root parser of x.py, subcommand1, and subcommand2}
    

    we're forced to write:

    python x.py {--options for the root parser of x.py} subcommand1 {--options for subcommand1} subcommand2 {--options for subcommand2}
    

    Which requires much more cognitive energy, because we need to be careful about where arguments are placed.

    We should be able to partially solve this: it won't be as elegant as CLI11, but when a subcommand tree is built adding a flag that distributes arguments applied to intermediate nodes to the leaves of the tree would enable the syntax in the first example.

    We basically have two approaches for this:

    (1) Refactor ParserSpecification.apply() to support this. This would require big changes to the way "sibling" subcommands are handled. (2) Keep the current ParserSpecification / argparse.ArgumentParser construction logic, but as a post-processing step move all argparse groups for intermediate subcommand nodes to leaves below them. This feels hackier but might be simpler.

    opened by brentyi 1
  • Faster + lazy helptext generation

    Faster + lazy helptext generation

    nerfstudio's ns-train function currently has ~500 arguments, which results in a nearly 0.4 (!!) second dcargs overhead. That's huge!

    It's currently still a small part of overall startup time, but some profiling shows that most of it is spent on helptext formatting; about 0.1 seconds for rich operations and 0.2 seconds for docstring parsing.

    Most of the time, the helptext isn't even used; we should find ways to run less logic and faster logic. More intelligent caching and lazy strings would likely speed things up by ~an order of magnitude.

    opened by brentyi 1
  • Subparsing issue with Union types

    Subparsing issue with Union types

    Hi Brent,

    We hit another issue with some nested configs we're using -- basically something is going awry (I think) due to a type Union.

    Here's a MWE:

    import dataclasses
    import dcargs
    
    from typing import Tuple, Union
    
    @dataclasses.dataclass(frozen=True)
    class Subtype:
        data: int = 1
        
    @dataclasses.dataclass(frozen=True)
    class TypeA:
        subtype: Subtype = Subtype(1)
    
    @dataclasses.dataclass(frozen=True)
    class TypeB:
        subtype: Subtype = Subtype(2)
        
    @dataclasses.dataclass(frozen=True)
    class Wrapper:
        supertype: Union[TypeA, TypeB] = TypeA()
        
    if __name__ == "__main__":
        wrapper = dcargs.cli(Wrapper) # errors when running with supertype:type-a
        print(wrapper)
    

    If you put this in a module subparsers.py and run $ python subparsers.py, everything works; if you run $ python subparsers.py supertype:type-a, it throws the following error:

    File "/opt/conda/lib/python3.7/site-packages/dcargs/_cli.py", line 272, in _cli_impl
        avoid_subparsers=avoid_subparsers,
      File "/opt/conda/lib/python3.7/site-packages/dcargs/_calling.py", line 169, in call_from_args
        avoid_subparsers=avoid_subparsers,
      File "/opt/conda/lib/python3.7/site-packages/dcargs/_calling.py", line 117, in call_from_args
        assert len(parser_definition.subparsers_from_name) > 0
    AssertionError
    

    Thanks again for the great package + sorry to raise obscure issues! No problem if this isn't high-priority.

    opened by pculbertson 1
  • Support for union + nested hierarchies?

    Support for union + nested hierarchies?

    I have an experiment with two types of models that can run, each with its own configs. In my ExperimentConfig, I have a model_class parameter which is of type Union[ModelAConfig, ModelBConfig], and defaults to ModelAConfig. When I call dcargs.cli(ExperimentConfig), how do I set which config to use, as well as the respective parameters in it?

    In general, how can Union over two types of configs be used if each config has its own set of parameters?

    Edit: Closing this, I think this is addressed with using multiple subparsers

    opened by krishpop 1
  • Improve default detection for subcommands

    Improve default detection for subcommands

    https://github.com/brentyi/dcargs/blob/master/dcargs/_parsers.py#L360-L377

    When we define subcommands, the "default" text is currently determined solely based on the type of the specified default. This may cause issues when multiple subcommands with the same type are configured via dcargs.conf.subcommand.

    opened by brentyi 0
  • YAML serialization helpers

    YAML serialization helpers

    This API hasn't been touched in a while, which has caused some issues like #7.

    Things that probably won't work:

    • [x] Annotating a field with a base class, then assigning a instance of a subclass. This is fixable by iterating over cls.__subclasses__().
    • [x] Annotating a field with a protocol, then assigning a value that correctly implements the protocol.

    Some questions to consider:

    • Is the serialization API actually solving an issue people have; is it useful enough to keep around? PyYAML works pretty well.
    • Is this actually in scope for dcargs? We're only scratching the surface on potential features; readable + robust serialization could be its own project.
    • dcargs.cli() previously only supported dataclasses, but scope has expanded since then. Does it still make sense for the serialization helpers to be hyper-targeted on dataclasses?

    Two options for next steps:

    1. Fix or document all the bugs and caveats. Not sure we have the energy for this :smiling_face_with_tear:
    2. Deprecate the API.
    opened by brentyi 0
  • (WIP) Expose registry API for third-party integrations

    (WIP) Expose registry API for third-party integrations

    Motivated by #23, just starting a PR to track progress.

    The current goal is to:

    1. Expose an API for configuring custom behavior/constructors for types that match some criteria. This is particularly useful for protocols, which can only be instantiated by an external function.
    2. Prove that this API is powerful enough by internally migrating support for dataclasses, attrs, TypedDict, etc, over to it.

    Turns out, however, that the engineering complexity of (2) is pretty high when we start considering all of the corner cases concerning things like partials, generics, helptext generation, forward references, etc, that the existing architecture was designed to handle.

    Given time constraints, may need to choose between (a) not landing this feature for the forseeable future or (b) reverting a bunch of changes and making the PR less ambitious.

    opened by brentyi 0
  • Assertion won't be formatted correctly in _arguments

    Assertion won't be formatted correctly in _arguments

    Sorry for the issue spamming- but here's a small bug report that this assertion needs to be replaced by an Exception to properly display which fields have issues in the defaults. I would submit a PR but I'll probably get the type of Exception to use wrong :D

    https://github.com/brentyi/tyro/blob/e88e690e14ffe87e13419fa2b6427bc77c4ae336/tyro/_arguments.py#L198-L201

    opened by tovacinni 1
  • Overriding with YAML defaults on a dataclass config

    Overriding with YAML defaults on a dataclass config

    Hi again,

    Today I was trying to override a config defined by a dataclass using a YAML file. The docs (https://brentyi.github.io/tyro/examples/03_config_systems/02_overriding_yaml/) seem to show that the use of a simple dictionary does work to override- but for a dataclass based config, it yields a bunch of warnings. A look into the source looks like it's looking for attributes, hence failing on a dictionary.

    Is this the intended behaviour? (maybe it makes sense to assume attribute based accessors considering the config itself is a dataclass- but I found this discrepancy with what's indicated in the docs a bit confusing, unless I missed something that specifies this behaviour)

    For completeness here's a small example to repro. Replacing the dict with an attrdict does work.

    import yaml
    
    import tyro
    import dataclasses
    import attrdict
    
    @dataclasses.dataclass
    class Config:
        exp_name : str
        batch_size : int
    
    # YAML configuration. Note that this could also be loaded from a file! Environment
    # variables are an easy way to select between different YAML files.
    default_yaml = r"""
    exp_name: test
    batch_size: 10
    """.strip()
    
    if __name__ == "__main__":
        # Convert our YAML config into a nested dictionary.
        default_config = dict(yaml.safe_load(default_yaml))
        
        # Using attrdict here instead will work
        #default_config = attrdict.AttrDict(default_config)
    
        # Override fields in the dictionary.
        overridden_config = tyro.cli(Config, default=default_config)
    
    opened by tovacinni 2
  • Getting YAMLs without populating defaults

    Getting YAMLs without populating defaults

    Hi tyro team,

    First of all thanks for this super cool configuration library!! It looks awesome.

    While reading the docs and playing around with the configurator I had a small question: is it possible to output a yaml file for a hierarchical config without first populating the argument defaults via the command line?

    The usecase is as follows:

    After defining a (hierarchical) dataclass schema, I want to populate a yaml with all null entries so that I can then populate that yaml to use as the default arguments. Ideally I can follow a flow like: 1. the code looks for a default config 2. if none exists, will populate an empty config 3. users can then populate the defaults inside the yaml, and then override the yaml with CLI arguments.

    opened by tovacinni 2
  • hydra-zen + tyro ❤️

    hydra-zen + tyro ❤️

    Hello! I just came across tyro and it looks great!

    I wanted to put hydra-zen on your radar. It is a library designed make Hydra-based projects more Pythonic and lighter on boilerplate code. It mainly does this by providing users with functions like builds and just, which dynamically generate dataclasses that describe how to build, via instantiate, various objects. There are plenty of bells and whistles that I could go into (e.g. nice support for partial'd targets), but I'll keep it brief-ish.

    That being said, hydra-zen's main features are quite independent of Hydra, and are more focused on generating dataclasses that can configure/build various Python interfaces. It seems like this might be the sort of thing that could be helpful for tyro users who want to generate nested, typed interfaces based on objects in their library or from third party libraries.

    This is just a rough idea at this point, but I figured that there might be some potential synergy here! I'd love to get your impressions if you think there might be any value here.

    opened by rsokl 13
  • Custom datamanager + tyro error

    Custom datamanager + tyro error

    Hi there!

    I asked a question in discord channel, I want to just follow it up here, my question was following:

    I want to implement new datamodules but don't want to include them in the directory of nerfstudio, but rather want to keep them externally. I copied the train.py code, and add my own configs to method_configs dictionary. I call tyro.cli with modified AnnotatedBaseConfigUnion, but tyro gives AssertionError. I realized I have to modify datamanagers.py and my new dataparserconfig but I want to avoid modifying any code in nerfstudio. What is the correct approach to achieve this?

    Here is my code:

    
    import pathlib
    import sys
    
    current_path = pathlib.Path(__file__).parent.resolve()
    sys.path.append(str(current_path.parent / "dependencies/nerfstudio/scripts"))
    from scripts.train import main
    
    from dataclasses import dataclass, field
    from pathlib import Path
    from typing import Type
    
    import tyro
    # from nerfstudio.cameras.camera_optimizers import CameraOptimizerConfig
    from nerfstudio.configs.base_config import Config
    from nerfstudio.data.datamanagers import VanillaDataManagerConfig
    from nerfstudio.data.dataparsers.blender_dataparser import BlenderDataParserConfig, Blender
    # from nerfstudio.data.dataparsers.friends_dataparser import FriendsDataParserConfig
    # from nerfstudio.data.dataparsers.nerfstudio_dataparser import NerfstudioDataParserConfig
    from nerfstudio.engine.optimizers import AdamOptimizerConfig, RAdamOptimizerConfig
    from nerfstudio.models.base_model import VanillaModelConfig
    from nerfstudio.models.vanilla_nerf import NeRFModel
    from nerfstudio.pipelines.base_pipeline import VanillaPipelineConfig
    # from nerfstudio.pipelines.dynamic_batch import DynamicBatchPipelineConfig
    from nerfstudio.configs.config_utils import convert_markup_to_ansi
    from nerfstudio.configs.method_configs import method_configs, descriptions
    from nerfstudio.data.dataparsers.base_dataparser import DataParserConfig
    
    @dataclass
    class TempDataParserConfig(DataParserConfig):
        _target: Type = field(default_factory=lambda: Blender)
        """target class to instantiate"""
        data: Path = Path("data/blender/lego")
        """Directory specifying location of data."""
        scale_factor: float = 1.0
        """How much to scale the camera origins by."""
        alpha_color: str = "white"
        """alpha color of background"""
    
    def entrypoint():
        """Entrypoint for use with pyproject scripts."""
        # Choose a base configuration and override values.
        tyro.extras.set_accent_color("bright_yellow")
    
        # Add hyperlight models to the method configs and descriptions
        descriptions["temp-model"] = "Temp-model"
        method_configs["temp-model"] = Config(
            method_name="vanilla-nerf",
            pipeline=VanillaPipelineConfig(
                datamanager=VanillaDataManagerConfig(
                    dataparser=TempDataParserConfig(),
                    # dataparser=BlenderDataParserConfig(),
                    train_num_images_to_sample_from = 8,
                    train_num_times_to_repeat_images = 4,
                    eval_num_images_to_sample_from = 2,
                    eval_num_times_to_repeat_images = 1,
                ),
                model=VanillaModelConfig(_target=NeRFModel),
            ),
            optimizers={
                "fields": {
                    "optimizer": RAdamOptimizerConfig(lr=5e-4, eps=1e-08),
                    "scheduler": None,
                }
            },
        )
    
        AnnotatedBaseConfigUnion = tyro.conf.SuppressFixed[  # Don't show unparseable (fixed) arguments in helptext.
        tyro.extras.subcommand_type_from_defaults(defaults=method_configs, descriptions=descriptions)
        ]
        main(
            tyro.cli(
                AnnotatedBaseConfigUnion,
                description=convert_markup_to_ansi(__doc__),
            )
        )
    
    
    if __name__ == "__main__":
        entrypoint()
    

    Here is the error it produces:

    Traceback (most recent call last):
      File "/host/scripts/train_temp.py", line 83, in <module>
        entrypoint()
      File "/host/scripts/train_temp.py", line 75, in entrypoint
        tyro.cli(
      File "/opt/conda/lib/python3.9/site-packages/tyro/_cli.py", line 125, in cli
        _cli_impl(
      File "/opt/conda/lib/python3.9/site-packages/tyro/_cli.py", line 275, in _cli_impl
        parser_definition = _parsers.ParserSpecification.from_callable_or_type(
      File "/opt/conda/lib/python3.9/site-packages/tyro/_parsers.py", line 99, in from_callable_or_type
        subparsers_attempt = SubparsersSpecification.from_field(
      File "/opt/conda/lib/python3.9/site-packages/tyro/_parsers.py", line 383, in from_field
        subparser = ParserSpecification.from_callable_or_type(
      File "/opt/conda/lib/python3.9/site-packages/tyro/_parsers.py", line 129, in from_callable_or_type
        nested_parser = ParserSpecification.from_callable_or_type(
      File "/opt/conda/lib/python3.9/site-packages/tyro/_parsers.py", line 129, in from_callable_or_type
        nested_parser = ParserSpecification.from_callable_or_type(
      File "/opt/conda/lib/python3.9/site-packages/tyro/_parsers.py", line 99, in from_callable_or_type
        subparsers_attempt = SubparsersSpecification.from_field(
      File "/opt/conda/lib/python3.9/site-packages/tyro/_parsers.py", line 357, in from_field
        assert default_name is not None
    AssertionError
    

    Would appreciate your help to achieve such functionality

    opened by myaldiz 2
Releases(v0.3.37)
Owner
Brent Yi
Brent Yi
Command Line (CLI) Application to automate creation of tasks in Redmine, issues on Github and the sync process of them.

Task Manager Automation Tool (TMAT) CLI Command Line (CLI) Application to automate creation of tasks in Redmine, issues on Github and the sync process

Tiamat 5 Apr 12, 2022
🪛 A simple pydantic to Form FastAPI model converter.

pyfa-converter Makes it pretty easy to create a model based on Field [pydantic] and use the model for www-form-data. How to install? pip install pyfa_

20 Dec 22, 2022
Shellmon is a tool used to create and control a webshell remotely, created using the Python3

An Simple PHP Webshell Manager Description Shellmon is a tool used to create and control a webshell remotely, created using the Python3 programming la

22XploiterCrew 12 Dec 30, 2022
CLI tool to fix linked references for dates.

Fix Logseq dates This is a CLI tool to fix the date references following a change in date format since the current version (0.4.4) of Logseq does not

Isaac Dadzie 5 May 18, 2022
Convert shellcode into :sparkles: different :sparkles: formats!

Bluffy Convert shellcode into ✨ different ✨ formats! Bluffy is a utility which was used in experiments to bypass Anti-Virus products (statically) by f

pre.empt.dev 305 Dec 17, 2022
Create animated ASCII-art for the command line almost instantly!

clippy Create and play colored 🟥 🟩 🟦 or colorless ⬛️ ⬜️ animated, or static, ASCII-art in the command line! clippy can help if you are wanting to;

Connor 10 Jun 26, 2022
Themes for the kitty terminal emulator

Themes for the kitty terminal This is a collection of themes for the kitty terminal emulator. The themes were initially imported from dexpota/kitty-th

Kovid Goyal 190 Jan 05, 2023
CLI Utility to encode and recursively recreate directories with ffmpeg.

FFenmass CLI Utility to encode and recursively recreate directories with ffmpeg. Report Bug · Request Feature Table of Contents Getting Started Prereq

George Av. 8 May 06, 2022
CLI Web-CAT interface for people who use VIM.

CLI Web-CAT CLI Web-CAT interface. Installation git clone https://github.com/phuang1024/cliwebcat cd cliwebcat python setup.py bdist_wheel sdist cd di

Patrick 4 Apr 11, 2022
A simple CLI tool for getting region-specific status of Logz.io components.

About A simple CLI tool for checking the current status of Logz.io components per region. Built With Python 3 The following packeges (see requirements

Yotam Bernaz 1 Dec 11, 2021
command line interface to manage VALORANT skins

A PROPER RELEASE IS COMING SOON, IF YOU KNOW HOW TO USE PYTHON YOU CAN USE IT NOW! valorant skin manager command line interface simple command line in

colinh 131 Dec 25, 2022
電通大のCLIツールです

uecli 電通大のCLIツールです。コマンドラインからシラバス検索、成績参照、図書館の貸出リストなどを見ることができます インストール pip install uecli 使い方 シラバスを検索 uecli syllabus search -s 'コンピュータサイエンス' シラバスを取得し、Mar

UEC World Dominators 2 Oct 31, 2021
Powerful yet easy command line calculator.

Powerful yet easy command line calculator.

Cruisen 1 Jul 22, 2022
A command line interface to buy things in stregsystemet

Stregsystemet-CLI This repository is the Stregsystemet CLI, to buy things in Stregsystemet, at AAU. Use of this cli-tool is at your own risk and there

F-klubben 14 Oct 18, 2022
open a remote repo locally quickly

A command line tool to peek a remote repo hosted on github or gitlab locally and view it in your favorite editor. The tool handles cleanup of the repo once you exit your editor.

Rahul Nair 44 Dec 16, 2022
Easily turn single threaded command line applications into a fast, multi-threaded application with CIDR and glob support.

Easily turn single threaded command line applications into a fast, multi-threaded application with CIDR and glob support.

Michael Skelton 1k Jan 07, 2023
A simple yet powerful timer and time tracker from the command line.

Focus Phase Focus Phase (FP) is a simple yet powerful timer and time tracker. It is a command-line application written in Python and can be installed

Ammar Alyousfi 13 Jan 13, 2022
Ipylivebash - Run shell script in Jupyter with live output

ipylivebash ipylivebash is a library to run shell script in Jupyter with live ou

Ben Lau 6 Aug 27, 2022
Interact with Replit remotely with the Replit CLI

Replit CLI pip install repl-cli Welcome to Replit CLI! With the Replit CLI Application, you can work with your repls locally, including clone, pull,

Shuchir Jain 4 Aug 18, 2022
dcargs is a tool for generating portable, reusable, and strongly typed CLI interfaces from dataclass definitions.

dcargs is a tool for generating portable, reusable, and strongly typed CLI interfaces from dataclass definitions.

Brent Yi 119 Jan 09, 2023