An Efficient Implementation of Analytic Mesh Algorithm for 3D Iso-surface Extraction from Neural Networks

Overview

AnalyticMesh

Analytic Marching is an exact meshing solution from neural networks. Compared to standard methods, it completely avoids geometric and topological errors that result from insufficient sampling, by means of mathematically guaranteed analysis.

This repository gives an implementation of Analytic Marching algorithm. This algorithm is initially proposed in our conference paper Analytic Marching: An Analytic Meshing Solution from Deep Implicit Surface Networks, then finally improved in our journal paper: Learning and Meshing from Deep Implicit Surface Networks Using an Efficient Implementation of Analytic Marching.

Our codes provide web pages for manipulating your models via graphic interface, and a backend for giving full control of the algorithm by writing python codes.

Installation

First please download our codes:

git clone https://github.com/Karbo123/AnalyticMesh.git --depth=1
cd AnalyticMesh
export AMROOT=`pwd`

Backend

Backend gives a python binding of analytic marching. You can write simple python codes in your own project after compiling the backend.

Our implementation supports pytorch, and possibly also other deep learning frameworks (e.g. tensorflow), but we do not test other frameworks yet.

Requirements:

Compilation:

cd $AMROOT/backend
mkdir build && cd build
cmake ..
make -j8
cd ..

If your pytorch version < 1.5.1, you may need to fix cpp extension compile failure on some envs.

Make sure compiled library can pass the tests. Run:

CUDA_VISIBLE_DEVICES=0 PYTHONDONTWRITEBYTECODE=1 pytest -s -p no:warnings -p no:cacheprovider

It will generate some files under folder $AMROOT/backend/tmp. Generally, those generated meshes (.ply) are watertight, you can check with meshlab.

If it passes all the tests, you can finally link to somewhere so that python can find it:

ln -s $AMROOT `python -c 'import site; print(site.getsitepackages()[0])'`

Frontend

We also provide an easy-to-use interactive interface to apply analytic marching to your input network model by just clicking your mouse. To use the web interface, you may follow steps below to install.

Requirement:

Before compiling, you may need to modify the server information given in file frontend/pages/src/assets/index.js. Then you can compile those files by running:

cd $AMROOT/frontend/pages
npm install
npm run build

The $AMROOT/frontend/pages/dist directory is ready to be deployed. If you want to deploy web pages to a server, please additionally follow these instructions.

To start the server, simply run:

cd $AMROOT/frontend && python server.py

You can open the interface via either opening file $AMROOT/frontend/pages/dist/index.html on your local machine or opening the url to which the page is deployed.

Demo

We provide some samples in $AMROOT/examples, you can try them.

Here we show a simple example (which is from $AMROOT/examples/2_polytope.py):

import os
import torch
from AnalyticMesh import save_model, load_model, AnalyticMarching

class MLPPolytope(torch.nn.Module):
    def __init__(self):
        super(MLPPolytope, self).__init__()
        self.linear0 = torch.nn.Linear(3, 14)
        self.linear1 = torch.nn.Linear(14, 1)
        with torch.no_grad(): # here we give the weights explicitly since training takes time
            weight0 = torch.tensor([[ 1,  1,  1],
                                    [-1, -1, -1],
                                    [ 0,  1,  1],
                                    [ 0, -1, -1],
                                    [ 1,  0,  1],
                                    [-1,  0, -1],
                                    [ 1,  1,  0],
                                    [-1, -1,  0],
                                    [ 1,  0,  0],
                                    [-1,  0,  0],
                                    [ 0,  1,  0],
                                    [ 0, -1,  0],
                                    [ 0,  0,  1],
                                    [ 0,  0, -1]], dtype=torch.float32)
            bias0 = torch.zeros(14)
            weight1 = torch.ones([14], dtype=torch.float32).unsqueeze(0)
            bias1 = torch.tensor([-2], dtype=torch.float32)

            add_noise = lambda x: x + torch.randn_like(x) * (1e-7)
            self.linear0.weight.copy_(add_noise(weight0))
            self.linear0.bias.copy_(add_noise(bias0))
            self.linear1.weight.copy_(add_noise(weight1))
            self.linear1.bias.copy_(add_noise(bias1))

    def forward(self, x):
        return self.linear1(torch.relu(self.linear0(x)))


if __name__ == "__main__":
    #### save onnx
    DIR = os.path.dirname(os.path.abspath(__file__)) # the directory to save files
    onnx_path = os.path.join(DIR, "polytope.onnx")
    save_model(MLPPolytope(), onnx_path) # we save the model as onnx format
    print(f"we save onnx to: {onnx_path}")

    #### save ply
    ply_path = os.path.join(DIR, "polytope.ply")
    model = load_model(onnx_path) # load as a specific model
    AnalyticMarching(model, ply_path) # do analytic marching
    print(f"we save ply to: {ply_path}")

API

We mainly provide the following two ways to use analytic marching:

  • Web interface (provides an easy-to-use graphic interface)
  • Python API (gives more detailed control)
  1. Web interface

    You should compile both the backend and frontend to use this web interface. Its usage is detailed in the user guide on the web page.

  2. Python API

    It's very simple to use, just three lines of code.

    from AnalyticMesh import load_model, AnalyticMarching 
    model = load_model(load_onnx_path) 
    AnalyticMarching(model, save_ply_path)

    If results are not satisfactory, you may need to change default values of the AnalyticMarching function.

    To obtain an onnx model file, you can just use the save_model function we provide.

    from AnalyticMesh import save_model
    save_model(your_custom_nn_module, save_onnx_path)

Some tips:

  • It is highly recommended that you try dichotomy first as the initialization method.
  • If CUDA runs out of memory, try setting voxel_configs. It will partition the space and solve them serially.
  • More details are commented in our source codes.

Use Analytic Marching in your own project

There are generally three ways to use Analytic Marching.

  1. Directly representing a single shape by a multi-layer perceptron. For a single object, you can simply represent the shape as a single network. For example, you can directly fit a point cloud by a multi-layer perceptron. In this way, the weights of the network uniquely determine the shape.
  2. Generating the weights of multi-layer perceptron from a hyper-network. To learn from multiple shapes, one can use hyper-network to generate the weights of multi-layer perceptron in a learnable manner.
  3. Re-parameterizing the latent code into the bias of the first layer. To learn from multiple shapes, we can condition the network with a latent code input at the first layer (e.g. 3+256 -> 512 -> 512 -> 1). Note that the concatenated latent code can be re-parameterized and combined into the bias of the first layer. More specifically, the computation of the first layer can be re-parameterized as , where the newly computed bias is .

About

This repository is mainly maintained by Jiabao Lei (backend) and Yongyi Su (frontend). If you have any question, feel free to create an issue on github.

If you find our works useful, please consider citing our papers.

@inproceedings{
    Lei2020,
    title = {Analytic Marching: An Analytic Meshing Solution from Deep Implicit Surface Networks},
    author = {Jiabao Lei and Kui Jia},
    booktitle = {International Conference on Machine Learning 2020 {ICML-20}},
    year = {2020},
    month = {7}
}

@misc{
    Lei2021,
    title={Learning and Meshing from Deep Implicit Surface Networks Using an Efficient Implementation of Analytic Marching}, 
    author={Jiabao Lei and Kui Jia and Yi Ma},
    year={2021},
    eprint={2106.10031},
    archivePrefix={arXiv},
    primaryClass={cs.CV}
}

Contact: [email protected]

Owner
Karbo
Karbo
Prometheus Exporter for data scraped from datenplattform.darmstadt.de

darmstadt-opendata-exporter Scrapes data from https://datenplattform.darmstadt.de and presents it in the Prometheus Exposition format. Pull requests w

Martin Weinelt 2 Apr 12, 2022
Towards Long-Form Video Understanding

Towards Long-Form Video Understanding Chao-Yuan Wu, Philipp Krähenbühl, CVPR 2021 [Paper] [Project Page] [Dataset] Citation @inproceedings{lvu2021,

Chao-Yuan Wu 69 Dec 26, 2022
Parasite: a tool allowing you to compress and decompress files, to reduce their size

🦠 Parasite 🦠 Parasite is a tool written in Python3 allowing you to "compress" any file, reducing its size. ⭐ Features ⭐ + Fast + Good optimization,

Billy 30 Nov 25, 2022
The code of paper 'Learning to Aggregate and Personalize 3D Face from In-the-Wild Photo Collection'

Learning to Aggregate and Personalize 3D Face from In-the-Wild Photo Collection Pytorch implemetation of paper 'Learning to Aggregate and Personalize

Tencent YouTu Research 136 Dec 29, 2022
Tensorflow implementation of soft-attention mechanism for video caption generation.

SA-tensorflow Tensorflow implementation of soft-attention mechanism for video caption generation. An example of soft-attention mechanism. The attentio

Paul Chen 153 Nov 14, 2022
Progressive Growing of GANs for Improved Quality, Stability, and Variation

Progressive Growing of GANs for Improved Quality, Stability, and Variation — Official TensorFlow implementation of the ICLR 2018 paper Tero Karras (NV

Tero Karras 5.9k Jan 05, 2023
Accompanying code for the paper "A Kernel Test for Causal Association via Noise Contrastive Backdoor Adjustment".

#backdoor-HSIC (bd_HSIC) Accompanying code for the paper "A Kernel Test for Causal Association via Noise Contrastive Backdoor Adjustment". To generate

Robert Hu 0 Nov 25, 2021
A Python implementation of global optimization with gaussian processes.

Bayesian Optimization Pure Python implementation of bayesian global optimization with gaussian processes. PyPI (pip): $ pip install bayesian-optimizat

fernando 6.5k Jan 02, 2023
A Game-Theoretic Perspective on Risk-Sensitive Reinforcement Learning

Officile code repository for "A Game-Theoretic Perspective on Risk-Sensitive Reinforcement Learning"

Mathieu Godbout 1 Nov 19, 2021
Real-Time Seizure Detection using EEG: A Comprehensive Comparison of Recent Approaches under a Realistic Setting

Real-Time Seizure Detection using Electroencephalogram (EEG) This is the repository for "Real-Time Seizure Detection using EEG: A Comprehensive Compar

AITRICS 30 Dec 17, 2022
A script helps the user to update Linux and Mac systems through the terminal

Description This script helps the user to update Linux and Mac systems through the terminal. All the user has to install some requirements and then ru

Roxcoder 2 Jan 23, 2022
McGill Physics Hackathon 2021: Reaction-Diffusion Models for the Generation of Biological Patterns

DiffuseAnimals: Reaction-Diffusion Models for the Generation of Biological Patterns Introduction Reaction-diffusion equations can be utilized in order

Austin Szuminsky 2 Mar 07, 2022
Source code for our paper "Learning to Break Deep Perceptual Hashing: The Use Case NeuralHash"

Learning to Break Deep Perceptual Hashing: The Use Case NeuralHash Abstract: Apple recently revealed its deep perceptual hashing system NeuralHash to

<a href=[email protected]"> 11 Dec 03, 2022
Covid-19 Test AI (Deep Learning - NNs) Software. Accuracy is the %96.5, loss is the 0.09 :)

Covid-19 Test AI (Deep Learning - NNs) Software I developed a segmentation algorithm to understand whether Covid-19 Test Photos are positive or negati

Emirhan BULUT 28 Dec 04, 2021
An Unbiased Learning To Rank Algorithms (ULTRA) toolbox

Unbiased Learning to Rank Algorithms (ULTRA) This is an Unbiased Learning To Rank Algorithms (ULTRA) toolbox, which provides a codebase for experiment

back 3 Nov 18, 2022
Code for DisCo: Remedy Self-supervised Learning on Lightweight Models with Distilled Contrastive Learning

DisCo: Remedy Self-supervised Learning on Lightweight Models with Distilled Contrastive Learning Pytorch Implementation for DisCo: Remedy Self-supervi

79 Jan 06, 2023
A PyTorch implementation of "Predict then Propagate: Graph Neural Networks meet Personalized PageRank" (ICLR 2019).

APPNP ⠀ A PyTorch implementation of Predict then Propagate: Graph Neural Networks meet Personalized PageRank (ICLR 2019). Abstract Neural message pass

Benedek Rozemberczki 329 Dec 30, 2022
Vector.ai assignment

fabio-tests-nisargatman Low Level Approach: ###Tables: continents: id*, name, population, area, createdAt, updatedAt countries: id*, name, population,

Ravi Pullagurla 1 Nov 09, 2021
Learning RGB-D Feature Embeddings for Unseen Object Instance Segmentation

Unseen Object Clustering: Learning RGB-D Feature Embeddings for Unseen Object Instance Segmentation Introduction In this work, we propose a new method

NVIDIA Research Projects 132 Dec 13, 2022
Code for ViTAS_Vision Transformer Architecture Search

Vision Transformer Architecture Search This repository open source the code for ViTAS: Vision Transformer Architecture Search. ViTAS aims to search fo

46 Dec 17, 2022