A FAIR dataset of TCV experimental results for validating edge/divertor turbulence models.

Related tags

Deep LearningTCV-X21
Overview

TCV-X21 validation for divertor turbulence simulations

Quick links

arXiv PDF

Binder DOI

Dataset licence Software licence

Test Python package codecov

Intro

Welcome to TCV-X21. We're glad you've found us!

This repository is designed to let you perform the analysis presented in Oliveira and Body et. al., Nuclear Fusion, 2021, both using the data given in the paper, and with a turbulence simulation of your own. We hope that, by providing the analysis, the TCV-X21 case can be used as a standard validation and bench-marking case for turbulence simulations of the divertor in fusion experiments. The repository allows you to scrutinise and suggest improvements to the analysis (there's always room for improvement), to directly interact with and explore the data in greater depth than is possible in a paper, and — we hope — use this case to test a simulation of your own.

To use this repository, you'll need to either use the mybinder.org link below OR user rights on a computer with Python-3, conda and git-lfs pre-installed.

Video tutorial

This quick tutorial shows you how to navigate the repository and use some of the functionality of the library.

Video_tutorial.mp4

What can you find in this repository

  • 1.experimental_data: data from the TCV experimental campaign, in NetCDF, MATLAB and IMAS formats, as well as information about the reference scenario, and the reference magnetic geometry (in .eqdsk, IMAS and PARALLAX-nc formats)
  • 2.simulation_data: data from simulations of the TCV-X21 case, in NetCDF format, as well as raw data files and conversion routines
  • 3.results: high resolution PNGs and LaTeX-ready tables for a paper
  • tcvx21: a Python library of software, which includes
    • record_c: a class to interface with NetCDF/HDF5 formatted data files
    • observable_c: a class to interact with and plot observables
    • file_io: tools to interact with MATLAB and JSON files
    • quant_validation: routines to perform the quantitative validation
    • analysis: statistics, curve-fitting, bootstrap algorithms, contour finding
    • units_m.py: setting up pint-based unit-aware analysis (it's difficult to overstate how cool this library is)
    • grillix_post: a set of routines used for post-processing GRILLIX simulation data, which might help if you're trying to post-process your own simulation. You can see a worked example in simulation_postprocessing.ipynb
  • notebooks: Jupyter notebooks, which allow us to provide code with outputs and comments together
    • simulation_setup.ipynb: what you might need to set up a simulation to test
    • simulation_postprocessing.ipynb: how to post-process the data
    • data_exploration.ipynb: some examples to get you started exploring the data
    • bulk_process.ipynb: runs over every observable to make the results — which you'll need to do if you're writing a paper from the results
  • tests: tests to make sure that we haven't broken anything in the analysis routines
  • README.md: this file, which helps you to get the software up and running, and to explain where you can find everything you need. It also provides the details of the licencing (below). There's more specific README.md files in several of the subfolders.

and lots more files. If you're not a developer, you can safely ignore these.

What can't you find in this repository

Due to licencing issues, the source code of the simulations is not provided. Sorry!

Also, the raw simulations are not provided here due to space limitations (some runs have more than a terabyte of data), but they are all backed up on archive servers. If you'd like to access the raw data, get in contact.

License and attribution notice

The TCV-X21 datasets are licenced under a Creative Commons Attribution 4.0 license, given in LICENCE. The source code of the analysis routines and Python library is licenced under a MIT license, given in tcvx21/LICENCE.

For the datasets, we ask that you provide attribution if using this data via the citation in the CITATION.cff file. We additionally require that you mark any changes to the dataset, and state specifically that the authors do not endorse your work unless such endorsement has been expressly given.

For the software, you can use, modify and share without attribution or marking changes.

Running the Jupyter notebooks (installation as non-root user)

To run the Jupyter notebooks, you have two options. The first is to use the mybinder.org interface, which let you interact with the notebooks via a web interface. You can launch the binder for this repository by clicking the binder badge in the repository header. Note that not all of the repository content is copied to the Docker image (this is specified in .dockerignore). The large checkpoint files are not included in the image, although they can be found in the repository at 2.simulation_data/GRILLIX/checkpoints_for_1mm. Additionally, the default docker image will not work with git.

Alternatively, if you'd like to run the notebooks locally or to extend the repository, you'll need to install additional Python packages. First of all, you need Python-3 and conda installed (latest versions recommended). Then, to install the necessary packages, we make a sandbox environment. This has a few advantages to installing packages globally — sudo rights are not required, you can install package versions without risking breaking other Python scripts, and if everything goes terribly wrong you can easily delete everything and restart. We've included a simple shell script to perform the necessary steps, which you can execute with

./install_env.sh

This will install the library in a subfolder of the TCV-X21 repository called tcvx21_env. It will also add a kernel to your global Jupyter installation. To remove the repository, you can delete the folder tcvx21_env and run jupyter kernelspec uninstall tcvx21.

To run tests and open Jupyter

Once you've installed via either option, you can activate the python environment with conda activate ./tcvx21_env. To deactivate, run conda deactivate.

Then, it is recommended to run the test suite with pytest which ensures that everything is installed and working correctly. If something fails, let us know in the issues. Note that this executes all of the analysis notebooks, so it might take a while to run.

Finally, run jupyter lab to open a Jupyter server in the TCV-X21 repository. Then, you can open any of the notebooks (.ipynb extension) by clicking in the side-bar.

A note on pinned dependencies

To ensure that the results are reproducible, the environment.yml file has pinned dependencies. However, if you want to use this software as a library, pinned dependencies are unnecessarily restrictive. You can remove the versions after the = sign in the environment.yml, but be warned that things might break.

You might also like...
Fair Recommendation in Two-Sided Platforms

Fair Recommendation in Two-Sided Platforms

Code for Private Recommender Systems: How Can Users Build Their Own Fair Recommender Systems without Log Data? (SDM 2022)

Private Recommender Systems: How Can Users Build Their Own Fair Recommender Systems without Log Data? (SDM 2022) We consider how a user of a web servi

Regulatory Instruments for Fair Personalized Pricing.

Fair pricing Source code for WWW 2022 paper Regulatory Instruments for Fair Personalized Pricing. Installation Requirements Linux with Python = 3.6 p

This is the official repo for TransFill:  Reference-guided Image Inpainting by Merging Multiple Color and Spatial Transformations at CVPR'21. According to some product reasons, we are not planning to release the training/testing codes and models. However, we will release the dataset and the scripts to prepare the dataset.
This code reproduces the results of the paper, "Measuring Data Leakage in Machine-Learning Models with Fisher Information"

Fisher Information Loss This repository contains code that can be used to reproduce the experimental results presented in the paper: Awni Hannun, Chua

A repository that shares tuning results of trained models generated by TensorFlow / Keras. Post-training quantization (Weight Quantization, Integer Quantization, Full Integer Quantization, Float16 Quantization), Quantization-aware training. TensorFlow Lite. OpenVINO. CoreML. TensorFlow.js. TF-TRT. MediaPipe. ONNX. [.tflite,.h5,.pb,saved_model,tfjs,tftrt,mlmodel,.xml/.bin, .onnx]
Experimental solutions to selected exercises from the book [Advances in Financial Machine Learning by Marcos Lopez De Prado]

Advances in Financial Machine Learning Exercises Experimental solutions to selected exercises from the book Advances in Financial Machine Learning by

An experimental technique for efficiently exploring neural architectures.
An experimental technique for efficiently exploring neural architectures.

SMASH: One-Shot Model Architecture Search through HyperNetworks An experimental technique for efficiently exploring neural architectures. This reposit

A simple but complete full-attention transformer with a set of promising experimental features from various papers
A simple but complete full-attention transformer with a set of promising experimental features from various papers

x-transformers A concise but fully-featured transformer, complete with a set of promising experimental features from various papers. Install $ pip ins

Comments
  • Repair results

    Repair results

    It appears that the 3.results folder had not been updated with the outputs of the notebooks.

    I've rerun the notebooks and now have the latest results in the folder.

    opened by TBody 1
Releases(v1.0)
SGPT: Multi-billion parameter models for semantic search

SGPT: Multi-billion parameter models for semantic search This repository contains code, results and pre-trained models for the paper SGPT: Multi-billi

Niklas Muennighoff 182 Dec 29, 2022
Wordle Env: A Daily Word Environment for Reinforcement Learning

Wordle Env: A Daily Word Environment for Reinforcement Learning Setup Steps: git pull [email&#

2 Mar 28, 2022
Adversarial Robustness Comparison of Vision Transformer and MLP-Mixer to CNNs

Adversarial Robustness Comparison of Vision Transformer and MLP-Mixer to CNNs ArXiv Abstract Convolutional Neural Networks (CNNs) have become the de f

Philipp Benz 12 Oct 24, 2022
Neural Factorization of Shape and Reflectance Under An Unknown Illumination

NeRFactor [Paper] [Video] [Project] This is the authors' code release for: NeRFactor: Neural Factorization of Shape and Reflectance Under an Unknown I

Google 283 Jan 04, 2023
Pytorch Lightning code guideline for conferences

Deep learning project seed Use this seed to start new deep learning / ML projects. Built in setup.py Built in requirements Examples with MNIST Badges

Pytorch Lightning 1k Jan 06, 2023
Voxel Transformer for 3D object detection

Voxel Transformer This is a reproduced repo of Voxel Transformer for 3D object detection. The code is mainly based on OpenPCDet. Introduction We provi

173 Dec 25, 2022
Easy Parallel Library (EPL) is a general and efficient deep learning framework for distributed model training.

English | 简体中文 Easy Parallel Library Overview Easy Parallel Library (EPL) is a general and efficient library for distributed model training. Usability

Alibaba 185 Dec 21, 2022
Generative Models for Graph-Based Protein Design

Graph-Based Protein Design This repo contains code for Generative Models for Graph-Based Protein Design by John Ingraham, Vikas Garg, Regina Barzilay

John Ingraham 159 Dec 15, 2022
Repository for the AugmentedPCA Python package.

Overview This Python package provides implementations of Augmented Principal Component Analysis (AugmentedPCA) - a family of linear factor models that

Billy Carson 6 Dec 07, 2022
Implementation of parameterized soft-exponential activation function.

Soft-Exponential-Activation-Function: Implementation of parameterized soft-exponential activation function. In this implementation, the parameters are

Shuvrajeet Das 1 Feb 23, 2022
Official implementation of Sparse Transformer-based Action Recognition

STAR Official implementation of S parse T ransformer-based A ction R ecognition Dataset download NTU RGB+D 60 action recognition of 2D/3D skeleton fro

Chonghan_Lee 15 Nov 02, 2022
This is a JAX implementation of Neural Radiance Fields for learning purposes.

learn-nerf This is a JAX implementation of Neural Radiance Fields for learning purposes. I've been curious about NeRF and its follow-up work for a whi

Alex Nichol 62 Dec 20, 2022
Awesome Artificial Intelligence, Machine Learning and Deep Learning as we learn it

Awesome Artificial Intelligence, Machine Learning and Deep Learning as we learn it. Study notes and a curated list of awesome resources of such topics.

mani 1.2k Jan 07, 2023
Demonstration of transfer of knowledge and generalization with distillation

Distilling-the-Knowledge-in-a-Neural-Network This is an implementation of a part of the paper "Distilling the Knowledge in a Neural Network" (https://

26 Nov 25, 2022
Object Detection Projekt in GKI WS2021/22

tfObjectDetection Object Detection Projekt with tensorflow in GKI WS2021/22 Docker Container: docker run -it --name --gpus all -v path/to/project:p

Tim Eggers 1 Jul 18, 2022
A Streamlit demo demonstrating the Deep Dream technique. Adapted from the TensorFlow Deep Dream tutorial.

Streamlit Demo: Deep Dream A Streamlit demo demonstrating the Deep Dream technique. Adapted from the TensorFlow Deep Dream tutorial How to run this de

Streamlit 11 Dec 12, 2022
Evolutionary Population Curriculum for Scaling Multi-Agent Reinforcement Learning

Evolutionary Population Curriculum for Scaling Multi-Agent Reinforcement Learning This is the code for implementing the MADDPG algorithm presented in

97 Dec 21, 2022
transfer attack; adversarial examples; black-box attack; unrestricted Adversarial Attacks on ImageNet; CVPR2021 天池黑盒竞赛

transfer_adv CVPR-2021 AIC-VI: unrestricted Adversarial Attacks on ImageNet CVPR2021 安全AI挑战者计划第六期赛道2:ImageNet无限制对抗攻击 介绍 : 深度神经网络已经在各种视觉识别问题上取得了最先进的性能。

25 Dec 08, 2022
Code for "LoRA: Low-Rank Adaptation of Large Language Models"

LoRA: Low-Rank Adaptation of Large Language Models This repo contains the implementation of LoRA in GPT-2 and steps to replicate the results in our re

Microsoft 394 Jan 08, 2023
PSPNet in Chainer

PSPNet This is an unofficial implementation of Pyramid Scene Parsing Network (PSPNet) in Chainer. Training Requirement Python 3.4.4+ Chainer 3.0.0b1+

Shunta Saito 76 Dec 12, 2022