Simulating Sycamore quantum circuits classically using tensor network algorithm.

Overview

Simulating the Sycamore quantum supremacy circuit

This repo contains data we have obtained in simulating the Sycamore quantum supremacy circuits with $n=53$ qubits, $m=20$ cycles using the tensor network method proposed in arXiv:2103.03074.

We plan to release the code soon.

Explanation of data

  1. data/circuit_n53_m20_s0_e0_pABCDCDAB.py is the circuit file which has been download from the Google's data repository for the Sycamore circuits.
  2. data/bipartition_n53_m20_s0_ABCD_s24_simplify_.txt is the initial bipartition of the simplified tensor network corresponding to Sycamore circuit with 53 qubits, 20 cycles, seed 0, elide 0 and ABCDCDAB sequence. There are two lines in the file, the first line indicates the tail partition which includes 21 open qubits, while the second line includes the head partition with 32 closed qubits. The simplification of the tensor network is done by sequentially contracting tensors with 2 or less dimensions.
  3. data/n53_m20_s0_ABCD_s24_simplify_gpulimit_30_edges.txt contains the 23 slicing edges which splits the overall contraction task into $2^{23}$ subtasks, each of which has space complexity $2^{30}$ hence can be contracted using fit into 32G memory.
  4. data/n53_m20_s0_ABCD_s24_simplify_gpulimit_30_ordernew.txt includes the contraction order. For each edge in the contraction order, say $i, j$, the $i$th and $j$th tensor in the head partition will be contracted by tracing out the shared indices. Then the resulting tensor will be put back into the $i$th position.
  5. vector.pt contains the cut tensor of of the head partition whose overall dimension is $2^{23}$ and the annotations of corresponding dimensions. The file is saved using pytorch, one can use torch.load to load the data.
  6. The obtained $2^{21}$ samples for the Sycamore circuits with $n=53$ qubits and $m=20$ cycles and their probabilities and amplitudes are listed in probs.txt file. Notice that the configuration we assigned to all closed qubits are fixed to $\underbrace{0,0,0,\cdots,0}_{32}$, and the open qubit ids are 11, 12, 13, 19, 20, 21, 22, 23, 28, 29, 30, 31, 32, 37, 38, 39, 40, 41, 44, 45, 46.

Notice

We noticed that in our paper arXiv:2103.03074 we have a misprint in the first row of Tab.III, where the amplitude should be |amplitude|. Neverthless, we put the refined table below.

image-20210308101302534

The $2^{21}$ bitstrings with amplitudes and probabilities can be download here.

Owner
Feng Pan
PHD candidate on theoretical physics. Personal interest in learning theory by statistical physics approaches.
Feng Pan
TaCL: Improving BERT Pre-training with Token-aware Contrastive Learning

TaCL: Improving BERT Pre-training with Token-aware Contrastive Learning Authors: Yixuan Su, Fangyu Liu, Zaiqiao Meng, Lei Shu, Ehsan Shareghi, and Nig

Yixuan Su 79 Nov 04, 2022
Official code repository for the EMNLP 2021 paper

Integrating Visuospatial, Linguistic and Commonsense Structure into Story Visualization PyTorch code for the EMNLP 2021 paper "Integrating Visuospatia

Adyasha Maharana 23 Dec 19, 2022
BTC-Generator - BTC Generator With Python

Что такое BTC-Generator? Это генератор чеков всеми любимого @BTC_BANKER_BOT Для

DoomGod 3 Aug 24, 2022
Generic ecosystem for feature extraction from aerial and satellite imagery

Note: Robosat is neither maintained not actively developed any longer by Mapbox. See this issue. The main developers (@daniel-j-h, @bkowshik) are no l

Mapbox 1.9k Jan 06, 2023
Malware Bypass Research using Reinforcement Learning

Malware Bypass Research using Reinforcement Learning

Bobby Filar 76 Dec 26, 2022
TextBPN Adaptive Boundary Proposal Network for Arbitrary Shape Text Detection

TextBPN Adaptive Boundary Proposal Network for Arbitrary Shape Text Detection; Accepted by ICCV2021. Note: The complete code (including training and t

S.X.Zhang 84 Dec 13, 2022
RoFormer_pytorch

PyTorch RoFormer 原版Tensorflow权重(https://github.com/ZhuiyiTechnology/roformer) chinese_roformer_L-12_H-768_A-12.zip (提取码:xy9x) 已经转化为PyTorch权重 chinese_r

yujun 283 Dec 12, 2022
Official pytorch implementation of the paper: "SinGAN: Learning a Generative Model from a Single Natural Image"

SinGAN Project | Arxiv | CVF | Supplementary materials | Talk (ICCV`19) Official pytorch implementation of the paper: "SinGAN: Learning a Generative M

Tamar Rott Shaham 3.2k Dec 25, 2022
Unofficial implementation (replicates paper results!) of MINER: Multiscale Implicit Neural Representations in pytorch-lightning

MINER_pl Unofficial implementation of MINER: Multiscale Implicit Neural Representations in pytorch-lightning. 📖 Ref readings Laplacian pyramid explan

AI葵 51 Nov 28, 2022
Crosslingual Segmental Language Model

Crosslingual Segmental Language Model This repository contains the code from Multilingual unsupervised sequence segmentation transfers to extremely lo

C.M. Downey 1 Jun 13, 2022
A Factor Model for Persistence in Investment Manager Performance

Factor-Model-Manager-Performance A Factor Model for Persistence in Investment Manager Performance I apply methods and processes similar to those used

Omid Arhami 1 Dec 01, 2021
DRIFT is a tool for Diachronic Analysis of Scientific Literature.

About DRIFT is a tool for Diachronic Analysis of Scientific Literature. The application offers user-friendly and customizable utilities for two modes:

Rajaswa Patil 108 Dec 12, 2022
Open-source codebase for EfficientZero, from "Mastering Atari Games with Limited Data" at NeurIPS 2021.

EfficientZero (NeurIPS 2021) Open-source codebase for EfficientZero, from "Mastering Atari Games with Limited Data" at NeurIPS 2021. Thank you for you

Weirui Ye 671 Jan 03, 2023
《Truly shift-invariant convolutional neural networks》(2021)

Truly shift-invariant convolutional neural networks [Paper] Authors: Anadi Chaman and Ivan Dokmanić Convolutional neural networks were always assumed

Anadi Chaman 46 Dec 19, 2022
Session-aware Item-combination Recommendation with Transformer Network

Session-aware Item-combination Recommendation with Transformer Network 2nd place (0.39224) code and report for IEEE BigData Cup 2021 Track1 Report EDA

Tzu-Heng Lin 6 Mar 10, 2022
Camera ready code repo for the NeuRIPS 2021 paper: "Impression learning: Online representation learning with synaptic plasticity".

Impression-Learning-Camera-Ready Camera ready code repo for the NeuRIPS 2021 paper: "Impression learning: Online representation learning with synaptic

2 Feb 09, 2022
A benchmark dataset for emulating atmospheric radiative transfer in weather and climate models with machine learning (NeurIPS 2021 Datasets and Benchmarks Track)

ClimART - A Benchmark Dataset for Emulating Atmospheric Radiative Transfer in Weather and Climate Models Official PyTorch Implementation Using deep le

21 Dec 31, 2022
This is the official implement of paper "ActionCLIP: A New Paradigm for Action Recognition"

This is an official pytorch implementation of ActionCLIP: A New Paradigm for Video Action Recognition [arXiv] Overview Content Prerequisites Data Prep

268 Jan 09, 2023
Kernel Point Convolutions

Created by Hugues THOMAS Introduction Update 27/04/2020: New PyTorch implementation available. With SemanticKitti, and Windows supported. This reposit

Hugues THOMAS 584 Jan 07, 2023
Specification language for generating Generalized Linear Models (with or without mixed effects) from conceptual models

tisane Tisane: Authoring Statistical Models via Formal Reasoning from Conceptual and Data Relationships TL;DR: Analysts can use Tisane to author gener

Eunice Jun 11 Nov 15, 2022