Qcover is an open source effort to help exploring combinatorial optimization problems in Noisy Intermediate-scale Quantum(NISQ) processor.

Related tags

Deep LearningQcover
Overview
Qcover is an open source effort to help exploring combinatorial optimization problems in Noisy Intermediate-scale Quantum(NISQ) processor. It is developed by the quantum operating system team in Beijing Academy of Quantum Information Sciences (BAQIS). Qcover supports fast output of optimal parameters in shallow QAOA circuits. It can be used as a powerful tool to assist NISQ processor to demonstrate application-level quantum advantages.

Getting started

Use the following command to complete the installation of Qcover

pip install Qcover

or

git clone https://github.com/BAQIS-Quantum/Qcover
pip install -r requirements.yml
python setup.py install

More example codes and tutorials can be found in the tests folder here on GitHub.

Examples

  1. Using algorithm core module to generate the ising random weighted graph and calculate it's Hamiltonian expectation
    from Qcover.core import Qcover
    from Qcover.backends import CircuitByQulacs
    from Qcover.optimizers import COBYLA
    
    node_num, edge_num = 6, 9
    p = 1
    nodes, edges = Qcover.generate_graph_data(node_num, edge_num)
    g = Qcover.generate_weighted_graph(nodes, edges)
    qulacs_bc = CircuitByQulacs()
    optc = COBYLA(options={'tol': 1e-3, 'disp': True})
    qc = Qcover(g, p=p, optimizer=optc, backend=qulacs_bc)
    res = qc.run()
    print("the result of problem is:\n", res)
    qc.backend.visualization()
  2. Solving specific binary combinatorial optimization problems, Calculating the expectation value of the Hamiltonian of the circuit which corresponding to the problem. for example, if you want to using Qcover to solve a max-cut problem, just coding below:
    import numpy as np
    from Qcover.core import Qcover
    from Qcover.backends import CircuitByQiskit
    from Qcover.optimizers import COBYLA
    from Qcover.applications.max_cut import MaxCut
    node_num, degree = 6, 3
    p = 1
    mxt = MaxCut(node_num=node_num, node_degree=degree)
    ising_g = mxt.run()
    qiskit_bc = CircuitByQiskit(expectation_calc_method="statevector")
    optc = COBYLA(options={'tol': 1e-3, 'disp': True})
    qc = Qcover(ising_g, p=p, optimizer=optc, backend=qiskit_bc)
    res = qc.run()
    print("the result of problem is:\n", res)
    qc.backend.visualization()
  3. If you want to customize the Ising weight graph model and calculate the ground state expectation with Qcover, you can use the following code
    import numpy as np
    import networkx as nx
    from Qcover.core import Qcover
    from Qcover.backends import CircuitByTensor
    from Qcover.optimizers import COBYLA
    
    ising_g = nx.Graph()
    nodes = [(0, 3), (1, 2), (2, 1), (3, 1)]
    edges = [(0, 1, 1), (0, 2, 1), (3, 1, 2), (2, 3, 3)]
    for nd in nodes:
       u, w = nd[0], nd[1]
       ising_g.add_node(int(u), weight=int(w))
    for ed in edges:
        u, v, w = ed[0], ed[1], ed[2]
    ising_g.add_edge(int(u), int(v), weight=int(w))
    
    p = 2
    optc = COBYLA(options={'tol': 1e-3, 'disp': True})
    ts_bc = CircuitByTensor()
    qc = Qcover(ising_g, p=p, optimizer=optc, backend=ts_bc)
    res = qc.run()
    print("the result of problem is:\n", res)
    qc.backend.visualization()

How to contribute

For information on how to contribute, please send an e-mail to members of developer of this project.

Please cite

When using Qcover for research projects, please cite

  • Wei-Feng Zhuang, Ya-Nan Pu, Hong-Ze Xu, Xudan Chai, Yanwu Gu, Yunheng Ma, Shahid Qamar, Chen Qian, Peng Qian, Xiao Xiao, Meng-Jun Hu, and Done E. Liu, "Efficient Classical Computation of Quantum Mean Value for Shallow QAOA Circuits", arXiv:2112.11151 (2021).

Authors

The first release of Qcover was developed by the quantum operating system team in Beijing Academy of Quantum Information Sciences.

Qcover is constantly growing and many other people have already contributed to it in the meantime.

License

Qcover is released under the Apache 2 license.

Plenoxels: Radiance Fields without Neural Networks

Plenoxels: Radiance Fields without Neural Networks Alex Yu*, Sara Fridovich-Keil*, Matthew Tancik, Qinhong Chen, Benjamin Recht, Angjoo Kanazawa UC Be

Sara Fridovich-Keil 81 Dec 25, 2022
Sign Language Transformers (CVPR'20)

Sign Language Transformers (CVPR'20) This repo contains the training and evaluation code for the paper Sign Language Transformers: Sign Language Trans

Necati Cihan Camgoz 164 Dec 30, 2022
A PyTorch Implementation of PGL-SUM from "Combining Global and Local Attention with Positional Encoding for Video Summarization", Proc. IEEE ISM 2021

PGL-SUM: Combining Global and Local Attention with Positional Encoding for Video Summarization PyTorch Implementation of PGL-SUM From "PGL-SUM: Combin

Evlampios Apostolidis 35 Dec 22, 2022
Black-Box-Tuning - Black-Box Tuning for Language-Model-as-a-Service

Black-Box-Tuning Source code for paper "Black-Box Tuning for Language-Model-as-a

Tianxiang Sun 149 Jan 04, 2023
Confidence Propagation Cluster aims to replace NMS-based methods as a better box fusion framework in 2D/3D Object detection

CP-Cluster Confidence Propagation Cluster aims to replace NMS-based methods as a better box fusion framework in 2D/3D Object detection, Instance Segme

Yichun Shen 41 Dec 08, 2022
FNet Implementation with TensorFlow & PyTorch

FNet Implementation with TensorFlow & PyTorch. TensorFlow & PyTorch implementation of the paper "FNet: Mixing Tokens with Fourier Transforms". Overvie

Abdelghani Belgaid 1 Feb 12, 2022
MixRNet(Using mixup as regularization and tuning hyper-parameters for ResNets)

MixRNet(Using mixup as regularization and tuning hyper-parameters for ResNets) Using mixup data augmentation as reguliraztion and tuning the hyper par

Bhanu 2 Jan 16, 2022
A very simple tool for situations where optimization with onnx-simplifier would exceed the Protocol Buffers upper file size limit of 2GB, or simply to separate onnx files to any size you want.

sne4onnx A very simple tool for situations where optimization with onnx-simplifier would exceed the Protocol Buffers upper file size limit of 2GB, or

Katsuya Hyodo 10 Aug 30, 2022
Implementation of the paper All Labels Are Not Created Equal: Enhancing Semi-supervision via Label Grouping and Co-training

SemCo The official pytorch implementation of the paper All Labels Are Not Created Equal: Enhancing Semi-supervision via Label Grouping and Co-training

42 Nov 14, 2022
FuseDream: Training-Free Text-to-Image Generationwith Improved CLIP+GAN Space OptimizationFuseDream: Training-Free Text-to-Image Generationwith Improved CLIP+GAN Space Optimization

FuseDream This repo contains code for our paper (paper link): FuseDream: Training-Free Text-to-Image Generation with Improved CLIP+GAN Space Optimizat

XCL 191 Dec 31, 2022
Tensorflow implementation of Swin Transformer model.

Swin Transformer (Tensorflow) Tensorflow reimplementation of Swin Transformer model. Based on Official Pytorch implementation. Requirements tensorflow

167 Jan 08, 2023
Visual dialog agents with pre-trained vision-and-language encoders.

Learning Better Visual Dialog Agents with Pretrained Visual-Linguistic Representation Or READ-UP: Referring Expression Agent Dialog with Unified Pretr

7 Oct 08, 2022
Julia package for multiway (inverse) covariance estimation.

TensorGraphicalModels TensorGraphicalModels.jl is a suite of Julia tools for estimating high-dimensional multiway (tensor-variate) covariance and inve

Wayne Wang 3 Sep 23, 2022
A Transformer-Based Siamese Network for Change Detection

ChangeFormer: A Transformer-Based Siamese Network for Change Detection (Under review at IGARSS-2022) Wele Gedara Chaminda Bandara, Vishal M. Patel Her

Wele Gedara Chaminda Bandara 214 Dec 29, 2022
Efficient Two-Step Networks for Temporal Action Segmentation (Neurocomputing 2021)

Efficient Two-Step Networks for Temporal Action Segmentation This repository provides a PyTorch implementation of the paper Efficient Two-Step Network

8 Apr 16, 2022
This is an open source python repository for various python tests

Welcome to Py-tests This is an open source python repository for various python tests. This is in response to the hacktoberfest2021 challenge. It is a

Yada Martins Tisan 3 Oct 31, 2021
A tight inclusion function for continuous collision detection

Tight-Inclusion Continuous Collision Detection A conservative Continuous Collision Detection (CCD) method with support for minimum separation. You can

Continuous Collision Detection 89 Jan 01, 2023
PyTorch implementation of Off-policy Learning in Two-stage Recommender Systems

Off-Policy-2-Stage This repo provides a PyTorch implementation of the MovieLens experiments for the following paper: Off-policy Learning in Two-stage

Jiaqi Ma 25 Dec 12, 2022
TransPrompt - Towards an Automatic Transferable Prompting Framework for Few-shot Text Classification

TransPrompt This code is implement for our EMNLP 2021's paper 《TransPrompt:Towards an Automatic Transferable Prompting Framework for Few-shot Text Cla

WangJianing 23 Dec 21, 2022
Diagnostic tests for linguistic capacities in language models

LM diagnostics This repository contains the diagnostic datasets and experimental code for What BERT is not: Lessons from a new suite of psycholinguist

61 Jan 02, 2023