HODEmu, is both an executable and a python library that is based on Ragagnin 2021 in prep.

Related tags

Deep LearningHODEmu
Overview

HODEmu

HODEmu, is both an executable and a python library that is based on Ragagnin 2021 in prep. and emulates satellite abundance as a function of cosmological parameters Omega_m, Omega_b, sigma_8, h_0 and redshift.

The Emulator is trained on satellite abundance of Magneticum simulations Box1a/mr spanning 15 cosmologies (see Table 1 of the paper) and on all satellites with a stellar mass cut of M* > 2 1011 M. Use Eq. 3 to rescale it to a stelalr mass cut of 1010M.

The Emulator has been trained with sklearn GPR, however the class implemented in hod_emu.py is a stand-alone porting and does not need sklearn to be installed.

satellite average abundance for two Magneticum Box1a/mr simulations, from Ragagnin et al. 2021

TOC:

Install

You can either )1) download the file hod_emu.py and _hod_emu_sklearn_gpr_serialized.py or (2) install it with python -mpip install git+https://github.com/aragagnin/HODEmu. The package depends only on scipy. The file hod_emu.py can be executed from your command line interface by running ./hod_emu.py in the installation folder.

Check this ipython-notebook for a guided usage on a python code: https://github.com/aragagnin/HODEmu/blob/main/examples.ipynb

Example 1: Obtain normalisation, logslope and gaussian scatter of Ns-M relation

The following command will output, respectively, normalisation A, log-slope \beta, log-scatter \sigma, and the respective standard deviation from the emulator. Since the emulator has been trained on the residual of the power-law dependency in Eq. 6, the errors are respectively, the standard deviation on log-A, on log-beta, and on log-sigma. Note that --delta can be only 200c or vir as the paper only emulates these two overdensities.

 ./hod_emu.py  200c  .27  .04   0.8  0.7   0.0 #overdensity omega_m omega_b sigma8 h0 redshift

Here below we will use hod_emyu as python library to plot the Ns-M relation. First we use hod_emu.get_emulator_m200c() to obtain an instance of the Emulator class trianed on Delta_200c, and the function emu.predict_A_beta_sigma(input) to retrieve A,\beta and \sigma.

Note that input can be evaluated on a number N of data points (in this example only one), thus being is a N x 5 numpy array and the return value is a N x 3 numpy array. The parameter emulator_std=True will also return a N x 3 numpy array with the corresponding emulator standard deviations.

import hod_emu
Om0, Ob0, s8, h0, z = 0.3, 0.04, 0.8, 0.7, 0.9

input = [[Om0, Ob0, s8, h0, 1./(1.+z)]] #the input must be a 2d array because you can feed an array of data points

emu = hod_emu.get_emulator_m200c() # use get_emulator_mvir to obtain the emulator within Delta_vir

A, beta, sigma  =  emu.predict_A_beta_sigma(input).T #the function outputs a 1x3 matrix 

masses = np.logspace(14.5,15.5,20)
Ns = A*(masses/5e14)**beta 

plt.plot(masses,Ns)
plt.fill_between(masses, Ns*(1.-sigma), Ns*(1.+sigma),alpha=0.2)
plt.xlabel(r'$M_{\rm{halo}}$')
plt.ylabel(r'$N_s$')
plt.title(r'$M_\bigstar>2\cdot10^{11}M_\odot \ \ \ \tt{ and }  \ \ \ \ \  r
   )
plt.xscale('log')
plt.yscale('log')

params_tuple, stds_tuple  =  emu.predict_A_beta_sigma(input, emulator_std=True) #here we also asks for Emulator std deviation

A, beta, sigma = params_tuple.T
error_logA, error_logbeta, error_logsigma = stds_tuple.T

print('A: %.3e, log-std A: %.3e'%(A[0], error_logA[0]))
print('B: %.3e, log-std beta: %.3e'%(beta[0], error_logbeta[0]))
print('sigma: %.3e, log-std sigma: %.3e'%(sigma[0], error_logsigma[0]))

Will show the following figure:

Ns-M relation produced by HODEmu

And print the following output:

A: 1.933e+00, log-std A: 1.242e-01
B: 1.002e+00, log-std beta: 8.275e-02
sigma: 6.723e-02, log-std sigma: 2.128e-01

Example 2: Produce mock catalog of galaxies

In this example we use package hmf to produce a mock catalog of haloe masses. Note that the mock number of satellite is based on a gaussian distribution with a cut on negative value (see Eq. 5 of the paper), hence the function non_neg_normal_sample.

2\cdot10^{11}M_\odot \ \ \ \tt{ and } \ \ \ \ \ r
import hmf.helpers.sample
import scipy.stats

masses = hmf.helpers.sample.sample_mf(400,14.0,hmf_model="PS",Mmax=17,sort=True)[0]    
    
def non_neg_normal_sample(loc, scale,  max_iters=1000):
    "Given a numpy-array of loc and scale, return data from only-positive normal distribution."
    vals = scipy.stats.norm.rvs(loc = loc, scale=scale)
    mask_negative = vals<0.
    if(np.any(vals[mask_negative])):
        non_neg_normal_sample(loc[mask_negative], scale[mask_negative],  max_iters=1000)
    # after the recursion, we should have all positive numbers
    
    if(np.any(vals<0.)):
        raise Exception("non_neg_normal_sample function failed to provide  positive-normal")    
    return vals

A, beta, logscatter = emu.predict_A_beta_sigma( [Om0, Ob0, s8, h0, 1./(1.+z)])[0].T

Ns = A*(masses/5e14)**beta

modelmu = non_neg_normal_sample(loc = Ns, scale=logscatter*Ns)
modelpois = scipy.stats.poisson.rvs(modelmu)
modelmock = modelpois

plt.fill_between(masses, Ns *(1.-logscatter), Ns *(1.+logscatter), label='Ns +/- log scatter from Emu', color='black',alpha=0.5)
plt.scatter(masses, modelmock , label='Ns mock', color='orange')
plt.plot(masses, Ns , label='
    
      from Emu'
    , color='black')
plt.ylim([0.1,100.])
plt.xscale('log')
plt.yscale('log')
plt.xlabel(r'$M_{\rm {halo}} [M_\odot]$')
plt.ylabel(r'$N_s$')
plt.title(r'$M_\bigstar>2\cdot10^{11}M_\odot \ \ \ \tt{ and }  \ \ \ \ \  r
    )

plt.legend();

Will show the following figure:

Mock catalog of halos and satellite abundance produced by HODEmu

Owner
Antonio Ragagnin
I cook math
Antonio Ragagnin
Text-to-SQL in the Wild: A Naturally-Occurring Dataset Based on Stack Exchange Data

SEDE SEDE (Stack Exchange Data Explorer) is new dataset for Text-to-SQL tasks with more than 12,000 SQL queries and their natural language description

Rupert. 83 Nov 11, 2022
Simple Text-Generator with OpenAI gpt-2 Pytorch Implementation

GPT2-Pytorch with Text-Generator Better Language Models and Their Implications Our model, called GPT-2 (a successor to GPT), was trained simply to pre

Tae-Hwan Jung 775 Jan 08, 2023
Weakly Supervised Segmentation by Tensorflow.

Weakly Supervised Segmentation by Tensorflow. Implements semantic segmentation in Simple Does It: Weakly Supervised Instance and Semantic Segmentation, by Khoreva et al. (CVPR 2017).

CHENG-YOU LU 52 Dec 27, 2022
GNN4Traffic - This is the repository for the collection of Graph Neural Network for Traffic Forecasting

GNN4Traffic - This is the repository for the collection of Graph Neural Network for Traffic Forecasting

564 Jan 02, 2023
Autonomous Robots Kalman Filters

Autonomous Robots Kalman Filters The Kalman Filter is an easy topic. However, ma

20 Jul 18, 2022
A Novel Incremental Learning Driven Instance Segmentation Framework to Recognize Highly Cluttered Instances of the Contraband Items

A Novel Incremental Learning Driven Instance Segmentation Framework to Recognize Highly Cluttered Instances of the Contraband Items This repository co

Taimur Hassan 3 Mar 16, 2022
MEND: Model Editing Networks using Gradient Decomposition

MEND: Model Editing Networks using Gradient Decomposition Setup Environment This codebase uses Python 3.7.9. Other versions may work as well. Create a

Eric Mitchell 141 Dec 02, 2022
Official code of the paper "Expanding Low-Density Latent Regions for Open-Set Object Detection" (CVPR 2022)

OpenDet Expanding Low-Density Latent Regions for Open-Set Object Detection (CVPR2022) Jiaming Han, Yuqiang Ren, Jian Ding, Xingjia Pan, Ke Yan, Gui-So

csuhan 64 Jan 07, 2023
JDet is Object Detection Framework based on Jittor.

JDet is Object Detection Framework based on Jittor.

135 Dec 14, 2022
MAME is a multi-purpose emulation framework.

MAME's purpose is to preserve decades of software history. As electronic technology continues to rush forward, MAME prevents this important "vintage" software from being lost and forgotten.

Michael Murray 6 Oct 25, 2020
PaddleBoBo是基于PaddlePaddle和PaddleSpeech、PaddleGAN等开发套件的虚拟主播快速生成项目

PaddleBoBo - 元宇宙时代,你也可以动手做一个虚拟主播。 PaddleBoBo是基于飞桨PaddlePaddle深度学习框架和PaddleSpeech、PaddleGAN等开发套件的虚拟主播快速生成项目。PaddleBoBo致力于简单高效、可复用性强,只需要一张带人像的图片和一段文字,就能

502 Jan 08, 2023
Pytorch implementation of our method for regularizing nerual radiance fields for few-shot neural volume rendering.

InfoNeRF: Ray Entropy Minimization for Few-Shot Neural Volume Rendering Pytorch implementation of our method for regularizing nerual radiance fields f

106 Jan 06, 2023
Heart Arrhythmia Classification

This program takes and input of an ECG in European Data Format (EDF) and outputs the classification for heartbeats into normal vs different types of arrhythmia . It uses a deep learning model for cla

4 Nov 02, 2022
Anchor-free Oriented Proposal Generator for Object Detection

Anchor-free Oriented Proposal Generator for Object Detection Gong Cheng, Jiabao Wang, Ke Li, Xingxing Xie, Chunbo Lang, Yanqing Yao, Junwei Han, Intro

jbwang1997 56 Nov 15, 2022
Rotation Robust Descriptors

RoRD Rotation-Robust Descriptors and Orthographic Views for Local Feature Matching Project Page | Paper link Evaluation and Datasets MMA : Training on

Udit Singh Parihar 25 Nov 15, 2022
Code To Tune or Not To Tune? Zero-shot Models for Legal Case Entailment.

COLIEE 2021 - task 2: Legal Case Entailment This repository contains the code to reproduce NeuralMind's submissions to COLIEE 2021 presented in the pa

NeuralMind 13 Dec 16, 2022
Official DGL implementation of "Rethinking High-order Graph Convolutional Networks"

SE Aggregation This is the implementation for Rethinking High-order Graph Convolutional Networks. Here we show the codes for citation networks as an e

Tianqi Zhang (张天启) 32 Jul 19, 2022
RoadMap and preparation material for Machine Learning and Data Science - From beginner to expert.

ML-and-DataScience-preparation This repository has the goal to create a learning and preparation roadMap for Machine Learning Engineers and Data Scien

33 Dec 29, 2022
Create and implement a deep learning library from scratch.

In this project, we create and implement a deep learning library from scratch. Table of Contents Deep Leaning Library Table of Contents About The Proj

Rishabh Bali 22 Aug 23, 2022
Towards Rolling Shutter Correction and Deblurring in Dynamic Scenes (CVPR2021)

RSCD (BS-RSCD & JCD) Towards Rolling Shutter Correction and Deblurring in Dynamic Scenes (CVPR2021) by Zhihang Zhong, Yinqiang Zheng, Imari Sato We co

81 Dec 15, 2022