Displays market info for the LUNI token on the Terra Blockchain

Related tags

Web CrawlingLuniBot
Overview

LuniBot for Discord

Displays market info for the LUNI/LUNA token on the Terra Blockchain (Webscrape method currently scraping CoinMarketCap). Will evolve over time :)

Install

--Install Dependencies

pip3 install bs4 discord dotenv flask requests

--Add a .env file to the repository and add the following line within it:

token = REPLACEWITHTOKEN <---- Discord Bot token, obtained HERE DO NOT SHARE THIS TOKEN WITH ANYONE

Set up your bot in the given link above and click "Bot" on the left menu to see the "Token" section underneath the Username input.

That is where you will copy and replace the code above with your Bot Token.

This is a private key to your Bot, so make sure not to leave it anywhere someone you dont trust can get it. They could potentially mess with your server.

Add Discord Bot to your server

Click "OAuth2" on the left menu on the Discord Developer Portal

Go to the second section, "URL Generator"

Assign the permissions you wish the bot to have, make sure to know what youre giving it access to, as too much control can be screwy.

Copy the given link on the bottom of the page, should look something like "https://discord.com/api/oauth2/authorize?client_id=??????????".

Go to that link in a new tab and assign the server you want it to have access to (you will have to confirm some permissions first), and your bot will then be connected.

It will show offline until you finalize and run the program.

Finished

And just like that you should be good. As long as you have the necessary dependencies to run the commands, you should see a smooth startup upon running the main.py file in VSCode (or preferred program)

Now go ahead and type "$luni" into the channel the bot is in (and online) and see it print out the top info for LUNI

I also added print function that will print the price in the console upon request from discord.

Tips

**This bot will have access to all your channels, so if you wish to limit this command to one channel, I recommend setting permissions for the Bot's role (should automatically be set as whatever you named the bot in the Discord Developer Portal) to not be able to view whatever channels you don't want it replying in.

**If you ever do have your token compromised (ie, you fork to github with your token readily available to anybody who checks your code) there is usually no cause for concern as Discord will automatically let you know and your token will be invalid. This does mean, however, that you must recopy and reenter your token into your local code in order for the bot to continue working (upon rerunning main.py, of course)

**Lastly, I do recommend applying "slow-mode" or limiting the frequency of using the commands, as the more info it pulls, the longer it will take. I am currently looking towards solutions in making the pull far more efficient so as to ease the amount of effort it takes for the program to access the data.

**This Bot is super simple, so feel free to mess with it and add to it/change it. This serves as a template, as not many solid ones are out there for this sort of thing. Hopefully this helps with that!

PS : Added the logo for the LuniBot avatar :)

Let me know any issues or suggestions! I'm not an expert by any means, so I don't expect this to work flawlessly :P

Scrapes the Sun Life of Canada Philippines web site for historical prices of their investment funds and then saves them as CSV files.

slocpi-scraper Sun Life of Canada Philippines Inc Investment Funds Scraper Install dependencies pip install -r requirements.txt Usage General format:

Daryl Yu 2 Jan 07, 2022
Collection of code files to scrap different kinds of websites.

STW-Collection Scrap The Web Collection; blog posts. This repo contains Scrapy sample code to scrap the following kind of websites: Do you want to lea

Tapasweni Pathak 15 Jun 08, 2022
Footballmapies - Football mapies for learning webscraping and use of gmplot module in python

Footballmapies - Football mapies for learning webscraping and use of gmplot module in python

1 Jan 28, 2022
A web scraper for nomadlist.com, made to avoid website restrictions.

Gypsylist gypsylist.py is a web scraper for nomadlist.com, made to avoid website restrictions. nomadlist.com is a website with a lot of information fo

Alessio Greggi 5 Nov 24, 2022
Here I provide the source code for doing web scraping using the python library, it is Selenium.

Here I provide the source code for doing web scraping using the python library, it is Selenium.

M Khaidar 1 Nov 13, 2021
Free-Game-Scraper is a useful script that allows you to track down free games and DLCs on many platforms.

Game Scraper Free-Game-Scraper is a useful script that allows you to track down free games and DLCs on many platforms. Join the discord About The Proj

KursK 2 Mar 28, 2022
Crawler job that scrapes comments from social media posts and saves them in a S3 bucket.

Toxicity comments crawler Crawler job that scrapes comments from social media posts and saves them in a S3 bucket. Twitter Tweets and replies are scra

Douglas Trajano 2 Jan 24, 2022
中国大学生在线 四史自动答题刷分(现仅支持英雄篇)

中国大学生在线 “四史”学习教育竞答 自动答题 刷分 (现仅支持英雄篇,已更新可用) 若对您有所帮助,记得点个Star 🌟 !!! 中国大学生在线 “四史”学习教育竞答 自动答题 刷分 (现仅支持英雄篇,已更新可用) 🥰 🥰 🥰 依赖 本项目依赖的第三方库: requests 在终端执行以下

XWhite 229 Dec 12, 2022
OSTA web scraper, for checking the status of school buses in Ottawa

OSTA-La-Vista OSTA web scraper, for checking the status of school buses in Ottawa. Getting Started Using a Raspberry Pi, download Python 3, and option

1 Jan 28, 2022
Distributed Crawler Management Framework Based on Scrapy, Scrapyd, Django and Vue.js

Gerapy Distributed Crawler Management Framework Based on Scrapy, Scrapyd, Scrapyd-Client, Scrapyd-API, Django and Vue.js. Documentation Documentation

Gerapy 2.9k Jan 03, 2023
基于Github Action的定时HITsz疫情上报脚本,开箱即用

HITsz Daily Report 基于 GitHub Actions 的「HITsz 疫情系统」访问入口 定时自动上报脚本,开箱即用。 感谢 @JellyBeanXiewh 提供原始脚本和 idea。 感谢 @bugstop 对脚本进行重构并新增 Easy Connect 校内代理访问。

Ter 56 Nov 27, 2022
Dude is a very simple framework for writing web scrapers using Python decorators

Dude is a very simple framework for writing web scrapers using Python decorators. The design, inspired by Flask, was to easily build a web scraper in just a few lines of code. Dude has an easy-to-lea

Ronie Martinez 326 Dec 15, 2022
A package designed to scrape data from Yahoo Finance.

yahoostock A package designed to scrape data from Yahoo Finance. Installation The most simple installation method is through PIP. pip install yahoosto

Rohan Singh 2 May 28, 2022
京东秒杀商品抢购Python脚本

Jd_Seckill 非常感谢原作者 https://github.com/zhou-xiaojun/jd_mask 提供的代码 也非常感谢 https://github.com/wlwwu/jd_maotai 进行的优化 主要功能 登陆京东商城(www.jd.com) cookies登录 (需要自

Andy Zou 1.5k Jan 03, 2023
TikTok Username Swapper/Claimer/etc

TikTok-Turbo TikTok Username Swapper/Claimer/etc I wanted to create it as fast as possible but i eventually gave up and recoded it many many many many

Kevin 12 Dec 19, 2022
Dictionary - Application focused on word search through web scraping

Dictionary - Application focused on word search through web scraping, in addition to other functions such as dictation, spell and conjugation of syllables.

Juan Manuel 2 May 09, 2022
Scrape plants scientific name information from Agroforestry Species Switchboard 2.0.

Agroforestry Species Switchboard 2.0 Scraper Scrape plants scientific name information from Species Switchboard 2.0. Requirements python = 3.10 (you

Mgs. M. Rizqi Fadhlurrahman 2 Dec 23, 2021
薅薅乐 - JD 测试脚本

薅薅乐 安裝 使用docker docker一键安装: docker run -d --name jd classmatelin/hhl:latest. 使用 进入容器: docker exec -it jd bash 获取JD_COOKIES: python get_jd_cookies.py,

ClassmateLin 575 Dec 28, 2022
This is a webscraper for a specific website

This is a webscraper for a specific website. It is tuned to extract the headlines of that website. With some little adjustments the webscraper is able to extract any part of the website.

Rahul Siyanwal 1 Dec 13, 2021
This is a web scraper, using Python framework Scrapy, built to extract data from the Deals of the Day section on Mercado Livre website.

Deals of the Day This is a web scraper, using the Python framework Scrapy, built to extract data such as price and product name from the Deals of the

David Souza 1 Jan 12, 2022