Automated network configuration backups using Github actions and git-scraping

Overview

Network Config Scraper

This repository demonstrates the use of Github Actions and git-scraping to build an automated backup solution for network configuration files. Git already provides an efficient way to track and manage changes to textual data, and Github Actions provide automation that we can use to fetch and process configuration backups without reliance on any additional infrastructure. The solution in this repository uses both to retrieve configurations from network devices on a defined schedule and commits the detected changes back to the repository.

This approach is heavily inspired by Git scraping: track changes over time by scraping to a Git repository by Simon Willison. His post provides an excellent overview of git-scraping data from various sources on the Internet.

config scraper

Could I use this?

Because of the textual nature of network configurations, git-scraping offers a simple yet effective way to backup and version configuration data. It runs entirely on GitHub Actions, so there's no complex infrastructure or orchestrations to manage.

Because the data we retrieve data from devices in a private lab, the repository uses a self-hosted runner that makes a connection to Github and has access to the lab. You'd have to decide if this is an acceptable model for your environment.

However, the solution works wonderfully and can be extended to fit into your existing automation. For example, using the webhooks in Github can extend this solution by allowing you to build or set up integrations to external automation systems which can subscribe to certain events on the repository. The configured events can trigger an HTTP POST to a system like Ansible Tower to provide additional automation.

How does it work?

The configuration scraper is configured and scheduled in the .github/workflows/scrape.yml action workflow. It's a short and simple workflow that defines all the triggers and steps to run our automation.

trigger

The workflow can be triggered in 3 different ways: on a push event to the repo, manually using the workflow_dispatch, or, most importantly, on a cron schedule. For example, in the snippet below, you can see that we have the workflow triggered every 30 minutes.

on:
  push:
    branches:
      - main
  workflow_dispatch:
  schedule:
    - cron:  '*/30 * * * *'

self-hosted runner(s)

Because our use case is slightly different from fetching data from flat files, we need to account for how GitHub would access the devices that we are fetching data from. For example, the devices in the inventory.yml are in a lab that sits behind a firewall, so a public Github action runner would not have the required access. For that reason, the repo is configured to use a self-hosted Github action runner that has access to the lab environment. It took about 15 minutes to provision and to configure a runner in this environment, so it is a relatively easy and painless process. Once configured, the self-hosted runner creates a connection to Github and listens for job requests to execute your actions.

Within the action workflow, the only things to specify are the fact that we're using a self-hosted runner and we also provide the tags to identify which runner to use for the action. We use the runs-on directive in the action to do just that, as shown below.

jobs:
  scheduled:
    runs-on: [self-hosted, atc-runners]

data

For data gathering, the retrieve_configs.py script is used to retrieve and save configuration data from each of the devices listed in the inventory.yml file. The script uses an async SSH connection transport from the scrapli library to handle parallel sessions to devices. Once the files are saved, the action workflow uses git to stage any files in which changes have been detected. A commit is created with a timestamp before the changes are pushed into the repository.

  - name: Fetch latest configs
    run: |-
      python retrieve_configs.py
    env:
      SSH_AUTH_USERNAME: ${{ secrets.SSH_AUTH_USERNAME }}
      SSH_AUTH_PASSWORD: ${{ secrets.SSH_AUTH_PASSWORD }}

  - name: Commit and push if it changed
    run: |-
      git config user.name "Automated"
      git config user.email "[email protected]"
      git add -A
      timestamp=$(date -u)
      git commit -m "Latest data: ${timestamp}" || exit 0
      git push

How do I track changes?

The neat thing about using Git to manage your configuration backups is that you get this commit log showing the history of commits that have been made to your configs.

Author(s)

Owner
WWT
World Wide Technology, Inc.
WWT
A simple tool to utilize the basic functionality of the Private API From Virus Total

Welcome To VT-SCAN (viurs total api) Information This is a simple tool to utilize the basic functionality of the Private API From Virus Total. with th

0X0ŽĒR∅⁰ 1 Sep 21, 2022
Automatic Proxy scraper and Proxy-rotating Nitro Generator.

Automatic Proxy scraper and Proxy-rotating Nitro Generator.

Tawren007 2 Nov 08, 2021
Library containing the core modules for the kingdom-python-server.

🏰 Kingdom Core Library containing the core modules for the kingdom-python-server. Installation Use the package manager pip to install kingdom-core. p

T10 4 Dec 27, 2021
Web-server with a parser, connection to DBMS, and the Hugging Face.

Final_Project Web-server with parser, connection to DBMS and the Hugging Face. Team: Aisha Bazylzhanova(SE-2004), Arysbay Dastan(SE-2004) Installation

Aisha Bazylzhanova 2 Nov 18, 2021
BaseSpec is a system that performs a comparative analysis of baseband implementation and the specifications of cellular networks.

BaseSpec is a system that performs a comparative analysis of baseband implementation and the specifications of cellular networks. The key intuition of BaseSpec is that a message decoder in baseband s

SysSec Lab 35 Dec 06, 2022
Python script to stop qBittorrent from torrenting without VPN for users with static IP.

Python script to stop qBittorrent from torrenting without VPN for users with static IP.

voidoak_ 1 Oct 25, 2021
Terminal based chat - networking project with sockets in python

Terminal based chat - networking project with sockets in python

2 Jan 24, 2022
Timeouts for popular Python packages

Python Timeouts An unresponsive service can be worse than a down one. It can tie up your entire system if not handled properly. All network requests s

Andrew Kane 11 Nov 22, 2022
Way find out if DNS is down or your instance

DNS-PING Way to find out if DNS is down or your instance Problem: At times it happens that DNS provider services of a website URL is down and so to re

Giten Mitra 4 Nov 18, 2022
This is a python based command line Network Scanner utility, which input as an argument for the exact IP address or the relative IP Address range you wish to do the Network Scan for and returns all the available IP addresses with their MAC addresses on your current Network.

This is a python based command line Network Scanner utility, which input as an argument for the exact IP address or the relative IP Address range you wish to do the Network Scan for and returns all t

Abhinandan Khurana 1 Feb 09, 2022
A database-based CDN node supporting PostgreSQL and MongoDB backends.

A simple to use database-based deployable CDN node for hobbyist developers who wish to have their own CDN!

Vish M 10 Nov 19, 2022
PySocks lets you send traffic through SOCKS proxy servers.

PySocks lets you send traffic through SOCKS proxy servers. It is a modern fork of SocksiPy with bug fixes and extra features. Acts as a drop-i

1.1k Dec 07, 2022
Eclipse zenoh Python API

Eclipse zenoh Python API Eclipse zenoh is an extremely efficient and fault-tolerant Named Data Networking (NDN) protocol that is able to scale down to

26 Jan 05, 2023
IP-Escaner - A Python Tool to obtain information from an IP address

IP-Escaner Herramienta para obtener informacion de una direccion IP Opciones de

4 Apr 09, 2022
A simple software which can use to make a server in local network

home-nas it is simple software which can use to make a server in local network, it has a web site on it which can use by multipale system, i use nginx

R ansh joseph 1 Nov 10, 2021
A SOCKS proxy server implemented with the powerful python cooperative concurrency framework asyncio.

asyncio-socks-server A SOCKS proxy server implemented with the powerful python cooperative concurrency framework asyncio. Features Supports both TCP a

Amaindex 164 Dec 30, 2022
Rufus is a Dos tool written in Python3.

🦎 Rufus 🦎 Rufus is a simple but powerful Denial of Service tool written in Python3. The type of the Dos attack is TCP Flood, the power of the attack

Billy 88 Dec 20, 2022
Nyx-Net: Network Fuzzing with Incremental Snapshots

Nyx-Net: Network Fuzzing with Incremental Snapshots Nyx-Net is fast full-VM snapshot fuzzer for complex network based targets. It's built upon kAFL, R

Chair for Sys­tems Se­cu­ri­ty 146 Dec 16, 2022
NSX-T infrastructure as code - SDDC deployment

Deploy NSX-T Infrastructure - Simple Topology by Nicolas MICHEL @vpackets / LinkedIn Introduction The purpose of this entire repository is to automate

21 Nov 28, 2022
ServerStatus with node management and monitor

ServerStatus with node management and monitor

lidalao 162 Jan 01, 2023