Tom-the-AI - A compound artificial intelligence software for Linux systems.

Overview

Tom the AI (version 0.82)

WARNING: This software is not yet ready to use, I'm still setting up the GitHub repository. Should be ready in a few days.

Tom is an open source AI desktop assistant for Linux systems, built using a series of independent response modules to generate replies to any input.

Tom uses natural language processing to determine which response module is best suited to generate a response for each input, thus avoiding the need for precise syntax.

Tom the AI

By Analogy

Tom the AI is designed as a Linux alternative to software such as Apple's Siri, or Microsoft's Cortana.

Set Up

Step 1 - Update repositories:

Update apt package repositories using sudo apt update to ensure that the apt package manager has access to the latest versions of the below dependencies.

Step 2 - Install APT dependencies:

First, install python by running sudo apt install python3.9 in a terminal. Tom is tested on python 3.9, but any newer version should (probably) also work just fine.

Next, install the latest version of VLC Media player using sudo apt install vlc.

Step 3 - Download Tom:

Download Tom by cloning the GitHub repository into your home folder using git clone https://github.com/Mblizzard/Tom-the-AI.

Step 4 - Install Python dependencies:

Open a terminal inside Tom's application folder, or navigate using cd ~/Tom-the-AI/. Now run sudo pip3 install requirements.txt. Some systems may use pip in place of pip3.

Next, we need to download the required NLTK libraries by running the following code in a python shell:

>>> import nltk
>>> nltk.download('all')

Step 5 - Running Tom:

Go ahead and run python3.9 ~/Tom-the-AI/tom.py. Tom will boot up, and after a minute or so of loading, you'll be ready to go! If you feel inclined, go ahead and make a desktop launcher of this command, link Tom into your Application Menu, or create a dock shortcut.

Mission

The mission of Tom is to provide an open source compound AI for which anyone can program and contribute response modules, expanding Tom's capabilities to create a useful and entertaining artificial intelligence software.

Examples

Tom generates outputs to any input by using natural language processing to determine the most suitable response module from which to source the reply.

Give Tom natural language input, either via voice recognition or text input, for instance Hey Tom, what is petrichor?, and he'll respond in the most appropriate way. Note that the 'Hey Tom' activation phrase is only required of voice inputs.

The following is a non-exhaustive list of things you can do:

Hey Tom, I'm in an optimistic mood. I'm not sure if this is a good thing or not. Emotions (Using sentiment analysis + NLTK chatbots): ~> Hey Tom, you are a brilliant individual! I am but one, you are but one more. ~> Hey Tom, thou art a fool. Become more interesting before I die of fatal boredom. Fact Memory & Recall: ~> Hey Tom, the answer to life, the universe, and everything is 42. Ok. ~> Hey Tom, what is the answer to life, the universe, and everything?. The answer to life, the universe, and everything is 42. Playing music (From device or web, includes UI controls for the former): ~> Hey Tom, play up the shard. Playing /home/murray/Music/Dr Who/Up The Shard.webm. ~> Hey Tom, stop the music. Media stopped. *NOTE: File names do not have to match exactly.* ~> Hey Tom, open my English essay. Alright. *NOTE: File names do not have to match exactly.* Opening websites: ~> Hey Tom, open Reddit. Alright. Jokes (From PyJokes): ~> Hey Tom, tell me a joke. I went to a street where the houses were numbered 8k, 16k, 32k, 64k, 128k, 256k and 512k. It was a trip down Memory Lane. Trivia: ~> Hey Tom, ask me a trivia question. Question: What is "Sealed crustless sandwich"? 1) The part of Yellowstone National Park in Idaho, where any crime can technically be committed without punishment – but don't tempt fate! 2) I got a fever, and the only prescription... is more cowbell! 3) The only nuclear reactor in a 17th-century building. 4) A patented peanut butter and jelly sandwich. ~> 4. Correct! Colossal Cave Adventure (Willie Crowther's ADVENT-350): ~> Hey Tom, let's go on an adventure! Welcome to adventure!! would you like instructions? Fun facts: ~> Hey Tom, make me smarter. Spices were not used to mask the flavor of rotting meat before refrigeration. Spices were an expensive luxury item; those who could afford them could afford good meat, and there are no contemporaneous documents calling for spices to disguise the taste of bad meat. Dice Rolls (great for D&D): ~> Hey Tom, roll me a d20. I rolled a 14. Word generation (great for Articulate) ~> Hey Tom, give me a random action word. Your word is 'winning'. Complex Mathematics (using SymPy): ~> Hey Tom, integrate (tan(x))^1/2 ∫f(x) = -ln(cos(x))/2 + c Code generation (using howdoi): ~> Hey Tom, write a hello world script in C++. #include <\iostream> int main() { std::cout << "Hello World!" << std::endl; return 0; } Most of Betty's functionality (From https://github.com/pickhardt/betty): ~> Hey Tom, what time is it? Running date +"%r (%T)" ... 02:34:46 PM (14:34:46). ~> Hey Tom, what day is it? Running date +"%A" ... Saturday. ~> Hey Tom, whats my username? Running whoami ... murray ~> Hey Tom, what is my ip address? Wlo1: flags=4163 mtu 1500 inet 192.168.43.9 netmask 255.255.255.0 broadcast 192.168.43.255 inet6 fe80::5c61:caf:5614:7b82 prefixlen 64 scopeid 0x20 ether 54:35:30:60:a8:b9 txqueuelen 1000 (Ethernet) RX packets 401121 bytes 523184185 (523.1 MB) RX errors 0 dropped 0 overruns 0 frame 0 TX packets 235650 bytes 23471151 (23.4 MB) TX errors 0 dropped 0 overruns 0 carrier 0 collisions 0.">
Objective Response (From anywhere on the internet):
~> Hey Tom, what is petrichor?
According to en.wikipedia.org... Petrichor is the earthy scent produced when rain falls on dry soil. The word is constructed from the Greek petra, "rock", or petros, "stone", and ichor, the fluid that flows in the veins of the gods in Greek mythology.

Subjective Response (From Cleverbot):
~> Hey Tom, I'm in an optimistic mood.
I'm not sure if this is a good thing or not.

Emotions (Using sentiment analysis + NLTK chatbots):
~> Hey Tom, you are a brilliant individual!
I am but one, you are but one more.
~> Hey Tom, thou art a fool.
Become more interesting before I die of fatal boredom.

Fact Memory & Recall:
~> Hey Tom, the answer to life, the universe, and everything is 42.
Ok.
~> Hey Tom, what is the answer to life, the universe, and everything?.
The answer to life, the universe, and everything is 42.

Playing music (From device or web, includes UI controls for the former):
~> Hey Tom, play up the shard.
Playing /home/murray/Music/Dr Who/Up The Shard.webm.
~> Hey Tom, stop the music.
Media stopped.
*NOTE: File names do not have to match exactly.*

~> Hey Tom, open my English essay.
Alright.
*NOTE: File names do not have to match exactly.*

Opening websites:
~> Hey Tom, open Reddit.
Alright.

Jokes (From PyJokes):
~> Hey Tom, tell me a joke.
I went to a street where the houses were numbered 8k, 16k, 32k, 64k, 128k, 256k and 512k. It was a trip down Memory Lane.

Trivia:
~> Hey Tom, ask me a trivia question.
Question: What is "Sealed crustless sandwich"?
1) The part of Yellowstone National Park in Idaho, where any crime can technically be committed without punishment – but don't tempt fate!
2) I got a fever, and the only prescription... is more cowbell!
3) The only nuclear reactor in a 17th-century building.
4) A patented peanut butter and jelly sandwich.
~> 4.
Correct!

Colossal Cave Adventure (Willie Crowther's ADVENT-350):
~> Hey Tom, let's go on an adventure!
Welcome to adventure!! would you like instructions?

Fun facts:
~> Hey Tom, make me smarter.
Spices were not used to mask the flavor of rotting meat before refrigeration. Spices were an expensive luxury item; those who could afford them could afford good meat, and there are no contemporaneous documents calling for spices to disguise the taste of bad meat.

Dice Rolls (great for D&D):
~> Hey Tom, roll me a d20.
I rolled a 14.

Word generation (great for Articulate)
~> Hey Tom, give me a random action word.
Your word is 'winning'.

Complex Mathematics (using SymPy):
~> Hey Tom, integrate (tan(x))^1/2
∫f(x) = -ln(cos(x))/2 + c

Code generation (using howdoi):
~> Hey Tom, write a hello world script in C++.
#include <\iostream>
int main()
{
std::cout << "Hello World!" << std::endl;
return 0;
}

Most of Betty's functionality (From https://github.com/pickhardt/betty):
~> Hey Tom, what time is it?
Running date +"%r (%T)" ...
02:34:46 PM (14:34:46).
~> Hey Tom, what day is it?
Running date +"%A" ...
Saturday.
~> Hey Tom, whats my username?
Running whoami ...
murray
~> Hey Tom, what is my ip address?
Wlo1: flags=4163
    
      mtu 1500
    inet 192.168.43.9 netmask 255.255.255.0 broadcast 192.168.43.255
    inet6 fe80::5c61:caf:5614:7b82 prefixlen 64 scopeid 0x20
     
    ether 54:35:30:60:a8:b9 txqueuelen 1000 (Ethernet)
    RX packets 401121 bytes 523184185 (523.1 MB)
    RX errors 0 dropped 0 overruns 0 frame 0
    TX packets 235650 bytes 23471151 (23.4 MB)
    TX errors 0 dropped 0 overruns 0 carrier 0 collisions 0.

    

This is a fair representation of Tom's capabilities as they currently stand. See the following section on contributing for a guide of how to create your own response modules for Tom, and expand upon the above abilities.

Contributing

How to write a custom response module for Tom:

Step 1 - Understanding how Tom will treat your module:

Tom is programmed in Python. Response modules are imported into Tom using the python import statement, and the response is retrieved from the module using output = .respond( ) . The output is then returned to the user.

Step 2 - Programming the response module:

Go ahead and program your response. Your script should have a main function def respond(inp):, where inp is the user input parameter that will be passed to your function by Tom. Your function should provide it's output through a return statement (NOT a print() statement).

Step 3 - Testing your module:

Paste the following bit of code at the end of your python script, then run your program:

")))">
if __name__ == "__main__":
    while True:
        print(respond(input("~> ")))

If this works as expected, and you can type inputs on the ~> prompts and receive your output printed in the console, then continue to step 4.

Step 4 - Relative imports:

Rename your main response script to __init__.py, and make sure it's at the first level of your project folder (not nested in other folders). Next, rename the folder containing your script to the name of your module (no white-space or special characters). Now, if you are importing any functions from other scripts (does not include dependencies installed through pip), you will need to change the import statement by placing a '.' in front of the location. For example, from myOtherScript import customFunction becomes from .myOtherScript import customFunction, but import requests would remain unchanged.

Step 5 - Dependencies:

If your response module requires python packages from PyPi, make sure it includes a requirements.txt file. Any dependencies not available from PyPi should bundled with project, located in the project folder alongside __init__.py.

Step 6 - Using your module:

Paste the folder containing your response module into Tom's /responses directory. You will then need to activate the response module within Tom's modules interface, or by manually adding the name of your module to responseOrder.txt.

Step 7 - Creating a pull request:

If you feel inclined to share your module with the world, go ahead and create a pull request for your module on Tom's GitHub repository (https://github.com/Mblizzard/Tom-the-AI).

Planned Features

New response modules & capabilities to look forward to in future versions of Tom:

  • Timers & stopwatch capabilities.
  • Ability execute terminal commands.
  • Automated module installation.
  • Releases and updates available on the Ubuntu apt repositories.

Features I'm not currently planning to include in Tom, but that I'll consider adding if enough people are interested:

  • Windows support.

Versioning

Releases will follow a semantic versioning format:

. .

For more information on SemVer, visit http://semver.org/.

License

Tom the AI: A compound AI for Linux systems.
Copyright (C) 2021  Murray Jones

This program is free software: you can redistribute it and/or modify
it under the terms of the GNU General Public License as published by
the Free Software Foundation, either version 3 of the License, or
(at your option) any later version.

This program is distributed in the hope that it will be useful,
but WITHOUT ANY WARRANTY; without even the implied warranty of
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
GNU General Public License for more details.

You should have received a copy of the GNU General Public License
along with this program. If not, see 
   .
MetaDrive: Composing Diverse Scenarios for Generalizable Reinforcement Learning

MetaDrive: Composing Diverse Driving Scenarios for Generalizable RL [ Documentation | Demo Video ] MetaDrive is a driving simulator with the following

DeciForce: Crossroads of Machine Perception and Autonomy 276 Jan 04, 2023
The official PyTorch implementation for the paper "sMGC: A Complex-Valued Graph Convolutional Network via Magnetic Laplacian for Directed Graphs".

Magnetic Graph Convolutional Networks About The official PyTorch implementation for the paper sMGC: A Complex-Valued Graph Convolutional Network via M

3 Feb 25, 2022
PyTorch implementation of "Debiased Visual Question Answering from Feature and Sample Perspectives" (NeurIPS 2021)

D-VQA We provide the PyTorch implementation for Debiased Visual Question Answering from Feature and Sample Perspectives (NeurIPS 2021). Dependencies P

Zhiquan Wen 19 Dec 22, 2022
Churn-Prediction-Project - In this project, a churn prediction model is developed for a private bank as a term project for Data Mining class.

Churn-Prediction-Project In this project, a churn prediction model is developed for a private bank as a term project for Data Mining class. Project in

1 Jan 03, 2022
Multi-Objective Loss Balancing for Physics-Informed Deep Learning

Multi-Objective Loss Balancing for Physics-Informed Deep Learning Code for ReLoBRaLo. Abstract Physics Informed Neural Networks (PINN) are algorithms

Rafael Bischof 16 Dec 12, 2022
Supervision Exists Everywhere: A Data Efficient Contrastive Language-Image Pre-training Paradigm

DeCLIP Supervision Exists Everywhere: A Data Efficient Contrastive Language-Image Pre-training Paradigm. Our paper is available in arxiv Updates ** Ou

Sense-GVT 470 Dec 30, 2022
MixRNet(Using mixup as regularization and tuning hyper-parameters for ResNets)

MixRNet(Using mixup as regularization and tuning hyper-parameters for ResNets) Using mixup data augmentation as reguliraztion and tuning the hyper par

Bhanu 2 Jan 16, 2022
A comprehensive and up-to-date developer education platform for Urbit.

curriculum A comprehensive and up-to-date developer education platform for Urbit. This project organizes developer capabilities into a hierarchy of co

Sigilante 36 Oct 04, 2022
DPT: Deformable Patch-based Transformer for Visual Recognition (ACM MM2021)

DPT This repo is the official implementation of DPT: Deformable Patch-based Transformer for Visual Recognition (ACM MM2021). We provide code and model

CASIA-IVA-Lab 111 Dec 21, 2022
Generative Flow Networks for Discrete Probabilistic Modeling

Energy-based GFlowNets Code for Generative Flow Networks for Discrete Probabilistic Modeling by Dinghuai Zhang, Nikolay Malkin, Zhen Liu, Alexandra Vo

Narsil-Dinghuai Zhang 51 Dec 20, 2022
Posterior predictive distributions quantify uncertainties ignored by point estimates.

Posterior predictive distributions quantify uncertainties ignored by point estimates.

DeepMind 177 Dec 06, 2022
Unsupervised Video Interpolation using Cycle Consistency

Unsupervised Video Interpolation using Cycle Consistency Project | Paper | YouTube Unsupervised Video Interpolation using Cycle Consistency Fitsum A.

NVIDIA Corporation 100 Nov 30, 2022
NNR conformation conditional and global probabilities estimation and analysis in peptides or proteins fragments

NNR and global probabilities estimation and analysis in peptides or protein fragments This module calculates global and NNR conformation dependent pro

0 Jul 15, 2021
DALL-Eval: Probing the Reasoning Skills and Social Biases of Text-to-Image Generative Transformers

DALL-Eval: Probing the Reasoning Skills and Social Biases of Text-to-Image Generative Transformers Authors: Jaemin Cho, Abhay Zala, and Mohit Bansal (

Jaemin Cho 98 Dec 15, 2022
Gym for multi-agent reinforcement learning

PettingZoo is a Python library for conducting research in multi-agent reinforcement learning, akin to a multi-agent version of Gym. Our website, with

Farama Foundation 1.6k Jan 09, 2023
PyTorch implementation of: Michieli U. and Zanuttigh P., "Continual Semantic Segmentation via Repulsion-Attraction of Sparse and Disentangled Latent Representations", CVPR 2021.

Continual Semantic Segmentation via Repulsion-Attraction of Sparse and Disentangled Latent Representations This is the official PyTorch implementation

Multimedia Technology and Telecommunication Lab 42 Nov 09, 2022
[NeurIPS-2021] Mosaicking to Distill: Knowledge Distillation from Out-of-Domain Data

MosaicKD Code for NeurIPS-21 paper "Mosaicking to Distill: Knowledge Distillation from Out-of-Domain Data" 1. Motivation Natural images share common l

ZJU-VIPA 37 Nov 10, 2022
Testing the Facial Emotion Recognition (FER) algorithm on animations

PegHeads-Tutorial-3 Testing the Facial Emotion Recognition (FER) algorithm on animations

PegHeads Inc 2 Jan 03, 2022
Iowa Project - My second project done at General Assembly, focused on feature engineering and understanding Linear Regression as a concept

Project 2 - Ames Housing Data and Kaggle Challenge PROBLEM STATEMENT Inferring or Predicting? What's more valuable for a housing model? When creating

Adam Muhammad Klesc 1 Jan 03, 2022
Cookiecutter PyTorch Lightning

Cookiecutter PyTorch Lightning Instructions # install cookiecutter pip install cookiecutter

Mazen 8 Nov 06, 2022