Code Repository for The Kaggle Book, Published by Packt Publishing

Overview

The Kaggle Book

Data analysis and machine learning for competitive data science

Code Repository for The Kaggle Book, Published by Packt Publishing

"Luca and Konradˈs book helps make Kaggle even more accessible. They are both top-ranked users and well-respected members of the Kaggle community. Those who complete this book should expect to be able to engage confidently on Kaggle – and engaging confidently on Kaggle has many rewards." — Anthony Goldbloom, Kaggle Founder & CEO

Key Features

  • Learn how Kaggle works and how to make the most of competitions from two expert Kaggle Grandmasters
  • Sharpen your modeling skills with ensembling, feature engineering, adversarial validation, AutoML, transfer learning, and techniques for parameter tuning
  • Challenge yourself with problems regarding tabular data, vision, natural language as well as simulation and optimization
  • Discover tips, tricks, and best practices for getting great results on Kaggle and becoming a better data scientist
  • Read interviews with 31 Kaggle Masters and Grandmasters telling about their experience and tips

Get a step ahead of your competitors with a concise collection of smart data handling and modeling techniques

Getting started

You can run these notebooks on cloud platforms like Kaggle Colab or your local machine. Note that most chapters require a GPU even TPU sometimes to run in a reasonable amount of time, so we recommend one of the cloud platforms as they come pre-installed with CUDA.

Running on a cloud platform

To run these notebooks on a cloud platform, just click on one of the badges (Colab or Kaggle) in the table below. The code will be reproduced from Github directly onto the choosen platform (you may have to add the necessary data before running it). Alternatively, we also provide links to the fully working original notebook on Kaggle that you can copy and immediately run.

no Chapter Notebook Colab Kaggle
05 Competition Tasks and Metrics meta_kaggle Open In Colab Kaggle
06 Designing Good Validation adversarial-validation-example Open In Colab Kaggle
07 Modeling for Tabular Competitions interesting-eda-tsne-umap Open In Colab Kaggle
meta-features-and-target-encoding Open In Colab Kaggle
really-not-missing-at-random Open In Colab Kaggle
tutorial-feature-selection-with-boruta-shap Open In Colab Kaggle
08 Hyperparameter Optimization basic-optimization-practices Open In Colab Kaggle
hacking-bayesian-optimization-for-dnns Open In Colab Kaggle
hacking-bayesian-optimization Open In Colab Kaggle
kerastuner-for-imdb Open In Colab Kaggle
optuna-bayesian-optimization Open In Colab Kaggle
scikit-optimize-for-lightgbm Open In Colab Kaggle
tutorial-bayesian-optimization-with-lightgbm Open In Colab Kaggle
09 Ensembling with Blending and Stacking Solutions ensembling Open In Colab Kaggle
10 Modeling for Computer Vision augmentations-examples Open In Colab Kaggle
images-classification Open In Colab Kaggle
prepare-annotations Open In Colab Kaggle
segmentation-inference Open In Colab Kaggle
segmentation Open In Colab Kaggle
object-detection-yolov5 Open In Colab Kaggle
11 Modeling for NLP nlp-augmentations4 Open In Colab Kaggle
nlp-augmentation1 Open In Colab Kaggle
qanswering Open In Colab Kaggle
sentiment-extraction Open In Colab Kaggle
12 Simulation and Optimization Competitions connectx Open In Colab Kaggle
mab-santa Open In Colab Kaggle
rps-notebook1 Open In Colab Kaggle

Book Description

Millions of data enthusiasts from around the world compete on Kaggle, the most famous data science competition platform of them all. Participating in Kaggle competitions is a surefire way to improve your data analysis skills, network with the rest of the community, and gain valuable experience to help grow your career.

The first book of its kind, Data Analysis and Machine Learning with Kaggle assembles the techniques and skills you’ll need for success in competitions, data science projects, and beyond. Two masters of Kaggle walk you through modeling strategies you won’t easily find elsewhere, and the tacit knowledge they’ve accumulated along the way. As well as Kaggle-specific tips, you’ll learn more general techniques for approaching tasks based on image data, tabular data, textual data, and reinforcement learning. You’ll design better validation schemes and work more comfortably with different evaluation metrics.

Whether you want to climb the ranks of Kaggle, build some more data science skills, or improve the accuracy of your existing models, this book is for you.

What you will learn

  • Get acquainted with Kaggle and other competition platforms
  • Make the most of Kaggle Notebooks, Datasets, and Discussion forums
  • Understand different modeling tasks including binary and multi-class classification, object detection, NLP (Natural Language Processing), and time series
  • Design good validation schemes, learning about k-fold, probabilistic, and adversarial validation
  • Get to grips with evaluation metrics including MSE and its variants, precision and recall, IoU, mean average precision at k, as well as never-before-seen metrics
  • Handle simulation and optimization competitions on Kaggle
  • Create a portfolio of projects and ideas to get further in your career

Who This Book Is For

This book is suitable for Kaggle users and data analysts/scientists with at least a basic proficiency in data science topics and Python who are trying to do better in Kaggle competitions and secure jobs with tech giants. At the time of completion of this book, there are 96,190 Kaggle novices (users who have just registered on the website) and 67,666 Kaggle contributors (users who have just filled in their profile) enlisted in Kaggle competitions. This book has been written with all of them in mind and with anyone else wanting to break the ice and start taking part in competitions on Kaggle and learning from them.

Table of Contents

Part 1

  1. Introducing Kaggle and Other Data Science Competitions
  2. Organizing Data with Datasets
  3. Working and Learning with Kaggle Notebooks
  4. Leveraging Discussion Forums

Part 2

  1. Competition Tasks and Metrics
  2. Designing Good Validation
  3. Modeling for Tabular Competitions
  4. Hyperparameter Optimization
  5. Ensembling with Blending and Stacking Solutions
  6. Modeling for Computer Vision
  7. Modeling for NLP
  8. Simulation and Optimization Competitions

Part 3

  1. Creating Your Portfolio of Projects and Ideas
  2. Finding New Professional Opportunities
Owner
Packt
Providing books, eBooks, video tutorials, and articles for IT developers, administrators, and users.
Packt
Repositorio oficial del curso IIC2233 Programación Avanzada 🚀✨

IIC2233 - Programación Avanzada Evaluación Las evaluaciones serán efectuadas por medio de actividades prácticas en clases y tareas. Se calculará la no

IIC2233 @ UC 47 Sep 06, 2022
Scalable training for dense retrieval models.

Scalable implementation of dense retrieval. Training on cluster By default it trains locally: PYTHONPATH=.:$PYTHONPATH python dpr_scale/main.py traine

Facebook Research 90 Dec 28, 2022
All course materials for the Zero to Mastery Deep Learning with TensorFlow course.

All course materials for the Zero to Mastery Deep Learning with TensorFlow course.

Daniel Bourke 3.4k Jan 07, 2023
Code for the paper "M2m: Imbalanced Classification via Major-to-minor Translation" (CVPR 2020)

M2m: Imbalanced Classification via Major-to-minor Translation This repository contains code for the paper "M2m: Imbalanced Classification via Major-to

79 Oct 13, 2022
Code for "Adversarial attack by dropping information." (ICCV 2021)

AdvDrop Code for "AdvDrop: Adversarial Attack to DNNs by Dropping Information(ICCV 2021)." Human can easily recognize visual objects with lost informa

Ranjie Duan 52 Nov 10, 2022
This repository contains the map content ontology used in narrative cartography

Narrative-cartography-ontology This repository contains the map content ontology used in narrative cartography, which is associated with a submission

Weiming Huang 0 Oct 31, 2021
IJCAI2020 & IJCV 2020 :city_sunrise: Unsupervised Scene Adaptation with Memory Regularization in vivo

Seg_Uncertainty In this repo, we provide the code for the two papers, i.e., MRNet:Unsupervised Scene Adaptation with Memory Regularization in vivo, IJ

Zhedong Zheng 348 Jan 05, 2023
Bounding Wasserstein distance with couplings

BoundWasserstein These scripts reproduce the results of the article Bounding Wasserstein distance with couplings by Niloy Biswas and Lester Mackey. ar

Niloy Biswas 1 Jan 11, 2022
Pytorch code for ICRA'21 paper: "Hierarchical Cross-Modal Agent for Robotics Vision-and-Language Navigation"

Hierarchical Cross-Modal Agent for Robotics Vision-and-Language Navigation This repository is the pytorch implementation of our paper: Hierarchical Cr

43 Nov 21, 2022
Implementation of Sequence Generative Adversarial Nets with Policy Gradient

SeqGAN Requirements: Tensorflow r1.0.1 Python 2.7 CUDA 7.5+ (For GPU) Introduction Apply Generative Adversarial Nets to generating sequences of discre

Lantao Yu 2k Dec 29, 2022
Reinforcement learning library in JAX.

Reinforcement learning library in JAX.

Yicheng Luo 96 Oct 30, 2022
Rethinking Transformer-based Set Prediction for Object Detection

Rethinking Transformer-based Set Prediction for Object Detection Here are the code for the ICCV paper. The code is adapted from Detectron2 and AdelaiD

Zhiqing Sun 62 Dec 03, 2022
This repository includes the official project for the paper: TransMix: Attend to Mix for Vision Transformers.

TransMix: Attend to Mix for Vision Transformers This repository includes the official project for the paper: TransMix: Attend to Mix for Vision Transf

Jie-Neng Chen 130 Jan 01, 2023
PyTorch implementation of PNASNet-5 on ImageNet

PNASNet.pytorch PyTorch implementation of PNASNet-5. Specifically, PyTorch code from this repository is adapted to completely match both my implemetat

Chenxi Liu 314 Nov 25, 2022
PyTorch implementation of CloudWalk's recent work DenseBody

densebody_pytorch PyTorch implementation of CloudWalk's recent paper DenseBody. Note: For most recent updates, please check out the dev branch. Update

Lingbo Yang 401 Nov 19, 2022
A PyTorch implementation of " EfficientNet: Rethinking Model Scaling for Convolutional Neural Networks."

EfficientNet A PyTorch implementation of EfficientNet: Rethinking Model Scaling for Convolutional Neural Networks. [arxiv] [Official TF Repo] Implemen

AhnDW 298 Dec 10, 2022
Code release of paper "Deep Multi-View Stereo gone wild"

Deep MVS gone wild Pytorch implementation of "Deep MVS gone wild" (Paper | website) This repository provides the code to reproduce the experiments of

François Darmon 53 Dec 24, 2022
Unsupervised Foreground Extraction via Deep Region Competition

Unsupervised Foreground Extraction via Deep Region Competition [Paper] [Code] The official code repository for NeurIPS 2021 paper "Unsupervised Foregr

28 Nov 06, 2022
NeurIPS'21 Tractable Density Estimation on Learned Manifolds with Conformal Embedding Flows

NeurIPS'21 Tractable Density Estimation on Learned Manifolds with Conformal Embedding Flows This repo contains the code for the paper Tractable Densit

Layer6 Labs 4 Dec 12, 2022