• Stars
    star
    168
  • Rank 224,174 (Top 5 %)
  • Language
    Jupyter Notebook
  • Created almost 6 years ago
  • Updated over 5 years ago

Reviews

There are no reviews yet. Be the first to send feedback to the community and the maintainers!

Repository Details

Quick, Draw! Kaggle Competition Starter Pack v2

The code in this repo is all you need to make a first submission to the Quick, Draw! Kaggle Competition. It uses the FastAi library.

For additional information please refer to the discussion thread on Kaggle forums.

I provide instructions on how to run the code below.

This code is based on code from a fast.ai MOOC that will be publicly available in Jan 2019

You can find an earlier version of this starter pack here. This iteration eliminates some of the rough edges that resulted in a relatively low score and also introduces the data block API. A major change is that here I am generating drawings on the fly - this should help with experimentation.

This version of the code runs with fastai 1.0.27 and is likely incompatible with more recent releases.

Making the first submission

  1. You need to have the FastAi library up and running. You can find installation instructions here.
  2. You will also need to download the competition data. The competition can be found under this url. If you do not have a Kaggle account you will need to register. There are many ways to download the data - I use the Kaggle CLI. To set it up you will need to generate an API key - all this information is available in the Kaggle CLI repository. If you do not want to set this up at this point, you can download the data from the competition's data tab.
  3. We will need test_simplified.csv and train_simplified.zip. Please put them in the data directory. If you were to download one of the files using the Kaggle CLI, the command (executed from the root of the repository) would be kaggle competitions download -c quickdraw-doodle-recognition -f test_simplified.csv -p data.
  4. We now need to unzip the downloaded files. cd into the data folder and create a new directory train. Extract the downloaded files by executing unzip train_simplified.zip -d train.
  5. Open first_submission.ipynb. If you have kaggle CLI installed and configured, you can uncomment the last line. Hit run all. See you on the leaderboards :)

More Repositories

1

whale

Jupyter Notebook
267
star
2

aiquizzes-anki

151
star
3

yolo_open_images

yolov3 with SPP weights pretrained on Open Images dataset along with config files
98
star
4

ask_ai

Jupyter Notebook
97
star
5

dogs_vs_cats

Jupyter Notebook
75
star
6

cifar10_docker

Jupyter Notebook
52
star
7

rsna-intracranial

Jupyter Notebook
49
star
8

personalized_fashion_recs

Jupyter Notebook
42
star
9

python_musings

Jupyter Notebook
36
star
10

aws-setup

Shell
28
star
11

nvt_op_examples

Jupyter Notebook
27
star
12

machine_learning_notebooks

Jupyter Notebook
27
star
13

fastai-rails

Jupyter Notebook
25
star
14

10_neural_nets

Python
24
star
15

tgs_salt_solution

Jupyter Notebook
23
star
16

refactoring

Ideas and theory on how one might want to go about writing code.
Jupyter Notebook
11
star
17

presidential

Jupyter Notebook
11
star
18

python_shorts

Jupyter Notebook
10
star
19

training_a_CNN_with_little_data

Design and train a CNN with few training examples using data augmentation and pseudo labeling with keras.
Jupyter Notebook
8
star
20

personal-site

a one page personal site built with mvp.css
HTML
7
star
21

meta_notebook

Treat Jupyter notebooks as lego bricks to create something beautiful.
Python
4
star
22

serve-markdown

Ruby
4
star
23

ACT_refactor

Jupyter Notebook
4
star
24

paddy_doctor

Jupyter Notebook
2
star
25

utils

Python
2
star
26

zen_dataset

Library for assembling pytorch dataset.
Python
2
star
27

error_surface_vs_generalizability

An experiment to see if smoothness of the surrounding error surface helps with generalization.
Jupyter Notebook
1
star
28

git_course_completion_bell

1
star
29

universe

Shell
1
star
30

imagenette-LB-entry

Jupyter Notebook
1
star
31

answers

HTML
1
star