• This repository has been archived on 18/Feb/2023
  • Stars
    star
    383
  • Rank 111,995 (Top 3 %)
  • Language
    Jupyter Notebook
  • Created over 7 years ago
  • Updated over 1 year ago

Reviews

There are no reviews yet. Be the first to send feedback to the community and the maintainers!

Repository Details

๐ŸŒฑ Deep Learning for Instance Segmentation of Agricultural Fields - Master thesis

Deep Learning for Instance Segmentation of Agricultural Fields - Master thesis

Abstract

This thesis aims to delineate agricultural field parcels from satellite images via deep learning instance segmentation. Manual delineation is accurate but time consuming, and many automated approaches with traditional image segmentation techniques struggle to capture the variety of possible field appearances. Deep learning has proven to be successful in various computer vision tasks, and might be a good candidate to enable accurate, performant and generalizable delineation of agricultural fields. Here, a fully convolutional instance segmentation architecture (adapted from Li et al., 2016), was trained on Sentinel-2 image data and corresponding agricultural field polygons from Denmark. In contrast to many other approaches, the model operates on raw RGB images without significant pre- and post-processing. After training, the model proved successful in predicting field boundaries on held-out image chips. The results generalize across different field sizes, shapes and other properties, but show characteristic problems in some cases. In a second experiment, the model was trained to simultaneously predict the crop type of the field instance. Performance in this setting was significantly worse. Many fields were correctly delineated, but the wrong crop class was predicted. Overall, the results are promising and prove the validity of the deep learning approach. Also, the methodology offers many directions for future improvement.

Results

Instructions

1. Installation FCIS & MXNet

Install the FCIS model and MXNet framework according to the instructions in the FCIS repository. The setup works well with an AWS EC2 P2 instance and the official AWS Deep Learning AMI (Ubuntu). Make sure that the installations were successfull by running the FCIS demo:

> python FCIS/fcis/demo.py

2. Data Preprocessing

Follow the instructions and run the code in the Preprocessing Jupyter notebook. This will prepare the Denmark LPIS field data and create the image chips and COCO format annotations. When finished, place the preprocessed vector folder .output/preprocessing/annotations and image folder .output/preprocessing/images in .FCIS/data/coco.

3. Configuration

Place the configuration file .model/resnet_v1_101_coco_fcis_end2end_ohem.yaml in .FCIS/experiments/fcis/cfgs. A more detailed description of the model and training parameters used for the thesis is given in thesis chapter 3.3. Then delete the annotations cache (neccessary each time you change a configuration parameter that could influence the model evaluation or training):

> rm -rf .FCIS/data/coco/annotations_cache/; rm -rf .FCIS/data/cache/COCOMask/  

4. Model Evaluation

Runs the prediction/model evaluation task via the model trained in the thesis. First move the folder containing the model .model/resnet_v1_101_coco_fcis_end2end_ohem to FCIS/output/fcis/coco/resnet_v1_101_coco_fcis_end2end_ohem. Then run the evaluation:

> python experiments/fcis/fcis_end2end_test.py --cfg experiments/fcis/cfgs/resnet_v1_101_coco_fcis_end2end_ohem.yaml --ignore_cache

The resulting instance segmentation and object detection proposals will be saved to FCIS/output/fcis/coco/resnet_v1_101_coco_fcis_end2end_ohem/val2016/detections_val2016_results.json.

5. Custom Model Training

You can carry out your own model training with custom configurations or datasets.

First adjust the PIXEL_MEANS values in the configuration file to the RGB channels means of your dataset (The band means are saved to .output/preprocessed/statistics.json during the preprocessing).

Delete existing model files:

> rm -rf /home/ubuntu/FCIS/output/fcis/coco/resnet_v1_101_coco_fcis_end2end_ohem/

Finally, run the training task:

> python experiments/fcis/fcis_end2end_train_test.py --cfg experiments/fcis/cfgs/resnet_v1_101_coco_fcis_end2end_ohem.yaml

More Repositories

1

awesome-satellite-imagery-datasets

๐Ÿ›ฐ๏ธ List of satellite image training datasets with annotations for computer vision and deep learning
3,581
star
2

prettymapp

๐Ÿ–ผ๏ธ Create beautiful maps from OpenStreetMap data in a streamlit webapp
Python
2,326
star
3

awesome-geospatial-companies

๐ŸŒ List & Map of 700+ companies for geospatial jobs (GIS, Earth Observation, UAV, Satellite, Digital Farming, ..)
Python
704
star
4

streamlit-keplergl

๐Ÿ—พ Streamlit Component for rendering kepler.gl maps
Python
61
star
5

geojson-invalid-geometry

๐Ÿ—บ๏ธ List of GeoJSON invalid geometry issues with example files
31
star
6

mkdocs-exclude-search

๐Ÿ”Ž A mkdocs plugin that excludes selected chapters from the docs search index.
Python
27
star
7

GoogleEarthEngine-side-projects

Google Earth Engine side projects and tutorial scripts
JavaScript
26
star
8

geojson-validator

๐Ÿ› ๏ธ Validate GeoJSON and automatically fix invalid geometries
Python
21
star
9

vector-validator

๐Ÿ”บ Webapp that validates and automatically fixes your geospatial vector data.
Python
16
star
10

cgeo

โญ•Convenience functions for geospatial & cv
Python
9
star
11

iceberg-locations-data

๐ŸงŠ Iceberg locations on S3, weekly updated via AWS lambda
Python
6
star
12

streamlit-prettymaps

Streamlit app to render pretty maps from OpenStreetMap data. Based on marceloprates prettymaps package.
Python
5
star
13

geocoder-comparison

๐Ÿ“ A comparison of multiple geocoders - streamlit webapp.
Python
5
star
14

calendar-insights

๐Ÿ“… Query the Google Calendar API & visualize meeting habits & interactions
Jupyter Notebook
2
star