cw-eval
Evaluation metrics for Geospatial Machine Learning Challenges
Evaluation metrics for Geospatial Machine Learning Challenges
To install this package, run one of the following:
cw-eval is an evaluation suite for scoring entries in geospatial image analysis competitions. It includes tools for calculating IoU scores, precision, recall, F1 score, and scripts to score entire entries in either geojson or csv formats.
Summary
Evaluation metrics for Geospatial Machine Learning Challenges
Last Updated
Jan 5, 2019 at 17:56
License
Apache-2.0
Supported Platforms
GitHub Repository
https://github.com/cosmiq/cw-evalDocumentation
http://cw-eval.readthedocs.io/