AI prompts
base on A Python toolkit for fine-tuning Geospatial Foundation Models (GFMs). <!---
<img src="https://github.com/user-attachments/assets/f7c9586f-6220-4a53-9669-2aee3300b492#light-only" alt="TerraTorch" width="400"/>
<img src="assets/logo_white.png#dark-only" alt="TerraTorch" width="400"/>
-->
<picture>
<source media="(prefers-color-scheme: light)" srcset="https://github.com/user-attachments/assets/f8c9586f-6220-4a53-9669-2aee3300b492">
<source media="(prefers-color-scheme: dark)" srcset="assets/logo_white.png">
<center><img style="display: block; margin-left: auto; margin-right: auto"; src="https://github.com/user-attachments/assets/f7c9586f-6220-4a53-9669-2aee3300b492" alt="TerraTorch" width="400"/></center>
</picture>
<!--
<picture>
<source media="(prefers-color-scheme: dark)" srcset="docs/figs/logo_inv.png">
<source media="(prefers-color-scheme: light)" srcset="docs/figs/logo.png">
</picture>
-->
[](https://huggingface.co/ibm-nasa-geospatial)
[](https://pypi.org/project/terratorch)
[](https://github.com/ibm/terratorch/actions/workflows/test.yaml)
[](https://ibm.github.io/terratorch/)

[](https://pypi.org/project/terratorch/)
## Overview
TerraTorch is a PyTorch domain library based on [PyTorch Lightning](https://lightning.ai/docs/pytorch/stable/) and the [TorchGeo](https://github.com/microsoft/torchgeo) domain library
for geospatial data.
<hr>
<a href="https://www.youtube.com/watch?v=CB3FKtmuPI8">
<img src="https://upload.wikimedia.org/wikipedia/commons/4/42/YouTube_icon_%282013-2017%29.png" alt="YouTube" width="20">
Watch the latest recording on YouTube: Earth observation foundation models with Prithvi-EO-2.0 and TerraTorch
<img src="https://upload.wikimedia.org/wikipedia/commons/4/42/YouTube_icon_%282013-2017%29.png" alt="YouTube" width="20">
</a>
<hr>
TerraTorch’s main purpose is to provide a flexible fine-tuning framework for Geospatial Foundation Models, which can be interacted with at different abstraction levels. The library provides:
- Convenient modelling tools:
- Flexible trainers for Image Segmentation, Classification and Pixel Wise Regression fine-tuning tasks
- Model factories that allow to easily combine backbones and decoders for different tasks
- Ready-to-go datasets and datamodules that require only to point to your data with no need of creating new custom classes
- Launching of fine-tuning tasks through CLI and flexible configuration files, or via jupyter notebooks
- Easy access to:
- Open source pre-trained Geospatial Foundation Model backbones:
* [Prithvi](https://huggingface.co/ibm-nasa-geospatial/Prithvi-100M)
* [TerraMind](https://research.ibm.com/blog/terramind-esa-earth-observation-model)
* [SatMAE](https://sustainlab-group.github.io/SatMAE/)
* [ScaleMAE](https://github.com/bair-climate-initiative/scale-mae)
* Satlas (as implemented in [TorchGeo](https://github.com/microsoft/torchgeo))
* DOFA (as implemented in [TorchGeo](https://github.com/microsoft/torchgeo))
* SSL4EO-L and SSL4EO-S12 models (as implemented in [TorchGeo](https://github.com/microsoft/torchgeo))
* [Clay](https://github.com/Clay-foundation/model)
- Backbones available in the [timm](https://github.com/huggingface/pytorch-image-models) (Pytorch image models)
- Decoders available in [SMP](https://github.com/qubvel/segmentation_models.pytorch) (Pytorch Segmentation models with pre-training backbones) and [mmsegmentation](https://github.com/open-mmlab/mmsegmentation) packages
- Fine-tuned models such as [granite-geospatial-biomass](https://huggingface.co/ibm-granite/granite-geospatial-biomass)
- All GEO-Bench datasets and datamodules
- All [TorchGeo](https://github.com/microsoft/torchgeo) datasets and datamodules
## Install
### Pip
In order to use the file `pyproject.toml` it is necessary to guarantee `pip>=21.8`. If necessary upgrade `pip` using `python -m pip install --upgrade pip`.
For a stable point-release, use `pip install terratorch==<version>`.
[comment]: <If you prefer to get the most recent version of the main branch, install the library with `pip install git+https://github.com/IBM/terratorch.git`.>
To get the most recent version of the main branch, install the library with `pip install git+https://github.com/IBM/terratorch.git`.
[comment]: <Another alternative is to install using [pipx](https://github.com/pypa/pipx) via `pipx install terratorch`, which creates an isolated environment and allows the user to run the application as a common CLI tool, with no need of installing dependencies or activating environments.>
TerraTorch requires gdal to be installed, which can be quite a complex process. If you don't have GDAL set up on your system, we recommend using a conda environment and installing it with `conda install -c conda-forge gdal`.
To install as a developer (e.g. to extend the library):
```
git clone https://github.com/IBM/terratorch.git
cd terratorch
pip install -r requirements_test.txt
conda install -c conda-forge gdal
pip install -e .
```
To install terratorch with partial (work in development) support for Weather Foundation Models, `pip install -e .[wxc]`, which currently works just for `Python >= 3.11`.
## Documentation
To get started, check out the [quick start guide](https://ibm.github.io/terratorch/quick_start).
Developers, check out the [architecture overview](https://ibm.github.io/terratorch/architecture).
[TerraTorch: The Geospatial Foundation Models Toolkit on arXiv](https://arxiv.org/abs/2503.20563)
## Contributing
This project welcomes contributions and suggestions. Ways to contribute or get involved:
- Join our [Slack](https://join.slack.com/t/terratorch/shared_invite/zt-397aik4dc-ObPV85BaW3kGz1PDzQfTdA)
- Create an [Issue](https://github.com/IBM/terratorch/issues) (for bugs or feature requests)
- Contribute via [PR](https://github.com/IBM/terratorch/pulls)
- Join our [duoweekly](https://romeokienzler.medium.com/the-duoweekly-manifesto-eaa6c1f542c8) community calls taking place [Tuesdays 4:30 PM - 5 PM CEST](https://teams.microsoft.com/l/meetup-join/19%3ameeting_MWJhMThhMTMtMjc3MS00YjAyLWI3NTMtYTI0NDQ3NWY3ZGU2%40thread.v2/0?context=%7b%22Tid%22%3a%22fcf67057-50c9-4ad4-98f3-ffca64add9e9%22%2c%22Oid%22%3a%227f7ab87a-680c-4c93-acc5-fbd7ec80823a%22%7d) and [Thursdays 2:30 PM - 3 PM CEST](https://teams.microsoft.com/l/meetup-join/19%3ameeting_MWJhMThhMTMtMjc3MS00YjAyLWI3NTMtYTI0NDQ3NWY3ZGU2%40thread.v2/0?context=%7b%22Tid%22%3a%22fcf67057-50c9-4ad4-98f3-ffca64add9e9%22%2c%22Oid%22%3a%227f7ab87a-680c-4c93-acc5-fbd7ec80823a%22%7d).
You can find more detailed contribution guidelines [here](https://ibm.github.io/terratorch/stable/contributing/).
A simple hint for any contributor. If you want to meet the GitHub DCO checks, just do your commits as below:
```
git commit -s -m <message>
```
It will sign the commit with your ID and the check will be met.
## Credits
TerraTorch is supported by the EU’s Horizon Europe program under Grant Agreement number 101131841 and also received funding from the Swiss State Secretariat for Education, Research and Innovation (SERI) and the UK Research and Innovation (UKRI).
## License
This project is primarily licensed under the **Apache License 2.0**.
However, some files contain code licensed under the **MIT License**. These files are explicitly listed in [`MIT_FILES.txt`](./MIT_FILES.txt).
By contributing to this repository, you agree that your contributions will be licensed under the Apache 2.0 License unless otherwise stated.
For more details, see the [LICENSE](./LICENSE) file.
", Assign "at most 3 tags" to the expected json: {"id":"13488","tags":[]} "only from the tags list I provide: [{"id":77,"name":"3d"},{"id":89,"name":"agent"},{"id":17,"name":"ai"},{"id":54,"name":"algorithm"},{"id":24,"name":"api"},{"id":44,"name":"authentication"},{"id":3,"name":"aws"},{"id":27,"name":"backend"},{"id":60,"name":"benchmark"},{"id":72,"name":"best-practices"},{"id":39,"name":"bitcoin"},{"id":37,"name":"blockchain"},{"id":1,"name":"blog"},{"id":45,"name":"bundler"},{"id":58,"name":"cache"},{"id":21,"name":"chat"},{"id":49,"name":"cicd"},{"id":4,"name":"cli"},{"id":64,"name":"cloud-native"},{"id":48,"name":"cms"},{"id":61,"name":"compiler"},{"id":68,"name":"containerization"},{"id":92,"name":"crm"},{"id":34,"name":"data"},{"id":47,"name":"database"},{"id":8,"name":"declarative-gui "},{"id":9,"name":"deploy-tool"},{"id":53,"name":"desktop-app"},{"id":6,"name":"dev-exp-lib"},{"id":59,"name":"dev-tool"},{"id":13,"name":"ecommerce"},{"id":26,"name":"editor"},{"id":66,"name":"emulator"},{"id":62,"name":"filesystem"},{"id":80,"name":"finance"},{"id":15,"name":"firmware"},{"id":73,"name":"for-fun"},{"id":2,"name":"framework"},{"id":11,"name":"frontend"},{"id":22,"name":"game"},{"id":81,"name":"game-engine "},{"id":23,"name":"graphql"},{"id":84,"name":"gui"},{"id":91,"name":"http"},{"id":5,"name":"http-client"},{"id":51,"name":"iac"},{"id":30,"name":"ide"},{"id":78,"name":"iot"},{"id":40,"name":"json"},{"id":83,"name":"julian"},{"id":38,"name":"k8s"},{"id":31,"name":"language"},{"id":10,"name":"learning-resource"},{"id":33,"name":"lib"},{"id":41,"name":"linter"},{"id":28,"name":"lms"},{"id":16,"name":"logging"},{"id":76,"name":"low-code"},{"id":90,"name":"message-queue"},{"id":42,"name":"mobile-app"},{"id":18,"name":"monitoring"},{"id":36,"name":"networking"},{"id":7,"name":"node-version"},{"id":55,"name":"nosql"},{"id":57,"name":"observability"},{"id":46,"name":"orm"},{"id":52,"name":"os"},{"id":14,"name":"parser"},{"id":74,"name":"react"},{"id":82,"name":"real-time"},{"id":56,"name":"robot"},{"id":65,"name":"runtime"},{"id":32,"name":"sdk"},{"id":71,"name":"search"},{"id":63,"name":"secrets"},{"id":25,"name":"security"},{"id":85,"name":"server"},{"id":86,"name":"serverless"},{"id":70,"name":"storage"},{"id":75,"name":"system-design"},{"id":79,"name":"terminal"},{"id":29,"name":"testing"},{"id":12,"name":"ui"},{"id":50,"name":"ux"},{"id":88,"name":"video"},{"id":20,"name":"web-app"},{"id":35,"name":"web-server"},{"id":43,"name":"webassembly"},{"id":69,"name":"workflow"},{"id":87,"name":"yaml"}]" returns me the "expected json"