Demonstration Platform Documentation

Large scale batch processing pipelines and machine learning

Users can create and deploy multi-stage data processing pipelines that can utilize the computing power and scalability of the IAAS tier. The core tool is Kubeflow , which was primarily designed to enable machine learning workflows, but the same constructs used in machine learning can also be applied to the tasks of EO data ingestion and pre-processing.

To enable machine learning, KubeFlow provides direct integration with PyTorch and TensorFlow deep learning frameworks and provides methods to access GPU resources. In the future Hatfield will further explore AI and deep learning approaches enabled using KubeFlow.

An example pipeline for ingesting Modis imagery
An example pipeline processing Sentinel-2 data using Sen2Cor
Scroll to top