Recent advances in cloud-based remote sensing platforms have revoluted the routines for remote sensing big data (RSBD) analysis. However, it is challenging to make user-defined algorithms reusable for RSBD applications if not pre-implemented in RSBD platforms, especially legacy algorithms written with specific programming languages and libraries. In recent years, the emergence of containerization, which is the core feature of cloud native, provided effective solutions to port user-defined algorithms to the cloud environment. In this research, we present a novel approach to deploy user-defined remote sensing algorithms for large-scale analysis based on Data Cube and cloud-native containerization. A processing model is introduced to organize workflows of remote sensing analysis based on Data Cube. The workflows can be decomposed into multiple independent steps and parallelizable tasks following the homogeneity of Data Cube and the parallelizability of remote sensing analysis. Subsequently, the Composite Container is designed to process tasks with user-defined algorithms as built-in algorithms. Then, we introduce Data Cube Resilient Distributed Dataset (DRDD) to implement workflows with Composite Containers following the MapReduce paradigms. The proposed approach was implemented with Science Earth Platform and validated with two sets of up to 10-m resolution continental-scale land cover mapping. Experiment results show that the proposed approach can effectively implement remote sensing analysis with user-defined algorithms and show good performance for continental-scale analysis.
Geospatial data and related technologies have become an increasingly important aspect of data analysis processes, with their prominent role in most of them. Serverless paradigm have become the most popular and frequently used technology within cloud computing. This paper reviews the serverless paradigm and examines how it could be leveraged for geospatial data processes by using open standards in the geospatial community. We propose a system design and architecture to handle complex geospatial data processing jobs with minimum human intervention and resource consumption using serverless technologies. In order to define and execute workflows in the system, we also propose new models for both workflow and task definitions models. Moreover, the proposed system has new Open Geospatial Consortium (OGC) Application Programming Interface (API) Processes specification-based web services to provide interoperability with other geospatial applications with the anticipation that it will be more commonly used in the future. We implemented the proposed system on one of the public cloud providers as a proof of concept and evaluated it with sample geospatial workflows and cloud architecture best practices.
To leverage our past investments in ocean observations and modeling, and to fully exploit new observations, we must transform our infrastructure and tools for working with ocean data. Currently, data intensive ocean research is only accessible to privileged institutions with the resources for high performance computing and data storage. OpenOceanCloud will break down this barrier, providing a research platform to the thousands of potential oceanographers who lack such resources. Access to vast data sets and powerful computing environments can help remove the barriers related to low-bandwidth internet, intermittent power, and limited cyber infrastructure. With this infrastructure, anyone can do science, anywhere, and this empowers communities that have been historically excluded from full participation in oceanography.
Digital Twins of the environment can help reaching sustainability goals and tackling climate change related issues. They will strongly rely on geospatial data, and the processing and analytics thereof. Cloud environments provides the flexibility and scalability needed to cope with the potential enormous geospatial datasets. This article explores the Azure cloud capabilities, and places them in a broader multi cloud perspective.
Radiant Earth’s first online course aims to strengthen practitioners’ capacity and skills to create impactful machine learning applications.
Nature article: Earth Observation Epidemiology or tele-epidemiology is defined as ‘using space technology with remote sensing in epidemiology. It is a useful tool that is increasingly being used by clinicians and stakeholders for zoonotic infections1,2,4. Tele-epidemiology helped map out the spread of the Ebola virus among animals and can be used for risk mapping, risk communication and identifying vulnerable populations. Similarly, geographic information science technology can improve the understanding and control of COVID-19 through surveillance, data sharing, digital contact tracing and investigation of risk factors and infectious disease forecasting
Countries are using forests to pad their climate commitments. New satellite images might call their bluff
This Special Issue covers a broad range of topics, such as transfer learning, design of new Deep Neural Network (DNN), CNN, and GAN models, as well as a wide range of applications (Table 1), including agriculture (four papers), natural resources (three papers), marine environments (two papers), change detection (one paper), and disaster damage detection (one paper).
ML4EO Training given by Radiant Earth. Designed to strengthen practitioners’ local capacity and skills in support of creating impactful machine learning applications
The SpatioTemporal Asset Catalog (STAC) community is pleased to announce the release of version 1.0.0