EarthServer Project goes into the second round

News Date: 
21 Jul 2015

The EarthServer initiative is establishing Agile Analytics on Petabyte data cubes as a commodity.

Pushing the boundaries of Big Earth Data services, the intercontinental Earthserver initiative enables researchers to browse, access, and analyze massive multi-dimensional data sets from a wide range of sources. Big Earth Data at your fingertips - this is the vision of EarthServer for unleashing the potential of Big Data through a disruptive paradigm shift in technology:

  • from isolat-ed silos of data with disparate functionality towards a single, uniform information space;
  • from a difficult, artificial differentiation between data and metadata access to unified retrieval;
  • from zillions of files towards few whatever-size datacubes;
  • from limited functionality to the freedom of asking anything, anytime, any server in a peer network of data centers worldwide.

In phase 1, EarthServer has established open ad-hoc analytics on massive Earth Science data, based on and extending the leading Array Database technology, rasdaman. According to EU Commission and inde pendent reviewers, rasdaman will "significantly transform the way that scientists in different areas of Earth Science will be able to access and use data in a way that hitherto was not possible" as demonstrated by portals with over 230 TB of spatio-temporal data. EarthServer "with no doubt has been shaping the Big Earth Data landscape through the standardization activities within OGC, ISO and beyond".

Now phase 2 of EarthServer has started, with an even more ambitious goal: data centers will provide at least 1 Petabyte of 3-D and 4-D datacubes. Technology advance will allow real-time scaling of such Petabyte cubes, and intercontinental fusion. This power of data handling will be wrapped into direct visual interaction based on multi-dimensional visualization techniques, in particular: NASA World Wind. Following the motto "a cube says more than a million images" EarthServer has set out to redefine the Big Data service landscape even more.

This way, critical support will be given to Copernicus and the Sentinel satellite data: a single 3D x/y/t datacube will be constructed for each satellite instrument so that millions of images form a single, simple data space, irrespective of its size resulting. Likewise, each climate dataset will form a single 4D datacube. Access to these cubes is through a clean-slate standards-based query language on n-D grids, OGC WCPS. This yields the agility that any query can be sent at any time, without admin intervention on server side. Multiple cubes can be combined based on parallel, distributed processing. Altogether, the WCPS language allows navigation, extraction, aggregation, and fusion of any-size space/time data cubes using simple, yet powerful query operators.

The consortium consists of Jacobs University (Germany, coordinator), rasdaman GmbH (Germany, SME), , Plymouth Marine Laboratory (UK), European Center for Medium-Range Weather Forecasts (UK), MEEO s.r.l. (Italy, SME), and CITE S.A. (Greece, SME). Additionally, two high-profile international organizations participate: NASA (US) and National Computational Infrastructure (Australia).

Read more:

Heike Hoenig