How to Use the Tiles API¶
The /tiles API is used to generate map tiles via the tiles OGC standard.
This notebook demonstrates how to use the tiles API for the GHRSST Level 4 MUR Global Foundation Sea Surface Temperature Analysis (v4.1) product.
Setup¶
In [1]:
Copied!
from datetime import datetime, timezone
import earthaccess
import httpx
from folium import Map, TileLayer
titiler_endpoint = "https://staging.openveda.cloud/api/titiler-cmr" # staging endpoint
from datetime import datetime, timezone
import earthaccess
import httpx
from folium import Map, TileLayer
titiler_endpoint = "https://staging.openveda.cloud/api/titiler-cmr" # staging endpoint
Identify the dataset¶
You can find the MUR SST dataset using the earthaccess.search_datasets function.
In [2]:
Copied!
datasets = earthaccess.search_datasets(doi="10.5067/GHGMR-4FJ04")
ds = datasets[0]
concept_id = ds["meta"]["concept-id"]
print("Concept-Id: ", concept_id)
print("Abstract: ", ds["umm"]["Abstract"])
datasets = earthaccess.search_datasets(doi="10.5067/GHGMR-4FJ04")
ds = datasets[0]
concept_id = ds["meta"]["concept-id"]
print("Concept-Id: ", concept_id)
print("Abstract: ", ds["umm"]["Abstract"])
Concept-Id: C1996881146-POCLOUD Abstract: A Group for High Resolution Sea Surface Temperature (GHRSST) Level 4 sea surface temperature analysis produced as a retrospective dataset (four day latency) and near-real-time dataset (one day latency) at the JPL Physical Oceanography DAAC using wavelets as basis functions in an optimal interpolation approach on a global 0.01 degree grid. The version 4 Multiscale Ultrahigh Resolution (MUR) L4 analysis is based upon nighttime GHRSST L2P skin and subskin SST observations from several instruments including the NASA Advanced Microwave Scanning Radiometer-EOS (AMSR-E), the JAXA Advanced Microwave Scanning Radiometer 2 on GCOM-W1, the Moderate Resolution Imaging Spectroradiometers (MODIS) on the NASA Aqua and Terra platforms, the US Navy microwave WindSat radiometer, the Advanced Very High Resolution Radiometer (AVHRR) on several NOAA satellites, and in situ SST observations from the NOAA iQuam project. The ice concentration data are from the archives at the EUMETSAT Ocean and Sea Ice Satellite Application Facility (OSI SAF) High Latitude Processing Center and are also used for an improved SST parameterization for the high-latitudes. The dataset also contains additional variables for some granules including the SST anomaly (variable sst_anomaly) derived from a MUR climatology, and the temporal distance in hours to the nearest IR measurement for each pixel (variable dt_1km_data). Variable dt_1km_data first appears in the time series on October 4, 2015, while sst_anomaly starts July 23, 2019. This dataset was originally funded by the NASA MEaSUREs program (http://earthdata.nasa.gov/our-community/community-data-system-programs/measures-projects), and created by a team led by Dr. Toshio M. Chin from JPL. It adheres to the GHRSST Data Processing Specification (GDS) version 2 format specifications. Use the file global metadata "history:" attribute to determine if a granule is near-realtime or retrospective.
Explore the collection using the /compatibility endpoint¶
See How to use the Compatibility API Endpoint for how the compatiliby endpoint can be used to identify variable, datetime and rescale parameters.
Define a query for titiler-cmr¶
To use titiler-cmr's endpoints for a NetCDF dataset like this we need to define a date range for the CMR query and a variable to analyze.
In [3]:
Copied!
variable = "sea_ice_fraction"
datetime_ = datetime(2024, 10, 10, tzinfo=timezone.utc).isoformat()
variable = "sea_ice_fraction"
datetime_ = datetime(2024, 10, 10, tzinfo=timezone.utc).isoformat()
Display tiles in an interactive map¶
The /tilejson.json endpoint will provide a parameterized xyz tile URL that can be added to an interactive map.
In [4]:
Copied!
r = httpx.get(
f"{titiler_endpoint}/WebMercatorQuad/tilejson.json",
params=(
("concept_id", concept_id),
# Datetime in form of `start_date/end_date`
("datetime", datetime_),
# titiler-cmr can work with both Zarr and COG dataset
# but we need to tell the endpoints in advance which backend
# to use
("backend", "xarray"),
("variable", variable),
# We need to set min/max zoom because we don't want to use lowerzoom level (e.g 0)
# which will results in useless large scale query
("minzoom", 2),
("maxzoom", 13),
("rescale", "0,1"),
("colormap_name", "blues_r"),
),
timeout=None,
).json()
print(r)
r = httpx.get(
f"{titiler_endpoint}/WebMercatorQuad/tilejson.json",
params=(
("concept_id", concept_id),
# Datetime in form of `start_date/end_date`
("datetime", datetime_),
# titiler-cmr can work with both Zarr and COG dataset
# but we need to tell the endpoints in advance which backend
# to use
("backend", "xarray"),
("variable", variable),
# We need to set min/max zoom because we don't want to use lowerzoom level (e.g 0)
# which will results in useless large scale query
("minzoom", 2),
("maxzoom", 13),
("rescale", "0,1"),
("colormap_name", "blues_r"),
),
timeout=None,
).json()
print(r)
{'tilejson': '2.2.0', 'version': '1.0.0', 'scheme': 'xyz', 'tiles': ['https://staging.openveda.cloud/api/titiler-cmr/tiles/WebMercatorQuad/{z}/{x}/{y}@1x?concept_id=C1996881146-POCLOUD&datetime=2024-10-10T00%3A00%3A00%2B00%3A00&backend=xarray&variable=sea_ice_fraction&rescale=0%2C1&colormap_name=blues_r'], 'minzoom': 2, 'maxzoom': 13, 'bounds': [-180.0, -90.0, 180.0, 90.0], 'center': [0.0, 0.0, 2]}
In [5]:
Copied!
bounds = r["bounds"]
m = Map(location=(70, -40), zoom_start=3)
TileLayer(
tiles=r["tiles"][0],
opacity=1,
attr="NASA",
).add_to(m)
m
bounds = r["bounds"]
m = Map(location=(70, -40), zoom_start=3)
TileLayer(
tiles=r["tiles"][0],
opacity=1,
attr="NASA",
).add_to(m)
m
Out[5]:
Make this Notebook Trusted to load map: File -> Trust Notebook