Skip navigation

Daymet

Weather Daymet AIforEarth

Estimates of daily weather parameters in North America on a one-kilometer grid.

Daymet provides measurements of near-surface meteorological conditions; the main purpose of Daymet is provide data estimates where no instrumentation exists. This dataset provides Daymet Version 3 data for North America; the island areas of Hawaii and Puerto Rico are available as files separate from the continental land mass. Daymet output variables include minimum temperature, maximum temperature, precipitation, shortwave radiation, vapor pressure, snow water equivalent, and day length. The dataset covers the period from January 1, 1980 to the present. Each year is processed individually at the close of a calendar year. Daymet variables are continuous surfaces provided as individual files, by variable and year, at 1-kilometer spatial resolution and daily temporal resolution. Data are in a Lambert Conformal Conic projection for North America and are distributed in netCDF format compliant with Climate and Forecast (CF) metadata conventions (version 1.6).

Storage resources

Data are stored in blobs in the East US data center, in the following blob container:

https://daymet.blob.core.windows.net/daymetv3-raw

Within that container, data are named as:

davymet_v3_[variable]_[year]_[region].nc4

  • Variable is one of:
    • tmin (minimum temperature)
    • tmax (maximum temperature)
    • prcp (precipitation)
    • srad (shortwave radiation)
    • vp (vapor pressure)
    • sne (snow water equivalent)
    • dayl (day length)
  • year is a four-digit year
  • region is a region code, one of “na” (North American continental mass), “hawaii”, or “puertorico”

For example, maximum temeperature data from 1982 for the continental mass is available at:

https://daymet.blob.core.windows.net/daymetv3-raw/daymet_v3_tmax_1982_na.nc4

A complete Python example of accessing and plotting a Daymet blob is available in the notebook provided under “data access”.

We also provide a read-only SAS (shared access signature) token to allow access to Daymet data via, e.g., BlobFuse, which allows you to mount blob containers as drives:

st=2020-01-03T00%3A12%3A06Z&se=2031-01-04T00%3A12%3A00Z&sp=rl&sv=2018-03-28&sr=c&sig=ca0OY7h4J6j0JxQiiTcM9PeZ%2FCWmX5wC5sjKUPqq0mk%3D

Mounting instructions for Linux are here.

Large-scale processing using this dataset is best performed in the East US Azure data center, where the data is stored. If you are using Daymet data for environmental science applications, consider applying for an AI for Earth grant to support your compute requirements.

Citation

If you use this data in a publication, please cite:

Thornton, P.E., M.M. Thornton, B.W. Mayer, Y. Wei, R. Devarakonda, R.S. Vose, and R.B. Cook. 2016. Daymet: Daily Surface Weather Data on a 1-km Grid for North America, Version 3. ORNL DAAC, Oak Ridge, Tennessee, USA.

See the Oak Ridge National Laboratory Distributed Active Archive Center (ORNL DAAC)’s Data Use and Citations Policy for more information.

Resources

The following resources and references may be helpful when working with the Daymet dataset:

Pretty picture


Average daily maximum temperature in Hawaii in 2017.

Contact

For questions about this dataset, contact aiforearthdatasets@microsoft.com.

Notices

MICROSOFT PROVIDES AZURE OPEN DATASETS ON AN “AS IS” BASIS. MICROSOFT MAKES NO WARRANTIES, EXPRESS OR IMPLIED, GUARANTEES OR CONDITIONS WITH RESPECT TO YOUR USE OF THE DATASETS. TO THE EXTENT PERMITTED UNDER YOUR LOCAL LAW, MICROSOFT DISCLAIMS ALL LIABILITY FOR ANY DAMAGES OR LOSSES, INCLUDING DIRECT, CONSEQUENTIAL, SPECIAL, INDIRECT, INCIDENTAL OR PUNITIVE, RESULTING FROM YOUR USE OF THE DATASETS.

This dataset is provided under the original terms that Microsoft received source data. The dataset may include data sourced from Microsoft.

Access

Available inWhen to use
Azure Notebooks

Quickly explore the dataset with Jupyter notebooks hosted on Azure or your local machine.

Select your preferred service:

Azure Notebooks

Azure Notebooks

Package: Language: Python

Demo notebook for accessing Daymet data on Azure

The Daymet dataset contains daily minimum temperature, maximum temperature, precipitation, shortwave radiation, vapor pressure, snow water equivalent, and day length at 1km resolution for North America. The dataset covers the period from January 1, 1980 to December 31, 2019.

The Daymet dataset is maintained at daac.ornl.gov/cgi-bin/dsviewer.pl?ds_id=1328 and mirrored on Azure Open Datasets at azure.microsoft.com/services/open-datasets/catalog/daymet.

In [6]:
# Standard or standard-ish imports
import os
import tempfile
import shutil
import numpy as np
import matplotlib.pyplot as plt
import urllib.request

# Less standard, but still pip- or conda-installable
import netCDF4 as nc 
from azure.storage.blob import ContainerClient
from netCDF4 import Dataset

container_name = 'daymetv3-raw'
daymet_azure_storage_url = 'https://daymet.blob.core.windows.net/'

daymet_container_client = ContainerClient(account_url=daymet_azure_storage_url, 
                                         container_name=container_name,
                                         credential=None)

# Temporary folder for data we need during execution of this notebook (we'll clean up
# at the end, we promise)
temp_dir = os.path.join(tempfile.gettempdir(),'daymet')
os.makedirs(temp_dir,exist_ok=True)

Support functions

In [2]:
def download_url(url, destination_filename=None, progress_updater=None, force_download=False):
    """
    Download a URL to a temporary file
    """
    
    # This is not intended to guarantee uniqueness, we just know it happens to guarantee
    # uniqueness for this application.
    if destination_filename is None:
        url_as_filename = url.replace('://', '_').replace('.', '_').replace('/', '_')
        destination_filename = \
            os.path.join(temp_dir,url_as_filename)
    if (not force_download) and (os.path.isfile(destination_filename)):
        print('Bypassing download of already-downloaded file {}'.format(os.path.basename(url)))
        return destination_filename
    print('Downloading file {}'.format(os.path.basename(url)),end='')
    urllib.request.urlretrieve(url, destination_filename, progress_updater)  
    assert(os.path.isfile(destination_filename))
    nBytes = os.path.getsize(destination_filename)
    print('...done, {} bytes.'.format(nBytes))
    return destination_filename

List the available Daymet files

The Daymet dataset is available for anonymous public download on Azure. The following code shows how to list all data files that are currently available, trimmed to a specific region (hawaii, na, puertorico) and year for brevity.

In [3]:
# Hawaii sounds nice...
state_of_interest = 'hawaii'
year_of_interest = '1988'

# List the blobs in the container
generator = daymet_container_client.list_blobs()
for blob in generator:
    if state_of_interest in blob.name and year_of_interest in blob.name:
        print('Blob name: ' + blob.name)
Blob name: daymet_v3_dayl_1988_hawaii.nc4
Blob name: daymet_v3_prcp_1988_hawaii.nc4
Blob name: daymet_v3_srad_1988_hawaii.nc4
Blob name: daymet_v3_swe_1988_hawaii.nc4
Blob name: daymet_v3_tmax_1988_hawaii.nc4
Blob name: daymet_v3_tmin_1988_hawaii.nc4
Blob name: daymet_v3_vp_1988_hawaii.nc4

Download a specific file from Azure blob storage

This code shows how to download a specific file from Azure blob storage into the current directory. It uses the example file daymet_v3_tmax_1980_hawaii.nc4, but you can change this as described below.

The following types of data are available: minimum temperature (tmin), maximum temperature (tmax), precipitation (prcp), shortwave radiation (srad), vapor pressure (vp), snow water equivalent (swe), and day length (dayl).

In [7]:
variable = 'tmax'
year = '2017'

# Choose your location.  The following are available: hawaii, na, puertorico.  The value 'na' stands for North America.
location = 'hawaii'

granule_name = 'daymet_v3_' + variable + '_' + year + '_' + location + '.nc4'
url = 'https://daymet.blob.core.windows.net/daymetv3-raw/' + granule_name

filename = download_url(url)
Bypassing download of already-downloaded file daymet_v3_tmax_2017_hawaii.nc4

Explore the NetCDF metadata

In [8]:
daymet_ds = Dataset(filename, 'r') 
print('netCDF file format:' + ' ' + daymet_ds.file_format)

print('netCDF dimensions:')
print(daymet_ds.dimensions.keys())
print('\ntime dimension:')
print(daymet_ds.dimensions['time'])
print('x dimension:')
print(daymet_ds.dimensions['x'])
print('y dimension:')
print(daymet_ds.dimensions['y'])
print('netCDF variables:')
print(daymet_ds.variables.keys())
print('\n' + variable + ' variable and attributes:')
print(daymet_ds.variables[variable])
netCDF file format: NETCDF4_CLASSIC
netCDF dimensions:
dict_keys(['x', 'y', 'time', 'nv'])

time dimension:
<class 'netCDF4._netCDF4.Dimension'> (unlimited): name = 'time', size = 365
x dimension:
<class 'netCDF4._netCDF4.Dimension'>: name = 'x', size = 284
y dimension:
<class 'netCDF4._netCDF4.Dimension'>: name = 'y', size = 584
netCDF variables:
dict_keys(['x', 'y', 'lat', 'lon', 'time', 'yearday', 'time_bnds', 'lambert_conformal_conic', 'tmax'])

tmax variable and attributes:
<class 'netCDF4._netCDF4.Variable'>
float32 tmax(time, y, x)
    _FillValue: -9999.0
    long_name: daily maximum temperature
    units: degrees C
    missing_value: -9999.0
    coordinates: lat lon
    grid_mapping: lambert_conformal_conic
    cell_methods: area: mean time: maximum
unlimited dimensions: time
current shape = (365, 584, 284)
filling on

Plot temperature data

Let's calculate the mean value for the variable that we care about, and then visualize this on a map. If you have kept the defaults above, this is the maximum temperature.

In [9]:
# Read the whole array
factor = daymet_ds.variables[variable][:]

# Calculate mean
factor_mean_comp = np.mean(factor, axis=0, keepdims=True)

# Reshape 
x_size = daymet_ds.dimensions['x'].size
y_size = daymet_ds.dimensions['y'].size
factor_mean_comp.shape = (y_size,x_size)

# Plot
%matplotlib inline
plt.rcParams['figure.figsize'] = (25,9)
plt.imshow(factor_mean_comp, cmap='rainbow')
plt.colorbar()
Out[9]:
<matplotlib.colorbar.Colorbar at 0x10cd4264b48>

Time conversion

Convert the time axis to a more human-readable format... mostly an excuse to demonstrate tinkering with NetCDF variables.

In [11]:
time = daymet_ds.variables['time'][:] 
time_unit = daymet_ds.variables['time'].getncattr('units') 
time_cal = daymet_ds.variables['time'].getncattr('calendar') 
local_time = nc.num2date(time, units=time_unit, calendar=time_cal)

print('Original time value: {}, human-readable time: {}'.format(time[0], local_time[0]))
Original time value: 13515.5, human-readable time: 2017-01-01 12:00:00

Cleanup

In [ ]:
shutil.rmtree(temp_dir)