跳过导航

Daymet

Weather Daymet AIforEarth

北美地区 1 公里网格上的每日天气参数预测。

Daymet 提供近地表气象条件的测量结果;Daymet 的主要目的是在没有仪器的情况下提供数据估计值。 此数据集提供北美地区的 Daymet 版本 3 数据;夏威夷和波多黎各的岛屿地区的文件会单独提供,没有包含在北美大陆的文件中。 Daymet 输出变量包括最低温度、最高温度、降雨量、短波辐射、水汽压、雪水当量和昼长。 此数据集涵盖的时间范围为从 1980 年 1 月 1 日至今。 每一年在日历年结束时单独进行处理。 Daymet 变量是作为单个文件按变量和年份提供的连续表面,具有 1 km 的空间分辨率和单天的时间分辨率。 数据呈现在北美地区的兰勃特等角投影中,并以符合气候和预测 (CF) 元数据约定(1.6 版本)的 netCDF 格式分布。

存储资源

数据存储在美国东部数据中心的 blob 的以下 blob 容器中:

https://daymet.blob.core.windows.net/daymetv3-raw

在该容器内,数据命名为:

davymet_v3_[variable]_[year]_[region].nc4

  • 变量是下列项之一
    • tmin(最低温度)
    • tmax(最高温度)
    • prcp(降水)
    • srad(短波辐射)
    • vp(蒸汽压)
    • sne(雪水当量)
    • dayl(白昼长度)
  • “year”是四位数年份
  • “region”是区域代码,为“na”(北美大陆)、“hawaii”或“puertorico”三者中的一个

例如,北美大陆 1982 年最高温度数据的存储位置为:

https://daymet.blob.core.windows.net/daymetv3-raw/daymet_v3_tmax_1982_na.nc4

可访问“数据访问”下提供的笔记本,找到访问 Daymet blob 并据其绘图的完整 Python 示例。

我们还提供只读 SAS(共享访问签名)令牌,以通过 BlobFuse 等访问 Daymet 数据,BlobFuse 可将 blob 容器装载为驱动器:

st=2020-01-03T00%3A12%3A06Z&se=2031-01-04T00%3A12%3A00Z&sp=rl&sv=2018-03-28&sr=c&sig=ca0OY7h4J6j0JxQiiTcM9PeZ%2FCWmX5wC5sjKUPqq0mk%3D

此处提供适用于 Linux 的装载说明。

使用此数据集的大规模处理在美国东部 Azure 数据中心(此数据存储于此处)的性能最佳。 如果将 Daymet 数据用于环境科学应用程序,请考虑申请 AI for Earth 许可以满足计算需求。

引文

如果在出版物中使用此数据,请引用:

Thornton, P.E., M.M. Thornton, B.W. Mayer, Y. Wei, R. Devarakonda, R.S. Vose, and R.B. Cook. 2016. Daymet:北美地区 1 公里网格上的每日地面天气数据(版本 3)。 ORNL DAAC, Oak Ridge, Tennessee, USA.

有关详细信息,请参阅美国橡树岭国家实验室分布式主动存档中心 (ORNL DAAC) 的数据使用和引文政策

资源

使用 Daymet 数据集时,以下资源和引用可能非常有用:

精致的图片


2017 年夏威夷日平均最高气温。

联系人

若有关于此数据集的任何疑问,请联系 aiforearthdatasets@microsoft.com

通知

Microsoft 以“原样”为基础提供 AZURE 开放数据集。 Microsoft 对数据集的使用不提供任何担保(明示或暗示)、保证或条件。 在当地法律允许的范围内,Microsoft 对使用数据集而导致的任何损害或损失不承担任何责任,包括直接、必然、特殊、间接、偶发或惩罚。

此数据集是根据 Microsoft 接收源数据的原始条款提供的。 数据集可能包含来自 Microsoft 的数据。

Access

Available inWhen to use
Azure Notebooks

Quickly explore the dataset with Jupyter notebooks hosted on Azure or your local machine.

Select your preferred service:

Azure Notebooks

Azure Notebooks

Package: Language: Python

Demo notebook for accessing Daymet data on Azure

The Daymet dataset contains daily minimum temperature, maximum temperature, precipitation, shortwave radiation, vapor pressure, snow water equivalent, and day length at 1km resolution for North America. The dataset covers the period from January 1, 1980 to December 31, 2019.

The Daymet dataset is maintained at daac.ornl.gov/cgi-bin/dsviewer.pl?ds_id=1328 and mirrored on Azure Open Datasets at azure.microsoft.com/services/open-datasets/catalog/daymet.

In [6]:
# Standard or standard-ish imports
import os
import tempfile
import shutil
import numpy as np
import matplotlib.pyplot as plt
import urllib.request

# Less standard, but still pip- or conda-installable
import netCDF4 as nc 
from azure.storage.blob import ContainerClient
from netCDF4 import Dataset

container_name = 'daymetv3-raw'
daymet_azure_storage_url = 'https://daymet.blob.core.windows.net/'

daymet_container_client = ContainerClient(account_url=daymet_azure_storage_url, 
                                         container_name=container_name,
                                         credential=None)

# Temporary folder for data we need during execution of this notebook (we'll clean up
# at the end, we promise)
temp_dir = os.path.join(tempfile.gettempdir(),'daymet')
os.makedirs(temp_dir,exist_ok=True)

Support functions

In [2]:
def download_url(url, destination_filename=None, progress_updater=None, force_download=False):
    """
    Download a URL to a temporary file
    """
    
    # This is not intended to guarantee uniqueness, we just know it happens to guarantee
    # uniqueness for this application.
    if destination_filename is None:
        url_as_filename = url.replace('://', '_').replace('.', '_').replace('/', '_')
        destination_filename = \
            os.path.join(temp_dir,url_as_filename)
    if (not force_download) and (os.path.isfile(destination_filename)):
        print('Bypassing download of already-downloaded file {}'.format(os.path.basename(url)))
        return destination_filename
    print('Downloading file {}'.format(os.path.basename(url)),end='')
    urllib.request.urlretrieve(url, destination_filename, progress_updater)  
    assert(os.path.isfile(destination_filename))
    nBytes = os.path.getsize(destination_filename)
    print('...done, {} bytes.'.format(nBytes))
    return destination_filename

List the available Daymet files

The Daymet dataset is available for anonymous public download on Azure. The following code shows how to list all data files that are currently available, trimmed to a specific region (hawaii, na, puertorico) and year for brevity.

In [3]:
# Hawaii sounds nice...
state_of_interest = 'hawaii'
year_of_interest = '1988'

# List the blobs in the container
generator = daymet_container_client.list_blobs()
for blob in generator:
    if state_of_interest in blob.name and year_of_interest in blob.name:
        print('Blob name: ' + blob.name)
Blob name: daymet_v3_dayl_1988_hawaii.nc4
Blob name: daymet_v3_prcp_1988_hawaii.nc4
Blob name: daymet_v3_srad_1988_hawaii.nc4
Blob name: daymet_v3_swe_1988_hawaii.nc4
Blob name: daymet_v3_tmax_1988_hawaii.nc4
Blob name: daymet_v3_tmin_1988_hawaii.nc4
Blob name: daymet_v3_vp_1988_hawaii.nc4

Download a specific file from Azure blob storage

This code shows how to download a specific file from Azure blob storage into the current directory. It uses the example file daymet_v3_tmax_1980_hawaii.nc4, but you can change this as described below.

The following types of data are available: minimum temperature (tmin), maximum temperature (tmax), precipitation (prcp), shortwave radiation (srad), vapor pressure (vp), snow water equivalent (swe), and day length (dayl).

In [7]:
variable = 'tmax'
year = '2017'

# Choose your location.  The following are available: hawaii, na, puertorico.  The value 'na' stands for North America.
location = 'hawaii'

granule_name = 'daymet_v3_' + variable + '_' + year + '_' + location + '.nc4'
url = 'https://daymet.blob.core.windows.net/daymetv3-raw/' + granule_name

filename = download_url(url)
Bypassing download of already-downloaded file daymet_v3_tmax_2017_hawaii.nc4

Explore the NetCDF metadata

In [8]:
daymet_ds = Dataset(filename, 'r') 
print('netCDF file format:' + ' ' + daymet_ds.file_format)

print('netCDF dimensions:')
print(daymet_ds.dimensions.keys())
print('\ntime dimension:')
print(daymet_ds.dimensions['time'])
print('x dimension:')
print(daymet_ds.dimensions['x'])
print('y dimension:')
print(daymet_ds.dimensions['y'])
print('netCDF variables:')
print(daymet_ds.variables.keys())
print('\n' + variable + ' variable and attributes:')
print(daymet_ds.variables[variable])
netCDF file format: NETCDF4_CLASSIC
netCDF dimensions:
dict_keys(['x', 'y', 'time', 'nv'])

time dimension:
<class 'netCDF4._netCDF4.Dimension'> (unlimited): name = 'time', size = 365
x dimension:
<class 'netCDF4._netCDF4.Dimension'>: name = 'x', size = 284
y dimension:
<class 'netCDF4._netCDF4.Dimension'>: name = 'y', size = 584
netCDF variables:
dict_keys(['x', 'y', 'lat', 'lon', 'time', 'yearday', 'time_bnds', 'lambert_conformal_conic', 'tmax'])

tmax variable and attributes:
<class 'netCDF4._netCDF4.Variable'>
float32 tmax(time, y, x)
    _FillValue: -9999.0
    long_name: daily maximum temperature
    units: degrees C
    missing_value: -9999.0
    coordinates: lat lon
    grid_mapping: lambert_conformal_conic
    cell_methods: area: mean time: maximum
unlimited dimensions: time
current shape = (365, 584, 284)
filling on

Plot temperature data

Let's calculate the mean value for the variable that we care about, and then visualize this on a map. If you have kept the defaults above, this is the maximum temperature.

In [9]:
# Read the whole array
factor = daymet_ds.variables[variable][:]

# Calculate mean
factor_mean_comp = np.mean(factor, axis=0, keepdims=True)

# Reshape 
x_size = daymet_ds.dimensions['x'].size
y_size = daymet_ds.dimensions['y'].size
factor_mean_comp.shape = (y_size,x_size)

# Plot
%matplotlib inline
plt.rcParams['figure.figsize'] = (25,9)
plt.imshow(factor_mean_comp, cmap='rainbow')
plt.colorbar()
Out[9]:
<matplotlib.colorbar.Colorbar at 0x10cd4264b48>

Time conversion

Convert the time axis to a more human-readable format... mostly an excuse to demonstrate tinkering with NetCDF variables.

In [11]:
time = daymet_ds.variables['time'][:] 
time_unit = daymet_ds.variables['time'].getncattr('units') 
time_cal = daymet_ds.variables['time'].getncattr('calendar') 
local_time = nc.num2date(time, units=time_unit, calendar=time_cal)

print('Original time value: {}, human-readable time: {}'.format(time[0], local_time[0]))
Original time value: 13515.5, human-readable time: 2017-01-01 12:00:00

Cleanup

In [ ]:
shutil.rmtree(temp_dir)