Skip navigation

GOES-16

SatelliteImagery EarthObservation AIforEarth NOAA

Weather imagery from the GOES-16 satellite.

The GOES-R (Geostationary Operational Environmental Satellite) program images weather phenomena from a set of satellites in geostationary orbits. The GOES-16 satellite is the first of four planned GOES-R satellites; GOES-16’s orbit provides a view of the Americas.

This dataset currently includes the ABI-L2-MCMIPF product (Advanced Baseline Imager, Level 2, Cloud and Moisture Imagery, Full-disk) and the ABI-L1b-RadF product (Advanced Baseline Imager, Level 1b, Full-disk). We may on-board other GOES-16 and GOES-17 products on request; please contact aiforearthdatasets@microsoft.com if you are interested in using additional GOES data on Azure.

This dataset is available on Azure thanks to the NOAA Big Data Program.

Storage resources

Data are stored in blobs in NetCDF format (one blob per image) in the East US data center, in the following blob container:

https://goes.blob.core.windows.net/noaa-goes16

Within that container, data are named as:

[product]/[year]/[day]/[hour]/[filename]

  • product is a product name; currently ABI-L2-MCMIPF and ABI-L1b-RadF are available on Azure
  • year is a four-digit year
  • day is a three-digit day-of-year code, starting with 001
  • hour is a two-digit hour-of-day code, starting with 00
  • filename encodes the product, date, and time; details are available in the GOES Users’ Guide

For example, this file:

https://goes.blob.core.windows.net/noaa-goes16/ABI-L2-MCMIPF/2020/003/00/OR_ABI-L2-MCMIPF-M6_G16_s20200030000215_e20200030009534_c20200030010031.nc

…contains data from January 3, 2020, between midnight and 1am UTC (hour 00).

Data channels and wavelengths are described here.

A complete Python example of accessing and plotting a GOES-16 image is available in the notebook provided under “data access”.

We also provide a read-only SAS (shared access signature) token to allow access to GOES-16 data via, e.g., BlobFuse, which allows you to mount blob containers as drives:

st=2020-04-11T23%3A55%3A25Z&se=2032-04-12T23%3A55%3A00Z&sp=rl&sv=2018-03-28&sr=c&sig=IVSoHKVscKyu8K99I7xfman%2Bzp0ISkFbnbAqE6wkv6A%3D

Mounting instructions for Linux are here.

Large-scale processing using this dataset is best performed in the East US Azure data center, where the data is stored. If you are using GOES data for environmental science applications, consider applying for an AI for Earth grant to support your compute requirements.

Citation

If you use this data in a publication, please cite as:

GOES-R Series Program, (2019): NOAA GOES-R Series Advanced Baseline Imager (ABI) Level 0 Data. [indicate subset used]. NOAA National Centers for Environmental Information. doi:10.25921/tvws-w071.

Pretty picture


Moisture imagery of the Americas on Jan 2, 2020.

Contact

For questions about this dataset, contact aiforearthdatasets@microsoft.com.

Notices

MICROSOFT PROVIDES AZURE OPEN DATASETS ON AN “AS IS” BASIS. MICROSOFT MAKES NO WARRANTIES, EXPRESS OR IMPLIED, GUARANTEES OR CONDITIONS WITH RESPECT TO YOUR USE OF THE DATASETS. TO THE EXTENT PERMITTED UNDER YOUR LOCAL LAW, MICROSOFT DISCLAIMS ALL LIABILITY FOR ANY DAMAGES OR LOSSES, INCLUDING DIRECT, CONSEQUENTIAL, SPECIAL, INDIRECT, INCIDENTAL OR PUNITIVE, RESULTING FROM YOUR USE OF THE DATASETS.

This dataset is provided under the original terms that Microsoft received source data. The dataset may include data sourced from Microsoft.

Access

Available inWhen to use
Azure Notebooks

Quickly explore the dataset with Jupyter notebooks hosted on Azure or your local machine.

Select your preferred service:

Azure Notebooks

Azure Notebooks

Package: Language: Python

Demo notebook for accessing GOES-16 data on Azure

This notebook provides an example of accessing GOES-16 data from blob storage on Azure, including (1) finding the data file corresponding to a date and time, (2) retrieving that file from blob storage, and (3) opening the downloaded file using the xarray library, and (4) rendering the image.

GOES-16 data are stored in the East US data center, so this notebook will run most efficiently on Azure compute located in East US. We recommend that substantial computation depending on GOES-16 data also be situated in East US. If you are using GOES-16 data for environmental science applications, consider applying for an AI for Earth grant to support your compute requirements.

Imports and environment

In [1]:
# Mostly-standard imports
import os
import tempfile
import numpy as np
import shutil
import urllib
import matplotlib.pyplot as plt

# Less-common-but-still-pip-installable imports
import xarray
from azure.storage.blob import ContainerClient

# pip install progressbar2, not progressbar
import progressbar

# Storage locations are documented at http://aka.ms/ai4edata-goes16
goes_account_name = 'goes'
goes_container_name = 'noaa-goes16'
goes_account_url = 'https://' + goes_account_name + '.blob.core.windows.net'
goes_blob_root = goes_account_url + '/' + goes_container_name + '/'

# Create a ContainerClient to enumerate blobs
goes_container_client = ContainerClient(account_url=goes_account_url, 
                                         container_name=goes_container_name,
                                         credential=None)

temp_dir = os.path.join(tempfile.gettempdir(),'goes')
os.makedirs(temp_dir,exist_ok=True)

%matplotlib inline

Functions

In [3]:
class DownloadProgressBar():
    """
    https://stackoverflow.com/questions/37748105/how-to-use-progressbar-module-with-urlretrieve
    """
    
    def __init__(self):
        self.pbar = None

    def __call__(self, block_num, block_size, total_size):
        if not self.pbar:
            self.pbar = progressbar.ProgressBar(max_value=total_size)
            self.pbar.start()
            
        downloaded = block_num * block_size
        if downloaded < total_size:
            self.pbar.update(downloaded)
        else:
            self.pbar.finish()
            

def download_url(url, destination_filename=None, progress_updater=None, force_download=False):
    """
    Download a URL to a temporary file
    """
    
    # This is not intended to guarantee uniqueness, we just know it happens to guarantee
    # uniqueness for this application.
    if destination_filename is None:
        url_as_filename = url.replace('://', '_').replace('/', '_')    
        destination_filename = \
            os.path.join(temp_dir,url_as_filename)
    if (not force_download) and (os.path.isfile(destination_filename)):
        print('Bypassing download of already-downloaded file {}'.format(
            os.path.basename(url)))
        return destination_filename
    print('Downloading file {} to {}'.format(os.path.basename(url),
                                             destination_filename),end='')
    urllib.request.urlretrieve(url, destination_filename, progress_updater)  
    assert(os.path.isfile(destination_filename))
    nBytes = os.path.getsize(destination_filename)
    print('...done, {} bytes.'.format(nBytes))
    return destination_filename

Choose a GOES data file for a known time

In [5]:
# Data are stored as product/year/day/hour/filename
product = 'ABI-L2-MCMIPF'
syear = '2020'; sday = '002'; shour = '14';

# There will be several scans this hour, we'll take the first
scan_index = 0

prefix = product + '/' + syear + '/' + sday + '/' + shour + '/'
print('Finding blobs matching prefix: {}'.format(prefix))
generator = goes_container_client.list_blobs(name_starts_with=prefix)
blobs = []
for blob in generator:
    blobs.append(blob.name)
print('Found {} scans'.format(len(blobs)))

scan_url = goes_blob_root + blobs[scan_index]
Finding blobs matching prefix: ABI-L2-MCMIPF/2020/002/14/
Found 6 scans

Load the image

In [7]:
# GOES-16 MCMIPF files are ~300MB.  Not too big to fit in memory, so sometimes it may be 
# preferable to download to file first, sometimes it will be better to load straight to 
# memory.
download_to_file = True

if download_to_file:
    
    filename = download_url(scan_url,progress_updater=DownloadProgressBar())
    from datetime import datetime
    dataset = xarray.open_dataset(filename)    

else:
    
    import netCDF4
    import requests
    
    # If you know of a good way to show a progress bar with requests.get (i.e., without writing
    # to file), we're all ears, email aiforearthdatasets@microsoft.com!
    print('Downloading {} to memory...'.format(os.path.basename(scan_url)))
    response = requests.get(scan_url)
    print('Finished downloading')
    nc4_ds = netCDF4.Dataset(os.path.basename(scan_url), memory = response.content)
    store = xarray.backends.NetCDF4DataStore(nc4_ds)
    dataset = xarray.open_dataset(store)
Bypassing download of already-downloaded file OR_ABI-L2-MCMIPF-M6_G16_s20200021400218_e20200021409537_c20200021410039.nc

Explore the xarray dataset and prepare to plot the image

In [8]:
print('Scan starts at: {}'.format(dataset.time_coverage_start))
print('Scan ends at: {}'.format(dataset.time_coverage_end))

# Bands are documented at:
#
# https://www.ncdc.noaa.gov/data-access/satellite-data/goes-r-series-satellites/glossary
#
# We'll use the red/"veggie"/blue bands with wavelengths 0.64, 0.86, and 0.47, respectively.
#
# This is close enough to RGB for today, but there's a great tutorial on getting closer to
# true color (and doing other fancy rendering tricks with GOES data!) here:
#
# https://unidata.github.io/python-gallery/examples/mapping_GOES16_TrueColor.html
#
r = dataset['CMI_C02'].data; r = np.clip(r, 0, 1)
g = dataset['CMI_C03'].data; g = np.clip(g, 0, 1)
b = dataset['CMI_C01'].data; b = np.clip(r, 0, 1)

# Brighten the image a bit for to look more stylish
gamma = 2.5; r = np.power(r, 1/gamma); g = np.power(g, 1/gamma); b = np.power(b, 1/gamma)

# Create a single RGB image for plotting
rgb = np.dstack((r, g, b))
Scan starts at: 2020-01-02T14:00:21.8Z
Scan ends at: 2020-01-02T14:09:53.7Z

Plot the image

In [10]:
fig = plt.figure(figsize=(7.5, 7.5), dpi=100)

# This definitely looks slicker with fancy borders on, at the cost of some extra
# imports.
show_fancy_borders = True

if not show_fancy_borders:
    
    plt.imshow(rgb); ax = plt.gca(); ax.axis('off');

else:
    
    import metpy
    import cartopy.crs as ccrs

    # Pick an arbitrary channel to get the x/y coordinates and projection information 
    # associated with the scan
    dummy_channel = dataset.metpy.parse_cf('CMI_C01')
    x = dummy_channel.x; y = dummy_channel.y

    ax = fig.add_subplot(1, 1, 1, projection=dummy_channel.metpy.cartopy_crs)
    ax.imshow(rgb, origin='upper', extent=(x.min(), x.max(), y.min(), y.max()))
    ax.coastlines(resolution='50m', color='black')
    ax.add_feature(ccrs.cartopy.feature.BORDERS);

Clean up temporary files

In [ ]:
shutil.rmtree(temp_dir)