Skip Navigation

The MNIST database of handwritten digits

digits handwritten MNIST

The MNIST database of handwritten digits has a training set of 60,000 examples and a test set of 10,000 examples. The digits have been size-normalized and centered in a fixed-size image.

This dataset is sourced from THE MNIST DATABASE of handwritten digits. Its a subset of the larger NIST Handprinted Forms and Characters Database published by National Institute of Standards and Technology.

Storage Location

  • Blob account: azureopendatastorage

  • Container name: mnist

Four files are available in the container directly:

  • train-images-idx3-ubyte.gz: training set images (9912422 bytes)

  • train-labels-idx1-ubyte.gz: training set labels (28881 bytes)

  • t10k-images-idx3-ubyte.gz: test set images (1648877 bytes)

  • t10k-labels-idx1-ubyte.gz: test set labels (4542 bytes)

Notices

MICROSOFT PROVIDES AZURE OPEN DATASETS ON AN “AS IS” BASIS. MICROSOFT MAKES NO WARRANTIES, EXPRESS OR IMPLIED, GUARANTEES OR CONDITIONS WITH RESPECT TO YOUR USE OF THE DATASETS. TO THE EXTENT PERMITTED UNDER YOUR LOCAL LAW, MICROSOFT DISCLAIMS ALL LIABILITY FOR ANY DAMAGES OR LOSSES, INCLUDING DIRECT, CONSEQUENTIAL, SPECIAL, INDIRECT, INCIDENTAL OR PUNITIVE, RESULTING FROM YOUR USE OF THE DATASETS.

This dataset is provided under the original terms that Microsoft received source data. The dataset may include data sourced from Microsoft.

Access

Available inWhen to use
Azure Notebooks

Quickly explore the dataset with Jupyter notebooks hosted on Azure or your local machine.

Azure Databricks

Use this when you need the scale of an Azure managed Spark cluster to process the dataset.

Select your preferred service:

Azure Notebooks

Azure Databricks

Azure Notebooks

Package: Language: Python Python

Load MNIST into a data frame using Azure Machine Learning tabular datasets.

See https://docs.microsoft.com/en-us/azure/machine-learning/service/how-to-create-register-datasets to learn more about datasets.

Get complete dataset into a data frame

In [1]:
from azureml.opendatasets import MNIST

mnist = MNIST.get_tabular_dataset()
mnist_df = mnist.to_pandas_dataframe()
mnist_df.info()
ActivityStarted, get_tabular_dataset
ActivityCompleted: Activity=get_tabular_dataset, HowEnded=Success, Duration=8343.18 [ms]
<class 'pandas.core.frame.DataFrame'>
RangeIndex: 70000 entries, 0 to 69999
Columns: 785 entries, 0 to label
dtypes: int64(785)
memory usage: 419.2 MB

Get train and test data frames

In [2]:
mnist_train = MNIST.get_tabular_dataset(datasetFilter='train')
mnist_train_df = mnist_train.to_pandas_dataframe()
X_train = mnist_train_df.drop("label", axis=1).values/255.0
y_train = mnist_train_df.filter(items=["label"]).values

mnist_test = MNIST.get_tabular_dataset(datasetFilter='test')
mnist_test_df = mnist_test.to_pandas_dataframe()
X_test = mnist_test_df.drop("label", axis=1).values/255.0
y_test = mnist_test_df.filter(items=["label"]).values
ActivityStarted, get_tabular_dataset
ActivityCompleted: Activity=get_tabular_dataset, HowEnded=Success, Duration=3537.69 [ms]
ActivityStarted, get_tabular_dataset
ActivityCompleted: Activity=get_tabular_dataset, HowEnded=Success, Duration=4404.79 [ms]

Plot some images of the digits

In [3]:
%matplotlib inline
import numpy as np
import matplotlib.pyplot as plt

# now let's show some randomly chosen images from the traininng set.
count = 0
sample_size = 30
plt.figure(figsize=(16, 6))
for i in np.random.permutation(X_train.shape[0])[:sample_size]:
    count = count + 1
    plt.subplot(1, sample_size, count)
    plt.axhline('')
    plt.axvline('')
    plt.text(x=10, y=-10, s=y_train[i], fontsize=18)
    plt.imshow(X_train[i].reshape(28, 28), cmap=plt.cm.Greys)
plt.show()

Download or mount MNIST raw files Azure Machine Learning file datasets.

This works only for Linux based compute. See https://docs.microsoft.com/en-us/azure/machine-learning/service/how-to-create-register-datasets to learn more about datasets.

In [4]:
mnist_file = MNIST.get_file_dataset()
mnist_file
ActivityStarted, get_file_dataset
ActivityCompleted: Activity=get_file_dataset, HowEnded=Success, Duration=2272.94 [ms]
Out[4]:
{
  "source": [
    "https://azureopendatastorage.blob.core.windows.net/mnist/**/*.gz"
  ],
  "definition": [
    "GetFiles"
  ]
}
In [5]:
mnist_file.to_path()
Out[5]:
array(['/t10k-images-idx3-ubyte.gz', '/t10k-labels-idx1-ubyte.gz',
       '/train-images-idx3-ubyte.gz', '/train-labels-idx1-ubyte.gz'],
      dtype=object)

Download files to local storage

In [6]:
import os
import tempfile

data_folder = tempfile.mkdtemp()
mnist_file.download(data_folder, overwrite=True)
Out[6]:
array(['/tmp/tmpxqh3jcf_/t10k-images-idx3-ubyte.gz',
       '/tmp/tmpxqh3jcf_/t10k-labels-idx1-ubyte.gz',
       '/tmp/tmpxqh3jcf_/train-images-idx3-ubyte.gz',
       '/tmp/tmpxqh3jcf_/train-labels-idx1-ubyte.gz'], dtype=object)
In [7]:
os.listdir(data_folder)
Out[7]:
['train-images-idx3-ubyte.gz',
 't10k-images-idx3-ubyte.gz',
 't10k-labels-idx1-ubyte.gz',
 'train-labels-idx1-ubyte.gz']

Mount files. Useful when training job will run on a remote compute.

In [8]:
import gzip
import struct
import pandas as pd
import numpy as np

# load compressed MNIST gz files and return pandas dataframe of numpy arrays
def load_data(filename, label=False):
    with gzip.open(filename) as gz:
        gz.read(4)
        n_items = struct.unpack('>I', gz.read(4))
        if not label:
            n_rows = struct.unpack('>I', gz.read(4))[0]
            n_cols = struct.unpack('>I', gz.read(4))[0]
            res = np.frombuffer(gz.read(n_items[0] * n_rows * n_cols), dtype=np.uint8)
            res = res.reshape(n_items[0], n_rows * n_cols)
        else:
            res = np.frombuffer(gz.read(n_items[0]), dtype=np.uint8)
            res = res.reshape(n_items[0], 1)
    return pd.DataFrame(res)
In [9]:
import sys
mount_point = tempfile.mkdtemp()
print(mount_point)
print(os.path.exists(mount_point))
print(os.listdir(mount_point))

if sys.platform == 'linux':
  print("start mounting....")
  with mnist_file.mount(mount_point):
    print("list dir...")
    print(os.listdir(mount_point))
    print("get the dataframe info of mounted data...")
    train_images_df = load_data(os.path.join(mount_point, 'train-images-idx3-ubyte.gz'))
    print(train_images_df.info())
/tmp/tmpmtcdrdqr
True
[]
start mounting....
list dir...
['t10k-images-idx3-ubyte.gz', 't10k-labels-idx1-ubyte.gz', 'train-images-idx3-ubyte.gz', 'train-labels-idx1-ubyte.gz']
get the dataframe info of mounted data...
<class 'pandas.core.frame.DataFrame'>
RangeIndex: 60000 entries, 0 to 59999
Columns: 784 entries, 0 to 783
dtypes: uint8(784)
memory usage: 44.9 MB
None
In [10]:
%matplotlib inline
import numpy as np
import matplotlib.pyplot as plt
In [11]:
import urllib.request
import os

data_folder = os.path.join(os.getcwd(), 'data')
os.makedirs(data_folder, exist_ok=True)

urllib.request.urlretrieve('https://azureopendatastorage.blob.core.windows.net/mnist/train-images-idx3-ubyte.gz',
                           filename=os.path.join(data_folder, 'train-images.gz'))
urllib.request.urlretrieve('https://azureopendatastorage.blob.core.windows.net/mnist/train-labels-idx1-ubyte.gz',
                           filename=os.path.join(data_folder, 'train-labels.gz'))
urllib.request.urlretrieve('https://azureopendatastorage.blob.core.windows.net/mnist/t10k-images-idx3-ubyte.gz',
                           filename=os.path.join(data_folder, 'test-images.gz'))
urllib.request.urlretrieve('https://azureopendatastorage.blob.core.windows.net/mnist/t10k-labels-idx1-ubyte.gz',
                           filename=os.path.join(data_folder, 'test-labels.gz'))
Out[11]:
('D:\\TEMP\\jupyter\\data\\test-labels.gz',
 <http.client.HTTPMessage at 0x21ac002cb38>)
In [12]:
import gzip
import struct

# load compressed MNIST gz files and return numpy arrays
def load_data(filename, label=False):
    with gzip.open(filename) as gz:
        struct.unpack('I', gz.read(4))
        n_items = struct.unpack('>I', gz.read(4))
        if not label:
            n_rows = struct.unpack('>I', gz.read(4))[0]
            n_cols = struct.unpack('>I', gz.read(4))[0]
            res = np.frombuffer(gz.read(n_items[0] * n_rows * n_cols), dtype=np.uint8)
            res = res.reshape(n_items[0], n_rows * n_cols)
        else:
            res = np.frombuffer(gz.read(n_items[0]), dtype=np.uint8)
            res = res.reshape(n_items[0], 1)
    return res
In [13]:
# note we also shrink the intensity values (X) from 0-255 to 0-1. This helps the model converge faster.
X_train = load_data(os.path.join(
    data_folder, 'train-images.gz'), False) / 255.0
X_test = load_data(os.path.join(data_folder, 'test-images.gz'), False) / 255.0
y_train = load_data(os.path.join(
    data_folder, 'train-labels.gz'), True).reshape(-1)
y_test = load_data(os.path.join(
    data_folder, 'test-labels.gz'), True).reshape(-1)

# now let's show some randomly chosen images from the traininng set.
count = 0
sample_size = 30
plt.figure(figsize=(16, 6))
for i in np.random.permutation(X_train.shape[0])[:sample_size]:
    count = count + 1
    plt.subplot(1, sample_size, count)
    plt.axhline('')
    plt.axvline('')
    plt.text(x=10, y=-10, s=y_train[i], fontsize=18)
    plt.imshow(X_train[i].reshape(28, 28), cmap=plt.cm.Greys)
plt.show()

Azure Databricks

Package: Language: Python

Load MNIST into a data frame using Azure Machine Learning tabular datasets.

See https://docs.microsoft.com/en-us/azure/machine-learning/service/how-to-create-register-datasets to learn more about datasets.

Get complete dataset into a data frame

In [1]:
# This is a package in preview.
from azureml.opendatasets import MNIST

mnist = MNIST.get_tabular_dataset()
mnist_df = mnist.to_spark_dataframe()
ActivityStarted, get_tabular_dataset ActivityCompleted: Activity=get_tabular_dataset, HowEnded=Success, Duration=6821.78 [ms]
In [2]:
display(mnist_df.limit(5))
0123456789101112131415161718192021222324252627282930313233343536373839404142434445464748495051525354555657585960616263646566676869707172737475767778798081828384858687888990919293949596979899100101102103104105106107108109110111112113114115116117118119120121122123124125126127128129130131132133134135136137138139140141142143144145146147148149150151152153154155156157158159160161162163164165166167168169170171172173174175176177178179180181182183184185186187188189190191192193194195196197198199200201202203204205206207208209210211212213214215216217218219220221222223224225226227228229230231232233234235236237238239240241242243244245246247248249250251252253254255256257258259260261262263264265266267268269270271272273274275276277278279280281282283284285286287288289290291292293294295296297298299300301302303304305306307308309310311312313314315316317318319320321322323324325326327328329330331332333334335336337338339340341342343344345346347348349350351352353354355356357358359360361362363364365366367368369370371372373374375376377378379380381382383384385386387388389390391392393394395396397398399400401402403404405406407408409410411412413414415416417418419420421422423424425426427428429430431432433434435436437438439440441442443444445446447448449450451452453454455456457458459460461462463464465466467468469470471472473474475476477478479480481482483484485486487488489490491492493494495496497498499500501502503504505506507508509510511512513514515516517518519520521522523524525526527528529530531532533534535536537538539540541542543544545546547548549550551552553554555556557558559560561562563564565566567568569570571572573574575576577578579580581582583584585586587588589590591592593594595596597598599600601602603604605606607608609610611612613614615616617618619620621622623624625626627628629630631632633634635636637638639640641642643644645646647648649650651652653654655656657658659660661662663664665666667668669670671672673674675676677678679680681682683684685686687688689690691692693694695696697698699700701702703704705706707708709710711712713714715716717718719720721722723724725726727728729730731732733734735736737738739740741742743744745746747748749750751752753754755756757758759760761762763764765766767768769770771772773774775776777778779780781782783label
00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000318181812613617526166255247127000000000000303694154170253253253253253225172253242195640000000000049238253253253253253253253253251938282563900000000000018219253253253253253198182247241000000000000000000801561072532532051104315400000000000000000001411542539000000000000000000000000001392531902000000000000000000000000111902537000000000000000000000000003524122516010810000000000000000000000081240253253119250000000000000000000000045186253253150270000000000000000000000016932522531870000000000000000000000000249253249640000000000000000000004613018325325320720000000000000000000391482292532532532501820000000000000000002411422125325325325320178000000000000000002366213253253253253198812000000000000000018171219253253253253195809000000000000000055172226253253253253244133110000000000000000001362532532532121351321600000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000005
0000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000511592531595000000000000000000000004823825225225223700000000000000000000054227253252239233252576000000000000000001060224252253252202842522531220000000000000000016325225225225325225296189253167000000000000000051238253253190114253228477925516800000000000000048238252252179127512121002532435000000000000003816525323320884000000253252165000000000000717825224071192800000025325219500000000000057252252630000000002532521950000000000001982531900000000000255253196000000000007624625211200000000002532521480000000000085252230250000000071352531861200000000000852522230000000071312522257100000000000085252145000000048165252173000000000000008625322500000011423825316200000000000000085252249146482985178225253223167560000000000000008525225225222921525225225219613000000000000000000281992522522532522522331450000000000000000000025128252253252141370000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000067232390000000006281000000000000001201803900000000012616300000000000002153210400000000002201630000000000000272541620000000000222163000000000000018325412500000000046245163000000000000019825456000000000120254