Matias Quaranta joins Scott Hanselman to share some best practices for creating serverless geo-distributed applications with Azure Cosmos DB. With the native integration between Azure Cosmos DB and Azure Functions, you can create database triggers, input bindings, and output bindings directly from your Azure Cosmos DB account. Using Azure Functions and Azure Cosmos DB, you can create and deploy event-driven serverless apps with low-latency access to rich data for a global user base.Serverless database computing using Azure Cosmos DB and Azure FunctionsServerless geo-replicated event-based architecture sample for Azure Friday (GitHub)Change feed in Azure Cosmos DB - overviewCreate a free account (Azure)
Spark is the world’s foremost distributed analytics platform, delivering in-memory analytics with a speed and ease of use unheard of in Hadoop. Azure Cosmos DB is the lighting fast distributed database powering Fortune 500 companies like Walmart, Exxon Mobile, Toyota and many others. Did you know you can combine them easily using our natively built azure-cosmosdb-spark connector or now you can use the new Spark API feature integration that allows Spark to fully take advantage of Cosmos DB to run real-time analytics directly on petabytes of operational data! In this session we’ll go over some of the most common use cases of the azure-cosmosdb-spark connector and highlight how to avoid the most common pitfalls. We will talk about the new Azure Cosmos DB Spark API and the native support it brings for Apache Spark engines executing directly on petabytes of operational data stored in your globally distributed Cosmos databases. We will walk through the capabilities Spark API brings to developers, data engineers and data scientists such that they can use Cosmos DB as a flexible, scalable, and performant planet-scale data platform for running both OLTP and HTAP workloads alike.
For many newcomers to Azure Cosmos DB, the learning process starts with data modeling and partitioning. How should I structure my data? When should I co-locate data in a single container? Should I de-normalize or normalize properties? What’s the best partition key for my model? In this demo-filled session, we discuss the strategies and thought process one should adopt for modeling and partitioning data effectively in Azure Cosmos DB. Using a real-world example, we explore Cosmos DB key concepts – request units (RU), partitioning, and data modeling – and how their understanding guides the path to a data model that yields best performance and scalability.
In this session, we will discuss how to build mission-critical multi-tenant systems on Azure Cosmos DB that scale out to handle a global user base. We’ll begin by covering key concepts surrounding performance isolation, security isolation, high availability, disaster recovery, and SLAs. And then we’ll dive deep by walking through the Cosmos DB design considerations and capabilities – including partitioning strategies and multi-master.
As developers push intellectual property to registries, how will you secure and protect that IP? Additionally, how do you ensure that once those applications are deployed they properly running and in good shape? In this session we'll cover building container images, image scanning, signing and promotion across environments. Then we will look at the tools and knowledge you need to keep your containerized applications healthy and how to detect when something goes wrong.
Make More of the Sky - Air Traffic Management (ATM) services affect the quality and performance of every commercial flight in the world - currently transporting about a billion passengers a year. Growing traffic, limited systems capacity, strict safety requirements, and high environmental awareness demand a constant ATM services improvement. Come and listen to Heiko Udluft, AirSense Technical Leader, Jesse Anderson, BigData guru, and Vincent Chartier, Customer Success CTO at Microsoft France, tell how they use Azure to continuously develop, deploy and release AirSense services, the Airbus new generation ATM services. These services, powered by Azure Data & Analytics platform, provide business insights based on real-time aircraft position data to Airbus internal and external customers in order to shape the future of ATM. Find out how Azure Event Hubs, Azure Cosmos DB, and Azure Databricks, amongst others, were combined to handle real-time streams, derive analytics, and provide decision support directly relevant and valuable for the Air Transportation industry. Heiko, Jesse and Vincent will also cover the collaborative approach they leverage to overcome the challenges encountered along the way.
This session will talk about how various Azure technologies like Azure Event Hubs, Azure Cosmos DB and Node.js can be used to process telematics data. We'll do a code demo of how sample telematics data are ingested into Event Hubs and processed by multiple processes and stored in Azure Cosmos DB.
In this session we will start with a modern app and optimize it for Kubernetes using Azure Kubernetes Service (AKS), In addition to migrating to AKS, we will take advantage of native Azure services like Cosmos DB API for Mongo DB ( in the process comparing options like running dockerized Mongo DB or hosted Mongo DB Atlas). We will also discuss the Kubernetes concepts like service discovery, external services, and service catalog and how they relate to AKS and Cosmos DB interaction. Finally, we will look at Cosmos DB capabilities like the Virtual Network Service Endpoints as a way to secure the traffic between AKS and Cosmos DB.
In this session, you will learn how to build event-driven apps using Azure Cosmos DB and Azure Functions. Users expect modern apps to offer event-driven, near real-time experiences. Learn how to subscribe to changes in Azure Cosmos DB collections and trigger logic in real time without needing to manage any servers. Understand real-world use cases in multi-billion dollar industries such as retail and gaming.
Leading enterprises are using AI to power innovation across industries, including healthcare, automotive, and finance. In this session, gain insight into how these enterprise AI solutions are implemented using Azure Cognitive Services on Spark and Azure CosmosDB. Cognitive Services on Spark enable working with Azure’s Intelligent Services at massive scales with the Apache Spark distributed computing ecosystem. We will demonstrate how customers like SAP, NASCAR, MediaValet and OpenText are using Custom vision service, Video Indexer, Text analytics, Bing and speech services at cloud scale and how customers like Kroger are able to mitigate privacy and security concerns by deploying cognitive services on-premise as containers.