Spring over navigation

Denne video er ikke tilgængelig i Dansk. Denne video er tilgængelig i English (US).

Leveraging Azure Databricks to minimize time to insight by combining Batch and Stream processing pipelines

Bringing data to life in a timely manner is every developer's dream. With Azure Databricks Delta, this dream is closer to reality than ever before. Join this session to see how you can create simple pipelines that allow bringing together real-time data and merge with massive batch datasets, with the objective of leveraging the best of both worlds, but with minimal friction. With data driving automated decision-making processes infused into intelligent applications, this session will enable you to develop intelligence integration directly against your in-flight data. 

Relaterede videoer

The Developer Data Scientist – Creating New Analytics Driven Applications using Apache Spark with Azure Databricks

Machine learning at scale

A Developer’s Introduction to Big Data Processing with Azure Databricks