Hands-On SQL Server 2019 Big Data Clusters with Spark

Hands-On SQL Server 2019 Big Data Clusters with Spark
Hands-On SQL Server 2019 Big Data Clusters with Spark

English | MP4 | AVC 1920×1080 | AAC 48KHz 2ch | 1h 42m | 428 MB
eLearning | Skill level: All Levels


Deploy scalable clusters of SQL Server with Spark and HDFS containers running on Kubernetes using SQL Server 2019 Big Data clusters

A SQL server big data cluster is a cluster of Linux containers orchestrated by Kubernetes. Starting with SQL Server 2019, SQL server big data clusters allow you to deploy scalable clusters of SQL Server, Spark, and HDFS containers running on Kubernetes. This enables you to read, write, and process big data from T-SQL or Spark, allowing you to easily combine and analyze high-value relational data with high-volume big data.

In this course, you’ll get up and running with SQL Server 2019 big data clusters. With the help of examples based on a real-world scenario, you’ll see how SQL Server big data clusters provide the flexibility in your interactions with big data. We’ll begin by covering the architecture and concepts of SQL Server 2019 big data clusters. Then, after deploying a big data cluster on your Kubernetes cluster, you’ll learn how to query external data sources, query data from multiple external data sources through the cluster, and load your data into a data pool of SQL Server 2019.

By the end of the course, you’ll be well-versed in managing SQL Server 2019 big data clusters with Spark.

This course will guide you through providing the best pointers on SQL Server 2019 Big Data Clusters with Spark.

  • Understand what SQL Server Big data clusters are
  • Set up SQL server 2019 big data with Spark
  • Use installation tools to help set up SQL server 2019 Big Data with Spark
  • Properly deploy Azure Kubernetes Service
  • Implement security in SQL server 2019 big data with Spark
  • See how to connect to a big data cluster
  • Know how the cluster administration portal works
  • Get Kubernetes troubleshooting tricks and tips
  • Implement Spark jobs