2. • The most trusted and
compliant platform
Azure HDInsight
A secure and managed Apache Hadoop and Spark platform for building data lakes in the Cloud
@ashishth
3. hot path
cold path
Serving-layer
data sources consumers
Governance
HDFS Compliant Storage
(Azure Data Lake Store Gen 2)
Meta data
Management
Security /
Access Control
Ingest real-time data with Kafka
Real Time NOSQL Store (HBase)
ETL (Apache Spark/Hive/Pig)
Ingest batch data AdHoc Query in DataLake
Serving Layer
Apache Spark/Hive LLAP/ Presto
Store real-time data
for long term analysis
Orchestration
Corporate
Data
Devices
&
Sensors
Advanced Analytics
& Data Science
Machine Learning
R, Python, APIs
Analytics
Data Exploration
Corporate
Reporting
Self-Service BI
Streaming/Real-
Time/
Application
Stream Processing
(Storm/Apache Spark)
@ashishth
9. Data Qty Network Bandwidth
45 Mbps (T3) 100 Mbps 1 Gbps
1 TB 2 days 1 day 2 hours
10 TB 22 days 10 days 1 day
35 TB 76 days 34 days 3 days
80 TB 173 days 78 days 8 days
100 TB 216 days 97 days 10 days
200 TB 1 year 194 days 19 days
500 TB 3 years 1 year 49 days
1 PB 6 years 3 years 97 days
2 PB 12 years 5 years 194 days
@ashishth
10. Network Transfer with TLS
• Over Internet
• Express Route
• Data Box online Transfer
Shipping data offline
• Data Box offline data transfer
@ashishth
11. USB 3.1 SSD disks
Order up to 5 in each pack
Ruggedized, self-contained appliances
100 TB
8 TB, up to 40 TB
1 PB
@ashishth
Use Azure Data Box to migrate data from an on-premises HDFS store to Azure
Storage
12. Type Latency ( Consistency of
latency)
Workloads Bandwidth Key Benefits
ADLS Gen 2 Hierarchical 10-50ms (Medium) HDInsight 3.6 &
4.0
Unconstrained Atomic Rename,
File Folder level
ACL’s
Standard
BLOB
Object
Store
10-50ms (Medium) HDInsight 3.6 &
4.0
Unconstrained Mature
Premium
BLOB
Object
Store
~5ms (High) HBase in Preview Unconstrained Fast
Premium
Managed
Disks
Hierarchical ~5ms (High) Kafka, HBase in
preview
Based on disk Consistent latency
ADLS Gen 1 Hierarchical 10-100ms (Low) HDInsight 3.6(
No HBase)
High Atomic Rename,
File Folder level
ACL’s
@ashishth
22. Workload Caching Options Key benefits
Spark Spark IO Cache Up to ~8 to 10x perf improvements
HBase &
Phoenix
Bucket cache Up 5-10x perf gains on recently read or written
data
Hive + LLAP LLAP Intelligent cache/Result Cache Up to ~4-100X gain on cached data
@ashishth
23. Azure Data Lake Storage
INSTANCE CORE RAM TEMP SSD
D1 v2 1 3.50 GiB 50 GiB
D2 v2 2 7.00 GiB 100 GiB
D3 v2 4 14.00 GiB 200 GiB
D4 v2 8 28.00 GiB 400 GiB
D5 v2 16 56.00 GiB 800 GiB
• Significant Spark performance speed up
with IO cache (up to 9X perf gains)
• Automatic cache resource management
• DRAM + Temp SSD makes large cache
pool
@ashishth@ashishth
24. PERIMETER
Isolate clusters within VNETs
Service Endpoint support for WASB, Azure DB, Cosmos DB
Restrict outbound traffic using NVAs*
AUTHENTICATION
Azure Active Directory
Kerberos with Active
Directory
AUTHORIZATION
Role-Based Access Control
Apache Ranger based Access
Control
DATA PROTECTION
Encryption on-the-wire with HTTPS enforced
Encryption at Rest using Azure Key Vault
Auditing of all data operations and configuration changes
@ashishth
29. • hdfs dfsadmin -D 'fs.default.name=hdfs://mycluster/' -safemode get # A report that shows the
• details of HDFS state: hdfs dfsadmin -D 'fs.default.name=hdfs://mycluster/' -report # Get
HDFS
• out of safe mode hdfs dfsadmin -D 'fs.default.name=hdfs://mycluster/' -safemode leave #
Get
• HDFS into safe mode hdfs dfsadmin -D 'fs.default.name=hdfs://mycluster/' -safemode enter
@ashishth
Azure HDInsight is a secure and managed platform for building data lakes on Azure based on the Apache Hadoop and Spark frameworks. So, what all does HDInsight have to offer?
Reliable Open Source analytics with an Industry leading SLAHDInsight allows you to easily spin up open source cluster types guaranteed with the industry’s best 99.9% SLA and 24/7 support. We guarantee this SLA for the entire big data solution, not just the VM instances. HDInsight is architected for full redundancy and high availability including head node replication, data geo-replication, and built-in standby NameNode making HDInsight resilient to critical failures not addressed in standard Hadoop implementations. Azure also offers cluster monitoring and 24x7 enterprise support backed by Microsoft and Hortonworks with 37 combined committers for Hadoop core, more than all other managed cloud providers combined to support your deployment and the ability to fix and commit code back to Hadoop.
Enterprise Grade Security & MonitoringHDInsight protects your data assets and easily extends your on-premise security and governance controls to the cloud. We feature single sign-on (SSO), multi-factor authentication and seamless management of millions of identities through Azure Active Directory. You can authorize users and groups with fine-grained access control policies over all your enterprise data with Apache Ranger. HDInsight meets HIPAA, PCI, SOC compliance, ensuring your enterprise data assets are always protected with the highest security and regulatory compliance. To ensure the highest level of business continuity, HDInsight extends capabilities for alerting, monitoring, defining pre-emptive actions, and enhanced workload protection through native integration with Azure Operations Management Suite (OMS).
Most Productive platform for developers and scientists HDInsight offers developers tailored experiences through rich productivity suites for Hadoop & Spark with integrated development environments using Visual Studio, Eclipse, and IntelliJ supporting Scala, Python, R, Java, and .Net. HDInsight gives data scientists the ability to create narratives that combine code, statistical equations, and visualizations that tell a story about the data through integration to the two most popular notebooks: Jupyter and Zeppelin. HDInsight is also the only managed cloud Hadoop solution with integration to Microsoft R Server. Multi-threaded math libraries and transparent parallelization in R Server means handling up to 1000x more data and up to 50x faster speeds than open source R—helping you train more accurate models for better predictions than previously possible.
Cost effective cloud scaleHDInsight has decoupled compute and storage, enabling you to cost-effectively scale workloads up or down, independent of storage. Local storage can still be used for caching and fast I/O. Spark and interactive Hive users can choose SSD memory for interactive performance; while Kafka users can retain all streaming data in premium managed disks. You only pay for the compute and storage you use and are given the ability to choose any Azure VM types that enables the best utilization of resources. A recent study showed HDInsight delivering 63% lower TCO than deploying Hadoop on premises over 5 years.*
Integration with leading Productivity ApplicationsIn the broader ecosystem for Hadoop, there is a thriving market of independent software vendors (ISVs) who provide value added solutions. Through a unique design where every cluster is extended with edge nodes and script action, HDInsight lets customers spin up Hadoop and Spark clusters pre-integrated and pre-tuned with any ISV application out-of-the-box. Datameer, Cask, AtScale, StreamSets are few such applications, which are very popular on the HDInsight platform today.
Easy for administrators to manageWith HDInsight, administrators can deploy Hadoop in the cloud without buying new hardware or incurring other up-front costs. There’s also no time-consuming installation or set up. There is also no need to patch the operating system or upgrade the Hadoop versions. Azure does it for you. Launch your first cluster in minutes.
Before I describe specific capabilities and value propositions of HDInsight, let us take a quick look at the architecture of a HDInsight cluster. We will build upon this when we talk about security later on in the presentation.
First off, a key difference between an on-premise Hadoop cluster and a HDInsight cluster is that with HDInsight, the storage and compute layers are separated. This allows for storage and compute to be scaled independently of each other. We have seen in numerous customer cases, that trying to combine storage and compute on to a single cluster often leads to underutilization of one or the other or both. With HDInsight, you can keep loading data in to Azure Storage Gen1 or Gen2 or in WASB. And you can create small or large clusters as and when needed.
Each HDInsight cluster comes with 2 gateway nodes, 2 head nodes and 3 ZooKeeper nodes. In most cases, these are free of charge. As we will discuss later, we provision multiple of these nodes to ensure high availability.
Each HDInsight cluster lives within a VNET. The gateway nodes are the ONLY public endpoints accessible from outside the VNET. As we will see later, this architecture allows you to securely lock down your HDInsight cluster.
Build 2015
Transfer data over network with TLS
Over internet - You can transfer data to Azure storage over a regular internet connection using any one of several tools such as: Azure Storage Explorer, AzCopy, Azure Powershell, and Azure CLI. See Moving data to and from Azure Storage for more information.
Express Route - ExpressRoute is an Azure service that lets you create private connections between Microsoft datacenters and infrastructure that’s on your premises or in a colocation facility. ExpressRoute connections do not go over the public Internet, and offer higher security, reliability, and speeds with lower latencies than typical connections over the Internet. For more information, see Create and modify an ExpressRoute circuit.
Data Box online data transfer - Data Box Edge and Data Box Gateway are online data transfer products that act as network storage gateways to manage data between your site and Azure. Data Box Edge, an on-premises network device, transfers data to and from Azure and uses artificial intelligence (AI)-enabled edge compute to process data. Data Box Gateway is a virtual appliance with storage gateway capabilities. For more information, see Azure Data Box Documentation - Online Transfer.
Shipping data Offline
Import / Export service - you can send physical disks to Azure and they will be uploaded for you. For more information, see What is Azure Import/Export service?.
Data Box offline data transfer - Data Box, Data Box Disk, and Data Box Heavy devices help you transfer large amounts of data to Azure when the network isn’t an option. These offline data transfer devices are shipped between your organization and the Azure datacenter. They use AES encryption to help protect your data in transit, and they undergo a thorough post-upload sanitization process to delete your data from the device. For more information, see Azure Data Box Documentation - Offline Transfer.