SlideShare a Scribd company logo
1 of 153
Download to read offline
Marcin Szymaniuk

DataFrames in Spark
Data Analysts’ perspective
www.tantusdata.com
About me
• Data Engineer @TantusData
• Have worked for: Spotify, Apple, telcos, startups
• Cluster installations, application architecture and
development, training, data team support
marcin@tantusdata.com
marcin.szymaniuk@gmail.com
@mszymani
www.tantusdata.com
Agenda
• Spark for Data Analysts
• Spark execution model
• Case by case…
• Summary
www.tantusdata.com
Data Analysts’/Scientists’ perspective
• Explain what happened and why
• Drive decisions to be made
• Build ML models
www.tantusdata.com
Bring analysis to data
DATA
R / PYTHON
/ SAS
Sample
• Sample only (region, latest month…)
• Coarse aggregate eg. month vs hour (1:720)
www.tantusdata.com
Bring analysis to data
• - Base station position

- Network technology

- Antenna specs
Photo credit: productcoalition.com
www.tantusdata.com
Bring analysis to data
• Analyze all data
• Faster analysis
• No extra data copies (GDPR!)
www.tantusdata.com
Data Analysts’/Scientists’ perspective
• Ad-hoc queries and reports
• Data cleanup
www.tantusdata.com
Data Analysts’/Scientists’ perspective
• Ad-hoc queries and reports
• Data cleanup
Photo credit: CrowdFlower
www.tantusdata.com
Spark API - DataFrame
eventsDf
.groupBy("userId")
.agg(
max("value").alias("maxVal"),
avg("value").alias("avgValue")
)
.join(usersDf, usersDf("id") === eventsDf("userId"))
.select("userId", "maxVal", "avgValue","name")
www.tantusdata.com
Spark API - DataFrame
eventsDf
.groupBy("userId")
.agg(
max("value").alias("maxVal"),
avg("value").alias("avgValue")
)
.join(usersDf, usersDf("id") === eventsDf("userId"))
.select("userId", "maxVal", "avgValue","name")
www.tantusdata.com
Data Analysts’ perspective
• Simple query fails?
• Simple query never finishes?
Photo credit: thedeliberatemusician.com
www.tantusdata.com
Deep dive
Photo credit: amazon.ca
www.tantusdata.com
DataFrame
DataFrame
Partition 1
Partition 2
Partition 3
Partition 4
…
www.tantusdata.com
Transformation type
• Narrow - single partition at time
• Wide - exchange partition’s data
www.tantusdata.com
Narrow transformation
dataFrame.
.withColumn("uIdHashed", sha2(col("userId"), 256))
.withColumn("strReversed", reverse(col(“str")))
.withColumn(“v1v2Sum”, col("v1") + col(“v2"))
.write
.save(output)
www.tantusdata.com
Narrow transformation
dataFrame.
.withColumn("uIdHashed", sha2(col("userId"), 256))
.withColumn("strReversed", reverse(col(“str")))
.withColumn(“v1v2Sum”, col("v1") + col(“v2"))
.write
.save(output)
www.tantusdata.com
Narrow transformation
dataFrame.
.withColumn("uIdHashed",sha2(col("userId"), 256))
.withColumn("strReversed", reverse(col(“str")))
.withColumn(“v1v2Sum”, col("v1") + col(“v2"))
.write
.save(output)
www.tantusdata.com
Narrow transformation
dataFrame.
.withColumn("uIdHashed", sha2(col("userId"), 256))
.withColumn("strReversed", reverse(col(“str")))
.withColumn(“v1v2Sum”, col("v1") + col(“v2"))
.write
.save(output)
www.tantusdata.com
Narrow transformation
dataFrame.
.withColumn("uIdHashed", sha2(col("userId"), 256))
.withColumn("strReversed", reverse(col(“str")))
.withColumn(“v1v2Sum”, col("v1") + col(“v2"))
.write
.save(output)
www.tantusdata.com
Narrow transformation
dataFrame.
.withColumn("uIdHashed", sha2(col("userId"), 256))
.withColumn("strReversed", reverse(col(“str")))
.withColumn(“v1v2Sum”, col("v1") + col(“v2"))
.write
.save(output)
www.tantusdata.com
Narrow transformation
dataFrame.
.withColumn("uIdHashed", sha2(col("userId"), 256))
.withColumn("strReversed", reverse(col(“str")))
.withColumn(“v1v2Sum”, col("v1") + col(“v2"))
.write
.save(output)
www.tantusdata.com
Narrow transformation
dataFrame.
.withColumn("uIdHashed", sha2(col("userId"), 256))
.withColumn("strReversed", reverse(col(“str")))
.withColumn(“v1v2Sum”, col("v1") + col(“v2"))
.write
.save(output)
Stage 1
www.tantusdata.com
Narrow transformation
dataFrame.
.withColumn("uIdHashed", sha2(col("userId"), 256))
.withColumn("strReversed", reverse(col(“str")))
.withColumn(“v1v2Sum”, col("v1") + col(“v2"))
.write
.save(output)
read Input Data
Stage 1
www.tantusdata.com
Narrow transformation
dataFrame.
.withColumn("uIdHashed", sha2(col("userId"), 256))
.withColumn("strReversed", reverse(col(“str")))
.withColumn(“v1v2Sum”, col("v1") + col(“v2"))
.write
.save(output)
read Input Data
do the magic
Stage 1
www.tantusdata.com
Narrow transformation
dataFrame.
.withColumn("uIdHashed", sha2(col("userId"), 256))
.withColumn("strReversed", reverse(col(“str")))
.withColumn(“v1v2Sum”, col("v1") + col(“v2"))
.write
.save(output)
read Input Data
do the magic
write out the result
Stage 1
www.tantusdata.com
Stage 1
www.tantusdata.com
Wide transformations
Partition 1
Partition 2
Partition 3
DataFrame
2 | VIEW | … 1 | SEND | …1 | CLICK | …
3 | SEND | … 1 | CALL | …1 | CLICK | …
2 | VIEW | … 3 | CALL | …2 | SEND | …
1 | CALL | …3 | SEND | …
Partition 1
Partition 2
Partition 3
New DataFrame
groupBy
www.tantusdata.com
Wide transformations
Partition 1
Partition 2
Partition 3
DataFrame
2 | VIEW | … 1 | SEND | …1 | CLICK | …
3 | SEND | … 1 | CALL | …1 | CLICK | …
2 | VIEW | … 3 | CALL | …2 | SEND | …
1 | CALL | …3 | SEND | …
Partition 1
Partition 2
Partition 3
New DataFrame
1 | SEND | … 1 | CLICK | …
1 | CALL | …
1 | CLICK | …
1 | CALL | …
www.tantusdata.com
Wide transformations
Partition 1
Partition 2
Partition 3
DataFrame
2 | VIEW | … 1 | SEND | …1 | CLICK | …
3 | SEND | … 1 | CALL | …1 | CLICK | …
2 | VIEW | … 3 | CALL | …2 | SEND | …
1 | CALL | …3 | SEND | …
Partition 1
Partition 2
Partition 3
New DataFrame
1 | SEND | … 1 | CLICK | …
1 | CALL | …
1 | CLICK | …
1 | CALL | …
2 | VIEW | … 2 | SEND | … 2 | VIEW | …
www.tantusdata.com
Wide transformations
Partition 1
Partition 2
Partition 3
DataFrame
2 | VIEW | … 1 | SEND | …1 | CLICK | …
3 | SEND | … 1 | CALL | …1 | CLICK | …
2 | VIEW | … 3 | CALL | …2 | SEND | …
1 | CALL | …3 | SEND | …
Partition 1
Partition 2
Partition 3
New DataFrame
1 | SEND | … 1 | CLICK | …
1 | CALL | …
1 | CLICK | …
1 | CALL | …
2 | VIEW | … 2 | SEND | … 2 | VIEW | …
3 | SEND | … 3 | CALL | … 3 | SEND | …
www.tantusdata.com
Wide transformations
• join
• groupBy
• repartition
• coalesce
• distinct
• …
www.tantusdata.com
Wide transformations
events
.join(users, users("id") === events("userId"))
//Extra logic
.write
.parquet(output)
www.tantusdata.com
Wide transformation
Stage 1 Stage 2 Stage 3
www.tantusdata.com
Wide transformation
Shuffle - exchange data between nodes
Stage 1 Stage 2 Stage 3
www.tantusdata.com
Simplest scenario ever
val df = spark.read.parquet(“…”)
HDFS
TASK
www.tantusdata.com
Simplest scenario ever
HDFS
TASK
ADDCOL ADDCOL
val df = spark.read.parquet(“…”)
df
.withColumn("year", year(col(“timestamp")))
.withColumn("month", month(col(“timestamp”)))
.withColumn("day", dayofmonth(col(“timestamp”)))
www.tantusdata.com
Simplest scenario ever
HDFS
TASK
ADDCOL ADDCOL
HDFS
val df = spark.read.parquet(“…”)
df
.withColumn("year", year(col(“timestamp”)))
.withColumn("month", month(col("timestamp")))
.withColumn("day", dayofmonth(col("timestamp")))
.write.save(output)
www.tantusdata.com
Simplest scenario ever
1TB of
events
8000 blocks
www.tantusdata.com
Simplest scenario ever
1TB of
events
8000 blocks
8000 tasks
HDFS
TASK
ADDCOL ADDCOL
HDFS
HDFS
TASK
ADDCOL ADDCOL
HDFS
HDFS
TASK
ADDCOL ADDCOL
HDFS
HDFS
TASK
ADDCOL ADDCOL
HDFS
…
8000
www.tantusdata.com
Case #1
• Faster queries with date condition?
• Organize your data by date!
www.tantusdata.com
Case #1
/DATA
www.tantusdata.com
Case #1
/DATA
.filter(
col("year")==="2018"
)
www.tantusdata.com
Case #1
2015
/DATA
www.tantusdata.com
Case #1
2016/DATA
www.tantusdata.com
Case #1
2017
/DATA
www.tantusdata.com
Case #1
2017
2016
2015
/DATA
www.tantusdata.com
Case #1
/DATA 2016
2015
01
01
02
032017
…
www.tantusdata.com
Case #1
val df = spark.read.parquet(“…”)
df
.withColumn("year", year(col(“timestamp”)))
.withColumn("month", month(col("timestamp")))
.withColumn("day", dayofmonth(col("timestamp")))
.partitionBy("year", "month", "day")
.write.save(output)
www.tantusdata.com
Case #1
www.tantusdata.com
Case #1
www.tantusdata.com
Case #1
www.tantusdata.com
Case #1
www.tantusdata.com
Case #1
www.tantusdata.com
Case #1
www.tantusdata.com
Case #1
$ hdfs dfs -ls
www.tantusdata.com
Case #1
$ hdfs dfs -ls /user/marcin/out/year=2019/month=3/day=1
www.tantusdata.com
Case #1
$ hdfs dfs -ls /user/marcin/out/year=2019/month=3/day=1 | wc -l
www.tantusdata.com
Case #1
$ hdfs dfs -ls /user/marcin/out/year=2019/month=3/day=1 | wc -l
22
www.tantusdata.com
Case #1
$ hdfs dfs -ls /user/marcin/out/year=2019/month=3/day=1 | wc -l
22
$ hdfs dfs -ls /user/marcin/out/year=2019/month=3/day=1 | wc -l
25
www.tantusdata.com
Case #1
$ hdfs dfs -ls /user/marcin/out/year=2019/month=3/day=1 | wc -l
22
$ hdfs dfs -ls /user/marcin/out/year=2019/month=3/day=1 | wc -l
25
$ hdfs dfs -ls /user/marcin/out/year=2019/month=3/day=1 | wc -l
40
www.tantusdata.com
val df = spark.read.parquet(“…”)
df
.withColumn("year", year(col(“timestamp”)))
.withColumn("month", month(col("timestamp")))
.withColumn("day", dayofmonth(col("timestamp")))
.partitionBy("year", "month", "day")
.write.save(output)
HDFS
TASK
READ ADDCOL ADDCOL
HDFS
Day 1
Day 2
Day 90
…HDFS
TASK
READ ADDCOL ADDCOL
HDFS
Day 1
Day 2
Day 90
…
Case #1
HDFS
TASK
READ ADDCOL ADDCOL
HDFS
Day 1
Day 2
Day 90
…HDFS
TASK
READ ADDCOL ADDCOL
HDFS
Day 1
Day 2
Day 90
…
HDFS
TASK
READ ADDCOL ADDCOL
HDFS
Day 1
Day 2
Day 90
…HDFS
TASK
READ ADDCOL ADDCOL
HDFS
Day 1
Day 2
Day 90
…
HDFS
TASK
READ ADDCOL ADDCOL
HDFS
Day 1
Day 2
Day 90
…HDFS
TASK
READ ADDCOL ADDCOL
HDFS
Day 1
Day 2
Day 90
…
…
www.tantusdata.com
Case #1
90 days of
data
8000 blocks
www.tantusdata.com
Case #1
90 days of
data
8000 blocks
90*8000
files
produced
www.tantusdata.com
Consequences
• Slow!
• Downstream jobs will struggle
• Memory pressure on NameNode
• Might kill the cluster!
www.tantusdata.com
Consequences
www.tantusdata.com
Case #1
• repartition before partitionBy in order to limit #files
www.tantusdata.com
Repartition
df
.withColumn("year", year(col("eventTimestamp")))
.withColumn("month", month(col("eventTimestamp")))
.withColumn("day", dayofmonth(col("eventTimestamp")))
.repartition(col("year"), col("month"), col("day"))
.write
.partitionBy("year", "month", "day")
.save(output)
www.tantusdata.com
www.tantusdata.com
HDFS
TASK
HDFS
TASK 1
STAGE 1
HDFS
TASK
HDFS
TASK 1
STAGE 1
HDFS
TASK
Day 1
Day 2
Day 3
LOCAL
HDFS
TASK 1
Day 1
Day 2
Day 3
LOCAL
STAGE 1
HDFS
TASK
Day 1
Day 2
Day 3
LOCAL
HDFS
TASK 2
Day 1
Day 2
Day 3
LOCAL
HDFS
TASK X
Day 1
Day 2
Day 3
LOCAL
HDFS
TASK 1
Day 1
Day 2
Day 3
LOCAL
TASK 2
Day 1
Day 2
Day 3
LOCAL
TASK X
Day 1
Day 2
Day 3
LOCAL
STAGE 1
HDFS
HDFS
HDFS
TASK
Day 1
Day 2
Day 3
LOCAL
HDFS
TASK 2
Day 1
Day 2
Day 3
LOCAL
HDFS
TASK X
Day 1
Day 2
Day 3
LOCAL
HDFS
TASK 1
Day 1
Day 2
Day 3
LOCAL
TASK 2
Day 1
Day 2
Day 3
LOCAL
TASK X
Day 1
Day 2
Day 3
LOCAL
STAGE 1
HDFS
HDFS
HDFS
TASK
Day 1
Day 2
Day 3
LOCAL
HDFS
TASK 2
Day 1
Day 2
Day 3
LOCAL
HDFS
TASK X
Day 1
Day 2
Day 3
LOCAL
HDFS
TASK 1
Day 1
Day 2
Day 3
LOCAL
TASK 1
TASK 2
Day 1
Day 2
Day 3
LOCAL
TASK X
Day 1
Day 2
Day 3
LOCAL
STAGE 1 STAGE 2
HDFS
HDFS
HDFS
TASK
Day 1
Day 2
Day 3
LOCAL
HDFS
TASK 2
Day 1
Day 2
Day 3
LOCAL
HDFS
TASK X
Day 1
Day 2
Day 3
LOCAL
HDFS
TASK 1
Day 1
Day 2
Day 3
LOCAL
TASK 1
TASK 2
Day 1
Day 2
Day 3
LOCAL
TASK X
Day 1
Day 2
Day 3
LOCAL
STAGE 1 STAGE 2
HDFS
HDFS
HDFS
TASK
Day 1
Day 2
Day 3
LOCAL
HDFS
TASK 2
Day 1
Day 2
Day 3
LOCAL
HDFS
TASK X
Day 1
Day 2
Day 3
LOCAL
HDFS
TASK 1
Day 1
Day 2
Day 3
LOCAL
TASK 1
TASK 2
Day 1
Day 2
Day 3
LOCAL
TASK X
Day 1
Day 2
Day 3
LOCAL
TASK 2
STAGE 1 STAGE 2
HDFS
HDFS
www.tantusdata.com
Case #1 - summary
• each partitionBy task may create many files in HDFS
• repartition to the rescue
www.tantusdata.com
Case #2
• It fails after repartitioning…
www.tantusdata.com
www.tantusdata.com
www.tantusdata.com
www.tantusdata.com
Case #2
www.tantusdata.com
Case #2
www.tantusdata.com
Case #2
www.tantusdata.com
Case #2
One day of
events is
100G
HDFS
TASK
Day 1
Day 2
Day 3
LOCAL
HDFS
TASK 2
Day 1
Day 2
Day 3
LOCAL
HDFS
TASK X
Day 1
Day 2
Day 3
LOCAL
HDFS
TASK 1
Day 1
Day 2
Day 3
LOCAL
TASK 1
TASK 2
Day 1
Day 2
Day 3
LOCAL
TASK X
Day 1
Day 2
Day 3
LOCAL
TASK 2
STAGE 1 STAGE 2
HDFS
HDFS
HDFS
TASK
Day 1
Day 2
Day 3
LOCAL
HDFS
TASK 2
Day 1
Day 2
Day 3
LOCAL
HDFS
TASK X
Day 1
Day 2
Day 3
LOCAL
HDFS
TASK 1
Day 1
Day 2
Day 3
LOCAL
TASK 1
TASK 2
Day 1
Day 2
Day 3
LOCAL
TASK X
Day 1
Day 2
Day 3
LOCAL
TASK 2
STAGE 1 STAGE 2
HDFS
HDFS
100G100G
100G
www.tantusdata.com
Problems with shuffle
• Spill to disk
• Timeouts
• GC overhead limit exceeded
• OOM
• ExecutorLostFailure
www.tantusdata.com
Case #2
df
.withColumn("year", year(col("eventTimestamp")))
.withColumn("month", month(col("eventTimestamp")))
.withColumn("day", dayofmonth(col("eventTimestamp")))
.repartition(col("year"), col("month"), col(“day”), col(”hour”))
.write
.partitionBy("year", "month", "day")
.save(output)
www.tantusdata.com
day month year hour event …
1 1 2017 11 CALL
1 1 2017 1 TEXT
1 1 2017 4 CLICK
1 1 2017 7 SEND
2 1 2017 1 CLICK
2 1 2017 3 TEXT
2 1 2017 1 TEXT
2 1 2017 1 CALL
.repartition(“year", "month",“day”)
www.tantusdata.com
day month year hour event …
1 1 2017 11 CALL
1 1 2017 1 TEXT
1 1 2017 4 CLICK
1 1 2017 7 SEND
2 1 2017 1 CLICK
2 1 2017 3 TEXT
2 1 2017 1 TEXT
2 1 2017 1 CALL
.repartition(“year", "month",“day”)
www.tantusdata.com
day month year hour event …
1 1 2017 11 CALL
1 1 2017 1 TEXT
1 1 2017 4 CLICK
1 1 2017 7 SEND
2 1 2017 1 CLICK
2 1 2017 3 TEXT
2 1 2017 1 TEXT
2 1 2017 1 CALL
.repartition(“year", "month",“day”)
www.tantusdata.com
day month year hour event …
1 1 2017 11 CALL
1 1 2017 1 TEXT
1 1 2017 4 CLICK
1 1 2017 7 SEND
2 1 2017 1 CLICK
2 1 2017 3 TEXT
2 1 2017 1 TEXT
2 1 2017 1 CALL
.repartition(“year", “month”,“day”,”hour”)
www.tantusdata.com
day month year hour event …
1 1 2017 11 CALL
1 1 2017 1 TEXT
1 1 2017 4 CLICK
1 1 2017 7 SEND
2 1 2017 1 CLICK
2 1 2017 3 TEXT
2 1 2017 1 TEXT
2 1 2017 1 CALL
.repartition(“year", “month”,“day”,”hour”)
www.tantusdata.com
day month year hour event …
1 1 2017 11 CALL
1 1 2017 1 TEXT
1 1 2017 4 CLICK
1 1 2017 7 SEND
2 1 2017 1 CLICK
2 1 2017 3 TEXT
2 1 2017 1 TEXT
2 1 2017 1 CALL
.repartition(“year", “month”,“day”,”hour”)
www.tantusdata.com
day month year hour event …
1 1 2017 11 CALL
1 1 2017 1 TEXT
1 1 2017 4 CLICK
1 1 2017 7 SEND
2 1 2017 1 CLICK
2 1 2017 3 TEXT
2 1 2017 1 TEXT
2 1 2017 1 CALL
.repartition(“year", “month”,“day”,”hour”)
www.tantusdata.com
Case #2
www.tantusdata.com
Case #2 - recap
• Know your data distribution when repartitioning
• Keep all your tasks busy - repartition by more columns if
necessary
www.tantusdata.com
Case #3
events
.join(users, users("id") === events("userId"))
//Extra logic
.write
.parquet(output)
www.tantusdata.com
Case #3
events
.join(users, users("id") === events("userId"))
//Extra logic
.write
.parquet(output)
www.tantusdata.com
Case #3
events
.join(users, users("id") === events("userId"))
//Extra logic
.write
.parquet(output)
www.tantusdata.com
Case #3
events
.join(users,users(“id")===events("userId"))
//Extra logic
.write
.parquet(output)
HDFS
TASK
Day 1
Day 2
Day 3
LOCAL
HDFS
TASK 2
Day 1
Day 2
Day 3
LOCAL
HDFS
TASK 1
users1
users2
users3
LOCAL
TASK 2
users1
users2
users3
LOCAL
STAGE 1
HDFS
HDFS
TASK
Day 1
Day 2
Day 3
LOCAL
HDFS
TASK 2
Day 1
Day 2
Day 3
LOCAL
HDFS
TASK X
Day 1
Day 2
Day 3
LOCAL
HDFS
TASK 1
users1
users2
users3
LOCAL
TASK 2
users1
users2
users3
LOCAL
TASK 1
users1
users2
users3
LOCAL
STAGE 1
HDFS
HDFS
HDFS
TASK X
Day 1
Day 2
Day 3
LOCAL
TASK 1
users1
users2
users3
LOCAL
HDFS
STAGE 2
HDFS
TASK
Day 1
Day 2
Day 3
LOCAL
HDFS
TASK 2
Day 1
Day 2
Day 3
LOCAL
HDFS
TASK X
Day 1
Day 2
Day 3
LOCAL
HDFS
TASK 1
users1
users2
users3
LOCAL
TASK 1
TASK 2
users1
users2
users3
LOCAL
TASK 1
users1
users2
users3
LOCAL
TASK 2
STAGE 1
HDFS
HDFS
HDFS
TASK X
Day 1
Day 2
Day 3
LOCAL
TASK 1
users1
users2
users3
LOCAL
HDFS
STAGE 2
STAGE 3
HDFS
TASK
Day 1
Day 2
Day 3
LOCAL
HDFS
TASK 2
Day 1
Day 2
Day 3
LOCAL
HDFS
TASK X
Day 1
Day 2
Day 3
LOCAL
HDFS
TASK 1
users1
users2
users3
LOCAL
TASK 1
TASK 2
users1
users2
users3
LOCAL
TASK 1
users1
users2
users3
LOCAL
TASK 2
STAGE 1
HDFS
HDFS
HDFS
TASK X
Day 1
Day 2
Day 3
LOCAL
TASK 1
users1
users2
users3
LOCAL
HDFS
STAGE 2
STAGE 3
HDFS
TASK
Day 1
Day 2
Day 3
LOCAL
HDFS
TASK 2
Day 1
Day 2
Day 3
LOCAL
HDFS
TASK X
Day 1
Day 2
Day 3
LOCAL
HDFS
TASK 1
users1
users2
users3
LOCAL
TASK 1
TASK 2
users1
users2
users3
LOCAL
TASK 1
users1
users2
users3
LOCAL
TASK 2
STAGE 1
HDFS
HDFS
HDFS
TASK X
Day 1
Day 2
Day 3
LOCAL
TASK 1
users1
users2
users3
LOCAL
HDFS
STAGE 2
STAGE 3
www.tantusdata.com
Case #3
10TB of
events
uniform
distribution
1GB of users
HDFS
TASK
Day 1
Day 2
Day 3
LOCAL
HDFS
TASK 2
Day 1
Day 2
Day 3
LOCAL
HDFS
TASK X
Day 1
Day 2
Day 3
LOCAL
HDFS
TASK 1
users1
users2
users3
LOCAL
TASK 1
TASK 2
users1
users2
users3
LOCAL
TASK 1
users1
users2
users3
LOCAL
TASK 2
STAGE 1
HDFS
HDFS
HDFS
TASK X
Day 1
Day 2
Day 3
LOCAL
TASK 1
users1
users2
users3
LOCAL
HDFS
STAGE 2
STAGE 3
.config(
“spark.sql.shuffle.partitions",
“200”
)
HDFS
TASK
Day 1
Day 2
Day 3
LOCAL
HDFS
TASK 2
Day 1
Day 2
Day 3
LOCAL
HDFS
TASK X
Day 1
Day 2
Day 3
LOCAL
HDFS
TASK 1
users1
users2
users3
LOCAL
TASK 1
TASK 2
users1
users2
users3
LOCAL
TASK 1
users1
users2
users3
LOCAL
TASK 2
STAGE 1
HDFS
HDFS
HDFS
TASK X
Day 1
Day 2
Day 3
LOCAL
TASK 1
users1
users2
users3
LOCAL
HDFS
STAGE 2
STAGE 3
.config(
“spark.sql.shuffle.partitions",
“200”
)
HDFS
TASK
Day 1
Day 2
Day 3
LOCAL
HDFS
TASK 2
Day 1
Day 2
Day 3
LOCAL
HDFS
TASK X
Day 1
Day 2
Day 3
LOCAL
HDFS
TASK 1
users1
users2
users3
LOCAL
TASK 1
TASK 2
users1
users2
users3
LOCAL
TASK 1
users1
users2
users3
LOCAL
TASK 2
STAGE 1
HDFS
HDFS
HDFS
TASK X
Day 1
Day 2
Day 3
LOCAL
TASK 1
users1
users2
users3
LOCAL
HDFS
STAGE 2
STAGE 3
10TB/200 = 50GB/TASK
10TB/200 = 50GB/TASK
HDFS
TASK
Day 1
Day 2
Day 3
LOCAL
HDFS
TASK 2
Day 1
Day 2
Day 3
LOCAL
HDFS
TASK X
Day 1
Day 2
Day 3
LOCAL
HDFS
TASK 1
users1
users2
users3
LOCAL
TASK 1
TASK 2
users1
users2
users3
LOCAL
TASK 1
users1
users2
users3
LOCAL
TASK 2
STAGE 1
HDFS
HDFS
HDFS
TASK X
Day 1
Day 2
Day 3
LOCAL
TASK 1
users1
users2
users3
LOCAL
HDFS
STAGE 2
STAGE 3
50G
50G
.config(
“spark.sql.shuffle.partitions",
“200”
)
www.tantusdata.com
Problems with shuffle
• Spill to disk
• Timeouts
• GC overhead limit exceeded
• OOM
• ExecutorLostFailure
www.tantusdata.com
What to do?
• Understand your data!
• Control the level of parallelism
• Choose number of partitions according to your data size
.config("spark.sql.shuffle.partitions", “2000")
.repartition(2000)
www.tantusdata.com
Case #3 - result
www.tantusdata.com
Case #3 - result
www.tantusdata.com
Case #4 - Skewed join
www.tantusdata.com
Case #4
Photo credit: hiveminer.com
www.tantusdata.com
Case #4
Photo credit: hiveminer.com
www.tantusdata.com
Skewed join
Photo credit: hiveminer.com
www.tantusdata.com
Skewed join
Photo credit: hiveminer.com
www.tantusdata.com
Case #4
10TB of
events
One user with
1 TB of events
others are
uniformly
distributed
HDFS
TASK
Day 1
Day 2
Day 3
LOCAL
HDFS
TASK 2
Day 1
Day 2
Day 3
LOCAL
HDFS
TASK X
Day 1
Day 2
Day 3
LOCAL
HDFS
TASK 1
users1
users2
users3
LOCAL
TASK 1
TASK 2
users1
users2
users3
LOCAL
TASK 1
users1
users2
users3
LOCAL
TASK 2
STAGE 1
HDFS
HDFS
HDFS
TASK X
Day 1
Day 2
Day 3
LOCAL
TASK 1
users1
users2
users3
LOCAL
HDFS
STAGE 2
STAGE 3
HDFS
TASK
Day 1
Day 2
Day 3
LOCAL
HDFS
TASK 2
Day 1
Day 2
Day 3
LOCAL
HDFS
TASK X
Day 1
Day 2
Day 3
LOCAL
HDFS
TASK 1
users1
users2
users3
LOCAL
TASK 1
TASK 2
users1
users2
users3
LOCAL
TASK 1
users1
users2
users3
LOCAL
TASK 2
STAGE 1
HDFS
HDFS
HDFS
TASK X
Day 1
Day 2
Day 3
LOCAL
TASK 1
users1
users2
users3
LOCAL
HDFS
STAGE 2
STAGE 3
1TB
www.tantusdata.com
Skewed join
• Bad data?
• Wrong logic?
• Just ok?
Photo credit: hiveminer.com
www.tantusdata.com
Skewed Join
userId …
1
2
3
… …
eventId userId …
af8 1
bf9 1
ff1 1
881 1
91f 2
cc6 1
b22 1
ee4 1
www.tantusdata.com
userId …
1
2
3
… …
eventId userId …
af8 1
bf9 1
ff1 1
881 1
91f 2
cc6 1
b22 1
ee4 1
Skewed Join
www.tantusdata.com
userId …
1
2
3
… …
eventId userId … salt
af8 1 1
bf9 1 2
ff1 1 1
881 1 3
91f 2 2
cc6 1 3
b22 1 3
ee4 1 1
Skewed Join
www.tantusdata.com
userId … salt
1 1
1 2
1 3
2 1
2 2
2 3
3 1
3 2
3 3
… …
Skewed Join
eventId userId … salt
af8 1 1
bf9 1 2
ff1 1 1
881 1 3
91f 2 2
cc6 1 3
b22 1 3
ee4 1 1
www.tantusdata.com
userId … salt
1 1
1 2
1 3
2 1
2 2
2 3
3 1
3 2
3 3
… …
eventId userId … salt
af8 1 1
bf9 1 2
ff1 1 1
881 1 3
91f 2 2
cc6 1 3
b22 1 3
ee4 1 1
Skewed Join
www.tantusdata.com
eventId userId … salt
af8 1 1
bf9 1 2
ff1 1 1
881 1 3
91f 2 2
cc6 1 3
b22 1 3
ee4 1 1
Skewed Join
userId … salt
1 1
1 2
1 3
2 1
2 2
2 3
3 1
3 2
3 3
… …
www.tantusdata.com
userId … salt
1 1
1 2
1 3
2 1
2 2
2 3
3 1
3 2
3 3
… …
Skewed Join
eventId userId … salt
af8 1 1
bf9 1 2
ff1 1 1
881 1 3
91f 2 2
cc6 1 3
b22 1 3
ee4 1 1
www.tantusdata.com
Skewed Join
val eventsSalted = events
.withColumn("salt", toInt(rand() * 100))
www.tantusdata.com
Skewed Join
val eventsSalted = events
.withColumn("salt", toInt(rand() * 100))
www.tantusdata.com
Skewed Join
val eventsSalted = events
.withColumn("salt", toInt(rand() * 100))
www.tantusdata.com
Skewed Join
val allSalts = spark.sparkContext
.parallelize((1 to 100).toList)
.toDF("salt")
val usersSalted = users.join(allSalts)
www.tantusdata.com
Skewed Join
val allSalts = spark.sparkContext
.parallelize((1 to 100).toList)
.toDF("salt")
val usersSalted = users.join(allSalts)
www.tantusdata.com
Skewed Join
val allSalts = spark.sparkContext
.parallelize((1 to 100).toList)
.toDF("salt")
val usersSalted = users.join(allSalts)
www.tantusdata.com
Skewed Join
eventsSalted
.join(
usersSalted,
usersSalted("id") === eventsSalted("userId")
&& usersSalted("salt") === eventsSalted("salt")
)
www.tantusdata.com
Skewed Join
eventsSalted
.join(
usersSalted,
usersSalted("id") === eventsSalted("userId")
&& usersSalted(“salt") === eventsSalted("salt")
)
www.tantusdata.com
Skewed Join
eventsSalted
.join(
usersSalted,
usersSalted("id") === eventsSalted("userId")
&& usersSalted(“salt") === eventsSalted("salt")
)
www.tantusdata.com
Skewed Join
eventsSalted
.join(
usersSalted,
usersSalted("id") === eventsSalted("userId")
&&
usersSalted("salt")===eventsSalted("salt")
)
www.tantusdata.com
Skewed Join
www.tantusdata.com
• Know your data!
• Understand the transformations you are doing
• Keep your tasks busy
Use cases - recap
Photo credit: shamakern.com
www.tantusdata.com
Challenges ahead
• Broadcast
• Cache
• Locality
• File formats
• Sizing executors
• …
www.tantusdata.com
Challenges ahead
OPS
DEVELOPERS
BUSINESS ANALYTICS
Common tools and knowledge
www.tantusdata.com
Q&A
• marcin@tantusdata.com
• marcin.szymaniuk@gmail.com
• @mszymani
www.tantusdata.com
Thank you!

More Related Content

Similar to Spark3

MapReduce with Scalding @ 24th Hadoop London Meetup
MapReduce with Scalding @ 24th Hadoop London MeetupMapReduce with Scalding @ 24th Hadoop London Meetup
MapReduce with Scalding @ 24th Hadoop London Meetup
Landoop Ltd
 
Writing Continuous Applications with Structured Streaming PySpark API
Writing Continuous Applications with Structured Streaming PySpark APIWriting Continuous Applications with Structured Streaming PySpark API
Writing Continuous Applications with Structured Streaming PySpark API
Databricks
 

Similar to Spark3 (20)

Samantha Wang [InfluxData] | Best Practices on How to Transform Your Data Usi...
Samantha Wang [InfluxData] | Best Practices on How to Transform Your Data Usi...Samantha Wang [InfluxData] | Best Practices on How to Transform Your Data Usi...
Samantha Wang [InfluxData] | Best Practices on How to Transform Your Data Usi...
 
Beyond php - it's not (just) about the code
Beyond php - it's not (just) about the codeBeyond php - it's not (just) about the code
Beyond php - it's not (just) about the code
 
Beyond php - it's not (just) about the code
Beyond php - it's not (just) about the codeBeyond php - it's not (just) about the code
Beyond php - it's not (just) about the code
 
Apache Spark - Data intensive processing in practice
Apache Spark - Data intensive processing in practiceApache Spark - Data intensive processing in practice
Apache Spark - Data intensive processing in practice
 
Spark Barcelona Meetup: Migrating Batch Jobs into Structured Streaming
Spark Barcelona Meetup: Migrating Batch Jobs into Structured StreamingSpark Barcelona Meetup: Migrating Batch Jobs into Structured Streaming
Spark Barcelona Meetup: Migrating Batch Jobs into Structured Streaming
 
Adaptive Query Processing on RAW Data
Adaptive Query Processing on RAW DataAdaptive Query Processing on RAW Data
Adaptive Query Processing on RAW Data
 
MapReduce with Scalding @ 24th Hadoop London Meetup
MapReduce with Scalding @ 24th Hadoop London MeetupMapReduce with Scalding @ 24th Hadoop London Meetup
MapReduce with Scalding @ 24th Hadoop London Meetup
 
Deep dive into stateful stream processing in structured streaming by Tathaga...
Deep dive into stateful stream processing in structured streaming  by Tathaga...Deep dive into stateful stream processing in structured streaming  by Tathaga...
Deep dive into stateful stream processing in structured streaming by Tathaga...
 
Continuous Application with Structured Streaming 2.0
Continuous Application with Structured Streaming 2.0Continuous Application with Structured Streaming 2.0
Continuous Application with Structured Streaming 2.0
 
apidays LIVE Australia 2020 - Strangling the monolith with a reactive GraphQL...
apidays LIVE Australia 2020 - Strangling the monolith with a reactive GraphQL...apidays LIVE Australia 2020 - Strangling the monolith with a reactive GraphQL...
apidays LIVE Australia 2020 - Strangling the monolith with a reactive GraphQL...
 
A Cassandra + Solr + Spark Love Triangle Using DataStax Enterprise
A Cassandra + Solr + Spark Love Triangle Using DataStax EnterpriseA Cassandra + Solr + Spark Love Triangle Using DataStax Enterprise
A Cassandra + Solr + Spark Love Triangle Using DataStax Enterprise
 
So you think you can stream.pptx
So you think you can stream.pptxSo you think you can stream.pptx
So you think you can stream.pptx
 
Beyond the Query – Bringing Complex Access Patterns to NoSQL with DataStax - ...
Beyond the Query – Bringing Complex Access Patterns to NoSQL with DataStax - ...Beyond the Query – Bringing Complex Access Patterns to NoSQL with DataStax - ...
Beyond the Query – Bringing Complex Access Patterns to NoSQL with DataStax - ...
 
ClickHouse Data Warehouse 101: The First Billion Rows, by Alexander Zaitsev a...
ClickHouse Data Warehouse 101: The First Billion Rows, by Alexander Zaitsev a...ClickHouse Data Warehouse 101: The First Billion Rows, by Alexander Zaitsev a...
ClickHouse Data Warehouse 101: The First Billion Rows, by Alexander Zaitsev a...
 
Big Data Day LA 2016/ Hadoop/ Spark/ Kafka track - Data Provenance Support in...
Big Data Day LA 2016/ Hadoop/ Spark/ Kafka track - Data Provenance Support in...Big Data Day LA 2016/ Hadoop/ Spark/ Kafka track - Data Provenance Support in...
Big Data Day LA 2016/ Hadoop/ Spark/ Kafka track - Data Provenance Support in...
 
Everyday I'm Shuffling - Tips for Writing Better Spark Programs, Strata San J...
Everyday I'm Shuffling - Tips for Writing Better Spark Programs, Strata San J...Everyday I'm Shuffling - Tips for Writing Better Spark Programs, Strata San J...
Everyday I'm Shuffling - Tips for Writing Better Spark Programs, Strata San J...
 
Writing Continuous Applications with Structured Streaming PySpark API
Writing Continuous Applications with Structured Streaming PySpark APIWriting Continuous Applications with Structured Streaming PySpark API
Writing Continuous Applications with Structured Streaming PySpark API
 
Beyond the Query: A Cassandra + Solr + Spark Love Triangle Using Datastax Ent...
Beyond the Query: A Cassandra + Solr + Spark Love Triangle Using Datastax Ent...Beyond the Query: A Cassandra + Solr + Spark Love Triangle Using Datastax Ent...
Beyond the Query: A Cassandra + Solr + Spark Love Triangle Using Datastax Ent...
 
What's New in Apache Spark 2.3 & Why Should You Care
What's New in Apache Spark 2.3 & Why Should You CareWhat's New in Apache Spark 2.3 & Why Should You Care
What's New in Apache Spark 2.3 & Why Should You Care
 
Spark Streaming with Cassandra
Spark Streaming with CassandraSpark Streaming with Cassandra
Spark Streaming with Cassandra
 

More from poovarasu maniandan (12)

Spark7
Spark7Spark7
Spark7
 
Spark4
Spark4Spark4
Spark4
 
Spark2
Spark2Spark2
Spark2
 
Ml3
Ml3Ml3
Ml3
 
Ml8
Ml8Ml8
Ml8
 
Ml2
Ml2Ml2
Ml2
 
Ml7
Ml7Ml7
Ml7
 
Ml5
Ml5Ml5
Ml5
 
Blue arm
Blue armBlue arm
Blue arm
 
Literature survey
Literature surveyLiterature survey
Literature survey
 
Home security system using internet of things
Home security system using internet of thingsHome security system using internet of things
Home security system using internet of things
 
rescue robot
rescue robotrescue robot
rescue robot
 

Recently uploaded

Call Girls Hsr Layout Just Call 👗 7737669865 👗 Top Class Call Girl Service Ba...
Call Girls Hsr Layout Just Call 👗 7737669865 👗 Top Class Call Girl Service Ba...Call Girls Hsr Layout Just Call 👗 7737669865 👗 Top Class Call Girl Service Ba...
Call Girls Hsr Layout Just Call 👗 7737669865 👗 Top Class Call Girl Service Ba...
amitlee9823
 
Call Girls Indiranagar Just Call 👗 9155563397 👗 Top Class Call Girl Service B...
Call Girls Indiranagar Just Call 👗 9155563397 👗 Top Class Call Girl Service B...Call Girls Indiranagar Just Call 👗 9155563397 👗 Top Class Call Girl Service B...
Call Girls Indiranagar Just Call 👗 9155563397 👗 Top Class Call Girl Service B...
only4webmaster01
 
Call Girls Jalahalli Just Call 👗 7737669865 👗 Top Class Call Girl Service Ban...
Call Girls Jalahalli Just Call 👗 7737669865 👗 Top Class Call Girl Service Ban...Call Girls Jalahalli Just Call 👗 7737669865 👗 Top Class Call Girl Service Ban...
Call Girls Jalahalli Just Call 👗 7737669865 👗 Top Class Call Girl Service Ban...
amitlee9823
 
➥🔝 7737669865 🔝▻ mahisagar Call-girls in Women Seeking Men 🔝mahisagar🔝 Esc...
➥🔝 7737669865 🔝▻ mahisagar Call-girls in Women Seeking Men  🔝mahisagar🔝   Esc...➥🔝 7737669865 🔝▻ mahisagar Call-girls in Women Seeking Men  🔝mahisagar🔝   Esc...
➥🔝 7737669865 🔝▻ mahisagar Call-girls in Women Seeking Men 🔝mahisagar🔝 Esc...
amitlee9823
 
Call Girls In Hsr Layout ☎ 7737669865 🥵 Book Your One night Stand
Call Girls In Hsr Layout ☎ 7737669865 🥵 Book Your One night StandCall Girls In Hsr Layout ☎ 7737669865 🥵 Book Your One night Stand
Call Girls In Hsr Layout ☎ 7737669865 🥵 Book Your One night Stand
amitlee9823
 
CHEAP Call Girls in Saket (-DELHI )🔝 9953056974🔝(=)/CALL GIRLS SERVICE
CHEAP Call Girls in Saket (-DELHI )🔝 9953056974🔝(=)/CALL GIRLS SERVICECHEAP Call Girls in Saket (-DELHI )🔝 9953056974🔝(=)/CALL GIRLS SERVICE
CHEAP Call Girls in Saket (-DELHI )🔝 9953056974🔝(=)/CALL GIRLS SERVICE
9953056974 Low Rate Call Girls In Saket, Delhi NCR
 
Call Girls In Doddaballapur Road ☎ 7737669865 🥵 Book Your One night Stand
Call Girls In Doddaballapur Road ☎ 7737669865 🥵 Book Your One night StandCall Girls In Doddaballapur Road ☎ 7737669865 🥵 Book Your One night Stand
Call Girls In Doddaballapur Road ☎ 7737669865 🥵 Book Your One night Stand
amitlee9823
 
Abortion pills in Jeddah | +966572737505 | Get Cytotec
Abortion pills in Jeddah | +966572737505 | Get CytotecAbortion pills in Jeddah | +966572737505 | Get Cytotec
Abortion pills in Jeddah | +966572737505 | Get Cytotec
Abortion pills in Riyadh +966572737505 get cytotec
 
➥🔝 7737669865 🔝▻ Mathura Call-girls in Women Seeking Men 🔝Mathura🔝 Escorts...
➥🔝 7737669865 🔝▻ Mathura Call-girls in Women Seeking Men  🔝Mathura🔝   Escorts...➥🔝 7737669865 🔝▻ Mathura Call-girls in Women Seeking Men  🔝Mathura🔝   Escorts...
➥🔝 7737669865 🔝▻ Mathura Call-girls in Women Seeking Men 🔝Mathura🔝 Escorts...
amitlee9823
 
Call Girls In Nandini Layout ☎ 7737669865 🥵 Book Your One night Stand
Call Girls In Nandini Layout ☎ 7737669865 🥵 Book Your One night StandCall Girls In Nandini Layout ☎ 7737669865 🥵 Book Your One night Stand
Call Girls In Nandini Layout ☎ 7737669865 🥵 Book Your One night Stand
amitlee9823
 
Call Girls In Bellandur ☎ 7737669865 🥵 Book Your One night Stand
Call Girls In Bellandur ☎ 7737669865 🥵 Book Your One night StandCall Girls In Bellandur ☎ 7737669865 🥵 Book Your One night Stand
Call Girls In Bellandur ☎ 7737669865 🥵 Book Your One night Stand
amitlee9823
 
CHEAP Call Girls in Rabindra Nagar (-DELHI )🔝 9953056974🔝(=)/CALL GIRLS SERVICE
CHEAP Call Girls in Rabindra Nagar  (-DELHI )🔝 9953056974🔝(=)/CALL GIRLS SERVICECHEAP Call Girls in Rabindra Nagar  (-DELHI )🔝 9953056974🔝(=)/CALL GIRLS SERVICE
CHEAP Call Girls in Rabindra Nagar (-DELHI )🔝 9953056974🔝(=)/CALL GIRLS SERVICE
9953056974 Low Rate Call Girls In Saket, Delhi NCR
 

Recently uploaded (20)

Call Girls Hsr Layout Just Call 👗 7737669865 👗 Top Class Call Girl Service Ba...
Call Girls Hsr Layout Just Call 👗 7737669865 👗 Top Class Call Girl Service Ba...Call Girls Hsr Layout Just Call 👗 7737669865 👗 Top Class Call Girl Service Ba...
Call Girls Hsr Layout Just Call 👗 7737669865 👗 Top Class Call Girl Service Ba...
 
Predicting Loan Approval: A Data Science Project
Predicting Loan Approval: A Data Science ProjectPredicting Loan Approval: A Data Science Project
Predicting Loan Approval: A Data Science Project
 
DATA SUMMIT 24 Building Real-Time Pipelines With FLaNK
DATA SUMMIT 24  Building Real-Time Pipelines With FLaNKDATA SUMMIT 24  Building Real-Time Pipelines With FLaNK
DATA SUMMIT 24 Building Real-Time Pipelines With FLaNK
 
Call Girls Indiranagar Just Call 👗 9155563397 👗 Top Class Call Girl Service B...
Call Girls Indiranagar Just Call 👗 9155563397 👗 Top Class Call Girl Service B...Call Girls Indiranagar Just Call 👗 9155563397 👗 Top Class Call Girl Service B...
Call Girls Indiranagar Just Call 👗 9155563397 👗 Top Class Call Girl Service B...
 
Call Girls Jalahalli Just Call 👗 7737669865 👗 Top Class Call Girl Service Ban...
Call Girls Jalahalli Just Call 👗 7737669865 👗 Top Class Call Girl Service Ban...Call Girls Jalahalli Just Call 👗 7737669865 👗 Top Class Call Girl Service Ban...
Call Girls Jalahalli Just Call 👗 7737669865 👗 Top Class Call Girl Service Ban...
 
April 2024 - Crypto Market Report's Analysis
April 2024 - Crypto Market Report's AnalysisApril 2024 - Crypto Market Report's Analysis
April 2024 - Crypto Market Report's Analysis
 
➥🔝 7737669865 🔝▻ mahisagar Call-girls in Women Seeking Men 🔝mahisagar🔝 Esc...
➥🔝 7737669865 🔝▻ mahisagar Call-girls in Women Seeking Men  🔝mahisagar🔝   Esc...➥🔝 7737669865 🔝▻ mahisagar Call-girls in Women Seeking Men  🔝mahisagar🔝   Esc...
➥🔝 7737669865 🔝▻ mahisagar Call-girls in Women Seeking Men 🔝mahisagar🔝 Esc...
 
Call Girls In Hsr Layout ☎ 7737669865 🥵 Book Your One night Stand
Call Girls In Hsr Layout ☎ 7737669865 🥵 Book Your One night StandCall Girls In Hsr Layout ☎ 7737669865 🥵 Book Your One night Stand
Call Girls In Hsr Layout ☎ 7737669865 🥵 Book Your One night Stand
 
CHEAP Call Girls in Saket (-DELHI )🔝 9953056974🔝(=)/CALL GIRLS SERVICE
CHEAP Call Girls in Saket (-DELHI )🔝 9953056974🔝(=)/CALL GIRLS SERVICECHEAP Call Girls in Saket (-DELHI )🔝 9953056974🔝(=)/CALL GIRLS SERVICE
CHEAP Call Girls in Saket (-DELHI )🔝 9953056974🔝(=)/CALL GIRLS SERVICE
 
Call Girls In Doddaballapur Road ☎ 7737669865 🥵 Book Your One night Stand
Call Girls In Doddaballapur Road ☎ 7737669865 🥵 Book Your One night StandCall Girls In Doddaballapur Road ☎ 7737669865 🥵 Book Your One night Stand
Call Girls In Doddaballapur Road ☎ 7737669865 🥵 Book Your One night Stand
 
Abortion pills in Jeddah | +966572737505 | Get Cytotec
Abortion pills in Jeddah | +966572737505 | Get CytotecAbortion pills in Jeddah | +966572737505 | Get Cytotec
Abortion pills in Jeddah | +966572737505 | Get Cytotec
 
BDSM⚡Call Girls in Mandawali Delhi >༒8448380779 Escort Service
BDSM⚡Call Girls in Mandawali Delhi >༒8448380779 Escort ServiceBDSM⚡Call Girls in Mandawali Delhi >༒8448380779 Escort Service
BDSM⚡Call Girls in Mandawali Delhi >༒8448380779 Escort Service
 
➥🔝 7737669865 🔝▻ Mathura Call-girls in Women Seeking Men 🔝Mathura🔝 Escorts...
➥🔝 7737669865 🔝▻ Mathura Call-girls in Women Seeking Men  🔝Mathura🔝   Escorts...➥🔝 7737669865 🔝▻ Mathura Call-girls in Women Seeking Men  🔝Mathura🔝   Escorts...
➥🔝 7737669865 🔝▻ Mathura Call-girls in Women Seeking Men 🔝Mathura🔝 Escorts...
 
Discover Why Less is More in B2B Research
Discover Why Less is More in B2B ResearchDiscover Why Less is More in B2B Research
Discover Why Less is More in B2B Research
 
Detecting Credit Card Fraud: A Machine Learning Approach
Detecting Credit Card Fraud: A Machine Learning ApproachDetecting Credit Card Fraud: A Machine Learning Approach
Detecting Credit Card Fraud: A Machine Learning Approach
 
Thane Call Girls 7091864438 Call Girls in Thane Escort service book now -
Thane Call Girls 7091864438 Call Girls in Thane Escort service book now -Thane Call Girls 7091864438 Call Girls in Thane Escort service book now -
Thane Call Girls 7091864438 Call Girls in Thane Escort service book now -
 
Call Girls In Nandini Layout ☎ 7737669865 🥵 Book Your One night Stand
Call Girls In Nandini Layout ☎ 7737669865 🥵 Book Your One night StandCall Girls In Nandini Layout ☎ 7737669865 🥵 Book Your One night Stand
Call Girls In Nandini Layout ☎ 7737669865 🥵 Book Your One night Stand
 
Call Girls In Bellandur ☎ 7737669865 🥵 Book Your One night Stand
Call Girls In Bellandur ☎ 7737669865 🥵 Book Your One night StandCall Girls In Bellandur ☎ 7737669865 🥵 Book Your One night Stand
Call Girls In Bellandur ☎ 7737669865 🥵 Book Your One night Stand
 
CHEAP Call Girls in Rabindra Nagar (-DELHI )🔝 9953056974🔝(=)/CALL GIRLS SERVICE
CHEAP Call Girls in Rabindra Nagar  (-DELHI )🔝 9953056974🔝(=)/CALL GIRLS SERVICECHEAP Call Girls in Rabindra Nagar  (-DELHI )🔝 9953056974🔝(=)/CALL GIRLS SERVICE
CHEAP Call Girls in Rabindra Nagar (-DELHI )🔝 9953056974🔝(=)/CALL GIRLS SERVICE
 
Midocean dropshipping via API with DroFx
Midocean dropshipping via API with DroFxMidocean dropshipping via API with DroFx
Midocean dropshipping via API with DroFx
 

Spark3