Más contenido relacionado La actualidad más candente (20) Similar a Cisco’s E-Commerce Transformation Using Kafka (20) Cisco’s E-Commerce Transformation Using Kafka 2. © 2017 Cisco and/or its affiliates. All rights reserved. Cisco Confidential
Agenda
Kafka Architecture2
1 Kafka Use Cases
Kafka Monitoring3
Lessons Learnt4
3. © 2018 Cisco and/or its affiliates. All rights reserved. Cisco Confidential
Orders booked
$50+B
138 Countries
63 Device types 16 Browsers
185KUsers16 Languages
6M Hits/day
6.9 M Estimates 5.3 M Quotes 1.9 M Orders 85.6% Orders
Orders
Autobook
Portal 71% B2B 29%
Cisco Commerce By The Numbers
4. © 2018 Cisco and/or its affiliates. All rights reserved. Cisco Confidential
Ref Data
REFERENCE DATA SOURCE
DMPRD - RDBMS
Logging
Order Capture
DC1 - Tomcat
Order Capture
Transaction
Data
Downstream
Publish
X-Functional
Services (73)
DC3DC2DC1
TRANSACTION DATA STORE
P S S S S
N1 N2 N3 N4 N5
DC2 - Tomcat
1 2
3
4
Addresses Items
Preferences
Roles
Contacts Logging
DC1 & DC2
Commerce – Cloud Native
6. © 2017 Cisco and/or its affiliates. All rights reserved. Cisco Confidential
Kafka – Use Cases
Data push to
downstreams
1. Avoid point to point
integration.
2. Avoid direct
connection to
transactional DB.
Elastic Search Data
Push
1. Reduce load on
transactional DB
2. Eliminates ES out of
sync in multi-DCs
Machine Learning Use
Cases using Spark
1. Recommendation
Engine
2. Most Popular
Configurations
3. Most popular products
for a given category
7. © 2017 Cisco and/or its affiliates. All rights reserved. Cisco Confidential
Customer who bought X also bought Y.
Identify products which are
mostly bought together so we can create
bundles or promotions accordingly.
1
2
Algorithm: Apriori
ML Use Case
1. Recommendation Engine
8. © 2017 Cisco and/or its affiliates. All rights reserved. Cisco Confidential
Provide visibility to most popular
configurations for a given product.
Provide visibility to a configuration which
Customer has recently bought for the
given product.
1
2
Allow selection of pre-configured products
instead of starting from scratch.
ML Use Case
2. Popular Product Configuration
10. © 2017 Cisco and/or its affiliates. All rights reserved. Cisco Confidential
Producer (Capture Order) Producer (Return Order)
Broker 1 Broker 2 Broker 3 Broker 4
ZK - 1 ZK - 2 ZK - 3 ZK - 4 ZK - 5
Consumer (Smart- SW SC)
Kafka Cluster
Zookeeper
DC1 DC2
DC1 DC2 DC3
DC1 – RCDN; DC2 – ALLEN; DC3 – RTP
Coordinates cluster membership
Commit Offset (v 0.10.x.x)
Kafka Architecture
Consumer (EDW)
11. © 2017 Cisco and/or its affiliates. All rights reserved. Cisco Confidential
RDBMS
ProducerCustom Code (DC1) Custom Code (DC2)FAULT
TOLERENT
Kafka
DC1 and DC2
Consumer Group - DC1 Consumer Group – DC2
Elastic Search – DC1 Elastic Search – DC2
Kafka Architecture – Elastic Search
14. © 2017 Cisco and/or its affiliates. All rights reserved. Cisco Confidential
Monitoring: Kafka Manager and Kafdrop
Kafka Manager Kafdrop
15. © 2017 Cisco and/or its affiliates. All rights reserved. Cisco Confidential
Kafka – Custom Scripts
1. Cron job to check Kafka
processes every minute. Restart
Kafka process
(and send email) in case it’s not
running.
2. Always take back up of logs
systematically when Kafka processes
are getting restarted.
3. Have a test topic and push test
message every minute. Trigger a
notification in case of failures.
17. © 2017 Cisco and/or its affiliates. All rights reserved. Cisco Confidential
Best
Practices
1
Have a mechanism to reset Kafka offsets on
demand.
4 Auto Re-push mechanism in case producer
gets error while pushing data into Kafka
2
Have a mechanism to re-push data to
Kafka topic,
3 Enable SSL for secure access
18. © 2017 Cisco and/or its affiliates. All rights reserved. Cisco Confidential
UI – Reset Offsets
19. © 2017 Cisco and/or its affiliates. All rights reserved. Cisco Confidential
UI – Re-push data
20. © 2017 Cisco and/or its affiliates. All rights reserved. Cisco Confidential
Kafka Producer & Consumer Setup with SSL
Below properties are required to enable SSL for both Producer and Consumer
If client authentication is not required in
the broker then below configuration is
suffice, (kafka.client.truststore.jks will
be provided by kafka service host.)
1
If client authentication is required in the
broker then below configuration is
required. (kafka.client.keystore.jks will
be provided by kafka service host. )
2
21. © 2017 Cisco and/or its affiliates. All rights reserved. Cisco Confidential
Auto Re-Push Mechanism
Failure
Source
Data Push
In case of failures
Offline
Scheduler
Failed records
Re - Push
22. © 2017 Cisco and/or its affiliates. All rights reserved. Cisco Confidential
Lessons
Learnt
1
Have a while loop while subscribing to any
Kafka Topics instead of creating
consumer every time.
4
Data Size - consumer's
max.partition.fetch.bytes should be greater
or equals to the producers
producer.max.request.size Default is 1MB.
2
Always use key if you want all messages
for a particular key (e.g. order id) always
goes to a particular partition.
3
enable.auto.commit - Default is true. It is
better to set it false to get control over
when to commit the offset.
23. © 2017 Cisco and/or its affiliates. All rights reserved. Cisco Confidential
Lessons
Learnt
5
Have a custom script deployed to monitor &
restart Kafka nodes in case of
any issues.
8
Reset offset: Make sure there is no active
consumer on this topic for that
consumer group.
6
heartbeat.interval.ms must be smaller
than session.timeout.ms.
session.timeout.ms : it controls the time it
takes to detect a consumer crash and
stop sending heartbeats.
heartbeat.interval.ms :The expected time
between heartbeats to the consumer
7 auto.offset.reset -default latest
25. © 2017 Cisco and/or its affiliates. All rights reserved. Cisco Confidential
Kafka Architecture – ML Use Case
Quote
Stream
Order
Stream