Walmart has continuously redefined what can be achieved in the data streaming space
What problem was the nominee organization looking to solve with Data Streaming Technology?
Historically like most Retail companies, Walmart leveraged Batch Platforms and Data ingest technologies to bring in the data from various Physical nodes (Stores, DCs, etc) and calculate the Order points and the Order plans for the future. That lag in time between when the input data is created and when it is ingested and used causes inefficient and inaccurate Order plans in most cases. Walmart’s team onboarded the whole Platform into streaming architecture using Kafka to make more real-time, accurate decisions.
How did they solve the problem?
On any given day, Walmart’s real-time replenishment system processes more than tens of billions of messages from close to 100 million SKUs in less than three hours. They leverage an array of processors to generate an order plan for their entire network of Walmart stores with great accuracy and at high throughputs of 85GB messages/min. While doing so, it also ensures there is no data loss through event tracking and necessary replays and retries.
The work that Walmart’s architects did changed the quality of their Order plans and in turn allowed the wider business and Merchants to make better decisions. It also helped reduce the frequency of products going out of stock, which of course drastically helped improve Walmart’s Customer experience.