A Deep Dive into Supermarket Sales Data for Q1 2019

A Deep Dive into Supermarket Sales Data for Q1 2019

Overview

This project is aimed at generating insight from the sales data of a supermart for the first quarter of 2019, January to March, using Microsoft Excel.

Business Requirements

Recommended Analysis

  1. What is the total income of the store for provided records? (KPI)

  2. How many orders were received for provided records? (KPI)

  3. What is the overall rating of the store? (KPI)

  4. How many orders do we have each day? Are there any peak hours?

  5. What time of the day do we get most orders? How much did we make in these times?

    • Morning is 9am – 12noon

    • Afternoon is 12 noon – 4pm

    • Evening is 4pm – 9pm

  6. Break down the gross income by month, customer type, gender, product type, city and branch of store

  7. Interactive review of the sales record using gender, customer type, city and branch of store.

Data Gathering/ Source

The dataset used for this analysis was collected from three Supermarket branches in Yangon,Mandalay, and Naypyitaw in Myanmar (Asia). You can access the dataset here. The dataset contains different attributes about Products, Custmers, Sales and the City. I made a copy of the data and launched it on Excel 2019 and preview the different sheets to understand the dataset.

Data Preparation

I cleaned the dataset by using filters and manually scanned and checked for empty cells, checked the data types and formating of the columns to make sure that they are consistent and complete.

Data Manipuation

I used the Formulas and Functions to create custome attributes for my analysis. I created the Day of the week attributes using the TEXT function to extract the day of the week for each date in the dataset. This helps in understanding sales patterns throughout the week.

\=TEXT(F2,"ddd")

Additionally, I categorized the time of day into morning, afternoon, and evening using the TIME function along with the IF condition. This allows us to analyze when the store experiences peak order times.

\=IF(H2<TIME(12,0,0),"Morning",IF(H2<TIME(16,0,0),"Afternoon","Evening"))

I also merged the city and branch fields using the CONCAT function to simplify our data and create a more comprehensive view of store locations.

\=CONCAT(C2, "_Branch ",B2)

Data Processing

To perform a comprehensive analysis, I converted the cleaned dataset into an Excel table and titled the table appropraitely from the design tab, the altered attributes was colored differently. This step enables automatic updates and makes it easier to work with the data.

I also established relationships between different tables in the dataset by linking them using primary keys and foreign keys. This created a data model that allows for more complex analyses. To form relationship with the tables, I used the relationship command on the data tab, and formed data model using their primary key and foreign key in this case the Invoice id column.

Data Analysis

I utilized pivot tables to summarize the data using calculated fields like sum, average, and count, and provided context to the analysis. Pivot tables are a powerful tool in Excel that allow for dynamic summarization and exploration of data.

Visualizing Insights

To gain deeper insights from the data, I employed various chart types to visualize the key performance indicators (KPIs) and sales patterns. Charts such as bar graphs, line charts, and pie charts were used to represent different aspects of the analysis.

I also ensured i only made used of reference to all the titles and texts used in the chart, this is to enable auto update if any changes was made to the pivot analysis.

for instance the values on the columb B and C was referenced in the objects created in the dashboard.

\=pivotTabe!B2

\=pivotTabe!B7

\=pivotTabe!C4

\=pivotTabe!B12

\=pivotTabe!B16

Key Findings

From the dashboard analysis, we can quickly discern important information:

Total Sales and Income: The store's total sales and gross income for the first quarter of 2019 were calculated, providing a snapshot of its financial performance.

Number of Orders: I determined the total number of orders, shedding light on customer demand.

Overall Store Rating: The overall store rating was analyzed, which is crucial for understanding customer satisfaction.

Order Patterns: I identified the busiest days and peak order hours. This information can help in staffing and inventory management.

Product Performance: I broke down gross income by various categories, such as product type, city, and branch. This allows for a detailed analysis of what products contribute most to revenue.

Dynamic Dashboard

The analysis doesn't stop here. You can explore the data interactively by filtering it based on gender, customer type, city, and branch of the store. This functionality allows for deeper insights into specific segments of the business and this can be achieved by using the slicer, connecting the requried pivot tables and using the report connection option to link the pivot table to each other.

Recommendations

Based on the analysis, here are some recommendations:

Staffing: Consider adding extra staff during peak hours to ensure a smooth shopping experience for customers.

Customer Service: Enhance customer service and gather feedback to identify areas for improvement. Improving the overall customer experience can lead to higher ratings and increased loyalty.

Conclusion

This data analysis project provides valuable insights into the performance of a supermarket during the first quarter of 2019. By following the steps outlined, you can reproduce the analysis and gain a deeper understanding of the data. Feel free to share your thoughts and insights in the comments section; your input is greatly appreciated!