EMPOWERING A DATA-DRIVEN CULTURE FOR ONE OF THE LARGEST US FOOD MANUFACTURERS

Learn how we helped our client build a strong and scalable foundation for data lifecycle management (DLM), enabling data-driven decisions and paving the way for advanced AI/ML technologies to future-proof their business initiatives and optimize business expenses.

  • Industry

    Manufacturing & retail

  • Country

    USA

  • Team size

    5 IT experts

  • Period of collaboration

    August 2023 – present

About the client

Our client is a leading American food manufacturer and retailer that has been in business for over 30 years. Operating in both the food service and grocery sectors, the company manages a complex, high-volume supply chain feeding into multiple locations across the country.

Business context

Using data as a strategic asset, the client was looking to transition to a more comprehensive data ecosystem. Their key goals were to gain a unified view of their data and increase process visibility for informed and faster business insights, enabling data-driven decision-making.

However, to fully adopt a data-driven culture, the company had to solve several challenges their departments faced:

  • Limited visibility into data potential and possibilities for data use due to the lack of a full-fledged data processing system
  • Hampered delivery of business insights on service quality and improvement because of outdated data processing and storage tools
  • Siloed data across departments with no centralized data warehouse
  • Data maintenance difficulties due to sluggish report generation

The client was already using Microsoft Power BI but wanted to take a more robust approach to adopting business intelligence capabilities. Facing a shortage of resources in setting up a structured data handling approach (from data collection and processing to deletion) within the required timelines, they requested support from Yalantis experts.

Solution overview

  • To expedite project progress, the Yalantis team has become fully integrated into the client’s software development lifecycle (SDLC), closely collaborating with their data experts and functioning as part of their team.

    Our goal was to help the client modernize their data processing technology by creating versatile functionality for storing and analyzing data. This involved handling growing data volumes through data lifecycle management (DLM) — the process of managing, organizing, and securing data from creation or collection to archival and deletion.

     

    Together with the client, we began by focusing on sales and finance data to improve sales and resource management, with the intention of extending the efforts to other departments. This involved three key stages:

  • Migrating data from manual files to a database to create a consistent data storage standard

    A key focus during this stage was on providing the sales and finance departments with access to accurate data on business activities, trends, and changes in a comprehensive form for quick retrieval. This involved:

    • analyzing source data by defining all available data sources and determining systems of record to avoid loading unnecessary datasets
    • standardizing the data structure and populating the database with data from raw sources in the target form
    • systematically and efficiently transferring logic from files to the database
  • Setting up a data warehouse for a unified data management solution

    This phase involved consolidating data from diverse sources, ensuring data quality and integrity, and optimizing storage for analytics and reporting. It encompassed the following steps:

    • Modeling a data schema. Together with the client’s team, we organized tables, schemas, logic, and data processing on the Amazon SQL server. This included clustering information and staging different types of processing for improved control and scalability.
    • Developing an extract, transform, load (ETL) process from the ground up using Azure Data Factory to build data pipelines. The ETL architecture with various data clusters and data schemes ensured the system could support increased data volumes, allowing seamless data processing regardless of the raw data volume.
    • Introducing Python for automatic data loading, validation, and testing. Python scripts were scheduled to run automatically, ensuring consistent data processing, reducing manual errors, and efficiently handling large datasets and complex tasks.
    • Defining security and access controls to protect sensitive data and control user access, such as by encrypting sensitive data and establishing role-based access control (RBAC).

     

  • Optimizing reporting for a unified view of accurate data

    In this phase, data engineers and analysts enhanced reporting, which involved migrating all of the client’s reports and creating new ones based on datasets. Activities included:

    • shifting calculations to the database. As we moved calculations from the Power BI side to the database side with SQL, Power BI now functions solely as a display tool without performing calculations upon receiving data. This accelerates report processing, enhancing the overall user experience. Analysts and users benefit from swift responses to filters and data updates, all automatically executed through SQL code in the background.
    • scheduling data refreshes. Data engineers defined and automated regular data refreshes to maintain the warehouse’s current state.
    • archiving data. The team set up tables to archive data, providing coverage for test cases and facilitating comparisons with uploaded raw data. This ensured database integrity and assisted in identifying potential data loss.

Value delivered

Together with the client, Yalantis continues to diligently refine the solution. In our collaboration, we have laid the groundwork for a solid data core, supporting an efficient data-driven culture within the company. We initiated a real-time data ecosystem, introducing a comprehensive format for how data should be entered and processed. This empowers the client’s sales and finance departments by:

  • enabling justified decision-making by providing quick access to accurate business operations data through properly processed Power BI reports that display valid trends and changes

  • facilitating resource management and cost reduction with accurate historical analysis

  • ensuring full operational transparency via optimized, auto-refreshed reports and dashboards and a possibility of manual data processing

  • providing simplified process migration and integration since a unified data system allows seamless scaling and quick deployment of new functionalities

  • paving the way for seamlessly implementing more complex technologies, such as AI/ML, which can’t be built without a solid data engineering basis

What’s next?

  • In the initial six months, we worked with our client to successfully provide access to valid data in a comprehensive form for the sales and finance departments. Moving forward into 2024, our client has strategic plans to:

    • extend data-driven decision-making capabilities to ops and stock management
    • establish a comprehensive enterprise data warehouse
    • create a robust data core, which will ensure swift and secure access to accurate company data based on an employee’s position and level, facilitating agile product control and proactive analysis of any internal or external impact on the product

GET A UNIFIED DATA SOURCE FOR INFORMED DECISION-MAKING

Maximize your business data’s potential by establishing a DLM process with Yalantis expertise.

Contact our team