top of page
Digital Network_edited.jpg

Nexus of Data and Software Solutions

Integration of tools that address specific, clear objectives

Inception

In the previous article, State of Your Data, we talked about the needs of a nascent business. In particular, that a startup will choose its software systems and data storage based on its immediate needs and the perception of its needs in the near future. The focus is naturally on the budget and the ease with which it can preform the main lines of work or production at a point in time. Data is subsequently created out of the need to capture information. 

 

Evolution

What about existing businesses? What prompts them to make changes to their data architectures? Businesses often make decisions to effect their existing systems when they have to resolve changing regulatory requirements or recurring problems, such as shortcomings in the quality of their outputs. These type of reactive approaches can be more expensive. The cost of making needed changes may not be provided for in existing budgets. Time constraints also place pressure on the decision-makers and may inhibit their ability to find the right cost-effective solutions. 

 

An alternative approach

In development operations (DevOps), this risk is hedged to some degree using continuous integration/continuous delivery. DevOps is an approach used in software development that is defined as "a set of technical and management practices that aim to increase an organization's velocity in releasing high-speed software" (Gift & Deza, 2021, p.5). One of the key aspects of this approach is that the actions performed over data are mapped out as part of an overall end-to-end, cyclical process. Each stage is broken down into a set of logical subprocesses. There is also a strong focus on oversight, automation of repeated actions, and quality. In this framework, changes are anticipated and provisions are made to integrate them at the different stages. This approach is beneficial to the development of any data architecture. This approach is considered a best practice when implementing analytic and machine learning solutions, particularly when the inputs of a machine learning model are dynamic/constantly changing and outputs are in continual use. Does it mean that this approach would be appropriate for a small business? Not exactly. But some of the key concepts can be adapted as businesses look to take an a more proactive role in the management of their data.

 

Let's consider a scenario where an established business is looking to make changes as a result of strategic decisions. For example, it wants to use  the existing data to quantify the monthly changes in production as a factor of employee turnover. This information is expected to give it insight into the value of retaining employees and to guage the real costs of turnover. In this situation, it becomes essential for the business to define three factors: budget, timeline, and expected outcomes. To do so, they will need to understand the state of their data. This prerequisite will enable it to draw a baseline, scope out what needs to be built, and to identify the solutions are right for it. Other important considerations are:

  1. Are the technical experts involved in defining the scope and needs of the project? 

  2. Does this effort need to be broken down into phases?

  3. Who will be held accountable for signing off on the deliverables? 

  4. Are there contingencies in place for the testing of those deliverables prior to sign-off?

  5. What is the timeline? 

 

At other times, simpler solutions may be appropriate to address recurring issues or to make updates to comply with regulatory changes. For example, adding quality control checks to a spreadsheet may help to validate the integrity of the data that is entered. One approach may be to require the review of the minimum and maximum values for a particular month or limiting a field to one specific data type. When the fixes are not as evident, a process audit would help to identify the steps in the process that are leading to the issues being observed.  

 

The common thread amongst these different landscapes is that each piece of information that is captured as data has relevance and a specific use for the business. As such, it is a good idea to ensure that it is captured accurately (data quality), particularly if a business intends to adapt it for further for inference (analytics) at any point in time. 

 

From Reactive Fixes to Proactive Data Strategies

In contrast to new businesses, who are likely to design their data architectures based on their immediate needs, established businesses make changes when they face issues like new regulations, recurring data quality problems, or errors in outputs. Unfortunately, these reactive decisions can be costly and constrained by time and budget pressures.

 

To address this, businesses can borrow concepts from DevOps, which uses structured, end-to-end processes with an emphasis on automation, oversight, and adaptability. While full-scale DevOps may not suit small businesses, its core principles—especially anticipating and managing change— can improve how data and analytics initiatives are approached.

We advocate for a proactive, strategic approach to data management to support smarter, more efficient decision-making. We are here to help you identify and implement solutions. 

Sources:

Gift, Noah & Deza, Alfredo (2021). "Practical MLOps: Operationalizing Machine Learning Models." O'Reilly Media. 

If you enjoyed this post, please consider sponsoring our work. 

$

Thank you for your sponsorship!

Analytics Daily

Sign up to our tech blog to receive new posts 

Contact us to get your service started

Are you contacting us on behalf of your business or as an individual?
Indicate the service you are interested in.

Partner with Us on Your Journey

  • YouTube
  • LinkedIn
  • Twitter

© 2035 by Analytics Daily. Powered and secured by Wix

bottom of page