“Two pairs of Eyes” usage in data archiving, retention management, and data privacy

10-01-2024 | 4 min read | SAP Data Archiving, SAP Data Management

For years, I’ve been advising large corporations in these fields: document and data archiving, data privacy, and retention management (deletion management).

For all these topics, “going back to” is either not possible or extremely complex. Also, the stakes are high:

  • Sometimes it slows down corporate digital sales, the order entry system, or the full ERP.
  • Paying for a tax penalty (usually 300k€ or more) or a data privacy penalty (a % of the group revenue).
  • Or there is reputational damage (when data leaks …).

A common approach suggested by customers is the ‘two pairs of eyes’ concept. In this post, we’ll delve into this concept, thinking of where it comes from, and whether it applies to such “not going back high penalty” business topics.

The “two pairs of eyes” concept has emerged as a crucial quality control method, from the detailed scrutiny of legal documents to the intricate process of software development. Its origins could arguably be traced back to ancient times, when scribes might cross-check each other’s transcriptions of important texts. In modern contexts, it evolved from the best practices of critical fields like aviation and medicine, where the stakes of solitary errors are intolerably high.

In contemporary usage, this approach is seen as an asset in environments where the cost of mistakes can be prohibitive. For instance, in legal practices, a second pair of eyes is a frequent practice. Here, another lawyer will review contracts or legal briefs, ensuring that every assertion is backed by solid evidence and every clause is airtight. Similarly, in medical diagnoses, a second opinion can be the difference between a successful treatment and a misdiagnosis. In today’s world, a second opinion may be a doctor’s review of an AI-based image analysis and diagnosis.

In the publishing process, multiple layers of editors and proofreaders ensure that what reaches the public domain is not just free of typographical errors, but also factually accurate and clear in its narrative (And it’s efficient: have you ever read a self-published book ?).

The software industry has also widely adopted this principle in the form of ‘code reviews’ or ‘peer reviews’. Two developers look over the same codebase, catching bugs and suggesting improvements, a practice that often leads to both higher-quality output and a shared understanding among team members.

It may reduce errors and promote a culture of shared responsibility and collective intelligence, where individuals benefit from each other’s insights and expertise.

Additionally, this method can foster a learning environment.

However, this seemingly foolproof system is not without its pitfalls. One of the primary risks is the ‘groupthink’ phenomenon, where the desire for consensus and the influence of dominant personalities can lead to a lack of critical scrutiny. There’s also the problem of assumed infallibility; knowing that another set of eyes will review their work might lead some to a false sense of security, resulting in a less vigilant initial review.

Another risk is the potential delay caused by the “two pairs of eyes”.

In addition, there’s a danger of responsibility diffusion, where each reviewer assumes the other will catch the mistakes, potentially leading to a situation where errors are overlooked because each believed it was the other’s responsibility to identify them.

The key to reaping the benefits of the “two pairs of eyes” approach without falling victim to its drawbacks is to apply it judiciously. This means knowing when the extra scrutiny is essential and when it might be counterproductive. In fast-paced, creative fields like advertising, the approach might be used selectively, reserved for final outputs rather than the generative stages of ideation.

Another important aspect is to establish clear parameters for each reviewer’s responsibilities, ensuring that each set of eyes is looking out for different potential issues, reducing the likelihood of overlap and the subsequent diffusion of responsibility.

In my business, decisions are often based on regulations rather than technical needs. Conflicting regulations in international corporations are common. In theory, “not going back, high penalty” is not a good candidate for a four-eyes process as ‘reducing errors’ should not be good enough. Defining robust rules is, except for attention errors, the only way of reducing the cost of mistakes.

Ideally, clear rules for data archiving, data privacy, and data retention should be defined first. Then, avoiding errors is best done with automation rather than with a four-eyes process. At TJC Group, we cherish this approach. A yearly review will make sure rules are updated when regulations change.

Yet, organizations typically consult experienced company members for their insights. Seeking expert advice from a seasoned person in the company is often the rationale for a 4-eyes process.

I have seen a fair share of situations where validation is led by an experienced person within the company who was bold enough to make critical (and difficult) decisions. In most cases, this person was close to retirement and willing to take the blame in case there were major consequences.

Sometimes, the four-eyes approach may be the only way forward for some corporations at a given point in time. Once the “second pair of eyes” retires, it will be time to get rid of the four-eyes process.