You might have first encountered the concept of auditing during a grade four math test, in the form of instructions to show your work. As adults, we experience audits daily; whether it is a punch clock or logbook, our daily workflows are designed to create a trackable log as we interact with data systems in the workplace.
Today we’ll be looking through the auditor’s lens at how work is logged to ensure data integrity in any organization and specifically in a forensic DNA laboratory.
Log data is a valuable asset for every organization. As the body of evidence created by workplace operations, it is used in accounting, planning, analysis, quality control, safety, reporting and accreditation. Mismanaged log data causes serious problems: incomplete records will contribute to a poor audit score or loss of accreditation; maintenance and performance issues may go unsolved if routine checks are recorded incorrectly; and without evidence that protocols were followed, workers and/or employers may be culpable in the event of a legal dispute.
Ultimately, the implications of mismanaged log data include the loss of time and money as well as increased risk – especially when an inefficient logging system is involved. Inefficiencies can result in missing or incomplete logs, regulatory fines, and potentially reputation damage. But what separates good data from bad data? What defines a good logging system?
Part 1: Data is life
Data management has woven its way into all our lives. Organizations rely more and more on computer-based enterprise systems over physical recordkeeping to log and manage data. We rely on data in routine assessments, when making big decisions, and when demonstrating compliance with safety or accreditation standards. Not only does data represent the focus of our work—whether we are scanning stock in a store or processing forensic samples in a lab—data represents the proof of our work.
When analyzing a client’s records, auditors assess both the log data and the system that captured that log data. It’s the same as in grade four: to pass a review, you must be able to validate your work. If an activity log doesn’t include identifiers such as a timestamp or the ID of the user who logged the entry, there is no way to prove that the activity was performed as described, or even at all.
So how can we ensure that our logging methods are error-resistant and make the most out of log data when we need to show our work? The concept of data integrity outlines the solution to both of these problems.
Part 2: Principles of data integrity
Data integrity refers to a set of conditions which ensure that log data is captured:
- Contemporaneously, when the activity being logged occurs
- Thoroughly and without errors (such as missing or illegible information)
- Using technology in an accessible format
- Using a system that prevents falsification and creates an additional record when log data is edited
- With frequent backups of information
Good data integrity is more than just auditability. An efficient logging system runs parallel with the user workflow, allowing workers to create log data with minimal interruption as they perform their primary tasks. The best logging systems are automated and involve as little manual input as possible.
In a lab, the challenge of data integrity is multifold. Analysts need to perform thorough data capture for every processing step while employing multiple instruments, and require the ability to retrieve log data for internal reporting, troubleshooting, and information requests.
Part 3: The dilemma of digital
Most workplaces rely on both physical and digital logging based on what works best with the environment. For the forensic lab, replacing physical records with digital ones may seem like a natural first step towards ensuring a lab’s data integrity. By relying on a database for log data instead of printouts and logbooks, labs can make log data more accessible, but digital solutions can create new problems.
Digital data entry reduces illegible handwriting, but it doesn’t prevent transcription errors or issues that can arise from less-than-timely capture. A simple solution such as a set of Excel spreadsheets on a local network requires little specialized knowledge, but invites human error; attempting to juggle spreadsheets between multiple users can result in data loss and unauthorized changes.
Furthermore, in order to effectively maintain a lab’s digital logs, the resident IT specialist would need dedicated time to maintain the solution(s) in use. If an analyst is performing the role of IT specialist, they won’t be able to spend as much time on their primary work, and any absence on their part will leave the rest of the lab helpless if a technical problem arises.
A lab information management system (LIMS) can facilitate data capture and retrieval, but it may not be up to the task of accommodating the unique metrics and technology used in the lab, forcing analysts to rely on multiple systems when logging their work. Additionally, any digital logging system exists at the mercy of the technology itself. If log files are created or managed using legacy technology (e.g. platform-specific file formats, serial cables, diskettes) they may become inaccessible if that technology breaks or if the method for reading them is lost. Without proper backup and maintenance from IT, one system failure can potentially result in the loss of an entire database.
Part 4: Solutions
Despite the issues that accompany digital logging in the lab, it still represents the best first step on the path towards data integrity. Good maintenance is the cornerstone of every lab, so a dedicated IT specialist(s) is a valuable investment towards reducing technological issues in the lab. Good IT also equips the lab to manage software tools such as change tracking and user access control in order to streamline data capture and retrieval without sacrificing security. By implementing an accessible logging system with minimal additional steps during processing and other tasks, the lab can ensure that analysts are able to focus on their primary role instead of spending precious time on repetitive data entry tasks.
Adopting a solution that is tailor-made for the work environment, such as STACS Casework (developed specifically for forensic DNA labs), allows the lab to automate routine tasks, including:
- Checking consumable expiry
- Tracking inventory
- Setting up instruments
- Creating work list/transaction files
- Logging user IDs and timestamps
- Populating forms with existing case/submission/batch data
Log data and processing data—freed from a physical logbook and stored in an accessible form in the database—is available at the push of a button for reporting, troubleshooting, information requests, or in the case of an audit. Along with supporting the lab’s data integrity, a well-designed solution can vastly improve the lab’s throughput as well as the quality and usability of its log data.
Conclusion: Why you need to show your work
All workplaces require logging systems that balance the realities of their environment with effective recordkeeping. Maintaining data integrity is a challenge because of the human element. The chances for data errors and omissions to slip through the cracks increase when staff is overwhelmed and short on time.
For the forensic lab, which depends on a robust body of log data to support test results and secure grant funding, good data integrity can mean the difference between success and dis-accreditation. Fortunately, labs are well positioned to adopt secure digital logging methods in order to improve their data integrity and even the workflow itself. By implementing a solution designed for the lab’s specialization, it becomes possible to automate routine checks and tasks, greatly enhancing the lab’s productivity while reducing human error.