Wrap up: Data quality
The quality of a survey is best judged not by its size, scope, or prominence, but by how much attention is given to [preventing, measuring and] dealing with the many important problems that can arise | American Association for Public Opinion Research (2015) (AAPOR).
How you organise, document and process data has a clear impact on data quality and thus also on reliability and adequacy of research findings. In scientific research even small things matter. Data organisation, documentation and processing procedures are not an exception and this is true for quantitative as well as qualitative research. A systematic approach and punctuality in data management (Krejčí, 2010) are awarded by the following:
- Preventing errors and false findings;
- Smooth course, time efficiency and transparency of your own research work;
- Establishing assumptions for effective re-use of research data outside of the original research team.
While in quantitative research the quality is closely linked to standardization and control over the research situation the prevailing approach in qualitative research is different.
In qualitative research, discussions about quality in research are not so much based on the idea of standardization and control, as this seems incompatible with many qualitative methods. Quality is rather seen as an issue of how to manage it. Sometimes it is linked to rigour in applying a certain method, but more often to soundness of the research as a whole | Flick (2007).
A complex approach to data quality
In previous chapters, you have become familiar with a number of procedures and rules for the development of an appropriate data file structure, development of rich metadata and ensuring the data integrity and authenticity. At the same time, however, we should bear in mind that the data management is always an integral part of much more complex research work.
The quality of the outcome is achieved through the quality of the production process (Krejčí, 2010). Scientific research is not an exception. Thus the quality stems from professionalism based on continuous improvement. Data management is one important part of such processes. As such it is interconnected and influenced by other processes within the system and should contribute to a common long-term objective of continuous improvement of a research work within the research organisation.
The mechanical quality control of survey operations such as coding and keying does not easily lend itself to continuous improvement. Rather, it must be complemented with feedback and learning where the survey workers themselves are part of an improvement process | Biemer & Lyberg (2003).
In addition, quality always involves a number of different dimensions. Quality is often defined as “fitness to use”. However simple this sounds, it provides a point of departure for a comprehensive approach to data quality. The results must not only be accurate but must be delivered in time, understandable and clear, and meet other potential users’ needs, e.g. comparability and coherence with other databases. Moreover, it must be also cost-efficient.
For an example of how total quality management is handled by the European Statistical System (Eurostat, 2017) click the accordion.