Cyber Security and Privacy Experiments: A Design and Reporting Toolkit
Abstract
With cyber security increasingly flourishing into a scientific discipline, there has been a number of proposals to advance evidence-based research, ranging from introductions of evidence-based methodology [8], proposals to make experiments dependable [30], guidance for experiment design [8, 38], to overviews of pitfalls to avoid when writing about experiments [42]. However, one is still given to wonder: What are the best practices in reporting research that act as tell-tale signs of reliable research.We aim at developing a set of indicators for complete reporting that can drive the quality of experimental research as well as support the reviewing process.As method, we review literature on key ingredients for sound experiment and studied fallacies and shortcomings in other fields. We draw on lessons learned and infuse them into indicators. We provide definition, reporting examples, importance and impact and guiding steps to be taken for each indicator.As results, we offer a toolkit with nine systematic indictors for designing and reporting experiments. We report on lessons and challenges from an initial sharing of this toolkit with the community.The toolkit is a valuable companion for researchers. It incites the consideration of scientific foundations at experiment design and reporting phases. It also supports program committees and reviewers in quality decisions, thereby impacting the state of our field.
Domains
Computer Science [cs]Origin | Files produced by the author(s) |
---|
Loading...