FAIR Metrics

The FAIR Metrics Group took-on the challenge of designing a framework for evaluating "FAIRness".

Discoverability and reusability are not abstract concepts, but imply concrete behaviors and properties that must hold true for the fulfillment of the FAIR objectives. Given this, it must therefore be possible to precisely define a measurable set of properties and behaviors that assess FAIRness. Over the short 1 month lifespan of the FAIR Metrics Working Group, we have created a cogent framework for developing FAIR metrics manifested as a simple form with 8 questions that structures fruitful conversations about proposed metrics.

Our approach recognizes that the diversity in opinion must play a key role in crafting fair and effective FAIR guidelines. Communities must not only understand what is meant by FAIR, but must also be able to monitor the FAIRness of their digital resources, in a realistic, but quantitative manner. We recognize that what is considered FAIR in one community may be quite different from FAIRness in another community - different community norms and practices make this a certainty! As such, our approach focuses on the mechanism by which metrics can be created by community members themselves, rather than attempting to create a set of one-size-fits-all metrics to apply to every resource.

With a mechanism in-place to design metrics, we now open the process of generating metrics to community participation. We have created several exemplar metrics that we think will be broadly applicable; however, additional metrics may be designed and published through our open submission process, or simply shared within your community through your normal communication channels.

Our proposed FAIR Metrics can be found here.

We have selected an approach to publishing FAIR Metrics that is, itself, FAIR. This takes the form of a FAIR Accessor (a kind of Linked Data Platform Container), which describes a subset of metrics, the community to which they are applicable, other relevant metadata, and links to each of the associated metrics metadata documents. These metadata documents are written in YAML, and follow the smartAPI annotation patterns for Web Services. As such, each of these documents contains a link to the Metric itself - a Web interface capable of testing a resource's compliance with that metric.