Skip to content

Commit

Permalink
Update 3-assessment_content.md
Browse files Browse the repository at this point in the history
  • Loading branch information
npch authored Feb 12, 2024
1 parent 8b00bbe commit ca00925
Showing 1 changed file with 11 additions and 3 deletions.
14 changes: 11 additions & 3 deletions resources/03 Put it into practice/3-assessment_content.md
Original file line number Diff line number Diff line change
Expand Up @@ -48,7 +48,7 @@ To perform FAIR assessment one can use a FAIR software checklist and manually go
- [howfairis](https://github.com/fair-software/howfairis)
- [isseven](https://github.com/meadsteve/isseven)

2. Use the developed FAIR software checklist to assess the FAIRness of the chosen software object
2. Use the FAIR software checklist developed by FAIR-IMPACT to assess the FAIRness of the chosen software object
- analyse each item of the following table by paying attention to the requirements, method, essential, important and useful aspects
- note that the specification for each of the metrics is available in [D5.2 - Metrics for automated FAIR software assessment in a disciplinary context](https://zenodo.org/records/10047401), chapter 2 Metric Specification page 12

Expand All @@ -73,11 +73,19 @@ To perform FAIR assessment one can use a FAIR software checklist and manually go
| **FRSM-17** | Does the software include provenance information that describe the development of the software? |

3. Use an automated tool to assess the software FAIRness
- Although there are FAIR assessment tools for data, few are currently available for assessing software.
- Some examples of FAIR assessment tools for software that you can try include:
- [FAIR Software Checklist](https://fairsoftwarechecklist.net/): this is a web checklist that asks you 15 questions about your software. This is similar to the FAIR-IMPACT metrics.
- F-UJI: web version designed for data, though can be used for software. A software specific set of metrics is being implemented.
- howfairis: Python tool that needs to be installed on your computer. Doesn't directly assess against the FAIR for research software principles.

4. Compare the results from both approaches
5. Compare the results from both approaches
- discuss any problems or doubts during the process
- what metrics did you find most useful to consider?
- what metrics did you find hardest to check?
- what metrics would you change to be more useful for your community / discipline?

5. Prepare for retrospective
6. Prepare for retrospective
- what are your main impressions of this activities in 2-3 sentences
- choose a group representatives that will share these impressions with the rest of the workshop participants

Expand Down

0 comments on commit ca00925

Please sign in to comment.