The Mastery Rubric for Bioinformatics: A tool to support design and evaluation of career-spanning education and training

Autoři: Rochelle E. Tractenberg aff001;  Jessica M. Lindvall aff002;  Teresa K. Attwood aff003;  Allegra Via aff004
Působiště autorů: Collaborative for Research on Outcomes and –Metrics, and Departments of Neurology, Biostatistics, Biomathematics and Bioinformatics, and Rehabilitation Medicine, Georgetown University, Washington, DC, United States of America aff001;  National Bioinformatics Infrastructure Sweden (NBIS)/ELIXIR-SE, Science for Life Laboratory (SciLifeLab), Department of Biochemistry and Biophysics, Stockholm University, Stockholm, Sweden aff002;  Department of Computer Science, The University of Manchester, Manchester, England, United Kingdom; The GOBLET Foundation, Radboud University, Nijmegen Medical Centre, Nijmegen, The Netherlands aff003;  ELIXIR Italy, National Research Council of Italy, Institute of Molecular Biology and Pathology, Rome, Italy aff004
Vyšlo v časopise: PLoS ONE 14(11)
Kategorie: Research Article
doi: 10.1371/journal.pone.0225256


As the life sciences have become more data intensive, the pressure to incorporate the requisite training into life-science education and training programs has increased. To facilitate curriculum development, various sets of (bio)informatics competencies have been articulated; however, these have proved difficult to implement in practice. Addressing this issue, we have created a curriculum-design and -evaluation tool to support the development of specific Knowledge, Skills and Abilities (KSAs) that reflect the scientific method and promote both bioinformatics practice and the achievement of competencies. Twelve KSAs were extracted via formal analysis, and stages along a developmental trajectory, from uninitiated student to independent practitioner, were identified. Demonstration of each KSA by a performer at each stage was initially described (Performance Level Descriptors, PLDs), evaluated, and revised at an international workshop. This work was subsequently extended and further refined to yield the Mastery Rubric for Bioinformatics (MR-Bi). The MR-Bi was validated by demonstrating alignment between the KSAs and competencies, and its consistency with principles of adult learning. The MR-Bi tool provides a formal framework to support curriculum building, training, and self-directed learning. It prioritizes the development of independence and scientific reasoning, and is structured to allow individuals (regardless of career stage, disciplinary background, or skill level) to locate themselves within the framework. The KSAs and their PLDs promote scientific problem formulation and problem solving, lending the MR-Bi durability and flexibility. With its explicit developmental trajectory, the tool can be used by developing or practicing scientists to direct their (and their team’s) acquisition of new, or to deepen existing, bioinformatics KSAs. The MR-Bi is a tool that can contribute to the cultivation of a next generation of bioinformaticians who are able to design reproducible and rigorous research, and to critically analyze results from their own, and others’, work.

Klíčová slova:

Bioinformatics – Cognition – Experimental design – Health informatics – Human learning – Learning – Reproducibility – Instructors


1. MacLean M, Miles C. Swift action needed to close the skills gap in bioinformatics. Nature. 1999;401: 10. doi: 10.1038/43269 10485694

2. Brass A. Bioinformatics education—A UK perspective. Bioinformatics. 2000;16: 77–78. doi: 10.1093/bioinformatics/16.2.77 10842726

3. Pevzner P, Shamir R. Computing has changed biology-biology education must catch up. Science (80-). 2009;325: 541–542. doi: 10.1126/science.1173876 19644094

4. Abeln S, Molenaar D, Feenstra KA, Hoefsloot HCJ, Teusink B, Heringa J. Bioinformatics and Systems Biology: bridging the gap between heterogeneous student backgrounds. Brief Bioinform. 2013;14: 589–598. doi: 10.1093/bib/bbt023 23603092

5. Libeskind-Hadas R, Bush E. A first course in computing with applications to biology. Brief Bioinform. 2013;14: 610–617. doi: 10.1093/bib/bbt005 23449003

6. Schneider MV, Jungck JR. Editorial: International, interdisciplinary, multi-levelbioinformatics training and education. Brief Bioinform. 2013;14: 527. doi: 10.1093/bib/bbt064 24030777

7. Goodman AL, Dekhtyar A. Teaching Bioinformatics in Concert. PLoS Comput Biol. 2014;10: e1003896. doi: 10.1371/journal.pcbi.1003896 25411792

8. Rubinstein A, Chor B. Computational Thinking in Life Science Education. PLoS Comput Biol. 2014;10: e1003897. doi: 10.1371/journal.pcbi.1003897 25411839

9. Chang J. Core services: Reward bioinformaticians. Nature. 2015;520: 151–152. doi: 10.1038/520151a 25855439

10. Brazas MD, Blackford S, Attwood TK. Training: Plug gap in essential bioinformatics skills. Nature. 2017;544: 161. doi: 10.1038/544161c 28406196

11. Feldon DF, Jeong S, Peugh J, Roksa J, Maahs-Fladung C, Shenoy A, et al. Null effects of boot camps and short-format training for PhD students in life sciences. Proc Natl Acad Sci. 2017;114: 9854–9858. doi: 10.1073/pnas.1705783114 28847929

12. Attwood TK, Blackford S, Brazas MD, Davies A, Schneider MV. A global perspective on evolving bioinformatics and data science training needs. Brief Bioinform. 2019;20: 398–404. doi: 10.1093/bib/bbx100 28968751

13. Barone L, Williams J, Micklos D. Unmet needs for analyzing biological big data: A survey of 704 NSF principal investigators. PLoS Comput Biol. 2017;13: e1005858. doi: 10.1371/journal.pcbi.1005858 29131819

14. Brazas MD, Brooksbank C, Jimenez RC, Blackford S, Palagi PM, Las Rivas JD, et al. A global perspective on bioinformatics training needs. bioRxiv. 2017; doi: 10.1101/098996

15. Schneider MV, Madison G, Flannery P. Survey of Bioinformatics and Computational Needs in Australia 2016.pdf. figshare. [Internet]. figshare. 2016.

16. Tan TW, Lim SJ, Khan AM, Ranganathan S. A proposed minimum skill set for university graduates to meet the informatics needs and challenges of the “-omics” era. BMC Genomics. 2009; doi: 10.1186/1471-2164-10-S3-S36 19958501

17. Kulikowski CA, Shortliffe EH, Currie LM, Elkin PL, Hunter LE, Johnson TR, et al. AMIA Board white paper: Definition of biomedical informatics and specification of core competencies for graduate education in the discipline. J Am Med Informatics Assoc. 2012;19: 931–938. doi: 10.1136/amiajnl-2012-001053 22683918

18. Dinsdale E, Elgin SCR, Grandgenett N, Morgan W, Rosenwald A, Tapprich W, et al. NIBLSE: A Network for Integrating Bioinformatics into Life Sciences Education. CBE—Life Sci Educ. 2015;14: 1–4. doi: 10.1187/cbe.15-06-0123 26466989

19. Welch L, Lewitter F, Schwartz R, Brooksbank C, Radivojac P, Gaeta B, et al. Bioinformatics Curriculum Guidelines: Toward a Definition of Core Competencies. PLoS Comput Biol. 2014;10: e1003496. doi: 10.1371/journal.pcbi.1003496 24603430

20. Welch L, Brooksbank C, Schwartz R, Morgan SL, Gaeta B, Kilpatrick AM, et al. Applying, Evaluating and Refining Bioinformatics Core Competencies (An Update from the Curriculum Task Force of ISCB’s Education Committee). PLOS Comput Biol. 2016;12: e1004943. doi: 10.1371/journal.pcbi.1004943 27175996

21. Wilson Sayres MA, Hauser C, Sierk M, Robic S, Rosenwald AG, Smith TM, et al. Bioinformatics core competencies for undergraduate life sciences education. PLoS One. 2018;13: e0196878. doi: 10.1371/journal.pone.0196878 29870542

22. Centers for Disease Control and Prevention and University of Washington’s Center for Public Health Informatics. In: Competencies for Public Health Informaticians [Internet]. [cited 11 Jan 2016]. Available:

23. Miner KR, Childers WK, Alperin M, Cioffi J, Hunt N. The MACH Model: From Competencies to Instruction and Performance of the Public Health Workforce. Public Health Rep. 2017;120: 9–15. doi: 10.1177/00333549051200s104 16025702

24. Carter KF, Kaiser KL, O’Hare PA, Callister LC. Use of PHN competencies and ACHNE essentials to develop teaching-learning strategies for generalist C/PHN curricula. Public Health Nurs. 2006;23: 146–160. doi: 10.1111/j.1525-1446.2006.230206.x 16684189

25. Fernandez N, Dory V, Ste-Marie LG, Chaput M, Charlin B, Boucher A. Varying conceptions of competence: An analysis of how health sciences educators define competence. Med Educ. 2012;46: 357–365. doi: 10.1111/j.1365-2923.2011.04183.x 22429171

26. Bennett CJ, Walston SL. Improving the use of competencies in public health education. Am J Public Health. 2015;105: S65–S67. doi: 10.2105/AJPH.2014.302329 25706022

27. Caverzagie KJ, Nousiainen MT, Ferguson PC, ten Cate O, Ross S, Harris KA, et al. Overarching challenges to the implementation of competency-based medical education. Med Teach. 2017;39: 588–593. doi: 10.1080/0142159X.2017.1315075 28598747

28. Holmboe ES, Edgar L, Hamstra S. The Milestones Guidebook [Internet]. 2016.

29. Englander R, Frank JR, Carraccio C, Sherbino J, Ross S, Snell L. Toward a shared language for competency-based medical education. Med Teach. 2017;39: 582–587. doi: 10.1080/0142159X.2017.1315066 28598739

30. Tractenberg RE. The Mastery Rubric: A tool for curriculum development and evaluation in higher, graduate/post-graduate, and professional education. SocArXiv. 2017; doi: 10.31235/

31. Tractenberg RE, Lindvall JM, Attwood TK, Via A. Guidelines for curriculum development in higher education: How learning outcomes drive all decision-making (In preparation). 2019;

32. Tractenberg RE. Achieving sustainability and transfer with short term learning experiences. SocArXiv. 2018; doi: 10.31235/

33. Bloom BS, Englehard MD, Furst EJ, Hill WH. Taxonomy of educational objectives: The classification of educational goals: Handbook I, cognitive domain. 2nd ed. New York: David McKay Co Inc.; 1956.

34. Anderson LW, Krathwohl DR, Airasian PW, Cruikshank KA, Mayer RE, Pintrich PR, et al. A Taxonomy for Learning, Teaching and Assessing: A Revision of Bloom’s Taxonomy of Educational Objectives. White Plains, NY: Addison Wesley Longman; 2001.

35. Stevens DD, Levi AJ. Introduction to Rubrics. 2nd ed. Stylus Publishing;

36. Egan K, Schneider C, Ferrara S. Performance Level Descriptors. In: Cizek GJ, editor. Setting Performance Standards: Foundations, Methods, and Innovations. 2nd ed. Routledge; 2012. pp. 79–106.

37. Clark R, Feldon D, van Merriënboer J, Yates K, Early S. Cognitive Task Analysis. In: Spector JM, Merrill MD, Elen J, Bishop MJ, editors. Handbook of research on educational communications and technology. 3rd ed. Mahwah, NJ: Lawrence Earlbaum Associates; 2008. pp. 577–593.

38. Ogilvie S. The Economics of Guilds. J Econ Perspect. 2014;28: 169–192. doi: 10.1257/jep.28.4.169

39. Cizek GJ. An introduction to contemporary standard setting: concepts, characteristics, and contexts. In: Cizek GJ, editor. Setting Performance Standards. 2nd ed. New York: Routledge; 2012. pp. 3–14.

40. Norcini JJ. Setting standards on educational tests. Med Educ. 2003;37: 464–469. doi: 10.1046/j.1365-2923.2003.01495.x 12709190

41. Tractenberg RE. Degrees of freedom analysis in educational research and decision-making: Leveraging qualitative data to promote excellence in bioinformatics training and education. Brief Bioinform. 2019;20: 416–42. doi: 10.1093/bib/bbx106 30908585

42. Campbell DT. III. “Degrees of Freedom” and the Case Study. Comp Polit Stud. 1975;8: 178–193. doi: 10.1177/001041407500800204

43. Knowles MS, Holton EF III, Swanson RA. The Adult Learner. 6th ed. New York: Elsevier/Buttrworth Heinemann; 2005.

44. Wild CJ, Pfannkuch M. Statistical thinking in empirical enquiry. Int Stat Rev. 1999;67: 223–248. doi: 10.1111/j.1751-5823.1999.tb00442.x

45. Bishop G, Talbot M. Statistical thinking for novice researchers in the biological sciences. In: Batanero C, editor. Training researchers in the use of statistics. Granada, Spain: International Association for Statistical Education International Statistical Institute; 2001. pp. 215–226.

46. Tractenberg R. How the Mastery Rubric for Statistical Literacy Can Generate Actionable Evidence about Statistical and Quantitative Learning Outcomes. Educ Sci. 2016;7: 3. doi: 10.3390/educsci7010003

47. Pearson WR. Training for bioinformatics and computational biology. Bioinformatics. 2001;17: 761–762. doi: 10.1093/bioinformatics/17.9.761 11590093

48. Tractenberg RE, Gushta MM, Weinfeld JM. The Mastery Rubric for Evidence-Based Medicine: Institutional Validation via Multidimensional Scaling. Teach Learn Med. 2016;28: 152–165. doi: 10.1080/10401334.2016.1146599 27064718

49. Chan TM, Baw B, McConnell M, Kulasegaram K. Making the McMOST out of Milestones and Objectives: Reimagining Standard Setting Using the McMaster Milestones and Objectives Stratification Technique. AEM Educ Train. 2016;1: 48–54. doi: 10.1002/aet2.10008 30051009

50. Kingston N, Tiemann G. Setting Performance Standards on Complex Assessments: The Body of Work method. In: Cizek GJ, editor. Setting Performance Standards: Foundations, Methods, and Innovations. 2nd ed. New York: Routledge; 2012. pp. 201–223.

51. Plake B, Cizek G. Variations on a theme: The Modified Angoff, Extended Angoff, and Yes/No standard setting methods. In: Cizek GJ, editor. Setting Performance Standards: Foundations, Methods, and Innovations. New York: Routledge; 2012. pp. 181–199.

52. Kane M. Validating the performance standards associated with passing scores. Rev Educ Res. 1994;64: 425–461. doi: 10.3102/00346543064003425

53. Messick S. The interplay of evidence and consequences in the validation of performance assessments. Educ Res. 1994;23: 13–23.

54. Kane MT. Validating the Interpretations and Uses of Test Scores. J Educ Meas. 2013;50: 1–73. doi: 10.1111/jedm.12000

55. Magana AJ, Taleyarkhan M, Alvarado DR, Kane M, Springer J, Clase K. A survey of scholarly literature describing the field of bioinformatics education and bioinformatics educational research. CBE Life Sci Educ. 2014;13: 573–738.

56. Smith DR. Bringing bioinformatics to the scientific masses. EMBO Rep. 2018;19: e46262. doi: 10.15252/embr.201846262 29724753

57. Tractenberg RE, Wilkinson M, Bull A, Pellathy T, Riley J. (2019). Designing a developmental trajectory supporting the evaluation and achievement of competencies: a case study with a Mastery Rubric for the advanced practice nursing curriculum. PLOS ONE 14(11): e0224593. 31697730

58. Andrade HG. Using Rubrics to Promote Thinking and Learning. Educ Leadersh. 2000;57: 13–18.

59. Jonsson A, Svingby G. The use of scoring rubrics: Reliability, validity and educational consequences. Educ Res Rev. 2007;2: 130–144. doi: 10.1016/j.edurev.2007.05.002

60. Lipnevich AA, McCallen LN, Miles KP, Smith JK. Mind the gap! Students’ use of exemplars and detailed rubrics as formative assessment. Instr Sci. 2014;42: 539–559. doi: 10.1007/s11251-013-9299-9

61. Sullivan RS. The Competency-Based Approach to Training. Strategy Paper No 1. Baltimore, Maryland; 1995.

62. Messick S. Validity of psychological assessment: Validation of inferences from persons’ responses and performances as scientific inquiry into score meaning. Am Psychol. 1995;50: 741–749. doi: 10.1037/0003-066X.50.9.741

63. Kane M. Certification Testing as an Illustration of Argument-Based Validation. Meas Interdiscip Res Perspect. 2004;2: 135–170. doi: 10.1207/s15366359mea0203_1

64. Shaffer CD, Alvarez C, Bailey C, Barnard D, Bhalla S, Chandrasekaran C, et al. The genomics education partnership: successful integration of research into laboratory classes at a diverse group of undergraduate institutions. CBE Life Sci Educ. 2010;9: 55–69. doi: 10.1187/09-11-0087 20194808

65. Mulder N, Schwartz R, Brazas MD, Brooksbank C, Gaeta B, Morgan SL, et al. The development and application of bioinformatics core competencies to improve bioinformatics training and education. PLoS Comput Biol. 2018;14: e1005772. doi: 10.1371/journal.pcbi.1005772 29390004

66. Master V. BioExcel Deliverable 4.2—Competency framework, mapping to current training & initial training plan [Internet]. 2016. doi: 10.5281/zenodo.264231

67. Attwood T, Beard N, Nenadic A, Finn Bacall MT. TeSS–The life science training portal. F1000Research 2018 [version 1; not peer Rev]. 2018;7: 250(poster). doi: 10.7490/f1000research.1115282.1

68. Larcombe L, Hendricusdottir R, Attwood TK, Bacall F, Beard N, Bellis LJ, et al. ELIXIR-UK role in bioinformatics training at the national level and across ELIXIR [version 1; peer review: 4 approved, 1 approved with reservations]. F1000Research. 2017;6: 952. doi: 10.12688/f1000research.11837.1 28781748

69. Profiti G, Jimenez RC, Zambelli F, Mičetić I, Licciulli VF, Chiara M, et al. Using community events to increase quality and adoption of standards: the case of Bioschemas [version 1; not peer reviewed]. F1000Research. 2018;7: 1696. doi: 10.7490/f1000research.1116233.1

70. Corpas M, Jimenez RC, Bongcam-Rudloff E, Budd A, Brazas MD, Fernandes PL, et al. The GOBLET training portal: A global repository of bioinformatics training materials, courses and trainers. Bioinformatics. 2015;31: 140–142. doi: 10.1093/bioinformatics/btu601 25189782

71. Van Horn JD, Fierro L, Kamdar J, Gordon J, Stewart C, Bhattrai A, et al. Democratizing data science through data science training. Pac Symp Biocomput. 2018;23: 292–303. doi: 10.1142/9789813235533_0027 29218890

72. Teal TK, Cranston KA, Lapp H, White E, Wilson G, Ram K, et al. Data Carpentry: Workshops to Increase Data Literacy for Researchers. Int J Digit Curation. 2015;10: 292–303. doi: 10.2218/ijdc.v10i1.351

Článek vyšel v časopise


2019 Číslo 11