Resultados de busca
7 results found with an empty search
- Our Director Eduarda Centeno Receives the French Prix Science Ouverte de la Thèse
On December 1st , our director Eduarda Centeno attended the École Normale Supérieure of Université de Paris-Saclay (Paris, France) to receive the prestigious Prix Science Ouverte de la Thèse . Organized by the French Ministry of Higher Education, Research and Space , the award recognizes doctoral projects that stand out for their implementation of open and reproducible science practices . In total, there are eight awardees , two in each category. Eduarda was honored in the field of Medicine and Health Biology with her doctoral project, available here . Over four years of work at the Université de Bordeaux , Eduarda developed a workflow inspired by various open and reproducible science practices. This effort allowed her to standardize the collection, management, analysis, and sharing of data and metadata , supporting internal projects and collaborations with other European laboratories. The main focus was to investigate how Zebra Finches learn and consolidate their song – a topic that bridges biology, neuroscience, and data science. More information about the award can be found here and here . A Word from Our Director “ Winning this award was an incredible highlight of my year and a special closure to this chapter of my career . Implementing this project was not simple – we know that putting open and reproducible science practices into action can be challenging, whether due to lack of encouragement and acceptance from supervisors or the absence of consolidated information on how to carry out standardizations. I was fortunate to have a team that supported me greatly, both within my laboratory and through external collaborations, which made it possible to develop and implement each stage of the workflow. Writing the thesis itself was also complex, since placing open science at the center of the manuscript is not common and generated some resistance. But I am happy to see the work recognized precisely for its boldness. I believe that this kind of incentive from the French Ministry makes all the difference. I would love to see something similar implemented in Brazil – and, as director of the Network, I am available to work on initiatives together with national research funders .” Why Does This Matter? This international recognition reinforces the importance of valuing open and reproducible science practices worldwide, including in Brazil. The Brazilian Reproducibility Network believes that initiatives like this strengthen trust in science, expand collaboration, and make results more transparent and accessible. We celebrate the achievement of our director and hope it inspires other researchers to be bold in implementing open practices in their projects.
- Brazilian Reproducibility Initiative receives international award
The Einstein Foundation, in cooperation with the BIH QUEST Center for Responsible Research, promotes an annual award to recognize efforts that strengthen research quality. This year, the Brazilian Reproducibility Initiative is one of the winners. Brazilian Reproducibility Initiative receives international award. The Brazilian Reproducibility Initiative The Brazilian Reproducibility Initiative was a multicenter project aimed at assessing replicability rates in Brazilian biomedical science. Launched in 2017 under the leadership of Prof. Olavo Amaral (UFRJ), the project brought together more than 200 researchers to carry out 143 replications of experiments published over the past two decades by scientists affiliated with Brazilian institutions . The project’s main findings were released as a preprint in March 2025, and the results are summarized here . The award recognizes the project’s impact beyond its empirical results, highlighting the opportunities it has created for improving current practices in academic research as well as the legacy it leaves for organizations such as BRISA and the Brazilian Reproducibility Network itself . Jürgen Zöllner, a member of the award jury, commented on the project’s selection: “The Brazilian Reproducibility Initiative proves that a coordinated, nationwide effort to strengthen research rigor and reproducibility is possible—and should inspire disciplines and funders worldwide to follow suit.” The project is in the final stages of results verification and is expected to release an updated preprint in the upcoming months. The €100,000 award will be dedicated to continuing efforts in support of reproducibility in Brazilian research, and the specific uses of the funds will be defined by the consortium . A premiação The Einstein Foundation Berlin was founded in 2009 to support research in the state of Berlin, Germany. In 2017, the BIH QUEST Center —affiliated with the Charité university hospital, also located in Berlin—was created to develop and implement new practices to improve the quality and reliability of biomedical research. The international award recognizes individuals, institutions, and projects committed to enhancing research quality. In its first edition in 2021, it honored the contributions of Paul Ginsparg , founder of the arXiv repository; the Center for Open Science , a research and infrastructure institution for open science; and the ManyBabies 5 project, a multicenter initiative dedicated to building a more robust scientific foundation in developmental psychology. Other notable awardees include initiatives focused on the publication system, such as PubPeer , and reforms in researcher evaluation, such as the project led by Anne Gärtner . In 2025, the Brazilian Reproducibility Initiative received the award in the institutional category, alongside Simine Vazire (individual category) for her academic trajectory, and Maximilian Sprang (Early Career Researcher category) for the project Erring Rigorously . Additional information about this edition's award recipients, as well as those from previous years, can be found at https://award.einsteinfoundation.de/award-winners-finalists .
- The first results of the Brazilian Reproducibility Initiative are now published in preprint format
The study successfully replicated between 15% and 45% of a sample of biomedical experiments published by Brazilian research groups and mapped the challenges faced by the laboratories participating in the project. Photo: Eduardo Anizelli/Folhapress. The Brazilian Reproducibility Initiative , a project that helped establish our network, has just published its first results in preprint format . The study, which involved 56 laboratories and 213 researchers across the country, successfully replicated between 15% and 45% of a sample of experiments published by Brazilian groups, depending on the criteria used. Funded by the Serrapilheira Institute , the project took seven years to be completed and represents the world's first national-level estimate of experiment replicability. The initiative set out to replicate 60 randomly selected experiments from articles published by Brazilian research groups using three traditional biomedical research laboratory methods: the MTT assay, which measures the viability of cultured cells; reverse transcription polymerase chain reaction (RT-PCR), used to analyze the expression of specific genes; and the elevated plus maze, used to assess anxiety-related behavior in rodents. Each experiment was sent for replication by three different laboratories in the consortium. Of the 180 planned replications, 143 were carried out, with 37 canceled due to difficulties in acquiring supplies or laboratories failing to complete the experiments. However, only 97 were considered valid replications of the original experiment by a validation committee created for the project. The remaining ones were excluded from the main analysis due to issues such as protocol deviations, insufficient sample size, insufficient biological variability between experimental units, or inadequate documentation. The Initiative's main results include 97 replications of 47 experiments, with each experiment replicated between one and three times. The replication success rate was measured using five pre-established criteria, and resulted in the following results: a) 45% of the original effect estimates fell within the 95% prediction interval of a meta-analysis of the replications. b) 26% of the replication effect estimates fell within the 95% confidence interval of the original effect. c) 19% of replication effect estimates showed a statistically significant aggregated effect (p < 0.05) in the same direction as the original effect. d) 15% of experiments had at least half of their replications with a statistically significant result in the same direction as the original. e) 40% of replications had at least half of the replications considered successful by the laboratory conducting them. An important limitation is that the replications showed much greater variability between experimental units than the original experiments, making several of them statistically underpowered to detect the difference between groups observed in the original experiment. To address this issue, the study conducted various alternative analyses using different filters to select the experiments. When considering only the 27 experiments with adequate statistical power , even when accounting for the variability observed in the replication, the statistical significance rate obtained in the replications rises to 30%. The relative effect sizes (e.g., ratios between measurements in the treated group and the control group) were, on median, almost 40% smaller in the replications than in the original articles. Conversely, the coefficients of variation (which measure the variability observed within each group) were 2.5 times higher, particularly in in vitro experiments using MTT and RT-PCR techniques. The differences between the original experiment and the replications were about 34% greater than those between individual replications, suggesting that part of the reasons for low replicability are specific to experiments published in the scientific literature. Possible reasons for this include issues such as publication bias and researchers' preference for positive results. Still, considerable heterogeneity among results was observed across replications, even though the responsible laboratories were blind to the original article and had no incentive to seek specific outcomes. This heterogeneity, combined with the difficulties many laboratories faced in following their own protocols, led the Initiative to conduct a self-assessment process to identify reasons for protocol deviations and opportunities to improve experiment reproducibility. The results indicate that, in some cases, strictly following pre-established protocols is impossible—either because the biological model behaves differently than expected or due to external conditions such as infrastructure limitations, supply chain issues, or unpredictable factors like the COVID-19 pandemic forcing changes in plans. In other cases, however, non-adherence to protocols resulted from communication failures or misinterpretations, some of which stemmed from the lack of shared terminologies among laboratories to describe aspects of experimental design, such as the experimental unit used in cell culture studies. For Olavo Amaral, coordinator of the Initiative, this is one of the factors indicating that academic laboratories are poorly equipped to collaborate—largely because the research culture in academia is more focused on small groups conducting exploratory approaches rather than large confirmatory projects like the Initiative. 'We had the idea that bringing many laboratories together would be enough to establish reproducible protocols, but retrospectively, it's like expecting a collection of garage bands to form an orchestra.' For Amaral, the academic community should structure itself to develop confirmatory projects , either through large consortia like the Initiative or specialized centers dedicated to such projects with a higher methodological rigor. 'We can't expect every scientific project to apply all possible reproducibility practices, as some are quite costly, and this could stifle the exploratory side of science that is also necessary. But it is crucial to increase rigor in selected cases, and our scientific workforce is not well-organized or prepared for this.' Amaral also highlights that biomedical scientists are rarely trained to manage large projects and are not accustomed to receiving external supervision, both of which are essential for large-scale projects to thrive. Beyond researcher training, simple measures such as adopting consensus terminologies among laboratories to describe experimental units, control groups, and other aspects of experimental design also present opportunities for improving reproducibility. Thus, the results of the Initiative reinforce one of the central premises behind the Brazilian Reproducibility Network—that improving the reliability of science requires coordinated actions among researchers, institutions, and other stakeholders to promote systemic changes that encourage and reward more rigorous science. Unsurprisingly, the Network may be the Initiative’s most important legacy, helping the project continue with future articles dedicated to deeper analyses of its findings. See more articles on this subject here , here and here .
- The Science Integrity Alliance is now a partner of BrRN
The Science Integrity Alliance (SIA) was born out of the recognition that science is facing a crisis of integrity, marked by reproducibility failures, predatory editorial practices, the systematic production of fraudulent articles (“paper mills”), and, more recently, the misuse of artificial intelligence tools. Many of the responses to these challenges remain fragmented or confined to specific fields, which limits collective progress and undermines public trust in science. In light of this scenario, SIA’s founder, Dr. Luciana Machado, envisioned the alliance as a space of convergence: a reliable environment to expand access to educational resources, provide practical tools, and connect initiatives that share a commitment to scientific integrity. Since its creation, SIA has been consolidating itself as an expanding international community, bringing together partners across different continents and more than 20 active collaborators. Its role is to provide visibility, support, and connection so that initiatives focused on research integrity—in areas such as academic infrastructure, scientific communication, research education, strengthening reproducibility, and promoting open science—can scale up and reach new audiences. Through resources such as a digital journal, interactive mappings, discussion forums, and learning environments, the alliance seeks to build not only a global cooperation network but also a movement capable of transforming the research culture and strengthening society’s trust in science. As a partner of the Brazilian Reproducibility Network, we aim to foster collaboration around the values of good scientific practices, integrity, open science, and reproducibility. This partnership will also provide exclusive access to resources and tools for four members of the Network. Partnership between the Brazilian Reproducibility Network and the Science Integrity Alliance. More information about the resources: The Premium Membership includes: HIKE – Community forum for constructive discussions with experts on the challenges of research integrity REACH – Bi-monthly digital journal on new developments in the field of research integrity (only the inaugural issue will be open access to all) Marketplace – Catalog of paid solutions with exclusive discounts for Premium Membership subscribers STRIDE –Directory of entities dedicated to research integrity All of this builds on the basic account , which offers free access to the following resources: Browse Solutions – Catalog of free solutions: www.sci-integrity.com/browse-solutions LEARN – Interactive educational platform: www.sci-integrity.com/learn MOSAIC – Mapping of the open science ecosystem (in development) MIRROR – Multimedia hub for metascience (in development) Junior Hub – Science and integrity platform aimed at younger audiences (in development) Member-exclusive portal – Interaction space for members with public accounts, including exclusive discussion groups such as the Integrity Café , and personalized notification management Information about each of these resources can also be accessed at www.sci-integrity.com/resources-overview . All links will be updated as the resources under development are launched. We invite our community to learn more about this initiative at www.sci-integrity.com . Disclaimer: This text was jointly written by SIA and RBR.
- BrRN x JoVE: a shared mission for science reproducibility
When Dr Jan Lötvall’s lab published their game-changing RNA isolation method, the University of Gothenburg researchers faced a new challenge: a high volume of queries from scientists trying to run the complex new protocol in their own labs. By creating a video with JoVE , the authors enabled others to visualize every intricate detail of the experiment. Today, their video article has almost 100,000 views and over 600 citations . Reproducibility is the foundation of trustworthy science. Yet many researchers face difficulties replicating experiments described in traditional articles, where important details are sometimes left out. JoVE was created to address this challenge. Since its founding, it has grown into the world’s largest scientific video collection , all designed to make methods clearer and more accessible, credible, and reproducible. With JoVE recently joining forces with the Brazilian Reproducibility Network as a partner, let’s look at how the company began, how its mission has evolved, and its potential impact on the future of Brazilian and global science. An animation by JoVE. A New Way to Share Methods The idea for JoVE came in 2006 from Moshe Pritsker, then a PhD student at Princeton and now the company’s CEO. He was attempting to conduct a new stem cell experiment but couldn’t succeed based on the written description alone. With his lab able to fund international travel, he visited the group that had first published the method. After a few weeks of hands-on training, he returned with a deeper understanding of the technique and the realization that advancing science shouldn’t hinge on travel budgets or rare chances to train with the original authors— everyone should be able to see exactly how a method is done . This experience sparked a new idea: to publish scientific methods not only in text but also in video. By showing the procedure step by step, the small but crucial details that determine success would be preserved. As Moshe later explained , “Confusion over the smallest details can result in months of lost effort.” To solve this, he created JoVE, the first peer-reviewed science video journal . Building a Global Video Library From a single journal, JoVE has grown into a library of more than 26,000 videos, including over 18,000 research articles . More than 1,000 new peer-reviewed video articles are added each year. JoVE Journal , which is indexed in all major databases, covers a wide range of STEM disciplines, from biology and chemistry to nursing and bioengineering. The journal follows a hybrid publication policy: while all content is available through institutional subscriptions, authors can also choose an open access option to make their video articles freely accessible worldwide. This route aligns with Open Access values by promoting global visibility and transparency of science. Once a manuscript has been peer-reviewed and accepted for publication, JoVE offers interested authors the possibility of including a video. The team develops a script with the authors and films in their lab or polishes submitted footage. A global network of videographers, editors, and voice-over specialists ensures every video is scientifically accurate and professionally produced. Alongside the journal, the Encyclopedia of Experiments is a collection of comprehensive videos of advanced techniques for researchers in academia and industry. JoVE also supports education through videos designed for undergraduate students . Animated lessons and live-action demonstrations help them understand complex concepts and prepare for lab work . Why Video Matters for Reproducibility Reproducibility often fails because written methods cannot capture every detail. A slight nuance in technique, the way equipment is positioned, or the pace of adding a reagent can mean the difference between success and failure . Filmed demonstrations capture subtle actions and experimental conditions that are hard to describe in text. The result is a clearer, more transparent record that can be replicated. “Sometimes you’re reading directions, and the words just don’t make sense,” says Jeanette Moore, researcher and lab manager at the University of Alaska Fairbanks. “That is the value of JoVE, actually seeing how to insert a needle in a laboratory animal before you do it, so you don’t have to do it twice.” When experienced researchers leave a lab, their skills often disappear with them. Videos can act as a long-term archive. Recording methods ensures that knowledge is preserved and available to the next generation of scientists . Reliable science also depends on accessibility. Often, researchers and students struggle with dense, technical language in traditional journals. JoVE’s platform reduces this barrier by pairing complex descriptions with clear visual demonstrations, as well as subtitles in 14+ languages (including Brazilian Portuguese) for educational content. JoVE’s translation efforts are continually expanding. Education and Training at Scale Reproducibility does not begin in research laboratories. It depends on how students and early-career scientists are trained. JoVE videos demonstrate core concepts and techniques clearly, giving students a shared knowledge base before they even enter the lab. This helps reduce errors and supports reproducible practices from the start of their careers. Moreover, they’re able to connect abstract concepts with real-life applications by seeing what happens in today’s research labs. Institutions that integrate JoVE into their lab training report accelerated learning and reduced costs . By standardizing training, researchers across institutions and countries are able to start off with the same foundation in reproducible methods , leveling the playing field for those in resource-constrained environments. Shared Goals with BrRN The BrRN was created to make science in Brazil more trustworthy through rigorous, responsible, and collaborative practices. A core part of its mission is providing training and educational resources to researchers across disciplines. JoVE’s history and mission align closely with these goals. Both organizations are committed to making methods clearer, reducing barriers to reliable research , and strengthening reproducibility across the global scientific community. By working together, BrRN and JoVE can provide researchers with new tools to ensure that methods are transparent and reproducible across institutions . As the first step of this new collaboration, JoVE is providing 6-month access to 60 members of BrRN’s immediate network. This pilot will be supported by a series of webinars and training sessions. The goal is to demonstrate how video-based methods drive reproducibility in Brazil and elevate initiatives that promote rigorous science both locally and globally. As part of this collaboration, we’d also like to highlight that JoVE is already available at several major Brazilian universities, including the University of São Paulo (USP, São Paulo State University (UNESP), State University of Campinas (UNICAMP), Rio de Janeiro State University (UERJ), and Goiás State University (UEG) . This means many researchers in the country already have direct access and can take advantage of JoVE’s extensive resources for scientific research. Conclusion The story of Dr. Lötvall’s lab shows how powerful video can be in turning a promising discovery into a widely applied method. Irreproducible findings can have a far-reaching ripple effect in science, affecting future research studies and scientific decision-making. JoVE was founded with reproducibility at its core. By pairing text with video, it addressed a fundamental gap in scientific communication and created a resource now used worldwide. The partnership with BrRN builds on this foundation, bringing new opportunities for Brazilian researchers to access, apply, and share reproducible methods. Reliable science is a global priority, and collaborations like this show how we can work together to achieve it. Learn more about JoVE and explore how video supports reproducible research: https://jove.com/research Disclaimer: This text was written by JoVE and revised by the BrRN.
- Brazilian Reproducibility Network develops recommendations for incorporating open and reproducible research in the evaluation of graduate programas by CAPES
On April 18 and 19, the Brazilian Reproducibility Network held an online conference on Reproducibility in Brazilian Research , as a preparatory event for the 5th National Conference on Science and Technology . There were two days of debates on the meaning of reproducibility in the natural sciences, health sciences and humanities, as well as discussions on the roles that institutions, funders and journals can take on to promote reproducible research practices. The recording of the two-day event is available on Youtube ( day 1 and day 2 ) and the conference summary is available here . During the conference, CAPES' evaluation director, Dr. Antônio Gomes de Souza Filho, invited the BRN to draw up a list of practical recommendations for incorporating open and reproducible research practices into the evaluation of Brazilian postgraduate courses. The drafting of the document was led by the BRN, with extensive collaboration from members of the Open Government Partnership (Commitment 3) and our community. The document is available here and contains 8 recommendations, based on international initiatives to reform scientific evaluation and adapted to the context of postgraduate evaluation, which are summarized in the figure below. CAPES' evaluation takes place every four years, validating the country's master's and doctoral programs and informing the direction of the federal government's investments in the National Postgraduate System. As such, the evaluation has great potential to direct research practices within the programs. In recent evaluations, the assessment criteria related to the quality of intellectual production have been strongly linked to the Qualis ranking , which in turn is based on other bibliometric indicators of journals, such as the impact factor. Various international initiatives such as DORA , the Leiden Manifesto , CoARA and the Hong Kong Principles recommend that the publication vehicle should not be used to assess the quality of research, which is the first of our recommendations. We also advocate for the qualitative evaluation of selected output, directed towards different quality dimensions such as methodological rigor and transparency. We also recommend that the open availability of articles, theses and dissertations be considered, as well as the sharing of data, materials, codes and other research products. How open access is achieved should be up to researchers, and should not depend on the payment of publication fees in the case of articles. Other recommendations revolve around the training and support offered to students to adopt open and reproducible research practices. Graduate programs should include discussions on these topics in the curriculum, as well as implement policies and support services in this regard. Open education practices that make courses and subjects offered by the program available to a wider audience should also be valued. Finally, we recommend that the assessment of the research output of researchers and students should include dimensions other articles and theses. Activities such as pre- and post-publication peer review, editing scientific journals, participating in organizations and committees dedicated to scientific policies, and taking up roles in stable research collaborations should also be taken into account. The recommendations were presented by the BRN to the Technical-Scientific Council for Higher Education on May 21, and we are open to developing recommendations in this regard for other agencies, institutions and graduate programs. If you are interested in adapting our recommendations to your context, please access the document and contact us via redereprodutibilidade@gmail.com .
- The Brazilian Reproducibility Network is a member of CoARA
As of June 2024 we are official members of CoARA ( Coalition for Advancing Research Assessment ) . CoARA is a global coalition of research organizations, funders, national and regional assessment agencies, among other types of organizations, mobilized to reform the way research products, researchers, and their institutions are assessed. The Agreement on Research Assessment Reform (ARRA) In 2022, an initial CoARA group, originally focused on European organizations, published the ARRA ( The Agreement on Reforming Research Assessment ) . This agreement is based on the understanding that the evaluation of research products, researchers and/or research institutions needs to be rethought, mainly by avoiding metrics based on publications. With this, the negative impacts of pursuing publication in journals with a high impact factor, which include, for example, reduction of methodological rigor and the high cost for publications , can be avoided. The 4 core commitments of the agreement are summarized in the box below. The need to diversify scientific contributions that are considered in evaluations and a more relevant role for qualitative evaluations to the detriment of the use of quantitative metrics are highlighted. Core commitments of ARRA (Agreement on Reforming Research Assessment) Recognize the diversity of contributions to, and careers in, research, in accordance with the needs and nature of the research Contributions to science are not just papers; other research products, research practices and related activities (supervision, training, etc.) must also be recognized. Likewise, careers in science other than the role of researcher must be recognized and valued. Base research assessment primarily on qualitative evaluation for which peer review is central, supported by responsible use of quantitative indicators Qualitative assessment by peer review is central. Review processes must follow standards of rigor and transparency, and must be constantly examined to ensure the quality and impartiality of evaluations. Abandon inappropriate uses in research assessment of journal- and publication-based metrics, in particular inappropriate uses of Journal Impact Factor (JIF) and h-index Inappropriate uses include exclusively using metrics based on the author, journal or language of publication, for example, to evaluate the quality of a work. Avoid the use of rankings of research organizations in research assessment Criteria used in international rankings of universities and other institutions, generally defined by commercial organizations, should not influence the internal quality assessment of research or researchers. Each institution must have the autonomy to define how to evaluate quality internally. Six supporting commitments are also included in the agreement, focusing on enabling change to new assessment systems and enabling continuous learning and use of evidence in decision-making. Activities of the Brazilian Reproducibility Network By becoming a member of CoARA, we are also signatories to ARRA. With this, we indicate that the BrRN shares the vision of need for reforms and adopts the commitments of the agreement for its activities. As we are not an organization that evaluates researchers, we can work on supporting our members to implement reform measures and on seeking evidence on the best ways to do so. Our first activity in this direction has already been announced: in May we presented a list of recommendations for CAPES indicating that reforms should be made in the evaluation model for Brazilian postgraduate courses. The next steps should include recommendations aimed at the coordinators of these programs, with concrete actions that can already be implemented independently and suggestions of relevant tools and platforms for each area of knowledge. Additionally, all BrRN members can contribute to CoARA working groups. Currently discussed topics include: open infrastructures for responsible evaluations, multilingualism and biases in research evaluation, experimentation in research evaluation, responsible metrics and indicators, recognition of high-quality peer reviews, among others. The complete list and more information about each group can be seen here . If you would like to contribute, please contact us to organize our contributions. Other ideas for how BrRN can act are emerging and as we organize an action plan (over the next year) we are open to listening to our members and broader interested community. Contribute by writing to us via email ( redereprodutibilidade@gmail.com ) or participating in Zulip ( www.reprodutibilidade.zulipchat.com ).







