mastodon.ie is one of the many independent Mastodon servers you can use to participate in the fediverse.
Irish Mastodon - run from Ireland, we welcome all who respect the community rules and members.

Administered by:

Server stats:

1.6K
active users

#reproducibility

3 posts3 participants0 posts today
Harald Klinke<p>Offene KI braucht offene Infrastruktur – auch bei der Entwicklung.<br>Das Projekt f13 auf OpenCode stellt Tools für transparente, reproduzierbare Workflows zur Verfügung – von Modelltraining bis Deployment.<br>Ein Schritt Richtung digitaler Souveränität durch offene Werkzeuge.<br><a href="https://det.social/tags/OpenSourceAI" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>OpenSourceAI</span></a> <a href="https://det.social/tags/Reproducibility" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Reproducibility</span></a> <a href="https://det.social/tags/DigitaleSouver%C3%A4nit%C3%A4t" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>DigitaleSouveränität</span></a></p><p><a href="https://gitlab.opencode.de/f13" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">https://</span><span class="">gitlab.opencode.de/f13</span><span class="invisible"></span></a></p>
Technology Tales<p>Docker Desktop for Statisticians revolutionises R use by creating isolated, reproducible environments. This eliminates version conflicts and simplifies setups. With Docker, you run pre-configured R containers, enabling efficient and clean analysis environments. Explore container management to enhance statistical work and ensure easy collaborative sharing. <a href="https://mstdn.social/tags/Docker" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Docker</span></a> <a href="https://mstdn.social/tags/Statistics" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Statistics</span></a> <a href="https://mstdn.social/tags/Reproducibility" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Reproducibility</span></a> <a href="https://mstdn.social/tags/RStats" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>RStats</span></a> <a href="https://www.statology.org/docker-desktop-for-statisticians-running-r-in-containers/" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">https://www.</span><span class="ellipsis">statology.org/docker-desktop-f</span><span class="invisible">or-statisticians-running-r-in-containers/</span></a></p>
Pete Bachant<p>When you describe the computational methods in your paper without sharing the code and data:</p><p><a href="https://fediscience.org/tags/OpenScience" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>OpenScience</span></a> <a href="https://fediscience.org/tags/reproducibility" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>reproducibility</span></a></p>
Veit Schiele 🔜 @FrOSCon<p>Karl Popper in The Logic of Scientific Discovery, 1959: ‘Non-reproducible single occurrences are of no significance to science.’</p><p>XKCD in Replication Crisis, 2025: ‘Replication Crisis Solved’<br><a href="https://xkcd.com/3117/" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">https://</span><span class="">xkcd.com/3117/</span><span class="invisible"></span></a></p><p><a href="https://mastodon.social/tags/Science" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Science</span></a> <a href="https://mastodon.social/tags/Reproducibility" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Reproducibility</span></a> <a href="https://mastodon.social/tags/XKCD" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>XKCD</span></a></p>
David Philip Morgan<p>Congratulations to <span class="h-card" translate="no"><a href="https://fediscience.org/@ElenLeFoll" class="u-url mention" rel="nofollow noopener" target="_blank">@<span>ElenLeFoll</span></a></span> and colleagues on joining the German Reproducibility Network! 🎊 </p><p>Great to see representation of the humanities in <a href="https://mastodon.world/tags/OpenScience" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>OpenScience</span></a> and <a href="https://mastodon.world/tags/Reproducibility" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Reproducibility</span></a>, lots of great progress underway there - researchers <span class="h-card" translate="no"><a href="https://xn--baw-joa.social/@unimannheim" class="u-url mention" rel="nofollow noopener" target="_blank">@<span>unimannheim</span></a></span> should definitely check out the talks from this initiative! 💡</p><p><span class="h-card" translate="no"><a href="https://mastodon.world/@GermanRepro" class="u-url mention" rel="nofollow noopener" target="_blank">@<span>GermanRepro</span></a></span> <span class="h-card" translate="no"><a href="https://scicomm.xyz/@ReproducibiliTeaGlobal" class="u-url mention" rel="nofollow noopener" target="_blank">@<span>ReproducibiliTeaGlobal</span></a></span> <span class="h-card" translate="no"><a href="https://fediscience.org/@ElenLeFoll" class="u-url mention" rel="nofollow noopener" target="_blank">@<span>ElenLeFoll</span></a></span></p>
Pete Bachant<p>"One button" reproducibility should be the standard.</p><p><a href="https://fediscience.org/tags/OpenScience" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>OpenScience</span></a> <a href="https://fediscience.org/tags/reproducibility" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>reproducibility</span></a></p>
Daniel Hoffmann🌻<p>There is at least one subfield of biology where <a href="https://mathstodon.xyz/tags/reproducibility" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>reproducibility</span></a> is relative high: fruit fly immunology where, according to a new study, the majority of results can be reproduced. I guess that a reason may be large sample sizes. <br><a href="https://www.nature.com/articles/d41586-025-02250-1" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">https://www.</span><span class="ellipsis">nature.com/articles/d41586-025</span><span class="invisible">-02250-1</span></a></p>

Retractions and failures to replicate are signs of weak research. But they're also signs of laudable and necessary efforts to identify weak research and improve future research. The #Trump admin is systematically weaponizing these efforts to cast doubt on science as such.

"Research-integrity sleuths say their work is being ‘twisted’ to undermine science."
nature.com/articles/d41586-025

www.nature.comResearch-integrity sleuths say their work is being ‘twisted’ to undermine scienceSome sleuths fear that the business of cleaning up flawed studies is being weaponized against science itself.

And yet another one in the ever increasing list of analyses showing that top journals are bad for science:

"Thus, our analysis show major claims published in low-impact journals are significantly more likely to be reproducible than major claims published in trophy journals. "

biorxiv.org/content/10.1101/20

bioRxiv · A retrospective analysis of 400 publications reveals patterns of irreproducibility across an entire life sciences research fieldThe ReproSci project retrospectively analyzed the reproducibility of 1006 claims from 400 papers published between 1959 and 2011 in the field of Drosophila immunity. This project attempts to provide a comprehensive assessment, 14 years later, of the replicability of nearly all publications across an entire scientific community in experimental life sciences. We found that 61% of claims were verified, while only 7% were directly challenged (not reproducible), a replicability rate higher than previous assessments. Notably, 24% of claims had never been independently tested and remain unchallenged. We performed experimental validations of a selection of 45 unchallenged claim, that revealed that a significant fraction (38/45) of them is in fact non-reproducible. We also found that high-impact journals and top-ranked institutions are more likely to publish challenged claims. In line with the reproducibility crisis narrative, the rates of both challenged and unchallenged claims increased over time, especially as the field gained popularity. We characterized the uneven distribution of irreproducibility among first and last authors. Surprisingly, irreproducibility rates were similar between PhD students and postdocs, and did not decrease with experience or publication count. However, group leaders, who had prior experience as first authors in another Drosophila immunity team, had lower irreproducibility rates, underscoring the importance of early-career training. Finally, authors with a more exploratory, short-term engagement with the field exhibited slightly higher rates of challenged claims and a markedly higher proportion of unchallenged ones. This systematic, field-wide retrospective study offers meaningful insights into the ongoing discussion on reproducibility in experimental life sciences ### Competing Interest Statement The authors have declared no competing interest. Swiss National Science Foundation, 310030_189085 ETH-Domain’s Open Research Data (ORD) Program (2022)

To my knowledge, first time that not only prestigious journals, but also prestigious institutions are implicated as major drivers of irreproducibility:

"Higher representation of challenged claims in trophy journals and from top universities"

biorxiv.org/content/10.1101/20

bioRxiv · A retrospective analysis of 400 publications reveals patterns of irreproducibility across an entire life sciences research fieldThe ReproSci project retrospectively analyzed the reproducibility of 1006 claims from 400 papers published between 1959 and 2011 in the field of Drosophila immunity. This project attempts to provide a comprehensive assessment, 14 years later, of the replicability of nearly all publications across an entire scientific community in experimental life sciences. We found that 61% of claims were verified, while only 7% were directly challenged (not reproducible), a replicability rate higher than previous assessments. Notably, 24% of claims had never been independently tested and remain unchallenged. We performed experimental validations of a selection of 45 unchallenged claim, that revealed that a significant fraction (38/45) of them is in fact non-reproducible. We also found that high-impact journals and top-ranked institutions are more likely to publish challenged claims. In line with the reproducibility crisis narrative, the rates of both challenged and unchallenged claims increased over time, especially as the field gained popularity. We characterized the uneven distribution of irreproducibility among first and last authors. Surprisingly, irreproducibility rates were similar between PhD students and postdocs, and did not decrease with experience or publication count. However, group leaders, who had prior experience as first authors in another Drosophila immunity team, had lower irreproducibility rates, underscoring the importance of early-career training. Finally, authors with a more exploratory, short-term engagement with the field exhibited slightly higher rates of challenged claims and a markedly higher proportion of unchallenged ones. This systematic, field-wide retrospective study offers meaningful insights into the ongoing discussion on reproducibility in experimental life sciences ### Competing Interest Statement The authors have declared no competing interest. Swiss National Science Foundation, 310030_189085 ETH-Domain’s Open Research Data (ORD) Program (2022)

🧑🔬 Starting your journey in scientific research?

Join the beginner-friendly tutorial at #EuroSciPy2025:
📦 Managing Scientific Data and Workflows with DataLad
Led by Ole Bialas & Michał Szczepanik

In this hands-on session, you'll learn:
✅ How to organize & track your data
✅ How to repeat your experiments reliably
✅ How to share your results with others

No Git experience required—just curiosity and Python basics!
📅 euroscipy.org/schedule
#FAIRdata #Reproducibility #Python

euroscipy.orgEuroSciPy 2025The EuroSciPy meeting is a cross-disciplinary gathering focused on the use and development of the Python language in scientific research.

We invite staff and students at the University of #Groningen to share how they are making #research or #teaching more open, accessible, transparent, or reproducible, for the 6th annual #OpenResearch Award.

Looking for inspiration?
Explore the case studies submitted in previous years:
🔗 rug.nl/research/openscience/op

More info:
🔗 rug.nl/research/openscience/op

#OpenScience #OpenEducation #OpenAccess #Reproducibility
@oscgroningen

Continued thread

Jack Taylor is now presenting a new #Rstats package: "LexOPS: A Reproducible Solution to Stimuli Selection". Jack bravely did a live demonstration based on a German corpus ("because we're in Germany") that generated matched stimuli that certainly made the audience giggle... let's just say that one match involved the word "Erektion"... 😂

There is a paper about the LexOPS package: link.springer.com/article/10.3 and a detailed tutorial: jackedtaylor.github.io/LexOPSd. Also a #Shiny app for those who really don't want to use R, but that allows code download for #reproducibility: jackedtaylor.github.io/LexOPSd Really cool and useful project! #WoReLa1 #linguistics #psycholinguistics

#statstab #378 Selective Inference: The Silent Killer of Replicability

Thoughts: Benjamin overviews the replicability crisis, alternatives to p-values (and their issues), and suggests selective reporting is a large issue itself.

#replication #crisis #pvalue #nhst #selectivereporting #replicability #reproducibility

hdsr.mitpress.mit.edu/pub/l39r

Harvard Data Science Review · Selective Inference: The Silent Killer of Replicability
Continued thread

Just pushed some updates. I have posted the #verilog serial link code. This involved getting a set of scripts for simulation set up.

If anybody is interested in helping on this project, I'd love to have somebody try to follow the readme.md to install the prerequisites and run the ./run_sim script.

The goal would be to take notes and flesh out what things aren't in the prerequisite list and how to install them.

I'd be happy to do the same for somebody else's project.