Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124
Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124
There are countless research-related issues that have entered public discourse. Sometimes related to the academic publishing process. Discussions on this topic range from publications or spiritual thinking pervasive in academic publications to gatekeeping that occurs in the scientific review process, to the inaccessibility of quality research, including published research. These topics are inherent concern to me, given that some of my time is usually spent as an academic researcher. But they are also interesting to many people who are not academic researchers, particularly in light of current negative attitudes about experts and the media, and the growing trend of people who “do their own research.”
I recently came across an article Kulldorff (2025)focuses on the rise and fall of scientific journals, and on potential paths to the future. Its potential methods included four pillars: (1) open access publishing, (2) open peer review, (3) reviewer payments, and (4) article gatekeeping (this is about the removal of acceptance/rejection processes, particularly desk rejection processes).(1).
The most interesting thing wasn’t that it wasn’t merely a way of thinking about the issue being considered. Advertisement nausea. Rather, it was actually the first attempt to apply the “advanced methods” proposed by authors published in journals attempting to apply these pillars. This article is worth reading as it does not just use this post to summarise Kulldorff’s discussion.(2). what morning What we’re trying to do here is push the fourth pillar of Kulldorff a bit back. I give up on the need for some degree of gatekeeping.
I was fully on the Kulldorf pillar until I reached the fourth pillar. The complete wording of the pillar is as follows:
Removes article gatekeeping and allows organizational scientists to freely publish all their research findings in a timely and efficient way.
While removing barriers to attractive publications in theory, it overlooks the key role editorial supervision can play in ensuring the quality and rigour of published research. And that’s my argument here Some Article gatekeeping is required and can serve effective purposes in the scientific process. Rejection may be required as a means of enforcing the author to improve (1) The strength of the argument underlying the research(3) or (2) the quality of evidence/analytical techniques used in the research;(4).
Some of my own research serves as evidence of that, and certain cases Grawitch et al. (2023). The original attempts to publish included a single study (rather than a finished product of the three series) and the unsleek theoretical and logical foundations that underpinned the study. Therefore, some of the rejections have helped to improve our argument and improve the strength of reasoning.
That being said, there was also a lot of questionable gatekeeping involved in the process, from long reviewers’ rants on tangential topics unrelated to our particular research, to desk rejections, to odd claims about the meaning of our research. Sadly, my experience with this single manuscript was not uncommon.
Such, Some A degree of gatekeeping appears to be necessary, especially given the increased use of AI to generate junk science.Schultz, 2025))(5). An increased variation in published research and its quality requires some formalized gatekeeping system. Such systems should use well-defined (and limited) parameters and minimize the use of Rejection of the desk As a gatekeeping tool(6).
I generally agree with the direction Kruldorf argues that published research should argue, but I would argue that the four pillars are incomplete. As a basis for understanding what is missing, I Spector (2024)(7). In it he argued that the field of industrial/organisational psychology was seen too much in statistical modeling, explaining analyses such as the AA “Trojan horse.” The reason for his argument is that in the majority of cases, modeling is nothing more than to make the correlation results appear more refined than they actually are.(8). Such an analysis leads to claims (and visuals) that imply causal relationships when no causality has been tested at all.(9).
The essential reading of psychology
Overuse of modeling is just one symptom of a wider problem. This is a research culture that prioritizes analytical sophistication over clarity and need. Many studies rely on complex methods not because they need to answer research questions, but because using such analytical techniques can improve disclosure. Structural modeling, multi-level analysis, or complex Machine Learning The technique may sound impressive, but without adding meaningful insights beyond traditional statistical methods, you risk getting muddy rather than clarifying your understanding.(10).
And my fifth pillar is:
Analysis should not be more complicated than necessary to answer research questions.
When the complexity of analysis becomes virtue rather than necessary, research loses its ability to promote clarity, accessibility, and knowledge.
Kulldorff’s proposal provides a compelling vision for reforming scientific publications and addresses key issues such as accessibility, transparency, and peer review inefficiency. However, there are no reforms that do not have trade-offs. Removing publication barriers can accelerate knowledge dissemination, but also risks undermining quality control mechanisms that help refine research before entering the public domain. A certain amount of gatekeeping is required. Not to suppress ideas (as gatekeeping is now done), but to ensure that published research meets the basic criteria of rigor and validity.
At the same time, seeking analytical refinement as a marker of quality has created a different set of challenges. The overuse of complex analytical techniques such as structural modeling, hierarchical methods, and machine learning adds sophisticated layers without adding real insights. If complexity is prioritized for itself rather than as a means of answering meaningful research questions, there is a risk that scientific discoveries are less transparent, less replicable and ultimately not useful.
Scientific Publishing needs to evolve, but rather than introducing new problems, it needs to evolve in a way that preserves the integrity of research. Meaningful reforms must balance accessibility and rigor. Openness and quality, innovation And clarity. Ensuring that research is rigorous and interpretable is not a barrier to progress. It is the very foundation of scientific advancement.