this post was submitted on 17 Mar 2025
280 points (99.0% liked)

Science Memes

13455 readers
3631 users here now

Welcome to c/science_memes @ Mander.xyz!

A place for majestic STEMLORD peacocking, as well as memes about the realities of working in a lab.



Rules

  1. Don't throw mud. Behave like an intellectual and remember the human.
  2. Keep it rooted (on topic).
  3. No spam.
  4. Infographics welcome, get schooled.

This is a science community. We use the Dawkins definition of meme.



Research Committee

Other Mander Communities

Science and Research

Biology and Life Sciences

Physical Sciences

Humanities and Social Sciences

Practical and Applied Sciences

Memes

Miscellaneous

founded 2 years ago
MODERATORS
 
top 25 comments
sorted by: hot top controversial new old
[–] Dogyote@slrpnk.net 59 points 1 day ago* (last edited 1 day ago) (4 children)

I think so. Journals are only in use today because that's how scientific reporting was done before the internet. They're still around because institutions and academics need some way of keeping score. What's the point of it all if you can't say you're better than someone else?

Journals could be replaced with something like Wikipedia, but more sophisticated and editing would be a highly controlled process that requires reproducible data and peer review.

Score could be kept with citations. You'd be required to list the work you built on, as we do today, and the authors would receive credit. No citation would be worth more than another. If you published something useful for a particular field or made a major discovery that opened a new field, then your citation count would reflect it.

Perhaps competing labs could both receive citation credit if their results essentially showed the same thing. If nobody could scoop anyone else's work, then cooperation may be encouraged over competition.

The entire wiki would be a public good, funded by governments across the world, free for all to read and for those with the relevant credentials to publicly comment on.

Negative results could also be published. "We had this hypothesis, we tried this, it didn't work out." It'd probably save time and these works could be cited as well. Imagine making a very important mistake that saves everyone time and effort and being rewarded for it.

I also feel like there is opportunity here to expand a particular field's community. Since the wiki would be more free and open, academic silos may have more metaphorical doors, allowing more cross-field dialog.

I could go on, but I think the tools we need already exist, but we're not using them because... tradition. It would be easier, more efficient, and flexible to use some kind of wiki structure than what's currently happening.

Edit: I thought of one more thing. Searching for information could be so easy. Instead of finding a dozen papers (some slightly off topic, some of questionable quality, some poorly written, some your institution isn't subscribed to, etc) and review articles, all of the information could be easily compiled into review wikis. The level of detail could be easily changed depending on what you want and it would all be right there.

[–] RiceMunk@sopuli.xyz 11 points 1 day ago* (last edited 1 day ago)

Maybe if the academic world behaved like that, I wouldn't have burned out and fucked off to the IT sector a decade ago.

Kind of adjacent to this, years ago around said burnout I kept floating this idea around my head where I was thinking if there was some way out of this tradition of creating these giant monolithic papers every time -even if your effective research result could be distilled down to a paragraph with some numbers and preliminary handwaving- where you need to pad the whole thing out with a big-ass literature review to keep the citation circlejerk going.

So why not just have "papers" consist of effectively a few paragraphs? The citation tree of how you got there is still relevant, but you can put all of that stuff in what's effectively metadata and not clutter up the whole thing with it.

Have an idea for a lab experiment? Publish the methodology as-is, and link it to whatever other tidbits of knowledge make it relevant. Did the experiment and got results? Publish the data, and link to the experiment. Got some new theory out of the results? Publish the theory and link to the experiment results in the metadata. And so on.

Maybe this would have weird side effects of its own, but I can't help but imagine this would make the whole process so much less painful, and allow also for better organizing of the knowledge produced.

[–] jabathekek@sopuli.xyz 11 points 1 day ago

It's beautiful. I should learn to build a website so I could host all of human knowledge on the old desktop under my kitchen table.

[–] WolfLink@sh.itjust.works 9 points 1 day ago (1 children)

A lot of this is kinda already happening.

Score could be kept with citations.

This is already something people brag about / look at as a measure of success. There are plenty of free websites to keep track but the most popular one is Google Scholar.

Perhaps competing labs could both receive citation credit if their results essentially showed the same thing.

When I find multiple good papers that have the information I need, I cite all of them, and even feel happy about it because citing a lot of papers can make your paper look like you put in more work.

If nobody could scoop anyone else's work, then cooperation may be encouraged over competition.

It’s a bit hard to completely do away with scooping. A possibly more practical way to increase cooperation would be to eliminate the idea of the “first author” getting the majority of the credit. It’s really annoying when like 5 people heavily contributed to the paper but whoever’s name is listed first ends up getting 90% of the credit because that’s what people look for.

The idea of doing things in a wiki format is interesting though.

[–] fristislurper@feddit.nl 5 points 22 hours ago (1 children)

I don't know if first authorship needs to go away. I've definitely been 2nd or 3rd author for a few days of work (as compared to months of work for the first author).

You can give detailed attribution (many papers require them nowadays), but no-one ever reads them.

[–] pupbiru@aussie.zone 2 points 22 hours ago

in a structured and dynamic system, order could be randomised - not entirely, but between the “tiers” of contributors… it looks as though if everyone submitted detailed attribution, that could then be used to dynamically vary order so that nobody gets “first” for every view for the same amount of effort as others

[–] qaz@lemmy.world 3 points 1 day ago (1 children)

Score could be kept with citations. You'd be required to list the work you built on, as we do today, and the authors would receive credit. No citation would be worth more than another. If you published something useful for a particular field or made a major discovery that opened a new field, then your citation count would reflect it.

Wouldn't you be able to game that by having 2 entities spamming citations for each other?

[–] Dogyote@slrpnk.net 3 points 1 day ago (2 children)

Your peers would know and they'd think you're pathetic. There'd be nothing to gain.

[–] qaz@lemmy.world 2 points 1 day ago (2 children)

If grants are tied to the "score" there is an incentive to abuse the system.

[–] pupbiru@aussie.zone 2 points 22 hours ago

and similarly, any metric tied to a reward is no longer a metric worth measuring for the purposes of maintaining system health

[–] Dogyote@slrpnk.net 2 points 22 hours ago

True, but I feel like abuse would be fairly obvious and grant panels would know if someone is gaming the system. The panels, at least in my former field, are composed mostly of people who know who's doing what in their field. If they saw a high citation count they would know if that was legitimate. If anything, faking citations would be shameful and embarrassing for most people.

[–] Lv_InSaNe_vL@lemmy.world 2 points 22 hours ago

On GitHub there is a tracker for how many commits you make in a year, but it's super easy to fake. But it's also pretty obvious when you fake it haha

[–] mrmacduggan@lemmy.ml 29 points 1 day ago (1 children)

The mathmeticians already escaped to ArXiv. It's open.

[–] SmoothOperator@lemmy.world 7 points 1 day ago

It's not a journal though. No editor or peer review process.

[–] OldManBOMBIN@lemmy.world 20 points 1 day ago

Everyone email me your results and I will post them to 196 and lemmyshitpost. We will start a grass roots movement. Bowel movement.

[–] bjoern_tantau@swg-empire.de 16 points 1 day ago* (last edited 1 day ago)

Looks like a topic that needs more research.

[–] Sergio@slrpnk.net 14 points 1 day ago

Yeah I'm not buying it.

  • in academia, there are a lot more mid- and low-tier academics (like me!) than there are high-tier academics, and there are plenty of lower-tier venues and lower-tier institutes. In these tiers, you're not expected to publish at the highest levels.
  • in higher-tier academia, once you get tenure (to be precise: once you submit your tenure package) the urgent need for high-impact journal cites is greatly reduced. you write books or something I dunno.
  • industry scientists have far fewer publishing requirements. or they write articles for trade mags. One place I interviewed was actively hostile to publishing.
  • government-lab scientists, I dunno, I think they write technical articles that they give their sponsors?

The dynamic that u/mIRNAexpert describes does exist, but it's not the whole problem. And like any scientist will tell ya, figuring out the problem is half way to figuring out the solution.

[–] SmoothOperator@lemmy.world 13 points 1 day ago

The Quantum journal is doing a pretty good job.

[–] Tja@programming.dev 8 points 1 day ago

There is a way out. It involves earning a bunch of money by moving into industry. It's a hard job but someone's gotta do it!

[–] pixeltree@lemmy.blahaj.zone 8 points 20 hours ago

is there a way out?

Yeah but my friends keep telling me not to do it

[–] pigup@lemmy.world 6 points 23 hours ago