All arguments based on behavioral similarity only proof we all come from evolution: "we are neural networks trained by natural selection. We avoid destruction and pursue reproduction, and we are both effective and desperate in both goals. The (Darwinian) reinforcement learning process that has led to our behavior imply strong rewards and penalties and being products of the same process (animal kingdom evolution), external similarity is inevitable. But to turn the penalty in the utility function of a neural network into pain you need the neural network to produce a conscious self. Pain is penalty to a conscious self. Philosophers know that philosophical zombies are conceivable, and external similarity is far from enough to guarantee noumenal equivalence."
Now, regarding how much information is integrated, supperativity implies that the ammount of resources devoted to the shrimp shall be propotional to their number, but (at most!) to their brain mass:
"As a rule, measures of information integration are supper additive (that is, complexity of two neural networks that connect among themselves is far bigger than the sum of the original networks), so neuron count ratios (Shrimp=0.01% of human) are likely to underestimate differences in consciousness. The ethical consequence of supper additivity is that ceteris paribus a given pool of resources shall be allocated in proportion not to the number of subjects but (at most!) to the number of neurons. "
I think it's very plausible that shrimp experience something akin to pain. I have very low confidence this adds up to being a lot more problematic than avian or mammalian pain. And I think intuition is off about decapods partially because their neurons are quite large, much larger than land animals. That means their body size is big relative to the neuron count compared with land animals. For instance, a wasp has several times more neurons than a shrimp, does more complex cognition, and is more likely to have moral worth. Mosquitos alao have more neurons than shrimp though it's closer. It also seems off to use knowledge about crabs to infer anything about shrimp, because again shrimp are comparatively much simpler.
Let’s remind here that the reason I don’t think the Shrimp feel “pain” has nothing to do with “pain” and everything with the shrimp not being a conscious being. It is the shrimp subject and not its pain with I find (massively) un plausible.
Yes, given the study linked in this article has low confidence in shrimp sentience due to lack of evidence, why doesn't EA fund the requisite research? It seems from this piece that they just buy the electric stunners. Do they fund shrimp sentience research?
How do you scientifically study sentience? Even if you have a perfectly descriptive understanding of a physical system, this does not give much information about consciousness:
You study sentience by studying the underlying neuroanatomy and physiology and conducting a panoply experiments to ascertain whether the species displays the capacity to suffer, learn, think, etc. Scientists have been studying sentience in species for god knows how long.
Is very clear in the fact that everything consciousness is extrapolation from our own experience. Are AI conscious? We know everything about them: We have the generating code. How conscious is chat GTP? Nobody knows, except Chat GTP if it is conscious. It is named “pretty hard problem of consciousness” for that.
that's about consciousness not sentience, which is more straightforward to study. We can extrapolate from what we know about our own neurophysiology to determine or assign probabilities of sentience in other species based on their neurophysiology.
You've made the argument very cogently, so I'll put to you the question that led to me starting to eat meat again: I'm still not sure how being predated by us is better than being predated by any other being.
The welfare violations you note would happen to shrimp in the wild, I think. In general, every living thing gets predated at the end of its life, either by something larger that eats it from the outside; or by a parasite or microorganism that eats it from the inside. The way in which humans slaughter their prey does not seem to be obviously nastier than any of those endings.
That said, if shrimp are to be farmed and slaughtered, doing it humanely would be better than doing it cruelly, so the value there is clear. It sounds like you do great work.
India, sadly, is terrible when it comes to the welfare of aquatic creatures, both fish and crustaceans. And these farms are being promoted as alternative sources of income because of pressure on agricultural land and decreasing sizes of land holdings. But, like in so much else, the underlying causes are to do with the globalization of food systems. I visited a prawn processing plant in Odisha at the end of 2023 and was surprised to find out how prawns are cultivated in South India, shipped to Odisha (with all the welfare concerns you mention) a few hundred KM north to Odisha, where they are processed by women from Bangladesh, then shipped once again to Vietnam or the Middle East where they are processed further and then go god knows where. Ultimately, of course, all of this is driven by first-world consumption patterns. Every gathering I attend in the West has shrimp served alongside cocktails. While changing welfare practices is crucial, it's even more important to change the Western palate.
Your whole premise of counting the number of entities is incredibly misguided, and likely to mislead yourself and actively harm the world. You seem to be intentionally pursuing the repugnant conclusion that people criticize naive utilitarianism with (the idea that naive linear utility results in wanting to tile the universe with unthinking pleasure circuits). Rather than realizing that this though experiment highlights the need for a more complex and sophisticated philosophy, you instead double down.
Shrimp's have around 100,000 neurons. Chickens around 220,000,000. In terms ability to experience pain pleasure, the scaling is likely super-linear in terms of neuron count
I am curious what you would reply when I argue that "Even if shrimps are sentient, why are their suffering relevant to us? I think morality is a way for humans to live with each other in a society, shaped by evolution (& maybe other forces too), thus animals are not relevant in our moral paradigm."?
Okay so looking into shrimp brain architecture, shrimps have very small neuron counts (order of 10,000s) and most (60%(?)) appear to be in the eyestalks which are the main central processing location of their brains. So, ablating the shrimp eyestalk removes a majority of their neurons and their main central processing unit. Is this considered in shrimp welfare? we appear to be severing most of their brain at the beginning of their lives thus removing most of their (low and far from proven) capacity to suffer. This only applies to the 5-10% of the total shrimp farm that are used for broodstock but if its really severing their capacity for suffering then we could do it to all shrimp? Seems like a good area for EA to fund research in. this could be a free lunch.
I completely disagree: their brains are very simple neural networks, and their degree of consciousness is in the same range as electronic devices.
https://forum.effectivealtruism.org/posts/3nLDxEhJwqBEtgwJc/arthropod-non-sentience
All arguments based on behavioral similarity only proof we all come from evolution: "we are neural networks trained by natural selection. We avoid destruction and pursue reproduction, and we are both effective and desperate in both goals. The (Darwinian) reinforcement learning process that has led to our behavior imply strong rewards and penalties and being products of the same process (animal kingdom evolution), external similarity is inevitable. But to turn the penalty in the utility function of a neural network into pain you need the neural network to produce a conscious self. Pain is penalty to a conscious self. Philosophers know that philosophical zombies are conceivable, and external similarity is far from enough to guarantee noumenal equivalence."
Now, regarding how much information is integrated, supperativity implies that the ammount of resources devoted to the shrimp shall be propotional to their number, but (at most!) to their brain mass:
"As a rule, measures of information integration are supper additive (that is, complexity of two neural networks that connect among themselves is far bigger than the sum of the original networks), so neuron count ratios (Shrimp=0.01% of human) are likely to underestimate differences in consciousness. The ethical consequence of supper additivity is that ceteris paribus a given pool of resources shall be allocated in proportion not to the number of subjects but (at most!) to the number of neurons. "
I think it's very plausible that shrimp experience something akin to pain. I have very low confidence this adds up to being a lot more problematic than avian or mammalian pain. And I think intuition is off about decapods partially because their neurons are quite large, much larger than land animals. That means their body size is big relative to the neuron count compared with land animals. For instance, a wasp has several times more neurons than a shrimp, does more complex cognition, and is more likely to have moral worth. Mosquitos alao have more neurons than shrimp though it's closer. It also seems off to use knowledge about crabs to infer anything about shrimp, because again shrimp are comparatively much simpler.
Let’s remind here that the reason I don’t think the Shrimp feel “pain” has nothing to do with “pain” and everything with the shrimp not being a conscious being. It is the shrimp subject and not its pain with I find (massively) un plausible.
Consequently in this statement:
I think it's very plausible that shrimp experience something akin to pain
The use of “shrimp” as the subject is the problem. There is no “shrimp” self to feel the pain (or anything else).
Yes, given the study linked in this article has low confidence in shrimp sentience due to lack of evidence, why doesn't EA fund the requisite research? It seems from this piece that they just buy the electric stunners. Do they fund shrimp sentience research?
How do you scientifically study sentience? Even if you have a perfectly descriptive understanding of a physical system, this does not give much information about consciousness:
https://forum.effectivealtruism.org/posts/5zbmEPdB2wqhyFWdW/naturalistic-dualism
You study sentience by studying the underlying neuroanatomy and physiology and conducting a panoply experiments to ascertain whether the species displays the capacity to suffer, learn, think, etc. Scientists have been studying sentience in species for god knows how long.
Really? The Classic essay on This:
https://en.m.wikipedia.org/wiki/What_Is_It_Like_to_Be_a_Bat%3F
Is very clear in the fact that everything consciousness is extrapolation from our own experience. Are AI conscious? We know everything about them: We have the generating code. How conscious is chat GTP? Nobody knows, except Chat GTP if it is conscious. It is named “pretty hard problem of consciousness” for that.
that's about consciousness not sentience, which is more straightforward to study. We can extrapolate from what we know about our own neurophysiology to determine or assign probabilities of sentience in other species based on their neurophysiology.
Sentience means there is a sentient being, a concious: for pain to exist, you need the sufferer.
You've made the argument very cogently, so I'll put to you the question that led to me starting to eat meat again: I'm still not sure how being predated by us is better than being predated by any other being.
The welfare violations you note would happen to shrimp in the wild, I think. In general, every living thing gets predated at the end of its life, either by something larger that eats it from the outside; or by a parasite or microorganism that eats it from the inside. The way in which humans slaughter their prey does not seem to be obviously nastier than any of those endings.
That said, if shrimp are to be farmed and slaughtered, doing it humanely would be better than doing it cruelly, so the value there is clear. It sounds like you do great work.
India, sadly, is terrible when it comes to the welfare of aquatic creatures, both fish and crustaceans. And these farms are being promoted as alternative sources of income because of pressure on agricultural land and decreasing sizes of land holdings. But, like in so much else, the underlying causes are to do with the globalization of food systems. I visited a prawn processing plant in Odisha at the end of 2023 and was surprised to find out how prawns are cultivated in South India, shipped to Odisha (with all the welfare concerns you mention) a few hundred KM north to Odisha, where they are processed by women from Bangladesh, then shipped once again to Vietnam or the Middle East where they are processed further and then go god knows where. Ultimately, of course, all of this is driven by first-world consumption patterns. Every gathering I attend in the West has shrimp served alongside cocktails. While changing welfare practices is crucial, it's even more important to change the Western palate.
Your whole premise of counting the number of entities is incredibly misguided, and likely to mislead yourself and actively harm the world. You seem to be intentionally pursuing the repugnant conclusion that people criticize naive utilitarianism with (the idea that naive linear utility results in wanting to tile the universe with unthinking pleasure circuits). Rather than realizing that this though experiment highlights the need for a more complex and sophisticated philosophy, you instead double down.
Shrimp's have around 100,000 neurons. Chickens around 220,000,000. In terms ability to experience pain pleasure, the scaling is likely super-linear in terms of neuron count
I am curious what you would reply when I argue that "Even if shrimps are sentient, why are their suffering relevant to us? I think morality is a way for humans to live with each other in a society, shaped by evolution (& maybe other forces too), thus animals are not relevant in our moral paradigm."?
Okay so looking into shrimp brain architecture, shrimps have very small neuron counts (order of 10,000s) and most (60%(?)) appear to be in the eyestalks which are the main central processing location of their brains. So, ablating the shrimp eyestalk removes a majority of their neurons and their main central processing unit. Is this considered in shrimp welfare? we appear to be severing most of their brain at the beginning of their lives thus removing most of their (low and far from proven) capacity to suffer. This only applies to the 5-10% of the total shrimp farm that are used for broodstock but if its really severing their capacity for suffering then we could do it to all shrimp? Seems like a good area for EA to fund research in. this could be a free lunch.
Commenting for the metrics
Aaaargg stop with the shrimp welfare and post cool technologies again