Never considered that the mods of this site are literally discussing with the people heading yc companies how to game their hn-titles for better interaction. How naive I am.
I don’t know about “idiots” but bias towards women was obviously real and prevalent. Treating the idea that that might have influenced medical literature as a “meme” is slightly bizarre to me.
The meme is that before [insert your contemporary period] people were so backwards that they would miss something like the clitoris entirely. The meme isn't that people and cultures were prejudiced or biased, but that they were prejudiced in an idiotic way. If you believe that's how prejudice works, then you'll be utterly blind to much contemporary prejudice.
EDIT: Relatedly, The Guardian article sites the statistics about female genital mutilation. And you might think, how could people in this day be so cruel? Well, in some (but not all) of those cultures, such as parts of West Africa, female sexual pleasure is highly valued, a clitoral circumcision involves removing the clitoral hood only, similar to circumcision for men, and is viewed as enhancing female sexual pleasure, specifically for oral sex, an act that lacks any negative connotations. Now, embedded in that narrative might be a deeper, more subtle bias against women, but by not appreciating and grappling with that dynamic you're ignoring and diminishing how many women in those cultures understand feminism, which is its own anti-feminine and culturally centric (i.e. "colonial") bias.
Isn't type 1a circumcision (removal of the clitoral hood, but not other parts) very rare? At least that's what the Wikipedia article claims, referencing a 2008 WHO report.
This was several years ago and unfortunately I didn't archive my research. Every year it becomes so difficult to dig up stuff, and I don't have time today to go back down that rabbit hole. (These days I'm much better at archiving stuff.)
Here's a couple of articles by one of most vocal supporters of FGM in West Africa:
The second link of the four is a response to the last.
I was sloppy in being too specific in saying removing the clitoral hood was sometimes justified as enhancing oral sex. Now that I think about it, that might be one of the views regarding labial extension, which is often lumped in with FGM but obviously quite different from cutting the clitoral hood. The claims about enhancing sexual pleasure I think largely came from more polemical literature, as well as some English-language African feminist blogs and bulletin boards, and I would suspect those views may be, at least to some extent and in their specificity, recent revisionist justifications. In African discourse there's a reactionary vein that pushes against Western criticisms of traditional African practices, and one of the ways to do that would be to subvert the paternalistic disgust about FGM by explicitly arguing the practice promotes one of the West's other ideals, sex positivity.
To be clear, I'm not trying to defend any of this. Just trying to point out that the West's exceedingly simplistic and categorical perspective hides a very strong cultural prejudice, as well other problematic assumptions about how and why these practices persist.
If this actually worked, you'd think there would be at least a few women without the cultural connection who get it done just for that purpose.
This sounds like the same sort of bullshit used to promote male circumcision. How about we just stop performing unnecessary surgery on our children? If someone wants to mess with their own junk, they can do it when they become an adult.
So, you admit you have no evidence supporting your bizarre claims, and aren't defending a practice you claimed was at least sometimes without negative connotations. Gotcha.
My comment about negative connotations was referring to oral sex, where it was claimed the local culture never viewed performing oral sex on women as emasculating, but something men were expect to do. Genital modification itself has to some extent negative connotations everywhere these days, if only because of the influence of Western media, but that has also given rise to a reactionary dynamic that tries to defend these practices using the language of contemporary Western morality, e.g. sex positivity.
I don't particularly agree with the OP but from my European pov, male circumcision doesn't seem to have negative connotations, certainly not in the US.
Negative connotations and actual negativity are two separate things. Alcohol tends not to have negative connotations whereas things that are better for your health and less addictive, cannabis, magic mushrooms, have negative connotations.
What? That practice is absolutely terrible. Many people just have no idea about it, and then their offspring might grow up with terrible shame or something if they ever learn what was taken from them.
Alcohol is also terrible. Nicotine is terrible. Even caffeine can be terrible if you become too dependent on it without realizing. Harm reduction is a thing that can make things less terrible but most users don't practice it. That's the real terror IMO.
> Negative connotations and actual negativity are two separate things. Alcohol tends not to have negative connotations whereas things that are better for your health and less addictive, cannabis, magic mushrooms, have negative connotations.
This is just legal vs illegal. Which is pretty much how morals are decided these days, especially for the non-autistic / "neurotypical" population
> Which is pretty much how morals are decided these days, especially for the non-autistic / "neurotypical" population
Give it a break. Nothing isolates "neurodivergent" people from the rest of society faster than treating neurotypical people as a morally inferior out-group.
I don't know where you got inferiority from that, but it's a well-documented phenomenon that autistic people are more likely to go against the norm than non-autistic people. It's called "positive non-conformity". This suggests non-autistic people are more likely than autistic people to accept how things are and perpetuate it.
While I have many autistic friends from abusive living situations that were forced to accept how things were, I find that the autistic people I meet still tend to be much more varied than the non-autistic. Though I don't know for sure whether this is a side effect of their neurotype or of their societal treatment.
Your first 2 paragraphs missing the point that negative connotations are not the same as actually negative.
My view is that circumcision is negative. I disagree that it has negative connotations though.
Do we have different understandings of what connotation means? I would say in most of the western population having a glass of wine in the evening would be seen neutrally. Having a joint less so. I'm not saying having a joint is bad. But connotations are about the unspoken things, I'm not saying it, it's inserted by people based on their biases.
I think you're putting the cart before the horse. Things that society decides are immoral become illegal and visa versa.
Fwiw I'm Autistic, so I don't know if the last comment was aimed at me, and whether I should class it as a compliment.
Dumb question, why do “sensitive” spots on the body need more nerves? Couldn’t you just have the normal touch-sensing nerves and map signals from specific spots on the body to stronger/pleasurable qualia in the brain?
Not a dumb question. The shortest (and at a glance unsatisfactory) answer is because it works, and therefore it evolved that way.
Going in detail, first consider that for a feature to be evolutionarily selected for two things have to be true:
1. It must increase the fitness of the organism that carries it, i.e. the likelihood of its carrier having descendants as compared to non-carriers ( or be a side effect of another feature that improves fitness enough to be a net positive, etc etc )
2. It must be inheritable (and, in sexually reproduced organisms, mutually compatible during embryonic development).
One such a feature has reached dominance in a given population, as long as it continues to be important for fitness it cannot really be deprecated in favour of an alternative from scratch, even if that alternative is arguably better.
That's why, for instance, vertebrate ocular nerves connect to our retinas on the inside of our eyeball, resulting in us having a blind spot. Cephalopods, on the other hand, evolved their eyes independently the "reasonable" way, connecing their nerves from behind the eyeball. There's no way a vertebrate could mutate from scratch for its optical nerve to connect to the retina from behind without causing absolute mayhem in embryonic development. Our hacky solution for the blind spot? Let the brain hide it in software.
Going back to your question, some spots of the body being more sensitive than others became critical for evolutionary fitness long before nervous systems were complex enough to generate conscious qualia, let alone enough for them to be consistently involved in decision making. Furthermore, mapping of specific nerves to intensity of feeling on the CNS would imply complex hardcoding of something which is much easier to solve with "this place important, have more nerves", and maybe would even conflict with the fitness benefit of a CNS with enough neuroplasticity to learn anew during the development and lifetime of an organism.
So, in summary, the solution of having more nerves where it matters is simple, good enough, and has no reason to be rolled back in favour of a radically different alternative.
> Our hacky solution for the blind spot? Let the brain hide it in software.
I would say the solution is just having two eyes, since their respective blind spots don't overlap in the visual field.
I would also say that the brain doesn't hide the blind spots, but rather doesn't pay any attention to them in the first place. There's just a lack of information from them, and this deficit isn't normally noticeable because the other eye makes up for it. I think Dennett explains it that way somewhere, probably in Consciousness Explained
The blind spot still isn't noticeable if you close one eye, though. You have to look for it carefully with a specific setup that allows you to detect the discrepancy between what you see and what's actually there.
>1. It must increase the fitness of the organism that carries it, i.e. the likelihood of its carrier having descendants as compared to non-carriers
This isn't necessarily true. If you map out changes through the history of species, you'll find no significant changes but a lot of diversity for long periods, followed by big changes and low diversity for a short period.
That's because during "abundant" times, the population will develop diversity as long as it doesn't significantly hinder reproductive rates. When an environnemental pressure comes up, the diversity dies down because the ones lucky enough to have adaptations that suddenly become useful and reproduce more.
So an animal might get a longer neck, but that doesn't significantly increase reproduction because food is aplenty. It's only when there's a drought that longer necks become an advantage and the trait is now selected for.
What you say is correct, and that is why I was referring specifically for what's necessary for "features" (the term I used for "phenotypes" to make it more HN) to become ubiquitous. It is a general rule of thumb in evolutionary biology that the more diversity is observed in a given population for a particular phenotype (e.g. hair colour, height, blood group, etc.) the less relevant it is for fitness within its observed range. When a phenotype is strongly selected for in a given population (e.g. bipedalism, opposable thumbs, the ability to speak) it soon becomes dominant and there is much less diversity.
As to your example about, for instance, neck length during abundant times, that follows the same rule: during abundant times neck length simply does not matter for fitness, therefore (all else being equal) there can be phenotypic diversity in the population.
One caveat though as to how a given phenotype may become ubiquitous without favourable selection is of course genetic drift[0], given a small enough population which is isolated for a long enough timeframe. Eventually that phenotype may become selectively "advantageous" inasmuch as it is no longer compatible with alternatives, and individuals from the isolated population who this phenotype can no longer have successful offspring with individuals of a different phenotype, resulting in speciation. That's what I meant with regards to a "make nerves on important places generate more pleasure/pain in brain" genotype being incompatible with a "have more nerves on important places" one. A hypothetical hybrid creature would be a mess.
As a software dev, I think this is actually quite a satisfying and sensible answer. A simple reliable hardware solution in favour of a brittle “clever” software one
> mapping of specific nerves to intensity of feeling on the CNS would imply complex hardcoding of something which is much easier to solve with "this place important, have more nerves"
Not saying your answer is right or wrong, but I don't think this is a sufficient explanation. If the body can differentiate areas enough to produce more nerves in one area, then it could plausibly instead produce fewer nerves which inherently produce a stronger signal - just as we have nerves which respond differently to different stimuli (e.g. heat, light, etc). Also it could be neither and we kinda randomly ended up with what we have because no option was strongly disadvantageous at the time.
Having more independent samples helps filter out noise. If you had individual sensory neurons with outsized influence, then misfiring of such neurons would also have outsized influence.
Sounds plausible at least, but I think the question isn't necessarily making a valid assumption. Why do men have to have nipples? Why is our retina installed backwards? Why do sinuses drain upwards? It's just a path evolution took, it doesn't jump to some optimal design.
male nipples are developmental vestiges, the male condition is derived from response to embryonic testosterone, and is a developmental variation from default.
early stage embryos of both sexes are not easily distinguishable by genitalia, they look morphologically similar. later developmental events culminate in morphological rearrangement to male form.
lack of response to testosterone during development results n a curious state of affairs, where a person is genetically male, having x, and y chromosomes, develops according to a female plan. external appearances are female, with loss of secondary sex development in puberty.
Fingers, for instance, not only have higher sensitivity, but also much higher spatial resolution due to the more dense nerve network.
I can't tell why other areas may have needed higher spatial resolution; maybe it was evolutionary important in the past, and remains today. Or maybe just adding more nerves due to a random mutation correlated with better reproductive outcomes due to a stronger signal, or higher sensitivity, so more nerves are present for no other reason.
Nerve density isn’t mainly about intensity, it’s about spatial resolution. More nerve endings per square centimeter means you can distinguish finer details of touch, texture, and pressure. The brain can’t invent spatial detail that isn’t in the incoming signal. Amplifying a sparse signal centrally would be like zooming into a low-res photo.
The brain does do some of what you’re describing though. The somatosensory cortex gives disproportionate space to certain body parts (the sensory homunculus). So there is central amplification, but it works on top of peripheral density, not instead of it. Without the dense nerve input, you’d basically have an on/off switch instead of nuanced sensation.
> Dumb question, why do “sensitive” spots on the body need more nerves? Couldn’t you just have the normal touch-sensing nerves and map signals from specific spots on the body to stronger/pleasurable qualia in the brain?
Think of a television. What gives you a better picture, quadrupling the number of pixels or making the existing pixels 4x as intense?
I think you have it backwards. The brain doesn't "know" what's supposed to be sensitive or pleasurable. It boots up with no training data after all. It machine learns what's sensitive due to a combination of nerve density and other factors. We haven't figured out all the other factors yet. But that's why there's a correlation between nerve density and sensitivity: density means sensitivity.
Go take a picture in a dark room and edit the photo to try to make it like you've got the lights on in photo editing software. You get a noisy grainy mess and little to no detail.
It's not like evolution would leave a significant amount of signal/noise ratio on the table for all other nerves.
Presuming nerves are already optimized if you want more signal you have to add nerves.
Sensing nerves aren't especially energy hungry when considering their volume in the human body, so evolution doesn't have much reason to minimize them.
They assert that they dont have them, in the same way you (presumably) assert that you do have them. Neither have any further evidence and one is not a prioi more likely than the other.
Wow! I'm shocked Limbo took 3 years and had a larger team. I've played a bit of the game (but didn't complete it). Looking at the Steam reviews and a quick search shows me it takes 3-10 hours to beat the game. :O
IIRC the Limbo team grew over time, it shows in gameplay, the second half feels less personal and more like a generic platformer. They hired more people to get done with it. It don't mean it is rushed, but to me, the second half lacks personality compared to the first half.
Also, I don't like the idea of using gameplay time as a value statement. Maybe I say that because I don't have money problems, but I find that the tendency certain gamers have of judging games in terms of dollars per hour of gameplay is pretty damaging, as it incentivizes developers to focus on gameplay time more than polish. 3-10 hours is already plenty for that style of game. Note that there are many AAA games in the 10 hour range.
I'll say something positive here as a european: the amount of diverse places that I'd assume would be broadly culturally aligned with Trump that have shown some form of resistance or pretty vehement disagreement with this administration this last year, suggests to me that there is a degree of widespread (kinda bipartisan) idealism in the US that's pretty unique in the west.
Americans individually are probably the most optimistic people in the world. The optimism might be myopically fixed on getting a promotion or winning the lottery or breaking the plate spinning world record. But if you don’t have some big project or self improvement scheme then many people (and most traditionally successful people) will give you a wide berth. People without big dreams might as well have already kicked the bucket.
Regardless of the government this culture is infectious. I think of Nikes famous tagline “Just do it” probably describes America better than any anthem or crusty document.
At the core of "Havana Syndrome" lies the idea that Cuba and/or Russia have managed to develop energy weapons so advanced that the American military command won't even entertain the thought of them existing. I'll let you draw your own conclusions.
> won't even entertain the thought of them existing
Careful, it's also possible that they have thought very hard about such things, and they've decided that revealing what they know would lose them a technological edge.
In other words, what if the CIA/DOD already knows there's a class of devices which could explain the problems, and the denial is about maintaining secrecy over their own operational capabilities?
Imagine something similar in the 1980s: "This tragic mid-air collision was obviously caused by faulty radar or gross pilot error by at least one of the two military planes... Our brightest minds have looked very hard at the problem and there is no such thing as a 'stealth' airplane which doesn't show up on radar."
Or going back further, Allies having cracked the Enigma encryption had to let Allied ships continue getting sunk and soldiers dying because to act otherwise would have revealed that the Enigma had been broken which would have led to an even greater loss of life.
The assumption with these weapons was that they would require too much energy to be portable enough to be undetectable in all of these circumstances (at least based on other reporting on the subject).
If the device doesn't require a lot of power, then it's entirely possible that American military commanders and research leadership would miss it.
Add to that an incentive to avoid helping the victims from a cost and overhead perspective, and you get a big ol' mess.
>At the core of "Havana Syndrome" lies the idea that Cuba and/or Russia have managed to develop energy weapons so advanced that the American military command won't even entertain the thought of them existing.
I just don't think that's true at all. The answer could easily be that Cuba and Russia have developed energy weapons that we only know about from classified sources and therefore cannot discuss their existence.
There is precedent for this. IC Satellite optics were years ahead of commercial. Same with cryptography. The NSA invented asymmetric encryption and kept it secret. I wouldn’t be surprised if they know a few advanced things about quantum computing that IBM hasn’t figured n out yet
I would, I've written about this before. A common error is to think that say NSA mathematicians have access to what's in the "free world", but those on the outside don't have access to the inside.
The reason this is an error is because research is an interactive process. Spooks can read papers, but they can't freely discuss with outside researchers - not what they themselves are working on of course, but even talking to outsiders about what the outsiders are working on, can be risky, since it can accidentally reveal a lot.
Secrecy cripples research. Even in areas where the TLAs hoover up the majority of graduates (was apparently true of math a while ago), they often fall behind.
There was a time when the TLAs could just call people like Claude Shannon, demand he work for them and never tell anyone about what he was working on, and he would say OK. That time is long gone. They don't have that goodwill any longer, and the price of isolation for a researcher has only gotten worse as communication has improved.
There's also the chance it's not a weapon, but something that mistakenly turned into a weapon when it was tested on live subjects.
I don't think randomly attacking embassy staff (iirc, not everyone was CIA - there were just desk people affected) makes sense for anyone to do, but trying to listen on them and fucking up sounds right up their (or our) alley.
This was the point I made in another comment here. My bet is the US deployed the weapon and accidentally sickened their own people. So of course they play stupid and deny that any such tech could exist.
I feel corny being so positive about a megacompany, but I bought my first Macbook air half a year ago after a life of PC's, and it has been genuinly surprising to use something made by a huge company that is constantly better than I expected.
I have a macbook air from 2022 and it is easily the "best" computer I have ever owned.
Its portable. It has a great keyboard, screen, and battery life. No fans or overheating. No issues with the operating system or installing software I need.
I can even use it for some lighter software development directly, and for everything else I can ssh back to a beefier machine.
If I weren't already so happy with this macbook air, I would be ecstatic for the neo.
Same. I got the 2024 15" Macbook Air when CostCo had it for $849.00*
Hadn't purchased a laptop new since college scholarship decades ago. This machine continues to make an immediate impression. The entire thing is thinner than just the bottom of my college CoreDuo. It also lasts 8x longer, on battery.
I just use mine as a tertiary machine (i.e. bedtime reading/podcast), but if you ever want to run the machine hard long-term, you can use 1mm thermal pads between the heatsink and bottom of external case (and then it'll never throttle).
> if you ever want to run the machine hard long-term, you can use 1mm thermal pads between the heatsink and bottom of external case (and then it'll never throttle).
That will spread the heat to the battery and degrade it much faster.
This removes heat from the internal compartments (which logic board heat sink and battery co-habitate [0]) by transferring it outside via heat conduction through the case. There is no detectible heat increase (to touch) — consider the heat masses relative sizes (processor v. entire metal case).
The best computer, but with the worst software (well maybe Windoze is even worse these days). If you could run Linux on them, without compromises, it would be perfect.
It's not that bad really. Windows was always a bit flakey and crashy and Linux has a job running the software I use. I'll give you Apple can be a bit control freaky as to what you do with your own machine - getting rid of 32 bit annoyed me - but nothing's perfect.
Same. Equally comfortable on Windows, Mac and Linux. But almost almost all new hardware choices for the last 25 plus years have been mostly from Apple. The old Macs don't really die, even as I replace them with faster models, so my house is slowly becoming an Apple/Mac museum, starting with a Mac 512k, Mac CI and Mac LC, and so on, right down to a trash can Mac in the mix, and then to M series Macs. All CPU generations from Apple: 6502 (Apple ][), 68000, 68040 (NeXT) PPC, ARM (Newton, iDevices), Intel and M series. Can't get myself to throw/give/sell them away.
Coming to terms with two uncomfortable truths: I'm a hoarder, and an unapologetically incorrigible Apple fanboi.
reply