Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

This is pretty good news if only that we will get more research into this phenomenon.

In 1933 von Neumann basically published a proof against the idea of any hidden-variable theory. It took 30 years but Bell then discredited it.

My guess is that a lot of this will get revisited. The outcome of this in the next 10-20 years might be really cheap probes (via reusable launch vehicles like SpaceX is making) that need no propellent to continue onto their merry way for a long time. The current way of doing this would be using solar sails or something along those lines which is not exactly compact.

Having a way to push your little probe along (even at a little bit of acceleration) without having to carry a lot of heavy propellant is a big deal. Maybe just some solar panels to generate the energy needed.



Bell proved that there is no local hidden variable model that is consistent with quantum theory. He said the same thing as Von Neumann (quantum mechanics can't be understood merely in terms of hidden variables), only more robustly. No new physics was discovered.


No, that's not the whole story. Bell pointed out that Bohm's theory, already known but ignored, is a counterexample to von Neumann's and others' claims on impossibility of hidden variables. He analyzed how that is possible and found out: 1) von Neumann made some unreasonable assumptions in his analysis; 2) the fixation on hidden variables is missing the point; the real surprise is that locality, not theory with hidden variables, is inconsistent with quantum theory. That's quite a discovery.

http://www.ijqf.org/wps/wp-content/uploads/2014/11/Bricmont-...

https://arxiv.org/pdf/1408.1826.pdf


Wait a minute, my understanding of QM rejects both non-locality and hidden variables. Put it that way however, you kinda have to believe in macroscopic decoherence, that is, Everet's multiverse: there are 2 universes, one in which the cat is dead, and one in which the cat is alive, and observing one universe doesn't rule out the existence of the other. (Similarly, sending a photon (or settlers!) outside of the observable universe doesn't end its existence.)

Non-locality sounds way, way weirder to me than macroscopic decoherence. Stuff like quantum collapse is really an additional, un-falsifiable claim: why the universe would zero-out precisely the stuff that the math says we can't observe?


Er, are you familiar with the Bohmian mechanics [1] that was mentioned above? It has, basically, 'non-local hidden variables', and that works fine.

[1] (https://en.wikipedia.org/wiki/De_Broglie%E2%80%93Bohm_theory)


This sounds really interesting but my physics are abysmal.

Could someone kindly explain this thread in conceptually simple terms?


These two Quanta Magazine articles are probably the most accessible-with-least-compromise articles about the subject, and also include a lot of the history behind it:

https://www.quantamagazine.org/20140624-fluid-tests-hint-at-...

https://www.quantamagazine.org/20160517-pilot-wave-theory-ga...


It's pretty hard to say anything meaningful about quantum mechanics and above without being precise with the math (case in point, there are so many frustrating examples of people who don't have the math down making statements that aren't consistent with it).

Here's a summary, cause talking about this stuff is fun, with the caveat that it won't

Quantum mechanics in its usual interpretations involves some randomness, where a measurement can give multiple results with different probabilities. The interpretations tell us that this is truly random; that the values are not known until you measure them, and then afterwards they are known and true everywhere.

The obvious objection is "hey, maybe the values were there already, but until we measured them we just didn't know what they were". A theory with this property is called a 'hidden variable' theory.

A 'local' hidden variable is one attached to a particle, such as its velocity. If you measured two entangled particles at the same time that are some difference apart, QM shows us that the results of one can be influenced by the results of the other - for example, if you measure one particle as having a positive spin on the Z-axis, the other would have to have a negative spin, in a certain experiment.

Excluding superluminal communication (one particle did not send a message to the other), we might guess: well, one had spin up and the other had spin down to begin with, and we were just measuring them to find out which one was which.

Bell's Theorem [1] proves (in an experimentally verifiable way) that this is not the case. There is no way that local hidden variables can reproduce the results of quantum mechanics. This is one of the most amazing discoveries of the 20th century, in my opinion.

Nonlocal hidden variables still work, which is what Bohmian mechanics is. You're allowed to say "there is a variable accessible to every measurement that determines what the result of a measurement is. In Bohmian mechanics - which is mathematically equivalent to regular QM, just more complex - there is a 'pilot wave' that is computed from the whole configuration of the universe, and then is used to determine what the spin of a particule is.

Basically you get to pick between nondeterminism (randomness) and a global function that influences everything that's much more complicated.

The theory is appealing to many because it avoids non-determinism. It's unappealing because it's strictly more complicated than the interpretations that don't have this extra object, but predicts nothing beyond them. By Occam's razor it's not as good as the simpler interpretations.

It is appealing to people who really don't want to accept the possibility of randomness in the universe, which I have no problem with. Not that it's not worth time investigating it, because it's interesting.

(Some people think it's not worth splitting hairs over interpretations over QM, because they don't provide falsifiable predictions and so this stuff isn't science but philosophy. I disagree with this. Finding that one explanation is simpler than another is finding something, and constitutes, in my opinion, valid evidence for that explanation in a scientific sense.)

[1] https://en.wikipedia.org/wiki/Bell's_theorem


Just to clarify, Bell's result implies that our universe is either lacking locality or counterfactual definiteness. Counterfactual definiteness means that for any measurement you could perform there is a definite fact of the matter of what the result is prior to or without taking the measurement. The Many-Worlds Interpretation is consistent with Bell's result because it lacks counterfactual definiteness -- there's not one result of measurement; there's several results, all real!


Doesn't that mean that MWI has "global" counterfactual definiteness (across universe branches), but lacks it "locally" (within individual branches)?


It lacks counterfactual definiteness globally, because there's multiple outcomes, not a single fact-of-the-matter. It lacks counterfactual definiteness locally from a subjective perspective due to the seemingly random outcomes.


Isn't multiple correct outcomes still the same thing? Because you can say in advance that all the outcomes will happen, isn't it just another perspective? As a physical entity inside the universe you know that you'll have different copies of you experience a unique outcome each, rather than being able to say "I will experience outcome X".

In other words, you can say what branches will exist but not with certainty say what you'll see as a result for yourself. By the dictionary definition, isn't that still counterfactual correctness? You'll know what happen in advance, because you can calculate the possibilities.

Is the scientific definition strictly in the perspective of a physical observer?


I think we're talking past each other, and there may be some confusion about terminology. I will do my best to try to describe what's going on to the best of my abilities and see whether that answers your questions.

Bell's Theorem (also called Bell's Inequality) starts with assuming that the universe has locality and counterfactual definiteness and yields and inequality concerning correlations between certain measurements. Experiments later conclusively showed that these inequalities are violated in our universe. So, one of the assumptions of locality or counterfactual definiteness does not hold in the universe we live in. Historically, this result was used to discredit and eliminate a class of theories about quantum mechanics called hidden variable theories -- the idea that quantum mechanics is deterministic, and we just haven't delved deep enough to find out what's determining the behavior of particles.

In my opinion it's most useful to think of Bell's result as a proof by contradiction: there are a bunch of assumptions (implicit and explicit) that go into it, and we're left with a prediction that reality violates. Therefore one of the assumptions is wrong, but we don't know which one, and different interpretations of QM will have different answers to this question. In the context of Bell's Theorem, counterfactual definiteness has a specific definition, and that's from the local perspective of an observer observing from within physics, because Bell's Theorem itself implicitly assumes a single universe and goes from there.

Under the Many-Worlds Interpretation, the result of a measurement is that the observer is split, and different copies of the observer will experience all results. This is the global and deterministic view. The local view is that the observer will see one result or the other, and there is simply no fact of the matter as to why the observer sees this particular result instead of the other.

The Many-Worlds Interpretation has some advantages over other interpretations in that it's local, deterministic, and mathematically the simplest. But it rubs a lot of people's intuitions the wrong way because it posits an unbelievable multitude of copies of what we know of as the universe.


It's for comments like this that I bother reading the comments here on HN. Thanks for distilling a complex subject into understandable-to-a-layman terms.


Agreed. I particularly valued this part, which helps me grasp some other content posted here:

"Bohmian mechanics [describes] a 'pilot wave' that is computed from the whole configuration of the universe... a global function that influences everything".


> The theory is appealing to many because it avoids non-determinism. It's unappealing because it's strictly more complicated than the interpretations that don't have this extra object, but predicts nothing beyond them. By Occam's razor it's not as good as the simpler interpretations.

I'm not a physicist, but doesn't classical QM have the problem of the "collapse of the wave function", i.e. "suddenly" Schrödinger's equation does not hold anymore - a problem that the De Broglie-Bohm theory does not have? Shouldn't this be considered as a strong sign against "typical" QM formulations and for the De Broglie-Bohm theory?


The wavefunction collapse is messy and awkward. It's taken as an axiom, basically, in the Copenhagen interpretation. It's not as awkward as introducing another equation and set of rules to skirt around it, as Bohmian mechanics does, in my opinion.

Many-worlds - which starts with being a lot more rigorous about what 'measurement' is (entangled yourself with the system in question) is much simpler than either, I think.

My general impression is that Many-Worlds is getting more popular as physicists make precise how entanglement, measurement, and decoherence exactly work. I think it'll supplant Copenhagen as what we teach in intro classes eventually. But this is just my impression from the physics blogs and papers I've read; I'm not a physicist myself.


Classical QM (Copenhagen, if I'm not mistaken) does have that problem. The collapse is indeed an additional hypothesis done on the part of the wavefunction the math already says we cannot observe. As always, unfalsifiable additional hypotheses are bad.

I can't speak for De Broglie-Bohm, but it would seem that theory also have a similar strike against it: the math is more complex than the equations QM physicists are familiar with. It's just not the simplest theory that fits the observation.

The obvious alternative is to just stick to the math. Problem is, the simplest math that explains our observations implies a universe that forks all the time. For some reason, possibly the intuition of a unique universe, many people cannot accept that.


>Basically you get to pick between nondeterminism (randomness) and a global function that influences everything that's much more complicated.

That sounds as if the universe is passing everything through an RNG monad.


edit for typo since I can't edit the post:

* caveat that it's not enough information to let you make any concrete predictions

Other afterthought: The pilot wave model is definitely not directly compatible with relativity, since it purports to have a function 'defined everywhere' at once, and the concept of 'everywhere at a moment in time' doesn't work in relativity. There are apparently methods of working around this, and they're apparently very complicated.


that's not entirely true. Non-einsteinian relativity allows for functions 'defined evewyhere' at once. An example is tangherlini relativity, which allows for anisotropy of the speed of light (and is not inconsistent with contemporary observations IIRC)....


Preferred foliation is the term. Ctrl-F it here:

https://en.wikipedia.org/wiki/De_Broglie%E2%80%93Bohm_theory


Well, I was looking at that page too, but didn't bother to quote what it's called because the term doesn't say anything useful.


in his book on QM David Bohm gives a relativistic version of the pilot-wave theory


There was a philosophical disagreement at the turn of the century where a bunch of ho-hums (including Einstein) decided to speak for everyone by saying everything here in this reality is discrete and isn't affected by anything anywhere else. A lot of work went into proving this over the last 100 years, with mixed results that just led to more and more testing at higher energy levels. With the Higgs, we started thinking we were "stuck" with our current theories. With the EM Drive, we're seeing things that don't fit in that model.

Going back to when the original disagreement started, there were a few physicists like Bohm and de Broglie who thought there was more than met the eye going on with discrete particles past a certain size limit. Their theories included a medium, or ether, in which all of this exists. To understand it a bit better, you might want to check out the "two slit experiment". There's one on Youtube that covers doing the experiment with photos, electrons, buckyballs and polarized lasers. Basically, when you understand the experimental results, you understand that matter acts like a wave sometimes and a particle others, even at large scales.

Those waves in the two slit experiment are likely responsible for the forces we're seeing in the EM drive. Also, I'm sure I've left out important bits here, but that's the basic idea.

To the moon!


Keep in mind there's no reason not to be hugely skeptical of the 'EM drive' right now.


> conceptually simple terms?

Physics does not seem to operate on conceptually simple terms. Three options, it is simple when approached from some way we have yet to think of. There is some inherent mistake we are making with a huge range of current experiments. Or, as seems most likely the universe operates very differently on different scales making it conceptually difficult to deal with.


The Ptolemaic system was pretty complicated too...


So that would be option one.


I prefer to think of it as local variables over X amount of time. After that, they are non-local.


What?


Hmm, nope, sorry. I'll look that up.


Regular QM has non-locality. Entanglement is non-local, at least any way that I've heard it to explain things like delayed choice erasers and the behavior of entangled anyons in confined electron gases. (Anyons (particularly fused anyons) themselves seem to have nonlocal quantum vales assocaited with them.)


So we meet again, SomeStupidPoint. :) I still uphold my claim that regular QM is a local theory and just found time today to reply to your comment on non-locality that you posted a week ago:

https://news.ycombinator.com/item?id=12936699

Let me know what you think!


Hi there!

I hadn't expected such a delayed reply! Will get to reading it this afternoon. (:


The many-worlds interpretation doesn't need non-locality.


Bell's Theorem states that there exists no local hidden variable theory for QM. AFAIK that implies that QM must be non-local because the other possibilities of a local theory can be reduced to a hidden variable theory or have no evidence for them.


Not true. Bell's result precludes local realism. So physics is either not local or lacks realism (counterfactual definiteness). The Many Worlds Interpretation keeps the locality but lacks counterfactual definiteness. The fact that MWI is the only interpretation that manages to hold onto both determinism and locality is what makes it so aesthetically pleasing.


Many-worlds still has nonlocal entanglement, IIRC, and all the weird phenomena therein.


It does not require FTL signalling for the entanglement to work.


Care to make rigorous how other interpretations require FTL signaling? For that matter, can you actually rigorously explain what it means for worlds to "branch", or why Born probabilities should be interpreted as though they are probabilities in MWI? Saying "all the possibilities actually happen!" doesn't really explain the correlations we actually observe in any interesting way.


You make some good points. Regarding the Born probabilities, the extrapolation of Schroedinger equation to the whole universe seems to make the concept of probability superfluous and the Born rules lose their sense.


> This is pretty good news if only that we will get more research into this phenomenon.

Agreed.

> Having a way to push your little probe along (even at a little bit of acceleration) without having to carry a lot of heavy propellant is a big deal. Maybe just some solar panels to generate the energy needed.

I'm not sure that actually works. Most use cases are for exploring the outer solar system, or even outside of our solar system. As the probe gets away from the sun, the solar energy it gets decreases with the square of the distance, and so it probably will stop providing even the little bit of acceleration.


> As the probe gets away from the sun, the solar energy it gets decreases with the square of the distance, and so it probably will stop providing even the little bit of acceleration.

In space you move in elipses. It's possible to choose an elongated orbit that moves the probe closer and then further from the sun repeatedly and each time it is close to the sun it accelerates forward, moving the "far" side of orbit further away, until it gains enough speed to escape sun gravity.

Or at least it's possible in Kerbal Space Program :)


Yes, you move in ellipses; but you spend a lot of time far away and very little time close to the sun. I'm not sure that the average power is any higher than you would have for a circular orbit at the same energy...

EDIT: Ok, let's think through the calculus here. Kepler's 2nd law says that the area swept by the line between a planet and the Sun is constant, i.e., r^2 dθ/dt = 2 π a b / P where a and b are the semi-major and semi-minor axes and P is the orbital period. Thus dt / r^2 = P dθ / (2 π a b); and the former is the instantaneous power absorbed. Integrating over an orbit, we get E = P / (a b); or the average power absorbed is proportional to 1 / (a b).

Since the energy of an orbit is determined solely by the semi-major axis a, this means the power absorbed for any given orbital energy is maximized for b ≪ a; so you're right, that highly eccentric orbits are a better way to acquire energy for the purpose of orbital escape. (Subject of course to practical considerations of energy storage -- you'll absorb the most power at exactly the time you don't want to be changing your momentum.)


> you'll absorb the most power at exactly the time you don't want to be changing your momentum

Thanks for doing the math. But I'm still confused, I thought you do want to accelerate when near the sun, as that makes the far end of the orbit go further? Again, no math behind, just how it worked in KSP :)


Whoops, you're quite right. I was thinking that you wanted to add momentum when you were moving the slowest, but that's for when you want to get into a circular orbit (which is the opposite of what you're talking about).

You should trust KSP more than you trust me. :-)


It's Oberth effect, basically? :).

http://www.askamathematician.com/2013/01/q-how-does-the-ober...

By the way, I only know about it because I played KSP a lot ;).


Does the Oberth effect really apply when your propellant mass is always zero?


The simplest way (imo) to think about the Oberth effect is "you want to maximize the amount of time you're falling inwards (i.e. speeding up) and minimize the amount of time shooting outwards (i.e. slowing down.)"

It doesn't just work with orbits and rockets and propellants, it works just as well with an oscillating weight on a spring that you flick with your fingers.


It's about ∆v, so I'd say it does regardless of how you got that ∆v. I don't see a reason it couldn't work with solar sails, for example.


If anything, it should increase your inertia, since you're not sacrificing mass for ∆v.


my attempt at a thought experiment that seems to suggest elliptical orbits are better in this regard:

if you have an efficient insulator, you could absorb an arbitrarily large amount of heat at the perigee, and retain it for the remainder of the orbit.

if you were in a circular orbit, you'd only be able to absorb up to the ambient temp.


Ambient temperature is a red herring; the sun is hotter than the ambient temperature so you can always win by absorbing energy from the sun and then emitting it to the rest of space.


Nuclear/Batteries


Wait, what? I thought Bell's Theorem made local hidden variables a no-go? Care to elaborate?


> This is pretty good news if only that we will get more research into this phenomenon.

Or maybe it's just a waste of time/money? (I'm not being facetious for the sake of it. See cjensen's comment[1])

[1] https://news.ycombinator.com/item?id=12995341


Nothing in that comment is news, but if we've reached a point where we're completely unwilling to accept that experiments could show us something new before theorists predict and explain it, then we should hang up our science hats and go home.

Skepticism is warranted, but the team behind this study seems to be proceeding with a healthy dose of it, and AFAIK they have not made any inappropriate claims. The results so far are interesting, the experiments do not seem terribly expensive, and I see no reason why they shouldn't be followed to their conclusion (which will probably be the discovery of a mundane explanation for the phenomenon that's been observed).


Are you actually worried that all of the world's scientists are completely unimaginative and incompetent? I don't understand why you'd escalate it this way just because this topic comes up. That certainly doesn't sound like a reliable frame of mind in which to be.


You seem to be looking at my comment through a radically different lens than I was when I wrote it. You know it was in reply to another comment, right?


> Nothing in that comment is news, but if we've reached a point where we're completely unwilling to accept that experiments could show us something new before theorists predict and explain it, then we should hang up our science hats and go home.

Yes, but nobody's ACTUALLY saying that -- that's just the typical narrative of people-selling-perpetual-motion-machines. (Who, you'll note, have a slightly different agenda.).

What's actually going on is that everybody's saying "this would require New Physics -- hold your horses!"... and the response seems to be "YEAH, BUT WHAT IF?!??!".

Pithy answer to the "what if" challenge: Not good enough.


But the question of how the scientific community should be reacting to this, and whether more time and money should be thrown at these experiments, seems wholly separate from the question of how the public should be reacting.

You asked "Or maybe it's just a waste of time/money?", which is a question for the scientific community, but most of the "WHAT IF?!??!" reactions seem to be coming from the public.

edit: I replied before you edited your post fairly heavily; I think what I said still applies, though.


Apologies for the ninja edit! I tend to leave fairly heavy markers around, but didn't this time[1] -- just didn't expect that someone would respond so quickly!

> You asked "Or maybe it's just a waste of time/money?", which is a question for the scientific community, but most of the "WHAT IF?!??!" reactions seem to be coming from the public.

I really don't understand this. Is this some sort of PoMo response to my original challenge, or...?

My contention is that the public interest may not necessarily have much to do with what the public is interested in. I don't think this a particularly controversial PoV...?

(Though I do have reservations about it, but -- personally -- I've just about given up on even basic scientific literacy. You may be less jaded.)

[1] Readability suffers, but DFW approves "from heaven". I must admit I'm quite DFW-like in that I cannot halt the loop before I write something, so here we are... with massive edits and such.


> Apologies for the edit!

No worries

> I really don't understand this. Is this some sort of PoMo response to my original challenge, or...?

What's PoMo?

My interpretation of your first comment was that you were questioning whether this research is worth more time and money based on skepticism rooted in scientific knowledge i.e. these "new physics" needed to explain the EmDrive would conflict with a lot of what we currently think of as scientific "fact". Occam's razor, and so on.

My interpretation of your second comment was that you think the "YEAH, BUT WHAT IF?!??!" reactions to this research (generally associated with internet comment sections) are inappropriate and rooted in ignorance of scientific principles (FWIW, I agree with you on that).

I see those as separate concerns. The public is wholly ignorant, and their reaction to the results of this research to date (filtered as they are through awful clickbait articles) should have no bearing either way on whether the research continues. That's the point I was trying to make in my second comment.

> My contention is that the public interest may not necessarily have much to do with what the public is interested in. I don't think this a particularly controversial PoV...?

I don't think so either (and I agree).


> What's PoMo?

PoMo refers to Post Modernism , which is afaiu , "Literary/whatever Theory" that, I think, seems to embrace reletavism


>Or maybe it's just a waste of time/money?

We are currently shooting missiles that cost $800,000 each at people in caves on the other side of the world from boats that cost $4.4 billion dollars each. If anyone can justify that expenditure, I think we can find a small fraction of that money to investigate this further.

http://news.nationalpost.com/news/world/at-800000-per-round-...


Well, for one thing, that srticle describes a future warship and ammunition that is not yet in service, so it is incorrect to suggest anyone is 'currently shooting [...] at people in caves' with it.

However, the justification (OK, a justification) for that action, as part of the war on terror (I believe this is your implication, based on the 'caves' reference) would be that the insured costs of 9/11 were around USD 40 billion [1] (ignoring the unknown intangible additional cost in human life) so spending USD 800000 to kill a terrorist who is planning to perform a similar event is something of a bargain.

But, this sort of logic is flawed to begin with: We spend X on Y, which I disapprove of, therefore we should spend epsilon on Z instead.

[1] https://en.wikipedia.org/wiki/Economic_effects_arising_from_...


It not a "future" warship. Its an existing warship that keeps breaking down, despite the fact that its new and cost $4.4 billion to make. It just broke down again in the Panama Canal. That's a picture of the actual, existing-in-the-present ship.

http://www.military.com/daily-news/2016/11/22/new-zumwalt-br...

>However, the justification (OK, a justification) for that action, as part of the war on terror (I believe this is your implication, based on the 'caves' reference) would be that the insured costs of 9/11 were around USD 40 billion [1] (ignoring the unknown intangible additional cost in human life) so spending USD 800000 to kill a terrorist who is planning to perform a similar event is something of a bargain.

Except for the fact that that justification is false. We've spent over $5 Trillion fighting the fictional "war on terror" in the 15 years since 9/11 and by every measurable metric there are more "terrorists" (people who would like to attack us) today then there were on 9/11.

http://www.military.com/daily-news/2016/09/13/report-nearly-...

This dwarfs by many orders of magnitudes the total science funding spent by the government since the government was formed. Its absurd to discuss "waste" or "inefficiency" in any context, let alone a tiny expenditure for investigating the EM drive, while ignoring the elephant in the room. The phrase "penny wise pound foolish" applies to that line of thought.


> It not a "future" warship. Its an existing warship that keeps breaking down, despite the fact that its new and cost $4.4 billion to make. It just broke down again in the Panama Canal. That's a picture of the actual, existing-in-the-present ship.

It's in the Panama Canal going from it shipyard to homeport. It isn't yet in service. Apparently the combat systems are not even installed yet. I believe that is what your parent poster means. [1]

[1] http://navaltoday.com/2016/11/22/uss-zumwalt-breaks-down-in-...


Speaking as a experimental physicist:

The research shown is relatively inexpensive, and can be pulled off with what many labs probably have lying around anyways.

I'm sure it's more a work of love anyways and the guys doing this are probably facing serious headwind for trying 'crazy stuff'.


This paper that investigates how the EM propulsion drives might generate thrust, and as a side effect, the theory explains certain phenomena that we attribute to dark matter and dark energy. i only have undergrad physics degree, but it sounds interesting. anyone with more experience have any thoughts about this?

http://arxiv.org/pdf/1604.03449v1.pdf >McCulloch (2007) has proposed a new model for inertia (MiHsC) that assumes that the inertia of an object is due to the Unruh radiation it sees when it accelerates, radiation which is also subject to a Hubble-scale Casimir effect. In this model only Unruh wavelengths that fit exactly into twice the Hubble diameter are allowed, so that a greater proportion of the waves are disallowed for low accelerations (which see longer Unruh waves) leading to a gradual new loss of inertia as accelerations become tiny. MiHsC modifies the standard inertial mass (m) to a modified one (m_i) as follows: m_i = m (1-(2c^2)/(|a|Θ)) = m (1 - λ/4Θ) (1) where c is the speed of light, Θ is twice the Hubble distance, ’|a|’ is the mag- nitude of the relative acceleration of the object relative to surrounding matter and λ is the peak wavelength of the Unruh radiation it sees. Eq. 1 predicts that for terrestrial accelerations (eg: 9.8m/s2) the second term in the bracket is tiny and standard inertia is recovered, but in low acceleration environments, for example at the edges of galaxies (when a is small and λ is large) the sec- ond term in the bracket becomes larger and the inertial mass decreases in a new way so that MiHsC can explain galaxy rotation without the need for dark matter (McCulloch, 2012) and cosmic acceleration without the need for dark energy (McCulloch, 2007, 2010).


McCulloch's papers have many, many problems[1]. No credible theory has been posited to date, and the experimental evidence fails to properly address systematic errors. If this device worked, one could use it to create an over-unity generator. It would also invalidate most of known physics. The chances of it working are nonzero, but that's the best that anyone can say about it. Most likely it will waste people's time and money for a few years before the public loses interest.

[1]https://www.reddit.com/r/EmDrive/comments/3hjmtv/my_conversa...


I mentioned this elsewhere in this thread. I'm really interested to hear some context around it from someone who knows what they're talking about; to me, it's a more appealing explanation than "conservation of momentum is wrong" or "we're using the quantum vacuum as a tractive medium which is supposed to be impossible but whatever".


IANAP, just someone with a lay understanding that's been following a lot of these topics for a while now.

As I understand it McCulloch's requirement that wavelengths fit within the cosmic horizon is an assumption, not a proven logical implication.

Additionally I think a lot of physicists are dismissive because they expect someone who desires to upend mainline theory to at least demonstrate fluency with the mathematics of mainline theory. McCulloch falls short of this.

All that said, I think he's genuine in his pursuit and I've really enjoyed reading his blog over the years.


McCulloch's papers aren't very well-regarded because he was originally a biologist, not a physicist, and his theories (accompanied by blog posts) are very unconventional. I'm not trying to slime him or anything, just explain why he isn't cited often - I'm not well-educated enough to understand his work.


Shouldn't crazy stuff be exactly why we fund tenured positions?

I would like to think tenure still produces crazy scientists


> Shouldn't crazy stuff be exactly why we fund tenured positions?

Crazy stuff becomes an option after one is granted tenure, it doesn't necessarily make one more likely to achieve tenure. On the other hand, theoretical physicists aren't very expensive -- they only need a blackboard and an eraser. Compare that to a philosopher -- much the same but without the eraser.


Badah dum tish!


Classic, textbook example of lifeisgood-oceanswave syndrome.


I think you misunderstand. Tenure is usually granted to people with a (sometimes minor) track record of their ideas working out. Essentially it amounts to a bet on things to come, or IOW of "is person X a crank or not?". This, of course, is also tied to the field in which they are working, etc.

(Sure, some good researchers turn into cranks later in life[1], but usually in a different -- often unrelated -- field.)

[1] Nobel prize winners often succumb to this, sadly.


Seems like having the preeminent researchers in a field be the ones to do the weirdest research while teaching students on the foundations and established research is a pretty solid decision.


I guess you're referring to Brian Josephson, who went from superconductors (Nobel prize) to things like telepathy etc. but I'm interested if there are other examples. A quick search turns up Kary Mullis as a possible other example, but I don't know too much about him, and it's a difficult topic to research.


Linus Pauling (two Nobel prizes!) jumps to mind with his Vitamin C work

https://en.wikipedia.org/wiki/Linus_Pauling#Medical_research...


What similar research is the time and money better spent on?


Actually it's not as much useful as an engine as it may seem, at least at current parameters. Ion thrusters are pretty good as low-power engines. And it's not clear that emdrive is better than that even if it works. Maybe with powerful space-based nuclear reactors


The difference is that eventually you run out of ions to thrust.


Practically speaking though the power generation is the limiting factor. An emdrive requires way more power, which means way more generating capabilities, which ends up taking up way more mass on your spacecraft than the ion engine reaction mass.


If you have a nuclear reactor, the power vs mass loss could end up working in favor of the emdrive, especially for long range interstellar flights. None of this is, of course, practical anytime soon...


Nuclear reactors are incredibly massive though, especially with all of the required shielding. We've never launched anything even as close to as massive as a space-borne nuclear reactor would be. I agree with you that it won't be practical anytime soon (or at least not politically), so as long as we're positing some future level of tech, why not nuclear fusion?


Due to political and environmental concerns alone no modern terrestrial reactor design would be allowed to fly. That's usually not what nuclear propulsion proponents argue for.

There are several other reactor designs that have been researched over the last half century that can operate in a closed loop with much smaller amounts of nuclear fuel and more manageable radiation emissions. Designs like the nuclear lightbulb, which was researched by United Aircraft Corporation and NASA for almost a decade before the plug on the Mars mission was pulled in the 70s, are much better suited and are what proponents of active nuclear propulsion most often have in mind. Once there is some political will, we have decades old research to start from to build a flight capable nuclear reactor.


The nuclear lightbulb doesn't generate power though -- it's a rocket engine, and "generates" reaction mass leaving at high velocities. So it's not suitable for use powering an emdrive, the whole point of which being indefinite flight without needing to use up any reaction mass, which a nuclear lightbulb can't do.


No, the nuclear lightbulb generates energy in the form of heat and electromagnetic radiation. How you use that energy output depends on the design of the reactor and propulsion system.

You can seed expanding liquid hydrogen in the outer cavity with tungsten nanoparticles which absorb the UV radiation to heat the hydrogen for use as a propellant. You can pump the outer cavity with a UV transparent coolant and line the walls with parabolic photovoltaics (that convert UV instead of visible spectrum) for direct conversion of the black body radiation to electricity. You can theoretically even create a magnetohydrodynamic "turbine" in the outer cavity that is coupled to the spinning nuclear fuel (which can be charged plasma).

You may be thinking of an earlier that was mistakenly called nuclear lightbulb or another design that was lumped into the concept. The variation I have in mind is just as general purpose as any other nuclear reactor, it just uses a lot less fuel and has to actively maintain temperature and pressure to keep the neutron cross section energy high enough to sustain power positive fision.


Do you have a link with more information to the specific concept that you're talking about? It looks like I was led astray by the Wikipedia article. I'm super interested.


Nuclear reactors have actually been launched into space multiple times [0][1]. Note, however, that their power output was on the order of a few kilowatts.

[0] https://en.wikipedia.org/wiki/US-A

[1] https://en.wikipedia.org/wiki/TOPAZ_nuclear_reactor


Wow, that's awesome. I hadn't realized. Unfortunately those systems were fraught with problems (that persist to this day in their orbital debris), but they flew and they worked. Looking at the power generation capacity of those reactors vs their mass, however, they weren't even that much better than plutonium RTGs! The TOPAZ generated 5 kW using a 320 kg reactor, and was only good for five years. That's 15.6 W/kg. Meanwhile, the standard RTG design we've been using in recent space probes, including New Horizons, generates 300 W at 57 kg, or 5.3 W/kg, but the half-life is a long 87.7 years and the system doesn't suffer the kind of wear problems that a nuclear reactor does, so the total lifetime energy output is substantially greater. This would matter for interstellar probes with near-present levels of technology, assuming a working emdrive but nothing else.


I don't really see a huge difference between a spaceborne reactor and the kind that have been in constant use for decades in submarines. We just need a new Admiral Hyman Rickover to make it happen safely and effectively.


Two big problems:

1. Details on submarine nuclear reactors are classified, but five minutes' Googling shows that a reasonable guess for their total mass (including shielding) is 1,000 tons. Meanwhile, the entire payload to LEO of the largest rocket ever successfully launched, the Saturn V, is only 155 tons -- and that's for the entire top stage.

2. Submarine nuclear reactors use water for cooling. Water that is not available in space. Cooling would be a huge problem.

So, existing submarine designs are not practical. You'd need to design something from the ground up that is much lighter, and the cooling system would be entirely different and likely more massive, since you don't have all of that free water available to dump heat into. Instead you're talking massive radiators.


Good point re: the mass of the reactor, that's a hell of a lot more than I'd have guessed.

But any mission involving humans is likely to carry a large amount of water beyond the crew's personal needs, because it makes such a good radiation shield. So presumably the same water would be used for cooling the reactor.


I am not convinced, if you can fly a reactor : https://en.wikipedia.org/wiki/Convair_NB-36H then I feel it unlikely that it weighs 1000 tons.


It all depends on the size and design of the reactor, though. A reactor that you put on a plane just for the hell of it (which is what that was) is going to be a lot smaller and lighter than a nuclear reactor that needs to power an entire submarine. The point to my reply was that submarine nuclear reactors were in no way suitable for space use because they aren't optimized for weight at all. Reactors optimized for plane use would be a closer fit. The reactor in that plane could be lifted to orbit on a Saturn V, so we're making progress, but, and this is a huge but, it was air-cooled.

A 3 MW reactor puts out a hell of a lot of heat, and without the benefit of air-cooling in space, I'm not sure what exactly you would do with all of that waste heat. Consider how massive the space shuttle orbiter's radiators were (they are on the inside of the cargo bay here: http://i.stack.imgur.com/Flgzb.jpg ), and all of that is only capable of shedding waste heat in the amount of ~6 KW! We can put a much more capable reactor into space than we can possibly cool, so we haven't bothered. Cooling is the real problem. The total mass of the radiators and the structure required to support them ends up being way more than the reactor itself.

So for good long distance transportation in space, not only do we need a working, efficient emdrive, but we also need better power generation that is much more efficient from a waste heat perspective. These are really hard problems.


You don't have to set off the equivalent of 500,000kg of TNT at the business end of a submarine to put it into the water.


did you not read the article? to mars in 70 days, i believe it says..


We can get to Mars in 90 days with chemical propulsion, today.


and this refutes how?


> Actually it's not as much useful as an engine as it may seem,

It's worth more than any existing or previous theoretical engine. Put this rotating around a center point. Now put 1000s of them. Now we have power generation for outposts in space. The implications of a working EM drive are staggering.


The emdrive (at least the current version) requires way more power per Newton of force than even ion engines, which are themselves famously power hungry. You definitely could not get free energy out of an emdrive because the amount of power required to run it is way more than you'd get out.

The advantage of the emdrive is that it does not require reaction mass, at the cost of incredibly high power consumption. Unfortunately that makes it not really viable for anything; ion drives are limited by power, not reaction mass, so the emdrive is actually worse because it uses way more in power than it saves in not needing reaction mass.


The drive being power hungry would not change the fact that kinetic energy goes up quadratically with speed (E=(m*v^2)/2) while this drive supposedly can keep accelerating with only constant power input. So kinetic energy would go up quadratically with time while expended energy only goes up linearly. You'd get free energy very soon.

Since violation of conservation of energy is very unlikely, I'd say that this is a sign that the drive doesn't actually work. Or at least that there is some gotcha that we haven't understood yet.


Anything keeps accelerating with only constant power input. That's literally how all rockets in a vacuum work. Nothing special with an emdrive there. It doesn't mean that perpetual motion is possible, it just means that you're wrong about the ramifications of kinetic energy vis a vis power generation.


A rocket using chemical energy at a constant rate does not have this problem, because the excess energy it seems to gain from kinetic energy going up quadratically is balanced by a decrease in the kinetic energy that the expelled reaction mass is left with.


If you think of mass as the deformation of spacetime, mass reaction engines push off the wake from expelling mass rapidly in one direction to go in the other direction.

The EM Drive is an energy reaction engine, rather than a mass reaction engine (if it works as described), so instead of pushing off wake, it uses energy itself to "deepen the fold", as it were, and increase speed (and mass, since the two are exchanged).

That's interesting. I've never considered that the very act of increasing the velocity of the EM Drive would actually increase its mass. Hrm.


Not disagreeing with you there, but I don't see how that leads to free energy from an emdrive. How exactly do you get more energy out than you put in?

And anyway, kinetic energy isn't a conserved quantity. It can be converted into potential energy in a gravity well, or lost entirely in inelastic collisions. Momentum is the quantity that is always conserved, and that is what the emdrive violates directly (though still not in a way that allows for perpetual motion or free power, near as I can tell).


As jack9 proposed higher up in this thread, you could put the EmDrive on the outside of a rotating system to provide constant torque, which accelerates the rotation. Then wait until the system has sped up to the point where the kinetic energy increases faster than the power you put in. Then use the whole thing to drive a generator.


I would think that once we determine the cause of the thrust, assuming it is real, the design of the 'drive' might be improved to increase efficiency. We currently don't even know what characteristics of the device are causing the generation of thrust.

The first internal combustion engines were quite inefficient as well.


Here's hoping! It's so unbelievably inefficient right now though, it's hard to imagine it improving by orders of magnitude. The original internal combustion engines were comparatively way more efficient.


As I understand it, the version under test here was simply a proof-of-concept, not optimized for maximum efficiency. There are also plans for a version with a superconducting frustum, which should be even more efficient...


Naive questions from someone who does not understand a lot about physics (to put it mildly): Did you just invent the perpetuum mobile, or did I misunderstand what you said? Or do I just not understand how this thing is supposed to work?


Actually, yes!

By special relativity you can't break the momentum conservation principle without breaking the energy conservation principle.

Somehow most people feel more comfortable dismissing the momentum conservation principle, but they are intertwined.

A carousel with Emdrives like the described in the previous comment should be a perpetual mobile and create energy if it's spinning in the right direction. (It will destroy energy in the reverse direction, so be careful.)

[Anyway, I think that the emdrive doesn't work and the conservation of energy and momentum are safe.]


The Emdrive takes energy, though, doesn't it? It's not like you can plug it in and have it run, unless I misunderstand a lot of claims.


Yes.

However, any true reactionless thrust is equivalent to a perpetual motion machine, because reactionless thrust drives produce the same acceleration regardless of speed, but the kinetic energy produced by the same acceleration goes up as the speed goes up.

If you accelerate one kg from 0m/s to 1m/s, you impart 0.5joules of energy on it. If you accelerate it from 1m/s to 2m/s, you impart 1.5joules of energy on it. If you accelerate it from 2m/s to 3m/s, you impart 2.5 joules of energy on it, and so on and so on. Each added m/s costs more in energy.

A reactionless drive working in a system with no preferred frame would add the same amount of acceleration for the same amount of cost, regardless of how much energy there already is. This would mean that eventually it would be going fast enough that the added kinetic energy would be more than however much energy it draws in. Then you can build a gigantic carousel that is spinned on the rim and takes energy from the middle and feeds some back in to move it, and now you have perpetual motion.

This is why no real physicist actually thinks that this will be reactionless thrust. However, that does not necessarily mean it's useless. If it, for example, allows you to push against the earth's magnetic field more weight-efficiently than current magnetic propulsion systems, it would be a major win for satellite stationkeeping.


> reactionless thrust drives produce the same acceleration regardless of speed

A) Where is that in the paper? Or am I missing something elementary to this?

B) We don't even know if it produces the same level of thrust as speed increases. If it produces the same level of thrust across all speeds, then we have a problem, because as velocity increases, so does mass, and the same level of thrust would produce less acceleration over time, because of the increase in mass. Of course, that may only effect things at relativistic speeds.


Speed relative to what, exactly? There are no preferred reference frames. A true reactionless thruster could always treat itself as if it's initial speed was 0, and so it took 0.5J to accelerate by 1m/s.

Rockets do that too, but they get away with it because they stored the necessary extra energy into the kinetic energy of the fuel.

If there was a reactionless thruster that did, in fact, get less efficient at higher speeds, then that would also be world-shattering physics news, because then you could measure it's performance after accelerating in different directions and eventually get a true rest frame that is privileged over the rest of the reference frames out of it.


Put two emdrives on the ends of a rotating beam. Use the violation of conservation of energy/momentum to drive a generator. Infinite speed and energy are yours to command!


Driving the generator is going to rob the system of energy. That should be obvious, right? Nothing in the matter/energy equation is changing. This drive would just take electrical energy and convert it into kinetic energy.


He did, yeah. If it works as described, then you'd expect it to break conservation of energy along with conservation of momentum.

That's another good reason to think it doesn't work.


He did not. The emdrive requires a very large amount of power to operate, way more per Newton than even an ion drive.


> Did you just invent the perpetuum mobile,

That's the implication of the EM Drive. I didn't invent it, it's the very first thing that you think of once you violate the momentum conservation principle. This is why it's unlikely to be true, but at the same time an alchemists dream scenario.


This takes a huge amount of electricity to work.


What makes the power to power them?


easy: 1000000's of them rotating around a fixed point!




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: