65
Physicists Superheated Gold to Hotter Than the Sun's Surface and Disproved a 40-Year-Old Idea
(www.smithsonianmag.com)
General discussions about "science" itself
Be sure to also check out these other Fediverse science communities:
Sort of reminds me of the energy-time version uncertainty principle: if an interval is short enough, energy fluctuations can be extremely high.
What I'd like to know here is what the duration threshold to would allow fusion to start is.
Energy-time relations have no link to the uncertainty principle. They apply to classical cameras for instance. There are no "energy fluctuations", you cannot magically get energy from nothing as long as you give it back quickly, like some kind of loan.
This is because the energy-time relation works for particular kinds of time, like lifetime of excitations or shutter times on cameras. Not just any time coordinate value.
Edit: down votes from the scientifically illiterate are fun. Let's not listen to a domain expert, let's quote wiki and wallow in collective ignorance.
collapsed inline media
https://en.m.wikipedia.org/wiki/Uncertainty_principle
Did you read it?
Whether it's energy-time or position-momentum, the uncertainty principle is just a consequence of two variables being linked via Fourier transform. So position and wave-vector therefore position and momentum, ans time and pulse and therefore time and energy. Sure, it only has consequences when you're looking at time uncertainties and probabilistic durations, which is less common than space distributions. And sure it also happens in classical optics, that's where all of this comes from. And I agree that "quantum fluctuations" is often a weird misleading term to talk about uncertainties. But I'm not sure how you end up with "no link to the uncertainty principle"? It's literally the same relation between intervals in direct or Fourier space.
Okay, explain to me what the standard deviation of time is. I will pre-empt nonsense, just "time", not just time in reference to the duration of a finite process. It must be abstract and universal, like the position-momentum case.
You know maybe I'm starting to understand your point.
On the surface your question is easy to answer: clock uncertainties are a thing, and are very analogous to space-position uncertainty. Also time-of-arrival is a question that you can pretty much always ask, and it's precisely the "uncertain t for given x" to the usual "uncertain x for given t". Conversely you don't have the standard deviation of "just space": as universal as it is, Delta x is always incarnated as some well-defined space variable in each setting.
But it's also true that clock and time-of-arrival uncertainties are not what's usually meant in the time-energy relation: in general it's a mean duration (rather than a standard deviation) linked to a spectral width. And it does make sense, because quantum mechanics are all about probability densities in space propagating in a well-parametrized time. So Fourier on space=>uncertainties while Fourier on time=>actual duration/frequency. And if you go deeper than that, I'm used to thinking of the uncertainty principle in terms of Fourier because of the usual Delta x Delta p > 1/2 formulation, but for the full-blown Heisenberg-y formula you need operators, and you don't have a generally defined time operator of the standard QM because of Pauli's argument.
But that's a whole thing in and of itself, because now I'm wondering about time of arrival operators, quantum clocks and their observables, and is Pauli's argument as solid as that since people do be defining time operators now and it's quite fun, so thanks for that.
Fine, I can say this in a way that does not violate energy conservation but still uses the energy-time uncertainty principle:
Say you have a system with two levels, hot and cold like the gold sheet in this experiment. Then I can take a linear combination of these two (stationary) states, between which which the period of oscillation would be deltat=h/deltaE, which would be the time for the system to "heat" and "cool" within 45 femtoseconds. (lifted from Griffiths, page 143)
That would give a deltaE>1.5E-20J compared with kT (T=19000K) = 27E-20J 🤔 (T=1300K) = 1.8E-20J so the fusion T is close to the oscillation limit, the extra energy for 19000K is not going to do anything unless the cooling slows down.
Soo...I don't understand the point of the experiment. It just looks like they're exciting ~~atoms~~ metal and then letting them quickly deexcite radiatively...and then wonder why they won't absorb huge amounts of energy and melt (if the energy remained within the system, it would). I probably would have to get the actual paper, but I don't wanna 😛
A reasonable approach, but melting is a phase transition. It's a collective behaviour. What the experiment shows is that quantum phenomena happen fast enough to make thermodynamics a bit strange. Probably because it is formulated in terms of continuous maths and atoms are discrete.
They didn't say anything about cooling the gold film.
They measured it lasted as solid at a certain temperature for a certain length of time after it had reached that temperature.
I'm sure it eventually melted, but the question was how long it stayed solid after being superheated past previously theoretical limits.
That's the problem, reading the quotes from my top reply even they seem to admit that what they are calling temperature is not what is usually called temperature in thermal equilibrium.
It's a subtle distinction.
High temperature/energy leads to entropy/liquification, but I think what this experiment demonstrated is there's a short delay or "entropy build up curve" between high amounts of energy and the "transmission" of entropy through the solid molecular structure to a liquid state.
I'm not sure if I'm wording all this correctly.