TheReference

  • Subscribe to our RSS feed.
  • Twitter
  • StumbleUpon
  • Reddit
  • Facebook
  • Digg

Sunday, September 8, 2013

Likely: latest Atlantic hurricane-free date at least since 1941

Posted on 10:54 PM by Unknown
Originally posted on September 4th. Now, 5 days later, it seems that no currently active systems will grow to a hurricane so the records will be broken – a new record at least since 1941, indeed.
Remotely related: Henrik Svensmark et al. have a new paper (press release) on cosmoclimatology in PLA, experimentally arguing that the UV rays increase the aerosol production from ozone, sulfur dioxide, vapor by the same factor even for nuclei above 50 nm of diameter – which may already be called cloud condensation nuclei. This strengthens his claims that the cosmic rays influence the climate and falsifies some theories about the chemistry of the atmosphere. Via WUWT. See previous TRF text on cosmoclimatology.
I have manually checked the dates of formation of the first hurricanes on Wikipedia pages about the 1851 Atlantic hurricane season (older, sparser data, are available at most on the "one page per decade" basis) through the 2013 Atlantic hurricane season. You should be able to manually edit the year in the URL to get to all the other pages.

This is what I found.

The first 2013 Atlantic hurricane hasn't started to form yet; only two 20-30 percent "glimpses" of a possible depression can be seen and they're likely to be destroyed by their collision with the land (and if they won't be, they will still be too weak for a hurricane). It's September 4th. The probability is therefore high that this situation will continue past September 9th, i.e. next Monday. If that's so, 2013 will beat 2002 (Sep 9th) and the "most recent" hurricane season that could beat our present are 1941 (Sep 17th), 1922 (Sep 13th), 1914 and 1907 (the only two recorded hurricane-free seasons with 1 and 5 named storms, respectively), 1905 (Oct 1st), 1877 (Sep 14th), and 1876 (Sep 12th).




For the sake of convenience, let me mention the first "hurricane birth date" for less spectacular years which were nevertheless late in the season: 1937 (Sep 9th), 1931 (Sep 6th), 1920 (Sep 7th), 1912 (Sep 10th), 1865, 1857 (both Sep 6th). Some seasons were very weak – 1952, 1939, 1930 – but some hurricanes materialized by the end of August, anyway.

Most of the years see the first hurricane forming sometime in July. August births of the first hurricanes are rarer, much like the beginning in September or, on the contrary, June (or even May). Several out-of-the-official-season hurricanes in January etc. mess up with the calendar and I was overlooking them (treating them as if they didn't occur).




See also an Accuweather and WUWT articles about the possibly looming new record.

After the unusually vigorous 2005 Atlantic hurricane season, we were drowning in predictions of ever stronger and ever more frequent hurricanes. A statistical evaluation of the 2006-2013 seasons shows that all the hurricane-related data randomly fluctuate in the usual intervals we have been used to for quite some time. These numbers are very volatile but there's no indication that something is increasing and everything suggests that 2005 was an exceptional year that is unlikely to be repeated often.

So far, we don't know for sure whether the late earliest 2002 hurricane date will be beaten – I would bet that it probably will – and whether 2013 will join the shortlist of hurricaneless seasons (1914 and 1907: how many TRF readers remember those years?) – chances for Yes and No seem comparable to me. And yes, I think it's more likely than not that at least one hurricane was overlooked either in 1907 or 1914, due to the absence of satellites and similar devices, so it's perfectly plausible that the ongoing season will actually be the weakest one since 1851 (unless a hurricane will materialize).

But what is clear is that the absence of any unusually strong hurricane activity after 2005 is just another example of the spectacular failures of the climate alarmists. Like with other failures, they never learn any lesson. They only cherry-pick the data that agree with their scientifically pathological opinion that a dangerous climate change should be underway and simply switch to something else whenever the clash of their idea with the data becomes too self-evident.

Whenever it becomes indisputable that the data have falsified a particular prediction of their "dangerous climate change" framework – and be almost sure that the data ultimately rule out every single one of them – they just switch to something else in which the data aren't sufficiently detailed which is why sufficiently spun, cherry-picked anecdotes may be temporarily used to replace the data. This approach is unscientific and despicable.

It's not too important scientifically or socially whether the first 2013 Atlantic hurricane starts to form before September 9th but I will be personally checking what's going on above the Atlantic Ocean, just for the fun of it. Will you?
Read More
Posted in climate, weather records | No comments

Democrats of Europe, wake up!

Posted on 3:59 AM by Unknown
Daniel Cohn-Bendit is a notorious Franco-German leftist.

In 1968, he would be a fighter at the barricades of Paris; his nickname has been Danny the Red (Dany le Rouge) ever since. In the 1970s, he would "love" children in an "anti-authoritarian kindergarten" which is why he also fought for the sex with the children to be legalized in the 1980s. Germany's Green Party recently made a huge U-turn and it now seems to claim that it was "unacceptable" to demand the legalization of peadophilia.

In a normal society, such a man would probably oscillate in between a mental asylum and a prison but we live in countries that have incorporated themselves into the European Union so this chap is much more than a rank-and-file member of the European Parliament. He co-leads the Greens-Marxists in the EU legislative body and is just planning to create a new, modern incarnation of the Communist International that should overtake Europe.




That's the main "planned action" according to his and Felix Marquard's 5-days-old article (also published by The New York Times),
The Fix For Europe: People Power,
where they plan to screw Europe and take the power from the people and democrats on the continent. Marquard is a dropout who did fine thanks to his rich parents and runs a P.R. agency – a spoiled brat Engels for the Marx-Bendit, I would say. In Poland, the article was reprinted under the headline Get rid of the nation state. Wow.

Normally, I would consider Danny the Red to be a marginal figure who is not worth an answer from many of us. His job in the EU Parliament is good for him (he announced that he won't run again in May 2014 elections) but I would doubt that his power goes beyond his preposterous individual existence. That doesn't mean that his opinions and plans aren't shared by a scary percentage of the people. But I would just feel that this particular politician doesn't have the power to change things today.




But I may be wrong. Czech ex-president Václav Klaus, his aides, and a few pro-freedom European politicians and pundits such as Nigel Farage have initiated the following manifesto:
Democrats of Europe, Wake Up!
You may join the supporters by sending a kind e-mail to democracy@vkinstitute.cz.

Klaus explains that some modernization of the language notwithstanding, Cohn-Bendit's and Marquard's rant structurally mimics Marx's and Lenin's projects and/or a roadmap to rebuild the EU into a federal melting pot of nations analogous to the USSR.

See also Bruce Bawer's analysis, "Europe's Would-Be Masters", of the Cohn-Bendit's and Marquard's rant written for the Front Page Magazine.

I agree with Klaus that those people do think in a more or less isomorphic way to the Marxists, Leninists, and Stalinists. I do agree that they're not a negligible fringe group that may be ignored. I do agree that democrats must show their teeth before the left-wing radicals insert their teeth into the flesh of necks of our European nation states and their democratic regimes.

It's just not clear to me whether the planned creation of yet another radical party is something that crosses the red lines, something that should suddenly energize and unite the opposition to these dangerous plans across the old continent. But maybe I will change my opinion later today. ;-) So far, it just seems to me that he will at most regroup some extremist parties enjoying something like 5% percent in the EU.

Am I wrong?
Read More
Posted in Czechoslovakia, Europe, politics | No comments

Saturday, September 7, 2013

Confusions about the relationships of special relativity and general relativity

Posted on 5:35 AM by Unknown
Sabine Hossenfelder wrote about the confusions surrounding the relationship of Einstein's 1905 special theory of relativity and Einstein's 1915 general theory of relativity. Edward Pig Measure is one of the laymen who are somewhat confused; many others are vastly more confused.



First of all, I find it very important that all the discussions on the two blogs above are about physics topics that have been settled for 100 years, about the high-school understanding of relativity. I think it is desirable to emphasize this point because much of the confusion arises when complete crackpots such as Lee Smolin say or write totally wrong things about relativity and they sell these totally wrong things as a cutting-edge research.




Special relativity is a 108 years old or new theory of space and time that correctly accounts for new phenomena that are known to occur when the observers' speeds approach the speed of light. It is a principled theory that constraints what particular constructive theories of individual phenomena and their classes may say and what they mustn't say.




All other theories must be made compatible with the two postulates of special relativity:
  • Relativity postulate: the laws of physics have the same form in the coordinate systems of all observers moving by constant speeds in a constant direction (inertial frames)
  • Constancy of the speed of light: the speed of light is constant, \(c\), regardless of the speed of the source and the speed of the observer
Maxwell's theory of electromagnetism was actually compatible with those principles before relativity was found; that's why Einstein's good understanding of electromagnetism helped him to discover special relativity. However, ordinary mechanics was only compatible with the first postulate (which is referred to as the Galilean invariance in non-relativistic mechanics); it didn't respect the constancy of the speed of light because the speed of light was supposed to become \(c\pm v\) if an observer was moving relatively to the aether – a preferred environment in which the speed of light is \(c\) (independently of the speed of the source!) – by the speed \(v\). The 1887 Morley-Michelson experiments made it clear that the speed was always \(c\), regardless of the speed of the observer.

So Einstein's special relativity primarily modified mechanics – mechanics was forced to change. For example, the relative speed between two bodies on a collision course on a line whose speeds are \(u,v\) is no longer \(u+v\) but it is \((u+v)/(1+uv/c^2)\). But any other kind of phenomena if there were one aside from mechanics and electrodynamics – hydrodynamics, aerodynamics, thermodynamics etc. (although they're really derived theories from mechanics and perhaps electrodynamics, not really fundamentally new) – had to be adjusted to agree with the two postulates. Particle physics only accepts theories that agree with relativity, too. For particle physics, this is so automatic – quantum field theory and string theory are the frameworks of choice and all of them are relativistic – that we don't even realize how much the possible "theories of particles" have been constrained by relativity.

The postulates imply – and Einstein was able to prove from them – that the length of objects shrinks in the direction of motion; the rate at which (any) clocks are "ticking" is slowing down if the clocks are speeding up; the total relativistic mass is increasing with the speed; its conservation law is merged with the momentum conservation law to the 4-momentum conservation law; this also implies that the mass and energy conservation laws become one identical part of the 4-momentum law and (what we used to call) energy may be converted to (what we used to call) mass and vice versa via \(E=mc^2\), the most well-known equation of relativity among the laymen.

While mechanics (I really mean kinematics, the description of motion influenced by forces that are given and whose origin isn't analyzed) was adjusted to relativity in 1905 – it was the main point of it – the physics of gravity (the description of a particular force that causes the motion – and such descriptions belong to "dynamics", not "kinematics") remained mysterious because (among related problems), Newton's gravity seems to operate instantaneously which violates the speed limit, \(c\), that relativity imposes on the speed of propagation of any usable information.

Einstein spent the decade after the discovery of special relativity, 1905-1915, by attempts to reconcile the laws of gravity with the principles of special relativity. The result of this long but successful work, the general theory of relativity, pretty much inevitably and uniquely follows from special relativity (that is required to hold whenever the gravitational fields are negligible) and the equivalence principle (the statement that all bodies accelerate in gravitational fields by the same acceleration which means that freely falling frames are indistinguishable from the life outside gravitational fields, and must therefore locally preserve special relativity).

I will discuss GR as an unavoidable extension of SR momentarily. But let me first address a more trivial question:
Is SR applicable to phenomena in which objects accelerate?
The answer is, of course, Yes. Special relativity would be useless if it were requiring all objects to move without any acceleration; after all, almost everything in the real world accelerates, otherwise the world would be useless. The correct claim similar to the proposition above is that special relativity has the same, simpler form in coordinate systems associated with non-accelerating observers. But that doesn't mean that we can't translate the predictions of a special relativistic theory to an accelerating frame. Yes, we can. It's as straightforward as a coordinate transformation. Fictitious forces will appear in the description. All of them are fully calculable.

We should point out that if it were impossible to consider accelerating observers, special relativity couldn't tell us anything about the twin "paradox". At least one of the twins, the astronaut, has to intensely accelerate during his life. But the total time measured by his clock – and by the aging of his organs, which is just another type of clocks (not too accurate one) -´is clearly composed of the proper times of tiny line intervals into which his world line may be divided. The infinitesimal pieces of his world line are straight so special relativity simply has to hold. When we compute i.e. integrate the total proper time along the world line, of course that we will find out that the twin-astronaut will be younger than his brother who spent decades on Earth.

We don't need general relativity because the presence of acceleration doesn't mean that there's a gravitational field. The curvature of the spacetime is still zero. Acceleration is locally equivalent to gravity by the equivalence principle but the clever way to use it isn't to envision unnecessary gravitational fields but, on the contrary, to undo the gravity whenever we can by replacing it with acceleration combined with no gravity – and for this combination, special relativity is sufficient.

Not being able to produce this right answer to the twin "paradox" means not to understand special relativity at the high-school level (at least we did learn basics of special relativity at the high school, a pretty ordinary high school). It's not wise, deep, clever, or sophisticated to be doubtful about the usual resolution to the twin "paradox". It is nothing more than a sign of brutal ignorance. (Christine Dantas is among those who believe that special relativity doesn't imply that the astronaut-twin will be younger because acceleration makes it impossible to use the theory. Holy cow. This lady has had a full big mouth about quantum gravity while high school physics is apparently way too hard for her.)

Now, let me switch to general relativity again. Sabine promotes a particular definition of special relativity:
Ask some theoretical physicist what special relativity is and they’ll say something like “It’s the dynamics in Minkowski space” or “It’s the special case of general relativity in flat space”. (Representative survey taken among our household members, p=0.0003). But open a pop science book and they’ll try to tell you special relativity applies only to inertial frames, only to observers moving with constant velocities.
I don't think that it is downright incorrect to describe special relativity in Sabine's way. But I don't think it's the deepest or most natural way, either. More importantly, I do agree with the criticized books that at least something in special relativity does apply to observers moving with constant velocities only – the Lorentz symmetry only mixes the viewpoints of these observers and, consequently, the laws of physics only have the usual simple form in the coordinate systems connected with these observers. The difference between inertial and non-inertial systems is essential in special relativity and if that's the claim that Sabine criticizes, she is completely wrong.

Moreover, her "definition" of special relativity is useless. A definition is meant to be helpful to someone whose knowledge is at a lower level than the level at which the defined object is "obvious". If someone doesn't know special relativity, you won't help him much if your explanation will assume the knowledge of general relativity because, you know, general relativity is harder than special relativity.

But there's another, more conceptual reason why I consider Sabine's definition to be a sign of her (and her spouse's, as we were told) shallow knowledge of the subject. What is the reason? Her definition implicitly says that general relativity is the fundamental set of insights, rules, and principles and special relativity is just a minor corollary of it. While it's true that special relativity is a limit of general relativity obtained for gravitational fields going to zero, the actual "hierarchy of power" is the opposite: general relativity is just one application of special relativity – the incorporation of the gravitational field in a special-relativity-invariant way. While general relativity is arguably the prettiest (and geometrically most non-trivial) classical application of the rules of special relativity, in principle it is on par with Yang-Mills theory or any other (special) relativistic field theory.

This claim of mine may be interpreted as a modern interpretation of the philosophy underlying relativity – and widely appreciated by most of the competent modern theoretical/phenomenological particle physicists (people who were clearly not included in Sabine's low-brow survey). But there's a sense in which it's ancient, too. What's the sense? Well, the insight is ancient because Einstein simply didn't have a choice when he was searching for a relativistic theory of gravity between 1905 and 1915. General relativity is the unique theory obeying the postulates of special relativity that describes the gravitational force – by which I mean a force (and we can prove that it's the force because such a force must be unique for a physical system) that respects the equivalence principle.

The gravitational field must be given by some components of the mass/energy/momentum-encoding stress-energy tensor. Because the strength of the field around a physical system as measured at infinity cannot change (in analogy with the field around a charge in electrostatics), it must be conserved quantities that source the gravitational field/influence. Because our goal is a gravitational force that depends on the mass, it's clearly the whole stress-energy tensor \(T_{\mu\nu}\) that must be involved in sourcing the gravitational field (\(T_{00}\) which must surely influence the gravitational field isn't a Lorentz-invariant quantity and the Lorentz transformations of this quantity involve all other components of the tensor). The corresponding "potentials" of the gravitational field must be organized as a symmetric tensor with two indices, too. It's \(h_{\mu\nu}\).

However, the derivatives of the field \(h_{\mu\nu}\) contribute to the energy as well, like the derivatives of any matter field. We are led to the question how the field sources itself. We're brutally constrained by the equivalence principle because physics in the \(h_{\mu\nu}\) field that linearly depends on the coordinates must be indistinguishable from physics outside any nonzero fields: a freely falling observer (in the linear \(h\)-field) mustn't be able to figure out that he's in a gravitational field at all.

This is only possible if there is a rather large symmetry that is able to identify configurations with different profiles of \(h_{\mu\nu}\) – identify some configurations where this field is nonzero (and even non-constant) with the configuration where it's zero. So this symmetry must be mixing the gravitational field \(h_{\mu\nu}\) with something that was nonzero to start with. It must have the same tensor structure and we conclude that it must be the pre-existing metric tensor \(\eta_{\mu\nu}\). The only symmetry that is able to produce the right number of symmetries acting on these metric tensors is the diffeomorphism symmetry under which the "total metric"\[

g_{\mu\nu}=\eta_{\mu\nu} + h_{\mu\nu}

\] transforms as the tensor field. So we're led to general relativity as the only possible (special) relativistic description of gravity that uses fields.

This was a sequence of arguments that tried to be as classical as possible. Modern particle physicists would present a similar but quantum-field-theory-based version of the ideas. Because it's locally sourced by the stress-energy tensor, gravity must involve spin-two fields. In the covariant, manifestly Lorentz-invariant description, spin-two fields have some positively definite components \(h_{ij}\) and perhaps \(h_{00}\) (which will also be mostly killed, despite its good sign) and some negative-normed components \(h_{0i}\), ghosts. The latter is unacceptable because it leads to the prediction of negative probabilities for some processes. So there must exist a symmetry that decouples all the ghosts. The symmetry has to be local and have a whole "vector" of parameters at each spacetime point. Coordinate redefinitions \(\delta x^\mu\) are the only solution. For gravity in terms of quantum fields, you need spin-two fields and the diffeomorphism invariance is necessary to get rid of their pathological, negative-normed components. The rest of the GR follows; the Ricci scalar is the lowest-order (in the number of derivatives) coupling compatible with the required symmetry but there may also be higher-order corrections (whose effect becomes negligible at long distances).

Some people would declare all the derivations above to be heresies because they think it is a blasphemy to ever write the metric tensor as a sum of two or several pieces because such a blasphemy contradicts the holy beauty of general relativity as written in an unwritten commandment somewhere. ;-) The price they pay for this medieval, unjustifiable, irrational, stupid taboo (the commandment really says "you shall never make your hands dirty by any science that actually applies to a situation in the real world or answers some questions beyond the questions whose answers you have been given to start with, by science that requires you to write anything else than the most beautiful form of the basic equations") is very high: They can't understand some key facts about modern physics, e.g. that and why the general theory of relativity is unavoidable given the validity of special relativity and the existence of gravity sourced by the energy-and-momentum density and their fluxes/currents.

Many people in the Backreaction discussion are confused about many other things.

For example, is a charged object sitting somewhere on the Earth's surface emitting electromagnetic and/or Unruh and/or gravitational radiation?

The answer is, of course, No. If it were radiating in any of the three ways (to be precise, by radiation I mean sending physical photons, gravitons, or other particles to infinity), it would have to lose energy to avoid the violation of the energy conservation law. But the charged object is already sitting at a place where the energy is minimized so there's no way to extract more energy out of the particle.

Relatively to a freely falling frame, the charged object sitting on the Earth's surface is accelerating so it should emit all three kinds of radiation, some people could argue. If it emits no radiation, doesn't it violate the equivalence principle?

No, it doesn't. First of all, the equivalence principle is only guaranteed locally. But in the previous paragraphs, we were asking whether particles are emitted to infinity. This requires us to connect the vicinity of the Earth with infinity, to compare them. But such a global connection turns the existence of the Earth's gravitational field into an objective fact. There exists no flat-space-based equivalent description of a region that would include both Earth's vicinity as well as the asymptotic region at infinity. So the equivalence principle isn't really applicable. There's no justifiable way to argue that the charged sitting object should emit radiation.

There are other ways to argue and reach the same conclusion.

For example, the equivalence principle identifies the experience of a freely falling observer with those of an inertial observer in the flat spacetime. But the identification only holds if "all other factors are equal". The freely falling observer who is going to hit the Earth's surface soon doesn't have "all other factors equal". In particular, there may be some extra radiation coming from the rest of the Universe. It just happens that the radiation is such that it perfectly cancels the would-be electromagnetic/Unruh/gravitational radiation of the charged object sitting on the Earth.

To make this discussion really complete, I would have to describe a formalism that has something to cancel at all and distinguish the different amounts of radiation as seen by a nearby static, nearby accelerating, or infinitely distant detector. The discussion could get unnecessarily messy and repetitive. But my point that shouldn't get lost in this technical material is that only the black holes emit the Hawking radiation. One actually needs the horizon for that. If there's no horizon, there's no energy loss by the Hawking or another acceleration-based radiation. (And this 1999 paper is just wrong. It's not the only one.)

Why does the horizon matter? If there's the horizon, one simple fact holds: the black hole interior can't possibly send any radiation (positive-energy one or a "compensating one") in the outward direction; nothing gets out of the black hole. That's why the frame of an observer who is freely falling into a black hole (with a horizon) is as equivalent to an inertial observer in an empty space as you can get. He could have been freely falling throughout his life which explains that no radiation was going in his direction.

On the other hand, there's no radiation going from the black hole interior, against him, either. It's forbidden by the blackness of the black hole. It's this latter property that doesn't hold for the Earth. The Earth imposes different boundary conditions on the surface than the black hole enforces on the event horizon. If the Earth were a conductor, the electrostatic potential would vanish on the surface. The relevant modes of waves would be standing waves above the Earth's surface. While the condition "killing" one-half of the modes in the black hole case says that "nothing is coming in the outward direction", the conditions are different for the Earth: "no waves are inside the conducting Earth". The latter condition is past-future-symmetric, unlike the condition for the black hole.

The vacuum is Unruh and electromagnetic radiation-free in the "most natural frames". For black holes, it's the freely falling frame because you can just freely fall and you will never notice that something is unnatural about that frame (the singularity kills you before you realize that). That's why there's no radiation in this frame while the frame of an observer keeping himself above the horizon by jets experiences Unruh radiation that penetrates through the black hole's gravitational field and becomes real, physical Hawking radiation at infinity.

For the Earth, the most "vacuum-like" frame is one associated with the surface because the freely falling observer will hit the Earth's surface and the headache will convince him it's not the frame most similar to the empty space. ;-) So the Earth stabilizes all the surrounding fields relatively to its static surface and relatively to this frame, there's no radiation – and frames accelerating relatively to the surface's frame will see some radiation. Of course, a semiclassical analysis of GR coupled to electromagnetism offers you a more reliable but less funny derivation of the same conclusion.

One should emphasize that the Unruh/Hawking radiation for the Earth, even if there were one, would be ludicrously weak. The typical wavelength of the emitted photons would be comparable to \(c^2/g\) which is about \(10^{16}\,{\rm meters}\), not far from a light year. It's clearly just an academic debate for the Earth as the very weak radiation would be totally unobservable – dozens of orders of magnitude weaker than the observable one. But it would still be an inconsistency if stable objects and particles like that would radiate because of some incorrectly applied equivalence principle.



Off-topic: Mr Ilja Hurník (*1922) died. He was a serious Czech composer of highly non-classical music for classical instruments and a piano virtuoso but people like your humble correspondent know him as the author of small pieces such as the "Merry Postman" (yes, he's ringing the bell and knocking the door) and "Little Soldier" above which I liked to play when I was 8 or so. ;-)
Read More
Posted in science and society, string vacua and phenomenology, stringy quantum gravity | No comments

Friday, September 6, 2013

Yo-yo banned in Syria

Posted on 9:35 PM by Unknown
Blamed for drought by Muslims

BEIRUT (Syria), [date]. Drought and severe cold is disastrously affecting the cattle in Syria, and the Muslim chiefs at Damascus have attributed the wrath of the heavens to the recent introduction of the yo-yo.

They say that while the people are praying for rain to come down from above the yo-yo goes down, and before reaching the ground springs up through the subtle pull of the string.




The chiefs interviewed the Prime Minister, and exposed the evil influence of yo-yos, so they were immediately banned.




Today the police paraded the streets and confiscated the yo-yos from everyone they saw playing with them.



Source: Barrier Miner (Broken Hill, New South Wales: 1888 - 1954), 23 January 1933

The censored date at the top was January 21st. I changed "Moslems" to "Muslims" so that it's not clear from the beginning that the text is ancient. Educated readers realized that anyway because Beirut hasn't belonged to "Syria" since 1943 (the correct name of the state should have been The French Mandate for Syria and the Lebanon, anyway) and because Muslim chiefs will only return to command police in Damascus after the possibly looming war.

At any rate, the similarity with the IPCC-backed religion cannot be overlooked. People see two things happening at about the same time (one of them has been happening for billions of years but they don't care), they decide that correlation and even coincidence is the same thing as causation, and the rest is just about making sure that the law enforcement forces enforce this deep life-saving insight. ;-)

The yo-yos probably resemble the rain droplets. Before they can reach the ground thanks to the gravity of the prayers, the force from an evil string returns them to the clouds. The springs, if any, stretched between the clouds and the rain droplets may look too weak but that's OK because they're surely strengthened by positive feedbacks. The debate is over.

Thanks to Steve Goddard for the URL
Read More
Posted in climate, Middle East | No comments

Snowden: Internet encryption useless against eyes in NSA, GCHQ

Posted on 3:01 AM by Unknown
Both HTTPS, SSL, and VoIP only good against little fish

Edward Snowden has provided The New York Times and The Guardian and others with some eye-catching revelations:
N.S.A. Able to Foil Basic Safeguards of Privacy on Web (NYT)

NSA and GCHQ unlock privacy and security on the internet (Guardian)
The two U.S. and U.K. intelligence agencies "are investing in groundbreaking cryptanalytic capabilities to defeat adversarial cryptography and exploit internet traffic" according to Director of National Intelligence who was quoted in the latest Snowden document about the $52 billion black budget. See also skeptical, technically sophisticated remarks by Wired. HTTPS, SSL, and VoIP are no longer safe; the correctly implemented strong encryption seems fine.



Of course, I got a bit excited: Have the agents finally built operational quantum computers? Have they made some progress that proves that \(P=NP\), after all?




Well, not really. Already the subtitle of the article in The Guardian makes it clear that the weapons that the agencies use aren't some groundbreaking advances in quantum computation or classical algorithms. Instead, they abuse the weaknesses of the human factor. Some big progress occurred in 2010, we're told.




So it seems that $250 million is spent every year to "encourage" the tech companies to insert weaknesses (backdoors and trapdoors) into their products. I suppose that to decrypt a message using state-of-the-art encryption programs, you either need to input a very long nonsensical sequence of characters that changes every day or you have to type "My name is Bond, James Bond". ;-) This sounds like a joke but it may be very close to the truth, too. NSA influences international agreements about encryption standards. Lots of supercomputers are running to break the codes by brute force but this hard work would be useless if the agencies didn't have secret agreements with folks in the tech companies.

Analysts aren't allowed to ask or speculate about the sources of the data or methods used to make the data readable. Having watched many superagent movies and having seen that I couldn't complete, I won't ask or speculate, either. NSA claims that without this control, the U.S. couldn't allow the access to the cyberspace to remain unrestricted. This claim surely sounds tough but it may have a point, too. A GCHQ team works with the "big four": Google, Facebook, Hotmail, Yahoo.

Well, as long as I feel that those agencies don't use their behind-the-scenes powerful tactics to harm free individuals for something that should always remain legal, I find the reports above just a little bit chilling. On the other hand, every capability or influence may be abused and what we're hearing seem to be extraordinary powers, indeed. It still sounds a bit more plausible when these powers belong to institutions whose composition may be refreshed according to the desires of the American (and British) voters.



Does this serious Gentleman have his own capabilities, too?

The British GCHQ seems to be among the "top two". That couldn't stop Dmitry Peskov, a Putin spokesman, from overlooking Northern Ireland and calling the United Kingdom "just a small island no one listens to" that plays no major role in the world politics and whose Chelsea and other upmarket London districts is being bought by Russian oligarchs. Cameron et al. claim that they believe that the U.K. continues to be a superpower. It's up to you to decide whose perspective is more ludicrous. ;-)
Read More
Posted in computers, mathematics, politics | No comments

Thursday, September 5, 2013

A universal derivation of Bekenstein-Hawking entropy from topology change, ER-EPR

Posted on 2:16 AM by Unknown
I have been intrigued by topology change in quantum gravity, especially its Euclidean version, for 15 years or so. Since the beginning, I liked a sketch of a derivation (that I invented) of the Bekenstein-Hawking entropy of a black hole that was based on a wormhole connecting two tips of the Euclidean black hole in the \(rt_E\) plane.



Ignore the wormhole-related captions.

Before the ER-EPR correspondence, I would interpret the two planes on the picture above (lower, upper) as spacetimes in the ket vector and the bra vector, respectively, and this need to double and complex conjugate the whole spacetime made the details of the argument confusing because the thermal calculation (which is inevitably connected with the cigar-like Euclidean black hole pictures) inevitably involves a trace over ket vectors (or bra vectors but not both).

Fortunately, one may now present the whole argument without any bra vectors. Thanks to Maldacena and Susskind, the doubling of the spacetime (note that there is the upper and lower plane on the picture above) may be interpreted as the presence of two distinct spacetimes or two faraway regions of one spacetime – or two faraway regions of the same spacetime; it won't really make a difference. With this reinterpretation of the pictures, I am more satisfied with the argument.




Try to calculate a thermal correlation function in a spacetime (or a pair of spacetimes if you really view the two planes as disconnected) of temperature \(1/\beta\) which will be chosen to agree with the black hole temperatures below. The operators in the correlation functions don't matter; assume that they are low-energy operators far away from all the celestial bodies we will consider.

We want to know how much the states with two black holes at places \(A,B\) (in arbitrary microstates) contribute to the correlator; and how much the states with two neutron stars at the same places \(A,B\) contribute. The ratio of the two contributions should be \(\exp(S_A+S_B)\) where the terms in the exponent are black hole entropies times some subleading corrections (all neutron stars' entropies will be negligible). Just to be sure, the contribution from two black holes should be exponentially larger. I will take the two celestial objects to be macroscopically the same so the ratio should be \(\exp(2S)\) where \(S=S_A=S_B\).




To confirm the Bekenstein-Hawking formula means to prove that the contribution from the two black holes is \(\exp(2A/4G) = \exp(A/2G)\) times greater than contribution from the two neutron stars.

By the neutron stars (a nickname chosen for the sake of simplicity), I really mean a celestial body that is on the verge of collapsing to a black hole. I want \(g_{00}\) to be very small right above the surface of this body. Because \(|g_{00}|\) wants to be even smaller in the stellar interior which makes it impossible for \(|g_{00}|\) to be near zero above the surface, I really need to consider a hollow star – a shell that is protected against the collapse by some skeleton or light gas inside or whatever. I hope that these awkward technicalities don't really matter and can be replaced by a less problematic treatment. Maybe it's enough to compare the two-black-hole contribution with the contribution having no objects at those places at all.

For the sake of clarity, let's assume that the black hole radii are equal to a few miles (a solar-mass black hole). The thermal correlators may be calculated from the path integrals\[

\langle \cdots \rangle = \int {\mathcal D}\,{\rm fields}(x,y,z,t_E)\,\exp(-S_E)\, (\cdots )

\] over the Euclidean geometries with Euclidean field configurations in a spacetime whose Euclidean time coordinate \(t_E\) has the periodicity \(\beta\).

Now, we don't want to study the detailed microscopic physics of the neutron stars. Their entropy (and any non-black-hole celestial object's entropy) is negligible in comparison with the black hole entropy. We don't even want to specify what exact short-distance degrees of freedom are responsible for the black hole entropy. Indeed, the goal is to derive the Bekenstein-Hawking formula "universally", for every quantum theory that resembles quantized general relativity in a limit.

But yes, in this geometrized picture of the degrees of freedom, all the entropy is carried by some degrees of freedom – field modes and their generalizations – that may be attached to the stretched horizon, a Planckian vicinity of the region that will host a throat in a minute.

To neglect the short-distance physics, why don't we integrate out all the field modes with wavelengths shorter than 1 millimeter (to be specific again)? When you do so, the two-neutron-star contribution looks like two disconnected pieces, essentially two planes (the upper and lower plane) not connected by the throat shown on the picture at the top. Even if there has been some entanglement between the stars, it was way too weak to produce the smooth throat. Instead, the thin tunnels disappeared as we integrated the high-energy degrees of freedom out. The stellar interior isn't clearly shown on the picture – the picture only shows the stellar exterior – but it's somewhere and the Ricci scalar \(R\) is essentially zero everywhere. Again, maybe I should replace the neutron stars by empty regions of space throughout this argument; I wanted the two compared situations (with and without black holes) to be as similar as possible, however, so that the difference may be blamed on the throat, as we will see momentarily.

What about the two-black-hole contribution?

Maldacena and Susskind taught us that the Hilbert space of 2 similar black holes – essentially \(\HH_{2BH}=\HH_{1BH}\otimes \HH_{1BH}\) – is isomorphic to (really the same as) the Hilbert space of an Einstein-Rosen bridge geometry that connects them. Despite the apparently different topologies of the two descriptions, they're the same Hilbert spaces. The bridge-based description is better for highly entangled states in the Hilbert space; the 2 isolated black hole description is better for the nearly unentangled states of the two black holes. (Note that "highly entangled states" and "almost unentangled states" don't form linear spaces because the properties "entangled" and "unentangled" aren't closed under addition.) The two-black-hole states that strongly entangle the two black holes look like smooth bridges; however, there are highly excited, unsmooth bridges that must describe all the other two-black-hole microstates as well.

In general, the two planes – see the picture at the top – are connected by "some" throat. When you integrate all the field modes shorter than one millimeter out, you also do it for the gravitational modes so the geometry can't be too thin or curved. In effect, the gradual integration out thickens the throat in the black-hole case while it cuts the throat(s) in the stellar case. When you're finished, the throat itself is about one millimeter thick. It was a randomly chosen distance scale that is much longer than the Planck scale but much shorter than the black hole radius.

Looking at the two-neutron-star and two-black-hole Euclidean geometries, they look very similar. The only difference is the throat near the event horizon (or near event horizon in the case of the stars). In that region, the \((d-2)\)-dimensional area of the angular variables is constant, \(A\), which simply enters as an overall factor to the difference of the actions, and the major components of the curvature tensor only exist in the two-black-hole case and in the Riemann components \(R_{rtrt}\) and its three copies dictated by the Riemann tensor's symmetries (\(t\) really denotes \(t_E\) as an index).

(The throat in the black-hole case isn't Ricci-flat; the nonzero Ricci tensor must be blamed on the high-energy matter that resides in the stretched horizon(s).)

So the two contributions to the path integral – from the two neutron stars; and from the two black holes – only differ by the extra "wormhole" in the two-black-hole case. This wormhole is a "handle" of a Riemann surface and the exponent of the Euclidean path integral is more negative in the black-hole-case (I hope) relatively to the neutron-star case by the factor\[

\exp[-(S_E^{\rm BH}-S_E^{\rm neut})] =
\exp\left(\!-\frac{A\int d^2 x\sqrt{|g|}R_{(2)}}{16\pi G}\right)=\dots

\] over the handle (wormhole). But the two-dimensional integral – the Einstein-Hilbert action above – is proportional to the Euler characteristic\[

\chi = \frac{1}{4\pi}\int d^2 x\,\sqrt{|g|}R_{(2)}.

\] Note that a sphere of radius \(a\) has \(R_{(2)}=2/a^2\) and \(\chi=2\). Each added handle (which has a negative curvature \(R_{(2)}\) in average) reduces the Euler character by two and (therefore) the integral of \(\sqrt{|g|}R_{(2)}\) by \(8\pi\). When you substitute this \(8\pi\) decrease above, it becomes an increase of the exponent due to the extra minus sign in the exponent and you will see that the two-black-hole contribution is greater by the factor of \[

\exp\zav{ \frac{A\cdot 8\pi}{16\pi G} } = \exp\zav{ \frac{A}{2G} },

\] exactly as expected from the Bekenstein-Hawking entropy of two black holes. This multiplicative increase implies that there are \(\exp(A/4G)\) black hole microstates per black hole whose precise identity doesn't significantly affect the correlator we agreed to compute. So if we trace over them (and we do so in a thermal calculation), they just influence the result by the simple multiplicative factor (the number of these microstates).

You may have some doubts about the sign of the Euclidean Einstein-Hilbert action used above. I have some doubts as well. I can enumerate about 6 things one must be careful about that may lead you to a wrong sign but I am not sure whether I am not missing some other sign flips. The probability that I keep on committing a sign error here is too close to 50 percent at the end ;-) which is why I must add that a more careful scrutiny is needed.

This argument may arguably be generalized to derive Wald's entropy formula for a more general action including higher-derivative terms. In these cases, one still has \(R_{rtrt}=2\pi \delta^{(2)}(r,t_E)\) per black hole located at the horizon and if we treat this modification of the Riemann tensor perturbatively, the change of the gravitational action produces Wald's entropy formula instead of the Bekenstein-Hawking formula above.

Incidentally, I think that quite generally, the black hole entropy must also be interpretable as the total order/volume of an approximate symmetry group of a given spacetime because a black hole may be interpreted as a codimension-2 "cosmic string" in the Euclidean spacetime (which is analogous to 7-branes in F-theory and requires us to study the monodromies). But why this gives the right results in weakly coupled string theory (where you have a \(U(1)\) for each free field-theory mode produced by the string theory); pure \(AdS_3\) with the monster symmetry group; and in BTZ-black-hole-based \(AdS/CFT\) calculations will be reserved for other blog entries, much like the connections of the ideas above with the representation of microstates as Mathur's fuzzballs.

String/M-theory gave us numerous pictures of the microscopic structure of the black holes. Those usually make it hard to see the locality in the bulk (and even hard to see into the black hole interior) and difficult to assign the degrees of freedom to the locations in the bulk. While unitarity etc. is manifest in these string/M-theoretical pictures, various geometric properties are less clear. Realizations such as the text above are meant to clarify all the remaining secrets of the black holes that are "universal" and independent of the microscopic description of the black holes.
Read More
Posted in stringy quantum gravity | No comments

Wednesday, September 4, 2013

Nathaniel Craig's State of the SUSY Union address

Posted on 10:50 AM by Unknown
I have known Nathaniel Craig since he was a brilliant Harvard undergraduate who was attending graduate courses – at least my string theory course (I believe he was the best student in the room). This young Gentleman has written 37 papers or preprints (if you subtract some namesakes) and the last one among them is sufficiently pedagogic for you to be interested in it:
The State of Supersymmetry after Run I of the LHC
These 71 pages are based on his talk at a June 2013 workshop.




The first section is an introduction. At the end of it, on page 5, Nathaniel summarizes 5 positive reasons why the LHC has strengthened our belief that SUSY is right and relevant and 1 way in which it has weakened the belief.

In the following section, he discusses the expectations – naturalness and parsimony (essentially minimality) of the right supersymmetric models. The section is summarized by an expected ordering of the superpartners' masses and reasons for this ordering.




The third section is about our knowledge, especially various limits. In this section, you start to encounter lots of handwritten yet colorful cartoons that reproduce various graphs that you might think that only computers can draw well. ;-) Colorful, electroweak, third-generation, Higgs-related superpartners are given special attention.

The fourth section is about indirect limits, mainly ones from various rare decays.

Implications of the Higgs and its suddenly known (SUSY-compatible) mass as well as Standard-Model-like couplings are discussed in Section 5.

There have been no signals proving SUSY reported by the LHC yet. This disfavors the minimal naive models and Nature reconciles SUSY with the observations in at least one of the two ways: by breaking the signal relatively to the most visible naive models or by breaking the spectrum.

Section 6 is dedicated to breaking of the signal. It's harder to see SUSY if the spectrum is compressed or SUSY is stealth or SUSY is R-parity-violating. Compressed spectrum means that the LSP isn't much lighter than the colored superpartners. If that's so, not many particles may be produced when the colored superpartners decay to the LSP and something else. Moreover, the missing transverse energy tends to cancel as it's copied from the oppositely moving colored superpartners.

Stealth supersymmetry has a light LSP (usually outside the MSSM) which decays into something truly "almost invisible", like a light gravitino, and its R-even superpartner whose mass is just a bit lighter than the LSP mass. This R-even superpartner consequently decays into well-known SM particles so almost nothing new – and, more importantly, almost no missing energy – is produced in the reaction.

R-parity violation makes it harder to economically explain dark matter and may worsen problems with the proton decay. For the latter reason, RPV operators should still preserve either lepton or baryon number. SUSY becomes less visible because the (new) superpartners may completely decay up to SM particles again.



Section 7 is about breaking of the spectrum. Natural SUSY became a newly recycled term for SUSY models where only particles that are "really needed" for the lightness of Higgs' being are light – especially the third-generation quarks (primarily the stops). Light stops have been discussed on TRF many times, of course. This lightness of the third generation should ideally be connected with the heaviness of the third generation of SM fermions. Such models are OK with the LHC data because the data still allow light third-generation sleptons and squarks; and there's nothing unnatural about the heavy (and safely LHC-compatible) first two generations of sfermions. Nathaniel discusses various strategies to obtain natural SUSY models by choices in the mediation.

By supersoft SUSY, he means a different way of breaking the spectrum. The squarks of all generations are comparably light but the gluino is much heavier which is enough to suppress the production of superpartners at the LHC (which is mostly performing gluon-gluon collisions, using a microscopic perspective). This would be unnatural in the minimal models but it's OK if the gluino is a Dirac particle, something that I like so it's been repeatedly discussed on this blog.

Nathaniel discusses one more unusual way of breaking the spectrum, folded or colorless SUSY, in which the relevant superpartners don't carry any color, unlike their known SM partners. I don't understand how this could be possible and will study this tonight. (I see, they're just some non-SUSY models that also cancel quadratic divergences but in a more general way. This looks contrived to me and the only way way how string theory could endorse such things is via some non-supersymmetric orbifolds.)

Focus point SUSY – another way to break the spectrum, a way that is considered rubbish by Nima Arkani-Hamed, by the way – is also dedicated a special subsection.

The final subsection of Section 7 is about minisplit SUSY – going in the direction of split SUSY by Arkani-Hamed et al. but not that extreme. In this approach, one sacrifices naturalness but tries to respect all the other attractive conditions.

The final Section 8 is dedicated to thoughts about the future and Nathaniel's recommendations how people should approach the 2015- LHC run at 13 or 14 TeV. Acknowledgements and 93 references are the only other thing expecting you after that section.
Read More
Posted in string vacua and phenomenology, stringy quantum gravity | No comments
Older Posts Home
Subscribe to: Posts (Atom)

Popular Posts

  • Ostragene: realtime evolution in a dirty city
    Ostrava , an industrial hub in the Northeast of the Czech Republic, is the country's third largest city (300,000). It's full of coal...
  • Origin of the name Motl
    When I was a baby, my father would often say that we come a French aristocratic dynasty de Motl – for some time, I tended to buy it ;-). Muc...
  • Likely: latest Atlantic hurricane-free date at least since 1941
    Originally posted on September 4th. Now, 5 days later, it seems that no currently active systems will grow to a hurricane so the records wi...
  • Papers on the ER-EPR correspondence
    This new, standardized, elegant enough name of the Maldacena-Susskind proposal that I used in the title already exceeds the price of this b...
  • Bernhard Riemann: an anniversary
    Georg Friedrich Bernhard Riemann was born in a village in the Kingdom of Hanover on September 17th, 1826 and died in Selasca (Verbania), No...
  • New iPhone likely to have a fingerprint scanner
    One year ago, Apple bought AuthenTec , a Prague-based security company ( 7 Husinecká Street ), for $356 million. One may now check the Czech...
  • Prediction isn't the right method to learn about the past
    Happy New Year 2013 = 33 * 61! The last day of the year is a natural moment for a blog entry about time. At various moments, I wanted to wri...
  • Lubošification of Scott Aaronson is underway
    In 2006, quantum computing guy Scott Aaronson declared that he was ready to write and defend any piece of nonsensical claim about quantum gr...
  • A slower speed of light: MIT relativistic action game
    In the past, this blog focused on relativistic optical effects and visualizations of Einstein's theory: special relativity (download Re...
  • Eric Weinstein's invisible theory of nothing
    On Friday, I received an irritated message from Mel B. who had read articles in the Guardian claiming that Eric Weinstein found a theory of ...

Categories

  • alternative physics (7)
  • astronomy (49)
  • biology (19)
  • cars (2)
  • climate (93)
  • colloquium (1)
  • computers (18)
  • Czechoslovakia (57)
  • Denmark (1)
  • education (7)
  • Europe (33)
  • everyday life (16)
  • experiments (83)
  • France (5)
  • freedom vs PC (11)
  • fusion (3)
  • games (2)
  • geology (5)
  • guest (6)
  • heliophysics (2)
  • IQ (1)
  • Kyoto (5)
  • landscape (9)
  • LHC (40)
  • markets (40)
  • mathematics (37)
  • Middle East (12)
  • missile (9)
  • murders (4)
  • music (3)
  • philosophy of science (73)
  • politics (98)
  • religion (10)
  • Russia (5)
  • science and society (217)
  • sports (5)
  • string vacua and phenomenology (114)
  • stringy quantum gravity (90)
  • TBBT (5)
  • textbooks (2)
  • TV (8)
  • video (22)
  • weather records (30)

Blog Archive

  • ▼  2013 (341)
    • ▼  September (14)
      • Likely: latest Atlantic hurricane-free date at lea...
      • Democrats of Europe, wake up!
      • Confusions about the relationships of special rela...
      • Yo-yo banned in Syria
      • Snowden: Internet encryption useless against eyes ...
      • A universal derivation of Bekenstein-Hawking entro...
      • Nathaniel Craig's State of the SUSY Union address
      • Did soot melt glaciers in the 19th century?
      • 16 out of half a billion: elite Calabi-Yau manifol...
      • Lev Pontryagin: 105th anniversary
      • The 50 to 1 project
      • Ukrainian ex-porn star wins legal residence in Cze...
      • An apologia for ideas from Hawking's BH bet conces...
      • Feminists demand gender quotas for bodies buried i...
    • ►  August (42)
    • ►  July (36)
    • ►  June (39)
    • ►  May (38)
    • ►  April (41)
    • ►  March (44)
    • ►  February (41)
    • ►  January (46)
  • ►  2012 (159)
    • ►  December (37)
    • ►  November (50)
    • ►  October (53)
    • ►  September (19)
Powered by Blogger.

About Me

Unknown
View my complete profile