Sunday, June 04, 2023

Penrose's "Missing" Link Between The Physics Of The Large And The Physics Of The Small

wikipedia  |  The Penrose interpretation is a speculation by Roger Penrose about the relationship between quantum mechanics and general relativity. Penrose proposes that a quantum state remains in superposition until the difference of space-time curvature attains a significant level.[1][2][3]

Penrose's idea is inspired by quantum gravity, because it uses both the physical constants and . It is an alternative to the Copenhagen interpretation, which posits that superposition fails when an observation is made (but that it is non-objective in nature), and the many-worlds interpretation, which states that alternative outcomes of a superposition are equally "real", while their mutual decoherence precludes subsequent observable interactions.

Penrose's idea is a type of objective collapse theory. For these theories, the wavefunction is a physical wave, which experiences wave function collapse as a physical process, with observers not having any special role. Penrose theorises that the wave function cannot be sustained in superposition beyond a certain energy difference between the quantum states. He gives an approximate value for this difference: a Planck mass worth of matter, which he calls the "'one-graviton' level".[1] He then hypothesizes that this energy difference causes the wave function to collapse to a single state, with a probability based on its amplitude in the original wave function, a procedure derived from standard quantum mechanics. Penrose's "'one-graviton' level" criterion forms the basis of his prediction, providing an objective criterion for wave function collapse.[1] Despite the difficulties of specifying this in a rigorous way, he proposes that the basis states into which the collapse takes place are mathematically described by the stationary solutions of the Schrödinger–Newton equation.[4][5] Recent work indicates an increasingly deep inter-relation between quantum mechanics and gravitation.[6][7]

Accepting that wavefunctions are physically real, Penrose believes that matter can exist in more than one place at one time. In his opinion, a macroscopic system, like a human being, cannot exist in more than one place for a measurable time, as the corresponding energy difference is very large. A microscopic system, like an electron, can exist in more than one location significantly longer (thousands of years), until its space-time curvature separation reaches collapse threshold.[8][9]

In Einstein's theory, any object that has mass causes a warp in the structure of space and time around it. This warping produces the effect we experience as gravity. Penrose points out that tiny objects, such as dust specks, atoms and electrons, produce space-time warps as well. Ignoring these warps is where most physicists go awry. If a dust speck is in two locations at the same time, each one should create its own distortions in space-time, yielding two superposed gravitational fields. According to Penrose's theory, it takes energy to sustain these dual fields. The stability of a system depends on the amount of energy involved: the higher the energy required to sustain a system, the less stable it is. Over time, an unstable system tends to settle back to its simplest, lowest-energy state: in this case, one object in one location producing one gravitational field. If Penrose is right, gravity yanks objects back into a single location, without any need to invoke observers or parallel universes.[2]

Penrose speculates that the transition between macroscopic and quantum states begins at the scale of dust particles (the mass of which is close to a Planck mass). He has proposed an experiment to test this theory, called FELIX (free-orbit experiment with laser interferometry X-rays), in which an X-ray laser in space is directed toward a tiny mirror and fissioned by a beam splitter from tens of thousands of miles away, with which the photons are directed toward other mirrors and reflected back. One photon will strike the tiny mirror while moving to another mirror and move the tiny mirror back as it returns, and according to conventional quantum theories, the tiny mirror can exist in superposition for a significant period of time. This would prevent any photons from reaching the detector. If Penrose's hypothesis is correct, the mirror's superposition will collapse to one location in about a second, allowing half the photons to reach the detector.[2]

However, because this experiment would be difficult to arrange, a table-top version that uses optical cavities to trap the photons long enough for achieving the desired delay has been proposed instead.[10]

 

Saturday, June 03, 2023

Why Quantum Mechanics Is An Inconsistent Theory

wikipedia  | The Diósi–Penrose model was introduced as a possible solution to the measurement problem, where the wave function collapse is related to gravity. The model was first suggested by Lajos Diósi when studying how possible gravitational fluctuations may affect the dynamics of quantum systems.[1][2] Later, following a different line of reasoning, R. Penrose arrived at an estimation for the collapse time of a superposition due to gravitational effects, which is the same (within an unimportant numerical factor) as that found by Diósi, hence the name Diósi–Penrose model. However, it should be pointed out that while Diósi gave a precise dynamical equation for the collapse,[2] Penrose took a more conservative approach, estimating only the collapse time of a superposition.[3]

It is well known that general relativity and quantum mechanics, our most fundamental theories for describing the universe, are not compatible, and the unification of the two is still missing. The standard approach to overcome this situation is to try to modify general relativity by quantizing gravity. Penrose suggests an opposite approach, what he calls “gravitization of quantum mechanics”, where quantum mechanics gets modified when gravitational effects become relevant.[3][4][9][11][12][13] The reasoning underlying this approach is the following one: take a massive system well-localized states in space. In this case, being the state well-localized, the induced space–time curvature is well defined. According to quantum mechanics, because of the superposition principle, the system can be placed (at least in principle) in a superposition of two well-localized states, which would lead to a superposition of two different space–times. The key idea is that since space–time metric should be well defined, nature “dislikes” these space–time superpositions and suppresses them by collapsing the wave function to one of the two localized states.

To set these ideas on a more quantitative ground, Penrose suggested that a way for measuring the difference between two space–times, in the Newtonian limit, is

 

 

 

 

(9)

where is the Newtoninan gravitational acceleration at the point where the system is localized around . The acceleration can be written in terms of the corresponding gravitational potentials , i.e. . Using this relation in Eq. (9), together with the Poisson equation , with giving the mass density when the state is localized around , and its solution, one arrives at

 

 

 

 

(10)

The corresponding decay time can be obtained by the Heisenberg time–energy uncertainty:

 

 

 

 

(11)

which, apart for a factor simply due to the use of different conventions, is exactly the same as the time decay derived by Diósi's model. This is the reason why the two proposals are named together as the Diósi–Penrose model.

More recently, Penrose suggested a new and quite elegant way to justify the need for a gravity-induced collapse, based on avoiding tensions between the superposition principle and the equivalence principle, the cornerstones of quantum mechanics and general relativity. In order to explain it, let us start by comparing the evolution of a generic state in the presence of uniform gravitational acceleration . One way to perform the calculation, what Penrose calls “Newtonian perspective”,[4][9] consists in working in an inertial frame, with space–time coordinates and solve the Schrödinger equation in presence of the potential (typically, one chooses the coordinates in such a way that the acceleration is directed along the axis, in which case ). Alternatively, because of the equivalence principle, one can choose to go in the free-fall reference frame, with coordinates related to by and , solve the free Schrödinger equation in that reference frame, and then write the results in terms of the inertial coordinates . This is what Penrose calls “Einsteinian perspective”. The solution obtained in the Einsteinian perspective and the one obtained in the Newtonian perspective are related to each other by

 

 

 

 

(12)

Being the two wave functions equivalent apart for an overall phase, they lead to the same physical predictions, which implies that there are no problems in this situation, when the gravitational field has always a well-defined value. However, if the space–time metric is not well defined, then we will be in a situation where there is a superposition of a gravitational field corresponding to the acceleration and one corresponding to the acceleration . This does not create problems as far as one sticks to the Newtonian perspective. However, when using the Einstenian perspective, it will imply a phase difference between the two branches of the superposition given by . While the term in the exponent linear in the time does not lead to any conceptual difficulty, the first term, proportional to , is problematic, since it is a non-relativistic residue of the so-called Unruh effect: in other words, the two terms in the superposition belong to different Hilbert spaces and, strictly speaking, cannot be superposed. Here is where the gravity-induced collapse plays a role, collapsing the superposition when the first term of the phase becomes too large.

The Collapse Of The Wave Function

wikipedia  |  In quantum mechanics, the measurement problem is the problem of how, or whether, wave function collapse occurs. The inability to observe such a collapse directly has given rise to different interpretations of quantum mechanics and poses a key set of questions that each interpretation must answer.

The wave function in quantum mechanics evolves deterministically according to the Schrödinger equation as a linear superposition of different states. However, actual measurements always find the physical system in a definite state. Any future evolution of the wave function is based on the state the system was discovered to be in when the measurement was made, meaning that the measurement "did something" to the system that is not obviously a consequence of Schrödinger evolution. The measurement problem is describing what that "something" is, how a superposition of many possible values becomes a single measured value.

To express matters differently (paraphrasing Steven Weinberg),[1][2] the Schrödinger wave equation determines the wave function at any later time. If observers and their measuring apparatus are themselves described by a deterministic wave function, why can we not predict precise results for measurements, but only probabilities? As a general question: How can one establish a correspondence between quantum reality and classical reality?[3]

The views often grouped together as the Copenhagen interpretation are the oldest and, collectively, probably still the most widely held attitude about quantum mechanics.[4][5] N. David Mermin coined the phrase "Shut up and calculate!" to summarize Copenhagen-type views, a saying often misattributed to Richard Feynman and which Mermin later found insufficiently nuanced.[6][7]

Generally, views in the Copenhagen tradition posit something in the act of observation which results in the collapse of the wave function. This concept, though often attributed to Niels Bohr, was due to Werner Heisenberg, whose later writings obscured many disagreements he and Bohr had had during their collaboration and that the two never resolved.[8][9] In these schools of thought, wave functions may be regarded as statistical information about a quantum system, and wave function collapse is the updating of that information in response to new data.[10][11] Exactly how to understand this process remains a topic of dispute.[12]

Bohr offered an interpretation that is independent of a subjective observer, or measurement, or collapse; instead, an "irreversible" or effectively irreversible process causes the decay of quantum coherence which imparts the classical behavior of "observation" or "measurement".[13][14][15][16]

Hugh Everett's many-worlds interpretation attempts to solve the problem by suggesting that there is only one wave function, the superposition of the entire universe, and it never collapses—so there is no measurement problem. Instead, the act of measurement is simply an interaction between quantum entities, e.g. observer, measuring instrument, electron/positron etc., which entangle to form a single larger entity, for instance living cat/happy scientist. Everett also attempted to demonstrate how the probabilistic nature of quantum mechanics would appear in measurements, a work later extended by Bryce DeWitt. However, proponents of the Everettian program have not yet reached a consensus regarding the correct way to justify the use of the Born rule to calculate probabilities.[17][18]

De Broglie–Bohm theory tries to solve the measurement problem very differently: the information describing the system contains not only the wave function, but also supplementary data (a trajectory) giving the position of the particle(s). The role of the wave function is to generate the velocity field for the particles. These velocities are such that the probability distribution for the particle remains consistent with the predictions of the orthodox quantum mechanics. According to de Broglie–Bohm theory, interaction with the environment during a measurement procedure separates the wave packets in configuration space, which is where apparent wave function collapse comes from, even though there is no actual collapse.[19]

A fourth approach is given by objective-collapse models. In such models, the Schrödinger equation is modified and obtains nonlinear terms. These nonlinear modifications are of stochastic nature and lead to a behaviour that for microscopic quantum objects, e.g. electrons or atoms, is unmeasurably close to that given by the usual Schrödinger equation. For macroscopic objects, however, the nonlinear modification becomes important and induces the collapse of the wave function. Objective-collapse models are effective theories. The stochastic modification is thought to stem from some external non-quantum field, but the nature of this field is unknown. One possible candidate is the gravitational interaction as in the models of Diósi and Penrose. The main difference of objective-collapse models compared to the other approaches is that they make falsifiable predictions that differ from standard quantum mechanics. Experiments are already getting close to the parameter regime where these predictions can be tested.[20] The Ghirardi–Rimini–Weber (GRW) theory proposes that wave function collapse happens spontaneously as part of the dynamics. Particles have a non-zero probability of undergoing a "hit", or spontaneous collapse of the wave function, on the order of once every hundred million years.[21] Though collapse is extremely rare, the sheer number of particles in a measurement system means that the probability of a collapse occurring somewhere in the system is high. Since the entire measurement system is entangled (by quantum entanglement), the collapse of a single particle initiates the collapse of the entire measurement apparatus. Because the GRW theory makes different predictions from orthodox quantum mechanics in some conditions, it is not an interpretation of quantum mechanics in a strict sense.

Friday, June 02, 2023

Constructive Interference Patterns Give Rise To Unitary Conscious Experience

wikipedia  |  Smythies[27] defines the combination problem, also known as the subjective unity of perception, as "How do the brain mechanisms actually construct the phenomenal object?". Revonsuo[1] equates this to "consciousness-related binding", emphasizing the entailment of a phenomenal aspect. As Revonsuo explores in 2006,[28] there are nuances of difference beyond the basic BP1:BP2 division. Smythies speaks of constructing a phenomenal object ("local unity" for Revonsuo) but philosophers such as Descartes, Leibniz, Kant and James (see Brook and Raymont[29]) have typically been concerned with the broader unity of a phenomenal experience ("global unity" for Revonsuo) – which, as Bayne[30] illustrates may involve features as diverse as seeing a book, hearing a tune and feeling an emotion. Further discussion will focus on this more general problem of how sensory data that may have been segregated into, for instance, "blue square" and "yellow circle" are to be re-combined into a single phenomenal experience of a blue square next to a yellow circle, plus all other features of their context. There is a wide range of views on just how real this "unity" is, but the existence of medical conditions in which it appears to be subjectively impaired, or at least restricted, suggests that it is not entirely illusory.[31]

There are many neurobiological theories about the subjective unity of perception. Different visual features such as color, size, shape, and motion are computed by largely distinct neural circuits but we experience an integrated whole. The different visual features interact with each other in various ways. For example, shape discrimination of objects is strongly affected by orientation but only slightly affected by object size.[32] Some theories suggest that global perception of the integrated whole involves higher order visual areas.[33] There is also evidence that the posterior parietal cortex is responsible for perceptual scene segmentation and organization.[34] Bodies facing each other are processed as a single unit and there is increased coupling of the extrastriate body area (EBA) and the posterior superior temporal sulcus (pSTS) when bodies are facing each other.[35] This suggests that the brain is biased towards grouping humans in twos or dyads.[36]

Dennett[40] has proposed that our sense that our experiences are single events is illusory and that, instead, at any one time there are "multiple drafts" of sensory patterns at multiple sites. Each would only cover a fragment of what we think we experience. Arguably, Dennett is claiming that consciousness is not unified and there is no phenomenal binding problem. Most philosophers have difficulty with this position (see Bayne[30]) but some physiologists agree with it. In particular, the demonstration of perceptual asynchrony in psychophysical experiments by Moutoussis and Zeki,[48][49] when color is perceived before orientation of lines and before motion by 40 and 80 ms, respectively, constitutes an argument that, over these very short time periods, different attributes are consciously perceived at different times, leading to the view that at least over these brief periods of time after visual stimulation, different events are not bound to each other, leading to the view of a disunity of consciousness,[50] at least over these brief time intervals. Dennett's view might be in keeping with evidence from recall experiments and change blindness purporting to show that our experiences are much less rich than we sense them to be – what has been called the Grand Illusion.[51] However, few, if any, other authors suggest the existence of multiple partial "drafts". Moreover, also on the basis of recall experiments, Lamme[52] has challenged the idea that richness is illusory, emphasizing that phenomenal content cannot be equated with content to which there is cognitive access.

Dennett does not tie drafts to biophysical events. Multiple sites of causal convergence are invoked in specific biophysical terms by Edwards[53] and Sevush.[54] In this view the sensory signals to be combined in phenomenal experience are available, in full, at each of multiple sites. To avoid non-causal combination each site/event is placed within an individual neuronal dendritic tree. The advantage is that "compresence" is invoked just where convergence occurs neuro-anatomically. The disadvantage, as for Dennett, is the counter-intuitive concept of multiple "copies" of experience. The precise nature of an experiential event or "occasion", even if local, also remains uncertain.

The majority of theoretical frameworks for the unified richness of phenomenal experience adhere to the intuitive idea that experience exists as a single copy, and draw on "functional" descriptions of distributed networks of cells. Baars[55] has suggested that certain signals, encoding what we experience, enter a "Global Workspace" within which they are "broadcast" to many sites in the cortex for parallel processing. Dehaene, Changeux and colleagues[56] have developed a detailed neuro-anatomical version of such a workspace. Tononi and colleagues[57] have suggested that the level of richness of an experience is determined by the narrowest information interface "bottleneck" in the largest sub-network or "complex" that acts as an integrated functional unit. Lamme[52] has suggested that networks supporting reciprocal signaling rather than those merely involved in feed-forward signaling support experience. Edelman and colleagues have also emphasized the importance of re-entrant signaling.[58] Cleeremans[59] emphasizes meta-representation as the functional signature of signals contributing to consciousness.

In general, such network-based theories are not explicitly theories of how consciousness is unified, or "bound" but rather theories of functional domains within which signals contribute to unified conscious experience. A concern about functional domains is what Rosenberg[60] has called the boundary problem; it is hard to find a unique account of what is to be included and what excluded. Nevertheless, this is, if anything is, the consensus approach.

Within the network context, a role for synchrony has been invoked as a solution to the phenomenal binding problem as well as the computational one. In his book, The Astonishing Hypothesis,[61] Crick appears to be offering a solution to BP2 as much as BP1. Even von der Malsburg,[62] introduces detailed computational arguments about object feature binding with remarks about a "psychological moment". The Singer group[63] also appear to be interested as much in the role of synchrony in phenomenal awareness as in computational segregation.

The apparent incompatibility of using synchrony to both segregate and unify might be explained by sequential roles. However, Merker[20] points out what appears to be a contradiction in attempts to solve the subjective unity of perception in terms of a functional (effectively meaning computational) rather than a local biophysical, domain, in the context of synchrony.

Functional arguments for a role for synchrony are in fact underpinned by analysis of local biophysical events. However, Merker[20] points out that the explanatory work is done by the downstream integration of synchronized signals in post-synaptic neurons: "It is, however, by no means clear what is to be understood by 'binding by synchrony' other than the threshold advantage conferred by synchrony at, and only at, sites of axonal convergence onto single dendritic trees..." In other words, although synchrony is proposed as a way of explaining binding on a distributed, rather than a convergent, basis the justification rests on what happens at convergence. Signals for two features are proposed as bound by synchrony because synchrony effects downstream convergent interaction. Any theory of phenomenal binding based on this sort of computational function would seem to follow the same principle. The phenomenality would entail convergence, if the computational function does.

The assumption in many of the quoted models suggest that computational and phenomenal events, at least at some point in the sequence of events, parallel each other in some way. The difficulty remains in identifying what that way might be. Merker's[20] analysis suggests that either (1) both computational and phenomenal aspects of binding are determined by convergence of signals on neuronal dendritic trees, or (2) that our intuitive ideas about the need for "binding" in a "holding together" sense in both computational and phenomenal contexts are misconceived. We may be looking for something extra that is not needed. Merker, for instance, argues that the homotopic connectivity of sensory pathways does the necessary work.

 

BeeDee Gave Me A Gentle Reminder To Get Back On Topic

wikipedia  |  In physics, interference is a phenomenon in which two coherent waves are combined by adding their intensities or displacements with due consideration for their phase difference. The resultant wave may have greater intensity (constructive interference) or lower amplitude (destructive interference) if the two waves are in phase or out of phase, respectively. Interference effects can be observed with all types of waves, for example, light, radio, acoustic, surface water waves, gravity waves, or matter waves as well as in loudspeakers as electrical waves. 

The word interference is derived from the Latin words inter which means "between" and fere which means "hit or strike", and was coined by Thomas Young in 1801.[1][2][3]

The principle of superposition of waves states that when two or more propagating waves of the same type are incident on the same point, the resultant amplitude at that point is equal to the vector sum of the amplitudes of the individual waves.[4] If a crest of a wave meets a crest of another wave of the same frequency at the same point, then the amplitude is the sum of the individual amplitudes—this is constructive interference. If a crest of one wave meets a trough of another wave, then the amplitude is equal to the difference in the individual amplitudes—this is known as destructive interference. In ideal mediums (water, air are almost ideal) energy is always conserved, at points of destructive interference energy is stored in the elasticity of the medium. For example when we drop 2 pebbles in a pond we see a pattern but eventually waves continue and only when they reach the shore is energy absorbed away from the medium.

Constructive interference occurs when the phase difference between the waves is an even multiple of π (180°), whereas destructive interference occurs when the difference is an odd multiple of π. If the difference between the phases is intermediate between these two extremes, then the magnitude of the displacement of the summed waves lies between the minimum and maximum values.

Consider, for example, what happens when two identical stones are dropped into a still pool of water at different locations. Each stone generates a circular wave propagating outwards from the point where the stone was dropped. When the two waves overlap, the net displacement at a particular point is the sum of the displacements of the individual waves. At some points, these will be in phase, and will produce a maximum displacement. In other places, the waves will be in anti-phase, and there will be no net displacement at these points. Thus, parts of the surface will be stationary—these are seen in the figure above and to the right as stationary blue-green lines radiating from the centre.

Interference of light is a unique phenomenon in that we can never observe superposition of the EM field directly as we can for example in water. Superposition in the EM field is an assumed and necessary requirement, fundamentally 2 light beam pass through each other and continue on their respective paths. Light can be explained classically by the superposition of waves, however a deeper understanding of light interference requires knowledge of wave-particle duality of light which is due to quantum mechanics. Prime examples of light interference are the famous double-slit experiment, laser speckle, anti-reflective coatings and interferometers. Traditionally the classical wave model is taught as a basis for understanding optical interference, based on the Huygens–Fresnel principle however an explanation based on the Feynman path integral exists which takes into account quantum mechanical considerations.

 

 

Thursday, June 01, 2023

Is The Biden Family Crime Syndicate Finally Running Out Of Safe Harbor?

jonathanturley  |  In 2018, Hunter Biden’s world was collapsing.

The New York Times had run a story on one of his shady deals with the Chinese and his father, then vice president, was pulled into the vortex.

It appears that Hunter was in a free fall and his uncle Jim Biden reportedly reached out in newly discovered messages to offer him a “safe harbor.”

The exchange is an insight into a train wreck of a life of the scion of one of the most powerful families in the country.

However, it is also insight into a world of influence peddling where millions simply evaporated in the coffers of the Biden family.

On their face, the messages seem to contradict public statements from President Biden on the foreign-influence peddling that was used to fund Hunter’s drug-infused, self-destructive lifestyle.

The Times story caused a panic in the Biden family.

Despite a largely supportive media, the Bidens have long been known for influence peddling.

Jim Biden has been repeatedly criticized for marketing his access to his brother in pitches to clients.

Hunter knew that the Times story was only the tip of an iceberg.

There were deals all over the world with foreign figures worth millions and some of these figures had close ties to foreign intelligence or regimes.

As revealed recently by the House Oversight Committee, the Bidens constructed a labyrinth of corporations and accounts to transfer millions from these deals to a variety of Biden family members, including grandchildren.

Free fall

Nevertheless, Joe Biden repeatedly claimed as a presidential candidate and as president that he had no knowledge of any foreign dealings of his son.

Those denials now appear patently false.

The laptop includes pictures and appointments of Hunter’s foreign business associates with Joe Biden.

It also includes a recording concerning a Times report on Dec. 12, 2018, detailing Hunter’s dealings with Ye Jianming, the head of CEFC China Energy Company.

Ye would later be arrested for corruption.

As Biden associates pushed the Times to change aspects of the story, Joe Biden called to report on the results.

In his message, Biden ends his call to Hunter with the statement “I think you’re clear. And anyway if you get a chance, give me a call, I love you.”

The new messages indicate that the Bidens were worried that Hunter was in a free fall as these dealings were becoming known and revenue was declining.

Jim Biden appears to be rushing to get Hunter to work the problem with the family.

He assures him that they can find him “a safe harbor” and that “I can work with you[r] father alone!”

Hunter previously complained that he was giving as much as half of his proceeds to his father and was now facing towering financial demands.

He appears to have cut off the family.

That is a dangerous development for a man who had a long struggle with drugs and alcohol.

Biden Accuser Tara Reade: My Options - Live In A Cage Or Be Killed

sputnik  | Tara Reade, a US citizen, writer, and ex-assistant to Joe Biden, who has recently arrived in Russia, told Sputnik she no longer feels safe in Biden's America, adding that many Americans are ready to follow in her footsteps.

Tara Reade, a former Senate staffer, came forward in April 2020 and filed a criminal complaint against then-presumptive Democratic presidential hopeful Joe Biden, accusing him of sexual assault in 1993. Even though some Democratic congresswomen said they believe her, not only were her claims downplayed by the US mainstream press, but she was also subjected to smears, a criminal probe, and intimidation.
 
After Biden's 2024 re-election announcement, Reade reiterated her accusations and expressed willingness to testify in the GOP-controlled House of Representatives. However, in early May, Tara released a cryptic message saying that if something happens to her, all roads would lead to Biden. Reade opted to come to Russia to protect her life.
 
On May 30, Member of the State Duma Committee on International Affairs Maria Butina, who herself fell victim to the US punitive machine, promised to discuss the possibility of granting Russian citizenship to Reade and ask Russian President Vladimir Putin to fast track her citizenship request.

Tara Reade is not the only American truth-seeker who has come to Russia in order to evade political persecution from the US authorities. Earlier, NSA whistleblower Edward Snowden found refuge in the Russian Federation after revealing a US global spying program which targeted American citizens in sharp violation of the US Constitution. In September 2022, Vladimir Putin signed a decree granting Russian citizenship to Edward. He is now a full-fledged citizen of Russia.

As per Reade, there are a lot of people in the US who feel unsafe. Her message to them is to take action to protect themselves and their families "and to really look at who you're voting for."

"We need systemic change. So participate in that process and try to take command of your democracy if you want a democracy, because right now it's in disarray," Reade said, addressing her fellow Americans. "And that's the problem. And as far as like going to another safe haven, I mean, there are many Americans here, and I don't want to out a bunch of Americans, but there are people here that are coming to Russia - much like back in the day when Soviet Union people defected over to the US - now you have the opposite. Now you have US and European citizens looking for safe haven here. And luckily, the Kremlin is accommodating. So we're lucky."

Fuck Robert Kagan And Would He Please Now Just Go Quietly Burn In Hell?

politico | The Washington Post on Friday announced it will no longer endorse presidential candidates, breaking decades of tradition in a...