Larsson, JanÅke
Overview
Works:  52 works in 57 publications in 1 language and 62 library holdings 

Genres:  Academic theses 
Roles:  Author, Other 
Publication Timeline
.
Most widely held works by
JanÅke Larsson
Quantum paradoxes, probability theory, and change of ensemble by
JanÅke Larsson(
Book
)
4 editions published in 2000 in English and held by 7 WorldCat member libraries worldwide
In this thesis, the question "What kind of models can be used to describe microcosmos?" will be discussed. Being difficult and very large in scope, the question has here been restricted to whether or not Local Realistic models can be used to describe QuantumMechanical processes, one of a collection of questions often referred to as Quantum Paradoxes. Two such paradoxes will be investigated using techniques from probability theory: the Bell inequality and the GreenbergerHorneZeilinger (GHZ) paradox. A problem with the two mentioned paradoxes is that they are only valid when the detectors are 100% efficient, whereas present experimental efficiency is much lower than that. Here, an approach is presented which enables a generalization of both the Bell inequality and the GHZ paradox to the inefficient case. This is done by introducing the concept of change of ensemble , which provides both qualitative and quantitative information on the nature of the "loophole" in the 100% efficiency prerequisite, and is more fundamental in this regard than the efficiency concept. Efficiency estimates are presented which are easy to obtain from experimental coincidence data, and a connection is established between these estimates and the concept of change of ensemble. The concept is also studied in the context of Franson interferometry, where the Bell inequality cannot immediately be used. Unexpected subtleties occur when trying to establish whether or not a Local Realistic model of the data is possible even in the ideal case. A Local Realistic model of the experiment is presented, but nevertheless, by introducing an additional requirement on the experimental setup it is possible to refute the mentioned model and show that no other Local Realistic model exists
4 editions published in 2000 in English and held by 7 WorldCat member libraries worldwide
In this thesis, the question "What kind of models can be used to describe microcosmos?" will be discussed. Being difficult and very large in scope, the question has here been restricted to whether or not Local Realistic models can be used to describe QuantumMechanical processes, one of a collection of questions often referred to as Quantum Paradoxes. Two such paradoxes will be investigated using techniques from probability theory: the Bell inequality and the GreenbergerHorneZeilinger (GHZ) paradox. A problem with the two mentioned paradoxes is that they are only valid when the detectors are 100% efficient, whereas present experimental efficiency is much lower than that. Here, an approach is presented which enables a generalization of both the Bell inequality and the GHZ paradox to the inefficient case. This is done by introducing the concept of change of ensemble , which provides both qualitative and quantitative information on the nature of the "loophole" in the 100% efficiency prerequisite, and is more fundamental in this regard than the efficiency concept. Efficiency estimates are presented which are easy to obtain from experimental coincidence data, and a connection is established between these estimates and the concept of change of ensemble. The concept is also studied in the context of Franson interferometry, where the Bell inequality cannot immediately be used. Unexpected subtleties occur when trying to establish whether or not a Local Realistic model of the data is possible even in the ideal case. A Local Realistic model of the experiment is presented, but nevertheless, by introducing an additional requirement on the experimental setup it is possible to refute the mentioned model and show that no other Local Realistic model exists
Efficient classical simulation of the DeutschJozsa and Simon's algorithms by Niklas Johansson(
)
2 editions published in 2017 in English and held by 3 WorldCat member libraries worldwide
Alongstanding aim of quantum information research is to understand what gives quantum computers their advantage. This requires separating problems that need genuinely quantum resources from those for which classical resources are enough. Two examples of quantum speedup are the DeutschJozsa and Simons problem, both efficiently solvable on a quantum Turing machine, and both believed to lack efficient classical solutions. Here we present a framework that can simulate both quantum algorithms efficiently, solving the DeutschJozsa problem with probability 1 using only one oracle query, and Simons problem using linearly many oracle queries, just as expected of an ideal quantum computer. The presented simulation framework is in turn efficiently simulatable in a classical probabilistic Turing machine. This shows that the DeutschJozsa and Simons problem do not require any genuinely quantum resources, and that the quantum algorithms show no speedup when compared with their corresponding classical simulation. Finally, this gives insight into what properties are needed in the two algorithms and calls for further study of oracle separation between quantum and classical computation
2 editions published in 2017 in English and held by 3 WorldCat member libraries worldwide
Alongstanding aim of quantum information research is to understand what gives quantum computers their advantage. This requires separating problems that need genuinely quantum resources from those for which classical resources are enough. Two examples of quantum speedup are the DeutschJozsa and Simons problem, both efficiently solvable on a quantum Turing machine, and both believed to lack efficient classical solutions. Here we present a framework that can simulate both quantum algorithms efficiently, solving the DeutschJozsa problem with probability 1 using only one oracle query, and Simons problem using linearly many oracle queries, just as expected of an ideal quantum computer. The presented simulation framework is in turn efficiently simulatable in a classical probabilistic Turing machine. This shows that the DeutschJozsa and Simons problem do not require any genuinely quantum resources, and that the quantum algorithms show no speedup when compared with their corresponding classical simulation. Finally, this gives insight into what properties are needed in the two algorithms and calls for further study of oracle separation between quantum and classical computation
A POSSIBLE UNIFICATION OF THE COPENHAGEN AND THE BOHM INTERPRETATIONS USING LOCAL REALISM by
JanÅke Larsson(
)
1 edition published in 2000 in English and held by 2 WorldCat member libraries worldwide
1 edition published in 2000 in English and held by 2 WorldCat member libraries worldwide
Quantum paradoxes, probability theory, and change of ensemble by
JanÅke Larsson(
Book
)
1 edition published in 2000 in English and held by 2 WorldCat member libraries worldwide
1 edition published in 2000 in English and held by 2 WorldCat member libraries worldwide
Comment on Twophoton Fransontype experiment and local realism  Reply by
Sven Aerts(
)
1 edition published in 2001 in English and held by 1 WorldCat member library worldwide
A Reply to the Comment by Carlos Luiz Ryff
1 edition published in 2001 in English and held by 1 WorldCat member library worldwide
A Reply to the Comment by Carlos Luiz Ryff
Vulnerability of "A Novel ProtocolAuthentication Algorithm Ruling out a ManintheMiddle Attack in Quantum Cryptography" by Aysajan Abidin(
)
1 edition published in 2009 in English and held by 1 WorldCat member library worldwide
In this paper, we review and comment on "A novel protocolauthentication algorithm ruling out a maninthemiddle attack in quantum cryptography" [M. Peev et al., Int. J. Quant. Inf. 3 (2005) 225]. In particular, we point out that the proposed primitive is not secure when used in a generic protocol, and needs additional authenticating properties of the surrounding quantumcryptographic protocol
1 edition published in 2009 in English and held by 1 WorldCat member library worldwide
In this paper, we review and comment on "A novel protocolauthentication algorithm ruling out a maninthemiddle attack in quantum cryptography" [M. Peev et al., Int. J. Quant. Inf. 3 (2005) 225]. In particular, we point out that the proposed primitive is not secure when used in a generic protocol, and needs additional authenticating properties of the surrounding quantumcryptographic protocol
Comments on "New Results on FrameProof Codes and Traceability Schemes" by
Jacob Löfvenberg(
)
1 edition published in 2010 in English and held by 1 WorldCat member library worldwide
1 edition published in 2010 in English and held by 1 WorldCat member library worldwide
Mutually unbiased bases and Hadamard matrices of order six by Ingemar Bengtsson(
)
1 edition published in 2007 in English and held by 1 WorldCat member library worldwide
We report on a search for mutually unbiased bases (MUBs) in six dimensions. We find only triplets of MUBs, and thus do not come close to the theoretical upper bound 7. However, we point out that the natural habitat for sets of MUBs is the set of all complex Hadamard matrices of the given order, and we introduce a natural notion of distance between bases in Hilbert space. This allows us to draw a detailed map of where in the landscape the MUB triplets are situated. We use available tools, such as the theory of the discrete Fourier transform, to organize our results. Finally, we present some evidence for the conjecture that there exists a four dimensional family of complex Hadamard matrices of order 6. If this conjecture is true the landscape in which one may search for MUBs is much larger than previously thought
1 edition published in 2007 in English and held by 1 WorldCat member library worldwide
We report on a search for mutually unbiased bases (MUBs) in six dimensions. We find only triplets of MUBs, and thus do not come close to the theoretical upper bound 7. However, we point out that the natural habitat for sets of MUBs is the set of all complex Hadamard matrices of the given order, and we introduce a natural notion of distance between bases in Hilbert space. This allows us to draw a detailed map of where in the landscape the MUB triplets are situated. We use available tools, such as the theory of the discrete Fourier transform, to organize our results. Finally, we present some evidence for the conjecture that there exists a four dimensional family of complex Hadamard matrices of order 6. If this conjecture is true the landscape in which one may search for MUBs is much larger than previously thought
Attacks on quantum key distribution protocols that employ nonITS authentication by Christoph Pacher(
)
1 edition published in 2016 in English and held by 1 WorldCat member library worldwide
We demonstrate how adversaries with unbounded computing resources can break Quantum Key Distribution (QKD) protocols which employ a particular message authentication code suggested previously. This authentication code, featuring low key consumption, is not InformationTheoretically Secure (ITS) since for each message the eavesdropper has intercepted she is able to send a different message from a set of messages that she can calculate by finding collisions of a cryptographic hash function. However, when this authentication code was introduced it was shown to prevent straightforward ManInTheMiddle (MITM) attacks against QKD protocols. In this paper, we prove that the set of messages that collide with any given message under this authentication code contains with high probability a message that has small Hamming distance to any other given message. Based on this fact we present extended MITM attacks against different versions of BB84 QKD protocols using the addressed authentication code; for three protocols we describe every single action taken by the adversary. For all protocols the adversary can obtain complete knowledge of the key, and for most protocols her success probability in doing so approaches unity. Since the attacks work against all authentication methods which allow to calculate colliding messages, the underlying building blocks of the presented attacks expose the potential pitfalls arising as a consequence of nonITS authentication in QKDpostprocessing. We propose countermeasures, increasing the eavesdroppers demand for computational power, and also prove necessary and sufficient conditions for upgrading the discussed authentication code to the ITS level
1 edition published in 2016 in English and held by 1 WorldCat member library worldwide
We demonstrate how adversaries with unbounded computing resources can break Quantum Key Distribution (QKD) protocols which employ a particular message authentication code suggested previously. This authentication code, featuring low key consumption, is not InformationTheoretically Secure (ITS) since for each message the eavesdropper has intercepted she is able to send a different message from a set of messages that she can calculate by finding collisions of a cryptographic hash function. However, when this authentication code was introduced it was shown to prevent straightforward ManInTheMiddle (MITM) attacks against QKD protocols. In this paper, we prove that the set of messages that collide with any given message under this authentication code contains with high probability a message that has small Hamming distance to any other given message. Based on this fact we present extended MITM attacks against different versions of BB84 QKD protocols using the addressed authentication code; for three protocols we describe every single action taken by the adversary. For all protocols the adversary can obtain complete knowledge of the key, and for most protocols her success probability in doing so approaches unity. Since the attacks work against all authentication methods which allow to calculate colliding messages, the underlying building blocks of the presented attacks expose the potential pitfalls arising as a consequence of nonITS authentication in QKDpostprocessing. We propose countermeasures, increasing the eavesdroppers demand for computational power, and also prove necessary and sufficient conditions for upgrading the discussed authentication code to the ITS level
Modelling cell lineage using a metaBoolean tree model with a relation to gene regulatory networks by
JanÅke Larsson(
)
1 edition published in 2011 in English and held by 1 WorldCat member library worldwide
A cell lineage is the ancestral relationship between a group of cells that originate from a single founder cell. For example, in the embryo of the nematode Caenorhabditis elegans an invariant cell lineage has been traced, and with this information at hand it is possible to theoretically model the emergence of different cell types in the lineage, starting from the single fertilized egg. In this report we outline a modelling technique for cell lineage trees, which can be used for the C. elegans embryonic cell lineage but also extended to other lineages. The model takes into account both cellintrinsic (transcription factorbased) and extrinsic (extracellular) factors as well as synergies within and between these two types of factors. The model can faithfully recapitulate the entire C. elegans cell lineage, but is also general, i.e., it can be applied to describe any cell lineage. We show that synergy between factors, as well as the use of extrinsic factors, drastically reduce the number of regulatory factors needed for recapitulating the lineage. The model gives indications regarding covariation of factors, number of involved genes and where in the cell lineage tree that asymmetry might be controlled by external influence. Furthermore, the model is able to emulate other (Boolean, discrete and differentialequationbased) models. As an example, we show that the model can be translated to the language of a previous linear sigmoidlimited concentrationbased model (Geard and Wiles, 2005). This means that this latter model also can exhibit synergy effects, and also that the cumbersome iterative technique for parameter estimation previously used is no longer needed. In conclusion, the proposed model is general and simple to use, can be mapped onto other models to extend and simplify their use, and can also be used to indicate where synergy and external influence would reduce the complexity of the regulatory process
1 edition published in 2011 in English and held by 1 WorldCat member library worldwide
A cell lineage is the ancestral relationship between a group of cells that originate from a single founder cell. For example, in the embryo of the nematode Caenorhabditis elegans an invariant cell lineage has been traced, and with this information at hand it is possible to theoretically model the emergence of different cell types in the lineage, starting from the single fertilized egg. In this report we outline a modelling technique for cell lineage trees, which can be used for the C. elegans embryonic cell lineage but also extended to other lineages. The model takes into account both cellintrinsic (transcription factorbased) and extrinsic (extracellular) factors as well as synergies within and between these two types of factors. The model can faithfully recapitulate the entire C. elegans cell lineage, but is also general, i.e., it can be applied to describe any cell lineage. We show that synergy between factors, as well as the use of extrinsic factors, drastically reduce the number of regulatory factors needed for recapitulating the lineage. The model gives indications regarding covariation of factors, number of involved genes and where in the cell lineage tree that asymmetry might be controlled by external influence. Furthermore, the model is able to emulate other (Boolean, discrete and differentialequationbased) models. As an example, we show that the model can be translated to the language of a previous linear sigmoidlimited concentrationbased model (Geard and Wiles, 2005). This means that this latter model also can exhibit synergy effects, and also that the cumbersome iterative technique for parameter estimation previously used is no longer needed. In conclusion, the proposed model is general and simple to use, can be mapped onto other models to extend and simplify their use, and can also be used to indicate where synergy and external influence would reduce the complexity of the regulatory process
MetaBoolean models of asymmetric division patterns in the C. elegans intestinal lineage Implications for the posterior boundary
of intestinal twist by
Sofia Pettersson(
)
1 edition published in 2013 in English and held by 1 WorldCat member library worldwide
The intestine of Caenorhabditis elegans is derived from 20 cells that are organized into nine intestinal rings. During embryogenesis, three of the rings rotate approximately 90 degrees in a process known as intestinal twist. The underlying mechanisms for this morphological event are not fully known, but it has been demonstrated that both leftright and anteriorposterior asymmetry is required for intestinal twist to occur. We have recently presented a rulebased metaBoolean tree model intended to describe complex lineages. In this report we apply this model to the E lineage of C. elegans, specifically targeting the asymmetric anteriorposterior division patterns within the lineage. The resulting model indicates that cells with the same factor concentration are located next to each other in the intestine regardless of lineage origin. In addition, the shift in factor concentrations coincides with the boundary for intestinal twist. When modeling lit1 mutant data according to the same principle, the factor distributions in each cell are altered, yet the concurrence between the shift in concentration and intestinal twist remains. This pattern suggests that intestinal twist is controlled by a threshold mechanism. In the current paper we present the factor concentrations for all possible combinations of symmetric and asymmetric divisions in the E lineage and relate these to the potential threshold by studying existing data for wildtype and mutant embryos. Finally, we discuss how the resulting models can serve as a basis for experimental design in order to reveal the underlying mechanisms of intestinal twist
1 edition published in 2013 in English and held by 1 WorldCat member library worldwide
The intestine of Caenorhabditis elegans is derived from 20 cells that are organized into nine intestinal rings. During embryogenesis, three of the rings rotate approximately 90 degrees in a process known as intestinal twist. The underlying mechanisms for this morphological event are not fully known, but it has been demonstrated that both leftright and anteriorposterior asymmetry is required for intestinal twist to occur. We have recently presented a rulebased metaBoolean tree model intended to describe complex lineages. In this report we apply this model to the E lineage of C. elegans, specifically targeting the asymmetric anteriorposterior division patterns within the lineage. The resulting model indicates that cells with the same factor concentration are located next to each other in the intestine regardless of lineage origin. In addition, the shift in factor concentrations coincides with the boundary for intestinal twist. When modeling lit1 mutant data according to the same principle, the factor distributions in each cell are altered, yet the concurrence between the shift in concentration and intestinal twist remains. This pattern suggests that intestinal twist is controlled by a threshold mechanism. In the current paper we present the factor concentrations for all possible combinations of symmetric and asymmetric divisions in the E lineage and relate these to the potential threshold by studying existing data for wildtype and mutant embryos. Finally, we discuss how the resulting models can serve as a basis for experimental design in order to reveal the underlying mechanisms of intestinal twist
Minimum Detection Efficiency for a LoopholeFree AtomPhoton Bell Experiment by Adan Cabello(
)
1 edition published in 2007 in English and held by 1 WorldCat member library worldwide
In Bell experiments, one problem is to achieve high enough photodetection to ensure that there is no possibility of describing the results via a local hiddenvariable model. Using the ClauserHorne inequality and a twophoton nonmaximally entangled state, a photodetection efficiency higher than 0.67 is necessary. Here we discuss atomphoton Bell experiments. We show that, assuming perfect detection efficiency of the atom, it is possible to perform a loopholefree atomphoton Bell experiment whenever the photodetection efficiency exceeds 0.50
1 edition published in 2007 in English and held by 1 WorldCat member library worldwide
In Bell experiments, one problem is to achieve high enough photodetection to ensure that there is no possibility of describing the results via a local hiddenvariable model. Using the ClauserHorne inequality and a twophoton nonmaximally entangled state, a photodetection efficiency higher than 0.67 is necessary. Here we discuss atomphoton Bell experiments. We show that, assuming perfect detection efficiency of the atom, it is possible to perform a loopholefree atomphoton Bell experiment whenever the photodetection efficiency exceeds 0.50
A practical Trojan Horse for Bellinequalitybased quantum cryptography by
JanÅke Larsson(
)
1 edition published in 2002 in English and held by 1 WorldCat member library worldwide
Quantum Cryptography, or more accurately, Quantum Key Distribution (QKD) is based on using an unconditionally secure "quantum channel" to share a secret key among two users. A manufacturer of QKD devices could, intentionally or not, use a (semi) classical channel instead of the quantum channel, which would remove the supposedly unconditional security. One example is the BB84 protocol, where the quantum channel can be implemented in polarization of single photons. Here, use of several photons instead of one to encode each bit of the key provides a similar but insecure system. For protocols based on violation of a Bell inequality (e.g., the Ekert protocol) the situation is somewhat different. While the possibility is mentioned by some authors, it is generally thought that an implementation of a (semi) classical channel will differ significantly from that of a quantum channel. Here, a counterexample will be given using an identical physical setup as is used in photonpolarization Ekert QKD. Since the physical implementation is identical, a manufacturer may include this modification as a Trojan Horse in manufactured systems, to be activated at will by an eavesdropper. Thus, the old truth of cryptography still holds: you have to trust the manufacturer of your cryptographic device. Even when you do violate the Bell inequality
1 edition published in 2002 in English and held by 1 WorldCat member library worldwide
Quantum Cryptography, or more accurately, Quantum Key Distribution (QKD) is based on using an unconditionally secure "quantum channel" to share a secret key among two users. A manufacturer of QKD devices could, intentionally or not, use a (semi) classical channel instead of the quantum channel, which would remove the supposedly unconditional security. One example is the BB84 protocol, where the quantum channel can be implemented in polarization of single photons. Here, use of several photons instead of one to encode each bit of the key provides a similar but insecure system. For protocols based on violation of a Bell inequality (e.g., the Ekert protocol) the situation is somewhat different. While the possibility is mentioned by some authors, it is generally thought that an implementation of a (semi) classical channel will differ significantly from that of a quantum channel. Here, a counterexample will be given using an identical physical setup as is used in photonpolarization Ekert QKD. Since the physical implementation is identical, a manufacturer may include this modification as a Trojan Horse in manufactured systems, to be activated at will by an eavesdropper. Thus, the old truth of cryptography still holds: you have to trust the manufacturer of your cryptographic device. Even when you do violate the Bell inequality
Optimal Inequalities for StateIndependent Contextuality by
Matthias Kleinmann(
)
1 edition published in 2012 in English and held by 1 WorldCat member library worldwide
Contextuality is a natural generalization of nonlocality which does not need composite systems or spacelike separation and offers a wider spectrum of interesting phenomena. Most notably, in quantum mechanics there exist scenarios where the contextual behavior is independent of the quantum state. We show that the quest for an optimal inequality separating quantum from classical noncontextual correlations in a stateindependent manner admits an exact solution, as it can be formulated as a linear program. We introduce the noncontextuality polytope as a generalization of the locality polytope and apply our method to identify two different tight optimal inequalities for the most fundamental quantum scenario with stateindependent contextuality
1 edition published in 2012 in English and held by 1 WorldCat member library worldwide
Contextuality is a natural generalization of nonlocality which does not need composite systems or spacelike separation and offers a wider spectrum of interesting phenomena. Most notably, in quantum mechanics there exist scenarios where the contextual behavior is independent of the quantum state. We show that the quest for an optimal inequality separating quantum from classical noncontextual correlations in a stateindependent manner admits an exact solution, as it can be formulated as a linear program. We introduce the noncontextuality polytope as a generalization of the locality polytope and apply our method to identify two different tight optimal inequalities for the most fundamental quantum scenario with stateindependent contextuality
Strict detectorefficiency bounds for nsite ClauserHorne inequalities by
JanÅke Larsson(
)
1 edition published in 2001 in English and held by 1 WorldCat member library worldwide
An analysis of detectorefficiency in manysite ClauserHorne inequalities is presented for the case of perfect visibility. It is shown that there is a violation of the presented n site ClauserHorne inequalities if and only if the efficiency is greater than n/ ( 2n−1 ). Thus, for a twosite twosetting experiment there are no quantummechanical predictions that violate local realism unless the efficiency is greater than . Second, there are n site experiments for which the quantummechanical predictions violate local realism whenever the efficiency exceeds
1 edition published in 2001 in English and held by 1 WorldCat member library worldwide
An analysis of detectorefficiency in manysite ClauserHorne inequalities is presented for the case of perfect visibility. It is shown that there is a violation of the presented n site ClauserHorne inequalities if and only if the efficiency is greater than n/ ( 2n−1 ). Thus, for a twosite twosetting experiment there are no quantummechanical predictions that violate local realism unless the efficiency is greater than . Second, there are n site experiments for which the quantummechanical predictions violate local realism whenever the efficiency exceeds
Quantum paradoxes, probability theory and detectorefficiency by
JanÅke Larsson(
Book
)
1 edition published in 1998 in English and held by 1 WorldCat member library worldwide
1 edition published in 1998 in English and held by 1 WorldCat member library worldwide
A KochenSpecker inequality by
JanÅke Larsson(
)
1 edition published in 2002 in English and held by 1 WorldCat member library worldwide
By probabilistic means, the concept of contextuality is extended so that it can be used in nonideal situations. An inequality is presented, which at least in principle enables a test to discard noncontextual hiddenvariable models at low error rates, in the spirit of the KochenSpecker theorem. Assuming that the errors are independent, an explicit error bound of 1.42% is derived, below which a KochenSpecker contradiction occurs
1 edition published in 2002 in English and held by 1 WorldCat member library worldwide
By probabilistic means, the concept of contextuality is extended so that it can be used in nonideal situations. An inequality is presented, which at least in principle enables a test to discard noncontextual hiddenvariable models at low error rates, in the spirit of the KochenSpecker theorem. Assuming that the errors are independent, an explicit error bound of 1.42% is derived, below which a KochenSpecker contradiction occurs
Quantum contextuality for rational vectors by Adan Cabello(
)
1 edition published in 2010 in English and held by 1 WorldCat member library worldwide
The KochenSpecker theorem states that noncontextual hidden variable models are inconsistent with the quantum predictions for every yesno question on a qutrit, corresponding to every projector in three dimensions. It has been suggested [D.A. Meyer, Phys. Rev. Lett. 83 (1999) 3751] that the inconsistency would disappear when restricting to projectors on unit vectors with rational components; that noncontextual hidden variables could reproduce the quantum predictions for rational vectors. Here we show that a qutrit state with rational components violates an inequality valid for noncontextual hiddenvariable models [A.A. Klyachko et al., Phys. Rev. Lett. 101 (2008) 020403] using rational projectors. This shows that the inconsistency remains even when using only rational vectors
1 edition published in 2010 in English and held by 1 WorldCat member library worldwide
The KochenSpecker theorem states that noncontextual hidden variable models are inconsistent with the quantum predictions for every yesno question on a qutrit, corresponding to every projector in three dimensions. It has been suggested [D.A. Meyer, Phys. Rev. Lett. 83 (1999) 3751] that the inconsistency would disappear when restricting to projectors on unit vectors with rational components; that noncontextual hidden variables could reproduce the quantum predictions for rational vectors. Here we show that a qutrit state with rational components violates an inequality valid for noncontextual hiddenvariable models [A.A. Klyachko et al., Phys. Rev. Lett. 101 (2008) 020403] using rational projectors. This shows that the inconsistency remains even when using only rational vectors
Quantum paradoxes probalility theory, and detectorefficiency by
JanÅke Larsson(
Book
)
2 editions published in 1998 in English and held by 1 WorldCat member library worldwide
2 editions published in 1998 in English and held by 1 WorldCat member library worldwide
Discussion on "On quantum statistical inference" by O.E. BarndorffNielsen, R.D. Gill and P.E. Jupp by
L Accardi(
)
1 edition published in 2003 in English and held by 1 WorldCat member library worldwide
1 edition published in 2003 in English and held by 1 WorldCat member library worldwide
more
fewer
Audience Level
0 

1  
Kids  General  Special 
Related Identities
 Linköpings universitet Tekniska högskolan Publisher
 Linköpings universitet Matematiska institutionen Publisher
 Linköpings universitet Institutionen för systemteknik Publisher
 Cabello, Adan Author
 SpringerLink (Online service) Other
 Johansson, Niklas Author
 Guehne, Otfried Author
 Bengtsson, Ingemar Author
 Kleinmann, Matthias Author
 Abidin, Aysajan Author
Associated Subjects