Chrome Extension
WeChat Mini Program
Use on ChatGLM

Coherent Oscillations Between Two Weakly Coupled Bose-Einstein Condensates: Josephson Effects, Π-Oscillations, and Macroscopic Quantum Self Trapping

Physical Review A(1997)SCI 2区

Abdus Salaam Int Ctr Theoret Phys | SISSA | ICTP

Cited 961|Views36
Abstract
We discuss the coherent atomic oscillations between two weakly coupled Bose-Einstein condensates. The weak link is provided by a laser barrier in a (possibly asymmetric) double-well trap or by Raman coupling between two condensates in different hyperfine levels. The Boson Josephson Junction (BJJ) dynamics is described by the two-mode non-linear Gross-Pitaevskii equation, that is solved analytically in terms of elliptic functions. The BJJ, being a neutral, isolated system, allows the investigations of new dynamical regimes for the phase difference across the junction and for the population imbalance, that are not accessible with Superconductor Josephson Junctions (SJJ). These include oscillations with either, or both of the following properties: 1) the time-averaged value of the phase is equal to π (π-phase oscillations); 2) the average population imbalance is nonzero, in states with “macroscopic quantum self-trapping” (MQST). The (non-sinusoidal) generalization of the SJJ `ac' and `plasma' oscillations and the Shapiro resonance can also be observed. We predict the collapse of experimental data (corresponding to different trap geometries and total number of condensate atoms) onto a single universal curve, for the inverse period of oscillations. Analogies with Josephson oscillations between two weakly coupled reservoirs of ^3He-B and the internal Josephson effect in ^3He-A are also discussed.
More
Translated text
Key words
Bose-Einstein Condensation
PDF
Bibtex
AI Read Science
AI Summary
AI Summary is the key point extracted automatically understanding the full text of the paper, including the background, methods, results, conclusions, icons and other key content, so that you can get the outline of the paper at a glance.
Example
Background
Key content
Introduction
Methods
Results
Related work
Fund
Key content
  • Pretraining has recently greatly promoted the development of natural language processing (NLP)
  • We show that M6 outperforms the baselines in multimodal downstream tasks, and the large M6 with 10 parameters can reach a better performance
  • We propose a method called M6 that is able to process information of multiple modalities and perform both single-modal and cross-modal understanding and generation
  • The model is scaled to large model with 10 billion parameters with sophisticated deployment, and the 10 -parameter M6-large is the largest pretrained model in Chinese
  • Experimental results show that our proposed M6 outperforms the baseline in a number of downstream tasks concerning both single modality and multiple modalities We will continue the pretraining of extremely large models by increasing data to explore the limit of its performance
Try using models to generate summary,it takes about 60s
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Related Papers
Bose
1924

被引用1315 | 浏览

International Quantum Electronics Conference 1996

被引用9696 | 浏览

M D M R Stamper-Kurn, A P Andrews, S Chikkatur, H.-J Inouye,J Miesner, W Stenger
2001

被引用855 | 浏览

Data Disclaimer
The page data are from open Internet sources, cooperative publishers and automatic analysis results through AI technology. We do not make any commitments and guarantees for the validity, accuracy, correctness, reliability, completeness and timeliness of the page data. If you have any questions, please contact us by email: report@aminer.cn
Chat Paper
GPU is busy, summary generation fails
Rerequest