Random-telegraph noise mitigation and qubit decoherence in solid-state experiments

Journal of Physics: Conference Series(2023)

引用 0|浏览2
暂无评分
摘要
Abstract We investigate the recently proposed theoretical models and algorithms in Song et al. [1] for mitigating decoherence in solid-state qubit systems, where qubits are affected by charge (random telegraph) noises. The model includes a setup of a logical qubit (data qubit) and a spectator qubit, where the latter is used as a probe of the noise. The probe results can be used in correcting the phase error in order to improve the decoherence of the data qubit. In this work, we apply the proposed model with parameters extracted from recent solid-state qubit experiments. We extract parameters such as the noise switching rates, the qubit sensitivities to noise, and the measurement dead time. Using these parameters, we then numerically simulate the data qubit’s phase and the qubit decoherence. We also show that the proposed phase-correction technique using Bayesian estimation can improve the data qubit decoherence significantly.
更多
查看译文
关键词
qubit decoherence,noise,random-telegraph,solid-state
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要