Pulse-level characterization of qubits on quantum devices
ABSTRACT: Recent research has tackled the problem of mitigating noise present in quantum computers in the Noisy Intermediate-Scale Quantum (NISQ) era to enable precise computations and to benefit from the intrinsic properties of quantum mechanics. For this matter, an important task is the characteri...
- Autores:
-
Quiroga Salamanca, David Andrés
- Tipo de recurso:
- Trabajo de grado de pregrado
- Fecha de publicación:
- 2021
- Institución:
- Universidad de Antioquia
- Repositorio:
- Repositorio UdeA
- Idioma:
- eng
- OAI Identifier:
- oai:bibliotecadigital.udea.edu.co:10495/22001
- Acceso en línea:
- http://hdl.handle.net/10495/22001
- Palabra clave:
- Quantum theory
Teoría cuántica
Algorithms
Algoritmo
Calibration
Calibración
Crosstalk
Machine Learning
Quantum Benchmarks
Quantum Computing
Quantum Machine Learning
Quantum Optimal Control
http://vocabularies.unesco.org/thesaurus/concept4810
http://vocabularies.unesco.org/thesaurus/concept2024
http://vocabularies.unesco.org/thesaurus/concept4530
- Rights
- openAccess
- License
- http://creativecommons.org/licenses/by/2.5/co/
Summary: | ABSTRACT: Recent research has tackled the problem of mitigating noise present in quantum computers in the Noisy Intermediate-Scale Quantum (NISQ) era to enable precise computations and to benefit from the intrinsic properties of quantum mechanics. For this matter, an important task is the characterization of qubits available in quantum devices so as to provide insights on how to reduce noise on the final output of a quantum circuit. Characterization comprises analysis of noise sources and this information can be used to reduce noise with methods such as Cycle Benchmarking, Quantum Error Mitigation, Quantum Error Correction and others. Here, we study optimization of pulses through Quantum Optimal Control (QOC) to obtain higher gate fidelity. We will explore an algorithm that performs calibration on specific quantum gates by implementing optimized pulse schedules to subsequently use the algorithm for analysis of noise sources. Using calibrated gates as input, several benchmarking protocols, including pulse noise extrapolation, leakage analysis from quantum optimal control, and machine learning based classification of qubit readout, will be tested to extract precise information on how noise influences the analyzed qubits. We will explain and discuss different techniques for obtaining properties of qubits and quantum computers. We will implement state discrimination with a Machine Learning (ML) focus to analyze readout errors caused by factors such as cross-talk and leakage into higher quantum states. We will perform noise fitting of optimized pulses and evaluate the effectiveness of important quantum algorithms at the pulse level. |
---|