Files

Abstract

We investigate a species selective cooling process of a trapped SU(N) Fermi gas using entropy redistribution during adiabatic loading of an optical lattice. Using high-temperature expansion of the Hubbard model, we show that when a subset N-A < N of the single-atom levels experiences a stronger trapping potential in a certain region of space, the dimple, it leads to improvement in cooling as compared to an SU(NA) Fermi gas only. We show that optimal performance is achieved when all atomic levels experience the same potential outside the dimple and we quantify the cooling for various NA by evaluating the dependence of the final entropy densities and temperatures as functions of the initial entropy. Furthermore, considering Sr-87 and Yb-173 for specificity, we provide a quantitative discussion of how the state selective trapping can be achieved with readily available experimental techniques.

Details

Actions

Preview