Memgpt-3x7b-MOE is a Mixure of Experts (MoE) made with the following models using LazyMergekit:
Base model