Mixture of experts
| Part of a series on | 
| Machine learning and data mining | 
|---|
Mixture of experts (MoE) is a machine learning technique where multiple expert networks (learners) are used to divide a problem space into homogeneous regions. MoE represents a form of ensemble learning. They were also called committee machines.