Americentrism

Americentrism, also known as American-centrism or US-centrism, is a tendency to assume the culture of the United States is more important than those of other countries or to judge foreign cultures based on American cultural standards. It refers to the practice of viewing the world from an overly US-focused perspective, with an implied belief, either consciously or subconsciously, in the preeminence of American culture.

The term is not to be confused with American exceptionalism, which is the assertion that the United States is qualitatively different from other nations and is often accompanied by the notion that the United States has superiority over every other nation.