The Western Unite States—commonly referred to as the American West or simply The West—traditionally refers to the region comprising the westernmost states of the United States (see geographical terminology section for further discussion of these terms). Because the United States expanded westward after its founding, the meaning of the West has evolved over time. The Mississippi River is often referenced as the easternmost possible boundary of the West.