Western United States
From The Art and Popular Culture Encyclopedia
Related e |
Featured: ![]() Kunstformen der Natur (1904) by Ernst Haeckel |
The Western United States—commonly referred to as the American West or simply The West—traditionally refers to the region comprising the westernmost states of the United States. Because the U.S. expanded westward after its founding, the meaning of the West has evolved over time. Prior to about 1800, the crest of the Appalachian Mountains was seen as the western frontier. Since then, the frontier moved further west and the Mississippi River was referenced as the easternmost possible boundary of The West.
The "West" had played an important part in American history; the Old West is embedded in America's folklore.
Unless indicated otherwise, the text in this article is either based on Wikipedia article "Western United States" or another language Wikipedia page thereof used under the terms of the GNU Free Documentation License; or on original research by Jahsonic and friends. See Art and Popular Culture's copyright notice.