Western United States  

From The Art and Popular Culture Encyclopedia

(Redirected from American West)
Jump to: navigation, search

"The novels of James Fenimore Cooper, Bret Harte, Mayne Reid, Gustave Aimard, and Karl May fanned enthusiasm for the American West."--Wild West Shows (1999) by Paul Reddin

Related e

Wikipedia
Wiktionary
Shop


Featured:

The Western United States—commonly referred to as the American West or simply The West—traditionally refers to the region comprising the westernmost states of the United States. Because the U.S. expanded westward after its founding, the meaning of the West has evolved over time. Prior to about 1800, the crest of the Appalachian Mountains was seen as the western frontier. Since then, the frontier moved further west and the Mississippi River was referenced as the easternmost possible boundary of The West.

The "West" had played an important part in American history; the Old West is embedded in America's folklore.

See also




Unless indicated otherwise, the text in this article is either based on Wikipedia article "Western United States" or another language Wikipedia page thereof used under the terms of the GNU Free Documentation License; or on research by Jahsonic and friends. See Art and Popular Culture's copyright notice.

Personal tools