western united states Meaning, Definition & Usage

  1. noun the region of the United States lying to the west of the Mississippi River
    West.

WordNet