Home » Encyclopedia » Western American Empire Western American Empire The Western American Empire refers to the Empire of Australia’s territory in the south-western American continent.