Originally Posted by
Gorrister
The southernmost state is not southern? 🤔
Originally Posted by
FiftySix
Probably rooted in the history of the United States. Florida was Spanish territory when the 13 colonies became U.S. states.
Then there was the U.S. Civil War, of which I'm not sure how much of
Florida was involved in that war.
Even Texas (member of the C.S.A. in the Civil War) isn't necessarily considered Southern by people from the southeastern U.S. When I've been in the Carolinas and Georgia, I've been told I'm not from the South, I'm from Texas.
My, how things change. Florida is a very big state.
Whether or not Florida is considered "Southern" depends on where in the state. Parts of extreme South Florida are very Southern, as are portions closer to Georgia and Alabama, but even there it varies by locality—or at least it did when I was there, many long years ago. As far as the "War Between the States" as it's called in Southern regions, or the Civil War (northern regions), yes, Florida was very staunchly Confederate.
As for Texas, to a Southerner, that's not the South, that's "the West."