Warning: This blog post contains some disturbing pictures. One of these, in particular, is very graphic, and may merit special caution.
The Boer War in Southern Africa was more important than many Americans realize …
I would wager that most Americans have never heard of the Boer War. They might have heard of the Spanish-American War, which was fought around the same time, but they probably wouldn’t even remember much of that – beyond Teddy Roosevelt charging up San Juan Hill, at least. But their history classes are unlikely to have even mentioned the Boer War. This means that most of them will reach adulthood without having heard of it. This is not surprising, because the Boer War was fought in the southern tip of Africa, by the various parts of the British Empire. The conflict did not involve the United States, which may explain why our own history classes don’t teach much about it. Nonetheless, the Boer War was quite important, and continues to be remembered as such in some other places.
Wounded British soldiers (circa 1900)






