...followed a few days later by Germany.
Actually, IIRC, we never declared war on Germany but Hitler for some reason declared war on the United States a few days after the Pearl Harbor incident, which was a stupid move when you think about it. What if the US only went against Japan and stayed out of the European theater?
Roosevelt wanted that war.
Otherwise, I suppose that the commies would have taken over most of Europe. There would have been coups in Switzerland and a war in Spain. Don't know how the Cold War would have played out with Stalin in control of Western Europe.
You think if the Axis won the war, that there'd be a Cold War between America and Nazi Germany with America becoming increasingly more Left winged and granting Civil Rights earlier in order to be different from the Rightist, eugenist Nazis?
Eugenics didn't have a bad name until well after WWII. It was quite popular in certain circles in the early 20th century. The adherents of the Fabian Society (like George Bernard Shaw
) in the UK and people like Margaret Sanger in the US were very much into eugenics. It was seen as progress and a hot new thing. This kind of 'progress' was mainly opposed by old-fashioned, mainly Roman Catholic men like G.K. Chesterton. Social Democratic Sweden, for example, continued its eugenetics program well into the 70s.
Eugenetics wasn't seen as being a horrible crime against humanity, as we see it now. Eugenics certainly wasn't something that was popular on the right, it was a left-wing thing back then.
The American left of the day wasn't any less racist than the right. Probably they were even more racist. The entire former Confederacy voted for FDR and his New Deal, while the more tolerant New Englanders disproportionally voted for Hoover and Alf Langdon. If you wanted civil rights for blacks the worst thing you could do was vote for the left. That's where the KKK was.
Nazis weren't really 'rightists', but that would depend on your definition of the right and the left. If you'd go with the original definition - monarchists, classical conservatives and other supporters of the ancién régime on the right and socialists, liberals and other revolutionaries on the left - the nazis would qualify as leftists. The nazis and the fascists were self-consiously revolutionary, often adopted revisionist or corporatist economic policies, hated free markets, overthrew, made powerless or didn't restore the traditional government institutions of their countries, were proponents of an all-powerful state out of which nothing could legitimately exist (churches, youth groups, labour unions etc. in Nazi Germany were, as far as possible, abolished in favour of an official, state-controlled national socialist one), didn't support traditional morality (euthanasia and abortion were very popular with the nazis - if only it were done to the right people) and antagonized Christians (look up Cardinal Clemens August Graf von Galen and Dietrich Bonhoeffer).
Now that I've answered some of your faulty assumptions I'll try to answer your question
Hitler, before the war was often seen by the establishment (Chamberville Tories, Americans on both sides of the aisle, the French Popular Front and continental Christian Democrat) as a useful fool. They looked down upon that demagogue and probably despised him, but their real enemy was bolshevism. They wanted to destroy the Soviet Union, and they wanted to use Hitler to achieve that aim. Perhaps if Hitler had destroyed the Soviet Union that might have somewhat appeased the West. America would have probably reverted back to isolationism in case of an Axis victory. In fact, at the end of WWII America did nearly go back to isolationism. American Liberals thought that the Cold War could be prevented by not insulting Uncle Joe and supporting the United Nations, the right (like Sen. Robert Taft) wanted America to simply mind its own business. When Churchill proposed a renewed Anglo-American alliance against Russia in his 1946 Fulton Adress he was dismissed by both sides of the aisle. America would certainly have become isolationist in case of a loss in WWII; they wouldn't have payed too much attention to what happened abroad.
Besides, the west didn't really know about the holocaust, or at least the magnitude of the holocaust, until the Allies walked into the concentration camps. If Hitler would have won the war perhaps the allies wouldn't have figured out what happened until years later. Even if they would have done so as early, the Americans wouldn't (and, in fact, didn't) link the holocaust to their own treatment of minorities, and not without reason - refusing blacks at a lunchcounter is hardly comparible to butchering millions of Jews and gypsies. As far as I know, not even the leaders of the Civil Rights Movement compared segregation to the holocaust. Therefore, it seems unlikely that civil rights would have been granted earlier if Germany had won WWII.
This might better go in the Politics forum rather than Other Topics. But nothing in my answer was about contemporary politics, so perhaps it can stay.