In what major country is is more popular and legitimate to be on the right? If I had to guess, it would be the UK, where while most actors and celebrities are lefties, it's by a much lower ratio that in the states. Also, leftist politicians, at least recently, was as mocked as Trump was in this election, if not more so. There's also very few leftists in finance, unlike in the U.S. This might not have been the case ten years ago when Blair was in power, when, however much he was hated, Tories were still more mocked, but I think things have considerably changed since then.
Also, by "leftist" and "right," I am referring to the two broad political segments of any country.