In a book by another French author Rene Girard, it is implied that the above lines of thought are insidious forms of nihilism, for:
Why work for actual success when you can fake it? Etc. Yes. Virtue signaling is a debasement of the currency.
Well, as it turns out, Girards entire oeuvre can be said to be advocating for that ineffective strategy of adopting extreme positions.. (eg if you think that dying on the cross was very ineffective for J. Christ)
My cynicism of politicians draws a shade over my eyes and I cannot interpret it more charitably: I interpret this to mean, plucking heartstrings works as long as you take no substantive position on the issue. I think this can be seen empirically in Dem leadership communications strategies over the past few years/decades, with all the waffling, pathological centrism, and immediate appeasement of the squeaky wheels among us.
Mumford's "bribe" was always Faustian in my reckoning. At some maybe
unconscious level people know exactly what they're getting into but
think they can kick the can down the road. We hope at some unspecified
future time the beast can be appeased or tricked.
Baudrillard would, I think, follow McLuhan in blaming technology and pataphysics (e.g. the 'physics' of fiction) as opposed to "this is a conscious conspiracy to control people". A middle ground is that if CBS didn't do it, NBC would.
I've been in a conversation about "blaming technology" recently,
needing to disambiguate amorphous Technology (big-T) and it's
possibility of neutrality, from specific malevolent creations. To
disambiguate land-mines and AK47s from Penicillin and baby incubators.
There's a mile of difference between creating an attractive nuisance,
a "Trap" to borrow from Adam Curtis, and failing to account for
negative side effects. We're still very much in the era of apologetics
and rationalisation with regard to knowingly creating controlling,
addictive and deleterious technologies. There is a kind of tragic
death drive or celebration of Thanatos/atrophy going on there. I
actually think the modern word that best captures it is "convenience".
(it must be pronounced with a sigh of resignation). :)
Link doesn't load, so with only the title to go on...
It feels like it's implying that sincere is the opposite of cynical? But someone (me) could be sincerely cynical. Or falsely optimistic.
In any case, signaling sincerity sounds a bit like fighting for peace, and doesn't give me high hopes for this article whenever we do get to find out what it says.
(It _is_ sort of fun to have this playground though, where we can all just freely admit we didn't read the article, and riff off the title alone)
I don't think they intend sincerity to be the opposite of cynicism. I think the idea is that people, being cynical, will assume that you are insincere. Presumably, TFA tells you how to avoid that.
So if you are sincerely cynical, how would people tell you that they are being sincere (when your assumption is that they are not)?
It turns out that your google cache page doesn't render the page, but the retrieved source does actually contain all of the content. If you view page source, you get:
How Candidates Can Signal Sincerity in an Era of Cynicism
Partisan polarization has reached historical highs, while politicians'
credibility has reached historical lows. For example, recent polls
suggest that as few as 8% of Americans think that politicians believe
most of the stances that they take on issues. This extreme level of
cynicism threatens to break a fundamental link in representation. If
candidates cannot credibly convey their positions, then voters cannot
evaluate them on policy. Yet, we know little about the strategies
politicians might take to convey the credibility of their claims. In
this paper, we investigate whether politicians can signal credibility by
taking extreme positions or by justifying their stances in moral terms.
Across three experiments, we show that moral justifications tend to
enhance credibility, while extreme positions do not. In a fourth study,
we show that while extreme stances increase polarization in candidate
ratings, moral justifications do not. Taken together, our findings
suggest that moral justifications are a useful strategy to enhance
credibility without contributing to rising levels of polarization.
Hypotheses
- H1: Candidates taking extreme issue positions will be perceived as more sincere.
- H2: Issue stances justified with moral language will be
perceived as more sincere.
Experimental Manipulations
A within-subjects vignette experiment. Respondents will be asked to
evaluate three hypothetical politicians, each taking a stance on a
particular issue. Within each candidate profile, the stance will be
randomly assigned to one of four conditions in a 2x2 design. The stance
will be either extreme or moderate and moral or pragmatic.
Outcomes
An index of the following questions:
- Do you think this candidate truly believes in {stance}, or is just
saying what some people want to hear?
- In your opinion, how committed do you think this candidate is to
{stance}?
- In your opinion, how likely is it that this candidate will be a
leader on {stance}?
- In your opinion, how likely do you think it is that this candidate
will flip-flop on {stance} in the future?
Summary of Results
As expected, the moral justification is perceived as significantly more
credible than the pragmatic justification (b = .02, p = .002). The
extreme position, on the other hand, is seen as slightly, but not
significantly less credible than the more moderate position (b = -.007,
p = .295). Thus, consistent with Study 1, moral justifications increase
credibility, but extreme positions do not. Additionally, we find no
evidence of an interaction between the treatments.
References
Paper presented at the the 2019 Texas American Politics Symposium
(TAPS).
It's genuinely horrifying to me that people appear to be upvoting the creation of a subhuman caste and delegating responsibility to them. Please think this through a bit more carefully, it's an interesting thought experiment but a terrible idea.
Consider also that not being able to lie is absolutely not the same thing as being trustworthy. If we've learned anything the last two years, it's that people can deeply believe some shocking things.
Could these public servants be subject to being misled? They can tell the truth but they aren't all knowing, they won't necessarily know when they've been lied to.
Could these public servants be become a tool for a more powerful political entity? Perhaps truth is more subjective than we hoped, or perhaps they have been conditioned with a failsafe - to keep them from speaking certain truths.
Could these public servants decide they don't like the life they've been given? They are humans, not mere tools.
> "And yes, you could be trusted to not lie. To eschew deceit."
Leaving aside that being unable to lie has nothing to do with knowing the truth, that poor wretch would be the most unelectable creature in existence. The voters do not want to hear that "we don't know what the right answer yet", "this solution is the right path in the long run but will cause poverty and suffering today", and other unpalatable truths that are largely unavoidable as part of governing a population.
It works well unless the person with such a mark had their wife kidnapped and the only way to ensure she was no longer being tortured was to tell a lie and cause an entire political dynasty to be wiped out. Well, unless lying once enabled double crossing and the heir to such a dynasty's escape was marked by the person with that mark.
1. Society already does this; most morality systems discourage lying (with exceptions of course).
2. Could your lobotomized person understand lying? If not, they won't be effective at governing the rest of us lairs. If so, the "lair/truth-teller" logic puzzle applies: they could truthfully report what a liar would say, thereby truthfully lying.
The lying part is secondary—no maliciousness is the important part. Maybe what you really want is a “Good” ruler; philosophy and ethics have been wrestling with how to define that for thousands of years. I’d be shocked if such a ruler would have no need to lie, unless they have no internal or external threats (so an all-encompassing autocracy, perhaps like the God Emperor of Dune). Speaking of sci-fi, the Culture novels’ hyper-intelligent AI “Minds” are, AFAIK, “good” rulers with external threats, though I don’t recall the Minds being limited by honesty.
(hosting at the Open Science Framework, "a free, open platform to support your research and enable collaboration")