[Philosophy of Social Cognition] Fifteenth Meeting

Martyna Meyer martyna.meyer at univie.ac.at
Sun Jun 25 10:31:33 CEST 2023


Dear all,

I hope you're doing well and enjoying the last days of the semester :) 

We have only one meeting* (*plus a bonus session) left!


For our next meeting, we will read:

Kiverstein, J. (2015). Empathy and the responsiveness to social affordances. Consciousness and Cognition, 36, 532–542.
If you have a UniWien account, you can access it here: https://www.sciencedirect.com/science/article/pii/S1053810015001105
Otherwise, please let me know!

You can join either online (Zoom link <https://univienna.zoom.us/j/65514918078?pwd=cVZTd2Ivb09uSUFVNTZORWFIOTA4UT09>) or in person. We meet in the room 3B (NIG, third floor), at 6:30 CET. 
The meeting takes place on Tuesday, June 27, 2023.

- - - - - - - - - - - - 

I am also forwarding you an email from Bailey (thank you so much, Bailey!) on the topic of the empathy theories (M. Ratcliffe's paper we discussed on 13.06.2023):

- - -

Hello Martyna/group! Sorry for the long ramble produced by a sleepless night. Hopefully it’s of use to someone. 

I still (of course) feel deeply bothered by the account given in “Empathy Without Simulation” in respect to it not requiring truth conditions. However, talking with you/Martyna afterwards, I do also feel a lot of the intuitive pull of looking at empathy as constituted by “attitudes” rather than doxastic  commitments. I’ll try to reframe my position here. 

Here’s how I understand my own argument in propositional form: 

1. It is an empirical fact that we are aware of the existence of other people’s mental states (e.g. false belief task) 
2. The attempt to account for the conditions in Premise 1 is called “Theory of Mind” [ToM] 
3. “Empathy” is a derivative application of ToM. For consider if it weren’t. If empathy were not an application of ToM, then empathizing would in no way involve the mental states of others. But empathy does involve the mental states of others. Therefore it must be an application of ToM. 
4. Because the mental states of others are facts of the world , ToM must be truth apt in order to be answerable to those facts (Necessary Condition). 
5. From 3-4, empathy must be truth apt (Necessary Condition). 
6. The definition of empathy presented in “Empathy Without Simulation” is not truth apt. 
7. From 5-6, the definition of empathy in “empathy without simulation” fails to meet a necessary condition of empathy. 
Conclusion: The definition of  empathy in “Empathy Without Simulation” fails. 

From a philosopher’s point of view, it would be weird to put an Intuition Pump *after* a formal argument. Nevertheless, I think that the following case might help make my worry seem more reasonable. Here’s a specific case, drawn (mostly) from real life. 

BICKLEY JR CASE (slight adaptation from the real life John Hickley Jr case) 

Bickley Jr is a young American  man who has recently seen the movie Limousine Driver. In it, a delusional Limousine Driver aims to impress a crush (played by Josy Albers)  by assasinating a political candidate whom she disliked. BICKLEY Jr becomes attracted to Albers, and eventually goes with her on a date that ends up poorly. Mentally unwell but feeling   As if he has something to make up, he begins to make increasingly dramatic apologies despite her ignoring him. Under the belief that her feelings are similar to the character she played, he wonders if murdering a political candidate himself will both make up for his mistake and assuage the feelings of hurt towards him which he imagines she has. He goes through with the thought and attempts to assassinate the candidate, but fortunately is unsuccessful. 

It seems crazy to me to describe what BICKLEY Jr feels to Josy Albers as empathy. Yet it seems like the account in “Empathy Without Simulation” would commit us to that view. He does have an *attitude* of empathy towards Albers, and he is (overly) open to whatever experience of the world he imagines she has. The immediate intuition I have that this is still not sufficient for calling him empathetic, however, could easily be resolved by simply adding the condition that his beliefs about her mental states — particularly her beliefs and desires — must be *right.* His beliefs about what she wants are false and delusional. Following the propositional argument above, that would be sufficient to explain why he’s not empathetic. This is perhaps a meta ethical worry and not a Cog Sci/Phil Mind one, but it’s a worry nonetheless. 

The burden on enactivist-y, norm-y accounts, it seems to me, is to explain premises 4-5  of the propositional argument. One possible way to do this might be to specific truth conditions for coordination. Simply two agents A and B *being* coordinated, however, would not be sufficient. Bickley Jr and Josy Albers *are* coordinated, in that their actions are in feedback loops with each other. However, something seems to go wrong in their coordination. If enactivist-y norm-y stuff can explain what goes wrong in a way in which propositional conditions obtain, then my worry might be assuaged and the “Empathy Without Simulation” account might be salvageable. 

- - -

I’m looking forward to seeing you on Tuesday :)



Best,

Martyna




More information about the SocialCognition mailing list