Using FACE++ to read emotions on images of faces.
Theresa Kuentzler wrote a nice post, linking R to the Face++ API. So I thought I’d give it a go too.
First, we need to choose an image with faces, and for this crucial decision we formed a focus group in the family here. The basic FACE++ limits us to five faces per image so this ruled out Modern Family, Friends, and Oceans 8. Images from Poirot, Stargate, and Line of Duty featured faces that were pretty stoic so not too interesting emotion-wise. But then we looked at the magnificent Derry Girls television series and knew we had found our mark. It tells the story of a bunch of schoolchildren in 1990’s Derry, set against the back-drop of the Troubles and the Peace Process, with a great sound track and an irreverent sense of humour.
The image chosen looks like this:
mypaths <- "images/derry-girls.jpg"
derry_girls <- magick::image_read(mypaths)
plot(derry_girls)
Face++ needs registration and authorisation keys, the post from Theresa mentioned above discusses how to do this. I wasn’t keen to have my key on github, so it’s created here from a file outside the repo.
myauth <- readRDS("../../../myauth_faceplusplus")
The function below is the workhorse of this post, again largely created based on the code of Theresa. Note the block of fromJSON()
statements in the middle; one of Hadley’s Rules of Programming is that if you repeat code more that twice it should become its own function but in this case it seemed to be clearer to let these statements stand on their own.
face_plus_plus <- function(fullpath) {
face <- httr::RETRY("POST", "https://api-us.faceplusplus.com/facepp/v3/detect",
body = list(api_key = myauth$api_key,
api_secret = myauth$api_secret,
image_file = upload_file(fullpath),
return_landmark = 0,
return_attributes = "emotion,gender"),
times = 2,
encode = "multipart") %>%
as.character
anger <- fromJSON(face)$faces$attributes$emotion$anger
disgust <- fromJSON(face)$faces$attributes$emotion$disgust
fear <- fromJSON(face)$faces$attributes$emotion$fear
happiness <- fromJSON(face)$faces$attributes$emotion$happiness
neutral <- fromJSON(face)$faces$attributes$emotion$neutral
sadness <- fromJSON(face)$faces$attributes$emotion$sadness
surprise <- fromJSON(face)$faces$attributes$emotion$surprise
gender <- fromJSON(face)$faces$attributes$gender
top <- fromJSON(face)$faces$face_rectangle$top
left <- fromJSON(face)$faces$face_rectangle$left
tibble(anger, disgust, fear, happiness, neutral,
sadness, surprise, top, left,
gender = gender$value, image = fullpath)
}
And now we can run our function. We have to manually code the character names, and I decided to recode the position so that the origin was in the bottom left corner to make it synchronise better with the plot to come.
derry <- map_df(mypaths, face_plus_plus) %>%
arrange(left) %>%
mutate(name = c("Michelle", "James", "Erin", "Orla", "Claire"),
x = left,
y = height - top) %>%
select(-c(image, top, left))
derry %>% gt()
anger | disgust | fear | happiness | neutral | sadness | surprise | gender | name | x | y |
---|---|---|---|---|---|---|---|---|---|---|
0.046 | 0.046 | 0.066 | 65.650 | 26.834 | 6.803 | 0.556 | Female | Michelle | 92 | 321 |
2.374 | 1.132 | 0.301 | 0.079 | 12.066 | 7.675 | 76.374 | Male | James | 186 | 332 |
0.006 | 0.217 | 0.006 | 0.006 | 21.087 | 0.006 | 78.670 | Female | Erin | 315 | 326 |
0.000 | 0.009 | 0.009 | 99.926 | 0.002 | 0.000 | 0.053 | Female | Orla | 438 | 329 |
0.002 | 0.018 | 1.847 | 0.002 | 0.002 | 0.002 | 98.125 | Female | Claire | 532 | 259 |
Next up, we made a separate data table to generate labels for our plot below. It discarded emotions that are less than 10% for each character, and it builds in some html to format the labels. The font, Amiri was the best match I could find to the text in the school logo, the colour, #004400, lines up with the uniform colour.
emotions <- derry %>%
select(-c(gender, x, y)) %>%
pivot_longer(cols = -c(name),
names_to = "emotion",
values_to = "percentage") %>%
dplyr::filter(percentage > 10) %>%
mutate(percentage = round(percentage, 1)) %>%
unite("emotion", emotion:percentage, sep = ": ") %>%
mutate(emotion = glue::glue("{emotion}%")) %>%
group_by(name) %>%
summarise(emotion = paste(emotion, collapse = "<br>")) %>%
ungroup() %>%
mutate(name1 = glue::glue("<b>{name}</b>")) %>%
unite("emotion", name1:emotion, sep = "<br>") %>%
mutate(emotion = glue::glue("<p style = 'color:#004400; font-size:28px; font-family:Amiri';>{emotion}</p>"))
Putting this together, using ggplot()
with background_image()
from the ggpubr package gives:
derry %>%
left_join(emotions) %>%
ggplot(aes(x, y)) +
coord_cartesian(xlim = c(0, width),
ylim = c(0, height)) +
background_image(derry_girls) +
ggtext::geom_richtext(aes(x = x + ifelse(x > 100,
sign(x-centre[1])*50 + 50,
sign(x-centre[1])*50 + 10),
y = y + ifelse(x>400,
sign(y-centre[2])*65,
sign(y-centre[2])*(-120)),
label = emotion)) +
theme_void()
The label positions were a bit hit-and-miss, but I wanted to use the face positions as discovered by FACE++ rather than manually code the positions.
Seems like FACE++ captured the emotions expressed on these faces pretty well, now if only it could produce a script as sharp as that of Lisa McGee….
If you see mistakes or want to suggest changes, please create an issue on the source repository.
Text and figures are licensed under Creative Commons Attribution CC BY 4.0. Source code is available at https://github.com/eugene100hickey/fizzics, unless otherwise noted. The figures that have been reused from other sources don't fall under this license and can be recognized by a note in their caption: "Figure from ...".
For attribution, please cite this work as
Eugene (2021, May 3). Euge: Derry Girls. Retrieved from https://www.fizzics.ie/posts/2021-05-03-derry-girls/
BibTeX citation
@misc{eugene2021derry, author = {Eugene, }, title = {Euge: Derry Girls}, url = {https://www.fizzics.ie/posts/2021-05-03-derry-girls/}, year = {2021} }