NewsFood & Entertainment

Actions

Carrie Fisher’s death renews questions about digital re-creation

Posted
Carrie Fisher's death was a jolt to the broad community of "Star Wars" fans. But it could help spark a necessary and overdue conversation about what might be called digital etiquette, and the practice of using computer generated imagery to recreate and manipulate the likenesses of dead actors.

Star Wars: The Force Awakens

Carrie Fisher’s death was a jolt to the broad community of “Star Wars” fans. But it could help spark a necessary and overdue conversation about what might be called digital etiquette, and the practice of using computer-generated imagery to re-create and manipulate the likenesses of dead actors.

Last week Lucasfilm responded to rumors that Fisher’s “Star Wars” character could be digitally inserted into future movies, issuing a statement that read, “We want to assure our fans that Lucasfilm has no plans to digitally re-create Carrie Fisher’s performance as Princess or General Leia Organa.”

The company added that it intended to “cherish her memory and legacy as Princess Leia, and will always strive to honor everything” she contributed to the franchise.

The announcement sounded appropriate, and even felt like something of a relief, especially given Fisher’s often-caustic views about Hollywood and celebrity, including the indignities associated with her “Star Wars” fame along with the perks.

Still, the late Peter Cushing’s presence in “Rogue One: A Star Wars Story,” as well as other instances of technology employed in that fashion, has understandably fueled discussion about what happens if a performer should die unexpectedly, as well as the ethics of replicating them from beyond the grave.

The issue is far from new, having fueled debate and even a 1999 legal challenge, by Fred Astaire’s widow, when his likeness turned up in a TV commercial. Through the years, it’s frequently been punctuated by cases of studios using computer-rendered images to complete films.

Early examples range from Brandon Lee’s death during the making of “The Crow” to the lead character’s mother appearing in “The Sopranos” to Oliver Reed’s supporting performance in “Gladiator.” More recently, scenes featuring Paul Walker were incorporated — with help from his brothers as stand-ins — in the most recent “Fast & Furious” sequel.

Directors have also capitalized on the process to conjure younger versions of living actors — as in “The Curious Case of Benjamin Button” — the kind of effect that can’t readily be achieved using makeup.

When the movie “Final Fantasy” was released in 2001, concerns arose about “photo-realistic animation” designed to approximate humans on screen. At the time, actor Tom Hanks said he was “troubled by it,” as was the Screen Actors Guild by the then mostly theoretical notion that filmmakers could take the raw material actors leave behind and concoct “a kind of cyberslave who does the producer’s bidding without a whimper or salary,” as the New York Times put it.

Among those offering reassurance was, notably, “Star Wars” mastermind George Lucas. While he acknowledged his pioneering use of digital characters like Jar Jar Binks, he told the paper, “I don’t think I would ever use the computer to create a human character. It just doesn’t work. You need actors to do that.”

A lot has happened since, including improvements in the process that are more passable, if not perfect. But many fans and performers are still understandably queasy about seeing actors turned into the equivalent of digital widgets, using their likeness in ways they might not have anticipated or of which they wouldn’t necessarily approve.

Obviously, there’s no easy solution in a situation like this, and even recasting a role has its drawbacks. The bottom line, though, is that with studios eager to wring revenue out of established properties, these questions won’t go away, which would make some sort of standard helpful going forward.

Where to draw those lines isn’t always clear, and probably can’t be applied uniformly to every case. But like so much that pertains to technology, a good starting point might be the assumption that just because you can doesn’t always mean that you should.