Chubby Checker and 230 Immunity

It’s always nice to get a straightforward case with a straightforward result.  The rather humorous facts involve a trademark suit by singer Chubby Checker (best known for “The Twist”) suing Hewlett Packard (HP) over an app in Palm’s app store called the “Chubby Checker.”  This app claimed that it could guess the size of a man’s genitalia using his shoe size.  Checker, not happy with the association, opted to sue HP for contributory infringement of various intellectual property rights (such as trademark and publicity rights).  Checker also claimed that he went through HP’s notice and takedown procedure, but was ignored.  HP in turn opted to assert a section 230 defense, claiming that they qualified for immunity under section 230 of the Communications Decency Act (CDA).

The court agreed with HP, as they should have.  The purpose of section 230 of the CDA is to protect entities providing third-party content on their websites with protection from other people’s infringement.  This case represents a fairly straightforward example of that exact situation envisioned by the law.  Basically, the CDA does not treat the content provider as the speaker or publisher of information as long as they meet a three-pronged test: 1. the defendant has to provide an “interactive computer service”, 2. the case has to treat the defendant as the speaker or publisher of the harmful information, and 3. the information has to be provided by another speaker or publisher.  The law doesn’t apply to intellectual property rights, but does apply to defamation or obscenity.

While HP didn’t get the trademark infringement claims dismissed, they clearly meet the standards of section 230.  HP after all did not make the Chubby Checker app, even if it was available in their app store.  Checker’s lawsuit acts as if they were the publisher or had some role in the creation and maintenance of that app.  The court, however, concludes that there is no evidence HP had any role in the creation of this app, and thus qualifies for section 230 immunity.  The trademark case will go forward, and HP has some worries there (particularly if they ignored Checker’s notice).

As Eric Goldman points out, this was an easy case.  It is significant because there do not appear to be any other cases making any determination on 230 immunity.  This case ruling should also aid in limiting litigation against the other major app stores for similarly dirty apps.

And it Begins: 3D Printing and the New Age of DRM

I suppose that there was bound to be some attempt at Digital Rights Management (DRM) when it came to 3D printing.  Any technology that potentially disrupts the intellectual property rights of companies with significant resources (in this case, manufacturers of consumer products such as toys and simple electronics) makes some form of DRM to limit that technology likely.  What’s intriguing is the nature of the DRM proposed by startup Authentise.

Authentise proposes a system that streams designs to the printer for a single use.  The idea is to have the company operate similarly to Netflix: stream the product to the customer when they need it.  Authentise even has a video explaining the basic technology: stream the blueprint to the client but never give them access to the full file (so it can’t simply get copied and shared). 

Andre Wegner, Authentise’s founder, says that he hopes to make this process simpler and easier than using unauthorized copy methods.  If nothing else, it appears that Authentise learned some important lessons from the past decade and a half of attempts of various industries to control the distribution of their content.  Instead of trying to stop infringement through technological means, Authentise aims to provide a legitimate alternative unauthorized copying of 3D blueprints.  Authentise knows that the goal is to provide an easier to use legal alternative (and even mentions services such as Spotify or Netflix as examples of that point).  This represents much better starting point than the mess that characterized the rise of music and movie file sharing earlier in the millennium (for those that remember systems like Starforce and Sony’s malware DRM programs). 

What sort of legal issues does Authentise portend?  Authentise doesn’t appear to bring up any legal issues specifically, since their technology is not intrusive or overbearing (at least at first glance).  Starting with better DRM likely does not head off issues caused by legal uncertainty.  The copyright issues alone create a significant amount of legal uncertainty likely to result in significant litigation.

Michael Weinberg, from the intellectual property Public Knowledge, addresses one possibility in a white paper: the application of copyright law to 3D printed objects.  Copyright law potentially applies in two ways.  First, the issue of whether an individual can claim copyright protection over the blueprint.  Generally speaking, a party cannot own a copyright in a design for a useful object (17 U.S.C. 101(7)).  The design for that useful object has to go beyond a utilitarian description in some manner (such as adding aspects beyond merely utilitarian features).  That creates a significant number of potential legal issues on its own.  The test for determining which aspects of a utilitarian work with artistic elements are subject to copyright and which aspects are not (called the “severability test”) is not clear.  The severability test requires that the object possess some creative elements inserted without regard for function (Brandir Int’l, Inc. v. Cascade Pac. Lumber Co., 834 F.2d 1142 (2d Cir. 1987)) for there to be copyright considerations at all. However, recall last Friday’s article about the Digital Millennium Copyright Act (DMCA) and how it functions, particularly the notice and takedown system.  This system often assumes that the entity sending a takedown notice owns a valid copyright in the item mentioned in the takedown (512(f) can only be utilized by the party affected by the takedown notice).   Given the judgment call necessary to make that determination, DMCA enforcement will likely present a major issue for 3D printing websites in the near future.

The second copyright issue is related, which is how these sites can determine the distinction between a useful object and a creative one.  Useful objects (such as screws or bolts) generally do not receive copyright protection, whereas creative objects generally do.  A useful object, as stated earlier, can only receive copyright protection in its artistically unique elements, which must be severable from the object’s functional elements.  In other words, artistic elements only receive protection if they aren’t too tied into how the object functions.  That determination may not be clear to users of websites such as Thingverse, since they may not show up in the CAD file used to print the object.

As these issues indicate, there are many unresolved legal matters in regard to 3D printing.  Authentise learning some lessons from the DRM of the past should reduce the pain of some of the legal battles (and at least make for a convenient legal service from the sound of it).  There are still many issues left unresolved, and those will likely involve the long legal battles from the Napster and Grokster days.


DMCA Overreach: How a Little Used Provision Makes a Big Difference

Happy Friday everyone.  Recently, record label Liberation Music decided to issue a Digital Millennium Copyright Act (DMCA) takedown for a video posted by digital copyright guru Lawrence Lessig over the use of the song “Lisztomania” by Phoenix.  Lessig gave a lecture (entitled “Open”) that displayed different groups of people in a variety of countries dancing to the song, in order to make a point about how online culture allows different groups to comment on the same material.  Lessig, in tandem with the Electronic Frontier Foundation (EFF), decided to sue Liberation Music over the takedown.  Here is the complaint, for those so inclined.  Leaving aside how idiotic it is to sue a copyright expert for copyright infringement, this case provides an excellent opportunity for a plaintiff to successfully utilize section 512(f) of the DMCA.

Here’s a basic overview of the DMCA Notice and Takedown system.  The DMCA provides a limitation on liability for websites that host third-party content, even if that content infringes on another entity’s copyright.  As long as that website meets the requirements of 17 U.S.C. 512 (such as not having previous knowledge of infringement and moving quickly to correct the issue when so informed), they are not liable for copyright infringement committed by third parties on their network (referred to as a “safe harbor”).  A content owner finds something infringing, they send a letter asking Youtube to take it down (called a takedown notice), and Youtube takes it down if they think the content actually infringes on the other party’s copyright.

Theoretically, this system works fine for everyone involved.  Content owners gain an easy mechanism for dealing with infringement, sites that traffic heavily in user-generated content (UGC) have an enormous liability issue removed, and users get to use those websites.  Problems emerge in the execution.  Due to the desire to preserve their safe harbor status, UGC websites often err on the side of caution when dealing with takedown notices.   These sites remove content without further investigating whether it does indeed infringe.  Some sites (like Youtube) even automate their takedown procedures to automatically remove videos deemed infringing (in Youtube’s case, using software called ContentID that matches user submitted videos to clips provided by content holders).  Many of these sites (like Youtube) maintain an appeals process for when a user thinks their content was wrongfully taken down.  In Youtube’s case, this appeals process depends heavily on the content owner’s willingness to release the claim or unwillingness to file a formal notice.  Given the ease of issuing a formal DMCA takedown notice, uploaders possess very little leverage in this process.

What makes Lessig’s move interesting is that he and the EFF are attempting to utilize 512(f) of the DMCA, which allows an uploader to claim damages when a content owner files a bogus takedown notice.  512(f) requires that the content owner knowingly misrepresents either their ownership of the material or the infringing nature of the uploader’s content.  To this effect, Lessig claims that his use of Phoenix’s song constitutes fair use (which provides an exception to copyright infringement in the case of education, among many others) and that Liberation knew or should have known that Lessig’s use of the song represented a valid case of fair use.  Given that Lessig is a professor whose use of the song was in a lecture regarding his subject of expertise (copyright and the internet), Lessig’s argument for fair use is rather strong.  Education is one of the more straightforward fair use exceptions, and makes for a stronger argument that Liberation knew or should have known that their takedown order was bogus.  It should be interesting to see if the court agrees.

Unfortunately, Lessig’s position is not typical of individuals receiving overreaching takedown notices.  Most obviously, Lessig is a well known copyright expert with significant resources.  Not every individual uploading a video to Youtube has similar access to the legal knowledge, public stage, or money of a man like Lessig.  Also, 512(f) requires that the party sending the takedown notice knowingly misrepresented either their ownership of the copyright or that the material infringed.  In other words, the content owner has to know or should have known that their takedown notice was not valid.  Parties relying on fair use are going to have a hard time claiming that the other party meets this “knowing” standard, simply because fair use is a complicated exception evaluated on a case by case basis (I would go into more detail as to what constitutes fair use, but that’s at least another post’s worth of material).  With the exception of a few obvious situations (such as parodies, criticism, or education), the uploader will likely have a harder time claiming that the content holder knew or should have known that the uploader’s content met fair use requirements.  512(f) still potentially helps parties receiving a takedown notice where the entity sending the notice does not even own the material (which happens a lot more than you’d think), since knowing misrepresentation is a lot easier to prove in that case.

A successful court case for Lessig and the EFF would make people a lot more aware of 512(f), and their right to fight back against fraudulent takedown notices.  This court case will also provide the public with some well-known case law regarding how judges interpret 512(f), which in turn allows future content uploaders to know their rights and method of obtaining restitution beforehand.  It should be interesting to see the final result.

Lacking Authorization

What is unauthorized access?  That question confounds any lawyer attempting to discern the restrictions and limits of the Computer Fraud and Abuse Act (CFAA).  The US District Court of Northern California recently addressed the issue in Craigslist v. 3taps, wherein the court moved not to dismiss Craigslist’s charges of CFAA violations.  In this case, 3taps took and republished ads from Craigslist.  Craigslist reacted by sending a cease-and-desist letter, then blocking IP addresses originating from 3taps.  Craigslist then sued 3taps for violation of the CFAA, along with a number of other causes of action.

While the headlines appear jarring, this case is actually both rather straightforward and an interesting study in the problems with the CFAA.  The CFAA, a 1986 law designed to deter hacking, generally makes “accessing a computer without authorization or [by exceeding] authorized access” illegal.  There are more specific sections of the law as well, mostly dealing with various forms of fraud or breaking into government computers.  The issue with the CFAA and its accompanying case law is that the statute’s language is very broad.  Courts could potentially interpret “without authorization” to include any number of activities.  Prosecutors have cited the CFAA in cases ranging from violating terms of service to mass-downloading academic and scientific papers from a government database as violations of the CFAA.   The courts don’t always agree, rejecting some of these interpretations (like the terms of service case in US v. Lori Drew).   

The judge in this case uses straightforward logic: Craigslist normally holds itself open for any user to access. Craigslist then rescinded that access to 3taps by specifically blocking their IP addresses after sending 3taps a cease-and-desist order  (removing 3taps’ authorization).  Any further attempts to access Craigslist constitutes access without authorization, and results in a CFAA violation.  Judge Charles Breyer, the judge in this case, felt that imposing a technological barrier to access constitutes revoking authorization for an otherwise public site.

Orin Kerr of The Volokh Conspiracy feels that it’s a little disappointing that the judge did not explore this notion of requiring some technological barrier for revoking authorization.  While I agree, it makes sense in the context of the ruling.  Judge Breyer spends most of the time explaining why 3taps’ situation is an access restriction rather than a use restriction (which would not violate the CFAA according to the Ninth Circuit’s case law), and views imposing technological barriers as an obvious limitation on access.  The judge does ignore some rather obvious technological questions, such as whether the blocked entity has a dynamic IP (which changes from time to time) or whether accessing the website through a non-blocked IP still constitutes accessing the website without authorization.  Would the authorization only apply to blocked IPs, or would it still apply broadly (as in, if a website chose to block a few IP addresses from a user, would any further access constitute access without authorization)? 

There is also the issue of verification.  IP addresses are, simply put, an unreliable method of verifying a particular user.  In copyright infringement cases (where plaintiffs often claim IP addresses act as a form of verification), some courts have held that the address by itself is not enough to identify a defendant.  A plaintiff usually needs to prove that the defendant used that particular IP at the time of infringement, which requires supporting documentation (especially in households with more than one internet user).  While copyright infringement is obviously very different from a CFAA violation, the identification issue affects both.  In some ways, identification is a more serious issue in the context of the CFAA since the CFAA imposes criminal penalties. 

After the suicide of Aaron Swartz over CFAA charges, it should be interesting to see how legislators (and prosecutors) handle CFAA issues in the future.  The CFAA requires more clarity, even in cases like this one where the nature of the violation is clear.   

Constitutional Rights, Both Silly and Serious

Today, I’ve decided to tackle a few issues.  First, there is a rather funny case out of Massachusetts where a cyclist riding in the middle of a lane claimed riding in this manner is protected by the First Amendment’s right to freedom of expression.  The Supreme Court doesn’t agree with this cyclist, stating that the context of the act must convey a larger message.  In other words, an act by itself does not constitute free speech unless that act communicates a message.  I guess at the end of the day, we can’t fault the cyclist in this case for being creative.

The second issue is far more serious.  Laptop and hard drive decryption has recently emerged as a hot topic in criminal law (at least among lawyers like myself that care about how to account for such technology).  There’s an unsettled issue that arises whenever the police demand that an individual decrypt their hard drive: do such requests violate the Fifth Amendment right against self incrimination?  This came to mind due to a recent case against a man in Wisconsin accused of receiving and possessing child pornography.  The man refused to supply his password after police found themselves unable to decrypt the hard drives themselves, and got the court to issue an order to decrypt the hard drives for them.  The police managed to decrypt the hard drive later, rendering the issue moot and preventing the judge from making a ruling.

While courts haven’t had to answer this issues, I’m kind of curious.  Does requiring an individual to supply a password for their encrypted hard drive constitute self incrimination?  Normally this right allows a defendant to choose not to take the stand during a criminal trial or answer questions regarding their involvement in criminal matters.  This includes answering questions where the context of the question (such as where it was asked) or the implications of the question lead the individual to believe that answering or explaining why they aren’t answering would lead to worse results than simply answering in the first place.

From that perspective, there’s merit to the idea that these password requests represent self-incrimination.  If the content on the hard drive is potentially incriminating, then providing the password would lead the individual to incriminate himself and thus violate the Fifth Amendment (in theory at least).  A judge would have to decide that responding to these password requests represents testimony (the sort of action that would make a person “a witness against themselves” to paraphrase the Constitution) or a link in a chain that leads to the defendant providing incriminating evidence.  That’s where the debate gets potentially thorny.  Providing the password does result in a statement that potentially leads to the defendant incriminating him or herself (and is a statement of a fact), but is likely not incriminating on its own.  Given that the purpose of the Self Incrimination Clause is to mitigate police coercion of suspects, I would personally lean towards viewing such requests as a violation.  This is especially true in light of the wording of Supreme Court’s test: the individual must fear the repercussions of any answer to the question, even if the answer only provides part of the evidence (Hoffman v. United States).  Revealing the password is potentially injurious, which should support a finding of privilege when dealing with a TrueCrypt encrypted hard drive.

This issue should be an interesting one to follow over the next few years.  While there haven’t been any high level cases dealing with the compelled providing of passwords,  the likelihood of courts arriving at different conclusions should mean that the Supreme Court will step in eventually.  In the meantime, enjoy your weekend.

Can a User Expect Privacy in Gmail?

As if the recent NSA revelations weren’t enough, Google recently revealed a new concern when it comes to a user’s electronic privacy.  In an attempt to get a case regarding a settlement between the FTC and Google regarding data-mining accusations (Google placed a special cookie in Safari that gathered information about the users’ browsing habits despite offering assurances that Safari users would not be tracked without an opt in), Google filed a brief (found here) that they don’t believe that users have an expectation of privacy when using Gmail as part of the motion to dismiss.  For some people, this isn’t a huge surprise.

Google’s wording for their filing, that a user has “no legitimate expectation of privacy”, should interest any attorney who took Constitutional Law and Criminal Procedure back in law school.  That term “expectation of privacy” speaks specifically to the Supreme Court’s requirements for compliance with the Fourth Amendment search and seizure jurisprudence.  Normally, a government entity requires a warrant when the citizen has a reasonable expectation of privacy (a standard established in Katz v. United States).  The application of that standard to electronic devices remains an on-going question within the legal world (and has produced some interesting results, such as the entire United States v. Jones case).  Google even cited a famous case, Smith v. Maryland, holding that the police’s use of a pen register (a device that records phone numbers dialed by a particular phone line) did not constitute a search because individuals voluntarily share call routing information with the phone company (thus giving the individual no legitimate expectation of privacy in that information).  Google seems to view themselves as the phone company in this situation, as an entity with which users share their information for the purpose of sending emails.

Now, there is an obvious distinction between Google and the police: one is a private company and the other is a government entity.  Constitutional protections technically only apply in the case of a government entity, so the Fourth Amendment doesn’t necessarily apply against Google.  As a result, certain statutes are more relevant in this situation (for example, the Electronic Communications Privacy Act and the California Invasion of Privacy Act).  Google’s legal position, in regard to those statutes, is less worrisome.

Still, there are a number of problems with Google’s stance.  First, Smith v. Maryland involved use of a device that only logged the routing information (i.e. the phone numbers).  Google’s data-mining program scanned the content of emails sent through Gmail by looking for keywords.  That represents a greater potential intrusion and may change the expectation of privacy analysis.  Second, Google argues in the brief that users provide consent to these practices.  Google bases this primarily on the Terms of Service for Gmail, which allows for advertisements based on the contents stored on various Google services (including Gmail).  However, Google does ignore the consent issue for users who chose an email service that expressly does not engage in keyword scanning or for individuals using an email encryption service of some kind (say PGP or Bitmessage).  One could argue, reasonably, that there is no consent if a user goes out of their way to encrypt their communications since the user took steps to ensure that their communications could not be read.  Google does not acknowledge that situation or its implications.

Now, Google probably doesn’t have to worry too much about this case.  The ECPA allows for an Electronic Communications Service (ECS) to engage in a certain amount of scanning and filtering as part of the normal course of business (section 2701 allows a qualified ECS to access communications and the data therein on their own networks, even if such communications normally constitute an interception, as long as it is related to operating their services).  Even in regard to the consent issue, the Federal wiretapping statute only requires the consent of one party (a fact stated repeatedly in Google’s brief).

That being said, Google citing Fourth Amendment language in a case like this is a little troubling (and probably unnecessary).  The kind of information issues Google discusses in the brief (sharing keywords with advertisers) is mostly separate from the issues that require citing cases like Katz or Smith v. Maryland (warrant requirements).  A government entity’s involvement would change the legal analysis greatly, and may produce a different result regarding the legitimate expectation of privacy.  This particular issue becomes especially important with the recent NSA revelations (particularly xKeyforce), where any court case would have to answer whether a user really can expect privacy in Gmail.  A user’s expectation of privacy changes greatly depending on what Google shares and with whom. In the mean time, the best thing a user can do is make sure to read the Terms of Service and keep informed (particularly through items like the Electronic Frontier Foundation’s “Who Has Your Back?” whitepaper).

Infringement is Coming: Game of Thrones and IP

Today is going to be a policy day.  Time Warner CEO Jeff Bewkes recently made an interesting comment regarding the show Game of Thrones on HBO (a Time Warner subsidiary).  Bewkes said that Game of Thrones status as the most pirated TV show of 2012 was, in some respects, “better than an Emmy” because piracy generated “tremendous word-of-mouth.”  Bewkes is not the only Game of Thrones creator to think along those lines (author George R.R. Martin and director David Petrarca have made similar comments).

As someone who grew up reading about various court actions against people who downloaded music and movies, it’s interesting to see the head of a major media company make statements shrugging off the effects of copyright infringement.  There is no doubt in anyone’s mind that downloading episodes of Game of Thrones violates HBO’s ownership rights, particularly their right to make and distribute copies of these aforementioned episodes.  Bewkes here focuses less on his legal rights and more on how to profit from his company’s copyrighted work.  Bewkes lays out his logic very simply: they get more HBO subscribers from making quality television, and people downloading the episodes illegally helps get the word out about particularly good shows.  HBO appears to profit greatly from this strategy: total revenue is up 7%, with operating revenue is up 13%, this past quarter.  Game of Thrones, given its cultural cache, likely has a lot to do with that growth.

This news shouldn’t be particularly stunning, but Bewkes’ views on infringement are a nice shift from other media executives’.  For the longest time, a number of lobbying organizations (such as the RIAA and MPAA) argued that every illegally downloaded song, game, or movie represented a lost sale (and based their loses due to infringement accordingly).  This has never been entirely true, since there was always a set of people whom (for one reason or another) never intended to pay for the content.  The other issue is that, unlike normal theft, copyright infringement does not deprive the owner of the value of the property.  The owner still maintains possession of the copyrighted work and can still profit from it despite infringement.  This is in contrast to theft of physical property, which entirely deprives the owner of the property’s value.  It’s nice to see a person in Bewkes’ position acknowledge that illegal copying can promote a show in the right circumstances.  HBO almost provides a case study in such issues, since the number of legitimate points of access for their shows is minimal.  If you don’t have cable (or live in certain countries), obtaining legal access to Game of Thrones episodes is difficult.

Fortunately, the advent of streaming media services such as Netflix and Hulu makes providing legal avenues for customers relatively easy.  These services remain, in my opinion, severely underutilized by content owners.  Content owners are reluctant to place newer content on any of these services without some kind of caveat (such as a significant delay between the air date and the date available for streaming) that creates an opening for less legal means of viewing.  HBO actually manages this fairly well with their own streaming service, HBO Go.  Go makes content airing on HBO available almost immediately (usually within five or 10 minutes).  The subscriber never has to worry about missing Game of Thrones, because they can always stream it if they cannot view the television airing.

It should be interesting to see how other media providers respond to Bewkes’ comments.  In the meantime, I’ll wait patiently for Game of Thrones Season 4.

Daft Punk and Contracting

Today’s post is going to be a tad shorter and more straightforward than the previous posts.  I am a big fan of The Colbert Report.  I am also a big fan of Daft Punk.  When I heard that Daft Punk was going to play “Get Lucky” (their current single) on Colbert yesterday, I was appropriately stoked.  Unfortunately, Daft Punk opted not to appear on the show at the last minute due to a contractual conflict with MTV.  Apparently Daft Punk agreed to a surprise appearance at the Video Music Awards (VMAs) and to not appear on any other shows in the meantime.

Now, for the sake of argument, let’s assume that this wasn’t just a giant publicity stunt to promote a Daft Punk appearance at the Video Music Awards (which I think is the likely reason for all of this).  What happened here?  Well, Daft Punk (or their booking agent) signed a contract with Viacom (who owns both Comedy Central and MTV) with certain terms and conditions.  Most of the time, these terms are pretty straightforward: in exchange for money, Daft Punk offered to perform at the VMAs.  This contract apparently included some other terms as well, one of which was a promise not to perform on other TV shows during the month of August.  Violating any of these terms places Daft Punk in breach of contract. 

Now, Daft Punk likely signed a similar contract when Colbert booked them.  Why would Daft Punk be in breach of one but not the other?  That’s hard to say without seeing the actual contract Daft Punk signed with MTV.  Presumably, the contract included a term prohibiting them from booking other TV performances in August, since their contract included a term prohibiting them from appearing on other shows.  The very act of getting booked by Colbert may have resulted in breach.  As a result, Daft Punk had to opt against playing on The Colbert Report (which is a shame).  Viacom may have negotiated some kind of settlement (allowing Colbert to do a whole sketch set to “Get Lucky”) as accord and satisfaction (where a party agrees that a different term meets the requirements of the contract, and discharges the old term).  In that situation, allowing Colbert to play “Get Lucky” was a substitution for the live performance and fulfilled the contract.

The moral of the story?  Always make sure you read what you sign.  It makes life a lot easier in the long run.

College Athletes and Video Games

Originally, I intended to do an article on the Chromecast and its legal implications.  I just received mine, and partially wanted to share my thoughts and impressions.  Instead I’ll just give a brief review: the device works like a charm.  If you need an easy way to stream Netflix and Youtube onto your TV, this is probably the cheapest and easiest way to do so.  Hopefully it will have support from some other services soon.

Now, on to the topic at hand.

This week presented a very interesting case in the world of sports video games.  A former quarterback named Samuel Keller (who played for Arizona State and Nebraska) sued Electronic Arts (EA) over their use of his likeness in their college football game (called NCAA Football).  EA claimed, as a defense, that their use of Keller’s likeness is protected by the First Amendment as a form of artistic expression.  The Ninth Circuit Court of Appeals held that the digital representation of Keller came close enough to the real Keller to reject EA’s First Amendment defense.

Before moving on to the legal reasoning, there is one matter that requires clarification.  The college sports games do not use the actual athletes in their games in the same manner as the professional sports games.  For example, Robert Griffin III is Robert Griffin III in the upcoming version of Madden.  The NCAA version of the game from Griffin’s final year in college, however, would use an individual with similar abilities as Griffin with a different name and slightly altered appearance.  The reasons for this have to do with the NCAA’s amateurism rules (which are extremely convoluted and beyond the scope of this article).

The Ninth Circuit, in this case, used California’s “Transformative Use” test to make their determination (for the record, the original suit was filed under California Civil Code section 3344).  For this test, the court seeks to balance the First Amendment right of the defendant against the right to publicity right of the defendant (COMEDY III PRODUCTIONS, INC. v. GARY SADERUP, INC., 25 Cal. 4th 387 (2001)).  The court then lists five factors for determining whether a transformative use occurs: 1. if the use of the celebrity’s image only provides the “raw material” from which the original work is synthesized, rather than being the “very sum and substance of the work in question”, 2. if an individual is likely to purchase the work for the celebrity or for the expressive work of the artist, 3. the number of imitations in the work, 4. the economic value derived from the use of the celebrity’s likeness, and 5. how much the artist’s skill was put towards recreating the celebrity.  The court ended up holding that EA’s use of Keller’s likeness was not transformative since EA sought to “[recreate] Keller in the very setting in which he has achieved renown.”  Basically, the Ninth Circuit felt that simply changing Keller’s name, appearance, and some basic biographical details was not enough to be transformative due to the fact that the game recreated Keller in other ways (his physical abilities and skills as a football player) that were related to his public persona.

Some parties, including the dissent in the case, point out some interesting legal implications to this ruling.  Both the Electronic Frontier Foundation (EFF) and Annalee Newitz over at io9 worry that this ruling could impact fictional representations of real people.  The majority, in footnote 10 of the opinion, believes that the requirement to evaluate whether the likely purchaser’s motivation was to buy “a reproduction of a celebrity or the expressive work of the artist” sufficiently limits this holding.  Basically, the majority feels that EA’s desire to “reproduce reality” (as mentioned in the opinion) distinguishes this case from other fictionalized representations of real people.

Like the EFF, I find the majority’s reasoning lacking here.  There are many fictional representations of real people that seek to recreate the setting in which they achieved renown, some of which go to far fewer lengths to change their representation of the real person in question (the EFF uses The Social Network and Mark Zuckerburg as an example, but almost any unauthorized biopic would seemingly qualify).  The majority also seems to neglect that NCAA athletes have a limited economic interest in their likeness due to the NCAA’s amateurism rules: they are not allowed to profit off their likeness while playing NCAA sports.  The majority felt that, since the athlete may eventually profit from their appearance after they leave the NCAA, they should be able to assert their right of publicity when still playing college sports.  This completely neglects the fact that many do just that with these very games: the player placed on the cover is always a prominent player who recently left college to play professionally.  Furthermore, this particular situation (an individual who can’t profit off their likeness now but may be able to do so later) potentially applies to many people.  There is no shortage of people attempting to break out in music, acting, sports, or other forms of entertainment who might have signed documents limiting their right to profit off their image in some respect.  Without a limiting principle, this ruling potentially touches on many other works (such as speculative fiction, as io9 points out).

Another interesting note, this court would have likely held differently if this case was tried under federal law (instead of California law).  There was a very similar case against EA by Jim Brown (the famous Cleveland Browns running back) a little while ago, where EA’s use of Brown’s image was upheld under the Lantham Act (the federal trademark law).

Now, with all that being said, I think the Ninth Circuit had other reasons for reaching the ruling that they did that they didn’t mention in their opinion.  The majority here focused quite a bit in their language on the commercial nature of the use, without citing that as a major reason for holding as they did.  The majority focused more on how EA sought to recreate NCAA football in a realistic manner.  The inherently commercial nature of that recreation appears to distinguish a game like Madden from other works with representations based on real people (say The Social Network or even a Grand Theft Auto game that employs a representation of a real person).  A sports game, by its nature, relies much more heavily on accurately recreating real people for the very simple reason that gamers want an accurate representation of the players and teams.  Other works rely a lot less on these recreations than sports games.  For example, a Grand Theft Auto game may include a recreation of a famous person portraying a gangster from a famous movie but that is unlikely to be the major reason the gamer purchases the game.  What I can’t figure out is why the court didn’t lean more heavily on this logic (it is the fourth factor of the test after all) to state why they felt rejecting EA’s First Amendment defense didn’t impact other works, if only to rebut the dissent’s arguments.

It should be interesting to see if any other individuals make the argument for the more expansive ruling in the future.  Given the nature of some celebrities (and the major lawsuit led by Ed O’Bannon against the NCAA), I wouldn’t be surprised if this issue comes up again.