Sandra Day O'Connor College of Law

Jurimetrics

Abstracts

Design and Deviance: Patent as Symbol, Rhetoric as Metric—Part 2
Charles E. Colman

This project reveals the unrecognized power of gender and sexuality norms in the deep discourse of pivotal American case law on design patents. In Part 1, I showed that late nineteenth-century cultural developments in the urban Northeast gave rise to a stigma surrounding the “ornamental” and “decorative” works under the then-exclusive purview of design-patent protection. Among the politically dominant segments of American society, the creation, appreciation, and consumption of design “for its own sake” grew increasingly intertwined with notions of decadence, effeminacy, and sexual “deviance.” In Part 2, I now examine influential design-patent decisions from the 1870s through the 1930s against that cultural backdrop. My close reading of these decisions will demonstrate that federal judges, particularly in pivotal cases decided by the Second Circuit, increasingly used design-patent disputes as a vehicle for the performance and endorsement of gendered values. The resulting doctrine relegated design patents to near-total irrelevance as a viable form of intellectual property protection for a large and crucial portion of the twentieth century.

CITATION: Charles E. Colman, Design and Deviance: Patent as Symbol, Rhetoric as Metric—Part 2, 56 Jurimetrics J. 1–45 (2015).
A New Formula for Tribal Internet Gaming
Racheal M. White Hawk

Tribal gaming is an industry that generates more than $27 billion a year. It comprises forty percent of all gaming in the United States, and has provided more than 628,000 jobs for Native and local communities. While tribal brick-and-mortar casinos contribute numerous economic, cultural, and social benefits to Native communities, Internet gaming profits are a potential boon. Internet gaming is well positioned for rapid growth because tens of millions of Americans use computers, cell phones, and tablets for shopping, games, and entertainment. Furthermore, with the advent of increasingly accurate geolocation technology, filtering, and blocking systems, the age and location of gamblers can be monitored, thus facilitating legal Internet gaming within state borders. Moreover, the potential for tax and licensing revenue from Internet gaming is immense, and states may enter into revenue-sharing agreements with tribes while offering exclusivity for tribal operators. For instance, in California, tribes contributed $467 million to state revenue in 2012 from brick and mortar casinos. States such as Delaware and New Jersey have legalized intrastate Internet gaming to reap tax revenue. California, however, has not yet legalized intrastate Internet gaming. Rather than wait for states to legalize intrastate Internet gaming, some tribes are launching their own online poker and bingo rooms to accept bets from players not located on Indian lands, asserting that doing so is legal under the Indian Gaming Regulatory Act (IGRA). However, some states disagree that it is legal under IGRA. To prevent impending expensive and time-consuming litigation and to support tribal economic development, Congress should reform the current regulatory patchwork of federal Internet gaming legislation by legalizing interstate Internet gaming, allowing states to opt out of the federal interstate Internet gaming scheme, and adding a new category specifically for Internet gaming to IGRA.

CITATION: Racheal M. White Hawk, Comment, A New Formula for Tribal Internet Gaming, 56 Jurimetrics J. 47–78 (2015).
Officer Body-Worn Cameras—Capturing Objective Evidence with Quality Technology and Focused Policies
David K. Bakardjiev

Body-worn cameras (BWCs) are one of the nation’s latest policing tools to be used in the effort to increase police department transparency, strengthen community trust, and fight crime. BWCs are touted as game-changing technology in policing because of their potential to monitor officer misconduct during police-citizen encounters and provide objective evidence. As more police agencies incorporate BWCs into their policing culture, an influx of video data is being collected for possible use in criminal prosecution. Although BWCs are revolutionizing criminal proceedings, inadequate or absent police department policies may threaten the potential use of recorded video as evidence in court. This comment identifies potential evidentiary challenges that may arise when introducing BWC recordings at trial, and argues that such challenges may be avoided with strong police department policies. It concludes by offering a BWC policy template designed to promote objectivity and the integrity of BWC video evidence.

CITATION: David K. Bakardjiev, Comment, Officer Body-Worn Cameras—Capturing Objective Evidence with Quality Technology and Focused Policies, 56 Jurimetrics J. 79–112 (2015).
Governing Emerging Technologies: The First 30 Years and the Next
Gary E. Marchant, Yvonne A. Stevens, Braden R. Allenby, & James M. Hennessy

The year 2015 is the 30th anniversary of the Center for Law, Science & Innovation (CLSI or the Center) at Arizona State University. With over 30 faculty members and 30 years of accomplishment, the Center is the oldest and largest law school center in the nation addressing the intersection of law and technology. From the outset, the Center, which publishes Jurimetrics in partnership with the ABA Section of Science & Technology Law, has been at the forefront in exploring the legal implications of emerging technologies.

Highlights of the first 30 years include sponsoring two national conferences in the mid-1980s anticipating the coming merger of computer and communications technologies. The Center convened a major conference in the early 1990s to develop the first legal research agenda for the Human Genome Project. It created the first law school program on personalized medicine in 1990. It taught the nation’s first regularly offered law school course on nanotechnology in 2007. Since 2012 it has offered one of the nation’s leading annual conferences on e-discovery. And in 2013, it launched the annual conference on Governance of Emerging Technologies (GET), which will be held for the fourth time in May 2016.

In these and other activities, the Center takes a proactive approach that anticipates important new technology advances and explores how the legal system must adjust to keep pace with these rapidly emerging technologies. Today, the Center continues to actively address the legal response to a broad array of emerging technologies, including precision medicine, nanotechnology, biotechnology, synthetic biology, neuroscience, robotics, autonomous cars, e-discovery, surveillance technologies, big data, drones, sustainable technologies, digital medicine, antiaging technologies, national security technologies, and technologies affecting relationships.

As it kicks off its fourth decade, the Center is consolidating its work on emerging technologies into a new Governance of Emerging Technologies Program (GET Program), which will join the existing Public Health Law and Policy Program and Program on Law and Sustainability as its third core program. The goal of the GET Program will be to promote beneficial technologies that can make individuals and society happier, more fulfilled, sustainable, and secure. A key tenet of this effort will be that law must go beyond its historical predominant role in trying to restrict harmful technologies, to now assume a new priority of affirmatively promoting beneficial technologies.

CITATION: Gary E. Marchant, Yvonne A. Stevens, Braden R. Allenby, & James M. Hennessy, Column, Governing Emerging Technologies: The First 30 Years and the Next, 56 Jurimetrics J. 113–116 (2015).
From Patent Thickets to Patent Networks: The Legal Infrastructure of the Digital Economy
Jonathan M. Barnett

Scholarly and popular commentary often asserts that markets characterized by intensive patent issuance and enforcement suffer from “patent thickets” that suppress innovation. This assertion is difficult to reconcile with continuous robust levels of research and development (R&D) investment, coupled with declining prices, in technology markets that have operated under intensive patent issuance and enforcement for several decades. Using network visualization software, I show that information and communication technology (ICT) markets rely on patent pools and other cross-licensing structures to mitigate or avoid patent thickets and associated inefficiencies. Based on the composition, structure, terms, and pricing of selected leading patent pools in the ICT market, I argue that those pools are best understood as mechanisms by which vertically integrated firms mitigate transactional frictions and reduce the cost of accessing technology inputs. Appropriately structured patent pools can yield cost savings for intermediate users, which may translate into reduced prices for end users, but at the risk of undercompensating R&D suppliers.

CITATION: Jonathan M. Barnett, From Patent Thickets to Patent Networks: The Legal Infrastructure of the Digital Economy, 55 Jurimetrics J. 1–53 (2014).
Privacy Mindset, Technological Mindset
Michael Birnhack, Eran Toch, and Irit Hadar

Policymakers around the world constantly search for new tools to address growing concerns about informational privacy (data protection). One solution that has gained support in recent years among policy makers is Privacy by Design (PbD). The idea is simple: think of privacy ex ante, and embed privacy within the design of a new technological system, rather than try to fix it ex post, when it is often too late. However, PbD is yet to gain an active role in engineering practices. Thus far, there are only a few success stories.

This article argues that a major obstacle for PbD is the discursive and conceptual gap between law and technology. A better diagnosis of the gaps between the legal and technological perceptions of privacy is a crucial step in seeking viable solutions. We juxtapose the two fields by reading each field in terms of the other: (1) by reverse engineering the law to expose its hidden assumptions about technology (the law’s technological mindset), and (2) by reading canonical technological texts to expose their hidden assumptions about privacy (technology’s privacy mindset). This article’s focus is on one set of informational privacy practices: the large corporation that collects data from individual data subjects.

This dual reverse engineering exercise indicates substantial gaps between the legal perception of informational privacy, as reflected in the set of principles commonly known as Fair Information Practice Principles (FIPPs) and the perceptions of the engineering community. While both information technology and privacy law attempt to regulate the flow of data, they do so in utterly different ways, holding different goals and applying different constraints. The gaps between law and technology point to potential avenues to save PbD.

CITATION: Michael Birnhack, Eran Toch, and Irit Hadar, Privacy Mindset, Technological Mindset, 55 Jurimetrics J. 55–114 (2014).
With a Life on the Line, Emerging Technologies Can Contribute in the Determination of Intellectual Disability in Capital Sentencing
Kellie Manders

The use of capital punishment for intellectually disabled individuals violates the Eighth Amendment through its standard of decency jurisprudence. However, with varying definitions and diagnostics from the American Association on Intellectual and Development Disabilities and the American Psychiatric Association as well as varying state statutes defining intellectual disability, there is no consistent understanding of what qualifies a person as intellectually disabled. This problem can be addressed through a uniform definition in diagnostic materials and laws across the United States. Beyond policy definitions, emerging technologies may also establish objective standards that will increase accuracy and ensure justice within our capital sentencing system. With emerging technology, courts have the capability to make careful diagnoses of individuals with reduced mental capacities to mitigate the discrepancies across the nation and prevent intellectually disabled individuals from wrongfully being put to death.

CITATION: Kellie Manders, Comment, With a Life on the Line, Emerging Technologies Can Contribute in the Determination of Intellectual Disability in Capital Sentencing, 55 Jurimetrics J. 115–130 (2014).
Kevin Cooks’s Kitty Genovese: The Murder, the Bystanders, the Crime That Changed American
Jon M. Sands and Katherine L. Hanna

The chilling indifference has haunted us over the past fifty years: the cry in the dead of night, “Oh God he stabbed me. Help me!” the windows opening, and the thirty-eight witnesses who turned away. The horrific murder of Kitty Genovese on March 13, 1964, in the Queens Borough of New York City sparked a debate, still ongoing, about bystander indifference, moral responsibility, and legal duty. She became, for many, a tragic symbol of urban alienation. Her death spurred social science research into “the bystander effect” and led to 911 lines and changes in emergency help. But what if much of the story we know is wrong?

In his definitive book, Kitty Genovese: The Murder, the Bystanders, the Crime That Changed America, Kevin Cook explores what happened that cold March morning. Cook has done more than write a “true crime” exposé; his examination does more than just retrace steps and recount the terrible crime that occurred that night. He gives us a sense of what New York and America were like in 1964, including insight into the closeted gay life; police work; and capital litigation. Cook also restores the victim as a person. He rescues the victim from being just the ignored stranger in an indifferent cityscape into a person, tragically killed and mourned to this day.

CITATION: Jon M. Sands and Katherine L. Hanna, Kevin Cooks’s Kitty Genovese: The Murder, the Bystanders, the Crime That Changed America, 55 Jurimetrics J. 131–144 (2014) (book review).
Governance in Emerging Technologies Symposium: A Foreword
Gary E. Marchant

Emerging technologies such as biotechnology, genomics, geoengineering, human enhancement technologies, information technologies, nanotechnology, neuroscience, personalized medicine, 3D printing, robotics, synthetic biology, stem cell and regenerative medicine, digital currencies, surveillance technologies, autonomous vehicles, and telecommunications present unique governance challenges. Each of these technologies is emerging at a frantic pace, create very uncertain risks and benefits, have a wide range of applications that cross industry sectors and agency jurisdictions, and present concerns that go beyond traditional health, safety, and environmental risks to also include challenging social and ethical problems such as privacy, inequality, and economic dislocation.

In May 2014, the Center for Law, Science & Innovation (LSI Center) at the Sandra Day O’Connor College of Law–Arizona State University, with the support of cosponsors from across the country, hosted its second annual conference on Governance of Emerging Technologies: Law, Policy, and Ethics. The theme of this annual conference is that “there is much to be learned and shared from and across the governance experience and proposals for today’s emerging technologies.”

CITATION: Gary E. Marchant, Foreword, Symposium: Governance in Emerging Technologies Symposium, 55 Jurimetrics J. 145–146 (2015).
Proactive International Regulatory Cooperation for Governance of Emerging Technologies
Marc A. Saner and Gary E. Marchant

This article provides a systematic checklist to guide proactive bilateral and international regulatory cooperation (in the sense of “alignment” or “harmonization”) in the context of emerging technologies. The article is structured along a life-cycle starting with preregulatory activities and ending with postregulatory processes. The background research is based on a series of interviews with American and Canadian experts carried out in late 2013 as well as studies of previous international regulatory alignment examples. Our aim is to inform the regulatory debate on how to best develop proactively aligned regulatory programs for emerging technologies in bilateral (e.g., United States-Canada) and international contexts.

CITATION: Marc A. Saner and Gary E. Marchant, Proactive International Regulatory Cooperation for Governance of Emerging Technologies, 55 Jurimetrics J. 147–178 (2015).
A Case for the Commonplace: Locating Nanotechnology within Existing Regulatory Frameworks
Jeffery T. Morris

The disruptive effects of new technologies extend to concerns over the viability of existing regulatory regimes to effectively govern the introduction of such technologies into society. Using the nanotechnology case, this article argues that anchoring the development of a new technology within an existing regulatory regime can in fact stimulate new thinking and discussion around the societal implications of technology. Regulatory discussions within the Organisation for Economic Cooperation and Development (OECD) demonstrate that institutional knowledge about the regulation of traditional chemicals has served as a solid foundation for developing effective regulatory approaches for nanomaterials. In particular, the OECD’s Mutual Acceptance of Data framework has proven to be an effective means of channeling political power into purposeful discourse on how to fit nanomaterials into a regulatory regime that had been developed for traditional chemicals.

CITATION: Jeffery T. Morris, A Case for the Commonplace: Locating Nanotechnology within Existing Regulatory Frameworks, 55 Jurimetrics J. 179–200 (2015).
Innovation-Friendly Regulation: The Sunset of Regulation, the Sunrise of Innovation
Sofia Ranchordás

In the last years, the approach of governments towards innovation has changed dramatically. The myth of innovation being an activity carried out by lonely inventors in their garages or in the laboratories of large private companies, in the context of which there was little or no place for governmental intervention, is passé. The advancement of innovation and economic growth has been included in the list of priorities of most governments. However, in times of crisis, governments expect to play more than a supporting role and go beyond the traditional economic incentives and innovation policy documents. It is time to call regulators to the center stage and tell them to start playing the innovation tune. Regulation and governance can play either an impeding or a facilitative role, depending on the instruments used. Regulation may give or take away incentives to innovate; it may accelerate or delay the introduction of innovations into the market; it may adapt itself to the speed of the innovation process or lag behind. This article focuses on two aspects of this regulatory approach to innova¬tion: the pacing problem and the lack of information about the regulation of the innova¬tion process.

Very often regulation simply lags behind or tries to slow down the “pace” of innovation, whereas innovation moves at the speed of sound: it can “happen” anywhere and anytime; regulators are limited by slow procedures and the need to confer some stability to regulations. In addition, regulators are being confronted with complex innovations in the different fields of emerging technologies and apparently straightforward innovations that challenge existing regulatory paradigms (e.g., Aereo, Airbnb, Uber) about which regulators know very little. Do these innovations bring along risks, and how should they be regulated? In this article, I argue that the “pacing” and “informational” problems could be solved by enacting two highly overlooked regulatory instruments: sunset clauses and experimental legislation. Both of these instruments confer adaptability to the regulatory framework, set the stopwatch on obsolete legislation, and create room for regulatory flexibility and learning. Sunsetting limits the temporal scope of regulation, while experimental legislation can limit both the temporal and geographic scopes or object of regulation. Experimenting with laws can be particularly useful to test new regulations on a small-scale basis, gather more facts about the market’s response to an innovative product, and improve regulation as more information becomes available. Both temporary legislative instruments can be part of a more innovation-friendly approach to regulation, combining on the one hand an openness to innovation, and on the other, a responsible regulatory framework.

CITATION: Sofia Ranchordás, Innovation-Friendly Regulation: The Sunset of Regulation, the Sunrise of Innovation, 55 Jurimetrics J. 201–224 (2015).
Automatic License Plate Readers: An Effective Law Enforcement Tool or Big Brother’s Latest Instrument of Mass Surveillance? Some Suggestions for Legislative Action
Randy L. Dryer and S. Shane Stroud

Although license plate recognition technology has been around for years, privacy concerns over the technology have only recently come to the forefront of the public’s attention. The concern is due to several factors. First, Automatic License Plate Reader (ALPR) systems have proliferated throughout the country as law enforcement learns of its usefulness in crime interdiction. Second, the data collected are being aggregated in massive regional databases, raising the specter of location tracking. Third, there is a lack of uniform standards governing retention and use of ALPR data collected or accessed by government. Fourth, the collection and use of license plate data by private companies is growing rapidly in an unregulated environment, leading to concerns that the data may be abused or used for purposes other than as initially intended. Finally, there is a growing public awareness of the potential privacy dangers arising from geolocation tracking in light of (1) the U.S. Supreme Court’s decision in United States v. Jones, which held that the warrantless installation of a GPS tracking device on a suspect’s vehicle implicated Fourth Amendment protections and (2) a widely circulated national study by the American Civil Liberties Union (ACLU) raising the specter of massive government surveillance through the use of ALPR technology.

ALPR technology respects no geographic boundaries, yet Congress has been reluctant to regulate or limit the technology. State legislatures, however, have begun to address the technology but are doing so without any consistent or principled legislative approach. The ALPR industry has vigorously lobbied against restrictive laws and has filed suits claiming various laws infringe upon First Amendment rights. As a consequence, we are heading toward an environment of piecemeal and inconsistent regulation that will be fertile ground for litigation for many years to come.

The article identifies privacy issues and other legal risks associated with unregulated use of ALPR technology, offers a brief analysis of the potential First and Fourth Amendment issues implicated, surveys the current legislative attempts to regulate the technology, and proposes guiding principles that legislators should follow in drafting laws to regulate ALPR technology. The overarching objective of the principles is (1) to provide a rational and comprehensive framework in which to approach the regulation of the technology, and (2) to adequately address the principle privacy concern of most citizens—government suspicionless surveillance—without unduly hampering the effectiveness of law enforcement or negatively impacting the ALPR industry’s core business of serving the financial and repossession industries. The enactment of laws is always an exercise in compromise, and the ten suggested principles are offered as a cogent path toward rational legislation.

CITATION: Randy L. Dryer and S. Shane Stroud, Automatic License Plate Readers: An Effective Law Enforcement Tool or Big Brother’s Latest Instrument of Mass Surveillance? Some Suggestions for Legislative Action, 55 Jurimetrics J. 225–274 (2015).
Old Laws, New Tricks: Drunk Driving and Autonomous Vehicles
Katherine L. Hanna

Drunk driving, or driving under the influence (DUI), is a major public health problem in the United States. Despite attempts to educate drivers on the dangers of drunk driving and deter such behavior through criminal punishment, there are still thousands of deaths attributable to drunk driving every year and billions of dollars spent on damage from auto accidents, loss of life, injuries, deterrence, and punishment. Recent developments in autonomous vehicle technology could reduce or eliminate DUI-related accidents within the next decade. However, even in future autonomous vehicle systems that will have the potential to drive themselves in most circumstances, human intervention will sometimes be necessary. This article applies current DUI laws to autonomous vehicles and proposes a legislative change to clarify DUI laws and enhance the public safety.

CITATION: Katherine L. Hanna, Comment, Old Laws, New Tricks: Drunk Driving and Autonomous Vehicles, 55 Jurimetrics J. 275–289 (2015).
Regan Brashear’s Fixed: The Science/Fiction of Human Enhancement
Maxwell J. Mehlman

In thinking about biomedical and biomechanical enhancement, commentators, including myself, tend to focus on improving the performance of people whose abilities already lie within a species-normal range, while giving little thought to how enhancement technologies might affect persons with disabilities. Additionally, while I have called attention to the potential use of what I call “compensatory enhancements,” which would offset a person’s disability by enhancing other capabilities, such as giving a person who is blind a supernormal hearing ability, I have not explored how people with disabilities perceive enhancements for themselves or for others.

One reason for the inattention to the perspective of persons with disabilities is that improving performance to remove a disability may not count as enhancement. A working definition of a biomedical enhancement is an intervention that employs medical and biological technology to improve performance, appearance, or capability “beyond what is necessary to achieve, sustain or restore health.” The movie Fixed portrays interventions such as wearable exoskeletons for paraplegics, artificial legs for a mountain climber, and computer implants for a quadriplegic that enable him to touch his girlfriend for first time. Because an enhancement does not aim to eliminate or mitigate the effects of a disability, many of these interventions are not properly considered enhancements, despite being remarkable scientific achievements that can make enormous differences in the lives of the people who obtain them.

CITATION: Maxwell J. Mehlman, Regan Brashear’s Fixed: The Science/Fiction of Human Enhancement, 55 Jurimetrics J. 291–296 (2015) (film review).
Growing Organs in the Lab: Tissue Engineers Confront Institutional “Immune” Responses
Lars Noah

In order to ease chronic shortages of transplantable human organs as well as circumvent the immune response of recipients, tissue engineers have seeded scaffolds with patients’ stem cells that develop into functional replacements. Several individuals already have benefitted from lab-grown bladders and tracheas, and scientists have made promising advances in laboratory animals with far more complex organs such as hearts and kidneys. As this form of regenerative medicine moves from fantasy to reality, it will pose various puzzles for legal institutions. For instance, does tissue engineering merely amount to an innovative medical procedure subject to state regulation, or instead does it qualify as an interstate commercial activity governed by federal law? Will the outputs of this novel technique closely enough resemble human organs so that the Health Resources and Services Administration enjoys primary authority, or instead will they qualify as therapeutic products regulated by the Food and Drug Administration (FDA)? If subject to the supervision of the latter agency, then would lab-grown organs face licensure as biologics or medical devices, and how exactly would the FDA apply such requirements to tailor-made items? Collateral questions will arise in connection with tort law, payment, and patent protection. Unless existing systems for controlling medical technologies develop satisfactory answers to such concerns, they may create roadblocks to the realization of the full potential of tissue engineering.

CITATION: Lars Noah, Growing Organs in the Lab: Tissue Engineers Confront Institutional “Immune” Responses, 55 Jurimetrics J. 297–338 (2015).
Search Bias and the Limits of Antitrust: An Empirical Perspective on Remedies
David A. Hyman and David J. Franklyn

As Google has moved from providing “ten blue links” to “universal search,” controversy has erupted over whether Google is favoring its own specialized search results over competing specialized results offered by other entities. Google’s competitors have complained about “search bias,” and demanded that antitrust enforcers should ensure “search neutrality.” Both the U.S. Federal Trade Commission (FTC) and the European Commission have considered these complaints. The FTC closed its investigation without taking any action, but the European Commission issued a formal statement of objections to Google in April 2015. This study empirically examines the impact of potential design remedies on search bias, including prominent links to rival specialized search services (“architectural remedies”) and clearer labeling of Google’s specialized search results (“labeling remedies”). This study finds that architectural remedies have much greater impact than labeling remedies. User awareness of labeling is low, and even labels far more explicit than those currently employed do not have much impact. Consumers have sticky expectations about how search results are presented, and their click-through behavior tracks those expectations irrespective of how the search results are labeled. However, major architectural changes can have a substantial impact on click-through rates. These findings suggest that the impact of architectural remedies will depend greatly on their design features, while labeling remedies are unlikely to have a significant impact. We explore the implications of these findings for other issues at the interface of Internet and intellectual property (IP) law.

CITATION: David A. Hyman and David J. Franklyn, Search Bias and the Limits of Antitrust: An Empirical Perspective on Remedies, 55 Jurimetrics J. 339–380 (2015).
Death and Social Media Implications for the Young and Will-less
Alexandra Elliott

Every minute, 108 people die worldwide. That’s almost two deaths per second. Every day, 8000 Facebook members die—and Facebook is just one of many social networks. With four in five Internet users using social networks and blogs, many Internet users die leaving behind social networking accounts—arguably, significant parts of themselves. Who should decide what happens to these personal accounts? While some users have begun adding digital assets to wills, the current state of the law is not up to date or clear on whether social networking accounts are part of a deceased’s estate. And what about those users without wills? Ninety-four percent of U.S. teens use Facebook, and few are likely to have seriously contemplated death, let alone have wills in place. To eliminate this gap and confusion in the law, Congress needs to implement a uniform solution addressing social media access after death. The courts, and ultimately social networking sites themselves, would be bound to those standards, thus eliminating confusion and the need for wills pertaining to social networks. Moreover, social networking accounts are made to be modified and accessed solely by the creator, and thus should be legally construed as extensions of the self, whereby individual property rights in such accounts are extinguished upon death and remain nontransferable.

CITATION: Alexandra Elliott, Comment, Death and Social Media Implications for the Young and Will-less, 55 Jurimetrics J. 381–405 (2015).
Deborah Tuerkheimer’s Flawed Convictions: “Shaken Baby Syndrome” and the Inertia of Injustice
Jon M. Sands and Rachelle Jones-Rowe

In 2012, more than ten years into a twenty-year sentence for murdering his infant, Drayton Witt was released from an Arizona prison and subsequently had his charges dismissed. In granting Mr. Witt postconviction relief, the court acknowledged that the science used to convict him, the very evidence the court previously declared “overwhelming,” was now open to question. This science was “shaken baby syndrome” (SBS), the supposed “triad” of medical symptoms that conclusively proved child abuse. Prosecutors used expert testimony for over two decades in child abuse cases to prove that infants and babies were purposely and intentionally harmed and killed. This once unquestionable evidence is now open to question. Although the county prosecutor stated that he believed Mr. Witt was still guilty, the charges were dropped. To prove child abuse cases, the prosecution now must rely on proof other than solely SBS. With the expert certainty that surrounded SBS now open to skepticism, the criminal justice system is confronting many possible wrongful convictions.

In Flawed Convictions, Deborah Tuerkheimer authoritively chronicles the history of SBS. A former prosecutor with experience in child abuse cases, Tuerkheimer writes as an apostate. She acknowledges the widespread and horrific instances of child abuse but argues that SBS can no longer be taken as conclusive proof of the place and manner of abuse or the abuser’s identity. And yet, with the possibility of wrongful convictions having occurred, the legal means to exonerate are too often barred—or prosecutors, having embraced SBS, refuse to reconsider.

CITATION: Jon M. Sands and Rachelle Jones-Rowe, Deborah Tuerkheimer’s Flawed Convictions: “Shaken Baby Syndrome” and the Inertia of Injustice, 55 Jurimetrics J. 407–418 (2015) (book review).
Design and Deviance: Patent as Symbol, Rhetoric as Metric—Part 1
Charles E. Colman

This project reveals the unrecognized power of gender and sexuality norms in the deep discourse of pivotal American case law on design patents. In Part 1, I show that late nineteenth-century cultural developments in the urban Northeast gave rise to a stigma surrounding the “ornamental” and “decorative” works under the then-exclusive legal purview of design-patent protection. Among the politically dominant segments of American society, the creation, appreciation, and consumption of design “for its own sake” grew increasingly intertwined with notions of frivolity, effeminacy, and sexual “deviance.” In Part 2, I will examine influential design-patent decisions from the 1870s through the 1930s against this cultural backdrop. My close reading of these decisions will demonstrate that federal judges, particularly in leading cases decided by the Second Circuit, increasingly used design-patent disputes as a vehicle for the performance and endorsement of prevailing gender norms. The resulting doctrine relegated design patents to near-total irrelevance as a viable form of intellectual prop¬erty protection for a large and crucial portion of the twentieth century.

CITATION: Charles E. Colman, Design and Deviance: Patent as Symbol, Rhetoric as Metric - Part 1, 55 Jurimetrics J. 419–462 (2015).
Austin Sarat’s Gruesome Spectacles: Botched Executions and America’s Death Penalty
Jon M. Sands and Dale A. Baich

The issue of botched executions most recently came before the United States Supreme Court in Glossip v. Gross. In that case, we were two of the lawyers who represented the petitioner. We argued that the Oklahoma proce¬dure for administering lethal injections, the primary method of capital punish¬ment in the jurisdictions which still administer the death penalty, was uncon¬stitutional under the Eighth Amendment.

Following a series of executions where midazolam was used as part of the lethal injection drug formula, and where prisoners struggled against the re¬straints, spoke or gulped and gasped for an extended period of time before dying, the United States Supreme Court granted review in Glossip v. Gross. In Glossip, four Oklahoma petitioners sought a preliminary injunction to stay their executions. The Court affirmed the denial of the preliminary injunction, stating that the petitioners failed to present a known and available alternative and that the district court did not commit clear error when it found that mid¬azolam is likely to render a person unable to feel pain associated with the administration of a paralytic and potassium chloride. Scientific and medical literature state that midazolam is not capable of placing a prisoner into a coma-like state and maintaining it before the second and third drugs are adminis¬tered. Those two drugs, a paralytic and potassium chloride, cause pain—suffo¬cation followed by liquid fire coursing through the prisoner’s veins. In the context of capital punishment, given its present constitutionality, the Court held that because there can never be an entirely painless way to execute an individual, some pain will be tolerated. The Court found the risk of pain, under this specific drug, in this specific posture, to be tolerable.

The issue, however, will not go away. How much pain is constitutionally permitted is still an open question. Some future execution will be botched; it is inevitable. The condemned withering in agony will again reignite the debate on the method of execution and, as stated in the dissent in Glossip, whether the death penalty itself is constitutional. How, and by what means, the death penalty is carried out will continue to be an integral part of the debate. It is therefore timely that we now have a history of executions gone terribly wrong.

Gruesome Spectacles is a chronicle of botched executions from 1890 to 2010. It is also an account of how such mishandled and bungled executions, witnessed and reported, impacted the debate over capital punishment and the methods employed in carrying out the death penalty. As such, and deservingly, it will become a well-cited source.

CITATION: Jon M. Sands and Dale A. Baich, Austin Sarat’s Gruesome Spectacles: Botched Executions and America’s Death Penalty, 55 Jurimetrics J. 463–475 (2015).
Medical Device Regulation in the Information Age: A Mobile Health Perspective
Kevin Khachatryan

In September 2013, after two years of deliberation, the Food and Drug Administration (FDA) released its Final Guidance—nonbinding recommendations for mobile medical application (MMA) manufacturers. This revised guidance answered questions that plagued the health information technology (IT) industry. The agency explained its jurisdictional intent and limited its own authority to regulating higher risk mobile health apps. However, the guidance did not sufficiently address clinical decision support software or describe a classification mechanism. Instead of clear definitions, the Final Guidance relied on rigid examples. In the same time period and at the bequest of Congress, the FDA released the Food and Drug Administration Safety and Innovation Act (FDASIA) Health IT Report in 2014 proposing goals for health IT regulation and a risk-based classification system. Although the report was a step in the right direction and painted a picture of agency goals, the FDA’s deregulatory tone was worrisome. It left adverse event reporting to an outside private-public agency and described a regulatory universe where the FDA’s only jurisdiction would be over high-risk medical device IT. The report would leave the regulation of health IT functionality software to the Office of the National Coordinator for Health Information Technology (ONC) and other agencies. This comment proposes a more flexible, FDA-centric, mobile health (mHealth) regulatory pathway that pools the agency’s past experience in regu¬lating medical devices. I also propose a less cumbersome method of obtaining clear¬ances for frequently updated software, a more technology-friendly approval pathway, and an interagency certification model.

CITATION: Kevin Khachatryan, Comment, Medical Device Regulation in the Information Age: A Mobile Health Perspective, 55 Jurimetrics J. 477–507 (2015).
Rehabilitation of MAOA Deficient Criminals Could Lead to a Decrease in Violent Crime
Ashley Wiberg

Researchers have found that a low variant of the monoamine oxidase A (MAOA) gene, coupled with childhood abuse or mistreatment, creates a propensity for psychopathic tendencies including violence and aggression. More recently, defense attorneys have called for this genetic evidence to be presented in criminal trials and during sentencing to prove defendants suffer from this genetic and environmental combination. The combination can explain psychopathy, which makes it difficult for individuals to control their aggressive tendencies. Some courts have allowed the presentation of genetic evidence and there is some indication it can be used as a mitigating factor in more criminal trials as it gains acceptance in the legal and scientific communities. If this defense is the wave of the future, a better system must be created for sentencing and rehabilitating those who are genetically predisposed to act violently. Many fear the use of genetic evidence as a mitigating factor will cause further harm to society. Dangerous, violent individuals could be punished less severely and let free to harm again. Rehabilitating MAOA-low offenders would eliminate this fear and could significantly decrease violent crime, as it is estimated that MAOA-low males commit roughly forty percent of violent crimes in the United States.

CITATION: Ashley Wiberg, Comment, Rehabilitation of MAOA Deficient Criminals Could Lead to a Decrease in Violent Crime, 55 Jurimetrics J. 509–526 (2015).
Pleading Patterns and the Role of Litigation as a Driver of Federal Climate Change Legislation
Juscelino F. Colares and Kosta Ristovski

Based on a variant of the Elliott-Ackerman-Millian theory that variable, potentially inconsistent and costly litigation outcomes induce industry to seek federal preemptive legislation to reign in such costs, we collect data on climate change-related litigation to determine whether litigation might motivate major greenhouse gas emitters to accept a preemptive, though possibly carbon-restricting, legislative compromise. We conduct a spectral cluster analysis on 178 initial federal and state judicial filings to reveal the most relevant groupings among climate change-related suits and their underlying pleading patterns. Besides exposing the general content and structure of climate change-related filings, this study identifies major specific pleading trends, such as the low frequency of tort claim pleading and the high level of segregation of state and federal causes of action. These data also allow investigating how generally applicable litigation doctrines have influenced pleading patterns, even subduing the impact of the two major U.S. Supreme Court rulings in this area. These findings lead us to conclude that this type of litigation has not induced, and is not likely to induce, major emitters to embrace preemptive emissions legislation as a risk-reducing compromise.

CITATION: Juscelino F. Colares and Kosta Ristovski, Pleading Patterns and the Role of Litigation as a Driver of Federal Climate Change Legislation, 54 Jurimetrics J. 329–373 (2014).
Thirteen Ways to Steal a Bicycle: Theft Law in the Information Age by Stuart P. Green
Reviewed by Jon M. Sands

This term the United States Supreme Court heard and decided ABC Inc. v. Aereo, on whether a business that captures television broadcast signals and streams them to customers is a legitimate service or is engaged in a form of theft. Aereo argued that they were merely allowing their customers to do what they have done since the advent of television broadcasting, except now the customers could rent equipment from Aereo that allowed them to watch the shows at their convenience. This renting of antennas to individuals evaded the copyright prohibition against transmissions of public performances. Broad¬casters argued that Aereo should have to pay fees, as the capturing of the broadcast was a form of retransmission. The taking of these transmissions prompted Chief Justice John G. Roberts to state to the Aereo lawyer, “[Y]our technological model is based solely on circumventing legal prohibitions that you don’t want to comply with.” Justice Ruth Bader Ginsberg and Justice Sonia Sotomayor asked questions that implied Aereo was taking content with¬out paying for it. Justice Ginsberg told Aereo’s lawyer that the company “[was] the only player so far that doesn’t pay any royalties at any stage,” while Justice Sotomayor said, “It’s not logical to me . . . that you can make these millions of copies and . . . essentially sell them to the public.” The jus¬tices struggled however with the implications for new technologies and the unknown ramifications on, for example, digital cloud innovations and storage. Justice Stephen G. Breyer stated that what disturbed him is the impact on “all kinds of other technologies.” Ultimately, the Supreme Court decided against Aereo, in a decision that has far reaching implications for broadcast television and for the cloud. The issues raised also concern what is, and is not, theft in the digital age.

CITATION: Jon M. Sands, Stuart P. Green’s Thirteen Ways to Steal a Bicycle: Theft Law in the Information Age, 54 Jurimetrics J. 375–384 (2014).
The Helpfulness of Copyright Misuse in DMCA Circumvention
Lauren Stewart

The Digital Millennium Copyright Act’s anticircumvention provisions grant a copyright owner a claim against a party who circumvents a technological measure controlling access to his work. Anticircumvention provides significant protection for computer programs and digitally distributed works that are vulnerable to more than traditional infringement. Protecting more than infringement necessarily requires anticircumvention to operate beyond the scope of traditional copyright protection. The 2010 decision in MDY v. Blizzard left a circuit split on the provisions’ correlation with copy¬right limitations. The split carries a concern that unlimited anticircumvention permits copyright owners to leverage their copyright monopolies beyond intended protections. Herein lies a convergence of tensions between software’s ill fit with copyright law, digital copyright owners’ need for protection from rampant online abuse, and the public need for access that facilitates innovation and improvement. The copyright misuse doctrine has helpful application in dealing with these tensions. Applied to anticircumvention, copyright misuse allows courts to employ DMCA policy and enforce circumvention when access controls are connected with the incentives Congress intended to protect. Moreover, copyright misuse application resolves the split created by MDY while maintaining the desired results. Consequently, copyright misuse provides a means for defining an anticircumvention limit by analyzing access controls using justifiable copyright bounds.

CITATION: Lauren Stewart, The Helpfulness of Copyright Misuse in DMCA Circumvention, 54 Jurimetrics J. 385–408 (2014).
The Use of Nanotechnology within the Solar Industry: A Sustainability Perspective
Lauren A. Ferrigni

Nanotechnologies have garnered widespread support from the scientific community for their potential benefits in many areas, including improving efficiency within the solar energy generation, transportation, and storage sectors. Despite such promise, there is increasing concern regarding the potential risks such technologies may pose to human health and the environment. Hence, while nanotechnologies are forecasted to advance the solar industry and its contributions toward global sustainability, such efforts may be thwarted by the fragmented and lax regulations currently imposed on such technologies. While traditional top-down regulatory models have garnered support as potential solutions, such methods are infeasible due to the ever-evolving nature of nanotechnologies and the uncertainty regarding the potential risks associated with the use of nanomaterials. Likewise, strict privatization of solar nanotechnology governance is untenable because this model would permit industry members to freely produce solar nanotechnologies without considering the potential adverse effects such innovations may pose to human health and the environment. This comment supports adopting a new governance model as an effective oversight mechanism to ensure the adequate identification of the potential exposures associated with the use of nanotechnology within the developing solar industry. In view of the proposed new governance approach to solar nanotechnology oversight, a public-private partnership between the EPA and a consortium of diverse solar nanotechnology-invested stakeholders should be formed as a collaborative effort to ensure the sustainable use of nanomaterials within the developing solar industry.

CITATION: Lauren A. Ferrigni, The Use of Nanotechnology within the Solar Industry: A Sustainability Perspective, 54 Jurimetrics J. 409–432 (2014).
Forensic Fallacies and a Famous Judge
Jonathan J. Koehler

Probabilistic reasoning in the law is replete with well-documented errors and pitfalls. The errors include the prosecutor and defense attorney fallacies, the transposed conditional, the source probability error, the numerical conversion error, the probability of another match error, the false positive fallacy, the base rate fallacy, selection bias, the individualization fallacy, the fingerprint examiner fallacy, the uniqueness fallacy, the conjunction fallacy, disjunctive errors, the imperfection fallacy, misconceptions of chance, and pseudodiagnosticity. These errors arise most often in cases that include forensic science evidence. Because statistical reasoning can be both complex and counterintuitive (for example, application of Bayes Theorem to conditional probability matters), it is not entirely surprising when jurors and attorneys commit these errors. But it is more surprising, and more damaging to the legal system as a whole, when those we trust to get it right commit and perpetuate those errors.

CITATION: Jonathan J. Koehler, Column, Forensic Fallacies and a Famous Judge, 54 Jurimetrics J. 211–19 (2014).
Comparing the Use of Forensic Science Evidence in Australia, Switzerland, and the United States: Transcending the Adversarial-Nonadversarial Dichotomy
Gary Edmond and Joëlle Vuille

This article compares responses to incriminating expert evidence (that is, forensic science) in Australia, Switzerland, and the United States. It begins with an outline of the three systems. Then, drawing on recent reviews of the forensic sciences, it explains that many of the forensic sciences have not been formally evaluated—that is, never subjected to validation studies. This means that in many cases we do not know if techniques work, nor how well. It also means that standards, claims about proficiency and experience, as well as the expressions used by analysts are not empirically based. These critical findings from a range of peak scientific organizations and commissions of inquiry (for example, the U.S. National Academy of Sciences and National Institute of Standards and Technology) are then used to illuminate the impact of rules, procedures, and the performance of personnel (such as forensic scientists, prosecutors, defense lawyers and judges) across our three jurisdictions. The article explains how three different criminal justice systems each failed to identify or credibly respond to deep structural and endemic problems with many types of forensic science and medicine evidence routinely used by investigators and prosecutors. Serious problems with forensic science techniques and derivative evidence are rarely identified, let alone explained and conveyed in ordinary criminal proceedings. Indeed, there is very limited evidence that lawyers and judges are conversant with emerging critiques or the corrosive impact of speculative expert evidence on criminal proof. The article endeavors to understand these failures and the weakness of processes and safeguards across advanced criminal justice systems that include adversarial and nonadversarial elements. It also considers why many of the problems were initially identified in the United States, although not necessarily in courts, and what might be done in each of these jurisdictions to improve awareness and enhance legal responses to weak and speculative forms of incriminating expert evidence.

CITATION: Gary Edmond and Joëlle Vuille, Comparing the Use of Forensic Science Evidence in Australia, Switzerland, and the United States: Transcending the Adversarial-Nonadversarial Dichotomy, 54 Jurimetrics J. 221–76 (2014).
Protecting Victims of Cyberstalking, Cyberharassment, and Online Impersonation through Prosecutions and Effective Laws
Cassie Cox

The Internet has spawned a dark side by creating new ways to commit crime. Cybercrimes increase exponentially, and current laws do not provide adequate redress. Emerging areas that have, and will, trouble the courts are cyberstalking, cyberharassment, and online impersonation. Although these crimes contain elements found in current stalking and harassing laws, they are unique offenses with distinguishing qualities. Beyond traditional stalking, cyberstalkers can use a wider range of methods, from tracking victims through social media to impersonating targeted individuals. To adequately address these crimes, lawmakers need to draft new laws specific to these offenses. Because prosecutions of cyberstalking and cyberharassment are stymied by the current statutory requirements of culpable mental state, I propose statutes and rules that individually address cyberstalking, cyberharassment, and online impersonation to improve protections for victims and accountability for offenders.

CITATION: Cassie Cox, Comment, Protecting Victims of Cyberstalking, Cyberharassment, and Online Impersonation through Prosecutions and Effective Laws, 54 Jurimetrics J. 277–302 (2014).
The Sancho Effect: Why the Large Hadron Collider Won’t Destroy the Earth, and How It Could Improve Science in the Courts
Marshall Chance Peterson

Many scientific theories begin life as highly abstract concepts spawned from the imaginative brains of “mad scientists.” Particularly in fields such as high energy particle physics, where the size, complexity, and cost of even setting up an experiment can be prohibitive, such theories often live or die as purely mathematical expressions, never living to see the inside of a laboratory. When various theories of this nature are at odds, the scientific process, in all its careful, calculated glory, provides for eventual—but never quite final—resolution. What happens, then, when rapid and final resolution is sought? When the scientific debate becomes a legal dispute, does the legal process complement the scientific, or interfere with it? When such disputes arise, who is the better judge: one in a black robe, one in a white coat, or somewhere in between (a grey cardigan)? Similar issues frequently arise in the context of the Federal Rules of Evidence, where courts are asked to determine the admissibility of forensic science evidence. In this comment, the focus is placed on cases where advanced scientific theories themselves are the subject of litigation, and modifications are proposed that would better enable courts to address such cases. A brief introduction to The Center for Nuclear Energy Research’s (CERN) Large Hadron Collider (LHC) is provided. Part I examines Sancho v. U.S. Department of Energy, a case in which two private plaintiffs sought to halt construction of the LHC, to illustrate the difficulties courts face in effectively adjudicating scientific disputes. Part II proposes three possible modifications for courts to consider to improve the adjudicative process in cases involving scientific claims. First, an examination of a new preliminary injunction standard is given that could be applied in cases involving low probability but potentially catastrophic risks. This option is ultimately rejected as misguided and ineffective. Second, a proposal for a Judicial Scientific Research Service (JSRS), similar to the Congressional Research Service, is provided as another viable alternative. Although a JSRS could significantly improve science-based adjudication, it would also present structural, constitutional, and political questions. To avoid these problems and to effectuate the most comprehensive improvements, an argument is made for the creation of a new science court. While the idea of a science court has garnered considerable interest and generated serious debate in the past, the discussion has, unfortunately, lain surprisingly dormant in recent decades. This comment’s aim is to reopen the discussion in hopes that our legal system will be able to keep pace with scientific and technological progress.

CITATION: Marshall Chance Peterson, Comment, The Sancho Effect: Why the Large Hadron Collider Won’t Destroy the Earth, and How It Could Improve Science in the Courts, 54 Jurimetrics J. 303–317 (2014).
Judith Flander’s The Invention of Murder: How the Victorians Revelled in Death and Detection and Created Modern Crime
Reviewed by Jon M. Sands

Judith Flanders’s The Invention of Murder examines the advent of Sherlock Holmes and other literary detective stories, arguing, among other themes, that the creation of detective sleuths and police investigators coincided with the rise of modern policing, forensic science and the entertainment industry of Victorian England. The prim and proper Victorians were aghast at the perceived rise of violent crimes, while simultaneously being enthralled with the latest terrible murder accounts and stories of detection. Flanders has written a fascinating and sprawling, gruesome and engrossing, lucid and lively social history of the subject of murder in 19th-century England.

CITATION: Jon M. Sands, Judith Flander’s The Invention of Murder: How the Victorians Revelled in Death and Detection and Created Modern Crime, 54 Jurimetrics J. 319–28 (2014) (book review).
Cruel and Unusual Punishments: A Comparison of Public and Judicial Opinion
Jason A. Cairns and Jonathan J. Koehler

The Eighth Amendment to the U.S. Constitution mandates that “cruel and unusual punishments” shall not be inflicted. Based on an “evolving standards of decency” doctrine, the Supreme Court has held that forbidden punishments include the death penalty for non-homicidal crimes, the death penalty for the insane, and life imprisonment without the possibility of parole for juveniles. But how do lower court judges and ordinary citizens view these punishments and their relationship to justice? We surveyed 127 trial and appellate judges and 180 jury-eligible members of the general public about these matters. We found that the standards of decency for our surveyed groups have not evolved to the place reached by the Supreme Court. In three Eighth Amendment scenarios, a large majority of the public favored punishments that were expressly rejected as cruel and unusual by the Supreme Court. The attitudes of lower court judges were in between those of the general public and Supreme Court, but closer to those of the Supreme Court. We discuss some reasons why the public may be more punitive than the judges and Justices, and whether courts should attend to public opinion on this matter.

CITATION: Jason A. Cairns and Jonathan J. Koehler, Cruel and Unusual Punishments: A Comparison of Public and Judicial Opinion, 54 Jurimetrics J. 109–34 (2014).
Combating Baseless Patent Suits: Rule 11 Sanctions with Technology-Specific Application
Hana Oh Chen

Nuisance-value patent suits impose significant costs on a wide range of industries. Patent assertion entities (PAEs), or “patent trolls,” receive particular attention, although individual inventors and practicing corporations also contribute. Yet effective solutions in the political arena remain elusive. The situation calls for a practical approach that can be quickly implemented, is useful throughout various stages of litigation, and adaptable to meet the different needs of various industries. Sanctions under Rule 11 can be such a tool in curbing frivolous infringement suits. Rule 11 is particularly useful in the patent litigation context because it applies throughout the stages of litigation, including the initial stages, and its provisions help prevent the overdeterrence of claims with merit. A technology-specific application of the prefiling investigation requirement under Rule 11 can be an effective weapon in deterring plaintiffs from bringing meritless claims. Existing factors of the Rule should be considered within the context of the specific technology at issue.

CITATION: Hana Oh Chen, Combating Baseless Patent Suits: Rule 11 Sanctions with Technology-Specific Application, 54 Jurimetrics J. 135–78 (2014).
Producing Valuable Patents: The USPTO’s Missed Opportunity in Biotechnology
Jonathan T. McMichael

Charged with managing technological innovation across the nation, the United States Patent & Trademark Office (USPTO) has recently made attempts to encourage certain kinds of technology development over others. The impetus behind these so-called pilot programs is unclear; presumably, programs that accelerate protection of new technology contribute to the USPTO’s reduction of the often-cited patent application backlog. However, social and environmental calls to action might also serve as motivation for the USPTO’s actions. In its Green Technology Pilot Program, implemented in late 2009, the USPTO underscored its preference for certain technologies with greater relative potential to benefit society by offering examination-based incentives to inventors of environmental technologies. Nonetheless, the Green Technology Pilot Program leaves an obvious gap in incentive support for biomedical innovations, another field with a high potential for societal benefit. While the USPTO’s controversial approach to encouraging green technologies could be wholly lifted and applied to biomedical developments, such an approach would fail to recognize fundamental differences between these industries and perhaps even deter inventors’ participation in the program and the patent system. Therefore, because of inherent differences between the green technology and biotechnology industries, the USPTO must devise a unique and sustainable solution to encourage biomedical inventions for humanitarian purposes.

CITATION: Jonathan T. McMichael, Comment, Producing Valuable Patents: The USPTO’s Missed Opportunity in Biotechnology, 54 Jurimetrics J. 179–200 (2014).
The Techno-Human Shell: A Jump in the Evolutionary Gap by Joseph Carvalko
Reviewied by Ellen M. McGee

Joseph Carvalko’s The Techno-Human Shell: A Jump in the Evolutionary Gap is an indispensable guide to the future of mankind. The author has mas¬tered the developments presently occurring in the worlds of biology, robotics, synthetic biology, nanotechnology and digital technology, and has projected the future course of these research enterprises. It is an amazing tale. The book holds its own in that ever growing field of volumes on technology and the future; its chief virtue is its attention to most areas of scientific progress that will intimately affect mankind.

CITATION: Ellen M. McGee, Joseph Carvalko’s The Techno-Human Shell: A Jump in the Evolutionary Gap, 54 Jurimetrics J. 201–09 (2014) (book review).
Private Assets, Public Mission: The Politics of Technology Transfer and the New American University
David E. Winickoff

The rise of biotechnology, in concert with the Bayh-Dole Act, greatly intensified the practice of university technology transfer in which academic institutions take ownership of federally funded discoveries and license them as private assets. Technology transfer has become a site through which law, life sciences, and the idea of the “entrepreneurial university” are coevolving. Scholars of technology transfer have tended to describe the process of institutionalization as a smooth one, overlooking the important ways that it has been uneven and contested. Particular licensing controversies in the 1990s and 2000s demonstrate how certain prevailing assumptions of the entrepreneurial university have been challenged, and how the boundaries of the public and private domain are under active negotiation. In particular, controversies surrounding biotechnological objects like transgenic mice, stem cells, and pharmaceutical targets evince fundamental disagreements about the obligations of the research university in its new role as an investor of knowledge capital. These controversies have also helped produce new answers to larger questions facing the university as it seeks to negotiate its private and public interests. Faced with increasing political scrutiny from diverse stakeholders, university technology transfer professionals have been forced to develop new practices of public accountability. This trend should be welcomed, as it has the potential to enable both a deeper democratic engagement and a richer distributive politics within the American innovation system.

CITATION: David E. Winickoff, Private Assets, Public Mission: The Politics of Technology Transfer and the New American University, 54 Jurimetrics J. 1–42 (2013).
A Brief of Genetics, Genomics and Forensic Science Researchers in Maryland v. King
Henry T. Greely and David H. Kaye

This issue of Jurimetrics includes the core of an amicus brief filed with the United States Supreme Court in the case of Maryland v. King. The brief was filed on December 28, 2012, the matter was argued on February 26, 2013, and the Supreme Court issued its decision on June 3, 2013. By a five-to-four vote, the Court reversed Maryland’s highest court by holding that Maryland’s statute requiring DNA samples from defendants arrested, but not (yet) convicted for attempted or completed crimes of violence or burglary, did not violate the Fourth Amendment.
The amici, eight scientists and two law professors, filed this brief in support of neither party. Our goal was to lay out, as accurately, completely, and (we hoped) understandably as possible, what current science tells us about the information conveyed by the thirteen short tandem repeats known as “CODIS markers,” the variations in DNA generally used by American law enforcement for forensic identification. The extent to which those markers provide information beyond identification is often viewed as important in interpreting the intrusiveness of forensic DNA testing. The conclusion that emerges from the brief is that the salient information content is not nothing, but not very much. That is, of course, the answer as of December 2012. Although we know of no changes in the scientific understanding of the CODIS markers, such changes may always take place. The markers currently used for forensic identification—both the type of marker and the exact DNA locations involved—can also change. (The FBI plans to expand the set of CODIS markers, which may change some of the details discussed in this brief.) But we believe this brief represents the best understanding of the informational significance of the CODIS markers as of December 2012.

CITATION: Henry T. Greely and David H. Kaye, A Brief of Genetics, Genomics and Forensic Science Researchers in Maryland v. King, 54 Jurimetrics J. 43–64 (2013).
Patents without Teeth: Whole Genome Sequencing and Gene Patent Infringement after AMP v. Myriad
Blake Atkinson

Association for Molecular Pathology v. Myriad Genetics highlights the complexity of gene-related patents and the ethical questions they provoke. This recent Supreme Court decision, hailed by the popular press as placing human genes beyond the reach of patent protection, permits the continued patenting of isolated complementary DNA strands and leaves Myriad in a strong position to continue enforcing its patents to restrict access to genetic testing on the BRCA1 and -2¬ breast cancer genes. A new technology, poised to become commonplace in the medical field in the next few years, promises to offer a novel resolution to the problem. Whole genome sequencing (WGS) techniques will soon decrease in price below $1,000 and rapidly grow in usage frequency, allowing for the sequencing of an individual's whole six-billion-base-pair genome for less than the cost of a handful of genes via the patented testing procedures used today. Furthermore, drawing on the conclusion of the Association for Molecular Pathology case and other legal arguments, WGS is highly unlikely to infringe any surviving gene patents, operating in a distinct manner that does not infringe the complementary DNA patents upheld by the Supreme Court. This technology will provide a robust solution to the arguments that gene patents drive up the cost of genetics testing and restrict patient access to care—even if it does not offer a satisfying philosophical denouement—by offering inexpensive and comprehensive genetic treatment without infringing the gene patents owned by companies like Myriad Genetics.

CITATION: Blake Atkinson, Comment, Patents without Teeth: Whole Genome Sequencing and Gene Patent Infringement after AMP v. Myriad, 54 Jurimetrics J. 65–84 (2013).
You Can’t Handle the Truth: Rationally Limiting the Duty to Disclose Genetic Information
Lauren Burkhart

Health-care costs are a significant current concern and overtreatment burdens both individuals and the national health-care system. One potential contributor to the current era of overtreatment is the development and rapid expansion of genetic testing. Despite a great deal of attention in the academic and legal communities, many issues surrounding genetic testing and the tests’ results have been left unresolved. The issue of when and how to disclose the results of genetic tests was identified more than a decade ago, and it has yet to be definitively addressed by court systems or regulatory agencies. Why? Because the clear, simplistic approaches proposed thus far are too strict, but accounting for the interests of all parties involved is daunting. Genetic information is as important to patients as any other health information, but is unique in that the reliability, versatility, and utility of a given piece of genetic information today may not be the same tomorrow. Legal rules must adapt as genetic technology improves, treatments progress, and care strategies change, but also be clear and straightforward in a particular case at any moment in time. Economics provides the ideal solution: a formula that balances the costs and benefits to all parties to establish a floor for when disclosure of genetic information to patients is mandatory, and gives parties a basis for evaluating if and when to disclose more at their own discretion. Variable factors for each gene, each associated condition, and the current status of treatment for any case can be inserted in such a formula to reach an appropriate result. The result may then give rise to a duty to disclose when it is beneficial and relieve parties of that duty when it is not, avoiding the inefficient delays and administrative costs of constantly updating legal rules. An economic formula should be adopted to limit mandatory disclosure of information to circumstances where the benefits outweigh the total costs to the patient and the burden on the health-care system.

CITATION: Lauren Burkhart, Comment, You Can’t Handle the Truth: Rationally Limiting the Duty to Disclose Genetic Information, 54 Jurimetrics J. 85–107 (2013).
The Dictionary and the Database
David H. Kaye

Almost without exception, courts have upheld the constitutionality of collecting DNA from convicted offenders and placing their identifying DNA profiles into computer-searchable databases that are trawled for matches to DNA recovered from crime scenes or from victims. Although adding profiles to databases before a conviction is far more controversial, just this June, the Supreme Court upheld the constitutionality of a Maryland law mandating DNA collection on arrest and analysis after arraignment. The Court’s endorsement of—and apparent enthusiasm for—DNA databases may encourage more states to adopt preconviction testing.

CITATION: David H. Kaye, Column, The Dictionary and the Database, 53 Jurimetrics J. 389–94 (2013).
The Effect of Adjusted Actuarial Risk Assessment on Mock-Jurors’ Decisions in a Sexual Predator Commitment Proceeding
Nicholas Scurich and Daniel A. Krauss

Twenty states and the federal government have adopted statutes that authorize the post-incarceration commitment of sexually violent predators. Actuarial risk assessment is commonly used, and in some states statutorily required, to assess the risk of sexual recidivism in these proceedings. Professionals sometimes modify actuarial risk estimates with their own clinical judgment, the so-called adjusted actuarial approach. Although this approach is controversial and courts almost uniformly permit it, the effect of this practice on fact finders is unknown. This experiment found that adjusting actuarial risk estimates affected mock-jurors’ decisions to commit a respondent, but only when the adjustment increased the risk estimate. Adjusting the risk estimate downwards did not decrease the commitment rate. Notably, this effect occurred without the expert providing any rationale for the adjustment. Further analyses suggest that participants engaged in motivated reasoning, which refers to the tendency to selectively credit or discredit information depending on whether it is congenial to the desired outcome. Participants who chose to commit the respondent deemed the assessment highly acceptable when it indicated high risk, and relatively unacceptable when it indicated low risk, even though the substance of the assessments was identical. Implications for the adjusted actuarial approach are discussed in conjunction with existing legal admissibility standards for expert testimony.

CITATION: Nicholas Scurich and Daniel A. KraussThe Effect of Adjusted Actuarial Risk Assessment on Mock-Jurors’ Decisions in a Sexual Predator Commitment Proceeding, 53 Jurimetrics J. 395–413 (2013).
The Need to Carefully Interpret the Statistics Reporting the Accuracy of a Narcotics Detection Dog: Application to South Dakota v. Nguyen, State Of Florida v. Harris and Similar Cases
Joseph L. Gastwirth

Before searching people or property, police need “probable cause.” In United States v. Place, the Supreme Court held that dog sniffs of vehicles, stopped for lawful purposes, are not a search. Lower courts have held that a positive alert by a trained narcotics dog can establish probable cause for a search of the car. Many courts use the fraction of positive identifications in which drugs were found to assess the reliability of the dog, and thus to decide whether the alert established probable cause. In the medical test literature, this summary statistic is called the predictive value of a positive test (PVP). By itself, it does not measure the accuracy of the sniffs or medical test. There are two components to assessing the accuracy of a dog sniff or test. These are the probability that the dog sniff correctly classifies an item containing contraband as having it and the probability the sniff correctly exonerates an item not containing contraband. The PVP depends on both these probabilities and on the prevalence of contraband in the places the dog has examined. The same PVP can arise when (1) an accurate dog sniffs items with a low prevalence of contraband and (2) a much less accurate dog examines items with a high prevalence of drugs. It is mathematically impossible to estimate the two accuracy rates of a narcotics dog from the field performance data typically submitted by the state to show the narcotics dog is reliable. The problem arises because one needs three equations to estimate the prevalence and the two accuracy rates but the data only provide two. These issues will be illustrated on data from cases. Furthermore, the number of test sniffs in many certifications is too small to provide a statistically reliable measure of the dog’s accuracy and the prevalence of drugs in the items sniffed is usually at least 50%. Rather than continuing to rely on an inappropriate measure of the accuracy of dog sniffs, courts should require more information concerning the accuracy of dogs in their training sessions, certifications, and in the field. In conjunction with obtaining better information on the prevalence of drugs in commonly occurring settings—such as vehicles stopped for routine traffic violations or items examined after police have received a “tip”—having access to dog accuracy rates would provide the legal system with sufficient information to estimate both measures of accuracy of a narcotics dog and its PVP, which would assist courts in determining whether the police had probable cause.

CITATION: Joseph L. Gastwirth, The Need to Carefully Interpret the Statistics Reporting the Accuracy of a Narcotics Detection Dog: Application to South Dakota v. Nguyen, State Of Florida v. Harris and Similar Cases, 53 Jurimetrics J. 415–446 (2013).
The Computer Fraud and Abuse Act: Hacking into the Authorization Debate
Alden Anderson

The Computer Fraud and Abuse Act (CFAA) makes it a federal crime for a person to intentionally access a computer without authorization or to exceed authorized access and thereby obtain information from any protected computer. The CFAA also provides civil remedies against individuals who violate the Act. While the Act broadly defines what constitutes exceeding authorized access, it does not define authorization. Although the CFAA was originally passed to criminalize computer hacking, employers have recently used the CFAA’s vague language to contend that employees who misuse employer information violate the Act. This has created a split among the Circuit Courts over whether an employee who misuses employer information pursuant to authorized physical computer access “accesses a computer without authorization” or “exceeds authorized access” and can thus be prosecuted or held liable for civil damages under the Act. This article examines those competing interpretations and argues that permitting employers to bring misappropriation claims under the Act is inconsistent with the CFAA’s legislative history and basic propositions of federalism. Specifically, allegations that an employee misused an employer’s information are often based on trade secret violation claims and claims that the employee breached his employment contract or duty of loyalty. These are state claims and an employer should not be permitted to use the CFAA’s ambiguous language to pursue them in federal court under a substantially lower burden of proof than state and federal statutes that address these specific claims. This article proposes that Congress should amend the CFAA so courts know it creates criminal and civil liability for unauthorized computer access that is the result of employee computer hacking and not simply for using information in a way that is contrary to the employer’s interests. Alternatively, the judiciary should not permit employers to use the CFAA to bring these claims into federal court.

CITATION: Alden Anderson, Comment,The Computer Fraud and Abuse Act: Hacking into the Authorization Debate, 53 Jurimetrics J. 447–466 (2013).
Canvassing the Emerging Law of Digital Information: Stephen Mason’s Electronic Evidence
George L. Paul

Stephen Mason’s Electronic Evidence, 3rd edition, explores the laws of Australia, Canada, England and Wales, the European Union, Hong Kong, India, the Republic of Ireland, New Zealand, Scotland, Singapore, South Africa and the United States. The tome comprises 934 pages and discusses over 1500 cases, statutes and treaties. Accordingly, it is appropriate to conceive of Electronic Evidence as an encyclopedia of sorts. The lead author is Stephen Mason of Bedfordshire, England, but the new edition of Electronic Evidence contains the work of 23 other contributors, whose work was led by Mason. Why read a book discussing so many jurisdictions? Certainly, no one could practice in each of those nations. And certainly, the book is not an exhaustive coverage of each jurisdiction. It is, rather, more in the nature of an overview. Why read it? For a start, there is nothing more important to a legal system than the analysis of “information.” Information is the lifeblood of commerce, societal dialogue, and interpersonal communication; and therefore, it is also the subject of litigation the world over. Our economy, indeed, is now overwhelmingly an “information economy,” and will continue to become increasingly so in the future.

CITATION: George L. Paul, Canvassing the Emerging Law of Digital Information: Stephen Mason’s Electronic Evidence, 53 Jurimetrics J. 467–481 (2013) (book review).
Shubha Ghosh’s Identity, Invention, and the Culture of Personalized Medicine Patenting
Adam L. Lunceford

The U.S. Supreme Court recently granted a writ of certiorari in Ass’n for Molecular Pathology v. Myriad Genetics, Inc., the famous case challenging the validity of patents issued for the use of the human genes, BRCA1 and BRCA2, to diagnose breast and ovarian cancers. The Court limited its grant of the appeal to the question: “Are human genes patentable?” Can a person’s identity be reduced to a set of genes, a racial classification, or skin color and how might the U.S. patent system promote such definitions of identity? How do the theories of patent law and those of race intersect? These and similar questions are addressed by Shubha Ghosh in Identity, Invention, and the Culture of Personalized Medicine Patenting.

CITATION: Adam L. Lunceford, Shubha Ghosh’s Identity, Invention, and the Culture of Personalized Medicine Patenting, 53 Jurimetrics J. 483–485 (2013) (book review).
Public Health Law Symposium: A Foreword
James Hodge, Jr.

Revitalization, revolution, renaissance. In the modern era, these three words depict the foundational changes in the role of law to protect, promote, and preserve the public’s health. Since the early 1990s, public health law has transformed into a distinct, identifiable field through the coupling of phenomenal scholarly vision (including from several scholars in this symposium issue) and public health practitioners’ bold experimentation in novel, applied uses of law to improve public health outcomes. After decades of sheer decline, reluctance, and even reversal of known, effective applications, law and policy have emerged as primary tools for making change in public health. Whether via constitutional interpretation, statutory or regulatory enactments, judicial interventions, or policy adjustments within legal systems, the notion that law can effectively contribute to the public’s health in areas as diverse as tobacco control, obesity prevention, and product safety is now relatively well accepted.

The role of law in furthering the public’s health, however, is controversial. Determining legal responsibility for addressing negative communal health outcomes—whether in the public or private sector—is wrought with political tension. At the heart of many uses of public health law are inexorable conflicts with individual rights. Few public health legal interventions arise without arguments from various entities and individuals about perceived or actual impacts on their specific interests.

CITATION: James G. Hodge, Jr., Foreword, Symposium: The Intersection of Law, Science, and Policy to Protect the Public’s Health , 53 Jurimetrics J. 249–254 (2013).
Valuing the Unidentified: The Potential of Public Health Law
Wendy E. Parmet

Social scientists have long recognized the strong tendency to preference identified over unidentified or merely statistical victims. This so-called unidentified lives phenomenon is visible in many areas of U.S. law and policy, including health law and policy. This article looks at public health law’s potential to serve as a counter-balance to this phenomenon. The article argues that public health law—with its roots in a preliberal vision of the police power, and its close association with public health practice, which emphasizes preventing disease in populations—stands apart from much of American law by focusing on the interests of unidentified lives. The article also discusses recent trends in public health practice and broader developments in American jurisprudence that are limiting public health law’s ability to protect the health of the unidentified, but concludes that public health law nevertheless offers both historical and rhetorical support for valuing the lives and interests of not-yet-identified victims.

CITATION: Wendy E. Parmet, Valuing the Unidentified: The Potential of Public Health Law, 53 Jurimetrics J. 255–277 (2013).
Risk Governance and Population Health
Peter D. Jacobson, Sarah J. McHugh, and Valerie K. Tran

Complex problems and emerging technologies challenge the public health system and create both opportunities and risks for population health. Risk governance is an appropriate tool with which public health practitioners can maximize benefit and minimize risk to the public’s health. To date, there is no risk governance framework that is robust, rigorous, and applicable to population health. This article first explores existing risk governance frameworks and discusses their potential applicability to population health challenges. It then explains specifically how biobanks, which have great potential to improve the population’s health, but threaten individual interests, could benefit from the application of a risk governance framework.

CITATION: Peter D. Jacobson, Sarah J. McHugh, and Valerie K. Tran, Risk Governance and Population Health, 53 Jurimetrics J. 279–292 (2013).
Enforcement: Linking Policy and Impact in Public Health
Julia F. Costich and Dana Patton

Public health law, even optimally developed and implemented, can fall short of its goals if it is not enforced. Enforcement theorists note four considerations in framing relevant law: likelihood that infractions will be detected, liability standards, sanction types, and penalty size. By these standards, enforcement provisions in public health law are variable, and somewhat inconsistent. Recent public health funding cutbacks threaten public health agencies’ ability to execute enforcement duties. In addition, enforcement can be controversial when its targets feel it infringes on their ability to act in their own interests. We present examples in current practice and identify cost-effective enforcement strategies that support objectives of public health law. We apply enforcement theory to four specific objects of public health regulation that experience enforcement challenges: all-terrain vehicles, motor vehicle safety belts, child passenger safety, and driving under the influence.

CITATION: Julia F. Costich and Dana Patton, Enforcement: Linking Policy and Impact in Public Health, 53 Jurimetrics J. 293–305 (2013).
Denialism and Its Adverse Effect on Public Health
Leila Barraza, Daniel G. Orenstein, and Doug Campos-Outcalt

A growing culture of denialism in the realm of public health seeks to reject long-standing interventions based on broad scientific consensus and attempts to manufacture the appearance of legitimate debate on important issues, where the weight of scientific evidence actually precludes genuine dispute. Within the context of two public health interventions, community water fluoridation and childhood immunization, this article will assess: (1) the science supporting these policies; (2) the legal doctrines that facilitate such interventions; (3) how a culture of denialism has created opposition that may lead to detrimental consequences for the public’s health; and (4) strategies to combat the denialist movement.

CITATION: Leila Barraza, Daniel G. Orenstein, and Doug Campos-Outcalt, Denialism and Its Adverse Effect on Public Health, 53 Jurimetrics J. 307–325 (2013).
Public Reporting of Hospital Infection Rates: Not All Change Is Progress
David A. Hyman & Bernard Black

Health-care associated infections (HAIs) are a major public health issue. In response, 29 states currently provide public reporting of hospital-specific HAI rates, but there is considerable diversity in how states present information. In related work, we assess the efficacy of these efforts by scoring individual states on the content, credibility, and usability of their public reports and websites. This article addresses a related but distinct topic, by focusing on three states (California, Pennsylvania, and Washington) that have made substantial changes in their HAI public reports, websites, or both during the short period since they began disclosing HAI rates. Indeed, Washington has made two sets of substantial changes to its website. How have these changes affected the content, credibility, and usability of these reports and websites? Stated more bluntly, does change mean progress? Sadly, as we show, the answer is sometimes “no.” We discuss the lessons that other states should draw from these case studies.

CITATION: David A. Hyman & Bernard Black, Public Reporting of Hospital Infection Rates: Not All Change Is Progress, 53 Jurimetrics J. 327–340 (2013).
Mandatory Vaccination of Health-Care Personnel: Good Policy, Law, and Outcomes
Alexandra M. Stewart, Arthur Caplan, Marisa A. Cox, Kristen H.M. Chang, and Jacqueline E. Miller

Experts agree that influenza vaccination of health-care personnel (HCP) is the best method of preventing nosocomial outbreaks. Because voluntary uptake among HCP has not achieved recommended levels, hundreds of health-care facilities and twenty states have implemented mandatory vaccination programs. However, employee unions assert that unilateral changes to working conditions violate established principles of collective bargaining.

In this article, we argue that mandatory vaccination in the health-care context is supported by ethics, science, and law. We examine nosocomial influenza outbreaks and how hospitals have attempted to curb infections through mandatory HCP influenza vaccination policies. We also explore employee unions’ administrative and legal chal-lenges to the requirements. Finally, we suggest an approach to resolving the dispute between HCP labor representatives and health-care facility management.

CITATION: Alexandra M. Stewart, Arthur Caplan, Marisa A. Cox, Kristen H.M. Chang, and Jacqueline E. Miller, Mandatory Vaccination of Health-Care Personnel: Good Policy, Law, and Outcomes, 53 Jurimetrics J. 341–359 (2013).
The “Jurassic Park” Problem—Dual-Use Research of Concern, Privately Funded Research and Protecting Public Health
Vickie J. Williams

In November 2011, a controversy arose about publishing the results of two publicly funded experiments designed to transform the H5N1 “avian” influenza virus into a disease that would be easily transmissible between mammals through airborne means. The concern was that the published research could be used to produce biological weapons. Questions about the proper balance between encouraging the free flow of scientific information and protecting public health followed. This article discusses recent attempts by the U.S. government to balance these concerns and analyzes the effectiveness of these attempts, especially in light of increasing involvement of the private sector in biological research. It posits that the best way to protect the public health when dual-use research of concern is involved is to ensure that the implications of the proposed research are scrutinized by qualified people before the research begins. The article then proposes that requiring private sector researchers to demonstrate that they submitted their research protocols to the appropriate government body for a review of the potentially harmful uses of the research, before applying for a patent, is an effective way to ensure that privately funded research is subject to the same scrutiny as publicly funded research, while preserving First Amendment rights of free expression.

CITATION: Vickie J. Williams, The “Jurassic Park” Problem—Dual-Use Research of Concern, Privately Funded Research and Protecting Public Health, 53 Jurimetrics J. 361–374 (2013).
Factors Affecting Successful Enactment of Legislation Prohibiting Smoking in Cars with Children: Interviews with State Policy Makers,
Amy R. Confair, Jon S. Vernick, Shannon Frattaroli, and Gregory J. Tung

Secondhand tobacco smoke is a recognized health hazard that poses an especially high risk for children. Four U.S. states recently enacted laws prohibiting smoking in cars with child passengers in attempts to reduce this risk. The authors sought to understand the legislative sponsors’ perceptions of factors associated with successful enactment of these laws. Items of particular interest included (1) motives for introducing the bill; (2) the development of a policy solution; and (3) the importance of the policy context. To understand these factors, telephone interviews were conducted with the primary sponsor or key staffers of the bills in all four states that have enacted these laws. An open-ended interview guide based on Kingdon’s theory of policy making provided an overall structure for interview content. Open and axial coding of the interviews yielded themes around the successful enactment of these laws. We also analyzed the language of the bills, and the timing of introduction and enactment. The research indicates that the relationship between secondhand smoke exposure and children motivated all of the interviewees. Personal experiences were also important. Most interviewees noted little opposition to the bills. According to interviewees, a receptive political climate for smoke-free legislation contributed to the bills’ success. Nevertheless, sponsors carefully considered details of the laws and negotiated specific aspects including applicable age, punishment, and enforcement. Kingdon’s framework proved an accurate model for understanding policy makers’ perceptions of factors associated with the bills’ success. Our results will be useful to advocates and policy makers in other U.S. jurisdictions or countries considering similar laws.

CITATION: Amy R. Confair, Jon S. Vernick, Shannon Frattaroli, and Gregory J. Tung, Factors Affecting Successful Enactment of Legislation Prohibiting Smoking in Cars with Children: Interviews with State Policy Makers,, 53 Jurimetrics J. 375–387 (2013).
Hypothesis Testing of the Critical Underlying Premise of Discernible Uniqueness in Firearms-Toolmarks Forensic Practice
William A. Tobin and Peter J. Blau

The forensic practice of firearms-toolmarks (F/TM) identification rests on a currently indefensible conceptual foundation of discernible uniqueness of firearms and tools. However, similar to the now-defunct forensic practice of comparative bullet-lead analysis that had been admitted for almost four decades in judicial proceedings, there has never been any scientifically acceptable hypothesis testing of the underlying premises required for practice validation. Existing studies in the domain literature typically presented in court testimonies as support for specific source attributions are pervasively and fatally flawed such that they have no external validity for extrapolation to universal assumption. They are, thus, of no value for validation of the critical premise of discernible uniqueness in real-world forensic scenarios and are largely irrelevant to any particular criminal judicial proceeding. A practical solution is offered that would allow a scientifically defensible opinion to be proffered to courts until comprehensive and meaningful hypothesis testing can be conducted by the mainstream scientific community.

CITATION: William A. Tobin and Peter J. Blau, Hypothesis Testing of the Critical Underlying Premise of Discernible Uniqueness in Firearms-Toolmarks Forensic Practice, 53 Jurimetrics J. 121 – 142 (2013).
Fishy Statistics: In re Consolidated Salmonid Cases and the Problem of Autocorrelation
Michael Vincent

Although regression analysis is a popular tool for proving causation in environmental litigation, its probative value depends upon the validity of assumptions concerning the observed data. When the data do not satisfy these assumptions, the statistical significance of the analysis may be overstated. One condition which violates a key assumption of standard linear regression models is autocorrelation—correlation among a regression model’s error terms across time.

Because autocorrelation may not be apparent, its presence is frequently overlooked, resulting in erroneous statistical conclusions. In In re Consolidated Salmonid Cases, 791 F. Supp. 2d 802 (E.D. Cal. 2011), California water agencies challenged a regression analysis supporting the federal government’s proposal for protecting a threatened trout species. Expert witnesses disagreed as to the presence of autocorrelation in the data, and if present, the extent to which it resulted in overstated statistical significance. This article reanalyzes the data from In re Consolidated Salmonid Cases to explain the fundamentals of detecting and remedying autocorrelation in simple linear regression. Despite the presence of autocorrelation in the data, the government’s conclusion remains statistically significant after applying remedial measures.

CITATION: Michael Vincent, Fishy Statistics: In re Consolidated Salmonid Cases and the Problem of Autocorrelation, 53 Jurimetrics J. 143 – 162 (2013).
Technical Standards and Ex Ante Disclosure: Results and Analysis of an Empirical Study
Jorge L. Contreras

Ex ante disclosure of patent licensing terms has been proposed as one solution to the problem of holdup in standard setting. Critics of ex ante disclosure argue that requiring early disclosure of licensing terms will impede standards-development processes and create additional legal risks for participants. Yet, in a NIST-funded study that we conducted of three standards development organizations (SDOs), we found no evidence that ex ante disclosure policies resulted in measurable negative effects on the number of standards produced, staff time commitments or quality of standards, nor was there compelling evidence that ex ante policies caused the lengthening of time required for standardization or the depression of royalty rates. There was also evidence to suggest that the adoption of ex ante policies may have contributed to positive effects observed on some of these variables. In addition, a significant majority of participants at one SDO felt that the information elicited by the organization’s ex ante policy was important and improved the overall openness and transparency of the standards-development process. We concluded, on the basis of the data reviewed, that the process-based criticisms of ex ante policies and the predicted negative effects flowing from the adoption of such polices, were not supported by the evidence reviewed.

Nevertheless, ex ante policies have not achieved significant support among standards-development organizations. The causes for this lack of support may include self-interested advocacy by market participants pursuing a patent monetization strategy, as well as fear by other participants that complying with such policies could be costly, inconvenient, and likely to provoke otherwise passive patent holders to seek royalties on their standards-essential patents. Moreover, because of patent-stacking effects, the benefits of ex ante disclosures may not be clear in SDOs characterized by large quantities of standards-essential patents held by multiple participants. However, I argue that ex ante disclosure of licensing terms, while perhaps insufficient, standing alone, to address the effects of patent stacking in the standards context, is a necessary element of most proposed solutions to the stacking problem. Ex ante disclosure is also likely to lessen disputes regarding compliance with fair, reasonable, and nondiscriminatory (FRAND) licensing commitments. Accordingly, the standards-development community should reconsider the adoption of policies requiring ex ante disclosure of licensing terms for standards-essential patents.

CITATION: Jorge L. Contreras, Technical Standards and Ex Ante Disclosure: Results and Analysis of an Empirical Study, 53 Jurimetrics J. 163 – 211 (2013).
Comment, Autopsy Reports, the Confrontation Clause, and a Virtual Solution
Stephen Aiken

Since the United States Supreme Court’s 2001 decision in Crawford v. Washington, lower courts and commentators have wrestled with the application of the Sixth Amendment’s Confrontation Clause to scientific lab reports. In particular, courts have struggled with whether to permit the prosecution to use autopsy reports in homicide trials if the medical examiner who wrote the report is not available to testify. These reports are often crucial to homicide prosecutions, prompting the concern that their exclusion may function as a statute of limitations on murder.

Although there has been a series of Supreme Court cases addressing the Confrontation Clause and scientific evidence, these decisions provide little guidance on the issue. Nevertheless, conventional autopsy reports are likely to be considered testimonial statements that are subject to the Confrontation Clause and therefore cannot be introduced at trial without the testimony of their authors. However, the use of radiological imaging to perform virtual autopsies, called virtopsies, may provide a solution, as they rely on nontestimonial data. Law enforcement agencies can thus use this new technology to avoid Confrontation Clause issues and allow effective homicide prosecutions without requiring a particular medical examiner’s testimony.

CITATION: Stephen Aiken, Comment, Autopsy Reports, the Confrontation Clause, and a Virtual Solution, 53 Jurimetrics J. 213 – 238 (2013).
Identity in Law
Timothy S. Reiniger

In this analysis and critique of electronic signature forms and laws as they are developing around the world, Stephen Mason explains the authentication, evidentiary, and authority purposes of electronic signatures and their historical antecedents in signature law generally. This is not intended to be simply a “how to” guide for everyday practitioners. Instead, Mason’s mission is to remind policy makers and judges that electronic signatures are yet another technology that must be governed by existing legal principles and made to serve human needs. Technology and systems must not be allowed to supersede or change an “ancient protocol” of law.

Without attempting to be prescriptive, Mason presents electronic signatures as being essential to establish identity relationships and determine legal responsibility in the information economy. Electronic signatures are not only the primary method of authentication in law, they also represent or symbolize identity relationships in a given context. To study this book is to study the law of identity.

CITATION: Timothy S. Reiniger, Identity in Law, 53 Jurimetrics J. 239 – 247 (2013) (reviewing Stephen Mason, Electronic Signatures in Law (Cambridge University Press 3rd ed. 2012)).
Likelihoodism, Bayesianism, and a Pair of Shoes
David H. Kaye

In R. v. T, [2010] EWCA Crim. 2439, the Court of Appeal of England and Wales quashed a conviction based in part on testimony of “a moderate degree of scientific evidence to support the view that [a pair of defendant’s shoes] had made the footwear marks” found at the scene of the murder. The court’s opinion discusses how pattern-matching evidence that does not lend itself to a precise quantitative assessment should be presented to juries. In doing so, the opinion conflates the role of likelihoods in Bayesian inference and their role in the likelihood-based approach to inference. One need not be a Bayesian to advocate the presentation of likelihoods, but both likelihoodism and Bayesianism lay bare an important distinction—testimony about hypotheses concerning the origin of a trace differs from testimony about the extent to which the trace evidence supports these source hypotheses. Forensic science experts should testify with this distinction in mind so that they confine their testimony to assessments of how probative the evidence is and do not engage in source attribution.

CITATION: David H. Kaye, Column, Likelihoodism, Bayesianism, and a Pair of Shoes, 53 Jurimetrics J. 1–9 (2012).
Regulation by Means of Standardization: Key Legitimacy Issues of Health and Safety Nanotechnology Standards
Evisa Kica and Diana M. Bowman

Along with informal standardization organizations, trade associations, and private entities, classical standard-setting organizations such as the International Organization for Standardization (ISO) and the Organization for Economic Co-operation and Development (OECD) have been actively involved in the development of standards and guidance materials in the field of nanotechnologies. These developments have led to new relationships and partnerships, shifting the attention from traditional state regulation to polycentric regulatory structures in which the government is not the sole source of decision-making authority. With these modes of regulatory governance, the formulation or implementation of decisions is on the hands of various actors, who are not directly elected by the citizens to make decisions that may become (collectively or de facto) binding and impact the interests of the public and industry. These facts lead to many dilemmas on whether such nonstate arrangements can be considered legitimate and on what principles their legitimacy can be measured. Nevertheless, while the importance of standards has been widely recognized in nanotechnology, little attention is paid to the issue of legitimacy, and in particular to the actors involved and the processes by which these standards are negotiated and developed at the international level. This article analyzes the potential and the legitimacy implications of the standardization activities related to health and safety nanotechnology standards. In particular, it explores how the regulatory activities of the nanotechnology standardization developments can be analyzed and evaluated from the viewpoint of legitimacy. What are the problems with the current approaches to legitimacy and what sources of legitimacy can be used to analyze nanotechnology standardization? And, what lessons can we distill by looking at the organizational processes, mechanisms, and strategies that transnational regulators follow to establish their legitimacy and authority in nanotechnology?

CITATION: Evisa Kica and Diana M. Bowman, Regulation by Means of Standardization: Key Legitimacy Issues of Health and Safety Nanotechnology Standards, 53 Jurimetrics J. 11–56 (2012).
Sustainability and Intellectual Property Rights in Traditional Knowledge
Dennis S. Karjala

This article makes a simple point: If sustainability (however defined) is the goal, intellectual property rights (IPRs) in traditional knowledge (TK) do not move us toward the achievement of that goal. The reason is that the only social policy justification for recognizing IPRs at all is that they supposedly serve as an incentive to create socially desirable works of authorship and inventions. They are not and should not serve as a reward for past achievements. In other words, outside of their usual incentive function of promoting new technology, IPRs in TK have no role to play in the sustainability analysis. This is not to say that TK is irrelevant to sustainability; indeed, there is good reason to believe that much can be learned from study and implementation of traditional practices in a wide range of fields. Nor is it to say that IPRs in general play no role in advancing the goal of sustainability. The incentives supplied by IPRs to authors and inventors may help induce new technologies and methods for preserving what is left of the natural state of the planet and its ecosystems. The point is only that IPRs in TK can do no good (in promoting sustainability) and may do much harm, by tying up knowledge in exclusive rights that inhibit its application to sustainability (or anything else) without any compensating social gains.

CITATION: Dennis S. Karjala, Sustainability and Intellectual Property Rights in Traditional Knowledge, 53 Jurimetrics J. 57–70 (2012).
The Hot News Misappropriation Doctrine: Confusion in the Internet Age and the Call for Legislative Action
Lindsay G. Rabicoff

In an increasingly connected world, copying information and disseminating it has never been easier. Unfortunately, this rise in technology and connectivity has also led to an increased taking—without permission or attribution—from compilers of factual content and creators of uncopyrightable works. Currently, where the factual content is not copyright protected, the available remedy to the victims of such free riding is the state law hot news misappropriation doctrine. Regardless of the availability of the remedy, courts in recent cases have struggled to adapt this doctrine appropriately to apply to the situations created by the Internet age. Although some scholars argue that federal copyright law preempts the hot news misappropriation doctrine, even if it survives preemption it is still an inappropriate remedy in modern society.

Simply put, the hot news misappropriation doctrine is failing to work in today’s technological world. The confusion surrounding the hot news misappropriation doctrine indicates that the courts’ current solution addressing free riding concerns is ineffective. The doctrine fails because it is vague and can be interpreted to both over- and underprotect information, depending on the context. Legislative action is needed to protect uncopyrightable information that (1) is an essential feature of a plaintiff’s business, (2) provides a benefit to society, and (3) is unlikely to be generated without an incentive of protection for an appropriate time to allow the creator of the work to recover his costs.

CITATION: Lindsay G. Rabicoff, Comment, The Hot News Misappropriation Doctrine: Confusion in the Internet Age and the Call for Legislative Action, 53 Jurimetrics J. 71–95 (2012).
Juvenile Justice and Mental Health: Innovation in the Laboratory of Human Behavior
Sean C. McGarvey

Developed as a structure that seeks to rehabilitate rather than punish, the American juvenile justice system has yet to adequately address widespread mental health concerns among youth offenders. Scarred by childhood trauma and other societal challenges, youth within the juvenile justice system frequently struggle with mental health issues, which all too often go unnoticed and untreated. Meanwhile, modern developments in neuroscience have opened new windows into the juvenile mind and illuminated a path for reform within the juvenile justice system. Scientists have uncovered neural correlations beneath many diagnosable mental health disorders. Furthermore, neurological studies have demonstrated the effectiveness of treatment-based alternatives to juvenile detention in addressing the roots of mental growth and development. Such discoveries demand greater resource investment in the juvenile justice system to efficiently and effectively rehabilitate youth suffering from mental health issues.

CITATION: Sean C. McGarvey, Comment, Juvenile Justice and Mental Health: Innovation in the Laboratory of Human Behavior, 53 Jurimetrics J. 97–120 (2012).
Website Design and Liability
Nancy S. Kim

Two regrettable behaviors have emerged online: the posting of content about others without their consent; and impulsive postings with no consideration of long-term consequences. Website operators can either encourage or discourage these regrettable behaviors and influence their consequences through the design of their website and by the fostering of norms and codes of conduct. Unfortunately, courts interpret section 230 of the Communications Decency Act as providing websites with broad immunity. In an earlier article, I argued that a proprietorship standard should be imposed upon websites, which would require them to take reasonable measures to prevent foreseeable harm. This article further champions the concept of website proprietorship liability and proposes that section 230 should be amended to recognize such liability with provisions for the following “safe harbors” for website operators that: (1) permit only postings by identified posters; (2) have nonprofit status and do not accept ad revenue; and (3) remove postings upon request of the victim. This article also addresses anticipated objections that are based upon market concerns and free speech concerns.

CITATION: Nancy S. Kim, Website Design and Liability, 52 Jurimetrics J. 383 – 431 (2012).
Jurors and Scientific Causation: What Don’t They Know, and What Can Be Done About It?
N.J. Schweitzer and Michael J. Saks

Past attempts to better equip jurors to comprehend scientific expert testimony have had little success. This paper describes a new approach to training jurors to be better consumers of such evidence. In addition to assessing how well jurors can differentiate between valid and flawed experimental research when that research is presented through expert testimony in a civil trial, we describe an experiment that tests the efficacy of a new educational intervention designed to enable jurors to more capably evaluate causal scientific research. The intervention, which is brief, not case specific, and transportable into trials of other causal issues, teaches jurors to understand and identify the three requisites of causal inference: temporal precedence, covariation, and nonspuriousness. Using 182 adult citizens as jurors, in a fairly realistic videotaped mock trial we presented a toxic tort trial in which the defendant’s product was alleged to cause a lung disease. The critical evidence in the trial is a study, presented by an expert, which tests the causal relationship between the defendant’s product and lung disease through either a properly designed experiment or one in which one or another of the key elements of causal inference is absent. Consistent with previous researchers’ findings, untrained jurors were unable to distinguish the well designed experiment from any of the defectively designed experiments. However, trained jurors were better able to assess the quality of the research, and these more accurate assessments were reflected in their verdicts.

CITATION: N.J. Schweitzer and Michael J. Saks, Jurors and Scientific Causation: What Don’t They Know, and What Can Be Done About It?, 52 Jurimetrics J. 433 – 455 (2012).
Big Issues for Small Stuff: Nanotechnology Regulation and Risk Management
Gary E. Marchant, Blake Atkinson, David Banko, Joshua Bromley, Edith Cseke, Evan Feldstein, Devin Garcia, Justin M. Grant, Connor Hubach, Monika Silva, Robert L. Swinford, and Simon Willman

Emerging nanotechnologies hold much promise for addressing many of the nation’s most pressing priorities, but that promise is increasingly being threatened by growing concerns about potential health and environmental issues. The successful commercialization of nanotechnology, including the realization of the important health and environmental benefits of nanotechnology, depends on developing effective systems of oversight and governance. Yet, several aspects of nanotechnology limit the effectiveness of traditional regulation, including the lack of a precise definition, the rapid pace of change in the technology, the heterogeneity and diversity of nanotechnology materials and applications, and the tension between the environmental and health benefits and risks of nanotechnology. A variety of risk management approaches have been proposed or are being attempted to fill this regulatory void, including occupational safety and hygiene programs, voluntary and partnership programs, insurance, the precautionary principle, and liability.

CITATION: Gary E. Marchant, Blake Atkinson, David Banko, Joshua Bromley, Edith Cseke, Evan Feldstein, Devin Garcia, Justin M. Grant, Connor Hubach, Monika Silva, Robert L. Swinford, and Simon Willman, Big Issues for Small Stuff: Nanotechnology Regulation and Risk Management, 52 Jurimetrics J. 243 – 277 (2012).
Soft Law Oversight Mechanisms for Nanotechnology
Kenneth W. Abbott, Gary E. Marchant, and Elizabeth A. Corley

Few observers doubt that regulatory oversight is now or will ultimately be necessary for at least some nanotechnology products and processes. Yet oversight need not take the form of mandatory command-and-control regulation: regulation increasingly follows a more flexible governance paradigm that incorporates private and public-private arrangements. This article reviews eleven public and private soft law oversight mechanisms for nanotechnology operating within the United States, the European Union and transnationally. It then provides a framework for assessing such mechanisms, considering the characteristic strengths and weaknesses of public and private approaches as well as the impact of specific design choices. We find that the existing mechanisms have a mixed record: many incorporate shallow commitments, few actively promote implementation, and none engage in significant monitoring or provide strong incentives for compliance. Accordingly, we call on public authorities to strengthen current oversight schemes and promote the development of more effective mechanisms.

CITATION: Kenneth W. Abbott, Gary E. Marchant, and Elizabeth A. Corley, Soft Law Oversight Mechanisms for Nanotechnology, 52 Jurimetrics J. 279 – 312 (2012).
Nanotechnology Liability: Do We Steer or Just Go Along for the Ride?
Edward R. Glady, Jr., Gregorio M. Garcia, and Blair H. Moses

Each participant in the nanotechnology scene has different interests and motivations. Those in business want to get a product to market as fast as possible to start earning a return on their investments. The consumer wants the benefits of the new technology. The regulator wants to ensure the products on the market are safe for the consuming public, as does the consumer advocate. The plaintiff attorney wants to ensure that clients who claim to have been injured by nanoproducts are fairly compensated in a timely manner. The defense attorney, the insurer, and the client want to ensure that any liability imposed is both reasonable and scientifically supported. The court system wants nanotechnology claims processed efficiently and in accordance with due process. The media wants headlines that sell. And the internet makes possible communication that may far exceed verification. Against this mix of players, how will claims of alleged nanotechnology harm be examined, treated, and resolved? Examining past examples of legal redress for injury claims arising from “new” technology allows us to learn and to suggest ideas for a new model for nanotechnology liability claims before they arise.

CITATION: Edward R. Glady, Jr., Gregorio M. Garcia, and Blair H. Moses, Nanotechnology Liability: Do We Steer or Just Go Along for the Ride?, 52 Jurimetrics J. 313 – 335 (2012).
Matching Solutions to Problems: Strategies for Nanotechnology Oversight
Daniel J. Fiorino

An effective strategy for exercising responsible oversight of a new and rapidly evolving challenge such as nanotechnology will require a combination of regulatory and more voluntary approaches. Much of the current system for environmental regulation is designed for large pollution sources and high-volume production of chemicals, and it is not well suited on its own to the characteristics of nanotechnologies and their effects. Two recent voluntary programs—the Nano Risk Framework and the Nanomaterials Stewardship Program—illustrate both the strengths and limitations of voluntary programs. Hard law (regulatory programs) and soft law (relatively voluntary programs) both offer advantages as policy strategies for nanotechnology oversight. The challenge for policy makers is to understand how best to combine them into an effective, long-term oversight strategy.

CITATION: Daniel J. Fiorino, Matching Solutions to Problems: Strategies for Nanotechnology Oversight, 52 Jurimetrics J. 337 – 345 (2012).
Soft Law and Nanotechnology: A Functional Perspective
Timothy F. Malloy

Whether and how to regulate nanotechnology is debated widely. While ad hoc bits of regulation shuffle forward, a comprehensive response eludes us. Some advocate using new governance approaches, seeking to transform regulation from an agency-centric exercise to a collaborative undertaking by actors from multiple segments of society. One central aspect of this new governance is reliance upon soft law approaches to regulation. In the area of nanotechnology particularly, numerous commentators have proposed a variety of soft law mechanisms. Yet, the concept of soft law is fuzzy in terms of its definition, specific functions, and optimal uses. This article addresses that fuzziness in two ways. First, it provides a definition of soft law informed by the four functions soft law serves: the precursive, normative, directive and complementary functions. Second, it comments upon the usefulness of soft law with respect to each of those functions in the specific context of nanotechnology.

CITATION: Timothy F. Malloy, Soft Law and Nanotechnology: A Functional Perspective, 52 Jurimetrics J. 347 – 358 (2012).
Scientific Challenges of Nanomaterial Risk Assessment
Kiril D. Hristovski

Development of an appropriate regulatory apparatus to successfully manage potential risks of nanomaterials necessitates adequate nanoparticle risk assessment. However, unique scientific challenges prevent the use of traditional risk assessment tools to describe and predict risks originating from the novel properties of nanomaterials. With the rapid introduction of nanomaterials in society, it is of paramount importance to address these potential risks now. This study outlines and dis­cusses major scientific challenges hindering the progress of nanomaterial risk assessment.

CITATION: Kiril D. Hristovski, Scientific Challenges of Nanomaterial Risk Assessment, 52 Jurimetrics J. 359 – 370 (2012).
Public Challenges of Nanotechnology Regulation
Elizabeth A. Corley, Youngjae Kim, and Dietram A. Scheufele

Regulatory decisions are often approached with the assumption that decision making would be easier with full public knowledge of the topic and complete scientific certainty about risks and benefits. Unfortunately, for emerging technologies with potentially far-reaching and long-term societal implications, the assumption that regulatory decisions can be made with all relevant facts on the table is unrealistic. More importantly, however, many of the ethical, legal and social questions surrounding these technologies in public debate are inherently political questions, and—as a result—the technical or scientific facts behind these new technologies are only a small part of how societies come to agreement about the various regulatory options surrounding these emerging technologies. Given the growing presence of nanomaterials in consumer end markets worldwide and the uncertainties about the risks and benefits of nanomaterials, the development of nanotechnology regulations must move forward in the absence of full public knowledge and scientific certainty.

In this article, five core public challenges are identified that face regulators and policymakers as they move forward with nanoregulation in the United States. Within the context of introducing these challenges, data are presented that illuminate why these issues could be challenges for the development of nanotechnology regulations. The paper concludes with priority areas for nanoregulation based on the perceptions of leading U.S. nanoscientists. The data presented in this article were collected through the Center for Nanotechnology in Society at Arizona State University (CNS-ASU), which is funded by the National Science Foundation (NSF).

CITATION: Elizabeth A. Corley, Youngjae Kim, and Dietram A. Scheufele, Public Challenges of Nanotechnology Regulation, 52 Jurimetrics J. 371 – 381 (2012).
Meta-Analysis of “Sparse” Data: Perspectives from the Avandia Cases
Michael O. Finkelstein and Bruce Levin

Combining the results of multiple small trials to increase accuracy and statistical power, a technique called meta-analysis has become well established and increasingly important in medical studies, particularly in connection with new drugs. When the data are sparse, as they are in many such cases, certain accepted practices, applied reflexively by researchers, may be misleading because they are biased and for other reasons. We illustrate some of the problems by examining a meta-analysis of the connection between the diabetes drug Avandia (rosiglitazone) and myocardial infarction that was strongly criticized as misleading, but led to thousands of lawsuits being filed against the manufacturer and the FDA acting to restrict access to the drug. Our scrutiny of the Avandia meta-analysis is particularly appropriate because it plays an important role in ongoing litigation, has been sharply criticized, and has been subject to a more searching review in court than meta-analyses of other drugs.

CITATION: Michael O. Finkelstein and Bruce Levin, Meta-Analysis of “Sparse” Data: Perspectives from the Avandia Cases, 52 Jurimetrics J. 123 – 153 (2012).
Statistical Considerations Support the Supreme Court’s Decision in Matrixx Initiatives v. Siracusano
Joseph L. Gastwirth

In Matrixx Initiatives, Inc. v. Siracusano, the Supreme Court ruled that information about the number of adverse events occurring to users of a drug need not be sufficiently numerous to reach statistical significance before this knowledge should be disclosed to potential investors. Several commentators interpreted the decision as questioning the value of the concept of statistical significance or limiting its role when courts examine whether the failure of a firm to disclose information was material to investment decisions. None of the briefs, legal decisions or comments actually conducted a statistical analysis of the reports of anosmia (loss of smell) in users of the drug Zicam even though most of the early cases came from one large medical clinic. Apparently, because the adverse event reports were not obtained through a planned statistical survey or study, it was thought that they could not be subject to statistical analysis. However, statistical methods originally developed for the analysis of randomized clinical trials or a random sample from a population can be used to analyze “observational” data provided one also conducts a sensitivity analysis. This additional analysis assesses the potential impact of plausible deviations of the data set being examined from the assumptions underlying the statistical calculations. Applying the methodology to the available data in Matrixx indicates that by fall 2003 the firm should have realized that there was a substantial probability of an association between the use of Zicam and the subsequent development of anosmia; and by 2009, when the FDA issued a warning to the company, the statistical support for an association between the use of the product and anosmia was very strong.

CITATION: Joseph L. Gastwirth, Statistical Considerations Support the Supreme Court’s Decision in Matrixx Initiatives v. Siracusano, 52 Jurimetrics J. 155 – 175 (2012).
The Expanding Role and Importance of Standards in the Information and Communications Technology Industry
Brad Biddle, Frank X. Curci, Timothy Haslach, Gary E. Marchant, Andrew Askland, and Lyn Gaudet

Standards play a particularly critical role in the information and communications technology (ICT) industry: they facilitate important interoperability goals. Standards development processes in the ICT industry are extraordinarily complex, and many aspects of these processes are not well understood. Inspired by discussions at a workshop that included leading practitioners, academics, and policymakers specializing in standards, the authors identify factors that explain both the growing importance and the growing complexity of standards in the ICT industry. The authors provide a framework for understanding how standards development efforts are structured, with a particular focus on the more informal specification development groups known as Consortia. The authors also explore two particular challenges in standard setting: the development of intellectual property policies that adequately balance different stakeholder interests, and the potential for ethical conflict issues.

CITATION: Brad Biddle, Frank X. Curci, Timothy Haslach, Gary E. Marchant, Andrew Askland, and Lyn Gaudet, The Expanding Role and Importance of Standards in the Information and Communications Technology Industry, 52 Jurimetrics J. 177 – 208 (2012).
Advancing Evidence-Based Medicine by Expanding Coverage with Evidence Development
Rachel A. Lindor

The Patient Protection and Affordable Care Act of 2009 (ACA), as amended by the Health Care Reform and Reconciliation Act of 2010, laid the foundation for the future of health care in the United States and expanded health-care access to millions of Americans. The cost of this expansion is expected to be significantly offset by the savings generated by the bill’s lesser-known provisions, which are designed to improve the quality of Americans’ health care. History has shown, however, that these provisions, including multibillion dollar investments in clinical research and electronic record keeping, will not successfully improve health care unless they are accompanied by strong financial incentives for providers to deliver more effective care. Medicare’s “coverage with evidence development” (CED), a policy that conditions Medicare payments for new products on the collection of clinical data about their effectiveness, provides such an incentive. Revising the guidance document currently governing the use of CED would allow the policy to be used more effectively than it has been in the past to improve both the effectiveness and value of health care.

CITATION: Rachel A. Lindor, Comment, Advancing Evidence-Based Medicine by Expanding Coverage with Evidence Development, 52 Jurimetrics J. 209 – 237 (2012).
The Quandary of Agricultural Biotechnology, Pure Economic Loss, and Non-Adopters: Comparing Australia, Canada, and the United States
Karinne Ludlow and Stuart J. Smyth

Innovations impact societies in a variety of ways. Successful innovations are utility enhancing, in that they create a higher degree of benefits that offset any of the potential disadvantages of the innovation. Unsuccessful innovations suffer from the reverse, in that they result in more disadvantages than benefits and therefore, are ultimately rejected by society. The innovation of agricultural biotechnology and genetically modified (GM) crops has triggered substantial discussion regarding the advantages and disadvantages of the technology. Numerous financial and economic benefits are starting to be recognized by adopters, but some non-adopters are growing increasingly concerned about their ability to profit given the high levels of GM crop adoption. While some might argue that non-adopters of GM crops are the conventional economic losers of this innovation, the reality is that demand for non-GM products is higher, in large part, because of consumer desires to avoid GM food products. The concept of pure economic loss in relation to innovation posits that those negatively impacted by the innovation of GM crops are entitled to compensation that offsets the externality. In undertaking a thorough assessment of pure economic loss and GM crops, this article evaluates the logic for, and efficiencies of, having compensation funded via the use of courts versus government regulations. This article considers whether non-adopter rights are developing in the case of GM crops and what governance response mechanism is best suited to those claims. It is concluded that the decision over whether to support or reject an innovation is too important to the larger society as a whole to be decided by the courts.

CITATION: Karinne Ludlow and Stuart J. Smyth, The Quandary of Agricultural Biotechnology, Pure Economic Loss, and Non-Adopters: Comparing Australia, Canada, and the United States, 52 Jurimetrics J. 7 – 41 (2011 ).
“Facebook Is Now Friends with the Court”: Current Federal Rules and Social Media Evidence
Megan Uncel

Today, hundreds of millions of people around the world are active users of some kind of social networking website such as Facebook and MySpace. Users plaster their profiles with the play by play of their daily lives often complemented by dozens of pictures to document the experience. Not so very long ago, this type of information was reserved for the few friends who actually saw a person daily. But today, almost anyone can have access to a user’s “private” thoughts and photographs, including lawyers looking for potentially useful evidence to help support a claim or defense. Information gathered on social networking sites provides valuable evidence in both criminal and civil cases about a party’s daily life and state of mind. Unique problems arise, however, in the form of discoverability, admissibility, and, most notably, authentication and reliability issues. Some courts may be reluctant to admit social media content as evidence because of unfamiliarity with its novel utility. Without proper guidance, courts produce a disjointed patchwork of inconsistent case law with no unified approach for this distinctive category of evidence. This comment will seek to guide courts toward pursuing a more accommodating and consistent acceptance of social media evidence. It analyzes the application of the current Federal Rules of Civil Procedure and Federal Rules of Evidence to social networking evidence and advocates for the recognition of an informal presumption of reliability that, in an appropriate case, an opponent easily can rebut with convincing proof that the site or content is inauthentic in any way.

CITATION: Megan Uncel, Comment, “Facebook Is Now Friends with the Court”: Current Federal Rules and Social Media Evidence, 52 Jurimetrics J. 43 – 69 (2011 ).
In re Nuijten: A Failure of the Patent System to Incentivize Innovation
Brian Greathouse

In In re Nuijten, the Federal Circuit Court ruled that a transitory electromagnetic signal is not patentable subject matter because it is intangible and does not fit neatly into one of the four statutory categories enumerated in 35 U.S.C. § 101. Although the invention in Nuijten related to an improved way of watermarking digital media, the Court’s decision is much more far reaching. The Court restricted the manufacture category of § 101 to exclude intangible signals. Although intangible, signal technologies such as satellite communications, encrypted digital signals, and brainwave advancements are viable as products in our future economy. Nuijten creates uncertainty and unnecessarily restricts investment in new technologies that are so-called intangible. This note analyzes the Nuijten decision and argues that electromagnetic signals should be patentable if policy supports incentivizing advanced signaling technologies.

CITATION: Brian Greathouse, Note, In re Nuijten: A Failure of the Patent System to Incentivize Innovation, 52 Jurimetrics J. 71 – 85 (2011 ).
What’s In a Word? Defining Registration Under the Copyright Act
Kenneth Moskow

Under § 411(a) of the Copyright Act, an owner of a copyright in a U.S. work may not bring a civil action for infringement unless the allegedly infringed work is registered with the Copyright Office. Authority among federal circuit courts is split on whether the registration requirement is satisfied upon the Copyright Office’s final disposition of an application for registration (the registration approach), or upon the Copyright Office’s receipt of the application (the application approach). Two cases, one from the Tenth Circuit and the other from the Ninth Circuit, highlight and best illustrate the interpretive split among the many circuits. In La Resolana Architects, PA v. Clay Realtors Angel Fire, a 2005 opinion, the Tenth Circuit Court of Appeals held that under the Copyright Act § 411(a), registration requires final disposition by the Copyright Office. Conversely, in a 2010 case, Cosmetic Ideas, Inc. v. IAC/Interactivecorp, the Ninth Circuit Court of Appeals held that merely filing an application with the Copyright Office satisfied the registration requirement. Recently, the United States Supreme Court denied IAC/Interactivecorp’s petition for a writ of certiorari, opting not to settle the split in authority on what constitutes registration for purposes of bringing an infringement suit. The Copyright Act is a uniform statutory scheme that controls copyright law in the United States. Therefore, the Act should apply equally to all copyright owners in the United States, regardless of the geographic location in which they bring their infringement actions. This case note argues that: (1) the Ninth Circuit’s recent holding in Cosmetic Ideas, Inc. v. IAC/Interactivecorp better served the policies underlying the registration requirement of § 411(a) of the Copyright Act and, accordingly, the registration requirement should be satisfied upon the Copyright Office’s receipt of an application for copyright registration, and not upon disposition of the application; and (2) in the wake of the Supreme Court’s denial of certiorari in the Cosmetic Ideas case, Congress should intervene and statutorily define registration under the Copyright Act to comport with the application approach.

CITATION: Kenneth Moskow, Note, What’s In a Word? Defining Registration Under the Copyright Act, 52 Jurimetrics J. 87 – 106 (2011 ).
A Quest for a Theory of Privacy: Context and Control—A Review of Helen Nissenbaum's Privacy in Context: Technology, Policy, and the Integrity of Social Life
Michael D. Birnhack

This review analyzes Helen Nissenbaum’s recent and highly important book, Privacy in Context: Technology, Policy and the Integrity of Social Life. Nissenbaum proposes a detailed framework to better understand privacy issues and assist in prescribing privacy policies that meet the needs of the 21st century. Her proposed framework is Contextual Integrity (CI), which seeks to identify the impact of a new socio-technological system on existing, entrenched norms (social and legal) relating to transmission of personal information within a specific context. The focus is on the flow of information. According to CI, once we observe a change in informational transmission norms there is a presumption that privacy has been violated, but the presumption can be rebutted. Thus, we should evaluate the change against general moral principles and against the values and goals of the particular context.

The review locates the book within the current heated debate about privacy and identifies a growing yearning for an up-to-date privacy theory that can answer current and forthcoming challenges. After summarizing the main arguments of the book, the review makes three critical arguments. First, it points to the limits of relying on contexts as an organizing unit of a privacy analysis. Many contexts are dynamic, unsettled and lack any clear set of reliable informational norms. This is especially so in the digital environment. Second, the review argues that CI allocates an insufficient place to a metajustificatory principle (or principles) of privacy. Privacy theory, I argue, cannot avoid identifying its fundamental metaprinciple(s), difficult as such identification may be. Thus, as far as a theoretical inquiry, CI can be extremely helpful, but only as part of a broader normative analysis. It can supplement a justificatory theory of privacy, but not supplant it. Third, the review argues that we already have a metaprinciple that can best explain and justify privacy as a social and philosophical concept as well as a legal right: Privacy as Control (PaC). According to PaC, a right to privacy is the control an autonomous human being should have over her personal information, regarding its collection, processing and further uses, including onward transfers. Nissenbaum’s discussion of this theory renders it a secondary place at best. However, the review argues that Privacy as Control remains the strongest conception of privacy. It surely has its problems and it faces pressing challenges that CI can assist in clearing, but it should not be reduced to a mere transmission norm.

CITATION: Michael D. Birnhack, A Quest for a Theory of Privacy: Context and Control, 51 Jurimetrics J. 447–479 (2011) (reviewing Helen Nissenbaum, Privacy in Context: Technology, Policy and the Integrity of Social Life (2009)).
Computer-Managed Perpetual Trusts
Michael Vincent

Many states have now abrogated the rule against perpetuities, opening their doors (and coffers) to perpetual trusts. When combined with advances in artificial intelligence (AI) technology, this creates a perfect storm for dead-hand control. This confluence is enabling the emergence of long or unlimited duration computer-managed trusts that operate autonomously as vehicles of vicarious immortality for the settlors who fund them. This article introduces the robo-trust, a trust using AI to flexibly respond to changing world conditions, and speculates about its likely legal and computational structure.

The robo-trust will be born from the age-old quest to live forever, the tools of which are often postmortem property controls and cutting-edge technology. Its unique adaptability will permit it to solve many of the problems that plague traditional perpetual trusts. It does however raise concerns regarding public policy, namely indefiniteness. An analysis of these concerns is presented in the context of two approaches to the robo-trust’s uniquely bifurcated structure—whether the computer program is a component of the trust instrument or is a tool for the trustee’s administration. The latter view is essential for the robo-trust’s validity, and is the more likely conception of its structure. The best defense against the proliferation of these computer-managed trusts, and perpetual trusts in general, is a revival of the rule against perpetuities.

CITATION: Michael Vincent, Comment, Computer-Managed Perpetual Trusts, 51 Jurimetrics J. 399–446 (2011).
The Criminal Psychopath: History, Neuroscience, Treatment, and Economics
Kent A. Kiehl and Morris B. Hoffman

The manuscript surveys the history of psychopathic personality, from its origins in psychiatric folklore to its modern assessment in the forensic arena. Individuals with psychopathic personality, or psychopaths, have a disproportionate impact on the criminal justice system. Psychopaths are twenty to twenty-five times more likely than non-psychopaths to be in prison, four to eight times more likely to violently recidivate compared to non-psychopaths, and are resistant to most forms of treatment. This article presents the most current clinical efforts and neuroscience research in the field of psychopathy. Given psychopathy’s enormous impact on society in general and on the criminal justice system in particular, there are significant benefits to increasing awareness of the condition. This review also highlights a recent, compelling and cost-effective treatment program that has shown a significant reduction in violent recidivism in youth on a putative trajectory to psychopathic personality.

CITATION: Kent A. Kiehl and Morris B. Hoffman, The Criminal Psychopath: History, Neuroscience, Treatment, and Economics, 51 Jurimetrics J. 355–397 (2011).
Safety Squeeze: Banning Non-Wood Bats Is Not the Answer to Amateur Baseball’s Bat Problem
David A. Palanzo

In 2007, the City of New York passed an ordinance banning the use of non-wood (metal) baseball bats in high school baseball games citing the safety of high school athletes as the main concern. Proponents of the law argued that injuries from baseballs propelled by non-wood baseball bats had become more frequent and more violent. They pointed to amateur players like eighteen-year-old Brandon Patch, killed in Montana in 2003 by a line drive from a metal bat that struck his temple. While child and young adult safety is certainly a legitimate concern, requiring the use of wood baseball bats will create its own safety and economic issues and will not solve amateur baseball’s bat problem. Our National Pastime is in need of a compromise. By requiring manufacturers to build non-wood baseball bats that perform exactly like wood bats, amateur baseball can eliminate the safety conflict between wood and non-wood bats, limit the environmental footprint of baseball bats, reduce the economic impact of baseball bats, and maintain the statistical and fundamental integrity of the amateur game by preventing the inflation of amateur statistics. Maintaining the statistical and fundamental integrity in turn could increase both the economic and temporal efficiency of the Major League Baseball scouting and draft process. Thus, rather than ban the use of non-wood baseball bats altogether, amateur baseball regulatory bodies such as the NCAA and NFHS should work with MLB to regulate the technology of non-wood baseball bats used in amateur baseball leagues to require performance at exact wood bat levels.

CITATION: David A. Palanzo, Comment, Safety Squeeze: Banning Non-Wood Bats Is Not the Answer to Amateur Baseball’s Bat Problem, 51 Jurimetrics J. 319–353 (2011).
Brain Fingerprinting, Scientific Evidence, and Daubert: A Cautionary Lesson from India
Lyn M. Gaudet

Although the Supreme Court decided the seminal case of Daubert v. Merrell Dow Pharmaceuticals, Inc. nearly two decades ago, academic discourse about the value of the Daubert standard rages on. This note discusses Daubert in a new context, using the 2008 Indian case of State of Maharashtra v. Sharma as an example of how unreliable, questionable evidence can penetrate the courtroom when admissibility standards for expert evidence do not keep it at bay. This note also analyzes Daubert against the backdrop of rapidly emerging technologies and highlights the fact that courts can expect to confront increasing amounts of technical expert evidence in the future. Now, more than ever, courts must be armed with a mechanism to separate the legitimate from the illegitimate. Addressing Daubert critics, whose arguments are focused mainly on toxic tort cases, this note finds their criticisms do not apply in criminal trials, and thus the vast majority of the dissatisfaction with Daubert is one-sided. It is as a screening tool for expert evidence and testimony in criminal court that the Daubert standard is so valuable. Lastly, this paper argues that United States criminal law is fortunate to have the evidentiary filter provided by Daubert and warns of the potential consequences of relaxing admissibility standards.

CITATION: Lyn M. Gaudet, Note, Brain Fingerprinting, Scientific Evidence, and Daubert: A Cautionary Lesson from India, 51 Jurimetrics J. 263–318 (2011).
The Devil Is in the Details: Health-Care Reform, Biosimilars, and Implementation Challenges for the Food and Drug Administration
Jordan Paradise

The Biologics Price Competition and Innovation Act (BPCIA), part of the Patient Protection and Affordable Care Act (PPACA), placed a major regulatory challenge in front of the Food and Drug Administration (FDA). The BPCIA grants broad authority to the FDA to develop and implement a market approval pathway for “biosimilar” and “interchangeable” biological products. This article provides an overview of the nature and regulation of biological products as compared to conventional small molecule drugs; surveys the content and scope of the BPCIA provisions, highlighting core definitions, requirements for submissions to the FDA, and the basics of the intricate patent disclosure and resolution process; provides an overview of the FDA’s actions to date in the implementation of the biosimilar pathway; and identifies several challenges facing the FDA and various stakeholders.

CITATION: Jordan Paradise, The Devil Is in the Details: Health-Care Reform, Biosimilars, and Implementation Challenges for the Food and Drug Administration, 51 Jurimetrics J. 279–292 (2011).
Don't Hold Back: When and How Corporate Counsel Should Implement a Litigation Hold
Tim Winslow and Jason Malone

A party has a duty to preserve evidence when they know the evidence is relevant to present litigation or should have known that the evidence may be relevant to future litigation. This duty has become more important do to the relative ease and frequency at which electronic documents are deleted within the course of business. Thus, it is important for a business to understand its obligations with regard to the preservation of electronic data, especially when the potential for litigation arises.

Once a party reasonably anticipates litigation, it must suspend its normal document retention policy and institute a litigation hold. Failure to implement a legal hold can have dire ramifications for a party both financially and in terms of litigation strategy. Consequently, it is paramount that all companies have a proper litigation hold policy in place at the appropriate time.

This article discusses what events may trigger the initiation of a litigation hold, how the culpability of the delinquent party and the relevance of the missing evidence factor into determining what sanctions a court may impose, the types of sanctions that courts have recently favored, and finally, what a proper legal hold policy should entail.

CITATION: Tim Winslow and Jason Malone, Don’t Hold Back: When and How Corporate Counsel Should Implement a Litigation Hold, 51 Jurimetrics J. 245–278 (2011).
A Duration No More than Necessary: A Proposed Test for the Duration Requirement of RAM-Copy Fixation
Li-Jen Shen

Circuit Courts are split on whether a temporary duplicate of a copyrighted work in Random Access Memory (RAM) constitutes a copy under the Copyright Act of 1976. The Ninth Circuit interprets the statutory definition of fixation in § 101 of the 1976 Act by requiring only that the embodiment of a copyrighted work be “perceived, reproduced, or otherwise communicated.” But the Second and Fourth Circuits expand on this requirement and consider the language in § 101, “for a period of more than transitory duration,” an indispensable requirement as well. One problem in the duration requirement is that when an Internet Service Provider (ISP), under the request of a user, accesses certain materials by referring to copyrighted electronic files, the downloaded file automatically becomes an unauthorized copy under the Ninth Circuit standard. This is true even if the materials’ copyright belongs to the user. Another problem is that there is no bright-line rule on how long the period should be for the embodiment to be fixed. To effectively resolve this issue, courts should refer to the goals and policies underlying copyright law in determining whether an embodiment was “fixed.” The threshold “period of more than transitory duration” should not be a set figure. Rather, a copy should be considered fixed under § 101 if the period of caching is longer than what is necessary for a fair-use conduct to acquire the required information. The key is whether such a temporary duplicate will put the copyrighted work at the risk of being unlawfully distributed.

CITATION: Li-Jen Shen, Comment, A Duration No More than Necessary: A Proposed Test for the Duration Requirement of RAM-Copy Fixation, 51 Jurimetrics J. 217–243 (2011).
Prior Appropriation, Agriculture and the West: Caught in a Bad Romance
Nisha D. Noroian

The agricultural sector consumes 65% of the freshwater in the United States, with the majority of irrigated farmland located west of the 100th Meridian. The Western United States contains some of the fastest growing regions in the country and has the lowest average rainfall. Surprisingly, because of historical water policies dating back to the mid 1800s when the first settlers travelled west, there are few incentives for farmers to conserve water. The foundation of western water law, the doctrine of prior appropriation—first in time, first in right—gave landowners and permit holders as much water as they needed with little concern for how much they used. This era of limitless natural resources amid a burgeoning population has shifted to an era where municipalities, landowners, environmentalists, and states are scrambling to satisfy their basic water needs. Because the agricultural sector is the largest single consumer of freshwater, it goes without saying that its members are the users most scrutinized for their water policies. Therefore, the agricultural sector must be proactive in its future conservation policies and advocate for beneficial alterations to the prior appropriation doctrine. In particular, prior appropriation law must be altered to reflect and address these future concerns of water scarcity, while maintaining an adequate balance between agricultural and urban needs. The doctrine of prior appropriation and state water policy should focus on three primary objectives: (1) assurance that water conserved by prior appropriators remains part of their appropriative right; (2) alteration of the prior appro-priation doctrine to facilitate the development of a federal water transfer Clearing House; and (3) increased farmer education about the merits and methods of water conservation.

CITATION: Nisha D. Noroian, Comment, Prior Appropriation, Agriculture and the West: Caught in a Bad Romance, 51 Jurimetrics J. 181–215 (2011).
Cash for Clunkers, Dimes for Duracells: An Effective Model to Motivate the Proper Disposal of Household Toxic Waste
Marc L. Lerner

Conventional portrayals of sources of toxic waste show smokestacks puffing smoke into the sky and pipes spewing toxic ooze into waterways. These images are incomplete, however, because they overlook the significant aggregated disposal of toxic waste from households and its substantial impact on the environment. The perception that these small contributions do not cause much harm is prevalent; however, in reality the impact is significant and profound. Certain categories of toxic household waste require specific disposal, but general household waste is largely exempt from regulations intended to protect humans, animals, and the environment. Information regarding which categories of household products require safe disposal is lacking. Beyond such knowledge, the public needs incentives that promote the safe disposal of these categories of goods. Any effective process for safely disposing of the materials must be convenient while providing clear benefits.

This comment proposes a framework for three specific product categories, batteries, Compact Fluorescent Lamps, and cell phones, which incorporates key lessons from other successful initiatives to reduce and control waste. Evaluating programs, like container-deposit legislation, that have successfully motivated individuals to safely dispose of con-sumer goods provides valuable guidance. Further, a regulatory scheme that informs the public, provides meaningful incentives, and makes disposal convenient could reduce the impact of household toxic waste on the environment. This comment proposes a mandatory rebate system modeled on bottle bill legislation that will motivate citizens to properly dispose of certain categories of toxic household waste.

CITATION: Marc L. Lerner, Comment, Cash for Clunkers, Dimes for Duracells: An Effective Model to Motivate the Proper Disposal of Household Toxic Waste, 51 Jurimetrics J. 141–179 (2011).
Problems in Common Interpretations of Statistics in Scientific Articles, Expert Reports, and Testimony
Sander Greenland and Charles Poole

Despite articles and books on proper interpretation of statistics, it is still common in expert reports as well as scientific and statistical literature to see basic misinterpretations and neglect of background assumptions that underlie all statistical inferences. This problem can be attributed to the complexities of correct definitions of concepts such as P-values, statistical significance, and confidence intervals. These complexities lead to oversimplifications and subsequent misinterpretations by authors and readers. Thus, the present article focuses on what these concepts are not, which allows a more nonmathematical approach. The goal is to provide reference points for courts and other lay readers to identify misinterpretations and misleading claims.

CITATION: Sander Greenland and Charles Poole, Problems in Common Interpretations of Statistics in Scientific Articles, Expert Reports, and Testimony, 51 Jurimetrics J. 113–129 (2011).
The National Academy of Sciences Report on Forensic Sciences: What it Means for the Bench and Bar
Harry T. Edwards

On May 6, 2010, the Honorable Harry T. Edwards delivered a presentation to the Conference on The Role of the Court in an Age of Developing Science & Technology. Sponsored by the Superior Court of the District of Columbia, the conference was held in Washington, D.C. on May 6–7, 2010. Jurimetrics is pleased to present the text of Judge Edwards’s lecture.

CITATION: Harry T. Edwards, The National Academy of Sciences Report on Forensic Sciences: What it Means for the Bench and Bar, 51 Jurimetrics J. 1–15 (2010).
Disconnected: The Safe Prisons Communications Act Fails to Address Prison Communications
Jane C. Christie

Cell phones smuggled into federal and state prisons are considered a top security threat to public safety. Contraband cell phones, often smuggled in by corrupt prison guards and family visitors, are used by criminals for activities ranging from the innocuous, such as text messaging their families, to the more sinister, including ordering assassination hits on witnesses. Prison officials argue that cell phone jamming devices, which operate by producing a signal on the same spectrum band that cancels or “jams” cell phone signals, are needed to halt the flow of cellular communications in and out of prisons. These jamming devices, however, have long been deemed illegal by the Federal Communications Commission (FCC) under the Communications Act of 1934, 47 U.S.C. § 333. The proposed Safe Prisons Communications Act of 2009, recently passed by unanimous consent in the U.S. Senate, would create an exception for prisons to petition the FCC for permission to install cell phone jamming devices. This potential exception raises several concerns, however. These jamming devices may create interference spillover beyond the boundaries of prisons, disrupting legitimate cell phone communication. Furthermore, private companies, schools and even movie theaters are interested in gaining access to this technology to jam legal, but potentially annoying, cell phone communication. An exception for prisons might lead to an undesirable proliferation of jamming. Additionally, many cell phones are being smuggled in by desperate families who cannot afford to pay the exorbitant collect call rates traditionally used by most prisons. Dealing just with illicit cell phones in prisons obscures the important social and rehabilitative interests in facilitating prisoner contact with families and loved ones.

At first glance, implementation of jamming devices seems compelling for combating contraband cell phones in prison, but it must be weighed appropriately against other technology and public policy alternatives. This comment will argue that implementation of cell phone jamming technologies in prisons, as proposed by the Safe Prisons Communications Act of 2009, is an inappropriate response to curbing illicit prisoner cell phone usage because the act (1) fails to address the threat of disrupting service to legitimate users, (2) ignores the real threat of jamming proliferation, (3) focuses solely on jamming technologies, and (4) fails to address the lack of inmate access to affordable telephone service.

CITATION: Jane C. Christie, Comment, Disconnected: The Safe Prisons Communications Act Fails to Address Prison Communications, 51 Jurimetrics J. 17–59 (2010).
Particles of What? A Call for Specificity in Airborne Particulate Regulation
Michael T. Poulton

The Clean Air Act assigns the Environmental Protection Agency the responsibility of regulating air pollutants nationwide. In response to this mandate, the EPA has promulgated National Ambient Air Quality Standards (NAAQS), which establish maximum permissible concentrations for certain pollutants. In addition to specific chemical pollutants, the NAAQS attempt to regulate coarse airborne particulate matter, or “PM10.” This airborne dust varies widely in chemical composition and morphology, so experts cannot accurately predict health effects based on concentration alone. Consequently, the Environmental Protection Agency’s current regulatory approach cannot produce a predictable or consistent public health outcome. An improved regulatory scheme is proposed, using newer technology to monitor particulate composition in detail and establish standards weighted by particulate pathogenic potential, to achieve a uniform public health outcome.

CITATION: Michael T. Poulton, Comment, Particles of What? A Call for Specificity in Airborne Particulate Regulation, 51 Jurimetrics J. 61–87 (2010).
The NAS/NRC Report on Forensic Science: A Glass Nine-Tenths Full (This Is About the Other Tenth)
D. Michael Risinger

The NAS/NRC committee report, Strengthening Forensic Science in the United States: A Path Forward, issued in February 2009, was a milestone in the decades-long struggle to get those who control the production and utilization of foren­sic science expertise to admit the various weaknesses of some of the techniques in­volved, and to take steps to strengthen the reliability of those techniques and their products. The NAS/NRC committee report is in some ways the culmination of those efforts and has made it now untenable to dismiss criticisms as simply the cavils of uninformed academics with nothing better to do.

In this sense, the report is a glass nine-tenths full and is to be celebrated as such. But then there is the other tenth, the tenth that may, as an unintended consequence, delay needed reform significantly and unnecessarily. The most significant part of this unwise tenth is the decision not to push strongly for the immediate adoption of masking and sequential unmasking protocols in forensic science practice, but instead to call for “more research” on the issue in advance of moving forward.

This paper explains in detail why the “await more research” approach is mis­guided.

CITATION: D. Michael Risinger, The NAS/NRC Report on Forensic Science: A Glass Nine-Tenths Full (This Is About the Other Tenth), 50 Jurimetrics J. 21–34 (2009).
The National Research Council’s Plan to Strengthen Forensic Science: Does the Path Forward Run Through the Courts?
William C. Thompson

According to the National Research Council’s (NRC) report on forensic science, courts have been “utterly ineffective” in screening problematic forensic testimony from criminal juries. This article discusses the NRC’s implicit critique of the courts and assesses future prospects for judicial gatekeeping in the area of forensic science. It argues that the NRC report may itself help turn the tide of judicial opinion against the admissibility of individualization testimony by practitioners of pattern-matching disciplines and that this development, if it occurs, will help generate political support for the NRC’s ambitious agenda of legislative and regulatory reforms.

CITATION: William C. Thompson, The National Research Council’s Plan to Strengthen Forensic Science: Does the Path Forward Run Through the Courts?, 50 Jurimetrics J. 35–51 (2009).
The NRC Report and Its Implications for Criminal Litigation
Paul C. Giannelli

The National Research Council (NRC), an arm of the National Academy of Sciences (NAS), issued a landmark report on forensic science in February 2009. In the long run, the report’s recommendations, if adopted, would benefit law enforcement and prosecutors. The recommendations would allow forensic science to develop a strong scientific basis and limit evidentiary challenges regarding the reliability of forensic evidence. In keeping with its congressional charge, however, the NRC committee did not directly address admissibility issues. Nevertheless, given its content, the report will inevitably be cited in criminal cases. Indeed, within months, the United States Supreme Court cited the report, noting that “[s]erious deficiencies have been found in the forensic evidence used in criminal trials.” Defense attorneys would be derelict if they did not use it, and prosecutors will have no choice but to respond to defense arguments. This essay examines how courts may respond to the NRC report in the near future.

CITATION: Paul C. Giannelli, The NRC Report and Its Implications for Criminal Litigation, 50 Jurimetrics J. 53–66 (2009).
Forensic Science and Miscarriages of Justice: Some Lessons from Comparative Experience
Kent Roach

This paper provides a critical assessment of the National Research Council’s (NRC) 2009 report in light of comparative experience in Australia, Canada, and the United Kingdom. It suggests that the NRC’s proposals for federal regulation of the forensic sciences are more appropriate for a unitary state than a federal system. The NRC report could have been strengthened by examining the British experience with a forensic regulator and a 2008 report on forensic pathology in the Canadian province of Ontario. The pathology report has already produced tangible reforms to the practice of forensic pathology within coroners’ systems while the NRC unrealistically calls for the abolition of all coroner systems. The dangers of superficial reforms are examined, including a Canadian example of published research being misapplied in a manner that contributed to a wrongful conviction. Finally, the NRC’s pessimistic conclusions about judicial exclusion of unreliable forensic science are contrasted with recent and more optimistic reform proposals in Canada and the United Kingdom.

CITATION: Kent Roach, Forensic Science and Miscarriages of Justice: Some Lessons from Comparative Experience, 50 Jurimetrics J. 67–92 (2009).
How Can Francis Bacon Help Forensic Science? The Four Idols of Human Biases
Itiel E. Dror

In this paper, I try to find ways to improve forensic science by identifying potential vulnerabilities. To this end, I use Francis Bacon’s doctrine of idols, which distinguishes between different types of human biases that may prevent scientific and objective inquiry. Bacon’s doctrine contains four sources for such biases: idola tribus (idols of the tribe), idola specus (idols of the den or cave), idola fori (idols of the market), and idola theatri (idols of the theatre). While his 400-year-old doctrine does not, of course, perfectly match up with our current world view, it still provides a productive framework for examining and cataloguing some of the potential weaknesses and limitations in our current approach to forensic science.

CITATION: Itiel E. Dror, How Can Francis Bacon Help Forensic Science? The Four Idols of Human Biases, 50 Jurimetrics J. 93–110 (2009).
When Nature’s Anticipation Inherently Prevents Your Discovery: A New Look at an Overlooked Requirement of Patentability and Its Impact on Inherent Anticipation
Ben Herbert

A valid patent must be novel, useful, and nonobvious. If a patent is not novel, it is said to be anticipated; and thus, the patent is invalid. Anticipation is determined by looking at the written claims describing the invention. An invention is anticipated if the invention contains all of the limitations of a single prior art reference. The prior art can either explicitly or inherently anticipate. Explicit anticipation occurs when the prior art states all of the limitations of the subsequent invention. Inherent anticipation occurs when the prior art explicitly mentions some limitations, while other limitations are present in the physical embodiment described by the prior art reference but are not explicitly mentioned in the reference. This situation arises because the written description of the prior art is distinct from the actual physical product described by the prior art reference. Accordingly, it is possible for an actual product to display characteristics that are not explicitly mentioned as limitations in the language describing the prior art reference. Under the doctrine of inherent anticipation, a court, in effect, will read innate limitations present in the actual product as if they are explicit limitations in the prior art. There has been widespread disagreement in the Federal Circuit and the legal community about the elements of the doctrine of inherent anticipation. All agreed that the nonexplicit "missing" limitation of a prior art reference needed to be innately present in the actual product described by the prior art reference for the doctrine to apply. In other words, the missing limitation must always occur or be present in the actual product. Some argued, however, that in addition to this element, a second element is required, namely that the innate limitation also must be recognized by a person having ordinary skill in the art (PHOSITA). In Schering v. Geneva, the Federal Circuit held that inherent anticipation required only that the missing limitations be innately present in the product described by the prior art reference. The difficult factual scenario presented in Schering led the court to abandon PHOSITA recognition as an element of inherent anticipation. But the facts of the case did not require this result. Instead, as the history of patent law makes clear, the patent in Schering could have been invalidated by applying the newness analysis, while maintaining both elements of the doctrine of inherent anticipation. Specifically, patent law dictates that the product or process seeking patent actually be new; it requires an invention. Absent a showing of invention, a patent will not issue. This newness analysis examines the subject matter of a patent and asks whether it previously existed in nature. If the subject matter did exist in nature, then a patent is invalid. Thus, instead of abandoning the recognition requirement, the court in Schering should have applied the two element inherent anticipation analysis and held that the patent was not anticipated. But, the court should then have applied the newness analysis and still held the patent invalid because the subject matter was not an invention.

CITATION: Ben Herbert, Comment, When Nature’s Anticipation Inherently Prevents Your Discovery: A New Look at an Overlooked Requirement of Patentability and Its Impact on Inherent Anticipation, 50 Jurimetrics J. 111–146 (2009).
Time for an Upgrade: Amending the Federal Rules of Evidence to Address the Challenges of Electronically Stored Information in Civil Litigation
Jonathan L. Moore

In recent years, electronically stored information (ESI) has begun to play an increasingly important role in civil litigation. Although the e-discovery amendments to the Federal Rules of Civil Procedure in 2006 provided guidelines for the discovery of this information, no accompanying changes were made to the Federal Rules of Evidence to govern the admissibility of this information at trial.

This article outlines the vastly different ways courts have addressed this problem in three areas: authentication, hearsay, and the best evidence rule. After discussing the various approaches courts take in these areas, this article proposes specific amendments to the Federal Rules of Evidence that would provide guidance to courts and litigants as to the admissibility of electronically stored information at trial.

CITATION: Jonathan L. Moore, Time for an Upgrade: Amending the Federal Rules of Evidence to Address the Challenges of Electronically Stored Information in Civil Litigation, 50 Jurimetrics J. 157–193 (2010).
Obtaining Copyright Licenses by Prescriptive Easement: A Solution to the Orphan Works Problem
Aryeh L. Pomerantz

This article addresses the issue of orphan works. It discusses legislation recently considered by Congress in an effort to solve the problem orphan works present, and it analyzes the shortcomings of the approach taken by the considered legislation.

In an effort to provide an alternate solution to the orphan works problem, the article analyzes the application to copyrights of the common law doctrines of adverse possession and prescriptive easements. It argues against the adoption of adverse possession to copyrights. The article, however, concludes that the doctrine of prescriptive easement can be adapted to copyrights, and the article offers a proposal based on such an adaptation that the author believes would provide a superior solution to the orphan works problem.

CITATION: Aryeh L. Pomerantz, Obtaining Copyright Licenses by Prescriptive Easement: A Solution to the Orphan Works Problem, 50 Jurimetrics J. 195–227 (2010).
The Impact of “Hurricane” Hannah: The Government’s Decision to Compensate in One Girl’s Vaccine Injury Case Could Drastically Alter the Face of Public Health
Michael J. Donovan

The Centers for Disease Control and Prevention (CDC) rank immunizations as one of the most significant public health achievements of the 20th century. In the last 100 years, vaccination has eradicated smallpox globally and brought under control many other infectious diseases. Inversely proportional to the success of vaccinations, however, is the prevalence of Autism Spectrum Disorder (ASD), a broad organic disorder characterized by difficulties with social interaction and communication. A persistent, unsubstantiated hypothesis posits a connection between the mandatory childhood vaccination regimen and the increase in ASD. Although this theory is not supported by peer-reviewed literature, a recent development in the administrative Vaccine Court has fueled the debate. In Poling ex rel. Poling v. Secretary of Health and Human Services, Hannah Poling brought a claim for compensation, alleging that her ASD resulted from her childhood vaccinations. The United States’ conclusion, made through the Secretary of Health and Human Services, that Poling was due compensation ignited the mass media and advocacy groups. Even though contrary to several recent decisions by the Vaccine Court, some media sources have used the government’s decision in Poling to support the autism-vaccination connection, which may in turn lead to lower vaccination rates among children. This lack of adherence to immunization protocol could result in an increased risk of infectious disease outbreaks, similar to what was observed globally after the initial report of a vaccine-autism connection. In the first seven months of 2008, more measles cases were reported to the CDC than at any time in the last dozen years. To address this concern, this comment suggests a two-part approach. First, although the United States did not concede compensation in the three February 2009 cases, the United States still faces approximately five thousand cases before the Vaccine Court, and it should ensure its Vaccine Court strategy is to require causation be demonstrated before it concedes compensation. Second, personalized medicine should be used whenever available to minimize circumstances where rare vaccine side effects are possible.

CITATION: Michael J. Donovan, Comment, The Impact of “Hurricane” Hannah: The Government’s Decision to Compensate in One Girl’s Vaccine Injury Case Could Drastically Alter the Face of Public Health, 50 Jurimetrics J. 229–259 (2010).
Copyleft Termination: Will the Termination Provision of the Copyright Act of 1976 Undermine the Free Software Foundation’s General Public License?
Jon L. Phelps

Unlike traditional copyright licenses, Free Open Source Software (FOSS) licenses have been termed copyleft rather than copyright. This is because the intention of copyleft licenses is to utilize intellectual property law to keep the source and object code of the licensed software available to anyone who would like to use it at no charge. The termination provision of the Copyright Act of 1976 could undermine or even destroy the FOSS movement. Potentially, licenses of FOSS programs, which are old enough for the termination provision of the Copyright Act to apply, could be terminated as early as 2013. This article provides a brief discussion of FOSS licenses and the termination provision of the Copyright Act of 1976. It also describes the limitations on an author’s rights in computer programs under 17 U.S.C. § 106. Finally, it presents a possible resolution, which lies in the Copyright Act’s safe harbor provision for adaptation of computer software.

CITATION: Jon L. Phelps, Comment, Copyleft Termination: Will the Termination Provision of the Copyright Act of 1976 Undermine the Free Software Foundation’s General Public License?, 50 Jurimetrics J. 261–273 (2010).
Organizing the Unorganized: The Role of Nonprofit Organizations in the Commons Communities
Jyh-An Lee

In recent years, commons communities, such as Wikipedia and free or open source software communities, have received extensive attention from academia. Conventional wisdom holds that these communities have produced information across boundaries, rendering formal organizations obsolete. Nonetheless, this article demonstrates that nonprofit organizations (NPOs) are actually playing a vital role in the digital commons production process.

This conclusion is based on current commons and NPO scholarship and a series of in-depth, semistructured interviews with officers of NPOs and for-profit organizations associated with commons communities. NPOs’ function for those communities in managing property rights, transacting with other entities, protecting individuals from potential liabilities, and institutionalizing various decision-making and standard-setting processes. The unique organizational structures of NPOs, associated with a “nondistribution constraint,” have provided indispensable trust for community members. This article proposes that the commons environment has provided an “environmental niche” in which NPOs thrive. That is, the nature of NPOs is more consistent with commons-environment culture than is the nature of for-profit enterprises or governmental units.

CITATION: Jyh-An Lee, Organizing the Unorganized: The Role of Nonprofit Organizations in the Commons Communities, 50 Jurimetrics J. 275–327 (2010).
That Kind of Sexe Which Doth Prevaile: Shifting Legal Paradigms on the Ontology and Mutability of Sex
Michael Boulette

This article begins with a simple premise: sex matters, particularly in the context of marriage. While determinations of an individual’s sex are often taken for granted, situations involving the marriage of transsexual and intersex individuals present hard cases, cases that require the examination and reexamination of how sex, and the ability of individuals to change their sex, are understood. Using marriage as a lens, this article explores and critiques the various ways in which a variety of courts have come to grips with the ontology and mutability of sex. Lastly, this article explores potential answers to this question, ultimately arguing that hard cases require hard answers, and proposes a general analytic structure capable of accounting for hard cases, while avoiding the more politically polarizing question of same-sex marriage.

CITATION: Michael Boulette, That Kind of Sexe Which Doth Prevaile: Shifting Legal Paradigms on the Ontology and Mutability of Sex, 50 Jurimetrics J. 329–370 (2010).
State Trial Judge Use of Court Appointed Experts: Survey Results and Comparisons
Stephanie Domitrovich, Mara L. Merlino, and James T. Richardson

State trial judges from Texas, Michigan, Indiana, and Arizona were surveyed about their use of court-appointed experts. This article discusses the survey results concerning the types of experts most frequently appointed by judges in this sample and comparing their use by judge gender, docket (civil, criminal, family, and combined), and type of expert appointment (statutorily mandated or discretionary). Many judges reported either appointing their own experts or being willing to do so. Significant differences were found in type of docket, type of expert, and type of expert appointment. Judges’ reasons for choosing or not choosing to appoint experts are discussed, as are comparisons with findings in the few other studies of this topic.

CITATION: Stephanie Domitrovich, Mara L. Merlino, and James T. Richardson, State Trial Judge Use of Court Appointed Experts: Survey Results and Comparisons, 50 Jurimetrics J. 371–389 (2010).
The Neuropsychology of Justifications and Excuses: Some Problematic Cases of Self-Defense, Duress, and Provocation
Theodore Y. Blumoff

Writing in 1984, Professor Greenawalt described cases on the excuse-justification border as “perplexing.” He concluded that two of the most frequently articulated reasons for distinguishing between justifications and excuses—warranted versus unwarranted conduct, objective and general versus subjective and individual—are not as descriptively clean as they sometimes purport to be. The “conceptual fuzziness” that Greenawalt documents is inherent in the nature of the acts themselves; they are neurobiologically indistinct. Justifications and excuses in the boundary cases trigger both our emotional and cognitive processing areas almost simultaneously. The emotions tend to precede the cognitive, but only long enough to focus attention on the immediate threat. The conceptual blur will continue as long as our jurisprudence rigidly categorizes conduct that exists only on a continuum. This is not a new problem. The law tends to break down into categories—guilty or not guilty, for example. But the world is not binary; it is continuous, and categorical thinking tends to distort our view of the world. The Model Penal Code divides mens rea into four categories. The drafters concede that mens rea exists only on a continuum and cannot be rationally determined without question-begging. This approach constitutes an implicit recognition of the way in which our control functions actually operate. It is time to acknowledge that these problematic excuse-justification cases defy categorization and thereby eliminate the confusion by adopting an advertently hybrid defense.

CITATION: Theodore Y. Blumoff, The Neuropsychology of Justifications and Excuses: Some Problematic Cases of Self-Defense, Duress, and Provocation, 50 Jurimetrics J. 391–424 (2010).
The Normative Threshold for Psychiatric Civil Commitment
Nicholas Scurich and Richard John

There is an ethical debate about whether mental health professionals should predict dangerousness. One powerful objection involves considering the nature and scope of what is being predicted. Dangerousness is a bifurcated concept with a descriptive component, corresponding to risk factors and their relation to violence, and a normative component, corresponding to a moral and legal judgment that sufficient risk exists to justify involuntary confinement. The normative judgment is exclusively reserved for legal decision makers, although due process constitutionally constrains these decisions. This article uses decision theory to show the general effects of due process on setting a minimal threshold of risk necessary to justify involuntary civil commitment. It then demonstrates the corollary of applying these effects to the results of the foremost actuarial tool, under varying base rate assumptions. It concludes that the involuntary confinement of only the highest risk individuals is legally justified.

CITATION: Nicholas Scurich and Richard John, The Normative Threshold for Psychiatric Civil Commitment, 50 Jurimetrics J. 425–452 (2010).
Halting Pig in the Parlor Patents: Nuisance Law as a Tool to Redress Crop Contamination
Amanda L. Kool

The legal discourse regarding the problem of crop contamination caused by stray genetically modified (GM) traits generally centers around two remedies: the reduction in patent protection afforded to subsequent generations of patented seed, and the use of common law to protect the property rights of the farmer whose crops have been contaminated. This article supports the latter approach, specifically advocating for a private nuisance suit by the owner of the contaminated crop against the owner of the patented traits. The author argues that the former approach is not only unlikely to succeed, but may also prove detrimental to those farmers who find themselves growing patented crops they do not want. The author contends that a private nuisance suit is the most appropriate of all the applicable tort law remedies because it allows the court to balance the interests of the farmer, the patent owner, and society as a whole, thereby fashioning a remedy that fits the distinct characteristics of the case at hand. Furthermore, the author argues that existing case law supports a private nuisance claim brought by a model plaintiff against a seed patent owner. The author then builds the model plaintiff’s case, using existing case law to guide the discussion.

CITATION: Amanda L. Kool, Halting Pig in the Parlor Patents: Nuisance Law as a Tool to Redress Crop Contamination, 50 Jurimetrics J. 453–507 (2010).
The Third-Party Assurance Model: A Legal Framework for Federated Identity Management
Jeff Nigriny and Randy V. Sabett

Computer network compromises continue to plague all sectors of business and government, particularly where multiple stakeholders need to share large amounts of data and collaborate online. While the ability to reliably and accurately authenticate a person is a critical aspect of securing access to computer network resources, such user authentication has not, historically, been a high priority. Federated identity management has evolved as a mechanism for certain entities to make assertions on which other entities can rely, by establishing a trust framework amongst such heterogeneous stakeholders. This trust framework can provide an acceptable level of assurance that the identity of a user or group of users associated with one stakeholder can be properly verified such that a user or group of users associated with another stakeholder can rely on the purported identity of the users in the other group. Traditional federated models, however, have not been optimized for operational use. In response, a third-party assurance (3PA) model is proposed that incorporates the best features of the industry standard hub and consortium models, utilizing (a) existing bilateral agreements between asserting and relying parties and (b) a federation operator (FO). The 3PA model creates a double-binding obligation on the asserting party (AP) to comply with the rules of the identity federation (IdP). Under the 3PA model, the IdP has one set of contractual obligations directly applicable to it via its contract with the FO. The IdP has a second set of contractual obligations to each relying party (RP) in the federation (as a third-party beneficiary) via the incorporation by reference of the rules of the federation.

CITATION: Jeff Nigriny and Randy V. Sabett, The Third-Party Assurance Model: A Legal Framework for Federated Identity Management, 50 Jurimetrics J. 509–537 (2010).
Causation and Uncertainty: Making Connections in a Time of Change
Andrew R. Klein

This paper considers how tort law should respond to scientific developments that improve our ability to connect toxic exposure to changes in human health. Starting from a normative position that encourages proof of traditional sine qua non causation, the paper suggests an allocation of cases between the tort system and legislatively created compensation programs. With regard to the latter, the paper sets out guidelines that lawmakers can use when deciding whether to replace tort law. With regard to the former, the paper divides its recommendations into two categories, one for cases where plaintiffs have existing clinical symptoms of disease, and another where plaintiffs have only an increased risk of disease. Recognizing the gray area that exists between these two groups, the paper proposes burdens consistent with traditional causation rules to serve tort law’s underlying goals.

CITATION: Andrew R. Klein, Causation and Uncertainty: Making Connections in a Time of Change, 49 Jurimetrics J. 5–50 (2008).
Recent Developments in the “Stem Cell Century”: Implications for Embryo Research, Egg Donor Compensation
Russell Korobkin

The past year has witnessed three developments with significant implications for important policy disputes concerning stem cell research: the creation of adult cells that are reprogrammed to behave like embryonic stem cells, new progress in the creation of embryonic stem cell lines using cloning technology, and a decision of the United States Patent and Trademark Office upholding the validity of foundational patents in embryonic stem cell technology. These developments directly implicate three important policy issues surrounding stem cell research: the appropriateness of public funding of embryo research, whether compensation of human egg donors should be permitted, and whether human embryonic stem cell lines ought to be patentable. This article describes and critically analyzes the implications of these developments for law and policy.

CITATION: Russell Korobkin, Recent Developments in the “Stem Cell Century”: Implications for Embryo Research, Egg Donor Compensation, 49 Jurimetrics J. 51–71 (2008).
In re Seagate Tech., L.L.C.: Is the Objective Recklessness Standard a Practical Change?
Christopher C. Bolten

In In re Seagate Tech., L.L.C., 497 F.3d 1360 (Fed. Cir. 2007), the Court of Appeals for the Federal Circuit held: (1) a showing of objective recklessness by an infringer is required to prove willful patent infringement for an award of enhanced damages; and (2) there is no longer an affirmative duty to obtain an opinion of counsel as a defense against a charge of willful infringement. This decision overruled the willful infringement standard in Underwater Devices Inc. v. Morrison-Knudsen Co., 717 F.2d 1380 (Fed. Cir. 1983), which imposed a duty of due care on potential infringers upon actual notice of another’s patent rights. This duty was normally satisfied when a potential infringer obtained a competent legal opinion stating that continuing action did not constitute infringement. The court’s rationale for overruling the Underwater Devices standard is persuasive with respect to showing that a defense to willful infringement should not require an affirmative duty to obtain an opinion of counsel. The elimination of the affirmative duty, however, effects little practical change because potential infringers will likely continue to seek opinions of competent counsel as a matter of sensible business practice.

CITATION: Christopher C. Bolten, Note, In re Seagate Tech., L.L.C.: Is the Objective Recklessness Standard a Practical Change?, 49 Jurimetrics J. 73–90 (2008).
Sykes v. Glaxo-SmithKline: Testing the Boundaries of Federal Preemption
Euan A. Lockhart

In Sykes v. Glaxo-SmithKline, 484 F. Supp. 2d 289 (E.D. Pa. 2007), the district court ruled that a failure to warn claim was preempted by the Food, Drug, and Cosmetic Act (FDCA) because a state tort claim would conflict with the Food and Drug Administration’s (FDA) labeling rules. The court also found that the National Vaccine Injury Compensation Act did not preempt a claim that a vaccine label was inadequate, instead finding the FDCA dispositive. The court did not account adequately for the purposes of the Vaccine Act, which allows only limited preemption and speaks of Congress’ intent to preserve state tort law, or for the fact that the FDA’s mandated ceiling on safety information is not scientifically feasible.

CITATION: Euan A. Lockhart, Note, Sykes v. Glaxo-SmithKline: Testing the Boundaries of Federal Preemption, 49 Jurimetrics J. 91–112 (2008).
Substitution and Schumpeterian Effects Over the Life Cycle of Copyrighted Works
Ariel Katz

This article develops two key insights. First, copyrighted works are affected by two types of competitive forces: substitutive competition and Schumpete­rian competition. Second, the relevant magnitude of each of these competitive forces changes at various points over the life cycle of copyrighted works. The earlier stages of a work’s life cycle are dominated by substitution effects, whereby many other works can function as very close substitutes. As the work develops to a full product, to which many other inputs have been added, it becomes less easily substitutable. This process intensifies as network effects of various kinds secure successful works’ market position and render them less vulnerable to competition from close imitations. The competitive threat to which such works may be exposed becomes more Schumpeterian in nature: competition from other works that offer something new, and potentially preferable. Generally, copyright law unequivocally discourages merely substitutive competition, but is much less interested in discouraging Schumpeterian competition. This article’s time-based analysis provides both a justification for this distinction, as well as grounds for evaluating various existing rules and doctrines. It suggests that broader copyright protection (perhaps broader than the current protection) may be desirable at the early stages of works’ life cycle, whereas narrower protection (perhaps narrower than the current) may be justified at later stages.

CITATION: Ariel Katz, Substitution and Schumpeterian Effects Over the Life Cycle of Copyrighted Works, 49 Jurimetrics J. 113–153 (2009).
Legal Transplants in Patent Law: Why “Utility” Is the New “Industrial Applicability"
Sivaramjani Thambisetty

This paper focuses on the transplantation of the “utility standard” from the U.S. legal system into the “industrial application” criterion of patentability as seen in EPO and UKIPO case law. The “specific, substantial, and credible” standard (SSCS) of utility lacks explicit statutory basis, is largely driven by a process of mimesis fol­lowing collaboration between patent offices, and carries the potential to generate col­lateral damage to a number of neighboring legal standards in European patent law. The SSCS potentially undermines the “technical” requirement in Europe and highlights a growing conflation between industrial applicability and disclosure requirements. Additionally the SSCS may increase research tool patentability in Europe, a development that exposes potential inadequacies in the institutional arrangements of the receiving legal system. Institutional dynamics incrementally entrench the legal transplant, ac­companied by little or no discussion on its viability and legitimacy. The significant normative impact of the process of transplantation of the SSCS places the patent office at the center of legal and policy change—an entity that is arguably not fit for this purpose.

CITATION: Sivaramjani Thambisetty, Legal Transplants in Patent Law: Why “Utility” Is the New “Industrial Applicability", 49 Jurimetrics J. 155–201 (2009).
Deconstructing Jefferson’s Candle: Towards a Critical Realist Approach to Cultural Environmentalism and Information Policy
David W. Opderbeck

This paper introduces critical realism to information policy theory. It employs the “founding myth” of contemporary information policy, Thomas Jefferson’s candle, to excavate the philosophical roots of the ubiquitous view that “information” is an inexhaustible “natural” resource. It shows how mathematical information theory has been employed to transport Jefferson’s Candle into a space where “information” and “culture” are merely social constructs. This social constructivism, the paper argues, saps the “cultural environmentalist” program of normative force. The paper suggests that a critical realist view of “information” and “culture,” tied to norms drawn from environmental virtue ethics, offers a helpful way forward. It concludes with suggestions about how the critical realist perspective could inform debates about Internet govern­ance.

CITATION: David W. Opderbeck, Deconstructing Jefferson’s Candle: Towards a Critical Realist Approach to Cultural Environmentalism and Information Policy, 49 Jurimetrics J. 203–243 (2009).
Roper and the Scientific Amicus
William J. Katt

In Roper v. Simmons, the Supreme Court held it unconstitutional to execute criminals who were sixteen- or seventeen-years-old at the time of their offense. To support this holding, the Court endorsed a scientific argument presented by amici curiae in a so-called Science Brief suggesting that juveniles are too cognitively and psychologically underdeveloped to justify the death penalty. This article examines the scientific claims presented in the Science Brief and the scientific studies cited in the brief to support those claims. Although the Science Brief for the most part relies on solid science, many of the brief’s claims go beyond what the cited studies actually prove. After reviewing the brief in detail, this article briefly discusses the inherent conflict between scientific accuracy and legal advocacy and what, if any, procedural truth-testing mechanisms should be in place for evidence presented to the Court by amici curiae.

CITATION: William J. Katt, Roper and the Scientific Amicus, 49 Jurimetrics J. 253–275 (2009).
Communicating the “Bottom Line” Probability Facilitates Decisions That Rely on Bayesian Reasoning
Yasmine L. Konheim-Kalkstein, Mark A. Stellmack, and Barb Strnad

Jurors often must have an understanding of Bayesian reasoning to interpret DNA forensic evidence. Previous studies have suggested that representing probabilistic information in natural frequency format may facilitate Bayesian reasoning. In this study, we presented 31 college students with a Bayesian problem involving DNA forensic evidence. Initially, we found that college students had difficulty with the calculation and with rendering an appropriate verdict despite representing the information in natural frequencies. We found, however, that when the posterior probability (the probability computed through Bayes theorem of a particular DNA profile being cor­rectly identified) was explicitly stated, our subjects were far more likely to render an appropriate verdict. Our research suggests that simplification of probabilistic informa­tion is the key to helping people understand and use risk information appropriately.

CITATION: Yasmine L. Konheim-Kalkstein, Mark A. Stellmack, and Barb Strnad, Communicating the “Bottom Line” Probability Facilitates Decisions That Rely on Bayesian Reasoning, 49 Jurimetrics J. 277–283 (2009).
How Much Did You Pay for Your Heart: Is a Centralized Entity Performing Health Technology Assessment with Cost-Effectiveness Analysis the Answer to the Rising Costs of Health Care?
Thomas J. Parisi

Despite having the most expensive health-care system in the world—costing over 16% of the gross domestic product—the United States continues to have a less healthy population and a greater proportion of uninsured individuals than all other westernized countries. The discordance between the amount of money spent and the actual level of health indicates that there is great room for improvement in the effi­ciency of health-care delivery in the United States. Although health technologies—new drugs, devices, and procedures—have been cited as a major reason for this rise in cost, the United States health-care system continues to operate without a centralized entity that performs health technology assessment with cost-effectiveness analysis. The lack of this entity has not only resulted in the adoption of expensive technologies with unproven health outcomes, but has not even allowed the acquisition of data on technologies that improve overall health and decrease health-care costs. While many insurance companies perform some health technology assessment with cost-effectiveness analysis during the decision-making process, the data are very fragmented, good studies are rare, and results are either not applicable to a greater audience or not available to those who might benefit from them. The involvement of the United States Food and Drug Administration in drug and device approval would make it an obvious choice for health technology assessment with cost-effectiveness analysis, and would align the United States with other countries that have implemented such a system. Because the United States has strong policies of promoting innovation and consumer choice, however, an alternate centralized agency, not involved in the approval process or coverage decisions, is a better option. Several reasonable options for this role would include the Agency for Healthcare Research and Quality, the Centers for Medicare and Medicaid Services, or a reinstatement of the Office of Technology Assessment. Whichever entity is chosen would have to be insulated from politicians and self-interested medical device manufacturers and pharmaceutical companies. Additionally, this entity would serve as a clearinghouse of unbiased, comparative information that would be made available to health-care workers, health-care payers, policy makers, and the public.

CITATION: Thomas J. Parisi, Comment, How Much Did You Pay for Your Heart: Is a Centralized Entity Performing Health Technology Assessment with Cost-Effectiveness Analysis the Answer to the Rising Costs of Health Care?, 49 Jurimetrics J. 285–316 (2009).
MDY Industries, L.L.C. v. Blizzard Entertainment, Inc.: Software “Contracts” That Expand Copyrights Have Gone Too Far
Jordan Christopher Redman

The recent decision in MDY Industries, L.L.C. v. Blizzard Entertainment, Inc.is a setback for copyright law. In this case, MDY Industries, L.L.C., an Arizona Limited Liability Company, created a “bot” software program that works in coordination with Blizzard’s Massively-Multiplayer Online Role-Playing Game, World of Warcraft (WoW). Blizzard’s End-User License Agreement (EULA) and Terms of Use (TOU) for WoW, which all users must adhere to before installing WoW, insist that purchasers of the game may not install any third-party software which can be used in conjunction with WoW unless Blizzard authorizes such use. After agreeing to the EULA and TOU, some users installed Glider, MDY’s third-party soft­ware to use with WoW, leading Blizzard to give MDY an ultimatum—stop the sale of MDY’s Glider software, or Blizzard would file a lawsuit against MDY on multiple claims. MDY declined the offer and preemptively filed a declaratory judgment action in Arizona, claiming that the offer and sale of Glider did not infringe any of Blizzard’s rights. The Arizona court granted summary judgment in favor of Blizzard on its tortious interference claim, which may have been proper. But the court also granted Blizzard summary judgment on its copyright claims, and this could lead to disastrous results not only for copyright law, but for users of software generally and third-party software developers. The key issue in the decision was whether users of the WoW software were “owners” of the software, or mere “licensees.” The court ruling that the users were “licensees” inappropriately allows the EULA and TOU, contracts governed by state law, to supersede federal copyright law. Section 109 and section 117 of the U.S. Copyright Act give users who purchase software the status of “owner” with respect to the copies they own. In addition, policy strongly supports treating these users as “owners.” Allowing contract to determine the owner or licensee issue for mass-marketed products would essentially allow the manufacturer of any such product containing software that is reexecuted on each use to regulate the use of the product by copyright law, whether or not the use in question violates any of the exclusive rights of copyright. For these reasons, the MDY court improperly decided this case, with potentially broad, detrimental repercussions.

CITATION: Jordan Christopher Redman, Note, MDY Industries, L.L.C. v. Blizzard Entertainment, Inc.: Software “Contracts” That Expand Copyrights Have Gone Too Far, 49 Jurimetrics J. 317–342 (2009).
Eligibility of Athletes Receiving Necessary Gene Therapy: The Oscar Pistorius Case as Procedural Precedent
Blair H. Moses

The recent 2008 Olympics revived the controversy over gene doping—the non-therapeutic use of genetic material to enhance athletic per­formance—and the regulations prohibiting its use by athletes. Regulations against gene doping were instituted by the International Olympic Committee in 2003 and by the World Anti-Doping Agency in 2004. The regulations ban only those athletes utilizing gene doping, but the differentiation of gene therapy—the therapeutic use of genetic material—from gene doping can be difficult. Future athletes receiving gene therapy for valid reasons, such as repairing sports injuries, also could be banned. The recent Oscar Pistorius case substantiated this concern. Pistorius, who has two artificial legs, was initially banned from competing in the 2008 Olympics. Even though his prosthetic legs are therapeutic uses of technology, the International Association of Athletics Federations ruled they were a technical enhancement that disqualified him from competing. Pistorius appealed the decision to the Court of Arbitration for Sports (CAS), whose recent ruling in his favor allows him to compete internationally against “able-bodied” runners. Although prosthetic devices are a different form of therapy than gene therapy, the Pistorius case serves as important procedural precedent for how regulatory agencies might analyze individuals who have been treated using gene therapy. In particular, four elements from the CAS decision are relevant for analyzing future alleged gene doping violations where the enhancement actually results from legitimate gene therapy.

CITATION: Blair H. Moses, Comment, Eligibility of Athletes Receiving Necessary Gene Therapy: The Oscar Pistorius Case as Procedural Precedent, 49 Jurimetrics J. 343–373 (2009).
Global Trends in Online Copyright Enforcement: A Non-Neutral Role for Network Intermediaries?
Jeremy de Beer and Christopher D. Clemmer

This article examines a worldwide shift in laws, policies, and practices pertaining to intermediaries’ role in online copyright enforcement. We use a comparative methodology to expose an emerging trend in jurisdictions, including Australia, Belgium, Canada, China, the European Union, France, Germany, Japan, New Zealand, Singapore, South Korea, the United Kingdom, and the United States.

Previously, the worldwide standard approach to issues of Internet service provider liability was to require carriers and hosts to behave passively until becoming aware of copyright-infringing activities on their networks, at which time a reaction typically involving the takedown of allegedly infringing content was required. Very recent events in several jurisdictions demonstrate a new trend away from a passive-reactive approach toward an active-preventative approach instead. Government policies, voluntary practices, legislative enactments, and judicial rulings are all contributing to this shift in the rules applicable to online intermediaries.

One reason for the shift is increased pressure from rights holders on legislators and policymakers to make intermediaries play a greater role in online copyright enforcement. Another less obvious reason is that intermediaries’ and rights-holders’ interests are aligning. While rights holders are concerned about copyright enforcement and intermediaries are concerned about network management, the result is a mutual interest in content filtering or traffic shaping.

The danger highlighted by this article is that policymakers might inadvertently craft inappropriate legal and regulatory responses by failing to appreciate the divergent motivations behind, and implications following, this trend. To help avoid those pitfalls, this article exposes a new global trend in the area of online copyright enforcement, and it suggests increased coordination among policymakers and affected stakeholders as an appropriate response.

CITATION: Jeremy de Beer and Christopher D. Clemmer, Global Trends in Online Copyright Enforcement: A Non-Neutral Role for Network Intermediaries?, 49 Jurimetrics J. 375–409 (2009).
Where Have All the Women Gone? The Gender Gap in Supreme Court Clerkships
David H. Kaye & Joseph L. Gastwirth

In the world of American law, a Supreme Court clerkship is a position desired by many but attained by few, particularly when it comes to women and minorities. Although women make up nearly half of all law students, they constitute only about a third of all Supreme Court clerks. This article examines the flow of aspiring clerks from law school to the Justices’ chambers in recent years in an effort to locate bottlenecks that lead to this gender gap. It also analyzes whether the Justices, as a group or as individuals, hire fewer women than would be expected in a random draw from a pool of men and women with comparable credentials. We find that some Justices hire fewer women than our simple model would predict, while others hire somewhat more. There are many possible explanations for this pattern, and we discuss several of them.

CITATION: David H. Kaye & Joseph L. Gastwirth, Where Have All the Women Gone? The Gender Gap in Supreme Court Clerkships, 49 Jurimetrics J. 411–437 (2009).
Diaz v. Eagle Produce Ltd. Partnership: The Potential for and Limitations of Formal Statistical Analysis to Assist Courts When Drawing Inferences from a Relatively Small Data Set
Joseph L. Gastwirth & Qing Pan

The purpose of this note is to demonstrate that courts ignore useful information when they fail to consider formal statistical analyses of small to moderate sized data sets. Data from the Ninth Circuit’s reversal of a summary judgment decision against four plaintiffs who complained of age discrimination after being discharged in the context of a seasonal slowdown will be used to illustrate the usefulness of statistical analysis. The appellate decision considered both job performance and other factors specific to each plaintiff and statistical information on 44 employees derived from a chart in evidence. The opinion discounted a difference of 9.5 years between the average age of the 16 discharged employees and the average age of the 16 new hires, because the difference of 9.5 years was less than the court’s requirement of a minimum ten-year difference to demonstrate age discrimination. The court also found that the sample sizes were too small to form a reliable basis for analysis. In addition, the court described patterns of some simple measures, such as the decrease in the average age of new hires over time, but the court did not apply formal statistical methods to analyze either the hiring or the termination data. After giving a brief review of the case and the statistical evidence summarized in the decision, this note applies formal statistical tests to reexamine the data pertaining to the issues of whether there was a decrease in the average age of new hires over time and whether the age of an employee increased his probabil­ity of being reduced in force (RIFed) during the time period of most relevance to the case. Because the plaintiffs alleged that a newly hired supervisor was responsible for their being laid off, three time periods were considered in the opinion and turn out to be important in the current analysis: (1) before this supervisor was hired, (2) when he co-supervised the crew, and (3) when he became the crew’s sole supervisor. The results of the analysis presented here show that there was a statistically significant decline in the ages of new hires over time. Although only 16 employees were RIFed during the period, both a graphical display of the data and a formal statistical analysis indicate that the probability of an employee older than fifty years of age being discharged increased noticeably after the new supervisor took over. While a formal test of the layoff data does not quite reach statistical significance, when the formal test is combined with the result of the hiring data using Fisher’s summary Chi-square method, the overall result provides further support for the appellate decision.

CITATION: Joseph L. Gastwirth & Qing Pan, Diaz v. Eagle Produce Ltd. Partnership: The Potential for and Limitations of Formal Statistical Analysis to Assist Courts When Drawing Inferences from a Relatively Small Data Set, 49 Jurimetrics J. 439–466 (2009).
Post Tiffany (NJ) Inc. v. eBay, Inc.: Establishing a Clear, Legal Standard for Online Auctions
Justin Nicholas Redman

Recent cases such as Tiffany (NJ) Inc. v. eBay, Inc. bring to light an escalating international commercial problem. While the Internet offers tremendous opportunities for increased commerce, it also facilitates the growth of counterfeit sales, global crime, and consumer confusion via online auction houses. With the advent of the Internet, trademark violations and counterfeiting are more difficult to police and, therefore, more frequent. Specifically, Internet auctions, or “virtual flea markets,” feature an endless array of trademarked goods from around the world. These auction sites serve as an intermediary between buyer and seller through which Internet sales of counterfeit goods have skyrocketed. Tiffany, a premier luxury jeweler and specialty retailer offering high quality goods, faced this very problem. Counterfeit jewelry advertised as Tiffany brand was being sold on eBay, an Internet auction house. Tiffany argued that eBay was contributorily liable for trademark infringement because it had knowledge that counterfeits were being sold and failed to take appropriate steps to address the issue. The court disagreed and reasoned that eBay had done enough, under current law, to discourage counterfeit sales, that eBay did not know any specific seller who was selling counterfeits, and that the ultimate burden of policing a trademark rests with the mark owner.

The court was correct in determining that eBay satisfied the current legal standard set out by the Supreme Court in Inwood Laboratories, Inc. v. Ives Laboratories, Inc. Inwood, however, was decided before the Internet was publicly available and goods could be exchanged across such a medium. Without an updated statute or act regarding contributory liability in trademark law, similar to the safe harbor policy set forth by Congress in the Digital Millennium Copyright Act (DMCA), the court ruled that the full burden to police trademarks lies with Tiffany, not eBay. Trademark law simply has not kept up with copyright statutes, bringing into question whether current statutes and case rulings are sufficient in the nearly limitless marketplace of the World Wide Web, where new sites are added daily and the cost to monitor is impossibly high. The Tiffany case exposes the problems and potential unfairness of placing the brunt of the burden of policing trademark infringement on trademark owners in an Internet world.

Interestingly, a predominant focus of the court’s ruling was eBay’s programs to combat infringement, despite the fact that the Inwood test does not require such steps. This note concludes that Tiffany should become the test, with a prescribed set of actions that e-commerce sites must take to thwart counterfeit sales and to remove offending merchandise if they receive notice from trademark owners of infringement. The note explains why Inwood alone is not sufficient for online commerce and further argues that updated statutes are needed to bring trademark law to the same level as copyright law in this area.

CITATION: Justin Nicholas Redman, Note, Post Tiffany (NJ) Inc. v. eBay, Inc.: Establishing a Clear, Legal Standard for Online Auctions, 49 Jurimetrics J. 467–490 (2009).
On the Radar: Government Unmanned Aerial Vehicles and Their Effect on Public Privacy Interests from Fourth Amendment Jurisprudence and Legislative Policy Perspectives
Troy Roberts

The utility of Unmanned Aerial Vehicles (UAVs) in military operations and border security is well documented. As the technology becomes more affordable and available, domestic law enforcement agencies, other state and federal governmental agencies, and private enterprises envisage UAV technology in their future operations. As UAVs are generally introduced into domestic airspace, they will test the Fourth Amendment’s protection of citizens against unreasonable searches and seizures. Existing Supreme Court cases relevant to the issue of aerial warrantless searches are not ultimately determinative of UAVs’ constitutionality in this regard. The Supreme Court was split in each of these previous cases,which dealt with manned flight, not unmanned flight. It is possible, however, to roughly evaluate the impact of UAV aerial surveillance on citizen privacy in a contemporary timeframe by extrapolating the Court’s logic into the future. Currently, the Federal Aviation Administration (FAA) does not allow generalized UAV flight in national airspace, so the issue is not immediately at hand. The FAA, however, is formulating regulations to admit them, so this is a propitious time to consider how to maintain citizens’ rights to privacy free from government infringement through this new technology. The Fourth Amendment most likely will provide only minimal protections. Thus a responsible legislative and administrative solution is required, incorporating accountability and restrictions on visual and sensory enhancing technology without a warrant while providing necessary but clearly drawn statutory exceptions to the warrant requirement. Otherwise, UAV technology may diminish citizens’ reasonable expectations of privacy.

CITATION: Troy Roberts, Comment, On the Radar: Government Unmanned Aerial Vehicles and Their Effect on Public Privacy Interests from Fourth Amendment Jurisprudence and Legislative Policy Perspectives, 49 Jurimetrics J. 491–518 (2009).
From Free Riders to Fairness: A Coop­erative System for Organ Transplantation
Christopher Tarver Robertson

Nearly 100,000 people are waiting for organ transplants, and 7,300 die each year. Meanwhile, useable cadaveric organs are destroyed, since the status quo “altruistic” system for procuring organs fails to cajole a sufficient number of donors. New survey evidence suggests that donor rates could be increased by linking the decisions to give and take organs. Indeed, justice requires such a linkage to prevent free riding. Even on its own terms, the status quo system is unsound. The rhetoric of altruism is philosophically inapposite to cadaveric donation. Rather than autonomous choices, the current system depends on defaults and proxies, which fail to honor choices made. To resolve these problems, I propose a cooperative system where individuals would decide, from behind a veil of ignorance, whether to be both potential suppliers and potential recipients of cadaveric organs. Such a system would respect autonomy, resolve the injustice, and minimize the shortage.

CITATION: Christopher Tarver Robertson, From Free Riders to Fairness: A Coop­erative System for Organ Transplantation, 48 Jurimetrics J. 1–41 (2007).
Why a Conviction Should Not Be Based on a Single Piece of Evidence: A Proposal for Reform
Boaz Sangero and Mordechai Halpert

This article illustrates a serious flaw in the conventional legal approach enabling a conviction based solely on one piece of evidence. This flaw derives from a cognitive illusion referred to as “the fallacy of the transposed conditional.” People might assume a low error rate in evidence only leads to a small percentage of wrongful convictions. We show that, counterintuitively, even a very low error rate might lead to a wrongful conviction in most cases where the conviction is based on a single piece of evidence. Case law has indicated some awareness of this fallacy, primarily when considering the random match probability for DNA evidence. However, there is almost no awareness of the significance of this fallacy in assessing other types of evidence not considered probabilistic or of the significance of laboratory errors in DNA testing. We show that mistakes do happen with all key types of evidence: fingerprints, DNA, confessions, and eyewitness testimony. We then demonstrate the tremendous impact that even a small probability of error has on our confidence in a conviction. In the end, we propose legislative reform that would make it impossible to convict someone on the basis of any single piece of evidence linking him to a criminal offense.

CITATION: Boaz Sangero and Mordechai Halpert, Why a Conviction Should Not Be Based on a Single Piece of Evidence: A Proposal for Reform, 48 Jurimetrics J. 43–94 (2007).
LabCorp v. Metabolite: Revisiting the Law of Nature Doctrine 25 Years after Diehr
David W. Klinger

In LabCorp v. Metabolite, 126 S. Ct. 2921 (2006), a patent for a process of determining vitamin deficiencies was challenged as an invalid attempt to patent a law of nature. Metabolite’s argument that the patent was an invalid attempt to patent a law of nature under § 101 of the Patent Act was first raised at the Supreme Court. The U.S. Supreme Court originally agreed to hear the case, but subsequently dismissed the case, after oral arguments, simply stating that certiorari had been improvidently granted. Although the patent is arguably valid under Federal Circuit case law, because the Supreme Court has not heard a case involving the patentability of laws of nature in twenty-five years, it is unclear whether the Court would adopt the Federal Circuit’s current expansive view of patentable subject matter, and thus uphold the validity of the patent, or narrow the scope of patentable subject matter based on its own precedent and likely invalidate the patent. Regardless, the Court was correct not to decide the case because LabCorp waived its argument that the patent was an invalid attempt to patent a law of nature under § 101.

CITATION: David W. Klinger, Note, LabCorp v. Metabolite: Revisiting the Law of Nature Doctrine 25 Years after Diehr, 48 Jurimetrics J. 95–117 (2007).
Coverage and Reimbursement for Pharmacoge­nomic Testing
Robert J. Milligan

Since the completion of the Human Genome Project there has been a great deal of publicity and discussion regarding genetic medicine. Genetic advances can be divided into two main categories: diagnostics and therapeutics. Although stem cell therapy and other genetic therapies have gotten most of the publicity relating to genetic advances, the diagnostic field in general and pharmacogenomic testing in particular are likely to have a greater impact on medicine in the near- to mid-term. The integration of pharmacogenomics into clinical practice is subject to several challenges, however, including the challenge of persuading government and commercial payors that they should provide reimbursement for pharmacogenomic testing and ensuring that the amount of reimbursement will make the development of these tests feasible.

This paper provides an overview of genetic testing and a brief discussion of pharmacogenomics. It also discusses the in vitro diagnostic test (IVD) industry, of which pharmacogenomic testing is a segment. Subsequent sections provide an overview of government and private payor coverage and reimbursement processes. The paper then identifies issues likely to influence coverage and reimbursement decisions for pharmacogenomic products and concludes with suggestions on how to improve the prospects for favorable coverage and reimbursement decisions.

CITATION: Robert J. Milligan, Coverage and Reimbursement for Pharmacoge­nomic Testing, 48 Jurimetrics J. 137–165 (2008).
The Washington University v. Catalona: Determining Ownership of Genetic Samples
Scott F. Gibson

The ownership of tissue samples donated for medical research is an ongoing subject of dispute. Some advocates assert that patients have ongoing ownership rights in their tissues, including an unfettered right to determine what happens to their tissue sample. Researchers argue that giving patients property rights in their samples will turn the human body into a commodity and bring research to a screeching halt. One thing is certain: the creation of commercial products from human tissue has generated very difficult legal and ethical questions that have no clear, universally accepted answers. When those questions have come up in litigation, the courts have struggled to adapt the tradition and precedent of the law to the challenges arising from the biotech era. The case of The Washington University v. Catalona is the most recent instance of a court seeking to resolve this dilemma.

CITATION: Scott F. Gibson, The Washington University v. Catalona: Determining Ownership of Genetic Samples, 48 Jurimetrics J. 167–191 (2008).
The Tissue Issue: A Wicked Problem
Sharon Lewis

Over three decades ago, Horst Rittel and Melvin Webber created a theoretical analytic structure for certain types of difficult and challenging problems, identifying them as “Wicked Problems.” This theory has been applied only limitedly to the social sciences and rarely to the consideration of how laws, regulations, or ethics affect these types of problems. This article looks at one of the most basic elements necessary for the development and advancement of personalized medicine—tissue—through the Wicked Problem analytic structure. The legal, regulatory, and ethical issues inherent in sharing, using, and maintaining tissue are complex, transcend national bor­ders, and are anything but settled. The analysis highlights the nature of the discrepancies and disconnects between cultures, practices, and institutions as well as identifies difficulties often encountered in attempting to solve or “tame” such problems.

CITATION: Sharon Lewis, The Tissue Issue: A Wicked Problem, 48 Jurimetrics J. 193–215 (2008).
The FDA and Regulation of Genetic Tests: Build­ing Confidence and Promoting Safety
Gregorio M. Garcia

The Food and Drug Administration recently announced that certain complex genetic tests developed and performed by clinical laboratories are medical devices subject to regulation. These complex genetic tests, often referred to as “home brew” tests, employ sophisticated algorithms and proprietary components in the prediction or diagnosis of disease or other conditions. There is no assurance that such home brew tests are safe and effective. In contrast, genetic test kits that are distributed for use outside of the developing laboratory must be proven to be safe and effective prior to distribution. Nonetheless, patients and medical care providers rely upon the outcome of home brew genetic tests in making health-care decisions. The proposed regulation would require proof that a subset of home brew genetic tests is safe and effective, but it fails to address the majority of home brew genetic tests. The regulation falls short of promoting safety and creates uncertainty in the market for home brew genetic tests.

CITATION: Gregorio M. Garcia, The FDA and Regulation of Genetic Tests: Build­ing Confidence and Promoting Safety, 48 Jurimetrics J. 217–239 (2008).
Tiered Consent and the Tyranny of Choice
Natalie Ram

Regulations and doctrine governing human tissue research are facing immense pressure to ensure respect for the interests of tissue providers and of researchers. Tiered consent presents tissue providers with a menu of research categories to which they may consent, and it is a recognized best practice. Yet, evidence in consumer psychology suggests that abundant choice causes decision makers to experience information overload, make arbitrary choices, refrain from choosing altogether, and experience regret following decision making. These patterns result in systematically lower quality decision making. This article fleshes out the potential limitations of expanded choice in tiered consent situations so that use of this best practice, and the laws and doctrine governing it, best approaches the ethical paradigm of informed consent.

CITATION: Natalie Ram, Tiered Consent and the Tyranny of Choice, 48 Jurimetrics J. 253–284 (2008).
State v. Karl: An Unreasonable Rejection of the Learned Intermediary Doctrine
Jerica L. Peters

In State ex rel. Johnson & Johnson v. Karl, 647 S.E.2d 899 (W. Va. 2007), the West Virginia Supreme Court declined to adopt the learned intermediary doctrine as an exception to the general rule that manufacturers have a duty to warn consumers about the risks of their products. Pursuant to the doctrine, pharmaceutical manufacturers can fulfill this duty for prescription drugs by providing an adequate warning to the patients’ physicians. The West Virginia Supreme Court held that the justifications for the learned intermediary doctrine were no longer valid. The court concluded that, due primarily to their use of direct-to-consumer advertisements, drug manufacturers have reduced the role of physicians in the selection of prescription medications for their patients. The court’s reasoning was flawed because it rejected the learned intermediary doctrine mainly in response to the increased use of advertising. The court ignored the beneficial aspects of the doctrine. Specifically, the learned intermediary doctrine (1) encourages the informed consent process between physician and patient, (2) recognizes that medications are becoming increasingly personalized, and (3) eliminates needless litigation over the content, format, and adequacy of consumer warnings.

CITATION: Jerica L. Peters, Note, State v. Karl: An Unreasonable Rejection of the Learned Intermediary Doctrine, 48 Jurimetrics J. 285–308 (2008).
Banning Tax Strategy Patents—Should We Listen to the Tax Practitioners?
Tinna C. Otero

The Federal Circuit issued a landmark decision in State Street Bank & Trust Co. v. Signature Financial Group, Inc., 149 F.3d 1368 (Fed. Cir. 1998), uphold­ing a patent on a business method. Patent law encourages innovation; however, if in­dustries encourage development through other mechanisms, patents in that industry are extraneous. Tax practitioners and accounting firms are concerned about the detrimental effects that certain business method patents, specifically tax strategy patents, will have on their industry. This note explores the current state of tax strategy patents; current views of tax practitioners, patent attorneys, and the IRS; and pending legislation. It pays particular attention to the reasoning provided by the leaders of the tax industry, namely tax practitioners and accounting firms and concludes that the pending legisla­tion recommending banning tax strategy patents is the preferable course of action.

CITATION: Tinna C. Otero, Note, Banning Tax Strategy Patents—Should We Listen to the Tax Practitioners?, 48 Jurimetrics J. 309–328 (2008).
On a Mathematical Argument for Splitting the Ninth Circuit
D.H. Kaye

For more than 30 years, there have been calls to split the Ninth Circuit Court of Appeals into two smaller courts. This essay examines a recent mathematical argument, based on sampling theory, for splitting the court.

CITATION: D.H. Kaye, On a Mathematical Argument for Splitting the Ninth Circuit, 48 Jurimetrics J. 329–336 (2008).
Warshak v. United States: Fourth Amendment Risk Analysis in the 21st Century
Gustavo Enrique Schneider

In Warshak v. United States, the Sixth Circuit Court of Appeals upheld a district court injunction prohibiting the government from acting upon a magistrate’s order authorizing the seizure of e-mails stored by an Internet Service Provider (ISP) without prior notice to the e-mail account holder. In its analysis, the court relied on analogies between e-mails and postal letters in holding that portions of the Stored Communications Act (SCA) violated an e-mail account holder’s reasonable expectation of privacy by allowing the government to gain access to the contents of his e-mail messages on less than probable cause and without due process of law. The court’s decision suggests that the degree to which an ISP monitors an account holder’s e-mails during the daily course of business is dispositive on the issue of whether the subscriber has a reasonable expectation of privacy and is thus entitled to appropriate due process. A standard that allows ISPs to determine the subscriber’s expectation of privacy may be problematic because, in theory, an ISP is free to monitor everything a subscriber does, or conversely, to monitor nothing a subscriber does. Neither extreme is palatable. The later approach would clearly frustrate the goals of the SCA, while the former would eliminate Internet privacy. Given the importance of developing a workable Fourth Amendment e-mail communications standard that balances governmental and private interests, the Sixth Circuit vacated its decision and granted rehearing en banc. Upon rehearing, the Sixth Circuit should set a bright line rule to govern e-mail privacy that is deferential to the terms of service contract between the subscriber and the ISP. To defeat a presumption of privacy, the court should require specific language to that effect in the contract between the parties; in its absence the user’s communications should be presumed to be private for purposes of the Fourth Amendment.

CITATION: Gustavo Enrique Schneider, Note, Warshak v. United States: Fourth Amendment Risk Analysis in the 21st Century, 48 Jurimetrics J. 357–377 (2008).
Using Hindsight in Determining Patent Obviousness: Observations on PharmaStem v. Viacell
Yu Cai

In PharmaStem v. Viacell, 491 F.3d 1342 (Fed. Cir. 2007), the Court of Appeals for the Federal Circuit ruled that PharmaStem’s patents relating to a medical procedure using umbilical cord blood were invalid for obviousness because (1) the prior art suggested the idea of using umbilical cord blood for human reconstitution and (2) the patent specification admitted that it was previously known that cord blood contained stem cells. The Federal Circuit Court held that there was a reasonable expectation of success in carrying out this idea. This paper points out the technical error the court made in reading the patent specification; it also shows that the court used hindsight to find that the prior prediction had given reasonable expectation of success. A new approach applicable to patents in life sciences is suggested in this paper. The approach uses advancement of the science to test whether a prior prediction is sufficient to render an invention invalid on the ground of obviousness.

CITATION: Yu Cai, Note, Using Hindsight in Determining Patent Obviousness: Observations on PharmaStem v. Viacell, 48 Jurimetrics J. 379–408 (2008).
In re Omeprazole Patent Litigation: Misapplication of Inherent Anticipation Opens the Door to Future Speculation in Patent Protection
Michelle L. Gross

In In re Omeprazole Patent Litigation, 483 F.3d 1364 (2007), the United States Federal Circuit Court of Appeals affirmed the district court’s ruling of invalidity of pharmaceutical manufacturer, Astra’s, ‘281 patent because of inherent anticipation by Chong Kun Dan Corp. (CKD)’s prior Korean patent. The court relied on nonanalogous precedent and erroneously reasoned that newly discovered results of known processes are not patentable because those results are inherent in known processes and overlooked a significant difference in Astra’s and CKD’s claimed processes. The court incorrectly applied the infrequently utilized doctrine of inherent anticipation based on illogical reasoning, evidentiary errors and its incorrect use of a trade secret as prior art. Finding inherent anticipation under these circumstances is inconsistent with the definition of acceptable forms of prior art and the enablement requirement of the Patent Act.

CITATION: Michelle L. Gross, Note, In re Omeprazole Patent Litigation: Misapplication of Inherent Anticipation Opens the Door to Future Speculation in Patent Protection, 48 Jurimetrics J. 409–425 (2008).
Patentable Subject Matter: Separating Abstract Ideas and Laws of Nature from Patentable Inventions
Bryan Treglia

The recent Supreme Court case of Lab. Corp. v. Metabolite, 126 S. Ct. 2921 (2006), created doubt concerning the patentability of the measurement plus formula class of inventions. The patent at issue contained a claim appearing to be nothing more than a measurement step tacked onto an algorithm. Certiorari was ultimately dismissed as being improvidently granted, but the three-judge dissent argued that the case should be heard and that the claim was nothing more than an attempt to gain a monopoly over a law of nature. More recently, the Federal Circuit in In re Comiskey, 499 F.3d 1365 (Fed. Cir. 2007), placed inventions involving purely mental steps outside of patentable subject matter. The reasoning in Comiskey can be extended to create a specific solution for the Lab. Corp. class of inventions and a general solution for separating abstract ideas from patentable subject matter. The solution is to require that the class of inventions represented by Lab. Corp. operate on tangible objects, as opposed to abstract ideas. More specifically, the test requires that the class of inventions involve a physical measurement of a tangible thing. This solution adds an important limitation to the Lab. Corp. class of inventions, fills a logical gap in the current jurisprudence, satisfies the relevant policy goals, addresses the concern of the Lab. Corp. dissent, and provides a general framework for separating abstract ideas from pat­entable subject matter.

CITATION: Bryan Treglia, Comment, Patentable Subject Matter: Separating Abstract Ideas and Laws of Nature from Patentable Inventions, 48 Jurimetrics J. 427–455 (2008).
The Disappearance that Wasn’t? “Random Variation” in the Number of Women Supreme Court Clerks
D.H. Kaye & Joseph L. Gastwirth

In the world of American law, a Supreme Court clerkship is a position desired by many but attained by few. In the summer of 2006, news reports revealed that only seven out of the 37 clerks hired—a mere 19 percent—were women. This outcome represented a dramatic 50 percent drop from preceding years. Yet, two Justices portrayed the change as the result of “random variation,” a claim that struck many observers at the time as incredible. This essay applies standard statistical reasoning to analyze what the dip in 2006 might indicate. We show that the year’s decline in women, considered as one point in a time series, was not so improbable after all.

CITATION: D.H. Kaye & Joseph L. Gastwirth, The Disappearance that Wasn’t? “Random Variation” in the Number of Women Supreme Court Clerks, 48 Jurimetrics J. 457–463 (2008).
Embryonic Histrionics: A Critical Evaluation of the Bush Stem Cell Funding Policy and the Congressional Alternative
Russell Korobkin

For the past decade, the debate over federal funding of human embryonic stem cell (hESC) research has dominated the national discussion of biomedical science policy. In 2001, President Bush announced the federal government would place significant limitations on funding of hESC research. In 2006, Congress passed legislation to overrule this decision but failed to override a presidential veto. This article goes beyond the headlines to critique the assumptions and the internal logic of the president's position and Congress' failed alternative, finding that neither constitutes a logically coherent public policy. It also evaluates the impact that the Bush policy has had on scientific progress in the field.

CITATION: Russell Korobkin, Embryonic Histrionics: A Critical Evaluation of the Bush Stem Cell Funding Policy and the Congressional Alternative, 47 Jurimetrics J. 1–29 (2006).
Google the Gozerian and Fair Use Slimed: Copyright Again in the Technocrat’s Den
Brian Sites

This article considers the fair use doctrine as it applies to Google’s Library Search Project and both predicts and advocates for a finding of fair use. Part I briefly reviews the past by considering the pertinent history of the fair use doctrine. It also explains the details of the current suit over Google’s Library Project. Part II moves on to consider the current state of fair use analysis by reviewing 110 fair use cases and conducting simple statistical analyses. It then explains and applies the fair use doctrine to Google’s project. Part III considers cases frequently compared to Google’s and discusses their impact on Google’s lawsuit. Part IV departs from a Google-centered analysis and examines the possible future of the four factors by suggesting modifications to the fair use doctrine.

CITATION: Brian Sites, Google the Gozerian and Fair Use Slimed: Copyright Again in the Technocrat’s Den, 47 Jurimetrics J. 31–87 (2006).
The Current State of Bullet-Lead Evidence
D.H. Kaye

This article describes recent opinions on the admissibility of bullet-lead evidence. In 2004, the National Research Council released a report identifying certain problems with this type of evidence and making recommendations for improvements in the analytical procedure, the interpretation of a match among bullet fragments, and the presentation of the laboratory findings in court. Even if the testimony were as circumspect as called for in the NRC report, however, it might not have sufficient value to the jury to warrant its admission. In any event, the FBI has stopped performing the procedure, and it appears that in any retrials involving previous testimony, the courts will view the past testimony of FBI examiners more skeptically.

CITATION: D.H. Kaye, The Current State of Bullet-Lead Evidence, 47 Jurimetrics J. 99–114 (2006).
Ecological Inference in Voting Rights Act Disputes: Where Are We Now, and Where Do We Want to Be?
D. James Greiner

Recent developments in law and in quantitative methods have combined to place greater emphasis on coherent and accurate techniques of drawing inferences about racial voting patterns in Voting Rights Act litigation. In this article, I examine the challenge the secret ballot poses for such inferences; I then discuss four so-called “ecological inference” methods designed to address the issue. I argue that the two techniques that have dominated this field for more than 20 years should be abandoned; that a third, well-publicized method should be used only when no other is feasible; and that a fourth, while representing the state of the art at present, has shortcomings that researchers should address. To aid in understanding, I apply each technique outlined to a concrete data set. I conclude with a discussion of what makes a good method.

CITATION: D. James Greiner, Ecological Inference in Voting Rights Act Disputes: Where Are We Now, and Where Do We Want to Be?, 47 Jurimetrics J. 115–167 (2007).
Just Reasonable: Can Linguistic Analysis Help Us Know What It Is to Be Reasonable?
Michel Paradis

This study provides a linguistic analysis of the use of the word “reasonable” in the law. It approaches the problem of legal vagueness in an experimental way in order to test the assumption that vague rules are unpredictable and leave the resolution of cases to extrinsic influences, such as political ideology or personal prejudices. Through the quantitative measurement of how “reasonable” is applied by judges and employed in legislation, we found considerable regularity in its interpretation and usage. The reason for this regularity, it is argued, stems from its ability to trigger associations of expectation and the robustness of associative judgment in resolving problems of greater complexity.

CITATION: Michel Paradis, Just Reasonable: Can Linguistic Analysis Help Us Know What It Is to Be Reasonable?, 47 Jurimetrics J. 169–191 (2007).
A Default-Logic Paradigm for Legal Fact-Finding
Vern R. Walker

Unlike research in linguistics and artificial intelligence, legal research has not used advances in logical theory very effectively. This article uses default logic to develop a paradigm for analyzing the reasoning behind legal fact-finding. The article provides a formal model that integrates legal rules and policies with the evaluation of both expert and nonexpert evidence—whether the fact-finding occurs in courts or administrative agencies, and whether in domestic, foreign, or international legal systems. This paradigm can standardize the representation of fact-finding reasoning, guide empirical research into the dynamics of such reasoning, and help prepare such representations and research results for automation through artificial intelligence software. This new model therefore has the potential to transform legal practice and legal education, as well as legal theory.

CITATION: Vern R. Walker, A Default-Logic Paradigm for Legal Fact-Finding, 47 Jurimetrics J. 193–243 (2007).
Homeland Security, Law, and Policy Through the Lens of Critical Infrastructure and Key Asset Protection
Joe D. Whitley, George A. Koenig, and Steven E. Roberts

Homeland security continues to be one of the principal priorities of government at all levels. Homeland security, however, is not static. What gets protected, how resources are allocated, and the manner in which threats are identified continue to evolve. In particular, critical infrastructure and key asset protection are fundamental components of homeland security greatly influenced by developments in law and policy.

CITATION: Joe D. Whitley, George A. Koenig, and Steven E. Roberts, Homeland Security, Law, and Policy Through the Lens of Critical Infrastructure and Key Asset Protection, 47 Jurimetrics J. 259–279 (2007).
Teaching New Dogs Old Tricks: Reshaping the Department of Homeland Security’s Technology Development Infrastructure
Michael Greenberger

This article discusses the Department of Homeland Security’s (DHS’s) use of technology to help fight the war on terror. First, this article reveals how DHS has made little progress in encouraging the development of important technology, despite receiving ample resources from Congress to do so. Second, this article looks to the Office of War Mobilization’s (OWM) work during World War II as a possible template for galvanizing the Nation’s technological talent and resources to fight terror. Third, this article suggests a program for refining the OWM template to meet modern day needs. In this regard, DHS is the “new dog,” that should be “taught” the “old tricks” that so ably helped this country during World War II.

CITATION: Michael Greenberger, Teaching New Dogs Old Tricks: Reshaping the Department of Homeland Security’s Technology Development Infrastructure, 47 Jurimetrics J. 281–296 (2007).
Countering Terrorism with Cyber Security
Jody R. Westby

Terrorist cells in more than 60 countries are using information and communication technologies (ICTs) to recruit and spread propaganda, raise money, train jihadists, and communicate and conspire. Their high-tech strategy outpaces the traditional approach of the United States and other countries who are relying upon traditional strategies aimed at tracking, capturing, and killing an enemy. The three problems that most frustrate governments in countering terrorists' use of ICTs are (1) difficulties in tracking and tracing cyber communications, (2) the lack of harmonized laws and procedures in investigating cybercrimes, and (3) inadequate or ineffective information sharing. Governments around the globe need to address these three areas of cyber security if they are to hope to win the battle against terror.

CITATION: Jody R. Westby, Countering Terrorism with Cyber Security, 47 Jurimetrics J. 297–313 (2007).
The Potential for an International Legal Approach to Critical Information Infrastructure Protection
David Satola and William J. Luddy, Jr.

This article reviews a wide range of global efforts (and their limitations) to achieve cyber security (security of critical information infrastructure) through cross-border collaboration in both the private and public sectors, involving technologists, lawyers, and business process managers. The importance of cyber security extends beyond any government’s legitimate concern for national security. There are significant interrelationships between terrorism, cybercrime, economic and human development, critical infrastructure protection, network security, regulatory reform, and Internet governance.

Technology and the legal and policy rationales behind its use on a global level, for both the public and private sectors, are the starting points for a fuller understanding of the work undertaken both domestically and internationally to provide “homeland security.” The analysis is intended to raise awareness and posit the need for further international cooperation, coordination, and collaboration focusing on the legal aspects of critical information infrastructure protection that implicate and are affected by technology in the field.

CITATION: David Satola and William J. Luddy, Jr., The Potential for an International Legal Approach to Critical Information Infrastructure Protection, 47 Jurimetrics J. 315–333 (2007).
Critical Issues in Identity Management-Challenges for Homeland Security
Lucy L. Thomson

Identity management has emerged as one of the most important and difficult homeland security challenges. This paper highlights the critical legal and policy challenges surrounding identity management and describes the recommendations and federal homeland security legislation that are driving proposed solutions. It describes the search for consensus among business leaders and government both in the United States and internationally. Significant issues about the nature of identity and the control individuals should have over their own personal information are at the heart of the debate about how to combat terrorism and keep the nation safe. The resolution of these issues will shape the relationship between individuals, business, and government for many years to come.

CITATION: Lucy L. Thomson, Critical Issues in Identity Management-Challenges for Homeland Security, 47 Jurimetrics J. 335–356 (2007).
The CSI Effect: Popular Fiction About Forensic Science Affects the Public’s Expectations About Real Forensic Science
N.J. Schweitzer and Michael J. Saks

Two of a number of hypotheses loosely referred to as the CSI Effect suggest that the television program and its spin-offs, which wildly exaggerate and glorify forensic science, affect the public, and in turn affect trials either by (a) burdening the prosecution by creating greater expectations about forensic science than can be delivered or (b) burdening the defense by creating exaggerated faith in the capabilities and reliability of the forensic sciences. The present study tested these hypotheses by presenting to mock jurors a simulated trial transcript that included the testimony of a forensic scientist. The case for conviction was relatively weak, unless the expert testimony could carry the case across the threshold of reasonable doubt. In addition to reacting to the trial evidence, respondents were asked about their television viewing habits. Compared to non-CSI viewers, CSI viewers were more critical of the forensic evidence presented at the trial, finding it less believable. Regarding their verdicts, 29% of non-CSI viewers said they would convict, compared to 18% of CSI viewers (not a statistically significant difference). Forensic science viewers expressed more confi­dence in their verdicts than did non-viewers. Viewers of general crime programs, however, did not differ significantly from their non-viewing counterparts on any of the other dependent measures, suggesting that skepticism toward the forensic science testimony was specific to those whose diet consisted of heavy doses of forensic science television programs.

CITATION: N.J. Schweitzer and Michael J. Saks, The CSI Effect: Popular Fiction About Forensic Science Affects the Public’s Expectations About Real Forensic Science, 47 Jurimetrics J. 357–364 (2007).
Network Neutrality and the Economics of an Information Superhighway: A Reply to Professor Yoo
Brett M. Frischmann and Barbara van Schewick

Network neutrality has received a great deal of attention recently, not just from legal academics and telecommunications experts, but from our elected repre­sentatives, the relevant agencies and the press. Our article directly replies to a series of articles published by Professor Christopher Yoo on this topic. Yoo’s scholarship has been very influential in shaping one side of the debate. In our article, we explain the critical flaws in Yoo’s arguments and present a series of important arguments that he and most other opponents of network neutrality regulation ignore.

CITATION: Brett M. Frischmann and Barbara van Schewick, Network Neutrality and the Economics of an Information Superhighway: A Reply to Professor Yoo, 47 Jurimetrics J. 383–428 (2007).
Intel Corp. v. Commonwealth Scientific & Industrial Research Organisation: Can Equity Step in Where Public Standards and the Patent System Seem at Odds?
Bruce A. Wagar

In Intel Corp. v. Commonwealth Scientific & Industrial Research Organisation, Inc., 455 F.3d 1364 (Fed. Cir. 2006), the Federal Circuit dismissed an Australian agency's interlocutory appeal claiming sovereign immunity. The underlying facts, however, paint an intriguing conflict between two institutions: public standards and the patent system. Australia's national science agency holds a patent on wireless network technology that every major U.S. computer company employs. Six years after the patented technology apparently became part of the underlying wireless network standard, the agency contacted numerous U.S. companies, alleging infringement without a license and seeking licensing royalties. As a result, six large computer companies initiated declaratory judgment actions to settle the infringement issue. In patent infringement cases, laches and equitable estoppel evolved as equitable defenses to patentees who slept on their rights or misrepresented their intended enforcement, and prejudiced potential infringers as a result. This note suggests that for policy reasons courts should accept these defenses more freely in patent infringement cases involving public standards.

CITATION: Bruce A. Wagar, Note, Intel Corp. v. Commonwealth Scientific & Industrial Research Organisation: Can Equity Step in Where Public Standards and the Patent System Seem at Odds?, 47 Jurimetrics J. 429–439 (2007).
Rescuecom Corp. v. Google Inc. and 800-JR Cigar v. Goto.com: Reaching a Fair Result in Keyword Triggered Advertising and Trademark Cases
Ian Gillies

The issue of whether search engine proprietors can sell the rights to use registered trademarks as keywords to trigger sponsored advertisements to a mark owner's competitors without violating the Lanham Act is unsettled. Recent cases demonstrate conflicting outcomes and policy rationales in this area, but in combination, the cases offer a potentially fair resolution to the infringement question. A useful framework for resolving these cases and a fair compromise for both search engine proprietors and trademark owners emerges from combining the narrow view of trademark use articulated in Rescuecom Corp. v. Google, Inc., 456 F. Supp. 2d 393 (N.D.N.Y. 2006) with the court's analysis of contributory trademark infringement and the distinction it draws between the search engine's advertising methods and displayed advertising content, as suggested in 800-JR Cigar, Inc. v. Goto.com, 437 F. Supp. 2d 273 (D.N.J. 2006).

CITATION: Ian Gillies, Note, Rescuecom Corp. v. Google Inc. and 800-JR Cigar v. Goto.com: Reaching a Fair Result in Keyword Triggered Advertising and Trademark Cases, 47 Jurimetrics J. 441–462 (2007).
American Council on Education v. FCC: Proper Outcome, Lack of Clarity in the Interpretation of CALEA
Sara E. Dirvianskis

In American Council on Education v. Federal Communications Commission, 451 F.3d 226 (D.C. Cir. 2006), the Court of Appeals for the D.C. Circuit upheld the FCC's interpretation that broadband Internet and Voice-over Internet Protocol (VoIP) providers fit the definition of "telecommunications carriers" under the Communications Assistance for Law Enforcement Act (CALEA) and therefore must make changes to allow law enforcement the ability to easily intercept communications over their networks. Although the court was correct to uphold the FCC's ultimate determination that broadband and VoIP providers are subject to CALEA, neither the FCC nor the court provided clear explanation of key terms and statutory constructions, creating further confusion about how the statute should be interpreted and applied.

CITATION: Sara E. Dirvianskis, Note, American Council on Education v. FCC: Proper Outcome, Lack of Clarity in the Interpretation of CALEA, 47 Jurimetrics J. 463–477 (2007).
The Importance of Checking the Assumptions Underlying Statistical Analysis: Graphical Methods for Assessing Normality
Yulia Gel, Weiwen Miao, and Joseph L. Gastwirth

Statistical methods, in general, rely on a variety of assumptions about the nature of the underlying data. When the data do not meet those assumptions, the results often are not valid. Therefore, it is important for courts assessing the evidentiary relevance of statistical studies to check that those assumptions are satisfied, at least, approximately. This article describes a new robust graphical method for checking that data come from a normal distribution, or "bell" curve.

CITATION: Yulia Gel, Weiwen Miao, and Joseph L. Gastwirth, The Importance of Checking the Assumptions Underlying Statistical Analysis: Graphical Methods for Assessing Normality, 46 Jurimetrics J. 3–29 (2005).
Assessing Spatial Heterogeneity in the Refractive Index of Float Glass
Geva Maimon, Russell J. Steele, and James M. Curran

The intrinsic variability of the refractive index within a pane of glass is important in the statistical interpretation of forensic glass evidence. Spatial structure could drastically alter the mode of analysis or severely restrict the generalization in interpretation. This article proposes two methods for determining the amount of spatially structured variability of the refractive index in a pane of float glass. The first method is a standard Bayesian spatial analysis that allows one to infer whether there is evidence of spatially structured variability within a pane. Unfortunately, no formal notion of statistical significance can be attached to this standard analysis. Second, we construct a Monte Carlo hypothesis test to test the null hypothesis that there is no spatial structure of the refractive index within the pane of glass. In addition, we present a simulation study that shows one proposed test quantity to be a good gauge of spatial structure when used in conjunction with a Monte Carlo hypothesis test.

CITATION: Geva Maimon, Russell J. Steele, and James M. Curran, Assessing Spatial Heterogeneity in the Refractive Index of Float Glass, 46 Jurimetrics J. 31–51 (2005).
An Introduction to Data Fusion, Data Mining, and Pattern Recognition Applied to Fiber Analysis
Jennifer Wiseman Mercer and Suzanne C. Bell

A statistically defensible method of fiber analysis is vital to determining the probative value of forensic fiber data. For this purpose, a comprehensive database of information from a range of instrumental sources is being designed to characterize and quantitate variability using strict sampling and analysis protocols as well as rigorous quality assurance and quality control. The database, could lead to a probability-based fiber-analysis model.

CITATION: Jennifer Wiseman Mercer and Suzanne C. Bell, An Introduction to Data Fusion, Data Mining, and Pattern Recognition Applied to Fiber Analysis, 46 Jurimetrics J. 53–64 (2005).
Analyzing the Relevance and Admissibility of Bullet-Lead Evidence: Did the NRC Report Miss the Target?
William C. Thompson

This article analyzes and criticizes the 2004 National Research Council report on bullet-lead evidence. Using likelihood ratios, it identifies four variables that affect the probative value of bullet-lead evidence and shows how insights from this model clarify legal issues surrounding the probative value of bullet-lead evidence and its admissibility under Daubert and the Federal Rules of Evidence. It also discusses empirical evidence needed to create a stronger foundation for bullet-lead testimony.

CITATION: William C. Thompson, Analyzing the Relevance and Admissibility of Bullet-Lead Evidence: Did the NRC Report Miss the Target?, 46 Jurimetrics J. 65–89 (2005).
The NRC Bullet-Lead Report: Should Science Committees Make Legal Findings?
D.H. Kaye

For decades, analysis of concentrations of elements in bullet lead have been used in the United States to link defendants to crimes in which fragments of bullets have been recovered. In response to mounting criticism of the practice, the FBI commissioned the National Academy of Science to review the Bureau’s procedures for making these measurements and drawing inferences from them. The Academy’s report confirms the validity of the instrumentation but identifies weaknesses in the statistical methods for declaring matches and for describing the significance of these matches. It also concludes that courts correctly could apply existing legal doctrine to uphold testimony that a match makes it more probable (by an unspecified amount) that the defendant is the source of the fragments. The article questions the committee’s legal reasoning with regard to the probative value of the evidence and its conclusion as to the admissibility of the evidence under the scientific-validity standard articulated in Daubert v. Merrell Dow Pharmaceuticals, Inc. It also raises a question as to whether and how blue-ribbon scientific committees studying forms of scientific evidence should offer explicit legal opinions on admissibility.

CITATION: D.H. Kaye, The NRC Bullet-Lead Report: Should Science Committees Make Legal Findings?, 46 Jurimetrics J. 91–105 (2005).
To Tell the Truth: On the Probative Value of Polygraph Search Evidence
Stephen E. Fienberg

Finkelstein and Levin, in an otherwise interesting and informative article on the probative value of a screening search, convey misleading ideas regarding both our current knowledge about polygraph accuracy and its use in security and criminal screening. The examples they present are not based on credible data on polygraph accuracy. As a consequence, they make what I believe are unwarranted statements about the probative value of polygraph results. Courts have been and should continue to be skeptical of such evidence in legal settings.

CITATION: Stephen E. Fienberg, To Tell the Truth: On the Probative Value of Polygraph Search Evidence, 46 Jurimetrics J. 107–116 (2005).
"Implicit Testing": Can Casework Validate Forensic Techniques?
Simon A. Cole

Latent-print individualization, better known as "fingerprint identification," is an area of forensic endeavor that remains profoundly indifferent to statistics. Although there is sophisticated statistical work concerning fingerprints, this work has no impact on everyday practice, on how conclusions are reached, or on how evidence is presented in court. Moreover, no validation studies of latent-print individualization have been performed. In the absence of validation studies, latent-print proponents and courts have treated a long history of casework as a surrogate for actual experimental testing of latent-print examiners’ knowledge claims. This notion of casework validation, neatly summed up by the Third Circuit Court of Appeals for the United States’ memorable phrase "implicit testing," obviously founders on the problem that ground truth is not known in casework. This article suggests that, regardless of the actual accuracy of latent-print individualization, "implicit testing" sets a poor example for the validation of forensic techniques.

CITATION: Simon A. Cole, "Implicit Testing": Can Casework Validate Forensic Techniques?, 46 Jurimetrics J. 117–128 (2006).
Statistical Assessment of Damages in Breach of Contract Litigation
Duane L. Steffey, Stephen E. Fienberg, and Robert H. Sturgess

A recent suit in U.S. District Court involved a breach of contract by a third-party administrator of workers’ compensation claims. Applying a theory of lost economic opportunity, the plaintiff’s attorney retained experts in claims adjustment, insurance, and statistics to assess damages. A stratified, systematic random sample was drawn from the population of claims and reviewed for processing and costs. Several ratio estimators were considered in estimating total damages. This article describes the sampling design and estimation methods and discusses how these procedures and results were received by the court. More aggressive application of the "loss of economic opportunity" or "loss-of-chance" doctrine in future cases will expand the scope of actions for which statistical assessments of damage may be indicated.

CITATION: Duane L. Steffey, Stephen E. Fienberg, and Robert H. Sturgess, Statistical Assessment of Damages in Breach of Contract Litigation, 46 Jurimetrics J. 129–138 (2006).
Power-Law Distributions and the Federal Judiciary
Thomas Bak

Power-law distributions apply to a wide range of phenomena in which events with large values are very rare, while events with small values are common. An examination of different variables used to characterize federal judicial activity, most prominently filings, indicates that they follow a power-law distribution and that these distributions persist over time. In particular, federal judicial filings can be shown to closely approximate Zipf’s law, a distinct type of power law.

CITATION: Thomas Bak, Power-Law Distributions and the Federal Judiciary, 46 Jurimetrics J. 139–160 (2006).
Assessing the Implications for Close Relatives in the Event of Similar but Nonmatching DNA Profiles
David R. Paoletti, Travis E. Doom, Michael L. Raymer, and Dan E. Krane

A complete match between the STR DNA profile of an evidence sample and that of an individual included in a database of profiles of convicted offenders has clear utility as an investigative tool. Very similar but nonetheless nonmatching DNA profiles also can provide useful information by suggesting that a close relative of the individual may be the source of the evidence sample. This article describes a general framework for determining the relative likelihood that an individual’s close relative is the source of an imperfectly matching DNA profile.

CITATION: David R. Paoletti, Travis E. Doom, Michael L. Raymer, and Dan E. Krane, Assessing the Implications for Close Relatives in the Event of Similar but Nonmatching DNA Profiles, 46 Jurimetrics J. 161–175 (2006).
Beyond the Ken? Testing Jurors’ Understanding of Eyewitness Reliability Evidence
Richard S. Schmechel, Timothy P. O’Toole, Catharine Easterly, and Elizabeth F. Loftus

Over the past thirty years, researchers have made substantial strides in understanding the workings and limitations of human memory. However, the application of these scientific advances to eyewitness identifications in the criminal justice system, though increasing, has been limited. Trial judges in most jurisdictions exercise their discretionary powers to exclude expert testimony about the reliability of eyewitness identifications. The most common rationale for excluding eyewitness identification expert witnesses is that their findings are not "beyond the ken" of the average juror.

To empirically test this "beyond the ken" rationale, an independent survey of potential jurors in the District of Columbia was designed to investigate whether jurors understand, as a matter of common sense, what makes some eyewitness identifications more or less reliable than others. The survey results, presented in this article, demonstrate that jurors misunderstand how memory generally works and how particular factors, such as the effects of stress or the use of a weapon, affect the accuracy of eyewitness testimony. In light of these findings, judicial practices of excluding expert testimony on the reliability of eyewitness identifications should be reexamined. Wrongful convictions, of which eyewitness identification error is the leading cause, will inevitably continue to result unless jurors can be better educated about these scientific findings.

CITATION: Richard S. Schmechel, Timothy P. O’Toole, Catharine Easterly, and Elizabeth F. Loftus, Beyond the Ken? Testing Jurors’ Understanding of Eyewitness Reliability Evidence, 46 Jurimetrics J. 177–214 (2006).
Reflections on the Legal, Social, and Ethical Implications of Pharmacogenomic Research
Domenic A. Crolla

Pharmacogenomics is the use of genomic technologies to study genetic variations and their differential response to drugs. Because people absorb, metabolize, and excrete medications differently, a drug that has a significant therapeutic effect in one individual may have minimal or no effect in another. If pharmacogenomic research leads to accurate predictions of a patient’s response to a particular drug, much of the guesswork can be removed from prescribing medication and adverse drug reactions can be avoided.

To achieve these potential benefits, it is necessary to collect genetic samples from research participants, raising significant privacy and confidentiality issues both for the research subjects and all persons with similar genetic makeup. This article discusses these privacy and confidentiality concerns as well as future legal implications of pharmacogenomics.

CITATION: Domenic A. Crolla, Reflections on the Legal, Social, and Ethical Implications of Pharmacogenomic Research, 46 Jurimetrics J. 239–248 (2006).
Intellectual Property Perspectives in Pharmacogenomics
Allen C. Nunnally

Patent protection will facilitate the development of pharmacogenomics and, in turn, help unlock the predictive medical value that pharmacogenomics holds. Because of the utility of genetic markers as predictors of drug efficacy and toxicity on individuals with varying genetic profiles, protection for these genetic components is critical for maintaining the impetus for optimal innovation in the pharmaceutical industry. Under current patent and trade secret regimes, industry participants will be motivated to invest in pharmacogenomic research. Intellectual property law is thus at the crux of the infrastructure of the nascent pharmacogenomics industry, providing a protective, incentive-based framework to help ensure the development of optimized drug therapies.

CITATION: Allen C. Nunnally, Intellectual Property Perspectives in Pharmacogenomics, 46 Jurimetrics J. 249–262 (2006).
Legal Pressure to Incorporate Pharmacogenetics in the U.K.
Arron Walthall

Pharmacogenetics has the potential to revolutionize the way pharmaceuticals are developed, marketed, and prescribed. Although this technology could bring great benefits to patients, pharmaceutical companies will be reluctant to implement it in areas where it does not reduce costs or increase profits. The legal framework in the United Kingdom, however, could exert pressures on the pharmaceutical industry outside the market functions of costs and profits. This paper discusses whether the legal framework in the United Kingdom might exert such pressure.

CITATION: Arron Walthall, Legal Pressure to Incorporate Pharmacogenetics in the U.K., 46 Jurimetrics J. 263–279 (2006).
First Pharmacogenomics, Next Nutrigenomics: Genohype or Genohealthy?
Nola M. Ries and Timothy Caulfield

The authors explore key legal, ethical, and social issues in the emerging field of nutritional genomics, including challenges associated with research with human subjects, implementing genetic testing, health claims, communicating complex messages about genetics, integrating nutrigenomic knowledge into public health advice, and allocation of scarce health resources. The authors compare and contrast pharmacogenomics with nutrigenomics and ask whether nutrigenomics is the next manifestation of “genohype” or whether it represents “genohealth,” the application of genetic knowledge and technology to enhance human health.

CITATION: Nola M. Ries and Timothy Caulfield, First Pharmacogenomics, Next Nutrigenomics: Genohype or Genohealthy?, 46 Jurimetrics J. 281–308 (2006).
Computer Models for Legal Prediction
Kevin D. Ashley and Stefanie Brüninghaus

Computerized algorithms for predicting the outcomes of legal problems can extract and present information from particular databases of cases to guide the legal analysis of new problems. They can have practical value despite the limitations that make reliance on predictions risky for other real-world purposes such as estimating settlement values. An algorithm’s ability to generate reasonable legal arguments also is important. In this article, computerized prediction algorithms are compared not only in terms of accuracy, but also in terms of their ability to explain predictions and to integrate predictions and arguments. Our approach, the Issue-Based Prediction algorithm, is a program that tests hypotheses about how issues in a new case will be decided. It attempts to explain away counterexamples inconsistent with a hypothesis, while apprising users of the counterexamples and making explanatory arguments based on them.

CITATION: Kevin D. Ashley and Stefanie Brüninghaus, Computer Models for Legal Prediction, 46 Jurimetrics J. 309–352 (2006).
Carhart v. Gonzales: Rethinking Stenberg and the Partial-Birth Abortion Ban
Kevin T. Kelly

In the October 2006 term, the U.S. Supreme Court will hear the Government’s challenge to Carhart v. Gonzales, 413 F.3d 791 (8th Cir. 2005), cert. granted, 126 S. Ct. 1314 (2006) (No. 05-380). In this case, the Eighth Circuit Court of Appeals held the federal Partial-Birth Abortion Ban Act of 2003 invalid because it lacks a maternal health exception as required by Stenberg v. Carhart, 530 U.S. 914 (2000). This Note examines the Eighth Circuit’s reasoning. It observes that although Stenberg left little room for any other outcome, the test for deciding when a statute should be struck down as facially invalid is less clear than Stenberg and Carhart v. Gonzales might suggest. If the Supreme Court were to apply a different test, the Act might be upheld.

CITATION: Kevin T. Kelly, Note, Carhart v. Gonzales: Rethinking Stenberg and the Partial-Birth Abortion Ban, 46 Jurimetrics J. 353–372 (2006).
Lee v. Martinez: Does Polygraph Evidence Really Satisfy Daubert?
Jodi Meyers

In Lee v. Martinez, 96 P.3d 291 (2004), the New Mexico Supreme Court ruled that polygraph evidence satisfies the Daubert standard for admitting scientific evidence. The court relied on the liberal thrust of the rules of evidence and the belief that jurors are generally knowledgeable in many areas to conclude that any doubt regarding the admissibility of scientific evidence should be resolved in favor of admission rather than exclusion. The reasoning the Lee court applied is flawed and creates a significant risk that invalid and unreliable polygraph evidence, as well as other forms of unreliable scientific evidence, will find their way into the courtroom.

CITATION: Jodi Meyers, Note, Lee v. Martinez: Does Polygraph Evidence Really Satisfy Daubert?, 46 Jurimetrics J. 391–406 (2006).
The AABB’S Autologous Blood Donation Suggested Guidance: Autologous Blood, HIV, and the Americans with Disabilities Act
Stefanie Shulman-Cutler

In 2005, the AABB, a leading international organization in transfusion medicine, reaffirmed its position that because of the Supreme Court’s ruling in Bragdon v. Abbott, 524 U.S. 624 (1998), “it would be discriminatory to offer autologous blood donation to most individuals, but not to those infected with HIV.” Yet, over one-third of medical facilities where blood is transfused do not allow storage or transfusion of HIV-infected autologous blood. Although the Supreme Court correctly determined that persons with asymptomatic HIV have a disability under the Americans with Disabilities Act of 1990, the medical community should consider whether the risk posed by erroneous autologous blood transfusions may be a “direct threat.”

CITATION: Stefanie Shulman-Cutler, Note, The AABB’S Autologous Blood Donation Suggested Guidance: Autologous Blood, HIV, and the Americans with Disabilities Act, 46 Jurimetrics J. 407–419 (2006).
K.M. v. E.G., Elisa B. v. Superior Court, and Kristine H. v. Lisa R.: Intent and Biology in California’s Lesbian Parenting Cases
Sara Xochitl Orozco`

Three recent California Supreme Court cases have attempted to define and clarify a woman’s custodial right to a child gestated by her partner in a lesbian relationship. However, these cases have created differing requirements for the nongestational partner. Couples using in vitro fertilization of an egg of one partner carried to term by the other need not prove intent for both to be recognized as parents. Meanwhile, couples using artificial insemination must show the nongestational partner’s intent to be a parent for her to obtain custodial rights. Although these landmark decisions establish custody rights for nongestational lesbian partners, the decisions emphasize the procedure used to conceive the child, instead of the intent of the parties, when determining parental rights. Furthermore, these decisions do not maintain the traditional intent analysis used when two heterosexual women vie for custody. Thus, by using different analyses to determine custodial rights depending on the reproductive technology used, courts allow the custodial rights of the nongestational partner in a lesbian relationship to be unduly influenced by the technologies available in reproductive technology clinics.

CITATION: Sara Xochitl Orozco`, Note, K.M. v. E.G., Elisa B. v. Superior Court, and Kristine H. v. Lisa R.: Intent and Biology in California’s Lesbian Parenting Cases, 46 Jurimetrics J. 421–436 (2006).
NTP, Inc. v. Research in Motion, Ltd.: Losing Control and Finding the Locus of Infringing Use
Bridget A. O’Leary Smith

Modern technological systems, particularly network computer, telecommunication, and e-commerce systems, frequently cross international borders. However, patents issued by the United States Patent and Trademark Office are enforceable only within the United States. Therefore, when deciding whether use of a transnational system infringes an American patent, courts must decide whether that use occurred inside United States territory. In the recent decision NTP, Inc. v. Research in Motion, Ltd., 418 F.3d 1282, 1317 (Fed. Cir. 2005), a panel of the Court of Appeals for the Federal Circuit held that infringing use of a geographically diverse system occurs “where control of the system is exercised and beneficial use of the system obtained.” However, the Patent Act does not require that an entity control a system to establish its infringing use of an invention. Thus, the Federal Circuit should abandon the control metric and steer future jurisprudence toward refining the locus of beneficial use.

CITATION: Bridget A. O’Leary Smith, Note, NTP, Inc. v. Research in Motion, Ltd.: Losing Control and Finding the Locus of Infringing Use, 46 Jurimetrics J. 437–458 (2006).
National Steel Car, Ltd. v. Canadian Pacific Railway, Ltd.: International Commerce and Exemption from Patent Infringement
John H. Evans

A United States patent normally confers upon the patent holder the exclusive right to preclude others from making, using, selling, offering to sell, or importing the patented invention within the United States. Under 35 U.S.C. § 272, however, the right does not apply to the use of a patented invention in a vessel, aircraft, or vehicle of a foreign country that enters the United States temporarily or accidentally. In National Steel Car, Ltd. v. Canadian Pacific Railway, Ltd., 357 F.3d 1319 (Fed. Cir. 2004), the Court of Appeals for the Federal Circuit construed this “foreign use” exemption expansively. This note traces the history of the statute, describes the court’s opinion, and proposes that the exception be construed in the light of its role in promoting international commerce.

CITATION: John H. Evans, Note, National Steel Car, Ltd. v. Canadian Pacific Railway, Ltd.: International Commerce and Exemption from Patent Infringement, 46 Jurimetrics J. 459–470 (2006).
Cardiac Pacemakers, Inc. v. St. Jude Medical, Inc.: Can the Patent-Term Extension of the Hatch-Waxman Act Be Used as Leverage in Drug Patent Infringement Settlements?
Andrew J. Paprocki

Medical device and drug manufacturers often market several products derived from the same patent; however, in such cases, the Hatch-Waxman Act applies differently to devices and drugs. In Cardiac Pacemakers, Inc. v. St. Jude Medical, Inc., 381 F.3d 1371 (Fed. Cir. 2004), the Court of Appeals for the Federal Circuit held that a patent-term extension for a medical device under the Hatch-Waxman Act could be based on any device produced under the patent. Prior case law and the text of the statute, however, suggest that any patent-term extension for a drug must be based on the FDA review period for the first approved drug. This is because the Act defines a “drug product” in terms of its active ingredient. Congress adopted this limited definition to prevent second uses and dosages of the same drug from receiving patent-term extensions under the Hatch-Waxman Act. The enacted statute, however, appears to allow an infringing drug manufacturer to prevent the patent holder from receiving a patent-term extension so long as the FDA approves the infringing product first.

CITATION: Andrew J. Paprocki, Note, Cardiac Pacemakers, Inc. v. St. Jude Medical, Inc.: Can the Patent-Term Extension of the Hatch-Waxman Act Be Used as Leverage in Drug Patent Infringement Settlements?, 46 Jurimetrics J. 471–488 (2006).
Nanotechnology and Policy
K. Eric Drexler with Jason Wejnert

Advances in nanotechnology are laying the foundations for developing molecular manufacturing systems, bringing a dramatic revolution in the price, performance, and ubiquity of smart products along with advances in medicine, computation, environmental improvements, and space exploration. Regulatory and intellectual property legal regimes will need to accommodate the coming technology. However, U.S. policy toward researching and funding nanotechnology is following an inconsistent model. Changes must be made in this approach to avoid falling behind other nations in the development and implementation of nanotechnology.

The paper first analyzes how we are able, with some confidence, to project at least a part of what will be possible as a result of future developments. It then discusses the current understanding of molecular manufacturing itself and discusses at some length nanotechnology and policy and legal issues—raising more questions than answers—but opening some directions that are fertile for further investigation. The paper concludes with a discussion of current nanotechnology policy in the United States.

CITATION: K. Eric Drexler with Jason Wejnert, nanotechnology and Policy, 45 Jurimetrics J. 1–22 (2004).
The Democratic Public Domain: Reconnecting the Modern First Amendment and the Original Progress Clause (a.k.a. Copyright and Patent Clause)
Malla Pollack

Empirical investigation of public usage of the word "progress" in the United States of 1789 demonstrates that the word meant "dissemination." The original meaning of art. I, sec. 8, cl. 8, therefore, is that Congress has the right to grant only such temporally limited exclusive rights in writings and new technology as encourage the dissemination of knowledge and new technology to the population. This article explains the major differences between current United States positive intellectual property law and the logical dictates of this original constitutional meaning. Additionally, the article asserts that the original meaning of clause 8 supports modern calls for a public-empowering First Amendment doctrine, as suggested by scholars such as Jack M. Balkin.

CITATION: Malla Pollack, The Democratic Public Domain: Reconnecting the Modern First Amendment and the Original Progress Clause (a.k.a. Copyright and Patent Clause), 45 Jurimetrics J. 23–40 (2004).
Recognizing and Responding to a Problem with the Admissibility of Fingerprint Evidence under Daubert
Kristin Romandetti

This Note analyzes the admissibility of fingerprint expert testimony in light of Daubert. On the one hand, fingerprint expert testimony cannot easily survive an intellectually honest application of Daubert. On the other hand, courts have made it clear that they cannot bring themselves to exclude fingerprint expert testimony, notwithstanding the Daubert requirements. This Note suggests a policy of restricted admission as an alternative to the complete exclusion or unfettered admission of such evidence.

CITATION: Kristin Romandetti, Note, Recognizing and Responding to a Problem with the Admissibility of Fingerprint Evidence under Daubert, 45 Jurimetrics J. 41–58 (2004).
United States v. American Library Association, Inc.: The Internet as an Inherently Public Forum
Derrick Stomberg

In United States v. American Library Ass'n, Inc., 539 U.S. 194 (2003), the Supreme Court held that Congress may condition the receipt of federal funds for Internet access under the E-rate and Library Services and Technology Act programs upon a library’s compliance with the Children’s Internet Protection Act’s requirement that pornography filters be installed on all computer terminals with Internet access. The Court reasoned that Internet access in libraries is not a public forum, and therefore such a content-based restriction is subject to a rational basis test rather than strict scrutiny. Under this test, Congress is afforded broad discretion to attach conditions to the receipt of federal aid. This Note explores the Court’s public forum analysis and ultimately concludes that the Court’s definition of a public forum is flawed and should be amended.

CITATION: Derrick Stomberg, Note, United States v. American Library Association, Inc.: The Internet as an Inherently Public Forum, 45 Jurimetrics J. 59–73 (2004).
Kremen v. Cohen : The "Knotty" Saga of Sex.com
Jay Prendergast

In Kremen v. Cohen, 337 F.3d 1024 (9th Cir. 2003), the Ninth Circuit held that Internet domain names constitute intangible personal property subject to the tort of conversion. This Note concludes that the classification of domain names as intangible property is incorrect and suggests they are properly viewed as products of contracts for services.

CITATION: Jay Prendergast, Note, Kremen v. Cohen : The "Knotty" Saga of Sex.com, 45 Jurimetrics J. 75–91 (2004).
Plant Genetic Systems v. DeKalb: The Pioneer Doctrine Cannot Substitute for Defective Enablement
Karen Feng

In Plant Genetic Systems v. DeKalb Genetics Corp., 315 F.3d 1335 (Fed. Cir. 2003), plaintiff held a patent covering genetically modified cells, seeds, and plants containing a gene that immunized the plants from nonselective herbicides. Plaintiff sued defendant for making and selling corn containing the gene. When the district court held in favor of defendant, plaintiff appealed, arguing that its invention was "pioneering" and deserving of broader protection. The Court of Appeals for the Federal Circuit affirmed, finding that plaintiff’s reliance on precedent concerning pioneer patents was misplaced because patentee had not met the statutory requirements for enablement. The applicability of the pioneer doctrine is fading as the fields of technological art are becoming more and more crowded. Courts will not apply it when statutory enablement requirements are not met.

CITATION: Karen Feng, Note, Plant Genetic Systems v. DeKalb: The Pioneer Doctrine Cannot Substitute for Defective Enablement, 45 Jurimetrics J. 93–102 (2004).
Applying International Guidelines on Ethical, Legal, and Social Issues to New International Genebanks
Melissa A. Austin, Julia Crouch, and Alyssa DiGiacomo

This paper identifies and compares current international guidelines in relation to ethical, legal, and social issues (ELSI) in genebank development and examines how they apply to seven proposed or newly established genebanks. More specifically, this article considers issues of sponsorship and benefit sharing, ethics committees, public engagement, consent, and data protection. In general, a consensus is emerging in the international guidelines that the human genome is the property of all humanity, that benefit sharing should include releasing relevant findings to the project participants, and that profit sharing may be appropriate. Similarly, there is agreement about the need for independent ethics committees, the requirement of voluntary informed consent, and the importance of protecting the privacy and confidentiality of patient information. However, the guidelines are less uniform with regard to consultation and education of the population, withdrawal of consent, data coding methods, and future use of data. New genebanks address these issues with a variety of policies and approaches. Because of the lack of global oversight and enforcement, it has been suggested that an international regulatory body be given the responsibility to oversee more standardized implementation of the guidelines.

CITATION: Melissa A. Austin, Julia Crouch, and Alyssa DiGiacomo, Applying International Guidelines on Ethical, Legal, and Social Issues to New International Genebanks, 45 Jurimetrics J. 115–134 (2005).
Informed Consent to Genetic Research on Banked Human Tissue
Daniel S. Strouse

People who provide their tissue for genetic research have strong interests in controlling its future uses; traditionally such interests have been safeguarded by the doctrine of informed consent. Yet, researchers often have strong competing interests in conducting research on "stored" tissue, research that was unforeseen at the time the tissue was collected and to which consent thus could not meaningfully be given at that time. Various mechanisms have been proposed over the years to resolve the resulting tension. Most of these have been flawed in important respects. By far the best solution, advanced by Stanford Professor Henry Greely, would substitute rigorous ethical safeguards for the subject’s diminished consent. However, Greely’s accompanying recommendation—to change the regulatory law of informed consent to accommodate his ethically sound proposal—seems unlikely to be achieved. Surprisingly, it may also be unnecessary and of ambiguous overall benefit.

CITATION: Daniel S. Strouse, Informed Consent to Genetic Research on Banked Human Tissue, 45 Jurimetrics J. 135–152 (2005).
Property Rights and Benefit-Sharing for DNA Donors?
Gary E. Marchant

The traditional assumption in most genetic research is that the donors of genetic material used in research act altruistically and are entitled to no property rights or direct benefit-sharing in the fruits of the research. This traditional assumption is now being challenged from several different directions. Some international ethics guidelines, advocacy organizations for families suffering from genetic diseases, and special populations such as Indian tribes are all pushing for greater control, rights, and benefit-sharing for genetic donors. The failure to resolve the tensions between these new demands and the traditional assumption of genetic research has the potential to create a bottleneck in the supply of genetic samples vital to the advancement of genetic research. This Article traces the recent controversies and trends that are challenging the traditional assumption that DNA donors in genetic research have no property or other rights in their donated material and outlines some alternative approaches for resolving this problem.

CITATION: Gary E. Marchant, Property Rights and Benefit-Sharing for DNA Donors?, 45 Jurimetrics J. 153–178 (2005).
Future Uses of Residual Newborn Blood Spots: Legal and Ethical Considerations
Nanette Elster

One of the first and most comprehensive banks of genetic information has long preceded the current interest in large collections of DNA: that of residual newborn dried blood spots (DBS) collected from nearly every infant born after 1965 in the United States. Because samples are collected all across the country and the samples remain quite stable over time, newborn dried blood spots are becoming more and more appealing as a potential resource. Currently, every state and the District of Columbia have laws establishing newborn screening programs. In this era of Genomics, a range of potential future uses of this vast collection of DNA may be possible. This Article will examine some of those uses and the considerations necessary before these previously collected samples can be used prospectively.

CITATION: Nanette Elster, Future Uses of Residual Newborn Blood Spots: Legal and Ethical Considerations, 45 Jurimetrics J. 179–189 (2005).
Genetics Research in American Indian Communities: Sociocultural Considerations and Participatory Research
Donald Warne

Genetics research has the potential to improve health care. American Indians (AIs) suffer from significant health disparities, including significantly higher incidence and prevalence of preventable diseases like diabetes, alcoholism, and their complications. Underfunding of health programs, including the Indian Health Service, and lower socioeconomic status among AIs contribute to these disparities. Improvements in disease prevention and treatment potentially offered by genetics research could help to reduce health disparities. However, a history of nonparticipation in the research process and a history of dishonest research practices have raised barriers to conducting research in AI communities. A paradigm for research that includes the community as a full research partner may be necessary to promote research in AI communities and to translate genetics research into reductions in health disparities.

CITATION: Donald Warne, Genetics Research in American Indian Communities: Sociocultural Considerations and Participatory Research, 45 Jurimetrics J. 191–203 (2005).
Genebank Management: A Review of Salient Ethical, Legal, and Social Issues
Michael D. Volk Jr., Christine Meis McAuliffe, and May Mowzoon

Genebanks are a revolutionary tool used to study and facilitate the development of therapies for human disease. However, human genebanks also raise several practical, ethical, legal, and social issues. This Article reviews these salient issues that invoke considerations of intellectual property, privacy, informed consent, community consent, and risk-benefit inequities that genebanks should consider when engaging in human genetic sampling and human subjects research.

CITATION: Michael D. Volk Jr., Christine Meis McAuliffe, and May Mowzoon , Genebank Management: A Review of Salient Ethical, Legal, and Social Issues, 45 Jurimetrics J. 205–223 (2005).
Copyright Protection for Genetic Databases
Ray K. Harris and Susan Stone Rosenfield

Genetic research generates databases and copyright liability could arise from unauthorized reproduction of those databases. Failure to consider copyright issues could result in unintended consequences for database owners. The fair use doctrine is not a complete bar to potential liability. Even though copyright protection for a database is "thin," owners should address the copyright issue by contract and may want to consider using an open source model for database ownership.

CITATION: Ray K. Harris and Susan Stone Rosenfield, Copyright Protection for Genetic Databases, 45 Jurimetrics J. 225–250 (2005).
Human Genetic Sampling and the HIPAA Privacy Standards
Kristen B. Rosati

The Privacy Standards, issued under the Health Insurance Portability and Accountability Act of 1996 (HIPAA), have had a significant impact on how health-care providers may use and release patient information for research purposes. This Article discusses the Privacy Standards’ rules on use and disclosure of health information for research, with a special emphasis on how these federal rules affect the storage and use of genetic samples for research. This Article also discusses the following topics: (1) de-identification of genetic samples; (2) when sampling is human subject research; (3) when researchers may release a "limited data set" that is stripped of elements directly identifying research participants; (4) how to seek HIPAA-compliant authorization from the participant, the limitations on combining a HIPAA authorization form with the informed consent to participate in the study, and the limitation on obtaining authorization to use or disclose health information for future unspecified research projects; (5) when HIPAA authorization may be waived or altered by an Institutional Review Board or privacy board; (6) how health information may be used to identify and recruit research participants; and (7) how the Privacy Standards affect research started before the HIPAA compliance date. The Article concludes that, while the Department of Health and Human Services (HHS) has made an effort to clarify these confusing rules, changes to the regulations are required to remove inconsistencies with the HHS Common Rule and to remove regulatory barriers to genetic sampling and to creating other research repositories.

CITATION: Kristen B. Rosati, Human Genetic Sampling and the HIPAA Privacy Standards, 45 Jurimetrics J. 251–271 (2005).
Frozen Embryos and Gamete Providers’ Rights: A Suggested Model for Embryo Disposition
Joseph Russell Falasco

Procreational autonomy is one’s ability to choose when to have a child. Because advances in human reproduction technology allow one to create embryos with donated sperm and egg and freeze them for future use, it is important to analyze the resulting legal implications. This Article proposes a complete disposition model in cases where the egg and sperm donors disagree about the embryo’s ultimate fate. While embryos should be accorded a level of respect as a potential life, referring to an embryo’s legal status as "chattel" is useful because it gives an embryo its deserved respect while bringing clarity to the law. This Article distills the policies alluded to in the scant case law dealing with the disposition of frozen embryos and argues that the right to avoid procreation is the stronger interest when attempting to resolve an embryo disposition dispute.

CITATION: Joseph Russell Falasco, Frozen Embryos and Gamete Providers’ Rights: A Suggested Model for Embryo Disposition, 45 Jurimetrics J. 273–300 (2005).
FIPs and PETs for RFID: Protecting Privacy in the Web of Radio Frequency Identification
Gal Eschet

Radio Frequency Identification (RFID), like many other technologies, is a double-edged sword. On the one hand, this automatic identification technology would enable entirely unobstructed visibility into the supply chain and would dramatically reduce industry costs. On the other hand, RFID raises consumer privacy issues both in and outside retail surroundings. RFID not only permits extended capabilities of data collection, but also creates a new threat of tracking individuals.

Inherent drawbacks in Privacy Enhancing Technologies (PETs) developed to address these concerns make it impossible, at least at this stage of development, for the PETs to independently provide a satisfactory response to the privacy concerns. To achieve adequate privacy protection, industry behavior should not only be directed by technology, but also must be regulated by other means. This Article calls on users of RFID technology to embrace self-regulatory measures and argues that legislation or other governmental regulation may deny businesses and consumers of the benefits of technology and is therefore not yet warranted. As a self-regulatory measure, existing "fair information practices" (FIPs) offer a good starting point but cannot be adopted as-is for RFID technology. This Article attempts to assess what FIPs are to be adopted and how some of their existing principles should be adapted and tailored to the distinctive characteristics of RFID technology. The end result is a set of ten FIPs that could serve as the foundation of a strong privacy policy for the use of RFID tags.

CITATION: Gal Eschet, FIPs and PETs for RFID: Protecting Privacy in the Web of Radio Frequency Identification, 45 Jurimetrics J. 301–332 (2005).
Life After Sarbanes-Oxley: The Merger of Information Security and Accountability
Bruce H. Nearon, Jon Stanley, Steven W. Teppler, and Joseph Burton

This article explores the connection between the Sarbanes-Oxley Act of 2002 and information security. Although the statute and implementing regulations do not address information security explicitly, the authors argue that compliance is incomplete without an adequate information-security regime in place. If necessary, laws should be amended to make documenting, assessing, and testing information security compulsory. The cost of compliance is likely to be less than losses investors will suffer if the security laws and rules remain silent regarding information security.

CITATION: Bruce H. Nearon, Jon Stanley, Steven W. Teppler, and Joseph Burton, Life After Sarbanes-Oxley: The Merger of Information Security and Accountability, 45 Jurimetrics J. 379–412 (2005).
The Legal Employment Market: Determinants of Elite Firm Placement and How Law Schools Stack Up
Anthony Ciolli

Data collected from 1,295 employers on 15,293 law firm associates who graduated from law school between 2001 and 2003 were used to develop a "total quality score" for every ABA-accredited law school, both nationally and for nine geographic regions. Quantitative methods were then used to identify factors to help explain the variation in a law school’s national career placement success at elite law firms. The findings revealed that while a law school’s academic reputation is the single biggest predictor of placement, several other factors were also highly significant. Differences in grading system, class rank disclosure policies, and the number of first year courses required were responsible for significant variation. Numbers grading systems, such as those used at the University of Chicago, and honors/pass/fail grading systems, such as those used at Yale, both have a strong negative impact on placement when all else is held equal. This is likely because both systems impair the middle of the class’s job prospects relative to traditional letter grade systems. Law schools that do not disclose class rank to students or employers place better than schools that do disclose rank, when all else is held constant. It is unclear whether this is due to employer preferences or due to disparate psychological effects on students that impact their career placement strategies. Law schools that require a greater number of first year classes, however, can make up for deficiencies in these other areas.

CITATION: Anthony Ciolli, The Legal Employment Market: Determinants of Elite Firm Placement and How Law Schools Stack Up, 45 Jurimetrics J. 413–448 (2005).
Does "Yes" Really Mean Yes? The Attempt to Close Debate on the Admissibility of Fingerprint Testimony
Simon A. Cole

The Third Circuit recently issued an opinion in United States v. Mitchell, 365 F.3d 215 (3d Cir. 2004), the first case to challenge to the admissibility of fingerprint evidence under Daubert v. Merrell Dow Pharmaceuticals, Inc., 509 U.S. 579 (1993). Although the court rejected this challenge, it upheld the right of defendants to call "counter-experts" to testify about the limitations of fingerprint evidence. Unfortunately, the Third Circuit’s rationale for holding that fingerprint evidence meets the Daubert requirements suffers from fundamental misunderstandings of the nature of fingerprint evidence and fails to confront the lack of validation for forensic fingerprint identification.

CITATION: Simon A. Cole, Does "Yes" Really Mean Yes? The Attempt to Close Debate on the Admissibility of Fingerprint Testimony, 45 Jurimetrics J. 449–464 (2005).
On "Falsification" and "Falsifiability": The First Daubert Factor and the Philosophy of Science
D.H. Kaye

In Daubert v. Merrell Dow Pharmaceuticals, Inc., 509 U.S. 579, 593 (1993), the Supreme Court suggested that in evaluating the admissibility of scientific evidence, federal courts should consider "whether a theory or technique . . . can be (and has been) tested." Several commentators have thought that this suggestion represents an adoption of the philosophy of science of Karl Popper, and several courts have treated the abstract possibility of falsification as sufficient to satisfy this aspect of the screening of scientific evidence called for in Daubert. This essay challenges these views. It first explains the distinct meanings of "falsification" and "falsifiability." It then argues that while the Court did not embrace the views of any specific philosopher of science, inquiring into the existence of meaningful attempts at falsification is an appropriate and crucial consideration in admissibility determinations. Consequently, it concludes that courts that are substituting mere "falsifiability" for actual empirical testing are misconstruing and misapplying Daubert.

CITATION: D.H. Kaye, On "Falsification" and "Falsifiability": The First Daubert Factor and the Philosophy of Science, 45 Jurimetrics J. 473–481 (2005).
Introduction
Gary E. Marchant

“Confidence-building measures” (CBMs) are concrete, incremental steps, acceptable to all parties, that can be implemented relatively easily to reduce tensions and build trust in a time of conflict. The concept of CBMs arose in the sphere of international relations, and such measures are frequently used in international conflicts as the initial steps for reducing hostilities between enemies. The concept of CBMs may be useful for conflicts over genetically modified (GM) foods and other biotechnology products. The lack of trust and confidence in GM foods has several adverse impacts on the development and use of biotechnology. Most public interest groups that lead campaigns against GM products, and consumers uneasy about GM foods, claim they are not opposed to GM products per se, but rather lack confidence in the ways that GM products have been introduced into the market. While no comprehensive solutions to bridge the disputes over GM products are on the horizon, there may be pragmatic CBMs available in the short term that can reduce controversy and build trust, thereby creating an atmosphere more conducive to reaching consensus on longer-term solutions.

CITATION: Gary E. Marchant, Introduction, 44 Jurimetrics J. 1–4 (2003).
Stewardship for Biotech Crops: Strategies for Improving Global Consumer Confidence
Thomas P. Redick

This article will review a decade's worth of voluntary liability prevention ("stewardship") in food biotechnology. Consumer confidence in the biotechnology industry has prevailed in the U.S., despite concerted efforts by anti-biotech activists to exaggerate the risks of biotech crops and some unfortunate lapses in biotech industry stewardship. However, there is a need for continuing scrutiny of stewardship standards that will help the biotechnology industry to protect U.S. leadership in grain exports and biotech innovation, given the recall and ensuing litigation over unapproved-for-food Starlink™ corn and the proliferation of "precautionary" regulatory approaches to biotech crops worldwide. Improved stewardship will help build global consumer confidence in biotech's ability to manage long-term, uncertain risks before they are manifest on a broad scale. One of the most significant threats driving improved stewardship of biotech crops in the post-Starlink™ era will be the threat of litigation from the plaintiffs' class action bar, now that courts may award damages to parties suffering economic injury from lost exports. Consumer confidence in biotech crops could be undermined by high-profile class action litigation. Moreover, the crucial linchpin in global markets, the U.S. challenge to European Union policy at the World Trade Organization, could turn upon a showing of improved biotech industry stewardship, which keeps biotech crops separate where particular crops are not approved for export to Europe or other markets.

CITATION: Thomas P. Redick, Stewardship for Biotech Crops: Strategies for Improving Global Consumer Confidence, 44 Jurimetrics J. 5–39 (2003).
Confidence-Building Measures for Genetically Modified Products: Stakeholder Teamwork on Regulatory Proposals
Gregory N. Mandel

The introduction of genetically modified products into the human food supply and into other commercial uses is one of the most socially and politically divisive technology issues facing the United States. This article presents a model for confidence building among opposed groups in areas of polarized regulatory conflict generally and details a proposal for confidence building in the genetically modified product arena in particular. The specific proposal entails private industry, activist organizations, and representatives of the public working together on a jointly proposed set of guidelines for improving the quality of genetically modified product regulation.

CITATION: Gregory N. Mandel, Confidence-Building Measures for Genetically Modified Products: Stakeholder Teamwork on Regulatory Proposals, 44 Jurimetrics J. 41–61 (2003).
Bridging the Genetic Divide: Confidence-Building Measures for Genetically Modified Crops
Rebecca M. Bratspies

Genetically modified crops are now widely planted throughout the United States. To date, GM crops have not been modified for improved taste, appearance, or nutrition-benefits that would accrue directly to the public. Rather, ag-biotech companies have directed most of their energies toward developing crops that can be grown more profitably in Iowa. While growers have embraced GM crops, the public has been less sanguine about both the science underlying GM crops and about the trustworthiness of ag-biotech companies. In angry and divisive exchanges, the technology's proponents and opponents have been typecast and vilified. This inability to communicate has grown into a vicious cycle of misunderstanding and mistrust. Civil dialogue has become all but impossible.

This article proposes a series of confidence-building measures intended to break this cycle and permit GM advocates and opponents to move beyond empty rhetoric. These confidence-building measures focus on the environmental concerns that surround this technology and offer a way to create channels of trust and communication between the interested parties. Once communication is established, GM advocates and opponents should be able to begin a substantive dialogue about how GM technology can and should be exploited.

CITATION: Rebecca M. Bratspies, Bridging the Genetic Divide: Confidence-Building Measures for Genetically Modified Crops, 44 Jurimetrics J. 63–79 (2003).
Issues Surrounding the International Regulation of Adventitious Presence and Biotechnology
Serina Vandegrift and Christine Gould

The use of genetically modified (GM) crops has increased more than 30-fold globally in the past six years, with 58.7 million hectares grown in 2002. With the rapid and widespread adoption of biotechnology into agriculture, it has become increasingly difficult to guarantee the genetic purity of agricultural products cultivated in open environments and produced and distributed in traditional ways. At the same time, concerns have been raised in many parts of the world—particularly Europe—about the environmental, social, and economic consequences of biotechnology in general. This has translated into significant controversy about the unavoidable and accidental, or "adventitious," presence of GM material in seed, grain, and food products.

This Article serves as an overall introduction to the concept of adventitious presence—including how it is defined and measured and how it has historically been handled and regulated for "traditional," conventionally bred crops. It provides a general overview of domestic and international policies to address the adventitious presence of GM material in agricultural products, and how regulatory disparities can potentially disrupt trade and lead to disputes within the WTO framework. In light of this, the paper also looks at the relevance of establishing a standard international threshold, as well as the potential of a unique market-oriented proposal from the U.S. Grain Inspection Packers and Stockyards Administration (GIPSA) that utilizes quality control standards to meet various customer adventitious presence threshold requirements.

CITATION: Serina Vandegrift and Christine Gould, Issues Surrounding the International Regulation of Adventitious Presence and Biotechnology, 44 Jurimetrics J. 81–98 (2003).
GM Foods: Potential Public Consultation and Participation Mechanisms
Gary E. Marchant and Andrew Askland

One direct mechanism for improving public confidence in genetically modified foods may be to provide a greater role for the public in making policy decisions about such products. There are compelling normative and practical reasons for involving the public in such decisions. Yet, effective and meaningful public participation is made difficult by several factors, most importantly the lack of knowledge by most members of the public about scientific subjects, including biotechnology. A number of mechanisms for public participation exist, but most suffer from one of two principal limitations. Either they provide for only a small number of participants, usually representatives of interest groups, or they provide for widespread public participation but have no means for ensuring that the public input is informed. The recently completed national debate on GM foods in the United Kingdom illustrates many of the difficulties in providing for informed and effective public participation. New innovative approaches, such as on-line deliberations, are needed to achieve the goal of meaningful public participation in science-based policy decisions about genetically modified foods.

CITATION: Gary E. Marchant and Andrew Askland , GM Foods: Potential Public Consultation and Participation Mechanisms, 44 Jurimetrics J. 99–137 (2003).
Enhancing Consumer Confidence in Agricultural Biotechnology and Genetically Engineered Food
Douglas A. Powell, Katija A. Blaine, and Ben Chapman

This case study in risk communication describes a three-year trial of consumer acceptance of genetically modified food conducted on a commercial fruit and vegetable farm near Hillsburgh, Ontario, Canada. Beginning in spring 2000 through 2002, genetically engineered (GE) Bt sweet corn was grown beside conventional sweet corn. Information posters, letters, pamphlets, and press conferences provided the community with information on the project and the GE crops. The corn and potatoes harvested through the trial were segregated and labeled, and direct consumer testing for purchasing preference was conducted. The corn was clearly labeled as genetically engineered, and background information was provided on what "genetically engineered" meant. Overall, the Bt sweet corn outsold the regular sweet corn. This project highlights the importance of open and honest communication with customers in the introduction of new agricultural technologies and the importance of trust, especially in food producers. It also demonstrates that consumers can handle information about risks.

CITATION: Douglas A. Powell, Katija A. Blaine, and Ben Chapman, Enhancing Consumer Confidence in Agricultural Biotechnology and Genetically Engineered Food, 44 Jurimetrics J. 139–152 (2003).
Confidence Building: In What, for Whom, and Why?
Jane Maienschein

Arizona State University's Center for Law, Science, and Technology hosted an all-day workshop on "Confidence-Building Measures for Genetically Modified Foods" on December 6, 2002. The lively discussion covered a range of ethical, legal, and policy implications in a climate of failed confidence and considered the process of confidence building in the products and processes, as well as the implications of doing so. This paper provides an overview and commentary on the central themes of the conference and looks at the contributions of the papers in this symposium.

CITATION: Jane Maienschein, Confidence Building: In What, for Whom, and Why?, 44 Jurimetrics J. 153–160 (2003).
Driving While Black: A Skeptical Note
Stephan Michelson

The phrase “Driving While Black” may refer to the hypothesis that police officers make excessive highway stops of minorities or to differential treatment of drivers after such a stop, based on their race. This article reviews studies that claim to confirm the former hypothesis. It finds their conclusions unsupported. Specific problems with the studies include (1) disparity between the way data on the race of speeders are collected and the officers’ view of the driver, (2) faulty definition of the population of speeders from which stops are selected, (3) incorrect statistical procedure (single pool test where multiple pools test is called for), (4) failure to exclude cause—be it speeding or other infraction—as the basis for stops, and (5) failure to account for differential police allocation to more “troubled” areas. The “Driving While Black” hypothesis might be correct, but the lack of statistical support for it is particularly troubling as these faulty studies have been presented as evidence in litigation.

CITATION: Stephan Michelson, Driving While Black: A Skeptical Note, 44 Jurimetrics J. 161–179 (2004).
Bush v. Gore: Two Neglected Lessons from a Statistical Perspective
Michael O. Finkelstein and Bruce Levin

The decisions of both the Florida Supreme Court and the U.S. Supreme Court in Bush v. Gore were based on statistical assumptions that were not made explicit in the opinions. In this article, we examine these assumptions and find them to be untenable. The Florida Supreme Court, in providing for a statewide recount, assumed that there were a sufficient number of uncounted ballots in which the intent of the voter could be discerned (the recoverable undervote) to put the outcome of the election “in doubt.” On the evidence before the court, this was an unjustified conclusion: the court should have found that Bush would have increased his lead on the recount and that the election was not in doubt. The U.S. Supreme Court vacated the Florida Supreme Court’s order on equal protection grounds because Florida counties could apply different standards in determining the intent of the voter for the uncounted ballots; it implicitly assumed that this would increase the variation in rates of undervoting across counties using punch-card machines. However, computer simulation shows that this was unlikely to occur. Even if counties used very different vote-recovery standards, a recount would probably have reduced—not increased—the variation in undercount rates across such counties. These consequences are not particular to the Florida data but reflect general attributes of recounts in a challenged election, which should be noted by the courts in future contests.

CITATION: Michael O. Finkelstein and Bruce Levin, Bush v. Gore: Two Neglected Lessons from a Statistical Perspective, 44 Jurimetrics J. 181–199 (2004).
“Some More Human Than Others”: The Scope of Patentability Related to Human Embryonic Stem Cell Research
Sina A. Muscati

This paper proposes changes to patent legislation in response to social concerns resulting from innovation in the field of human embryonic stem cell research. Its main proposal is for a prohibition on the patenting of “human” elements. Five types of subject matter are encompassed by the proposed prohibition: (a) fully developed human beings; (b) human embryos, a potential source of human life; (c) animal-human chimeric embryos, which contain the human genome and can potentially develop into some higher “humanoid” life-form; (d) animal-human transgenic organisms having genes that impart human qualities or that are of unknown function; and (e) human totipotent stem cells, which are analogous to human embryos by their unlimited developmental potential. All five subject matters are to be included in a new definition of “human” that the paper proposes for use in patent legislation. The paper also proposes a prohibition on patents lacking inventiveness, namely those claiming genetically unmodified stem cells, and on patents that involve the derivation of stem cells from “research” embryos. Lastly, suggestions for infringement relief for academic researchers and an expansion of pricing control and compulsory licensing mechanisms for critical patents are proposed.

CITATION: Sina A. Muscati, “Some More Human Than Others”: The Scope of Patentability Related to Human Embryonic Stem Cell Research, 44 Jurimetrics J. 201–227 (2004).
Capturing the Dialectic Between Principles and Cases
Kevin D. Ashley

Theorists in ethics and law posit a dialectical relationship between principles and cases; abstract principles both inform and are informed by the decisions of specific cases. Until recently, however, it has not been possible to investigate or confirm this relationship empirically. This work involves a systematic study of a set of ethics cases written by a professional association’s Board of Ethical Review. Like judges, the Board explains its decisions in opinions. It applies normative standards, namely principles from a code of ethics, and cites past cases. We hypothesized that the Board’s explanations of its decisions elaborated upon the meaning and applicability of the abstract code principles and past cases. In effect, the Board operationalizes the principles and cases. We hypothesized further that this operationalization could be captured computationally and used to improve automated information retrieval. A computer program was designed to retrieve from the on-line database those ethics code principles and past cases that are relevant to analyzing new problems. In an experiment, we used the computer program to test the hypotheses. The experiment demonstrated that the dialectical relationship between principles and cases exists and that the associated operationalization information improves the program’s ability to assess which codes and cases are relevant to analyzing new problems. The results have significance both to the study of legal reasoning and improvement of legal information retrieval.

CITATION: Kevin D. Ashley, Capturing the Dialectic Between Principles and Cases, 44 Jurimetrics J. 229–279 (2004).
The Continuing Balance: Federal Regulation of Biotechnology
W. Christopher Matton and F. Scott Thomas

This article briefly summarizes many of the federal statutes and regulations applicable to a broad variety of business organizations engaged in the biotechnology industry. It provides an introduction to federal regulation of biotechnology for attorneys counseling and working with start-up and early stage biotechnology companies, as well as later stage enterprises and other players in the biotechnology industry. The article enables attorneys and business managers who counsel and work for a variety of business organizations in biotechnology to identify and understand the applicable federal regulations.

Part I of this article provides a brief introduction to the business of biotechnology and its development in the United States. Part II discusses the federal statutes and regulations applicable to the biotechnology industry in the United States.

CITATION: W. Christopher Matton and F. Scott Thomas, The Continuing Balance: Federal Regulation of Biotechnology, 44 Jurimetrics J. 283–321 (2004).
Regulatory Mechanisms for Molecular Nanotechnology
Jason Wejnert

Molecular nanotechnology will present enormous technological as well as legal, social, and ethical challenges as the science develops. In order to ensure a safe introduction and implementation of molecular nanotechnology, current control, dissemination, and property protection regimes will need re-evaluation to account for the fluid nature of the technology.

CITATION: Jason Wejnert, Regulatory Mechanisms for Molecular nanotechnology, 44 Jurimetrics J. 323–350 (2004).
The Daubert Trilogy in the States
David E. Bernstein and Jeffrey D. Jackson

The Daubert trilogy of Supreme Court cases—Daubert, Joiner, and Kumho Tire, codified in Federal Rule of Evidence 702—has established new rules for the admissibility of expert testimony in federal court. The situation in state courts is far more unsettled. First, a significant number of courts have continued to adhere to the tests they used before Daubert, either the Frye general acceptance test or some other test. Even among those states that have adopted Daubert, its application has been decidedly nonuniform. Only a few states have adopted the Daubert trilogy in its entirety. Some states have adopted Daubert, but not yet adopted Kumho Tire or Joiner. Others have adopted Daubert and Kumho Tire, but not Joiner, or have adopted only part of Joiner. Still other states view the Daubert trilogy as only instructive, or as consistent with their own traditional state tests but not binding.

This article analyzes the degree to which the holdings of the Daubert trilogy have been adopted by state courts. This analysis shows that there is a rich diversity of tests within the states. Indeed, contrary to the prevailing impression, the Daubert trilogy is not yet the majority standard even among the states that have rejected Frye.

CITATION: David E. Bernstein and Jeffrey D. Jackson, The Daubert Trilogy in the States, 44 Jurimetrics J. 351–366 (2004).
Copyright Protection for DNA Sequences: Can the Biotech Industry Harmonize Science with Song?
Stephen R. Wilson

Technological advancements in the biotech industry have led to an explosion of scientific discoveries. Certain biotech discoveries, such as naturally occurring DNA sequences, however, are typically devoid of legal protection. Without legal protection, scientists may cease to create or may prevent others from accessing their creations or discoveries. While many intellectual property theorists and commentators have argued for and against the patentability of various DNA sequences, some theorists have suggested that copyright law provides an opportunity to protect naturally occurring DNA sequences. This Article explores the viability of acquiring copyright protection for DNA sequences by converting them into digital music files.

CITATION: Stephen R. Wilson, Copyright Protection for DNA Sequences: Can the Biotech Industry Harmonize Science with Song?, 44 Jurimetrics J. 409–463 (2004).
Voting for Progress: Copyright and the Role of the Judiciary in Eldred v. Ashcroft
Amelia E. Morrow

In Eldred v. Ashcroft, 537 U.S. 186 (2003), the Supreme Court held constitutional the 20-year copyright extension passed by Congress in 1998. The Court rejected Petitioners’ arguments that the extension violated the language and the spirit of the Copyright Clause, as well as the First Amendment. This Note argues that an evaluation of the law consistent with constitutional limits, recent precedent, and the nature of copyright requires a more focused review than used by the Court, and that the Court instead left Congress virtually without judicial oversight on Copyright legislation.

CITATION: Amelia E. Morrow, Note, Voting for Progress: Copyright and the Role of the Judiciary in Eldred v. Ashcroft , 44 Jurimetrics J. 465–486 (2004).
Why Kelly v. Arriba Soft Corp. , 336 F.3d 811 (9th Cir. 2003), Does and Doesn’t Matter
Adam B. Olson

Fair use is a defense to an allegation of copyright infringement. In Kelly v. Arriba Soft Corp. , 336 F.3d 811 (9th Cir. 2003), a three-judge panel of the Court of Appeals for the Ninth Circuit held that an Internet search engine company’s creation and use of "thumbnail" versions of copyrighted images for Internet indexing purposes was a fair use. However, the court withdrew its previous holding that the company’s incorporation of copyrighted images in their entirety into its web site by reference to the images’ Internet locations was not a fair use. This note recognizes difficulties in determining the scope of the court’s holding regarding thumbnails. Additionally, this note identifies potential impacts of the court’s withdrawn holding, concluding that Kelly’s primary significance is its role as a catalyst to identify competing concerns relating to referencing copyrighted works on the Internet.

CITATION: Adam B. Olson, Note, Why Kelly v. Arriba Soft Corp. , 336 F.3d 811 (9th Cir. 2003), Does and Doesn’t Matter, 44 Jurimetrics J. 487–498 (2004).
Taubman Co. v. Webfeats: The Introduction of Cybergriping to the Federal Court System
Kevin P. Wilson

In Taubman Company v. Webfeats, 319 F.3d 770 (6th Cir. 2003), the Sixth Circuit became the first federal appellate court to address the phenomenon of "cybergriping." In a trademark infringement suit, the court determined that plaintiff, The Taubman Company, was not entitled to a preliminary injunction against defendant’s use of the internet domain name taubmansucks.com and other, similarly derogatory domain names. This Note describes the case, the previous case law on the subject, and the viability of various alternatives when dealing with future cybergriping domain names.

CITATION: Kevin P. Wilson, Note, Taubman Co. v. Webfeats: The Introduction of Cybergriping to the Federal Court System, 44 Jurimetrics J. 499–510 (2004).
Hoffman-La Roche, Inc. v. Promega Corp. : Spread of a "New Plague"?
Aaron Matz

The defense of inequitable conduct, which is asserted in many patent infringement lawsuits, allows the invalidation of a patent if the infringer can show that the patentee intentionally misrepresented or omitted material facts during the patent application process. In Hoffman-La Roche, Inc. v. Promega Corp. , 323 F.3d 1534 (Fed. Cir. 2003), the Court of Appeals for the Federal Circuit affirmed a district court’s conclusion that the plaintiff had committed inequitable conduct. Applying a deferential standard of review, the majority found that some of the district court’s findings were clearly erroneous but also upheld most of the district court’s conclusions regarding materiality and intent. Despite the fears expressed in a strongly worded dissenting opinion, it is unlikely that the majority’s decision will bring about a "new plague" of charges of inequitable conduct.

CITATION: Aaron Matz, Note, Hoffman-La Roche, Inc. v. Promega Corp. : Spread of a "New Plague"?, 44 Jurimetrics J. 511–522 (2004).
Integra LifeSciences I, LTD. v. Merck: Upholding Fundamental Principles of Patent Law
Douglas C. Gardner

In Integra LifeSciences I, LTD. v. Merck KgaA, the United States Court of Appeals for the Federal Circuit determined that the safe harbor provision of the Drug Price Competition and Patent Term Restoration Act of 1984, which allows competitors to conduct experimentation that is reasonably related to obtaining regulatory approval in advance of patent expiration, does not apply to broad research to discover new drugs for later clinical testing. This Note describes the facts in Integra and the safe harbor provision of 35 U.S.C.S. § 271(e)(1), argues why the common law research exemption of patent law does not apply as maintained in the dissent, and finally considers the ramifications of Integra’s holding with respect to the safe harbor principle.

CITATION: Douglas C. Gardner, Note, Integra LifeSciences I, LTD. v. Merck: Upholding Fundamental Principles of Patent Law, 44 Jurimetrics J. 523–530 (2004).
The Coming Pharmacogenomics Revolution: Tailoring Drugs to Fit Patients' Genetic Profiles
Lars Noah

The opportunity for increased precision in pharmaceutical therapy will represent one of the important legacies of the Human Genome Project. Medical researchers have long suspected that genetic differences account for some of the variability in patient response to drugs, but now they hope the identification of single nucleotide polymorphisms will allow physicians to customize pharmaceutical interventions. Pharmacogenomics will lead to fundamental changes in how drugs are discovered, tested, manufactured, labeled, and marketed. Federal regulators, the courts, and other policy makers will face challenges in accommodating these changes, and, in turn, their responses may have important impacts on the maturation and diffusion of this technology. This article describes these scientific developments as a prelude to asking whether legal institutions will manage to catch up to or, instead, hinder such advances.

CITATION: Lars Noah, The Coming Pharmacogenomics Revolution: Tailoring Drugs to Fit Patients' Genetic Profiles, 43 Jurimetrics J. 1–28 (2002).
Ashcroft v. ACLU: Coping with Online Community Standards
Laura Bates

In Ashcroft v. ACLU, the Supreme Court held that the Child Online Protection Act's reliance on "contemporary community standards" to define "material that is harmful to minors" did not render it unconstitutional. The Court determined that applying a local community standard to material placed on the World Wide Web did not impermissibly burden speech. This Note reviews the Third Circuit and Supreme Court opinions, describes why both courts erred, and suggests that the Supreme Court should have vacated the Third Circuit's decision for a different reason.

CITATION: Laura Bates, Note, Ashcroft v. ACLU: Coping with Online Community Standards, 43 Jurimetrics J. 29–41 (2002).
Should Criminal Case Filings Be Available Online?
Raya Tahan

In 2001, the Judicial Conference of the United States approved making civil case documents available to the public over the Internet. It also approved a pilot program of Internet access to criminal case documents. Although privacy advocates worry that such information could prevent applicants from obtaining jobs, lead to identity theft, and cause other harm, allowing the public and the media to inspect court documents is important to democratic self-governance. This note urges that, to accommodate both interests, most criminal case documents be made available electronically with sensitive information removed.

CITATION: Raya Tahan, Note, Should Criminal Case Filings Be Available Online?, 43 Jurimetrics J. 43–50 (2002).
Criminal DNA Databank Statutes and Medical Research
Davina Dana Bressler

Every state and the federal government collects DNA from convicted individuals for certain crimes. Furthermore, most states participate in the Federal Bureau of Investigation's Combined DNA Identification System, which allows states to share criminal DNA records for solving crimes. Some commentators claim that the statutes that authorize this practice also would allow the government to use criminal DNA samples or records in medical or behavioral research. A careful reading of the statutes reveals that these assertions are either wrong or exaggerated. Only one state allows for medical research with records, and no state allows medical or behavioral research with DNA samples. This note explains that the DNA samples, not the DNA records, are needed in order to conduct medical or behavioral research. Moreover, this note shows that the statutory phrases "law enforcement purposes," "other humanitarian purposes," and "research" into quality control or protocol probably do not authorize medical or behavioral research.

CITATION: Davina Dana Bressler, Note, Criminal DNA Databank Statutes and Medical Research, 43 Jurimetrics J. 51–78 (2002).
The Human Cloning Prohibition Act of 2001: Vagueness and Federalism
Jonathan S. Swartz

On July 31, 2001, the U.S. House of Representatives passed The Human Cloning Prohibition Act of 2001. The legislation proposes a complete ban on somatic cell nuclear transfer to create cloned human embryos; it threatens transgressors with criminal punishment and civil fines. House Bill 2505 is the first human cloning prohibition to pass either chamber of Congress. This note argues that the bill is unconstitutionally vague and inconsistent with the Supreme Court's recent Commerce Clause jurisprudence.

CITATION: Jonathan S. Swartz, Note, The Human Cloning Prohibition Act of 2001: Vagueness and Federalism, 43 Jurimetrics J. 79–90 (2002).
Expert Testimony on Fingerprints: An Internet Exchange
Richard Friedman, D.H. Kaye, Jennifer Mnookin, Dale Nance, and Michael Saks

In United States v. Llera Plaza, 188 F. Supp. 2d 549 (E.D. Pa. 2002), a federal district initially limited expert opinion testimony on fingerprint identifications because the government was unable to show that such identifications were sufficiently valid and reliable under Federal Rule of Evidence 702. Then the court withdrew the opinion. This article reproduces an exchange of notes on the initial opinion submitted by five law professors.

CITATION: Richard Friedman, D.H. Kaye, Jennifer Mnookin, Dale Nance, and Michael Saks, Expert Testimony on Fingerprints: An Internet Exchange, 43 Jurimetrics J. 91–98 (2002).
Measuring Salience on the Supreme Court: A Research Note
Saul Brenner and Theodore S. Arrington

In a recent article in the American Journal of Political Science, Epstein and Segal recommended using the front page of the New York Times to measure the salience of Supreme Court cases. We explore the virtues and drawbacks of using the Times list and the list of major cases compiled by the Congressional Quarterly. We test both lists and a combination of both lists in terms of chief justice self-assignment rates and the percentage of cases with three or more written opinions. We find that cases appearing on both lists have the most self-assignments and the most multiple-opinions. Although there are reasons to use the Congressional Quarterly list alone, there are also situations in which the Times list ought to be used alone.

CITATION: Saul Brenner and Theodore S. Arrington, Measuring Salience on the Supreme Court: A Research Note, 43 Jurimetrics J. 99–113 (2002).
The Potential Effect of Statistical Dependence in the Analysis of Data in Jury Discrimination Cases: Moultrie v. Martin Reconsidered
Joseph L. Gastwirth and Weiwen Miao

This article describes a statistical analysis that incorporates the dependence between consecutive grand juries in jurisdictions that allow jurors to serve a second year. This holdover juror system is used in South Carolina and is permitted in California. The appropriate modifications to the usual standard deviation analysis are presented and applied to data from Moultrie v. Martin, 690 F.2d 1078 (4th Cir. 1982). The analysis shows that ignoring the dependence made the statistical evidence of discrimination appear stronger than it was. Furthermore, the standard method, which ignores the dependence, also exaggerates the power of the hypothesis test to determine whether there was a potentially discriminatory selection process.

CITATION: Joseph L. Gastwirth and Weiwen Miao, The Potential Effect of Statistical Dependence in the Analysis of Data in Jury Discrimination Cases: Moultrie v. Martin Reconsidered, 43 Jurimetrics J. 115–128 (2002).
Chance and Skill in Games: Electronic Draw Poker
Marc J. Ware and Joseph B. Kadane

Various laws distinguish among games of skill and games of chance. To help determine the appropriate classification of electronic draw poker, this article discusses the expected payoff from the optimal strategy for electronic draw poker.

CITATION: Marc J. Ware and Joseph B. Kadane, Chance and Skill in Games: Electronic Draw Poker, 43 Jurimetrics J. 129–134 (2002).
Communicating Statistical DNA Evidence
Samuel Lindsey, Ralph Hertwig, and Gerd Gigerenzer

There is a growing need to present statistical scientific evidence in a form that judges and jurors can understand and evaluate. After examining statistical issues surrounding forensic DNA evidence, this article presents research that demonstrates a way to improve judges' and jurors' understanding of evidence involving probabilities and statistics.

CITATION: Samuel Lindsey, Ralph Hertwig, and Gerd Gigerenzer, Communicating Statistical DNA Evidence, 43 Jurimetrics J. 147–163 (2003).
Curing the Unique Health Identifier: A Reconciliation of New Technology and Privacy Rights
Wendy J. Netter

The Health Insurance Portability and Accountability Act has mandated the assignment of a universal individual health identifier in 2003. Such an identifier can increase patient confidentiality, improve patient care, lower the cost of services to the patient, enhance administrative efficiency, and increase the opportunity for medical research. Nevertheless, national identification systems raise concerns about confidentiality and privacy. Instead of a mandatory, government-assigned number, this article proposes a technologically multi-tiered system that would be administered by a mixed government and private entity. Consumers could voluntarily opt-in to the system.

CITATION: Wendy J. Netter, Curing the Unique Health Identifier: A Reconciliation of New Technology and Privacy Rights, 43 Jurimetrics J. 165–186 (2003).
Governing Population Genomics: Law, Bioethics, and Biopolitics in Three Case Studies
David E. Winickoff

Existing scholarship on population genomics has only superficially addressed issues of power and political process. Accordingly, questions of politics and governance pervade the analysis of three population genomics case studies that follow: the Human Genome Diversity Project, Iceland's Health Sector Database, and "Clinical Genomics" as defined by the Beth Israel-Ardais collaboration. An examination of these case studies reveals that the common law, U.S. regulatory law, and international law have not developed the political sophistication to make the traditional promises of biomedical ethics-respect for autonomy, justice, and beneficence-come to fruition. Further, comparisons of these projects illuminate three areas ripe for reframing-informed consent, expert ethical oversight, and commercial benefits. Four avenues of reform are suggested.

CITATION: David E. Winickoff, Governing Population Genomics: Law, Bioethics, and Biopolitics in Three Case Studies, 43 Jurimetrics J. 187–228 (2003).
But-For Markets and Reasonable Royalties: The Rite-Hite v. Kelley Misdirection
Clement G. Krouse

U.S. patent law distinguishes between lost profit and reasonable royalty awards in infringement damage claims. Lost profits are calculated as a monetary damage; reasonable royalties are not. This is a false division. Only a reasonable royalty also calculated as a monetary damage is proper in the sense that it does not provide the patent holder with more than it would have earned had the infringement not occurred. Correspondingly, only an award of monetary damages accurately disgorges the infringer of the value gained from the illegal use of the patented technology.

CITATION: Clement G. Krouse, But-For Markets and Reasonable Royalties: The Rite-Hite v. Kelley Misdirection, 43 Jurimetrics J. 229–241 (2003).
On the Probative Value of Evidence from a Screening Search
Michael O. Finkelstein and Bruce Levin

Evidence obtained from large-scale screening of people and databases has become an important tool in criminal and security investigations and promises to become an increasingly significant source of proof in trials. Reliance on such evidence raises issues with regard to its probative value standing alone and its force when combined with other evidence. We explore these questions using simple hypothetical examples to bring out certain issues and then turn to three paradigm cases: the first involving profiling of passengers at airports; the second cheating on a multiple-choice examination; and the third polygraph tests used in counter-intelligence screening. We find in these cases that legal decision makers underestimate the rate of false positives when even very good tests are used to screen for rare conditions, underestimate the probative value of screening evidence when combined with other evidence, and fail to appreciate that tests used for screening must be sufficiently reliable to be admissible in criminal proceedings.

CITATION: Michael O. Finkelstein and Bruce Levin, On the Probative Value of Evidence from a Screening Search, 43 Jurimetrics J. 265–290 (2003).
Assessing the Quality of DNA-based Parentage Testing: Findings from a Survey of Laboratories
Mary R. Anderlik

Parentage testing has become a big business with dramatic consequences for individuals and families. This article reports the findings from an interview survey of parentage testing laboratories accredited by the American Association of Blood Banks, and findings from a supplemental survey of the web sites of accredited and unaccredited laboratories. Topics covered include the offer of controversial services such as home testing and fidelity testing, promotion of services, policies relating to privacy and confidentiality and the handling of informed consent in cases involving minor children, and quality assurance. The article identifies gaps in current law, describes the proposals for regulation from study groups in Australia and the United Kingdom, and advances a proposal for regulatory reform in the U.S.

CITATION: Mary R. Anderlik, Assessing the Quality of DNA-based Parentage Testing: Findings from a Survey of Laboratories, 43 Jurimetrics J. 291–314 (2003).
Governing Copyright in Cyberspace: The Penalty Default Problem with State-centric Sovereignty
Mary De Ming Fan

This Article examines the consequences for Internet governance of observing traditional, state-centric sovereignty in multilateral cyberspace regulation by analyzing the World Intellectual Property Organization's Copyright Treaty. Three layers of protection for state sovereignty in the Treaty interact to produce a possible nonenforcement default of its protections for digitally transmitted materials in contracting states that profit more from sidestepping than securing copyright protections.

This nonenforcement default operates like a "penalty default" because it gives copyright-profiting states incentive to further bargain to avoid the default. This penalty default is not consciously set, but results from observing state sovereignty in regulating a supranational common resource against the backdrop of a cleavage in state interests over regulation. States with high economic interest in copyrighted material might respond with out-of-treaty bargaining to induce other states to enact regulations avoiding the default. Copyright-profiting states and certain private actors also will have increased incentive to erect electronic security fences and deploy other technologies that chill information flow. The resulting deliberation-deficient and undemocratic process for achieving regulation and nontransparent restrictions on information access may act as a hidden penalty default against piracy-profiting states and all concerned about information access and cyberspace governance, if not ameliorated by a multilaterally agreed mandatory fair use rule and establishment of a strong central dispute settlement body to enforce the treaty's terms.

CITATION: Mary De Ming Fan, Governing Copyright in Cyberspace: The Penalty Default Problem with State-centric Sovereignty, 43 Jurimetrics J. 315–342 (2003).
Adversarial Econometrics in United States Tobacco Co. v. Conwood Co.
D.H. Kaye

This article describes a questionable statistical study used to estimate large antitrust damages in United States Tobacco Co. v. Conwood Co., 290 F.3d 768 (6th Cir. 2002). It suggests that the district and appellate courts did not recognize that the study was incapable of separating the effects of legal conduct from illegal conduct and therefore failed in performing their role as gatekeepers for scientific evidence.

CITATION: D.H. Kaye, Adversarial Econometrics in United States Tobacco Co. v. Conwood Co., 43 Jurimetrics J. 343–352 (2003).
Child Placement Decisions: The Relevance of Facial Resemblance and Biological Relationships
David J. Herring

This article discusses two studies of evolution and human behavior addressing child-adult relationships and explores implications for policies and practices surrounding placement of children in foster homes. The first study indicates that men favor children whose facial features resemble their own facial features. This study may justify public child welfare decisionmakers in considering facial resemblance as they attempt to place children in safe foster homes. The second study indicates that parents are likely to invest more in children who are biologically related to them, thus enhancing their long-term well-being. Among other implications, this study may justify public child welfare decisionmakers in attempting to preserve biological families and avoid the removal of children from biological parents. It may also justify maintaining contact between biological parents and children even if removal is necessary.

Although this article recognizes that the studies do not provide for comprehensive decisionmaking rules, the article articulates how the studies can be used to incrementally construct, test, and improve policies and practices in a specific area of public activity.

CITATION: David J. Herring, Child Placement Decisions: The Relevance of Facial Resemblance and Biological Relationships, 43 Jurimetrics J. 387–414 (2003).
Atkins v. Virginia: Suggestions for the Accurate Diagnosis of Mental Retardation
Daniel B. Kessler

In Atkins v. Virginia, 536 U.S. 304 (2002), the Supreme Court held that capital punishment of the mentally retarded constitutes cruel and unusual punishment under the Eighth Amendment. The Court determined a national consensus exists against the use of capital punishment on this population and that the penological goals of this penalty could not be furthered by such use. This note analyzes the opinions in this case and suggests how states can successfully protect the mentally retarded under this new limitation in capital jurisprudence.

CITATION: Daniel B. Kessler, Note, Atkins v. Virginia: Suggestions for the Accurate Diagnosis of Mental Retardation, 43 Jurimetrics J. 415–426 (2003).
Mitochondrial DNA Evidence in State v. Pappas
Marlan D. Walker

In State v. Pappas, 776 A.2d 1091 (Conn. 2001), the Supreme Court of Connecticut ruled that mitochondrial DNA identification evidence was admissible to link the defendant to a bank robbery. This note describes the case, its ramifications, and some possible solutions to the potential problems associated with mitochondrial DNA evidence.

CITATION: Marlan D. Walker, Note, Mitochondrial DNA Evidence in State v. Pappas, 43 Jurimetrics J. 427–440 (2003).
Utah v. Evans and Statistical Methodologies in Census Apportionment Calculations
Jason Wejnert

In Utah v. Evans, 536 U.S. 452 (2002), the state of Utah claimed that imputation, the procedure used to adjust the apportionment figures by filling in gaps in missing household data, was a form of statistical sampling forbidden by the Census Act, 13 U.S.C. § 195 (1976), and the Census Clause of the Constitution. The Supreme Court rejected the challenge. This note suggests that the Court was correct in concluding that imputation is permissible.

CITATION: Jason Wejnert, Note, Utah v. Evans and Statistical Methodologies in Census Apportionment Calculations, 43 Jurimetrics J. 441–453 (2003).
"Out-of-Control Flatware, Livestock Run Amok, and Colliding Tubas": Suspicionless Drug Testing in Public Schools
Sharma K. Foley

In Board of Education v. Earls, 536 U.S. 822 (2002), the Supreme Court held that random drug testing of students involved in extracurricular activities does not violate the Fourth Amendment. The Court reasoned that the need for swift, informal disciplinary procedures in public schools placed them within the "special needs" exception to the warrant and probable cause requirements, and that notwithstanding minimal evidence of drug use among the students, the safety needs of schoolchildren participating in voluntary activities outweighed their diminished privacy expectations. This note argues that by emphasizing the national teenage drug problem, rather than requiring specific evidence of use among the students subject to testing, Earls upholds testing for students least likely to use drugs and thereby sets the outer limits of constitutionally permissible drug testing in the public schools.

CITATION: Sharma K. Foley, Note, "Out-of-Control Flatware, Livestock Run Amok, and Colliding Tubas": Suspicionless Drug Testing in Public Schools, 43 Jurimetrics J. 455–464 (2003).
Festo Continues to Fester: The Supreme Court Restores the Nebulous Flexible Bar Rule
Eric B. Chen

The doctrine of equivalents extends the scope of patent protection while prosecution history estoppel limits such protection. In Festo Corp. v. Shoketsu Kinzoku Kogyo Kabushiki Co., 535 U.S. 722 (2002), the Supreme Court overturned the bright-line approach of prosecution history estoppel created by the Federal Circuit. In doing so, the Court has undermined the public notice function of patent claims. Other less restrictive means are available to protect inventors' interests.

CITATION: Eric B. Chen, Note, Festo Continues to Fester: The Supreme Court Restores the Nebulous Flexible Bar Rule, 43 Jurimetrics J. 465–472 (2003).
Ag Supply, Inc. v. Pioneer Hi-Bred Int'l, Inc.: Statutory Construction and Plant Patents
Amy Sun

In Ag Supply, Inc. v. Pioneer Hi-Bred Int'l, Inc., 534 U.S. 124 (2001), the Supreme Court clarified the overlapping use of the utility patent statute, the Plant Patent Act (PPA), and the Plant Variety Protection Act (PVPA). It held that newly developed plant breeds could be protected under the utility patent statute and concluded that neither the PPA nor the PVPA bars obtaining utility patents for plants. This note outlines the factual background of the case, the analysis of patent protection availability for plants in the majority and dissenting opinions, and the reasoning and outcome.

CITATION: Amy Sun, Note, Ag Supply, Inc. v. Pioneer Hi-Bred Int'l, Inc.: Statutory Construction and Plant Patents, 43 Jurimetrics J. 473–481 (2003).
Intel v. Hamidi: Trespassing in Cyberspace
Julie Beauregard

In Intel Corp. v. Hamidi, 114 Cal. Rptr. 2d 244 (Ct. App. 2001), cert. granted, 43 P.3d 587 (Cal. 2002), the California Court of Appeals upheld Intel's permanent injunction prohibiting Hamidi from sending unauthorized, bulk e-mails to its employees. The court reasoned that Hamidi committed a trespass to Intel's chattels, namely its computer systems and proprietary e-mail lists, when he e-mailed thousands of Intel employees at their work e-mail addresses. This note suggests that the court of appeals correctly applied the trespass-to-chattels doctrine to the Internet and intangible property.

CITATION: Julie Beauregard, Note, Intel v. Hamidi: Trespassing in Cyberspace, 43 Jurimetrics J. 483–492 (2003).
You've Been Served! Rio Properties, Inc. v. Rio International Interlink
Heather A. Sapp

In Rio Properties, Inc. v. Rio International Interlink, 284 F.3d 1007 (9th Cir. 2002), the Ninth Circuit Court of Appeals applied an ad hoc balancing test to uphold a district court order permitting service of process by e-mail on a foreign defendant engaged in electronic commerce whose preferred means of communication was e-mail. This Note argues that the rules of civil procedure should be amended to permit e-mail service on domestic defendants without a court order under certain circumstances.

CITATION: Heather A. Sapp, Note, You've Been Served! Rio Properties, Inc. v. Rio International Interlink, 43 Jurimetrics J. 493–504 (2003).
Finding a (DMCA) Safe Harbor in the Turbulent Sea of Online Copyright Liability: A & M Records, Inc. v. Napster, Inc.
Thomas E. Barako

In A & M Records, Inc. v. Napster, Inc., the District Court for the Northern District of California decided that a company that provided tools for digital audio file sharing over a peer-to-peer computer network did not meet the section 512(a) safe harbor requirements of Title II of the Digital Millennium Copyright Act of 1998. This Note explores the early stages of A & M Records, Inc., v. Napster, Inc., analyzes the district court's rulings in the case, and examines what effects Napster may have on future cases involving peer-to-peer networks and the safe harbor provisions of the Digital Millennium Copyright Act.

CITATION: Thomas E. Barako, Note, Finding a (DMCA) Safe Harbor in the Turbulent Sea of Online Copyright Liability: A & M Records, Inc. v. Napster, Inc., 42 Jurimetrics J. 1–20 (2001).
Will ESIGN Force States to Adopt UETA?
Stephanie Lillie

The Electronic Signatures in Global and National Commerce Act of 2000 prevents states from implementing electronic and digital signature laws that do not conform to it or the Uniform Electronic Transactions Act. This Note discusses the differences between ESIGN and UETA. It also discusses the preemptive effect of ESIGN and the difficulty in creating a uniform national law to promote e-commerce growth.

CITATION: Stephanie Lillie, Note, Will ESIGN Force States to Adopt UETA?, 42 Jurimetrics J. 21–30 (2001).
NEPA and the Danger of Alien Species Introduction: Taking a Hard Look at National Parks & Conservation Ass'n v. United States Department of Transportation
Stephanie J. Gliege

In National Parks & Conservation Ass'n v. United States Department of Transportation, 222 F.3d 677 (9th Cir. 2000), the Court of Appeals for the Ninth Circuit denied a petition for review of the FAA's approval for an extension of Kahului Airport on Maui. The court held that the FAA satisfied the National Environmental Policy Act (NEPA) and other federal statutes. However, the agency failed to undertake the "hard look" required by NEPA because it did not adequately consider the dangers of alien species' introduction. This Note reviews the opinion, describes the requirements of NEPA and the "hard look" doctrine, and identifies problems with the court's reasoning.

CITATION: Stephanie J. Gliege, Note, NEPA and the Danger of Alien Species Introduction: Taking a Hard Look at National Parks & Conservation Ass'n v. United States Department of Transportation, 42 Jurimetrics J. 31–45 (2001).
Cellular Phone Taskforce v. FCC: The Broadening of the NEPA Functional Equivalence Doctrine
Sherry D. Haller

In Cellular Phone Taskforce v. FCC, 205 F.3d 82 (2d Cir. 2000), various groups attacked the Federal Communication Commission's safety regulations limiting radio frequency emissions from communications devices as too weak. The Court of Appeals for the Second Circuit upheld the standards as within the agency's discretion under the Administrative Procedure Act. In addition, it allowed the standards to stand even though the FCC did not prepare the environmental analysis required by the National Environmental Policy Act. This Note describes the standards, examines the opinion of the court of appeals, and questions the court's treatment of the FCC's promulgation of the standards as the "functional equivalent" of the necessary environmental analysis.

CITATION: Sherry D. Haller, Note, Cellular Phone Taskforce v. FCC: The Broadening of the NEPA Functional Equivalence Doctrine, 42 Jurimetrics J. 47–60 (2001).
Bioprospecting in the National Parks: Edmonds Institute v. Babbitt
Todd Weaver

In Edmonds Institute v. Babbitt, 93 F. Supp. 2d 63 (D.D.C. 2000), the District Court for the District of Columbia upheld the National Park Service's power to enter into agreements with private companies "bioprospecting" within Yellowstone National Park. At issue was an agreement that allows the removal of organisms to analyze their genotypes and to exploit any commercially profitable applications in exchange for annual payments and possible royalties. The court determined that this agreement is valid under the Federal Technology Transfer Act and the acts establishing the National Park Service and Yellowstone National Park.

CITATION: Todd Weaver, Note, Bioprospecting in the National Parks: Edmonds Institute v. Babbitt, 42 Jurimetrics J. 61–72 (2001).
Privacy on Thin Ice? Considering the California Court of Appeal Decision in Johnson v. Superior Court
Joshua A. Plosker

In Johnson v. Superior Court, the California Court of Appeal determined that a provision of a contract limiting the discovery of the identity of a sperm donor was against public policy and that the privacy interest did not protect against disclosure of this information. Although the court's analysis of the public policy exception to the enforcement of contracts was unnecessary, the opinion properly balances California's and petitioners' interests against an anonymous donor's privacy right.

CITATION: Joshua A. Plosker, Note, Privacy on Thin Ice? Considering the California Court of Appeal Decision in Johnson v. Superior Court, 42 Jurimetrics J. 73–83 (2001).
Logerquist v. McVey: Frye, Daubert or Non-Scientific Expert Testimony?
Lori A. Van Daele

In Logerquist v. McVey,1 P.3d 113 (Ariz. 2000), the Arizona Supreme Court held that expert testimony regarding repressed memory syndrome is admissible even if it does not satisfy the standards for scientific evidence established in Frye v. United States or Daubert v. Merrell Dow Pharmaceuticals, Inc. The court also stated that Arizona would continue to follow the Frye test for other scientific expert testimony. This Note explains why the Logerquist decision is not sound.

CITATION: Lori A. Van Daele, Note, Logerquist v. McVey: Frye, Daubert or Non-Scientific Expert Testimony?, 42 Jurimetrics J. 85–95 (2001).
A Brief History of the Political Work of Genetics
Dorothy Nelkin

The biological sciences have long been used to define distinctions between people and to define inequalities as a natural consequence of essential biological traits. Today, geneticists draw distinctions on the basis of genetic predispositions. Their population-based methods can reinforce stereotypes about race and ethnic differences, providing concepts, validated by science, through which group differences can be interpreted as biologically ordained. Cases suggest how genetic variants can be used in social policies as individuals are differentially treated, not on the basis of their individual condition, but because of predispositions attributed to their group.

CITATION: Dorothy Nelkin, A Brief History of the Political Work of Genetics, 42 Jurimetrics J. 121–132 (2002).
The Central Role of Variation in Human Genetics
Michael E. Zwick

The single greatest hope in applying human genetics is that through an understanding of the genetic causes of common disease, improved methods of prevention and treatment can be developed and become widely available. This article examines the goals and practices of human genetics and explains why variation in populations of organisms is central to the science. The properties of genetic variants underlying discrete Mendelian traits and continuous complex traits are discussed. The nature of these variants is likely to influence the risks and benefits of population-specific genetic research.

CITATION: Michael E. Zwick, The Central Role of Variation in Human Genetics, 42 Jurimetrics J. 133–139 (2002).
Scientific Rationales for Population-Specific Genetic Research: Pharmacogenetics in Indigenous Peoples
Wendell W. Weber

Drug effectiveness and toxicity, like many other human physical and mental traits, depend on ethnicity. Knowledge of ethnic specificity is an essential part of pharmacogenetics, the scientific study of variation in human drug response, because it suggests ways of tailoring drug therapy to individual patients as well as rational design and clinical trials of new drugs. Unfortunately, knowledge of ethnic variation in human drug response in Native American populations is meager and incomplete compared to other populations. Without greater participation, these populations will be unable to participate fully in the benefits of pharmacogenetics.

CITATION: Wendell W. Weber, Scientific Rationales for Population-Specific Genetic Research: Pharmacogenetics in Indigenous Peoples, 42 Jurimetrics J. 141–144 (2002).
Perspectives on Research in American Indian Communities
Malcolm B. Bowekaty

This article discusses research-oriented responsibilities of the Zuni governor and tribal council to the Zuni people. To reduce potential negative effects and to enhance the lifestyle of the Zuni, these bodies screen and review research in an effort to ascertain compliance with tribal law, to be culturally respectful, and to determine what, if any, beneficial effects the research will have for the Zuni people. As a result, studies concerning high-prevalence diseases, such as diabetes, are given preference. These principles may apply to other American Indian and Alaskan native communities.

CITATION: Malcolm B. Bowekaty, Perspectives on Research in American Indian Communities, 42 Jurimetrics J. 145–148 (2002).
Native American Recommendations for Genetic Research to Be Culturally Respectful
Linda Burhansstipanov, Lynne T. Bemis, and Mark Dignan

This article describes genetic research issues and recommendations identified by inter-tribal Native American groups in meetings with tribal leaders from 1995 through 1999.

CITATION: Linda Burhansstipanov, Lynne T. Bemis, and Mark Dignan, Native American Recommendations for Genetic Research to Be Culturally Respectful, 42 Jurimetrics J. 149–157 (2002).
A Caution to Native American Institutional Review Boards about Scientism and Censorship
Andrew Askland

Native American Institutional Review Boards (IRBs) promote the health and welfare of tribes by reviewing protocols for research studies that focus on their tribes. The benefits of approved protocols should not be overstated lest good studies disappoint because they do not satisfy unachievable expectations. IRBs also should avoid the temptation to censor the outcomes of those studies. Science relies on candor and clarity about results and methods to move forward.

CITATION: Andrew Askland, A Caution to Native American Institutional Review Boards about Scientism and Censorship, 42 Jurimetrics J. 159–163 (2002).
An Analysis of Research Guidelines on the Collection and Use of Human Biological Materials from American Indian and Alaskan Native Communities
Richard R. Sharp and Morris W. Foster

American Indian and Alaskan Native communities have expressed concern about the use of human biological materials in research. These concerns have prompted research sponsors and professional organizations to develop guidelines for investigators working with these communities. This paper reviews research guidelines and presents recommendations that reflect "best practices" for working with North American indigenous communities in the collection, storage, and distribution of human biological materials for research. These recommendations strike a reasonable balance between three imperatives in research: (1) minimizing harm, (2) treating sample contributors with respect, and (3) promoting intellectual freedom to pursue a range of research questions. The recommendations can be used in designing appropriate methods of collecting and using human biological materials from members of American Indian and Alaskan Native communities and will likely be applicable to other historically disadvantaged communities as well.

CITATION: Richard R. Sharp and Morris W. Foster, An Analysis of Research Guidelines on the Collection and Use of Human Biological Materials from American Indian and Alaskan Native Communities, 42 Jurimetrics J. 165–186 (2002).
A Critical Appraisal of Protections for Aboriginal Communities in Biomedical Research
Charles Weijer and James A. Anderson

As scientists target communities for research into the etiology, especially the genetic determinants of common diseases, there have been calls for the protection of communities. This paper identifies the distinct characteristics of aboriginal communities and their implications for research in these communities. It also contends that the framework in the Belmont Report is inadequate in this context and suggests a fourth principle of respect for communities. To explore how such a principle might be specified and operationalized, it reviews existing guidelines for protecting aboriginal communities and points out problems with these guidelines and areas for further work.

CITATION: Charles Weijer and James A. Anderson, A Critical Appraisal of Protections for Aboriginal Communities in Biomedical Research, 42 Jurimetrics J. 187–198 (2002).
Genetic Research and Communal Narratives
Dena S. Davis

The use of DNA evidence to prove Thomas Jefferson's paternity of Sally Hemings' children is a powerful example of how genetic research can have an impact upon the communal narratives of families and nations.

CITATION: Dena S. Davis, Genetic Research and Communal Narratives, 42 Jurimetrics J. 199–207 (2002).
Genes, Religion, and History: The Creation of a Discourse of Origin Among a Judaizing African Tribe
Tudor Parfitt

A striking achievement of modernism is the overall success of the natural sciences in explaining a wide range of human conditions-not least, the condition posited by the enlightenment that all men are equal, and that our shared and common humanity can be reduced to an ultimate chemical basis. However, the rapid development of genetic science is in many ways anti-modernist. Many of the supposed differences in peoples, nations, and sexes appear to find some sort of corroboration in the differences found in group and individual genetic make-up. This paper looks at the historical tradition of a small African tribe, the Lemba. In recent years, genetic findings have entered the discourse of tribal origins. Perceptions of genetic research substantially bolster and modify issues of group self-identity and, in addition, provide ammunition both for conservative forces in the preservation of their prejudices and for liberal groups who seek the elimination of differences among peoples.

CITATION: Tudor Parfitt, Genes, Religion, and History: The Creation of a Discourse of Origin Among a Judaizing African Tribe, 42 Jurimetrics J. 209–219 (2002).
Quasi-Objective Bayesianism and Legal Evidence
Alvin I. Goldman

Legal adjudication aims at truth determination. Augmenting subjective Bayesianism with an element of objectivity, we obtain "quasi-objective Bayesianism," which identifies conditions for objectively expected increases in truth possession. An appropriate relation is required between subjective and objective likelihoods. A theory of objective likelihoods for singular states of affairs is sketched. Then quasi-objective Bayesianism is applied to three topics in legal evidence: interpreting "beyond reasonable doubt," the rationale for court-appointed experts, and the exclusion of character evidence.

CITATION: Alvin I. Goldman, Quasi-Objective Bayesianism and Legal Evidence, 42 Jurimetrics J. 237–260 (2002).
Yahoo and Democracy on the Internet
Joel R. Reidenberg

This article examines the French court order requiring Yahoo to prevent French Internet users from accessing images of Nazi memorabilia available for auction on the company's American web site. The article uses the French case to challenge the popular belief that an entirely borderless Internet favors democratic values. The article starts from the premise that while the Internet enables actors to reach a geographically dispersed audience, the Internet should not change the accountability of those actors for their conduct within national borders. The article shows that Yahoo's extensive business in France justifies the application of France's democratically chosen law and argues that the decision has important normative implications for pluralistic democracy on the global network. Namely, the decision promotes technical changes in the Internet architecture that empower democratic states to be able to enforce their freely chosen public policies within their territories. At the same time, the infrastructure changes will not enhance the ability of non-democratic states to pursue repressive policies within their territories in violation of international law. The article shows the French decision as a maturing of the Internet regulatory framework and argues that the policy rules embedded in the technical infrastructure must recognize values adopted by different states and must not be dictated by technical elites.

CITATION: Joel R. Reidenberg, Yahoo and Democracy on the Internet, 42 Jurimetrics J. 261–280 (2002).
When Do Courts Think Base Rate Statistics Are Relevant?
Jonathan J. Koehler

This paper identifies conditions under which appellate courts are more and less likely to treat background probabilities (i.e., base rates) as relevant. Courts are likely to view base rates as relevant when base rates (a) arise in cases that appear to have a statistical structure, (b) are offered to rebut an it-happened-by-chance theory, (c) are computed using reference classes that incorporate specific features of the focal case, or (d) are offered in cases when it is difficult or impossible to obtain evidence of a more individuating sort.

CITATION: Jonathan J. Koehler, When Do Courts Think Base Rate Statistics Are Relevant?, 42 Jurimetrics J. 373–402 (2002).
An Empirical Assessment of Presentation Formats for Trace Evidence with a Relatively Large and Quantifiable Random Match Probability
Scott B. Morris and Dale A. Nance

Since the early 1970s, academics have debated the wisdom of an explicit use of statistical information concerning the probability of a coincidental match in the presentation of forensic match evidence at trial. Critics pointed to the dangers of such numbers, arguing that jurors would misunderstand the numbers and exaggerate their importance in the case, being seduced into ignoring the other unquantified evidence. Data from a pool of people called for jury service in Kane County, Illinois, support the conclusion reached in most previous empirical research: Jurors tend to undervalue the scientific evidence when measured against a Bayesian norm. More important, the present results indicate that a careful use of Bayesian methods in the courtroom can assist the jury in reaching more accurate verdicts, a conclusion with less support in previous studies. This study also suggests that it may be possible in some contexts to achieve comparable improvements in accuracy by giving juries less information than is conventionally given to them. The significance of this last result, however, is more ambiguous.

CITATION: Scott B. Morris and Dale A. Nance, An Empirical Assessment of Presentation Formats for Trace Evidence with a Relatively Large and Quantifiable Random Match Probability, 42 Jurimetrics J. 403–448 (2002).
Ferguson v. City of Charleston: The Boundaries of the Special Needs Exception
Burt C. Binenfeld

In Ferguson v. City of Charleston, 532 U.S. 67 (2001), the Supreme Court ruled that a state hospital's policy of testing pregnant women for drug use was unconstitutional. It held that the program fell outside the scope of the "special needs exception" to the Fourth Amendment's normal requirement of probable cause and a warrant. The Court reasoned that the exception did not apply because the immediate goal was to obtain evidence for prosecution. This note describes the opinions in the case and considers other drug testing programs that might pass constitutional muster.

CITATION: Burt C. Binenfeld, Note, Ferguson v. City of Charleston: The Boundaries of the Special Needs Exception, 42 Jurimetrics J. 449–463 (2002).
Dendrite v. Doe: A New Standard for Protecting Anonymity on Internet Message Boards
Kevin Wein

In Dendrite v. Doe, 775 A.2d 756 (N.J. Super. Ct. App. Div. 2001), a New Jersey appellate court sought to create a test to determine when a plaintiff may obtain the identity of anonymous on-line message board posters. While the court correctly insisted on a higher standard before granting a request to unmask potential defendants, it failed in developing a clear framework that sufficiently protects the important constitutional rights in question. A more appropriate approach would be the institution of a qualified privilege for the identity of on-line message board posters.

CITATION: Kevin Wein, Note, Dendrite v. Doe: A New Standard for Protecting Anonymity on Internet Message Boards, 42 Jurimetrics J. 465–477 (2002).
Eli Lilly & Co. v. Barr Labs., Inc.: The Muddling of the Obviousness-Type Double Patenting Doctrine
Hsin Pai

In Eli Lilly & Co. v. Barr Labs., Inc., 251 F.3d 955 (Fed. Cir. 2001), the Court of Appeals for the Federal Circuit applied the doctrine of obviousness-type double patenting to hold that a patent on the active ingredient in Prozac was invalid. This note describes that doctrine and suggests that although the court's conclusion was correct, one portion of its analysis is problematic.

CITATION: Hsin Pai, Note, Eli Lilly & Co. v. Barr Labs., Inc.: The Muddling of the Obviousness-Type Double Patenting Doctrine, 42 Jurimetrics J. 479–491 (2002).
Chou v. University of Chicago: Assigned Patent Rights-Gone but Not Forgotten
Daniel P. Valentine

As the commercial value of research conducted at universities has increased, so have disputes among university researchers over who invented a patented process or material. Chou v. University of Chicago, 254 F.3d 1347 (Fed. Cir. 2001), illustrates one difficulty faced by courts in resolving such inventorship disputes—assessing whether a plaintiff who does not own the patent has standing to correct inventorship. Chou reveals that when the plaintiff retains a contractual right to benefit from royalties earned by the patent, sufficient grounds for standing exist. The Federal Circuit also suggested that a non-monetary, reputational interest may be sufficient to establish standing.

CITATION: Daniel P. Valentine, Note, Chou v. University of Chicago: Assigned Patent Rights-Gone but Not Forgotten, 42 Jurimetrics J. 493–500 (2002).
Genetic Testing, Liability, and Regulatory Policy: The Canadian Situation
Timothy Caulfield

As in many countries throughout the world, Canada is struggling to address the legal and ethical challenges created by the emerging genetic testing technologies. Though Canada currently has little "genetic-testing" jurisprudence and no formal regulations, there has been a significant amount of policy discussion. Moreover, there are reasons to believe that genetic malpractices cases may become more common. Canada's distinctive socio-political environment—marked by our mix of American individualism and a European deference to community interests—will ensure that this emerging legal framework will have a unique Canadian tenor.

CITATION: Timothy Caulfield, Genetic Testing, Liability, and Regulatory Policy: The Canadian Situation, 41 Jurimetrics J. 7–21 (2000).
Separating Predictive Genetic Testing from Snake Oil: Regulation, Liabilities, and Lost Opportunities
Michael J. Malinowski

This article explores the extent to which completion of maps of the human genome, coupled with the introduction of technology that will accelerate the identification of gene and protein function, have introduced immeasurable potential to advance life science and health care through genetic profiling. In light of definitional uncertainly, the regulatory and legal environment surrounding predictive genetic testing threatens to impede clinical utilization of genetic profiling technologies that could significantly improve human health. Especially given that genetic testing technologies have been stigmatized in the public and medical community, they must enter the marketplace with a regulatory framework that assures safety, efficacy, and market responsibility.

CITATION: Michael J. Malinowski, Separating Predictive Genetic Testing from Snake Oil: Regulation, Liabilities, and Lost Opportunities, 41 Jurimetrics J. 23–52 (2000).
FDA and the Regulation of Genetic Tests
Neil A. Holtzman

The Secretary's Advisory Committee on Genetic Testing recently recommended that FDA be involved in the regulation of all new genetic tests. This culminates a long effort to end a double standard under which the FDA regulates genetic tests marketed as "kits" but not those marketed as clinical laboratory services. For kits, FDA requires data on the accuracy of predictions or diagnoses of disease. Yet, for tests marketed as services, the FDA applies no such requirement. Using existing policies for classifying medical devices, FDA could readily implement the committee's recommendation, assuring that data on clinical validity are available to assist decision making.

CITATION: Neil A. Holtzman, FDA and the Regulation of Genetic Tests, 41 Jurimetrics J. 53–62 (2000).
Genetic Testing: A Role for FDA?
Richard A. Merrill

Calls for FDA to regulate genetic testing are possibly premature. The concerns that animate the desire for closer regulation lie generally beyond FDA expertise and experience, and to the extent they involve medical judgments, they invite the agency to enter an arena—the practice of medicine—which it has historically declined to regulate. Further, FDA's current statutory authority does not fit well the activities that trouble the advocates for reform.

CITATION: Richard A. Merrill, Genetic Testing: A Role for FDA?, 41 Jurimetrics J. 63–66 (2000).
Genetic Susceptibility and Biomarkers in Toxic Injury Litigation
Gary E. Marchant

The tort system generally treats plaintiffs as indistinguishable black boxes, entitled to compensation when a defendant's wrongful act or defective product causes some manifest disease or injury. This paradigm is likely to change dramatically with recent advances in genetic and related technologies. By peering inside the individual plaintiff to identify cellular and molecular markers that indicate both the status and etiology of pre-symptomatic disease processes, it will be possible to differentiate among individuals with respect to susceptibility and predispositions. This article surveys potential, and in some cases existing, uses of such biomarkers in toxic injury litigation and assesses the doctrinal, procedural, policy, and normative issues they present.

CITATION: Gary E. Marchant, Genetic Susceptibility and Biomarkers in Toxic Injury Litigation, 41 Jurimetrics J. 67–109 (2000).
The Importance of Test Validity and Predictive Value to Screening Programs
Ralph R. Cook

Sensitivity and specificity are two important and relatively fixed characteristics of any clinical test. Arguably, a test's predictive values (positive and negative) are even more important, especially for screening. Unless care is taken at the design stage, the majority of test results in screening programs may be false positives, because the predictive values change dramatically based on the background frequency of the disease. These concepts are illustrated with a hypothetical, periodic disease surveillance program mandated by the court following a community exposure to a toxic agent.

CITATION: Ralph R. Cook, The Importance of Test Validity and Predictive Value to Screening Programs, 41 Jurimetrics J. 111–120 (2000).
The Evolution of Predictive Genetic Testing: Deciphering Gene-Environment Interactions
Richard R. Sharp

The evolution of predictive genetic testing is best described as a shift from the detection of rare, highly penetrant "disease genes" to the detection of more common, less predictive genetic susceptibilities to disease and sensitivities to environmental agents. This paper describes how a better understanding of gene-environment interactions will cast new light on familiar bioethical issues and present a number of unique ethical and social challenges. Three topics are examined in detail: the protection of research participants in environmental genomics, how a better understanding of gene-environment interactions could impact socially identifiable groups, and potential shifts in social priorities and assignments of responsibility for health based on our knowledge of genetic sensitivities to environmental agents. Clarifying these emerging areas of concern, many of which have not received adequate attention in the existing bioethics literature and legal scholarship, is critical to ensuring that the benefits of predictive genetic testing are not overshadowed by unintended misuses.

CITATION: Richard R. Sharp, The Evolution of Predictive Genetic Testing: Deciphering Gene-Environment Interactions, 41 Jurimetrics J. 145–163 (2001).
Genetic Testing for Susceptibility to Disease from Exposure to Toxic Chemicals: Implications for Public and Worker Health Policies
Michael Baram

The Environmental Genome Program intends to identify "susceptibility genes" that would indicate if a person is more vulnerable to cancer or other disease as a result of exposure to certain chemicals in the workplace, the environment, foods, or other products. Research findings and the capability to test persons for such genes are likely to impugn and challenge health policies and regulatory programs that do not take genetic susceptibility into account when conferring health benefits and restricting chemical exposures. This article focuses on the Occupational Safety and Health Administration (OSHA) and discusses four options available to this agency for protecting genetically susceptible workers and the issues involved in designing and implementing each option. The options involve amending each workplace chemical standard to incorporate genetic testing in a medically supervised program akin to OSHA's Lead Standard, generic revision of all standards so they are sufficiently stringent to protect susceptible workers, requiring information dissemination to prompt management and workforce initiatives, and incorporating genetic susceptibility in holding employers accountable to OSHA's "general duty clause."

CITATION: Michael Baram, Genetic Testing for Susceptibility to Disease from Exposure to Toxic Chemicals: Implications for Public and Worker Health Policies, 41 Jurimetrics J. 165–176 (2001).
Uses of Biomarkers for Genetic Susceptibility and Exposure in the Regulatory Context
Dale Hattis and Sue Swedis

Biomarkers have the potential to make the connections between exposures and adverse health outcomes clearer, more specific to particular individuals and groups, and more quantitative. As a general matter, biomarker information and other data on individual variability in susceptibility are relevant for risk management assessments of both overall economic efficiency issues and equity or fairness issues. Equity concerns can be particularly important. As people who are at greatest risk are identified within particular groups, they are better able to mobilize social resources for their protection. There are, however, formidable obstacles to using susceptibility information in regulatory decision making. To articulate practical and quantitative risk management standards, decision makers need to address ambiguities in policies toward control of uncertain and variable risks that have persisted in the face of advancing scientific information and risk assessment techniques. Clarification of such ambiguities may come from the relative openness of the U.S. regulatory system, which fosters public articulation and both legislative and judicial review of the technical and policy bases of regulatory decisions.

CITATION: Dale Hattis and Sue Swedis, Uses of Biomarkers for Genetic Susceptibility and Exposure in the Regulatory Context, 41 Jurimetrics J. 177–194 (2001).
Relative Risk Greater than Two in Proof of Causation in Toxic Tort Litigation
Russellyn S. Carruth and Bernard D. Goldstein

The concept of relative risk has been of increasing interest in establishing causation in toxic tort suits. Specifically, courts are asking whether epidemiological data demonstrating a relative risk greater than 2 is required to meet the standard for proof ("more likely than not") or to admit an expert's opinion of causation in toxic tort cases. This article analyzes 31 such cases. Although the frequency of judicial opinions referring to relative risk greater than 2 is increasing, courts disagree as to the use of 2 as a bright-line test. These cases also provide information about judicial understanding of epidemiology and toxicology in providing a scientific basis for legal decisions.

CITATION: Russellyn S. Carruth and Bernard D. Goldstein, Relative Risk Greater than Two in Proof of Causation in Toxic Tort Litigation, 41 Jurimetrics J. 195–209 (2001).
Genetic Testing in Toxic Injury Litigation: The Path to Scientific Certainty or Blind Alley?
Susan R. Poulter

Rapid increases in the ability to identify disease genes and disease susceptibility genes and the expanding field of toxicogenomics suggest that genetic testing could become an important part of causal proof in toxic injury litigation. This article examines the implications of genetic testing for disease genes or susceptibility genes in establishing or excluding a genetic cause for the plaintiff's injury, particularly in light of the commonly posited theory that genetics and toxic exposures are alternative causes of injury. It discusses both the mechanistic and statistical implications of additive and synergistic models of combined action of toxic and genetic causes and concludes that genetic testing for disease susceptibility will be probative only where there is statistical or mechanistic information on the combined effects of the genetic susceptibility and toxin.

CITATION: Susan R. Poulter, Genetic Testing in Toxic Injury Litigation: The Path to Scientific Certainty or Blind Alley?, 41 Jurimetrics J. 211–238 (2001).
Product Liability for Predictive Genetic Tests
Pilar N. Ossorio

Most genetic testing is currently conducted by test manufacturers, in-house. In-house testing is not subject to liability for product defects because it is considered a service, not a product. However, changes in the regulatory climate and increasing demand for genetic tests will likely result in increasing numbers of genetic tests being sold for use by others, as kit products.

The author argues that genetic test kits should be subject to negligence liability and to liability for product defects, including manufacturing, design and instruction or warning defects. The special rules for medical products that essentially exempt them from design-defect liability should not apply to genetic test kits because (1) genetic tests are not unavoidably unsafe; (2) as with other durable goods, alternative safer designs for genetic test kits may exist or be feasible; (3) genetic tests may not be regulated to the same extent as prescription drugs; and (4) the public cannot rely on physicians as effective gatekeepers for determining the safe and appropriate use of genetic tests, because physicians' training in genetics is inadequate.

CITATION: Pilar N. Ossorio, Product Liability for Predictive Genetic Tests, 41 Jurimetrics J. 239–260 (2001).
The Evolution of Irrationality
Owen D. Jones

The place of the rational actor model in the analysis of individual and social behavior relevant to law remains unresolved. In recent years, scholars have sought frameworks to explain (a) disjunctions between seemingly rational behavior and seemingly irrational behavior, (b) the origins of and influences on law-relevant preferences, and (c) the nonrandom development of norms. This essay explains two components of an evolutionary framework that can encompass all three. They are, respectively, time-shifted rationality and the law of law's leverage.

CITATION: Owen D. Jones, The Evolution of Irrationality, 41 Jurimetrics J. 289–318 (2001).
A Multi-Disciplinary Approach to Legal Scholarship: Economics, Behavioral Economics, and Evolutionary Psychology
Russell Korobkin

Legal scholars interested in the behavior of citizens subject to the law's dictates have long relied on economic analysis to inform their normative analysis of legal problems. In recent years they have begun using insights provided by behavioral economics and evolutionary psychology as well. This article presents a framework for understanding how these disciplines individually and together can inform legal scholarship. First, the article distinguishes between motivational theories of behavior (that underlie both traditional law-and-economics analysis and evolutionary psychology) and empirical observation (exemplified by experimental social sciences such as behavioral economics) and contends that neither approach is alone sufficient to inform sound legal policy. Second, the article presents a model of how motivational theory and empirical observation can be combined to create a more useful multi-disciplinary approach to the study of law-relevant behavior and provides an example of how the model can be used.

CITATION: Russell Korobkin, A Multi-Disciplinary Approach to Legal Scholarship: Economics, Behavioral Economics, and Evolutionary Psychology, 41 Jurimetrics J. 319–336 (2001).
How Humans Make Political Decisions
Paul H. Rubin

Humans evolved in small communities, but now must make political decisions in vastly larger societies. Our evolved decision mechanisms have had to adapt to this change. Economists and psychologists disagree on the amount of human rationality, but consideration of evolutionary forces can help resolve this argument. For evolved reasons, humans pay excessive attention to factors involving identifiable individuals. There are other factors that influence decision making and that are based on evolution in small societies.

CITATION: Paul H. Rubin, How Humans Make Political Decisions, 41 Jurimetrics J. 337–356 (2001).
The Divergence of Neuroscience and Law
Jacob R. Waldbauer and Michael S. Gazzaniga

Recent developments in the neurosciences have produced profound insights into brain function and human behavior. There is a hope that neuroscientific studies may conclusively resolve questions about a criminal offender's diminished responsibility. However, neuroscience will never be able to answer conclusively legal questions of individual culpability for criminal actions. The very concept of legal responsibility derives from a particular model of human behavior that neuroscience does not share. Hence, neuroscience will remain mute on the issue of legal responsibility.

CITATION: Jacob R. Waldbauer and Michael S. Gazzaniga, The Divergence of Neuroscience and Law, 41 Jurimetrics J. 357–364 (2001).
Is Evolutionary Analysis of Law Science or Storytelling?
Jeffrey J. Rachlinski

In recent years, some legal scholars have argued that legal scholarship could benefit from a greater reliance on theories of human behavior that arise from biological evolution. These scholars contend that reliance on biological evolution would successfully combine the rigor of economics with the scientific aspects of psychology. Complex legal systems, however, are uniquely human. Law has always been the product of cognitive processes that are unique to humans and that developed as a response to an environment that no longer exists. Consequently, the evolutionary development of the cognitive mechanisms upon which law depends cannot be rigorously modeled or studied empirically.

CITATION: Jeffrey J. Rachlinski, Is Evolutionary Analysis of Law Science or Storytelling?, 41 Jurimetrics J. 365–370 (2001).
Science and Human Behavior: A Reply
Owen D. Jones

Human behavior originates in the human brain, an organ with an evolutionary history. Because that history has shaped the brain's psychological capacities and predispositions over evolutionary time, it continues to influence patterns in human behaviors. Integrating this life science perspective with existing social science perspectives can deepen our understanding of the multiple causes of behaviors relevant to law. Deepened understanding can, in turn, help us to better effect those behavioral changes we hope to achieve with the tools of law.

CITATION: Owen D. Jones, Science and Human Behavior: A Reply, 41 Jurimetrics J. 371–378 (2001).
Can Evolutionary Science Contribute to Discussions of Law?
Jeffrey Evans Stake

Evolutionary theory can be helpful in understanding the law and determining what it should be. There are two ways in which the evolutionary perspective differs from an economic perspective on law. Not only does the evolutionary approach shift our attention from the world today to the environment of evolutionary adaptation, it shifts our focus from rational individuals to rational genes and from rational behaviors to rational design of mental architecture. Finally, the law of law's leverage makes predictions about the relative elasticities of demand for all sorts of behaviors, including those that did and did not exist in the environment of evolutionary adaptation.

CITATION: Jeffrey Evans Stake, Can Evolutionary Science Contribute to Discussions of Law?, 41 Jurimetrics J. 379–384 (2001).
Frye, Frye, Again: The Past, Present, and Future of the General Acceptance Test
David E. Bernstein

This article begins by reviewing the history of the Frye general acceptance test for the admissibility of scientific evidence, from its origins in 1923 to its demise in federal court in Daubert v. Merrell Dow Pharmaceuticals, Inc., in 1993. This section focuses especially on how the Frye rule, which for decades applied almost exclusively in criminal cases, came to be the focal point of the controversy over the admissibility of scientific evidence in toxic tort cases in the early 1990s. Next, the article discusses the development of the Frye test since 1993. Despite Daubert, Frye has remained the plurality rule in state courts. Following the lead of federal courts operating under Daubert's broad gatekeeper mandate, Frye jurisdictions are increasingly applying the general acceptance test to scientific evidence in civil cases, especially toxic tort cases. However, Frye jurisdictions are divided regarding whether the general acceptance test applies primarily to the expert's general methodologies or must be applied to the expert's conclusions. Recently, several courts have followed the Supreme Court's lead in General Elec. Co. v. Joiner. Instead of focusing on the methodologies-conclusions distinction, these courts have scrutinized experts' reasoning process. Meanwhile, most Frye jurisdictions do not apply the general acceptance test to nonscientific evidence, although some Frye courts apply a Kumho Tire-like reliability test to such evidence. This article concludes that case law under Frye is slowly converging with Daubert jurisprudence. Rather than allowing this process to continue haphazardly and inconsistently, state legislatures should enact state versions of new Federal Rule of Evidence 702, which explicitly adopts Daubert and its progeny.

CITATION: David E. Bernstein, Frye, Frye, Again: The Past, Present, and Future of the General Acceptance Test, 41 Jurimetrics J. 385–407 (2001).
Mapping Cortical Areas Associated with Legal Reasoning and Moral Intuition
Oliver R. Goodenough

The prevailing tools of legal scholarship have focused the study of law on questions of doctrine. Recent developments in cognitive neuroscience allow us to explore a different kind of problem: how people think when they apply law. First we must update the accepted model of cognition, replacing the unified Cartesian approach with a multi-capacity, "modular" view of the human mind. Such an approach suggests that the classic, apparently intractable, arguments between positive-law and natural-law adherents may reflect the workings of two separate mental capacities for judging human actions-the application of word-based rules on the one hand and of unarticulated understandings of justice on the other. This hypothesis need not remain just a plausible assertion. The techniques of functional neuroimaging provide an experimental means of testing it. A series of brain-scanning experiments could reveal whether there are significant differences in the brain regions employed in using legal rules and moral intuition to judge human behavior, in the process helping us understand the neurological basis of the distinction between natural and positive law.

CITATION: Oliver R. Goodenough, Mapping Cortical Areas Associated with Legal Reasoning and Moral Intuition, 41 Jurimetrics J. 429–442 (2001).
Criminal Invasion of Privacy: A Survey of Computer Crimes
Jordan M. Blanke

Computers, databases, and the Internet have made personal information readily available. All states have enacted criminal laws to protect against abuses of accessing or using such data. This article traces the history of privacy as it pertains to personal information and explores the criminal laws against invasion of privacy.

CITATION: Jordan M. Blanke, Criminal Invasion of Privacy: A Survey of Computer Crimes, 41 Jurimetrics J. 443–463 (2001).
Childproofing on the World Wide Web: A Survey of Adult Webservers
Daniel Orr and Josephine Ferrigno-Stack

Using a random sample of web servers, this article examines the prevalence of various forms of on-line content, including adult content. We found adult content accounts for 2.1 percent of information on the World Wide Web, substantially less than commercial, educational, personal, or organizational content. Additionally, we examined the age-verification measures and business practices of adult web sites. Currently 73.8 percent of these sites provide access to adult content on their index page without age verification. We further assessed the prevalence of site ratings, disclaimers, and privacy policies, as well as the use of services that verify age through a credit card and "mousetraps," technical devices designed to prevent users from leaving the site. We present our results in these areas and discuss their policy implications.

CITATION: Daniel Orr and Josephine Ferrigno-Stack, Childproofing on the World Wide Web: A Survey of Adult Webservers, 41 Jurimetrics J. 465–475 (2001).
Protecting Farmer Innovation: The Convention on Biological Diversity and the Question of Origin
Cary Fowler

The objectives of the Convention on Biological Diversity (CBD) are "the conservation of biological diversity, the sustainable use of its components, and the fair and equitable sharing of the benefits arising out of the utilization of genetic resources . . . ." The CBD states that access is provided on the basis of "prior informed consent" and under "mutually agreed terms." Only countries that are "countries of origin" are empowered to give this consent and agree to terms. The definition of "countries of origin," however, lacks clarity and scientific rigor as applied to domesticated and cultivated species. Agricultural biodiversity is the product of innovation, whether in farmer-selected crop varieties or the latest biotechnologically produced gene construct. How such innovations and associated technologies will be protected and derivative benefits apportioned has been the subject of controversy for centuries. The CBD aimed, in part, to address this question. The particular strategy employed by the CBD, however, is not likely to be successful, given the difficulties that will surely be encountered in identifying "countries of origin" for plant genetic resources for food and agriculture.

CITATION: Cary Fowler, Protecting Farmer Innovation: The Convention on Biological Diversity and the Question of Origin, 41 Jurimetrics J. 477–488 (2001).
Toward an Understanding of the Effect of Changes in Standards of Proof on Errors of Justice
Brian Forst

This article explores the effects of standards of proof on verdict error rates, particularly errors related to the identity of the offender. It focuses on the number of offenders set free per innocent person erroneously convicted and the overall error rate. By combining available data on conviction rates with assumptions about the percentage of accused persons who actually committed the crimes, the effects of various procedural changes on errors in ascertaining guilt can be estimated. These include the effect of (1) raising or lowering the standard of proof required for conviction; and (2) the prosecutor's tradeoff between the number and accuracy of convictions. The accuracy of convictions increases as the prosecutor spends more time on each case and seeks fewer sentencing concessions. Effects of other changes also are considered, including the effects of improvements in forensic analysis, the effects of alternative investigative profiling rules on errors in pre-arrest decision making, and jury size and voting rules. Some interventions shift errors from false acquittals to false convictions or vice versa; others reduce or increase both types of errors.

CITATION: Brian Forst, Toward an Understanding of the Effect of Changes in Standards of Proof on Errors of Justice, 41 Jurimetrics J. 489–504 (2001).
Lee M. Silver, The Meaning of Genes and "Genetic Rights"
Lee M. Silver

In recent years, the notion of "genetic rights" has been promulgated in both the lay press and academic discourse. This paper reviews the concept of a gene and the question of genetic uniqueness. After explaining how genes are packets of immaterial information, it argues that because "genetic rights" is a vague concept with diverse meanings, the term should be avoided. Instead, scholars should focus on questions of reproductive rights, genetic privacy, and genetic discrimination.

CITATION: Lee M. Silver, Lee M. Silver, The Meaning of Genes and "Genetic Rights", 40 Jurimetrics J. 9–19 (1999).
Genetic Privacy and the Law: An End to Genetics Exceptionalism
Lawrence O. Goslin and James G. Hodge, Jr.

While the proliferation of human genetic information promises to achieve many public benefits, the acquisition, use, retention, and disclosure of genetic data threatens individual liberties. States (and to a lesser degree, the federal government) have responded to the anticipated and actual threats of privacy invasion and discrimination by enacting several types of genetic-specific legislation. These laws emphasize the differences between genetic information and other health information. By articulating these differences, governments afford genetic data an "exceptional" status.

The authors argue that genetic exceptionalism is flawed for two reasons: (1) strict protections of autonomy, privacy, and equal treatment of persons with genetic conditions threaten the accomplishment of public goods; and (2) there is no clear demarcation separating genetic data from other health data; other health data deserve protections in a national health-information infrastructure. The authors present ideas for individual privacy protections that balance the societal need for genetic information and the claims for privacy by individuals and families.

CITATION: Lawrence O. Goslin and James G. Hodge, Jr., Genetic Privacy and the Law: An End to Genetics Exceptionalism, 40 Jurimetrics J. 21–58 (1999).
Privacy Issues in Second State Genomics
John A. Robertson

Research that identifies genes useful in the prevention and treatment of disease will require access to biologic samples and medical records protected by traditional notions of privacy and confidentiality. Resolving conflicts between privacy and genomic research will require articulating the ethical rules that should govern such practices and then implementing those rules in the national, regional, or local health systems in which the data of interest exists. As consensus develops about the ethical rules that should govern such research, attention will shift to the practical and political problems of installing and implementing those rules in the agencies and institutions where such research will occur.

CITATION: John A. Robertson, Privacy Issues in Second State Genomics, 40 Jurimetrics J. 59–76 (1999).
Ethical Genetic Research on Human Subjects
John Harris

Since the Nuremberg trials and the Nazi doctors' trial following World War II, international ethics protocols have emerged designed to protect human subjects from the atrocities of medical experimentation that were literally routine under the Nazis. Some of the apparent "lessons" from the Nazi period have been encapsulated in the Declaration of Helsinki, perhaps the leading medical ethics protocol. This paper argues that these protocols have not been notably conducive to human welfare or to the protection of human rights in the field of human genetics research. The paper proposes new protocols and a new approach to the ethics of research on human subjects.

CITATION: John Harris, Ethical Genetic Research on Human Subjects, 40 Jurimetrics J. 77–91 (1999).
Legal Rules and Industry Norms: The Impact of Laws Restricting Health Insurers' Use of Genetic Information
Mark A. Hall

Since 1991, twenty-eight states have enacted laws that prohibit insurers' use of genetic information in pricing, issuing, or structuring health insurance. This article evaluates whether these laws reduce the extent of genetic discrimination by health insurers. Using multiple data sources, it concludes that there are almost no well-documented cases of health insurers asking for or using pre-symptomatic genetic test results in their underwriting decisions, either before or after these laws, or in states with or without these laws. At present, health insurers are not thinking about or interested in using genetic information of this sort. Using this information is not cost effective and is not seen as contributing significantly to underwriting accuracy. However, if genetic testing information were easily available, some health insurers would consider using it in some fashion if that were legal. In the future, such information could become much more relevant to health insurers than it is now. Therefore, the major effect of these laws is to make it less likely that insurers will use genetic information in the future. Although insurers and agents are only vaguely aware of these laws, the laws have helped to convince the industry that it is not appropriate or socially legitimate to use this information. Thus, these laws have caused the insurance industry to embrace more socially oriented norms and attitudes.

CITATION: Mark A. Hall, Legal Rules and Industry Norms: The Impact of Laws Restricting Health Insurers' Use of Genetic Information, 40 Jurimetrics J. 93–122 (1999).
Understanding Prohibitions Against Genetic Discrimination in Insurance
Kenneth S. Abraham

The justification for laws prohibiting genetic discrimination in health insurance is not at all clear. Neither privacy protection, the distinctive features of health insurance, nor the distinction between presymptomatic genetic tendencies and actually manifested disease provide a justification, although certain practical considerations may justify these laws.

CITATION: Kenneth S. Abraham, Understanding Prohibitions Against Genetic Discrimination in Insurance, 40 Jurimetrics J. 123–128 (1999).
Genetic Difference in the Workplace
Michael S. Yesley

Many states have enacted laws prohibiting genetic discrimination by employers. This article considers the need for these laws and the applicability of federal laws to such discrimination. The article also examines situations in which limiting employees' opportunities on the basis of their genetic status may be appropriate.

CITATION: Michael S. Yesley, Genetic Difference in the Workplace, 40 Jurimetrics J. 129–142 (1999).
Iceland's Plan for Genomics Research: Facts and Implications
Henry T. Greely

The government of Iceland has authorized a private, for-profit firm, deCODE Genetics, to construct a database of the population's medical records as part of a larger plan by deCODE for human genetics research. This article presents the background for genetics research in Iceland, the history of deCODE, and the terms of the law authorizing the database. It then examines five objections to the law, based on commercialization, lack of informed consent, risks to privacy, the effects on other research, and financial unfairness. It concludes that the Icelandic model is not a good precedent for similar research elsewhere.

CITATION: Henry T. Greely, Iceland's Plan for Genomics Research: Facts and Implications, 40 Jurimetrics J. 153–191 (2000).
Bioethics, Bench, and Bar: Selected Arguments in Landry v. Attorney General
D.H. Kaye

In Landry v. Attorney General, the Massachusetts Supreme Judicial Court upheld a statute that requires individuals convicted of a wide range of felonies to submit to the extraction of samples of their DNA for the analysis of individualizing features and for the inclusion of that data in a computerized database. Various organizations submitted amicus briefs to help the court understand the underlying science or technology or to appreciate the bioethical issues in using the data or samples in subsequent research. This article reviews portions of two of these briefs for their accuracy and completeness. It concludes that they are no less adversarial than those of the parties. It suggests that the arguments about invasions of a right to genetic privacy suffer in the translation from medical genetics to law enforcement identification data bases. It also contends that whether research uses of data or samples should be allowed without the consent of the offenders is a question of public policy that cannot be resolved by absolute and sweeping claims that information on people can never be used without their consent. It urges that the developing norms of research on human subjects be examined with greater recognition of the difference between clinical or research uses of genetic data and law enforcement data banking.

CITATION: D.H. Kaye, Bioethics, Bench, and Bar: Selected Arguments in Landry v. Attorney General, 40 Jurimetrics J. 193–216 (2000).
A Most Uncommon Carrier, On-Line Service Provider Immunity Against Defamation Claims in Blumenthal v. Drudge
Joshua M. Masur

Blumenthal v. Drudge, 992 F.Supp. 44 (D.D.C. 1998), found that the Communications Decency Act of 1996 extended statutory immunity against a libel claim to any on-line service provider that republished material, whether the republication was automatic or volitional. This result is not mandated by the ambiguous statutory text and runs counter to the vast body of common law distinguishing common-carrier immunity from liability for volitional republication. As the decision laments, this result is at odds with public policies underlying libel law. This article examines the evolution of republication liability and immunity in Internet and analogous telephone case law, and proposes an alternative reading of the statute.

CITATION: Joshua M. Masur, A Most Uncommon Carrier, On-Line Service Provider Immunity Against Defamation Claims in Blumenthal v. Drudge, 40 Jurimetrics J. 217–228 (2000).
The Aftermath of Daubert: An Evolving Jurisprudence of Expert Evidence
Michael J. Saks

This essay examines the developing jurisprudence of expert evidence under the Federal Rules of Evidence by reviewing the Supreme Court's holdings in its post-Daubert scientific evidence cases. In particular, it This essay examines the developing jurisprudence of expert evidence under the Federal Rules of Evidence by reviewing the Supreme Court's holdings in its post-Daubert scientific evidence cases. In particular, it considers the questions of the standard of appellate review, the distinction between methodology and conclusions, and Daubert's applicability to claims of nonscientific expertise.considers the questions of the standard of appellate review, the distinction between methodology and conclusions, and Daubert's applicability to claims of nonscientific expertise.

CITATION: Michael J. Saks, The Aftermath of Daubert: An Evolving Jurisprudence of Expert Evidence, 40 Jurimetrics J. 229–241 (2000).
Acclimation Effects Revisited
Charles R. Shipan

New Supreme Court Justices may have difficulty in adjusting to the work of the Court and as a result may exhibit acclimation effects. Some previous studies have examined this idea by comparing the behavior of new Justices with that of more senior colleagues. Others have compared each new Justice's initial behavior to his or her behavior in later years. This study also examines data both cross-sectionally and longitudinally, but it reconceptualizes the notion of acclimation effects. This reconceptualization draws directly on the definition of acclimation effects found in previous accounts and yields an empirical test that differs from those in other empirical studies. In particular, it shows that if a new Justice does exhibit an acclimation effect, then (1) his or her early-career voting patterns should be more unstable, or volatile, than his or her later-career patterns, and (2) his or her early-career voting patterns should be more unstable than the patterns of his or her more senior colleagues. I test these hypotheses using vote swings as a measure of instability. The results show that acclimation effects do exist for some Justices but cannot be considered a general phenomenon.

CITATION: Charles R. Shipan, Acclimation Effects Revisited, 40 Jurimetrics J. 243–256 (2000).
Designing Electronic Casebooks That Talk Back: The CATO Program
Kevin D. Ashley

Electronic casebooks offer important benefits of flexibility in control of presentation, connectivity, and interactivity. These additional degrees of freedom, however, also threaten to overwhelm students. If casebook authors and instructors are to achieve their pedagogical goals, they will need new methods for guiding students. This paper presents three such methods developed in an intelligent tutoring environment for engaging students in legal role-playing, making abstract concepts explicit and manipulable, and supporting pedagogical dialogues. This environment is built around a program known as CATO, which employs artificial intelligence techniques to teach first-year law students how to make basic legal arguments with cases. Ongoing improvements in CATO point the way for electronic casebooks to engage students in realistic analogical legal arguments. By reorganizing the electronic casebook's explicit information about cases and implicit knowledge of argumentation along the lines of CATO's knowledge sources, it is possible to orchestrate a real dialog between a book and its reader.

CITATION: Kevin D. Ashley, Designing Electronic Casebooks That Talk Back: The CATO Program, 40 Jurimetrics J. 275–319 (2000).
Epidemiology, Justice, and the Probability of Causation
Sander Greenland and James M. Robins

The concept of "probability of causation" forms the basis of important legal standards, legislation, and compensation schemes, which in turn use epidemiologic data to estimate the probability of causation. This usage is a misapplication of epidemiology, because it has been shown that without imposing restrictive biologic assumptions, epidemiologic data cannot supply estimates of the probability of causation. Although the misapplication of the probability of causation concept responds to the need to resolve cases in a rational and consistent manner, this need does not justify continued misuse of epidemiologic data in compensation decisions. Compensation schemes and legal standards must recognize that an upper bound on the probability of causation cannot be determined from epidemiologic data alone; biologic models also are needed. Although equitable compensation schemes can be formulated without reference to the probability of causation, all schemes must deal with fundamental methodologic uncertainties in estimation.

CITATION: Sander Greenland and James M. Robins, Epidemiology, Justice, and the Probability of Causation, 40 Jurimetrics J. 321–340 (2000).
Sampling-Based Adjustment of the 2000 Census-A Balanced Perspective
Margo Anderson, Beth Osborne Daponte, Stephen E. Fienberg, Joseph B. Kadane, Bruce D. Spencer, and Duane L. Steffey

Long before the issue of adjustment for differential undercount of blacks and other minorities arose in connection with the 1980 and 1990 censuses, public controversy attended the publication of census results. A recent criticism by Brown et al. (Jurimetrics J. 39: 347-375) concludes that the proposed design for Census 2000, which includes an updated version of such adjustment, is too risky. Their account of past practices is deeply flawed. This article attempts to provide a more balanced perspective of past practices and the fascinating statistical and legal issues that Census 2000 raises.

CITATION: Margo Anderson, Beth Osborne Daponte, Stephen E. Fienberg, Joseph B. Kadane, Bruce D. Spencer, and Duane L. Steffey, Sampling-Based Adjustment of the 2000 Census-A Balanced Perspective, 40 Jurimetrics J. 341–356 (2000).
Ready, Set, Patent! How the Supreme Court in Pfaff v. Wells Electronics Jumped the Gun
Lucius L. Lockwood

35 U.S.C. § 102(b) prohibits an inventor from obtaining a patent for an invention placed on sale in the U.S. more than one year prior to the date of the application for the patent. To determine whether an invention is on sale, courts have looked to whether the invention was substantially complete, whether it was "on hand" at the time of sale, and whether the totality of the circumstances indicated that it was on sale. The Supreme Court in Pfaff v. Wells Electronics decided that the relevant inquiry in "on-sale bar" cases is whether an invention offered for sale was "ready for patenting." This Note reviews the Supreme Court's decision, analyzes its policy implications, identifies problems with the Court's new test, and proposes an alternative standard.

CITATION: Lucius L. Lockwood, Note, Ready, Set, Patent! How the Supreme Court in Pfaff v. Wells Electronics Jumped the Gun, 40 Jurimetrics J. 399–415 (2000).
RIAA v. Diamond Multimedia: Can Music Copyright Owners Protect Themselves from the Rio?
Laura Sawicki

In RIAA v. Diamond Multimedia, the Ninth Circuit Court of Appeals decided that the Rio, a device that transfers digital audio music files in MP3 format to a portable player, is not a digital audio recording device, and thus does not fall under the ambit of the Audio Home Recording Act of 1992. This Note suggests that a dynamic theory of statutory interpretation should have been used to reach the opposite conclusion.

CITATION: Laura Sawicki, Note, RIAA v. Diamond Multimedia: Can Music Copyright Owners Protect Themselves from the Rio?, 40 Jurimetrics J. 417–430 (2000).
Encryption Source Code and the First Amendment
John J. Browder

In Junger v. Daley, the Court of Appeals for the Sixth Circuit held that the usual First Amendment standards apply to export regulations requiring a license before distributing encryption source code. In Bernstein v. United States Department of Justice, a panel of the Court of Appeals for the Ninth Circuit reached the same conclusion and struck down the licensing system as an unconstitutional prior restraint on speech. The Ninth Circuit agreed to hear the matter en banc, however, and vacated the panel opinion. This Note argues that the opinions in Junger and Bernstein are correct, and it shows that, although regulations promulgated after Bernstein are somewhat less restrictive, they too represent an unconstitutional prior restraint on the dissemination of encryption source code on the Internet.

CITATION: John J. Browder, Note, Encryption Source Code and the First Amendment, 40 Jurimetrics J. 431–444 (2000).
The Propriety of "Facial Challenges" to Prior Restraints on the Internet
D.H. Kaye

Two federal courts of appeals recently ruled that posting encryption source code on the Internet is a form of scientific speech and that export regulations requiring a license for such activity are a prior restraint on expression. However, in their present posture, these cases leave open the possibility that the courts will decide that the plaintiffs, who did not themselves apply for a license, lack standing to attack the export regulations as unconstitutional. The temptation to avoid the merits of the litigation in this way should be rejected. The First Amendment confers standing on litigants to challenge the export regulations as a prior restraint without submitting to its demands.

CITATION: D.H. Kaye, The Propriety of "Facial Challenges" to Prior Restraints on the Internet, 40 Jurimetrics J. 445–456 (2000).
Due Process Analysis in Millennium Enterprises, Inc. v. Millennium Music, L.P.
Angela R. Probasco

This Note analyzes the opinion of the District Court for the District of Oregon in Millennium Enterprises, Inc. v. Millennium Music, L.P., in the context of the development of the law on Internet personal jurisdiction.

CITATION: Angela R. Probasco, Note, Due Process Analysis in Millennium Enterprises, Inc. v. Millennium Music, L.P., 40 Jurimetrics J. 457–468 (2000).
Protection of Trademarks from Use in Internet Advertising Banner Triggers: Playboy v. Netscape
Erik T. Anderson

This Note discusses Playboy Enterprises v. Netscape Communications Corp., 55 F. Supp. 2d 1070 (C.D. Cal. 1999), and the application of trademark law to Internet banner advertising that responds to queries involving the trademarks of others. The Note suggests that such cases will turn on particular facts relating to the likelihood of confusion.

CITATION: Erik T. Anderson, Note, Protection of Trademarks from Use in Internet Advertising Banner Triggers: Playboy v. Netscape, 40 Jurimetrics J. 469–484 (2000).
American Trucking and the Nondelegation Doctrine: A New Twist on an Old Doctrine
Brigham A. Cluff

In American Trucking Associations v. EPA, the Court of Appeals for the District of Columbia Circuit purported to use the nondelegation doctrine to invalidate air-quality standards promulgated by the Environmental Protection Agency under the Clean Air Act. This Note suggests that the court did not revive the nondelegation doctrine, but rather fashioned a new doctrine that undermines the traditional nondelegation doctrine by allowing statutes that represent unconstitutional delegations of legislative power to be rescued by interpretations of administrative agencies.

CITATION: Brigham A. Cluff, Note, American Trucking and the Nondelegation Doctrine: A New Twist on an Old Doctrine, 40 Jurimetrics J. 485–497 (2000).
United States v. Playboy Entertainment Group, Inc., and Television Channel Blocking Technology
Andrea K. Rodgers

In United States v. Playboy Entertainment Group, Inc., the Supreme Court invalidated on First Amendment grounds a provision of the Communications Decency Act of 1996. This law required cable television operators who provide channels "primarily dedicated to sexually oriented programming" either to "fully scramble or otherwise fully block" those channels or to limit their transmission to hours when children are unlikely to be viewing. This Note analyzes the opinions in the case and explores technology that might enable parents to block channels. The Note argues that the Court correctly applied strict scrutiny, and it suggests that channel blocking technology offers a less restrictive alternative to the legislation and should play a role in the analysis of content-based restrictions on television programming.

CITATION: Andrea K. Rodgers, Note, United States v. Playboy Entertainment Group, Inc., and Television Channel Blocking Technology, 40 Jurimetrics J. 499–516 (2000).
A Scholar's Privilege: In re Cusumano
Judith G. Shelling

In re Cusumano, 162 F.3d 708 (1st Cir. 1998), holds that interview materials collected by two scholars for a book were privileged from discovery. This Note describes the case and the limited scope of its holding. It also outlines arguments for a broader or more powerful privilege than the one recognized in Cusumano.

CITATION: Judith G. Shelling, Note, A Scholar's Privilege: In re Cusumano, 40 Jurimetrics J. 517–526 (2000).
Codifying the "Daubert Trilogy:" The Amendment to Federal Rule of Evidence 702
Catherine E. Brixen and Christine M. Meis

In December 2000, an amendment to Federal Rule of Evidence 702 will take effect. This Note examines the amendment, which codifies Daubert v. Merrell Dow Pharmaceuticals, Inc., and its progeny.

CITATION: Catherine E. Brixen and Christine M. Meis, Note, Codifying the "Daubert Trilogy:" The Amendment to Federal Rule of Evidence 702, 40 Jurimetrics J. 527–536 (2000).
Non-human DNA Evidence
George Sensabaugh and David Kaye

After a decade of use, the novelty of DNA evidence has worn off. Nonetheless, new methods of DNA analysis are being introduced, new lOCI within the human genome are being employed, and DNA samples from other organisms are being tested. When evidence made possible by these advances is presented in courts, judges must decide whether the newer procedures are scientifically sound or generally accepted in the scientific community. Focusing on nonhuman DNA evidence, this article identifies and describes factors that courts should examine in passing on the admissibility of novel DNA evidence. The framework that we construct is neither rigid nor self-applying. Some understanding of and appreciation for the nature, structure, and process of scientific reasoning in general, and of the special characteristics of forensic science in particular, are necessary to evaluate the scientific quality of the evidence.

CITATION: George Sensabaugh and David Kaye, Non-human DNA Evidence, 39 Jurimetrics J. 1–16 (1998).
Computational Complexity and the Scope of Software Patents
Andrew Chin

Recent developments in patent law, most notably the effective nullification of the Supreme Court's 1972 Benson decision excluding mathematical algorithms from patentable subject matter, have attempted to reflect an increasingly sophisticated approach to computer science and technology. Despite this, the patent system has continued to disregard computational complexity, an issue of central concern to computer scientists and of strategic importance to U.S. information technology policy. This article proposes a development of patent scope doctrine that would introduce the issue of computational complexity into patent infringement analysis, thereby encouraging more efficient algorithm design, enhancing public benefits from complementary improvements in computer hardware, and strengthening the institutional competence of the patent system.

CITATION: Andrew Chin, Computational Complexity and the Scope of Software Patents, 39 Jurimetrics J. 17–28 (1998).
What's Wrong with the Probability of Causation?
Mark Parascandola

In mass toxic tort cases, it is usually impossible to identify any single factor as the exclusive cause of an individual's disease. In these cases, epidemiologic evidence may help prove causation, but most jurisdictions require studies to show that the "relative risk" associated with exposure to the toxin is more than two. Implicit in this threshold requirement are numerous assumptions about the biological bases underlying chronic disease. This article identifies those assumptions and argues that they are unjustified. Requiring a relative risk in excess of two fails to fulfill the policy goals it seeks to achieve.

CITATION: Mark Parascandola, What's Wrong with the Probability of Causation?, 39 Jurimetrics J. 29–44 (1998).
Have the Courts Grounded the Space Launch Industry? Reciprocal Waivers and the Commercial Space Launch Act
Kim B. Watson

The Commercial Space Launch Act of 1984 (CSLA) manifests Congress's efforts to facilitate the growth of the U.S. private space launch industry. The CSLA requires, among other things, that launch participants execute reciprocal waivers of claims, under which each party agrees to be responsible for property loss or personal injury it sustains as a result of the launch activities. A recent Fourth Circuit Court of Appeals decision, Martin Marietta v. INTELSAT, has brought into question the effectiveness of this reciprocal waiver requirement in limiting both insurance premiums and liability for private launch companies. This article explores the history of the commercial launch industry and the CSLA. It focuses primarily on the Martin Marietta case and discusses how the Fourth Circuit's interpretation of the CSLA's reciprocal waiver requirement could produce a chilling effect on the growth of the private launch industry.

CITATION: Kim B. Watson, Have the Courts Grounded the Space Launch Industry? Reciprocal Waivers and the Commercial Space Launch Act, 39 Jurimetrics J. 45–58 (1998).
Diagnostic Test Methodology in the Design and Analysis of Judge-Jury Agreement Studies
Joseph L. Gastwirth and Michael D. Sinclair

The Hui-Walter model for estimating the accuracy of diagnostic tests on judge-jury agreement data is applied and extended to illustrate its applicability to data from The American Jury. When the assumptions required by these methods are valid, they provide numerical estimates of the overall accuracy rates of jury verdicts, the corresponding judicial assessment, and the proportion of defendants who should have been found guilty. This method provides a more detailed quantitative analysis of the findings of Kalven and Zeisel that juries tend to be more lenient than judges in criminal cases. Since similar studies are ongoing, we present an alternative study design that is less dependent on assumptions required by the original diagnostic test protocol.

CITATION: Joseph L. Gastwirth and Michael D. Sinclair, Diagnostic Test Methodology in the Design and Analysis of Judge-Jury Agreement Studies, 39 Jurimetrics J. 59–78 (1998).
Scientific Evidence of Paternity: A Survey of State Statutes
Allan Z. Litovsky and Kirsten Schultz

Scientific advances in paternity testing have prompted the American legal system to adopt these technologies for courtroom use by means of state statutes. This Note surveys the current statutory scheme regarding the admissibility of scientific paternity tests in the fifty states and the District of Columbia. Variations and ambiguities among the states' statutes are explored, and the progress made in the scheme is noted. Tables summarizing the state statutory provisions follow the text.

CITATION: Allan Z. Litovsky and Kirsten Schultz, Scientific Evidence of Paternity: A Survey of State Statutes, 39 Jurimetrics J. 79–94 (1998).
The Rape Reform Movement: The Traditional Common Law and Rape Law Reforms
Cassia C. Spohn

The past three decades have witnessed a virtual revolution in rape law in the United States. This article summarizes the criticisms of traditional rape law and discusses the rape reform movement, focusing on the four most common reforms and their predicted effects.

CITATION: Cassia C. Spohn, The Rape Reform Movement: The Traditional Common Law and Rape Law Reforms, 39 Jurimetrics J. 119–130 (1999).
The Biology of Human Rape
Randy Thornhill

This article discusses rape from a biological perspective and dispels some myths about what that perspective is, and what it entails. It emphasizes that evolution applies to rape, just as it does to any feature of life. How exactly evolution applies is unclear, but various hypotheses are presented and evaluated. A rape-specific adaptation in men seems likely, but more research is needed to discriminate the rape adaptation hypothesis from the incidental effect hypothesis. In addition, this essay suggests that the widespread opposition to studying rape from an evolutionary biological perspective stems from a bias against viewing rape in biological terms, a bias we must understand and confront if rape research is to progress.

CITATION: Randy Thornhill, The Biology of Human Rape, 39 Jurimetrics J. 137–147 (1999).
Sexual Aggression in the Great Apes: Implications for Human Law
Ronald D. Nadler

Laboratory research shows that the males of three species of great ape, chimpanzee, gorilla, and orangutan, forcibly copulate with females under certain circumstances. Variables that permit or facilitate this type of male sexual aggression are male dominance over females (such that the female cannot safely refuse the male) and the restriction of female activity (such that it cannot avoid or escape from the male). Circumstances associated with this male behavior that contribute to a female's vulnerability and stimulate male sexual aggression include social isolation of a female from other conspecifics, close proximity of a female to a male, reunion of a female and male after a period of separation, novelty of a female to a male, and sudden exposure of a female to a male. Male sexual aggression in these species, moreover, occurs most frequently in mature but relatively young males. The close biological relationship between the great apes and humans suggests that these results have implications for humans; in particular, for a legal remedy to sexual aggression in our species.

CITATION: Ronald D. Nadler, Sexual Aggression in the Great Apes: Implications for Human Law, 39 Jurimetrics J. 149–155 (1999).
Integrating Multiple Levels of Scientific Analysis and the Confluence Model of Sexual Coercers
Neil M. Malamuth and Eldad Z. Malamuth

By transcending traditional disciplinary boundaries and integrating "embedded" levels of scientific analysis, researchers can enhance the value of knowledge derived from each level. We emphasize herein the need to integrate knowledge from evolutionary, genetic, cultural, and developmental levels. Examining how factors emanating from each level interact, and how such interaction shapes individual personality and other characteristics, results in a deeper understanding of the interplay between these critical variables. Such an integrative framework in turn enhances understanding of the more proximate "person by situation" interactional levels that social psychologists often use to analyze behaviors. Unfortunately, there are common misconceptions relating to the use of multi-level approaches, and we address some of them here. Finally, the utility of such integrative approaches is illustrated by a well-supported model of the characteristics of sexually aggressive men. This model incorporates factors traditionally thought to be in opposition and demonstrates that, rather than conflict, they can clarify and amplify each other.

CITATION: Neil M. Malamuth and Eldad Z. Malamuth, Integrating Multiple Levels of Scientific Analysis and the Confluence Model of Sexual Coercers, 39 Jurimetrics J. 157–179 (1999).
An Evolutionary Model of Courtship and Mating as Social Exchange: Implications for Rape Law Reform
Charles Crawford and Marc A. Johnston

Changes to rape legislation have generally failed to reduce the incidence of rape, in large part due to the discretionary power exerted by decision makers in the criminal justice system. Given that law attempts to regulate human conflicts of interest, understanding the evolutionary origin of these conflicts may help explain why changes to rape law have not been fully embraced. We present an evolutionary model of courtship and mating in ancestral environments, in terms of social exchange theory (parental investment by males is exchanged for sexual access to females). This model suggests that rape may be more likely to occur when perceived female value is high, perceived future reproductive prospects of males are low and perceived costs (risk and severity of punishment) are low. This evolutionary perspective suggests that desexualizing rape may be doing more harm than good, and explains why judges and jurors may be reluctant to convict alleged acquaintance rapists without corroboration of the victim's testimony, proof of non-consent and knowledge of the victim's sexual history. It may also help lawmakers design laws that foster environments that discourage rape.

CITATION: Charles Crawford and Marc A. Johnston, An Evolutionary Model of Courtship and Mating as Social Exchange: Implications for Rape Law Reform, 39 Jurimetrics J. 181–200 (1999).
A Darwinian Interpretation of Individual Differences in Male Propensity for Sexual Aggression
Martin L. Lalumière and Vernon L. Quinsey

This article reviews studies of individual differences in male propensity for sexual aggression, examines the implications of the findings for the formulation of Darwinian theories of sexual aggression, and discusses implications for law and policy. A Darwinian view of sexual aggression is likely to have practical implications through better understanding of its etiology.

CITATION: Martin L. Lalumière and Vernon L. Quinsey, A Darwinian Interpretation of Individual Differences in Male Propensity for Sexual Aggression, 39 Jurimetrics J. 201–216 (1999).
The Multiplicity of Rape: From Life History Strategies to Prevention Strategies
Linda Mealey

Evolutionary theory and the social sciences converge on the conclusion that "rape" is not a single crime committed by a single type of perpetrator with a single motivation. To reduce rape, the multiplicity of motivations must be addressed.

CITATION: Linda Mealey, The Multiplicity of Rape: From Life History Strategies to Prevention Strategies, 39 Jurimetrics J. 217–226 (1999).
What Rape Is and What It Ought Not to Be
Katharine K. Baker

This article suggests that the social meanings of both rape and sex help explain the prevalence of rape. Evolutionary biology can explain why it is evolutionarily beneficial for men to rape, but one needs to examine social norms before one can explain why rape is so hard to punish, why men are socially motivated to rape, and what the law should do about it. This article suggests that only by debunking the stereotypes that inhibit rape prosecutions, changing the social norms that encourage men to rape, and clearly distinguishing, as a matter of social understanding, the biologically identical acts of rape and sex can the law hope to curb the incidence of rape.

CITATION: Katharine K. Baker, What Rape Is and What It Ought Not to Be, 39 Jurimetrics J. 233–242 (1999).
Evolutionary Biology and Rape
Deborah W. Denno

This article queries whether an evolutionary analysis of rape may be more compelling in explaining a rape victim's fear than a defendant's sexual aggression. Such a victim-oriented approach could help legal decision makers assess the reasonableness of the victim's fear when determining whether sex was forced or threatened. These ideas are explored in the context of two well-known rape trials, State v. Rusk and State v. Smith. This article concludes that evolutionary biology can contribute to an understanding of rape. However, the supposed evolutionary underpinnings of male sexual aggression should not justify such behavior or render it acceptable as a criminal defense. Moreover, evolutionary research must be evaluated in a social frame so that generalizations do not unfairly or inaccurately bias plaintiffs or defendants.

CITATION: Deborah W. Denno, Evolutionary Biology and Rape , 39 Jurimetrics J. 243–254 (1999).
Statistics, Law, and Justice
Steven Goldberg

This article focuses on the extent to which the evolutionary psychology approach to rape, as to many other law-relevant behaviors, often depends on the utility of statistical generalizations about human behavior, rather than explanations of why a single individual behaved as he did. It consequently explores the tension between the broad understanding represented by statistics and the sharp desire that justice be done in each case.

CITATION: Steven Goldberg, Statistics, Law, and Justice, 39 Jurimetrics J. 255–259 (1999).
Sometimes Sex Matters: Reflections on Biology, Sexual Aggression, and Its Implications for the Law
Cheryl Hanna

In this essay, the author explores some implications a biosocial model of rape can have for the prevention of sexual assault. After briefly explaining biological theories of rape, the author examines the "girl power" movement and how it sends mixed messages about men's and women's sexual motivations. She concludes with how a re-examination of rape from a biosexual model can enhance the teaching of rape law.

CITATION: Cheryl Hanna, Sometimes Sex Matters: Reflections on Biology, Sexual Aggression, and Its Implications for the Law, 39 Jurimetrics J. 261–269 (1999).
Is It Sex Yet? Theoretical and Practical Implications of the Debate over Rapists' Motives
Craig T. Palmer, David N. DiBari, and Scott A. Wright

A review of recent literature on rape reveals that most social scientists continue to ignore or deny that rapists are sexually motivated. This position is a major obstacle to informed legal analysis of rape that recognizes the evolved difference in male and female sexual desires.

CITATION: Craig T. Palmer, David N. DiBari, and Scott A. Wright, Is It Sex Yet? Theoretical and Practical Implications of the Debate over Rapists' Motives, 39 Jurimetrics J. 271–282 (1999).
The Evolution of Legal Concepts: The Memetic Perspective
Michael S. Fried

The evolutionary biologist Richard Dawkins has observed that certain verbally transmitted concepts possess all of the properties necessary for them to evolve according to the principles of natural selection. This article discusses the application of Dawkins' insight to the development of legal concepts over time. Incorporating this "memetic" perspective into the substantial literature on common law legal evolution yields some significant insights for that tradition. Conversely, many of the serious practical difficulties inherent in developing a general science of memetics are eliminated or greatly reduced if one restricts one's attention to the memes that occur in the legal system.

CITATION: Michael S. Fried, The Evolution of Legal Concepts: The Memetic Perspective, 39 Jurimetrics J. 291–316 (1999).
Genetic Privacy and Discrimination: A Survey of State Legislation
William F. Mulholland and Ami S. Jaeger

As of January 15, 1999, forty-four states had enacted legislation concerning genetic privacy or discrimination. This Comment catalogs these statutes and tabulates their most salient provisions.

CITATION: William F. Mulholland and Ami S. Jaeger, Genetic Privacy and Discrimination: A Survey of State Legislation, 39 Jurimetrics J. 317–326 (1999).
Statistical Controversies in Census 2000
Lawrence D. Brown, Morris L. Eaton, David A. Freedman, Stephen P. Klein, Richard A. Olshen, Kenneth W. Wachter, Martin T. Wells, and Donald Ylvisaker

This paper discusses Census 2000, focusing on sampling techniques for adjustment. Experience with similar adjustment methods in 1980 and 1990 suggests that the design for Census 2000 is quite risky. The use of sampling to obtain population counts for apportionment was rejected by the Supreme Court as violating the Census Act. Statistical adjustments for the purpose of redistricting may or may not be constitutional.

CITATION: Lawrence D. Brown, Morris L. Eaton, David A. Freedman, Stephen P. Klein, Richard A. Olshen, Kenneth W. Wachter, Martin T. Wells, and Donald Ylvisaker, Statistical Controversies in Census 2000, 39 Jurimetrics J. 347–375 (1999).
Equity in Supreme Court Opinion Assignment
Sara C. Benesh, Reginald S. Sheehan, and Harold J. Spaeth

When the chief justice of the Supreme Court votes with the majority, he assigns the writing of the majority opinion to a particular justice. The chief justice is expected to distribute the opportunity evenly among the associate justices. This article develops a new method for measuring distributional equality, applies it to a 41-year period extending across four chief justiceships, and finds that assigners adhere to the norm of equal distribution. Even so, we demonstrate that the equal assignment norm does not preclude assigners from behaving strategically, and we find evidence of such behavior in the Burger and Rehnquist Courts.

CITATION: Sara C. Benesh, Reginald S. Sheehan, and Harold J. Spaeth, Equity in Supreme Court Opinion Assignment, 39 Jurimetrics J. 377–389 (1999).
The Genetic Privacy Act: A Proposal for National Legislation
Patricia (Winnie) Roche, Leonard H. Glantz, and George J. Annas

In late June 1996, Senator Pete V. Domenici (R-N.M.) introduced S.1898, the Genetic Confidentiality and Nondiscrimination Act of 1996 (GCNA), which was based in large part on the proposed "Genetic Privacy Act of 1995" (GPA) drafted by the authors. This article outlines the purpose and provisions of the GPA, and it highlights some of the differences between the GPA and the GCNA.

CITATION: Patricia (Winnie) Roche, Leonard H. Glantz, and George J. Annas, The Genetic Privacy Act: A Proposal for National Legislation, 37 Jurimetrics J. 1–11 (1996).
Insurers' Use of Genetic Information
Mark A. Hall

This article surveys the arguments for and against allowing insurers to use genetic information in setting premiums or determining coverage. Noting that access to this information raises both discrimination and privacy issues, the article discusses the fit between various rationales for prohibiting this use of genetic information and the actual statutes that have been enacted and proposed. The article concludes with a call for more empirical research into the actual impact of these statutes and the magnitude of the social problems that led to their enactment.

CITATION: Mark A. Hall, Insurers' Use of Genetic Information, 37 Jurimetrics J. 13–22 (1996).
Assessing the Believability of Expert Witnesses: Science in the Jurybox
Daniel W. Shuman, Anthony Champagne, and Elizabeth Whitaker

This article reports on a study of the factors associated with jurors' assessments of experts' believability. A total of 156 former jurors from 24 civil cases tried in Dallas Country, Texas, responded to a telephone survey. Multivariate analysis revealed no statistically significant associations between the occupations of the experts or the characteristics of the jurors, on the one hand, and the believability of the experts, on the other. Perceived qualifications, familiarity with the facts, good reasoning, impartiality, and the side calling the expert were associated with believability. We conclude that jurors make expert-specific decisions based on rational criteria.

CITATION: Daniel W. Shuman, Anthony Champagne, and Elizabeth Whitaker, Assessing the Believability of Expert Witnesses: Science in the Jurybox, 37 Jurimetrics J. 23–34 (1996).
The Census Adjustment Cases: The Hunt for the Wily Trout
James Pack

The Bureau of the Census conducts a decennial count of the population of the United States. Unfortunately, the census does not count everyone, and the undercount is concentrated among racial and ethnic minorities. To combat this "differential undercount," the Census Bureau experimented with a statistical method to adjust the census count, known as "dual-systems estimation." After the Secretary of Commerce declined to apply this adjustment to the 1990 census, various states, cities, and interest groups sued to force its use. The case reached the Supreme Court in 1996, which held that because census inaccuracy is different from state malapportionment, the one-person, one-vote strict scrutiny standard does not apply. Because the Constitution grants Congress broad discretion to conduct the census, the Secretary of Commerce's decision not to adjust need only be reasonable in light of the constitutional purpose of the census.

CITATION: James Pack, Note, The Census Adjustment Cases: The Hunt for the Wily Trout, 37 Jurimetrics J. 35–52 (1996).
Hopkins v. Dow Corning Corp.: Silicone and Science
Donald A. Lawson

In Hopkins v. Dow Corning Corp., the Ninth Circuit Court of Appeals upheld a pivotal and controversial jury verdict against a manufacturer of silicone gel breast implants. The defendant argued that the plaintiff's proof that the implants caused disease lacked a scientific foundation and should not have been allowed into evidence. This note discusses the status of the scientific literature on the issue of causation of autoimmune diseases in women with silicone gel breast implants and the critical importance of epidemiologic evidence establishing causation. It concludes that Hopkins v. Dow Corning Corp. was wrongly decided.

CITATION: Donald A. Lawson, Note, Hopkins v. Dow Corning Corp.: Silicone and Science, 37 Jurimetrics J. 53–68 (1996).
Recent Legislation on Genetics and Insurance
Helen R. Davis and Janice V. Mitrius

This note catalogs and describes federal and state legislation regulating the use of genetic information by insurance companies.

CITATION: Helen R. Davis and Janice V. Mitrius, Note, Recent Legislation on Genetics and Insurance, 37 Jurimetrics J. 69–82 (1996).
Can an Operating System Vendor Have a Duty to Aid Its Competitors?
Bryce J. Jones II and James R. Turner

Microsoft Corporation is often accused of using the monopoly power of its Windows operating system to unfair advantage in applications software markets, such as word processing and spreadsheets. Various cases under Section 2 of the Sherman Act provide a basis for analyzing Microsoft's behavior. The essential facilities doctrine, along with other precedents on intent, refusal to deal, and leveraging, indicates that Microsoft has a duty to be more cooperative in informing its applications software competitors of details about its operating systems. This would help to maintain diversity in software innovation and to protect reliance on cooperative arrangements within industries.

CITATION: Bryce J. Jones II and James R. Turner, Can an Operating System Vendor Have a Duty to Aid Its Competitors?, 37 Jurimetrics J. 355–394 (1997).
The National Research Council's Second Report on Forensic DNA Evidence: A Critique
William C. Thompson

This article criticizes the National Research Council's second report on forensic DNA evidence for giving inadequate attention to problems surrounding the interpretation of DNA test results, for acquiescing to interpretive standards less rigorous than those recommended in the first NRC report, and for assuming, unrealistically, that retesting by criminal defendants will resolve the problem of laboratory error. It also comments on the calculation and presentation of quantitative error rate estimates and posterior probabilities in connection with DNA evidence.

CITATION: William C. Thompson, The National Research Council's Second Report on Forensic DNA Evidence: A Critique, 37 Jurimetrics J. 405–423 (1997).
Why DNA Likelihood Ratios Should Account for Error
Jonathan J. Koehler

The possibility of error limits the strength of DNA evidence in the same way that it limits the strength of other kinds of legal evidence. However, a 1996 report by the National Research Council recommends against estimating an error rate derived from proficiency tests to help identify the probative value of DNA evidence. The Committee's arguments are identified and critiqued. It is argued that error rate data derived from broad reference classes such as "all DNA laboratories" provide a relevant starting point for estimating the risk of error in individual cases. Likelihood ratios that fail to incorporate this estimate may be misleading.

CITATION: Jonathan J. Koehler, Why DNA Likelihood Ratios Should Account for Error, 37 Jurimetrics J. 425–437 (1997).
After the DNA Wars: Skirmishing with NRC II
Richard O. Lempert

This article traces some of the controversies surrounding DNA evidence and argues that, although many have been laid to rest by scientific developments confirmed in the National Research Council's second DNA report, there remain several problems that are likely to lead to continued questioning of standard ways prosecutors present DNA evidence. Although much about the report is to be commended, it falls short in several ways, the most important of which is in its support of presenting random match probabilities independent of plausible error rates. The article argues that although one can sympathize with the NRC committee's decision as an effort to say no more than what science reliably tells us, it is not a good forensic science recommendation because following it means that the probative value of DNA evidence is likely to be substantially overstated. Fortunately, it will be the rare case where this matters.

CITATION: Richard O. Lempert, After the DNA Wars: Skirmishing with NRC II, 37 Jurimetrics J. 439–468 (1997).
Errors and Misunderstandings in the Second NRC Report
David J. Balding

Criticisms are presented of the second NRC report on DNA evidence. The underlying theme is that the report does not focus on evidential weight; consequently much of its discussion is tangential to the issues that matter in court, and in some cases the report is positively misleading. In particular, a recommendation concerning database searches and another concerning small groups or tribes are seriously flawed, erring in the former case in favor of defendants and in the latter case against defendants.

CITATION: David J. Balding, Errors and Misunderstandings in the Second NRC Report, 37 Jurimetrics J. 469–476 (1997).
The Forensic DNA Endgame
N.E. Morton

The National Research Council report on DNA evidence left a few unsolved problems, nearly all at the level of forensic assimilation of established scientific principles. This article reviews the major post-NRC topics: the theoretical framework, the coincidence test for unrelated suspect and evidentiary sample, the kinship test for pairs from the same subpopulation, alternative hypotheses, Bayesian analysis, multiple culprits or suspects, and what remains to be done if DNA evidence is to be trustworthy and suspect trawls are to be efficient and without prejudice to defendants. A simple solution to the trawling controversy through confirmatory markers is proposed.

CITATION: N.E. Morton, The Forensic DNA Endgame, 37 Jurimetrics J. 477–494 (1997).
The Admissibility of PCR-based DNA Evidence: State v. Lyons
Karla K. Hotis

In State v. Lyons, the Oregon Supreme Court held a form of PCR-based DNA evidence admissible. It determined that the PCR-DQA test is generally accepted in the scientific community and that the evidence satisfied a standard similar to that adopted for federal courts in Daubert v. Merrell Dow Pharmaceuticals, Inc. The reasoning in Lyons is persuasive, and it is likely that courts in other jurisdictions will reach similar results.

CITATION: Karla K. Hotis, Note, The Admissibility of PCR-based DNA Evidence: State v. Lyons, 37 Jurimetrics J. 495–506 (1997).
Understanding and Evaluating Statistical Evidence in Litigation
Stephen E. Fienberg, Samuel H. Krislov, and Miron L. Straf

The theory, methods, and applications of statistics underlie much scientific evidence presented in courts. This paper explores some of the roles of statistics as scientific evidence, and develops three themes related to (i) the role of models and assumptions in understanding scientific inferences, (ii) the frequent need to combine disparate statistical evidence, and (iii) what to consider in evaluating scientific evidence. We discuss statistical tools, their role in the presentation of scientific evidence and some common misunderstandings in statistical reasoning. We suggest ways in which the trier of fact can be more effective in dealing with expert statistical testimony.

CITATION: Stephen E. Fienberg, Samuel H. Krislov, and Miron L. Straf, Understanding and Evaluating Statistical Evidence in Litigation, 36 Jurimetrics J. 1–32 (1995).
Issues in Cancer Screening: A Prostate Case Study
Catherine M. Polizzi

As new methods are developed for detecting disease at earlier stages, difficult health-care policy issues arise. Improved screening techniques do not necessarily prevent or cure disease. The natural history of the disease, the appropriate treatment approach, and other factors are important. Understanding these factors and their interaction is imperative in controlling health care costs while maintaining an acceptable standard of health care. Ideally, vigorous research into the factors affecting the efficacy of screening programs will lead to a highly selective screening policy that is efficient and effective. As a step in this direction, this article examines the multi-faceted controversy surrounding prostate-cancer screening.

CITATION: Catherine M. Polizzi, Issues in Cancer Screening: A Prostate Case Study, 36 Jurimetrics J. 33–58 (1995).
"Building-Block" Cost Methods for Pricing and Unbundling Telecommunications Services: Implications for the Law and Regulatory Policy
Alexander C. Larson and Steve G. Parsons

Building-block proposals for calculating the costs of local telephone company services and network functions have appeared in regulatory proceedings over the past several years. These proposals have been touted as a significant contribution to future regulatory compliance of local exchange carriers, but from a pragmatic and economic perspective they have several serious shortcomings. These include specific fixed costs, allocations of consumer costs, and an over-reliance on textbook notions of "long-run" costs. More important, building-block proposals are intertwined with telecommunications public policy issues such as pricing, "unbundling," and "imputation." An adherence to building-block cost proposals by regulators when pursuing issues of pricing, unbundling, and imputation can easily lead to ineffective and socially costly public policies. For example, adherence to building-block methods may preclude the efficient pricing of telecommunications services, create regulatory price floors that are too high to be efficient, and create an environment in which all discrete network facilities or functions are viewed as de facto "essential facilities" in the antitrust sense.

CITATION: Alexander C. Larson and Steve G. Parsons, "Building-Block" Cost Methods for Pricing and Unbundling Telecommunications Services: Implications for the Law and Regulatory Policy, 36 Jurimetrics J. 59–97 (1995).
Fluidity and Coalition Sizes on the Supreme Court
Saul Brenner, Tony Caporale, and Harold Winter

In every case in which the United States Supreme Court hears oral arguments and decides by an opinion or judgment of the Court, the justices vote twice: once in secret conference and again when they hand down the final decision. We examine two main aspects of the relationship between fluidity (i.e., vote-switching between the two votes) and coalition sizes: (1) the relationship between switching and the size of the coalition at the conference vote, and (2) the relationship between switching and the size of the coalition at the final vote. Unlike previous studies, we examine all types of switching that can occur between the conference and final votes.

Our results highlight two effects: (1) majority effects (i.e., the tendency of members of a majority coalition at the conference vote not to switch, and the tendency of the fluidity to increase the size of the majority coalition); and (2) unanimous effects (i.e., the tendency of members of a unanimous coalition at conference vote not to switch and the tendency of the fluidity to generate a unanimous final vote). Unanimous effects have not been previously identified in the literature.

CITATION: Saul Brenner, Tony Caporale, and Harold Winter, Fluidity and Coalition Sizes on the Supreme Court, 36 Jurimetrics J. 245–254 (1996).
One-Way Fee Shifting Statutes and Offer of Judgment Rules: An Experiment
Thomas D. Rowe, Jr. and David A. Anderson

Many federal statutes, most notably the Civil Rights Attorneys' Fees Awards Act (§ 1988), make defendants liable for the attorneys' fees of prevailing plaintiffs. Federal Rule of Civil Procedure 68 makes plaintiffs who reject a formal defense offer of judgment and fail to improve on it a trial liable for defendants' post-offer "costs." Although "costs" ordinarily do not include attorneys' fees, several federal fee-award statutes refer to the fees as part of costs, suggesting that plaintiffs cannot recover their own post-offer fees and become liable for post-offer defense attorneys' fees. This article identifies three alternative interpretations of the interplay of § 1988 and Rule 68 and provides data on the effects of these interpretations on settlements. The study finds that how Rule 68 is interpreted to affect § 1988 attorney-fee entitlements and liabilities can have a significant effect on settlement bargaining. If the objective is to maximize out-of-court settlements, the most effective rule would have a rejected but unbettered Rule 68 offer reverse a plaintiff's § 1988 attorney-fee entitlement, making the plaintiff liable for post-offer defense attorney's fees.

CITATION: Thomas D. Rowe, Jr. and David A. Anderson, One-Way Fee Shifting Statutes and Offer of Judgment Rules: An Experiment, 36 Jurimetrics J. 255–273 (1996).
Do Special Verdicts Improve the Structure of Jury Decision-Making?
David A. Lombardero

According to a probabilistic (Bayesian) model, each element of a claim can satisfy the preponderance of the evidence standard, even when the conjunction of these claims does not. In such situations, special verdicts, which cause the jury to consider each element separately, leave it to the court to enter judgment according the conjunction of these findings, exacerbate this conjunction problem. This article discusses the conjunction problem in relations to special verdicts, including those with non-unanimous voting requirements.

It concludes that (1) when the plaintiff asserts multiple alternative causes of action that would afford substantially the same relief, the disjunction rule, which is logically equivalent to the conjunction rule, may work to the defendant's advantage; (2) with a special verdict, the outcome may depend upon the order in which the verdict form presents issues; (3) ignoring the order of presentation but requiring that issues be decided sequentially, rather than simultaneously, is likely to favor the plaintiff; and (4) when jury unanimity is not required, a special verdict is even more advantageous to the plaintiff, especially if any majority can support a verdict for the plaintiff.

CITATION: David A. Lombardero, Do Special Verdicts Improve the Structure of Jury Decision-Making?, 36 Jurimetrics J. 275–324 (1996).
Higgledy-Piggedly Awards for Lost Earnings
Rolando Pelaez

This paper examines the theoretical and empirical basis of the below-market discount rate method--a method sanctioned by the U.S. Supreme Court, mandated by the Fifth Circuit, and widely used for computing awards for lost earnings. The analysis shows that the method is bereft of logical, theoretical, and empirical bases. Small variations in the below-market or real interest rate cause spectacular award errors, and the method accomplishes results opposite to those intended.

CITATION: Rolando Pelaez, Higgledy-Piggedly Awards for Lost Earnings, 36 Jurimetrics J. 325–336 (1996).
Determination of Both Parents Using DNA Profiling
Wing K. Fung, D.M. Wong, and P. Tsui

The genetic relationship between a couple whose newborn infant disappeared from a hospital and an abandoned newborn baby was investigated using a restriction fragment length polymorphism (RFLP) analysis. Likelihood ratios for different cases of matching DNA profiles were derived. The proposed methods can be applied more generally to other cases of alleged parenthood.

CITATION: Wing K. Fung, D.M. Wong, and P.Tsui, Determination of Both Parents Using DNA Profiling, 36 Jurimetrics J. 337–342 (1996).
Statistical Issues Arising in Equal Employment Litigation
Joseph L. Gastwirth

Statistical analysis can help show whether race, gender, or other prohibited factor played a significant role in an employment decision. But few data sets are perfect, and it is important to examine criticisms of the data or the analysis of the data to determine whether the alleged flaws are severe enough to alter the ultimate inference. This article discusses procedures for this purpose. It reviews a method for studying the potential effect of an omitted factor, for checking for a change in the data (to see whether increased success of minority applicants occurred before or after a charge was filed), and questions the technique, accepted by some courts, of modifying data to explore the sensitivity of finding a statistical significance.

CITATION: Joseph L. Gastwirth, Statistical Issues Arising in Equal Employment Litigation, 36 Jurimetrics J. 353–370 (1996).
Juror Assessments of the Believability of Expert Witnesses: A Literature Review
Daniel W. Shuman, Anthony Champagne, and Elizabeth Whitaker

Although a heated debate about the use of expert witness rages, that debate has been grounded in anecdotes rather than rigorous study. One issue that has received scant attention is the role that the expert's field of expertise, as contrasted with the characteristics of the jurors or experts, plays in juror assessments of believability. This issue is important because courts often apply different levels of scrutiny to expert testimony from the hard and soft sciences, based on their assumptions about how jurors assess this testimony. This article reviews this differential approach and the research relevant to juror assessments of expert believability.

CITATION: Daniel W. Shuman, Anthony Champagne, and Elizabeth Whitaker, Juror Assessments of the Believability of Expert Witnesses: A Literature Review, 36 Jurimetrics J. 371–382 (1996).
Horizontal Gaze Nystagmus: A Closer Look
Joseph R. Meaney

Police officers in every state routinely administer the "pen-waving" test to drunk-driving suspects. This test is formally referred to as the Horizontal Gaze Nystagmus (HGN) test. All ten state supreme courts that have directly considered the admissibility of the HGN testimony to show that a defendant was operating under the influence have decided in favor of admissibility. However, two recent state supreme courts have questioned this result.

This Note analyzes the admissibility of HGN as proof of intoxication. It applies the two dominant approaches to the admissibility of scientific evidence: scientific validity and general acceptance. It shows that despite being generally accepted by some scientific communities for some purposes, HGN is not scientifically valid as a field sobriety test. It concludes that more testing is required to determine HGN's accuracy to evaluate blood-alcohol levels near the legal limit before it should be admitted in certain situations.

CITATION: Joseph R. Meaney, Note, Horizontal Gaze Nystagmus: A Closer Look, 36 Jurimetrics J. 383–407 (1996).

 

   
Archive

Volume 56 (2015-16)
Volume 55 (2014-15)
Volume 54 (2013-14)
Volume 53 (2012-13)
Volume 52 (2011-12)
Volume 51 (2010-11)
Volume 50 (2009-10)
Volume 49 (2008-09)
Volume 48 (2007-08)
Volume 47 (2006-07)
Volume 46 (2005-06)
Volume 45 (2004-05)
Volume 44 (2003-04)
Volume 43 (2002-03)
Volume 42 (2001-02)
Volume 41 (2000-01)
Volume 40 (1999-00)
Volume 39 (1998-99)
Volume 38 (1997-98)
Volume 37 (1996-97)
Volume 36 (1995-96)