Articles by mjlst

Google Glass: Augmented Realty or ADmented Realty?

by Sarvesh Desai, UMN Law Student, MJLSTStaff

Thumbnail-Sarvesh-Desai.jpgGoogle glasses . . . like a wearable smartphone, but “weighing a few ounces, the sleek electronic device has a tiny embedded camera. The glasses also deploy what’s known as a ‘heads-up display,’ in which data are projected into the user’s field of vision on a small screen above the right eye.”

google-glasses2.jpgThe glasses are designed to provide an augmented reality experience in which (hopefully useful) information can be displayed to the wearer based on what the wearer is observing in the world at that particular moment. The result could be a stunning and useful achievement, but as one commentator pointed out, Google is an advertising company. The result of Google glasses, or as Google prefers to call them “Google Glass”(since they actually have no lenses) is that advertisements following you around and continuously updating as you move through the world may soon be a reality.

With the ever increasing digital age, more of our movements, preferences, and lives are incessantly tracked. A large portion of the American population carries a mobile phone at all times, and as iPhone users learned in 2011, a smartphone is not only a handy way to keep Facebook up to date, it is also a potential GPS tracking device.

With technologies like smartphones, movement data is combined with location data to create a detailed profile of each person. Google Glass extends this personal profile even further by recording not only where you are, but what you are looking at. This technology makes advertising, as displayed in the hit movie, The Minority Report, a reality, while also creating privacy issues that previously could not even be conceptualized outside science fiction.

Wondering what it might look like to wander the world, as context-sensitive advertisements flood your field of vision? Jonathan McIntosh, a pop culture hacker has the answer. He released a video titled ADmented Reality in which he placed ads onto Google’s Project Glass promotional video demonstrating what the potential combination of the technology, tracking, and advertising might yield. McIntosh discussed the potential implications of such technology in the ABC News Technology Blog. “Google’s an ad company. I think it’s something people should be mindful of and critical of, especially in the frame of these awesome new glasses,” McIntosh said.

As this technology continues to improve and become a more integrated part of our lives, the issue of tracking becomes ever more important. For a thorough analysis of these important issues, take a look at Omer Tene and Jules Polonetsky’s article in the Minnesota Journal of Law, Science & Technology, “To Track or ‘Do Not Track’: Advancing Transparency and Individual Control in Online Behavioral Advertising.” The article covers current online tracking devices, the use of tracking, and recent developments in the regulation of online tracking. The issues are not simple and there are many competing interests involved: efficiency vs. privacy, law enforcement vs. individual rights, and reputation vs. freedom of speech, to name a few. As this technology inexorably marches on, it is good to consider whether legislation is needed and, if so, how will it balance those competing interests. In addition, what values do we consider to be of greatest importance and worth preserving at the risk of hindering “awesome new” technology?


Being Green by Helping the Giants Beat the Eagles

by Nathanial Weimer, UMN Law Student, MJLST Staff

Thumbnail-Nathanial-Weimer.jpgSporting events are a nightmare in terms of the environment. The vast number of spectators involved–over 16 million paying fans attended NFL games last year, according to NBC Sports–leave behind massive amounts of trash, while stadiums face huge challenges with water conservation and electricity consumption on game days. Fans also have to transport themselves to and from the event, using large quantities of fuel. And, of course, the problem extends to all stadium events, whether professional or college, football or a different sport. Such a widespread problem needs a powerful solution, one that goes beyond merely suggesting that teams “do the right thing”. The fact is, teams that effectively deal with this problem must be rewarded, and those rewards must contribute to on-field success. By linking sustainability to team performance, the green movement can benefit from the competitive spirit that drives sports.

Many sports teams have already taken steps toward making their stadiums green. SustainableBusiness.com lists professional sports teams with effective environmental strategies, while the EPA has organized waste reduction competitions between collegiate football programs. The University of Minnesota became a leader with the construction of its new football field; upon completion, TCF Bank Stadium became the first collegiate or professional football facility to achieve LEED Silver Certification for environmental design.

Several motivations have contributed to this move towards sustainability. Some owners have used environmental campaigns as a way to strengthen community ties, or improve a team’s brand image to attract sponsors, according to Switchboard. Reductions in energy consumption, often through the installation of solar panels, can greatly reduce utility costs. Groups such as the Green Sports Alliance, a non-profit originating in the Pacific Northwest, have collaborated with professional teams across different sports to incite a higher level of environmental responsibility. Still, the greatest motivation in sports is noticeably missing–winning.

Achieving environmental sustainability requires continuous improvement. In order to ensure that sports teams continue to innovate and strive for improvement, their waste management accomplishments must be able to contribute to their on-field success. In professional leagues, this could easily be accomplished through a salary-cap bump. An NBA team with a model sustainability program could be allowed to spend, say, $5 million more a year on its roster than a team without such a program. Alternatively, draft odds could be adjusted. Instead of losing 59 games in the hopes of landing number one draft pick Anthony Davis, the Charlotte Bobcats could have installed low-flush, dual flush toilets and aerated faucets like those at Target Field. College programs, “arguably the next frontier for the sports greening movement” according to Switchboard, could be rewarded for their environmental initiatives through postseason considerations. Bowl Games could be allowed, or even encouraged, to take a program’s sustainability accomplishments into consideration. NCAA basketball tournament seeds could be similarly tweaked.

While going green might save money on utilities and attract corporate sponsors, the fastest way to make money in sports is to put a successful product on the field. Connecting greenness to on-field benefits would boost community involvement as well–an NBA fan is far more likely to volunteer to sort recycling when she thinks her efforts might help her team find cap room to sign a Dwight Howard. By the same token, collegiate boosters are more likely to donate money towards sustainability projects when those projects earn benefits that would otherwise go to a bitter rival. Sports, after all, are about competition, and winning feels better when you defeat somebody. Giants owner John Mara, when asked about the competitive outlet provided by greening efforts, agreed: “Most of all, I want to beat the Philadelphia Eagles.” It shouldn’t matter that by bringing competition into the quest for sustainability, we all win.

The environmental responsibility of sports events has come a long way. Many stadiums feature technology aimed at tackling the challenging problem of waste management. Still, the fight for sustainability remains an uphill battle, and teams must strive to find new ways to improve their stadiums. Rewarding committed teams with performance-related benefits not only preserves this commitment to innovation, it strengthens it.

Interested in law and sports? You might also like:
Fantasy Baseball Litigation: “C.B.C. Distribution and Marketing, Inc. v. Major League Baseball Advanced Media, LP: Why Major League Baseball Struck Out and Won’t Have Better Luck in its Next Trip to the Plate” by Daniel Mead


Pandemic Flu and You

by Eric Nielson, UMN Law Student, MJLST Staff

Thumbnail-Eric-Nielson.jpgWelcome to flu season. That wonderful time of year where we cross-contaminate millions of bioreactors in our schools and unleash the resulting concoction on humanity.

Flu kills thousands of Americans each year. The good news is that since H1N1 in 2009, we’ve gone without a serious flu pandemic threat. The bad news, according to researchers, is that may be just a matter of luck.

Researchers have recently published multiple methodologies for converting existing animal strains of flu into pandemic capable versions. Flu strains are tested on unimmunized ferrets which are believed to best represent the human disease response to flu (and are kind of cute in a weaselly way). In Korea, researchers created a highly contagious swine flu variant that produces 100% fatalities in brave test ferrets. While it is expected that humanity’s general immunity to flu would provide significant protective effect, it’s still a bit worrisome that a pandemic strain can be produced with equipment little better than a couple of cages and some animals.

Work on bird flu variants that had been mutated to produce contagious versions was also recently described by researchers in the Netherlands . The article states, “The introduction of receptor-binding site mutations Q222L/G224S and the mutations H103Y and T156A in HA, acquired during ferret passage, did not result in increased cross-reactivity with human antisera (table S6), indicating that humans do not have antibodies against the HA of the airborne-transmissible A/H5N1 virus that was selected in our experiments.” Or in plain English, this variation, made with minor mutagenic exposure and some ferrets was indeed a pandemic capable virus.

It is hard to know how bad a flu pandemic would be. The exemplary case of the Spanish Flu in 1918 had a death rate of 3-7% of the population. CDC estimates that a similar disease treated with modern medicine techniques would have a 1.2% death rate. That would mean approximately 3.77 million deaths in the United States. It should be recognized that the Spanish flu pandemic had two waves when the flu mutated and became much more deadly partway through. Anthrax (not a flu) was estimated to have a 75% or higher respiratory kill rate prior to the letter attacks on congress in 2001. The actual death rate from those attacks was 5 of 22 infected or 23%. While modern antivirals, antibiotics, hydration, and ventilators are effective, these resources would be limited in the event of a true pandemic. Especially considering the CDC estimates that 55 million Americans contracted H1N1.

There has not been significant legislation since James Hodge, Jr. stated in his article “Global Legal Triage in Response to the 2009 H1N1 Outbreak” published in 2010 in the Minnesota Journal of Law, Science & Technology that, “If H1N1 was a “test” run of the modern global public health system, then the system has fallen short.” While states have included pandemic preparedness into their planning,the overall level of preparedness is mixed.

The fact of American life is that our politics are reactive to crisis. Even shocks like the bird flu and swine flu have not been enough for our federal and local governments to develop plans to prepare for a pandemic. Instead, the lesson learned has been that there is nothing to worry about. Stay healthy.


FBI Face Recognition Concerns Privacy Advocates

by Rebecca Boxhorn, Consortium Research Associate, Former MJLST Staff & Editor

Thumbnail-Rebecca-Boxhorn.jpgHelen of Troy’s face launched a thousand ships, but yours might provide probable cause. The FBI is developing a nationwide facial recognition database that has privacy experts fretting about the definition of privacy in a technologically advanced society. The $1 billion Next Generation Identification initiative seeks to harness the power of biometric data in the fight against crime. Part of the initiative is the creation of a facial photograph database that will allow officials to match pictures to mug shots, electronically identify suspects in crowds, or even find fugitives on Facebook. The use of biometrics in law enforcement is nothing new, of course. Fingerprint and DNA evidence have led to the successful incarceration of thousands. What privacy gurus worry about is the power of facial recognition technology and the potential destruction of anonymity.

Most facial recognition technology relies on the matching of “face prints” to reference photographs. Your face print is composed of as many as 80 measurements, including nose width, eye socket depth, and cheekbone shape. . Sophisticated computer software then matches images or video to a stored face print and any data accompanying that face print. Accuracy of facial recognition programs varies, from accuracy estimates as low as 61% to as high as 95%.

While facial recognition technology may prove useful for suspect identification, your face print could reveal much more than your identity to someone with a cell phone camera and a Wi-Fi connection. Researchers at Carnegie Melon University were able to link face print data to deeply personal information using the Internet: Facebook pages, dating profiles, even social security numbers! Although the FBI has assured the public that it only intends to include criminals in its nationwide database, this has not quieted concerns in the privacy community. Innocence before proof of guilt does not apply to the collection of biometrics. Police commonly collect fingerprints from arrestees, and California’s Proposition 69 allows police to collect DNA samples from all people they arrest, no matter the charge, circumstances, or eventual guilt or innocence. With the legality of pre-conviction DNA collection largely unsettled, the legal implications of new facial recognition technology are anything but certain.

It is not difficult to understand, then, why facial recognition has captured the attention of the federal government, including Senator Al Franken of Minnesota. During a Judiciary Committee hearing in July, Senator Franken underscored the free speech and privacy implications of the national face print database. From cataloging political demonstration attendees to misidentifications, the specter of facial recognition technology has privacy organizations and Senator Franken concerned.

But is new facial recognition technology worth all the fuss? Instead of tin foil hats, should we don ski masks? The Internet is inundated with deeply private information voluntarily shared by individuals. Thousands of people log on to Patientslikeme.com to describe their diagnoses and symptoms; 23andme.com allows users to connect to previously unknown relatives based on shared genetic information. Advances in technology seem to be chipping away at traditional notions of privacy. Despite all of this sharing, however, many users find solace and protection in the anonymity of the Internet. The ability to hide your identity and, indeed, your face is a defining feature of the Internet and the utility and chaos it provides. But as Omer Tene and Jules Polonetsky identify in their article “To Track or ‘Do Not Track’: Advancing Transparency and Individual Control in Online Behavioral Advertising,” online advertising “fuels the majority of free content and services online” while amassing enormous amounts of data on users. Facial recognition technology only exacerbates concerns about Internet privacy by providing the opportunity to harvest user-generated data, provided under the guise of anonymity, to give faces to usernames.

Facial recognition technology undoubtedly provides law enforcement officers with a powerful crime-fighting tool. As with all new technology, it is easy to overstate the danger of governmental abuse. Despite FBI assurances to use facial recognition technology only to catch criminals, concerns regarding privacy and domestic spying persist. Need the average American fear the FBI’s facial recognition initiative? Likely not. To be safe, however, it might be time to invest in those oversized sunglasses you have been pining after.


Got GMOs?

by Ude Lu, UMN Law Student, MJLST Staff.

Ude-Lue.jpgGMOs, genetically modified organisms, have long been a part of our daily diet. For example, most of the soybeans and corn on the supermarket shelves are GMOs. Currently, the issue of whether these GMOs should be labeled so that customers can make informed purchases is in a heated debate in California. California Proposition 37, which would require labeling of GMOs, will soon be voted in November this year. Proponents from both sides have poured millions of dollars into the campaign.

GMOs are plants that have been genetically engineered to be enhanced with characteristics that do not occur naturally, so that the harvest can be increased and the cost can be lowered. One example of a prominent GMO is soybean. Monsanto–a Missouri based chemical and agriculture company–introduced its genetically modified soybean, Roundup Ready, in 1996. Roundup Ready is infused with genes that resist weed-killers. In 2010, 93% of soybeans planted in the United States were Roundup Ready soybeans.

Although GMOs are one of the most promising solutions to address the sustainability of food supply in view of the growing global population, there are concerns in the public regarding their safety, and confusion as to which federal agency has responsibility for regulating them.

Amanda Welters in her article “Striking a balance: revising USDA regulations to promote competition without stifling innovation” published in the Minnesota Journal of Law, Science, and Technology explains the current regulatory scheme of GMOs. Three primary agencies regulate GMOs: the Food & Drug Administration (FDA), the Environmental Protection Agency (EPA), and the United States Department of Agriculture (USDA). The FDA regulates GMOs in interstate commerce that are intended to be consumed by animals or humans as foods, the EPA monitors how growing of GMOs impacts the environment, and the USDA assesses the safety of growing GMO plants themselves.

Specifically, the Animal and Plant Health Inspection Service (APHIS) in the USDA is responsible for ensuring crops are free of pests and diseases. APHIS is currently in the process of revising its regulations for GMOs in an attempt to improve transparency, eliminate unnecessary regulations and enhance clarity of regulations. Under the proposed regulations there will be three types of permits for GMOs: interstate movement, importation, and environmental release.

Taking the position that GMOs are generally beneficial and unavoidable, Welters suggests that the USDA should frame a regulatory structure similar to the Hatch-Waxman Act and the Biosimilar Act to promote both innovation and competition. Readers interested in the regulatory issues of GMOs and the balance between the interests of patent innovators and generic follow-ons would find Welters’ article informative and insightful.


Hurricane Highlights Need for Better Regulatory Tools

by Kenzie Johnson, UMN Law Student, MJLST Managing Editor

Kenzie-Johnson-Thumbnail-White-Back.jpgThe Gulf Coast just can’t seem to catch a break. From the devastation of Hurricane Katrina, to the BP Deepwater Horizon oil spill, the region has had its fair share of environmental and natural disasters in recent years. Events this summer have placed the region in the news again–namely Hurricane Isaac, and perhaps less publicized, drought that has threatened fresh water supply in southern Louisiana. On the seventh anniversary of Hurricane Katrina, Hurricane Isaac made landfall causing severe flooding in rural areas along the Louisiana coast. In addition, this summer’s drought has also caused water levels to drop significantly in the Mississippi River, causing saltwater to work its way up stream threatening some areas’ fresh water supply.

These two events have, yet again, brought attention to environmental and natural resource issues in the Gulf Coast, but as Daniel Farber points out, environmental degradation in the Gulf-Coast region is not a new phenomenon. In an article published in MJLST, “The BP Blowout and the Social and Environmental Erosion of the Louisiana Coast,” Farber explains that the Gulf Coast has long suffered from disappearing wetlands that are important in reducing storm surges, a large aquatic dead zone that threatens marine life, coastal erosion, and numerous threats to biodiversity. He also discusses the effects climate change will have on the region. Farber argues that improved regulatory tools are needed to restore the region’s ecosystems and prepare for challenges the region is likely to face in the future. Farber also calls for increased restoration funding including the direction of Clean Water Act civil penalties towards Gulf Coast restoration.

In June, 2012, Congress passed the RESTORE Act which directs 80 percent of Clean Water Act penalties into a Gulf Coast Restoration Trust Fund. The Act also creates a Gulf Coast Ecosystem Restoration Council charged with comprehensive planning for restoration of the region and overseeing the use of Trust Fund money. On September 10, 2012, President Obama signed an Executive Order terminating the Gulf Coast Ecosystem Restoration Task Force and moving forward the establishment of the Gulf Coast Ecosystem Restoration Council. The order also names the Environmental Protection Agency and Department of Agriculture as trustees to the Natural Resources Damage Assessment Trustee Council that is charged with assessing natural resource damages from the Deepwater Horizon oil spill, restoring natural resources, and seeking compensation for lost resources.

As can be seen by the recent events, the Gulf Coast region will continue to face natural disasters as well as environmental and natural resource challenges, and the region needs a regulatory system structured to address such events. Recent actions by Congress and President Obama show promise towards long-term restoration, but as Farber points out, the complexities of these issues will take continued action and improvements in regulatory tools to fully restore the region.


An Individual Right to Return of Research Results

by Keli Holzapfel, MJLST Student Editor-in-Chief

Keli-Holzapfel-Thumb-White-Back-II.pngGiven the importance of results discovered by biorepositories and their implications for an individual’s health care choices, I believe that the individual has the right to receive his results despite their lack of verification. However, this right to receive results should be premised upon the individual’s explicit consent to receive his results, and upon the understanding that by receiving these results, the burden of their verification shifts from the biorepostory to the individual.

Biorepositories are collections of biospecimens that are tested and analyzed for scientific purposes. The testing performed on these biospecimens has become the basis for development of various molecular tests, which is becoming critical for the shift toward personalized medicine. Therefore, as technology advances, the quality and management of biorepositories is becoming more important. This is especially critical for the return of accurate patient data resulting from biospecimen analysis. However, managing and conducting a biorepository in the way necessary for return of results can be very complex and expensive. There must be many measures in place to prevent mistakes in identification and to ensure the quality of the biospecimen being tested. Currently, there are many existing biorepositories that do not meet the needed Clinical Laboratory Improvement Amendments (CLIA) standards for return of results. For an in-depth discussion of the current state of biorepositiories and issues surrounding return of results, see the article “Perspective on Biorepository Return of Results and Incidental Findings” written by Steve Jewell. For an example on what biorespositories need to do to improve their management and specimen oversight, see the College of American Pathologists, Accrediation Information.

As alluded to above, some of the important questions that arise from the return of results to an individual are inherently linked to the reliability of the result. For example, what should be the necessary standard for a result to be returned to the individual? Is the current threshold for returning results too high? As mentioned, many biorepositories do not meet the necessary guidelines for CLIA certification, which is required for returning of results. This means that potentially critical information is not shared with the individual involved. Is this ethical? Should biorepositories that discover critical information be required to return results to an individual even though the results are not CLIA certified? But if the results are wrong, is the emotional distress that may ensue from the return of results as unethical as withholding the results?

Due to the current state of biorepositories, and the huge implications that return of results may have, I think the best solution is to allow for consent-based return to an individual, with the understanding that any returned result needs to be independently CLIA certified. Therefore, only individuals who consent to receive results would get them, the individuals would receive the results with the understanding they could be incorrect, and then further testing would be done to validate the results to the necessary high standards. For additional in-depth discussion of issues surrounding CLIA and non-CLIA certified return of results, see “Ethical and Practical Guidelines for Reporting Genetic Research Results To Study Participants: Updated Guidelines from an NHLBI Working Group.”

For other insights and recommendations regarding return of research results, see MLST’s Winter 2012 symposium issue, “Debating Return of Incidental Findings and Research Results in Genomic Biobank Research–Law, Ethics, and Oversight


The Written Description Requirement Strikes Back

by Nihal Parkar, UMN Law Student, MJLST Staff

Nihal-Parkar-Thumbnail-White-Back.jpgThe written description requirement for patents often resembles the proverbial neglected middle child–it is left to its own devices and entrusted with its own care. The typical patent practitioner carefully chisels away at the claims with a thesaurus, and then proceeds to encase the exquisite sculpture with a glob of written description. Yes, the detailed description of the drawings and alternative embodiments may follow the core structure of the claims, but let’s face it–the average specification is hardly as painfully beautiful as the average claim.

A recent paper by Aaron Rabinowitz in Volume 12 of the Minnesota Journal of Law, Science & Technology, Ending the Invalidity Shell Game: Stabilizing the Application of the Written Description Requirement in Patent Litigation, analyzes this paradox in the context of how courts apply the written description requirement to routinely invalidate patents issued by the USPTO–those very issued patents that have gone through the tortious path of examination at the PTO and have been vetted by examiners and reshaped in course of the ping-pong game of office actions and their replies.

A high level of invalidation by the courts seems problematic. After all, shouldn’t patent owners be entitled to rely on the PTO’s evaluation of their patent, and be freely able to assert the patent against alleged infringers in court, without fearing that the court will find the written description to be as addled with holes as the typical chunk of Swiss cheese? Well, the courts can’t quite be blamed, given that the PTO works in mysterious ways. Reviewing the file wrapper often does not explain how each claim fulfilled the written description requirement in combination with the rest of the specification. A law firm helpfully points out, “Make Sure Your Patents Do Not Prove Their Own Invalidity!” To add to the complexity of the situation, the “written description requirement is separate from enablement.”

Patent owners would be wise to worry about the potential pitfalls of the written description getting shredded between Scylla and Charybdis. As Rabinowitz points out, “over 2000-2009, parties that attacked a patent on written description grounds succeeded more than forty percent of the time.”

Fortunately, Rabinowitz does not merely cry wolf, but supplies some solutions to keep the wolf at bay. To strengthen a patent’s validity, patent applicants can choose “to affirmatively identify the written description support for their claims in the application”. And then eagerly wait for “the PTO to either approve or question the applicant’s statement of support.” Of course, with patent examiners already being somewhat overburdened, the PTO may not be enthusiastic about yet another step in prosecution.

Rabinowitz’ article raises an interesting question that is often overlooked, and provides a practical, workable solution that is likely to be of benefit to patent owners, patent challengers, the PTO, and the courts.


Social Media Evidence: Not Just an Attorney Niche

mjlst-logo-button.pngOnce just the province of Generation Y and high tech culture, it is not breaking news that social media is now as mainstream as . . . well . . . the internet. What is new is that social media issues are no longer just an interesting specialty niche for tech savvy lawyers, but something that likely touches most attorneys’ practices.Social-Media-Evidence.PNG

A look at the rapid rise of appellate level cases involving social media evidence gives a hint at just how common social media evidence is becoming in civil litigation and criminal prosecution. The chart accompanying this post, while not a definitive study, shows the results of a Westlaw search for the number of appellate cases that likely involved the admission of evidence related to the major social media outlets — increasing 8-fold since 2008 and doubling in the past two years.

In separate research, eDiscovery firm “X1 Discovery” recently reported finding 674 appellate cases in 2010-2011 that mentioned social media evidence. With that many cases involving social media evidence at the appellate level, it is not unreasonable to conclude that social media evidence must be seen frequently by the lower courts.

Whether it is understanding how to authenticate a Tweet during trial, or avoiding a career-ending discovery sanction for spoliation of Facebook evidence, there is a growing need for litigators and other attorneys to understand the implications of social media for clients.

In issue 13.1 of the Minnesota Journal of Law, Science & Technology, Professor Ira P. Robbins of American University’s Washington College of Law outlines the challenges involved in authenticating social media evidence and proposes an authorship-centric approach to the authentication of such evidence. Read “Writings on the Wall: The Need for an Authorship-Centric Approach to the Authentication of Social-Networking Evidence


Bioethic Concerns 34 Years After 1st Test Tube Baby

mjlst-logo-button.pngProfessor Susan Wolf, Founding Chair of the Consortium on Law and Values in Health, Environment & the Life Sciences (which oversees and manages MJLST) discusses the latest bioethical concerns related to in vitro fertilization (IVF) on Minnesota Public Radio‘s The Daily Circuit program (click play button below):

In related content, MJLST Issue 10.1 included an article by Debora Spar, author of The Baby Business: How Money, Science and Politics Drive the Commerce of Conception and attorney Anna M. Harrington entitled “Building a Better Baby Business” that offers a road map to ensuring quality and equity in the reproductive technology industry.

For insights into understanding legal responses to technological change, using in vitro fertilization as an example, see Understanding Legal Responses to Technological Change of In Vitro Fertilization, by Lyria Bennett Moses in MJLST Issue 6.2.