To Serve or Not to Serve the Shareholders; That is the Question

by Maya Suresh, UMN Law Student, MJLST Staff

Thumbnail-Maya-Suresh.jpgHikma Pharmaceutical Company recently received the 2012 Client Leadership Award of the International Finance Corporation (IFC) due to its strong commitment to the community and leadership in the Pharmaceutical Industry. The award is given to companies that display this commitment through a variety factors, including strong corporate governance. As the biggest pharmaceutical manufacturer in the Middle East, Hikma has helped the public by providing affordable and lifesaving medicines to those in need. The CEO of the IFC lauded the corporation for setting the standard for corporate social responsibility within the industry.

The importance of these actions taken by Hikma was the basis for Martin Hirsch’s article, “Side Effects of Corporate Greed: Pharmaceutical Companies Need a Dose of Corporate Social Responsibility,” published in Issue 9.2 in the Minnesota Journal of Law, Science & Technology. The article talks about the tension that exists between the shareholders of pharmaceutical companies and the public the companies strive to serve. The shareholder model of corporate governance focuses on maximizing shareholder profit which often results in the production of lifestyle drugs over drugs that cure life threatening diseases. Lifestyle drugs include medicines for baldness and toe fungus that are in high demand and thus, sold for large profits. However, the lifesaving drugs, that are the most needed, are the ones the pharmaceutical companies refuse to produce. These drugs are mostly in demand by those living in poorer regions of the world. However, they typically cannot afford the high price point the pharmaceutical companies set the drugs at. Thus, the drugs are not bought, even though they are desperately needed, which leads the pharmaceutical companies to stop investing money into developing and producing them.

Hirsch argues that some companies take these actions further by influencing doctors’ diagnoses of patients, in an effort to increase the sales of higher revenue generating drugs. The actions of Hikma could lead consumers to believe that there is hope for the public, and that some companies are beginning to take a stand on this skewed model that has plagued the industry. However, some companies continue the practice of producing lifestyle drugs, versus lifesaving drugs.

WebMD has come under recent criticism for succumbing to that pressure when it should be serving as an objective medical resource for the public. A rigged online test for depression led test takers to believe they may be at risk for depression, when in fact they were not. This served as the perfect example of companies working to serve the BigPharma industry rather than the public. Unfortunately, as Research Associate Rallis asserts in the article, WebMD has no plans to alter its business model and as such, won’t be breaking ties with the industry anytime soon.

There appears to be some hope that the tension within the industry will resolve itself as the actions by Hikma will hopefully rub off on others in the industry. However, it is also clear that there is still work to be done.


Brains on Drugs: The Need for a Drug Policy That Embraces Scientific Understanding of Addiction

by Mike Borchardt, UMN Law Student, MJLST Managing Editor

Thumbnail-Mike-Borchardt.jpgThe strong showing in polls for marijuana legalization efforts in Colorado and Washington illustrate that America’s attitudes toward illegal drugs is starting to shift. Though the attitudes of some voters are starting to shift on pot, there is still a strong disconnect, especially when it comes to harder drugs, between what we know about addiction and the policies we use to curb drug use. In their article in MJLST 11.1, “Why Neuroscience Matters for Rational Drug Policy,” David M. Eagleman, Mark A. Correro, and Jyotpal Singh outline this disconnect between science and policy. As they explain, “Although addiction may involve volitional choices early on, it is best understood as a brain disease.” Despite this being the general consensus of the scientific community, our drug policies do too little to address addiction as a disease.

A good example of this is the use of Suboxone (buprenorphine), a drug used to treat opiate addiction . The US government spent millions of dollars funding Reckitt Benckiser’s development of Suboxone. It is an opiate which is much more difficult to overdose on than other drugs like heroin, and it is used to help manage withdrawal and cravings. Due to fears that it will be abused, Suboxone is difficult for many addicts to get. Doctors must undergo special training to prescribe it, and they are only allowed to write prescriptions for 30-100 patients a year. Additionally, many doctors are wary of prescribing it, as they don’t want to draw addicts to their offices. This makes it more difficult than necessary for addicts to gain access to Suboxone–they turn to drug dealers on the street for a supply of it, and when the dealers don’t have it, they use heroin or other opiates to satisfy their addiction.

Making Suboxone unnecessarily difficult for addicts to get is only one example of the disregard our drug policy shows towards our scientific understanding of addiction. As Eagleman, Correro, and Singh explain (at page 20) , “The United States has a history of combating the drug problem with increased law enforcement rather than customized intervention and rehabilitation.” Despite the fact that treatment has been shown to be far more effective (both cost-effective and effective in reducing drug use) than incarceration, drug treatment programs are underfunded and stigmatized. As the economic recession in the US has led to tighter budgets, drug-treatment programs are often one of the first things on the chopping block. Though US drug policy has generally been, and still is, heavily focused on law enforcement as a solution to the drug problem, there have been some hopeful developments. The Affordable Care Act includes addiction treatment as one of the “Essential Health Benefits” insurers are required to provide. If the law is successful in getting more Americans, especially low-income Americans, health insurance, it could help provide avenues of treatment that were formally unavailable to drug-addicts due to their cost.


Censorship Remains Viable in China– but for How Long?

by Greg Singer, UMN Law Student, MJLST Managing Editor

Thumbnail-Greg-Singer.jpgIn the west, perhaps no right is held in higher regard than the freedom of speech. It is almost universally agreed that a person has the inherent right to speak their mind as he or she pleases, without fear of censorship or reprisal by the state. Yet for the more than 1.3 billion currently residing in what is one of the oldest civilizations on the planet, such a concept is either unknown or wholly unreflective of the reality they live in.

Despite the exploding amount of internet users in China (from 200 million users in 2007 to over 530 million by the end of the first half of 2012, more than the entire population of North America), the Chinese Government has remained implausibly effective at banishing almost all traces of dissenting thought from the wires. A recent New York Times article detailing the fabulous wealth of the Chinese Premier Wen Jiabao and his family members (at least $2.7 billion) resulted in the almost immediate censorship of the newspaper’s English and Chinese web presence in China. Not stopping there, the censorship apparatus went on to scrub almost all links, reproductions, or blog posts based on the article, leaving little trace of its existence to the average Chinese citizen. Earlier this year, the Bloomberg News suffered a similar fate, as it too published an unacceptable report regarding the unusual wealth of Xi Jinping, the Chinese Vice President and expected successor of current President, Hu Jintao.

In “Forbidden City Enclosed by the Great Firewall: The Law and Power of Internet Filtering in China,” published in the Winter 2012 version of the Minnesota Journal of Law, Science & Technology, Jyh-An Lee and Ching-Yi Liu explain that it is not mere tenacity that permits such effective censorship–the structure of the Chinese internet itself has been designed to allow the centralized authority to control and filter the flow of all communications over the network. Even despite the decentralizing face of content creation on the web, it appears as though censorship will remain technically possible in China for the foreseeable future.

Yet still, technical capability is not synonymous with political permissibility. A powerful middle class is emerging in the country, with particular strength in the large urban areas, where ideas and sentiments are prone to spread quickly, even in the face of government censorship. At the same time, GDP growth is steadily declining from its tremendous peak in the mid-2000s. These two factors may combine to produce a population that has the time, education, and wherewithal to challenge a status quo that will perhaps look somewhat less like marvelous prosperity in the coming years. If China wishes to enter the developed world as a peer to the west (with an economy based on skilled and educated individuals, rather than mass labor), addressing its ongoing civil rights issues seems like an almost unavoidable prerequisite.


Political Data-Mining and Election 2012

by Chris Evans, UMN Law Student, MJLST Managing Editor

Thumbnail-Chris-Evans.jpgIn “It’s the Autonomy, Stupid: Political Data-Mining and Voter Privacy in the Information Age,” I wrote about the compilation and aggregation of voter data by political campaigns and how data-mining can upset the balance of power between voters and politicians. The Democratic and Republican data operations have evolved rapidly and quietly since my Note went to press, so I’d like to point out a couple of recent articles on data-mining in the 2012 campaign.

In August, the AP ran this exclusive: “Romney uses secretive data-mining.” Romney has hired an analytics firm, Buxton Co., to help his fundraising by identifying untapped wealthy donors. The AP reports:

“The effort by Romney appears to be the first example of a political campaign using such extensive data analysis. President Barack Obama’s re-election campaign has long been known as data-savvy, but Romney’s project appears to take a page from the Fortune 500 business world and dig deeper into available consumer data.”

I’m not sure it’s true Buxton is digging any deeper than the Democrats’ Catalist or Obama’s fundraising operation. Campaigns from both parties have been scouring consumer data for years. As for labeling Romney’s operation “secretive,” the Obama campaign wouldn’t even comment on its fundraising practices for the article, which strikes me as equally if not more secretive. Political data-mining has always been nonpartisanly covert; that’s part of the problem. When voters don’t know they’re being monitored by campaigns, they are at a disadvantage to candidates. (And when they do know they’re being monitored, they may alter their behavior.) This is why I argued in my Note for greater transparency of data-mining practices by candidates.

A more positive spin on political data-mining appeared last week, also by way of the AP: “Voter registration drives using data mining to target their efforts, avoid restrictive laws.” Better, cheaper technology and Republican efforts to restrict voting around the country are inducing interest groups to change how they register voters, swapping their clipboards for motherboards. This is the bright side of political data-mining: being able to identify non-voters, speak to them on the issues they care about, and bring them into the political process.

The amount of personal voter data available to campaigns this fall is remarkable, and the ways data-miners aggregate and sort that data is fascinating. Individuals ought to be let in on the process, though, so they know what candidates and groups are collecting what type of personal information, and so they can opt out of the data-mining.


510(k) Process Comes Under Renewed Scrutiny by Legislators, Awaiting FDA Response as Proposed Legislation Remains Stagnant

by Ashley Zborowsky, UMN Law Student, MJLST Notes & Comments Editor

Thumbnail-Ashley-Zborowsky.jpg Several months ago, Representative Edward J. Markey (D-Mass.) and Senator Jeff Merkley (D-Ore.) wrote a letter to Jeffrey Shuren, Director of the Center for Devices and Radiological Health (CDRH) at the U.S. Food and Drug Administration, calling for an overhaul of the 510(k) pre-market notification database. The legislators cite reports of defective medical devices cleared via the 510(k) process in recent years, such as the DePuy® artificial hip, that have caused “grievous[] and irrevocabl[e]” harm to patients.

The issue? Most devices subject to premarket review are cleared through FDA’s 510(k) process which provides expedited approval for products that are deemed “substantially equivalent” to an existing predicate device. While the 510(k) process allows patients earlier access to devices, the system is inherently flawed. Referred to as “predicate creep,” the 510(k) process clears generations of devices–none of which have been subject to the exacting scrutiny of pre-market approval (the PMA process). As Markey and Merkley cite in their letter to Shuren, many predicate devices upon which new products rely for 510(k) clearance themselves have been recalled by manufacturers due to “fundamental design flaw[s].”

The legislators asked Shuren and the FDA to retrospectively update the 510(k) database to clearly indicate devices recalled for serious design flaws that could adversely affect safety or effectiveness. The letter also asked FDA, among other things, to develop a mechanism for identifying certain 510(k) entries to “reflect instances where a device’s clearance traces back to a predicate recalled for a serious design flaw adversely impacting its safety, even if the original problematic device is not the immediate predicate.” For additional in-depth discussion of issues surrounding the existing 510(k) process and substantial equivalence, including product liability considerations see “Rethinking Lohr: Does ‘SE’ Mean Safe and Effective, Substantially Equivalent, or Both?

After the Institute of Medicine released its highly controversial report on the current 510(k) process last year (stating that the process is flawed and recommending a new pre- and post-market regulatory framework to provide a reasonable assurance of safety and efficacy), the issue of device safety has been omnipresent in policy debates surrounding related concerns of access and innovation. For a critique of the IOM report and a summary of its findings, see “A Failure to Comply: An Initial Assessment of Gaps in IM’s Medical Device Study Committee” and “Left to Their Own Devices: IOM’s Medical Device Committee’s Failure to Comply.” In January, Representative Markey and others introduced H.R. 3847, coined The SOUND Devices (Safety of Untested and New Devices) Act of 2012. The bill proposes to amend the Federal Food, Drug and Cosmetic Act to allow the FDA to reject a claim of substantial equivalence for a device whose predicate has been “recalled, corrected or removed from the market because of an intrinsic flaw in technology or design that adversely affects safety . . . .” in light of these concerns.

In testimony given to the House Committee on Energy and Commerce on device user fees back in February, Shuren discussed strategic priorities for the 510(k) process, including developing methods and procedures for the systematic analysis and use of medical device recall information by September 30, 2012. However, now that the Medical Device User Fee Amendments (MDUFA III) have been enacted, reauthorizing device user fees through fiscal year 2017, perhaps the FDA and CRDH will finally be able to make progress in revamping the 510(k) system. As Shuren noted in his testimony, “[w]hile it is true that providing more user fee resources alone won’t solve the problems with our premarket programs, insufficient funding is at the root of, or a contributing factor to, several of these problems. Adequate and stable funding is one key component to our and industry’s success in bringing safe and effective devices to market quickly and efficiently.”

Currently, the 510(k) process remains unchanged. Though legislators requested an official response no later than September 19, 2012, the FDA and Shuren have yet to release a statement (at least publicly) regarding these concerns. Additionally, it is unclear whether CDRH has made any headway in meeting its target goals. As we approach the end of 2012, 510(k) still leaves much to be desired–the highly anticipated changes to the pre-market clearance process are provisional at best. It seems as though Markey isn’t having much luck in Congress either. H.R. 3847 is awaiting approval from the House Committee on Energy and Commerce (and has been since shortly after its introduction). According to GovTrack.us, an online bill tracking website, the bill has only an estimated three percent change of being enacted.


Obama, Romney probably know what you read, where you shop, and what you buy. Is that a problem?

by Bryan Dooley, UMN Law Student, MJLST Staff

Thumbnail-Bryan-Dooley.jpgMost voters who use the internet frequently are probably aware of “tracking cookies,” used to monitor online activity and target ads and other materials specifically to individual users. Many may not be aware, however, of the increasing sophistication of such measures and the increasing extent of their use, in combination with other “data-mining” techniques, in the political arena. In “It’s the Autonomy, Stupid: Political Data-Mining and Voter Privacy in the Information Age,” published in the Spring 2012 volume of the Minnesota Journal of Law, Science, & Technology, Chris Evans discusses the practice and its implications for personal privacy and voter autonomy.

Both parties rely extensively on data-mining to identify potentially sympathetic voters and target them, often with messages tailored carefully to the political leanings suggested by detailed individual profiles. Technological developments and the widespread commercial collection of consumer data, of which politicians readily avail themselves, allow political operatives to develop (and retain for future campaigns, and share) personal voter profiles with a broad swath of information about online and market activity.

As Evans discusses, this allows campaigns to allocate their resources more efficiently, and likely increases voter turnout by actively engaging those receptive to a certain message. It also has the potential to chill online discourse and violate the anonymity of the voting booth, a central underpinning of modern American democracy. Evans ultimately argues that existing law fails to adequately address the privacy issues stemming from political data-mining. He suggests additional protections are necessary: First, campaigns should be required to disclose information contained in voter profiles upon request. Second, voters should be given an option to be excluded from such profiling altogether.


Biobanks Revisted

by Jeremy So, UMN Law Student, MJLSTManaging Editor

Thumbnail-Jeremy-So.jpgOn October 28, Australian researchers published new information about the genetic basis for endometriosis, a condition where the cells lining the uterus flourish in other areas of the body. The researchers, instead of recruiting their own research subjects, analyzed samples stored in biobanks in Australia, Japan, and Europe. Because of their approach, the researchers were able to identify common markers that appeared across the ethnically-diverse study population. The Australian team’s findings highlight the increasing importance of biobanks–repositories for biological research samples–which have become a valuable resource in the fields of genomics and personalized medicine

The increasing importance of biobanks was recently highlighted in a symposia sponsored by MJLST. In the accompanying Spring 2012 issue, researchers and lawyers discussed the one of the primary problem facing researchers who use biobanks: whether to return research results and incidental findings to research participants.

While the Australian researchers have decided to track down the original participants in order to share their findings, other researchers have hesitated to use the same approach. Karen J. Maschke highlighted several such reasons in her recent article “Returning Genetic Research Results: Considerations for Existing No-Return and Future Biobanks.” In the article, Maschke focuses on the approaches of American biobanks researchers, who generally do not share their results with individuals whose DNA was analyzed.

For American researchers, Maschke notes that samples stored for biobank research are regularly deidentified, making it difficult to impossible to contact the original donor. Such a system exists in part because of concerns over whether consent would be granted for samples to be used in certain types of research. Combined with conflicting interpretations of government regulations and other difficulties in actually returning sample results, researchers have hesitated to adopt a disclosure-based system for research results.

Although some may remain hesitant, cooperation between researchers and biobank participants has not necessarily led to negative outcomes.

The importance of resolving this conflict is highlighted by the increasing prevalence and importance of biobanks to scientific research. Several countries are working on expanding their biobank networks. Now, before competing standards come to dominate the field, a uniform system for the return of results should be determined and implemented.


Ask Not What Your Country Can Do For Your Patent . . .

by Caroline Marsili, UMN Law Student, MJLST Staff

Thumbnail-Caroline-Marsili.jpgThe candidates aren’t talking about patents (with the exception of a brief quip about IP piracy in last Tuesday’s debate). But if it’s “all about the economy,” they should be talking patent policy.

In the presidential and vice-presidential debates of recent weeks, the candidates have exchanged vitriol and “gotchas,” and have established a contrast in both policy and character for voters. Notably absent from the debates has been discussion of innovation, and more specifically, the role of IP policy in innovation. IP policy would seem an attractive platform for discussing job creation, as IP industries account for a vast portion of the Nation’s jobs and GDP (“IP-intensive industries” accounted for 27.7 of all jobs in the economy in 2010). It’s possible that the candidates find common ground on this issue. Alternatively, the topic is, for the time-being, moot in the wake of the America Invents Act, the full effects of which are yet to be seen.

Since its passage just over a year ago, some critics have expressed doubt that the Act will create jobs and promote innovation as promised. Others argue not that the Act is failing, but that it represents a misplaced effort to reform patent policy.

The solution? “Don’t just reform patents, get rid of them.” A recent working paper by Boldrin & Levine makes the bold case that our patent system is ultimately more trouble than it’s worth. The authors admit that abolishing patents “may seem ‘pie-in-the-sky'” and acknowledge the glut of transitional issues that would need addressing; just the same, they conclude that the key to reforming our patent system is to get rid of it. Their central beef with the system is the want of empirical evidence that it does what it purports to do: promote innovation and productivity. Meanwhile, there are other incentives for innovation and many negative externalities of the patent system.

Other authors have proposed less radical approaches to revamping the patent system. In her recent article in MJLST, “An Organizational Approach to the Design of Patent Law“, Liza Vertinsky also finds that empirical literature fails to decisively connect patents to innovation and economic growth. However, Vertinsky takes a more optimistic approach to the floundering patent system, arguing that policy-makers should seize reform efforts as an opportunity to tailor the patent law to innovation objectives. The America Invents Act, she argues, isn’t a significant change in the direction of patent policy and instead seeks to remedy narrower concerns with administrative backlog, litigation costs and patent quality. In her view, patent policy should be revamped to encourage innovation based on how individuals and organizations–corporations, Congress, the PTO–really function.

Vertinsky’s “organizational approach” entails a new way of thinking about patents in terms of how patent policy, informed by economic theory, should be fashioned to strengthen the organization of innovation rather than focusing on incentivizing acts of invention. For example, patent laws can be tailored to the needs of different innovation processes across different industries. While sweeping changes in patent policy are unlikely at this time (witness the battles encountered in passing and implementing the America Invents Act), Vertinsky’s proposals should inform discussion among policy-makers about what the patent system can and should do. The Obama Administration’s national innovation strategy neglected to give patent policy a more central role in encouraging innovation, but the desire to build an “innovation economy” is certainly there, and a rational and successful patent policy is vital to attaining the kinds of high-level jobs and industry the country needs and the candidates promise.

(Others think patent policy may not matter at all. What do you think?).


Juggling GMOs: Balancing Benefits, Risks, & Unknowns

by George Kidd, UMN Law Student, MJLST Staff

Thumbnail-George-Kidd.jpgThe recent multi-billion dollar loss as a result of the 5th worst drought ever recorded in U.S. history adds fuel to an already raging debate over genetically modified organisms (“GMOs”). Amanda Welters, in “Striking a Balance: Revising USDA Regulations to Promote Competition Without Stifling Innovation,” delivers a fantastic overview of key issues in the GMO debate while also introducing novel legislative ideas garnered from the pharmaceutical industry. Ms. Welters’ article provides important insights into the continuing struggle to provide society with an optimal outcome.

While recent documentaries such as “Food Inc.” and “King Corn” give informative, although one-sided, analyses of the GMO debate, GMO’s may indeed be necessary for the future. The recent drought only emphasizes why utilizing GMO crops may be so necessary. Benefits of using these crops could include increased resistance to severe weather, increased food production from less land, and decreased pesticide use. With so many benefits it is easy to see why these types of crops may have a lasting future.

But the road to societal riches as a result of using GMOs may be a tightrope walk with a long fall. Most of the pushback comes from the fact that the effects of consuming GMO products are largely unknown. Further, when all farmers use GMO seed, biodiversity is reduced, opening up problems if a disease were to effectively eradicate a particular GMO crop. Lastly, while Monsanto has done a good job of creating essentially “self-destructing” seed, reducing the crop yield of further generations of their soybean to encourage farmers to purchase new yearly seed, introduction of modified genetic material may have an irreversible environmental impact.

In light of the World Bank issuing a global hunger warning, perhaps we should accelerate our efforts in moving toward a legislative balancing act in either moving forward with GMO crops or looking elsewhere for innovative ideas. Producers of new GMO technology need to remain adequately incentivized to make GMOs more effective and safer for human consumption. But competition also plays an important role in improving GMO’s future viability. Expiration of Monsanto’s Roundup Ready soybean patents in 2014 will allow generic brand competition to spur price drops and competitive innovation.

In the end, when we do find that optimal balance between innovation and competition, the only winners are us.


Juggling GMOs: Balancing Benefits, Risks, & Unknowns

by George Kidd, UMN Law Student, MJLST Staff

Thumbnail-George-Kidd.jpgThe recent multi-billion dollar loss as a result of the 5th worst drought ever recorded in U.S. history adds fuel to an already raging debate over genetically modified organisms (“GMOs”). Amanda Welters, in “Striking a Balance: Revising USDA Regulations to Promote Competition Without Stifling Innovation,” delivers a fantastic overview of key issues in the GMO debate while also introducing novel legislative ideas garnered from the pharmaceutical industry. Ms. Welters’ article provides important insights into the continuing struggle to provide society with an optimal outcome.

While recent documentaries such as “Food Inc.” and “King Corn” give informative, although one-sided, analyses of the GMO debate, GMO’s may indeed be necessary for the future. The recent drought only emphasizes why utilizing GMO crops may be so necessary. Benefits of using these crops could include increased resistance to severe weather, increased food production from less land, and decreased pesticide use. With so many benefits it is easy to see why these types of crops may have a lasting future.

But the road to societal riches as a result of using GMOs may be a tightrope walk with a long fall. Most of the pushback comes from the fact that the effects of consuming GMO products are largely unknown. Further, when all farmers use GMO seed, biodiversity is reduced, opening up problems if a disease were to effectively eradicate a particular GMO crop. Lastly, while Monsanto has done a good job of creating essentially “self-destructing” seed, reducing the crop yield of further generations of their soybean to encourage farmers to purchase new yearly seed, introduction of modified genetic material may have an irreversible environmental impact.

In light of the World Bank issuing a global hunger warning, perhaps we should accelerate our efforts in moving toward a legislative balancing act in either moving forward with GMO crops or looking elsewhere for innovative ideas. Producers of new GMO technology need to remain adequately incentivized to make GMOs more effective and safer for human consumption. But competition also plays an important role in improving GMO’s future viability. Expiration of Monsanto’s Roundup Ready soybean patents in 2014 will allow generic brand competition to spur price drops and competitive innovation.

In the end, when we do find that optimal balance between innovation and competition, the only winners are us.