Noteworthy: Are Genes Patentable?

From a scientist’s standpoint, the Supreme Court’s ruling this week in the case of the “Association for Molecular Pathology et al., v. Myriad Genetics, Inc., et al. is perversely logical. For the molecular biologists reading this, frustrated by the ambiguity in the mainstream media, the ruling can be summarized into 3 bullet points:

  • DNA sequences are not patentable
  • mRNA sequences are not patentable
  • cDNA sequences ARE patentable

 

DNA sequences are not patentable because they exist in nature. The act of identifying the gene using common techniques does not enable to discoverer to patent either the location of the gene or its sequence so that others cannot utilize this information because she did not create the sequence or cause it to be in that location of the genome. This protection from patent coverage applies to all variations of the gene, including sequences with specific mutations linked to disease states.

 

mRNA (both as transcribed and once spliced) is not patentable because it is found in nature. If a scientist isolates mRNA, she did not create it, and therefore, she is not creating a new product.

 

It should be noticed, that the courts have ruled that if a scientist discovers a new method to isolate either DNA, mRNA, cDNA, or any similar scientific tool, then this new method would be patentable, even though the sequences discovered using the method might not be.

 

However, cDNA is created in a lab, even if naturally occurring enzymes and chemicals are used to create it. The theory behind this part of the ruling is that in the absence of human intervention, these specific sequences of DNA would not exist (the gene with the absence of introns). Thus the scientist has created a new a new patentable product.  I’ll be discussing this ruling over a few posts, from different angles. In my next post I’ll discuss the impact that this ruling has for molecular biologists.

 

The official ruling can be found here.

Fatigue related to outdoor darkness

Living in England in the heart of winter can make for quite a dark time, literally. For a number of months the sun does not rise until after 8 AM and sets before 4PM.  Even  people who have lived in the country all of their lives will tell you that this has the effect of making it seem much later than it is in the evenings. There is no denying that the environmental darkness that we experience  an effect on our psychological state, and possibly our physical state as well. I recently experienced this first-hand after spending almost 2 weeks in South Africa on holiday, where the sun rose before 6 and set after 8pm. Upon my return to Oxford I immediately lost over 6 hours of sunlight with no jetlag (time difference is only 2 hours). I find myself asking for dinner around 4:30 in the afternoon because it looks like dinner-time outside, and simply getting more tired in the evenings.

But what does science have to say about all this? The science of how our bodies regulate themselves on a daily cycle is known as the study of circadian rhythms. And indeed, the concept of circadian rhythms is not just psychological, with changes in body temperature, gastrointestinal, endocrine, and respiratory functions affected. Metabolism is also affected (though whether that or my brain is more at work with my 4:30 desire for dinner is debatable). Popular topics of study with circadian rhythm scientists are jet lag, shift work (i.e. if you have to work nightly shifts), and seasonal affective disorder (SAD). All of these are instances where our bodies experience changes in environment from what is considered the norm.

Jet lag is easily explainable through the study of circadian rhythms. By shifting time zones, your body immediately senses the new environmental conditions, and gradually your rhythms shift to become in line with this. However, as the cycles take time to align, this is what results in the feelings of jet lag – with changes in behaviour and performance observed. Working the night-time shift is more difficult than switching time zones because it involves using the body against some of the environmental cues (sunlight) that normally govern its functions. In these instances other environmental cues, such as social interactions, become more important to try and shift the cycle. However, studies have shown that individuals who work these shifts do exhibit behavioural problems and social isolation, compared to those working normal shifts.

So how does this all play out with respect to the English winter? Well, in winter we get a conflicting set of cues. True, the time in which we experience sunlight diminishes, but our work schedules and social interactions with others stay on the same time scale, in effect, helping out where the environmental cues are now absent. The hypothesis is that our circadian rhythms are phase-delayed in winter as contrasted to summer, but this has not been conclusively demonstrated. Interestingly, much research has been done with special bright light therapies, in order to reduce the symptoms of SAD. While these therapies do generally improve symptoms, they somehow do not appear to be effected by when they are administered. This confuses scientists, who suspect that it might be having a “sophisticated placebo or expectation response” effect.

The general outlook of the English is to simply accept the situation as it is, and take the occasional winter holiday to Spain or Mallorca. But perhaps the finding that night-shift workers deal with the situation in better in older age, due to advanced coping mechanisms, is the most illuminating.

Disorders of the sleep-wake cycle in adults

On New Year’s Resolutions

Do New Year’s resolutions work? Do they set us up for disappointment? Are they a marketing ploy to increase sales, particularly of fitness equipment and gym memberships, at the beginning of every year? Or can they actually be helpful, an honest effort and impetus, however, brief, to genuinely improve our lives?

Different studies over the years have shown high levels of initial success, e.g. 1 week, diminishing more rapidly by a month, and with limited success after one and two years. In general, it appears, individuals fail more frequently when they make weight resolutions versus non-weight related resolutions, e.g. smoking or drinking.

An interesting study, published in 2002, aimed to not just measure the rates of resolution success, but also the driving forces behind it, and to directly answer whether resolution actually effected change as contrasted to individuals contemplating change. The study indicates that its success rates may be slightly inflated due to the nature of self-reporting, with less successful applicants declining follow-up interviews or making up more successful results, and with some of the participants being motivated to stick to their resolutions because they know they will have follow-up interviews. However, it is important to note several things from the study. First, that 40% of American adults make New Year’s Resolutions every year. Second, that non-successful resolvers are likely to make the same resolution the following years until some degree of success has been achieved. And third, that resolving is not in vain and that most participants do achieve initial success, even if this is not particularly long lasting. As the authors point out, the rate of success should be compared to non-resolvers, i.e. people who don’t make a conscious effort to improve at all.

Most importantly, and surprisingly, once the initial resolution had been made, desire to change had no correlation with resolution success. Instead, success was found by participants who used techniques of “self-liberation, stimulus control reinforcement management, positive thinking, and avoidance strategies”, whereas those who were unsuccessful were characterized by “self-reevaluation, wishful thnking, self-blame, and minimized threat”. That is, those who were more successful better managed and kept on top of the situation and goals, whereas those who were unsuccessful were more likely to get frustrated, blame themselves, and/or talk themselves out of the need to change.

Thus, it’s better to try out a New Year’s resolution than not at all. And above all, stay positive. What’s your New Year’s resolution?

Auld Lang Syne: Success Predictors, Change Processes, and Self-Reported Outcomes of New Year’s Resolvers and Nonresolvers

The NIH-OxCam Program

I thought I would give a brief shout-out for the NIH-OxCam program that I’m on, for any people considering graduate programs who are reading this blog. Applications are now open until January 2nd, and the official website and application information can be found here:

NIH OxCam Webpage

Just over a decade ago, as graduate students were first starting to make their way onto the National Institutes of Health (NIH) campus, it was decided that the NIH would form what is now known as the Graduate Partnerships Program (GPP). It was decided that the NIH was not going to be degree-granting institution. However, they wished to provide funding for students to do research on campus and to work towards degrees at partnering Universities.  One of the first partnerships was with Oxford and Cambridge.

Each graduate partnership at the NIH works differently, but the OxCam program, as it is known, gives American citizens the opportunity to study a biomedical –related field at either Oxford or Cambridge as well as at the NIH. (There is a similar program called the NIH-Wellcome program for non-US citizens.) While the NIH funds the degree, the university in England is in charge of the academic side of affairs, including examination and degree conferral. The goal is for students to evenly split their time between the two locations, working in collaborating labs and pulling together a cohesive project.

As with any course of graduate study the success of the project is dependent on many variables, the three main ones being mentorship from supervisors, tenacity of the student, and topic choice. The great thing about the program is that it does in many respects offer a middle-ground between British and American attitudes towards PhD projects. One of the main aspects of this is the fact that funding is given for five years, which is longer than the British timeframe, but shorter than the American time-frame. This seems to work out just right for the students in the program who generally seem to graduate after about 4.5 years.

The program is certainly not for everyone in that in addition to the normal criteria for a PhD candidate, you also need to add a good set of communications skills, self-motivation, and extra tenacity. But it can, and does, work out, and I would definitely encourage anyone interested to apply. And please do contact me if you have any questions.

Holidays Can Be Dangerous

In searching for relevant Hannukkah-related articles on Pubmed, an interesting article jumped out at me.

Child Injury in Israel

In this article, the authors studied trends in injuries admitted to the emergency room at the Petach Tikvah Children’s Medical Center in Israel. Interestingly, they found that injury rates in children rose in specific ways related to certain Jewish holidays. In particular, they found that there was a significant rise in burn injuries surrounding the holidays of Hannukkah, Lag BaOmer, and Passover. Hannukkah involves the nightly lighting of candles, and Lag BaOmer traditionally involves bonfires. While Passover does not explicitly involve any fire-related activity, the authors speculate that this is due to an overall increase in cooking in the house around this festive period. What is clear, however, is that there is a sudden jump in accidental poisonings around Passover, due to cleaning products used during holiday-mandated spring cleaning. Around the days surrounding, and including Passover, there is a spate of bicycle and skateboard related injuries. Cars are not driven during Passover, so this means that children have free-rein of the roads, and use the opportunity to promptly injure themselves.

So be careful when lighting those Hannukkah candles.

Oh, and if you’re Christian, this is advance warning to beware pokey objects near eyeballs and other Christmas-related injuries:

Christmas-related eye injuries: a prospective study.

Propeller and jet-ski injuries during Christmas and New Year in Western Australia

News in Tissue Engineering: Growing Meat in a Petri Dish?

BBC: How growing meat in a petri dish may be the future

This is an excellent video made by the BBC in talks with Professor Post at Maastricht University, which addresses many of the key issues. Briefly, there are some very important issues to bring up.

Macro-impact:
o Growing meat in the current way has huge environmental effects based on the amount of food being fed to animals, and the amount of waste being put out
o Not discussed: the effects on the economy in transitioning from an agricultural system to a biological factory system.
o Also not discussed: what would happen to all the waste from a meat factory?

Taste:
o No one knows exactly why meat tastes the way it does.
o Fat, which is believed to give flavor to meat from an animal, would have to be grown separately and mixed in, and there is no saying that fat from a dish tastes like fat from an animal. Back to square one.
o Texture was not directly discussed in the video, though as it appears that Professor Post is applying mechanical pressure to cells (conjecture based on an apparatus I observed in the video); this would lead us to the conclusion that he is actually producing muscle tissue and not just layers of muscle cells – an important distinction.

Scale
o Professor Post rightly talks about the vascularization issue, which I talked about in my tissue engineering post on August 5 – it’s hard to grow a thick chunk of meat in a dish because of the inside-outside issue.
o Professor Post suggests bioreactor systems, but these still produce many thin layers that would have to be pressed together to form a steak (a bit like a really thick version of chicken slices seen at the deli counter).
o When Professor Post talks about the $250,000 burger, he’s almost certainly going to take many of these smaller pieces and put them together to get the final product
o Scale is also a problem because, although cells can be exponentially expanded in culture, as discussed in my August 11 post on passaging cells their characteristics change with each passage out of the animal. (I presume he is not talking about using immortalized cell lines because he intends to keep donor animals.)

Summary: Professor Post is working on a project that, for obvious reasons, gets a lot of attention from mainstream media. He is obviously well aware about the limitations of his project, but is taking a practical approach to getting it done right. It’s a great theory, and deserves investment, but we shouldn’t expect to be eating faux meat (“in vitro meat”) for years to come.

News in Translational Research: Too Much Information?

Although today’s blog has little to do with tissue engineering specifically I thought that it would be a great opportunity to discuss last Saturday’s article in the New York Times, linked below:

New York Times: Genes Now Tell Doctors Secrets They Can’t Utter

This article discusses the quandaries faced by researchers and doctors who discover things about their patients which are not covered by the consent forms. Instances include: discovering that a patient has a gene which has been linked to a significant risk of a particular cancer, discovering that the patient sample contains the HIV sequence (though whether a transient or permanent infection can’t be determined without further testing), or negatively, discovering a patient doesn’t have a gene that they thought they had, and therefore could avoid breast surgery.
It seems that doctors and researchers tend to go to bat with ethical boards to fight to let patients know when they discover that a patient can take preventative action. However, scientists are less certain about what to do when they discover the patient has a risk of a disease that they can’t be proactive about. And what if the patient specifically stated they never wanted to be contacted? If the establishment contacts the patient and asks if he want to reconsider, then that implies that there is something he ought to know.
As a scientist a few levels removed from the clinic, I generally don’t have these sorts of worries. However, a few months ago, some of my transplants became tumorigenic. These transplants had been seeded with cells donated by an actual patient (at my level we are just given a number identifier for the patient). As I am not involved in the clinic, I spoke to the relevant people, and down the chain all the cells were pulled and marked as not suitable for transplantation. The tumors could have been formed from anyone of multiple factors, chemical sterilization treatments of the scaffolds, contact with tumorigenic mouse cells transplanted in the same mice etc. And a repeat of the experiment with slightly different conditions yielded no tumors, however, that ugly incident is probably the closest I’m going to get to bench-to-bedside effect.

News in Tissue Engineering: Tracheas

The trachea is a vital human body part. It is a tube that connects the larynx to the lungs, allowing humans to breath. Tracheal collapse happens as a result of some heart conditions, Cushing’s syndrome, and some respiratory conditions, including lung cancer that has spread. Without a trachea it is impossible to breath. There are extreme options, such as having a tracheotomy in order to avoid having to breathe through the mouth, but these are very invasive and uncomfortable procedures. Tissue engineering has been applied to the trachea over the past few years in an astonishing way, and will continue to do so, setting new levels of treatment.

Science Daily, 2008: First Tissue-Engineered Trachea Successfully Transplanted

In this article we see that the scientists utilized two of the three components I talked about in my August 8th post, i.e. cells and scaffold. The scaffold, in this case, is a 7cm section of decellularized trachea from a donor. The donor was deceased – it is not possible to donate a trachea section while still alive. 7cm is a really large area to consider in terms of modern tissue engineering, as much research is done on a much smaller scale. The term decellularized means that the tissue was stripped of all cells to make it less immunogenic to the recipient. However, the physical structure of the organ is retained and, importantly, its biological activity, as the proteins that make up the structure are kept and still active. This scaffold was then seeded with cells that had been collected from the recipient, i.e. autologous cells, so that there would be no graft/host response. Seeding the cells is important because it provides the organ with a dynamic, living presence instead of just implanting an empty construct.

New York Times 2011: Synthetic Windpipe is Used to Replace Cancerous One

In this article we see that a similar approach of seeding autologous cells on a tracheal scaffold was taken. However, in this instance the scaffold material is different, instead of being a donated natural scaffold, it is a synthetic polymer material. This has several associated pros and cons. The pros of using this approach are that it doesn’t require a donor and associated organ processing logistics, but, more importantly, the scaffold can be shaped to fit the recipient exactly using modeling and advanced fabrication techniques, whereas donor tracheas are often not a good fit. The major con of the system is that the scaffold is inert and not biologically active and covered in normal tracheal proteins the way that the previous scientific group described. In order to overcome this issue, the group at the Karolinska Institute also grew the cells on the scaffold in a bioreactor, but added in several growth factors that they hoped would induce the cells to differentiate as desired, and start the biological pathways that would continue to affect how the cells behave even after transplanted.

Noteworthy: Silk Scaffolds

BBC: Silkworms Could Aid Breakthrough in Tissue Engineering

Every Monday I plan to talk about something in the news and how this relates to the field of tissue engineering. This week I’ve posted a link to a video about the creation of silk. Textile specialists have long looked at silk because it is a naturally produced material, from either spiders or silk worms. However, the first review of the scientific use of silk as a biomaterial wasn’t published until 2003. This is yet another example of just how rapidly the field of tissue engineering is progressing and how rapidly it is growing. In 2003, just 189 papers were published, according to the scientific publishing database PubMed, compared to 438 in 2011.
Today, it’s very evident that silk scaffolds are being tested around the world, from the USA, to Germany, to China, to Australia, and everywhere in between. Silk is, however, very rarely used on its own, and when it is, it does need to be chemically treated to change its mechanical, and sometimes biological, properties. Silk from various spiders, for example, has to be processed to remove an outer coating which is biologically toxic. Much more common is to find silk that has been modified, either by adding adenovirus to stimulate a biological process, or through the attachment of various proteins etc. that will give it various properties. Crosslinking of fibers strengthens the network and allows it to hold its shape better. And very commonly silk will be mixed with synthetic polymers such as polyacrylamide, whose biological properties are much easier to control to get reproducible results.
What the group did at the Institute of Materials Research and Engineering in Singapore accomplished in making colored silk worms that can generate fluorescently colored silk is very interesting because, as stated in the video, it makes it easier to do fluorescence imaging of the scaffold. From a biological standpoint it is interesting not just because of the new product it produces, but because it raises the possibility of incorporating many different characteristics into the silk, such as mechanical properties, or modified biological moieties (both would come from modifying the structure of the silk fibroin molecules during production by the silkworm). Previously, all materials had to be modified post-extraction and purification, now they can be modified before production even starts. It’s the biological equivalent of producing spinach-flavored pasta instead of taking plain pasta, cooking it, and adding spinach.