Hallucination is defined as perceiving something that seems real but in fact it is not. Some references take it as a synonym for delusion. Both hallucination and delusion are a perception or belief that something seems real. However, the individual that experiences hallucination senses a vision, sound, or other perceptions later on denies it to be real based on evidence or logic. People with delusion, in contrast, believe something as real in spite of refuting evidence.
Common causes of hallucination
Hallucination does not occur frequently. Nonetheless, it could be a common experience in individuals suffering from mental disorders like schizophrenia. Accordingly, >70% of those suffering from schizophrenia experience visual hallucinations whereas 60-90% believe they heard voices. Additionally, other conditions that result in hallucinations include certain cases of Parkinson’s disease, Alzheimer’s disease, migraines, brain tumor, and epilepsy. Apart from these conditions certain medications – called “hallucinogens” — have also caused hallucinations. For example, “Lysergic acid diethylamide” drug causes hallucination, in particular, by acting on serotonin (5-hydroxytryptamine [5-HT])-receptors.
High caffeine intake was also implicated to hallucinations. Accordingly, people who drink more than seven cups of instant coffee in a day turned out to be three times more likely to “hear voices” than those who drink less. In this case, scientists explicated that high caffeine intake led to an increased cortisol (a stress hormone), which, in turn increased proneness to hallucinate.
People experiencing hallucinations may feel afraid from the perceptual experience. Seeing a vision like a seemingly floating light, hearing sounds like footsteps, or a crawling feeling on the skin that later on are construed as not real could really be scary.
Why does hallucination occur? In essence, hallucinations involve defects in the structure and function of the primary and secondary sensory cortices of the brain. In the case of Alzheimer’s disease, visual hallucinations are associated with grey and white matter abnormalities. “Seeing”, “hearing”, or “feeling” things is by chance spontaneous and also a transitory personal experience. Thus, understanding the biological phenomenon of hallucination remains a challenge to neurobiologists and scientists alike to this day.
Do animals hallucinate?
Do animals hallucinate, too? Scientists can hardly tell but studies implicate animal models such as lab mice making a head-twitch response (a hallucinatory behavior) when administered with hallucinogen. However, some scientists argue it was not a compelling proof of such animals hallucinating.
Recently, though, a team of researchers from Stanford Medicine claim that they made lab mice hallucinate without injecting hallucinogen. Instead, they made use of optogenetics technique. In this case, they inserted light-sensitive genes into their brain. As a result, certain neurons tend to fire with particular light wavelengths. The genes would produce two types of proteins: one, causing neurons to fire when exposed to infared laser light and another, causing neurons to glow green when activated.
The scientists, then, trained the mice to lick a water spout when exposed to a pattern of moving parallel lines (i.e. perfectly vertical or horizontal lines). Based on the green glow response of the visual cortex, the scientists knew which neurons were firing, thus responding. These neurons supposedly were the ones responsible for “seeing” the pattern of lines. [4,5]
Gradually, researchers dimmed the projections while triggering the target neurons with their special laser. Eventually, they stopped showing the line patterns and yet the mice would still lick the water spout when scientists hit the same target neurons with laser. The result therefore implies that the mice might have experienced “true hallucination”, seeing “ghost” line patterns.
— written by Maria Victoria Gonzaga
1 Fowler, P. (2015, August 27). Hallucinations. Retrieved from WebMD website: Link
2 Durham University. (2009, January 14). High Caffeine Intake Linked To Hallucination Proneness. ScienceDaily. Retrieved from Link
3 Can animals have hallucinations? – Quora. (2018). Retrieved from Quora.com website: Link
4 Stanford Medicine. (2019, July 18). Scientists stimulate neurons to induce particular perceptions in mice’s minds. ScienceDaily. Retrieved from Link
5 Specktor, B. (2019, July 19). It’s a Mystery Why We Are Not Constantly Hallucinating, Trippy New Study Suggests. Retrieved from Live Science website: Link
Plant geneticists from the University of Tokyo are onto creating novel plant lines that seem to be “more polite” than they already are.1,2,3 However, their technique does not involve implanting a “social” gene of some sort. Rather, scientists would edit plant mitochondrial DNA. In that way, they can, for instance, make a plant bow down even more due to the heavier seeds it would yield. Thus, this could mean a more secured food supply. More interestingly, this genetic modification was accordingly the first time ever to be done on a plant mitochondrial DNA.
Mitochondria are one of the three organelles containing nuclear material. The nucleus and the chloroplast are the other two. Scientists have already done successful modifications of the nuclear DNA since1970s. Then, another team of researchers pioneered modification of chloroplast DNA in 1988. However, in terms of mitochondrial DNA, researchers had only found success on animals but not on plants. The first successful animal mitochondrial DNA modification happened in 2008. Then recently, a team of researchers from the University of Tokyo apparently showed success in doing it as well on a plant mitochondrial DNA. In this case, this was the first time.
Basically, mitochondrial DNA is the genetic material in the mitochondrion that carries code for the manufacturing of RNAs and proteins essential to the various functions of the said organelle. Since a mitochondrion has its own genetic material it is described as a semi-autonomous, self-reproducing organelle.
First plant mitochondrial DNA modification
Researchers from the University of Tokyo devised genetic tools that can edit plant mitochondrial DNA. Accordingly, they came up with four new lines of rice and three new lines of rapeseed (canola) using their technique. Between plant and animal mitochondrial genes, those in plants are larger and more complex. Prof. Arimura explicated that plant mitochondrial genes are more complicated in a way that some mitochondria have duplicated genes whereas others lack them. Thus, manipulating plant mitochondrial genome proved more challenging. Their collaboration with other researchers, particularly from Tohoku University and Tamagawa University, led them to their use of the technique mitoTALENs. With it, they were able to manipulate mitochondrial genes in plants.1 To learn their methods in detail you may read their published work here.
What plant mitochondrial DNA modification can do
After the successful editing of plant mitochondrial DNA, what could be the next big thing? Associate Professor Shin-ichi Arimura, leader of the research team, was enthusiastic indeed about their accomplishment. With a jest, he said, “We knew we were successful when we saw that the rice plant was more polite — it had a deep bow” – implying that a fertile rice plant would bend more due to the heavier weight of the seeds it would yield.1,3
A weak genetic diversity in crops could impose a threat to species survival through time. As a domino effect, that is bad news to our food supply. Thus, their team hope to use their technique by providing solutions that could significantly enhance genetic diversity in crops, and therefore improve plant species survival and yield. Arimura further said, “We still have a big risk now because there are so few plant mitochondrial genomes used in the world.”1 Furthermore, he mentioned of using their technique for the purpose of adding the much needed mitochondrial DNA diversity among plants.
Cytoplasmic male sterility
Cytoplasmic male sterility (CMS) refers to the male sterility in plants by not producing functional pollen, anthers, or male gametes. It occurs naturally although rarely and probably involve certain nuclear and mitochondrial interactions.4 Nonetheless, others believe that CMS is caused primarily by plant mitochondrial genes.1 In particular, the presence of CMS gene leads to this condition in plants. Thus, removing the CMS gene could convert the plant into becoming fertile again. This is just a start but they are already optimistic that with their technique they could improve crop lines and consequently secure food supply.
— written by Maria Victoria Gonzaga
1 University of Tokyo. (2019, July 8). Researchers can finally modify plant mitochondrial DNA: Tool could ensure genetic diversity of crops. ScienceDaily. Retrieved from [Link]
2 Arimura, S. -i., Yamamoto, J., Aida, G. P., Nakazono, M., & Tsutsumi, N. (2004). Frequent fusion and fission of plant mitochondria with unequal nucleoid distribution. Proceedings of the National Academy of Sciences, 101(20), 7805–7808. [Link]
3 Researchers can finally modify plant mitochondrial DNA | The University of Tokyo. (2019). Retrieved from The University of Tokyo website: [Link]
4 Campo, C. (1999). Biology of Brassica coenospecies. Amsterdam New York: Elsevier. pp.186-89.
Having a dog as a pet presents myriad of benefits. One of them is having a companion reputed for being charismatic and loyal. Dogs, apparently, render a “cure” when melancholy “strikes“. However, there are repercussions to avoid or deal with when handling a dog. One of the most important concerns when domesticating a dog is preventing dog bites. Getting bitten by a dog is, in fact, how microbes could find their way through the skin. Dogs, inopportunely, can be agents of medically-important diseases like rabies.
Rabies is a viral disease that is almost always deadly. It can be acquired chiefly through a single bite by an infective dog. One could also get it when a broken skin is exposed to infected saliva. Other potential routes include eyes, mouth, and nose. Nonetheless, not all dogs carry the virus causing the disease. Also, dogs are not the only ones that can transmit rabies virus. Most warm-blooded vertebrates (e.g. monkeys, raccoons, cattle, cats, bats, etc.) can carry the virus and transmit it to a human host. The virus has further adapted, and hence, could grow as well as in cold-blooded vertebrates. However, because of the widespread domestication of dogs in human households dogs have consequently incited most rabies cases in humans.
Lyssavirus – the viral agent
The virus of rabies disease is a Lyssavirus, a type of RNA virus belonging to the family Rhabdoviridae, order Mononegavirales. It has a bullet shape. It carries a single negative-strand RNA as its genome, enough to code for proteins — namely, nucleoprotein, phosphoprotein, matrix protein, glycoprotein, and RNA polymerase — to establish within the host cell.
In particular, the virus makes its way inside the host cell (e.g. muscle cell or nerve cell) through receptor binding and membrane fusion by way of endosome using its glycoprotein G. The virus transcribes its genome by its polymerase inside the endosome. Then, it fuses to the endosome to release its newly transcribed proteins and RNA into the cytosol.
The matrix protein regulates both transcription and replication of the virus. From transcribing, the polymerase shifts into replicating its genome. The nucleoprotein tightly binds to the newly replicated genome, thus, forming ribonucleoprotein complex. This, in turn, can now form new viruses.
The virus performs transcription and replication processes via a specialized inclusion body referred to as the Negri body. In fact, the presence of Negri bodies in the cytoplasm of the host cell indicates histological proof of Lyssavirus infection.
Rabies – two types
Early symptoms of rabies disease include fever, discomfort, and paraesthesia (burning sensation at the bite site). Eventually, the symptoms progress to behavioral changes when the virus spreads to the central nervous system.
Lyssavirus enters and hijacks muscle cells to replicate. From the muscle tissue, it travels to the nervous system through the neuromuscular junctions. The virus enters the peripheral nervous system directly and then spreads to the central nervous system where it can cause fatal inflammation in the brain and spinal cord.
Depending on the symptoms, the rabies may be described as “furious” or “paralytic“. The furious rabies — the more common form (80% ) — is characterized by hyperactivity, confusion, abnormal behavior, paranoia, terror, hallucinations, and hydrophobia (“fear of water“). The paralytic rabies, as the name implies, causes paralysis starting from the site of bite (or entry). Both of these types may lead to coma and eventually to death of the patient. However, patients with the furious type have higher risks, due to the likely cardio-respiratory arrest. Without an early and a proper medical intervention, death may ensue typically two to ten days after these symptoms manifest.
Rabies – pathobiology
How rabies causes behavioural changes baffles scientists. In 1980s and 1990s, researchers explicated how the virus caused paralysis. Accordingly, the glycoprotein at the cell surface of the Lyssavirus competes against acetylcholine in terms of binding affinities to specific muscle receptors (e.g. nicotinic acetylcholine receptors). Lately, researchers conjectured that the virus could also be doing the same with the similar receptors found in the brain. Furthermore, they presumed that the interaction could have affected how the brain cells normally communicate, and thereby induced changes in the behavior of the host.
Recently, researchers from the Ohio State University College of Medicine and The Ohio State University Wexner Medical Center conducted a study aimed at identifying dog breeds and physical traits that pose high risk of biting with severe injury. Their data could provide empirical basis when deciding which dogs to own. Still, further studies on rabies are necessary since the disease is marked as fatal as soon as the clinical symptoms set in. Although vaccine-preventable, rabies, especially via a dog bite, remains a significant cause of annual deaths in humans, both young and old. Novel treatments and vaccines that are effective and economical could preclude death. At present, the staggering cost of treatment remains a major health-care restraint. Without the proper and early treatment, death from rabies, unfortunately, is almost always certain.
— written by Maria Victoria Gonzaga
2 World Health Organization (WHO). (2019, May 21). Rabies. Retrieved from Who.int website: [Link]
4 Albertini, A. A., Schoehn, G., Weissenhorn, W., & Ruigrok, R. W. (January 2008). “Structural aspects of rabies virus replication”. Cell. Mol. Life Sci. 65 (2): 282–294. doi:10.1007/s00018-007-7298-1
5 Newman, T. (2017, November 15). Rabies: Symptoms, causes, treatment, and prevention. Retrieved May 23, 2019, from Medical News Today website: https://www.medicalnewstoday.com/articles/181980.php
6 University of Alaska Fairbanks. (2017, October 11). How rabies can induce frenzied behavior: Researchers better understand the disease that kills 59,000 people annually. ScienceDaily. Retrieved from website: [Link]
7 The Ohio State University Wexner Medical Center. (2019, May 22). Study identifies dog breeds, physical traits that pose highest risk of biting children. ScienceDaily. Retrieved from website: [Link]
How did life start as we know it? In the scientific community, the “RNA World Hypothesis“ has many adherents. Many believed that life came about as a result of the existence of the simplest molecule, such as RNA. Perceptibly, RNA shows signs of somewhat being “alive” — at least in the sense that it carries a genetic code and capable of self-replicating. In essence, RNA could be the earliest biomolecule. Subsequently, other organic molecules came about. However, a new hypothesis is gaining a grip. Accordingly, RNA and its close relative – DNA – might have existed side by side during the primordial times, even before life began.
RNA World Hypothesis
In RNA world hypothesis, primitive life is presumably RNA-based. This assumption arose from the notion that RNA could act both as a genetic material and as a catalyst. In due time, primitive RNA-based entities have transitioned into compartmentalized life forms (in the form of cells) for over many millions of years. Possibly, the RNA-based life dominated the primitive Earth and then served as the descendant of the present-day living organisms.1 Carl Woese, an American microbiologist and biophysicist, is hailed as the originator of the RNA World hypothesis. Accordingly, he conjectured in 1967 that the earliest self-replicating life entities could have relied on RNA.2
Theory on the Origin of RNA
How RNA emerged or came about still puzzles scientists. Where did RNA come from? Did it come from the Earth’s more nascent, rudimentary units? … or perhaps, the building blocks for RNA came down to Earth from the outer space? According to scientists, RNA seemingly came from, and synthesized in, the asteroids from the outer space. Apparently, they reached the Earth through meteorites. NASA reported that they found RNA and DNA nucleobases (e.g. adenine, guanine) in meteorites. These could have led to the spontaneous creation of RNA and DNA on Earth.3 In March 2015, researchers reported that pyrimidines uracil, cytosine, and thymine formed in their laboratory under outer space conditions and using precursors such as compounds present in meteorites.4
In essence, DNA is a more complex compound than RNA. While RNA occurs as a single strand, DNA exists as two strands that typically wound in a helix. Similar to RNA, DNA consists of multiple nucleotides covalently bonded by 3′, 5′ phosphodiester linkages. Each nucleotide, in turn, contains phosphoric acid, a deoxyribose sugar (5-carbon), and a nucleobase (particularly, cytosine, guanine, adenine, and thymine). Thymine is a distinctive structural feature of DNA. In DNA, thymine takes the place of uracil. Having discovered that these building blocks could form under pre-biotic conditions causes scientists to rethink the origin of life.
A research team reported how chains of nucleic acids could form in a pre-biotic environment and how RNA could easily turn into DNA components even without the assistance of enzymes. Ramanarayanan Krishnamurthy, one of the researchers, said, “These new findings suggest that it may not be reasonable for chemists to be so heavily guided by the RNA World hypothesis in investigating the origins of life on Earth.”5
They surmised that the primitive Earth is no pure RNA. DNA might have existed side by side with RNA and it probably even competed for supremacy, until the DNA system eventually reigned over.
They published their report in Nature.6
— written by Maria Victoria Gonzaga
1 Biology-Online Editors. (2014, May 12). Ribonucleic acid. Retrieved from Biology-online.org website: [Link]
2 Woese, C. (1967). The Genetic Code: the Molecular basis for Genetic Expression. New York: Harper & Row.
3 NASA – NASA Researchers: DNA Building Blocks Can Be Made in Space. (2011, January 1). Retrieved NASA website: [Link]
4 Marlaire, R. (3 March 2015). “NASA Ames Reproduces the Building Blocks of Life in Laboratory”. Retrieved from NASA website: [Link]
5 McRae, M. (2019). DNA And RNA May Have Existed Together Before Life Began on Earth. Retrieved April 6, 2019, from ScienceAlert website: [Link]
6 Xu, J., Green, N. J., Gibard, C., Krishnamurthy, R., & Sutherland, J. D. (2019). Prebiotic phosphorylation of 2-thiouridine provides either nucleotides or DNA building blocks via photoreduction. Nature Chemistry. [Link]
Regeneration in humans is much more limited compared in other animals. Say for instance when one lost a limb, much as well say goodbye to it for the rest of one’s life. Perhaps, it would be nice if we have higher capacity to regenerate many of our indispensable body parts, like head, limbs, and many other “regeneration-incapables”. Then probably, we might not have to worry much about losing any of them knowing that they will eventually re-grow in due time.
Regeneration vs. Healing
Humans have the capacity to regenerate. However, we have a very limited capabiltiy to restoring parts of our skin, hair, nails, fingertips, and liver. At the tissue level, surely we have dedicated cells to replace lost and damaged cells. For instance, our non-injured bone eventually replenishes into a full new bone but in a span of ten years. Our skin naturally renews but give it two weeks. The story swerves differently though in the case of an injury.
Rather than expending energy into having it replaced with a new one, our body directs its efforts into healing it. So when our skin is deeply damaged, our body fixes it with a scar. Tissue repair mechanisms such as wound healing aren’t really a snag. They forestall pathogenic microbes from using an injured body part as an easy gateway into our body. (Besides, we do have ample microbiota naturally thriving inside of us already) The main goal is to fix it efficaciously, with relatively less effort.
Natural regeneration in humans
In humans, the only tissue that regenerates naturally, consistently, and completely is the endometrium.1 After it slough-offs during a woman’s menstrual period, it grows back by re-epithelialization until the next period. Humans can also regenerate an injured liver provided that the restoration involves as little as 25% of the original liver mass. The liver can grow back to its original size but may not to its original shape. Damaged tubular parts of the kidney can also re-grow. The surviving epithelial cells undergo migration, dedifferentiation, proliferation, and re-differentiation to set a new epithelial lining of the tubule.
Animals with higher regeneration capacities
Some animals have higher capacity to re-grow lost body parts. Sharks, skates, and rays can regenerate their kidneys. They can regrow an entire nephron, which humans cannot. A lizard would drop its tail as a mode of escape; its tail will be fully restored over time anyway. Sharks do not have qualms about losing teeth. They can replace any of them more than a hundred times in their lifetime. Axolotl can replace its broken heart. A starfish will once again be stellar upon the return of a lost arm. In fact, even its lost arm can fully regenerate into an entire starfish as long as the central nerve ring remains intact.2 A decapitated planarian worm needs not worry about losing its head; it can grow back, together with its brain, including the memories.2 Without a doubt, many of these animals are simply masters of their craft – regeneration.
Researchers from Harvard University published their new findings on whole-body regeneration capacity of the three-banded panther worm.3 They uncovered DNA switches that seemed to regulate genes that have a role in the regeneration process. Accordingly, they found a section of a non-coding DNA that controlled the activation of a master gene in which they called the “early growth response” (EGR) gene. When active, the EGR gene seemed like a power switch that turns on and off certain genes in the coding region during regeneration. On the contrary, when deactivated, no regeneration occurred.
Surprisingly, humans have EGR gene, too. So why doesn’t it lead to greater regeneration capacities as it does in the three-banded panther worm? The researchers explained that while it works in the worm, it doesn’t work the same way in humans. The wiring may be different. The worm’s EGR gene may have germane connections that are absent in humans.
Switching the gene on
Induced regeneration in humans is one of the goals of regenerative medicine. This field of medicine seeks new ways to give our regenerative capacity a boost. One of the ways is to look “molecularly”. Researchers are looking into the gene “Lin28a“. When active, this gene can reprogram somatic cells into embryonic-like stem cells. Accordingly, it has a role in tissue regeneration and recovery. However, the gene is naturally turned off in adults. Research in boosting our regenerative capacities is ongoing. Switching our organs from being regeneration-incapable to regeneration-capable may just be a matter of discovering the gene switch that could enhance regeneration capacity of humans.
— written by Maria Victoria Gonzaga
1 Min, S., Wang, S. W., & Orr, W. (2006). “Graphic general pathology: 2.2 complete regeneration”. Pathology. pathol.med.stu.edu.cn. Retrieved from [Link]
2 Langley, L. (2013, August 28). “Pictures: 5 Animals That Regrow Body Parts”. National Geographic News. Retrieved from [Link]
In the advent of 2019, we are inspired to set new goals, pursue life-long dreams, or simply make better choices. Perhaps, one of the most common reveries we wish to realize is to be able to adopt a healthier kind of lifestyle. With this in mind, some of us look for ways to feel dutifully healthier, such as by managing our weight. So, many would turn to fad diets and caloric restrictions that promise to help. One of them is intermittent fasting. Based on studies, intermittent fasting does not only help trim weight but it seems to offer further health benefits as well.
Intermittent fasting – overview
In May 2018, I wrote the article: “Intermittent Fasting – benefits and caution“. There, I tackled briefly about intermittent fasting, its benefits, and potential risk. In essence, intermittent fasting is a cyclic pattern of a period of fasting and a subsequent period of non-fasting. The most common forms are: (1) whole-day fasting and (2) time-restricted eating. Whole-day fasting entails one-full day of “no eating”, done twice a week (thus, referred to as “5:2 plan“). In time-restricted eating, there is an interval of fasting and non-fasting on a daily basis. It could be half a day of fasting, and then the remaining half as the non-fasting period. With intermittent fasting, it’s not so much about “what to eat…” or “how much…” Rather, it’s more about a question of when.
Intermittent fasting became popular because it does not only help curb weight but it also implicates other health benefits. It apparently slows aging and boosts the immune defense. However, as I pointed out in that article, caution should still be taken. Intermittent fasting is not for everyone, especially those who are immunocompromised and underweight.
Rejuvenating effects of fasting
Previously, I mentioned that studies confirming the health benefits of fasting were done on non-human subjects (e.g. rodent models). Without much scientific proofs of efficacy on humans, what would, therefore, be definite is doubt. However, on January 29 of this year, a team of scientists from Okinawa Institute of Science and Technology Graduate University (OIST) and Kyoto University reported rejuvenating effects of fasting on human subjects. They published their findings in Scientific Reports. Accordingly, they analyzed the blood samples from four fasting individuals. They also monitored the levels of metabolites involved in growth and energy metabolism. What they found was quite interesting and promising.
Dr. Takayuki Teruya, one of the researchers of the team, said that their results implicated the rejuvenating effects of fasting. They found that many metabolites increased significantly, about 1.5- to 60-fold, in just 58 hours of fasting. In their previous study, they identified some of these metabolites (e.g. leucine, isoleucine, and ophthalmic acid), that typically deplete with age. According to Dr. Teruya, they found that the amount of these metabolites increased again in individuals who fasted. Also, they conjectured that fasting could possibly promote muscle maintenance and antioxidant activity based on the metabolites they found. Hence, fasting may probably promote longevity as well. Dr. Teruya further said that this was not yet known until now since most studies that have said so used animal models.
Fasting increased metabolism
During fasting, the body turns to alternate energy stores when carbohydrates are not available. Thus, the less-common metabolites from alternative metabolic pathways superseded the typical metabolites from carbohydrate metabolism. They identified butyrates, carnitines, and branched-chain amino acids as some of the metabolites that accumulated during fasting.  Apart from this, the researchers also found an increase in Citric acid cycle intermediates. This means that aside from prompting alternate metabolic pathways, fasting has also augmented the common metabolic activities. The metabolism of purine and pyrimidine seemed also heightened, indicating an increase in gene expression and protein synthesis. Because of this, the researchers also saw a boost in antioxidants (e.g. ergothioneine and carnosine) that protect cells from the free radicals produced by metabolism. The researchers assume to be the first to provide evidence of antioxidants as a fasting marker. 
This new-found proof infers that fasting seems to have some anti-aging effects, this time, on human subjects. Their next step is to see if they could duplicate the results in a larger-scale study. For now, let us remain cautious, look for indubitable substantiation, and weigh in the benefits and risks of all available options.
— written by Maria Victoria Gonzaga
1 Cohut, M. (2018). Intermittent fasting may have ‘profound health benefits’. Retrieved from [Link]
2 Longo, V. D., & Mattson, M. P. (2014). Fasting: Molecular Mechanisms and Clinical Applications. Cell Metabolism, 19 (2), 181–192. [Link]
3 Teruya, T., Chaleckis, R., Takada, J., Yanagida, M. & Kondoh, H. (2019). Diverse metabolic reactions activated during 58-hr fasting are revealed by non-targeted metabolomic analysis of human blood. ”Scientific Reports, 9”(1) DOI: 10.1038/s41598-018-36674-9.
4 Okinawa Institute of Science and Technology (OIST) Graduate University. (2019, January 31). Fasting ramps up human metabolism, study shows. ScienceDaily. Retrieved from [Link]
Scientists found dead tardigrades beneath the Antarctica based on their report published of recent. It was a surprising discovery since tardigrades have acquired the mark as the tiny infinities. They are so resistant to extreme conditions that they are thought of as some sort of “immortals“. Nonetheless, scientists found remains of tardigrades, together with crustaceans in deep, frozen Antarctic lake.
Antarctic Realm – The Cold Realm
The Antarctic is a region located in the southern-most tip of the Earth. The biogeographic realm that includes the Antarctic is called the Antarctic realm. A biogeographic realm refers to an area of land where similar organisms thrived and then evolved through periods of time in relative isolation. It rouses extensive research with the paramount objective of understanding the extent of biodiversity, especially the distributional patterns of residing organisms and the biological evolutionary history incurred.
The Antarctic biogeographic realm is the smallest of all realms. It spans a total area of about 0.12 million square miles. Its components include the land area, the Antarctic tectonic plate, the ice in the waters, and the ocean itself.  Because of the cold temperature, few floral species are able to persist and thrive. At present, around 250 lichens, 100 mosses, 25-30 livertworts, 700 algal species, and two flowering plant species (i.e. Antarctic hair grass and Antarctic pearlwort) inhabit the region. As for fauna, animal species include the penguins, seals, and whales.
An Icy Surprise
The discovery of the remains of tardigrades was unexpected, according to David Harnwood, a micropaleontologist. Late last year, Harnwood and his research team drilled a hole in the subglacial Lake Mercer. This frozen lake had been undisturbed for millennia. Thus, their research project SALSA (Subglacial Antarctic Lakes Scientific Access) was the first to conduct direct sampling. They were absolutely surprised to find these water bears –frozen and dead.
Astounded, the animal ecologist, Byron Adams, conjectured that these tardigrades might have come from the Transantarctic Mountains, and then carried down to Lake Mercer.  Further, he said, “What was sort of stunning about the stuff from Lake Mercer is it’s not super, super-old. They’ve not been dead that long.”
In September 2015, Jean-Michel Claverie and others reported two giant viruses (i.e. ”Pithovirus sibericum” and ”Mollivirus sibericum”) that they revived from a 30,000-year-old permafrost in Siberia.[3,5] Once revived, the viruses quickly became infectious to their natural hosts, the amoebae.  Luckily, these chilly giants do not prefer humans as hosts. Nonetheless, the melting of these frozen habitats could implicate danger to the public health when pathogens that can infect humans escape the icy trap.
A frozen Pandora’s Box
The frozen regions of the Earth hold so many astonishing surprises waiting to be “thawed”. In August 2016, a 12-year old boy from the Yamalo-Nenets region of Siberia died from anthrax. Reports included a few number of locals and thousands of grazing reindeer as well. Prior to the anthrax outbreak, a summer heatwave caused the melting of the permafrost in the Yamal Peninsula in the Arctic Circle. The thawing of the frozen soil unleashed anthrax bacteria presumed to have come from the carcass of their reindeer host that died over 75 years ago. Their release apparently reached the nearby soil, water, the food supply, and eventually their new hosts. The anthrax bacteria survived because they form spores that can protect them during their dormancy.
A Hotter Earth
Global warming supposedly increases the average temperature of the Earth’s surface enough to cause climate change. Accordingly, the global surface temperature increased 0.74 ± 0.18 °C (1.33 ± 0.32 °F) during the last century. The temperature rise brings threat as it could lead to environmental changes that could cause adverse effects of massive magnitude. One of which is the destruction of habitats due to the subsequent rise of water level from the melting of ice. Deadly pathogens could rise again from their cold slumber and plausibly cause another major mass extinction in no time. So, while we try to explore the deeper mysteries lurking beneath the ice, we should also make sure that we remain a step ahead. Claverie excellently put it:
The possibility that we could catch a virus from a long-extinct Neanderthal suggests that the idea that a virus could be ‘eradicated’ from the planet is wrong, and gives us a false sense of security. This is why stocks of vaccine should be kept, just in case.
— written by Maria Victoria Gonzaga
1 Berman, R. (2019, January 18). Dead – yes, dead – tardigrade found beneath Antarctica. Retrieved from [link]
2 Pariona, A. (2018, May 18). What Are The Eight Biogeographic Realms? Retrieved from [link]
3 CNRS. (2015, September 9). New giant virus discovered in Siberia’s permafrost. ScienceDaily. Retrieved from [link]
4 Wikipedia Contributors. (2018, November 10). Antarctic realm. Retrieved from [link]
5 Fox-Skelly, J. (2017, January 1). There are diseases hidden in ice, and they are waking up. Retrieved from [link]
6 Russia anthrax outbreak affects dozens in north Siberia. (2016, August 2). BBC News. Retrieved from [link]
7 Biology-Online Editors. (2014, May 12). Biology Online. Retrieved from [link]
Antibiotics are the most common compounds that are found in groundwater, surface water, drinking water and wastewater. Also, traces of these antibiotics found in sewage sludge, soil and sediments that caused concern to the environment. Besides, the emergence of antimicrobial resistance becomes the major health problem worldwide. Nonetheless, therapeutic used of antimicrobials in human and veterinary medicine contributes to the widespread of resistant microorganisms. On the other hand walnut shells are among the waste materials that have been suggested to have efficient sorbent alternatives. Due to its low ash content and been used as low cost sorbent for metal and oil removal.
Walnut shell activated carbon in removal of antibiotic
Advance treatment of wastewater confirming the positive results in lowering the presence of antibiotic residues. These include ozonation, membrane separation, advanced oxidation, reverse osmosis and nanofiltration. In which, the vast applicability of activated carbon in pollutants removal are always dependent on the conditions of raw materials. So, this particular research, the walnut shell has been used since it is a precursor material for activated carbon production. Moreover, the activated carbons ability to remove organic micro-pollutants lies on the solution and contaminants properties. Apparently, the absorption of antibiotic Metronidazole shows the conditions that maximize expected results.
The influence of temperature on the absorption capacity of antibiotic is slightly significant. As a result, the absorption capacity depends on the nature of the activated carbon and its chemical characteristics, morphology and solutes. Also, the nature of solutes affecting electronic density influences the interactions with the matrix of the absorbent. In addition, activated carbon is the most common process to remove dissolved organic and inorganic compounds. Its great flexibility in applications arises from physical and chemical properties on specifically treated carbon materials.
As a result, the absorption amount of organic compounds depends strongly on essential properties of the absorbent. However, it can be slightly affected by some variables like temperature, pH, ionic strength and contact time. Therefore, antibiotic shows positive effect on the interaction of the absorbed amount. So, activated carbon from walnut shell might represent a good agent in removing antibiotic residues.
Sources: Prepared by Joan Tura from ScienceDirect: Science of the Total Environment
Volume 646, 1 January 2019 Pages 168-176
A study published in Science on January 11 seems to be the first to lay empirical evidence that concur with Charles Darwin’s hypothesis: … that mate selection might have contributed to the evolution of intelligence or cognitive abilities. Scientists from China and the Netherlands collaborated in a study on budgerigars, Melopsittacus undulatus. Based on what they observed, problem-solving skills apparently increased the attractiveness of male birds. Accordingly, female birds chose to spend more time with male birds that appear to be smarter.
Darwin on mate selection
In animal kingdom, mate selection is a real deal. One of the generalized traits that distinguish the animal from the plant is the former’s tendency to select a mate. Animals, including humans, have their set of preferences when it comes to choosing a mate. While plants chiefly let nature do the “selection” for them, animals tend to seek a potential mate by themselves. And when they find a suitable mate of their choice, they often make a conscientious effort to succeed at coupling. In particular, males engage first in a courtship ritual, for example, by wooing a female with a song, a dance, or by a display of beauty or prowess.
Sexual selection evolved as one of the means of natural selection. A male, for instance, chooses a female to mate with, and, if need be, may tenaciously compete against other males to stack the odds in his favor. Charles Darwin’s long-standing theories on sexual selection are still relevant to this day. Darwin believed that sexual selection had a key role in how humans evolved and diverged into distinct human populations. In view of that, sexual selection could have contributed as to how intelligence evolved.
Intelligent males, more attractive
Many studies on birds revolved around the notion that female birds favor male birds with vibrant feathers or stylish songs. A recent study claims that intelligence is preferred over such fancy features and skills.
In the first experiment conducted by Chen and colleagues, small budgerigars (Australian parrots) were observed inside their cages to test the hypothesis that intelligence might affect mate selection. To do that, they allowed each female budgerigars to choose among a pair of similarly-looking male budgerigars to interact with. The chosen males were called preferred whereas those that were not were referred to as the less-preferred. Next, they trained the less-preferred males into learning a skill that opens closed lids or boxes. They, then, allowed the female budgerigar to observe the less-preferred male demonstrate the skill. Consequently, almost all of the females changed their preference. They chose the less-preferred males over the initially preferred males.
To test if this preference was social rather than sexual, they conducted a second experiment with a similar experimental design but this time a female budgerigar was exposed to two females (instead of males). The results showed that none of the female budgerigars changed their preferences. [1, 3] Based on these experiments, the researchers concluded that the demonstration of cognitive skills altered mate preference but not necessarily social preference.
Video of the animal model, male budgerigar that learned a problem-solving skill that seemingly increased its attractiveness to females. [Credit: Hedwig Pöllöläinen].
Why did mate selection evolved? The answer could be associated with the species survival or longevity. Individuals must be able to stay in the mate selection pool, if not on top of it. In general, males deemed as superior or “preferred” will gain higher chances at mating, and thereby will have better opportunities at transmitting their genes as they dominate the access to fertile females. Females, on the other hand, gain an upper hand from the mate selection by being able to choose the seemingly finest among the rest. Females must choose. That is because they have a generally limited reproductive opportunity to give life to. Moreover, the energy that a female invests in producing an offspring is so great that it has to be worth it.
— written by Maria Victoria Gonzaga
1 Chen, J., Zou, Y., Sun, Y.-H., & ten Cate, C. (2019). Problem-solving males become more attractive to female budgerigars. Science, 363(6423), 166–167. https://doi.org/10.1126/science.aau8181
2 Jones, A. G., & Ratterman, N. L. (2009). Mate choice and sexual selection: What have we learned since Darwin? Proceedings of the National Academy of Sciences, 106(Supplement_1), 10001–10008. https://doi.org/10.1073/pnas.0901129106
3 GrrlScientist. (2019, January 11). Problem-Solving Budgies Make More Attractive Mates. Forbes. Retrieved from https://www.forbes.com/sites/grrlscientist/2019/01/10/problem-solving-budgies-make-more-attractive-mates/#515f24d66407
The recent Netflix’s hit flick, Bird Box, surely startled the viewers with the thrilling scenarios revolving around the precept that once seen, expect an abrupt ferocious death. Given that, Malorie (the protagonist portrayed by Sandra Bullock) blindfolded herself and the two children, and embarked down the perilous river to seek a safer refuge. (N.B. If you have not seen it yet, you probably need to pause to dodge the spoilers ahead.) Ultimately, they reached the haven, which was revealed to be an old school for the blind. The surviving community was a population of primarily blind, and as such, immune, people. By and large, this film emanated a message to me that blindness should not be taken as an utter handicap but a trait that tenders a likely evolutionary edge.
Blindness is a complete, or a nearly complete, lack of vision. Basically, two major forms exist. A partial blindness means a very limited vision. In contrast, a complete blindness means a total lack of vision — not seeing anything, even light.1
Causes of blindness
Some of the common causes of blindness include eye accidents or injuries, diabetes, glaucoma, macular degeneration, blocked blood vessels, retrolental fibroplasia, lazy eye, optic neuritis, stroke, retinitis pigmentosa, optic glioma, and retinoblastoma.1
Congenital blindness refers to a condition wherein a person has been blind since birth. In fact, several instances of infant blindness are due to inherited eye diseases, such as cataracts, glaucoma, and certain eye malformations. In this case, genetic factors play a role. Retinitis pigmentosa, for example, is a hereditary condition. The retinal cells slowly disintegrate and ultimately leads to an incurable blindness later in life. Albinism also leads to vision loss in which, at times, reaches the category of “legally blind“.
The mapping of the human genome led to the identification of certain genetic causes of blindness. Scientists recently identified hundreds of new genes associated with blindness and other vision disorders. Bret Moore and colleagues found 261 new genes linked to eye diseases.2 Furthermore, they said that these newly-identified genes from mouse models likely have an analogous counterpart gene in humans. Thus, their findings could shed light in identifying the genes causing blindness in humans.
Humans evolved eyes that enabled sight or vision. About 500 million years ago, the earliest predecessors had eyes that could detect light from the dark. This early eye, called an “eyespot“, could sense ambient brightness (but not shapes), which sufficiently helped orient single-celled organism (e.g. Euglena) to circadian rhythm and photoperiodism, and of course to food.3
Soon, the eyespot evolved into a rather complex light-detecting structure, such as that found in flatworms. Their eyes could detect light direction. Also, their eyes enabled them to seek a better spot to hide from predators. As light was able to penetrate the deep seas, organisms such as Nautilus evolved pinhole eye. A small opening on it allowed only a thin pin of light to enter. This dramatically improved resolution and directional sensing.3
The pinhole eye evolved lens that regulated the degree of convergence or divergence of the transmitted rays. Furthermore, the lens helped distinguish spatial distance between the organism and the objects in its environment.3
A modern human eye has become more intricate by the presence of other eye structures. For instance, a transparent layer called cornea covered the opening (pupil) of the eye. This caused the inside of the eye to contain transparent body fluid called vitreous humor. The iris is the colored part near the pupil. The light-sensitive membrane, retina, contains the photoreceptor cells, i.e. the rods and the cones. Apparently, the evolution of the human eye concurred with the evolution of the visual cortex of the human brain.3
Blindness – an evolutionary regression or a gain?
Should blindness be considered an evolutionary regression or an evolutionary gain? Blind beetle species that live in light-less caves, in the underground aquifers of Western Australia and the eyeless Mexican cave fish are some of the animals that once had a sight but lost it over millions of years.
Simon Tierney from the University of Adelaide offered an explanation to this seemingly evolutionary regression.4 Accordingly, the loss of sight in the cave fish species apparently led to the evolution of increased number of taste buds. In particular, pleiotropy might explain this manifestation. A pleiotropic gene, in particular, controls multiple (and possibly unrelated) phenotypic traits. In this case, the gene responsible for the eye loss might have also caused the increased number of taste buds. The eyesight may not be imperative in a light-deprived habitat; however, an amplified number of taste buds for an improved sense of taste is. Douglas Futuyma of the State University of New York at Stony Brook explained: 4
“So the argument is these mutations are actually advantageous to the organism because the trade off for getting rid of the eye is enhancing the fish’s tastebuds. It really looks like these evolutionary regressions are not a violation of Darwin’s idea at all. It’s just a more subtle expression of Darwin’s idea of natural selection.”
In 2017, a research team posited that blind people do have enhanced abilities in their other senses. To prove this, they brain scanned blind participants in a magnetic resonance imaging (MRI) scanner. Accordingly, the scans revealed heightened senses of hearing, smell, and touch among blind participants as opposed to the participants who were not blind. Moreover, they found that blind people had enhanced memory and language abilities. Lotfi Merabet of the Laboratory for Visual Neuroplasticity at Schepens Eye Research Institute of Massachusetts Eye and Ear said:5
“Even in the case of being profoundly blind, the brain rewires itself in a manner to use the information at its disposal so that it can interact with the environment in a more effective manner.”
As the popular maxim goes, the eyes are the windows to the soul. In the presence of light, our eyes can perceive all the seemingly playful colors and spatiality that surround us. At times, a simple stare is all it takes to convey what we could have said in words. Despite the loss of sight in some of our co-specifics, their brain configured into an avant-garde stratagem that enabled them to do most of what a seeing person could. Based on what the researchers observed, they had enhanced interconnections in their brain that seemed to compensate for their lack of sight. Hence, blindness appears not as an evolutionary regression but probably a shift of path forward the evolutionary line.
— written by Maria Victoria Gonzaga
1 Blindness and vision loss: MedlinePlus Medical Encyclopedia. (2019, January 1). Retrieved from https://medlineplus.gov/ency/article/003040.htm
2 University of California – Davis. (2018, December 21). 300 blind mice uncover genetic causes of eye disease. ScienceDaily. Retrieved from www.sciencedaily.com/releases/2018/12/181221142516.htm
3 TED-Ed. (2015, January 8). YouTube. YouTube. Retrieved from https://www.youtube.com/watch?v=qrKZBh8BL_U
4 How does evolution explain animals losing vision? (2015, March 18). Abc.Net.Au. Retrieved from https://doi.org/http://abc.net.au/science/articles/2015/03/18/4192819.htm
5 Miller, S. G. (2017, March 22). Why Other Senses May Be Heightened in Blind People. Retrieved from https://www.livescience.com/58373-blindness-heightened-senses.html