California’s Prop 37 exposes Monsanto GMO agenda
California voters will have an opportunity to end decades of a conspiracy of silence when Proposition 37 is presented on this November’s ballot.
Prop 37 requires labeling of Genetically Modified Organism (GMO) based food monopolized by agri-giant Monsanto.
Proponents criticize a labeling black-out of GMO foods largely forced on the marketplace by Monsanto.
According to critics, consumers need to demand GMO food producers accurately label GMO foods because they have not been adequately tested for adverse effects on humans.
Lab rats experienced kidney failure in studies when fed GMO food, a fact often glossed over by the GMO food producers.
More often than not, it is the political influence of Monsanto that has stalled the GMO labeling efforts.
Concern was raised last year over the appointment of Michael J. Taylor, former chief legal counsel for Monsanto, as the Food Safety Czar of the Food and Drug Administration (FDA).
Critics say it Taylor was responsible for a red-tape regulatory loop-hole that removed the requirement to have GMO products labeled while he was a lower level FDA official in the mid-1990s.
A Monsanto FDA revolving door has been used by Taylor several times as he benefits his former employer Monsanto.
Can Genetic Modification Really Lead to Renewable Fuel?
Despite its faults and what it is typically used for, we have to admit that genetic modification is quite an advancement. Can it be corralled to help mankind, similar to the way Tesla technology could have been used more fully to benefit the world?
A lowly, seemingly ordinary bacteria in the soil has been found to use its energy to create complex carbon compounds when under stress. Ralstonia eutropha can convert carbon into its own bioplastic; biodegradable plastic.
Instead of the bacterium making plastic, Christopher Brigham and his MIT biology research team modified the genes, knocking out one and replacing a few others to train the bacteria to create isobutanol alcohol. The alcohol can be easily collected without too much hassle having to filter out the original bacteria.
This could be an amazing kind of alchemy indeed – isobutanol can be blended with or even completely replace gasoline.
It’s already used in some racing cars. Currently, ethanol is the touted advancement in fuel, but is only renewable to a point. The use of corn-based ethanol is partially blamed for rising food prices because of the influx of demand for more corn; and it is not the most reliable source during droughts. Many argue that its production is wasting space that could be used to grow edible food.
Furthermore, Brigham is now working on getting the microbe to use carbon dioxide as its carbon source in order to create fuel out of emissions. During the team’s lab tests, the bacteria was using sugar – that is, fructose – as their carbon food source, but it could potentially clean up farming and municipal waste and turn it into fuel. Last month, his paper on his current work with the bacteria was published in Applied Microbiology and Biotechnology journal.
Like a bear storing fat during hibernation, the bacteria in its natural conditions will store carbons when its basic food like nitrate or phosphate is unavailable. He says:
What it does is take whatever carbon is available, and stores it in the form of a polymer, which is similar in its properties to a lot of petroleum-based plastics. (Source)
The team has had long-term success in modifying the genes to change carbon into isobutanol and has projected they can garner large amounts, quicker production and are planning to design bioreactors to use industrial magnitude. Other companies are currently researching ways to produce isobutonol for fuel or to feed chemical production.
One red flag that raises a concern, however, is that the work was funded by the U.S. Department of Energy’s Advanced Research Projects Agency – Energy (ARPA-E).
Would they allow an advancement such as this to relieve the burden of rising fuel prices and environmental devastation? And what is the ultimate price tag?
It is unclear what this research is costing the taxpayers and where it will ultimately lead. Also, in order to get the fuel, we are still locked into dependency. And it looks like isobutanol production is already falling to corporate dogs – DuPont and BP succeeded in tripping up new companies to prevent them from shipping to new customers this summer. It is now a heated race to the patent rights.
We already have seen examples of government colluding with Big Oil to keep innovation out of America through the suppression of technology that could lead to better fuel efficiency, so skepticism by taxpayers is certainly warranted.
Ideally, we’d live in a world that uses other types of advancements to run our vehicles – air, water, energy or moisture pulled from the air – you just name any one of many discoveries that could power a car, and it is likely to remain suppressed and hidden. Advancements that might not require more factories. While the factories obviously would create more jobs, paying little to nothing for fuel would be a financial boost we hardly dare to hope for today, it would be so grand. And, of course, some of the problems regarding genetic modification center around changing our ecology forever – unleashing something that could permanently alter the fabric of nature. But perhaps this is one department where science, technology, and nature could coexist and shine?
So it appears that technology itself, as a double-edged sword, can be used to actually benefit mankind if applied properly, but we must always keep in mind whose hands are steering the direction of innovation, as well as who is likely to benefit the most from its suppression or release.
Stanford Anti-Organic Study Plays into UN Codex Alimentarious Outline for Global Depopulation
Globalist-funded researchers at Stanford University have created a propaganda study to assert that the nutritional values of organic food are not more than conventionally grown and GMO food. The study claims that the price hike of organic food combined with the researcher’s allegation that organic food supporters over-blown health benefits result in “no advantages of organic meat and produce”.
Dr. Dena Bravata, lead author of the study and affiliate with Stanford’s Center for Health Policy, explained : “When we began this project, we thought that there would likely be some findings that would support the superiority of organics over conventional food. I think we were definitely surprised.”
The summation of the study: organic food is a marketing scheme to coerce people into paying higher prices for the same quality food. The study says: “The evidence does not suggest marked health benefits from consuming organic versus conventional foods although organic produce may reduce exposure to pesticide residues and organic chicken and pork may reduce exposure to antibiotic resistant-bacteria.”
Claiming that the supposition of the study was to inform the public on the nutritional value of conventional versus organic food, Bravata asserts that there was no outside financing that would have created a bias.
Bravata believes that organic food bears no more nutritional value nor have more beneficial vitamins over conventional and GMO produce and meat.
In a two year study , scientists from Washington State University found that “organically grown strawberries were far more nutritious than their chemically grown counterparts.”
John Reganhold, lead researcher and professor, states that with all the data they have collected, and comparing chemical methods of growing food as juxtaposed with organic techniques, the actual way in which the food is grown effects the nutritional value of the food. Use of pesticides and chemicals create dangerous food laced with carcinogenic properties.
Molecular milestone: scientists unravel the human genome
By Stephanie Pappas
Published September 09, 2012
In a milestone for the understanding of human genetics, scientists just announced the results of five years of work in unraveling the secrets of how the genome operates.
The ENCODE project, as it is known, dispensed with the idea that our DNA is largely “junk,” repeating sequences with no function, finding instead that at least 80 percent of the genome is important.
The new findings are the latest in a series of increasingly deep looks at the human genome. Here are some of the major milestones scientists have passed along the way.
1. An understanding of heredity, 1866
The realization that traits and certain diseases can be passed from parent to offspring stretches back at least to the ancient Greeks, well before any genome was actually decoded. The Greek physician Hippocrates theorized that “seeds” from different parts of the body were transmitted to newly conceived embryos, a theory known as pangenesis. Charles Darwin would later espouse similar ideas.
What exactly these “seeds” might be was destined to remain a mystery for centuries. But the first person to put heredity to the test was Gregor Mendel, who systematically tracked dominant and recessive traits in his famous pea plants. Mendel published his work on the statistics of genetic dominance in 1866 to little notice. [Genetics by the Numbers: 10 Tantalizing Tales]
2. Chromosomes come to light, 1902
But the painstaking work of cross-breeding pea plants wouldn’t languish for long. In 1869, Swiss physician Johannes Friedrich Miescher became the first scientist to isolate nucleic acids, the active ingredient of DNA. Over the next several decades, scientists peering deeper into the cell discovered mitosis and meiosis, the two types of cell division, and chromosomes, the long strands of DNA and protein in cell nuclei.
In 1903, early geneticist Walter Sutton put two and two together, discovering through his work on grasshopper chromosomes that these mysterious filaments occur in pairs and separate during meiosis, providing a vehicle for mom and dad to pass on their genetic material.
“I may finally call attention to the probability that the associations of paternal and maternal chromosomes in pairs and their subsequent separation … may constitute the physical basis of the Mendelian law of heredity,” Sutton wrote in the journal The Biological Bulletin in 1902. He followed up with a more comprehensive paper, “The Chromosomes in Heredity” in 1903. (German biologist Theodor Boveri came to similar conclusions about chromosomes at the same time Sutton was working on his chromosome discovery.)
3. What genes do, 1941
With the link between chromosomes and heredity confirmed, geneticists delved deeper into the mysteries of the genome. In 1941, geneticists Edward Tatum and George Beadle published their work revealing that genes code for proteins, explaining for the first time how genes direct metabolism in cells. Tatum and Beadle would share half of the 1958 Nobel Prize in Physiology or Medicine for their discovery, which they made by mutating bread mold with X-rays.
4. DNA structure decoded, 1953
Now scientists knew that DNA was the molecule responsible for carrying genetic information. But how? And what did this molecule look like?
The pieces of the puzzle were beginning to come together throughout the 1940s. In 1950, biochemist Erwin Chargaff figured out that the nucleotides, or building blocks, of DNA occur in specific patterns. These nucleotides are represented by four letters (A, T, G and C), and Chargaff was the first to discover that no matter the species, A and T always appeared in equal measures, as did G and C.
This discovery would be crucial to James Watson and Francis Crick, the scientists who would describe the structure of DNA for the first time in 1953. Combining Chargaff’s work with studies by Maurice Wilkins and Rosalind Franklin and other scientists, the pair worked out the iconic double helix shape of DNA, a discovery Crick reportedly called “the secret of life.”
5. Human Genome catalogued, 2001
With DNA becoming an increasingly open book, scientists began to tackle genomics, the study of the complete genetic library of organisms. In 1977, researchers sequenced a complete genome for the first time, starting with a rotund little bacteriophage known as Phi X 174. By 1990, science was ready to start something much bigger: a complete cataloguing of the human genome. [Animal Code: Our Favorite Genomes]
The result was the Human Genome Project, a 13-year international effort that resulted in the complete sequencing of the human genome in 2001. (More detailed analyses of the initial sequence continued after the release of this first draft.) The project revealed that humans have about 23,000 protein-coding genes, a mere 1.5 percent of the genome. The rest is made up of what has been called “junk DNA,” including fragments of DNA that don’t code for any proteins and chunks of genes that regulate other portions of the genome.
6. Junk DNA de-junked (2012)
Now, the ENCODE project has looked deeper into this “junk DNA” than ever before. And junk it is not: According to more than 30 research papers published today (Sept. 5) in a number of journals including Science and Nature, at least 80 percent of the genome is biologically active, with much non-protein-coding DNA regulating nearby genes in a complex dance of influence. [Mysteries of Human Evolution]
The findings reveal that the genetic basis of many diseases may not be in protein-coding genes at all, but in their regulatory neighbors. For example, genetic variants related to metabolic diseases pop up in genetic regions that activated only in liver cells. Likewise, regions activated in immune cells hold variants that have been associated with autoimmune disorders such as lupus.
“These breakthrough studies provide the first extensive maps of the DNA switches that control human genes,” study researcher John Stamatoyannopoulos, associate professor of genome sciences and medicine at the University of Washington, said in a statement. “This information is vital to understanding how the body makes different kinds of cells, and how normal gene circuitry gets rewired in disease. We are now able to read the living human genome at an unprecedented level of detail, and to begin to make sense of the complex instruction set that ultimately influences a wide range of human biology.”
Biocompatible Material Much Tougher than Cartilage Developed, May be able to Replace Damaged Cartilage in Joints
September 9, 2012 By Nathan
Experts from a wide variety of fields have collaborated to research and create an extremely tough and stretchy biocompatible material that may be used in the future to replace damaged cartilage in human joints.
The material is a hydrogel, which means that its main component is water and that it is a hybrid of two weak gels combined to create a material much stronger than either was on its own.
This newly created gel is able to stretch to 21 times its original length, and is, more impressively, also extremely tough, biocompatible, and capable of self-healing. That’s an extremely valuable collection of attributes when taken together, opening up many new possibilities and opportunities in medical and tissue engineering fields.
The research on the new material, its properties, and an easy way to synthesize it are described in the September 6 issue of Nature.
“Conventional hydrogels are very weak and brittle — imagine a spoon breaking through jelly,” explains lead author Jeong-Yun Sun, a postdoctoral fellow at the Harvard School of Engineering and Applied Sciences (SEAS). “But because they are water-based and biocompatible, people would like to use them for some very challenging applications like artificial cartilage or spinal disks. For a gel to work in those settings, it has to be able to stretch and expand under compression and tension without breaking.”
The very tough new hydrogel was created by combining two common polymers, the primary being polyacrylamide (used in soft contact lenses), and the secondary being alginate (a seaweed extract used to thicken food).
“Separately, these gels are both quite weak — alginate, for instance, can stretch to only 1.2 times its length before it breaks. Combined in an 8:1 ratio, however, the two polymers form a complex network of crosslinked chains that reinforce one another. The chemical structure of this network allows the molecules to pull apart very slightly over a large area instead of allowing the gel to crack.”
The portion of the gel that is alginate is made of polymer chains that make weak ionic bonds with one another, “capturing calcium ions (added to the water) in the process.” If the gel is stretched out it allows some of these bonds between the chains to break, releasing the calcium. When this happens the gel slightly expands, while still leaving the polymer chains themselves intact. At the same time as this, the polyacrylamide chains form into a “grid-like structure that bonds covalently (very tightly) with the alginate chains.”
So, if the gel forms even a tiny crack as it’s stretched, the polyacrylamide grid spreads out over a large area the force from the pulling, putting pressure on the alginate’s ionic bonds and breaking them in some spots. Even with a huge crack, the hybrid gel is still able to stretch to 17 times its beginning length.
An important thing to note is that the new hydrogel is able to maintain its toughness and elasticity even after being stretched many times. As long as some time is provided to relax in between the stretches, the ionic bonds between the alginate and the calcium can “un-break.” In experiments, it was demonstrated that this healing process can be intentionally accelerated by increasing the ambient temperature.
“The unusually high stretchability and toughness of this gel, along with recovery, are exciting,” says Suo. “Now that we’ve demonstrated that this is possible, we can use it as a model system for studying the mechanics of hydrogels further, and explore various applications.”
Beyond artificial cartilage, the researchers suggest that the new hydrogel could be used in soft robotics, optics, artificial muscle, as a tough protective covering for wounds, or “any other place where we need hydrogels of high stretchability and high toughness.” Perhaps in some new cleantech?
Source: Harvard School of Engineering and Applied Sciences
Image Credit: Jeong-Yun Sun and Widusha R. K. Illeperuma
Clean Technica (http://s.tt/1mRLP)
Nanotech, Terra-Forming, Transhumanism, and You
Ann Gordon, RN, Contributor
Many things are being altered right before our eyes, mostly without our consent, or knowledge. These changes are coming from well-funded and classified experiments, accidents, and new science, often with creepy ‘modern’ agendas driving them. This sort of broad-scale change is unfolding in nearly every facet of our lives. But the signs and symptoms already indicate that our health may suffer as a result.
Let’s start with food for example. The bio-availability of the nutrients in our food seems to be getting interrupted. Could it be the irradiation or modification of foods? Some believe that the reason for the obesity problems could actually be a lack of nutrition, a slow starvation, because the nutrition cannot get into the cells.
Weather modification is nothing new, and researchers claim that the residues from spraying create toxic effects on our health, plants, and animals. We seem to be ingesting experimental aerosolized air, tainted water (with fluoride, pharmaceuticals, etc.), faked food, and untested medicines. Asthma cases are soaring, as are mysterious flu-like symptoms. But that is only the tip of a very chilling future according to many thinkers.
Hollywood is helping us to be de-sensitized to words, through popular movies. For example, many science fiction movies feature flesh eating zombies (mutated humans), and terra-forming earth. Academic futurists are talking about Transhumanism, and popular magazines illustrates how a future man might physically appear.
Transhumanism, a new and growing movement, is asserting claims to augment ‘natural humans’, replacing them with a technological ‘advanced’ artificial intelligence, fueled by nanotechnology and bio-engineering. As change accelerates, we are hearing the term ‘die off’, and shrug it off to natural ‘extinctions’. The skies are often white, not blue, as the weather experiments, (race to control the weather, or other motives), become extreme. Sadly, as we get desensitized, we accept a new normal.
Yet, Terra-Forming the environment (geo-engineering) and changing the nature of man needs some forethought. If the skies are sprayed with aerosols, as many documents note, (to reflect the sunlight preventing warming), serious consequences seem obvious. Ultimately, are we hurting ourselves and future generations?
We have already created plastic antibodies. We have smart pills, smart chicken, smart water, smart washers, and a smart grid. Smart is a marketing word, like the new modern, but often with an altered brew. Today, we get artificial (plastic) body parts, and create new life in vitro, (cellular engineering).
In short, we’re already ingesting new chemicals, plastics, nano particles, while being bombarded 24/7 by invisible electromagnetic frequencies (ELF’s). We are told everything is safe, but is it? How do you feel?
Transhumanism looks into the future for man decades, based on today’s technologies. We already have cryonics (preservation of cells).
Virtual reality already mimics reality, and soon will include our sense of touch, (haptics). It is suggested that people might prefer the virtual reality of the future to reality.
Sophisticated biotechnology combined with nanotechnology will become so convenient and tiny, that the everyday person will want to implant them (for convenience or prestige). New vaccines are on the horizon for countless ailments, with needle-less creative ways to ingest them. Yum.
Before embracing all new technologies sold to us to improve our lives, it might be wise to take a very close look at the non-monetary price you will pay especially when linked to potential harmful side effects. Start with what you can do for you or your loved ones and learn what you are putting into your bodies willingly.
This article first appeared at GreenMedInfo. Please visit to access their vast database of articles and the latest information in natural health.
University Researchers Develop a Technique to Remotely Control Cockroaches
By Madison Ruppert
Researchers at North Carolina State University (NCSU) have made a quite major step forward in developing a technique which leverages an electronic interface to remotely control cockroaches.
While this is supposedly to be used for positive purposes, namely, to find survivors in a building destroyed by an earthquake, I think it is quite obvious that this is not the only thing it could be used for. The earthquake application is strikingly similar to that proposed for the technology allowing people to “see” around corners and through skin via advanced optics and photons.
With an increasing demand for small robots capable of navigating tight spaces, it wouldn’t be the least bit surprising to see this type of technology used by the military or law enforcement.
The researchers have developed a technique – outlined in a paper entitled, “Line Following Terrestrial Insect Robots” – which leverages an electronic interface placed on the cockroaches back, thus allowing them to remotely steer the cockroaches and “allow first responders to create a mobile web of smart sensors that uses cockroaches to collect and transmit information,” according to Homeland Security News Wire.
Watch the “Roboroach” in action below:
“Our aim was to determine whether we could create a wireless biological interface with cockroaches, which are robust and able to infiltrate small spaces,” said Alper Bozkurt, assistant professor of electrical engineering at North Carolina State University and co-author of the paper on the breakthrough.
The ultimate goal, according to Bozkurt, is the creation of a “mobile web of smart sensors” using cockroaches to both collect and transmit information.
While this obviously could be used for incredibly positive and useful ends such as earthquake rescue, it could also be used for surveillance and other law enforcement or military purposes.
“Building small-scale robots that can perform in such uncertain, dynamic conditions is enormously difficult,” said Bozkurt.
“We decided to use biobotic cockroaches in place of robots, as designing robots at that scale is very challenging and cockroaches are experts at performing in such a hostile environment,” explained Bozkurt.
There is also the cost factor involved in producing robots which very well might be lost in the process, however, the costs involved in creating a way to control cockroaches isn’t all that cheap either.
Researchers needed to develop a method which is both cost-effective and electrically safe to control the cockroaches, which they did by making the roach think that danger was near or an object was blocking its path, according to an Engadget article linked to by NCSU.
Bozkurt’s team developed a method which involves embedding the roaches – in this case Madagascar hissing cockroaches – with a relatively simple, low-cost and lightweight commercially available microchip outfitted with a wireless receiver and transmitter.
The “cockroach backpack” weighs in at a mere 0.7 grams and also contains a microcontroller which is connected to the roach’s antennae and cerci, allowing monitoring of the interface between the electrodes implanted in the roach and the roach’s tissue in order to avoid possible neural damage.
The microcontroller uses the cerci because they are sensory organs normally used by the roach to detect movement in the air, possibly indicating a predator. When the cerci on the roach’s abdomen detect movement they cause the roach to scurry away to safety.
The researchers found that they could use wires attached to the cerci to make the roach move in a desired direction since the roach thinks something is sneaking up behind it. Thus, the roach is spurred to move forward and the wires attached to the cockroach’s antennae act as something like “electronic reins, injecting small charges into the roach’s neural tissue,” according to Homeland Security News Wire.
These “electronic reins” essentially trick the cockroach into thinking their antennae are in contact with some kind of physical barrier, pushing them to move in a different direction.
The precision with which they can make these cockroaches move is quite astounding, especially since this technology is still in its relative infancy.
The researchers have already demonstrated the ability to use a microcontroller to make the cockroach move along a line curving in different directions (see the above video) and one must realize that this capability will only become increasingly precise as the technology is refined.
While indeed this could be a phenomenal step forward in earthquake rescue and save many peoples’ lives, it could also be just as easily employed in military and law enforcement contexts for ends which are far less laudable. All we can really do is hope that it is used for positive ends instead of the many less-than-admirable applications.
This article originally appeared on End the Lie
About the Author: