Scientists Develop Spider Silk Skin

No, this isn’t the tagline for the new Spider-Man movie; it’s an actual news story. In Germany, researchers have been working on creating human skin replacements made from the silk of orb-weaving spiders. Apparently, if provided with precise conditions, human skin may grow from the spider silk. Without toxic byproducts, and with a biodegradable formula, the spider silk would make a much more eco-friendly alternative to current skin replacement methods, such as petroleum-based formulas.

It probably wouldn’t be considered vegan, but I’m also not sure if it would cause harm to the spiders themselves, either. And it’s not really that new of a concept, considering that throughout history spider webs have been used to help cover wounds, as well as to make other human items like nets and textiles. Researchers say that the web as skin replacement looks promising for brain cell regeneration as well as for distributing medication to small, localized areas of the body.

Who ever said spiders weren’t helpful to humans? We can only imagine what else the medical world might be able to use spider silk for in the future. Hopefully they’ll be used to save lives rather than for frivolity. I cringe as I imagine the facelifts and silk Botox treatments…

Genetically Engineered T-Cells May Provide A Cure for Cancer. Scientists Optimistic.

A preliminary test on Leukemia patients shows unprecedented results.

    

A recent report in the Health and Science section of THE WEEK provides some incredibly encouraging news for cancer patients and their loved ones. Researchers at the University of Pennsylvania have held only a preliminary study, but are "wildly buoyant" about the results. Three Leukemia patients underwent a special treatment in which their T-Cells, or natural immune system's disease-fighting cells, were genetically re-engineered to specifically target cancer cells; a kind of cellular serial killer. The results are startling: two of the Leukemia patients are cancer-free, and reduced cancer cells in the third by 70%.

     According to a report by ABC News, earlier attempts at re-purposing T-Cells have not gone well, with the cells reproducing poorly and eventually disappearing altogether. However, Dr. Carl June, a gene therapy expert at UPenn and one of the researchers on the project, changed their approach. The used a new carrier to bring the new genes into the T-Cell, which then told it to multiply and kill the particular leukemia cells in their patients. Each of the patients were middle-aged men with advanced Chronic Lymphocytic Leukemia (CLL), and their options had nearly run out. However, with June and his team's efforts, the men experienced unprecedented reversals in their conditions. June said that the T-Cells typically attack viruses, killing viral material in the body and then going after any new viruses that pop up, but that's precisely how they attacked the cancer.

     Blood was drawn from all three individuals, and T-Cells removed, altered, and then replaced in the patients in three infusions. The millions of altered T-Cells multiplied by 1,000 of times as they attacked the leukemia cells in each of the patients. June reported that there was little change in the men for the first several weeks, and then they all experienced crippling flu-like symptoms. The symptoms (nausea, cramping, chills, fever) are a condition called "tumor lysis syndrome", a result of the rapid loss of cancer cells in the body. According to June, "Within three weeks, the tumors had been blown away, in a way that was much more violent than we ever expected." One of the patients, according to the report, had more than seven pounds of tumor material in his body and after three-weeks it was completely gone. 

     Other than tumor lysis syndrome, there is also the potential for the genetically altered T-Cells to fight and kill other infection-fighting cells in the body, leaving the patients open to infection and sickness, which required treatments at their clinic. However, the extreme efficiency and efficacy of this treatment lends real hope to people dealing with cancer all over the world. The only unanswered question is how long the effects may last. researchers designed the T-Cells to multiply, and to create offspring dormant T-Cells that will "wake up" should cancer cells reappear later in life, partly in an effort to fight any re-emergence in their patients. In time, this treatment may become commercially available, but at this point it's a highly specialized operation that will stay largely in the realm of academia and science. Watch the introductory video on this breakthrough treatment here.

 

Scientists in movies are either loveable buffoons or really want to take over the world

As I see it, there are about five scientist archetypes in American movies. Frizzy-haired, whacked-out scientist makes an invention that may be brilliant, but it goes COMPLETELY OUT OF CONTROL! Young, geeky science kid grows up, gets those braces off, buys a nice suit and becomes grown-up, rich scientist who all the kids are sorry for teasing in the first place. You get the picture. It's a trend. And none of these stereotypes serve as very good role models to get kids to pick up their Bunsen burners all too quickly. They might think science could make them rich in their 20's or 30's, but their dreams will be thwarted when there's so much labor actually involved. These movies tell them that they may be smart, but who wants that anyway? The best thing a scientist can do, film tells us, is hide her scientific proclivities under physicality, money or fame. Let's take a look at some of the major scientist archetypes in American cinema and see what they mean for our next budding science geniuses:

Rich Scientist who isn't so geeky anymore because he has $erious Ca$h. Remember that kid with all the pimples and braces who made that volcano explode in the high school science fair? (Seriously, Hollywood, quit it with the volcano science experiments). Well, look at him now! I guess that this stereotype kind of offers some redemption, saying that the nerds will rule the earth and the popular kids will have 12 kids, but seriously what kid wants to wait until they're 30 to have any friends?  And most of the time, the character has to completely change himself even as a rich adult.  Remember Sandy Frink from Romy & Michelle's High School Reunion?  He can barely continue his khaki-clad erection in high school and doesn't catch Michelle's eye with his pocket calculator, but at the reunion, after he becomes a famous, good-looking inventor, she's all over him.

Evil Mad Scientist who wants to destroy the world. This is the big one: combine science and evil and it's a deadly combination! Okay, I guess this was sort of justified because there were a lot of big brained, evil superpowers in the world when Dr. Strangelove or: How I Learned to Stop Worrying and Love the Bomb came out in 1964. But the evil scientists that followed it?  Going too far.

Scientists who we forgive/forget their intelligence because they are soooooooo attractive! Hermione Granger in the Harry Potter films, Val Kilmer in Real Genius--it doesn't matter that they are smart because their attractiveness seriously outweighs their intelligence.  We can forgive them their genius, which is typically portrayed as effortless, perhaps burdensome, because who works at science if they have gorgeously flowing locks? Science, again, is a back-up when good looks or attractive skills fail.  

Loveable, Bumbling Mad Scientists who are super-brilliant in science, but have zero social skills to speak of. This one is the most common in recent years.  There's Dr. Doolittle, Phillip Brainard in Flubber, Wayne Szalinski in Honey, I Shrunk the Kids.  I could go on and on.  These loveable, brilliant dim-wits can't control their creations and usually can't remember to take out the trash, but they still get foxy women to marry them.  If you're a scientist, you don't need to have social skills. Probably because you're rich.

Scientists who accidently ruin the world.  Americans are constantly told that it's really not a good idea to be too smart. Bill Clinton and George W. Bush hid their Ivy League educations because these credentials may spook their voters.  But what better way to illustrate how often brains can get in the way than to show a brainiac accidentally destroying the world?  I sure don't know of one.  Remember Matthew Broderick in WarGames? He hacks into his college's computer system and accidentally starts World War III.  Better stick to blissful ignorance, American kids.

GeoHazards Strives for International Solutions for Disaster Relief

Brian Tucker: The loss of human life can be prevented or lessened.

 

Here in Seattle, we are always waiting for the Big One. The Big One refers to a potentially devastating earthquake because we are on the San Andreas Fault line. City officials here are constantly arguing about certain projects which will make the city safer in the event of an earthquake. Seismologist Brian Tucker has been looking at ways that cities and communities everywhere vulnerable to earthquakes can prevent the loss of human life, even in developing nations. 

 

Brian Tucker parallels diseases to earthquakes in that diseases can be prevented with vaccines and that much of the damage caused by earthquakes can also be stopped with preventive measures. His organization, GeoHazards (GHI), has been in as many as forty different countries doing work to build safer buildings and infrastructure to lessen the loss of human life when and if natural disasters such as earthquakes and Tsunamis occur. 

 

As this Slate article points out, Brian Tucker was inspired to start GHI because he noticed the large discrepancy in the number of lives lost in an Armenian earthquake (approximately 25,000 people) and a similar earthquake in California (63 people). He realized that the loss of life was largely preventable with solid construction and infrastructure. In addition, Brian Tucker and GHI look for solutions to other problems that occur. Often, the problems of evacuation are specific to the specific community or city where the natural disaster takes place. In addition, different notification systems need to be put in place for each location. There isn’t always a one-size fits all solution for disaster relief. 

 

 

Because of GHI’s work, Brian Tucker has come to realize the value of funding for disaster-prevention (or mitigation, since the disasters will still occur.) According to Slate, he would appreciate it if there was more international funding for safe infrastructure projects within the developing world, just as there is a significant amount of money for funding for vaccines in the developing world. 

 

I’ve always thought it was truthfully interesting how many people undervalue the importance of infrastructure until there is a natural disaster; at a party recently, I was introduced to a young guy from an undisclosed country in South America. I asked him what his profession was, and he said he had a boring job. He was “just a structural engineer.” I couldn’t believe how much he undervalued his job. Truthfully, structural engineers save just as many lives as doctors and firemen; they just don’t get any credit for it because all of their work is “preventive” and not after the fact. 

 

The Personal Information Device

The Future of Everyday Computing

The earliest version of the mobile phone was first implemented in 1946. The first handheld mobile phone placed its first call in 1973. Six years later, NTT launched the very first 1G network in Japan. IBM premiered the very first smartphone, the IBM Simon, in 1992. Apple Inc. made smartphones the purview of the public in 2007 with the first iPhone and from there things have only started moving faster. Mobile is where information technology is going, not just for high-end luxury gadgets but for everyday personal and business computing. It's only a matter of time before the desktop and laptop disappear from our lives, replaced by 100% mobile tech that, for our purposes, I'm calling the PID (the Personal Information Device). Here's how such a pocket computer might look.

The two biggest hurdles we have between current desktop computing and a true PID are data management and hardware interface. Neither are especially insurmountable, though they both present some challenges. When it comes to data management, i.e. memory and processing power, we're already well on our way to solving the problem. Fairly large amounts of data can be stored in tiny spaces. It's no big deal to have tens of gigabytes of information stored in a palm-sized device, or in just a few flash drives. Of course, this alone can't keep up with the increasing sizes of programs. For this, we need only turn to cloud computing and virtual storage. There's little reason in keeping personal copies of entirely digital programs in one's possession when an Internet connection provides identical functionality. The PID will be a powerful processor but it won't need to store very much information to be as functional as a memory-loaded desktop.

Hardware is a more significant issue vis. interface. The big speedbumps are input devices (mouse, keyboard, stylus, etc) and output devices (monitors, speakers, etc). Currently, we have rudimentary versions of the response to these issues, but the technology is going to have to get a lot more elegant and reliable before it can hope to replace our clunky, peripheral-heavy desktops.

First, let's consider the monitor. Even our best mobile devices today have tiny displays compared to even a middle-of-the-road desktop or laptop, but that's only because the display is limited to a physical screen. The easiest answer is simple projection. Fitting a mobile device with a built-in projector complete with size and resolution adjustment options could eliminate the need for monitors altogether. This would require a surface to catch the projection, which is only slightly inconvenient, but it's at best a stepping stone to something a little more sci-fi like holographic projection that would render the need for surfaces moot. As for audio, I'd say peripheral speakers or earpieces are the most reasonable answer. No need to get fancy, especially when an increasingly mobile device implies increased public usage where built-in speakers would be a problem.

Input is a little more difficult to overcome. Modern mobile devices have already embraced the touch screen and made it easy to use, but that won't fly for typing or other more nuanced actions. For this issue, we'll need to turn to motion capture technology. Currently experiencing growing pains, motion capture is far from functional for daily, nuanced use. We need a capture cam that is both smaller and more accurate than anything used in today's console gaming technology, but that's just a matter of time and research. A motion capture element in a PID would allow for full-sized, virtual keyboards, touch screens and other interactive graphics that would render our desktop and laptop input hardware obsolete.

A PID wouldn't just be a nifty gadget. Like the cell phone became the modern standard for communication, the PID would become the modern standard for all data transfer. We're already heading in that direction with the increasing functionality of smartphones and tablets, so it's less a question of if and more a matter of when the Personal Information Device becomes the way we live, work and connect with others.

Early Modern Humans Outnumbered Neanderthal Populations By Almost 9 to 1

The massive discrepancy in populations may have had some effect on modern humans displacing their earlier cousins.

    

As a companion piece to my earlier article on "inter-special breeding" between early Modern Humans and Neanderthals, a new report shows that sheer numbers of early modern humans may have pushed out, over-taken, or destroyed much of the Neanderthal population in France, which is the largest archeological site for this period in hominid history. The period, roughly between 35,000 and 45,000 years ago as human populations migrated out of the African Continent thousands of years after Neanderthal populations had already braved the colder northern wastes of the "mini ice age". What's unclear is how these two populations interacted. Were they peaceable? After all, there's been genetic evidence to suggest that there was the two peoples mated (primarily in the Middle East before migrating further). Did they fight for land, territory, or resources? Was the decline of the Neanderthal enough removed from the rise of early humans that by the time they arrived in Europe, their predecessors were ghost stories? New evidence may have, at least partially, and answer to these questions.

     According to a recent report in Science via Ars Technica, early humans may have significantly outnumbered their Neanderthal cousins. Based on data pulled from a large density of archeological sites in southern France, two Cambridge researchers have managed to piece together rough population extrapolations using several metrics. The results show that early Cro-Magnon and Human populations inhabited sites sometimes more than three-times the size of Neanderthal sites, and often the human sites were simply Neanderthal sites that had been reinhabited. Neanderthal sites averaged 200 square meters in which the early hominid populations lived and conducted most of their activities. Human encampments, on the other hand, would average between 400 and 600 square meters, indicated a much larger number of individuals.

     Additionally, the number of stone tools, implements, and other artifacts between the two different populations indicate a vast difference in the size and productivity of the two peoples. The timeline in question rests between two significant ages; the Chatelperronian era in which the last vestiges of the Neanderthal race was dominant, and the Aurignacian period, which was dominated by early modern humans. Between the Chatelperronian and the Aurignacian periods the densities of stone tools and animals remains at the sites increased dramatically, indicating that Neanderthals were vastly outnumbered by humans; possibly by a factor of 9 to 1.

     Although this does not indicate specifics of how the two hominid species interacted or related contemporarily, it does give significant evidence to the fact the humans were a much more productive and successful race than Neanderthals, and that sheerly by dint of size they may have been dislocated, or possibly forcefully removed from this region of Europe. In addition to earlier evidence that there had been inter-breeding between the two species of hominid, there is no evidence that the two groups ever cohabited. Could the mating merely have been a byproduct of war as Neanderthal females were raped by marauding humans as they were forcibly driven from their territories? Could there have been instances where Neanderthal and human populations did live symbiotically and peaceably? Could it be that Neanderthals were simply bred out of existence, assimilated into the human genetic line? Likely it was a combination of all of the above, but only further study (and a good dose of imagination) will ever clarify this critical period in human history.

Scientists Find Evidence that Humans and Neanderthals Mated Ouside of Africa

Recent evidence shows that humans outside the African continent mated with Neanderthals.

    

It's long been known that Neanderthals and early Cro-Magnon man coexisted early on, but it has never been established how or why Neanderthals went extinct well before the advent of civilization. Some theories include inter-species warfare, or even inter-breeding that eventually "bred out" the Neanderthal line. In fact, author Jean Auel and others have made a career of postulating what the relationship between early humans and Neanderthals may have looked like. Michael Crichton's novel Eaters of the Dead even reimagined the story of Beowulf as fighting not dragons or monsters, but a race of mountain-dwelling Neanderthals that cannibalized the local human population. According to recent findings by the University of Montreal Pediatrics, as reported on Wired Science, there is strong evidence of what that relationship may have actually looked like, and it seems Neanderthals and early humans were close...very close.

     The findings, published in Molecular Biology and Evolution, the research team analyzed 6,092 x-chromosomes from individuals of "all inhabited continents" to look for genetic markers, or segments of x-chromosomes common along genetic lines. They found that a significant population, nine percent of those contemporary individuals analyzed, contained a Neanderthal-derived segment of the x-chromosome. The official report explains, "this confirms earlier hypotheses that early moderm man and Neanderthals mixed and mated."

     Neanderthals and modern man both originally evolved on the African continent, eventually migrating outside of the continent to eventually populate much of the world. However, Neanderthals left the African continent 200,000 to 400,000 years ago and had disappeared from the face of the Earth by around 30,000 BC. Early modern man migrated 80,000 BC to 50,000 BC, which leaves a significant amount of time in which early humans and Neanderthals were contemporaries and populations certainly maintained some form of contact with one another.

     Furthermore, an earlier study explains how the team was able to identify the particular segment of the x-chromosome, called a haplotype, as being derived from Neanderthals. The Neanderthal genome was sequenced in 2010 and was quickly compared to the human genome, of which 6,000 genes were held in common. Of those types was a particular haplotype from the x-chromosome that was identical. This particular sequence was found in men of all inhabited continents with the exception of sub-Saharan Africa. They were also to pinpoint the highest frequency of inter-special mating. According to Discovery News, “The team believes most, if not all, of the interbreeding took place in the Middle East, while modern humans were migrating out of Africa and spreading to other regions.”

     Though evolution remains a controversial topic yet among small segments of the population, these more recent findings have powerful implications for the interconnectedness of human populations. Derogatory remarks such as one is a "caveman" or a "neanderthal", the stuff of popular Geicko commercial campaigns, have less punch when it's revealed that every population maintains some connection to our predecessors. Races, ethnicity's, and populations all over the world seem to share the same genetic markers. Ultimately, even those populations that may have intermixed with an aberrant branch of the human lineage, have spread throughout the world and evolved into societies as culturally different, but genetically similar, as any other.

    

Dr. Aubrey de Grey on Longevity

Will humans live for hundreds of years?

Dr. Aubrey de Grey is working on ways which he and transhumanists believe would allow humans to live for hundreds of years. While Dr. de Gray’s ideas might seem like something more likely to be found in a Tom Robbins or Philip K. Dick novel, his research indicates that the seven biochemical effects of aging can be reversed in humans.  

Dr. de Grey was recently interviewed by H+ magazine who questioned the scientist about common criticisms to his ideas. Specifically, the interviewer asked Dr. De Grey about overpopulation, whether or not people would actually want to live for centuries, and the morality of the decision to live for an extended period of time. In addition, he questioned Dr. de Grey about the pros and cons of specific medical advances in the area of longevity. 

Dr. de Grey didn’t back down from his stance on Transhumanism or his plan for extended lifespans, Strategies for Engineered Negligible Senescence” (SENS). In response to the question about population, he responded that overpopulation could be managed in a variety of different ways including less procreation and that future technological advances that might help effectively manage a crowded planet. 

On the morality issues of SENS, Dr. de Grey also remained firm in his thinking. He answered that those in developing nations would unlikely be affected by more Westerners living long lives, and that God would offer the option of an extended lifetime if he could. 

Dr. de Grey argues that SENS will rejuvenate people from the inside and out, so that they will look younger than they actually are. The battery of treatments that de Grey envisions includes an enzyme therapy and somatic gene therapy, which changes the coding of a person’s DNA so that certain genes are not passed on to the next generation. Some believe that somatic gene therapy could entirely wipe out cancer. 

He believes that SENS is in the near future, but because a majority of the therapies won’t be available for twenty-five to fifty years, it may not be soon enough for some people.

Some of the ideas that Dr. de Grey and the H+ interviewer mention for tacking on years of life expectancy are strange. For example, Dr. de Grey mentions long-term caloric restrictions as one possible way to extend a person’s lifetime because it has worked with other animals in experiments. 

Physicists Propose Space-Time Cloak

Fiber optics may hold the key to masking events

 

 

I love when a headline straight out of a science fiction movie shows up on a website like National Geographic. Physicists at Imperial College London are now playing with the idea of space-time invisibility cloaks. That is, cloaks that mask not just an object but an entire event from outside observers. 

Can you imagine? A bank heist could take place right under all the security in the world, completely invisible, only noticed when a few million dollars are mysteriously gone. There would be no observable trace of the robbers or their misdeeds. By the time the bank figured out what they were missing, the crafty wielders of physics would be driving away with piles of cash. Sounds like the foundation for a new futuristic crime show.

The cloak would work by slowing down light around a particular location. Once the event was finished, it would speed the light back up. You'd essentially be bending space-time around a particular event. I think. My knowledge of high-level physics is generally limited to what I've gleaned from Mass Effect and episodes of Battlestar Galactica. I do know that this kind of cloak works differently from the invisibility cloaks that have been proven to work at very small scales. Those simply bend light around an object. But bending light and bending time are two very different games. These English scientists seem to think both are possible. 

Fiber optics technology might be manipulated in such a way as to construct materials to build a space-time cloak. But you'd need an extremely powerful laser for it to work for even the tiniest fraction of a second. Normal daylight just wouldn't cut it. So unless you're robbing banks in a world lit by lasers where you can crack a safe in a femtosecond, your perfect sci-fi getaway is still a long way from working.

 

Personality tests encourage and categorize identical behaviors

 

I had to take the Myers-Briggs Type Indicator (MBTI) several times in my life.  My personality apparently from high school to college—I became more extroverted as I emerged from teenaged angst, perhaps?—so the solidity of the test’s outcome crumbled for me then.  My college-aged personality indicated that I would be a CEO, an unlikely career for someone who hates business politics and power plays.  

 

Every time I took the test, I felt like I was reading a fortune cookie.  Type descriptions described me—and everyone else I knew with slightly different adjectives—as warmhearted, outgoing and friendly and said that I wanted to make work fun! Each personality type’s description is vague and non-threatening, crafted, it seems, to fit everyone fairly well and to make each person think that his or her personality type is best.  I remember bonding with my fellow—exceedingly rare—INTJ personality type in high school, but what would we have in common now that I have changed? 

 

Most people take the MBTI more seriously than I do, sincerely using it to pick careers and shape social interactions.  What is about this short questionnaire that makes people think the Type Indicator has them pegged?  And, perhaps more frightening, what makes people want to get pegged in the first place?  

 

Swiss psychiatrist Carl Jung introduced the idea of psychological types in the 1920s.  Isabel Briggs Myers continued with his research in the 1940’s and 1950’s, with the first publication of the indicator in 1962.  The MBTI is said to be valid and reliable.  Psychologists say that it both measures what it says it will and will remain the same each time the tester takes the assessment.  

The MBTI is divided into four letters that represent different parts of a person’s personality. These represent the flow of energy: extroverted or introverted, how a person takes in information: sensing or intuiting, the way a person prefers to make decisions: thinking or feeling and lifestyle preference: judging or perceiving.  The MBTI, like in Jung’s original model, structures each of these parts on a scale so that a person is dominant in one aspect, but also has aspects of the others in his or her personality.  

 

On the MBTI website, it is quick to note that all personality types are equal.  Separate but equal, but much of the material entered note the types of jobs suited.  Only four types are meant to business-savvy Trumps and Gateses out there.  

 

To me, the saddest part about this test, and psychology in general, is its striving to take away randomness and individuality in people.  According to the MBTI website, “ The essence of the theory is that much seemingly random variation in the behavior is actually quite orderly and consistent, being due to basic differences in the ways individuals prefer to use their perception and judgment.” In this way, the MBTI tells people that they will behave a certain way--it is their personality destiny--and that everyone else will behave in a way that they can predict and understand. Who wants to predict and understand?  

 

Personality tests take out the uniqueness in people as well as the excitement that comes with the belief that there is spontaneity in every person.  Personality tests both predict and encourage people to become cookie cutouts of their alleged INTP, ENTJ or ISFP selves. 

Pages