A team of international researchers has set a new data transfer record. A memory to memory data transfer rate of of 186 gigabits a second. The team cranked up the network at the SuperComputing 2011 conference in Seattle in mid-November. Transferring data in opposite directions over a wide-area network circuit. The rate is equivalent to moving two million gigabytes per day, fast enough to transfer nearly 100,000 full Blu-ray disks – each with a complete movie and all the extras – a day. The researchers reached transfer rates of 98 gigabits per second between the University of Victoria Computing Centre located in Victoria, British Columbia, and the Washington State Convention Center in Seattle. Coupled with a simultaneous data rate of 88 Gbps in the opposite direction the team reached the astounding two-way data rate of 186 Gbps to break their own previous peak-rate record of 119 Gbps set in 2009. READ MORE
Optical fiber cable wiggles it’s way between continents, beneath oceans, under streets, hundreds of thousand of kilometers of the acrylate polymer or polyimide clad silica cable, allowing for almost loss-less communication. With less data loss and higher bandwidth, optical-fiber technology allows information to freely bound about the the globe, bringing pictures, video, and other data from every corner of the globe to your computer in a split second. Although optical-fibers and their clever photonic relay of information are increasingly replacing copper wire and it’s inefficient electron based data processing in the communication world, today’s computer technology still relies entirely on electron based – CPU -micro-processing. Read the full article »»»»
AI, Artificial intelligence has been the inspiration for some of our favorite books, movies and games, as well as the aspiration of countless scientists and engineers. Researchers at the California Institute of Technology (Caltech) have now taken a major step toward creating artificial intelligence—not in a robot or a silicon chip, but in a test tube. The researchers are the first to have made an artificial neural network out of DNA, creating a circuit of interacting molecules that can recall memories based on incomplete patterns, just as a brain can.
One of the things that the human brain excels at is the ability to recognize just what things are, even when presented with an incomplete data set. In a ‘For the first time ever moment’ boffinss at the California Institute of Technology – Caltech – have created a DNA-based artificial neural network that can do the same thing. They believe that it could have huge implications for the development of true artificial intelligence.
CALTECH’s neural network is made up of just four artificial neurons, as opposed to our human brain’s 100 billion real ones. . .
“The brain is incredible,” says Lulu Qian, a Caltech senior postdoctoral scholar in bioengineering and lead author on the paper describing this work, published in the July 21 issue of the journal Nature. “It allows us to recognize patterns of events, form memories, make decisions, and take actions. So we asked, instead of having a physically connected network of neural cells, can a soup of interacting molecules exhibit brainlike behavior?”
The answer, as the researchers show, is yes.
Consisting of four artificial neurons made from 112 distinct DNA strands, the researchers’ neural network plays a mind-reading game in which it tries to identify a mystery scientist. The researchers “trained” the neural network to “know” four scientists, whose identities are each represented by a specific, unique set of answers to four yes-or-no questions, such as whether the scientist was British.
After thinking of a scientist, a human player provides an incomplete subset of answers that partially identifies the scientist. The player then conveys those clues to the network by dropping DNA strands that correspond to those answers into the test tube. Communicating via fluorescent signals, the network then identifies which scientist the player has in mind. Or, the network can “say” that it has insufficient information to pick just one of the scientists in its memory or that the clues contradict what it has remembered. The researchers played this game with the network using 27 different ways of answering the questions (out of 81 total combinations), and it responded correctly each time.
This DNA-based neural network demonstrates the ability to take an incomplete pattern and figure out what it might represent—one of the brain’s unique features. “What we are good at is recognizing things,” says coauthor Jehoshua “Shuki” Bruck, the Gordon and Betty Moore Professor of Computation and Neural Systems and Electrical Engineering. “We can recognize things based on looking only at a subset of features.” The DNA neural network does just that, albeit in a rudimentary way.
Biochemical systems with artificial intelligence—or at least some basic, decision-making capabilities—could have powerful applications in medicine, chemistry, and biological research, the researchers say. In the future, such systems could operate within cells, helping to answer fundamental biological questions or diagnose a disease. Biochemical processes that can intelligently respond to the presence of other molecules could allow engineers to produce increasingly complex chemicals or build new kinds of structures, molecule by molecule.
“Although brainlike behaviors within artificial biochemical systems have been hypothesized for decades,” Qian says, “they appeared to be very difficult to realize.”
The researchers based their biochemical neural network on a simple model of a neuron, called a linear threshold function. The model neuron receives input signals, multiplies each by a positive or negative weight, and only if the weighted sum of inputs surpass a certain threshold does the neuron fire, producing an output. This model is an oversimplification of real neurons, says paper coauthor Erik Winfree, professor of computer science, computation and neural systems, and bioengineering. Nevertheless, it’s a good one. “It has been an extremely productive model for exploring how the collective behavior of many simple computational elements can lead to brainlike behaviors, such as associative recall and pattern completion.”
To build the DNA neural network, the researchers used a process called a strand-displacement cascade. Previously, the team developed this technique to create the largest and most complex DNA circuit yet, one that computes square roots.
This method uses single and partially double-stranded DNA molecules. The latter are double helices, one strand of which sticks out like a tail. While floating around in a water solution, a single strand can run into a partially double-stranded one, and if their bases (the letters in the DNA sequence) are complementary, the single strand will grab the double strand’s tail and bind, kicking off the other strand of the double helix. The single strand thus acts as an input while the displaced strand acts as an output, which can then interact with other molecules.
Because they can synthesize DNA strands with whatever base sequences they want, the researchers can program these interactions to behave like a network of model neurons. By tuning the concentrations of every DNA strand in the network, the researchers can teach it to remember the unique patterns of yes-or-no answers that belong to each of the four scientists. Unlike with some artificial neural networks that can directly learn from examples, the researchers used computer simulations to determine the molecular concentration levels needed to implant memories into the DNA neural network.
While this proof-of-principle experiment shows the promise of creating DNA-based networks that can—in essence—think, this neural network is limited, the researchers say. The human brain consists of 100 billion neurons, but creating a network with just 40 of these DNA-based neurons—ten times larger than the demonstrated network—would be a challenge, according to the researchers. Furthermore, the system is slow; the test-tube network took eight hours to identify each mystery scientist. The molecules are also used up—unable to detach and pair up with a different strand of DNA—after completing their task, so the game can only be played once. Perhaps in the future, a biochemical neural network could learn to improve its performance after many repeated games, or learn new memories from encountering new situations. Creating biochemical neural networks that operate inside the body—or even just inside a cell on a Petri dish—is also a long way away, since making this technology work in vivo poses an entirely different set of challenges.
Beyond technological challenges, engineering these systems could also provide indirect insight into the evolution of intelligence. “Before the brain evolved, single-celled organisms were also capable of processing information, making decisions, and acting in response to their environment,” Qian explains. The source of such complex behaviors must have been a network of molecules floating around in the cell. “Perhaps the highly evolved brain and the limited form of intelligence seen in single cells share a similar computational model that’s just programmed in different substrates.”
“Our paper can be interpreted as a simple demonstration of neural-computing principles at the molecular and intracellular levels,” Bruck adds. “One possible interpretation is that perhaps these principles are universal in biological information processing.
“The research described in the Nature paper, “Neural network computation with DNA strand displacement cascades,” is supported by a National Science Foundation grant to the Molecular Programming Project and by the Human Frontiers Science Program.
first published in the journal Nature.
Metallic Glasses, the new wonder material for a world pushing the boundaries. When Titanium is no longer strong enough where do you turn, BMG – Bulk Metallic Glasses – 2.5 times stronger than titanium, but with the pruduction costs of plastic. Cheap tough and very quick to form. Could matallic glass marbles be the toy of the future ?
First discovered by Klement, Willens and Duwez at CalTech – Californian Institute of Technology – in 1960’s . Production of Metal Glass wouldn’t occur until the early 90’s when William L. Johnson and his team were able to develop a commercially viable process. Liquidmetal Technologies was incorporated as a company in 2003 to manage the patents and products including Vitreloy, the first commercially available Metallic Glass.
The findings of their latest research have been published in the May 13, 2011 issue of the journal Science. Established by the research is a new far more efficient production process. They were able to heat small alloy slugs – blanks – at the rate of a million degrees a second. Once heated the metal is injection moulded and cooled. Try doing it all in 40 milliseconds. Sadly it doesn’t pop out of the mould transparent like glass, it’s not transparent metal as the name my imply. The term glass in the name actually refers to the metal having the internal structure of glass –amphorous or random layout of the metal’s atoms – not its transperancy. The crazy internal structure of this metal is its secret.
This next generation of BMG production process is all about speed. The alloy rod is heated to 600 degrees C in a millisecond, then injected into a mould and cooled, with the whole process complete in 40 milliseconds. To heat the alloy rod fast enough ohmic heating was used – firing a short and intense pulse of electrical current into the object to heat. Shock therapy for steel. The whole process is over so fast the metal doesn’t have time to form crytal bonds, it’s still sitting there going ‘what the hell was that?’ The new process is called Rapid Discharge Forming – patents pending – and will be commercialized through Glassimetal Technology, a company formed in 2010 by Johnson and his CalTech alumni. This new process should be able to produce even more perfectly imperfect parts than is currently possible, the speed of the process eliminates any order.
Metallic Glass starts off it’s life as a fairly standard slug of Metal Alloy. Various alloys have been tested over the years some recipe’s increase strength, some effect electrical properties. Vitreloy incredient list includes zirconium and titanium as it’s main ingredients. The same recipe rules that apply to producing any metal apply here just the same. In a sense it’s a new style of forging alloys into products, a new kind of casting. Extremely rapid heating, moulding and cooling.
Metallic Glass’s biggest enemies are crystals and time. As all metals slowly cool the bonds between atoms are reforming into oderly crystal like structures, this is what they naturally want to do and until recenty accepted in metallergy as the strongest way to make metals. The Japanese sumaria sword makers were the masters of this with the heating, folding and cooling cycle while making a sword. Metallic glass takes the opposite approach, it works by making a random structure without any straight lines or points of weakness. Vitreloy is one Metallic Glass product made by LiquidMetals which is twice as strong as titanium. It’s tough stuff but still considered previous generation, one step behind what is discussed in Johnson’s latest research papers.
“We’ve taken the economics of plastic manufacturing and applied it to a metal with superior engineering properties,” Johnson says. “We end up with inexpensive, high-performance, precision net-shape parts made in the same way plastic parts are made—but made of a metal that’s 20 times stronger and stiffer than plastic.”
Is there room in our fast paced world for another wonder material ? Is Metalic Glass destined to be the next Ginshu knife sold on Danoz Direct, or is it a game changer ? The Aeropsace and commercial air craft industries will be all over this technology like a rash, more strength and less weight is their mantra. It may take 10 or 20 years to become established in other areas like manufacturing but BMG’s will be huge. Now has someone seen my transparent aluminium.