August 16, 2014

Quantum Machine Learning Google Tech Talk

Seth Lloyd visited the Quantum AI Lab at Google LA to give a tech talk on "Quantum Machine Learning." This talk took place on January 29, 2014.

Speaker Info:

Seth Lloyd is one of pioneers in the quantum information science with several seminal contributions to quantum computing, quantum communication, and quantum control. He developed the first quantum algorithms for efficient simulation of many-body systems at the quantum scale. He has also introduced the first realizable model for quantum computation and is working with a variety of groups to construct and operate quantum computers and quantum communication systems. Dr. Lloyd is the author of over a hundred and fifty scientific papers, and of `Programming the Universe,' (Knopf, 2004). He is currently professor of quantum-mechanical engineering at MIT.


Machine learning algorithms find patterns in big data sets. This talk presents quantum machine learning algorithms that give exponential speed-ups over their best existing classical counterparts. The algorithms work by mapping the data set into a quantum state (big quantum data) that contains the data in quantum superposition. Quantum coherence is then used to reveal patterns in the data. The quantum algorithms scale as the logarithm of the size of the database.

At the bottom of quantum mechanics are vectors.

We should support moderate forces who can bring stabilty to America

Vox has an amusing article about how the US media would cover Ferguson, Missouri if it happened in another country

The crisis began a week ago in Ferguson, a remote Missouri village that has been a hotbed of sectarian tension. State security forces shot and killed an unarmed man, which regional analysts say has angered the local population by surfacing deep-seated sectarian grievances. Regime security forces cracked down brutally on largely peaceful protests, worsening the crisis.
We should support moderate forces who can bring stabilty to America.

Read the whole thing.

In Real News on this situation

Missouri governor Jay Nixon has imposed curfew in Ferguson and declared a state of emergency.

Wikipedia summary - The shooting of Michael Brown occurred on August 9, 2014, in Ferguson, Missouri, United States. Brown was an unarmed 18-year-old African-American male who died after being shot multiple times by Ferguson police officer Darren Wilson. Brown had no history of arrests or criminal convictions. According to Ferguson police, Brown was a suspect in a robbery minutes before the shooting, although the initial contact between Wilson and Brown was unrelated to the robbery. Wilson has served four years with the Ferguson Police Department, two years with another local police department, and had no disciplinary history.

A huge advance in optical telescope interferometry could be developing which will enable imaging exoplanets

Centauri Dreams reports on what may be a big step towards one of the greatest astronomical instrument breakthroughs since the invention of the telescope. This could be a genuine advance in interferometry.

The longest optical telescope baselines are now 437 meters. The researchers are proposing changes to get to 7 kilometer baselines. Longer baselines (putting telescopes further apart) can enable higher resolution if the telescopes can still work together.

An interferometer essentially combines the light of several different telescopes, all in the same phase, so it adds together “constructively” or coherently, to create an image via a rather complex mathematical process called a Fourier transform (no need to go into detail but suffice to say it works). We wind up with detail or angular resolution equivalent to the distance between the two telescopes. In other words, it’s like having a single telescope with an aperture equivalent to the distance, or “baseline” between the two. If you combine several telescopes, this creates more baselines which in effect help fill in more detail to the virtual singular telescopes’ “diluted aperture”. The equation for baseline number is n(n-1) /2, where n is the number of telescopes. If you have 30 telescopes this gives an impressive 435 baselines with angular resolution orders of magnitude beyond the biggest singular telescope. So far so easy? Wrong.

The problem is the coherent mixing of the individual wavelengths of light. It must be accurate to a tiny fraction of a wavelength, which for optical light is a few billionths of a metre. Worse still, how do you arrange for light, each signal at a slightly different phase, to be mixed from telescopes a large distance apart?

The Advance in Interferometry

The researchers are advocating is heterodyne interferometry, an old fashioned idea, again like interferometry itself. Basically it involves creating an electrical impulse as near in frequency as possible to the one entering the telescope, and then mixing it with the incoming light to produce an “intermediate frequency” signal. This signal still holds the phase information of the incoming light but in a stable electrical proxy that can be converted to the original source light and mixed with light from other telescopes in the interferometer to create an image. This avoids most of the complex light-losing “optical train”

Arxiv - A Dispersed Heterodyne Design for the Planet Formation Imager

The Planet Formation Imager (PFI) is a future world facility that will image the process of planetary formation. It will have an angular resolution and sensitivity sufficient to resolve sub-Hill sphere structures around newly formed giant planets orbiting solar-type stars in nearby star formation regions. We present one concept for this design consisting of twenty-seven or more 4m telescopes with kilometric baselines feeding a mid-infrared spectrograph where starlight is mixed with a frequency-comb laser. Fringe tracking will be undertaken in H-band using a fiber-fed direct detection interferometer, meaning that all beam transport is done by communications band fibers. Although heterodyne interferometry typically has lower signal-to-noise than direct detection interferometry, it has an advantage for imaging fields of view with many resolution elements, because the signal in direct detection has to be split many ways while the signal in heterodyne interferometry can be amplified prior to combining every baseline pair. We compare the performance and cost envelope of this design to a comparable direct-detection design.

Megaprojects are often needed for transformational impact but too many have budget and schedule overruns

A new report by EY finds that 64% of multibillion-dollar, technically and operationally demanding oil and gas megaprojects continue to exceed budgets, with 73% missing project schedule deadlines. On average, current project estimated completion costs were 59% above the initial estimate. In absolute terms, the cumulative cost of the projects reviewed for the report has increased to $1.7 trillion from an original estimate of $1.2 trillion, representing an incremental increase of $500 Billion.

KPMG also has a lengthy look at world megaprojects

Megaprojects are needed to have ‘transformational’ impact. Ask anyone involved in megaproject delivery what their greatest legacy is, and they’ll probably point to a slew of social and economic benefits that they have unleashed through their megaprojects.

In Canada, there are currently [2013] more than 175 megaprojects either on the books or under construction across a multitude of sectors. In total, these projects represent around USD420 billion worth of investment. Topping the list of sectors being developed is power and utilities which has nearly 50 projects underway with a total price-tag of around USD170 billion.

Carnival of Space 367

The Carnival of Space 367 is up at Everyday Spacer

NASA’s Nuclear Spectroscopic Telescope Array (NuSTAR) has captured a spectacular event: a supermassive black hole’s gravity tugging on nearby X-ray light.

In just a matter of days, the corona — a cloud of particles traveling near the speed of light — fell in toward the black hole. The observations are a powerful test of Einstein’s theory of general relativity, which says gravity can bend space-time, the fabric that shapes our universe, and the light that travels through it.

Supermassive black holes are thought to reside in the centers of all galaxies. Some are more massive and rotate faster than others. The black hole in this new study, referred to as Markarian 335, or Mrk 335, is about 324 million light-years from Earth in the direction of the Pegasus constellation. It is one of the most extreme of the systems for which the mass and spin rate have ever been measured. The black hole squeezes about 10 million times the mass of our sun into a region only 30 times the diameter of the sun, and it spins so rapidly that space and time are dragged around with it.

Even though some light falls into a supermassive black hole never to be seen again, other high-energy light emanates from both the corona and the surrounding accretion disk of superheated material. Though astronomers are uncertain of the shape and temperature of coronas, they know that they contain particles that move close to the speed of light.

NASA's Swift satellite has monitored Mrk 335 for years, and recently noted a dramatic change in its X-ray brightness. In what is called a target-of-opportunity observation, NuSTAR was redirected to take a look at high-energy X-rays from this source in the range of 3 to 79 kiloelectron volts. This particular energy range offers astronomers a detailed look at what is happening near the event horizon, the region around a black hole from which light can no longer escape gravity's grasp.

Some Chinese Cities shifting from GDP to anti-pollution and poverty reduction for performance measurement

[Financial Times] More than 70 Chinese smaller cities and counties have dropped gross domestic product as a performance metric for government officials, in an effort to shift the focus to environmental protection and reducing poverty. The move, which follows a directive issued by top leaders last year, is among the first concrete signs of China switching its blind pursuit of economic growth at all costs towards measures that encourage better quality of life.

Icarus Interstellar has the X-Physics Propulsion & Power Project (XP4) and Mark Rademaker's visuals of based on the technical work of Sonny White

People like Sonny White, Richard Obousy, Jeff Lee, Miguel Albcubierre are the principals of advanced propulsion physics. Icarus Interstellar has the X-Physics Propulsion and Power Project (XP4). The XP4 Group seeks to do visionary and scientifically rigorous research to ascertain how the known limits of propulsion and power physics for space exploration can be surpassed, and to implement technology breakthroughs that can be approached using credible experimentation and engineering solutions.

The Mission Statement

Toward the goal of achieving interstellar travel on manageable human time scales, the XP4 Group focuses on the study and advancement of long term propulsion and power concepts, which are at Technology Readiness Levels 1 and 2. This will be realized by: conducting leading-edge, world-class research in foundational theoretical physics and subsequent peer reviewed publications; lecturing at symposia; writing popular press articles and giving interviews; recruiting exceptional researchers and students; and promoting the Icarus mission of interstellar flight through volunteer opportunities and educational outreach, which would increase space literacy among students, teachers, and the general public throughout the United States and beyond.

The research areas of the XP4 Group include, but are not limited to:

* Faster than Light (FTL) Spacetime Geometries (i.e. warp drives and wormholes) and Associated Energies (e.g. Casimir vacuum energy).
* Matter/Antimatter (MAM) production.
* Primordial to Near-Planck-Scale Black Holes.
* Relativistic Thermodynamics and Radiation.
* Space Drives and Gravity Control.
* Sakharov’s emergent spacetime/gravity.
* Paradigm changes in physics that define new breakthrough propulsion physics.

Mark Rademaker created the IXS-Enterprise (IXS-110). His artistic acumen was directed by the technical work towards detecting warped spacetimes by Sonny White NASA and Icarus XP4 researcher. The design is thus more than just an imagining, as Sonny explained in his speech at SpaceVision 2013,

Mark Rademaker has at least eight renderings of IXS-110.

Project Voyager – Design a 2d/3d Interstellar Trajectory and Mission Planning Tools

Icarus Interstellar’s latest research project is Project Voyager. Voyager will be led by Project Leader Zach Fejes (, who has built a team of Engineers, Physicists and coders, with objective to design a 2d/3d interstellar trajectory and mission planning tool from the ground up.

Icarus Interstellar is a nonprofit foundation dedicated to achieving interstellar flight by the year 2100. The organizations was founded in 2011 and received its tax exemption status in January 2013. The organization grew out of Project Icarus, which is a five year design study for a fusion powered starship that began on September 30th 2009 and was launched jointly by the British Interplanetary Society and the Tau Zero Foundation.

The Toronto based team of 14 researchers so far have already settled in and are working on Euler vs 4th order Runge Kutta approximations of the inner solar system. Here’s a very early screen of the 2d and 3d visualizer (uses Unity).

Project Voyager is about mapping a path to the stars. Not tomorrow, not in a decade, but today.
In essence, the project involves the creation of a mission planning software system, to enable interplanetary and interstellar trajectory planning. This program will enable scientists, engineers, and enthusiasts from Icarus and other space organizations to accurately plan missions not only to bodies in our solar system, but to the known bodies in other star systems as well. We are building the program to be extendable, so that every new discovery (whether an asteroid, dwarf planet, or exoplanet) can be added to the system and used to plan more accurate missions. As our knowledge of the universe around us expands, so will our map.

We are planning the software to be divided into two distinct parts. The first is the mission planning stage. Here, we will have a graphical interface that allows the user to visually travel around the entire mapped region of space. In this component all of the bodies will be on ‘rails’ in their respective predicted orbits. The user should be able to manipulate their vessel’s trajectory in real time (using patched conics), and produce a relatively accurate trajectory to their chosen destination. After this is done, the second part of the software will come online. All force vectors from the earlier planning stage are logged, and the start time determined. Starting from this data, a full gravitational simulator is run to determine a trajectory more accurate than that provided by the PCA. Once this has been determined, the map will overlay the two trajectories, and switch back into real-time mode, allowing the user to iteratively design their mission plan.

Icarus Interstellar is starting project Astrolabe to study the future of civilization

Icarus Interstellar has started project Astrolabe to study the future of civilization.

If we really want to quantify signs of technological civilization in the universe, we need to think about how SETI and METI programs are or will be integral to any interstellar effort, as both are about the quest for knowledge as an active engagement with the world.

In parallel with these efforts, we will want to study the technological trends emerging from our rapidly changing civilization, for what they portend for the future. What technologies our science develops for us, and which among these technologies prove to be practicable and adaptable to the peculiar architecture of our civilization, will shape every detail of the future, and may prove the difference between human civilization being viable or non-viable. Every particular interstellar propulsion technology, or life sciences technology, or computing technology – everything, in short, that goes toward building a technological civilization – interacts differently with the individual life making use of such technologies and the socioeconomic structure within which the individual finds a home.

The study of the future of civilization, then, requires that we engage with questions of detail in regard to the particular means of securing our long-term future, as well as engaging with the big picture of difficult issues that will be posed by future developments, such as the economics of spacefaring civilization, transhumanism, computational infrastructure, and the profound moral dilemmas of expanding the terrestrial biosphere and human civilization beyond Earth.

Nick Nielsen will lead Project Astrolabe in these investigations, with Heath Rezabek as Deputy Project Lead. The Project Astrolabe Project Proposal Outline here.

If you’re interested in joining the study team, please email with a short statement of interest and brief background information. We are looking for people with interest and/or experience in anthropology, sociology, transhumanism, futurism and other disciplines relating to assessing the past and future of social and technical evolution and thought.

Paper suggests NASA Warping space time experiments needs about 1 million times better detection or to alter the design to increase the possible effect

Arvix paper contends the spacetime distortions resulting from the experimentally obtainable electric field of a parallel plate capacitor configuration cannot be detected by the White-Juday Warp Field Interferometer. Any post-processing results indicating a vanishing, non-zero difference between the charged and uncharged states of the capacitor are due to local effects rather than spacetime perturbations.

The White-Juday Warp Field Interferometer (WJWFI), which is a modified, seismically-isolated Fabry-Pérot interferometer, has been developed to detect spacetime distortions created by a ~10^6 V·m-1 static electric field. The interferometer employs a 6328 Å HeNe laser, in which one of the two beams passes between two electrically charged parallel plates. The beams are recombined on a CCD array.

However, the spacetime distortions produced by such an electric field are exceptionally below the detection threshold of all present-day interferometry techniques. Additionally, an analysis of refractive index variations, due to plausible air temperature differences in the laboratory, was conducted, and the resulting beam refraction is shown to be potentially above the lower
limit of detectability of the WJWFI.

The WJWFI is totally incapable of detecting the minute distortions of spacetime produced by a 4.4 J·m-3 electric field. The static electric field of equivalent radius required to achieve the microlensing detection threshold would be ~10^12 V·m-1. Therefore, any vanishing non-zero difference between the charged and uncharged states of the plates is clearly due other factors.

August 15, 2014

A self-organizing thousand-robot swarm

The first thousand-robot flash mob has assembled at Harvard University.

“Form a sea star shape,” directs a computer scientist, sending the command to 1,024 little bots simultaneously via an infrared light. The robots begin to blink at one another and then gradually arrange themselves into a five-pointed star. “Now form the letter K.”

The ‘K’ stands for Kilobots, the name given to these extremely simple robots, each just a few centimeters across, standing on three pin-like legs. Instead of one highly-complex robot, a “kilo” of robots collaborate, providing a simple platform for the enactment of complex behaviors.

Just as trillions of individual cells can assemble into an intelligent organism, or a thousand starlings can form a great flowing murmuration across the sky, the Kilobots demonstrate how complexity can arise from very simple behaviors performed en masse (see video). To computer scientists, they also represent a significant milestone in the development of collective artificial intelligence (AI).

Given a two-dimensional image, the Kilobots follow simple rules to form the same shape. Visually, the effect is similar to a flock of birds wheeling across the sky. “At some level you no longer even see the individuals; you just see the collective as an entity to itself,” says Radhika Nagpal. (Image courtesy of Mike Rubenstein and Science/AAAS.)

Japan's nuclear shutdown continues to cost $35.2 billion per year and means 26% more fossil fuel

Japan's ongoing reliance on imported fossil fuels while its nuclear reactors await permission to restart continues to impact on the country's greenhouse gas emissions and trade deficit.

Japan depended on imported fossil fuels for 88% of its electricity in fiscal year 2013, compared with 62% in fiscal 2010, the last full-year before the March 2011 accident at the Fukushima Daiichi plant. With almost its entire nuclear fleet offline, Japan reliance on fossil fuels peaked in fiscal year 2012 at 92.2%.

The additional fuel costs that Japan faced in fiscal 2013 to compensate for its nuclear reactors being idled was ¥3.6 trillion ($35.2 billion). Japan reported a trade deficit of ¥11.5 trillion ($112 billion) for the year, largely directly and indirectly due to these additional fuel costs. This compares with trade deficits of ¥6.9 trillion ($68 billion) in 2012 and ¥2.6 trillion ($25 billion) in 2011, following a ¥6.6 trillion ($65 billion) surplus in 2010.

Kazakhstan produces 5,650 tons of uranium in second quarter

Kazakhstan produced 5,650 tons of uranium in the second quarter of 2014.

Kazakhstan produced 5,590 tons of uranium in the second quarter of 2013.

Kazakhstan possesses 0.85 million tons of uranium reserves. It ranks second in the world in terms of the reserves, and first in terms of uranium mining

Small Magnifying Glass burns ants - Six foot lens can make safe drinking water for the worlds poor and save millions of lives per yaer

Millions of people die every year from diseases and pathogens found in unclean water, and they can’t help it because that’s all they have. Either they drink it or they die.

Deshawn Henry, a University at Buffalo sophomore civil engineering major, who researched how to improve a 6-foot-tall, self-sustaining magnifying glass.

Properly termed a water lens, the device uses another abundant resource — sunlight — to heat and disinfect polluted water. Since the frame for the lens can be constructed from commonly found materials — wood, plastic sheeting and water — the lens can be built for almost no cost, offering an inexpensive method to treat water.

The device may not look like much, but it can heat a liter of water to between 130 and 150 degrees Fahrenheit in a little more than an hour, destroying 99.9 percent of bacteria and pathogens.

Deshawn Henry working on the water lens that can heat a liter of water to between 130 and 150 degrees Fahrenheit in a little more than an hour, destroying 99.9 percent of bacteria and pathogens.

August 14, 2014

17 ExaFLOP Deep Learning Optical Computer by 2020 ?

Baidu, Google and others are competing with Deep learning Artificial Intelligence.

Baidu built a neural network that roughly matched the Google Brain system for a 50th of the cost—only $20,000—using off-the-shelf graphics chips from Nvidia.

The Google Brain was designed to test the potential of deep learning, which involves feeding data through networks of simulated brain cells to mimic the electrical activity of real neurons in the neocortex, the seat of thought and perception. Such software can learn to identify patterns in images, sounds, and other sensory data. In one now-famous experiment, the researchers built a “brain” with one billion connections among its virtual neurons; it ran on 1,000 computers with 16 processors apiece. By processing 10 million images taken from YouTube videos, it learned to recognize cats, human faces, and other objects without any human help. The result validated deep learning as a practical way to make software that was smarter than anything possible with established approaches to machine learning. It led Google to invest heavily in the technology—quickly moving the Google Brain software into some of its products, hiring experts in the technique, and acquiring startups.

The GPGPUs that implemented the Baidu Deep learning brain may be replaced by new optical computers.

A startup company called Optalysis is trying to invent a fully-optical computer that would be aimed at many of the same tasks for which GPUs are currently used. Amazingly, Optalysis is claiming that they can create an optical solver supercomputer astonishing 17 exaFLOPS machine by 2020.

Deep Learning + 17 exaFLOP optical computer = 17 ExaFLOP Deep learning system by 2020.

This would be about 1 million times faster than the current Baidu Brain and might be near human brain in scale.

Nextbigfuture interviewed Helion Energy CEO David Kirtley

Nextbigfuture interviewed Helion Energy CEO David Kirtley. An NSF, NASA, and DOD fellow, Dr. Kirtley has 13 years of experience in nuclear engineering, fusion, and aerospace and holds Nuclear and Aerospace Engineering degrees from the University of Michigan. He leads the MSNW propulsion research and development, serves as Helion’s CEO, and has raised and managed many high technology programs.

Helion Energy is trying to achieve commercial Magneto-Inertial Fusion. This combines the stability of steady magnetic fusion and the heating of pulsed inertial fusion, a commercially practical system has been realized that is smaller and lower cost than existing programs. Helion Energy will be magnetically accelerating plasmas together and then compressing them once per second.

Helion Energy has raised $2 million from Mithril Capital Management and Y Combinator.

They plan to perform the remaining research and experiments to enable to final design of their breakeven Fusion engine.

The breakeven machine will need about $35 million in funding (2015-2016) and the target is to develop it in 2016.

If all proceeds on schedule then a Helion Energy machine that that proves commercial energy gain would be a 50 Megawatt system built in 2019. $200 million will be needed for the commercial pilot plant. The plan would be to start building commercial systems by 2022.

Dr David Kirtley kindly indicated that he felt the discussions on Nextbigfuture were the most technically interesting.

Previously there was talk about creating a fusion engine to help burn fission waste (unburned fission fuel). It would need to produce a lot of neutrons. this turned out to be more difficult and less practical.

Now the work indicates that aneutronic Helium 3 - Deuterium is the best path forward.

2D+3He→ 4He+ 1p+ 18.3 MeV

Prototypes every two years

Another view of the fourth prototype

Baidu hires Google Andrew Ng and plans to compete with Deep Learning Artificial Intelligence and have built a $20000 simulated brain with one billion connections

[Technology Review]Andrew Ng is the newly appointed chief scientist at Baidu, China’s dominant search company. He has plans to advance deep learning, a powerful new approach to artificial intelligence loosely modeled on the way the brain works. It has already made computers vastly better at recognizing speech, translating languages, and identifying images—and Ng’s work at Google and Stanford University, where he was a professor of computer science, is behind some of the biggest breakthroughs.

Often called China’s Google, Baidu plans to invest $300 million in the new lab and a development office on the same floor over the next five years. Ng (it is pronounced “Eng”) aims to hire 70 artificial-intelligence researchers and computer systems engineers to work in the new lab by the end of 2015. “It will really target fundamental technology,” says Kai Yu, the director of Baidu’s Beijing deep-learning lab, a friend of Ng’s who urged him to join the company.

August 13, 2014

Micro Chiplets are accurately placed four at a time but in few years it could be millions or billions

[Technology Review] PARC’s technique of mincing chips into printer ink could revolutionize the way electronics are made. Researchers at PARC, in Palo Alto, California, envision doing something different with the wafers: chopping them up into hairs-width “chiplets,” mixing them into an ink, and guiding the tiny pieces electrostatically to just the right spot and orientation on a substrate, from which a roller could pick them up and print them.

Although now at a very early stage, the technology could lead to novel kinds of computing devices, such as high-resolution imaging arrays made from tiny ultrasensitive detectors assembled by the million. Because printers can deposit materials on different substrates, this technology could be used to make high-performance flexible electronic devices, tiny sensors festooned with dense arrays of diverse sensors, or 3-D objects with computing functions woven in, says Janos Veres, who manages PARC’s printed-electronics team. And the approach could make it easier for more people and small companies to design and manufacture custom computing devices.

PARC’s vision for the technology starts with wafers made by conventional methods and designed to hold thousands of tiny functioning devices. These could include LEDs or lasers, processors and memory, or sensors based on microelectromechanical devices, or MEMS. They’d all become feedstocks for a palette of chip-infused inks. Existing electronics-printing systems generally use lower-performance materials, but “potentially, we can use the absolutely highest-performance chips on the market,” says Eugene Chow, an electrical engineer who leads the project.

The technology marshals chiplets into place using software-controlled electrical fields generated by arrays of wires beneath an assembly substrate. Like balls rolling into divots, the chiplets go to locations defined by the electrical fields. “The fields are changing in time and space in all kinds of fancy patterns that can be controlled to allow high-throughput assembly,” Chow says.

The chiplets, each 200 micrometers by 300 micrometers, are placed in a fluid.

Transatomic Power got $2 million in funding

Transatomic Power, developers of a molten salt nuclear reactor design, announced today that FF Science, an investment vehicle of Founders
Fund, has invested $2 million to assist the company with
its seed stage development. The funds will be used for
bench-top laboratory testing and refinement of the company’s designs and computer models.

Transatomic Power is based on inventions developed by Dr. Dewan and Mark Massie while graduate students in the MIT Department of Nuclear Science and Engineering. The reactor uses nuclear fuel dissolved into a molten salt, rather than the solid fuel of conventional nuclear reactors. This liquid fuel makes it possible to generate power at atmospheric pressure, greatly reduce the creation of long-lived nuclear waste, and improve safety and cost. The basic approach was demonstrated in the 1960s, and now the pair has developed key material and design improvements that could increase the reactors effectiveness up to 100-fold and transform the nuclear industry.

Carnival of Nuclear Energy 221

The Carnival of Nuclear Energy 221 is up at ANS Nuclear Cafe

Atomic Insights - Why does conventional wisdom ignore hormesis?

Hormesis – the stimulation by some agent, such as radiation, of biological responses that are protective against damage done by that agent. It has the same etymological root as hormone, which is a molecule secreted by a gland to stimulate a response in some other part of the body.

In light of repeated assertions that all ionizing radiation is harmful no matter how high or how low the dose, the existence of a beneficial health effect may be surprising. But nearly a century of laboratory experimentation and epidemiological observation of both humans and animals supports the protective response region and contradicts the conventional wisdom. Why then does the concept that all ionizing radiation is harmful hang on with such tenacity, and how did it gain a foothold against all evidence to the contrary?

They rule out hormesis by fiat rather than by scientific evidence. They are forced to this maneuver since the evidence supports hormesis and contradicts LNT. The only reason that LNT is widely accepted is that virtually all political power stands behind LNT, so that it has long been the default position. This is not science.

Iraq Situation Maps at Institute for the Study of War

The Graph Says we have to start killing and neutering people or there will be a mass of humanity expanding at lightspeed

Until 10000 BC the world lived as hunter gatherers. Some view the hunter gatherer lifestyle as more in tune with nature.

[Hunger Math] The land needed for each hunter gather ranges from 40 hectares per person in an ideal ecosystem, to 150-250 hectares per person for a moderately favorable ecosystem. In unfavorable ecosystems, the number can go very high, to over 1,000 hectares per person. Let’s assume an overly-optimistic 100 hectares per person for hunting and gathering. This means 136 million people on earth.

So returning to being hunter gathers would merely cost 7.1 billion lives.

Some use population growth rates and project that unchanged exponential growth would lead in 3500 years to a massive ball of human people expanding out at the speed of light.

Monty Python Lifeboat sketch

US has nearly one thousands marines and special ops in Iraq who apparently will not be in combat roles

[] The United States has nearly one thousand troops in Iraq now.

The U.S. sent 129 additional military personnel to Iraq to help develop an evacuation plan for tens of thousands of Iraqis stranded on a mountain under threat from Islamic extremists in northern Iraq. The new deployment raises the level of American military personnel in the country to 935, including 250 military advisors already in place, as well as an additional 100 assigned to the U.S. Embassy and the airport in Baghdad. At least 100 aircraft and more than half a dozen ships are also available to assist in the ongoing crisis.

A top Pentagon official told reporters Monday that the effect of U.S. airstrikes is unlikely to stop the momentum of the fast-advancing Sunni militants intent on creating a new Islamic state in Iraq, Syria and beyond.

“This fiction that Americans are not going to be in a combat role is just that,” retired Air Force Lt. Col. Rick Francona told CNN.

The new batch of troops consists of Marines and special operations forces from within the U.S. Central Command region.

Skytran 62 mile per hour two person pods that are 90 times cheaper than subways

[] Tel Aviv may be the site of the first maglev PRT system [Personal Rapid Transit], which has been designed by skyTran, an American company headquartered in Mountain View, California.

As of June 23, 2014, skyTran reached an agreement with Israel Aerospace Industries to begin developing a test track on the grounds of IAI’s corporate campus. If testing proves successful, construction will begin on the Tel Aviv commuter line.

The power used in two hair dryers can fly you at over 62 mph (100 kph) with skyTran.

[] SkyTran’s has computer-controlled, 2-person vehicles. They can accommodate much of the world’s commuting population within a smaller footprint and for less than other mass transit systems. Because skyTran is built as an expandable grid, it will never be filled to capacity. As the demand grows, more track can be installed and additional vehicles can be added the network. The robust state-of the-art skyTran system grows in the same way that the Internet grows: exponentially and immediately. In fact, you can think of skyTran as a “Physical Internet.”

The sophisticated skyTran computer network paces vehicles at optimum spacing and speeds to handle massive amounts of commuters many time over in a safe and efficient manner. The skyTran system recognizes you, responds to where you would like to go, and gets you there in the fastest time possible.

Stabilized Permanent Magnet Maglev promises the same cost as one lane of freeway with twenty times the carrying capacity

Engineers at LaunchPoint Technologies have been working with Applied Levitation and Fastransit Inc., to develop a completely new, revolutionary mode of maglev transportation using stabilized permanent magnets. They seem to have not gotten funding since prototypes were made. It is interesting technology.

Applied Levitation LLC has its origins in Magtube Inc. where our permanent magnet levitation technology was initially developed and tested on a full-scale prototype capsule. Since then we have expanded our research and changed the name of the company to reflect the departure from a focus on underground freight transport. With funding from our partners LaunchPoint Technologies Inc. and Fastransit Inc. we have just finished the proof-of-concept phase of our electromagnetic stabilization technology which will allow vehicles to levitate above a magnetic track without the need for side rails to control lateral position. We are currently seeking investment partners to help us with the third and final phase of our technology development: a full-scale vehicle and demonstration track.

Applied Levitation's Stabilized Permanent Magnet (SPM) suspension technology (U.S. Patent #6,684,794) provides capabilities that were not previously possible and at a lower cost than any other high-capacity approach.

* Low capital cost--SPM maglev guideways can be built, mile for mile, for about the same cost as one lane of freeway with twenty times the carrying capacity.

* Easily integrated--Existing rail and subway systems can be easily retrofitted with SPM maglev capabilities, enabling incremental upgrades of an aging infrastructure.

* Fast switching--SPM maglev technology uses instant magnetic switching with no moving parts which enables more efficient routing of computer-controlled maglev vehicles.

* Network capability (Mag-NetTM)--With SPM maglev technology and instant magnetic switching, transportation networks can be developed to route traffic in much the same way as information is routed over the internet.

* Highly scalable--SPM maglev technology can be used with smaller, lighter vehicles for more efficient passenger transportation, or with bigger, heavier vehicles for the transport of freight.

Cross-section of Maglev Rails with Standard Train Rails

SPM maglev technology will permit incremental upgrade of our current rail and subway systems. By installing SPM guideways on the same ties as existing tracks, with one maglev rail outside each of the existing steel rails and a motor rail down the center, SPM maglev vehicles can operate simultaneously with standard rail and subway vehicles. As a result, existing rail and subway cars can remain in operation as we gradually replace them with new maglev vehicles that cost less and perform far better. This capability avoids the need to completely overhaul our current infrastructure at tremendous cost.

Machine learning algorithm is able to predict cardiac arrest 4 hours in advance and is accurate 66% of the time

[New Scientist] The researchers trained a machine-learning algorithm on data from 133,000 patients who visited the NorthShore University HealthSystem, a partnership of four Chicago hospitals, between 2006 and 2011. Doctors called a Code Blue 815 times. By looking at 72 parameters in patients' medical history including vital signs, age, blood glucose and platelet counts, the system was able to tell, sometimes from data from 4 hours before an event, whether a patient would have gone into arrest. It guessed correctly about two-thirds of the time, while a scorecard flagged just 30 per cent of events.

The algorithm still needs work – it reports a false positive 20 per cent of the time, says Somanchi. To improve its performance, his team is planning to train the system with data from other hospitals.

The system could be combined with wearable sensors to monitor blood glucose and platelet counts in real time.

Spacex success has made enemies of the allies of big government contractors

[Slate - Phil Plait] SpaceX, as you may know, is making good on its promise to make access to space cheaper and more reliable. Their Falcon 9 rocket is putting payloads into orbit for less money than the big government contractors charge.

As one might expect, government officials who have such contractors in their own districts and states are unhappy with this. And apparently some are willing to smear SpaceX as retribution.

Three House members—Mike Coffman (R-Colo.), Mo Brooks (R-Ala.), and Cory Gardner (R-Colo.)—have sent a memo to NASA demanding that the agency investigate what they call “an epidemic of anomalies” with SpaceX missions.

The congressmen say that SpaceX should be accountable to the American taxpayer.

Space News reported taht NASA didn’t actually pay for the development of the Falcon 9; Elon Musk did.

Development of Falcon 9 and Dragon was supported, but not exclusively funded, by NASA through the Commercial Orbital Transportation Services (COTS) program, using Space Act Agreements versus conventional contracts. SpaceX supplemented the NASA funding with its own; SpaceX Chief Executive Officer Elon Musk has said on a number of occasions that the company used no NASA funding for development of the Falcon 9.

Anti-population growth is anti-black and anti-minority

The New York Times had a recent article about reducing carbon by curbing population.

It is projected that the world is expected to add about 1.3 billion people in Africa by 2050 and 700 million in Asia from the 2014 level of 7.3 billion. [250 million more in latin america and 80 million in North America]

More recent census projections are for Africa to have 2.7 billion in 2050. This would be an increase of 1.65 billion from current levels.

Many groups ask for reduced global population growth. The expected population growth is over 50% in Africa. Africa is mostly a black population. The claimed intent is not supposed to be about being anti-black. But the difference of having 800 million fewer black people in 2050 can be construed as anti-black. Just as getting the one-child policy in China and having 400 million fewer Chinese could be construed as an anti-Chinese policy. Clearly the one-child policy was a Chinese government action. Population growth clearly is coming from particular races.

It is just statistical facts. The end results and goals have clear racial facts.

Many "population environmentalists" are just anti-people. It does not matter what race to them. Population Matters thinks Europe should have half of its current population and North America should have 152 million less people.

August 12, 2014

Technological Takeoff and Resource Mobilization for Aspects of Mundane Singularity

This site has looked at a list of technologies for what I call the Mundane Singularity. Technological Singularity and Transhumanism are often criticized because the primary technologies that justify it are Molecular Nanotechnology and greater than human intelligence general AI, which some believe are not possible. Much of the projected benefits of a technological singularity could be achieved even without Molecular Nanotechnology and without greater than human Artificial General Intelligence as the technology triggers.

A Mundane Singularity could bring about a large amount of
1. Economic abundance
2. Radical life extension
3. Physical and Cognitive enhancement
4. Blood Stream Robots
5. Supermaterials
6. Open Access to space
7. Pollution elimination
8. Computer Advancement
9. Shape changing functional devices like utility fog

Early versions of the controversial molecular nanotechnology are emerging with DNA nanotechnology, DNA origami and Synthetic biology The vision and work of Shawn Douglas, Ido Bachelet and George Church could be part of realizing radical life extension and something more powerful than mere blood stream robots.

DNA nanorobots have been demonstrated in live cockroaches and could be in humans by 2019 and could scale to Commodore 64 - eight bit computing power.

Nanoparticles with computational logic has already been done

Load an ensemble of drugs into many particles for programmed release based on situation that is found in the body

J Storrs Hall defines a technical takeoff
- Embodies the essential function of the proposed technology
- is proof that the concept works
- focuses technical effort
- is a vehicle for practical experience
- attracts financial (etc) resources
- forms a crack in the dam

The list of technologies and policies that I believe play a major part in achieving those things over then next 20 years are

1. Pro-growth Policies and aggressive adoption and deployment of best practices
2. Energy Efficiency - superconductors, thermoelectrics, improved grid
3. Energy Revolution - Mass produced fission, fusion, and maybe cold fusion, battery singularity
4. Additive manufacturing
5. Not so mundane - neuromorphic chips, quantum computers, photonics
6. Automated transportation (leading to robotic cars and planes)
7. Urbanization MegaCities
8. Urbanization Broad Group skyscrapers, Tata flat packed buildings
9. Robotics
10. Hyperbroadband
11. Supermaterials
12. Improve medicine and public health
13. Space
14. Synthetic biology and recombineering
15. Sensors everywhere
16. Education transformed and accelerated innovation
17. Supersmartphones, exoskeletons and wearable systems
18. Memristors and other significant computing and electronic improvements.

The Mundane Singularity still has a normal adoption and deployment cycle. So the impact will increase over time. ie. More robots in 2020 and still more in 2025 and 2030.

On track to human brain scale neuromorphic systems with 20 billion neurons and 200 trillion synapses in 2019 or 2020

The DARPA 2015 budget (page 192) reported that a 1 million neuron chip was developed in 2013. This was announced in the last week by IBM and published research in the journal Science

The neuromorphic chip work outlined in 2010 and 2011 is roughly on track to the goals of human brain scale emulation in 2019.

In 2015 the goal [stated in 2010] is a prototype chip system simulating 10 billion neurons connected via 1 trillion synapses. The device must use 1 kilowatt or less (about what a space heater uses) and take up less than 2 liters in volume. 100 of the systems would have 1 trillion neurons and 100 trillion synapses and would be about the complexity of the human brain.

In 2014, IBM should have integrated the board with 16 chips into a larger rack with 4 billion neurons using 4 kilowatts of power. IBM would need another iteration or two of chip design to get to about triple the density with four times lower power usage.

The current TrueNorth chip consumes merely 70 milliwatts and is capable of 46 billion synaptic operations per second per watt–literally a synaptic supercomputer in your palm.

In 2011, IBM research suggested that a full-scale model of the human brain—which has 20 billion neurons connected by about 200 trillion synapses—could be reached by 2019, given enough processing power. It would be a hardware model. This does not indicate the actual intelligence that would be in the system. It also does not specify the quality of the neurons and synapses that are part of the system.

Still being at human brain scale would be interesting and it would be interesting to see what could be possible and what will be learned. Refinement to better neurons and synapses could progress in the 2020s.

The current SyNAPSE-developed chip, which can be tiled to create large arrays, has one million electronic “neurons” and 256 million electronic synapses between neurons. Built on Samsung Foundry's 28nm process technology, the 5.4 billion transistor chip has one of the highest transistor counts of any chip ever produced. Each chip consumes less than 100 milliWatts of electrical power during operation. When applied to benchmark tasks of pattern recognition, the new chip achieved two orders of magnitude in energy savings compared to state-of-the-art traditional computing systems.

August 11, 2014

Robin William RIP

Helion Energy now has a Helium 3 fuel cycle for its magneto-inertial fusion process

Helion Energy has an update on their project to achieve commercial nuclear fusion.

Helion Energy uses Magneto-Inertial Fusion: By combining the stability of steady magnetic fusion and the heating of pulsed inertial fusion, a commercially practical system has been realized that is smaller and lower cost than existing programs.

They want to create modular, distributed Power using shipping container sized, 50 Megawatt modules for base load power generation. (Nextbigfuture note - The longest shipping containers are 17 meters long and Helion talks about being 25-28 meters long)

They are using Self-Supplied Helium 3 Fusion. It is pulsed, D-He3 fusion simplifies the engineering of a fusion power plant, lowers costs, and is even cleaner than traditional fusion.

Magnetic Compression: Fuel is compressed and heated purely by magnetic fields operated with modern solid state electronics.

This eliminates inefficient, expensive laser, piston, or beam techniques used by other fusion approaches.

Direct Energy Conversion: Enabled by pulsed operation, efficient direct conversion decreases plant costs and fusion’s engineering challenges.

It is safe: With no possibility of melt-down, or hazardous nuclear waste, fusion dose not suffer the drawbacks that make fission an unattractive alternative.

Helion plans to substantially improve their Fusion Engine for 2016 and have commercially capable system by 2019

China's online payment volume is projected to be about US$3 trillion in 2017

China's online payment volume is projected to increase by 500% from 2012 to 2017. This will be nearly US$3 trillion. The estimated level for 2014 is US$1.2 trillion.

China's Economic Transformation and the risk for the Global Economy of a Financial crisis that would be thirty times bigger than the 2008 Crisis

Reorient is a financial services group in Hong Kong

Reorient provides a 62 page view of China's economic transformation

Why does this matter ? Because China's Banks are collectively over thirty times bigger than Lehman's was.

It took the world a few years to recover from the worst of the 2008 financial crisis. A financial crisis that was 30 times larger could take about five to ten years for a world recovery.

Reorient Group argues, the old Chinese smokestack economy (dependent on low-value-added manufacturing exports and fixed-asset investments in infrastructure, mining, and real estate) is fading away and may drag somewhat on the banking sector; but nonperforming loans, even if the actual numbers are several times larger than officially reported figures, are still manageable; and accordingly, the banks may be undervalued instead of posing a systemic risk. John Maudlin in his China series, believe bad debts, if properly classified, could add up to nearly 20% of GDP.

Maudlin notes that China’s rapid credit expansion over the last five years puts it in the top five credit booms of the modern era. He argues it will be very difficult to make the transition without a great deal of pain.

Conventional wisdom suggests that China’s bank leverage ratios are more than manageable, whether they actually will be ultimately depends on whether China’s banks are reclassifying and rolling all but the worst loans.

Scaling ISRU for using Mars Resources for Space Missions

Being able to extract oxygen and produce Methane from the Mars atmosphere would greatly enhance colonization missions to Mars. Here we review how the obviously technical feasible technology would likely scale.

Mars In- Situ Resource Utilization Based on the Reverse Water Gas Shift: Experiments and Mission Applications by Zubrin and others in 1997

Basis for Scaling in Zubrin Analysis

The masses and power requirements of the S-E and Z-E systems in the 0.5 kg per day production rate are known with considerable accuracy from the experimental work done at Lockheed Martin and the University of Arizona. Power requirements for larger systems can also be estimated with confidence, since with all subsystems except controls, power requirement will increase linearly with production rate. Mass of sorption pump systems are estimated to increase by a factor of four for every factor of 10 increase in output rate. This is based upon a relative decrease in parasitic mass as the total sorption pump system becomes larger.

Mass of the chemical synthesis gear is assumed to be linear with respect to the roughly ~0.3 kg of actual chemical reactors contained within the 3 kg mass of the chemical reactor system required for the 0.5 kg per day production rate. This is based upon the author’s knowledge of the details of the Lockheed - Martin S-E system (0.1 kg Sabatier reactor + 0.2 kg of solid polymer electrolyte contained within the ~3 kg chemical synthesis subsystem) and reports from K.R. Sridhar of the University of Arizona of ~0.3 kg of actual Z - E cells within a ~0.5 kg per day output unit there. Control system mass and power is estimated to scale up by a factor of two for every factor of 10 increase in output.

Mass of lines and valves for all systems except the Z-E are assumed to scale up by factor of 3 for every factor of 10 increase in output. For the Z-E system, a factor of 5 increase in mass for every factor of 10 increase in output is assumed. This is because the Z-E system is composed of large numbers of small tubes. As the system scales up, more and more manifolds are required. This contrasts unfavorably with the other systems, which can simply employ larger reactor vessels as output rates are increased. Refrigerator mass is assumed to increase by a factor of four for every factor of 10 increase in output. This is based upon scaling observed in existing Stirling cycle

Nanostructured Metal-Oxide Catalyst Efficiently Converts CO2 to Methanol

Separately Brookhaven National Laboratory scientists have discovered a new catalytic system for converting carbon dioxide to methanol. The new system offers significantly higher activity than other catalysts now in use and the new system could make it easier to get normally unreactive CO2 to participate in these reactions. The resulting catalyst converts CO2 to methanol more than a thousand times faster than plain copper particles, and almost 90 times faster than a common copper/zinc-oxide catalyst currently in industrial use.

Highly reactive sites at interface of two nanoscale components could help overcome hurdle of using CO2 as a starting point in producing useful products.

(H/T New Energy and Fuel)

Scanning tunneling microscope image of a cerium-oxide and copper catalyst (CeOx-Cu) used in the transformation of carbon dioxide (CO2) and hydrogen (H2) gases to methanol (CH3OH) and water (H2O). In the presence of hydrogen, the Ce4+ and Cu+1 are reduced to Ce3+ and Cu0 with a change in the structure of the catalyst surface

Science - Highly active copper-ceria and copper-ceria-titania catalysts for methanol synthesis from CO2

NASA will send a system to extract oxygen from the Mars Atmosphere in 2020

NASA plans to make oxygen, a central ingredient of rocket fuel, on Mars early in the next decade.

Space agency officials on Thursday unveiled seven instruments they plan to put on a Martian rover that would launch in 2020, including two devices aimed at bigger future Mars missions.

MOXIE will produce about 22 grams (0.78 ounces) of oxygen per hour and will operate on at least 50 different Martian days during the course of the mission,

MOXIE — short for Mars OXygen In situ resource utilization Experiment — was selected from 58 instrument proposals submitted by research teams around the world. The experiment, currently scheduled to launch in the summer of 2020, is a specialized reverse fuel cell whose primary function is to consume electricity in order to produce oxygen on Mars, where the atmosphere is 96 percent carbon dioxide. If proven to work on the Mars 2020 mission, a MOXIE-like system could later be used to produce oxygen on a larger scale, both for life-sustaining activities for human travelers and to provide liquid oxygen needed to burn the rocket fuel for a return trip to Earth.

Separately Brookhaven National Laboratory scientists have discovered a new catalytic system for converting carbon dioxide to methanol. The new system offers significantly higher activity than other catalysts now in use and the new system could make it easier to get normally unreactive CO2 to participate in these reactions. The resulting catalyst converts CO2 to methanol more than a thousand times faster than plain copper particles, and almost 90 times faster than a common copper/zinc-oxide catalyst currently in industrial use.

US Army Vision for Force 2025 and beyond

By 2025, a leaner, smarter, more lethal, and flexible Army must operate differently, enable forces differently, and organize differently to maintain overmatch, capable of responding to a myriad of threats to our nation's national interests, and to set the conditions for fundamental long-term change. To determine the optimal design for the Army of the future, the Force 2025 and Beyond effort consists of activities along three primary lines of effort: force employment; science and technology and human performance optimization; and force design.

Here is a one page poster with an outline of the US Army Force 2025 and Beyond Strategy.

IAEA projections for nuclear power in 2030

There are currently 435 operational nuclear power reactors in 30 countries around the world and 72 are under construction in 15 countries. Nuclear power generated 2359 terawatt-hours (TW·h) of electricity in 2013, corresponding to less than 11% of world electricity production, the lowest value since 1982. The share of renewable energy continues to expand, but fossil fuels, especially coal, are still the global fuel of choice. Global nuclear electricity generation in 2013 was 2359 TW·h, 220 TW·h less than the average for the first decade of the 21st century. This drop resulted mainly from decreases due to permanent and temporary shutdowns in Japan (266 TW·h), permanent shutdowns in Germany (41 TW·h) and the USA (17 TW·h), offset partly by increases in China (34 TW·h) and other countries.

August 10, 2014

10 meter Sub-Orbital Large Balloon Reflector (LBR)

A new NIAC Phase II project is hte “10 meter Sub-Orbital Large Balloon Reflector (LBR)”. They propose to develop and demonstrate the technology required to realize a suborbital, 10 meter class telescope suitable for operation from radio to THz frequencies. The telescope consists of an inflatable, half-aluminized spherical reflector deployed within a much larger carrier stratospheric balloon. Besides serving as a launch vehicle, the carrier balloon provides a stable mount for the enclosed telescope. Looking up, the LBR will serve as a telescope. Looking down, the LBR can be used for remote sensing or telecommunication activities. By combining successful suborbital balloon and ground-based telescope technologies, the dream of a 10 meter class telescope free of ~99% of the Earth’s atmospheric absorption in the far-infrared can be realized. The same telescope can also be used to perform sensitive, high spectral and spatial resolution limb sounding studies of the Earth’s atmosphere in greenhouse gases and serve as a high flying hub for any number of telecommunications and surveillance activities. LBR is a multi-institution effort between the University of Arizona (the PI institution), SWRI, JPL, and APL. LBR was selected in 2013 by the NASA Innovative Advanced Concepts (NIAC) program to proceed into Step B of the NIAC Phase I program. This makes LBR eligible to propose for a 2014 Phase II award. The goal of our NIAC Phase II effort is to bring LBR concepts to a Technology Readiness Level of at least 2 in maturity, by addressing key unknowns, assumptions, risks, and paths forward remaining after the completion of our Phase I study.

IBM has integrated the TrueNorth chips into 16 million neuron system and are targeting a 4 billion neuron system in a rack and DARPA plans robots with neuromorphic chips

The IBM TrueNorth neuromorphic chip is inspired by the brain. The inputs to and outputs of this computer are spikes. Functionally, it transforms a spatio–temporal stream of input spikes into a spatio–temporal stream of output spikes.

If one were to measure activities of one million neurons in TrueNorth, one would see something akin to a night cityscape with blinking lights. Given this unconventional computing paradigm, compiling C++ to TrueNorth is like using a hammer for a screw. As a result, to harness TrueNorth, IBM has designed an end–to–end ecosystem complete with a new simulator, a new programming language, an integrated programming environment, new libraries, new (and old) algorithms as well as applications, and a new teaching curriculum (affectionately called, “SyNAPSE University”). The goal of the ecosystem is to dramatically increase programmer productivity. Metaphorically, if TrueNorth is ENIAC, then our ecosystem is the corresponding FORTRAN.

Dharmendra S. Modha's team is working, at a feverish pace, to make the ecosystem available–as widely as possible–to IBMers, universities, business partners, start–ups, and customers. In collaboration with the international academic community, by leveraging the ecosystem, they foresee being able to map the existing body of neural network algorithms to the architecture in an efficient manner, as well as being able to imagine and invent entirely new algorithms.

To support these algorithms at ever increasing scale, TrueNorth chips can be seamlessly tiled to create vast, scalable neuromorphic systems. In fact, we have already built systems with 16 million neurons and 4 billion synapses. Our sights are now set high on the ambitious goal of integrating 4,096 chips in a single rack with 4 billion neurons and 1 trillion synapses while consuming ~4kW of power

They envision augmenting the neurosynaptic cores with synaptic plasticity to create a new generation of field–adaptable neurosynaptic computers capable of online learning.

TrueNorth is a direction and not a destination! The end goal is building intelligent business machines that enable a cognitive planet, while transforming industries.

DARPA targets neuromorphic chips in more capable robots

Second Open thread for August

This is the second open August thread for discussing topics that you want to discuss or share links of interest.

Let us keep the discussion polite and fact based. Thanks

Carnival of Space 366

The Carnival of Space 366 is up at Cosmoquest

Universe Today - “We’re at the comet! Yes,” exclaimed Rosetta Spacecraft Operations Manager Sylvain Lodiot, confirming the spacecraft’s historic arrival at Comet 67P/Churyumov-Gerasimenko during a live webcast on the morning of Aug. 6, from mission control at ESA’s spacecraft operations centre (ESOC) in Darmstadt, Germany.

Universe Today - Who can imagine Uranus as a quiet planet now? The Keck Observatory caught some spectacular pictures of the gas giant undergoing a large storm surge a few days ago, which took astronomers by surprise because the planet is well past the equinox in 2007, when the sun was highest above the equator.

Comet Rendezvous and Landing

Comet Rendezvous and Landing: Rosetta and Philae  
A Guest Post by Joseph Friedlander

The Rosetta spacecraft is in the complex process of being captured by a low-mass body, Comet 
67P/Churyumov–Gerasimenko.  (This comet orbits between Mars and Jupiter and was discovered 
in 1969 by Klim Ivanovych Churyumov and Svetlana Ivanovna Gerasimenko).

Mass of the Rosetta was around 3000 kg at launch, including the 100 kg lander and 165 kg of 
scientific instruments.  Most of the mission delta V has been supplied by three Earth flybys 
(gravity assists) but a delta v budget of around 2200 m/sec was onboard. The craft is now 
considerably lighter than at launch, having spent quite a bit of this.  Dry mass
of the orbiter alone is 1,230 kg. Power at 3.4 AU is around 850 watts. 

Data on Comet 67P/Churyumov–Gerasimenko

  • Year: 6.45 earth years
  • Idealized radius: (if spherical--it ain't--) around 2-2.2 km (mean density about one tenth to one-ninth that of water--amazingly low)

This is the first body visited between Earth and Jupiter whose density virtually guarantees 
a majority of the body mass being water ice (snow) and carbonaceous matter.

As of June the body was outgassing accurately at around a kilogram a second of water. This 
is 90 tons a day, 30,000 tons plus a year. And peak outgassing season (perihelion and after) 
isn't yet.

  • Escape velocity .46 meter (not kilometer!) a second. (under half a meter a second)
  • 12.7 hour day, 6.35 hours sunrise to sunset
  • Distance from sun-- Aphelion 5.6839 AU (850,300,000 km) Perihelion 1.2429 AU 
  • (185,940,000 km)
  • Mass of Comet 67P/Churyumov–Gerasimenko is around 3.14 gigatons, 3140 megatons--

So the comet weighs over a billion times the now much lighter (after delta-v burns)
spacecraft and the Rosetta is trying to get captured.

In May the spacecraft encounter velocity was round 775 meters per second, comparable to a 
SR-71 still picking up speed.  Now it has been cancelled down to around a meter a second (a 
leisurely walk pace) and the spacecraft is now associated with Comet 
67P/Churyumov–Gerasimenko but not yet orbiting it. It's complex path is a dance whose 
segments are hyperbolic escape trajectories, each cancelled out in turn by delta-v 
subtracting thruster burns.

Where things are at launch, now and in the coming year. 

The ESA Giotto spacecraft in 1986 captured this picture of Halley's comet in full outgassing 
glory which was considered amazing at the time:

But a new generation of instruments has arisen which now yield pictures of the (still cold--(Next  perihelion is on 13 August 2015))  Comet 67P/Churyumov–Gerasimenko with this level of detail from 285 km away (5.3 meters/pixel):

Credit & Copyright: ESA 
/ Rosetta / MPS for OSIRIS Team MPS / UPD / LAM /IAA / SSO / INTA / UPM / DASP / IDA.

Capture radius (for orbit) should be somewhere between 40 and 25 kilometers, probably 
somewhere in the low 30s. As Rosetta flies lower, it will pick up speed, do minor but 
complex delta V maneuvers to  be captured in the first orbital mission around a comet.

When orbital coverage has been sufficient a landing site selection process will ensue. The 
Philae lander weighs under 100 KG.

It will not be the first landing on a small body but the first on a known active comet. 
November 2014 is the target landing date.  The pictures should be even more amazing.

This next year will also see a flyby of Pluto at 14 July 2015 (the NEW HORIZONS probe) and a 
orbiter of Ceres some months earlier. (the DAWN mission).

This is the first body visited between Mars and Jupiter whose density virtually guarantees a 
majority of the body mass being water ice (snow) and carbonaceous matter.  If this was not a 
one ton probe but a thousand ton one-way do or die manned colonizing effort with 30 people) 
(or a million ton colonizing effort with 30000 people,) the greatest reality TV show in 
history would be about to begin.

This is one of the most recent pictures, from a distance of 130 km and the image resolution is 2.4 metres per pixel. The large boulders at lower right are perhaps the size of a small school or commercial building.

How would you approach the problem of colonizing a body between Mars and Jupiter with this 
gravity (1/10000th Earth) this orbital and escape velocity (under half a meter a second) 
this 'geostationary height' (around 2-3 km radius) and this distance from sun-- 
5.6839 AU (850,300,000 km)
Perihelion 1.2429 AU (185,940,000 km)? (Next  perihelion is on 13 August 2015)

At the very least that implies a regular schedule for expanding and shrinking solar concentrators made of thin foils. And remember sunrise to sunset is under 7 hours. (If you are planning on a mirror system in orbit to relay constant sunlight, remember outgassing season). Many colonization architectures are possible. But would you bet your life on them?

  Reflections and comments welcome below.

Links of interest (The lander) (The orbiter) (The comet) (Why Clark Kent might have problems walking on 
Earth--your muscles can reach escape velocity too easily :)

The Rosetta Blog

It is controlled from ESA's Operation Centre in Darmstadt, Germany, The 'Houston' of ESA.

Форма для связи


Email *

Message *