March 07, 2008

Speaking in Toronto tomorrow, 2000th post

US Air force plans nuclear power plant and navy geothermal for bases

Proper framing of the transhumanist debate

Michael Anissimov has another article related to the defence of transhumanism from misguided attacks.

Transhumanism (sometimes symbolized by >H or H+), a term often used as a synonym for "human enhancement", is an international intellectual and cultural movement supporting the use of new sciences and technologies to enhance human mental and physical abilities and aptitudes, and ameliorate what it regards as undesirable and unnecessary aspects of the human condition, such as stupidity, suffering, disease, aging and involuntary death.

I think the discussions around Transhumanism get bogged down because the discussions too often skip over the history and trends around the use of technology to enhance human mental and physical abilities. They also add in a bunch of convoluted and unnecessary intellectual baggage. Enhancement has been going on since someone picked up a rock or stick to fight an animal or prepare one for food or clothing. Improved enhancement since someone sharpened the stick or the rock or tied the sharpened rock to the stick. Riding a horse to enhance mobility. Writing and paper to enhance communication and memory. Steam engines for enhanced mechanical/physical productivity.

Does every attempt at improvement succeed ? No, but since individuals switch to what is better for them then there is a strong general trend toward improvement.

Wired has an interesting little article which I would categorize as transhumanist lite. Compares hypothetical augmentation drug with coffee.

I would like to see some kind of definitions around “end of history”.

History is the continuous, systematic narrative and the research of events in the past of importance to the human race, including the study of events over time and their relation to humanity.

So there is the history before homo-sapiens. There would still be history even if homo-sapiens are modified.

In regards to augmentation with technology and biological and non-biological. Currently many people have smart phones that they carry around all the time, some have cochlear implants, pace makers etc… They also have lasik eye surgery, millions take steroids, millions take test score enhancing drugs, cosmetics, cosmetic surgery, take vitamins etc… People drive cars, forklifts, and ride Segways. Sometimes we integrate with technology like with prosthesis or surgery and sometimes we don't. However, voluntary modification of all types is common place.

The smart cellphones can provide access to the internet, google, wikipedia etc.. to augment the ability of someone to access facts, other resources etc… They can also access systems for enhancing the ability to perform math (calculators).

So technology has caused and is causing enhancement and modification.

So the only things at issue are how much more and how will this change in the future.

Will the bandwidth speed to my smartphone increase ? Will its processing power improve ? Will the interface get better (true some companies could backslide on this but people would then choose not to buy their phones) ?

Certain features trends seem certain. Some of the products can be bought now and some seem likely to be more available and cheaper in the future.

Recently there is lab work for human power generated from regenerative braking from walking. Generates 5 watts while someone is walking A square meter of flexible solar cells could generate 19-56 watts depending upon location and sunlight conditions. US army has solar tent material for generating up to 1000 watts.

There is plenty of other energy that could be captured to power devices.

Broken into usable terms, waiting to be harvested are 81 watts from a sleeping person, 128 from a soldier standing at ease, 163 from a walking person, 407 from a briskly walking person, 1,048 from a long-distance runner, and 1,630 from a sprinter, according to the center. But of course there’s not 100% capture. Body heat, for example, can only be converted with 3% efficiency with current thermoelectric materials.

Computing at about 19.4 gigaflops per watt for best new chips. This is a continually improving metric (just like flops per $ they go along with Moore's law).

Exaflop processing with configurable semi-custom processors is achievable by 2015. Other kinds of computers could achieve success and provide different kinds of computational advantages. quantum computers, optical computers, artificial intelligence and neural simulators.

Lightweight batteries and ultracapacitors. 100-300 watt-hours/kg.

So within say 10 years, carrying around teraflop+ class smartphones that are constantly charged with gigabit+ connections seems likely. One would also be able to network to quantum computers and supercomputers of all types for remote processing. Plus other components and gear could be carried and powered. 36V power tools (not used constantly unless you had access to a solar tent or we were capturing a lot more of the ambient power) or human body powered tasers, vision enhancement gear etc… Also, not including all the gear you might have in the future car or around your home. Also, you could have availed yourself of immune system enhancement, myostatin inhibitors etc…

Would individuals choose a better-faster phone ? Those who argue against enhancement are saying no, at some point all people will not choose better products.

Comments on the class arguements of rich versus poor
Would the businessperson or someone with more money have a better phone ? Maybe. Students with not much income can still own iPhones. How about medical procedures ? Who is paying $500-1000 for lasik now ? Is it only the wealthy ?
There are 3 billion people with some kind of cellphone now. It seems penetration of some technology is reaching even the poorest people in the world.

There are choices and priorities now and there will be options, choices and priorities in the future.

Those who ridicule the idea that technology can make us even richer are wrong

People who take advantage of opportunity - technological or otherwise - are and have become richer. Ray Kurzweil, Bill Joy are both what I consider rich. I do not know what the quantity is of “beyond dreams of avarice”. It appears to be a useless subjective phrase with an anti-wealth bias most frequently used by Berkeley communists.

One could look at the economic history of the world. List of regions by past GDP on a PPP basis and divide by populations to get per capita levels of wealth. It seems pretty plain that in the thought experiment of asking someone from one of those past times to compare their economic lot versus someone at a place and time with more tech that the place with more tech is richer with a larger fraction of people who would classified as rich. There seems to be no reason to believe that this trend to wealth enabled by better technology and economy will end. Those who argue about resource limits are ignoring vastly improved nuclear fission, successful development of nuclear fusion, development of economical space travel and techonlogy for tapping the resources of the solar system which I discuss throughout my thousands of articles.

Nick Bostrom discusses the impact of a 1% general and safe treatment for general cognitive improvement.

One person who posted anonymously on the Chronicle of Higher Education Web site said that a daily regimen of three 20-milligram doses of Adderall transformed his career: “I’m not talking about being able to work longer hours without sleep (although that helps),” the posting said. “I’m talking about being able to take on twice the responsibility, work twice as fast, write more effectively, manage better, be more attentive, devise better and more creative strategies.”

Surveys of college students have found that from 4 percent to 16 percent say they have used stimulants or other prescription drugs to improve their academic performance — usually getting the pills from other students.

In a recent commentary in the journal Nature, two Cambridge University researchers reported that about a dozen of their colleagues had admitted to regular use of prescription drugs like Adderall, a stimulant, and Provigil, which promotes wakefulness, to improve their academic performance.

March 06, 2008

Bakken oil update for Saskatchewan and Crescent Point

February's sale of oil and natural gas drilling rights in Saskatchewan have smashed the record for single-sale revenues, bringing in a whopping $197 million in bonus bids, more than double the previous record of $85 milllion set in 1994. The overall Bakken oil play is across North Dakota, South Dakota, Montana, Manitoba and Saskatchwan. It potentially is a Saudi Arabia of oil. Estimates for ultimate oil contained in the entire Bakken play range from 271 billion to 503 billion barrels, with a mean of 413 billion barrels of technically recoverable and irrecoverable oil. Saskatchwan could have 25% of that oil.

The focus of attention was the Bakken play, which accounted for more than 80 per cent of the $132 million in bonus bid revenues in the southeast. One company that has been at the forefront of the Bakken play is Crescent Point, a Calgary-based energy trust. "We're the largest (player in the Bakken) in land and production and facilities and drilling,'' said Crescent Point president and CEO Scott Saxberg.

And Crescent Point plans to make the Bakken the focus of its operations again this year. " In 2008, we've budgeted for Crescent Point about $175 million.'' Crescent Point is also 20-per-cent owner and operator of Shelter Bay Energy Inc., a privately held oil and gas company, which plans to spend up to $150 million in the Bakken play in Saskatchewan this year.

Extensive details on Crescent Point Energy Trust's activity in Saskatchewan to exploit the Bakken oil play

Crescent Point is currently the dominant producer in the southeast Saskatchewan Bakken resource play with more than 12,000 boe/d of production. The Trust also has the largest undeveloped land base in the play, with 360 net sections of undeveloped Bakken land and more than 1,000 net low risk Bakken drilling locations representing over 10 years of inventory.

Crescent Point believes the Viewfield Bakken play is the second largest conventional oil play ever discovered in western Canada, containing an estimated 3.0 billion barrels of Original Oil in Place ("OOIP"). Bakken oil reserves are high quality, consisting of 42 degree API light sweet oil and liquids rich associated gas. Crescent Point's third quarter 2007 Bakken netback was CDN$62.71 per boe.

As part of its commitment to Shelter Bay, Crescent Point will farmout to
the Company 22 net sections of its inventory of 360 net undeveloped Bakken
sections. Under the terms of the farmout agreement, Crescent Point will retain
interests in up to 50 percent of the lands and production, earning cash flow
and reserves on these sections and increasing the Trust's net asset value with
limited capital requirements. Shelter Bay is expected to drill up to 40 gross
wells on these farmin lands in 2008 and a further 40 gross locations in 2009.

Crescent Point is producing in excess of 33,500 boe/d [2008], mainly due to drilling and fracture stimulation success in the Bakken play.

Between the two companies, about 140 to 150 wells will be drilled in southeast Saskatchewan, and close to $400 million will be spent in the province, Saxberg said.

Another oil play that had attracted bidding aws the Shaunovan heavy oil play which has about 470 million barrels in place

Carnival of Space Week 44

Uprating wind turbine blades and more efficient ceiling fans

WhalePower, based in Toronto, Ontario, is testing this wind-turbine blade at a wind-testing facility in Prince Edward Island. The bumps, or "tubercles," on the blade's leading edge reduce noise, increase its stability, and enable it to capture more energy from the wind.
Credit: WhalePower

I have talked frequently about nuclear power uprating (changes in fuel design and other changes that can increase the power generated at existing nuclear power plants. Now there is reports that bumpy wind turbine blades could uprate existing and future wind turbines. The new wind turbine blades come from WhalePower of Toronto, Ontario, Canada. Uprates can be very good, because you spend a lot less to get a lot more out of what is already there.

Prototypes of wind-turbine blades have shown that the delayed stall doubles the performance of the turbines at wind speeds of about 17 miles per hour and allows the turbine to capture more energy out of lower-speed winds. For example, the turbines generate the same amount of power at 10 miles per hour that conventional turbines generate at 17 miles per hour. The tubercles effectively channel the air flow across the blades and create swirling vortices that enhance lift.

WhalePower can rapidly develop precise designs for retrofit leading edges or fully integerated tubercle technology blades for any turbine.

-Retrofit blades are stronger than the original unmodified blades.
-Integrated blades meet or exceed all required performance crieria.

They should be commercially available later this year (2008).

Stephen Dewar, director of research and development at WhalePower, says that ongoing tests at the Wind Energy Institute of Canada, in the province of Prince Edward Island, have shown the tubercle-lined blades to be more stable, quiet, and durable than conventional blades. "The turbine has survived being hit by the edge of a hurricane, and it survived wind-driven snow and ice," he says.

WhalePower has also shown in demonstrations that tubercle-lined blades on industrial ceiling fans can operate 20 percent more efficiently than conventional blades can, and they do a better job at circulating air flow in a building.

March 05, 2008

Instead of Mechs - robotically driven Hummers

Michael Anissimov has a picture of a progression from World War 2 soldier to a mech.

The hypothetical Mech has a hypothetical price tag of 500k-700K.

Military hummers cost $150K armored and $60-75k unarmored Of course they are not autonomous, although with the DARPA program for self driving cars..we could add yet (and tack on some price tag, but not necessarily that much with continually falling computer and electronic prices... if acquired in volume). All the weapons that the 5th gen system has could mounted on a self-driving humvee. Since the armor probably would probably not stand up to the weapons, probably better to go cheap on armor and go for unarmored and get two unarmored instead of one armored.

Actual vehicle being developed is the CMU crusher

The alternative to the "super alloy" of the Mech would be better ceramic armor or the new durable liquid metal (twice as strong as titanium) If you could get the exotic armor cheap enough and if it is worthwhile to make a tougher armored system instead of more cheaper systems.

The actual unmanned ground vehicles that is being developed is the CMU crusher

The Crusher program has cost $35 million so far Not clear how much any final per unit costs would be.

Current progress to autonomous robots is substantially because of better 3d LADAR

More on the DARPA robotic driving systems

Unmanned ground vehicles

Wealth chart update

Forbes has its latest survey of the Worlds's billionaires Warren Buffet is back to being the world's richest man.

2007200620052004Wealth Amount
8532US$30B+ Forbes list (which mainly catches owners
77674932US$10B+ of public assets, can underestimate some
203167124102US$5B+ like CTO of Cisco, who may be billionaire
1125946793691US$1B+ from cisco stock + large startup positions)
??940082007500US$160M+ (my own estimate)
??95,0008540077500US$30M+ (UHNW, ultra high net worth class)
??(e)930,000820000745000US$5 to 30M
8.6M 7.8M7.4M US$1-5M Global number, US number 33% (2.6 million)
??9.5M 8.7M8.2MUS$1M+ Global number, US number 33% ( 2.6 million)
??~26M~24M~22MUS$500K-1M doesn't include primary residence, (estimate)

Sources of wealth for the rich and how much money it takes to be rich

Blue Brain status and the future of whole brain simulation

Seed magazine has a 9 page review of the Blue Brain project. Blue Brain is the IBM project to simulate a human brain on a supercomputer. Currently they have simulated one column of a neocortex (of a rat) with 10,000 neurons and 30 million synapses using a 22.8 teraflop supercomputer. (A human neocortex column has 60,000 neurons.) Their simulation uses 400 segments for each neuron and they have precisely researched individual ion channels and biological functions to precisely generate the simulation. Their simulation is generating its emergent results from ground up matching to physical measurements.

I think this project has similarities to the human genome project. There is value and things to be learned in having a precise functioning computer model of a human brain and even parts of a human brain. There will be even more to learn when we can affordably make many such models for different people. A follow on goal of the human genome project is to sequence the genomes of 100,000 people, which is a step towards to goal of everyone having a copy of their personal genome. The follow on to Blue Brain or other projects to simulate the human brain is to simulate the brains of different types of people and then personal brain simulations. In parallel would be whole body simulations down to the intracellular and then molecular level. After you have working systems you can reduce the complexity where it is unnecessary. For example 1% of the genome is where important differences between people is located. Of course long running and molecularly accurate personal brain simulations would be equivalent to mind uploading.

A whole human brain has 100 billion neurons and 1 trillion synapses. So they need to scale up the neurons by 10 million times and the synapses by 33,000 times. They believe the computing power to do this will be available in less than 10 years. So by 2017, there should a single whole brain simulation. Personalized whole brain simulation would follow by 2027-2037.

Benefits of the blue brain project

Gathering and Testing 100 Years of Data

Cracking the Neural Code

The Neural Code refers to how the brain builds objects using electrical patterns. In the same way that the neuron is the elementary cell for computing in the brain, the NCC is the elementary network for computing in the neocortex. Creating an accurate replica of the NCC which faithfully reproduces the emergent electrical dynamics of the real microcircuit, is an absolute requirement to revealing how the neocortex processes, stores and retrieves information.

Understanding Neocortical Information Processing

A Novel Tool for Drug Discovery for Brain Disorders

A Global Facility [to test theories of brain function]

A Foundation for Whole Brain Simulations

A Foundation for Molecular Modeling of Brain Function

Now that the column is finished, the project is pursuing two separate goals:
- construction of a simulation on the molecular level, which is desirable since it allows to study effects of gene expression;
- simplification of the column simulation to allow for parallel simulation of large numbers of connected columns, with the ultimate goal of simulating a whole neocortex (which in humans consists of about 1 million cortical columns).

UPDATE: reader svante pointed out the need for non-destructive brain scanning for personalized brain simulation. It would be needed and below is a survey of the current status of that work.

Neuroimaging now

There has been precise monitoring of animal brain function via 2 photon microscopy. They used a clear window into living brains. If the functional end of the microscopes could be a lot smaller and placed under the skull then that would gather a lot more of info needed for personalizing the scans.

There is also detection of chemical reactions in living human cells

Other advances in faster and more sensitive cell viewing

MIT group working on making brain cell activity monitoring

An optical microscope has better than 10nm resolution

Al fin comments about this bottom up approach and the progress on top down analysis of brain function and the merging both efforts.

Nanomagnets with controlled Quantum tunneling could create powerful quantum computers

There is the possibility that single molecular nanomagnets or nanoparticle nanomagnets could be used to create powerful quantum computers. If quantum tunneling could be controlled in nanomagnets then they could create quantum computer gates or devices. The current work identifies that quantum tunneling can be turned on and off in nanomagnets.

According to quantum mechanics, small magnetic objects called nanomagnets can exist in two distinct states (i.e. north pole up and north pole down). They can switch their state through a phenomenon called quantum tunneling.

When the nanomagnet switches its poles, the abrupt change in its magnetization can be observed with low-temperature magnetometry techniques used in del Barco’s lab. The switch is called quantum tunneling because it looks like a funnel cloud tunneling from one pole to another.

Del Barco published paper shows that two almost independent halves of a new magnetic molecule can tunnel, or switch poles, at once under certain conditions. In the process, they appear to cancel out quantum tunneling.

“It’s similar to what can be observed when two rays of light run into interference,” del Barco said. “Once they run into the interference you can expect darkness.”

Controlling quantum tunneling shifts could help create the quantum logic gates necessary to create quantum computers. It is believed that among the different existing proposals to obtain a practical quantum computer, the spin (magnetic moment) of solid-state devices is the most promising one.

Here is a paper on their previous work to make single electron transistors to study single molecule nanomagnets

Quantum computers could make very powerful pattern recognizers

Here is a research paper that discusses using adiabatic quantum computation using liquid state nuclear magnetic resonance quantum computers for pattern recognition.

Pattern recognition is a critical aspect of artificial intelligence. I have a new article "Artificial Intelligence: You are soaking in it", which discusses mostly ignored penetration of artificial intellience and its future with a lot of links to my articles on near terms pathways to computers millions of times more powerful than todays.

Via Ars technica

Dwave systems has used their quantum computer for image matching.

A novel quantum pattern recognition scheme is presented, which combines the idea of a classic Hopfield neural network [shown to the left] with quantum adiabatic computation. Both the input and the memorized patterns are represented by means of the problem Hamiltonian. In contrast to classic neural networks, the algorithm can simultaneously return multiple recognized patterns. The approach also promises extension of classic memory capacity. A proof of principle for the algorithm for two qubits is provided using a liquid state NMR quantum computer.

In contrast to classic neural networks, a quantum neural register can represent a superposition of recognized patterns. Quantum superposition allows each of these patterns to be identified which is not the case for linearly combined mixture states in classic neural networks.

March 04, 2008

Weekly Iraq update with a view on current and future pocketbook

The US State Department gives a view of the oil situation in Iraq A better oil situation means more chances for oil prices to improve and better chances for the US to spend less money and resources there at some point which might be earlier than otherwise.

Nanoscale Metamaterial could create wireless optical computer

Nader Engheta's, professor at university of Pennsylvania, work provides "a vision, consisting of building blocks, along with instructions on how to arrange them together to enable transplanting well-known passive inductor-capacitor-resistor [LCR] electrical networks to the optical domain. This includes the direct optical realization of filters, antennas, power-distribution networks, microwave transmission-line metamaterials and many more. (H/T to EE Times and He has theories about creating equivalent structures for optical component versions for all of the components of electrical computers. Some of components would require new types of not yet created metamaterials to be developed.

UPDATE: This new development target of nanoscale metamaterial components for precisely controlling photons for optical computers is related to the Alex Zettl nanoradio development in that both would use nanoscale structures to control spectrum and enable exceptional capabilities.

An optical computer with nanoscale components would be able to be insanely fast. Designs could have massive levels of parallization, plus the speed boost from optical communication speeds and volumes versus slower electrical propogation.

Negative-refractive-index materials were demonstrated less than a decade after John Pendry, a professor of theoretical physics at Imperial College London, proposed (hugely controversially) that they were possible. Such precedent may bode well for Engheta's vision, some believe.

Two teams are already at work trying to demonstrate the basic nanocircuit principles. With his colleagues, Rohit Prasankumar of the Center for Integrated Nanotechnologies at Los Alamos National Laboratory is working on optical nanoantennas that he says should operate as lumped nanocircuit elements at visible wavelengths. "We are currently fabricating these nanoantennas and hope to test their operation as nanocircuits using optical scattering experiments shortly afterward," Prasankumar said. "Sub- sequent experiments will include design, fabrication and testing of more-complex nanocircuits to achieve a desired functionality"--for example, nanotransmission lines.

Prasankumar sees the endeavor as "one of the most exciting developments to emerge from research into metamaterials and their applications in the last few years, particularly if we are successful in making Prof. Engheta's theoretical ideas a reality. I am excited to be working on this project, and hope to have a working optical nanocircuit in the near future."

Penn's physics department is also working on the problem. "We plan to construct specially designed grating structures with periods much less than the operating wavelengths, and then experimentally verify the performance of such nanostructures in terms of optical reflection and transmission," said Penn physicist Marija Drndic.

According to Engheta's predictions, such nanostructures should act as optical filters at nanoscales, she said--for example, bandpass or bandstop filters depending on incident polarization. If successful, Drndic said, the experiment will show "that his concept of lumped circuit elements at optical frequencies will indeed provide useful recipes for design of optical nanocircuits with various functionalities."

Here is Nader Engheta's website

Engheta said he is interested the possibility of creating switches from metananocircuitry. They could lead to a new kind of optical information processing and, perhaps, a new form of nanoscale computational unit, said Engheta, the H. Nedwill Ramsey Professor of electrical and systems engineering at Penn.

Engheta's theory relies on three basic ideas. The first is that nanoparticles of various materials have properties that can be matched to electronic equivalents (such as resistance, inductance and capacitance). Further, the nano- particles can be thought of as "lumped components" that can be connected together into circuits by using additional guiding structures. Finally, the concept of metamaterials--in which composite materials exhibit properties that are dictated by their nanoscale structures rather than their chemistry--is crucial for the design of efficient devices.

The building blocks in Engheta's world are dielectric nanoparticles, Eleftheriades explained. Conventional dielectric nanoparticles--those with positive permittivity--"can realize optical capacitors," he said, whereas negative plasmonic nanoparticles, which have negative permittivity, can realize optical inductors and resistors

March 03, 2008

Key protein, FOX03a, helps shields some people from HIV, could lead to HIV vaccine

. In an advance online edition of Nature Medicine, the scientists explain how the protein, FOX03a, shields against viral attacks and how the discovery will help in the development of a HIV vaccine. This research could also advance therapies for other viral diseases.

Given their perfect resistance to HIV infection, elite controllers represent the ideal study group to examine how proteins are responsible for the maintenance of an immune system with good anti-viral memory," said Dr. Haddad. "This is the first study to examine, in people rather than animals, what shields the body's immune system from infection and to pinpoint the fundamental role of FOX03a in defending the body."

Beyond HIV treatment, Dr. Sékaly said his team's discovery offers promise for other immune diseases. "The discovery of FOX03a will enable scientists to develop appropriate therapies for other viral diseases that weaken the immune system," he said, citing cancer, rheumatoid arthritis, hepatitis C, as well as organ or bone marrow transplant rejection.

In separate but related news, Rockefeller University researchers have built a device that, by allowing scientists to turn genes on and off in actively multiplying budding yeast cells, will help them figure out more precisely than before how genes and proteins interact with one another and how these interactions drive cellular functions.

Although scientists have had the tools to track single cells and measure the protein levels within them, the new device allows scientists to track them for a longer period of time while not only monitoring but also controlling the activity of genes. The precision with which the device can track single cells also allows scientists to construct pedigrees, making it possible to compare gene activity from one cell to the next.

The device relies on electrovalves to control a flow of media, which travels through a tube and then diffuses across a porous membrane to reach the budding yeast cells. The cells are clamped between this membrane and a soft material, which forces them to bud horizontally without damage.

“That was the major design hurdle,” says Charvin. “To create a device in which cells don’t move, so that you can track hundreds of single cells for a long time — about eight rounds of cell division — which typically lasts 12 hours.”

In order to induce the activity of a gene, the researchers used inducer molecules that diffuse through the cell membrane and control DNA segments called promoters. The molecule’s presence silences the promoter, which silences the expression of the gene; the molecule’s absence, on the other hand, activates the promoter, which activates the gene to crank up the molecule’s production.

By exploiting this principle, the scientists showed that they could successfully turn specific genes on and off by controlling the flow of an inducer molecule called methionine. They observed that pulses as short as 10 minutes led to changes in protein levels that could be measured.

Craig Venter does not make small claims

Craig Venter, billionaire who previously created a company that was a a key part of sequencing the human genome, claims that he think he will have fourth-generation fuels in about 18 months, with CO2 as the fuel stock.

We have modest goals of replacing the whole petrochemical industry and becoming a major source of energy," Venter told an audience.

Simple organisms can be genetically re-engineered to produce vaccines or octane-based fuels as waste, according to Venter.

Biofuel alternatives to oil are third-generation. The next step is life forms that feed on CO2 and give off fuel such as methane gas as waste, according to Venter.

His team is using synthetic chromosomes to modify organisms that already exist, not making new life, he said. Organisms already exist that produce octane, but not in amounts needed to be a fuel supply.

Hat tip to Alfin for the following links on biofuels and bio-ethanol:
Global Research Technologies in Arizona and Sandia Labs in New Mexico and Los Alamos are working on making biofuel from CO2. Los Alamos has the Green Freedom project which also would produce biofuel from CO2 The price of gasoline at the pump would need to be US $4.60 per gallon for the current Green Freedom process to be profitable. The calculation assumes the incorporation of a nuclear reactor to provide electricity for the process.

The Green Freedom process is describe in this pdf

There are several groups working on very inexpensive bioethanol.

ZeaChem and Coskata are promising high volume bio-ethanol at US $1 a gallon, by 2012. New Canadian startup Syntec promises even cheaper bio-ethanol--40 cents a gallon.

Why the Nanodynamics IPO failed

The story about why the Nanodynamic Dubai IPO failed/was withdrawn.

According to a Nanoclarity investigation it was Global Crown Capital Ltd. (the lead underwriter) who did not provide all the money required for the IPO.

Semiconductor fabrication status

IBM is producing PS3 cell processors using 45 nm processes on 300mm wafers

The die size of the Cell processor at 90 nm features was 221 mm**2. Therefore, the 45nm die size should be about 55-70 mm**2.

So a perfect wafer would have about 300 cell processors at 90nm and would have 1000 cell processors at 45nm.

Intel has announced 16 processors using the 45nm process

Most Intel processors are still in the 100-220 mm**2 die size range

Intel is spending about $7.5 billion to open four 45nm chip processing plants.

Intel is planning to start the shift to 32 nm processes in late 2009

Semiconductor plants produce about 50,000 wafers per month at maximum production. But plants can be in the 15000-20000 range as well

Some of the newest fabrication plants are targeting 60,000 wafers per month

These plants cost about $3 billion each.

Advanced printable electronics or nanofactories would need to have those levels of productions given similar investments. $3 million printable electronic machines could each produce only 60 wafers (18,000 200 mm**2 die size ships) per month with similar performance chips and they would be competitive with current semiconductor fabrication.

Printable electronics and Fijitsu uses carbon nanotubes to grow Graphene

China currency and economy will be third largest in 2008

The Chinese yuan is 7.10 to the US dollar

The Chinese yuan is 10.85 to the Euro

I have previously predicted that China's economy will be larger than the US economy on an exchange rate basis before 2020. and think it is most likely to happen in the 2017-2019 range

The German economy ended 2007 at about 2.364 trillion Euro.
China's economy ended 2007 at about 26.3 trillion Yuan. This does not include Hong Kong and Macau.
China's economy is growing at about 10% in 2007 and Germany at about 1.6-1.8%.

So currently as we are about to reach the end of Q1 2008, when the Euro is at 10.23 yuan to Euro or less then China is the number 3 world economy. By the middle of the year, 10.5 or better would put China as number 3 and by the end of the year 10.9 or better would be enough.

If Hong Kong and Macau are included then an exchange of 10.86 would put China number 3 at about USD3.63 trillion now.

The expectation is that the Euro will be weakening again over the next few months to 10.6 or so.

Something to keep an eye on is the Taiwanese presidential elections for March 22, 2008 I have predicted that Ma Ying-jeou of the KMT will win. The KMT landslide legislative victory and most of the polls suggest that this is a safe prediction The main difference with a KMT victory would be direct airplane flights between China and Taiwan and pretty much unrestricted investment from Taiwan into China. It should also mean greatly reduced political and military tensions over Taiwan and China with the likely prospect of some kind of peace and unification talks at some point in the near future. Positive results there seem likely to help the long term economic outlook for both Taiwan and China.

Форма для связи


Email *

Message *