May 11, 2007

Nuclear powered oil sands follow up


UPDATE:A Globe and Mail article appears to show that the business and political pieces are in place. Formalities of approval are still needed. Shell will buy 70% of the power. Local support (300 to 5), provincial and federal support are in line

In order for nuclear power to replace the burning of natural gas to power the extraction of oil from the oilsands involves about 4.4 GW of nuclear power per million barrels per day of oil extracted (according to Wayne Henuset,director of Energy Alberta Corporation. estimate of a 2.2 GW reactor separating 500,000 bpd). 10 million bpd would take about twenty 2.2 GW twin reactors. A detailed analysis is provided from the Nuclear Energy journal. It was written by Atomic Energy Canada and Canadian Energy Research Institute scientists.

A paper "Opportunities for CANDU for the Alberta Oilsands" from journal of Nuclear Energy (peer reviewed) is probably the definitive word on how much oil would be separated using a CANDU reactor

SAGD can recover over 50% of the initial volume of crude bitumen in place.
An average steam/oil ratio of 2-3 is required.
Working from output levels of high quality steam of 62400 m**3/day, a cumulative steam/oil raio of 2.5, operating capacity of 93%, the results was consistent with a 146,000 barrel per day bitumen production.

A typical advanced CANDU reactor

Configuration of a reactor as part of oilsands project

A 728 MWe (gross) nominal electric output ACR-700 design generates 1983 MW (thermal).
The CANDU reactor can be adapted to provide steam of 2-6 MPa.
An ACR700 would provide in one configuration 140MWe (net), 420,000 barrels/day/steam and supply pressure of 2.2 MPa. The production rate of bitumen using this steam would depend on the steam/oil ratios required in the SAGD wells. For steam/oil ratios of 212.4-224 degrees celsius the bitument production rates would be 168,000-210,000 bbl/day. The project would achieve a 10% advantage in steam cost even if natural gas were at USD3.25/mmbtu

The twin 2.2 GWe reactor proposal would generate 507,000 to 634000 bbl/day in a similar configuration with similar assumptions.

Cost Sensivity of the project

One reactor (would be) in 2016 and the second one would be in 2017 ... We're taking it to where we feel there's less resistance (from the public)," corporation director Wayne Henuset told Reuters.

"We hope to site it and talk to the communities in the next two months," he said in an interview on the sidelines of a nuclear industry seminar.

Two further reactors are planned for a later unspecified date.

Canada's Natural Resources Minister Gary Lunn told Reuters in January that, in theory, he liked the idea of nuclear power for the oil sands.

Henuset said Atomic Energy of Canada Ltd. -- the government-owned manufacturer of the Candu -- estimated it could build the first reactor in 36 months.

He also said he hoped that nuclear waste from the plant would be stored either on site or in special chambers until it could be reused.

Shell Canada Ltd. Chief Executive Clive Mather told Reuters in January that although he was not ready to buy into the nuclear concept, it could offer a price advantage over time. Shell is a major oil sands operator.

The World Nuclear Association estimates natural gas is 60 percent of an oil-sands facility's operating costs.

A new Steam Assisted Gravity Drainage (SAGD) process has been developed and proven in Canada during the last decade. It is compatible with the steam conditions from CANDU reactors would release about 0.10 tonnes/barrel for extraction and upgrading of bitumen from much deeper deposits.

Reviewing the short history of oil sands production suggests that, based on current production and past rates of growth, production in 2050 would reach about 1.5 billion barrels/year (4 million bpd). About half would come from in situ projects. Presuming the production rate increases at a higher than historical rate of 5%/year, compounded annually, results in production of 3 billion barrels/year (8 million bpd) by 2050.

A single large dedicated CANDU 9 reactor could supply the steam and electricity to extract and upgrade about 600 million barrels of bitumen over a period of 30 years. The land area from which bitumen would be extracted is about 18 square miles requiring steam distribution and bitumen recovery piping from a centrally located 60,000 barrel/day plant of up to about 3 miles. Smaller reactors would be suitable for smaller production rates with shorter piping distance.

Henuset is quoted as saying that the 2.2 GW twin reactor would separate up to 500,000 barrels of oil a day.

In a speech to a high-powered business audience in New York last week, Prime Minister Stephen Harper said production from the oil sands — which now supply about one million barrels of crude a day — is now “on its way” to four million barrels by 2015, a target that exceeds the bullish 3.5 million barrels forecast used by the Canadian Association of Petroleum Producers.

Here is a pdf of a Wayne Henuset speech

The National Energy Board has forecasted that the oil sands production would increase fourfold by 2015, largely using steam assisted gravity drainage. If all the SAGD projects planned were to use natural gas as the fuel in cogeneration systems that would see natural gas demands in the region skyrocket to over 3 billion cubic feet/day by 2015. That's more natural gas than all the rest of Alberta uses now. 375 million cubic feet/day is replaced by the 2.2GW reactor. USD1.095 billion/year for the 375 million cubic feet/day if natural gas is at $8 per 1000 cubic feet.

[1000 cubic feet of natural gas is equal to 293 kWh. 320 billion kWh per year in 2015. Twenty 2 GW reactors needed to fully replace the natural gas usage]

Putting the brakes on the oil sands development is not the answer. Canada needs the oil sands. Conventional oil and gas reserves are declining. Energy conservations are important and necessary but they won't eliminate the need for fossil fuels in our economy. In addition, oil sands activities will lead to significant economical impact not only to Albertans but across the nation with substantial increase in gross domestic product, employment generating over $123 billion in tax revenues.

It makes no sense to squander precious and declining reserves of natural gas to make oil in the oil sands. That's simply like burning gold to make coal. The answer for this is using nuclear power.

The first strategy is using CANDU nuclear electricity generation to extract the oil from the carbonate triangle involving potentially 450 [billion] barrels of bitumen. This is new. The second strategy is generating hydrogen electricity for the upgrading of the bitumen. The third is providing steam supply for the SAGD process in the oil sands; and finally, generating electrical needs for the utility companies in Alberta to keep up with our provinces projected growth.

One ACR1000 reactor would result in an annual displacement of around 500 million tons of CO2 compared to an equivalent gas fired generator accounting to a savings of $100 million, annually, of carbon dioxide cost if it was at $20.

So either you burn natural gas to get at oil from the oilsand OR you use new processes where you burn some of the oilsand to get oil from it OR you make a lot of nuclear reactors. If oil prices stay high and we go past peak oil and the prices go higher then it seems that making the nuclear reactors to extract the most oil for other purposes is the way to go. If all current conventional oil in North America had to be replaced with oil from the oilsands that would be about 24 million bpd. 9 billion barrels per year. if Henuset/AECL/CERI are correct in the 500,000-630,000 bpd estimate then 48 of the 2.2 GW twin reactors would be needed for the SAGD extraction process.

One way of viewing nuclear powered oil sands is to think of it as making a hybrid version of the global energy system.

Zyvex has interesting nanostructured materials planned

The Center for Responsible Nanotechnology talks about an interesting project that Zyvex has planned. They will create nanostructures with elements of molecular precision. This will help enable more powerful capabilities for controlling structures. The plans are to use silicon and silicon carbide and possibly other materials and created connected layers.

Nanostructured material control would enable memory 100 times faster than hard drives

whitespace wireless broadband: 80mbps Feb, 2009

The White Spaces Coalition (WSC) is a group of companies devoted to making use of white space in the analog television spectrum to offer wireless broadband. It's an impressive lineup: Microsoft, Google, Dell, HP, Intel, Philips, Earthlink, and Samsung are the group's public members; there are also a couple of Coalition members who prefer to remain unknown.

This article has links to petitions to ensure the freed spectrum is used for open wireless usage

May 10, 2007

Engines could be made 15-20% more efficient

New combustion engines in cars and other vehicles could be made 15-20% more efficient

In today's internal combustion engines, the pistons turn a crankshaft, which is linked to a camshaft that opens and closes the valves, directing the flow of air and exhaust into and out of the cylinders. The new method would eliminate the mechanism linking the crankshaft to the camshaft, providing an independent control system for the valves.

It would take 30-40 years for this kind of modification to filter through the cars we have running. 4% of cars each year are new cars. It would take 10 years or longer for the modification to be introduced into most of the new cars. Old cars without the modification would have to be taken off of the road. Still improvements like this are a good thing.

Scalable process for growing long arrays of carbon nanotubes

Carbon nanotube arrays grown that are 2cm long. More importantly the process can be scaled up A scalable process for building a lot of long carbon nanotubes could spin fibers into large components with most of the strength of carbon nanotubes. If we are able to make the light and strong carbon nanotubes cheaply for large objects then we get a lot of the early nanotechnology vision. Radically improved transporation and industry.

Internet lessons for space: Nasa's should only enable space infrastructure


Al Fin has two interesting articles :Mining space the gold rush that never ends and Making space launch more affordable. The articles also refer to my own article about using laser arrays that can be enhanced with mirrors to create a scalable modular space launch and infrastructure system.

There are lessons to be learned from the growth of internet infrastructure:
- What were the enabling elements in the explosive growth of internet infrastructure and which ones would be helpful in enabling explosive space infrastructure growth
- How could the internet service provider model be used to help create a space access provider model
- how commerce was enabled and private capital was used to leveraged government funds
- how standards were funded and managed
- how improvements were encouraged

DARPA primed the pump for the internet. The history of the internet NASA should prime the pump for space access and that should be its primary and maybe even sole purpose. Space access is a more capitally intensive task, so it justifies more focus.

What are some of the key design elements for the internet that should be followed for a good space infrastructure system.

1. The internet is about moving information but space access is about moving energy and matter. The internet drove down all sorts of costs associated with moving information. A revolution in space access needs to be architected to drive down the costs of getting more energy and matter.

2. Highly utilized tiered system that can be built modularly and incrementally

The internet has the backbone network and primary nodes and routers. It also has tiers of providers.

How internet infrastructure works

There are lessons to be learned in how the internet backbone was privatized to plan how a modular space infrastructure could be incentivized and maximum leverage achieved

Space access technology should look more preferentially on systems that are modularly expandable and scalable.

3. Most of the pieces of the internet infrastructure are automated with minimal configuration. The early internet also had a lot of manual work/configuration and knowledge barriers. Custom coding and configuration were costs and barriers. The space infrastructure should be cheap and automated.

4. As much as possible use commodity components and hardware that is used for other purposes. Computers were not just used for the internet. There was money and market forces already at play pushing the improvement of computers. This is an advantage to using laser modules. Lasers are being developed for military and industrial purposes. Lasers have a rapid improvement rate.

On the ground laser module:

Power module in space:

5. The build out of space infrastructure does not need to involve people in space. I am a huge proponent of space colonization and of a manned space program. However, those pieces should come after the space launch and energy generation systems are built (or from a separate pool of funding). Designs for space structures can be made to be completely automated and modular.

If something (commodized cheap module of the overall system) does not work then it should be thrown away. There should be redundancy. Every part is cheap and replaceable.

Google has over 450,000 servers built from commodity components Custom build or assemble only when it helps in the scaling process

The goal should be to drive $10,000/kg launches down to near the cost of the electricity needed to lift a kilogram into space ($5-10/kg). Then the modular space energy infrastructure should then drive down the cost of electrical energy as well.

Supplying power for space and back to earth is an infrastructure that could be built up without the complexity and cost of involving people. the low to no gravity of space means that we do not have to build structures -we just have to have the pieces stay positioned relative to each other. The systems that are put into space should avoid complicated construction. Systems using magnetic or photonic positioning and formation flying could avoid the need for costly astronaut construction. Space telescopes are also infrastructure that is amenable to floating/inflating in place non-construction construction.

Once the launch costs come down then the other activities will be justified. Space mining, colonization etc...

Comcast 150 mbps modem could launch in 2008

Comcast CEO Brian Roberts said his company plans to roll out a new cable modem that delivers 150 megabits per second (Mbps) of bandwidth. It seems a 2008 trial and start of rollout is likely It is a DOCSIS 3.0 modem. DOCSIS information is here at wikipedia A recent study predicted that by 2011, DOCSIS 3.0 CMTS use will be at 60% and DOCSIS 3.0 CPEs (modems, set-tops, eMTAs, etc.) will only be in 40% of cable broadband subscriber homes. Cablelabs is also accelerating DOCSIS 3.0 trials and efforts

Fiber is still four times faster and Verizon has already been rolling out its service. The speed of the fiber is being increased 4-8 times. Verizon could have over 1.5 million subscribers by the end of 2007 to FIOS

Currently in Verizon's network, a single fiber from a Verizon switching office has transmission speeds of 622 Mbps (megabits per second) downstream and 155 Mbps upstream. When the fiber reaches a neighborhood it is split up to feed multiple fibers, serving as many as 32 customers. With G-PON electronics, that same fiber from the switching office will have a downstream transmission speed of 2.4 Gbps (gigabits per second) and an upstream speed of 1.2 Gbps.

Comcast is sending a message that it intends to compete with Verizon's FiOS service at every step to woo customers with the fastest broadband in the land.

Verizon delivers more than 100 Mbps of bandwidth to customers connected to its $23 billion fiber optic FiOS network. However, in most FiOS markets, the top speed Verizon sells is 30 Mbps. In some markets where broadband competition is especially strong--like New York and New Jersey--a 50 Mbps service tier is available at the same price as the 30 Mbps tier sold elsewhere. FiOS-connected households number about 348,000 in 16 states.

While Verizon does have the fancier (fiber-optic) network, it's still pushing for better speeds. Verizon is now in the process of converting the network to a new and faster flavor of optical networking equipment.

Carnival of Space Week 2

Week 2 of the carnival of space is up at whyhomeschool

My post on laser arrays for low cost launches is part of the carnival

The pricing of the SpaceX Falcon 1 and 1E rockets is discussed by Space Transportation news.
The Falcon 1 can launch 670kg to Low Earth Orbit. So at a $7 million price tag that is $10,000/kg. The Falcon 9 variations at prices of $35-78 million to launch 8700kg to 24,750kg translates to pricing of about $3000-4000 per kg to low earth orbit. The Falcon should have test flights in 2009 and could be ready in 2010 or 2011.

Colony Worlds discusses the long term possibilities of Jupiters moon Ganymede There are a total of ten posts on space in the carnival of space week 2.

May 09, 2007

Thermoelectric nanostructure advances

From Nanowerk, Thermoelectric nanostructures could double the efficiency of engines and generators by converting wasted heat to electricity

Ted Harman at MIT's Lincoln Laboratory--building in part on Chen's earlier, unrelated work--showed that by using nanostructures, you can create materials that outdo nature: Some of Harman's materials, thanks to their unique heat-impeding qualities, are twice as efficient as their conventional cousins.

Chen says innovations like an exhaust-mounted energy-mining device for vehicles needn't wait until you hit Lincoln Lab realms of efficiency. "If you can reach a 10-to-15 percent conversion efficiency," he says, "that would be attractive for many applications." In fact, results he's had at that level are already drawing interest from companies.

Thermoelectric devices are energy converters. When they're producing electricity, this puts them in the same broad category as power plants and solar-generating systems. When outputting heat or its opposite, meanwhile, they're like heat pumps and air conditioners, respectively.

In design terms, thermoelectric devices have key pluses. For one, they're solid state: no liquid fuels, no moving parts. They're also easily scalable up or down.

"Cars are about 20 percent efficient," notes Chen, "and turning some of the energy wasted into electricity could increase that figure by as much as one-third."

'Exercise pill' switches on gene that tells cells to burn fat

By giving ordinary adult mice a drug - a synthetic designed to mimic fat - Salk Institute scientist Dr. Ronald M. Evans is now able to chemically switch on PPAR-d, the master regulator that controls the ability of cells to burn fat. Even when the mice are not active, turning on the chemical switch activates the same fat-burning process that occurs during exercise. The resulting shift in energy balance (calories in, calories burned) makes the mice resistant to weight gain on a high fat diet.

By permanently turning on this delta switch in mice through genetic engineering, he was able to create a mouse with an innate resistance to weight gain and twice the physical endurance of normal mice. Because they were able to run an hour longer than a normal mouse, they were dubbed "marathon mice."

Subsequent work in the Evans laboratory found that activation of PPAR-d in these mice also suppresses the inflammatory response associated with arthrosclerosis.

But the genetic metabolic engineering that created the marathon mouse is permanent, turned on before birth. While a dramatic proof of concept that metabolic engineering is a potentially viable approach, it offers no help to an adult whose muscles are already formed and who now would benefit greatly from having more active, fat-burning muscles.

That is why the potential of chemical metabolic engineering - possibly a one-a-day pill as opposed to permanent genetic metabolic engineering - is so exciting, says Dr. Evans. In today's society, too few people get an ideal amount of exercise, some because of medical problems or excess weight that makes exercise difficult. Having access to an "exercise pill" would improve the quality of muscles, since muscles like to be exercised, and increase the burning of energy or excess fat in the body. And that would result in less fatty tissue, lower amounts of fat circulating in the blood, lower blood glucose levels and less resistance to insulin, lowering the risks of heart disease and diabetes.

The ability to chemically engineer changes in metabolism also has given the researchers more insight into how the PPAR-d switch works, says Dr. Evans. Genetically engineering changes in metabolism in the marathon mice triggers both increased fat burning and increased endurance. Adult normal mice that receive the drug to switch on PPAR-d show increased fat burning and resistance to weight gain, but they do not show increased endurance. Dr. Evans says this suggests the delta switch can operate in different modes, and the laboratory is in the process of figuring out exactly how. He hopes his strategy will make it possible.

Silicon surface plasmonics could achieve terahertz

has shown how the use of laser pulses can create a surface plasmon resonance from a photonic crystal effect.

Surface plasmons can only exist in a metal/dielectric interface. They are electromagnetic waves that run along the surface of this interface. “What we wanted to do,” explains Zhang, “is start with a non-conductive material to see if we could excite surface plasmons in the terahertz region.” For their attempt, Zhang and his colleagues use silicon because of its properties as a semiconductor. “We used ultra-fast laser pulses that resulted in photodoping.”

Biomedicine is a field especially where terahertz systems can find good use. Terahertz radiation can be used to “look” deep inside organic materials, and they do it without causing the damage that X-rays do. Additionally terahertz radiation is being considered for use in screening airport passengers.

Zhang also points out that surface plasmon resonance to direct terahertz systems can also be used to enhance space communication: “This would be ideal for making tunable switches.” Indeed, astronomers are interested in using terahertz technology to study the particles that fall into the category of “far-infrared.”

“Because silicon is cheap, rigid, and tunable,” concludes Zhang, “this is an important and exciting finding. The applications for technology are just beginning.”

May 08, 2007

Lasers and magnetic launch for cheap launches within ten years


The $3000-10,000/kg cost of getting things into space has been crippling what is possible in space. Any low cost system will also need to have a high volume purpose. I discuss the best system that would still involve chemical propulsion and laser and magnetic launch systems. The focus is on laser launch array systems (and mirror reflecting enhancement). I believe there is no technical roadblock for the laser array launch system being developed within 10 years. As with any significant project it would take a coordinated effort and funding.

Chemical rockets and incremental developments will take a long time to radically alter the cost equation for space as well as the volume of material that can be delivered into space. The best systems involving chemical propulsion would swap out the bottom stage with a magnetic boost and the top with either tethers (skyhooks) or magbeam. The chemical rockets would be made three times as efficient using hypersonic aircraft that took in oxygen from the atmosphere.
Scramjets are being actively studied but are unlikely to lead to the first working space launch system for 15-25 years. Then it would take longer to have frequent flights and scale up volume. (There was a pdf of a proposal for hypersonic project but it has been removed from the web. It had all of the project elements laid out and it would take about 25 years before scramjets were manned. The high volume economic motivation would be the $4 billion market for two hour package delivery around the world and that skiping a hypersonic passenger plane on the atmosphere would be twice as fuel efficient as current commercial jets.

Ultimately could achieve $100/kg. Might achieve $1000/kg to space at lower volume. Development costs and build up costs easily in the range of $100 billion to $1 trillion and taking 15-30 years to first system and 50 years to have scaled up infrastructure [unless nanotechnology is developed]

Starting with the laser array launch system

Proof of concept Scale model test vehicle for laser launch

Lasers in arrays could use cheap $7-10/watt lasers. This would allow full systems to be developed for about $2 billion. The first modules can be made at low risk

Each module would have the components above

Larger 67 kilowatt to 100 kilowatt systems are as pictured above

Laser photonic mirror system could launch things into orbit and could enhance the laser array launch system with mirrors to multiply efficiency by 1000 up to 100,000 times.

High volume magnetic launch (4000g) acceleration with ion propulsion at the top could bring launch costs down to $10/kg. High volume systems which have operational costs which are only the cost of electricity tend to converge to the $10/kg price. The laser launch and laser launch mirror systems also converge to those prices at high volume.

The high volume purpose should be space colonization and to tap into the resources of space.

Using magnetically inflated cable to capture solar energy in space

Larger 10 kilometer structures could generate 18GW, 15 times more than larger nuclear power plants

The systems that are put into space should avoid complicated construction. Systems using magnetic or photonic position and formation flying could avoid the need for astronaut construction.

Structures such as solar power arrays and telescopes can avoid complicated construction using modular systems. Large structures can also be made with less complex construction using inflation (magnetic or gas).

Arrays of lasers for a cheaper yet powerful launch system and inflated components for large, inexpensive and high capability space power systems and telescopes.

Instead of $14-16 billion per year on the current limited NASA efforts. Build the non-chemical infrastructure over the next 10 years. The key is build a lot of power both for non-space program needs but also 20-50% dedicated to scaling up space infrastructure.

Gigawatts on the ground dedicated to launches. Situate near a nuclear reactor or large dam (much like Google is building server farms near cheap power). Also, choose cheap power that would be near an area requiring less power for launches.

Powering laser arrays. Mass produce the arrays.

Highly reusable power infrastructure on the ground and in space is the key.

As noted in the movie Apollo 13:
Gene! Gene! We gotta talk about power here, Gene.
- Whoa, whoa, guys!

Power is everything. Uh, power is everything.

- What do you mean?
- Without it, they don't talk to us; they don't correct their trajectory; they don't turn the heat shield around.

Power to launch. Power to build.

Launch magnetically inflating structures for space solar power. Whatever, size can be launched and then have them hold formation to direct lots of power to localed power collection on the ground.

Scale up to build more power as fast as possible for the space program and for power needs on earth. Also, launch other reusable power and launch infrastructure that has a low ratio of electrical power to kinetic energy generated.

Modular and scalable is what makes the internet work so well. Government money helped develop, subsidize and prove out the modules and protocols for the internet. That is what is needed for space as well. Modular and scalable power and launch systems. Companies can come in and buy and build the launch and power modules.

How internet infrastructure works

There are lessons to be learned in how the internet backbone was privatized to plan how a modular space infrastructure could be incentivized and maximum leverage achieved

A solar powered magbeam could be more efficient as noted at crowlspace.

Fuel economy standards set to be increased

Legislation to increase fuel standards in the USA is in motion

Hopefully this legislation will lag changes in the market. As plug-in hybrids and all electric vehicles get become dominant over the next 20 years.

A plan to increase fuel efficiency standards to an average of 35 miles per gallon by 2020 won approval from a Senate panel Tuesday in a vote closely watched by automakers and environmental groups. The Senate Commerce, Science and Transportation Committee approved the measure, hich would raise the nationwide fleet fuel economy average by about 40 percent from current levels of 25 mpg for cars and trucks. The bill, approved on a voice vote, would also increase standards by 4 percent a year from 2020 through 2030.

Fuel economy standards have made little progress in the past 20 years. Passenger cars are required to meet a fleetwide average of 27.5 miles per gallon while SUVs, pickup trucks and vans must meet a standard of 22.2 mpg.

Lawmakers said the bill was a compromise that would likely face a number of changes on the Senate floor. Sen. Ted Stevens of Alaska, the committee's top Republican, and Trent Lott, R-Miss., said they had concerns about how it might affect trucks and its overall fairness.

Sen. Bill Nelson, D-Fla., meanwhile, said he would aim for a fleet increase of up to 40 mpg by 2020, while Sen. John Kerry, D-Mass., wants to guarantee 31 mpg by 2015 and 35 mpg by 2020.

Alan Reuther, the United Auto Workers' legislative director, wrote Inouye that it would force manufacturers "to close more facilities, destroying tens of thousands of additional jobs and undermining the economic base of communities across this country."

Environmentalists said they were concerned that the proposal was weaker than one offered by President Bush, which would set a goal of a 4 percent annual increase while expanding use of alternative fuels.

"When you look at all the loopholes in this 35 mpg bill, it kind of looks like Swiss cheese," said David Friedman of the Union of Concerned Scientists.

China's economy

A pdf discussing China's statistics

Here is a pdf from Carlsten Holz that examines different China economic growth projections

The World Trade Organization data shows that China surpassed the USA in terms of trade in 2006.

China's global trade exceeded $1.758 trillion at the end of 2006.[12]. It first broke the 1 trillion mark ($1.15 trillion) in 2004, more than doubling from 2001. At the end of 2004, China became the world's third largest trading nation behind the United States and Germany [13]. The trade surplus however was stable at $30 billion. (>40 billion in 1998, <30 billion in 2003). China's primary trading partners include Japan, U.S., South Korea, Germany, Singapore, Malaysia, Russia, and The Netherlands. According to U.S. statistics, China had a trade surplus with the U.S. of $170 billion in 2004, more than doubling from 1999. Wal-Mart, the United States' largest retailer, is China's 7th largest export partner, just ahead of the United Kingdom. Out of the 5 busiest ports in the world, 3 are in China.

An article that makes a case for China's performance being exaggerated and that India will be close to catching China in 2025

World Bank analysis of China successes against poverty

Economic data by the regions in China

Projections of China, US, EU and OECD economies until 2030

Foresight's Technology roadmap for productive nanosystems to be revealed Oct 9, 2007 Arlington

21st Century will be a China Century

Al fin points an article by City Journal about flaws in China's success story. Yes, there are huge environmental challenges and yes there are a lot of people still below western standards of middle class. However, the extremely poor have mostly been raised to just poor.

I do not agree that china is not raising the standards of most of its populace.

Jeffrey Sachs (Author ofThe End of Poverty: Economic Possibilities for Our Time) as well as various organizations such as the IMF and worldbank list china as being successful in raising people out of poverty.

People in the urban areas are wealthier and china is shifting more people into the cities.

China has started a $300 billion investment fund. This will be used to buy whole industries to continue to fuel china's growth. I predict that this fund will be successful and by 2010 it will be 500 billinon and by 2015 it will be a trillion dollar fund (or multiple funds totaling that amount.) China will have the best advisors in the world (Goldman, Solomans, Morgan Stanley and others helping them to run these funds)

China is spending to make a cleaner environment. China is spending on wind power. and cleaner coal

China has a rising middle class

Not all of china's transportation plans are bad and certainly more should be done.
But China is invested in public transportation and the chinese people are buying electric bicycles

India and Vietnam are also rising and will be successful in the 21st century.

May 07, 2007

Stable, high capacity rechargeable lithium batteries created

Double charge (greater than 250 mAh/g) stable rechargable lithium ion batteries created The technology is based on a new material for the positive electrode that is comprised of a unique nano-crystalline, layered-composite structure. In addition, by focusing on manganese-rich systems, instead of the more expensive cobalt and nickel versions of lithium batteries, overall battery cost is reduced. In larger batteries, the technology could be used in the next generation of hybrid electric vehicles and plug-in hybrid electric vehicles.

Mini-bacteria could reduce chemotherapy side effects over 1000 times and enable RNA interference

"Mini bacteria", or EnGeneIC Delivery Vehicles (EDVs) as the company has dubbed them, are cheap and easy to produce, and can be used as targeted drug delivery vehicles.

They could be adapted to target virtually any tumour tissue in the body, and could put an end to many of the toxic side effects associated with chemotherapy drugs because they do not release their payload until they are inside the target cell. This also means that far less drug is required.

The EDVs are able to selectively target different tissues thanks to bispecific antibodies attached to their surface. One arm of the antibody is specific to the EDV and is connected via a linker molecule to the second antibody, which is specific to a protein on the target tissue - for example, the Her2 receptor on breast cancer cells, which is also targeted by Herceptin.

Targeting is also aided by the fact that the blood vessels supplying cancer cells are often leaky, and by coincidence the 400 nanometre EDVs are the perfect size to fall through these holes into the tumour tissue. "Within 2 hours of intravenous administration greater than 30% of the dose ends up in the tumour microenvironment" says Brahmbhatt, who presented the findings at RNAi 2007 in Boston, Massachusetts, on 3 May.

Once EDVs bind to the correct cells, they are internalised and broken down - releasing the drug into the cell, where it can take effect.

EnGeneIC hopes to begin human trials towards the end of 2007.

"We haven't yet found a drug that you couldn't load," says MacDiarmid, and EnGeneIC believes EDVs could enable cancer patients to be given high doses of multiple drugs, thus increasing the chances of finding one that works for them. Oncologists are often reluctant to prescribe multiple drugs because of the risk of serious side effects - and if they do they will usually reduce the doses to limit the toxicity.

"The amount of drug given with EDVs is thousands of fold less than if they were given directly," says Bruce Stillman, director of Cold Spring Harbor Laboratory in New York, US, and an advisor to the EnGeneIc team. "This, coupled with the fact that you can also target the drugs directly to the target tissue, means that the reduction in side effects could be extraordinary."

Preliminary results in mice also suggest that EDVs could also be used to deliver novel therapies like RNA interference (RNAi), where one of the major hurdles is finding a targeted delivery method.

This would be a great advance for fighting cancer and to speed up and make RNA interference and gene therapies more effective.

Chris Phoenix discusses Computational Chemistry achievement

May 06, 2007

India Looks To Make $10 Laptop, current design $47

The efforts thus far have yielded designs for a laptop that would cost about $47; a $10 system remains the ultimate goal. Negroponte's One Laptop Per Child Organization submitted a proposal to the Indian government under which the group would have worked to produce laptops for Indian students starting at $100. The OLPC organization has produced a reference design for a $100 laptop that features an AMD Geode processor, a range of open-source software, and an attached hand crank for power generation.

Times of India has more info

Sources say it would be another two years before the laptops become a reality.

“We do not want to rush into it. Many issues remain to be resolved like royalty to the designer after the design is patented. Prototyping would also take time. We would even conduct destructive testing and create a proper maintenance network,” said one official.

Форма для связи


Email *

Message *