June 14, 2008

High Superconducting temperature predicted for boron-doped diamond and spinhole theory for all superconductors

There is superconducting in super hard boron doped diamond up to 45K according to a computational model.

T A Study calculates that boron-doped diamond (BC5) should be superconducting on up to temperatures of 45 K, which, if borne out in experiments, would make this class of material with the highest with the highest transition temperature into a superconducting state mediated by the passing of phonons.

A paperby Peter Wachter proposes that spin holes in anti -ferromagnetic clusters combine to make nonmagnetic bipolarons, which can condense and lead to superconductivity. (Cu, Pu and Fe high Tc superconductors: all the same mechanism.)

In conclusion, it has been shown that the parent materials of high Tc superconductors are antiferromagnets, where long - range magnetic order has been interrupted by 5 – 20% substitution of the magnetic ions by nonmagnetic ions. These nonmagnetic ions have been provoked by chemical doping, but are of the same kind as the magnetic ions, only in another valence state or another spin configuration. The remaining short – range antiferromagnetic clusters or fluctuations will surround such a spin hole with charge as a magnetic polaron. Two such polarons have an attractive interaction and form a boson nonmagnetic bipolaron. This can make a Bose condensation and lead to superconductivity, which has been shown in many papers by Alexandrov and Mott. We could show, that the same mechanism works for all three (Cu, Pu and Fe) high Tc superconducting systems.

A superconducting paper examines the issue of how the pairing of electrons works and which physical model might be a better explaination. A “pairing glue” in the Hubbard and t-J models is basically a question about the dynamics of the pairing interaction.

Synthesis and Microstructural Studies of Iron Based LaO1−xFxFeAs
Superconducting Materials

Materialise a current leader in Rapid Manufacturing

Rapid manufacturing was used to make important components of a concept car. The Sintesi is a sports car with four doors and four seats, developed by a highly innovative approach: it does not consider the car as a shape that covers the mechanicals, but one that gives a shape to the mechanicals around the passengers, starting from the latter. (H/T to Pantopicon.be)

Materialise, a rapid manufacturing company, used additive technology stereolithography (SLA): the radiator, control panels, roof antenna, remote controller, roof light cover and most importantly, the instrument panel which is the centrepiece of the car’s interior. During the file preparation phase, a complex webbing structure was integrated in the dashboard to give it functional strength.
Pininfarina Sintesi stereolithography dashboard, Materialise uses about twelve different kinds of rapid manufacturing and rapid prototyping systems. Those systems can use a variety of materials.

The eventual panel was “printed” in its full width on a Materialise Mammoth SLA machine, with a build volume up to 2150 x 700 x 800 mm, in a translucent PP-like epoxy (Poly 1500).

Due to its complexity, also the radiator had to be manufactured by means of additive technologies. The production of the smaller components like the roof antenna and remote controller show the endless personalisation possibilities of additive manufacturing. Nowadays, the state of the art of additive technologies allows that this type of products can be manufactured in small series of production cars or one-offs. This is a big step forward towards real personalised manufacturing.

Mammoth Stereolithography has a build area of more than 2 meters

In order to build single-piece SLA models with dimensions of more than 2 meters, Materialise has developed a unique technology: mammoth stereolithography.

Mammoth systems offer not only the ability to print very large parts, but are also extremely fast and productive, thanks to a patented curtain recoating technology which minimises the dead time between layers. The mammoth parts are constructed layer by layer in a liquid polymer that hardens when struck by a laser beam. The laser printed layers are each time lowered together with the vessel's resin level. Afterwards, a small reservoir moves over the vessel and disposes a film of liquid polymer onto the whole vessel. This curtain recoating technology needs less time between layers than the traditional SLA technology which uses a scraper.

As a result of their unique combination of build size and speed, mammoth systems are especially suited to high volume prototyping operations requiring both large numbers of parts, and large parts.

Improved rapid manufacturing is one of the seeds of a manufacturing and construction revolution.

This revolution will require rethinking designs and modelling and other systems to fully realize its potential. Totally rethinking cars is using inflatable bodies for cars and then using new paper that is stronger than cast iron and epoxy to bring material costs down.

The mammoth prototyping system could be used to make larger ecomodifications to existing cars. Aeromodding a car can increase full economy on the highway by 50%. Less extreme modifications can achieve 25% increases in fuel efficiency.

Aerodynamic modifications for cars
- Lower the car - Lowering the car reduces the effective frontal area, increasing efficiency. 2.7" ground clearance is a good minimum height. According to Mercedes, "Lowering the ride height at speed results in a 3-percent improvement in drag."
- Remove that wing - Many "sports" cars have a non-functional wing on the back. Removing it will improve the fuel economy. The exceptions are the small rear fairings that are designed to detach the airflow from a rounded trunk.
- Clean up the underside of the car. - Installation of a "body pan", while a labor intensive operation, will provide a significant improvement in mileage. More...
- If a body pan is not practical, an air dam will redirect air that would normally pile up under the car causing drag. Not as good as a body pan, but better than nothing. Should be combined with side fairings.
- fair the wheel wells, racing disk wheel covers, and many other modifications

An extreme custom modification gets about 70mpg on highway from an old Honda civic

A discussion group for improving fuel efficiency.

June 13, 2008

US Navy may get more nuclear powered

Research and development work on adapting the design of the Ford (CVN-78) class aircraft carrier nuclear power plant for use in a new Navy cruisers CG(X) and could be extended to amphibious assault vessels.

Congress in 2007 passed the National Defense Authorization Act for 2008, an annual piece of legislation that tells the Pentagon how it should spend its budget. Under the act all future aircraft carriers, submarines and battle cruisers have to be built with a nuclear power system at their heart.

The National Defense Authorization Bill for 2009, which the Senate has still to pass, aims to shift the process up a gear by adding various types of amphibious assault ships to the list of those that must be powered by nuclear reactors in the future. Amphibious ships come in various forms, from those that incorporate a dock for landing craft, to undersized aircraft carriers for helicopters and vertical take-off aircraft - or a mixture of both. The vessels' position in combat can also vary - from a "stand-off" over-the-horizon location to being moored to a pier in a combat zone.

Equipping such ships with nuclear reactors would have another advantage in military operations, says Wright. "Assault ships are carrier escort vehicles and will no longer be holding up a carrier task force's progress by having to be refuelled every three to five days," she says.

There was a 2007 study on the use of more nuclear power in the United States Navy

A potential advantage of nuclear power postulated by some observers is
that a nuclear-powered ship can use its reactor to provide electrical power for use
ashore for extended periods of time, particularly to help localities that are
experiencing brownouts during peak use periods or whose access to electrical power
from the grid has been disrupted by a significant natural disaster or terrorist attack. The Navy has stated that the CG(X) is to have a total power-generating capacity of about 80 megawatts (MW). Some portion of that would be needed to operate the reactor plant itself and other essential equipment aboard the ship. Much of the rest might be available for transfer off the ship. For purposes of comparison, a typical U.S. commercial power plant might have a capacity of 300 MW to 1000 MW. A
single megawatt can be enough to meet the needs of several hundred U.S. homes,
depending on the region of the country and other factors.

The Navy is looking to install radar requiring 30 or 31 megawatts of power onto its new Cruiser.

A nuclear-powered CG(X) could cost roughly 32% to 37% more than a conventionally powered CG(X). The Navy estimates that building the CG(X) or other future Navy surface ships with nuclear power could reduce the production cost of nuclear-propulsion components for submarines and aircraft carriers by 5% to 9%, depending on the number of nuclear-powered surface ships that are built.20 Building one nuclear-powered cruiser every two years, the Navy has testified, might reduce nuclear-propulsion component costs by about 7%.

At a crude oil cost of $74.15 per barrel (which was a market price at certain points in 2006), the life-cycle cost premium of nuclear power is:
— 17% to 37% for a small surface combatant;
— 0% to 10% for a medium sized surface combatant; and
— 7% to 8% for an amphibious ship.

Newly calculated life-cycle cost break-even cost-ranges, which supercede the break-even cost figures from the 2005 NR quick look analysis, are as follows:
— $210 per barrel to $670 per barrel for a small surface combatant;
— $70 per barrel to $225 per barrel for a medium-size surface combatant; and
— $210 per barrel to $290 per barrel for an amphibious ship. In each case, the

lower dollar figure is for a high ship operating tempo, and the higher dollar figure is for a low ship operating tempo.

A 2006 Navy study states that for a medium-size surface combatant that is larger than the DDG-1000, an additional cost of about $600 million to $700 million would equate to a procurement cost increase of about 22%. If building a Navy surface combatant or amphibious ship with nuclear power rather than conventional power would add roughly $600 million to $700 million to its procurement cost., then procuring one or two nuclear-powered CG(X)s per year, as called for in the Navy’s 30-year shipbuilding plan, would cost roughly $600 million to $1,400 million more per year than procuring one or two conventionally powered CG(X)s per year, and procuring a force of 19 nuclear-powered CG(X)s would cost roughly $11.4 billion to $13.3 billion more than procuring a force of 19 conventionally powered CG(X)s. For purposes of comparison,the Navy has requested a total of $13.7 billion for the SCN account for FY2008.

UPDATE: The United States navy has 280 active ships The Aircraft carriers (12 current, 2 under construction, 2 planned) and submarines (70 now, 5 under construction or ordered, at least 9 more planned) in the US navy are nuclear powered already.

The US has 10 amphibious assault ships (helicopter carriers) and 11-18 amphibious transport docks.

Amphibious assault ships (small aircraft carriers for marines)
* Tarawa class (3 in commission, 2 decommissioned)
* Wasp class (7 in commission, 1 under construction)

Amphibious transport docks (200 meters long versus 173 meters for a cruiser)
* Austin class (9 in commission, 2 decommissioned, 1 converted to an auxiliary command ship)
* San Antonio class (2 in commission, 3 under construction, 4 more planned)

The US Navy has 22 cruisers and 52 destroyers with 3 under construction, 7 more planned.

Dock Landing ships
* Whidbey Island class (8 in commission)
* Harpers Ferry class (4 in commission)

So 32-50 ships in the amphibious and cruiser categories could become nuclear powered at about 2 at a time over 16-25 years from 2015-2040.

NXP Semiconductor 150 mbps LTE modem and Clearwires 6Mbps Wimax network

NXP Launches The World’s Fastest Cellular Modem, the PNX6910, which is capable of achieving data transfer rates of 150 Mbits downlink and 50 Mbits uplink, and supports multi-mode LTE/HSPA/UMTS/EDGE/GPRS/GSM capability.

The world's fastest cellular modem from NXP - Nexperia Cellular System Solution PNX6910 will be available to early access customers in the second quarter of 2009.

Advanced cellular modem
- LTE class 4 up to 150-Mbps down link, 50-Mbps uplink
- TDD and FDD support
- MiMo, Rx diversity with 2Rx/1Tx path
- Quad-band GSM/GPRS/EDGE
- HSPA/LTE bands I-XIV, TDSCDMA bands 33-37 & 39
Interfaces to cellular-connect smart phones and PC’s
- USB 2.0 high-speed device and high-speed MIPI HSI, SDIO
- Drivers for Linux, Windows, Apple OS

Sprint Nextel and Clearwire are promising that their new 6Mbps WiMAX network will support both open access and wholesale access and that it will reach 140 million people by the end of 2010 and executives announced that they hope to cover 220 million people by 2017.

A lengthy document filed this week with the FCC asks for permission to merge the 2.5GHz spectrum assets of Sprint and Clearwire into "New Clearwire," the company backed by Sprint, Clearwire, Intel, Time Warner, Google, and Bright House. In the filing, Clearwire makes the case that it will provide true "third pipe" Internet access to home and mobile users at speeds of 6Mbps (and 3Mbps uplink).

Clearwire claim that "each of these systems has consistently demonstrated the ability to deliver up to 6Mbps downlink and up to 3Mbps uplink while the end user is moving at speeds of up to 60 miles per hour."

Carnival of Space Week 58

June 12, 2008

Iraq Oil production could increase by 400,000 bpd by the end of 2008

Following up on Iraq Oil status Oil production and output holding steady at higher levels that started in May. Iraq will produce up to 2.9 million bpd by the end of 2008, Hussein al-Shahristani Iraqi Oil Minister 2.9 million bpd would be an increase of about 400,000 bpd from levels at the end of May, 2008 2.53 million bpd.

Starved of access to oil and gas prospects by governments who increasingly favour development by their state oil companies, Western oil companies are eager to invest in Iraq, home to the world's third biggest oil reserves. However, the security situation and an uncertain legal framework have deterred the majors from making significant investment.

Major oil companies have all turned in their proposals for oil service deals and some will be signed this month. Shahristani had warned that Baghdad might drop the oil service contracts, worth about $500 million a piece, if the majors failed to sign deals by June. Five of the deals under discussion are with Royal Dutch Shell (nyse: RDSA), Shell in partnership with BHP Billiton (nyse: BBL), BP (nyse: BP), Exxon Mobil (nyse: XOM) and Chevron (nyse: CVX) in partnership with Total . Iraq is also in talks with a consortium of Anadarko (APC.N), Vitol and Dome for a sixth contract on the Luhais field.

Dow Jones news reports the Iraqi oil ministry is planning to announce the first round of tenders to develop its vast oil fields, which are among the world's largest, at the end of June or the beginning of July, an Iraqi oil official said Monday.

"Iraq is going to announce the first round of tenders to develop super giant oil fields in southern and northern Iraq either at the end of June or the beginning of July," the official told Dow Jones Newswires by telephone from Baghdad.

The official named seven oil fields and two gas fields that would be included in the first tender announcement. They are North Rumaila, South Rumaila, Zubair, West Qurna, and Buzurgan in southern Iraq and Kirkuk and Bai Hassan in northern Iraq. The two gas fields are Akkaz in western part of the country and Mansouriya in the east.

Over the last few months, the ministry has been working to prepare contract models for these fields, the official said. The ministry has signaled that more restrictive service contracts may be used to develop these fields, rather than controversial production-sharing contracts.

The official said the ministry would hold a news conference to announce these new tenders.

Iraq is currently in the final stages of striking what are called Technical Services Contracts, or TSCs, with oil majors to help boost crude oil production in the country's largest producing fields.

Iraqi oil sources said these TSCs could be signed as early as June. Each would last two years and could be extended for another year.

Oil Minister Hussein al-Shahristani has threatened to cancel these TSCs if they aren't signed in June. The TSCs are designed to boost Iraq's crude oil production from producing oil fields.

Iraq wants to boost production by 600,000 barrels a day in six producing oil fields in northern and southern Iraq. They are Kirkuk in the north, West Qurna 1, Zubair, Missan, Rumaila and Luhais in the south.

State Department Iraq Weekly Report for June 11, 2008

United States might finally build a new oil refinery in 2013

The first new oil refinery in the United States in thirty years is one step closer.

Union County residents voted 58 percent to 42 percent Tuesday to endorse the rezoning of almost 3,300 acres of pristine farm land north of Elk Point for the oil refinery. Texas-based Hyperion Resources requested the rezoning for the $10 billion refinery, billed as a potential step toward national energy independence.

It will process 400,000 barrels of oil per day (mostly from Canada's oilsands) and will likely be in operation in 2013.

North Dakota reported 5700 more barrels of oil per day in March, 2008 MArch production was 143738 bopd versus February 138013 bopd.

Reece Energy Exploration Corp. rose to a third-straight record in Toronto after saying it found oil in the first well it drilled in the Bakken area in Saskatchewan


Previous coverage of North Dakota's Bakken Oil

Daily North Dakota drilling and production reports (after 6 months on confidential list)

North Dakota oil statistics

Latest update on Bussard Fusion Prototype WB7

The Emc2 team has been ramping up its tests over the past few months, with the aim of using WB-7 to verify Bussard's WB-6 results. Today, Nebel said he's confident that the answers will be forthcoming, one way or the other.

"We're fully operational and we're getting data," Nebel said. "The machine runs like a top. You can just sit there and take data all afternoon.

Nebel may be low-key about the experiment, but he has high hopes for Bussard's Polywell fusion concept. If it works the way Nebel hopes, the system could open the way for larger-scale, commercially viable fusion reactors and even new types of space propulsion systems.

"We're looking at power generation with this machine," Nebel said. "This machine is so inexpensive going into the 100-megawatt range that there's no compelling reason for not just doing it. We're trying to take bigger steps than you would with a conventional fusion machine."

EMC2 built the laboratory and an experiment in nine months. If a working scaled up production system could be built in comparable time then the main part (not the site preparation and power lines) of any new reactor could be produced in 9 months or less.

this site had an article about the space propulsion breakthrough that this fusion system would enable if it is successful

Stronger paper and rapid manufacturing

A new kind of paper is stronger than cast iron and could be used to reinforce conventional paper, produce extra-strong sticky tape or help create tough synthetic replacements for biological tissues, says Lars Berglund from the Swedish Royal Institute of Technology in Stockholm, Sweden.

Despite its great strength, Berglund's "nanopaper" is produced from a biological material found in conventional paper: cellulose. This long sugar molecule is a principal component of plant cell walls and is the most common organic compound on Earth. Wood is typically about half cellulose, mixed with other structural compounds.

Cellulose is extracted from wood to make paper, is the basis of cellophane, and has also recently been used by materials scientists developing novel plastic materials. But they have used it only as a cheap filler material, ignoring its mechanical properties.

However, the mechanical processes used to pulp wood and process it into paper damage the individual cellulose fibres, greatly reducing their strength. So Berglund and colleagues have developed a gentler process that preserves the fibres' strength.

The new method involves breaking down wood pulp with enzymes and then fragmenting it using a mechanical beater. The shear forces produced cause the cellulose to gently disintegrate into its component fibres. The end result is undamaged cellulose fibres suspended in water.

It also means that the 214 megapascal strength paper (versus 1 megapascal for regular paper).

There is already a person who used 3D modelling and computerized cutting to create a cardboard based surfboard covered with epoxy.

The new paper made only from plant cellulose would be cheaper and strong enough for many applications.

Here is a link to video showing the assembly of the cardboard surfboard

Previous discussion on new ideas for a manufacturing and construction revolution. The new nanopaper will enable more rapid manufacturing with cheaper materials.

Feature on cardboard surfboard in surfer magazine
He looked to aerospace blogs for insight into the strength-to-weight ratio as it relates to design and how to graphically manipulate lines and rib interactions for his cardboard cores. "I'd look for the math that explained how to apply curves using programming language. Sheldrake cut the cardboard-core surfboard pieces using the stone company's laser cutter.

June 11, 2008

Death and Taxes

U.S. life expectancy has been steadily rising, usually by about two to three months from year to year. This year's jump of fourth months to 78.1 years is
"an unusually rapid improvement," Preston said.

The preliminary number of deaths in the United States in 2006 was 2,425,900, a 22,117 decrease from the 2005 total. With a rapidly growing older population, declines in the number of deaths (as opposed to death rates) are unusual, and the 2006 decline is likely the result of more mild influenza mortality in 2006 compared with 2005.

37 page preliminary analysis of the proposed McCain and proposed Obama tax plans The plans would reduce tax revenues by $3.7 trillion (McCain) and $2.7 trillion (Obama) over the next 10 years, or approximately 10 and 7 percent of the revenues scheduled for collection under current law, respectively.

This is confirming this site's previous analysis that Obama's plan to tax the rich will not result in more tax revenue More of what people in the higher income brackets report will be taxed but there will be less revenue because of changes in tax strategies to avoid taxation.

So the CDC report on 2006 US life expectancy and the Urban Institute and Brooking Instition tax policy center analysis are indicating less death and less tax revenue.

Tax plan summaries
Senator McCain would permanently extend the 2001 and 2003 tax cuts, increase deductions for taxpayers supporting dependents, reduce the corporate income tax rate, and allow immediate deductions for the cost of certain short-lived capital equipment. Senator Obama would permanently extend certain provisions of the 2001 and 2003 tax cuts primarily affecting taxpayers with incomes under $250,000; increase the maximum rate on capital gains and qualified dividends; and enact new and expanded targeted tax breaks for workers, retirees, homeowners, savers, students, and new farmers. Senator McCain proposes to extend permanently the AMT "patch" that has prevented most individuals and families with incomes below $200,000 from being affected by the tax, and in our interpretation of his proposal, Senator Obama would do the same. Each candidate would also increase the estate tax exemption and reduce the estate tax rate compared with current law in 2011 and beyond, although Senator McCain would cut the tax much more than Senator Obama. Finally, each candidate promises to broaden the tax base and reduce corporate loopholes. McCain lists eight breaks for oil companies as targets but, other than that, is short on details for his pledge to eliminate "corporate welfare." Obama identifies a variety of steps, including basis reporting for capital gains, taxing carried interest as ordinary income, and enacting sanctions on international tax havens that don't cooperate with enforcement efforts, but he would also need additional as-yet-unspecified policies to achieve his revenue target for base broadening.

How there were fewer deaths in 2006
The 2006 increase is due mainly to falling mortality rates for nine of the 15 leading causes of death, including heart disease, cancer, accidents and diabetes.

"I think the most surprising thing is that we had declines in just about every major cause of death," said Robert Anderson, who oversaw work on the report for the health statistics center.

The overall death rate fell from 799 per 100,000 in 2005 to about 776 the following year.

Perhaps the most influential factor in the 2006 success story, however, was the flu. Flu and pneumonia deaths dropped by 13 percent from 2005, reflecting a mild flu season in 2006, Anderson said. That also meant a diminished threat to people with heart disease and other conditions. Taken together, it's a primary explanation for the 22,000 fewer deaths in 2006 from 2005, experts said.

History of tax in the USA

2007 tax brackets at wikipedia

Top incomes and composition (salary, business income, capital gains)

Year top rate and income level of top rate

Tax policy center site

tax brackets by year

Individual tax rates

How many in the higher tax brackets

Sources of historical tax information

Historical networth 1989 to now

Effective tax rates

Updates on World Oil Production and Demand

IEA (International Energy Agency) weekly oil report was issued June 10. Global oil product demand is expected to average 86.8 mb/d in 2008, 80 kb/d below last month’s estimate, following the reduction of price subsidies in several non-OECD countries. Global growth is cut even more steeply by 230 kb/d to +0.9% or +800 kb/d when historical upward revisions to 2006 and 2007 data are factored in.

Global oil supply rebounded by 490 kb/d in May to average 86.6 mb/d, lifted by higher OPEC crude supply. The rise however comes after extensive downward revisions to 1Q08 non-OPEC production and lower biofuels and NGLs for the rest of this year. Despite this, a recovery in non-OPEC output is forecast for the second half of 2008.

World Oil Demand is still larger than supply by 1 million b/d.

The US EIA (Energy Information Administration) posted their May, 2008 International Petroleum Monthly on June 9, 2008.

In thousands of barrels per day. Oil Production.

Time Period USA P. Gulf OAPEC OPEC World
2008 January..E 8,624 23,979 25,121 36,594 85,530
February E 8,625........24,208 36,885 85,827
March...PE 8,664 24,219 25,361 36,784 85,730
2008 3Mth AvgPE 8,638........24,134 36,751 85,693

The Persian Gulf countries are Bahrain, Iran, Iraq, Kuwait, Qatar, Saudi Arabia, and the United Arab Emirates. Production from the Kuwait-Saudi Arabia Neutral Zone is included in Persian Gulf production.
OAPEC: Organization of Arab Petroleum Exporting Countries: Algeria, Iraq, Kuwait, Libya, Qatar, Saudi Arabia, and the United Arab Emirates.
OPEC: Organization of the Petroleum Exporting Countries: Algeria, Angola, Ecuador, Indonesia, Iran, Iraq, Kuwait, Libya, Nigeria, Qatar, Saudi Arabia, the United Arab Emirates, and Venezuela.

The Full IEA Oil Market Report. 60 page PDF.

June 10, 2008

Blog Economics

Chitika, an ad network, had a study of the revenues of the top 50,000 blogs for 2006 Chitika made the assumption that an estimate of total advertising revenues for a blog would be 3 times as much as the Chitika revenue for the blog. Using revenue trends and statistics from a representative
sample for the 12000+ Chitika publisher network. Ad revenue in a blog is more sensitive to the rank of the blog than what one would expect in a typical Zipf Law 80/20 curve situation. [More money for the top blogs and less for the the bottom]. Blog ranking was determined by Technorati ranking.

Top 50,000 made $500 million in advertising in 2006.

Top 10 blogs in Technorati, $40 million in ad revenue.

The top 1% accounted for approximately 20% of the total revenue.
Top 500 blogs made $100 million. Avg $200,000
11-500 blogs made $60 million. Avg $120,000

The top 5% accounted for approximately 50% of the total revenue.
Top 2500 blogs made $250 million
$150 million for 501-2500 Avg $75,000

The top 10% accounted for approximately 80% of the total revenue.
Top 5000 blogs made $400 million
2501 through 5000 made $150 million Avg $60,000

The top 15% accounted for approximately 90% of the total revenue.
Top 7500 blogs made $450 million
5001 through 7500 made $50 million Avg $20,000

7501 through 50,000 made $50 million Avg $1176.

Converting Alexa ranking to daily pageviews
242 www.saatchi-gallery.co.uk 0.5% reach 68 million page views/day
500 www.huffingtonpost.com 40 million page views/day
5000 www.space.com (4600) 6 million pv/day
35000 800,000 pv/day
70000 20-100,000pv/day

CPMs of $1 are low and $4 are average but it depends upon topic and being able to sell the inventory of CPM. It also matters how hard the site is trying to monetize.

Some Technorati to Alexa Ranks

Technorati Rank: 10,810 Alexa: 6400 4 million pv/day

Technorati Rank: 280 Alexa: 35000 250,000 pv/day

Technorati Rank: 2780 Alexa: 32000 70,000 pv/day

Technorati Rank: 5752 Alexa: 18621 300,000+ pv/month

Technorati Rank: 9073 Alexa: 50530 10k-30k per day

top 25 blog properties (5-15 times annual profit for valuation) for 2008

1.The Gawker Properties: $150 million. Gawker (#228), ValleyWag (#34 technorati), Gizmodo (#3 technorati), Wonkette (#678 technorati), and a number of smaller websites. The company claims 30 million monthly unique visitors. $11 million in revenue.

2. MacRumors: $85 million.Blog. It ranks No. 2,700 in Alexa. Page views at 33 million, which seems a bit high. Advertising at least $30 per page CPM. est $12 million and with 60% margin.

3.Huffington Post: $70 million. In late 2007 management claimed that the website had 4 million unique visitors per month and would bring in $7.5 million for the year.
#1 ranking on Technorati.

4.PerezHilton: $48 million. Is No 400-755 in Alexa. Compete show 1.3 million visitors a month. Quantcast puts month page views at 191 million. That seems high. It would put revenue at $900,000 million a month with a $5 CPM.

5.TechCrunch: $36 million. The TechCrunch network claims almost 3.2 million unique visitors and 14.6 million page views. Alexa 951-1795. CPM yield estimate $30. Revenue from advertising at $438,000 a month or $5.3 million a year. #2 ranking on Technorati.

6 (tied): Ars Technica $15 million. Sites ranks 2,500 in Alexa. #7 technorati. Audience is growing very rapidly. Quancast has reach at 1.1 million. Ads are all premium clients. Est $40 per page CPM. Page views are probably six million a month. Revenue of almost $3 million.

Craigslist revenue for 2008 is about $80 million

TV station web revenue $1.2 billion

Myspace revenue $800 million

Guy Kawasaki blog revenue was not very good in 2007

Adsense case study of Weblogs inc. $90K/month, 1 million/year, 60 million pageviews

10 steps to seven figure income from your site

Modelling and Enabling a Manufacturing and Construction Revolution

This site recently discussed the seeds of Manufacturing and construction revolution.

The seeds of the revolution are:
- Contour crafting (scaling up inkjet/rapid prototyping up to making buildings, cement jet) Use cement as the ink. Layer by layer additive construction. 200 times
faster than conventional methods. 5 times lower cost for construction.

- Inflatable electric cars. Flatship cars from a factory like Ikea furniture and could be as cheap as $2500 for an environmentally friendly car.

- Reel to reel production of electronics can be hundreds to thousands of times faster than current lithography factories for making computers and factories for making electronics, televisions, video monitors.

Other seeds are
- wafer scale self assembly of nanoscale components
- Nanotubes and more new materials (nanosteel able to withstand higher temperatures and retain strength)
- wood based fibers able to make paper and cardboard stronger than cast iron. Cheap and plentiful material that could be strong enough for many applications.

Making things 100 times faster than we do now would require a lot more planning to prevent many unintended problems. We need to take the best methods of today like Building Information Modelling and city planning and take those to the next level as well.

Modelling and Planning the Manufacturing and Construction Revolution

Once a computer model of a building has been created, it is possible to extract detailed plans of particular subsystems, such as cooling, water and electrical wiring

The Economist magazine talks about the shift for architects from 2-d blueprints to 3d databases. The amount of data and the variables that are modeled need to be increased. A denser data version of Second Life [virtual world modeling] needs to be made. Various proposed construction can be planned out to end of life.

Elaborate digital models for cities. Currently architecture and city planning are mostly 2-Dimensional professions.

Modeling to get better estimates, schedules and then simulate.

Building Information Management detail or greater fed into Second life virtual reality with many scenarios and at faster than real-time simulation modes.

There should also be various inexpensive real-time sensors tracking various aspects of safety and feeding models with updates on the current situation.

- Time and infrastructure health of surrounding systems and buildings relative to next maintenance task
- Actual emissions at and around the building site
- Traffic and people flow and usage patterns

More rapid and cheap construction could help address things like the California Dike problem.

Advanced City Planning
More detailed data, with more frequent updates at the city and larger scales. Various links on the subject are below.

Urban info modeling

Virtual reality in city planning process.

More frequently updated and detailed views of the real world from Everyscape and google Earth and other sources.

Virtual reality cityscapes

Plan NYC

New true 3 dimensional displays will help with the visualization process

Open geospatial BIM

Accelerating the Economy
Accelerating the economy while maintaining or improving safety will require coordination and effort. Just like being able to have trains move faster and with fewer delays requires planning, coordination and effort.

Looking at the "mundane possible speedups" [not using nanofactory level molecular nanotechnology or Artificial general intelligence] will also flesh out the requirements for MNT speeds.

Each of the levels of faster speed would require consideration.

10 times faster construction would mean - less time for various checks from weeks to days.

100 times faster means minutes for turnarounds or everything pre-checked and approved.

1000 times means all interested parties must have their issues pre-thought out for work in the pipeline up to one year in advance. A pre-planned city wide wiki of intersection projects. New software and new project planning may be required to enable each level.

Plans would be going into a queue for simulation, software-agent first pass comments and validations.

How modularized and disconnected can things safely be? The more compartimentalized things can be then the more simplicity and speed can be retained. There is value to higher safe development speeds.

20% growth - 1997-98 Internet time across the whole economy
If Robin Hanson is correct about the economics of the singularity, this would be the real long economic boom.

building-information modelling (BIM).

BIMStorm, open source BIM

BIM and Beyond
Beyond BIM article
BIM article
BIM at wikipedia
Virtual design and construction at wikipedia
Google search on beyond BIM

Five fallacies of BIM from Autodesk (CAD software maker]


Chuck Eastman, a professor of architecture and computing at the Georgia Institute of Technology in Atlanta is one of the champions of BIM.

June 09, 2008

Top Ten Near Term Developments for Vastly Improved Capabilities in Space

1. Fuel depots. 2-17 times more stuff to the moon or other space missions. Lowering costs for GTO closer to LEO orbit costs

Propellant fuel depot
Boeing Propellant fuel depot

2. Lunar concrete would reduce the amount of material needed to build things on the moon by ten times.

A 50 meter telescope could be built from lunar concrete, with the mirror covered with a thin layer of aluminum. It could directly image any potential continents on planets around nearby stars with no atmosphere on the moon to distort the massive light gathering area.

3. Successful Big and cheap rockets by Spacex or others

Spacex is trying to bring costs down to $500-3200/kg to get into space

Spacex Falcon 9 Heavy
Spacex Dragon space capsule

4. Bigelow - inflatable space stations
Bigelows planned habitable private space station

Bigelow Aerospace Lagrange point and lunar plans

5. Vasimr
The Vasimr 200kw unit is almost flight ready

A 12 MW Vasimr system could send a ship to Mars in less than 120 days one way. A 200 MW Vasimr could go to Mars in 39 days.

1-2MW Vasimr lunar cargo vehicle could transfer up to 39% of the mass from low earth orbit to the moon.

6. Solar electric sail

A simplified picture of the electric sail. An actual system would have 50 to 100 or more 20 kilometer wires. 100 kg spaceships could be accelerated to final speeds of 40-100 km/second. The electric sail is an extremely promising new propulsion technique which is nearly ready to be tested. If electron heating turns out to be successful performance may be increased even more. Costs for solar system missions will go down and new capabilities and performance will be possible.

7. Virgin galactic getting LEO orbit capabilities with SpaceshipThree

Spaceshiptwo at apogee

Virgin Galactic could expand the number of people (passengers) who get to fly by 10 to 1000 times versus the NASA plans. Seats on spaceshiptwo cost $200,000. Virgin Galactic says more than 200 individuals have booked, and another 85,000 have registered an interest to fly. Tens of millions of dollars in deposits have already been taken. If set prices drop to $100,000 each then 85,000 people would generate $8.5 billion in revenue. This could make Spaceshipthree (an orbital system) fully fundable from Virgin Galactic operational profits. Virgin Galactic appears to be offering a path forward to safer (100 times or more safer) and cheaper travel into space for a lot more people.

8. LEO solar power

Low earth orbit (LEO) systems offer the advantage of reducing the scale of the solar power systems. An interesting concept for solar power appears to be on track for testing in Palau by 2012. The Space Island Group is also proposing low earth orbit solar power beamed to multiple locations. Space Island Group hopes to have its first system up by 2010. Space Island group is targeting 10 cents per kilowatt hour (kWh). The Space Island Group has almost completed financing for a prototype system that it claims will be in orbit within 18 months, at a total cost of $200 million. "The satellite will deliver between 10 to 25 megawatts of power," says Meyers. "It will 'site-hop' across base stations in Europe, beaming 90 minutes of power to each one by microwave."

9. Lorentz force propulsion

Successful simulated space conditions test of lorentz force propulsion

Refueling nuclear rockets using lorentz force propulsion

10.Power source breakthroughs: IEC fusion, focus fusion, uranium hydride reactors, Blacklight power

Tri-alpha energy

Farther out:
Laser arrays. Technology is possible but not funded.
Space elevators and space piers. Technology will take two decades or more to mature or to be funded.
Tethers. Minor development and projects funded.

Obama's plan to tax the rich won't work

Businessweek discusses Obama's plan to increase the marginal tax rate back to the level under Bill Clinton and before the Bush tax cuts

Of the 149 million households filing federal income taxes for 2006, some 3% reported income between $200,000 and $500,000; fewer than 1% claimed income above half a million dollars.

The Bush administration instituted a federal tax cut for all taxpayers. Among other changes, the lowest income tax rate was lowered from 15% to 10%, the 27% rate went to 25%, the 30% rate went to 28%, the 35% rate went to 33%, and the top marginal tax rate went from 39.6% to 35%

Many people believe that increasing the marginal rate will collect more revenue from the the rich or for the government in general. Historically it does not matter if the top marginal rate is 90% or 25% the government collects 19.5% of GDP. The only way to get more tax revenue is to increase GDP. Such as a concerted effort to accelerate a manufacturing and construction revolution using new systems and technology.

Economists of all persuasions accept that a tax rate hike will reduce GDP, in which case Hauser's Law says it will also lower tax revenue. That's a highly inconvenient truth for redistributive tax policy, and it flies in the face of deeply felt beliefs about social justice. It would surely be unpopular today with those presidential candidates who plan to raise tax rates on the rich – if they knew about it.

Although Hauser's Law sounds like a restatement of the Laffer Curve (and Mr. Hauser did cite Arthur Laffer in his original article), it has independent validity. Because Mr. Laffer's curve is a theoretical insight, theoreticians find it easy to quibble with. Test cases, where the economy responds to a tax change, always lend themselves to many alternative explanations. Conventional economists, despite immense publicity, have yet to swallow the Laffer Curve. When it is mentioned at all by critics, it is often as an object of scorn.

Because Mr. Hauser's horizontal straight line is a simple fact, it is ultimately far more compelling. It also presents a major opportunity. It seems likely that the tax system could maintain a 19.5% yield with a top bracket even lower than 35%.

The fact that no matter what the rates and brackets all that can be obtained is 19.5% that argues for as simple a tax code as possible for getting that 19.5%.

The fair tax
or a
Relatively flat tax

The wealthier someone is then the more control they can have over their financial profile. Money can be shifted between income, corporate profits, dividends and capital gains and new income can be shifted between jurisdictions.


Tax brackets 1971-1978

1975 Median income 11,800 Mean income 13,779
Someone making 5 times the median income. Would be in the 60k-70k range.
(equal to someone now making 200,000).
Tax rate would be 53-55%.

1965 Median income was $6900 Five times that was 35,000 for 50-53% tax rate.

CBO analysis of long term taxes

Heritage examination of taxes

Comparing some tax burdens between countries

Comparing top marginal individual and corporate tax rates

Historical lessons of lower tax rates

New Apple iPhone 2.0 Available July 11, 2008 starting at $199

The New Apple iPhone 2.0 able to use faster 3G communication was announced today. It will be available for $199 for the 8GB model and $299 for the 16GB model. iPhone 2.0 will be availabe July 11, 2008.

Competing Smartphones
The Economist magazine notes that while the Apple iPhone has 20% of the smartphone market in the United States it only has 5% worldwide.

There are competing phones from Samsung with superior digital cameras (5 megapixel) instead of 2 megapixel for the iPhone. The iPhone sets the standard for
the quality of its interface, The new iPhone will also have improved enterprise software and integration with Microsoft Outlook email.

The Samsung Instinct will be on the Sprint network at 3G speeds

Samsung Omnia i900 will become available in Southeast Asia first and then be launched to other markets over the second half of 2008, according to Samsung.

More info on the Samsung Omnia i900

Blackberry Bold 9000 is a 3G smartphone.

The Blackberry Thunder touchscreen phone is featured at Blackberry cool They have a picture which shows the Thunder as having a large screen like the iPhone.

Nokia is still the world smartphone leader with the N95 and the Nokia S60

The Times Online discusses the competing smartphones

CEO Steve Jobs said the new iPhone, which is based on 3G technology, is 36% faster than top rival Nokia's N95 smartphone.

Jobs says the new iPhone will be available worldwide starting July 11. It will allow up to six hours of Web browsing and five hours of talk time.

Jobs announced the 3G iPhone, which had been rumored for months, at the company's annual World Wide Developers conference in San Francisco.

During the show, Jobs also introduced a slew of new applications for the iPhone, including a wireless system that automatically forwards e-mail to other devices, a friend-finding service called Loopt and mobile blogging software from TypePad.

Other new applications for the iPhone include a service from MLB.com that provides a live scoreboard of major league games, and music-making software, called Cow Terry, for creating songs on the phone.

The new iPhone applications are aimed at boosting revenue from data services. Wireless companies increasingly are looking to these services to offset slowing growth in mobile phone sales. Apple, for instance, will charge $99 a year for its new MobileMe service, which sends e-mail, contact and calendar updates to a user's devices.

The official Apple iPhone site

iPhone 2.0 features listed at Apple.com

Lorentz Force propulsion Successful test

Since the recent trial [with explosive arcing problems] , Peck and his colleagues at the University of Michigan and State University of New York, Binghamton, have successfully tested (but not yet published) their propulsion system, which could speed satellites along at more than four and a half miles a second. More recent tests of solder-less satellites at the University of Michigan have been successful, said Peck.

Peck and his colleagues argue this new kind of mini device could make satellite missions more affordable and feasible.

The propellant-less satellite idea works a lot like a TV. A 'gun' at the back of the TV shoots out negatively charged electrons. As they speed towards the viewer, a magnet changes their direction. On a planetary scale, the electron would be the satellite zooming around the magnet, in this case the Earth. As the satellite zooms around the spinning Earth it would experience a force (known as the Lorentz force) pushing it at an angle perpendicular to its direction. The satellite would steal a tiny bit of the Earth's energy to propel it forward.

Other designs using the same principle, including the Electro Dynamic Tether, have been successfully used in orbit. One difference between the EDT and the new system is that the tether has to be aligned in a specific direction, where the new satellites wouldn't need to be.

Using Lorentz force propulsion could be used to refuel Orion nuclear rockets.

Los Alamos Roadrunner supercomputer will run at over one petaflop/second sustained speed in 2009

Roadrunner is a cluster of approximately 3,250 compute nodes interconnected by an off-the-shelf parallel-computing network. Each compute node consists of two AMD Opteron dual-core microprocessors, with each of the Opteron cores internally attached to one of four enhanced Cell microprocessors. This enhanced Cell does double-precision arithmetic faster and can access more memory than can the original Cell in a PlayStation 3. The entire machine will have almost 13,000 Cells and half as many dual-core Opterons.

Scientists at the Los Alamos government weapons lab will have built the world's fastest computer. It will run at a sustained 1,000 trillion operations per second. Roadrunner will also be the first computer to run the universally recognized code used to test supercomputer performance—LINPACK—at over 1 petaflop/s. Roadrunner supercomputer scheduled for installation at Los Alamos starting this summer 2008, with full operation targeted for early 2009.

The $133 million Roadrunner was just assembled and tested by IBM to run at 1.026 petaflops and has been disassembled for installation at Los Alamos

The Cell microprocessor contains a Power PC compute core that oversees all the system operations and a set of eight simple processing elements, known as SPEs, that are optimized for both image processing and arithmetic operations at the heart of numerical simulations. Each is specialized to work on multiple data items at a time (a process called vector processing, or SIMD), which is very efficient for repetitive mathematical operations on well-defined groups of data.

The Roadrunner has a standard cluster of microprocessors (in this case AMD Opteron dual-core microprocessors). Nothing new here except that each chip has two compute cores instead of one. The hybrid element enters the picture when each Opteron core is internally attached to another type of chip, the enhanced Cell (the PowerXCell 8i), which has been designed specially for Roadrunner. The enhanced Cell can act like a turbocharger, potentially boosting the performance up to 25 times over that of an Opteron compute core alone.

The rub is that achieving a good speedup (from 4 to 10 times) is not automatic. It comes about only if the programmers can get all the Cell and Opteron microprocessors and their memories working together efficiently.

“We replace our high-performance supercomputers every 4 or 5 years,” says Andy White, longtime leader of supercomputer development at Los Alamos. “They become outdated in terms of speed, and the maintenance costs and failure rates get too high.”

The Cell was designed with enough computer power to enhance interactivity, allowing video games to be even less scripted. It has eight specialized processing elements (SPEs) that get around the speed barrier by working together. They can generate dynamic image sequences in record time, sequences that reflect the game player's intention and even have the correct physics.

The Cell gets around the memory barrier as well. It does so by having a small, fast local (on-chip) memory plus a memory engine for each SPE and an ultra high speed bus to move data within the Cell. The local memories store exactly the data and instructions needed to perform the next computations while all eight memory engines act like runners, simultaneously retrieving from off-chip memory the data that will be needed for computations further down the line.

Optimized for maximum computation per watt of electricity, the Cell looked like a good bet for accelerating supercomputing performance. Los Alamos knew, however, that the Cell would need some modifications for petaflop/s scientific computing. IBM was willing to work on the enhancements

Japan's NEC working towards 10 petaflop supercomputer

Tensilica's configurable processors could make exaflop supercomputers practical and petaflop computers cheaper

Cell processors and FPGAs and GPGPUs compared

Substantial rearchitecting of supercomputers will likely be needed to make practical zettaflop computers. An Extreme computing conference in 2007 examined the issues and it seems things like onchip photonics are necessary to being cost and power down to reasonable levels

Going beyond Zettaflops to Yottaflops and Xeraflops would probably require all optical computers or some other completely new architectures.

Форма для связи


Email *

Message *