Monday, August 04, 2008

Constraints to wind power

Jesse Ausubel has a pretty good essay which describes what he thinks the future of power production will look like. It's a somewhat rosy picture, and although I'm also optimistic (but not for the next few years), I disagree on a couple of points.

He, like I, thinks that we've got to get away from coal. I don't follow his reasoning for why he thinks we have to get away from coal. It seems he has identified a long term trend towards fuels with less carbon and more hydrogen, and he thinks we should make choices to perpetuate that trend. As near as I can tell, he's skipped the part about why the trend is a good thing. Perhaps he thinks consumers like lower carbon fuels because they tend to burn with fewer combustion byproducts, but he doesn't back this claim up with any market analysis.

I think we as a country need to stop burning coal because
  • we import a lot of oil to burn coal (we spend almost as much on oil to move the coal as on the coal itself), so that the price of coal-fired power is quite sensitive to the cost of oil,
  • it is politically possible to install lots more windpower, but coal is seeing opposition, and it is vital to our economic health to get a lot more electric supply,
  • wind power is inelastic supply, whereas coal power is elastic. That is, a coal plant will shut down if the price of electricity falls below it's operating costs, but a wind turbine costs almost nothing to run and will keep generating through a larger swing in electricity prices, which will make our electricity supply more predictable,
  • and finally and perhaps most importantly, because climate change matters.
My biggest point of disagreement is with Jesse's assertion that windpower is impractical due to land use constraints. Other, perhaps clearer-thinking people have made this same point. Jesse makes a very sobering calculation: he figures a wind farm produces 1.2 watts per square meter, average. To produce all of the U.S. grid's 450 gigawatts (average), you'd need a lot of land. Jesse calculates 780,000 square kilometers. The area of the U.S., for reference, is 9.16 million square kilometers, with 1.75 million square kilometers of cultivated cropland. I don't get quite as large a number as Jesse, but we'll take his 780k km^2 for now.

He figures that's just too much land. But this argument is trite. I'll skip over the point that farmland with wind turbines is still farmed land, and instead focus on a more basic question: How much is too much? I think too much is when the next wind turbine to be installed is projected to make no money. That could be all the farmland plus a lot of offshore turbines, or it could be just a few places in North Dakota. It won't be decided by people getting scared of erecting some more infrastructure on 44% of our existing cropland. Farms in the Netherlands in the 1800s were dotted with windmills, because that's what drove the pumps to keep the water out. Farms in the U.S. in the late 1800s were dotted with windmills, with parts shipped at enormous expense across the continent, because that's what pumped the irrigation water wells. Modern farms aren't currently dotted with wind turbines because they've been using oil instead.

Jesse's argument is also trite because it ignores the huge variation in windiness around the U.S. In North Dakota, the entire state is class 4 or above. That means the power available at 50 meters above the ground is 400-500 watts/meter^2. Even during the summer doldrums, the average power available is 300-400 watts/meter^2.

Jesse's 1.2 watts/meter^2 number comes from a wind farm in Lamar, Colorado. That wind farm has 108 1.5 MW turbines spread over a 11840 acre area. Multiply by a 30% capacity factor, and you get 1.01 watts/meter^2. (I'm not sure how he got the extra 20%.) Why is this number so low?

It's economics. The company that owns the wind turbines pays the company that owns the land on which the turbine is sited approximately $3000 to $6000 per year per turbine. The net present value of that payment stream is $60,000 to $120,000. The turbine costs $1,500,000, which is a lot more. Spacing the turbines farther apart slightly increases the power from each turbine, at small increases in royalty payments and road and cable construction costs. If land scarcity ever becomes an issue for wind farmers, I would expect $ per watt and watts per km^2 to go up. Note that $/watt may go up slightly, while watts per km^2 may go up a lot.

Consider that the first big wind farm, on the Altamont Pass, has a power density of 0.86 watts/m^2, which is lower than Lamar's density. If you follow that link, you'll note that wind farms vary from 0.24 watts/m^2 (Pierce County, N.D.) to 5.3 watts/m^2 (Braes of Doune, Scotland). I think land prices, more than turbine capability, is driving the energy density of these farms.

Note that the wind power map above quotes wind at 10 and 50 meters above the ground. Back when the Department of Energy began collecting data for these maps, those were considered the likely bounds of practically sized wind turbines. However, the Lamar turbine towers are 70 m tall. It turns out that the tower costs are mostly just steel, and the higher up you go, the faster the wind blows. After the industry got experience with the costs of siting, permitting, building, bird strikes, aesthetics, and so forth, it turned out worthwhile to spend more on steel in the tower and concrete in the foundation. As a result, watts per km^2 has gone up.

Is there a limit? Placing turbines closer together can collect more wind energy, but fundamentally most wind power is still being dissipated as turbulence and then heat higher up in the atmosphere. Bigger wind turbines reach farther up to capture more energy. It is hard for me to imagine that ground-based wind turbines are going to get substantially taller than they are now, and so I do not expect the average power yield to increase much beyond, say, 2 or 3 watts/m^2 average. 2 watts/m^2 across all of North and South Dakota would yield 750 gigawatts, which is why you hear wind advocates claiming that the Dakotas can power the rest of the U.S. They could, if you could transport the electricity to market.

Finally, I doubt very much that, even if windpower is wildly successful, it will ever account for anything like 100% of the U.S. grid's production. If many coal plants are forced out of production by lower cost wind plants, I would expect that some very efficient mine-mouth plants will remain. I will be astonished (and pleased) if wind ever produces half the U.S. capacity. If that ever happens, wind turbines will be a familiar sight, but not an overwhelming use of land.

Jesse also complains that wind turbines take significantly more steel and concrete than nuclear powerplants. Obviously the steel and concrete are factored into the current prices of turbines, so it's already part of the price comparisons being made. There are two future risks to large use of concrete and steel, however:
  • Wind turbine prices in the future could be more closely tied to raw material prices (which in turn depend on the cost of energy) than on the price of labor (which depends on the state of the economy). This question resolves to whether future wind turbine prices are more sensitive to the cost of imported energy than electricity from coal is. Coal fired electricity is fairly sensitive to oil prices, so I doubt this is a problem.
  • A large bump in wind turbine construction could use so much concrete and steel that it would distort the markets and cause large price increases.
The second issue got me to pull out the calculator again. Here are Jesse's numbers, actually Per Peterson's numbers, in context of the production necessary to build a 250 GWe average windpower grid (about half U.S. electric consumption):
  • Steel: 460 metric tons per MWe. The U.S. produces about 90 million metric tons of steel every year. Over the 30 years it would take to build a new US grid, wind turbines would require 1.3 years' worth of production.
  • Concrete: 870 cubic meters of concrete. The U.S. ready-mix industry produces about 350 million cubic meters a year, so we'd need 0.6 years' worth of concrete production.
These constitute a nice bump to domestic production, but are significantly less that ordinary year-to-year variation.

The bottom line: if the price is right (or even close), let's have all the wind turbines we can build, because it really could help with our foreign trade deficit, economic sensitivity to energy prices, and global warming.

Tuesday, July 22, 2008

Dumping Quicklime into the Oceans

Tim Kruger at Cquestrate has an idea for sequestering large amounts of CO2: dump quicklime (CaO) in the ocean.

The basic idea is to convert limestone (CaCO3) and CO2 into calcium bicarbonate (Ca(HCO3)2).

CaCO3 + energy -> CaO + CO2 Burn limestone into quicklime
CaO + H2O -> Ca(OH)2 Dissolve quicklime in ocean to make calcium hydroxide
Ca(OH)2 + 2CO2 -> Ca(HCO3)2 Calcium hydroxide absorbs CO2 to make calcium bicarbonate

Net:
CaCO3 + H2O + CO2 + 178 kJ/mol -> Ca(HCO3)2

The problem is the amount of energy required. Let's say it comes from coal. Typically, you can get 30 MJ/kg out of coal. To get your 178 kJ above, you'll produce a half mol of CO2 just burning coal, assuming perfect efficiency. That's half your benefit gone right there.

But, it's a high temperature reaction (840 C). That means you have to get the reactants (calcium carbonate, coal and coal oxidizer, e.g. air) up to that temperature, react them, then drop the reaction products back down to normal temperature. To get perfect efficiency, all of the heat from the cooling products has to be transferred to the reactants. There is going to be some loss.

Let's say you lose 25% of the coal heat, and 75% goes to making quicklime. Then, for every 2 kg of coal burned, you will eventually absorb the CO2 that was produced by burning another kg of coal somewhere else.

Bottom line: we'd have to triple the rate at which we burn coal to get carbon neutral with this scheme. That's not practical. It'll get better if we use natural gas or oil, but it won't change the basic calculation that we'd have to multiply our existing consumption of fossil fuels to get carbon neutral.

Now, if someone wants to tell me about a scheme in which limestone is burned in a solar furnace to make cement, I'm all ears. CO2 sequestration from such cement manufacture makes more sense than it does from coal-fired powerplants, because limestone burning (no air) releases pure CO2, whereas coal burning releases CO2 mixed with lots of nitrogen from the air. However, there are lots of other problems.

Sigh. We're not getting out of this mess easily.

Tuesday, July 15, 2008

The Pickens Plan

Check out the guy's website, if you haven't already. There is not a lot of meat there. Basically, the idea is that if we build enough wind turbines to provide 20% of our electricity, we can reduce the amount of natural gas that we burn to make electricity. This natural gas can be used to power special new cars, which will reduce our imports of petroleum.

Mr. Pickens' chief aim is to reduce U.S. petroleum imports. That's great, because that's the energy policy issue I care about most, too. However, I see two problems with his plan:
  1. As things stand now, large fast changes in wind turbine output will have to be accomodated by throttling natural gas turbines. Gas turbines cannot throttle down to zero power efficiently. So, even when the wind is blowing a large amount of power will have to come from gas turbines running at partial throttle ready to take over if the wind cuts out. If wind is supplying 20% of our domestic power, these partial-load gas turbines will have to supply some similarly large amount, and as a result there may not be a large amount of gas actually saved.
  2. I don't forsee a switch to compressed natural gas burning cars. I suspect it would be cheaper and have a larger, more immediate impact to convert the natural gas (and some coal) into gasoline in a refinery, and then feed that into the existing transportation system.
I have two humble suggestions for Mr. Pickens, or energy policymakers.

1. Switch home heating to electric heat pumps.

In 2006, 5 billion gallons of distillate fuel oil was sold to residential users, almost all of it used to heat their homes. Ignoring refinery gain, this is 160.8 million barrels, or about 3.6% of the 4.5 billion barrels of oil imported that year.

Nearly all the houses heated by distillate fuel oil have grid electricity. These houses can be upgraded to air-source heat pumps for a few thousand dollars each. Electricity can come from coal or natural gas, either one of which is better than petroleum. The economics are probably already there for the switch, so some public education and low-cost financing should push homeowners to embrace heat pumps en masse. This can happen a lot sooner than moving the U.S. car fleet to compressed natural gas.

This switch can reduce our oil import bill without requiring the first step of lots of wind turbines. Maybe I'm just nitpicking, but $21.4 billion dollars per year (for the 160.8 million barrels imported) seems like an interesting amount of money.

2. Make air conditioners work on intermittent electricity

This is also known as "Direct Load Control" or "Demand-side Management".

One of the problems with wind energy is that it's intermittent. Increasing the amount of wind generation in the national grid will increase the variation in load that the other generators must accomodate. This will cost money. It will cost less money if the other generators have 10 or 15 minutes to accomodate variation.

Air conditioners and heat pumps naturally store energy. It takes time to cool or heat a building. Usually, the pump cycles on or off every few minutes. If the utility has a fast way to shut down large numbers of compressors for a few minutes, it can filter out much of the short-term variation in load and supply. Instead of throttling gas turbines from 50% to 100%, a few minutes' notice gives the utility time to turn on gas turbines -- from 0% to 100%. That means that the 50% rated capacity that was otherwise being produced by a gas turbine can be produced by a coal-fired turbine instead, which is much cheaper.

This change is a good idea regardless of whether a massive wind turbine build happens, because it will allow utilities to use less natural gas and more coal. That may alarm some folks. Some may see a hidden agenda here. I think if the same bill in Congress mandates Direct Load Control on HVAC devices, and guarantees a production tax credit for all non-carbon domestic sources for a decade, that should assure doubters and put some real fire in the market.

Right now, hydroelectric turbines are the cheapest load-following generation around. They produce just 7.1% of the electricity in the United States (2006). Unfortunately, all of this load following capacity is already used.

For comparison, HVAC uses more than 29% (page 44 here, plus this, both from the EIA) of our generated electricity. Instantaneous control over this much load would be sufficient to accomodate any amount of wind power that we care to build. Of course, the utilities (really the system operators) can't control HVAC, yet. I don't think this is a problem, because we don't have 450 gigawatts of wind turbines yet either.

I suspect the average lifetime of HVAC equipment is around 20 years. If the government mandated that all HVAC equipment sold after, say, 2009 had Direct Load Control features, then we'd see about 15 new gigawatts of Direct Load Control every year. There is little danger of us building wind turbines faster than that in the near future.

Wednesday, July 09, 2008

Burning coal is burning oil

I found some numbers for the oil cost of burning coal.

Freight trains in the United States burn 1 gallon of diesel to move a ton of frieght 436 miles.

Average distance coal travels in US: 628 miles from mine mouth to powerplant. At $4.03/gallon, that's $5.80 for the diesel to move a ton of coal from the mine mouth to the powerplant, on average. Wyoming coal costs $9 at the mine mouth. So, electric producers pay almost as much for the diesel to move the coal as for the coal itself. Since marginal petroleum is imported, it's fair to say that coal is not entirely a domestic fuel.

The average powerplant cost for coal in the U.S. in 2006 was $34.26/ton. That's because coal mined outside of the Powder River basin in Wyoming costs a lot more to dig out -- the average mine-mouth price across the U.S. in 2006 was $25.16/ton. The difference is $9.10/ton, which is the cost of transport. The cost of diesel was a bit lower in 2006, but it looks like around half the transport cost is the diesel.

If the coal is 22 MJ/kg, and the plant is 35% efficient, then for each kWh at the powerplant you spend on average 1.8 cents for the coal. Just the fuel cost of the coal plant is more than the total operating cost of the Palo Verde nuclear powerplant, per kWh. This result is entirely independent of subsidies or clean coal. The black stuff is apparently just really expensive.

A while back, I snarkily suggested that mine mouth coal powerplants were a way to keep the pollution away from rich people. Looks like I was wrong:
  • Transporting a kWh of electricity 1000 miles increases the cost by 19%.
  • Transporting the coal necessary to make that electricity 1000 miles costs $14.49/ton, assuming cost is linear with distance. That's a 58% increase in the cost of the fuel. Assuming the fuel cost is 70% of the cost of producing electricity, that's a 40% increase in the cost of the electricity.
  • 4000 miles (across the continent) by electricity: increase cost by 107%.
  • 4000 miles by coal train: 160% increase.
What about the extra carbon? Transporting 1000 miles as electricity means you must make an extra 8.7% more electricity which gets lost in the wires, which produces 8.7% more CO2. Transporting 1000 miles by coal train burns 6.3 kg of carbon in the diesel to deliver perhaps 800 kg of carbon, which increases the total carbon released by 0.8%. Clearly the diesel locomotive is the lower carbon, if much more expensive, alternative.

Average distance coal travels in China: 230 miles. They're burning a lot less diesel to take advantage of their domestic coal.

Sunday, June 29, 2008

Keep women away from stairs!

I have summarized for your convenience the top 7 consumer killers in the United States in the year 2001, and swimming pools, for comparison.  I think the conclusion here is inescapable: women must be kept away from stairs. This is a significant issue for me because I live in a house with two stories and a basement, with my wife and three daughters. Although the statistics presented are not specific, it does appear that the problem is largely with older women, so we'll definitely have my mother in law stay in the downstairs bedroom.

As I write this my two older girls are directly behind me, messing around in the crib that we generally use for our youngest. A quick check shows that I should escort them outside where they can safely fool around in traffic on some ATVs!

total deathsmale accidentsfemale accidentscategory
2021047671421274004Stairs, Ramps, Landings, Floors
45964248445 291530Beds, Mattresses, Pillows
30271203930 252960Chairs, Sofas, Sofa beds
25023125312 168238Bathroom structures and fixtures
24750414008 151660Bicycles
21239169834 38022ATVs, Mopeds, Minibikes
19085150667 72498Ladders, Stools
53228886472894Swimming Pools, Equipment

Wednesday, June 25, 2008

CO2 sequestration -- size of the kill zone

Sometimes, the underground reservoirs that store natural gas explode. Drilling wells into them makes this more likely. When wells explode, the gas generally ignites, making a spectacular flame that can be seen for miles. Aside from the loss of valuable fuel and equipment damage, well explosions generally aren't too big a problem for people living nearby.

One less noteworthy effect of a well explosion is that the CO2 generated from the combustion of the methane is carried high up into the atmosphere by the heat of combustion, where it is mixed by high-altitude winds (routinely 100 MPH).

One plan for CO2 sequestration from coal-fired powerplants is to inject the CO2 into old, empty gas wells. Like the methane, the CO2 is in a supercritical state in the well -- not so much a liquid as a very dense high pressure gas.

The difference between CO2 and CH4 comes when the well explodes. CO2 does not start a fire. Instead, it expands, and cools, and the cold CO2 will flow with the wind, against the ground, eventually dissipating.

A 1 GW (electrical) coal-fired powerplant will burn 2.2 GW (thermal) of coal (because it's about 45% efficient). That's about 7000 metric tons every day. It will produce 4.7 cubic kilometers of carbon dioxide per year, at standard temperature and pressure. That CO2 is fatal to mammals at concentrations greater than 4%.

So, if a sequestration field explodes after 10 years of sequestering the output from a 1 GW coal plant, it will create an invisible blob of CO2 that will be at least 7 km across before it dissipates to the point of being nonlethal.

Think about this thing for a bit. CO2 inhalation is fatal within a couple of minutes, and I suspect it is disabling well before that. Who is going to detect this blob of gas before being overcome? You cannot see it. You cannot run from it. You cannot stay indoors to escape. You cannot start your car to drive away from it. As the wind wafts it across the scenery, it kills every animal in its path. It could go for 50 kilometers or more before wind shear mixes it with enough air to become safe.

Not in my back yard, if you please.

Monday, June 09, 2008

Anya's Gift


For Anya's 6th birthday, we had a huge party. Over 50 people came. It was a blast.

We've been trying to reduce the accumulation of toys in our house, and presents from that number of people was going to be a problem. So, we told people not to bring presents, or if they did, bring something suitable for the Ronald McDonald house, which is a temporary home for the families of kids undergoing serious treatments at Stanford Hospital.

If anything, the haul got better (oy! consumerism). Here is Anya delivering her presents to the charity.

Monday, June 02, 2008

Discovery Launch


I just got back from watching the Discovery launch. My boss, Ed Lu (former 3-time astronaut, second from left), hosted us, which really made the experience for me because he was able to introduce us to lots of folks. Every time we walked into a restaurant, and every 5 minutes while we were at Kennedy Space Center, someone would smile and come over to talk with Ed. NASA doesn't pay well and most folks don't get to try wacky things like we do at Google, but they seem to have great interpersonal relationships. It's heartwarming to see.



On launch day, we were 3 miles from the pad at the media site. This is as close as you can get. We had a lot of waiting around to do. Here is a cherry spitting contest.



I know there is a great deal of speculation out there about whether hacking on camera hardware at Google makes one a babe magnet. While such question are only academic for me personally, I can tell you that getting out in the midst of a bunch of media types with some very customized photographic hardware attracts all sorts of attention. I don't actually know who this person is but I think we can all agree she's gorgeous, and she was very interested in the camera hardware and what Google was doing with it.



From our vantage point 3 miles away, the shuttle stack was just a little bigger than the full moon, which meant that the flame coming out the back was about that size too. There have been some comparisons to the shuttle exhaust being as bright as day....

Let me put that myth to rest. After two years of designing outdoor cameras, I can tell you that just about nothing is as bright as the sun. From our vantage point it had more angular size than the sun -- maybe 400 feet long by 100 feet wide, viewed from 3 miles, is 1.5 by 0.5 degrees.  The sun is 0.5 degrees across.  But the Shuttle plume is not as hot as the sun -- 2500 K at most, compared to 6000 K for the sun.  Brightness increases as the 4th power of the temperature, so the Sun's delivered power per square meter is something like 11x larger.  Furthermore, most of the light coming from the Shuttle is in the deep infrared where you can't see it, compared to the Sun's peak right at yellow.  So my guess is that the shuttle was lighting us up to 9,000 lux illumination.  That's twice as bright as an operating room, and way brighter than standard office bright (400 lux).  But it's just nothing like the 100,000 lux that you get outside in bright sunlight.  Nobody's going to get a suntan exposing themselves to the shuttle.  (Yes, the shuttle flame reflects off the exhaust plume, but the sun reflects off clouds, which are much bigger, so there is no relative gain there.)

Anyway, back to the people we got to meet. Here we are at lunch in the KSC cafeteria, the day before the launch. That guy two to my right is... named at the bottom of the blog. Have a guess. He had a really neat retro electronic watch and talked about how much he likes his Segway. Picture was shot by Jim Dutton, one of the F-22 test pilots who is now an unflown astronaut.


Here's a terrible picture of Scott Horowitz (former #2 at NASA, the guy who set the direction for the Ares rockets and Orion capsule) talking with Ed. The two were talking about their airplanes, a subject that gets both of them fairly animated ("I love my airplane. It's trying to kill me.")  Sadly, Ed's plane was subsequently destroyed by Hurricane Gustav (while in a supposedly hurricane-proof hanger) later this year.

Sorry about the quality, it was incredibly crowded and Ed and Scott weren't posing. This was on the day of the launch. Scott came out and looked at our Street View vehicle, then narrated the launch for us. Scott is a former 4-time astronaut and has a great deadpan delivery ("okay we just burned off a million pounds of propellant"); he's probably done it a hundred times.

Here's Mike Foale, who Ed has closed the hatch on twice (that means Mike was in the crew after Ed at the ISS twice).


I enjoyed meeting the people and looking at the hardware quite a bit more than the spectacle of the actual launch itself. Basically, the Shuttle makes a big white cloud, climbs out, loud noises ensue, and within two minutes you can just make out the SRB seperation with your unaided eyes, and it's gone. The Indy 500, for instance, is louder, and more interesting because there are always going to be crashes and various anomalies, which are not usually injurious and therefore lots of fun for the crowd. After meeting all those competent people who are working so hard to thread this finicky beast through a loophole in Murphy's law, I was just praying the thing wouldn't break on the way up.


P.S. That's Steve Wozniak, cofounder of Apple Computer.

Tuesday, April 29, 2008

How GPUs are better than CPUs

Intel has a great CPU core right now, AMD does not, and in combination with Intel having higher-performance silicon, Intel is currently beating AMD handily. Meanwhile, Intel and AMD are both integrating graphics into the CPU and NVidia probably feels sidelined. So NVidia says that the CPU is dead. I agree, a little.

Many things people want to do these days are memory bandwidth limited. Editing/recoding video, or even tweaking still pictures and playing games are all memory bandwidth limited. GPUs have far better memory bandwidth than CPUs, because they are sold differently.

The extra bandwidth comes from five advantages that GPUs enjoy:
  • GPU and memory come together on one board (faster, more pins)
  • point-to-point memory interface (faster, lower power)
  • cheap GPU silicon real estate means more pins
  • occasional bit errors in GPU memory are considered acceptable
  • GPUs typically have less memory than CPUs
When people buy CPUs, they buy the memory seperately from the CPU. There are 2 chip carriers, one socket, a PC board, and one DIMM connector between the two. In comparison, when people buy GPUs, they buy the memory and the GPU chip together. There are 2 chip carriers and a PC board between the two.

CPU memory interfaces are expected to be expandible. Expandibility has dropped somewhat, so that currently you get two slots, one of which starts out populated and the other of which is sometimes populated and sometimes not. The consequence is that the CPU to DRAM connection has multiple drops on each pin.

GPUs always have one DRAM pin to each GPU pin. If they use more DRAM chips, those chips have more narrow interfaces. Because they are guaranteed point-to-point interfaces, the interfaces can run at higher speed, generally about twice the rate of CPU interfaces.

CPU silicon is optimized for single-thread performance -- both Intel and AMD have very high performance silicon. As a result, the silicon costs more per unit area than the commodity silicon the GPUs are built with. The "programs" that run on GPUs are much more amenable to parallelization, which is why GPUs can be competitive with lower-performance silicon.

It turns out that I/O pins require drivers and ESD protection structures that have not scaled down with logic transistors over time. As a result, pins on CPUs cost more than pins on GPUs, and so GPUs have more pins. That means they can talk to more DRAM pins and get more bandwidth.

All of the above advantages would apply to a CPU if you sold it the same way a GPU is sold. The final two advantages that GPUs enjoy would not apply, but are easy to work around.

The first is the acceptability of bit errors. GPUs do not have ECC. It would be easy to make a CPU/GPU that had a big wide interface with ECC.

The second is the memory size. GPUs typically connect to 8 or 16 DRAM chips with 32b interfaces each. It would be straightforward to connect with 64 DRAM chips with 8b interfaces each. If fanout to the control pins of the DRAMs becomes a problem, off-chip dedicated drivers would be cheap to implement.

So, I think integrated CPU/GPU combinations will be interesting for the market, but I think they will be more interesting once they are sold the way GPUs are sold today. Essentially, you will buy a motherboard from Iwill with an AMD CPU/GPU and 2 to 8 GB of memory, and the memory and processor will not be upgradable.

For servers, I think AMD is going in the right direction: very low power (very cheap) mux chips which connect perhaps 4 or even 8 DRAM pins to each GPU/CPU pin. This solution can maintain point-to-point electrical connections to DIMM-mounted DRAMs, and get connectivity to 512 DRAM chips for 64 GB per GPU/CPU chip.

Sunday, April 20, 2008

Fountain Prototype


Martha and I are building a pool in the back yard. In that pool will be hot tub, and pouring into that hot tub will be a fountain. I want lots of water flow, and curves, especially since the overall pool will be rectangular (due to the automatic cover). To give you an idea, here's the pool:


The hot tub is circular, and has a 1 foot thick wall that seperates it from the pool. Out of the center of that wall, water will leap up, arch over, and fall into the tub. This will pour nicely over your shoulders if you are an adult, and it will make a fancy tube to explore if you are a child.

The trouble is that nobody sells a curved fountain like this. No problem, I'll just assemble it from a number of straight sections. Also, I do not want to use high-pressure pool pumps for this thing. Instead, I want to use low-power, low-pressure pond pumps. The manufacturer of the fountain has specs for the amount of water flow you need, but not the pressure. I smell project risk. Time for a prototype. Here's the overall arrangement: two fountain units, 1 foot wide each, one Sequence 4200seq12 pump, and some pipes to move the water.





I've got a flow gauge, two pressure gauges, and a ball valve so I can figure out how many gallons per minute throws the water how far.


I've also got a peanut gallery. They're interested because they're going to get to dance around in the water in a bit.


The fountains throw water about as far as the manufacturer claims. Note that my flow rates are for two 1 foot units.

Flow rateThrowNotes
48 GPM26.0 inches7 inch rise
45 GPM23.5 inches
41 GPM18.7 inches
37 GPM14.7 inches
35 GPM11.5 inches3+ psi pressure drop

I learned a bunch of things from this prototype:
  • The flow through the two units was not identical. One moved about 8% more water than the other, and threw the water a little further.
  • The flow through each units was not uniform. The unit throwing farther was throwing farther on one end.
  • With no fine filtration, and just a skimmer before the pump, the fountain units quickly accumulated debris that interfered with the flow.
  • The water sheet from each unit contracts from surface tension as it gets farther from the fountain. A 14 degree included angle between the two units turned out to roughly match the contraction, but this still left a constant gap from one to the next. I may try to fix that by mitering the two fountains together.
  • Martha and I agreed that 15 or 20 GPM per linear foot is not enough. We really like 25 GPM/foot better.
  • The fountain water entering the water surface was the cause of all the noise. The pump was really quiet, and you could only hear it when you walked right over to it.
  • The pump really doesn't prime itself. I had to stuff a hose up the intake and fill it full of water before the pump would move anything.
  • This pump can just move 48 GPM with this setup (which implies it is seeing about 5 feet of head). With more angles and losses in the system, I am going to need more pressure at that flow.
I also noted that the water sheet was rough. Water entry was noisy. I took a high-speed shot of the water, and sure enough, it's breaking up in flight. Note also how much shorter the rear fountain is than the front.


I noticed that the flow gauge was bouncing around a fair bit, so I presume I'm getting a bunch of turbulence, which probably does not help the fountains at all. These units are the "short lip" version of these fountains, which means they have just 1" of flow straightener before they launch the water. The standard version has a 6 inch lip, which I think might damp the turbulence more and lead to a cleaner sheet of water.

Inside the unit there are apparently 3 supports of some sort. These have visible wakes, but I wasn't able to see that the wakes caused more breaking up when they hit the edges.


So, my plan is not yet validated.
  • I need bigger pumps. 3 of the 5100SEQ22 will produce 200 gpm total at 10' head. That should give me enough extra force to push through the extra twists and turns.
  • Each fountain unit is going to need it's own throttle. The best way to implement this is probably a bank of eight $20 ball valves, and a seperate run to each fountain unit.
  • As long as I'm doing a seperate run to each fountain unit, I might arrange for the final connection to be long and straight to reduce turbulence. There will be a lot of turbulence in the fountain unit itself, so maybe this is hopeless.
  • I should order a fountain unit with a 6" lip, and see if I like that flow better.

Tuesday, April 08, 2008

Conservation versus outsourcing

Read "The Wonderful Curse of Natural Gas Price Volatility". It's short, just 12 pages long.

Check out the graph at the top of page 9: "U.S. Industrial Gas Demand Destruction". That's a 22% drop in industrial natural gas utilization between 1997 and 2006. That's not efficiency, that's offshoring! What's going on here?
  • Natural gas is a feedstock for the fertilizer, chemical, and plastics industries, and a fuel for the electric generation industry.
  • Electric power generators are less sensitive to the price of their fuel than fertilizer, ethanol, and plastics, since the latter three can all be shipped to us oversea, and electricity cannot.
  • The electric generation industry is sensitive to the capital necessary to build capacity, because the rent on the capital to build their plants has to be priced into the electricity sold, and different plants do compete to produce and sell electricity. Thus, more capital-intensive plants are more likely to have lower return on investment if electricity prices dip.
  • Gas turbine power plants have exceptionally low capital costs, making them very desirable to the power producers, and gas prices were low during the 1980s and 90s.
  • So, electric generators built 200 gigawatts of gas turbine powerplants during the 1990s and early 00s, so that gas turbine plants now constitude 41% of our nameplate capacity (EIA figures). These gas turbine plants are now running at a capacity factor of 21%, and produce 20% of our domestic power (once again, EIA).
  • Figure 7 of page 8 of the Ventyx report shows that between 1997 and 2006, gas consumption by the power generators rose from 11 to 17 billion cubic feet a day. That's all those gas turbines coming on line.
  • It turns out there is a limited supply of domestic natural gas. Demand rose, supply stayed constant, and thus prices rose.
  • Over the same time, industrial consumption dropped from 23 to 18 billion cubic feet a day. That's domestic fertilizer, chemical, and plastics production being moved overseas in response to higher feedstock costs.
  • U.S. consumption of fertilizer, chemicals, and plastics has not dropped, and conversion from the feedstock to the final product increases value, so offshoring has driven the jobs overseas and also increased our trade deficit by much more than the cost of the natural gas consumed by the electric generation industry.
What we have here is another example of a strong negative correlation between the performance of the U.S. power generation industry and the U.S. economy as a whole. This is a tragedy, partially responsible for our $708 billion dollar/year trade deficit. That's an unpaid $2360 bill, per man, woman, and child, per year, for everyone in the United States.

This post and the last one may lead some of you to think I'm all for a command economy. No. I'm pretty sure that if we nationalized the electric power generation industry, we'd end up running it less efficiently, which would also lead to higher domestic power costs. I do think we need to bring the measure of performance of the electric power generation industry into better alignment with the domestic economy.

The domestic economy does well with cheap energy. In this context, gas turbines are a disaster, since they redirect a feedstock away from high-value-added uses (plastics) into low-value-added uses (electric generation). We have readily available substitutes for electric generation (coal and nuclear), but not natural gas. In some sense, all a gas turbine does is convert one kind of energy into another without increasing the domestic supply.

I don't know how to make domestic power producers profit more when the US economy has cheaper energy. The benefit of marginally cheaper power is probably nonlinear, and possibly unmeasureable in any way that would allow accountants to calculate a credit to power producers. I do not want to see more coal powerplants, because of the currently externalized cost of CO2 production, even though they are a cheap source of power. Perhaps the simplest way forward is what we have now: tax credits or subsidies for the obvious answers, like wind and nuclear, and just feel our way through, year by year, guessing which subsidies will distort the electricity market to best serve the interests of our citizens.

I'm sorry to keep harping on this energy and trade stuff, but to be honest, I'm scared. I don't understand how to predict what this trade deficit will do, nor do I understand how big is too big, but $700 billion feels too big. Our trade deficit, national budget deficit, credit crisis, housing market meltdown, and war in Iraq give me the feeling that this nation has derailed and is about to make a very expensive and possibly bloody mess.

The last time we got into a World War, we had just splurged on national infrastructure. Think about this: 90% of the Allied aluminum flying over Germany was made with power from the Grand Coulee Dam, built from 1933 to 1942, i.e. just in time. I'm not saying I expect another World War, but I am saying that when times get tough it's good to have serious infrastructure in your back pocket.

Wednesday, March 26, 2008

Let's drive electricity prices into the ground

Read this report. It's basically a big apology for why electricity prices have been going up.

On page 31, it shows the EIA estimate that a 10% increase in the price of electricity in 2006 would cause a 4% (175 billion kWh/year) drop in electricity demand in 2014, down from 4.2 trillion kWh/year. This is basic supply and demand, with the EIA doing the error-prone work of predicting the demand curve in the future. The first thing I'll note here is that a 10% price increase, coupled to a 4% sales drop, leaves a 6% revenue increase (at least $12 billion/year) coupled with decreased costs for the folks selling electricity. It's an inelastic demand curve. So, if the folks making electricity can do anything to reduce the overall supply, it's well worth their effort.

When the price of electricity goes up, some of that reduction in demand is accomplished by economic activity (buying a more efficient air conditioner), and some is accomplished by reducing economic activity (shutting down the night shift of a marginal plant). Overall, how much of each? My guess is that the reduction in economic activity is the main reducer of demand. Let's suppose I'm right, and that a 4% drop in electric demand is accompanied by a 1% drop in GDP. That's a $130 billion dollar drop.

You can see that price fixing among electricity producers would be seriously damaging to me and you. It is in the national interest that electricity prices not rise 10%. Note that this is true regardless of whether the utilities make or lose money, because as a nation we are making or losing quite a bit more money than the utilities are.

So let's consider a different investor, the U.S. government. Suppose that the electric demand curve slope is locally smooth. A 10% decrease in the cost of electricity, then, should lead to a 4% increase in sales, and a corresponding 1% increase in GDP. This is what Rod Adams is talking about when he calls electricity an economic lubricant.

How much is that 1% increase worth to the federal government? They tax the GDP at about 18.4%, so it's worth around $24 billion per year. To review:
  • A 10% decrease in the cost of electricity, from $0.07/kWh to $0.063/kWh, would lead, 10 years later, to
  • ...a 4% (175 billion kWh) increase in electricity sales, for a net revenue loss to the industry of
  • ...$12 billion/year. The federal government, however, would be raking in an extra
  • ...$24 billion/year, and the rest of us would be enjoying an additional
  • ...$130 billion/year in GDP.
Sounds good. Let's mandate a drop in prices! Who says we can't have a centrally controlled command economy?

Well, it's not that simple. First, we need to know much investment is required to drive electric prices down 10%. Presuming that the government has to somehow compensate utilities for taking a $12B/year hit for the team, that leaves $12B/year to pay for the capital required. The federal government currently borrows money for 30 years at 4.5% (they are a better credit risk than you), so the capital required for this investment had better be significantly less than $266B.

The Palo Verde nuclear power plant supplies power for $0.027/kWh, including operations (fuel), maintenance, and interest and depreciation costs. In 2002, the marginal cost (not including capital) was 42% less than that for coal in the area, and since then the difference has increased as coal costs have risen. This is the best lever we can use to drive down electricity prices.

To drive down wholesale prices by 10%, we'd need to bring the cost of production down approximately 10%. Using the Palo Verde area numbers from this report, and assuming we keep the same coal and hydro production (as they are both low cost), but reduce gas and increase nuclear, we'd need 49 gigawatts of new nuclear production nationwide. That's not going to happen by 2014, but we would probably see some fraction of the benefit for some fraction of the cost. Just incidentally, 49 gigawatts of new nuclear production scaled up from Palo Verde's employment base is 89,000 extra jobs here in the U.S., paying an average of 13% more than the average American salary.

Palo Verde cost $5.9 billion, was finished in 1988, and has a peak capacity of 3.72 GW and sustains a capacity factor in excess of 90%. We would need 13 more Palo Verdes to produce enough electricity to make that 10% cost reduction happen, at a present-day cost of around $120 billion [edited; thanks]. The generating utilities are not going to take this on, given that the "benefit" is a $12 billion/year loss to them. But for the U.S. government, looking at $24 billion/year in increased tax revenue, the cost of the plants is easily worth it. What remains is determining a way to have the government provide the capital and offset the revenue losses associated with a huge expansion of the nuclear reactor fleet, without getting ourselves further into the management disaster of a command economy.

I'll note that we're going into a recession, and interest rates are falling. This is a good (cheap) time for the government to borrow a bunch of money to invest in long term economic infrastructure. The reactor buildout I'm proposing would cost about the same as the $300/person economic stimulus package our leaders just conjured up. To my mind, the difference is very much teaching a man to fish versus giving him fish.

Tuesday, March 11, 2008

Clinton's choice

I'm watching replays of the CNN Obama/Clinton debate. This is painful.

The argument that Clinton needs, and is failing to make, is that there is a difference between how Senators and Presidents collect the information they need to make their decisions. The Congress does not have an NSA. The President does. Clinton made her decision, one she regrets, on the basis of information provided by George Bush's team. Had she been in President Bush's position, things would be entirely different because she would have had a completely different set of options, including better discovery of what the facts actually were.

She's not making that argument. I'm not sure why, and it suggests to me that she still doesn't think about how to be a President. She's thinking about how to argue about stuff, not how to find the right answer.

There is another angle that Clinton is missing. To win, the Democratic presidential candidate will have to appeal to some Republicans. What is going to go over better? "I was right, you shouldn't have gone to war, now I'm going to fix your mistake and pin the cost on you?" or "We got into this tragedy together, and I will help get us out of it together?" Obama's Iraq message is actually more divisive.

Finally, for what it's worth, the idea of scheduling a withdrawal scares me a lot. I think our withdrawal from Mogadishu contributed directly to the planning of 9/11. I worry about what we're going to be dealing with in 10 years, and where we're going to be dealing with it.

Monday, February 11, 2008

Dessert Recommendation

On Sunday night my wife and three kids had the "Lemon Meringue Ice Cream Pie" at the Half Moon Bay Inn. It was one of the best desserts I have ever had. For dinner I had the cheeseburger, also one of the best burgers I've ever had.

I'd like to put in a Google Maps link, but Maps doesn't have it! Half Moon Bay Inn is at 401 Main Street, Half Moon Bay, CA 650-560-9758.

Subsidizing wheat in Afghanistan

Afghanistan grows most of the world's opium. Opium is technically an illegal crop there, and it is one of the few crops that makes enough money to support a farmer in Afghanistan. If you grow opium, the central government is officially supposed to stop you, and the local official will probably look the other way if you pay him off. It may seem cheaper and easier for the folks growing opium in the Taliban-controlled areas, since the Taliban actively helps farmers sell their crop, in exchange for some of the profit. I'm sure many farmers prefer the Taliban for purely economic reasons.

If wheat sold for more money, perhaps 3 times the world price (which is around $350-$400/metric ton), some folks think the value of the wheat crop would be large enough to encourage many farmers to switch to wheat production. Wheat is legal to grow, so their is no disadvantage for a wheat farmer to having a functional Afghani government. Foreign aid organizations could run grain mills which bought wheat at $1100/ton and sold the flour for $350/ton. Bread prices would presumably stay low as flour flooded the market, and Afghanistan would presumably become an exporter of flour.

Folks in Pakistan and Iran would be encouraged to sell grain to Afghanistan for milling. I'm not entirely sure this is an entirely bad thing. Presumably economic conditions do not vary dramatically as you cross the border, so that areas outside Afghanistan are probably also growing opium. And, as long as we stop bulk cargo deliveries of grain to Afghanistan, one would think it would be expensive to move large quantities of grain by, say, mule across the border. There is some subsidy at which it is not worth moving grain by mule. Hopefully it's cheaper for small Afghani farmers to get their product to the mills than it is for Pakistani importers.

So, how much would this cost? Afghanistan produced 4.4 million metric tons of wheat in 2007/2008, so someone would have to cough up $3.3 billion/year to carry this subsidy. That's real money, and apparently we'd have to keep it up for a decade or so. If there are not large agribusinesses in Afghanistan now, there will be within a year or two. These businesses will get efficient at growing grain in Afghanistan, and start to produce the majority of the grain there. The subsidy on grain will decrease over time, large efficient businesses will capture nearly all of it (as they capture farm subsidies in the U.S.), and the marginal farmers will move back to poppies. I don't have a great deal of hope for this effort.

By the way: anyone have a clue what this is?

Tuesday, February 05, 2008

Cost of oil, revisited

Last time I looked, oil was priced at $22/barrel and we were importing 9.14 million barrels a day, which made up 20% of our trade deficit of $374 billion. We were actually importing more, but I hadn't counted the refined stuff. So it was actually 12.6 million barrels/day, so $101 billion or 27% of the trade deficit.

Now, as you know, the oil spot price is around $95/barrel, but $72/barrel is closer to the average price, and we are importing 12.2 million barrels a day (crude plus some refined products). The census bureau has nicely summarized the data here, which doesn't quite match the simple math I would do. For Dec 2006-Nov 2007, they see petroleum imports as $283 billion (35%) of a $813 billion deficit.

Grim.

How much does a plug-in hybrid help?
  • Over a 20-year lifetime, the car is driven 250k miles.
  • It gets 75 mpg rather than 25 mpg.
  • It burns 80 barrels of oil rather than 320 (and burns a bunch of domestic coal instead).
  • It saves the importation of $15,500 of crude.
  • It saves the user $23,000 in gas.
  • It costs the user $5800 in electricity. (250k miles) / (3 miles/kw-hr) * (0.07 $/kw-hr)
My guess is that a practical plug-in hybrid chews up more electricity and gasoline than this, but it still seems pretty good. Unfortunately,
  • It's made by Toyota in Japan, and costs $25,000, so the net trade debt increases. At least the money is going to a responsible nation like Japan. I will cede that eventually Toyota will make most of these plug-in hybrids here, and so only the profits will go to Japan.
  • If 10 million cars in the U.S. were plug-in hybrids, it would reduce our oil imports by 282,000 barrels/day, or 2.3%.
That last point is a killer. It is just incredibly hard to replace oil.

Sunday, January 27, 2008

Correcting a Newtonian

Part of the reason that contrast is so bad on my Newtonian is that it has flare. The other reason is that it is uncorrected. Let's see what it takes to correct the thing.

The mirror is a 300 mm diameter diffraction-limited 1/4 wave parabolic mirror. Sounds awesome, especially the diffraction limited part. When viewing 550 nm (green) light, a 300 mm aperture scope should resolve 1.22*wavelength/diameter = 2.2 microradians. With a 1500 mm focal length, those details are 3.4 microns across on the image plane. My Canon 40D has 5.7 micron pixels, which isn't quite going to catch the details.

Sadly, it turns out that even a perfect parabola produces just a single perfectly focussed dot in the center of the image, and resolution goes downhill out from there. One way to measure resolution is to measure the amount of contrast transmitted by the lens at a particular spatial frequency. We could measure transmission at the diffraction-limited spatial frequency (227 line pairs/mm), but we don't need it to be that good. A more useful frequency is the maximum spatial frequency that the camera supports. The pixels themselves sample 87.7 lp/mm. Because the camera has a Bayer filter to sample colors, it has an antialiasing filter, and the maximum frequency it can sample correctly is a factor of 1.8 smaller, about 48.7 lp/mm.

Here's a graph of the contrast you'd expect to transfer, at both 227 lp/mm (diffraction limit) and 48.7 lp/mm (Canon 40D limit). You can see that resolution from a simple parabolic mirror is only good within a small image circle around the center, and there is almost no contrast available at the diffraction limit.

Once contrast drops to zero, there is no detail left at that frequency. Lower spatial frequencies, corresponding to less detailed imagery, will have contrast at larger and larger radii, and so the picture will look more blurry as you get farther from the center. For photographic lenses, I like to see MTF at the maximum camera frequency of something like 30-40% across the whole field, although I'm willing to accept some dropoff at the corners.

Compare the parabolic mirror graph to a similar graph for the Canon 50mm/1.4 lens, in particular, looking at the 40 lp/mm line (the bottom pair). 60-70% MTF. That's a good lens. (I got this graph from photodo.com, which has great data on hundreds of lenses.)

The graphs aren't perfectly comparable, but they're close. The Canon here is stopped down to f/8, which vignettes away the least corrected portion of the aperture. But it performs nearly as well at f/5. Also, this graph is at 40 lp/mm, which is a little easier than the 48.7 lp/mm I'm using to judge the Newtonian. One other detail: in both graphs, there are two lines, one dashed (tangential) and one solid (saggital). Saggital means "in the direction towards and away from the image center", and tangential means"along a curve centered at the image center".

So the reflector looks terrible, but there is hope. Al Nagler at Tele Vue has designed a corrector lens (the Paracorr) that, when combined with a parabolic mirror, gives a well corrected image. I don't have the Paracorr's prescription (I checked the U.S. Patent Office, and found Al's eyepiece patents but no patent for the Paracorr), but I know the basic idea, so I was able to slap something together with Zemax to demonstrate.

Here is the overall scheme: light comes in from infinity from the left, bounces off a parabolic mirror at the far right, and then comes back through a negative doublet, followed by a positive doublet, finally arriving at the image sensor at the far left. I've left out the planar secondary miror that reflects the light out the side, but I've left in the obstruction that it causes. The corrector assembly, as shown, sits just outside the main optical tube. The eyepiece would sit on the other side of the image plane.

When comparing this to a photographic lens, it's best to think of the mirror combined with the two doublets as being the "lens". It actually extends the focal length of the telescope a bit. This picture shows rays bouncing off the primary mirror, going through the two doublets, and arriving at the image plane. Because the doublets are so small relative to the mirror and focal length, it's hard to see the detail.

So, here's the detail. I'm quite pleased with how this turned out: the elements are not ridiculously thick, and there is 50 mm of clearance between the last element and the focus plane, good enough to mount a DSLR (44 mm clearance required) if not a T thread mount (55 mm required). The negative doublet is a bit too close to the mirror, in the sense that it would probably sit right at the edge of the main optical tube, when we'd prefer it to be back a bit so that we can baffle the focus tube so that stray light from the front opening of the tube can't speckle off the doublet.

Designing this wasn't too hard. I left Zemax running overnight doing a global search for an optimum, with no constraints on the glass choice. A real optical designer has more constraints to deal with.

And here's the resulting MTF, polychromatic, no less! This is pretty incredible, I doubt the real Paracorr is this good. It's slightly better, all the way across the field, than the Canon 50mm/1.4. That's astonishing, given that this thing has 7 surfaces (one aspheric -- the mirror) with which to bend the light, compared to 13 in the Canon refractor. Or, compare this to the plot at the top of the post for the performance of the parabolic mirror alone. Night and day.

I think the bottom line is that a Paracorr is a necessary part of a Newtonian telescope, unless it's only used at very high magnifications. Unsurprisingly, the Paracorr is the best-selling product made by Al Nagler's company.

Thursday, January 24, 2008

Light bucket

I have a a 12 inch Newtonian telescope on a Dobsonian mount. It's a cheap light bucket. Over the weekend I tried hooking up a DSLR camera to it.

Two initial results:
  1. The adaptor physically connects the DSLR to the scope, but it doesn't guarantee that it will work. In my case, it doesn't. The problem probably applies to most telescopes designed to be used with eyepieces:
    • The eyepiece, e.g. a 30mm eyepiece, is a 30mm focal length lens designed to take an image 30mm in front of the lens and make it appear to be at infinity. Your eye looks through the lens to see the image at infinity.
    • The image that the eyepiece is focussed on is 30mm in front of the first nodal point of that lens. It appears that the standard for telescope eyepieces is to have that image in front of the shoulder of the eyepiece. When you remove the eyepiece and put a plain piece of paper on the image, you find that it is 5-10mm inside the focussing tube.
    • The focussing tube has some range, such that your can rack it forward and get the image to be 10mm behind the focussing tube, but:
    • DSLRs all want about 42mm between their front flange and the sensor plane. You focus them by placing the image on the sensor plane.
    • There is no way to focus a camera attached to this thing at infinity, without altering the telescope to move the image focus out.
    • If the image focus is moved out, some sort of extension tube, about 2 inches long, will be necessary with all eyepieces to make it possible to focus them.
    • I'll simulate it, but I suspect that extension tube will then limit the field of view of some of the larger FOV eyepieces.
  2. Surprise, the DSLR will focus on things at finite distances! If you pull the imager on a 1500mm scope out 2 inches from focussed-at-infinity, you are focussed 43.5m away. So, I took some shots of some tree branches at about that distance while pointed close to the sun.
    • This was a little dangerous, because if I'd accidentally pointed it at the sun while looking into it I could have hurt myself. I got lucky this time, and I'll not be impatient again.
    • Depth of field is awful. Spot size is about 10 microns (pixel size x 1.8 for the Bayer sensor), so an f/5 scope focussed at 43.5m away has a depth of focus of +/- 5*10 microns which corresponds to +/- 45 mm out by the tree branches.
    • The focus was only okay, not great.
      • Global contrast issues below
      • The scope has, at minimum, nasty coma which will smear images. I had a Paracorr lens between the camera and the telescope, but I don't think I had it adjusted properly, and I have not verified that the scope actually has a parabolic and not spherical mirror.
    • Contrast was ridiculously bad. If you saw a photographic lens this bad you might chuckle, but you would never, ever consider buying it. Saturday night I tried looking at the moon, and found that anywhere within 20 degrees of the moon the sky had a uniform grey background that hid most of the stars.
    • I need to clean my optics, there is dust on them.
    • I need to flock the interior of the telescope.
Maybe amateur telescopes all have terrible contrast because amateur astronomers are used to looking at stuff that's mostly black, so that a little scattered light is no problem, compared to daytimes scenes where it does matter.

So, photography through the telescope is not a trivially implemented idea. It does have me thinking about how to design a Newtonian telescope that can do photography, daytime or night, as well as stargazing. I know a thing or two about flare suppression and camera design as a result of my work on Street View, and I can see a bunch of obvious problems that might be fixable.
  • The secondary mirror is not balanced on the spider, so that it twists as the telescope is changed in altitude. I can actually see this with the autocollimator in the scope, which measures maybe 4mm of drift between horizontal and vertical. That's an angle of 2.67 milliradians. The autocollimator doubles the actual angle, so it's about 1.33 milliradians. Across the 43mm field of a DSLR, that's 57 microns of tilt, which is a smidge more than half the focus budget of +/- 50 microns. It would be good to balance the secondary on the spider with a counterweight.
  • The eyepiece should not view the opposite side of the tube around the secondary mirror. Instead, it should view a recessed surface which is itself shaded from both the aperture and the mirror.
  • The eyepiece tube should have a baffle, which is recessed from the tube so that it is not lit by the aperture, and which prevents the eyepiece from seeing anything but the secondary mirror and the recessed light trap behind it.
Hmm.... this is looking like a lot of work.

Monday, January 21, 2008

Honda S2000 vs BMW 330i ZHP

Yes, I know, apples vs oranges.

Martha has not liked my S2000 since I got it, at first because it's loud and has a hard ride, and more recently because it has no back seat. She wanted me to switch to something with 4 doors. We toyed with the idea of getting a Prius for a while, until I realized that I wasn't going to be happy with anything that couldn't wag its tail on dry pavement.

It may seem odd to be comparing these two (or three) cars. I'm not trying to figure out which is the best car in a particular category, rather, I'm trying to figure out which category I want. I decided that I could live without the convertible top, but not without that sense of engagement I get while driving.

The BMW has most of that engagement, so I got that. It's a used 2004 model year car. I am much happier buying a used car than a new one. I bought the Honda (a 2002) new because at the time the new ones were only a couple thousand dollars more than the used ones. The resale value has held up well enough that it should end up costing about $13/day (depreciation, gas, insurance, tires, and maintenance), which is about what I think I should be paying for a car. Because the BMW is used, it will hopefully depreciate at about the same dollar rate even though the car was more expensive new. The Prius would have been a lot less expensive.

Both cars claim almost exactly the same peak horsepower, though the BMW is 500 pounds porkier. Even so, it accelerates faster, because I'm not willing to thrash the Honda's clutch, and because the broader torque curve makes it much easier to be in the right gear in the BMW. The BMW engine feels more practical; it can relax, and it can lunge, and it is inline-6 smooooth. The Honda engine is more exciting, and responds quicker. Where the BWM takes maybe 200 ms between throttle lift-off and actual engine braking, the Honda's delay is unnoticeable -- maybe 50 ms. Throttle-on delay is tiny in both engines. The Honda sounds better, too, and between 6000 and 9000 RPM it is literally in a class by itself. I wish Honda had built a 9000 rpm inline-6 for their car(s). Fuel efficiency scales with body mass, as usual: almost all cars eat their weight in gasoline every year. Just think about that the next time you are considering buying a big pickup.

The Honda transmission is better. The throws are easier and much shorter, it's less rubbery, and it feels better going into gear. I am really going to miss this shifter. The BMW has wider spaced gears (the ZHP comes with a 6-speed manual), which would be a disaster on the Honda because of the peaky torque curve but in this car give a very relaxed engine note on the freeway. A mechanical engineering friend told me that the Honda transmission is probably the best ever made for a production car, so everything else is a step down. That's the problem with really nice stuff -- transitioning away.

The Honda steering is better. The ratio is faster, and it's lighter. The BMW tends to kick back approaching stop signs with uneven pavement. This last issue is probably due to the heavier car on wider rubber on the BMW. The BMW steering wheel is thicker and nicer to hold.

I haven't really pushed the BMW around yet, and I never did push the Honda past its cornering limits, but I can say that the very first thing I liked about the Honda is still true: there is less commitment in corners. When you are tearing around a bend, you can change your mind, change your line, get into the brakes, roll on the gas, you can do all kinds of things and the Honda reacts in a predictable manner. The BMW feels committed, and that feels scary. Operators vill not exceed zeez limits.

The BMW has a nicer interior than the Honda. It's quieter, the stereo is better, the instruments tell you your averaged MPG, and there is more room. The pedal position is way better than the Honda -- you can really toe-and-toe in the BMW; the Honda pedals were too far apart to do that reliably. Toe-and-toe'ing is lame, though, compared to heel-and-toeing. The best pedals I ever had were in my VW Bug. Yes, the car had a lot of other problems (it rusted all the way through the roof in one spot, the gas pedal would occasionally stick to the floor, and the steering oscillated badly at 70 MPH) but I've never driven any other car in which I could reliably match revs while under hard braking.

I like driving the new car, and I've taken all three girls in it now and they like it. (Well, Ava is noncommital, but she is just one year old and hasn't yet developed her appreciation of these things.) Anya yells "two wheels" going around brisk corners in this one too, although I sometimes think I hear a wistful note in her voice. Ah well, life goes on. I'm sure the next car will be a barge.

Thursday, January 10, 2008

Lady Jane

We have a new puppy, Lady Jane.

She's a black Lab, just like Iniki was. She's really cute and full of sharp teeth.

Saturday, January 05, 2008

Chernobyl, ghost town



Elena here is holding a geiger counter reading 763 microroentgen/hour. In the background is the sarcophagus built around the reactor that exploded in Chernobyl. Normal radiation levels are 10-20 uR/hr. She has an interesting photo tour of Chernobyl here.

The sarcophagus looks a lot better than I expected it would.

Saturday, December 29, 2007

Why are there no GTCC plants doing CO2 sequestration?

Rod Adams makes an excellent point here. Go down a bit. 11th paragraph:
If it is relatively easy to capture the CO2 from an IGCC [Integrated Gasification and Combined Cycle coal-burning plant], why wouldn't we start working to prove that assumption by capturing the CO2 from at least several of the existing GTCC (gas turbine combined cycle) plants that use natural gas as their heat source?
CO2 sequestration for coal-fired powerplants is held out as the major way that America will reduce it's CO2 emissions significantly over the next two decades. But, CO2 sequestration requires a lot of tinkering with the plant. An IGCC is nice for efficiency, but is not required. Several other really serious pieces of equipment are required, however:
  • Sequestration costs big money. Since you really don't want to unnecessarily sequester 4 times as much nitrogen as CO2, you seperate that nitrogen and vent it. Since you don't want to seperate nitrogen from the exhaust gas (you'd have to cool it), you seperate it from the incoming airstream. Thus, the air filter on an ordinary plant is replaced with an expensive and energy-hungry plant with cryogenics, multiple turbines, and heat exchangers galore.
  • The exhaust must be compressed and liquified to inject it into the ground. Most of the heat must be removed from the exhaust in order to compress it. In a normal coal-fired powerplant, a large fraction of the waste heat is rejected by simply venting the exhaust into the air. In a CO2 sequestrating facility, you need a big heat exchanger and a cooling tower to do that work. Oh, and a larger fresh water supply.
Rod is right, the economics of all this stuff could be proved out on an GTCC plant, or even a plain old combustion turbine fired by nearly anything. I think it's pretty obvious that the carbon-burning electricity producers (coal and gas) benefit from deferring the installation of CO2 sequestration equipment. And, no better way to defer installation than to defer development until after the development of a brand-new burner technology (IGCC) which will take a decade or two to roll out.

So, they talk about sequestration while they defer it as long as possible.

Interestingly, one of the side effects of concentrating the oxygen in the gas being burned is that the operating temperature increases, which could improve efficiency. Unfortunately, combustion turbines already run at temperatures higher than the melting point of the turbine blades... and probably cannot be run hotter. My guess is that exhaust CO2 will be cooled, recirculated and recompressed, and then used to dilute the oxygen in the incoming stream to lower flame temperature.

[Update: check the comments on this post. Harry Jaeger makes some nice points.]

Sunday, December 16, 2007

Teddy Bear Tea

I took my daughter to the Ritz-Carlton's Teddy Bear Tea today. $184 for a few dried-out finger sandwiches and a bunch of chocolates, a teddy bear, some singing, and a chance to get pictures with... a person-sized teddy bear. I couldn't help but think of how tasty a $184 dollar dinner can be. Or how fun the local production of "'Twas the night before Christmas" had been the day before.
Children of all ages gather for a favorite family tradition at The Ritz-Carlton. Guests enjoy a fun-filled afternoon in festive surroundings featuring a storytelling Teddy Bear, a pianist, hot cocoa, tea, a selection of tea pastries and mini finger sandwiches, and a Christmas candy and sweets buffet table. Each child takes home a teddy bear and photo as souvenirs. $75 per guest, $65 for children 12 years and under, exclusive of tax and gratuity. For additional information or reservations, please call (650) 712-7040.
I could wonder how the Ritz-Carlton could end up serving crud for such an expensive lunch. Stories from Teddy may have happened before we got there, 10 minutes late. But why bother with these specifics? A more important question is: how did I ever end up in such a travesty?

I did ask, several times before going, what exactly this "tea" entailed. Martha was nonspecific. Since the other folks going were all in one of Martha's mother's groups, I knew essentially no-one. I'm antisocial as it is; dropping me into a mother's group without something to specifically contribute to the proceedings turns me into a stone wall. I went because I was led to believe that the event had already been paid for, Martha could not attend as she had a cold, so, I might as well see what we paid for. Instead, I got a 3-digit bill. I think the lesson here is to (a) ask for specifics beforehand, which I did, but then (b) refuse to go when specifics are not provided.

From Kathleen's point of view, there was: (a) nothing to climb on, (b) nothing to legitimately squish with her fingers, (c) nothing with which to draw on herself, nor stickers, fake tatoos, or dress-up clothes, (d) no pool, and (e) no kids singing or doing something else to be emulated. Even a desert wasteland would at least have had rocks to turn over.

If anyone from the mother's club reads this, let me get in a last word: it's not you, it's me. Given something specific to do and at least some semblance of DIY flair, I can have a great time with y'all. But I'm never going to convincingly pull off an hour of small talk.

Tuesday, December 11, 2007

ISS does not smell like old feet

I work with Ed Lu, who is a former astronaut who spent 6 months in the ISS, without taking a shower. I asked the obvious question, didn't you and everything else just stink?

No. Ed says that the air conditioning/purification system was ridiculously good, so much so that the only time you ever smelled anything was when you opened a food packet. Even then, the smell was whisked away pretty quickly.

I asked if there was problems with vapor from breathing condensing all over the interior of the spacecraft walls. Apparently not. The thing has hot spots as well as cold spots, and heat pipes to balance it all out, and lots of insulation over that. Apparently stuff doesn't freeze. Given that the thing is cold soaked in sub-liquid-nitrogen temps 45 of every 90 minutes, I'm amazed. I was expecting a story of two-inch-thick ice sheets on the interior walls.

Thursday, December 06, 2007

The US is building more wind power than coal

I've just read this report from the DOE, and though it doesn't talk about windpower at all, I find it quite exciting for wind's prospects.

The conventional wisdom has been that the small size of the turbines (generally about 2 MW each) and the unreliability of both the wind and the turbines makes it improbable that the bulk of our power needs can be met with wind.

Meantime, the installed cost of windpower has been dropping, and is now at something like $1300/kilowatt of peak capacity, and coal-fired powerplants have been getting more expensive ($2200/kilowatt), and gas-fired powerplants have been getting more expensive to run (they remain cheap to build at $600/kilowatt). That doesn't explain everything, but check out this statistic from the DOE report:

From 2000 to 2007, the U.S. built an average of 293 MW/year of new coal-fired capacity. In that time, wind build rate went from essentially nothing to... about 4000 MW in 2007! Holy cow, that's an order of magnitude more build than coal!

Now I understand that, like the long Nuclear Pause, there has been something of a moratorium on new Coal for a (shorter) while. And, I'm told there are lots of coal-fired plants in planning right now. But just for scale, note that the EIA projects that the U.S. needs 6000 kW/year of new capacity for the next couple decades. Even assuming a 33% utilization rate, wind is within an order of magnitude of producing ALL of that new capacity, right now.

It's no longer a question of whether wind can ever dominate coal... it's a question of whether coal can come back! Look at figure 2 in the DOE report, and project a growth curve for windpower at 1300 MW/year in 2007 rising to 3200 MW/year in 2012. Why is my 2007 wind number small? Because you have to divide windpower by 3 to account for the wind not blowing much of the time.

Anyway, what you see is that wind will outpace coal again in 2008, but coal will win in 2009 and 2010. But after that, all this new wind capacity is going to meet most of the need for new capacity, reducing the need for new coal plants (and greatly increasing the need for long distance power lines at the same time).

And, by the way, there are about a dozen new nuclear plants in the works, perhaps half of which will come online in 2012 or thereabouts. They'll eat even more of the demand that would otherwise go to coal.

Here's a satisfying question to ponder: what year will U.S. coal production peak, not from lack of supply, but from lack of demand?