• Thanks for stopping by. Logging in to a registered account will remove all generic ads. Please reach out with any questions or concerns.

A scary strategic problem - no oil

Mini nuclear reactors might fill some gaps. Notice this is a type of nuclear reactor which does not use the same fissile material as nuclear weapons, nor does it produce the same fissile byproducts:

http://nextbigfuture.com/2011/05/thorenco-llc-presents-little-40-mw.html

Thorenco LLC presents a little 40 MW Liquid Flouride Thorium Reactor

Charles S. Holden founder of Thorenco LLC working with Lawrence Berkeley National Laboratory physicists has proposed a small transportable 50-megawatt-thermal Thorium converter reactor for multiple uses: producing electricity (15 megawatts), burning up high-level actinides from spent fuel, and producing low-cost, high-temperature steam (or process industrial heat). This high-temperature steam can be used for extraction of oil from tar sands, or desalinating, purifying, and cracking water. The reactor’s fuel matrix can be “tuned” to provide the right output for each particular work process.

The reactor core is a squat cylinder, about 140 centimeters in diameter and 50 centimeters tall. Its size makes it portable, so that it can be brought to remote locations to work site and supply heat and electricity there without dependence on long-distance transmission lines. Its small size also allows it to be factory-built and transported to its destination, “plugged in” in a deep underground containment structure, and put to work quickly. The core can be shipped back to the factory when the fuel needs to be changed.

Thorium Energy Alliance held its third national conference on May 12, 2011 - Charles S. “Rusty” Holden, founder of Thorenco LLC, did offer a specific design: a 40MW pilot plant that he called “a little LFTR.” Using fissile uranium-235 as a source of ignition neutrons and a mix of thorium tetrafluoride in a beryllium fluoride molten salt, Thorenco’s design includes a deep salt pool with a honeycomb geometry that offers “a superior way to clean and condition the fuel during operations,” Holden said.

23 page presentation - Liquid Fueled Thorium Reactor: 40 Megawatt Pilot Plant Outline

•Neutrons convert Fertile Thorium-232 to fissile Uranium-233
•No Plutonium Produced
•No melt downs
•No fuel rods
•No cooling ponds
•No 10,000+ year spent fuel storage
•10 years at 40 megawatts
•141 Kg. U-233 “burned” during decade
•More than 100 Kg. of fissile produced
•1600 kilograms of U-233 fissile load
•9000 kilograms of Th-232 fertile load
•23 Grams U-232 produced in fuel over the decade of operations
•Hexagonal Prism 160 Cm. Width and Height
•Fuel Volume 2330 Liters
•Fuel 11.65 Metric Tonnes; 1-2 Metric Tonnes Fissile in Fuel
•Coolant 93,200 Liters; 450 Tonnes
•Reflector Volume 1420 Liters 16.65 Metric Tonnes

Thorenco’s ceramic fuel is dispersed in an inert metal matrix covered by Holden’s Patent Cooperation Treaty application. This solid state metal alloy is composed of four materials. The thorium and uranium fuel particles are embedded in the alloy, which both slows and moderates the fissioning process. There are moderating materials dispersed in the alloy along with the actinide particles. Using the metallic alloys as moderators (instead of the water used in other Thorium reactor designs) allows Thorenco’s reactor to operate in a more energetic neutron spectrum so that its core can have a long life.

The self-regulating reactor is expected to operate for 10 years without needing refueling.
 
More good news; SOFCs are becoming practical for more applications"

http://www.technologyreview.com/energy/37439/?nlid=4369

Cooling Down Solid-Oxide Fuel Cells

A startup moves toward thin-film solid-oxide fuel cells suitable for practical devices.

By Katherine Bourzac

Startup company SiEnergy Systems has overcome a major barrier to commercializing solid-oxide fuel cells with a prototype that operates at temperatures hundreds of degrees lower than those on the market today. Working with Harvard materials science professor Shriram Ramanathan, SiEnergy Systems, based in Boston, has demonstrated a solid-oxide fuel cell that can operate at 500 degrees Celsius, as opposed to the 800 to 1,000 degrees required by existing devices. This allows the cell, which uses a thin-film electrolyte mechanically supported by a metal grid, to be much larger than similar devices fabricated before—on the order of centimeters in area, the size needed for practical applications, rather than micrometers.

Solid-oxide fuel cells, which can run a variety of fuels including diesel or natural gas, bring in oxygen from the air to be reduced at the cathode, and then pass the oxygen ions through a solid-oxide electrolye membrane to the anode, where the fuel is oxidized to produce electrons that are drawn out of the device. Their high operating temperatures are dictated by the fact that the ions move more quickly through the electrolyte at higher temperatures.

If the electrolyte is very thin—just a few hundred nanometers thick—a solid-oxide fuel cell can operate at lower temperatures. Such electrolytes can power very small demonstration devices, but until SiEnergy and Ramanathan's work, no one had been able to make an ultrathin solid-oxide membrane large enough for practical devices, says Harry Tuller, professor of materials science and engineering at MIT. "The challenge has been that the films, being so thin, are fragile and easily tear during processing or during heating and cooling cycles," says Tuller. When heated and cooled, the different materials of which they are made expand and contract at different rates, damaging the delicate film. "We and others have tried to support the films by one or more structural supports," he says, "but have not succeeded in doing so over as large an area."

In a paper published in the journal Nature Nanotechnology, the researchers describe making an electrolyte membrane that is more stable both thermally and mechanically. They started with a 100-nanometer-thick electrolyte membrane made up of zirconia and yttrium. They deposited a supportive metallic grid on top of it, to hold the membrane in place while it was heated and cooled and, since the grid was made of conductive material, to act as the anode. They combined this with a dense, high-performance cathode previously developed by Ramanathan. In their published work, SiEnergy has demonstrated arrays of fuel cells each about five millimeters square. Ramanathan says the method can be scaled up to the centimeter-scale areas needed for devices.

SiEnergy's general manager, Vincent Chun, says this is just a first demonstration and the company is now working on integrating the thin fuel cells into full systems and testing fuels. Chun hopes the company's fuel cells will save on materials costs because they are so thin.Chun says the company plans to offer replacements for diesel generators and home heating and power-generation systems.

Copyright Technology Review 2011.
 
Conventional wisdom turned on its head:

http://www.salon.com/news/politics/war_room/2011/05/31/linbd_fossil_fuels/index.html

War Room
Tuesday, May 31, 2011 07:01 ET
Everything you've heard about fossil fuels may be wrong
The future of energy is not what you think it is
By Michael Lind

Are we living at the beginning of the Age of Fossil Fuels, not its final decades? The very thought goes against everything that politicians and the educated public have been taught to believe in the past generation. According to the conventional wisdom, the U.S. and other industrial nations must undertake a rapid and expensive transition from fossil fuels to renewable energy for three reasons: The imminent depletion of fossil fuels, national security and the danger of global warming.

What if the conventional wisdom about the energy future of America and the world has been completely wrong?

As everyone who follows news about energy knows by now, in the last decade the technique of hydraulic fracturing or "fracking," long used in the oil industry, has evolved to permit energy companies to access reserves of previously-unrecoverable “shale gas” or unconventional natural gas. According to the U.S. Energy Information Administration, these advances mean there is at least six times as much recoverable natural gas today as there was a decade ago.

Natural gas, which emits less carbon dioxide than coal, can be used in both electricity generation and as a fuel for automobiles.

The implications for energy security are startling. Natural gas may be only the beginning. Fracking also permits the extraction of previously-unrecoverable “tight oil,” thereby postponing the day when the world runs out of petroleum. There is enough coal to produce energy for centuries. And governments, universities and corporations in the U.S., Canada, Japan and other countries are studying ways to obtain energy from gas hydrates, which mix methane with ice in high-density formations under the seafloor. The potential energy in gas hydrates may equal that of all other fossils, including other forms of natural gas, combined.

If gas hydrates as well as shale gas, tight oil, oil sands and other unconventional sources can be tapped at reasonable cost, then the global energy picture looks radically different than it did only a few years ago. Suddenly it appears that there may be enough accessible hydrocarbons to power industrial civilization for centuries, if not millennia, to come.

So much for the specter of depletion, as a reason to adopt renewable energy technologies like solar power and wind power. Whatever may be the case with Peak Oil in particular, the date of Peak Fossil Fuels has been pushed indefinitely into the future. What about national security as a reason to switch to renewable energy?

The U.S., Canada and Mexico, it turns out, are sitting on oceans of recoverable natural gas. Shale gas is combined with recoverable oil in the Bakken "play" along the U.S.-Canadian border and the Eagle Ford play in Texas. The shale gas reserves of China turn out to be enormous, too. Other countries with now-accessible natural gas reserves, according to the U.S. government, include Australia, South Africa, Argentina, Chile, France, Poland and India.

Because shale gas reserves are so widespread, the potential for blackmail by Middle Eastern producers and Russia will diminish over time. Unless opponents of fracking shut down gas production in Europe, a European Union with its own natural gas reserves will be far less subject to blackmail by Russia (whose state monopoly Gazprom has opportunistically echoed western Greens in warning of the dangers of fracking).

The U.S. may become a major exporter of natural gas to China -- at least until China borrows the technology to extract its own vast gas reserves.

Two arguments for switching to renewable energy -- the depletion of fossil fuels and national security -- are no longer plausible. What about the claim that a rapid transition to wind and solar energy is necessary, to avert catastrophic global warming?

The scenarios with the most catastrophic outcomes of global warming are low probability outcomes -- a fact that explains why the world’s governments in practice treat reducing CO2 emissions as a low priority, despite paying lip service to it. But even if the worst outcomes were likely, the rational response would not be a conversion to wind and solar power but a massive build-out of nuclear power. Nuclear energy already provides around 13-14 percent of the world’s electricity and nearly 3 percent of global final energy consumption, while wind, solar and geothermal power combined account for less than one percent of global final energy consumption.


(The majority of renewable energy consists of CO2-emitting biomass -- wood and dung used for fires by the world’s poor, plus crops used to make fuel; most of the remainder comes from hydropower dams denounced by Greens.)

The disasters at Chernobyl and Fukushima have dramatized the real but limited and localized dangers of nuclear energy. While their initial costs are high, nuclear power plants generate vast amounts of cheap electricity -- and no greenhouse gases. If runaway global warming were a clear and present danger rather than a low probability, then the problems of nuclear waste disposal and occasional local disasters would be minor compared to the benefits to the climate of switching from coal to nuclear power.

The arguments for converting the U.S. economy to wind, solar and biomass energy have collapsed. The date of depletion of fossil fuels has been pushed back into the future by centuries -- or millennia. The abundance and geographic diversity of fossil fuels made possible by technology in time will reduce the dependence of the U.S. on particular foreign energy exporters, eliminating the national security argument for renewable energy. And if the worst-case scenarios for climate change were plausible, then the most effective way to avert catastrophic global warming would be the rapid expansion of nuclear power, not over-complicated schemes worthy of Rube Goldberg or Wile E. Coyote to carpet the world’s deserts and prairies with solar panels and wind farms that would provide only intermittent energy from weak and diffuse sources.

The mainstream environmental lobby has yet to acknowledge the challenge that the new energy realities pose to their assumptions about the future. Some environmentalists have welcomed natural gas because it is cleaner than coal and can supplement intermittent solar power and wind power, at times when the sun isn’t shining or the wind isn’t blowing. But if natural gas is permanently cheaper than solar and wind, then there is no reason, other than ideology, to combine it with renewables, instead of simply using natural gas to replace coal in electricity generation.

Without massive, permanent government subsidies or equally massive penalty taxes imposed on inexpensive fossil fuels like shale gas, wind power and solar power may never be able to compete. For that reason, some Greens hope to shut down shale gas and gas hydrate production in advance. In their haste, however, many Greens have hyped studies that turned out to be erroneous.

In 2010 a Cornell University ecology professor and anti-fracking activist named Robert Howarth published a paper making the sensational claim that natural gas is a greater threat to the climate than coal. Howarth admitted, "A lot of the data we use are really low quality..."

Howarth’s error-ridden study was debunked by Michael Levi of the Council on Foreign Relations and criticized even by the Worldwatch Institute, a leading environmentalist organization, which wrote: "While we share Dr. Howarth’s urgency about the need to transition to a renewable-based economy, we believe based on our research that natural gas, not coal, affords the cleanest pathway to such a future."

A few years ago, many Green alarmists seized upon a theory that an ice age 600 million years ago came to an abrupt end because of massive global warming caused by methane bubbling up from the ocean floor. They warned that the melting of the ice caps or drilling for methane hydrates might suddenly release enough methane to cook the earth. But before it could be turned into a Hollywood blockbuster, the methane apocalypse theory was debunked recently by a team of Caltech scientists in a report for the science journal Nature.

All energy sources have potentially harmful side effects. The genuine problems caused by fracking and possible large-scale future drilling of methane hydrates should be carefully monitored and dealt with by government regulation. But the Green lobby’s alarm about the environmental side-effects of energy sources is highly selective. The environmental movement since the 1970s has been fixated religiously on a few "soft energy" panaceas -- wind, solar, and biofuels -- and can be counted on to exaggerate or invent problems caused by alternatives. Many of the same Greens who oppose fracking because it might contaminate some underground aquifers favor wind turbines and high-voltage power lines that slaughter eagles and other birds and support blanketing huge desert areas with solar panels, at the cost of exterminating much of the local wildlife and vegetation. Wilderness preservation, the original goal of environmentalism, has been sacrificed to the giant metallic idols of the sun and the wind.

The renewable energy movement is not the only campaign that will be marginalized in the future by the global abundance of fossil fuels produced by advancing technology. Champions of small-scale organic farming can no longer claim that shortages of fossil fuel feedstocks will force a return to pre-industrial agriculture.

Another casualty of energy abundance is the new urbanism. Because cars and trucks and buses can run on natural gas as well as gasoline and diesel fuel, the proposition that peak oil will soon force people around the world to abandon automobile-centered suburbs and office parks for dense downtowns connected by light rail and inter-city trains can no longer be taken seriously. Deprived of the arguments from depletion, national security and global warming, the campaign to increase urban density and mass transit rests on nothing but a personal taste for expensive downtown living, a taste which the suburban working-class majorities in most developed nations manifestly do not share.

Eventually civilization may well run out of natural gas and other fossil fuels that are recoverable at a reasonable cost, and may be forced to switch permanently to other sources of energy. These are more likely to be nuclear fission or nuclear fusion than solar or wind power, which will be as weak, diffuse and intermittent a thousand years from now as they are today. But that is a problem for the inhabitants of the world of 2500 or 3000 A.D.

In the meantime, it appears that the prophets of an age of renewable energy following Peak Oil got things backwards. We may be living in the era of Peak Renewables, which will be followed by a very long Age of Fossil Fuels that has only just begun.

    Michael Lind is Policy Director of the Economic Growth Program at the New America Foundation and is the author of "The Next American Nation: The New Nationalism and the Fourth American Revolution." More: Michael Lind
 
Maturing technologies bring natural gas into play in new ways. The ability to capture "flare" gas with this technology alone could put a huge amount of "new" fuel on the market:

http://pajamasmedia.com/tatler/2011/06/04/arabian-alchemy-shell-makes-black-gold-in-qatar/?print=1

Arabian Alchemy: Shell makes black gold in Qatar
Posted By Charlie Martin On June 4, 2011 @ 8:12 pm In Politics | 32 Comments

PJ contributor Leon de Winter mailed us an article from The Volkskrant, which Leon calls the New York Times of the Netherlands (which seems damning with faint praise to me, but never mind that).  The title of the article is (roughly) “Shell’s magic makes black gold from natural gas.” Frustratingly, I can’t get the Volksrant website to deliver me a link to the original article, but I was able to find it on Lexis, and between my small ability to read Dutch (see, you learn how the Dutch spell things, then read it out loud — if you speak both English and German, you can make sense of it) and Google Translate, I was able to get the gist.

And it’s a pretty interesting gist, I’ll tell you what.

Here’s the basic story: In a June 4th story, Michael Persson reports that the first product is coming from the Shell-Qatari joint project a half-hour out of Doha. The project is called “Pearl” and its function is to transform natural gas into synthetic replacements for petroleum products. In other words, turning natural gas into oil.

Conceptually the process is simple. Natural crude oil is a mix of a variety of hydrocarbons, which of course are simply molecules made of hydrogen and carbon. (This is distinct from carbohydrates like sugar, which also include oxygen.) A lot of the hydrocarbons in crude oil are long chains — they have many carbon atoms joined together. When crude oil is refined, the refinery basically does two things: first, it separates out any naturally-occurring shorter chains, which includes things like pentane, hexane, heptane, and octane. We call a misture of those things (and some other stuff, this is a bit oversimplified) “gasoline.” This isn’t sufficient to provide as much gasoline as we’d like, so a catalytic process is used to convert other fractions into the right components for gasoline. Natural crude usually contains some other compounds, like sulfur compounds. When you hear the TV business people talk about “sweet crude”, they mean crude with relatively little sulfer. “Sour” crude, naturally, has more sulfur. The sulfur compounds have some commercial uses, but they present processing problems, so “sour” crude is less desirable” than “sweet” crude.

The effect is that long-chain hydrocarbons are broken down into shorter chains; this process naturally releases some energy, so it’s “downhill”.

What Shell is doing, through a process called GTL (“gas to liquid”, aren’t scientists just poets?), is running a refinery backwards. Natural gas is primarily methane, the simplest hydrocarbon; the GTL process pushes it back up the hill to form longer hydrocarbon chains. This process consumes some energy, so it’s a little bit counter-intuitive why you’d want to do such a thing, but there are some real advantages. First off, natural gas is hard to handle — you can’t run a pipeline across the Pacific, and storing it in tankers means either storing it under very great pressure, or cooling it to very low temperatures to make it a liquid.

Longer chain hydrocarbons naturally have a higher boiling point, so they’re easier to keep liquid, easier to store in tanks. Years of transporting liquified natural gas in large quantities have proven that the whole effort is an expensive pain in the — expensive pain.

What’s much more important about the synthetic long-chain hydrocarbons from the Pearl operation is that they are very much like gasoline, diesel fuel, and kerosene (jet fuel.) Since they’re being built-to-order, so to speak, they have an advantage over conventional fuels too — they’re much purer. Natural gas is perfectly “sweet” — in fact, a sulfur compound, ethyl mercaptan, is added artificially to give it the “smell of gas”. Otherwise, natural gas would be odorless. The GTL process produces synthetic longer-chain hydrocarbons that never had the odd sulfur compound, or aldehydes and ketones (more smelly stuff) to be removed.

One result is a fuel like diesel fuel, that can be used without engine changes in a conventional diesel engine, but that produces cleaner exhaust, little or no soot, and is much less smelly. Changing the process lightly creates something very much like gasoline; a slightly different change creates something like kerosene.

The other result is this: we have lots of natural gas, throughout the world and in the USA in particular. Hydraulic fracturing — “fracking” — has opened up amazing reserves that weren’t thought to be practical a few years ago.The discovery of immense clathrate deposits — so called “burning ice” — are another immense source of natural gas. (In fact, the estimates right now are that clathrate deposits are the equivalent of twice all other other fossil fuels on Earth.) And perhaps best of all, the current costs of transporting natural gas are high enough that many refineries and oil fields simply flare off — burn — waste natural gas.

The gas-to-liquid process makes all those sources available not just to generate power and heat, but to replace fuel oils, and even lubrication oils. The combination of natural gas production and GTL technology could conceivably replace the whole expensive infrastructure required to move oilgas from the Middle East to Europe and the Americas, and make “waste” natural gas into a commercially useful product. As long as the price is right.

That’s the literal bottom line to the GTL process: it appears that the Shell process, as it stands right now, is financially feasible if the price of oil exceeds $20 a barrel.

Article printed from The PJ Tatler: http://pajamasmedia.com/tatler

URL to article: http://pajamasmedia.com/tatler/2011/06/04/arabian-alchemy-shell-makes-black-gold-in-qatar/
 
Two approaches to fuel economy and substitution. I don't think that flex fuel is quite as easy as the author seems to think (Methanol is quite corrosive to ordinary plastics and rubber used in the fuelsystem, for example), but is still cheaper than the technical and infrastructure changes needed to convert a large portion of the fleet to CNG:

http://www.nationalreview.com/articles/268621/two-approaches-fuel-choice-robert-zubrin?

Two Approaches to Fuel Choice
Open Fuel Standards is the right choice.

Americans are currently being heavily taxed by the governments of the OPEC cartel, who are using a policy of restricting oil production to drive up prices. Indeed, with prices inflated to the $100-per-barrel range, America’s 5 billion barrels per year of petroleum imports will cost our economy $500 billion, an amount equal to 25 percent of the federal government’s tax receipts or, alternatively, the nation’s whole balance-of-trade deficit.
The only way to break the power of the oil cartel to set global liquid-fuel prices is to open the market to competition from non-petroleum-based fuels. With this in mind, two bipartisan bills have recently been introduced in the U.S. House of Representatives. One is H.R. 1380, known as the “New Alternative Transportation to Give Americans Solutions Act,” or “NAT GAS Act” for short. The other is H.R. 1687, the Open Fuel Standards Act. The approaches adopted in these two pieces of legislation are very different.

The NAT GAS Act, which is strongly supported by oil and gas tycoon T. Boone Pickens, would provide a $7,500 tax-credit subsidy for the purchase of natural-gas cars, as well as a further subsidy to their manufacturers of $4,000 each, for a total of $11,500 per car. Natural-gas-truck subsidies would be at least double this, with the amount of the subsidy increased to as much as $64,000 per truck, depending on size. Further subsidies of up to $100,000 each would be available to filling stations to install natural-gas pumps.

The budgetary impact of this bill could be quite significant. For example, if we assume a sales rate of 1 million cars per year subsidized at $11,500 each, plus 100,000 small trucks subsidized at $23,000 each, and forget about the larger trucks and filling stations, the total tab would come to $13.8 billion per year. This would be triple the $4.5 billion per year ($0.45 per gallon times 10 billion gallons) currently being spent on the controversial corn-ethanol program, which has replaced 8 percent of our gasoline use. In contrast, it would take 18 years of such subsidies, with no vehicle losses, for the NAT GAS Act to replace 8 percent of the American automobile fleet, at a total cost to the treasury of $248 billion. Thus, at the end of 18 years, assuming a 2 percent compound rate of growth, the U.S. vehicle fleet will expand from 180 million to 257 million, of which 237 million will still be gasoline-powered, leaving us more dependent on foreign oil than at the program’s start. But since the average life of a car is only 17 years, it is unlikely that even this very modest degree of accomplishment will be achieved.

Another remarkable feature of the NAT GAS Act is the degree to which it has been championed by an openly self-interested party who would profit from increased sales of the sole alternative fuel chosen for support by the bill. However, it should be noted that the total amount of natural gas sold per car is unlikely to exceed $1,000 annually, of which perhaps 20 percent might be profit. Thus, even after several hundred billion dollars are spent to create a 20-million-car natural-gas fleet, the resulting profits to the entire natural-gas industry would be only about $4 billion per year. So, if helping the natural-gas industry were the objective, this could be accomplished at much lower cost to the treasury just by giving them their cut.

In contrast, the Open Fuel Standard bill does not choose a single winner, and would not cost the treasury anything. Instead, it stipulates that within several years the majority of new cars sold in the U.S. must give the consumer fuel choice by being any one of the following: full flex fuel (i.e., capable of using methanol, ethanol, and gasoline), natural gas, plug-in hybrid, or biodiesel compatible. Of these, the cheapest to produce will be flex fuel (zero to at most $100 additional cost per car), as many gasoline-powered vehicles now sold in the U.S. are already built with flex-fuel capability in mind, and need only a software upgrade to realize it. However, should consumers wish to spend their own money for the other alternatives, they will have every right to do so.

That said, it is the flex-fuel car’s methanol capability that will truly open up the source market for liquid fuels, as methanol can be made cheaply from coal, natural gas, or biomass. In fact, if the goal is to open up the vehicle-fuel market to natural gas, that can be much more readily accomplished, in a much bigger way, by the Open Fuel Standard legislation than by the NAT GAS Act, without any cost to the taxpayers at all — provided, of course, that natural-gas-sourced methanol continues to beat coal- or biomass-sourced methanol on price. This is as it should be.

Furthermore, unlike the NAT GAS Act, which will have near-zero impact on global oil prices, the worldwide effects of the Open Fuel Standard bill would be profound. This is because foreign carmakers will not wish to walk away from the American automobile market. If flex fuel becomes the standard for U.S. auto sales, foreign carmakers will switch their lines over, and their products worldwide will be predominantly flex fuel as well. This will subject gasoline to competition from methanol, and in some places ethanol, made from the cheapest local sources everywhere, thereby creating a permanent global competitive constraint on future oil prices.

The NAT GAS Act would cost the treasury a fortune, while accomplishing next to nothing. The Open Fuel Standard bill would cost the treasury nothing, while protecting both the U.S. and world economies from continued taxation by the oil cartel.

One can only hope that Congress makes the right choice.

— Dr. Robert Zubrin is president of the aerospace-engineering firm Pioneer Astronautics, a fellow with the Center for Security Policy, and the author of Energy Victory: Winning the War on Terror by Breaking Free of Oil.
 
Free markets and energy supplies:

http://washingtonexaminer.com/politics/2011/06/free-market-not-government-policies-drives-energy-boom

Free market, not government policies, drives energy boom
By: Michael Barone 06/07/11 8:05 PM
Senior Political Analyst Follow Him @MichaelBarone

(Photo by David McNew/Getty Images) There's an awful lot that's stale in the debate on government energy policy.
Some stale arguments are nevertheless valid: It's dangerous to depend heavily on Middle Eastern oil. Others have increasingly been seen as dubious: that global warming caused by human activity will result in catastrophe.

There's stale talk about federal and state laws that promised great change but have produced very little. Electric cars, even with subsidies, are no larger a part of the auto fleet than they were 100 years ago.

Renewable energy sources like wind and solar still produce only a tiny percentage of electricity. That offshore wind farm hasn't gone up in Nantucket Sound, and the Mojave Desert is never going to be covered with solar panels.

Ethanol subsidies have jacked up the price of corn, raising the price of meat here and tortillas in Mexico. But the subsidies haven't done much for gas mileage, and presidential candidates heading to Iowa now call for abolishing them.

In contrast to the marginal effects of these much-ballyhooed public policies, there has been a huge breakthrough in energy production in the past couple of years.

Petroleum engineers working for private companies have used a technique called "hydraulic fracking," injecting vast amounts of water into rock, to release commercially viable amounts of natural gas and oil.

Hydraulic fracking has resulted in a boom in the Bakken oil shale formation under North Dakota and Montana. North Dakota is now the No. 4 state in oil production.

And hydraulic fracking has made commercially viable huge volumes of natural gas previously imprisoned in shale rock in western Pennsylvania and West Virginia.

The U.S. Energy Information Administration has estimated that there is at least six times as much natural gas available now as a decade ago as well as a big increase in commercially recoverable oil.

All thanks to hydraulic fracking, a phrase I bet you didn't hear in the energy debate in the 2008 presidential campaign or in the debate over the cap-and-trade bill passed by House Democrats in June 2009. My (perhaps defective) search for the phrase in the New York Times and Washington Post websites didn't yield any mentions earlier than 2010.

While government's ethanol subsidies and renewable requirements have made little difference, the private sector's hydraulic fracking has increased our energy supply and reduced our dependence on dicey Middle Eastern oil.

This kicks back against the efforts of government under the Obama administration to restrict energy supply. The administration has shut down much offshore drilling in the Gulf of Mexico (even though Obama cheered Petrobras' drilling off the shore of Brazil) and has been denying permits for oil drilling in Alaska that is needed to keep the pipeline pumping. This on top of environmental groups' successful attempts to prevent drilling on the desolate tundra of the Arctic National Wildlife Refuge.

The State Department has even been stalling on approving the Keystone pipeline from the tar sands of Alberta to refineries in Oklahoma and Texas. Environmental groups object to drilling techniques Canada allows.

It's unclear why we should feel called on to second-guess the internal regulations of a competent and environmentally-conscious nation like Canada. And it's incomprehensible why we should want to keep out a plentiful supply of oil from a dependable and friendly neighbor.

There is a lesson here for public policy generally, including health care. No centralized government expert predicted the vast expansion in energy supply from hydraulic fracking. It was produced by decentralized specialists in firms subject to market competition.

Just as Friedrich Hayek taught, no central planner can know or foresee enough to produce the beneficial results regularly produced by competition in free markets regulated in accordance with the rule of law. And no central planner can accurately predict the course of innovation that can be achieved in decentralized markets. That's something you might want to keep in mind when someone tells you that Medicare costs can be controlled by 15 members of an unelected board created by Obamacare. Better results and lower costs can be expected with the kind of market competition set up by the 2003 Medicare prescription drug law.

No one can tell you just how that will happen, just as no one was telling you three years ago just how hydraulic fracking would expand our energy supply. But it did. That's what market competition can do -- and government control can't.

Michael Barone,The Examiner's senior political analyst, can be contacted at mbarone@washingtonexaminer.com. His column appears Wednesday and Sunday, and his stories and blog posts appear on ExaminerPolitics.com.


Read more at the Washington Examiner: http://washingtonexaminer.com/politics/2011/06/free-market-not-government-policies-drives-energy-boom#ixzz1OhT9IHD1
 
Every little bit helps (I think this should be added to existing vehicle fleets post haste):

http://nextbigfuture.com/2011/06/system-for-bringing-engine-oil-to.html

System for bringing engine oil to optimal temperature could increase fuel efficiency by 7% in old and new cars

ShareFrank Will with the Formula SAE racing ca

A minor modification to your car could reduce fuel consumption by over seven per cent. The Deakin University (Australia) invention uses waste heat to reduce friction by warming the engine oil. A prototype has been built and tested and the inventors are now talking to the car manufacturers and developing an aftermarket conversion kit. The system, which can be retrofitted, works by diverting waste heat to bring engine oil up to its optimal operating temperature. It was developed by researchers at Deakin University led by Mr Frank Will of the School of Engineering during his PhD project.

Half of all oil usage is for gasoline in cars and trucks. A 7% fuel saving that can be applied to existing cars and new cars would save 2.8 million barrels per day worldwide. It could save 700,000 barrels per day in the United States.

“Preliminary testing of our system has demonstrated fuel savings of over seven per cent as well as significant reductions in exhaust emissions,” Frank says.

The work is being presented through Fresh Science, a communication boot camp for early career scientists held at the Melbourne Museum. Frank was one of 16 winners from across Australia.

A typical car engine wastes about 80 per cent of the fuel consumed. Only 20 per cent of the fuel’s energy is used to drive the car forward. The rest is lost as heat. He believes his invention – which he has named OVER7™ – represents a smarter approach to vehicle engine design.

“One of its most important features is that it doesn’t have to heat all the oil in the sump. Instead, it heats only the active oil in the engine lubrication system. This makes the overall heat transfer process much more efficient.

“The system has the potential to be retrofitted to existing engines and we don’t think it will require big changes. It should be much cheaper to fit than an LPG conversion for example. Built into a new car it should pay for itself within a month or two,” he says.

“We also think the system will be suitable for a range of vehicles, including diesels, hybrids and those using alternative fuels.” Other benefits include the potential to reduce engine wear and improve performance.
 
While we wait for the political roadblocks to cheap oil to evaporate, here is another bit of work on biofuel alternatives:

http://www.physorg.com/news/2011-06-wood-digesting-enzyme-bacteria-boost-biofuel.html

First wood-digesting enzyme found in bacteria could boost biofuel production
June 9, 2011

(PhysOrg.com) -- University of Warwick researchers funded by the Biotechnology and Biological Sciences Research Council (BBSRC)-led Integrated Biorefining Research and Technology (IBTI) Club have identified an enzyme in bacteria which could be used to make biofuel production more efficient. The research is published in the 14 June Issue of the American Chemical Society journal Biochemistry.

This research, carried out by teams at the Universities of Warwick and British Columbia, could make sustainable sources of biofuels, such as woody plants and the inedible parts of crops, more economically viable.
The researchers, who were also supported by the Engineering and Physical Sciences Research Council, have discovered an enzyme which is important in breaking down lignin, one of the components of the woody parts of plants. Lignin is important in making plants sturdy and rigid but, because it is difficult to break down, it makes extracting the energy-rich sugars used to produce bioethanol more difficult. Fast-growing woody plants and the inedible by-products of crops could both be valuable sources of biofuels but it is difficult to extract enough sugar from them for the process to be economically viable. Using an enzyme to break down lignin would allow more fuel to be produced from the same amount of plant mass.

The researchers identified the gene for breaking down lignin in a soil-living bacterium called Rhodococcus jostii. Although such enzymes have been found before in fungi, this is the first time that they have been identified in bacteria. The bacterium’s genome has already been sequenced which means that it could be modified more easily to produce large amounts of the required enzyme. In addition, bacteria are quick and easy to grow, so this research raises the prospect of producing enzymes which can break down lignin on an industrial scale.
Professor Timothy Bugg, from the University of Warwick, who led the team, said: “For biofuels to be a sustainable alternative to fossil fuels we need to extract the maximum possible energy available from plants. By raising the exciting possibility of being able to produce lignin-degrading enzymes from bacteria on an industrial scale this research could help unlock currently unattainable sources of biofuels.
“By making woody plants and the inedible by-products of crops economically viable the eventual hope is to be able to produce biofuels that don’t compete with food production.”
The team at Warwick have been collaborating with colleagues in Canada at the University of British Columbia who have been working to unravel the structure of the enzyme. They hope next to find similar enzymes in bacteria which live in very hot environments such as near volcanic vents. Enzymes in these bacteria have evolved to work best at high temperatures meaning they are ideally suited to be used in industrial processes.
Duncan Eggar, BBSRC Sustainable Bioenergy Champion, said: “Burning wood has long been a significant source of energy. Using modern bioscience we can use woody plants in more sophisticated ways to fuel our vehicles and to produce materials and industrial chemicals. This must all be done both ethically and sustainably. Work like this which develops conversion processes and improves efficiencies is vital.”
More information: This paper is available online here: http://pubs.acs.or … 21/bi101892z
Provided by University of Warwick (news : web)
 
If subsidies can be eliminated, then market forces will have a much greater impact on supply and demand (ethanol subsidies distort the market in favour of an inferior fuel with the negative results described):

http://princearthurherald.com/archives/5673

Vaughan: U.S. Energy Subsidies Must Go
Posted on June 24th, 2011 by Eleanor Vaughan in Politics

Some called it a green miracle. On Thursday, a supermajority in the U.S. Senate voted to scrap a $6 billion annual corn ethanol subsidy. The amendment, sponsored by Sen. Dianne Feinstein (D-CA), removes the 45 cent tax break that oil companies receive for every gallon of ethanol that they blend into their gasoline. The amendment also ends the 54 cent per gallon import tax on foreign ethanol designed to protect domestic industry. If enacted, the amendment would save an estimated $6 billion per year. The victory is, however, a symbolic one. It is doubtful that the amendment will actually be enacted as the underlying bill—a new push for green subsidies—is unlikely to pass through the House or Senate. President Obama also opposes the amendment, claiming ethanol helps reduce foreign energy dependence.


Will the necessity of deficit reduction finally put an end to unjust and counter-productive U.S. energy subsidies?

Yet, the battle against ethanol subsidies—and indeed all energy subsidies—must be pursued. Broad support from a motley coalition of Republicans and Democrats highlight the widespread consensus on ditching ethanol subsidies. While corn producers have tried to claim that bio-fuels reduce greenhouse gas emissions and foreign energy dependence, these arguments don’t hold up to scrutiny.

Ethanol does not significantly reduce carbon emissions. Growing huge amounts of corn and distilling it to produce ethanol consumes vast amount of energy and stresses natural resources. It takes more energy to produce a gallon of ethanol than that gallon actually contains and its gas mileage is poor compared to conventional gasoline. The Energy Information Administration reports that a gallon of fuel ethanol is equal to only 0.67 gallons of conventional gas. Ethanol puts out 11% less energy per gallon than petroleum diesel. Ethanol subsidies also damage the environment by encouraging farmers to irresponsibly expand corn production, often on ecologically sensitive land.

Ethanol subsidies also increase global corn prices, leading to food shortages and starvation around the world. Livestock producers, whose animals feed largely on corn, say that ethanol has pushed up their costs, borne out as higher meat prices for consumers. The injustice is clear—the amount of corn it takes to produce a single tank of ethanol could feed one person for an entire year.

Ethanol also does little to alleviate foreign energy dependence. According to the U.S. Department of Agriculture, the ethanol industry already uses about 40 per cent of the nation’s corn crop but produces less than 10% of the fuel supply. There is simply not enough corn to support significantly greater production. For the United States to completely depend on ethanol as a fuel, the country would need to use every single acre of its land to grow corn—and would still need 20% more land on top of that to meet its energy needs.


“Ethanol subsidies,” concludes Sen. Tom Coburn (R-OK), “are bad economic policy, bad energy policy, and bad environmental policy.”

Indeed, why subsidize the energy industry at all?

As fiscal strain mounts in Washington, the pressure to cut subsidies will build. Sen. Lamar Alexander (R-TN) is reportedly working on legislation to wipe out an array of tax subsidies for other energy sources. “When we are borrowing 40 cents out of every dollar that we spend,” Lamar notes, “it is a good time to take a hard look at unwarranted tax breaks and one appropriate use of those funds is to reduce the deficit.” Cutting energy subsidies would lead to savings in the tens of billions.

The premise that the federal government must intervene to promote alternative energy sources is dubious at best. Energy subsidies distort market signals, diverting resources from their most effective uses to their most politically favorable ones. In a free market, investors will invest in the alternative energy technology that makes economic sense. Subsidies discourage investment into more promising energy alternatives, like cellulosic ethanol made from non-food crops.

While incentives and tax breaks can be used to help get new technologies off the ground, they should not be used to permanently prop up the energy industry. Subsidies for ethanol, natural gas, oil, and coal directly harm the environment by keeping the price of pollution-producing energy sources artificially low and thus encouraging their consumption. For green energy innovation to occur, markets must be free to allocate capital to the best technology. Government, swayed by the demands of their corn-growing constituents, should not pick winners and losers through the tax code.

Eleanor Vaughan graduated from McGill University in June and is interning in Washington, D.C.
 
Getting clean, cheap electrical energy is another big issue. Since so many people get bent out of shape by the word "nuclear", here is one possible alternative:

http://nextbigfuture.com/2011/06/meeting-all-of-earths-energy-needs-with.html

Meeting all of the earth's energy needs with tethered platforms

A tethered platform hovering at an altitude of 20 kilometers would operate in the stratosphere, above most clouds and weather. At that altitude, a platform covered with photovoltaic cells would be able to collect considerably more sunlight than a ground-based solar collector. A company called stratosolar has developed a proposal to use either concentrated solar collectors or straight photovoltaic cells to generate electricity. Such schemes could potentially bring the cost of solar power down to 1 cent per kilowatt-hour. In an interview with Sander Olson, StratoSolar President Edmund Kelly describes what will be needed to make this concept a reality.

Edmund Kelly

Question: What is StratoSolar?

It's a proposal to collect solar power at an altitude of 20 kilometers, where the troposphere gives way to the stratosphere. The stratosphere is a nearly ideal location for collecting solar power, since at that altitude the atmosphere has absorbed little of the sun's rays. There are no clouds and low winds. There is day and night, but that's totally predictable.

Question: StratoSolar has researched two versions, one using Concentrated Solar Power (CSP) and the other involving floating platforms of straight photovoltaic cells. How does the CSP system work?

The basic idea is a huge tethered concentrator at 20 km that tracks the sun. It continually collects light during the day and sends it to the ground through a light pipe. Some of the energy is stored as heat for 24-hour power production. Each square meter of a 2 km dish collects six times as much sunlight as a ground based heliostat mirror. In regard to converting sunlight into electric power, it is as efficiently as the best thermal plants. It is a fairly complicated system involving light pipes, heat storage, turbine generators and so forth.

Question: What will it cost?

Based on the cost of materials, plastic, aluminum, hydrogen and steel wire or Kevlar and a ten-year payoff the CSP version we expect it to eventually fall to around 1 cent per kilowatt-hour when the technology matures. Thermal storage is remarkably low cost, in this design no more than a tenth of a cent per kWh.

Even an early version looks like it will initially generate electricity for 6 cents per kilowatt-hour.

At 20km, a tethered platform gets the benefits of abundant sunlight and a benign environment. A tethered platform will get nearly constant sunlight from dawn till dusk, whereas on the ground a platform will only get peak sunlight for a brief period.

Question: Can't ground-based photovoltaic provide all of the earth's energy needs?

Photovoltaic technology has made considerable progress during the past two decades, but it is not anywhere close to being competitive with coal, in terms of cost per kilowatt-hour. In ground installations, the cell only receives sunlight on sunny days, and even then only intermittently. More efficient and less expensive cells can ameliorate these problems, but those advantages would also make our concepts even more viable. Therefore, we have developed an inherently better solution, one that can be deployed as far north as Stockholm.

Question: Is the CSP approach is the more sophisticated of the two approaches?

We believe that the most efficient way to get solar energy is by floating large solar collectors at an altitude of 20 Kilometers. However, the technology for concentrated solar power is newer, trickier than for photovoltaics, and involved greater R&D and higher risks. Therefore, for that reason we developed the photovoltaic system, which is simpler and easier to implement. The PV system floats a platform using hydrogen gas, also at 20 kilometers. The photovoltaic (PV) system should result in costs of 8 cents per kilowatt-hour, but with the continuing improvements in PV panels, we can reduce the cost even further.

Question: What R&D costs are we talking about to fully develop the CSP and PV systems? How much electricity could one platform provide?

The CSP systems contain many elements and the minimum system is large. This means the R&D cost to build a complete functioning system is several hundred million dollars. It will ultimately require billions of dollars to perfect this technology and to develop the infrastructure. However, the first platforms will generate revenue which will help fund the R&D. The PV platforms contain fewer elements and are smaller initially and modular, so the initial R$D cost is much lower than CSP, in the region of tens of millions of dollars for the first operational platform, and they can be easily scaled up to larger systems without additional R&D. We have calculated that 30 of these larger PV platforms could provide all of the daylight electricity needs of all of California. California could achieve energy independence within a decade.

Question: Given sufficient funding, how long would it take to launch the first CSP platform?

We would require at least five years to get a full-scale CSP platform up and running, since significant R&D is needed to make that concept viable. Within three years, we should be able to create a small platform that confirms the feasibility of the approach. The photovoltaic approach is modular, and therefore can be scaled up faster.

Question: How confident are you that a 20 kilometer tether could be made sufficiently strong? What materials will be required for the floating platform?

We will use Kevlar or UHMWPE for the tether, which have sufficient strength to support the platform without breaking. For a high voltage line, the tether will be very narrow in diameter. For the platform there is no need for exotic materials - the required materials are mostly aluminum, Mylar, and structural fabric.

Question: What sort of efficiency losses would be incurred getting the power down to the ground?

Low. Not different from 20 km of power lines.

Question: At what latitudes can tethered platforms operate?

Our platforms are designed to operate at between latitudes 30 and 60, which pretty much covers the industrial zones. We don't recommend deploying these platforms over cities, at least initially. However, you could place them close to cities, obviating the need for long power cables and efficiency losses. Therefore, these platforms could operate over Germany, and collect substantially more energy than ground-based solar cells in a desert.

Question: To what extent will weather be a problem? What about static electricity or hydrogen safety issues?

With the CSP, there is a possibility of catastrophic risk from a major hurricane or tornado. That is one of the reasons that we developed the PV system. The platform itself will be above most clouds and so only the cables will have to endure weather. The PV system is designed to be able to withstand any expected wind, and will in general be unaffected by weather. We have also specifically designed the platforms to be unaffected by electrical discharges. In a properly designed system, the risks from using hydrogen are negligible. For example, surround the hydrogen bags with nitrogen.

Question: To what extent can the costs of conventional ground-based solar power be reduced?

During the past decade, packaging costs have come down substantially, and the costs for packaging the cell and the electronics have been reduced. There are limits as to how much lower costs can go - fixed costs for land and paving that won't go down. A conventional solar station on the ground requires square miles of land and huge amounts of material. So there exists a bottom limit to costs. By contrast, the stratosolar approach has costs that are only 1/3 of those costs of a conventional solar station. Our system is the only feasible way to generate solar power that is cost-competitive with coal.

Question: Could this technology be used to eventually provide all of the earth's energy needs?

Within a few decades, there could be thousands of platforms in operation. That sounds like alot, but it would have these platforms are surprisingly easy to build, and are actually quite safe to operate. Building and operating thousands of these platforms is a feasible and cost-effective proposition, even without any subsidies.

Question: How much progress could be made with CSP and PV within the next decade?

Within a decade, we could have PV deployed on a very large scale. CSP will take longer, but we could see the multiple CSP platforms operating within ten years. PV is on a technology development curve, and if we could develop an efficient electricity storage system, we might not even need CSP.
 
I wonder how big a shadow that thing casts at 9 o'clock in the morning?
 
Several of the potential replacements for oil depend on biological systems, such as biofuels derived from algae, artificial enzymes to break agricultural and biological wastes for conversion into ethanol, methanol or bio diesel and synthetic bacteria to produce fuels from various raw materials. This technology has the potential to unlock these biological systems:

http://nextbigfuture.com/2011/07/accelerated-evolution-machine-being.html

Accelerated evolution machine being adapted to human stem cells and creates the new field of Recombineering

In May, 2010, Venter and his team successfully inserted a fully customized strand of DNA into a living cell, creating what they call the "first synthetic genome." Church says MAGE (Multiplex Automated Genome Engineering) can achieve similar results faster and cheaper. His lab's device will go on sale later this year for about $90,000, and at least a dozen companies, including chemical giant DuPont (DD) and biotech startup Amyris, are considering purchasing it.

The new term is recombineering.

Recombineering (recombination-mediated genetic engineering)is a genetic and molecular biology technique based on homologous recombination systems, as opposed to the older/more common method of using restriction enzymes and ligases to cut and glue DNA sequences. It has been developed in E. coli and now is expanding to other bacteria species and is used to modify DNA in a precise and simple manner. The procedure is widely used for bacterial genetics, in the generation of target vectors for making a conditional mouse knockout, and for modifying DNA of any source often contained on a bacterial artificial chromosome (BAC).

New Scientist had updated coverage

The machine let the E. coli multiply, mixed them with the DNA strands, and applied an electric shock to open up the bacterial cells and let the DNA get inside. There, some of the added DNA was swapped with the matching target sequences in the cells' genomes. This process, called homologous recombination, is usually very rare, which is where the viral enzymes come in. They trick cells into treating the added DNA as its own, greatly increasing the chance of homologous recombination.

* Church is adapting MAGE for genetically modifying human stem cell lines. The work, funded by the US National Human Genome Research Institute, aims to create human cell lines with subtly different genomes in order to test ideas about which mutations cause disease and how.

* if MAGE really can be used to edit the genome of human cells, it would provide a way to fix the mutations that cause inherited disease. It could be the technology that opens the door to the genetic engineering of humans.

Recombineering utilizes linear DNA substrates that are either double-stranded (dsDNA) or single-stranded (ssDNA). Most commonly, dsDNA recombineering has been used to create gene replacements, deletions, insertions, inversions. Gene cloning and gene/protein tagging is also common. For gene replacements or deletions, usually a cassette encoding a drug-resistance gene is made by PCR using bi-partite primers. These primers consist of (from 5’→3’) 50 bases of homology to the target region, where the cassette is to be inserted, followed by 20 bases to prime the drug resistant cassette. The exact junction sequence of the final construct is determined by primer design. These events typically occur at a frequency of approximately 10^4/10^8cells that survive electroporation. Electroporation is the method used to transform the linear substrate into the recombining cell.

Recombineering with ssDNA provided a breakthrough both in the efficiency of the reaction and the ease of making point mutations. This technique was further enhanced by the discovery that by avoiding the methyl-directed mismatch repair system, the frequency of obtaining recombinants can be increased to over 10^7/10^8 viable cells. This frequency is high enough that alterations can now be made without selection. With optimized protocols, over 50% of the cells that survive electroporation contain the desired change. Recombineering with ssDNA only requires the Red Beta protein; Exo, Gamma and the host recombination proteins are not required. As proteins homologous to Beta and RecT are found in many bacteria and bacteriophages (over 100 as of February 2010), recombineering is likely to work in many different bacteria. Thus, recombineering with ssDNA is expanding the genetic tools available for research in a variety of organisms. To date, recombineering has been performed in E. coli, S. enterica, Y. pseudotuberculosis, and M. tuberculosis

The biggest advantage of recombineering is that it obviates the need for conveniently positioned restriction sites, whereas in conventional genetic engineering, DNA modification is often compromised by the availability of unique restriction sites. In engineering large constructs of over 100 kb, such as the Bacterial Artificial Chromosomes (BACs), or chromosomes, recombineering has become a necessity. Recombineering can generate the desired modifications without leaving any 'footprints' behind. It also forgoes multiple cloning stages for generating intermediate vectors and therefore is used to modify DNA constructs in a relatively short time-frame. The homology required is short enough that it can be generated in synthetic oligonucleotides and recombination with short oligonucleotides themselves is incredibly efficient. Recently, recombineering has been developed for high throughput DNA engineering applications termed 'recombineering pipelines'. Recombineering pipelines support the large scale production of BAC transgenes and gene targeting constructs for functional genomics programs such as EUCOMM (European Conditional Mouse Mutagenesis Consortium) and KOMP (Knock-Out Mouse Program). Recombineering has also been automated, a process called "MAGE" -Multiplex Automated Genome Engineering, in the Church lab

Human MAGE

Next-Gen Reading and Writing of Microbial and Human Genomes talk by George Church

The cost of sequencing has plummeted a million fold in six years and integration with next-gen genome engineering is following a similar path. Our SynBERC SynBioSIS - BIOFAB group supports the community via high-throughput production and characterization of synthetic genes and genomes by novel and cost-effective resources like oligonucleotide chips and Multiplex Automated Genome Engineering (MAGE). Starting with up to 244K 300-mers or 1M 75-mers per chip (for as little as $500 per chip), with enzymatic error correction these assemble in 600-mers with error rates as good as 1/6000 (and with sequencing even better). E.coli MAGE (via ss-oligomers) can incorporate up to 5 mutations per 90-mer and up to seven 90-mers per cell per 2 hr. One MAGE device can produce up to 4 billion combinatorial genomes (cells) per day per each of 8 growth chambers. The engineered cells are characterized by FACS, automated microscopy, selective growth and quantitative sequencing assays of RNA and protein-NA interactions. Applications include optimization of metabolite, fuel, drug, and macromolecular production levels. New translation codes are aimed at efficient incorporation of multiple non-standard amino acids, multi-virus resistance and safety through nutritional and genetic isolation. For human MAGE we are optimizing various combinations of ss-oligos, Zn-finger and TALE targeting, ds-nucleases, deaminases, and recombinases. The SynBioSIS chip pipeline and E.coli MAGE help with constructing and selecting new ZnF and TALE and BACs for use in fibroblast, stem cells (hiPS) etc. Next-gen sequencing measures off-target mutational and epigenomic impacts and ratios of bar-codes.
 
Another amazing technology:

http://nextbigfuture.com/2011/07/modified-carbon-nanotubes-can-store.html

Modified carbon nanotubes can store solar energy indefinitely

Modified carbon nanotubes can store solar energy indefinitely, then be recharged by exposure to the sun.

Storing the sun’s heat in chemical form — rather than converting it to electricity or storing the heat itself in a heavily insulated container — has significant advantages, since in principle the chemical material can be stored for long periods of time without losing any of its stored energy. The problem with that approach has been that until now the chemicals needed to perform this conversion and storage either degraded within a few cycles, or included the element ruthenium, which is rare and expensive.

Nanoletters- Azobenzene-Functionalized Carbon Nanotubes As High-Energy Density Solar Thermal Fuels

Solar thermal fuels, which reversibly store solar energy in molecular bonds, are a tantalizing prospect for clean, renewable, and transportable energy conversion/storage. However, large-scale adoption requires enhanced energy storage capacity and thermal stability. Here we present a novel solar thermal fuel, composed of azobenzene-functionalized carbon nanotubes, with the volumetric energy density of Li-ion batteries. Our work also demonstrates that the inclusion of nanoscale templates is an effective strategy for design of highly cyclable, thermally stable, and energy-dense solar thermal fuels.

Using carbon nanotubes the new chemical system is less expensive than the earlier ruthenium-containing compound, but it also is vastly more efficient at storing energy in a given amount of space — about 10,000 times higher in volumetric energy density, Kolpak says — making its energy density comparable to lithium-ion batteries. By using nanofabrication methods, “you can control [the molecules’] interactions, increasing the amount of energy they can store and the length of time for which they can store it — and most importantly, you can control both independently,” she says.

Thermo-chemical storage of solar energy uses a molecule whose structure changes when exposed to sunlight, and can remain stable in that form indefinitely. Then, when nudged by a stimulus — a catalyst, a small temperature change, a flash of light — it can quickly snap back to its other form, releasing its stored energy in a burst of heat. Grossman describes it as creating a rechargeable heat battery with a long shelf life, like a conventional battery.

One of the great advantages of the new approach to harnessing solar energy, Grossman says, is that it simplifies the process by combining energy harvesting and storage into a single step. “You’ve got a material that both converts and stores energy,” he says. “It’s robust, it doesn’t degrade, and it’s cheap.” One limitation, however, is that while this process is useful for heating applications, to produce electricity would require another conversion step, using thermoelectric devices or producing steam to run a generator.

While the new work shows the energy-storage capability of a specific type of molecule — azobenzene-functionalized carbon nanotubes — Grossman says the way the material was designed involves “a general concept that can be applied to many new materials.” Many of these have already been synthesized by other researchers for different applications, and would simply need to have their properties fine-tuned for solar thermal storage.

The key to controlling solar thermal storage is an energy barrier separating the two stable states the molecule can adopt; the detailed understanding of that barrier was central to Grossman’s earlier research on fulvalene dirunthenium, accounting for its long-term stability. Too low a barrier, and the molecule would return too easily to its “uncharged” state, failing to store energy for long periods; if the barrier were too high, it would not be able to easily release its energy when needed. “The barrier has to be optimized,” Grossman says.
 
More oil from the Bakken formation:

http://nextbigfuture.com/2011/07/eco-pad-oil-recovery-in-bakken.html

Eco-pad oil recovery in the Bakken

Billionaire Harold Hamm is convinced thereʼs 24 billion barrels of oil to be coaxed from the Bakken field of North Dakota and Montana. Continental Resources has already prospered from Hammʼs Bakken bet—shares are up 250% since early 2009. Hammʼs 72% stake is worth $8 billion. Hamm currently has 25 of the 175 rigs working the Bakken. In the past year Continental’s Bakken output has exploded 70% to 28,000 barrels per day.

Operators like Continental, EOG Resources, Hess Corp., Occidental Petroleum and Marathon Oil have drilled some 3,000 wells there since 2008, and learn more on each one. A primary discovery: that just 100 feet below the primary Bakken formation (itself 10,000 feet down) is a whole other layer of oil-bearing rock called the Three Forks, which is separate from the Bakken and sealed off by a layer of shale. Watching flow rates, the companies agree that the average well drilled into either layer will produce around 500,000 barrels of oil in its lifetime.

Hammʼs number is aggressive because his drilling technique is aggressive. Most analysts and operators assume one well per 640 acres of reservoir. Too conservative. Continental has developed a new drilling concept it calls Eco-Pad to exploit both reservoirs. One rig will develop a 2-square-mile area by drilling eight wells—four into the Bakken layer and four into the Three Forks. Each well goes down two miles, then horizontally two miles through the reservoir. Using explosive charges, the drillers will make hundreds of holes (called “perforations”) in the pipe of each well. Then comes the hydraulic fracturing— where the well is injected with 1.8 million gallons of water and sand that props open tiny fractures in the dolomite rock to let out the oil. The “Eco” in this Eco-Pad concept? All this work on eight giant wells gets done from one spot, causing less surface impact.

From there, itʼs simple arithmetic. The basin covers about 8 million acres. Hamm figures there’s room for 48,000 wells. If each one delivers that 500,000 barrel average, you get 24 billion barrels. Even then, drillers will be harvesting well less than 10% of what geologist Edward Murphy of the North Dakota Geological Service figures is 250 billion barrels of original oil in place. The Williston basin is churning out 450,000 bpd now. Within four years, says Hamm, it will be producing 1.2 million bpd — as much oil as is currently recovered from the entire U.S. side of the Gulf of Mexico.

Bakken production has been slowed this year because of flooding in North Dakota.

Carbon dioxide injection is starting to be used to increase production in Montana.

Injecting the greenhouse gas underground could produce 40 billion barrels of oil in the United States, Evans said.

A DOE estimate is that enhanced oil recovery (nextgen CO2 injection) could unlock 240 billion barrels of oil.

This would involve hundreds of billions of investment if not trillions of investment. There would need to be massive pipeline projects to take captured CO2 from coal plants to oil wells.
 
A new approach to extracting oil from the oil sands:

http://nextbigfuture.com/2011/07/nsolv-has-solvent-based-approach-to.html

Nsolv has a solvent based approach to the oilsands that uses no water and 85% less energy

N-Solv Corporation holds patents for proprietary technology for in situ solvent extraction of bitumen from oil sands. The process uses no water and 85% less energy than Steam Assisted Gravity Drainage (SAGD). They have been awarded with $10.5 million by the Canadian government.

The N-Solv process is also expected to have lower operating and capital costs than SAGD with fewer restrictions on the reservoir conditions under which it can operate.

Other members of the consortium are oil sands producer Suncor Energy Inc. and Hatch Ltd.

In making the award, SDTC noted that Canada has some 170 billion barrels of recoverable crude oil stored in the oil sands. Of these remaining established reserves in Alberta, 80% are too deep to be mined and are currently recovered using in situ processes such as SAGD which is water- and energy- intensive.

N-Solv injects heated solvent (such as propane) vapor at moderate pressures into the gravity drainage chamber. The vapor flows from the injection well to the colder perimeter of the chamber where it condenses, delivering heat and fresh solvent directly to the bitumen extraction interface.

In solvent extraction, the production rate is limited by the rate that the solvent diffuses into the bitumen; the penetration rate of solvent into bitumen is determined by the bitumen viscosity. With Athabasca bitumen, a 25-30ºC temperature rise typically reduces the bitumen viscosity by a factor of 100. Thus, says N-Solv, a substantial acceleration in the bitumen extraction rate is achieved with a very modest increase in temperature. This is the key principle of N-Solv.
 
When good intentions go wrong:

http://www.greencarcongress.com/2011/08/researchers-castigate-planning-bodies-for-ill-conceived-jatropha-programs.html

Researchers castigate planning bodies for ill-conceived Jatropha programs
3 August 2011

The results of massive plantings of Jatropha worldwide for use as a biofuel feedstock—some 12.8 million ha (49,421 square miles) are expected to be planted by 2015—are “anything but encouraging”, according to Promode Kant from the Institute of Green Economy in India and Shuirong Wu of the Chinese Academy of Forestry.

In a Viewpoint published in the ACS journal Environmental Science & Technology, Kant and Wu suggest that what they call the “extraordinary collapse of Jatropha as a biofuel” appears to be due to “an extreme case of a well intentioned top down climate mitigation approach, undertaken without adequate preparation and ignoring conflict of interest, and adopted in good faith by other countries, gone awry bringing misery to millions of poorest people across the world”.

The current situation began in 2003 with the decision by the Planning Commission of India to introduce mandatory biofuel blending over increasingly larger parts of the country with a target of 30% by 2020. The Planning Commission pushed for Jatropha as it was considered to be high, early yielding, nonbrowsable and requiring little irrigation and even less management.

India encouraged millions of marginal farmers and landless people to plant Jatropha across India, Kant and Wu said. In 2006, China decided to meet 15% of its transportation energy needs by 2020 and, following India’s example, focused on Jatropha, with plans to raise it on more than 1 million ha of marginal lands. Other developing countries took similar measures, in the hope that the crop would provide enhanced income for farmers as well as renewable energy. By 2008, Jatropha had been planted on more than an estimated 900,000 ha, of which 85% was in Asia, 13% in Africa and the rest in Latin America.

According to the authors:

        In India the provisions of mandatory blending could not be enforced as seed production fell far short of the expectation. A recent study has reported discontinuance by 85% of the Jatropha farmers.
 
      China is seeing very little production of biodiesel from Jatropha seeds.
 
      Research on Jatropha planting in Tanzania found the net present value of a five-year investment in Jatropha plantation was negative with a loss of US$ 65 per ha on lands with yields of 2 tons/ha of seeds and only slightly beneficial at US$9 per ha with yields of 3 tons when the average expected Jatropha seed yield on poor barren soils is only 1.7 to 2.2 tons/ha.

Jatropha, the authors note, was never considered economically important enough for domestication; as a result, seed and oil productivity is highly variable.

    ...its phenotypic, physiological, and biochemical variability expressed in flowering age, intensity, and frequency, and seed size and oil content, is largely an epigenetic response to the varied environment it encounters as the phenotypic plasticity of genetic traits allows morphological and physiological adjustments with the environ. But such epigenetic accommodation lowers plant efficiency which is also reflected in its lowered seed production capacity.

    These observations are, however, nothing out of ordinary and should have been anticipated by the Planning Commission of India, the powerful apex body that decides national priorities and allocates funds for them, before taking up such a continent sized program involving millions of low income farmers. But the Commission may have relied too heavily on the opinion of one of its top functionaries, who expected an internal rate of return ranging from 19 to 28% across India. National planners’ enthusiasm for the species rubbed off easily on research organizations and Universities that rely heavily on the Planning Commission for funding and some of these institutions themselves became partners in raising Jatropha plantations. (Interpolation: sounds familier. See the Global Warming superthread for other examples of this sort of behaviour)

    It appears to be an extreme case of a well intentioned top down climate mitigation approach, undertaken without adequate preparation and ignoring conflict of interest, and adopted in good faith by other countries, gone awry bringing misery to millions of poorest people across the world. And it happened because the principle of “due diligence” before taking up large ventures was ignored everywhere. As climate mitigation and adaptation activities intensify attracting large investments there is danger of such lapses becoming more frequent unless “due diligence” is institutionalized and appropriate protocols developed to avoid conflict of interest of research organizations.
    —Kant and Wu

Resources
   
      Promode Kant, Shuirong Wu (2011) The Extraordinary Collapse of Jatropha as a Global Biofuel. Environmental Science & Technology Article ASAP doi: /10.1021/es201943
 
No oil indeed:

http://www.foreignpolicy.com/articles/2011/08/15/the_americas_not_the_middle_east_will_be_the_world_capital_of_energy?print=yes&hidecomments=yes&page=full

The Americas, Not the Middle East, Will Be the World Capital of Energy
Adios, OPEC.
BY AMY MYERS JAFFE | SEPT/OCT 2011

For half a century, the global energy supply's center of gravity has been the Middle East. This fact has had self-evidently enormous implications for the world we live in -- and it's about to change.

By the 2020s, the capital of energy will likely have shifted back to the Western Hemisphere, where it was prior to the ascendancy of Middle Eastern megasuppliers such as Saudi Arabia and Kuwait in the 1960s. The reasons for this shift are partly technological and partly political. Geologists have long known that the Americas are home to plentiful hydrocarbons trapped in hard-to-reach offshore deposits, on-land shale rock, oil sands, and heavy oil formations. The U.S. endowment of unconventional oil is more than 2 trillion barrels, with another 2.4 trillion in Canada and 2 trillion-plus in South America -- compared with conventional Middle Eastern and North African oil resources of 1.2 trillion. The problem was always how to unlock them economically.

But since the early 2000s, the energy industry has largely solved that problem. With the help of horizontal drilling and other innovations, shale gas production in the United States has skyrocketed from virtually nothing to 15 to 20 percent of the U.S. natural gas supply in less than a decade. By 2040, it could account for more than half of it. This tremendous change in volume has turned the conversation in the U.S. natural gas industry on its head; where Americans once fretted about meeting the country's natural gas needs, they now worry about finding potential buyers for the country's surplus.

Meanwhile, onshore oil production in the United States, condemned to predictions of inexorable decline by analysts for two decades, is about to stage an unexpected comeback. Oil production from shale rock, a technically complex process of squeezing hydrocarbons from sedimentary deposits, is just beginning. But analysts are predicting production of as much as 1.5 million barrels a day in the next few years from resources beneath the Great Plains and Texas alone -- the equivalent of 8 percent of current U.S. oil consumption. The development raises the question of what else the U.S. energy industry might accomplish if prices remain high and technology continues to advance. Rising recovery rates from old wells, for example, could also stem previous declines. On top of all this, analysts expect an additional 1 to 2 million barrels a day from the Gulf of Mexico now that drilling is resuming. Peak oil? Not anytime soon.

The picture elsewhere in the Americas is similarly promising. Brazil is believed to have the capacity to pump 2 million barrels a day from "pre-salt" deepwater resources, deposits of crude found more than a mile below the surface of the Atlantic Ocean that until the last couple of years were technologically inaccessible. Similar gains are to be had in Canadian oil sands, where petroleum is extracted from tarry sediment in open pits. And production of perhaps 3 million to 7 million barrels a day more is possible if U.S. in situ heavy oil, or kerogen, can be produced commercially, a process that involves heating rock to allow the oil contained within it to be pumped out in a liquid form. There is no question that such developments face environmental hurdles. But industry is starting to see that it must find ways to get over them, investing in nontoxic drilling fluids, less-invasive hydraulic-fracturing techniques, and new water-recycling processes, among other technologies, in hopes of shrinking the environmental impact of drilling. And like the U.S. oil industry, oil-thirsty China has also recognized the energy potential of the Americas, investing billions in Canada, the United States, and Latin America.

The revolution-swept Middle East and North Africa, meanwhile, will soon be facing up to an inconvenient truth about their own fossil-fuel legacy: Changes of government in the region have historically led to long and steep declines in oil production. Libya's oil output has never recovered to the 3.5 million barrels a day it was producing when Col. Muammar al-Qaddafi overthrew King Idris in 1969; instead it has been stuck at under 2 million barrels a day for three decades and is now close to zero. Iran produced more than 6 million barrels a day in the times of the shah, but saw oil production fall precipitously below 2 million barrels a day in the aftermath of the 1979 Islamic Revolution. It failed to recover significantly during the 1980s and has only crept back to 4 million in recent years. Iraq's production has also suffered during its many years of turmoil and now sits at 2.7 million barrels a day, lower than the 3.5 million it produced before Saddam Hussein came to power.

The Arab Spring stands to complicate matters even further: A 1979-style disruption in Middle Eastern oil exports is hardly out of the question, nor are work stoppages or strikes by oil workers caught up in the region's political zeitgeist. All in all, upwards of 21 million barrels a day of Arab oil production are at stake -- about a quarter of global demand. The boom in the Americas, meanwhile, should be food for thought for the Middle East's remaining autocrats: It means they may not be able to count on ever-rising oil prices to calm restive populations.

This hydrocarbon-driven reordering of geopolitics is already taking place. The petropower of Iran, Russia, and Venezuela has faltered on the back of plentiful American natural gas supply: A surplus of resources in the Americas is sending other foreign suppliers scrambling to line up buyers in Europe and Asia, making it more difficult for such exporters to assert themselves via heavy-handed energy "diplomacy." The U.S. energy industry may also be able to provide the technical assistance necessary for Europe and China to tap unconventional resources of their own, scuttling their need to kowtow to Moscow or the Persian Gulf. So watch this space: America may be back in the energy leadership saddle again.
 
What a lot of folks don't realize is that a lot of the technology and recovery techniques to recover shale gas and new oil from old wells was and is developed right here in Canada.  We have long been the innovators of the world wide oil industry.  Progressive Cavity Pumps, SAGD, all started here. 

Hydraulic fraccing as well, although from the news you would think this is brand new technology.  Not....  It has been used here and elsewhere for decades.  I am unsure why it is such an issue with the environmental movement nowadays.  The new Bogey man now that they are being called on their BS about the oilsands?
 
Frakking is not new, just the application to crack oil deposits and the refining fo the technology to crack open natural gas deposits in very hard rock. I suppose this might be considered a case of something reaching a tipping point, enough people have heard about it now that is is "common" knowledge.

Another unconventional source is being examined:

http://energeopolitics.com/2011/08/23/pre-development-of-huge-utah-oil-shale-block-begins-energy/

[quote}
Pre-development of huge Utah oil shale block begins
August 23, 2011

TomCo Energy is a London-based company which owns leases on over 3000 acres of oil shale land in Utah’s Uintah Basin.  As I have noted several times (most recently just last week), the Uinta Basin is the site of the massive Eocene Green River Shale formation – potentially the largest reservoir of unconventional petroleum in the world.  With total reserves estimated at up to 1.3 trillion barrels,  and ultimately recoverable reserves of 800 billion barrels or more , this formation holds three times or more the amount of Saudi Arabia’s proven reserves.  Unlocking this formation would change the energy outlook of the nation – and of the world – for a century or more.

Today, TomCo has announced that it has awarded contacts toward the development of this resource.  These are pre-development contracts intended to provide the baseline operational and environmental information required to move forward.

There is a long way to go in developing this resource.  As I have noted in the past, the Uinta Basin is a place of scenic beauty and we can anticipate very strong resistance from environmental interests on any development.  However, TomCo will be able to avoid the most visible environmental damage by refraining from traditional mining methods.  TomCo envisions using an in situ heating process to develop their lease blocks.  They plan to use what they call a new type of heating process called EcoShale In-Capsule technology.  While the EcoShale process sounds similar to the in situ process long under development by Shell Oil, a major difference seems to be in the use of water.  Shell’s process was said to utilize three barrels of water for every barrel of oil produced.  The EcoShale process, on the other hand, claims to use no water.  If so, this would of course be a tremendous development and would disarm one of the main points of attack against shale oil development.  If successful, TomCo’s development would mark the beginning of The Second Age of Oil.

EGP will watch this story closely over the coming months.
[/quote]
 
$2.00 a gallon?

http://www.nationalreview.com/articles/277246/achieving-2-gas-robert-zubrin

Achieving $2 Gas
It’s possible, with the right policy.

Republican presidential contender Michele Bachman has said that if she is elected, gas prices will fall to $2 per gallon. Such promises have understandably been greeted with considerable skepticism. But $2 gas is exactly what America needs. The question is, how can we get it?

We can’t do it just by expanded domestic drilling. In order for gasoline prices to fall to $2 per gallon, oil prices must be cut to $50 per barrel. And oil prices are set globally, with the dominating influence being the OPEC oil cartel. Since 1973, this cartel, which controls 80 percent of the earth’s commercially viable oil reserves, has refused to expand production, thus keeping petroleum prices artificially high. While, with a more pro-business government, the United States might conceivably be able to expand its production by a million or two barrels per day, OPEC could easily counter by cutting its production to match, or more likely, by simply continuing its non-expansion policy and letting increased Chinese demand take care of the slack.

If we are ever to get $2 gas, the power of OPEC to control oil prices needs to be broken. The United States Congress could do this with a stroke of the pen, simply by passing the bipartisan Open Fuel Standard bill (H.R. 1687). This act would effectively destroy OPEC by requiring that all new cars sold in the USA be fully flex fuel, able to run equally well on gasoline, ethanol, and — most important — methanol. This latter capability is critical because methanol can be, and is, made cheaply in large quantities from coal, natural gas, or any kind of biomass without exception. The United States has only 4 billion tons of oil reserves, but we have 270 billion tons of coal, vast amounts of natural gas, and an enormous capacity to produce biomass. By requiring that all cars sold here (and thus all cars made worldwide) be compatible with methanol, the act would force oil to compete with a fuel whose sources are not controlled by the cartel, and that we and our allies possess in abundance.

Methanol has only about half the energy per gallon as gasoline, but is 105 octane, which means it can be burned more efficiently. Taken together, these two factors make methanol’s current spot price of $1.38 per gallon roughly competitive with $2 gasoline.

Of course, the passage of the OFS bill would not cause gasoline prices to crash instantly. While it would no doubt hit oil futures hard, and thus cut the speculative premium on petroleum prices, the most immediate result of allowing methanol to compete against gasoline in the vehicle-fuel market would be to send methanol prices up, perhaps by as much as 60 percent. This situation would not, however, last for long. Methanol can be made and sold profitably today for $1.38 per gallon. At a 60 percent markup, its manufacture would be super-profitable, and massive amounts of capital would rush in to expand production. This would drive the price of methanol down, dragging gasoline and oil down prices with it, until methanol reached a price point where its production offered no greater profit than that prevailing in the economy at large. The fact that methanol would reach this price — what Adam Smith would term its natural price — follows from the fact that the sources to make methanol are plentiful and diverse, so that no cartel can artificially limit its production.

This underscores the key issue. There is not a free market in oil. Adjusted for inflation, the price of oil has increased eightfold since 1973, but OPEC production has not increased at all. In a free market, such a price increase would spur increased investment, with subsequent expanded production driving the price right back down again. That is why the inflation-adjusted price of coal, and nearly every other industrial commodity, has not risen in four decades. But because of the cartel, oil production has not responded to price increases in the way that it should in a properly functioning capitalist economy. In order for the free-enterprise system to do its work and deliver the cheap fuel the world needs, the ability of this cartel to limit the world’s liquid-fuel supplies needs to be broken. The Open Fuel Standard bill would accomplish that.

High oil prices are wrecking our economy. Since the United States imports 5 billion barrels of oil per year, the current price of nearly $90 per barrel will hit us for $450 billion this year alone, a huge tax on our economy. As a result, millions of jobs and thousands of businesses are being lost. If this wealth-draining process is allowed to continue, fiscal necessity will require us to withdraw the military forces protecting our national interests abroad, without a shot being fired.

Instead of seeking to exploit this catastrophe by placing its blame on their opponents, or posing with empty promises of salvation contingent upon their promotion to higher office, politicians need to take action. Two-dollar gas is not just a nice idea for inclusion in a campaign speech. It’s a critical necessity for economic recovery.

Either we break the cartel, or the cartel breaks us. The Open Fuel Standard bill needs to be passed.

— Robert Zubrin is a member of the Board of Advisors of Americans for Energy and author of Energy Victory: Winning the War on Terror by Breaking Free of Oil.
 
Back
Top