• Thanks for stopping by. Logging in to a registered account will remove all generic ads. Please reach out with any questions or concerns.

A scary strategic problem - no oil

If this keeps up, Quebec will be the only have not province.
 
Interesting. Many people don't take the train because it is inconvenient (It takes just as long to drive to Kingston from London as it is to take the train, due to a three hour stopover in Toronto to use an axample I am familiar with):

http://nextbigfuture.com/2011/02/train-scheduling-algorithm-optimized.html#more

Train scheduling algorithm optimized for shorter commuter travel times can shorten average trips from 60 to 48 minutes

Dr. Tal Raviv and his graduate student have developed a tool that makes passenger train journeys shorter, especially when transfers are involved — a computer-based system to shave precious travel minutes off a passenger's journey.

    Dr. Raviv's solution, the "Service Oriented Timetable," relies on computers and complicated algorithms to do the scheduling. "Our solution is useful for any metropolitan region where passengers are transferring from one train to another, and where train service providers need to ensure that the highest number of travellers can make it from Point A to Point B as quickly as possible.

    "Let's say you commute to Manhattan from New Jersey every day. We can find a way to synchronize trains to minimize the average travel time of passengers," says Dr. Raviv. "That will make people working in New York a lot happier."

    The project has already been simulated on the Israel Railway, reducing the average travel time per commuter from 60 to 48 minutes. The tool can be most useful in countries and cities, he notes, where train schedules are robust and very complicated.
 
More on Joule energy's claims. This would be very exciting if they are as good as claimed, oil prices would collapse and long term energy sufficiency would be assured:

http://nextbigfuture.com/2011/02/joule-unlimited-claims-5-50-times-more.html#more

Joule Unlimited claims 5-50 times more fuel per acre than other biofuel processes

Schematic comparison between algal biomass and direct photosynthetic processes. The direct process, developed by Joule and called Helioculture™, combines an engineered cyanobacterial organism supplemented with a product pathway and secretion system to produce and secrete a fungible alkane diesel product continuously in a SolarConverter™ designed to efficiently and economically collect and convert photonic energy. The process is closed and uses industrial waste CO2 at concentrations 50–100× higher than atmospheric. The organism is further engineered to provide a switchable control between carbon partitioning for biomass or product. The algal process is based on growth of an oil-producing culture in an industrial pond on atmospheric CO2, biomass harvesting, oil extraction, and chemical esterification to produce a biodiesel ester

A new dawn for industrial photosynthesis by Dan E. Robertson, Stuart A. Jacobson, Frederick Morgan, David Berry, George M. Church and Noubar B. Afeyan

The conversion efficiency for the direct process is about seven times larger than that for an algal open pond.

The article, entitled “A New Dawn for Industrial Photosynthesis,” quantitatively affirms the advantages of Joule’s direct conversion process as compared to the indirect production of fuel from biomass, including algae. Though both processes aim to convert solar energy into fuel, the latter method requires the costly culturing, harvesting and processing of algal biomass – a multi-step intermediate stage that Joule’s process avoids. Moreover, Joule’s process directly yields hydrocarbons that are fungible with existing diesel infrastructure, unlike the biodiesel product that is produced from algal oil.

Highlights include:

* Based on empirical measurements, Joule can directly produce 15,000 gallons of diesel per acre annually, as compared to 3,000 gallons of biodiesel produced indirectly from algae.

* The solar-to-product conversion efficiency of Joule’s direct, continuous process for producing diesel, ethanol and chemicals is between 5 and 50X greater than any biomass-dependent process, and gains additional efficiencies by avoiding downstream refining.

* Joule’s combined advances in genome engineering, solar capture and bioprocessing result in photosynthetic conversion efficiency of more than 7% relative to available yearly solar energy striking the ground, many times greater than prior industry assumptions.

  Several emerging technologies are aiming to meet renewable fuel standards, mitigate greenhouse gas emissions, and provide viable alternatives to fossil fuels. Direct conversion of solar energy into fungible liquid fuel is a particularly attractive option, though conversion of that energy on an industrial scale depends on the efficiency of its capture and conversion. Large-scale programs have been undertaken in the recent past that used solar energy to grow innately oil-producing algae for biomass processing to biodiesel fuel. These efforts were ultimately deemed to be uneconomical because the costs of culturing, harvesting, and processing of algal biomass were not balanced by the process efficiencies for solar photon capture and conversion. This analysis addresses solar capture and conversion efficiencies and introduces a unique systems approach, enabled by advances in strain engineering, photobioreactor design, and a process that contradicts prejudicial opinions about the viability of industrial photosynthesis. We calculate efficiencies for this direct, continuous solar process based on common boundary conditions, empirical measurements and validated assumptions wherein genetically engineered cyanobacteria convert industrially sourced, high-concentration CO2 into secreted, fungible hydrocarbon products in a continuous process. These innovations are projected to operate at areal productivities far exceeding those based on accumulation and refining of plant or algal biomass or on prior assumptions of photosynthetic productivity. This concept, currently enabled for production of ethanol and alkane diesel fuel molecules, and operating at pilot scale, establishes a new paradigm for high productivity manufacturing of nonfossil-derived fuels and chemicals.

Sum of individual contributions and accumulated photon losses for two fuel processes and a theoretical maximum for energy conversion. The losses are represented on a logarithmic scale and accumulated serially for the processes beginning with the percent of PAR in empirically measured solar ground insolation. Total practical conversion efficiency after accounting for losses is indicated by the green arrows
 
High energy demand = high prices, which really focus the mind. The description is a bit garbled (gravity has very little to do with the process described here, but centrifugal force has everything to do with it):

http://www.prweb.com/releases/2011/02/prweb5084164.htm

Breakthrough Gas Separation Technology Poised to Allow Use of Up to 30% of The World’s Natural Gas Reserves

Small nozzles to have a large impact on energy, computer chips, the environment, and 40 to 70 % of capital and processing costs in industry

  "These nozzles look simple, but there has been a great deal of effort and analysis that has gone into their development"

Ponte Vedra, Florida (PRWEB) February 22, 2011

Unique methods of separating gases are in development by Armington Technologies, LLC’s affiliate Tenoroc, LLC.

Separating impurities from highly contaminated natural gas reserves is the main focus of Tenoroc’s research. According to an Innovations Report article of 5/12/2006, “Huge underground gas reserves, up to 16% of the total reserves, remain unused. The natural gas in these fields is too contaminated for exploitation. With existing technology, cleaning these fields is much too costly… It is almost impossible to convey the economic value of 16% of the world’s reserves. They represent more that 360 times the annual natural gas production of Shell, Exxon, and BP put together.” Other estimates put the level of these unused reserves as high as 30%.

Established in 2005, Tenoroc has been developing its patent pending curved nozzle technologies since 2006 at its Mankato, Minnesota research facility. These small nozzles, with no moving parts, incorporate gravitational forces that can exceed conventional spinning centrifuges, achieving improved separation levels. Tenoroc’s nozzle technology is divided into two areas, “condensation based separation” and “gas-to-gas separation”.

“Condensation based separation” is a method of using the expansion that occurs within the nozzle to convert one gas in a mix of gases to a liquid. When the gas constituent that has been turned to a liquid and remaining, different gas constituents are exposed to the curve in the nozzle, centrifugal energy forces the liquid, which is heavier than the gas, to the outside wall and it exits through the outside wall outlet.

According to Paul Donovan, Director of Technology Development, “We see our niche in the natural gas industry in applications where there are high levels of contamination, too high for today’s methods of cleansing natural gas. We also hope to improve or supplement cleansing on less contaminated natural gas currently being processed. Our small footprint and versatility in placement is an added bonus.” When asked about commercialization Mr. Donovan added, “The key to commercialization will be our ability to license our technology to a strategic partner that provides equipment and service to the natural gas processing industry. We intend to begin demonstrating our prototype immediately as a first step in this process.”

“Gas-to-gas separation” and isotope enrichment make use of the extreme gravitational force produced by the nozzle curve to move the heavier gas to the outside wall where it exits away from the lighter gas.

Michael Bloom, Principle Tenoroc Researcher, offered, “In addition to natural gas, an application that we have our sights on is isotope enrichment, including isotopically pure silicon for semiconductor wafers. Pure silicon has been studied by the industry for years and is believed to be the answer to the debilitating heat generated by today’s computers. However, no one has ever been able to purify the gas that silicon is made from at a reasonable cost or in the quantities needed.” Mr. Bloom has almost two decades of experience developing separation methods. He achieved a patent on a gas centrifuge in 1999.

“These nozzles look simple, but there has been a great deal of effort and analysis that has gone into their development,” stated Tenoroc’s President, Gary Capuano. He added, “The need for improved separation methods is all around us. The Department of Energy estimates that separation processes represent 40 to 70 percent of both capital and operating costs in industry. They also account for 45 percent of all the process energy used by the chemical and petroleum refining industries every year. There are numerous applications for our technologies, including water de-salination. For now though, we must maintain our focus, and that focus is natural gas.”

When asked about the challenges of commercializing a technology, Capuano replied, “Our company’s management has enjoyed success licensing technology in the past. With today’s interest in energy and the environment, this technology seems to have put us in the right place at the right time. We intend to find the correct industry partner for each application while we continue to improve what we already have.”

Please direct inquiries to;
Armington Technologies, LLC
P.O. Box 3492
Ponte Vedra, FL 32004
Tel: 407-236-7023
Fax: 904-285-2156
pd(at)armingtontech(dot)com
 
Battery power that is more energy dense and hence more practiacal:

http://nextbigfuture.com/2011/03/low-temperature-molten-salt-battery-ten.html

Low temperature molten-salt battery ten times cheaper than lithium ion by 2015

Sumitomo Electric Industries Ltd., in partnership with Kyoto University, has developed a lower temperature molten-salt rechargeable battery that promises to cost only about 10% as much as lithium ion batteries. Sumitomo intends to commercialize the battery around 2015 and market it as an alternative to lithium-ion batteries used in automobiles and homes.



    The new battery uses sodium-containing substances melted at a high temperature. The technology has been around for decades, but existing molten-salt batteries require keeping the electrolyte in a liquid state at a temperature higher than 300 C. Sumitomo Electric worked with researchers at Kyoto University to develop a sodium material that melts at 57 C.

    Having roughly double the energy density of a typical lithium ion battery, the new battery would let an electric vehicle travel twice as far as a lithium ion battery of the same size. Automakers would be able to reduce the space taken up by batteries in their EVs. Molten-salt batteries also boast high heat and impact resistance and are said to be less susceptible to igniting than lithium ion batteries.

    Sodium is cheaper than lithium because it is in abundant supply. The new battery is expected to be priced at about Y20,000 per kilowatt-hour--about 10% as much as domestic lithium ion batteries and one-fifth as much as Chinese products.

    But unlike a room-temperature lithium ion battery, the new battery must be kept at 80 C to output power. So for the time being, Sumitomo Electric envisions it being used in applications where it is operating continuously, such as homes and electric buses. The company and the university have applied to have the battery patented.


Molten-salt batteries use highly conductive molten salts as an electrolyte, and can offer high energy and power densities. The ZEBRA battery is an example of a molten salt battery. A drawback to the general class of molten salt batteries has been high operating temperatures.
 
Some update on World's Oil Reserves:

http://www.cnbc.com/id/33550165?slide=1
(derived from Energy Information Administration, Department of Energy)

I converted the info into a chart for myself so I could read it better.  A few things stuck out for me though:

  • The US consumes twice as much oil as they are able to produce (produce 9.14 million; consumes 18.81 million
  • The US imports virtually all of the oil that Brazil produces, 2.52 million barrels, out of 2.57 million it produces, and this source is closer to finite, 11.65 billion barrels in reserve.  Brazil consummes an additional 2.57 million barrels (Brazil must rely on export oil back to them?)
    US relies heavily on Saudia Arabia export of oil: over a million barrels (dec 2010)
    Canada has the second largest oil reserves in the world, 175.2 in reserve, 13% of World Total.  We produce 3.20 million, consume 2.15 million and we export to the US, 2.71 million, which means we also rely on imported oil for our needs

President Jimmy Carter tried to warn the American people re: oil dependence and consumption patterns in general:  http://www.pbs.org/wgbh/americanexperience/features/primary-resources/carter-energy/
It looks to me to be like an energy crisis.

How do we protect our democracy, well-being of our citizens, when BIG OIL has such a huge lobbying power in politics?  Bush family, it was easy to see, the relationship between BIG OIL and politics, likewise Cheney's Energy companies.  Having to suck up to Saudia Arabia, depsite their horrid human rights abuses. 

Obviously the dependence is high in industries, but we can develop useful alternative energies for consumers, and bring great reductions there.

Canada has huge oil reserves, some operating drills in Alberta and Saskatchewa, but also the Tar Sands.  They've developed cheaper extraction, but the impact of such intensive industry on water, is troubling-- and Water might be our greatest asset that needs to be protected-- humans can't live without clean water.

On Geothermal Energy, I have a friend, retired scientist (from Atomic Canada), has developed a Atmospheric Energy System-- it can both heat and cool houses, buildings. 

Some of the barriers to development include reduced funding in Research and Development in Canada and funding for green energy development, so it's trying to find corporate investors, but when the research end is underfunded, it makes it difficult.  He did arrange for a test home via private funding via one of the colleges.  So far, so good, house is warm and cozy through the winter months.

I have some research papers comparing the Enwave system, Okotoks in Alberta with the Atmospheric Energy system, the AE system seems to out-perform. 

We have brilliant minds here in Canada and in the US, it's a matter of the system working to support innovation, research and development, without hinderance from the powerful lobbying of Energy Corporations.








 
Is there any publicly available information on this system? A link to a company website would be nice.

From what you have described so far it sounds like a heat pump, except its reservoir is the atmosphere rather than the ground or water.

 
Atmospheric Energy

This is the test house:
http://volkerthomsen.com/ae-atmospheric-energy-storage-system/


From: http://sustainability-journal.ca/  tolmie129@rogers.com

Basic Principles of AE systems:

Atmospheric Energy Systems

Operating at ambient ground temperature
If you inject heat into the ground in a way that
permanently raises the temperature of the ground then
there will be a constant loss of the injected heat to the
surrounding ground. Such a loss is acceptable in the case
of a closely packed housing community – an AE Array –
but not in cases where the heat stores are widely
separated. For such cases the ground temperature can be
cycled annually above and below the ambient ground
temperature so that there is no net loss or gain to the
surrounding ground at the end of the annual cycle. In
that case the amount of energy that you recover from the
ground is exactly the same as what was injected. It is a
lossless system that illustrates the principle of
conservation of energy.

Injection deficiency In practice it is desirable to inject
slightly less heat than is expected to be recovered. In
that case the surrounding ground will automatically
make up for the deficiency so the system will still
deliver the required amount of heat. There are a number
of advantages to operating in such an imbalanced mode:
(1) Since the total building heat load varies from
year to year this takes care of that variance
(2) The capital cost can be reduced because the
borehole depth is reduced
(3) If there is a net deficit then the direction of heat
flow outside of the store will always be in the
direction of the store, throughout the year and
from year to year. This guarantees that there will
be no conductive heat losses in such systems

Ground water flow Operating with an injection deficit
also ensures that there will be no net loss caused by the
flow of water through the ground. While the ground is
above the ambient temperature ground water could carry
away some of the heat but in the winter, when the
ground temperature falls below the ambient temperature,
the ground water will carry heat into the storage volume.
If there is a injection deficit then the gain in the winter
will exceed the earlier loss so the effect of ground water
flow is to improve the performance of the system.

Multiple boreholes If you were to use a single
borehole to both inject and recover the heat then the
above considerations would apply and over a period of
many years you would recover all of the heat that you
had injected. However, in a given year some of the
injected heat will disperse out to distances that are too
far from the injection site to make it possible to recover
all of the injected heat. The injected heat has had twice
as long a period to disperse so the winter recovery is
relatively poor even though technically the injected heat
will eventually be recovered. For that reason it is
desirable to surround the injection site with recovery
boreholes that are spaced far enough out so that they
recover the heat that slowly moves outwards. Since both
the injection and the recovery boreholes can be used in
the winter this practice does not increase the total
borehole length so it does not have a substantial impact
on the cost. Such systems can recover all of the injected
heat during the following winter. (n.b. An AE-Array can
efficiently utilize a single borehole per house because it
does not need to use the travel velocity to trap the heat.)

Ground heat exchangers In a conventional ground
source heat pump system the designer is dependent on
the nature of the ground to determine the system
performance. The amount of natural heat stored in the
ground at the subject location, the past history of heat
extraction, the ground water flow, the thermal
conductivity of the rock or other ground material, its
specific heat, etc., all affect the results. The equivalent
calculations for an AE system are much simpler. It can
be treated much like a storage battery. You still need to
establish the capacity of the “battery” from the storage
characteristics of the rock but you can then proceed to
inject and recover the amount of heat that will be needed
for the building without the need of complex computer
programs.

This simplicity of operation has led to the development
of techniques for reducing the borehole depth and of
coping with short term, diurnal and longer term load
variations. In Canada, conventional GSHP's deliver from
20 to 30 watts per metre of borehole length (including
the heat pump's contribution). The test bed AE system (a
40 kW, 60,000 kWh per year system) is capable of
delivering 208 watts per metre, and AE systems do not
diminish in output from year to year or interfere with
nearby systems.

AE systems utilize an energy source (the summer air)
that has a capacity that is orders of magnitude greater
than the total amount of energy needed to heat all of the
buildings in Canada. They inherently provide a means of
cooling those buildings by a process that requires
relatively little power. The ground has sufficient capacity
to store whatever amount of heat is needed, even for very
large buildings. Their capital cost is competitive with
conventional heating/cooling systems and the cost of the
energy supply is limited to the cost of running the heat
pumps and circulation pumps. That residual cost is
minimized because the heat pumps operate with high
COP's in both the winter and the summer

Ron Tolmie is a retired scientist from Atomic Canada.  In his retirement, he's developed a burgeoning interest in local and global environmental impact issues.  This has been non-funded, out-of-his-pocket research with other out-of-their-pocket partners.  He lucked out in forming a partnership, private funding from the former CEO of St.Lawrence college, Volker Thomsen.  We have the great promise of innovation in Canada, but a lot of stumbling blocks re: underfunding of Research and Development. 

Are banks eager to make loans on this sort of research and development-- is it in their interests (does it compete with profitability/investment return re: other energy sector investment, and growth?). . .are they willing to fund research and development and something risky at the beginner stages. . .?  More work on "marketing" can be done, and seeking out corporate partners.  Ron's a scientist, not a business person.  This is a retirement hobby, an interesting one at that.

It looks like Standford University is positioned to possibly embrace this technology, a supportive education program: http://cee.stanford.edu/programs/atmosenergy/index.html

Reduction by 68% of consumer household energy consumption, can leave more for industry, and military needs and benefit Canada through export of energy-- which would seem to be a win-win re: US energy needs, production, industry.  Easily be able to conform to Kyoto Protocals, without damaging industy, but there is this complex economics beast to deal with, market place and growth/profitability-- oil/energy companies which drive the economy. . .?  There are practical solutions, real sustainability within reach-- but it looks like the companies seek to drain every resource first for their profit.  Poison our waters, which we would also need re: future generations.  It looks like madness to me.  Our goverments won't govern this way, powerful lobby groups will block practical innovation, the craziness of the market economy and unwillingness, both ideologically and pragmatically to seek solutions to adjust for this.

I'm an idealist, my priorities would be different.  No-one staves, no-one denied needed health care; care for our vulnerable, our elderly, children-at-risk, etc.-- people first, a humane society.  I'm not anti-capitalist, but critical of some of the destructive elements of it, which I wish there were some thinking which allows for health of citizens and environment, with some help re: economic solutions to provide some flex room for solutions that in the long-run will benefit our future generations.  It's how things are prioritized, with the right priorities and have the planning to build the foundations to support it, find the economic resolutions to support it (but there is no agenda existing among those in power to even entertain it-- alternative think tanks,, are not funded either, here in Canada)  But it seems that there is no middle ground.  It seems that when we're not at war abroad, we're bringing war against our own citizens-- that to me, is a broken system.  I stand up for Veteran Rights, on principle, that's who I am.  I do the same for the elderly, or the sick -- my compassion goes beyond self-interest (only that I do self-identify with oppression, that's PTSD-related)-- I retained older generation values, an idealist not wiling to bend my principles.

Are there economic solutions which don't require war on the basic dignity of our own citizens.  I'm appaled for example, that seniors in my area have to wait 6 years for long-term care facility beds-- I can imagine the hardships of families, and without that essential care, the door wide-open to abuse and neglect of the vulnerable elderly-- when they paid into a system through their taxes-- what happened to the 'social contract'?  Have we explored all other options, then to bring hardship and suffering to our own citizens.  It seems like it's a power-grab among the most powerful, to the bitter end, literally. . .?  I'm not convinced that this is the only way.  I guess globally, we probably don't even own much of our own resources, so things like this are impacting our democracy.  Lost sovereignty-- what an enormous mess.  :(  (enormous debt. . .foreign ownership).  Too bad there weren't more patriotic billionaires to help out. . .:(  Probably not enough.

If I'm misunderstanding, I'm open to lots of correction.  I'm a neonate re: economics-- definitely no PhD here ;)
 
Field generators get a makeover. The second and third order effects (fewer trucks needed to transport fuel, fewer trucks and fuel needed in the logistics pipeline back to Canada etc.) would help pay for the system:

http://www.technologyreview.com/energy/35080/page1/

Hybrid Power for the Frontline

Diesel-battery generators could cut troop fuel use at least by half.

By Phil McKenna

The U.S. Armed Forces are heavily burdened by the financial and tactical costs of transporting fuel to the battlefield. This July, in an effort to address the problem, the United States Marine Corps will deploy a pair of diesel generators coupled with powerful batteries to frontline troops in Afghanistan. The hybrid power systems should cut by 50 to 70 percent the amount of fuel needed to generate electricity, according to the manufacturer, Earl Energy of Portsmouth, Virginia.

The generators that U.S. military camps currently use operate inefficiently because they need to handle ocassional peaks in demand. "You may have a 10-kilowatt generator that at any time is only producing 1.5 kilowatts of power to satisfy its load," says Doug Moorehead, president of Earl Energy. "So you are wasting 8.5 kilowatts of power that you aren't storing for later use," he says.

The diesel-battery hybrid the company developed instead runs generators for short bursts to maximize energy utilization. Not only does this satisfy the immediate energy requirements of a camp, but the system also charges a bank of lithium-ion batteries. When the batteries are fully charged, the generator shuts off and the system begins drawing power from the batteries instead. "Generators can go from running 24 hours a day to three to four hours a day—it's that good in some cases," Moorehead says.

The hybrid systems to be deployed in July will combine an 18-kilowatt diesel generator, similar to those currently used in the battlefield, with a 40-kilowatt-hour bank of lithium-ion batteries. The system will also include a 10-kilowatt photovoltaic solar panel array that will further lower fuel consumption.

The entire system, including photovoltaics, sells for "over $100,000," as compared with $80,000 to $100,000 for a similarly sized conventional generator, Moorehead says. The cost to Earl Energy for just the batteries—which have built-in safeguards against the high temperatures and dusty field conditions of Afghanistan—is $750 to $1,500 per kilowatt-hour of storage.

Moorehead estimates the system will pay for itself within seven to 12 months, depending on the cost of fuel. Bigger savings would come from using the hybrid system without the photovoltaics, which are expensive, and the company is now developing a stand-alone generator without the added solar power, he says.

Maximizing the unit's energy efficiency requires repeatedly deep cycling the batteries—discharging them to their full capacity before recharging them. Conventional lead-acid and nickel-cadmium batteries quickly lose storage capacity if repeatedly deep cycled. The advanced lithium-ion technology in Earl Energy's batteries allows them to last close to 4,000 cycles, or 18 to 24 months, according to the company. Moorehead developed lithium-ion battery technology for battery maker A123 Systems before joining Earl Energy.

The hybrid power system also employs energy-management software that uses complex algorithms to maximize the generator's efficiency. Steven Minnihan, an analyst at Lux Research says this energy management, together with the power electronics that allow the system to quickly switch between generator and battery power, is very important. "Companies will speak quite freely about the chemistry of the batteries they are using, but they are very tight-lipped about the energy-management systems and power electronics," he says. "It is becoming an increasingly important piece of intellectual property."

The technology is competitive in the battlefield because transporting diesel fuel to the front lines in heavily armed convoys is very expensive.  Moorehead, a former Navy SEAL, says delivery costs for fuel transported to the front lines in Afghanistan typically range between $20 and $40 per gallon.

Reducing fuel use on the front lines saves more than money. Ray Mabus, Secretary of the Navy, spoke of the "fully burdened" cost of fuel at a recent DOE Advanced Research Projects Agency for Energy (ARPA-E) conference in Washington, D.C. "For every 24 [fuel] convoys, we lose a soldier or a Marine [who] is killed or wounded guarding that convoy," Mabus said. "That's a high price to pay for fuel."

Earl Energy hopes to begin scaling up production of its high-efficiency generator systems. According to the company, the number of fuel convoys could be cut in half if its devices are widely deployed.
 
Thanks Thucydides for sharing that great article. 

It gives me further appreciation of the military for innovation and some positive trickle-down effects culturally when it comes to production of science and innovation.  Life and death situations, responding to immediacy of need concerns-- helps feed the impetus and the will to seek pragmatic resolutions re: energy (and the broader meanings of energy, including people resources-- to protect life).

At home, alternative energy solutions can stimulate domestic economy growth; jobs for our newer generation in specialized trades re: production, manufacture, installation, maintenance-- products and services that benefit us directly, on the most pragmatic of levels (vs. over-consumption of exported 'dollar store' junk ;) ); help combat stagnation of growth for our younger starter generations; free up resources for export and reduce the need/demand for import of resources from countries we'd rather not be doing business with.

Problems however with declining wages, shrinking middle class to purchase. . . (which made us stronger economically in the past, made the country healthy, the backbone of a healthy democracy, IMO). . .

The article was interesting as you said, the inefficiency of transport of oil to front line-- it points to unnecessary losses, people and resource.  That makes sense as well in a larger context, re: domestic oil need and consumption, its import from afar-- though less immediately catastrophic, re: loss of life.

It's good to see the power of military common sense prevail, innovation, the pragmatics of doing so re: alternative energy solutions.  It's liberating to hear, another very positive light. "Winning" ;)  Maybe the power of Canadian and American innovation can pull us out of economic troubles, sustainable recovery, light at the end of the tunnel?

Update on the AE systems, they are good to go-- they did it without reliance on R&D government funding, maybe better that way, less wastage of tax payer dollars-- the power of sacrifice among genuinely committed individuals, local partnerships, patriotic to both Country and Environment.  ;)

I imagine military members are directly engaged with problems re: government funding of military needs. 
Defense Secretary, Robert Gates expressed some frustration re: F-35 (which has doubled in predicted cost, seemingly not a result of Harper Government miscaluculation, but from Lockheed Martin itself):
"The culture of endless money that has taken hold must be replaced by a culture of restraint."
http://en.wikipedia.org/wiki/Lockheed_Martin_F-35_Lightning_II  Security concerns as well, re: stolen terrabytes.  The planes need to be made safe for use, why that wasn't in the original figures though. . . ?
 
Canada steps forward (again):

http://www.financialpost.com/news/features/Canadian+technology+goes+global/4459773/story.html

Canadian gas technology goes global

Claudia Cattaneo, Western Business Columnist, Financial Post · Mar. 17, 2011 | Last Updated: Mar. 17, 2011 7:06 PM ET

CALGARY — It took a decade for the Bakken play, centred in North Dakota and Saskatchewan, to evolve from a wild idea to one of North America’s most significant new sources of oil.

Now the technology that made it possible, a combination of horizontal drilling and targeted hydraulic fracturing — much of it developed and proven by Canadians — is on the cusp of global oil field deployment.

While it may not mean relief from high oil prices today, it could delay the sunset of the fuel if it’s as successful in the oil sector as it has been in the gas side of the business, where it made fields so much more prolific the flood of new supply has overwhelmed North American markets.

Adding to its promise is that this new type of unconventional oil involves a renewal of mature oil fields, many located outside politically risky places like the Middle East, where only a fraction of the oil has been recovered using old drilling methods.

Those on the leading edge of unlocking the world’s next Bakkens are enthusiastic about this new/old global oil source.

“There is absolutely huge interest from companies throughout the world that are looking for alternative sources of energy,” said Dan Themig, president of Packers Plus, the Calgary-based private company that pioneered the technology’s implementation in North America. It is now building operations in China, Argentina, Brazil, across North Africa, Romania, Russia, the U.K. and Norway.

“Certainly fracturing for oil will eventually be just every bit as big as fracturing in natural gas,” he said. “Almost every major oil basin, including Saudi Arabia, will utilize this technology to enhance recovery and revitalize some declining fields and possibly arrest the decline for a number of years.” Mr. Themig, an engineer, co-founded Packers Plus in 2001 with Peter Krabben and Ken Paltzat.

Houston-based Ryder Scott Petroleum Consultants, a top reservoir evaluation consulting firm, said in its latest quarterly newsletter, Reservoir Solutions, that net importers of oil like China, France and Poland are studying the Bakken as a model for similar deposits in their countries. Producers are already using Bakken technology in France’s Paris basin, Australia’s Georgina basin and New Zealand’s Taranaki basin.

“The U.S. and Canada have fairly drilled up basins, and so (they) tend to need to implement innovation faster,” said Scott Wilson, Denver-based senior vice president at Ryder Scott.

“If you have wells in Saudi Arabia that flow 10,000 b/d, you are probably not going to be spending $10-million a well to get a Bakken style producer. But if the best you can do is a Bakken style producer, your innovation will start to shine, and others can watch what you are doing and take advantage of that to the best of that your abilities.”

Bakken-type oilfield revivals around the world could be the next frontier, said Ward Polzin, managing director in Denver at Tudor Pickering Holt & Co., who specializes in mergers and acquisitions in oil and gas shale for the top oil and gas investment bank.

“It’s not on the scale of the oil sands in Canada or the Middle East, but when you add it all up, it has the potential to be the next significant find,” he said.

In North Dakota, oil companies had given up on the Bakken, known since the 1950s but seen as a marginal field, until a wildcat geologist in Billings, Montana, Richard “Dick” Findley, owner of a struggling two-staff independent geology firm, started re-thinking the play.

His work led to the development of the Bakken in the last decade, making North Dakota the fourth-largest oil producing state in the U.S. by quadrupling oil production to about 400,000 barrels a day from 5,000 wells.

In Saskatchewan, production from the Bakken by companies like Crescent Point Energy Corp. increased from 750 b/d in 2004 to 65,000 b/d, and the number of producing wells increased from 100 to 1,800.

Geological studies estimate the ultimate hydrocarbon potential of the Bakken Formation is between 100-billion to 400-billion barrels. Saskatchewan’s share of the resource could range from 25-billion to 100-billion barrels.

But that’s just the beginning. Canadian producers, mostly smaller companies, successfully applied the same methods in new areas in Saskatchewan and Alberta like the Lower Shaunavon, the Pembina Cardium and continue to look at new possibilities, including an Alberta version of the Bakken.

In the U.S., the search for the next Bakken is in full swing, with the Niobrara formation near the border of Colorado and Wyoing showing the most promise, and similar approaches being tried in Texas and Oklahoma.

Like all new oil frontiers, it has its challenges.

The drilling advancements are costly and require oil prices above $50 a barrel to be economic. Once in production, wells decline rapidly.

Adoption of the technology globally could take a long time, as international companies gain expertise, often by investing in U.S. and Canadian firms that know how to do it. And political interference is a given, once the off oil movement gets up to speed.

For those who are concerned about oil peaking, it could represent a brilliant reprieve with a big Canadian stamp.

Financial Post

ccattaneo@nationalpost.com
 
From the comments thread in The Truth About Cars, putting solar energy into perspective against hydrocarbons. Doing the numbers is very illuminating:

http://www.thetruthaboutcars.com/2010/10/the-chevy-volt-as-efficient-as-you-want-it-to-be/

The natural resources board of Canada says you can, on average, expect 5.2 kwh/day per square meter of surface area POINTED DIRECTLY AT THE SUN. This panel would be on a roof facing straight up…and a slightly curved roof at that…and not tracking the sun across the sky, either.  That’d be fine, at noon, if you lived on the equator.  At 55 degrees latitude, a horizontal surface will only collect about 5.2 sin(35deg) of that energy on average (on the equinox).  That brings us down to 57%, for 2.98 kwh/day.  Lets reduce that further for the fact it doesn’t track the sun across the sky, and for shade, garage time, etc.  We’ll estimate half, for about 1.5 kwh/day.  With 10% cell efficiency likley to be found in affordable, lightweight cells, and neglecting any inefficiency in charging and discharging the batteries, we’re looking at 150 watt hours per day up in Canada.  That’s the energy equivelent of 0.4 ounces of gas- just over a quarter of a shot glass, or about one cent worth of gas at $3/gallon.  One cent per day is $3.65 if you drive every day of the year from a 1 square meter panel.  At $600 for the panel alone (neglecting fitment, wiring, electronics, etc. you’ll break even in a brisk 164 years. 

From that, subtract a bit for the energy required to accelerate the extra mass.  Even if you live in Arizona and get several times that, it still isn’t worth it.  Now…those solar race cars use top of the line everything, and only require a couple hundred watts to do 35 mph, so they can get away with it.  On a normal car though, a solar panel is nothing but green washing until we can coat an entire car in thin-film cells for under, say, 60 pounds weight gain and under, say, $450 or so.
 
Using rivers and oceans to generate electricity:

http://nextbigfuture.com/2011/03/stanford-researchers-use-river-water.html#more

Stanford researchers use river water and salty ocean water to generate electricity

Stanford researchers have developed a rechargeable battery that uses freshwater and seawater to create electricity. Aided by nanotechnology, the battery employs the difference in salinity between fresh and saltwater to generate a current. A power station might be built wherever a river flows into the ocean.

Stanford researchers have developed a battery that takes advantage of the difference in salinity between freshwater and seawater to produce electricity.

Anywhere freshwater enters the sea, such as river mouths or estuaries, could be potential sites for a power plant using such a battery, said Yi Cui, associate professor of materials science and engineering, who led the research team.

The theoretical limiting factor, he said, is the amount of freshwater available. "We actually have an infinite amount of ocean water; unfortunately we don't have an infinite amount of freshwater," he said.

Nanoletters - Batteries for Efficient Energy Extraction from a Water Salinity Difference



The salinity difference between seawater and river water is a renewable source of enormous entropic energy, but extracting it efficiently as a form of useful energy remains a challenge. Here we demonstrate a device called “mixing entropy battery”, which can extract and store it as useful electrochemical energy. The battery, containing a Na2−xMn5O10 nanorod electrode, was shown to extract energy from real seawater and river water and can be applied to a variety of salt waters. We demonstrated energy extraction efficiencies of up to 74%. Considering the flow rate of river water into oceans as the limiting factor, the renewable energy production could potentially reach 2 TW, or 13% of the current world energy consumption. The mixing entropy battery is simple to fabricate and could contribute significantly to renewable energy in the future.

As an indicator of the battery's potential for producing power, Cui's team calculated that if all the world's rivers were put to use, their batteries could supply about 2 terawatts of electricity annually – that's roughly 13 percent of the world's current energy consumption.

The battery itself is simple, consisting of two electrodes – one positive, one negative – immersed in a liquid containing electrically charged particles, or ions. In water, the ions are sodium and chlorine, the components of ordinary table salt.

Initially, the battery is filled with freshwater and a small electric current is applied to charge it up. The freshwater is then drained and replaced with seawater. Because seawater is salty, containing 60 to 100 times more ions than freshwater, it increases the electrical potential, or voltage, between the two electrodes. That makes it possible to reap far more electricity than the amount used to charge the battery.

"The voltage really depends on the concentration of the sodium and chlorine ions you have," Cui said. "If you charge at low voltage in freshwater, then discharge at high voltage in sea water, that means you gain energy. You get more energy than you put in."

Once the discharge is complete, the seawater is drained and replaced with freshwater and the cycle can begin again. "The key thing here is that you need to exchange the electrolyte, the liquid in the battery," Cui said. He is lead author of a study published in the journal Nano Letters earlier this month.

In their lab experiments, Cui's team used seawater they collected from the Pacific Ocean off the California coast and freshwater from Donner Lake, high in the Sierra Nevada. They achieved 74 percent efficiency in converting the potential energy in the battery to electrical current, but Cui thinks with simple modifications, the battery could be 85 percent efficient.

To enhance efficiency, the positive electrode of the battery is made from nanorods of manganese dioxide. That increases the surface area available for interaction with the sodium ions by roughly 100 times compared with other materials. The nanorods make it possible for the sodium ions to move in and out of the electrode with ease, speeding up the process.

Other researchers have used the salinity contrast between freshwater and seawater to produce electricity, but those processes typically require ions to move through a membrane to generate current. Cui said those membranes tend to be fragile, which is a drawback. Those methods also typically make use of only one type of ion, while his battery uses both the sodium and chlorine ions to generate power.

Cui's team had the potential environmental impact of their battery in mind when they designed it. They chose manganese dioxide for the positive electrode in part because it is environmentally benign.

The group knows that river mouths and estuaries, while logical sites for their power plants, are environmentally sensitive areas.

"You would want to pick a site some distance away, miles away, from any critical habitat," Cui said. "We don't need to disturb the whole system, we just need to route some of the river water through our system before it reaches the ocean. We are just borrowing and returning it," he said.

The process itself should have little environmental impact. The discharge water would be a mixture of fresh and seawater, released into an area where the two waters are already mixing, at the natural temperature.

One of Cui's concerns is finding a good material for the negative electrode. He used silver for the experiments, but silver is too expensive to be practical.

His group did an estimate for various regions and countries and determined that South America, with the Amazon River draining a large part of the continent, has the most potential. Africa also has an abundance of rivers, as do Canada, the United States and India.

But river water doesn't necessarily have to be the source of the freshwater, Cui said.

"The water for this method does not have to be extremely clean," he said. Storm runoff and gray water could potentially be useable.

A power plant operating with 50 cubic meters of freshwater per second could produce up to 100 megawatts of power, according to the team's calculations. That would be enough to provide electricity for about 100,000 households.

Cui said it is possible that even treated sewage water might work.

"I think we need to study using sewage water," he said. "If we can use sewage water, this will sell really well.
 
A new energy player in the Middle East?:

http://www.energytribune.com/articles.cfm/6987/Israel-Targets-Energy-Superpower-Status

Israel Targets Energy Superpower Status
By Peter C Glover
Posted on Mar. 30, 2011

Due Diligence: How to Evaluate a Renewable Energy Technology

First it was two major offshore natural gas field discoveries. Now it’s an ambitious plan to exploit Israel’s massive shale oil deposits in the Shfela Basin. The gas finds alone will make the country self-sufficient in natural gas for decades and debut Israel as a key regional energy exporter. The latter, if successful, would quite simply catapult Israel into the energy superpower league.

Not surprisingly domestic excitement over Israel’s prospective new energy status is palpable, with the state’s energy insiders barely able to contain themselves, and with good reason.

Levantine riches

Speaking at the CERAWeek conference in Houston in early March 2011, CEO Charles Davidson, chairman of Noble Energy which, with Israeli partners Delek Group and Ratio Oil, made the Tamar natural gas discovery, announced that the $3 billion investment in the Tamar field will supply the Israeli domestic market for decades. Currently appraised at 8.4 trillion cubic feet (Tcf) Tamar is expected to deliver its first sales in 2013.

But what really drew the attention of potential suitors for Israeli natural gas was the announcement of the discovery of the Leviathan natural gas field as 2011 dawned. Lying to the north-west of Tamar, Leviathan holds around a further 16 Tcf, almost all of which could be slated purely for export. According to businessman Yitzhak Tshuva, part owner of Leviathan, its gas too will be ready for production by 2013, well ahead of schedule. But even at a combined total of 25 Tcf, Tamar and Leviathan only represent around a fifth of the 122 Tcf the US Geological Survey estimates lies in the Levantine Basin, much of which falls within Israeli jurisdiction.

Just how Israel’s vast reserves are to be monetized is yet to be seen. Already a national debate is raging in Israel over the royalties from the revenue and taxes that petroleum firms should pay the state. In the few years since the state’s changeover from oil to gas powered electricity generating plants Israel is already believed to have saved around $5 billion in revenue. With the state’s unique geological position however, Israel’s options for selling the gas include Europe, China or even India. In terms of development, a partnership with Cyprus tying in its gas fields and co-operating on building subsea gas pipes makes sense. And Greece has proposed becoming a distribution hub for eastern Mediterranean gas throughout Europe.

As if the sudden emergence of Israel on the natural gas stage was not enough, however, a new plan to develop Israel’s significant shale oil and gas deposits south-west of Jerusalem could put Israel alongside the energy-rich super-elite.

Vinegar’s Oil Plan

Harold Vinegar, the former chief scientists of Royal Dutch Shell, has devised an ambitious plan that would, if successful, turn Israel into one of the world’s leading oil producers. Now chief scientist for Israel Energy Initiatives (IEI), Vinegar maintains that the 238 sq km Shefla Basin holds the world’s second largest shale deposits outside the United States, from which around 250 billion barrels of oil – about the same as Saudi Arabia’s proven reserves, could be extractable. IEI estimates the marginal cost of production at between US$35 and US$40 per barrel. That, says Vinegar, would be cheaper than the US$60 or so per barrel it would cost to extract crude oil in more hospitable locations such as the Arctic, and even favourably with the US$30-US$40 in Brazilian deepwater.

IEI, owned by the American telecom giant IDT Corp, anticipates starting commercial production by 2020, producing 50,000 barrels a day initially. While that figure is a fraction of the 270,000 barrels per day Israel currently consumes, Vinegar maintains it is a further key step toward achieving energy independence. Vinegar proposes thermal recovery for Israeli shale oil.

The IEI shale oil project has already attracted serious interest from investors. In November last year, Jacob Rothschild and media mogul Rupert Murdoch bought an $11million stake in Genie Oil and Gas, the division of IDC that is the parent company of IEI. Genie’s advisory board also includes former vice-president Dick Cheney and hedge fund investor Michael Steinhardt. But it seems development funding is likely to be no bar to the Shefla project. Vinegar states, “Funding is not needed for the pilot and demonstration, although once we are getting 50,000 barrels per day, we would want to have a partner. We have been approached by all the majors.”

Not that it’s likely to be all plain sailing for the Shefla shale development. The size of the geological resource still needs to be confirmed. Environmental concerns and issues over whether the technology will work in situ also need to be addressed. But when it comes to the commercial long-term viability of the project, Vinegar believes it is validated, predicting, “the price of oil is going to continue rising” and “by 2030, will be around $200 per barrel.”

Land of energy promise

All of which could amount to a significant geopolitical power shift for the troubled wider region. First and foremost the clutch of new gas and oil initiatives would secure Israel’s longer-term energy security. Second, the social upheaval of neighboring, energy-producing, Arab states may also soon find governments in the West embarking on their own domestic shale revolutions – becoming less energy-dependent on Middle East oil and gas. For Israel, it would also mean there could be no repeat of the economic ransom to which the country was held in the 1970s when a pan-Arab energy embargo forced Israel to turn to the expensive and unpredictable international energy market.

The full extent of Israel’s subsea natural gas and onshore shale oil deposits will likely be confirmed over the coming year. But the old joke that Moses got it wrong, turning left and settling for ‘milk and honey’ instead of turning right and getting the oil, is already redundant. Israel is looking every bit a land of energy promise after all.
 
Things not to do:

http://www.theregister.co.uk/2011/04/07/wind_power_actually_25_per_cent/

Wind power: Even worse than you thought

But your 'leccy bill will keep going up to buy more of it

By Lewis Page

Posted in Environment, 7th April 2011 09:36 GMT

On Demand Webcast : Making the decision on hosted apps - What’s the risk and reward?

A new analysis of wind energy supplied to the UK National Grid in recent years has shown that wind farms produce significantly less electricity than had been thought, and that they cause more problems for the Grid than had been believed.

The report (28-page PDF/944 KB [1]) was commissioned by conservation charity the John Muir Trust and carried out by consulting engineer Stuart Young. It measured electricity actually metered as being delivered to the National Grid.

In general it tends to be assumed that a wind farm will generate an average of 30 per cent of its maximum capacity over time. However the new study shows that this is actually untrue, with the turbines measured by the Grid turning in performances which were significantly worse:

    Average output from wind was 27.18% of metered capacity in 2009, 21.14% in 2010, and 24.08% between November 2008 and December 2010 inclusive.

In general, then, one should assume that a wind farm will generate no more than 25 per cent of maximum capacity over time (and indeed this seems set to get worse [2] as new super-large turbines come into service). Even over a year this will be up or down by a few per cent, making planning more difficult.

It gets worse, too, as wind power frequently drops to almost nothing. It tends to do this quite often just when demand is at its early-evening peak:

    At each of the four highest peak demands of 2010 wind output was low being respectively 4.72%, 5.51%, 2.59% and 2.51% of capacity at peak demand.

And unfortunately the average capacity over time is pulled up significantly by brief windy periods. Wind output is actually below 20 per cent of maximum most of the time; it is below 10 per cent fully one-third of the time. Wind power needs a lot of thermal backup running most of the time to keep the lights on, but it also needs that backup to go away rapidly whenever the wind blows hard, or it won't deliver even 25 per cent of capacity.

Quite often windy periods come when demand is low, as in the middle of the night. Wind power nonetheless forces its way onto the grid, as wind-farm operators make most of their money not from selling electricity but from selling the renewables obligation certificates (ROCs) which they obtain for putting power onto the grid. Companies supplying power to end users in the UK must obtain a certain amount of ROCs by law or pay a "buy-out" fine: as a result ROCs can be sold for money to end-use suppliers.

Thus when wind farmers have a lot of power they will actually pay to get it onto the grid if necessary in order to obtain the lucrative ROCs which provide most of their revenue, forcing all non-renewable providers out of the market. If the wind is blowing hard and demand is low, there may nonetheless be just too much wind electricity for the grid to use, and this may happen quite often:

    The incidence of high wind and low demand can occur at any time of year. As connected wind capacity increases there will come a point when no more thermal plant can be constrained off to accommodate wind power. In the illustrated 30GW connected wind capacity model [as planned for by the UK government at the moment] this scenario occurs 78 times, or three times a month on average. This indicates the requirement for a major reassessment of how much wind capacity can be tolerated by the Grid.

Want to know why your 'leccy bill is climbing, and will keep on climbing no matter what happens to coal and gas prices? Yes - it's wind farms

Or, in other words, there is little point building more wind turbines above a certain point: after that stage, not only will they miss out on revenue by often being at low output when demand is high, but they will also miss out by producing unsaleable surplus electricity at times of low demand. The economic case for wind – already unsupportable without the ROC scheme – will become even worse, and wind will require still more government support (it already often needs large amounts [3] above and beyond ROCs).

The idea that pumped storage will be able to compensate for absent wind – meaning that there will be no need for full thermal capacity able to meet peak demand – is also exposed as unsound. The UK has just 2,788 megawatts of pumped-storage capacity and it can run at that level for just five hours. UK national demand is above 40,000 megawatts for 15 hours a day and seldom drops below 27,000. Pumped storage would have to increase enormously both in capacity and duration – at immense cost – before it could cope even with routine lulls hitting the planned 30-gigawatt wind sector, let alone rare (but certain to occur) prolonged calms.

The John Muir analysis goes on:

    The nature of wind output has been obscured by reliance on "average output" figures. Analysis of hard data from National Grid shows that wind behaves in a quite different manner from that suggested by study of average output derived from the Renewable Obligation Certificates (ROCs) record, or from wind speed records which in themselves are averaged. It is clear from this analysis that wind cannot be relied upon to provide any significant level of generation at any defined time in the future. There is an urgent need to re-evaluate the implications of reliance on wind for any significant proportion of our energy requirement.

Unfortunately given all this, the ROC scheme is on an escalator: the amount of ROCs an end-use 'leccy supplier must obtain will rise to 15.4 per cent of megawatt-hours supplied in 2014, up from 10.4 per cent last year. The effect of this is to provide the large extra funds a wind farm needs to compete with thermal generation, by driving up electricity prices for the user: The ROC scheme is a stealth tax which appears neither on the electricity bill nor the Treasury accounts.

High electricity prices worsen the case for electric transport, electric heating and electric industry, so there are reasons to dislike windfarms even from a carbon-emissions point of view. There would be little point going to partially-wind electricity if the effect is to drive people more and more into using fossil fuels wherever possible.

But that's the way we're headed. ®
Bootnote

You can look up all the current National Grid power figures here [4]: archives since 2008 are here [5] (registration required).
Links

  1. http://www.jmt.org/assets/pdf/wind-report.pdf
  2. http://www.theregister.co.uk/2011/01/21/wind_turbines_too_close_together/
  3. http://www.theregister.co.uk/2011/01/20/hull_wind_turbine_bonanza/
  4. http://www.bmreports.com
  5. https://elexonexchange.bsccentralservices.com/ref=HISTORICGENERATIONBYFUELTYPE

Some very radical storage and load leveling technology needs to be invented to make windpower an even halfway practical source of supply.
 
Interesting numbers. OTOH, Pickens has been pushing for a vast government subsidy for windmill power in Texas and the Great Plains, so I wonder how much of this push for natural gas is related to his own energy interests and holdings? Much of this can be done and will happen without any market subsidies (and there may be many places where subsidies and market distortions are preventing or delaying changeovers)

http://blogs.forbes.com/richkarlgaard/2011/04/11/what-i-learned-about-natural-gas-from-boone-pickens/

What I Learned About Natural Gas from Boone Pickens

Last week I interviewed the Texas energy baron T. Boone Pickens four consecutive nights in front of a live audience. Pickens would talk for 40 minutes and then I would interview him for 50 minutes. (Full disclosure: I was paid a fee to do this, not from Pickens but from the event’s owner.)

The Pickens presentations had an interesting underlying tension: Texas billionaire, oilman and Republican trying to convince earnest San Francisco Bay Area liberals about the virtues of natural gas. How did Pickens do in front of liberal, vaguely hostile audiences? Surprisingly well. He made his case with numbers.

Here is what Pickens said:

– Global demand for oil is 86-88 million barrels per day. It will be 90 million by the end of the year, due to global growth.

– Global production is 84 million barrels per day. Since production falls short of demand, prices have risen.

– America consumes 20 million barrels of oil per day. We produce 7 million barrels domestically and import the other 13 million barrels. Of the 13 million barrels of imported oil, 5 million come from OPEC – “nations that hate us,” says Pickens.

– The true cost of Middle Eastern oil is over $300 a barrel if you account for U.S. military presence in the Middle East, according to Pickens.

– “Drill baby, drill” – the conservative mantra to drill more oil from the Gulf of Mexico, off the East and West Coast shelves, and the Alaska Natural Wildlife Refuge (ANWR) would produce an extra 2 million barrels a day at best, says Pickens. The would raise America’s domestic production from 7 million to 9 million barrels but still leave America 11 million barrels short each day.

– In ANWR, the bottleneck is the pipeline from Alaska’s north shore. “It would take 30 years to build another pipeline,” says Pickens.

Hence the allure of natural gas: Pickens claims the U.S. has natural gas reserves equivalent to three times that of Saudi Arabia’s known 260 billion-barrel oil reserve when you use a Barrel of Oil Equivalent (BOE) comparison.

– Using BOE, natural gas, at its current price, would be about $1.50 per gallon cheaper than diesel fuel.

– Using BOE, natural gas emits 30% less carbon

Boone Pickens wants to convert America’s 140,000-unit fleet of 18-wheel truckers to run on natural gas. Pickens says the cost of converting the next-generation fleet of 18-wheelers would be about $60,000 per vehicle – or roughly $9 billion for the entire 140,000 fleet. Where will that money come from?

Last week, Congressmen John Sullivan (R-OK), Dan Boren (D-OK), John Larson (D-CT) and Kevin Brady (R-TX) introduced H.R. 1380, the  ‘New Alternative Transportation to Give Americans Solutions’ (NAT GAS) Act to supply the funds. It would ladle out a billion or two a year.

Is this a smart use of government funds at a time when the government is essentially broke? Yes, I think so. If you believe the Pickens numbers, our imported OPEC oil is costing America $2 billion a day and would cost $6 billion a day if unsubsidized by the U.S. military presence in the Middle East. Also, some percentage of the money we send to Saudi Arabia makes its way to our enemies, such as the Taliban.

But if natural gas is so economically compelling, why won’t private investors come up with the funds? It’s a critical mass problem, argues Pickens. America needs to prime the pump, as it were, to get the wheels turning. Start with 18-wheelers, he says, and that will create a national infrastructure of conversion technology and delivery. To my libertarian friends: Don’t forget that the U.S. government bought the first billion dollars worth of semiconductors in the 1960s. That created the funds for factories and volume manufacturing which in turn drove prices down to affordable levels for civilian uses. Industrial policy? Yes.

America’s commercial use of its vast, cheap, natural gas reserves will take a bipartisan political effort. Democrats will have to say no to the radical environmentalists and their hostility toward fossil fuels. Republicans will have to say no to the Tea Party and their hostility toward government funding.

Bipartisan consensus is a rarity these days. It is certainly out of fashion. But energy independence will demand it.
 
The first cut of this study seems to indicate it is on the edge of prossibility and economic practicality. I wonder how the figures will change when subsitituting more efficient bioreactors for open ponds or using sewage rather than fresh irrigation water will change the figures. As well, the figures do not take into account the energy required to process algae into biodiesel. Still, we do have to get rid of sewage, and getting into adventures like Lybia to secure Middle Eastern oil is not a winning proposition for us. Reducing oil imports will also save the US economy hundreds of billions of dollars annually, something which should be at the top of everyone's to do list:

http://www.wired.com/autopia/2011/04/algal-fuels-could-cut-oil-imports-17-percent/

Algal Fuels Could Cut Oil Imports 17 Percent
By Chuck Squatriglia  April 15, 2011  |  5:00 am  |  Categories: Alt Fuel

Forget hydrogen. Algae may be the fuel of the future.

A study by researchers at Pacific Northwest National Laboratory finds algal fuel could replace 17 percent of the petroleum the United States imports for transportation fuel each year.

Of course, one of the major concerns about algae is the volume of water needed to produce it. But the researchers note that water use can be significantly curtailed by raising algae in the humid climates of the Gulf Coast, Southeastern Seaboard and Great Lakes regions.

“Algae has been a hot topic of biofuel discussions recently, but no one has taken such a detailed look at how much America could make — and how much water and land it would require — until now,” said Mark Wigmosta, a hydrologist and lead author of the study, in a statement. “This research provides the groundwork and initial estimates needed to better inform renewable energy decisions.”

Algal fuels are made by extracting and refining the lipids within algae. Algae are attractive biofuel feedstock because it grows quickly and thrives in everything from seawater to irrigation runoff to sewage. Such fuels could go a long way toward meeting the Energy Independence and Security Act. That law requires that biofuels replace more than 10 percent of our current petroleum consumption by 2022. Half of that biofuel must come from something other than corn.

The PNNL study provides an in-depth assessment of the United States’ annual algal fuel potential given the available land and water.

The researchers, who detail their findings in Water Resources Research, analyzed currently available data to determine how much algae can be grown in open, outdoor ponds of fresh water — as is typical — using current tech.

First, they created a database analyzing topography, population, land use and other data about the contiguous United States. The info, spaced every 100 feet, allowed them to determine which areas are ideally suited to raising algae.

Then they gathered 30 years of weather info to determine how much sunlight the algae could realistically photosynthesize and how warm the ponds would become. They used that data in a mathematical model to calculate how much algae could be produced each hour at a given site.

Using that model, they say 21 billion gallons of algal oil could be produced domestically. That’s equivalent to 17 percent of the petroleum the United States imported for transportation fuel in 2008.

Growing all that algae would require land roughly the size of South Carolina and 350 gallons of water for each gallon of algal oil. All told, that comes to about 25 percent of the water we currently use for crop irrigation. (The researchers say that’s on par with ethanol.)

“Water is an important consideration when choosing a biofuel source,” Wigmosta said. “And so are many other factors. Algae could be part of the solution to the nation’s energy puzzle – if we’re smart about where we place growth ponds and the technical challenges to achieving commercial-scale algal biofuel production are met.”

If we went for broke and maxed out our capacity to produce algae, we could cut petroleum imports by 48 percent, the researchers say. But we’d need several times our annual consumption of irrigation water to do so. It isn’t terribly practical.

John Timmer, a biochemist who writes for our sister publication Ars Techica, offers some interesting analysis of the study.

He says the 48 percent figure is based on unrealistic assumptions. Even the possibility of replacing 17 percent of our oil imports with algal fuel must be taken with a grain of salt.

“Even for the more realistic scenarios, the list of caveats is pretty extensive,” he writes. “Water and nutrients are unlimited, only evaporation is considered, only open ponds are used, and the authors ignore the energy demand involved in keeping the ponds from freezing or processing the algae into fuel. ”

The authors identified areas that could be used for open ponds and focused on land that is relatively flat and isn’t farmland or parkland. That includes roughly 5 percent of the country’s land. If we used it all for biofuel production, we could produce 220 gigaliters a year, or about half our current oil imports, Timmer writes.

But we don’t have the water needed to do that.

With that in mind, Timmer writes, the researchers balance productivity and water requirements. That left the Gulf Coast, Southeast Seaboard and Great Lakes as ideal locations. And that led them to conclude we could replace 17 percent of our imported oil while consuming one-quarter of the water used each year for agriculture irrigation.

“That’s still quite high, but remember that this assumes unpolluted freshwater,” Timmer writes. “The areas along the Gulf and Atlantic cost could easily use a combination of saltwater and municipal waste. The latter source could potentially provide for facilities in some of the areas in the Southwest that are otherwise ruled out due to their high water use.”

Such details were beyond the scope of the PNNL study, but critical to consider, Timmer writes. The authors hope other researchers use their model to conduct further studies.

“Ideally, if they’re taken up on this offer,” Timmer writes, “we’ll have a clearer picture of the potential of algal biofuels.”
 
More developments in solar energy:

http://www.technologyreview.com/energy/37481/?p1=A3&a=f

More Power from Rooftop Solar
A startup says technology inspired by RAID hard drives can boost power output by up to 50 percent.
By Kevin Bullis

A startup called TenKsolar, based in Minneapolis, says it can increase the amount of solar power generated on rooftops by 25 to 50 percent, and also reduce the overall cost of solar power by changing the way solar cells are wired together and adding inexpensive reflectors to gather more light.

TenKsolar says its systems can produce power for as little as eight cents a kilowatt-hour in sunny locations. That's significantly more expensive than electricity from typical coal or natural-gas power plants, but it is less than the average price of electricity in the United States.

Solar cells have become more efficient in recent years, but much of the improvement has gone to waste because of the way solar cells are put together in solar panels, the way the panels are wired together, and the way the electricity is converted into AC power for use in homes or on the grid. Typically, the power output from a string of solar cells is limited by the lowest-performing cell. So if a shadow falls on just one cell in a panel, the power output of the whole system drops dramatically. And failure at any point in the string can shut down the whole system.

TenKsolar has opted for a more complex wiring system—inspired by a reliable type of computer memory known as RAID (for "redundant array of independent disks"), in which hard disks are connected in ways that maintain performance even if some fail. TenKsolar's design allows current to take many different paths through a solar-panel array, thus avoiding bottlenecks at low-performing cells and making it possible to extract far more of the electricity that the cells produce.

The wiring also makes it practical to attach reflectors to solar panels to gather more light. When solar panels are installed on flat roofs, they're typically mounted on racks that angle them toward the sun, and spaced apart to keep them from shading each other over the course of the day. Reflectors increase the amount of light that hits a solar array, but they reflect the sunlight unevenly. So in a conventional solar array, the output is limited by the cell receiving the least amount of reflected light. The new system can capture all the energy from the extra, reflected light. "The small added cost we put in on the electronics is paid back, plus a bunch, from the fact that we basically take in all of this reflected light," says Dallas Meyer, founder and president of TenKsolar. "We've architected a system that's completely redundant from the cell down to the inverter," he says. "If anything fails in the system, it basically has very low impact on the power production of the array."

The reflectors use a film made by 3M that reflects only selected wavelengths of light, reducing visible glare. The material also reflects less infrared light, which can overheat a solar panel and reduce its performance.

Meyer says the system costs about the same as those made by Chinese manufacturers but produces about 50 percent more power for a given roof area. Power output is about 25 percent higher than from the more expensive, high-performance systems made by SunPower, he says.

The new wiring approach does have a drawback: because it's new, the banks that finance solar-power installations may have doubts that the system will last for the duration of the warranty, and this could complicate financing, says Travis Bradford, an industry analyst and president of the Prometheus Institute for Sustainable Development.

TenKSolar, which has so far raised $11 million in venture funding and has the capacity to produce 10 to 12 megawatts of systems a year, is working on partnerships with larger companies to help provide financial backing for guarantees of its products.

Copyright Technology Review 2011.
 
Recharging using microbes that live in dirt. Now there is a technology that can really be used anywhere!

http://www.seas.harvard.edu/news-events/press-releases/gates-grant

SEAS receives $100k Grand Challenges Explorations Grant

April 28, 2011

Aviva Presser Aiden '09 and colleagues to develop microbial-based cell phone charger to increase access to health care via mobile apps

Seattle, Washington and Cambridge, Mass. – April 28, 2011 – A project to use dirt-powered batteries to charge cell phones in Africa won a $100,000 grant from The Bill & Melinda Gates Foundation today.

Aviva Presser Aiden '09 (Ph.D.), an affiliate of the Harvard School of Engineering and Applied Sciences (SEAS) who is now a student at Harvard Medical School, and colleagues will help to develop a Microbial Fuel Cell-based charger that could be readily and cheaply assembled out of basic components to increase access to health care via mobile applications in the developing world.

The project, hosted by the Laboratory-at-Large at Harvard, will have an initial field-test site in sub-Saharan Africa. Harvard Fellow Erez Lieberman-Aiden will serve as the lead investigator on the project.

This grant was made under the call for Gates Grand Challenges Exploration Grant (CGE) proposals to "Create Low-Cost Cell Phone-Based Applications for Priority Global Health Conditions."

GCE funds scientists and researchers worldwide to explore ideas that can break the mold in how to solve persistent global health and development challenges. Aiden’s project is one of over 85 Grand Challenges Explorations Round 6 grants.

Cell phones are becoming a ubiquitous and increasingly crucial part of the health care infrastructure of the developing world. The devices provide a critical gateway to health information and offer contact with physicians who cannot reach remote locations.

For instance, even in Sub-Saharan Africa, where 500 million people lack power in their homes, 22 percent of households have cell phones. Keeping the devices charged, however, can be a challenge.

"For households lacking power in Sub-Saharan Africa, recharging a cell phone battery often means a long, possibly multi-hour walk to a charging station, where recharges cost between 50 cents and a dollar," says Aiden. "Because the per-capita income is several hundred dollars per year, this is a significant cost. Existing solutions for charging cell phones in off-grid areas are inadequate. For instance, a solar-panel based charger costs around $20, and is difficult to even bring to market because of poor access and inability to repair them if they break."

The solution is the use of an natural abundant source of energy: microbial power. Certain naturally occurring soil microbes produce free electrons during the course of their ordinary metabolic processes. A Microbial Fuel Cell (MFC) uses a conductive surface to harvest these electrons and use them as a power source.

"We plan to develop an MFC-based cell phone charger," says Aiden "Our goal is to make a charger would cost of order a dollar and could completely charge a phone in 24 hours. Furthermore, unlike solar panels, MFCs do not require any sophisticated materials: they can be easily assembled in only a few minutes. As cultural knowledge of MFC technology spreads, Africans will become capable of assembling their own chargers almost entirely from scratch, and at minimal cost that will be recouped with the very first recharge."

Aiden has already demonstrated the effectiveness of the MFC-approach, building MFCs that can produce enough to power LED lights for use in homes in regions such as Tanzania and Namibia. Moreover, the MFCs were able to operate continuously in the lab for 14 months.

"With the funding from the Gates Foundation, our plan is to send two researchers to Africa for this deployment," she says. "The researchers will spend two weeks introducing themselves and their work to the community and collecting data regarding typical phone usage behavior and recharge frequency. After this introductory period, the researchers will install the prototypes in the homes of volunteer families, showing these families about how to plug in their phones."

Following the completion of the pilot program, Aiden hopes to follow-up with a larger-scale project, distributing chargers across broader region, thereby demonstrating the viability of this approach to charging cellular phones in developing world contexts.

“GCE winners are expanding the pipeline of ideas for serious global health and development challenges where creative thinking is most urgently needed. These grants are meant to spur on new discoveries that could ultimately save millions of lives,” said Chris Wilson, Director of Global Health Discovery at the Bill & Melinda Gates Foundation.
 
USGS assesses the Bakken deposit:

http://www.doi.gov/news/pressreleases/Bakken-Formation-Oil-Assessment-in-North-Dakota-Montana-will-be-updated-by-US-Geological-Survey.cfm

Bakken Formation Oil Assessment in North Dakota, Montana will be updated by U.S. Geological Survey
World-class formation developing into major source of onshore domestic energy, benefiting nation, American Indian tribes, rural communities

05/19/2011

Contact: Kendra Barkoff (DOI) 202-208-6416
Jessica Robertson (USGS) 703-648-6624
Alex Demas (USGS) 703-648-4421

WASHINGTON, DC – Secretary of the Interior Ken Salazar today announced that the U.S. Geological Survey will update its 2008 estimate of undiscovered, technically recoverable oil and gas in the U.S. portion of the Bakken Formation, an important domestic petroleum resource located in North Dakota and Montana.

“The Administration supports safe and responsible oil and gas production as part of our nation’s comprehensive energy portfolio,” Salazar said. “We must develop our resources armed with the best science available, and with wells drilled in the Bakken during the past three years, there is significant new geological information. With ever-advancing production technologies, this could mean more oil could potentially be recovered in the formation.”

The 2008 USGS assessment estimated 3.0 to 4.3 billion barrels of undiscovered, technically recoverable oil in the U.S. portion of the Bakken Formation, elevating it to a “world-class” accumulation. The estimate had a mean value of 3.65 billion barrels. The USGS routinely conducts updates to oil and gas assessments when significant new information is available, such as new understanding of a resource basin’s geology or when advances in technology occur for drilling and production.

The 2008 Bakken Formation estimate was larger than all other current USGS oil assessments of the lower 48 states and is the largest "continuous" oil accumulation ever assessed by the USGS. A "continuous” or "unconventional" oil accumulation means that the oil resource is dispersed throughout a geologic formation rather than existing as discrete, localized occurrences, such as those in conventional accumulations. Unconventional resources require special technical drilling and recovery methods.

“The new scientific information presented to us from technical experts clearly warrants a new resource assessment of the Bakken,” said USGS Energy Resources Program Coordinator Brenda Pierce. “The new information is significant enough for the evaluation to begin sooner than it normally would. It is important to look at this resource and its potential contribution to the national energy portfolio.”

The 2008 USGS assessment showed a 25-fold increase in the amount of technically recoverable oil as compared to the agency's 1995 estimate of 151 million barrels of oil. New geologic models applied to the Bakken Formation, advances in drilling and production technologies, and additional oil discoveries resulted in these substantially larger technically recoverable oil volumes. About 135 million barrels of oil were produced from the Bakken between 1953 and 2008; 36 million barrels in 2008 alone. According to state statistics, oil production from the Bakken in North Dakota has steadily increased from about 28 million barrels in 2008, to 50 million barrels in 2009 to approximately 86 million barrels in 2010.

“The Bakken Formation is producing an ever-increasing amount of oil for domestic consumption while providing increasing royalty revenues to American Indian tribes and individual Indian mineral owners in North Dakota and Montana,” Salazar noted. Interior agencies have been working closely, for example, with the Three Affiliated Tribes (the Mandan, Hidatsa and Arikara) and individual Indian mineral owners on the Ft. Berthold Reservation in North Dakota to facilitate this development.

Technically recoverable oil resources are those producible using currently available technology and industry practices. USGS is the only provider of publicly available estimates of undiscovered technically recoverable oil and gas resources.

The new update effort will be a standard assessment task under the existing USGS National Oil and Gas Assessment. It will begin in October 2011, at the start of the 2012 fiscal year. Depending on funding, it is expected to take two years to complete. Drilling and production will continue while the USGS conducts its assessment update.

For more information about the Bakken Formation, please visit the USGS frequently asked questions that were developed after the 2008 resource assessment at http://www.usgs.gov/faq/index.php?action=show&cat=21
 
Back
Top