Category Archives: nuclear fusion

The Interstellar Mind of Robert Goddard

From Centauri Dreams:

Astronautics pioneer Robert H. Goddard is usually thought of in connection with liquid fuel rockets. It was his test flight of such a rocket in March of 1926 that demonstrated a principle he had been working on since patenting two concepts for future engines, one a liquid fuel design, the other a staged rocket using solid fuels. “A Method of Reaching Extreme Altitudes,” published in 1920, was a treatise published by the Smithsonian that developed the mathematics behind rocket flight, a report that discussed the possibility of a rocket reaching the Moon.

While Goddard’s work could be said to have anticipated many technologies subsequently developed by later engineers, the man was not without a visionary streak that went well beyond the near-term, expressing itself on at least one occasion on the subject of interstellar flight. Written in January of 1918, “The Ultimate Migration” was not a scientific paper but merely a set of notes, one that Goddard carefully tucked away from view, as seen in this excerpt from his later document “Material for an Autobiography” (1927):

“A manuscript I wrote on January 14, 1918 … and deposited in a friend’s safe … speculated as to the last migration of the human race, as consisting of a number of expeditions sent out into the regions of thickly distributed stars, taking in a condensed form all the knowledge of the race, using either atomic energy or hydrogen, oxygen and solar energy… [It] was contained in an inner envelope which suggested that the writing inside should be read only by an optimist.”

Optimism is, of course, standard currency in these pages, so it seems natural to reconsider Goddard’s ideas here. As to his caution, we might remember that the idea of a lunar mission discussed in “A Method of Reaching Extreme Altitudes” not long after would bring him ridicule from some elements in the press, who lectured him on the infeasibility of a rocket engine functioning in space without air to push against. It was Goddard, of course, who was right, but he was ever a cautious man, and his dislike of the press was, I suspect, not so much born out of this incident but simply confirmed by it.

In the event, Goddard’s manuscript remained sealed and was not published until 1972. What I hadn’t realized was that Goddard, on the same day he wrote the original manuscript, also wrote a condensed version that David Baker recently published for the British Interplanetary Society. It’s an interesting distillation of the rocket scientist’s thoughts that speculates on how we might use an asteroid or a small moon as the vehicle for a journey to another star. The ideal propulsion method would, in Goddard’s view, be through the control of what he called ‘intra-atomic energy.’

goddard

Image: Rocket pioneer Robert H. Goddard, whose notes on an interstellar future discuss human migration to the stars.

Atomic propulsion would allow journeys to the stars lasting thousands of years with the passengers living inside a generation ship, one in which, he noted, “the characteristics and natures of the passengers might change, with the succeeding generations.” We’ve made the same speculation here, wondering whether a crew living and dying inside an artificial world wouldn’t so adapt to the environment that it would eventually choose not to live on a planetary surface, no matter what it found in the destination solar system.

And if atomic energy could not be harnessed? In that case, Goddard speculated that humans could be placed in what we today would think of as suspended animation, the crew awakened at intervals of 10,000 years for a passage to the nearest stars, and intervals of a million years for greater distances. Goddard speculates on how an accurate clock could be built to ensure awakening, which he thought would be necessary for human intervention to steer the spacecraft if it came to be off its course. Suspended animation would involve huge changes to the body:

…will it be possible to reduce the protoplasm in the human body to the granular state, so that it can withstand the intense cold of interstellar space? It would probably be necessary to dessicate the body, more or less, before this state could be produced. Awakening may have to be done very slowly. It might be necessary to have people evolve, through a number of generations, for this purpose.

As to destinations, Goddard saw the ideal as a star like the Sun or, interestingly, a binary system with two suns like ours — perhaps he was thinking of the Alpha Centauri stars here. But that was only the beginning, for Goddard thought in terms of migration, not just exploration. His notes tell us that expeditions should be sent to all parts of the Milky Way, wherever new stars are thickly clustered. Each expedition should include “…all the knowledge, literature, art (in a condensed form), and description of tools, appliances, and processes, in as condensed, light, and indestructible a form as possible, so that a new civilisation could begin where the old ended.”

The notes end with the thought that if neither of these scenarios develops, it might still be possible to spread our species to the stars by sending human protoplasm, “…this protoplasm being of such a nature as to produce human beings eventually, by evolution.” Given that Goddard locked his manuscript away, it could have had no influence on Konstantin Tsiolkovsky’s essay “The Future of Earth and Mankind,” which in 1928 speculated that humans might travel on millennial voyages to the stars aboard the future equivalent of a Noah’s Ark.

Interstellar voyages lasting thousands of years would become a familiar trope of science fiction in the ensuing decades, but it is interesting to see how, at the dawn of liquid fuel rocketry, rocket pioneers were already thinking ahead to far-future implications of the technology. Goddard was writing at a time when estimates of the Sun’s lifetime gave our species just millions of years before its demise — a cooling Sun was a reason for future migration. We would later learn the Sun’s lifetime was much longer, but the migration of humans to the stars would retain its fascination for those who contemplate not only worldships but much faster journeys.

 

Goddard was obviously influenced by his contemporary J.D. Bernal with his The World, the Flesh and the Devil  which predicted Man’s spread out into the Solar System and interstellar space with artificial worlds and hollowed out asteroids.

 

These worlds are needed because such journeys will take hundreds or perhaps thousands of years.

 

Of course that brings in natural evolution and what these people inside these places will become when they eventually reach their destinations and if they’ll actually have need of them.

 

Robert Goddard’s Interstellar Migration

 

 

 

 

 

Laser Powered Bussard Interstellar Ramscoops

We do a change of pace today as we move from the UFO Community to the mainstream ( sort of ) of Paul Gilster’s Centauri Dreams and Tau Zero’s discussion of real life interstellar propulsion methods and starflight.

Here Robert Bussard’s ramjet is linked with Robert Forward’s idea of laser-powered starflight and a more efficient method of vehicle acceleration –  and more importantly ‘deceleration’ at the appointed destination:

Many of the interstellar concepts I write about in these pages take on a life of their own. After the initial brainstorming, the idea gets widely enough disseminated that other scientists take it on, looking to modify and improve on the original concept. That’s been true in the case of solar sails and the more recently devised ‘lightsails,’ which use beamed energy from a laser or microwave source to drive the vehicle. We continue to study magnetic sails — ‘magsails’ — and various nuclear options like the inertial confinement fusion that powered Daedalus and perhaps Icarus. Sometimes insights arise when ideas are grafted onto each other to create a hybrid solution.

The idea I want to examine today, a hybrid design combining a Bussard-style interstellar ramjet with laser beaming — exemplifies this mix and match process. Working with Daniel Whitmire, A. A. Jackson, a frequent commenter and contributor here on Centauri Dreams, pondered the various issues the Bussard ramjet had run into, including the difficulty in lighting the proton/proton fusion reaction Bussard advocated early in the process. Writing at a time not long after he had finished up a PhD in relativistic physics (at the University of Texas), Jackson conceived the idea of beaming energy to the spacecraft and discovered that the method offered advantages over the baseline Bussard design. The laser-powered ramjet is a fascinating concept that has received less attention than it deserves.

Image: Physicist and interstellar theorist Al Jackson, originator of the laser-powered ramjet concept.

Bussard’s ramjet, you’ll recall, lit its fusion fires using reaction mass gathered from the interstellar medium by a huge magnetic ram scoop, which itself has proven problematic given the drag issues such a scoop introduces. The other way to power up a starship using an external source of energy is to beam a terrestrial or Solar System-based laser at the departing craft, which has deployed a lightsail to draw momentum from the incoming photons. Jackson and Whitmire found the latter method inefficient. Their solution was to beam the laser at a ramjet that would use reaction mass obtained from a Bussard-style magnetic ram scoop. The ramjet uses the laser beam as a source of energy but, unlike the sail, not as a source of momentum.

Running the numbers and assuming all photons transmitted by the laser will be absorbed by the ship, the authors discovered that the laser-powered ramjet (LPR) is superior to the baseline Bussard ramjet at low velocities, while superior to the laser-pushed sail at all velocities. The Bussard design becomes the most efficient of the three at velocities equal to and above about 0.14 c. The laser-powered ramjet, then, solves at least one of the Bussard vehicle’s problems, the fact that it has to get up to a significant percentage of lightspeed before lighting its fusion reaction. LPR propulsion could be used up to 0.14 c, with the vehicle switching over to full interstellar ramjet mode to achieve high efficiency at relativistic velocities.

The laser-powered ramjet offers other advantages as well. Think back to some of Robert Forward’s laser sail concepts and you’ll recall the problem of deceleration. With the sail powered by a laser beam from the Solar System, it’s possible to reach velocities high enough to take you to the nearest stars in a matter of decades rather than centuries. But how do you slow down once you arrive? Conceiving a manned mission to Epsilon Eridani, Forward came up with a ‘staged’ solution in which the sail separates upon arrival, with the large outer sail ring moving ahead of the vehicle and reflecting beamed laser energy to the now smaller inner sail, thus slowing it down. It would be so much easier if the beam worked in both directions!

But with the laser-powered ramjet, a round trip can be made using a single laser beam because the beam is being used as a source of energy rather than momentum. Jackson and Whitmire showed that the efficiency in the deceleration phase of the outbound journey as a function of velocity is the same as for the acceleration phase. And on the return trip, the energy utilisation efficiency is more favorable in both the acceleration and deceleration phases because the ship is traveling into the beam. In fact, the laser-powered ramjet is superior to both the laser sail and the Bussard ramjet even at high fractions of the speed of light when traveling into the laser beam.

Let’s go over that again: Jackson and Whitmire’s calculations focus on the energy utilisation efficiency parameter, showing that the laser-powered ramjet is superior to the laser sail at all velocities, whether the ship is receding from the beam or approaching (moving into the beam). The LPR is also superior to the Bussard ramjet at velocities less than about 0.14 c when receding from the beam, and superior to the Bussard design at all velocities when approaching. Add to this that the LPR concept requires no onboard proton-burning reactor — the authors assume the use of Whitmire’s ‘catalytic’ ramjet using the CNO (carbon-nitrogen-oxygen) cycle — and that the LPR’s power requirements are less than those of the laser sail.

As this talk is more ‘mainstream’ than usual, the idea of spotting interstellar craft incoming to this Solar System is easy to spot, given the power output of the craft. Any invasion would be highly visible.

But the UFO phenomenon excludes that – no incoming craft are visible until they are already in the atmosphere.

So is wormhole technology being used, or are other folding door type technologies ( if one can call such things technology ) being utilized?

A Laser-Powered Interstellar Ramjet

Virgin Satellites, Tunguska Tesla and the Nuclear Imperative

Virgin Galactic Satellite Company?

The company is working with UK space exploration company Surrey Small Satellites on plans to develop a launcher that could propel a 200kg satellite into space at roughly 10pc the cost of current technology.

Will Whitehorn, president of Virgin Galactic said: “We have the technology and the investment to put this together. We hope to develop a preliminary satellite launch vehicle ourselves, but will go to the wider market to produce something capable of carrying 200kg, which we believe is the sweet spot in the market.”

Mr Whitehorn said that the company hoped to have proposals to put to the market for the development of the satellite launch vehicle in the next four months.

Virgin Galactic has secured $100m of funding from Abu Dhabi’s Aabar Investments for the commercial satellite business on top of the $280m co-investment in its space tourism business announced last week. The extra investment would take Aabar’s stake in Virgin Galactic from 32pc to 38pc.

The satellite business will target the growing market for low-orbit earth observation and communication satellites.

According to Mr Whitehorn, it could also be used to start construction of server farms in space and to create mobile and broadband networks that could serve areas such as Africa that do not have good cable networks.

Although the development is in its early stages, it could provide a significant boost to the UK space industry, which according to Mr Whitehorn employs around 70,000 people and represents £2.5bn per year in net exports.

Mr Whitehorn said: “This is a hidden industry in the UK but a very important one. In terms of net exports it is bigger than the car industry.

“We hope to be able to use the development of our commercial satellite business to leverage off the tourism work we are already doing and to add real value to the UK economy.”

Virgin Goes Galactic with Satellites

Was the 1908 Tunguska, Siberia explosion actually ‘Tesla Tech?

1908: Tesla repeated the idea of destruction by electrical waves to the newspaper on April 21st. His letter to the editor stated, “When I spoke of future warfare I meant that it should be conducted by direct application of electrical waves without the use of aerial engines or other implements of destruction.” He added: “This is not a dream. Even now wireless power plants could be constructed by which any region of the globe might be rendered uninhabitable without subjecting the population of other parts to serious danger or inconvenience.”(27)

In the period from 1900 to 1910 Tesla’s creative thrust was to establish his plan for wireless transmission of energy. Undercut by Marconi’s accomplishment, beset by financial problems, and spurned by the scientific establishment, Tesla was in a desperate situation by mid-decade. The strain became too great by 1906-1907 and, according to Tesla biographers, he suffered an emotional collapse.(28),(29)In order to make a final effort to have his grand scheme recognized, he may have tried one high power test of his transmitter to show off its destructive potential. This would have been in 1908.

The Tunguska event took place on the morning of June 30th, 1908. An explosion estimated to be equivalent to 10-15 megatons of TNT flattened 500,000 acres of pine forest near the Stony Tunguska River in central Siberia. Whole herds of reindeer were destroyed. Several nomadic villages were reported to have vanished. The explosion was heard over a radius of 620 miles. When an expedition was made to the area in 1927 to find evidence of the meteorite presumed to have caused the blast, no impact crater was found. When the ground was drilled for pieces of nickel, iron, or stone, the main constituents of meteorites, none were found down to a depth of 118 feet.

Several explanations have been given for the Tunguska event. The officially accepted version is that a 100,000 ton fragment of Encke’s Comet, composed mainly of dust and ice, entered the atmosphere at 62,000 mph, heated up, and exploded over the earth’s surface creating a fireball and shock wave but no crater. Alternative explanations of the disaster include a renegade mini-black hole or an alien space ship crashing into the earth with the resulting release of energy.

Associating Tesla with the Tunguska event comes close to putting the inventor’s power transmission idea in the same speculative category as ancient astronauts. However, historical facts point to the possibility that this event was caused by a test firing of Tesla’s energy weapon.

In 1907 and 1908, Tesla wrote about the destructive effects of his energy transmitter. His Wardenclyffe facility was much larger than the Colorado Springs device that destroyed the power station’s generator. Then, in 1915, he stated bluntly:

It is perfectly practical to transmit electrical energy without wires and produce destructive effects at a distance. I have already constructed a wireless transmitter which makes this possible. … But when unavoidable [it] may be used to destroy property and life. The art is already so far developed that the great destructive effects can be produced at any point on the globe, defined beforehand with great accuracy (emphasis added).(30) Nikola Tesla, 1915

He seems to confess to such a test having taken place before 1915, and, though the evidence is circumstantial, Tesla had the motive and the means to cause the Tunguska event. His transmitter could generate energy levels and frequencies capable of releasing the destructive force of 10 megatons, or more, of TNT. And the overlooked genius was desperate.

The nature of the Tunguska event, also, is consistent with what would happen during the sudden release of wireless power. No fiery object was reported in the skies at that time by professional or amateur astronomers as would be expected when a 200,000,000 pound object enters the atmosphere at tens of thousands miles an hour. Also, the first reporters, from the town of Tomsk, to reach the area judged the stories about a body falling from the sky was the result of the imagination of an impressionable people. He noted there was considerable noise coming from the explosion, but no stones fell. The absence of an impact crater can be explained by there having been no material body to impact. An explosion caused by broadcast power would not leave a crater.

This sounds amazingly like HAARP tech also.

Are the two related?

Tesla Wireless and the Tunguska Explosion

Nuclear Energy Redux

We can make a case for improving living standards through space exploration, but only if we take the necessary next steps. Today, our launch technologies are essentially half a century old, with only minor improvements along the way. In our attempt to bootstrap a spacefaring civilization, we need to be thinking long-term and improving our ways of getting out of Earth’s gravity well. On this score, Genta is a proponent of nuclear energy, believing it alone will allow our emergence as a true spacefaring species. Here he speaks from his perspective as a deeply practical mechanical engineer:

The use of nuclear energy for space propulsion in Earth orbit and beyond is just a matter of political will and only marginally of technology: sure, technological advances are required, but after more than 50 years of theoretical studies the ideas are clear and what are still needed are just details. Nuclear-thermal propulsion was demonstrated on the ground in the 1970s and could be used by now for deep-space propulsion. It is true that the performance of such systems can be improved well beyond those demonstrated up to now, but what we have could allow anyway a large improvement if compared with chemical propulsion.

But transitioning to next generation technologies — or catching up in terms of a developing but unused capability — is a demanding process. More on this:

What we really need is to have nuclear powered spacecraft for interplanetary missions, even if their performance were only marginally better than those of chemical propulsion: we need to gain experience in building and operating nuclear systems in space and to make people used to this technology. Performance of nuclear thermal propulsion will improve in due course, but if we wait to start until improved systems are available, everything will be delayed indefinitely.

[…]

Anyone advocating nuclear propulsion in today’s climate of opinion is sure to have a fight on his hands, but Genta believes the time for this fight is propitious. We’re already seeing signs that in the power industry, nuclear options are making a comeback in terms of public acceptance — the phrase ‘nuclear renaissance’ is in the air in some quarters, indicating that we may be ready to move past the era of kneejerk rejection of the nuclear idea. Funding remains a problem, but we come back again to having to sell our future in space one mission at a time, a laborious task but an essential one.

The space option is a long-term perspective, which will naturally be implemented in due time. Perhaps it is hard to accept that progress toward space must be done step by step, but trying shortcuts may be dangerous. In a situation of scarce funds a hard competition between missions and technologies should be avoided. The efforts should be concentrated in areas that may prove to be enabling technologies, even if this may result in postponing some important scientific results.

There is no more important enabling technology than one that would get us to low-Earth orbit cheaply. Genta noted the space elevator concept in his talk but expressed concerns about the size of the investment needed to build it. In any case, a space elevator raises its own safety concerns. He sees nuclear technology as an achievable solution to the low-Earth orbit problem that should not be put off in hopes of a vastly more expensive future solution. Political will is a tricky thing to summon, but making a sustained, long-term case for space as a key player in our economic future may help overcome the obstacle.

Paul makes an excellent case for the use of nuclear power and uses Genta’s paper to great effect, and I totally agree with the meme 100%.

Without utilizing nuclear energy of some sort, mankind will never make it off its’ planet in numbers large enough to colonize the Solar System, let alone interstellar space.

Somehow, I’m not too optimistic about our prospects lately.

On the Nuclear Imperative

Red-Shift Mistakes

Conventional scientific wisdom in the area of astronomy and astrophysics is that when observing objects in the Cosmos, the more red-shifted the object observed, the farther away it is. It is one of the bedrock theories in astronomy and a linch-pin of the Big Bang Theory.

But what if that assumption is wrong? What if red-shifts are being mis-interpreted and that red-shifts might not have anything to do with distance?

According to astronomer Halton Arp, quasars that have a high red-shift (thus are interpreted to be very far away) are intrinsically linked to galaxies that are lightly red-shifted, and thus are known to be closer. How can that be?

The distribution on the sky of clusters of galaxies started to be cataloged about 40 years ago by George Abell and collaborators. The cores of these clusters were predominantly old stellar population E galaxies which were believed to be mostly gas free and inactive. With the advent of X-ray surveys, however, it became evident that many clusters of galaxies were strong X-ray emitters. This evidence for non-equilibrium behavior was not easily explained. In these active properties, however, the clusters joined AGN’s and quasars as the three principal kinds of extragalactic X-ray sources. Evidence then developed that quasars, and now some galaxy clusters were physically associated with much lower redshift galaxies. Surprisingly, the cluster redshifts were sharply peaked at the preferred quasar redshifts of z = .061, .30 etc. (This evidence has been discussed principally in Arp 1997; 1998a; Arp and Russell 2001).

It was possible to explore these properties further by plotting the distribution of galaxy clusters on other, larger areas on the sky. Some appeared projected along the spine of the Virgo Cluster. It turned out that the Abell clusters which were located in that part of the sky in the direction of Fornax fell in the same distinctively elongated area as the large, low redshift Fornax Cluster. (The Abell clusters reach to about z = .2 limit and the brightest galaxy in the Fornax cluster is z = .0025.) On the sky, in the direction of the giant, low redshift galaxy CenA/NGC5128, the Abell clusters fell almost exclusively along a broadening extension of the X-ray, radio jet going northward from this active galaxy. This is the same line occupied by a number of active, higher redshift galaxies which have been previously associated with ejection of radio plasma from CenA (Arp 1998a).

I am not an astronomer, or astrophysicist, but my interpretation of the above paragraphs is that these radio and x-ray emitting galaxies (and quasars) with varying degrees of red-shift are physically linked with one another along a straight line, and has nothing to do with distance at all!

So what is causing the perceived variations of red-shift?

According to Plasma Cosmology, the variations of red-shift are temperature differences of the objects’ plasma:

Astronomers consider plasma to be an ionized gas that behaves according to the same laws that a neutral gas follows, with some modification for magnetic effects. Because they cannot directly measure the properties of extragalactic space, they have developed mathematical models based on the behavior of neutral gases.

 Hannes Alfvén, the father of plasma cosmology, took a different approach. In the opening to his monograph, Cosmic Plasma, he describes how the pure theory approach lost touch with reality. Rather than theorize about how plasma is supposed to act, he studied how it actually behaves in the laboratory. Among the many differences between plasma’s actual behavior and the theoretical model are temperature anomalies  like the one in the galaxy clusters above: temperatures of ions and electrons are 10 to 100 times higher than expected in neutral gases. So, from an Electric Universe point of view, the anomalous temperature seen in the above galaxy clusters are a normal property of the plasma interaction  between clusters.

So what do we take from this? Are our present cosmological models wrong, and have been wrong for 100 years?

Is our Sun a huge ball of plasma, and not a fusion furnace?

We know fusion exists, because we weaponized it.

But could this explain why we haven’t come up with a way to build a reactor that’s capable of controlling the process, after over 50 years of exploding the first H-bomb?

Are we going down the wrong track here?

Origins of Quasars and Galaxy Clusters

Cluster Collisions

Hat Tip

Modern archetypes, or something different?

Battlestar Galactica as an Jungian archetype?

…are we a race of people that has roots are out there, somewhere beyond the milkyway on worlds unknown, of a time long forgotten, of a people long dead? Wouldn’t our children say the same if suddenly the earth were destroyed and only a few of us made it out there, only to settle on another world, to begin anew?

I’m not gullible and I don’t take science fiction shows and add them to my reality. But I do always and often wonder where all ideas and stories begin, and the ideology behind BSG is as old as humanity itself. So, why tell the same tale over and over again in different incarnations if not to serve a purpose? What purpose would that be? To help us to remember, perhaps?

The author makes a point; who, or what, are we?

In the first psychology class I took in college 25 years ago, the professor stressed that human beings are greater than the sum of their parts.

Are we digging up images from our past and just giving them modern clothes to wear?

All of this has happened before and will probably happen again…

______________________________

Cold fusion isn’t an archetype, I think.

But that doesn’t stop the ever present pursuit for it:

A U.S. Navy researcher announced today that her lab has produced “significant” new results that indicate cold fusion-like reactions.

If the work by analytical chemist Pamela Mosier-Boss and her colleagues is confirmed, it could open the door to a cheap, near-limitless reservoir of energy.

That’s a big if, however.

Today’s announcement at the national meeting of the American Chemical Society comes in the same location – Salt Lake City – as one of science’s most infamous episodes, the announcement 20 years ago by chemists Stanley Pons and Martin Fleischmann that they had produced cold fusion.

Unlike nuclear energy reactors and bombs, which split atoms, the atoms in stars such as the sun fuse together to produce spectacular amounts of energy, so much so that we are warmed by a stellar furnace 93 million miles away.

Devising a fusion-based source of energy on Earth has long been a “clean-energy” holy grail of physicists.

Present day research into fusion is high-tech intense and requires a lot of energy to maintain, often more goes in than it generates. That’s why we don’t have fusion reactors dotting the country-side and along sources of water yet; it’s too inefficient.

But, if a sustainable fusion reaction can be produced without all of the supermagnets required, less energy could be put in and more energy can be produced.

Time will tell I guess.

Navy scientist announces possible cold fusion reactions

Hat tip to The Anomalist

__________________________

More on Project Aurora, from Great Britain:

calvinescan

One of the key themes to emerge from these papers is the curious Aurora spyplane saga. This is linked with a little-known set of colour photographs, apparently taken in the Scottish highlands, which appear to show a large diamond-shaped UFO shadowed by military jets.

From the late 1980s the British press was buzzing with rumours about a stealthy, cutting-edge aircraft that some experts believed was an advanced US ‘black project’. Codenamed Aurora, the spy plane was said to be capable of hypersonic speed. Alleged sightings frequently made headlines in UFO magazines and in aviation weeklies such as Janes’ Defence. But the US Defence Department always denied such a project existed and two decades have passed without any real evidence that it ever did.

The Eurozone nations decided last year to start disclosing information on investigated UFO sightings from the late 1940s on through to the 1980s. This has produced a wealth of documents (largely redacted) and corresponding photographs.

Except the good ol’ US of A naturally, which still remains ominously silent on all things ‘UFO-ish.’

Project Aurora was a 1980s military effort obviously and if such a thing exists (existed?), the United States Pentagon/DARPA most certainly has something even better than that now-a-days and is keeping its cards close to the vest.

You wouldn’t want a potential rival  know what you have in your hand/arsenal, would you?

The Calvine Photos

Hat tip to The Daily Grail

_________________________

 

Just Google for Nuclear Fusion

Dr. Robert Bussard, the inventer of the Bussard Interstellar Ramjet who passed away last fall was a life long advocate of nuclear fusion energy and worked diligently to make it come to pass, was working on a method of nuclear fusion that converted hydrogen and boron directly into electricity, leaving helium the only by-product of the process:

This is not your father’s fusion reactor! Forget everything you know about conventional thinking on nuclear fusion: high-temperature plasmas, steam turbines, neutron radiation and even nuclear waste are a thing of the past. Goodbye thermonuclear fusion; hello inertial electrostatic confinement fusion (IEC), an old idea that’s been made new. While the international community debates the fate of the politically-turmoiled $12 billion ITER (an experimental thermonuclear reactor), simple IEC reactors are being built as high-school science fair projects…

Dr. Bussard will discuss his recent results and details of this potentially world-altering technology, whose conception dates back as far as 1924, and even includes a reactor design by Philo T. Farnsworth (inventor of the scanning television).

Can a 100 MW fusion reactor be built for less than Google’s annual electricity bill? Come see what’s possible when you think outside the thermonuclear box and ignore the herd…

The following is a Google vid that shows Dr. Forward giving a lecture in November 2006 about his fusion process and how it would meet Google’s present and future energy needs.

In fact, the Defence Department was funding his research, the Navy I believe. After his death there was speculation the funding would be pulled, but as of this posting the project is still funded until the end of this fiscal year.