Monthly Archives: March 2009

Big Brother is Watching

And He knows Everything You have Ever Done! Sometimes our paranoid government wants to do things that technology does not allow or they do not know about yet. As soon as they find out or the technology is developed, then they do it. Case in point is the paranoia that followed 11 Sept 2001 (9/11) in which Cheny and Bush wanted to be able to track and monitor every person in the US. There were immediate efforts to do this with the so-called Patriots Act that bypassed a lot of constitutional and existing laws and rights – like FISA. They also instructed NSA to monitor all radio and phone traffic, which was also illegal, and against the charter of NSA. Lesser known monitoring was the hacking into computer databases and monitoring of emails by NSA computers. They have computers that can download and read every email on every circuit from every Internet user as well as every form of voice communication. Such claims of being able to track everyone, everywhere have been made before and it seems that lots of people simple don’t believe that level of monitoring is possible. Well, I’m here to tell you that it not only is possible, but it is all automated and you can read all about the tool that started it all online. Look up “starlight” in combination with “PNNL” on Google and you will find references to a software program that was the first generation of the kind of tool I am talking about. This massive amount of communications data is screened by a program called STARLIGHT, which was created by the CIA and the Army and a team of contractors led by Battelle’s Pacific Northwest National Lab (PNNL). It does two things that very few other programs can do. It can process free-form text and it can display complex queries in visual 3-D outputs. The free-form text processing means that it can read text in its natural form as it is spoken, written in letters and emails and printed or published in documents. For a database program to be able to do this as easily and as fast as it would for formal defined records and fields of a relational database is a remarkable design achievement. Understand this is not just a word search – although that is part of it. It is not just a text-scanning tool; it can treat the text of a book as if it were an interlinked, indexed and cataloged database in which it can recall every aspect of the book (data). It can associate and find any word or phrase in relation to any parameter you can think of related to the book – page numbers, nearby words, word use per page, chapter or book, etc. By using the most sophisticated voice-to-text messaging, it can perform this kind of expansive searching on everything written or spoken, emailed, texted or said on cell phones or landline phones in the US! The visual presentation of that data is the key to being able to use it without information overload and to have the software prioritize the data for you. It does this by translating the database query parameters into colors and dimensional elements of a 3-D display. To view this data, you have to put on a special set of glasses similar to the ones that put a tiny TV screen in from of each eye. Such eye-mounted viewing is available for watching video and TV – giving the impression you are looking at a 60-inch TV screen from 5 feet away. In the case of STARLIGHT, it gives a completely 3-D effect and more. It can sense which way you are looking so it shows you a full 3-D environment that can be expanded into any size the viewer wants. And then they add interactive elements. You can put on a special glove that can be seen in the projected image in front of your eyes. As you move this glove in the 3-D space you are in, it moves in the 3-D computer images that you see in your binocular eye-mounted screens. Plus this glove can interact with the projected data elements. Let’s see how this might work for a simple example: The first civilian application of STARLIGHT was for the FAA to analyze private aircraft crashes over a 10-year period. Every scrape of information was scanned from accident reports, FAA investigations and police records – almost all of this was in free-form text. This included full specs on the aircraft, passengers, pilot, type of flight plan (IFR, VFR) etc. It also entered geospatial data that listed departure and destination airports, peak flight plan altitude, elevation of impact, distance and heading data. It also entered temporal data for the times of day, week and year that each event happened. This was hundreds of thousands of documents that would have taken years to key into a computer if a conventional database were used. Instead, high-speed scanners were used that read in reports at a rate of 200 double-sided pages per minute. Using a half dozen of these scanners completed the data entry in less than one month. The operator then assigned colors to a variety of ranges of data. For instance, it first assigned red and blue to male and female pilots and then looked at the data projected on a map. What popped up were hundreds of mostly red (male) dots spread out over the entire US map. Not real helpful. Next he assigned a spread of colors to all the makes aircraft – Cessna, Beachcraft, etc.. Now all the dots change to a rainbow of colors with no particular concentration of any given color in any given geographic area. Next he assigned colors to hours of the day – doing 12 hours at a time – Midnight to Noon and then Noon to Midnight. Now something interesting came up. The colors assigned to 6AM and 6PM (green) and shades of green (before and after 6AM or 6PM) were dominant on the map. This meant that the majority of the accidents happened around dusk or dawn. Next the operator entered assigned colors to distances from the departing airport – red being within 5 miles, orange was 5 to 10 miles…and so on with blue being the longest (over 100 miles). Again a surprise in the image. The map showed mostly red or blue with very few in between. When he refined the query so that red was either within 5 miles of the departing or destination airport, almost the whole map was red. Using these simple techniques, an operator was able to determine in a matter of a few hours that 87% of all private aircraft accidents happen within 5 miles of the takeoff or landing runway. 73% happen in the twilight hours of dawn or dusk. 77% happen with the landing gear lowered or with the landing lights on and 61% of the pilots reported being confused by ground lights. This gave the FAA information they needed to improve approach lighting and navigation aids in the terminal control areas (TCAs) of private aircraft airports. This was a very simple application that used a limited number of visual parameters at a time. But STARLIGHT is capable of so much more. It can assign things like direction and length of a vector, color of the line or tip, curvature and width and taper to various elements of a search. It can give shape to one result and different shape to another result. This gives significance to “seeing” a cube versus a sphere or to seeing rounded corners on a flat surface instead of square corners on an egg-shaped surface. Everything visual can have meaning.  Having 20+ variables at a time that can be interlaced with geospatial and temporal (historical) parameters can allow the program to search an incredible amount of data. Since the operator is looking for trends, anomalies and outflyers, the visual representation of the data is ideal to spot this data without actually scanning the data itself by the operator. Since the operator is visually seeing an image that is devoid of the details of numbers or words, he can easily spot some aspect of the image that warrants a closer look. In each of these trial queries, the operator can using his gloved hand to point to any given dot and call up the original source of the information in the form of a scanned image of the accident report. He can also touch virtual screen elements to bring out other data or query elements. For instance, he can merge two queries to see how many accidents near airports (red dots) had more than two passengers or were single engine aircraft, etc. Someone looking on would see a guy with weird glasses waving his hand in the air but in his eyes, he is pressing buttons, rotating knobs and selecting colors and shapes to alter his 3-D view of the data. In its use at NSA, they add one other interesting capability. Pattern Recognition. It can automatically find patterns in the data that would be impossible for any real person to by looking at the data. For instance, they put in a long list of words that are linked to risk assessments – such as plutonium, bomb, kill, jihad, etc. Then they let it search for patterns. Suppose there are dozens of phone calls being made to coordinate an attack but the callers are from all over the US. Every caller is calling someone different so no one number or caller can be linked to a lot of risk words. STARLIGHT can collate these calls and find the common linkage between them, and then it can tack the calls, caller and discussions in all other media forms. Now imagine the list of risk words and phrases to be tens of thousands of words long. It includes code words and words used in other languages. It can include consideration for the source or destination of the call – from public phones or unregistered cell phones. It can link the call to a geographic location within a few feet and then track the caller in all subsequent calls. It can use voice print technology to match calls made on different devices (radio, CB, cell phone, landline, VOIP, etc.). This is still just a sample of the possibilities. STARLIGHT was the first generation and was only as good as the data that was fed into it through scanned documents and other databases of information. A later version, code named Quasar, was created that used advanced data mining and ERP (enterprise resource planning) system architecture that integrated the direct feed from information gathering resources. For instance, the old STARLIGHT system had to feed recordings of phone calls into a speech-to-text processor and then the text data that was created was fed into STARLIGHT. In the Quasar system, the voice monitoring equipment (radios, cell phones, landlines) is fed directly into Quasar as is the direct feed of emails, telegrams, text messages, Internet traffic, etc. So does the government have the ability to track you? Absolutely! Are they? Absolutely! But wait, there’s more! Above, I said that Quasar was a “later version”. It’s not the latest version. Thanks to the Patriot Act and Presidential Orders on warrantless searches and the ability to hack into any database, NSA now can do so much more. This newer system is miles ahead of the relatively well known Echelon program of information gathering (which was dead even before it became widely known). It is also beyond another older program called Total Information Awareness (TIA). This new capability is made possible by the bank of NSA Cray computers and memory storage that are said to make Google’s entire system look like an abacus combined with the latest integration (ERP) software and the latest pattern recognition and visual data representation systems. Added to all of the Internet and phone monitoring and screening are two more additions into a new program called “Kontur”. Kontur is the Danish word for Profile. You will see why in a moment. Kontur adds geospatial monitoring of a person’s location to their database. Since 2005, every cell phone now broadcasts its GPS location at the beginning of every transmission as well as at regular intervals even when you are not using it to make a call. This was mandated by the Feds supposedly to assist in 911 emergency calls but the real motive was to be able to track people’s locations at all times. For those few that are still using the older model cell phones, they employ “tower tracking” which uses the relative signal strength and timing of the cell phone signal reaching each of several cell phone towers to pinpoint a person within a few feet. A holdover from the Quasar program was the tracking of commercial data which included every purchase made by credit cards or any purchase where a customer discount card is used – like at grocery stores. This not only gives the Feds an idea of a person’s lifestyle and income but by recording what they buy, they can infer other behaviors. When you combine cell phone and purchase tracking with the ability to track other forms of transactions – like banking, doctors, insurance, police and public records, there are relatively few gaps in what they can know about you. Kontur also mixed in something called geofencing that allows the government to create digital virtual fences around anything they want. Then when anyone crosses this virtual fence, they can be tracked. For instance, there is a virtual fence around every government building in Washington DC. Using predictive automated behavior monitoring and cohesion assessment software combined with location monitoring, geofencing and sophisticated social behavior modeling, pattern mining and inference, they are able to recognize patterns of people’s movements and actions as being threatening. Several would-be shooters and bombers have been stopped using this equipment. To talk about the “Profile” aspect of Kontur, we must first talk about why or how is it possible because it became possible only when the Feds were able to create very, very large databases of information and still be able to make effective use of that data. It took NSA 35 years of computer use to get to the point of using a terabyte (1012) of data. That was back in 1990 using ferrite core memory. It took 10 more years to get to petabyte (1015) of storage – that was in early 2001 using 14-inch videodisks and RAID banks of hard drives. It took four more years to create and make use of an exabyte (1018) of storage. With the advent of quantum memory using gradient echo and EIT (electromagnetically induced transparency), the NSA computers now have the capacity to store and rapidly search a yottabyte (1024) of data and expect to be able to raise that to 1,000 yottabytes of data within two years. To search this much data, they use a bank of Cray XT Jaguar computers that do nothing but read and write to and from the QMEM – quantum memory. The look-ahead and read-ahead capabilities are possible because of the massively parallel processing of a bank of other Crays that gives an effective speed of about 270 petaflops. Speeds are increasing at NSA at a rate of about 1 petaflop every two to four weeks. This kind of speed is necessary for things like pattern recognition and making use of the massive profile database of Kontur. In late 2006, it was decided that NSA and the rest of the intelligence and right wing government agencies would stop this idea of real-time monitoring and begin developing a historical record of what everyone does. Being able to search historical data was seen as essential for back-tracking a person’s movements to find out what he has been doing and whom he has been seeing or talking with. This was so that no one would ever again accuse them on not “connecting the dots”. But that means what EVERYONE does! As you have seen from the above description, they already can track your movements and all your commercial activities as well as what you say on phones or emails, what you buy and what you watch on TV or listen to on the radio. The difference now is that they save this data in a profile about you. All of that and more. Using geofencing, they have marked out millions of locations around the world to including obvious things like stores that sell pornography, guns, chemicals or lab equipment. Geofenced locations include churches, organizations like Greenpeace and Amnesty International. They have moving geofences around people they are tracking like terrorists but also political opponents, left wing radio and TV personalities and leaders of social movements and churches. If you enter their personal space – close enough to talk, then you are flagged and then you are geofenced and tracked. If your income level is low and you travel to the rich side of town, you are flagged. If you are rich and travel to the poor side of town, you are flagged. If you buy a gun or ammo and cross the wrong geofence, you will be followed. The pattern recognition of Kontur might match something you said in an email with something you bought and somewhere you drove in your car to determine you are a threat. Kontur is watching and recording your entire life. There is only one limitation to the system right now. The availability of soldiers or “men in black” to follow-up on people that have been flagged is limited so they are prioritizing whom they act upon. You are still flagged and recorded but they are only acting on the ones that are judged to be a serious threat now.It is only a matter of time before they can find a way to reach out to anyone they want and curb or destroy them. It might come in the form of a government mandated electronic tag that is inserted under the skin or implanted at birth. They have been testing these devices in use on animals under the disguise of tracking and identification of lost pest. They have tried twice to introduce these to all the people in the military. They have also tried to justify putting them into kids for “safety”. They are still pushing them for use in medical monitoring. Perhaps this will take the form of a nanobot. If they are successful in getting the population to accept these devices and then they determine you are a risk, they simply deactivate you by remotely popping open a poison capsule using a radio signal. Such a device might be totally passive in a person that is not a threat but might be lethal or it can be programmed to inhibit the motor-neuron system or otherwise disable a person that is deemed to be a high-risk person. Watch out for things like this. It’s the next thing they will do. You can count on it. 

Plato: Unlimited Energy – Here Already!

Plato: Unlimited Energy

 

If you are a reader of my blog, you know about Plato. It is what I call a software program that I have been working on since the late 1980’s that does what I call “concept searches”. The complete description of Plato is in another story on this blog but the short of it is that it will do web searches for complex interlinked and related or supporting data that form the basis for a conceptual idea. I developed Plato using a variety of techniques including natural language queries, thesaurus lookups, pattern recognition, morphology, logic and artificial intelligence. It is able to accept complex natural language questions, search for real or possible solutions and present the results in a form that logically justifies and validates the solution. Its real strength is that it can find solutions or possibilities that don’t yet exist or have not yet been discovered. I could go on and on about all the wild and weird stuff have used Plato for but this story is about a recent search for an alternative energy source….and Plato found one.

As a research scientist, I have done a considerable amount of R&D in various fields of energy production and alternate energy sources. Since my retirement, I have been busy doing other things and have not kept up with the latest so I decide to let Plato do a search for me to find out what is the latest state-of-the-art in alternate energy and the status of fusion power. What Plato came back with is a huge list of references in support of an source of energy that is being used by the government but is being withheld from the public. This energy source is technical complex but is far more powerful than anything being used today short of the largest nuclear power plants. I have read over most of what Plato found and am convinced that this source of power exists, it is being used but is being actively suppressed by out government. Here is the truth:

On January 25, 1999 a rogue physicist researcher at the University of Texas named Carl Collins clamed to have achieved stimulated decays of nuclear isomers using a second-hand dental x-ray machine. As early as 1988, Collins was saying that this was possible but it took 11 years to get the funding and lab work to do it. By then, it was confirmed by several labs including Dr. Belic at the Stuttgart Nuclear Physics Group. Collins’ results were published in a peer reviewed Physical Review Letters. The science of this is complex but what it amounts to is a kind of cold fusion. Nuclear isomers are atoms with a metastable nucleus. That means that certain when they are created in certain radioactive materials, the protons and neutrons (nucleons) in the nucleus of the atom are bonded or pooled together in what is called an excited state.

An analogue would be like stacking balls into a pyramid. It took energy to get them into that natural state but what Collins found is that it takes relatively little energy to destabilize this stack and release lots of energy. Hafnium and Tantalum are two naturally occurring metastable elements that can be triggered to release their energy with relatively little external excitation.

Hafnium, for instance, releases a photon with an energy of 75 keV (75,000 electron volts) and one gram produces 1,330 megajoules of energy – the equivalent of about 700 pounds of TNT. A five-pound ball is said to be able to create a two-kiloton blast – that is the equivalent to 4,000,000 pounds of TNT. A special type of Hafnium called Hf-178-m2 is capable of producing energy in the exawatt range, that is 10,000,000,000,000,000,000 (1018) watts of energy! This is far more than all the energy created by all the nuclear plants in the US. As a comparison, the largest energy producer in the world today is the Large Hadron Collider (LHC) near Geneva which cost more than $10 billion and can a beam of energy estimated to be 10 trillion watts (1012 ) but that is power that lasts for about 30 nanoseconds (billionths of a second).

Imagine being able to create 1 million (106) times that energy level but sustain it indefinitely? We actually don’t have a power gird capable of doing that but because we are talking about a generator that might be the size of a small house, this technology could be inexpensively replicated all over the US or the world to deliver as much power as needed.

These are, of course, calculated estimates based on extrapolation of Collins’ initial work and that of the follow-on experiments but not one scientist has put forth a single peer reviewed paper that disputes these estimates or the viability of the entire experiment. It is also obvious that the mechanism of excitation would have to be larger than a dental x-ray machine in order to get 1018 watts out of it. In fact, when Brookhaven National Lab conducted its Triggering Isomer Proof (TRIP) test, it used their National Synchrotron Light Source (NSLS) – a powerful laser – as the excitation.

Obviously this was met with a lot of critical reviews and open hostility from the world of physics. This was just another “Cold Fusion” fiasco that was still fresh in everyone’s minds. It was in 1989 that Pons and Fleischmann claimed to have created fusion in a lab at temperatures well below what was then thought to be necessary. It took just months to prove them wrong and the whole idea of cold fusion and unlimited energy was placed right next to astrology, perpetual motion and pet rocks.

Now Collins was claiming that he had done it again – a tiny amount of energy in and a lot of energy out. He was not reporting the microscopic “indications of excess energy” that Pons and Fleischmann claimed. Collins is saying he got large amounts of excess energy (more energy out that went in) on many orders of magnitude above what Pons and Fleischmann claimed.

Dozens of labs across the world began to try to verify or duplicate his results. The biggest problem was getting a hold on the Hafnium needed to do the experiments – it is expensive and hard to come by so it took mostly government sponsored studies to be able to afford it. Surprisingly, some confirmed it and some had mixed results and some discredited him.

In the US, DARPA was very interested because this had the potential for being a serious weapon that would give us a nuclear bomb type explosion and power but would not violate the worldwide ban on nuclear weapons. The US Navy was very interested in it because had the potential for being not only a warhead but also a new and better form of power for their nuclear power fleet ships and subs.

By 2004, the controversy over whether it was viable or not was still raging so DARPA, which had funded some of the labs that had gotten contradictory results, decided to have a final test. They called it the TRiggering Isomer Proof (TRIP) test and it was funded to be done at Brookhaven National Lab.

This had created such news interest that everyone was interested in the results. NASA, Navy, Dept. of Energy (DOE), Dept of Defense (DoD), NRL, Defense Threat Reduction Agency, State Department, Defense Intelligence Agency (DIA), Argonne Labs, Arms Control and Disarmament Agency (ACDA), Los Alamos, MIT Radiation Lab, MITRE, JASON, and dozens of others were standing in line to hear the outcome of this test being conducted by DARPA.

So what happened in the test? No one knows. The test was conducted and DARPA put the lockdown on every scrap of news about the results. In fact, since that test, they have shutdown all other government funded contracts in civilian labs on isomer triggering. The only break in that cover has been a statement from the senior most DOE scientist involved, Dr. Ehsan Khan when he made this statement:

“TRIP had been so successful that an independent evaluation board has recommended further research….with only the most seasoned and outstanding individuals allowed to be engaged”.

There has been no peer review of the TRIP report. It has been seen by a select group of scientists but no one else has leaked anything about it. What is even more astounding is that none of those many other government agencies and organizations have raised the issue. In fact, any serious inquiry into the status of isomer triggering research is met with closed doors, misdirection or outright hostility. The government has pushed it almost entirely behind the black curtain of black projects. Everything related to this subject is now either classified Top Secret or is openly and outwardly discredited and denounced as nonsense.

This has not, however, stopped other nations or other civilian labs and companies from looking into it. But even here, they cannot openly pursue isomer triggering or cold fusion. Now research into such subjects is called “low-energy nuclear reactions” (LENR) or “chemically assisted nuclear reactions (CANR). Success in the experiments of these researchers is measured in the creation of “excess heat” meaning that it has created more (excess) energy than was put into it. Plato has found that some people and labs that have achieved this level of success include:

Lab or company ResearcherUniversity of Osaka, Japan Arata

ENEA, Rome Frascati, Italy Vittorio Violante

Hokkaido University, Japan Mizuno

Energetic Technology, LLC, Omer, Israel Shaoul Lesin

Portland state University, USA Dash

Jet thermal Products, Inc, USA Swartz

SRI, USA McKubre

Lattice Energy, Inc. USA E. Storms

In addition, the British and Russians have both published papers and intelligence reports indicate they may both be working on a TRIP bomb. The British have a group called the Atomic Weapons Establishment (AWE) that has developed a technique called Nuclear Excitation by Electron Transition and are actively seeking production solutions. The Russians may have created an entire isolated research center just for studying TRIP for both weapons and energy sources.

In addition to the obvious use of such a power source to allow us to wean off of fossil fuels, there are lots of other motivations for seeking a high density, low cost power source: global warming, desalination, robotics, mass transportation, long distance air travel, space exploration, etc.

These applications are normal and common sense uses but what application might motivate our government to surppress the news coverage of further research and to wage a disinformation and discredit campaign on anyone that works on this subject? One obvious answer is its potential as a weapon but since that also is well known and common sense, there must be some other reason that the government does not want this to be pursued. What that is will not be found by searching for it. If it is a black project, it will not have internet news reports on it but it might have a combined group of indicators and seemingly disconnected facts that form a pattern when viewed in light of some common motive or cause. Doing that kind of searching is precisely what Plato was designed to do.

What my Plato program discovered is that there are a number of unexplained events and sightings that have a common thread. These events and sightings are all at the fringes of science or are outright science fiction if you consider current common knowledge of science or listen to the government denounce and discredit any of the observers. Things like UFOs that move fast but make no noise, space vehicles that can approach the speed of light, underwater vessels that have been reported to travel faster than the fastest surface ships and beam weapons (light, RF, rail) that can destroy objects as far away as on the moon. What they have in common is that if you consider that there is a high density, compact source of extremely high-powered energy, then these fantastic sightings suddenly become quite plausible.

A power source that can create 10 TeV (tera-electron Volts) is well within the realm of possibility for an isomer-triggered device and is powerful enough to create and/or control gravitons and the Higgs Boson and the Higgs field. See my other blog story on travel faster than light and on dark energy and you will see that if you have enough power, you can manipulate the most fundamental particles and forces of nature to include gravity, mass and even time.

If you can control that much power, you can create particle beam weapons, lasers and rail guns that can penetrate anything – even miles of earth or ocean. If you can create enough energy – about 15 TeV, you can create a negative graviton – essentially negative gravity – which can be used to move an aircraft with no sounds at supersonic speeds. It will also allow you to break all the rules of normal aerodynamics and create aircraft that are very large, in odd shapes (like triangles and arcs) and still be able to travel slowly. Collins estimated that a full-scale isomer triggered generator could generate power in the 1,000 TeV range when combined with the proper magnetic infrastructure of a Collider like the LHC.

Plato found evidence that is exactly what is happening. The possibility of coincidence that all of these sightings have this one single thread in common is beyond logic or probability. The coincidence that these sightings and events have occurred by the hundreds in just the past few years – since the DARPA TRIP test – is way beyond coincidence. It is clear that DARPA put the wraps on this technology because of its potential as a weapon and as an unlimited high-density power source.

The fact that this has been kept hushed up is mostly due to the impact it would have on the economies of the world if we were suddenly given unlimited power that was not based on fossil fuels, coal or hydroelectric power. Imagine the instant availability of all of the electricity that you could use at next to nothing in cost. Markets would collapse in the wake of drops in everything related to oil, gas and coal. That is not a desirable outcome when we are in such a bad financial recession already.

Plato comes up with some wild ideas some times and I often check them out to see if it really is true. I was given perhaps 75 references, of which I have listed only a few in this article but enough that you can see that they are all there and true. I encourage you to search for all the key words, people and labs listed here. Prove this to yourself – it’s all true.

NASA Astrophysics Data System (ADS) Physical Review Papers Vol 99, Issue 17, id. 172502 titled, “Isomer Triggering via Nuclear Excitation by Electron Capture (NEEC) reported confirmed low energy triggering with high energy yields.

Brookhaven National Lab conducted a Triggering Isomer Proof Test (TRIP) using their National Synchrotron Light Source (NSLS) in which they reported; “A successfully independent confirmation of this valuable scientific achievement has been made … and presented in a Sandia Report (SAND2007-2690, January 2008). This was funded by DARPA but pulled the funding right after the test.

FCC Warning: Anomalous Content

SECRET

Compartment Coded: Megaphone

 

 

FEDERAL COMMUNICATIONS COMMISSION

Enforcement Bureau

Content Enforcement Division (CED)

 

 

*****************************************************

NOTICE

*****************************************************

FCC Violation Notice for the Executive Office of the President

*****************************************************

 

Continuous Custody Courier Delivery

April 20, 2009

Subject: Commercial Broadcast Radio Stations KARB, KARV, KBBR, KCRB, et al

Commercial Radio License CB8I: Warning Notice, Case #EB-2008-2997-RB

Dear Sir:

On August 1, 2007, The FCC/CED discovered a Part 15 violation regarding inappropriate content within the assigned bands of operation of 173 commercial AM and FM broadcast radio stations located in every State. The nature of the inappropriate content appears to be an extremely sophisticated subliminal message that is undetectable by routine spectrum analysis because it is dynamically created by the beat frequencies of the broadcast. This means that any specific analysis of broadcast content will show no embedded or side-band signals, however, the audio modulation of the received broadcast at the receiver’s speaker creates an artificial but highly effective analog influence upon and within any listener.

This signal appears as a result of the signal creating binaural beat tones inside the superior olivary nucleus of the brain stem. Preliminary research has shown that these temporal modulations are creating multiple brainwave synchronizations below the conscious perception thresholds and they are having measurable effects (see below) on the listeners in each of the radio broadcast regions. The signal is not a voice, per se, but rather they have a direct and immediate influence on the inferior colliculus neurons internal to the brain. The affect of this influence has been measured in activated areas of the brain of the primary sensorimotor and cingulate areas, bilateral opercular premotor areas, bilateral SII, ventral prefrontal cortex, subcortically, anterior insula, putamen and thalamus. These areas of the brain and others affected include control of motor reflexes, hunger, vision, decision-making, body temperature control, temperament, smell and memory.

Collaboration with NSA and NRL have provided us with a complete analysis of the signal but this has been of only limited help with the cause and effect on the listening public. At the suggestion of Dr. Wayne Sponson at NSA, the FCC/CED contacted the Sensory Exploitation Division (SED) of NIH, at Fort Detrick, Maryland. We were delayed for 4 weeks in order to process clearances for two members of the FCC/CED (myself and Dr. Edward Willingsley).

In late February, we were able to obtain the following information. The NIH/SED has been working on binaural beats to explore the phenomenon called the Frequency Following Response or Entrainment. They have been highly successful with this field of study, however, their efforts have focused on the creation of infrasound induced beat frequencies to entrain brain waves. This has been shown to impact the delta, theta, alpha, beta and gamma brainwaves.  By contrast, the contaminated signals from these radio stations is created using sounds well above the infrasound range and well within the range of normal music listening.

Dr. Alan Cupfer from NIH’s Neuroscience Research confirmed that entrainment using binaural beat stimulation (or using light) has been shown to be quite effective to affect dream states, focus, anxiety, addiction, attention, relaxation, learning, mood and performance. He also admitted that by first achieving brain synchronization and then applying entrainment to effect constructive or destructive interference with brain frequencies, it is possible to significantly enhance or suppress these brain functions.

NSA computers discovered these signals during their routine monitoring of the broad frequency spectrum of all transmissions. The computers have been recording these signals as an automatic function of finding an anomalous signal, however, because no specific threatening content was recognized by the computers, it was not flagged to any human operators or analysts at NSA. This is a procedural error that has been corrected.

Once the FCC/CED discovered the nature of these anomalous signals in August 2008 and coordinated with NSA, NSA provided our office with archived recordings that date back to 2001 and show an increasing coverage of broadcast stations from the first one found in California to the present 173. They seem to be increasing at a rate of about two per month. It is estimated that approximately 61 million people are currently within the broadcast coverage areas of these stations.

In our two-month exploration of what, if any, impact or objective these broadcasts are having on the listening audience, we have discovered the following:

  1. The subliminal signals appear to be constantly varying at each station and between stations, even when the same music or other recordings are being played. It appears that the anomalous signals are being injected into the broadcast systems at each station’s transmission facility from an exterior source but the means and mechanism of this signal injection has not been determined yet. Until it is, we can’t stop it.
  1. The anomalous signals can be distinguished from non-contaminated signals by means of signal analysis comparisons before and after the use of adaptive filtering. Using a recursive least squares (RLS) and a least mean squares (LMS) in an automated sweep variable filter that seeks a zero cost function (error signal) when compared to a reference baseline. When this computed correction factor is non-zero, the NSA computers determine that the signal is contaminated and they are recorded. These kinds of finite impulse response (FIR) filter structures have proven to be effective at the detection of changes to the baseline reference as small as one cycle at one Giga-Hertz over a period of 24 hours.
  1. Despite being able to detect and isolate the anomalous signal, the combined efforts of NSA, FCC, NIH and NRL have been unable to decode the signal with respect to intelligent content. However, Dr. Tanya Huber and Joel Shiv, two researchers from the National Institute of Science and Technology (NIST) suggested that by examining the non-conscious behavior of the listeners against a baseline, there might be a correlation between signal content and responses. These two researchers have been studying the psychological manipulation of consumer judgements, behavior and motivation since 2004.
  1. Conducting the first macro-survey of listener behavior in each of the broadcast areas initially yielded no anomalous behavior but when micro-communities and community activities were individually examined, some conspicuous changes were noted.
  1. In Mesquite, NV, a change in the recorded anomalous signal coincided with a controversial referendum by the voters on the long-term problems with the Oasis Golf Club. This referendum was notable because it unexpectedly and nearly unanimously reversed a voter survey taken the previous day.
  1. In La Pine, OR, a small farm community with a low power publicly owned station, experienced an uncommonly large increase in the sale of over-the-counter non-steroidal anti-inflammatory agents/analgesics (NSAIAs) such as aspirin, naproxen, Tylenol, and ibuprofen. It appears that the sale was initially motivated by a three week period of a large increase in demand for the analgesic qualities of these drugs but following a week long lull in sales, demand again peaked for three weeks for the antipyretic effects of these drugs. This was validated by a large increase in the sales of thermometers and examination reports of doctor visits. What is unusual is that this appears to have affected nearly every single person in the broadcast area of this small station. The only ones not affected were deaf.
  1. Over the survey of cities and towns, it was discovered that there was a surge in consumer activity associated with a variety of drugs and foods in more than 70 communities over the period analyzed. In each instance, this surge in sales had no prior precedent and lasted for one or two weeks and then returned to normal without reoccurrence.
  1. By contrast, it was also discovered that there was a corresponding decrease in sales of specific drugs and food and drinks in 67 communities – some of which were involved, in the above-mentioned increase of sales. These decreased sales included a drop to nearly zero sales of all drinks containing any form of alcohol or milk. These decreases were especially significant because doctors and local advertisers actively opposed them without effect.

Dozens of other changes in consumer behavior, voter response, mood swings and entertainment sales were discovered but no specific patterns of products, locations, response or demographics were discovered.

Summary:

The findings of the FCC/CED indicate that a significant and growing population have been and are being manipulated and controlled by listening to radio broadcasts. The degree of control exerted has been nothing short of extraordinary and without precedent. The technology involved has so far eluded detection. The source or objectives of these anomalous signals has also not yet been determined.

It is the speculation of the FCC/CED and of the NIH/SED that this has all the signs of someone or some organization that is actively testing their capabilities on a live and diverse group of test subjects. These tests appear to be random but are systematically exploring the degree of influence and the parts of the brain that can be exploited by these signals. What cannot be determined is what is the final intent or objective or possibly that it has already been accomplished or may be ongoing.

Recommendations: It is recommended that the general public NOT be informed of this situation until we are able to define it further.We recommend the use of deaf analysts be assigned to monitor on-site listening stations in all of the largest radio coverage areas to maintain an observation of changes to behavior. In other areas, automated monitoring can be used to isolate the signals before sending encrypted files to NSA for analysis.We recommend the use of FBI and CIA to examine any commonality between these stations.We recommend that NIST and NIH continue their survey of behavior changes in all of the affected communities.We recommend that NRL and FCC collaborate on the creation of a selective RF counter-measure to the anomalous signals.

We recommend that a cabinet-level task force be created within Homeland Security to assist and coordinate all of the above activities.

Sincerely,

Dr. W. Riley Hollingswood Ph.D.

FCC Director, Content Enforcement Division

April 21, 2009 Update:

Following the creation and coordination of the above report, it was reported to this office by NSA that the anomalous signals have been detected in both national broadcast and cable television signals.

SECRET

Government Secrets #2 They Control You!!

They Control You!!

After reading Government Secrets #1, you should know that I had access to a lot of intelligence over a long career and had a lot of insights into our government’s actions on the international political stage. What I observed first hand and in my historical research is that repeatedly over decades, the US government has gone to great effort to create wars. You will never hear a military person admit this because most of them are not a part of the decision process that commits us to war but because they believe in the idea that we are always right and they will go to prison if they disobey, they will execute the directions to go to war with great gusto. We have a very warped view of our own history. In every war we are the heroes and we fought on the side of right and we did it honorably and with great integrity. Well, that is what the history books would have you believe. Did you ever learn that we issued orders to take no prisoners at the battle of Iwo Jima? Thousands of Japanese were shot with their hands raised in surrender. To be fair, some of them would feign surrender and then pop a grenade but you won’t see this in our history books.Did you know that our attack strategy in Europe was to destroy the civilian population? The worst example occurred on the evening of February 13, 1945, Allied bombers and fighters attacked a defenseless German city, one of the greatest cultural centers of northern Europe. Within less than 14 hours not only was it reduced to flaming ruins, but an estimated one-third of its inhabitants, more than half a million, had perished in what was the worst single event massacre of all time. More people died there in the firestorm, than died in Hiroshima and Nagasaki combined.

Dresden, known as the Florence of the North, was a hospital city for wounded soldiers. Not one military unit, not one anti-aircraft battery was deployed in the city. Together with the 600.000 refugees from Breslau, Dresden was filled with nearly 1.2 million people. More than 700,000 phosphorus bombs were dropped on 1.2 million people. More than one bomb for every 2 people. The temperature in the center of the city reached 1600 centigrade (nearly 3,000 degrees Fahrenheit). More than 260,000 bodies and residues of bodies were counted. But those who perished in the center of the city can’t be traced because their bodies were vaporized or they were never recovered from the hundreds of underground shelters. Approximately 500,000 children, women, the elderly and wounded soldiers were slaughtered in one night.

Following the bomber attack, U.S. Mustangs appeared low over the city, strafing anything that moved, including a column of rescue vehicles rushing to the city to evacuate survivors. One assault was aimed at the banks of the Elbe River, where refugees had huddled during the night. The low-flying Mustangs machine-gunned those all along the river, as well as thousands who were escaping the city in large columns of old men, women and children streaming out of the city.

Did you ever read that in your history books? Did you know that we deliberately avoided all attacks on Hiroshima and Nagasaki so as to ensure that the civilian population would not flee the city?

This sparked my interest to look into “my war” – Viet Nam and I began to study it in detail. I read about its start and how the famous Tonken Gulf Incident was a complete ruse to let Lyndon Johnson boost troops for political gain and out of a personal fear that America might be seen as weak. He had great faith in our might and ability to make a quick and decisive victory so he trumped up a fake excuse to get the famous Tonken Gulf Resolution passed to give him more powers to send troops. The whole war had been just a political whim by a misguided politician and bolstered by the military-industrial complex that profited by massive arms sales, which also happened to be the largest contributors to the political campaigns. More than 50,000 US lives and countless Viet Namese lives later, we left Viet Nam having had almost no effect on the political outcome of the initial civil war effort to reunite the North and the South under communism – except that there were a lot fewer people to do it.

Even our basis for most of the cold war was mostly fake. For instance, I found pretty solid evidence that as early as the early 1960’s there was a massive campaign to create a false missile gap mentality in order to funnel massive money into the military.

Look up Operation Paperclip, it had actually given us a huge advantage in missile technology so the whole basis for the cold war from before the Cuban Missile Crisis to the present is all based on a lie. Despite having the largest nuclear warheads, Russia’s missiles are known for being so poorly guided that an ICBM had a probability of hitting its target with the effective range of its warhead of only 20%. That meant that it would be expected to hit within a +/- 30 miles radius of its target. Our missiles, by contrast, are rated at less than 1,000 feet. In every crisis involving Russia in which we refused to back down, the Russians gave in because they knew that they did not have a chance in a nuclear war exchange with the US. There was never any real missile gap nor any real threat to our world from communism. It was a scapegoat for all our mistakes and expenditures.

Did you know about the testing of bio-weapons, nuke weapons and designer drugs on our own US military? Do you know the truth about the start of Viet Nam? How about Angola, Nicaragua, the Congo, Grenada, Guatemala, Panama, El Salvador, Iran, Iraq, Israel, Argentina and dozens of others? Do you know the real story of the USS Liberty? The list is huge of what is not fully known or understood by the US public. I can guarantee that what you think happened, what is in the history books and the press is NOT what really happened.

Here’s just one example of how the news is not really the news as it happened but as our government wants us to hear it. The Falkland Islands went to war in 1982. One incident we had a lot of intelligence about was the sinking of several British warships. One of these ships was hit and sunk by an Exocet air to surface missile despite the use of extensive electronic countermeasures. Or so that was the way it was reported in the news.

Because of my access to intelligence reports, I found out that the use of electronic countermeasures by the British was nearly flawless in its effectiveness to divert or confuse these missiles. The skipper of the HMS Sheffield, in the middle of a battle, ordered the electronic countermeasures equipment to be shut off because he could not get a message to and from Britain with it on. As soon as his equipment was off, the Argentine air attacks from the Super Etendard launched the Exocet.

OK this was a tragic screw up by a British officer but what our military planners and politicians did with it was the REAL tragedy. The bit about shutting of the electronic countermeasures equipment was deleted from all of the news reports and only the effectiveness of the Exocet was allowed to be published by the US press. The Navy and the Air Force both used this event to create the illusion of an anti-missile defense gap in the minds of the public and politicians and to justify the purchase of massive new defensive systems and ships at the cost of billions of dollars. All based on a false report.

In fact, an objective look at how we have been playing an aggressive game of manifest destiny with the world for the past 150 years would make you wonder how we can have any pride in our nation. From the enslavement of millions of blacks to the genocide of the American Indian to the forceful imposition of our form of government on dozens of sovereign nations, we have been playing the role of a worldwide dictator for decades. It has all been a very rude awakening for me.

The military-industrial complex that President Eisenhower warned us about is real but latter day analysts now call it the “military-industrial-congressional complex”. It is congress and some of the Presidents that we have had that are the power side of the triangle that consists of power, money and control.

The money buys the power because we have the best government in the world that is for sale on a daily basis and that sale is so institutionalized that it is accepted as a way of doing routine business. The bribing agents are called Lobbyists but there is little doubt that when the visit a congressman to influence his vote, they are clearly and openly bribing him with money or with votes. The congressmen, in return, vote to give tax money to the companies that the lobbyists represent. Or perhaps they will vote to allow those companies to retain their status, earnings or advantages even when that is at the cost of damage to the environment, other people or to other nations.

The control comes in the form of propaganda to sway and manipulate the masses; the military might to exert control over our enemies and our allies and the control of the workers and people that empower the congressmen – thus making the interlocking triangle complete.

What is not well known is a basic psychological mechanism that the military-industrial-congressional complex employs that few people understand or realize. Historical Sociologists (people that study how societies think over time and history) have discovered that every successful society in the world and over all of history, has had a scapegoat group of people or country or culture on which to blame all their problems.

Scapegoating is a hostile social – psychological discrediting routine by which people move blame and responsibility away from themselves and towards a target person or group. It is also a practice by which angry feelings and feelings of hostility may be projected, via inappropriate accusation, towards others. The target feels wrongly persecuted and receives misplaced vilification, blame and criticism; he is likely to suffer rejection from those who the perpetrator seeks to influence. Scapegoating has a wide range of focus: from “approved” enemies of very large groups of people down to the scapegoating of individuals by other individuals. Distortion is always a feature.

In scapegoating, feelings of guilt, aggression, blame and suffering are transferred away from a person or group so as to fulfill an unconscious drive to resolve or avoid such bad feelings. This is done by the displacement of responsibility and blame to another that serves as a target for blame both for the scapegoater and his supporters.

Primary examples of this include 1930 Germany in which Hitler used a variety of scapegoats to offset the German guilt and shame of World War I. He eventually chose the Jews and the entire population of Germany readily accepted them as the evil cause of all their problems. The US did this in the south for more than a century after the civil war by blaming everything on the black population. But this is true today for most of our successful countries: The Japanese hate the Koreans, the Arabs hate the Jews, in the southwest of the US, the Mexicans are the targets but in the southeast, it is still the blacks, the Turks hate the Kurds…and so it goes for nearly every country in the world and for all of history.

In some cases the scapegoat might be one religious belief blaming another as in the Muslims blaming the Jews or the Catholics blaming the Protestants. These kinds of scapegoats can extend beyond national boundaries but often are confined to regional areas like the Middle East or Central Europe. Finally, there are the political and ideological scapegoats. For many years, the US has pitted conservatives against liberals and Democrats against Republicans. This often has the effect of stopping progress because each side blames the other for a lack of progress and then opposes any positive steps that might favor the other side or give them the credit for the progress. Unfortunately, this scapegoat blame-game ends up being the essence of the struggles for power and control.

What is not well understood or appreciated is that our government is very well versed in this scapegoating and blame-game as a means to avoid accountability and to confuse the objectives. By creating an enemy that we can blame all our insecurities on – like we did with communism in the cold war – we can justify almost any expense, any sacrifice demanded of the public. If you question or oppose the decisions, then you are branded a communist sympathizer and are ostracized by society. Joseph McCarthy is the worst example of this but it exists today when we say someone is not patriotic enough if they dare to question a funding allocation for Iraq or for a new weapon system.

We, the public, are being manipulated by a power and highly effective psychological mechanism that is so well refined and developed that both the Democrat against Republican parties have an active but highly secretive staff composed of experts in the social psychological propaganda techniques that include, among others, scapegoating. In the Democratic Party this office is called the Committee for Public Outreach. In the Republican Party, their staff is called Specialized Public Relations. Even the names they choose make use of misdirection and reframing. Right now, the Democratic Party has the better group of experts partly because they raided the staff of the Republican office of Specialized Public Relations back in 1996 by offering them huge salary increases. By paying them half a million dollars per year plus bonuses that can reach an additional $50 million, they have secured the best propaganda minds in the world.

In both cases, the staffs are relatively unknown and work in obscure private offices located away from the main congressional buildings. Their efforts are passed as quietly and as low a profile as possible and to only the senior most party members. The reports begin with clearly defined objectives of diverting public attention or countering fact-based reports or justifying some political action or non-action but as they work their way through the system of reviewers and writers, the objective remains the same but the method of delivery gets altered so that the intent is not at all obvious. It is here that the experts in psychology and social science tweak the wording or events to manipulate the public, allies or voters.

The bottom line is that the federal government of the US has a long and verifiable history of lying but it is a fact that the lies that have been discovered are perhaps 5% of the lies that have emanated from the government. If you care to look, you will find that a great deal of what you think you know about our nation’s history, our political motivations and accomplishments and our current motives and justifications are not at all what you think they are. But I warn you – don’t begin this exploration unless you are willing to have your view of your country and even yourself seriously shaken up. But, if you don’t want to see the truth, then at least be open minded enough to listen to what will be declared the radical views that oppose the popular political positions of the day.

Nanobots Contamination of over-the-counter (OTC) Drugs

Nanobots Contamination of over-the-counter (OTC) Drugs

Topics on this Page

Introduction

Update on FDA’s Investigation

FDA’s Executive Office Warnings/AdvisoriesIntroduction

September 12, 2008: In light of recent evidence from the National Security Agency (NSA), concerning over-the-counter (OTC) Drugs contaminated with nanobots, the FDA has issued a Health Information Advisory to proactively reassure the Office of the President that there is no known health threat from contaminated OTC Drugs manufactured by companies that have met the requirements to sell such products in the United States. Nanobot contamination, if present, poses no apparent risk to health, even to children; however, there may be a risk to privacy.

The nanobots were discovered by NSA because they appear to be activated by an external radio frequency (RF) signal and in response emanate a coded signal. They were found to be less than 1 centimeter long and apparently contain a passive RFID device in addition to a rudimentary mechanism for sensing and memory retention. So far, neither NSA nor FDA has been able to decipher the coded signal. Although this is considerable smaller than the Verichip developed by Kevin Warwick, it is well within the current technology.

These nanites have been found embedded in the center of OTC drugs that come in 325 mgs and larger solid pill form. Contaminated pills range from a low of 1% to a high of 3% of all pills sampled. This is an unusually high level but the method of insertion of these contaminated pills into the manufacturing process of multiple producers has not been determined yet.

Analysis of their exact nature has been complicated by the fact that they seem to be encased with a protective coating that is also highly reactive to light. If a contaminated OTC pill is broken open and the nanite is exposed to light, it immediately disintegrates. Further studies are underway.

The FDA had no knowledge of the presence of these nanobots prior to the notification by NSA in August 2008 and has been hampered in its analysis by a total lack of cooperation from the NSA, however, with NSA’s help, we have been able to determine that in most urban centers, the level of contaminated adults is approximately one in four with slightly greater percentages found in the larger urban centers of New York, Boston, Miami and Dallas.

For some people that take OTC drugs on a regular basis (more than 2 a week), it is possible that they might accumulate more than one nanobot in their system. This does not appear to increase or decrease the health risk to the person but does appear to alter the RF signals emanating from the RFID circuits of the nanites.

The FDA has broadened its domestic and import sampling and testing of OTC drugs from suspected sources but has been unable to define the exact source or sources. FDA has recommended that consumers not consume certain products because of possible contamination with Nanobots. A list of those products is below.

Update on FDA’s InvestigationFebruary 19, 2009: FDA’s ongoing investigation continues to show that the domestic supply of over-the-counter (OTC) Drugs is safe and that consumers can continue using U.S. manufactured OTC Drugs. FDA has concluded that levels of Nanobots alone are at or below 1 pill per thousand (ppt) among all OTC Drugs. This level does not raise public health concerns. FDA has updated its interim risk assessment, issued in early October, with this information:

The FDA has been collecting and analyzing samples of domestically manufactured OTC Drugs for the presence of Nanobots and Nanobots-related RF signal responses. To date, FDA tests have found extremely low levels of Nanobots in one OTC Drugs sample and moderate levels of RF signal responses from concentrations of OTC drugs, such as in a commercial drug store. The benign nature of the nanobots found so far indicate they were designed for tagging, tracking and collection of health information and do not interact with the body or its system and therefore pose no health risk to the public.

To date, statistical data on those individuals that have been contaminated with the nanobots has been limited but several trends have begun to emerge. The number of people contaminated seems to be equally divided among men and women and in a proportional distribution among ethnic and racial groups. The passive RFID tag is responsive to various frequencies in the high UHF and SHF range (922 MHz to 2202 GHz) and appears to makes use of the backscatter coupling method, however, a few known contamination’s could not be activated with any signal source.

Studies have shown that these passive RFID tags can be activated by signals from satellites but have to be read by a receiver located within ten feet. During the testing of nanobots that were actually ingested by people, it was discovered by NSA that the cell phones of the people being tested emanated an unusual signal pattern in response to a band sweep of SHF RF signals. The cell phone activation is being further investigated.

For unknown reasons, some people eliminate or pass their nanobot out of their systems relatively quickly and other people retain the nanobots for either extended periods or permanently (until surgically removed). Further studies are trying to determine what, if any health condition is common among those that retain their nanites. In our sampling of US cities using roaming teams with sweep generators and receivers, it was discovered that the signal being emanated from the RFID tags lasted about 21.7 milliseconds longer than in any other urban center.

As of this FDA Warning, there appears to be no immediate health risk and no reason to unduly alarm the general public with a general public announcement. NSA has indicated they will separately report to the Executive Office of the President on their findings.

Transcript for FDA’s Executive Office Briefing: FDA’s Updated Interim Safety and Risk Assessment of Nanobots and its Analogues in OTC drugs for Humans

November 28, 2008

FDA’s Warnings/Advisories

Recalls Home Page

FDA Home Page | Search FDA Site | FDA A-Z Index | Contact FDA | Privacy |

FDA Website Management Staff

A few of you doubt me?!!

 

I have gotten a number of comments about the science of my stories. Since I spent most of my life in hard core R&D, science is my life and the way I talk. To read my stories, you have to be willing to either accept that the science behind it is fact or go look it up yourself. You will quickly find that there is damn little, if any fiction, in my stories. I take exception to people that say the science is wrong so I’m going to self analyze one of the stories that I have gotten the most questions about.

 

In the story about the accidental weapon discovery, I described a C-130 with a multi-bladed prop – See US Patent 4171183 – . Also see http://usmilnet.com/smf/index.php?topic=9941.15 and http://www.edwards.af.mil/news/story.asp?id=123089573. As I said in the story the long and telescoping blade is still classified so there are no public pictures of it.

 

The ATL (airborne tactical laser) program being run out of the ACTD program by the DUSD(AS&C), an office within OSD. The ACTD program is where the original project was started in cooperation with the Naval Research Lab (NRL). The original objective was to improve the speed and range of long distance transport by aircraft. It followed some research that showed that if the variable pitch of prop were extended outward from the hub further, then the efficiency would improve.

 

Since a prop is a lifting wing that lifts horizontally, it must maintain a constant angle of attack (AoA) over the entire length of the blade. AoA is the angle between the camber line of the wing and the axis of the flow of air over the blade. Since the relative speed of the prop changes as a function of distance from the hub, the blade must twist or pitch more as you move further out the blade. This was the essential secret that the Wright Brothers discovered in 1902 and is the basic difference between a screw propeller and a wing propeller.

What was discovered in the development of vertical wind turbines is that blades as long as 50 feet but as thin as 5 inches could be made to be more efficient and with higher torque than conventional blades. In wind power, the added torque allows you to turn a larger generator but this is due to the wind passing over the blade making it spin. But in an aircraft the engines would be spinning the blade to make it take a bigger (more efficient) bite out of the air, this would mean being able to create more thrust or it might be able to operate at a higher altitude (in thinner air). Do a Google search for “Vertical Wind Turbine”. You’ll see designs like the WindSpire that is 30 feet tall with blades less than 8 inches wide that is so efficient that it produces 2000 kilowatts and can operate in 8 MPH winds and it can handle 100 MPH gusts.

 

The guys at NRL took that and reversed it into an efficient propeller design for the C-130 in the hopes that it would give a similar improved performance. The carbon-fiber telescoping blade was just a natural extension of that thinking.

 

As to the laser beam creating a wide range of frequencies, that is also easy to explain. The Doppler Effect says that an increase in wavelength is received when a source of electromagnetic radiation is moving away from the observer and a decrease in wavelength is received when a source of electromagnetic radiation is moving toward from the observer. This is the basis for the Red Shift (redshift) used by astronomers to examine the movement of starts. It is the reason that a train has a rising pitch whistle as it coming toward you and a decreasing pitch sound as it passes and goes away from you. This is basic high school physics.

 

As the laser beam was rotated, any observer in a lateral position to the aircraft would see one part of the rotating beam rotating toward them (for example, the part above the prop hub) and another part rotating away from them (in this example, the part below the prop hub). The bottom part would have a redshift to its visible light because it is moving away from the observer. The part of the prop that is moving the slowest, near the hub, would have the least redshift but as the observer looked at the light coming from the laser beam further out on the prop, the speed would increase and the redshift would be greater until the Doppler shift would be so great that the light would shift to a frequency below the visible light spectrum. This would move the light energy into the infrared area but as the light traveled faster and faster, it would shift lower and lower. Since the laser beam extended for miles and the beam was traveling at speeds from a few hundred MPH to thousands of mils per second, the red shift along the beam path constantly moved down the electromagnetic spectrum passed radar, TV, short wave radio and down into the ELF range.

 

That portion of the prop above the hub was doing the same thing but it was moving toward the observer in the lateral position and so it was giving a blue shift – toward higher frequencies. As the light frequencies compressed into the blue and ultraviolet range, it became invisible to the naked eye but it still was emitting energy at higher and higher frequencies – moving into X-rays and gamma rays at speeds toward the end of the beam.

 

The end result of this red and blue shift of the light from the laser beam is that there was a cone of electromagnetic radiation emanating from the hub of each of the two engines (on the C-130) or the one engine on the retrofitted 707. This cone radiated out from the hub with a continuously changing frequency to the electromagnetic emissions as the cone widens out behind the aircraft. The intensity of the emissions is directly proportional to the power of the laser and the speed of the props so the highest and lowest frequencies were the most intense. These also happened to be the most destructive.

 

This is just one story that is firmly based in real and actual science. You have to be the judge if it is true or not but I defy you to find any real flaw in the logic or science. As with all of my stories, I don’t talk about space cadet and tin foil hat stuff. I have 40 years of hard core R&D experience along with four degrees in math, computer modeling, physics and engineering so I’m not your usual science writer but whether it is science fiction or not is up to you to decide. Just don’t make that decision because you don’t believe or understand the science – that is the part that should not be questioned. If you doubt any of it, I encourage you to look it up. It will educate you and allow me to get these very important ideas across to people.

Government Secrets #1 – Be Afraid…Be Very Afraid

I was involved in a long career of classified work for the military and then did classified work for the government after I got out of the military. Doing classified work is often misunderstood by the public. If a person has a Top Secret clearance, that does not mean they have access to all classified information. In fact, it is not uncommon for two people to both have Top Secret (TS) clearances and still not be allowed to talk to each other. It has to do with what the government calls “compartments”. You are allowed your clearance only within certain compartments or subject areas. For instance, a guy that has a TS for Navy weapons systems may not know or allowed to know anything about Army weapon systems. If a compartment is very closely held – meaning that it is separately controlled even within the TS cleared people, then it is given a special but often obscure names and additional controls. For instance, for years (back in the days of Corona but not any more) the compartment for satellite reconnaissance was called “talent-keyhole” and “byeman” and was usually restricted to only people within the NRO – National Reconnaissance Office.

These code words were abbreviated with two letters so talent-keyhole became TK and byeman became BY. As a further safeguard, it is forbidden to tell anyone the code word for your compartment – you are only allowed to tell him or her the two-letter abbreviation. And you cannot ask someone if they are cleared for any particular compartment, you have to check with a third party security force. So if you work in a place like CIA or NRL or NSA, and you want to have a meeting with someone from another department, you meet them at the security station outside your department office area (every department has one). When they arrive, you ask the guard if they are cleared for “TK” and “BY”. The guard then looks at the visitor’s badges and then checks them against a picture logbook he keeps. The picture and codes on the badges and the log book have to match and if they do, then gets out another book that has just the visitor’s numeric coded badge number and looks up his clearances. If he has TK and BY after his badge number, then you are told that he can be admitted to your area for discussions on just the TK and BY programs and subjects. In some departments, the visitors are given brightly colored badges identifying them as being cleared only for specific subject areas. This warns others in the department to cover their work or stop talking about other clearance areas when these visitors are nearby.

There are hundreds of these coded compartments covering all aspects of military and civilian classified topics and programs. If you are high enough or if your work involves a lot of cross-discipline work, you might have a long string of these code words after your name….as I did.

If a program has any involvement with intelligence gathering (HUMINT – human intelligence, SIGINT – signal intelligence or IMINT – imagery intelligence), then it may get additional controls that go well beyond the usual TS background checks. For instance, you might be subjected to frequent polygraph tests or be placed in the PRP – Personal Reliability Program. PRP was a program that constantly monitors people’s lives to see if they ever get even close to be vulnerable or easy targets for spies. In the PRP, your phone might be tapped, your checking accounts are monitored, you debt and income are watched, and your computer is hacked. This is all with the intent of making sure you never get into debt or get psychologically unstable. PRP administers a series of psychological tests that can take up to 3 days to complete every year. These tests can peer into your mind so well that they can feel reasonably confidant that you are mentally stable if these tests say so.

Because of my work, I had a TS clearance for more than 40 years and had a string of two-letter codes after my name that went on for three or four lines on a typewritten page. I was in the “Poly” program and in the PRP and some others that I still can’t talk about. The reason I had so many was because I was involved in doing decision support using computer modeling – Operations Research, Math Modeling and Simulations. This meant I had to have access to a huge range of information from a wide variety of intelligence sources as well as other kinds of R&D work. I then had to be able to analyze this information, model it and present it to the senior decision-makers in an easy to understand form. This meant I was often briefing congressmen, senators, people from the CIA, FBI and high ranking officers from all of the services, JCS and OSD as well as the working level analyst that were giving me their classified data for analysis.

Now I can begin telling you some of what I learned by being exposed to all of that intelligence over all those years but I still have to be careful because although most of my limitations have expired, some are still in effect and I can’t violate them or I will join the ranks of the “disappeared”.

First, let me make it clear that the entire military is run by the top most 1% of the people in the services combined with the top 5% within the federal government. Imagine a pyramid in which only the guys at the top point are deciding where all the rest will go. I’ll call them the “Power Elite”

There are just a handful of officers in the Pentagon and in JCS that make all of the decisions of what the services will do and what direction they will take. Perhaps 50 officers total. These guys are so high in rank and so close to retirement that they have, for intents and purposes, ceased being military people and are simply politicians that wear a uniform. They are essentially the liaison officers for the highest ranking congressmen and the office of the President. They cater to these politicians in order to gain additional power through the control of more money or to feather they nest of future involvement in the political arena.

There are of course a few – a very few notable exceptions. Colon Powell and Dwight Eisenhower are two that come to mind. Officers like General Norman Schwarzkopf are not in this group because they chose not to seek political office or extend their power or control beyond doing their military jobs.

It is easy to see why all of the military is controlled by 1% of the officers. This is an organization based on the “chain-of-command” structure and everyone is taught to follow orders. In fact, once you are in the military, you can go to jail if you do not follow orders and in time of war, you can be executed for not following orders. Most of the bulk of the military is so biased by the indoctrination and propaganda created and put out by the government, that they willingly follow orders without questioning them.

What this 1% of high ranking military and 5% of the federal government have in common is that they measure their success in money and power. The source of that money and power comes from commercial, industrial and business sources. By making decisions that favor these businesses, those businesses, in turn, empower and enrich those involved. What is truly tragic is that this is not a recent occurrence but rather thee has been a Power Elite in our government for many decades – going back to the mid 1800’s.

The 5% of the federal government refers to the most powerful members of the Executive branch – President, VP, Sec. of Defense, Sec. of State, etc. and the top most powerful congressmen and senators. The reason that the newer, younger and less powerful legislators do not fall into this group is because of the way the political parties are setup behind the scenes. The most senior congressmen and senators are put into positions of power and influence over the committees and programs that have the most influence on contracts, budget money and funding controls. When one congressman can control or seriously impact the budget for the entire military or any major commerce area, then he has control over all of the people in those areas. To see who these people are, list all of the congressmen and senators by length of service and take the top 5% and you will have 99% of the list. Not surprisingly, this top 55 also includes some of the most corrupt members of congress – Murtha, Stevens, Rangel, Renzi, Mollohan, Don Young and others.

At the highest levels of security clearances, many people gain insights into how this Power Elite manipulate and twist he system to their gain. When Dick Cheney orchestrated the fake intelligence to support his war on Iraq, don’t think for a minute that the CIA, NSA and Pentagon did not know exactly what he was doing but being good little soldiers that are, by law, not allowed to have a political opinion, they kept quiet. If they had not kept quiet, their personal careers would have been destroyed and their departments or agencies would have been punished by under-funded budgets for years to come.

The money and power comes from lobbyists and donations of funds and promises of votes so that the Power Elite can remain in power and extend their control and riches. A study by Transparency International found that of all the professions and jobs in the world, the one job that is most likely to make you a millionaire the soonest is being a congressman and senator in the US. In a job that pays less than $200K peer year, the net income and wealth of most congressmen and senators rises by 30-40% per year while they are active members of the legislature. That’s a fact!

So where’s the SciFi in all this? It’s just this, these members of the Power Elite have so much control that they can operate a virtual parallel government that functions out of sight of the public and often in complete opposition to the actions of their publicly expressed policies. Of course, statements like this cannot be made without positive and verifiable evidence and I can provide facts you can check and a long history of this occurring going back decades. Read about these incidents in the rest of this series of stories – Government Secrets #2, #3 and #4.

Ocean Dumping – A Summary of Studies

Ocean Dumping – A Summary of 12 Studies Conducted between 1970 and 2001

By Jerry Botana

The dumping of industrial, nuclear and other waste into oceans was legal until the early 1970’s when it became regulated; however, dumping still occurs illegally everywhere.  Governments world-wide were urged by the 1972 Stockholm Conference to control the dumping of waste in their oceans by implementing new laws. The United Nations met in London after this recommendation to begin the Convention on the Prevention of Marine Pollution by Dumping of Wastes and Other Matter which was implemented in 1975. The International Maritime Organization was given responsibility for this convention and a Protocol was finally adopted in 1996, a major step in the regulation of ocean dumping.

The most toxic waste material dumped into the ocean includes dredged material, industrial waste, sewage sludge, and radioactive waste. Dredging contributes about 80% of all waste dumped into the ocean, adding up to several million tons of material dumped each year. About 10% of all dredged material is polluted with heavy metals such as cadmium, mercury, and chromium, hydrocarbons such as heavy oils, nutrients including phosphorous and nitrogen, and organochlorines from pesticides. Waterways and, therefore, silt and sand accumulate these toxins from land runoff, shipping practices, industrial and community waste, and other sources.  This sludge is then dumped in the littoral zone of each country’s ocean coastline.  In some areas, like the so called “vanishing point” off the coast of New Jersey, in the United States, such toxic waste dumping has been concentrated into a very small geographic area over an extended period of time. 

In the 1970s, 17 million tons of industrial waste was legally dumped into the ocean by just the United States.   In the 1980’s, even after the Stockholm Conference, 8 million tons were dumped bincluding acids, alkaline waste, scrap metals, waste from fish processing, flue desulphurization, sludge, and coal ash.

If sludge from the treatment of sewage is not contaminated by oils, organic chemicals and metals, it can be recycled as fertilizer for crops but it is cheaper for treatment centers to dump this material into the ocean, particularly if it is chemically contaminated. The UN policy is that properly treated sludge from cities does not contain enough contaminants to be a significant cause of eutrophication (an increase in chemical nutrients—typically compounds containing nitrogen or phosphorus—in an ecosystem) or to pose any risk to humans if dumped into the ocean, however, the UN policy was based solely on an examination of the immediate toxic effects on the food chain and did not take into account how the marine biome will assimilate and be affected by this toxicity over time.  The peak of sewage dumping was 18 million tons in 1980, a number that was reduced to 12 million tons in the 1990s.

Radioactive Waste

Radioactive waste is also dumped in the oceans and usually comes from the nuclear power process, medical use of radioisotopes, research use of radioisotopes and industrial uses. The difference between industrial waste and nuclear waste is that nuclear waste usually remains radioactive for decades. The protocol for disposing of nuclear waste involves special treatment by keeping it in concrete drums so that it doesn’t spread when it hits the ocean floor however, poor containers and illegal dumping is estimated to be more than 45% of all radioactive waste. 

Surprisingly, nuclear power plants produce by far the largest amount of radioactive waste but contribute almost nothing to the illegal (after the Stockholm Conference) ocean dumping.  This is because the nuclear power industry is so closely regulated and accountable for its waste storage.  Off the coast of southern Africa and in the Indian Ocean, is the greatest accumulation of nuclear wastes.

The dumping of radioactive material has reached a total of about 84,000 terabecquerels (TBq), a unit of radioactivity equal to 1012 atomic disintegrations per second or 27.027 curies. Curie (Ci) is a unit of radioactivity. One curie was originally defined as the radioactivity of one gram of pure radium.  The high point of nuclear waste dumping was in 1954 and 1962, but this nuclear waste only accounts for 1% of the total TBq that has been dumped in the ocean. The concentration of radioactive waste in the concrete drums varies as does the ability of the drums to hold it.  To date, it is estimated that the equivalent of about 227 million grams (about 500,000 pounds) of pure radium has been dumped on the ocean floor.

Until it was banned, ocean dumping of radioactive waste was considered a safe and inexpensive way to get rid of tons of such materials.  It is estimated that the 1960’s and early 1970’s era nuclear power plants in New Jersey (like Oyster Creek – which is located just 21 miles from the Barnegat Lighthouse) and 12 other nuclear power plants located in Pennsylvania, New Jersey, and New York have dumped more than 100,000 pounds of radioactive material into the ocean off the New Jersey coast.

Although some claim the risk to human health is small, the long-term affects of nuclear dumping are not known, and some estimate up to 1,000 deaths in the next 10,000 years as a result of just the evaporated nuclear waste. 

By contrast, biologists have estimated that the ocean’s biome has been and will continue to be permanently damaged by the exposure to radioactive material.  Large scale and rapid genetic mutations are known to occur as dosage levels of radiation increase.  Plant, animal and micro-organisms in the immediate vicinity of leaking radioactive waste will experience the greatest and most radical mutations between successive generations.  However, test show that even long term exposure to diluted radioactive wastes will create accelerated mutations and adaptations.

The Problems with Ocean Dumping

Although policies on ocean dumping in the recent past took an “out of sight- out of mind” approach, it is now known that accumulation of waste in the ocean is detrimental to marine and human health. Another unwanted effect is eutrophication. A biological process where dissolved nutrients cause oxygen-depleting bacteria and plants to proliferate creating a hypoxic, or oxygen poor, environment that kills marine life. In addition to eutrophication, ocean dumping can destroy entire habitats and ecosystems when excess sediment builds up and toxins are released. Although ocean dumping is now managed to some degree and dumping in critical habitats and at critical times is regulated, toxins are still spread by ocean currents. Alternatives to ocean dumping include recycling, producing less wasteful products, saving energy and changing the dangerous material into more benign waste.

According to the United Nations Group of Experts on the Scientific Aspects of Marine Pollution , the amount of ocean dumping actually brings in less pollution than maritime transportation, atmospheric pollution, and land based pollution like run-off. However, when waste is dumped it is often close to the coast and very concentrated as is the case off the coast of New Jersey.

Waste dumped into the ocean is categorized into the black list, the gray list, and the white list. On the black list are organohalogen compounds, mercury compounds and pure mercury, cadmium compounds and pure cadmium, any type of plastic, crude oil and oil products, refined petroleum and residue, highly radioactive waste, any material made for biological or chemical warfare.

The gray list includes water highly contaminated with arsenic, copper, lead, zinc, organosilicon compounds, any type of cyanide, flouride, pesticides, pesticide by-products, acids and bases, beryllium, chromium, nickel and nickel compounds, vanadium, scrap metal, containers, bulky wastes, lower level radioactive material and any material that will affect the ecosystem due to the amount in which it is dumped.

The white list includes all other materials not mentioned on the other two lists. The white list was developed to ensure that materials on this list are safe and will not be dumped on vulnerable areas such as coral reefs.

In 1995, a Global Waste Survey and the National Waste Management Profiles inventoried waste dumped worldwide to determine what countries were dumping waste and how much was going into the ocean. Countries that exceeded an acceptable level would then be assisted in the development of a workable plan to dispose of their waste.

The impact of a global ban on ocean dumping of industrial waste was determined in the Global Waste Survey Final Report the same year. In addition to giving the impact for every nation, the report also concluded that the unregulated disposal of waste, pollution of water, and buildup of materials in the ocean were serious problems for a multitude of countries. The report also concluded that dumping industrial waste anywhere in the ocean is like dumping it anywhere on land. The dumping of industrial waste had reached unacceptable levels in some regions, particularly in developing countries that lacked the resources to dispose of their waste properly.

The ocean is the basin that catches almost all the water in the world. Eventually, water evaporates from the ocean, leaves the salt behind, and becomes rainfall over land. Water from melted snow ends up in rivers, which flows through estuaries and meets up with saltwater.  River deltas and canyons that cut into the continental shelf – like the Hudson Canyon and the Mississippi Cone – create natural channels and funnels that direct concentrated waste into relatively small geographic areas where it accumulates into highly concentrated areas of fertilizers, pesticides, oil, human and animal wastes, industrial chemicals and radioactive materials.  For instance, feedlots in the United States exceed the amount of human waste with more than 500 millions tons of manure each year – about half of which eventually reaches the ocean basin.

Not only does the waste flow into the ocean, but it also encourages algal blooms to clog up the waterways, causing meadows of seagrass, kelp beds and entire ecosystems to die. A zone without any life remaining is referred to as a dead zone and can be the size of entire states, like in coastal zones of Texas and Louisiana and north-east of Puerto Rico and the Turks and Caicos Islands.  All major bays and estuaries now have dead zones from pollution run-off. Often, pollutants like mercury, PCBs and pesticides are found in seafood meant for the dinner table and cause birth defects, cancer and neurological problems—especially in infants.

One of the most dangerous forms of dumping is of animal and human bodies.  The decomposition of these bodies creates a natural breeding ground for bacteria and micro-organisms that are known to mutate into more aggressive and deadly forms with particular toxicity to the animals or humans that they fed on.  Of the mid-Atlantic coast of the United States was a common dumping zone for animals – particularly horses and human bodies up until the early 1900’s.  Today, the most common areas for human body dumping is in India in which their religious beliefs advocate burial in water.  The results of this dumping may be seen in the rise in extremely drug resistant strains of leprosy, dengue fever and Necrotizing Fasciitis bacteria.

One of the largest deep ocean dead zones is in the area between Bermuda and the Bahamas.  This area was a rich and productive fishing ground in the 1700’s and early 1800’s but by the early 20th Century, it was no longer productive and by the mid-1900’s, it was virtually lifeless below 200 feet of depth.  This loss of all life seems to have coincided with massive ocean dumping along the New Jersey and Carolina coasts.

Recreation

Water recreation is another aspect of human life compromised by marine pollution from human activities like roads, shopping areas, and development in general.  Swimming is becoming unsafe, as over 12,000 beaches in the United States have been quarantined due to contamination from pollutants. Developed areas like parking lots enable runoff to occur at a much higher volume than a naturally absorbent field. Even simply driving a car or making a house warm can leak 28 million gallons of oil into lakes, streams and rivers. The hunt for petroleum through offshore gas and oil drilling leaks extremely dangerous toxins into the ocean and luckily is one aspect of pollution that has been halted by environmental laws.

Environmental Laws

In addition to the lack of underwater national parks, there is no universal law like the Clean Air Act or the Clean Water Act to protect the United States ocean territory. Instead, there are many different laws like the Magnuson-Stevens Fishery Conservation and Management Act , which only apply to certain aspects of overfishing and are relatively ineffective. The act developed in the 1970’s is not based on scientific findings and is regulated instead by the regional fisheries council. In 2000, the Oceans Act  was implemented as a way to create a policy similar to the nationwide laws protecting natural resources on land. However, this act still needs further development and, like many of the conservation laws that exist at this time, it needs to be enforced.

 The total effects of ocean dumping will not be known for years but most scientists agree that, like global warming, we have passed the tipping point and the worst is yet to come.

Perpetual Motion = Unlimited Power….Sort of…

The serious pursuit of perpetual motion has always intrigued me. Of course I know the basic science of conservation of energy and the complexities of friction, resistance, drag and less than 100% mechanical advantage that dooms any pursuit of perpetual motion to failure…but still, I am fascinated at how close some attempts have come. One college professor built a four foot tall Ferris wheel and enclosed its drive mechanism in a box around the hub. He said it was not perpetual motion but that it had no inputs from any external energy source. It did, however, make a slight sound out of that box. The students were to try to figure out how the wheel was turning without any apparent outside power source. It turned without stop for more than two years and none of his students could figure out how. At the end of his third year, he introduced his mechanism. He was using a rolling marble design that was common for perpetual motion machines but that also had been proven to not work. What he added was a tiny IC powered microcircuit feeding a motor that came out of a watch. A Watch! The entire 4 foot high Ferris wheel needed only the additional torque of a watch motor to keep it running for nearly 4 years!

This got me to thinking that if I could find a way to make up that tiny little additional energy input, I could indeed make perpetual motion. Unlike most of my other ideas, this was not something that could easily be simulated in a computer model first. Most of what does not work in perpetual motion is totally unknown until you build it. I also knew that the exchange of energy to and from mechanical motion was too inefficient to ever work so I concentrated on other forms of energy exchange. Then I realized I had already solved this – back in 1963!

Back in 1963, I was a senior in high school. Since 1958, I had been active in science fairs and wanted my last one to be the best. To make a long story short, I won the national science fair that year – sponsored by Bell Telephone. My project was “How far will sound travel” and my project showed that the accepted theory that sound diminishes by one over the square of the distance (the inverse square law) is, in fact, wrong. Although that may occur in an absolutely perfect environment of a point source of emission in a perfectly spherical and perfectly homogeneous atmosphere, it never ever occurs in the real world.

I used a binary counting flashing light circuit to time sound travel and a “shotgun” microphone with a VOX to trigger a measure of speed and power of the sound under hundreds of conditions. This gave me the ability to measure to 1/1000th of a second and down to levels that were able to distinguish between the compressions and rarefaction’s of individual sound waves. Bell was impressed and I got a free trip to the World’s Fair in 1964 and to Bell Labs in Murry Hill NJ.

As a side project of my experiments, I attempted to design a sound laser – a narrow beam of sound that would travel great distances. I did. It was a closed ten-foot long Teflon-lined tube that contained a compressed gas – I used Freon. A transducer (a flat speaker) at one end would inject a single wavelength of a high frequency sound into the tube. It would travel to the other end and back. At exactly 0.017621145 seconds, it would pulse one more cycle at exactly the same time that the first pulse reflected and returned to the transducer. This was timed to exactly coincide with the first pulse so that it was additive, making the first pulse nearly double in amplitude. Since the inside of the tube as smooth and kept at a constant temperature, the losses in one pass through the tube were almost zero. In less than 5 minutes, these reinforcing waves would build the moving pulse to the point of containing nearly all of the gas in the tube into the single wave front of one pulse. This creates all kinds of problems so I estimated that it would only be about 75% efficient but that was still a lot.

Using a specially shaped and designed series of chambers at the end opposite the transducer, I could rapidly open that end and emit the pulse in one powerful burst that would be so strong that the wave front of the sound pulse would be visible and it would remain cohesive for hundreds of feet. It was dense enough that I computed it would have just over 5 million Pascal’s (Pa) of force or about 750 PSI. The beam would widen to a square foot at about 97 meters from the tube. This is a force sufficient to knock down a brick wall.

One way to make the kind of transducer that I needed for this sound laser was to use a carefully cut crystal or ceramic disc. Using the property of reverse piezoelectric effect, the disc will uniformly expand when an electric field is applied. A lead zirconate titanate crystal would give me the right expansion while also being able to respond to the high frequency. The exit chambers were modeled after some parabolic chambers that were used in specially made microphones used for catching bird sounds. The whole thing was perfectly logical and I modeled it in a number of math equations that I worked out on my “slip stick” (slide rule).

When I got to Bell Labs, I was able to get one scientist to look at my design and he was very intrigued with it. He said he had not seen anything like it but found no reason it would not work. I was asked back the next day to see two other guys that wanted to hear more about it. It was sort of fun and a huge ego boost for me to be talking to these guys about my ideas. In the end, they encouraged me to continue thinking and that they would welcome me to work there when I was old enough.

I did keep thinking about it and eventually figured out that if I can improve the speed of response of the sensors and transducer, I could shorten the tube to inches. I also wanted more power out of it so I researched what was the gas with the greatest density. Even this was not enough power or speed, so I imagined using a liquid – water – but it turns out that water molecules are like foam rubber and after a certain point, they absorb the pulses and energy too much. The next logical phase of matter was a solid but that meant that there was nothing that could be emitted. I was stumped…for awhile.

In the late 1970’s I figured, what if I extended the piezoelectric transducer crystal to the entire length of the tube – no air – just crystal. Then place a second transducer at one end to pulse the crystal tube with a sound wave. As the wave travels the length of the crystal tube, the compression and rarefaction’s of the sound wave pulse create stress or strain on the piezoelectric crystal, making it give off electricity by the direct piezoelectric effect.   this is how a phonograph needle works as it bounces on the grooves of the record. 

Since the sound pulse will reflect off the end of the tube and bounce back, it will create this direct piezoelectric effect hundreds of times – perhaps thousands of times – before it is reduced by the transfer into heat. As with my sound laser, I designed it to pulse every single bounce to magnify the amplitude of the initial wave front but now the speed was above 15,000 feet per second so the pulses had to come every 0.0001333 seconds. That is fast and I did not know if current technology was up to the task. I also did not know what it would do to the crystal. I was involved in other work and mostly forgot about it for a long time.

In the late 1980’s, I now was working for DARPA and had access to some great lab equipment and computers. I dug out my old notes and began working on it again. This time I had the chance to actually model and create experiments in the lab. My first surprise was that these direct piezoelectric effects created voltages in the hundreds or even thousands of volts. I was able to get more than 10,000 volts from a relatively small crystal (8 inches long and 2 inches in diameter) using a hammer tap. I never thought it would create this much of a charge. If you doubt this, just take a look at the Mechanism paragraph in Wikipedia for Piezoelectricity.

When I created a simple prototype version of my sound laser using a tube of direct piezoelectric crystal, I could draw off a rapid series of pulses of more than 900 volts using a 1/16th watt amplifier feeding the transducer. Using rectifiers and large capacitors, I was able to save this energy and charge some ni-cads, power a small transmitter and even light a bulb.

This was of great interest to my bosses and they immediately wanted to apply it to war fighting. A friend of mine and I cooked up the idea of putting these crystals into the heels of army boots so that the pressures of walking created electricity to power some low power devices on the soldier. This worked great but the wires, converter boxes, batteries, etc.,  ended up being too much to carry for the amount of power gained so it was dropped. I got into other projects and I dropped it also.

Now flash forward to about 18 months ago and my renewed interest in perpetual motion. I dug out my old notes, computer models and prototype from my DARPA days. I updated the circuitry with some newer faster IC circuits and improved the sensor and power take-off tabs. When I turned it on, I got sparks immediately. I then rebuilt the power control circuit and lowered the amplitude of the input sound into the transducer. I was now down to using only a 9-volt battery and about 30 ma’s of current drain to feed the amplifier.   I estimate it is about a 1/40th watt amplifier.  The recovered power was used to charge a NIMH battery of 45 penlights of 1.2 volts each.

Then came my epiphany – why not feed the amplifier with the charging battery! DUH!

I did and it worked. I then boosted the amplifier’s amplitude, redesigned the power take-off circuit and fed it into a battery that was banked to give me a higher power density. It worked great. I then fed the battery back into an inverter to give me AC. The whole thing is about the size of a large briefcase and weighs about 30 pounds – mostly from the batteries and transformers. I am getting about 75 watts out of the system now but I’m using a relatively small crystal. I don’t have the milling tools to make a larger properly cut crystal but my modeling says that I can get about 500 watts out of a crystal of about 3 inches in diameter by about 12 inches long.

I call my device “rock power” and when I am not using it for power in my shop or on camping trips, I leave it hooked up to a 60 watt bulb. That bulb has been burning now for almost 7 months with no signs of it diminishing. It works! Try it!!!

The Down Side to Lucid Dreams

Some of you may have read my other stories about my experiences with Lucid Dreaming. See LUCID DREAMS and THE POWER OF THE MIND. Now I am going to tell you there is a down side to doing that.

It started when I noticed that I was constantly playing music in my head. Everybody does that but this was different. It was like the background music in a movie. I could “think” this music in my head even while I was actively thinking and even talking about something totally unrelated to the music. Like the music in the movies, I was not always aware that this background music was there but if I had a lull in other thoughts, I would immediately become aware of it.

It was my subconscious mind playing this music and my conscious mind was hearing it while my conscious mind was busy with other thoughts. What was worse, is that I cannot stop it easily. I think to myself – NO MORE MUSIC – over and over again and after several minutes, it stops…only to start again in 10, 20, 60 minutes later.

This sounds silly but my conscious mind seems to have a mind of its own. Yeah, I know that is crazy but why else would I not be able to control it? In my lucid dreams, I have complete control and even instruct my subconscious mind to not do that any more but it doesn’t help much. But outside of those moments when I am expressly trying to control my subconscious mind, it seems to be thinking almost independently of my conscious mind. I say, “almost” because it has begun a new “background activity”.

I am very well aware of both the jokes and the reality of hearing “voices in your head”. These are just a few I found on a bumper sticker site. “You’re just jealous because the voices are talking to me” “The voices in my head are stealing my sanity” “I can’t go to work today – the voices in my head said stay home and clean the guns”. This is no joke. I really do have voices in my head that I don’t seem to have full control over.

In my story THE POWER OF THE MIND, I described over a decade of work with my lucid dreaming and my interactions with my subconscious mind. I have been able to take that to some very remarkable levels to include being able to invade other people’s thoughts and dreams and to extend my remote viewing to some amazing levels. Well now it seems I have a back seat driver to these events. My subconscious mind seems to be working at trying to make these contacts and invasions even during the day when I am otherwise engaged in other activities. It is really annoying.

The other day, I visited a friend; I’ll call her Jane. She had company and I was introduced to “Terry”. As I was introduced, I heard this weak voice in my mind saying she was a smoker and a bad driver and she drinks too much. I was shocked by these comments and could not imagine where they came from since she looked and talked perfectly normal, well dressed and certainly appeared sober. There was nothing to indicate these awful things about this woman that I had just met for the first time.

In my mind, I was literally having an argument in my head between my subconscious mind telling me awful things about Terry while my conscious mind was shouting that all that was nonsense. Meanwhile, I am also having a conversation with Jane and Terry and sitting down for some coffee.

I can’t tell you how distracting these mind games were while I am trying to smile and act cordial. I had to work at not saying some of my responses to my subconscious mind out loud. Just the fact that this was happening at all was annoying and very disconcerting but it was also re-framing the entire visit from a pleasant exchange with a friend to a mental brawl and mental shouting match. Jane had to ask me a several questions twice before I responded because I was so distracted.

I finally had to excuse myself but as I did, so did Terry. As Terry stood up, Jane rushed to help her. I thought she might be disabled or be injured the way that Terry was trying to hold her up but she was in her mid-50’s and seemed quite capable. While Terry was looking for her purse, I wrinkled my brow and shrugged my shoulders to Jane as if to say, what is going on? Without Terry seeing her, Jane curved her hand as if holding a glass and raised it to her face while rolling her head back. The obvious sign that Terry had been drinking. It was only then that I noticed that there was a large empty wineglass next to where Terry had been siting.

Jane and I helped Terry out to her car and she was definitely not able to drive safely. Jane repeatedly said she would drive Terry home – it was just a few blocks. After some effort, we got Terry to agree and I followed them to Terry’s house and then picked up Jane and drove her back home. Just as I was backing out of Terry’s driveway, I noticed deep tire marks on the lawn going right up to the front steps. The first few steps were broken or missing. I made the comment to Jane that somebody missed the driveway. Jane said that happened when Terry was driving home drunk one night and dropped a cigarette into her lap.

I am still annoyed by this running commentary of my conscious world and by the continuous background music but I am learning to live with it. It has not really told be to go home and clean the guns and I am not hearing messages from God. What I am hearing is sort of a news flash or intelligence report from my subconscious mind of matters that I am not immediately aware of and that have, so far. All proved to be correct. I can live with that.