Tuesday, January 31, 2012

Forbidden Colors!

Not actually "forbidden" in any real sense of the word, the color mixtures of red-green and blue-yellow are not easily visible to the human eye. That's because the wavelengths of these pairs cancel each other out when they reach the eye at the same time.

It's especially true with the red-green border -- blue and yellow can mix to show green and red and blue can mix to show purple, but red and green stay distinct in most cases. Apparently, the neurons which carry the "This is red" signal to the brain are shut down by light in the green wavelength of the spectrum. The same thing happens between neurons which register light in the blue and yellow wavelengths: The neurons that carry one signal cancel out the neurons that carry the other one.

A 1983 experiment showed a way that the forbidden colors can be perceived, if stripes of the two opposing colors were put next to each other and subjects stared at them for awhile. A special "eye tracker" held the surface on which the color was found stable relative to the eye, meaning that it was moved if the eyes themselves moved. The subjects' eyes thus didn't shift back and forth between red and green but instead could stay focused on the border. After some time, the border seemed to disappear and the viewers saw a reddish green and a bluish yellow -- the colors previously "forbidden" by the eye's structure.

Later experiments cast some shadows on the earlier ones, but the scientists working from the earlier set of experiments point out that their version used the eye tracker but the later ones didn't. One of them thinks that when the image is stabilized in front of the retina the opponent neurons no longer cancel each other out and the brain can process the dual signals in a way it ordinarily can't.

Which of course won't help you if you claim that your eyes caught the traffic signal as it changed and you couldn't distinguish between red and green, or even if the light was yellow, sir. Especially if the officers got SCMODS.

Monday, January 30, 2012

Brief Moment of Sanity

A rare instance of firing neurons at a major television network prevented the airing of a Fear Factor episode that involved contestants drinking donkey semen, followed by donkey urine.

Fear Factor, for those of my readers who haven't bothered to care about it, is a show in which contestants perform nerve-wracking stunts and consume gross things in order to become full members of the fraternity...wait, no, that was the Omegas in Animal House. On Fear Factor, the person who does all of these things the fastest or something wins some money.

I have always said that were I for some reason to be a contestant on Fear Factor, my goal would be to get through the first stunt round and then eat the gross stuff so I could puke all over the über-obnoxious Joe Rogan.

Sunday, January 29, 2012

Fear of an Oozing Planet

As writer Ian O'Neill notes here, some of the planets astronomers have been finding in other solar systems are weird.

Among the weird worlds is 55 Cancri e, a planet about 45 light years from Earth orbiting the star 55 Cancri A. It's roughly the size of Neptune and it orbits its star every 18 hours because it's 26 times as close to its star as Mercury is to our sun.

So you'd think it would be a hot ball of half-melted rock, but that's only part of the story. Immense pressures beneath the surface of the planet mean that liquids which would ordinarily boil away remain liquid in a state called "supercritical fluid state." Under great pressure, liquids don't boil at their usual temperatures. Supercritical carbon dioxide, O'Neill notes, is used to decaffeinate coffee beans.

This means that the supercritical fluids ooze to the surface of the planet through cracks in the overheated rocky surface and there sublimate to make an atmosphere we can actually see from 45 light years away. In other words, this thing is a Neptune-sized super-sauna. It is doubtful that any life exists under such extreme conditions, but scientists have theorized that if it did, it would have very open pores.

Saturday, January 28, 2012

Prescience Fiction?

The young Friar avidly consumed science fiction novels, but being young he leaned towards some more action-oriented stuff. The measured, cerebral work of Lithuanian-born Algis Budrys didn't hold his attention, and he returned Budrys' 1977 media-centered tale Michaelmas to the library with only a chapter or two read.

Today's Friar, thanks to a trip to the superb little store Aladdin Book Shoppe, gave Mr. Budrys a second try and found him to have had more than a little prophet in him when he created the story of the famous independent investigative reporter Laurent Michaelmas, pursuing an amazing story of an astronaut's resurrection in 1999.

Michaelmas also happens to more or less rule the world through the secret artificial intelligence he calls Domino, a computer program able to spy out things in almost every corner of the world, eavesdrop on almost any conversation and control what other computers do. Using Domino's abilities and influence, he has calmed most world conflicts and brought countries to work together through the United Nations. On the eve of the millennium, the greatest achievement of his combined space agency is preparing for its launch of a manned mission to the outer planets of the solar system. But the miraculous return of the American mission commander, presumed dead in a training accident, could restart old rivalries. The Soviet Union's astronaut was elevated to command of the mission when the American died and the Soviets are unlikely to quietly accept his demotion. But a nationalist group within the U.S. may have evidence that the accident which injured the American astronaut wasn't an accident. Michaelmas must use every advantage his media celebrity and Domino can give him to find out who is behind these developments before the world resets to its Cold War footing. He also has to see where Clementine Gervaise, a video producer who strongly resembles his late wife, may fit into the situation as well as his own personal life.

Budrys does a very good job of predicting some media developments which played out in real life, such as the kind of information glut brought about by the internet and the meaningless nature of a good deal of modern news, entertainment-based and otherwise. Michaelmas and other independent reporters file their stories via personal recording/transmitting units that also gather up and play back information from other sources -- not unlike the role laptop computers and personal data pads play today. Politically, shady Gulf state oil barons fund unrest in the Middle East similarly to the way they still do today. Budrys has some misses -- having the Soviet Union still around seven or eight years after it fell, for example, and his foreseen level of technology of 1999 both undershoots the level of the actual technology of 1999 and overshoots it. Unless there really is a secret artificial intelligence named Domino hanging around in the electronosphere, in which case, howdy!

Much of the novel is made up of conversations between Michaelmas and Domino as they try to puzzle out what happened with the astronaut's return. They frequently wax philosophical, and Michaelmas' own thoughts about the world he more or less helped to make are sometimes rueful. But Budrys' rich prose and low exposition quotient reduces the boredom level of such passages considerably. Michaelmas is a short 183 pages packed with fascinating retro-speculation, food for thought and an intriguing premise. The young Friar might not have been able to keep with it when it was published, but his older counterpart found it worth the time.

Friday, January 27, 2012

Not Exactly? Exactly!

Talking about a book turned into a post better for the long-post blog, here.

Wednesday, January 25, 2012

A Mighty Long Time

If you were to be asked who was the earliest-serving United States president to still have living grandchildren, who would you pick?

Given that it's grandchildren and not children, you might figure you could safely go back into the 1890s, maybe even the 1880s. Garfield? Cleveland? Hayes? Well, it seems that would be shooting too late in the game -- by almost forty years. Lyon Gardiner Tyler, Jr., (87) and Harrison Ruffin Tyler (83), can look at a picture of our tenth president, John Tyler, and say, "That's my grandpa." Tyler, born in 1790, served as president from 1841 to 1845. He died in 1862, but not before becoming a father (for the 13th of 15 times) at 63 to Lyon Gardiner Tyler, Sr, in 1853. The elder Lyon welcomed his namesake in 1925, when he was 71, and his younger brother Harrison in 1929. So although Pres. Tyler is their grandfather, they obviously never met him.

Tyler was the first president to take the office when it became vacant. His predecessor, William Henry Harrison, was the man who wouldn't wear a coat to his cold and damp inauguration, caught pneumonia, and died a month after taking the oath of office. Tyler set the precedent that a vice-president taking the presidential office became president in his or her own right, rather than serving as an acting president in between elections. The 25th Amendment to the U.S. Constitution established that precedent as law when it was passed in 1967.

The story notes that the oldest living presidential grandchild is Jane Garfield, granddaughter of James Garfield. She's 99. The oldest living presidential child is John Eisenhower, 89, son of Dwight Eisenhower.

For those who still have a little room for the trivial in their heads, the Tyler family has a long history with the College of William and Mary, stretching back to John Tyler -- the grandfather of the president -- who attended it in 1704, 11 years after it was founded.

Tuesday, January 24, 2012


I have heard people say that the failure of the United States Senate to pass a budget in the last thousand days is a failure of leadership on the part of Senate Majority Leader Harry Reid (D-Nev).

Nonsense. Senator Reid, being a longtime Washington, D.C., fixture, knows that government budgets are meaningless. Spending can be shifted from one budgetary year to the next by simply moving it ahead one day and thus charging it to next year's budget. It can be moved "off budget" completely and not counted against a year's expenditures. Senator Reid is simply eliminating the middleman of pretending the government has a plan on how it spends money and going straight to the spending because he knows any claims that the government will spend money the way it says it will spend it are a sham.

And Senator Reid knows shams, having used the word "leader" in his job titles for the last seven years -- first as Senate Minority Leader from 2005-2007 and as Senate Majority Leader beginning in 2007 -- while never actually displaying any leadership.

Get Back, Get Back...

The young Friar read a lot of science fiction, and among those offerings were Robert Heinlein's "juveniles" series for Charles Scribner. Those stories and a number of Heinlein's short stories in his "future history" series were all set inside our solar system. They dealt with what were at the time science fiction ideas of space stations and trips to the moon, things that later became reality. They also dealt with ideas that have yet to come to pass, such as moonbases and manned travel to other planets in the solar system, as well as ideas which have been proven wrong or unlikely by later exploration, such as life on Venus or advanced civilizations on Mars.

Back to the Moon, by NASA scientists Travis S. Taylor and Les Johnson, brings to mind some of those old Heinlein juveniles, following in that author's path of accurate scientific descriptions and real-world feel of the technology and situations. Heinlein, in writing for younger readers, didn't varnish his style a great deal, nor did he spend a lot of time adding depth to his characters. Neither do Taylor and Johnson -- the lead character is a stalwart astronaut named Bill Stetson, fer cryin' out loud -- and they don't display half of Heinlein's style and skill even though they're not writing for a younger crowd.

But those things aide, Back to the Moon is still a fun romp, a just-the-facts-ma'am kind of story about events surrounding the United States' first manned mission to the moon since Apollo 17 left in 1972. The time frame seems to be the early 2020s and relies on the now-canceled Constellation program as the basis for the U.S. effort. The manned mission is only months off when a private company also launches a flight to the moon, although this one is just a flyby carrying wealthy tourists. The tourists, though, catch a distress signal from a wrecked Chinese moon mission. What had been announced as a robotic test flight had actually carried a crew and is now stranded on the moon's surface. Stetson convinces his NASA superiors to scramble his planned flight for an immediate launch to rescue the stranded Chinese crew. But will the glitches shown in test flights mean his ship can't reach the moon? And will the Chinese crew, facing political pressure from a system that would rather have a failure on its own than success with help, actually go through with the rescue?

Taylor and Johnson move us through the mostly predictable plot with an engineer's straightforward prose -- no frills and not a lot of flavor. The appeal is in watching tried-and-true heroes do tried-and-true heroic things and seeing resourceful quick thinkers solve the problems that come their way quickly and resourcefully. Back is also fun because it uses recognizable and plausible technology instead of way-out stuff like warp drives and hyperspace jumps that are far beyond anything current science can manage.

Of course, a moon landing in the early 2020s is also far beyond anything current NASA technology can manage. In an afterward, Taylor describes how bipartisan presidential and congressional indifference starting with the Nixon administration starved the space agency of funds, requiring it to put off spacecraft development time and time again in order to keep what it had running. That culminated in the current administration's myopic ending of manned U.S. spaceflight, it being one of the very few things that the president and congressional leadership didn't want to spend money on. Both the possible Chinese moon mission and spaceflight by private corporations could happen within Back to the Moon's timeframe, but the idea that there would be a NASA mission waiting in the wings could not.

Taylor and Johnson offer a clue about what they probably think the solution is, as their privately-owned spacecraft and its wealthy owner play important roles at crucial points in the story. Private enterprise and free-market forces may or may not be the actual future of humanity's presence in space, but at least betting on them takes the matter out of the hands of people who ask whether or not additional soldiers on an island might make it capsize.

Monday, January 23, 2012


Wondering if I stood outside the gate with my thumb out I could catch a ride?

Probably not, but since NASA's been tasked to do things like investigate Toyota accelerations and other silly non-space stuff, I'd have a better shot that way than some of the actual astronauts would.

Sunday, January 22, 2012

Everything Gets Digital?

This article in Scientific American (it's behind the paywall but you can read a preview at the link) describes an experiment by Fermilab director and University of Chicago physicist Craig Hogan, who hopes to test out something about the way the universe is put together.

Most standard views of the universe hold that space, objects and random protoplasm like you and me are at their most basic level "smooth" or continuous. But Hogan's experiment will see if that is actually true. If he is right, at the smallest possible scale, we are made up of discrete particles and the universe is actually "fuzzy" -- the same way that a smooth curve on a digital picture becomes a clunky series of squares if you zoom in close enough. These are not atoms, like we may remember from school science courses, but much much smaller on a scale called the "Planck length."

If he is right, we do not actually move smoothly through space, either, but rather we kind of jitter along by occupying first one set of the Planck-sized doodads (real physicists say "quanta" instead of "doodads," by the way) and then actually jumping to another set of them a Planck-distance away. Planck distances are also so small that they can't be detected by any usual measuring instrument, but Hogan's experiment will show certain results if the lasers it uses are affected by this jitteriness.

The Holographic Principle is another idea that may go along with the universe's "jittery" nature. It's kind of fuzzy to me its own self, but the upshot is that each of these little Plancks is actually encoded information, like the bits in a computer that store its information. I will be a good boy at this point and deliberately not say "mind of God, anyone?" although I am sore tempted. Not that I believe Hogan's experiment would prove that the universe is somehow contained within God's thoughts. But it points out that folks who suggest that scientific views of the universe crowd God from the picture just aren't conversant with how weird the universe might really be.

I was asked once why I like this weirdo scientific physics stuff, considering that I make a point of describing myself as a pretty orthodox Christian theist. Aren't we the ones that deny evolution and insist the universe is 6,000 years old and stuff? Some of us do. But I personally believe that if I worship a God who is among other things, the source of all truth, then nothing that moves me closer to the truth can move me away from God. So neither side -- atheists or "young-earthers" -- has it right when they insist that the scientifically discovered or theorized view of the universe pushes God aside.

You might say that both views have a flaw -- and if Hogan is right, the flaw is that they are pointing out the mote in their opponents' eyes while ignoring the Planck in their own (Oh, c'mon, you knew I'd do it sooner or later).

Saturday, January 21, 2012

Market Forces

You remember the last time you flew on an airplane and how you thought, "This experience -- removing my shoes, dumping my pocket contents into a plastic bin, being radioactively scanned and/or groped, getting six pretzels as a 'snack' -- would be absolutely perfect if my chair cushion was just a little bit thinner, there was an inch less legroom in front of me and my seatback could recline all the way from 'vertical' to 'still vertical, who are you kidding?'"

Yeah, me neither.

Friday, January 20, 2012

Just a Suggestion...

...to one of the high school basketball coaches in the local tournament this evening. When your game strategy seems to include the idea that time of possession during a basketball game earns you points like riding time does in wrestling, you should perhaps be less willing to disgustedly bawl out one of your players for a defensive lapse. After all, you don't let them play defense that much and they may be a little rusty by the time the other team actually gets the ball back in its hands.

Thursday, January 19, 2012

99 Words But He Uses Just One

Apparently soon after his daughter with wife Beyonce was born, rapper Jay-Z was supposed to have said her presence in the world would make him forswear the use of the starts-with-a-B-rhymes-with-witch word that many of hip-hop's most popular songs use as a synonym for "woman."

This has turned out not to be the case, according to Mr. Z's own representatives. The assertion was made in a poem erroneously attributed to the rapper but actually written by a blogger named Renee Gardner. Those representatives did not comment on the possibility that if Mr. Z were to actually stop using offensive and derogatory words, his next album would be an instrumental.

Wednesday, January 18, 2012


Rosie O'Donnell, in an interview with Piers Morgan, commented on the campaigns for the Republican presidential nomination. The positions and comments of the candidates and their supporters or opponents led her to say "We're a backward nation in many ways."

I was going to post something arguing against her statement, but then it hit me that Rosie O'Donnell was being asked about her political opinions on national television, and I realized I had to agree with her.

Tuesday, January 17, 2012

Look It Up

So in a few minutes from this post, Wikipedia is going to "go dark" to protest the Stop Internet Piracy Act (SOPA), which many online folks say will result in government censorship of the internet like China has.

Word is many college professors are expecting an upturn in the amount of legitimate factual information that appears in their students' assignments. On the other hand, you could always go find a couple of students, tell them that Eli Whitney invented a way to make gin from cotton and then take them to a library and watch them wander around aimlessly, unable to use a reference section.

ETA: Or not; as of 7:30 AM Wednesday, Wiki was still pedia-ing.

ETA 2: Ah, just have to hit refresh, apparently, and now we have the blackout screen.

Monday, January 16, 2012

Fame and Glory Imminent!

Well, I guess we can now start betting on which one of the acting pair of Ben Koldyke and Amaury Nolasco becomes the multiple Oscar winner and major league Hollywood royalty and which one winds up on Honey I Shrunk the Kids: The TV Show.

Their show Work It, about two men who cross-dress because the network boss thinks it'd be funny, has been pulled from ABC's lineup. It will be replaced by reruns of Tim Allen's new show, Last Man Standing.

Rumors that Allen engineered the cancellation in order to make sure that Work It did not surpass the monumental two-season run of his Toy Story co-star Tom Hanks' early '80s cross-dressing comedy Bosom Buddies have yet to be addressed by Allen or Hanks.

From the Rental Vault (1982): My Favorite Year

Everybody's got a favorite year of their memory, and for Benjy Stone it was the year Alan Swann guest-starred on the TV show where he gofered, King Kaiser's Comedy Cavalcade. Benjy's an aspiring comedy writer, and Cavalcade is the hottest sketch comedy show on television in the late 1950s. Swann is a swashbuckling movie star with an -- ahem -- somewhat dissolute private life whom Benjy is assigned to squire around in the week before the show airs. His mission: Keep Swann more or less sober and out of trouble long enough to make the broadcast.

Complicating Benjy's mission is his own personal quest to woo the lovely K.C. Downing as well as the threats that mob boss Karl Rojeck has been making against the show for lampooning him in its sketches. Complicating it even more is that Swann wasn't born yesterday and his abilities to elude Benjy for to partake of the random debauch far outweigh Benjy's nursemaiding skills.

Although the story is important to Year, the movie is more or less a showcase for Peter O'Toole as Alan Swann. The role earned him one of his eight Academy Award nominations, although he lost to Ben Kingsley's Gandhi (The others nominated were Paul Newman for The Verdict, Jack Lemmon for Missing and Dustin Hoffman for Tootsie. 1982 gave the Academy three of the best performances of many years, and voters bravely chose the fourth best of the year as the winner). Year asks O'Toole to range between more or less straight-up slapstick to anguish to charming to wordless introspection and he pretty much never hits a false note. Despite his character's well-known lament towards the end of the movie, O'Toole is an actor and a movie star, and one without many peers.

As Benjy Stone, Mark-Linn Baker doesn't stink up the joint. He manages to keep Stone a nicely- wrapped collection of nervous jitters as he rides herd on his movie hero. Although he is not really up to some of the emotional confrontation required of him to set up the movie's great finish, those scenes go pretty quickly and anyway, Baker would soon have much more to atone for by starring in eight seasons of Perfect Strangers.

Year is littered with great "small roles" that help fine-tune its impact, such as Selma Diamond playing a cigarette-puffing wardrobe mistress, Lanie Kazan as Benjy's mother, Bill Macy as the spineless head writer Sy Benson and Joseph Bologna as comedy legend King Kaiser. Jessica Harper as Benjy's love interest K.C. is mercifully outside the main line of the plot and thus not onscreen for any great length of time.

Executive producer Mel Brooks helped shape the movie based on his years as a comedy writer for Sid Caesar's Your Show of Shows, and Dennis Palumbo used the idea of Errol Flynn's appearance on a Show of Shows episode as the basis for his script, although Flynn's guest-star turn was actually uneventful. Year was director Richard Benjamin's first movie behind the camera and it benefits from his long experience as a comedic actor. But the real draw as well as the real center on which the movie turns is O'Toole, in one of the best of a slew of great performances in his distinguished career. Although as he himself noted in wavering about accepting an Academy Honorary Award in recognition of his body of work, he's still "in the game" and so there's no telling what he might produce between this stage and the one to come.

Sunday, January 15, 2012

Time...Is on My Side

Were I still a deadline-working fellow, I'd be waaay behind to pick up this April 2011 profile of David Eagleman, a neuroscientist at the Baylor College of Medicine. But since Eagleman's work focuses on how our brains perceive time, I think I could get myself off the hook.

You should read the whole thing, but there were some really neat pull-out nuggets in the story. One pointed out how our brains work to match up our sense of sight with our sense of smell. As you may remember from school, light travels much faster than does sound. This is why you may see lightning in a thunderstorm and not hear it until several seconds later.

But closer than a certain distance, we seem to see something and hear it happen more or less simultaneously. We might think that was because the distance was so small that even the slower sound waves traveled too fast to perceive the gap, but some of Eagleman's experiments showed that the human ear can distinguish gaps between sounds as small as five milliseconds. So even at short distances, the gap between sight and sound is enough that some people might notice it. Except, Eagleman found, the brain's audio processor works many times faster than its visual processor, meaning that even though the light gets a head start on its way from the eyes to the brain, the sound makes up for it by running a faster race. Only when the distance is great enough to overcome that speed difference do we start hearing things after we see them.

In fact, because our brains do take a definite time to process sensory inputs, we actually are a little bit behind the world around us. Not much -- we're still talking in fractions of a second here -- but a time discernible to sensitive instruments. So all of us, even the hippest hipster that ever hipped, are behind the times.

One other thing that caught my eye was an experiment in which a person was shown a series of pictures of the same object several times in a row, but every now and again a different picture was inserted into the series. Test subjects almost always said the picture of the different item was on the screen longer than the others, even though they were all on for the same amount of time. Eagleman says it may be because our brains develop a kind of short-hand processing for things that we're familiar with, and that means we don't pay attention to them for as long as we do to something new and different. There's a lesson in that somewhere. Probably a bunch of them.

Friday, January 13, 2012

Second Leap

I'm telling you this now so you'll be ready on June 30th. On that day, we will add a second to the official time as it is kept at the International Earth Rotation and Reference Systems Service (IERS) in Paris, France.

The IERS uses an atomic clock based on the rate of vibration of a cesium atom. That atom shimmies just more than 9.1 million times a second and so that atomic clock keeps precise time that never changes. Our dumb ol' Earth, on the other hand, isn't nearly as careful about how quickly it spins on its axis or revolves around the Sun and so the amount of time in a day isn't always the same.

Every now and again, the folks at IERS either add or subtract a second from the world's calendars and clocks in order to make the Earth's time match the atomic clock time. They last did it in 2008 and this year will do it between June 30 and July 1. Once your clock hits 12:59:59 11:59:59 on June 30, it will actually take it two seconds to go to 0:00:00 on July 1 instead of one second.

This also marks one of the few times that France manages to tell other countries what to do instead of surrendering.

(ETA: Clock reset thanks to Dustbury)

Thursday, January 12, 2012

From the Rental Vault (1983): Never Say Never Again

If you're a Bond-ophile, you know that the iconic James Bond, Sean Connery, quit the series after the fifth movie, You Only Live Twice, and was replaced by George Lazenby for On Her Majesty's Secret Service. Lazenby's agent argued against accepting the reported seven-film contract offered by the Bond series production company, thus ensuring his client's enshrinement on Trivial Pursuit cards everywhere. The studio chief made it clear that Sean Connery was to be brought back, with money as no object, and so he was offered 1.25 million pounds to do Diamonds Are Forever, after which he reportedly said of playing Bond, "Never again."

"Never" came in 1983, when Connery agreed to do the "non-canon" Bond adventure Never Say Never Again, a title suggested by his wife Micheline in light of his earlier declaration. The movie is a second screen version of Ian Fleming's 1961 novel Thunderball, the first being Connery's 1965 outing as Bond in...Thunderball. 

Never wasn't made by Albert Broccoli and Harry Saltzman's Eon Productions, so it lacked the gun-barrel opening sequence and Monty Norman's iconic "James Bond Theme." Screenwriter Kevin McClory, after a long legal battle with Eon Productions and Ian Fleming, had the rights to film a version of the Thunderball novel based on his claims to have supplied much of its plot. McClory won his fight and tried to make his movie a couple of times before finally succeeding. He didn't have the music and he didn't have the opening sequence, but he did have one thing that the "official" Bond franchise lacked: Sean Frickin' Connery wielding his official license to kill and unofficial license to thrill. Given that the official franchise's 1983 entry was the tired Octopussy, featuring the increasingly tired-looking Roger Moore, that's a major-league head start right there. 

Never also bests its competition in the form of its villain, pitting Klaus Maria Brandauer's playful psychopathic Largo against Louis Jordan's surprisingly small-scale Kamal Kahn. Largo is aided by the scenery-chewing Barbara Carrera as Fatima Blush and bossed by the sinister Blofeld, master of SPECTRE. His weakness -- Domino Petachi (played by Kim Basinger), sister of traitorous air force officer Jack Petachi  -- proves to be the handle that Bond will use to pry his way into SPECTRE's plan to use nuclear weapons to extort money from NATO. Bernie Casey also uses his brief time to good effect as CIA agent Felix Leiter.

At 52, Connery was actually three years younger than Moore and at that point had aged much better. The Never screenwriters and director Irwin Kershner used his slightly long-in-the-tooth status to good advantage in the movie, highlighting Bond as something of a Cold War dinosaur in an age of supposedly different needs and standards. Service Director M has little use for Bond and gadget-man Q's slashed budget can't offer him much in the way of spyware. But when it counts, the somewhat older and wilier 007 can outsmart any would-be world-dominating megalomaniac and still outfight most of them. The idea that Bond's time has come and gone gives Never a kind of wry tone that Connery carries off well, probably appreciating the fact that he was back playing spies again a dozen years after he said he wouldn't. And although he's 22 years older than Basinger, she was already 30 when the movie was made -- not some 20-year-old coed -- which reduces the creep factor for this movie, at least. That, and the fact that he's Sean Frickin' Connery. 

Never isn't flawless -- it's at least a half-hour too long and it's one of the many movies that labors under the delusion that murky underwater fight scenes between stunt doubles are exciting. Basinger is kind of bland, even more so against the colorful backdrop of Connery, Brandauer and Carrera. Bond movies had been trending towards more active heroines, especially in For Your Eyes Only's Melina Havelock, but Basinger was a throwback to the more passive model of the early 1960s.

It was a splash of fun in the Bond series, though, and fun was something that Octopussy and A View to a Kill, Moore's final Bond outing two years later, lacked. It didn't always take itself so completely seriously, making it a nice change of pace as well as a much better Bond finale for Connery than the jokey Diamonds. It was also a welcome chance to see the original "Nobody does it better" guy back in action, one last time. We might wish it wasn't the last time -- even though Connery is 81 and officially retired from movies and Dame Judi Dench has been spectacular as M in the last six Bond outings, it could a hoot seeing Sir Sean glower and harumph at his just-this-side-of-rogue agent 007 as he overstepped his authority...again.

MGM, which distributes Eon's Bond movies, bought the distribution rights to Never Say Never Again in 1997, which means it's included in multivolume Bond sets of DVDs and is no longer an orphan.

Wednesday, January 11, 2012

Corporate Criminals!

The 2007 Energy Independence and Security Act requires U.S. oil companies to include substances called cellulosic biofuels in their petroleum production. These are biological fuel substances made from things like corn cobs or wood chips, and the plan was that the increased use of cellulosic biofuels would decrease U.S. dependence on foreign oil and sidestep the problem of farmers selling their corn to be made into fuel rather than food.

According to the law, the amount of this kind of biofuel the companies are required to mix into their gasoline and diesel supplies is pretty small -- the goal for 2011 was 250 million gallons of biofuel mixed into the overall gasoline production of 135 billion gallons, or less than .002 percent. The actual required amount was even smaller -- just 6.6 million gallons for 2011 and only 8.65 million gallons for 2012. Or a percentage so small my calculator gives me an error message when I try to divide the 6.6 million by 135 billion in order to compute it.

And yet these arrogant oil companies, these corporate pirates, these one-percent big-business enviroment-destroying doo-doo heads wouldn't even do that little. Just 6.6 million gallons, an amount equal to how much gasoline the oil companies make just about every four and a half hours, and they wouldn't do it. Sure, the Environmental Protection Agency fined them, but it was just a paltry $6.8 million -- barely more than a dollar a gallon for what they didn't make and surely nowhere near what you and I pay at the pump for our gasoline. Is this all the EPA, our watchdog, our protector of the planet, our environmental conscience could do? Couldn't they fine the companies an amount that actually got to them, or stage surprise inspections and threaten shutdowns if the inspectors didn't find any cellulosic biofuels? This paltry little fine?

I'm sorry, what? Cellulosic biofuels don't actually exist yet? You mean that elected officials heard about something that sounded good and decided to tell people they had to do it even though technically there wasn't yet an "it" to do?

I guess that explains the warning letter the captain of the aircraft carrier U.S.S. Enterprise received from the EPA telling him that at least one quarter of the trips made on and off the ship during the next year had to be by transporter.

Tuesday, January 10, 2012

Collegiate Till

Apparently last night's BCS National Championship Game was so bad (Q: Did you hear the LSU team bus is still stuck in the parking lot? A: Yeah, someone painted a 50-yard line across the exit) that even people involved with the system think it will be different next year. Tampa Bay Times writer Michael Kruse, in a piece at Grantland, suggests the awful game will make the creation of four 16-team "superconferences" and an actual playoff system that much closer to reality.

Ordinarily I disagree with people who suggest a playoff system would create a "real" national champion, because the suggested systems would boot the problem of arbitrary team selection from the top two teams down to the top eight or ten. However, Kruse hits on several ideas that would help overcome that problem.

And most importantly, he acknowledges that the superconference idea -- which would sort collegiate football teams into groups of haves and have-nots and leave the have-nots out of the discussion of who got to play for the title -- would pretty much bring to an end the fiction that collegiate athletics is a non-profit operation that shouldn't have to pay taxes. So what, Kruse says? That's almost certain to happen anyway, so why not bite the bullet and set the circumstances under which it does? I think here he overestimates the ability of NCAA officials and college sports folks to accept the reality that Uncle Sam is sniffin' the fine scent of greenbacks on the ol' quad, and Uncle doesn't let you play with his money unless you let him wet his beak a little. I think the superconference move will happen after the NCAA loses some future legal fight to maintain its nonprofit fairy tale.

Kruse also addresses the matter of the have-nots by saying that may wind up being the best thing for them. Too often schools without Division 1 NCAA athletic programs try to make the jump to that level and spend money they don't have in doing so, with dismal results. Snake-oil salesmen in the administration somehow convince university trustees that within five years of becoming a D-1 program, it'll be their school logo on the jerseys surrounding the national trophy and one of their own SAT-challenged "students" flashing his three-quarters of a degree in interpersonal communications in the first round of the draft. At the college where I used to work, the university president's wife always told people she wanted to see us "the best" in everything, including moving from NAIA competition to NCAA. The immense costs involved were one of the many things she didn't understand, and that idea was one of the unfortunately-not-as-many things on which she was ignored.

As Kruse points out, if you say from the get-go that these 64 teams (whichever ones they are) are the only ones who'll have any chance to play for the national title, then you keep schools below that level from spending themselves into the "world lit only by fire" status they'll be forced into when they sell all of two season tickets and have to choose between paying off the loan on the Jumbotron or lights in the library after 8 PM.

In the end, we'll see how it happens, whether Kruse's higher opinion of university athletic officials' intelligence or my pessimism about the same is warranted. But I'm right behind him on the fact that it will happen one way or the other, as all the school colors bleed into one -- deceased presidential green.

Monday, January 9, 2012

Muon It on Over

While it's important to know what goes on inside a volcano, it's pretty darn difficult to get one to come by the doc's office for a checkup. That makes it tough to x-ray the thing, and although the Transportation Security Administration swears its scanner machines don't show your face to the people they let see you naked, volcanoes are apparently having none of that either.

So enter the muons. Literally. Muons are subatomic particles that pass through most things because they are so very small that even what we call solid objects are more like empty space to them. They're set free when cosmic rays strike other atoms and break them up -- which is not what Stan Lee and Jack Kirby told us that cosmic rays can do, but the real world sometimes just doesn't recognize a great idea when it sees one.

After being liberated from the oppressive power structure of their former atom (#OccupyNucleus), muons sail along their merry way until they strike another atom. As mentioned above, that doesn't happen very often because of the small size of the particles involved, but there are enough muons movin' on that they can be tracked and the rate of this interruption can be measured. Denser materials stop more muons (Washington, D.C. is pretty much a muon-free zone), so the muon detectors get a kind of a picture of what something looks like by imaging the denser and less dense areas. This is pretty much exactly how an x-ray machine works.

In volcanoes, the liquid magma is denser than the solid mountain because it is under pressure. Muon detectors can track more or less where the magma is inside the volcano and see if it is close to the surface or if it is flowing near weaker spots in the volcano's rock. If it is, then folks living nearby can be alerted to have their bags packed and fire insurance updated.

Saturday, January 7, 2012

Discriminating Taste?

Actress Huong Hoang of Texas has identified herself as the woman who sued Amazon.com, owner of the Internet Movie Database (IMDb), for publishing her birthdate on her IMDb profile.

Ms. Hoang works under the name Junie Hoang and originally sued the company anonymously, claiming that Hollywood age discrimination meant she would not be considered for some movie roles when it became clear that her actual age is 40. A Seattle judge threw out the suit until she refiled it under her actual name. A quick check of her IMDb profile shows an attractive woman who could pretty easily play a decade or so younger than her age, especially with movie makeup. It also shows that her résumé lists such credits as "Sandy" in Gingerdead Man 3: Saturday Night Cleaver, the third entry in the series telling the story of a cookie possessed by the spirit of an executed murderer. In it, it seems, the murderous pastry time travels to 1976 to slay some disco folks.

This lawsuit would seem to be another example of how our nation has too many folks whose first response is to reach for an attorney, but as it happens I agree that information from IMDb's page for Ms. Hoang could keep her from getting roles in movies. Of course, the problem information is not so much her birthdate as it is the fact that she lists such credits as "Sandy" in Gingerdead Man 3: Saturday Night Cleaver, but I don't know what to advise her about that.

Friday, January 6, 2012


San Diego veterinarian James Paul Czajkowski sold his first manuscript when fantasy mainstay Terry Brooks was one of the judges in a writing contest he won. But an obvious problem loomed, so he wrote under the pen name James Clemens. As Clemens, Czajkowski wrote a fantasy series called The Banned and the Banished.

He adopted another pen name for some standalone techno-thrillers and the SIGMA Force series, and it's under that name of James Rollins that we meet him here with 2010's Altar of Eden. He also wrote the novelization of the poorly-received Indiana Jones and the Kingdom of the Crystal Skull, which he would probably rather you keep to yourself.

In Altar, we meet veterinarian Lorna Polk as she hurries to her special southern Louisiana research facility after its power has gone off in a storm. Just as power is restored and her team sees there is no damage, Dr. Polk is whisked away by a division of the Border Patrol that's investigating a beached boat on the Louisiana coast. They find no survivors except for some strangely changed -- and in some cases deformed -- animals. She helps the team, which is led by Jack Menard, brother of a high school boyfriend whose death she witnessed many years ago and is blamed for by his family except for Jack, determine that one of the deadliest of the altered animals escaped the boat and is headed for the Louisiana swamps. And yes, the relationship between Polk and Menard does to this novel exactly what that dependent clause did to the previous sentence -- makes you scratch your head and wonder why such a complication exists.

The pair discover the animals are part of some kind of experiment that has affected their brains as well as their bodies, and that the secret corporation behind it is very interested in keeping a low profile. So interested, in fact, that its employees are more than ready to murder to insure it.

Rollins writes a peppy, taut action scene and his veterinary background gives him a good working knowledge of much of the medical science at the root of his thriller. In its three main action set pieces -- the hunt for and confrontation with the escaped animal, fighting off corporate thugs at Polk's lab and storming an island research base -- Altar hums along uncluttered.

The exposition that comes in between the first two set pieces and the concluding one does almost the exact opposite. Clunky, wordy and littered with useless features, it sticks out in the book like a Windows Vista operating system among Mac Snow Leopards. It doesn't help that it also serves to introduce the shadowy CEO megalomaniac at the root of the experiments, who is so much of a stock character he's got to exist on some thriller author keyboard macro somewhere. He is, of course, greedy, ruthless and eeeeevil, but has the "unexpected character trait" of being a religious hypocrite. The mad scientist responsible for the experiments is just about as cookie-cutter, but he does allow Rollins to identify John 1:1 as a quote from Genesis so he -- or perhaps his creator -- is not without comic relief.

If you allow your inner Evelyn Woods to take over during the exposition passage and skim through it to the action and overlook Rollins' attempt to build Dramatic Tension by giving his two leads a Shadowy Shared Past, Altar of Eden is a diverting read and not likely to be the worst book you could buy from the Wal-Mart discount bin.

Thursday, January 5, 2012

Some Similarities?

Yes, technically the sharing of content is an important part of my faith. It is not, however, the primary part of my faith, as it would be if I followed the Missionary Church of Kopimism, an officially recognized religion in Sweden.

Like many Christians, I hold it important to share the gospel message with others, because I want them to understand what I believe to be the truth about the human condition and God's response to that condition. Also, Jesus told us to. The recently-approved church of Kopimism believes in file-sharing as its most important value -- the free sharing of any information between anybody who wants to do so, regardless of copyright laws or other restrictions. So for them, the act of sharing is far more important than the content of what is being shared.

The opposite is supposed to hold true for us -- we think it's most important that we share the central message of God's love for humanity and his desire to heal the relationship with humanity that was broken by human sin, and how that desire was made reality through the life, death and resurrection of Jesus Christ. We certainly don't know everything about that message and we might get it wrong now and again, which is one reason we trust God to do the work our limitations prevent. But we usually try to match our sharing with that message.

Of course, there are some pray-now-send-money-get-rich-quick churches, "Buddy Jesus" messages and hipper-than-thou schools of thought that offer different ideas, but those have histories of fading into the background after a little while anyway.

Tuesday, January 3, 2012

Schrödinger's Rinse Cycle?

This has apparently been around the net for awhile, but I just ran across it for the first time and since I've been on a little geek binge I thought it was worth some riffing.

Physycist Brian J. Reardon brought his knowledge of quantum physics to the problem of socks that go missing from the dryer, as well as the mysterious but related problem of other people's socks appearing in a public washer or dryer that was empty when checked.

Professor Reardon suggests that the same forces at work in the arcane corners of quantum mechanics somehow operate at the macro level to produce similar results in socks. You will note he uses a lot of equations, so please take necessary precautions when reading his article, like keeping your index finger on the scroll button of your mouse to enable a quick escape before catatonia sets in. Please, profit from my hard-earned experience.

Anyway, the professor's Quantum Theory of Laundry (QTL), takes advantage of a peculiar property of some subatomic doodads, like photons or electrons. These doodads can appear at one point to be particles, but at other times they act like waves -- this is why I called them doodads, because saying they are "subatomic particles" is not always accurate. Experiments designed to detect photons or electrons if they are particles will measure them and prove they are particles. But experiments designed to detect them if they are waves will measure them and prove they are waves. In other words, there is no way to know whether they are particles or waves until the experiments are done, and the experiments themselves will "cause" them to act like particles or waves.

An Austrian physicist named Erwin Schrödinger described some aspects of this relationship in a very complicated equation and illustrated it in a somewhat gruesome "thought experiment." He imagined a cat in a box that was rigged with a device that would release a poison gas if triggered by a certain kind of radioactive decay. The decay was random, so there was no way to predict whether or not it would go off and trigger the gas. If the box was opaque, there was no way to know whether or not the cat was alive without opening the lid, meaning that the cat could be alive or could be dead, just like the photon or electron could be a wave or could be a particle. To the observer, the cat was both alive and dead at the same time (or neither alive nor dead) until the lid opened. In the same way, the photon or electron is both a wave and a particle until it is measured and its "wave function" collapses.

Reardon uses the same kind of idea to talk about socks. The washer or dryer is a closed system, and there is no way to know if the sock is inside the machine until the door is opened and the laundry is retrieved. Once the door is opened, the sock function collapses, and it becomes a sock or it disappears. Reardon's theory also allows for the possibility that a sock in the dryer becomes lint, because it can't leave the closed system of the washer or dryer. So someone doing a future load of laundry may open the door and find that the sock function has again collapsed, but this time instead of an existing sock disappearing, a previously lost sock has reappeared. Unless, of course, you clean your lint trap and mess up the closed system.

Although I can't put together the equations that would describe it, I would like to propose a corollary to the Schrödinger-Reardon model that actually brings both of them together. As I said, I don't know the math for it, but I have observed it to be true on many occasions. I suggest that, for every unique load of laundry that is brought out of the dryer, there exists the potential of one unique cat which will appear at the center of the warm and dry clothes to take a nap there, whether the cat was visible anywhere near the dryer before the load was removed from it or not.

(H/T The Newton Blog)

Monday, January 2, 2012

Raise Your Hand

...if you can't wait until the Iowa caucuses are over so it'll be at least six or seven weeks before we hear the blathering class talk about who'll win them in 2016.

Sunday, January 1, 2012

Tell Me Why

Another tasty treat allowing me to get my geek on can be found here in the online edition of Wired magazine, dealing with some of the problems that scientists have when studying very complex systems like the human body.

Jonah Lehrer outlines a drug study and the history of treatment for back pain as two ways to look at how gathering more information about a problem doesn't always lead to its solution. A drug company studied what's called the "cholesterol pathway" to figure out a new way to treat people who have high levels of the so-called "bad cholesterol" in their systems. This pathway is the way that the body and its enzymes use and then break down both good and bad cholesterol. It is pretty widely understood, and the drug company had a compound that would assist the good cholesterol in removing the bad cholesterol and thus improve the health of people whose cholesterol levels were too high. The drug did exactly what researchers thought it would do and was in its final trials before being approved for public prescriptions. Except that when it did exactly what the researchers thought it would do, it actually endangered people's health instead of improving it. Although all of the chemicals acted exactly as they had in experiments, the effect of those actions was the opposite of what was intended.

Lehrer's article is long but is worth the read as he explores one of the problems the drug company's situation demonstrates: Complex systems are not easily understood. Well duh, we might say, but the truth is that a whole lot of the dietary and medical advice given out today is based on assumptions about causes and effects that themselves might overlook crucial elements.

He mentions an experiment done by a Belgian psychologist in the 1940s in which people saw short films with a moving red ball and a moving blue ball. The red ball would move across the screen and touch the blue ball and then stop. The blue ball would then move in the direction the red ball had been traveling. When people described it, they said the red ball hit the blue ball and made it move -- they almost automatically spoke in terms of causation, even though the film showed nothing that supported that idea. That interpretation matched most people's experience of watching what happens when a moving object hit a stationary object, of course. But there was no evidence for saying what happened in the film was the same thing that happened in those other cases. People shortened the process and supplied their own understanding.

Lehrer points out that's the way the brain works and if it doesn't work that way we can't operate in the world. It's sort of like we're hard-wired to produce a narrative explanation for things we see, whether there's any kind of explanation like that or not. We simplify a process by removing steps. Instead of giving the most complete and accurate description:  "The red ball moved until it touched the blue ball. The red ball stopped. The blue ball then started moving in the same direction that the red ball had been moving," we take the shortcut and say "The red ball hit the blue ball and knocked it away."

Most of the time that works. In the experiment, we don't lose much information by using the shortcut version. But when systems become very complex, we don't know what kind of impact removing steps can have, and that's why the drug company's compound didn't do what they thought it would do.

Scientists, of course, usually expect this sort of thing. They're used to not knowing things, and even not really knowing what it is they don't yet know. But a lot of goofballs who insist on a rigid cause-and-effect relationship in systems where it may not even be possible to know all the causes mostly overlook it and make all sorts of wild claims about the universe and knowledge that they can't really back up. Your Friar, mired as he is in his traditional Christian theism, understands this because he takes a lot of things on faith as well.

Of course, he admits it.