Sunday, May 31, 2015


Vicki Dodds, a detective sergeant on the Dundee Metropolitan Police, has a hot-headed trainee to wrangle, a preschool daughter to raise, nagging parents to endure and a blind date with a friend of a friend.

And a kidnapping to solve, on top.

Ed James creates Dodds as the central character of an ensemble cast that's trying to figure out why someone kidnapped a brother and sister connected with a dog breeding business and left them in a cage. But before the investigators get very far into their inquiry, another incident occurs, and the probe itself suggests there have been similar cases before. The level of violence is increasing, meaning it may not be long before someone is seriously injured, unless Dodds and the department can track down the criminals.

James writes an excellent by-the-book procedural, revealing as all good procedurals do that detectives wield determination, persistent questioning and checking and re-checking information for clues far more often than they do service weapons or back-room suspect beatdowns. Dodds and company may be fairly standard characters -- Vicki herself is a Modern Career Woman Attempting to Juggle Her Profession and Family Obligations While Looking for Love, Law-Enforcement Model. But James paints them realistically enough to move beyond the stereotype level, recognizing that we have stereotypes because at their roots are actual people from which the types come. His series of crimes seems a little out of place, more suited to a grim-and-gritty psychological suspense thriller than the realism-influenced world he's created, but it sorts out well enough in the end.

Snared is James' first novel from a publishing company; he has six self-published mysteries featuring Detective Constable Scott Cullen also in the Scottish detective genre sometimes called "tartan noir." It's obviously intended as a series launch itself, but is well-done enough to make subsequent entries, if they come, worth a look.
Rick Murcer traces a similar arc, with several self-published books preceding Drop Dead Perfect, also a product of Amazon's Thomas and Mercer imprint.

Ellen Harper has a temper problem, which creates a number of other problems for her work as a forensics investigator for the Chicago Police Department. Complicating matters for her are a series of murders of young women, all found perfectly dressed and made up yet bearing the same written sign: "Not her." Harper will need to keep a rein on that temper to solve the crime and keep herself from getting fired -- or worse.

Like James, Murcer deals in well-traveled stereotypes. Aside from the twist of the Nearly Burned-Out, Walking-the-Tightrope-of-Barely-Repressed-Rage cop being a woman instead of a man, he offers very little that hasn't been done before. But unlike James, Murcer fails to put those pieces together in any kind of intriguing way or add any flavor to them. We have the requisite scenes in which the killer sadistically toys with and terrorizes his victim, or in which the higher-ups almost lose their patience with Harper but pull back because she's the only one who can get the job done. Weighing all of this down is the leaden style that Murcer uses, maybe a couple of steps above a Stratemeyer Syndicate entry but quickly tiresome.

Murcer's Amazon page shows a lot of approval for his self-published Manny Williams series, so Perfect might be a one-off clunker -- no single Amazon review tells you all that much about a book, but aggregated over several different volumes they can point one way or another. Murcer's own page says he is working on the second Ellen Harper book, and the relatively low price of a Kindle edition means that giving him a second chance is not a particularly big gamble. But that's because a smart gambler only bets what he or she is willing to lose, and a journey into any much higher prices will put Mr. Murcer on the outside of that particular set.

Saturday, May 30, 2015

A Good Con

Some scenes from FanExpo Dallas, formerly known as the Dallas Comic Con, a science-fiction/fantasy convention.

-- Northbound I-35E at Lake Dallas is saved from being underwater by a railroad track that runs alongside. On the southbound side, we can see the pillars of the raised roadbed that is under construction. Unfortunately, we can also see various pieces of construction equipment more or less submerged in the risen lake, which would explain why construction is a bit slow these days.

-- Billie Piper, best known in sci-fi circles as Rose Tyler in Dr. Who and who currently stars in Showtime's "all the monsters at once" series Penny Dreadful, is a charming lady who graciously answered all kinds of questions about a series she finished seven years ago from people who didn't ask much at all about the work she's doing now. Not every actor would do that.

-- Nathan Fillion and Alan Tudyk, two of the stars of Joss Whedon's late, lamented Firefly, are an absolute hoot together. They talked about their current project, the web-based comedy series Con Man, which loosely retells some stories of Tudyk's experiences on the sci-fi and comic convention circuit. They enjoy working together, riff off each other very well and obviously greatly appreciate the "fandom" that enables them to do work they love. The absence of fellow castmate Morena Baccarin, occasioned by her work on the upcoming movie Deadpool, was an unfortunate occurrence but didn't diminish the fun of the panel with Fillion and Tudyk. It does leave me less inclined to pay money for Deadpool, however.

-- As always at such a convention, many people were dressed in costumes as their favorite characters. Some were more obviously homemade, while some were evidence of a lot of work and skill. Some are strictly in continuity and some blend or re-interpret different characters or fictional universes. I like those the best, usually. Most of the "cosplayers," as they are called, are willing to pose for photos if you ask and they are not currently doing something else, like eating their lunch or trying to buy something themselves. The most fun is watching people from different cosplay worlds get excited about taking each other's pictures. I am sure those who worked very hard on their costumes appreciate compliments from others who also worked very hard on their costumes.

-- One gentleman was wearing a male version of Princess Leia's "slave" costume from Return of the Jedi, sans metallic top. He had to have put that on after reaching the convention center, because there is no way he could have walked five feet in public in Texas without someone saying, "Let's get you some pants and a shirt, friend!" If he had demurred and explained he was in costume, I suspect the response would have been, "It wasn't a suggestion, son."

-- It's cool to visit with friends at these events. A ministerial colleague and his son were also present (He blogs here) and enjoying the festivities. My friend's son scored a sweet cast-resin replica of the revolver used by Nathan Fillion's character in the aforementioned Firefly.

-- Four-way tie for the cutest things I saw: 1) A four-year old Drax the Destroyer, complete with shirtless gray torso and orange tattoo-like markings, as well as pre-schooler-sized plastic replicas of Drax's twin blades. Everybody should have taken a picture of this kid. 2) A nine-year-old young lady dressed as Matt Smith's version of the Doctor, complete with bow tie, tweed jacket and fez. 3) The Spider-Gwen in the ticket line next to me. She was not as young as the children mentioned previously, but the obvious enthusiasm and excitement she displayed while chattering non-stop to her companion was contagious and evidence of the main reason to come to a con dressed as a character -- to have a lot of fun. 4) The mom and daughter I saw walking into the center as I left, with the mom dressed as Cruella de Vil and her daughter in a black-and-white polka-dot dress and a spot of black make-up on the end of her nose.

ETA: My comments regarding not paying money for Deadpool do not represent a childish snit against Ms. Baccarin for not showing up, i.e, "She skipped the con so I won't see her movie." No, they represent a childish snit against the movie for wrecking her schedule. Just to clarify.

Friday, May 29, 2015

Foul Bud

At The Daily Beast, William O'Connor summarizes Jon Pessah's new book about Bud Selig's tenure as commissioner of Major League Baseball, and indicates it is just as bad as you might imagine.

Pessah goes into what he claims are Selig's financial shenanigans and his Captain-Renault-like complacency regarding steroids, which he was shocked to find running rampant among the bigger, faster and stronger players who were keeping records high, turnstiles spinning and greenbacks flowing. He also mentions Selig's greatest sin, allowing the 2002 All-Star Game to end in a tie, although O'Connor doesn't include that in his summary. That comes a little later in the book, which makes sense, because once you read something like that you already know the guy's a loser.

Thursday, May 28, 2015

Le Potpourri

-- So, I wonder if soccer is happy that the U.S. finally paid it some attention.

-- Rick Santorum has decided to let everyone know that he will not be president from 2017 to 2021.

-- Donald Trump may be planning to tell everyone the same thing, come June 16.

-- An article at Big Think talks about how to make digital branding less insufferable. Since the author didn't type "Stop digital branding" and then quit writing, I pretty much just skimmed.

-- This is a neat article about the Wham-O corporation's "Super Ball" toy, but the coolest thing about it is learning that the polymer which makes up the bouncy sphere is called "Zectron." No word on whether or not one should taunt Zectron, but I'm guessing no, based on the fact that "Zectron the Malevolent" is a phrase that comes all too easily to the tongue.

Wednesday, May 27, 2015

Engage Brain Before Speaking

Some scientists studying at the Beacon of Light, Knowledge and Decency in This Corner of the Universe (aka Northwestern University) have discovered that we began to understand some abstract concepts as babies, well before we could talk or express them.

The study showed infants around seven months old pairs of objects, and then placed a barrier in front of them and either changed the objects around, replaced one of a pair of identical objects with a different object, or some other different manipulations. The babies looked longer at the different objects than they did at the pair of identical objects, which is the usual sign in infant research that a baby is thinking about something. A longer stare is baby-speak for, "Well, what's all this then?" while a shorter glance means the baby has become bored and wants to look at something else. Teenage children often exhibit the opposite behavior, staring at something for many hours while frequently proclaiming boredom.

Now here might be the place where you would expect me to make a joke about how we lose the habit of thinking before speaking as we age -- we do -- and linking that to politicians. That would be a good joke, but not necessarily good enough. Nor would it be entirely true.

In fact, politicians actually develop higher perceptive skills as they gain experience. You and I look at what we have earned or what we own and we only see what we own. But simply by looking at a person, group of people or a company, politicians are able to discern differences in those very same things. They look at what we have earned or what we own and see which part of it belongs to us and which part of it belongs to them.

The part about them not thinking before they speak used to be more true than it is today. Today there are spokespeople, media relations firms and PR folks who spend most of their working hours saying the things that the politician would say if the politician were thinking before speaking. Often those things are what a poll tells the media management people that voters are thinking, but sometimes those things are things that the politician might actually say or believe. Wonders never cease.

Tuesday, May 26, 2015

Generation Which?

This post at Aeon may go a little too far, but there's some truth in it: A lot of times, the characteristics that are ascribed to a generation don't really apply well to the people in that generation -- or they apply at some times in life but not others. In which case they are probably more descriptive of a time in life than they are of some kind of generational cohort.

I recall that in the mid to late 1990s, "Gen-X'ers" were seen as disaffected slackers. There was something to that, but since a generation is usually taken to span about 20 years, it didn't apply across the board. I was a Gen-X'er, but from the earliest end of that time frame and didn't have all that much in common with the alienated goofball of Reality Bites. But on the other hand, the alienated goofball represented real people as well. Some of their alienation remained as they aged, but not all, and truth to tell, disaffection is not uncommon to late teens and early 20-somethings.

Major events can affect large groups of people -- World War II made the generation that fought it something it never would have been otherwise. Other events can affect a wide range of people, but not as deeply. People my age take note of things like the murder of John Lennon, the attempted assassination of Ronald Reagan and the explosion of the space shuttle Challenger, and we will probably always remember them and what we felt when we learned of them. But I can't see how even the three of those added together could have the same kind of impact brought on by the most massive conflict in humam history.

Plus, a lot of the research into generational characteristics focuses on Western or developed nations, and often only on the U.S. What a Rwandan teen saw in the 1990s probably worked on them a little differently than, say, the death of Kurt Cobain did on American kids.

I often work with college students, young adults and youth, and I try to keep abreast of the kinds of things written about them. Some of them ring true, but some of them don't. The Aeon writer has some points about how a significant amount of the stuff I read oversells the ideas of generational characteristics and differences, but there's some baby in that bathwater and I can't buy the suggestion it's time to toss them out completely. But I would make sure there's some salt handy.

Monday, May 25, 2015

Gratitude, In Memoriam

A Dutch village remembers the sacrifices made by U.S. soldiers on their behalf by tending the graves at a World War II military cemetery. In some cases, the third generation of a family is now visiting the graves, placing flowers, and remembering what was given.

That's a pretty good thank you.

Sunday, May 24, 2015

Sunspot, Baby

This struck me as interesting, especially since I'm currently reading -- at a very slow and mystified pace -- a book about helioseismology, or the study of the movement within the sun. The picture is the magnetic field over a group of sunspots, color-enhanced to be visible in regular light.

Even though it's on the sun, it's still pretty cool.

Saturday, May 23, 2015

Did We Take the Red Pill?

A couple of other things along these lines have shown up here before, but this one poses an interesting question: What if the core "stuff" of the universe isn't stuff after all, but information?

Robert Lawrence Kuhn, creator of the public television show Closer to Truth, ran across the idea when interviewing some scientists. Scientists have almost always believed that the universe and everything in it is made up of things, so to speak. Although as time and technology have progressed we have found smaller and smaller things, the idea is still that these things are little pieces of bigger things. Even the word "atom," which we use to describe the building blocks of matter, comes from ancient Greek, showing how long this concept of the universe has been engrained in our thinking.

But what if the core stuff of the universe was not stuff at all? What if it was information instead?

The idea requires a specific understanding of information that diverges a little from our common use of the word. It doesn't mean knowledge, data or trivia. The latter may make up the entire programming schedule of Comedy Central but not the universe.

As Kuhn notes, the concept of information being used here is much more like the one a computer uses in its binary code. A circuit is either on or off, and if you string together those ons and offs in a series, you create programming directions that tell the computer what to do. Whether or not the circuit is on or off is the kind of "information" that Kuhn's interviewees were speaking of.

As he notes in the piece, this could be seen as something like the spin of an electron. Electrons spin one way or another -- they don't actually spin around like a planet, but physicists have decided to label one of their characteristics "spin" in order to mess with non-physicists' heads. Whether or not the electron is spinning up or down determines a lot about it, and that "information" is basic to what the electron is doing. By extension that's basic to what the atom is doing, which in turn influences what the element is, which in turn influences what compound it is or isn't a part of, and so on.

Several physicists are not convinced, and the theory itself is in the early stages of development. At this point scientists aren't even sure what kind of experiments would be needed to test the idea, let alone what results would come about from them.

On the other hand, if a Laurence Fishburne pops up and offers to show you how deep the rabbit hole goes, that might be an indicator.

Friday, May 22, 2015

It Just Came to Mind

I went to college too long ago for this report about the perks that university presidents get for it to include either of the men who served in that job during my undergraduate tenure.

But since I just made my monthly offering on the altar of Sallie Mae (now called "Navient" for some reason that I am sure I got an e-mail about, written by someone whose position is partially funded by said monthly offerings and yet whom I cannot fire, demote or direct to do something more useful than change names and send annoying e-mails), I felt like posting it.

I'll also remember it the next time our state's flagship university presidents come before our legislature, wearing their best Dickensian orphan faces but traveling in late-model luxury sedans paid for and driven by someone else, to ask for a funding increase even though state revenue is projected to fall about $600 million below last year's figures.

'Cause those mooches do fall under the survey time frame...

Thursday, May 21, 2015

Captain Christopher Pike of the Starship Enterprise

Star Trek fans know that Captain Pike was the character played by Jeffrey Hunter in the first pilot of the famous show, "The Cage." When that pilot didn't sell, Hunter decided not to stick with the project, and a second pilot was made, featuring William Shatner as Captain James T. Kirk. The rest, of course, is history.

The Pike character came back in a two-part episode called "The Menagerie," but he had been horribly scarred in an accident and behind the heavy make-up was Sean Kenney (who also played characters in two other Star Trek episodes). In the reboot movies from J. J. Abrams, Pike is played by Bruce Greenwood and has the great line, "Your father was captain of a starship for 12 minutes. He saved 800 lives, including your mother's. And yours. I dare you to do better."

Some fans of the original series have gotten together an online fund-raising campaign to create a fan film called Star Trek: Captain Pike, with that serving as a bridge to the 90-minute movie Star Trek: Encounter at Rigel. A lot of these movies float around out there of varying degrees of quality in both acting, effects and production. The Pike group seems to have secured a pretty high-level cast for a production of this type and many of the listed members have previous connections to Star Trek shows and movies (including Kenney, who's been working as a professional photographer since 1980).

Here's hoping this goes off well. I've become curmudgeonly enough to hold a lot more appreciation for the original series than the later ones and I've enjoyed several of the stories fans have created using those characters. The Star Trek Continues folks, for one, have done four episodes so far and a couple of them have been better than anything the original show aired in its awful third season, even if their Dr. McCoy is atrocious and former Mythbuster Grant Imahara cannot convince me (or anyone else older than four) that he is Lieutenant Sulu. It would be nice to see a little bit of the Enterprise before she became famous.

Wednesday, May 20, 2015

Heavenly Kiev

Or at least, that's what it looks like, as Ukrainian art director Alexey Kondakov used photo software to insert figures of angels and the gods of ancient mythology into some everyday spaces in Kiev, the city where he lives and works. Here's one:

Betcha didn't know that the virgin Mary rode the subway, did you?

It's kind of interesting, because if the story of the Nativity were to take place today, in all probability Mary would be the kind of person who would ride a subway to work. Maybe cleaning houses in an upscale neighborhood. She might have had to go back to work when Jesus was still an infant, and maybe carried him on the ride home.

The angels, on the other hand, have it easy. When you have wings, you don't even have to jump the turnstile to ride for free. You just float it.

Tuesday, May 19, 2015

See Dick and Jane Chart

According to Andrew Powell-Morse at Seatsmart, the reading level of the top-selling chart hits has been getting lower and lower for the past 10 years. The average reading level for the lyrics of a chartbuster in 2005 was between the third and fourth grade. The average level last year was between the second and third grade, and a number of the major hits are well below that.

The reading level algorithm takes into account sentence length, word length, number of syllables and so on. This right away can show what one of the problems might be in assessing whether or not a song is really dumb. A song that repeated the word "Mississippi," for example, could score higher than a song that repeated "Utah" just as often. The two would be equally meaningless, though, and could be described fairly as equally stupid.

Powell-Morse notes it's just a fun exercise, which he pretty much has to say because his research shows that the "smartest" lyrics for any rock performer who charted between 2005 and 2014 were written by one Chad Kroeger and company -- the hated Nickelback.

On the other hand, the same research shows that the dumbest lyrics, reading-level wise, come from Ke$ha, and that is most certainly true.

Monday, May 18, 2015

Test of Time

What I like best about this story of a 102-year-old woman earning the Ph. D. originally denied her by the Nazis in 1938 is not that she was simply awarded the degree she should have been in 1938. It's that she defended her original thesis successfully and earned the degree.

Ingeborg Syllm-Rapoport produced a thesis on diptheria when she was a medical student at the University of Hamburg. But since her mother was Jewish, the Nazis denied her entrance to the oral defense portion of the degree requirements and the university did not award the degree. Dr. Syllm-Rapaport emigrated to the United States and began practicing medicine but returned to Germany with her husband following the war. She retired as a full professor of pediatrics and director of the Neonatology Department at a top Berlin hospital.

I imagine no one would have been scandalized if the University of Hamburg had simply awarded her the degree that the Nazis had denied her the chance to earn. But the good doctor did not settle for that and sat for the thesis exam just as she would have 77 years ago, and she passed.

In a way, this is a much more powerful statement to the ideas Nazism espoused about racial inferiorities and the like than it would have been if the university had just decided she should get the degree. The Nazis claimed that having a Jewish mother disqualified Dr. Syllm-Rapaport from earning a Ph. D, but she demonstrated that she was not only qualified then, she still is. The only way the Nazi ideal could win against her was to cheat, which it did. But the victory was only temporary, and the Nazis were wrong.

Of course, they should be used to that by now.

Sunday, May 17, 2015

Is This Truth Self-Evident?

Over at Neurologica, Steven Novella explores the frequent disconnect between what people seem to think they know and what is actually so. Think of Jay Leno's old "Jaywalking" sketch in which he would ask random people on the street some fairly simple questions which some of them could not answer -- even though most folks, with a moment or two to think, probably should have come up with a correct answer.

Novella is commenting on the same kind of phenomena, and wondering about why it happens. He suggests a combination of several factors, such as media sensationalism, the tendency toward cognitive bias (we tend to believe an answer that fits what we think rather than one that does not) and the fitting of facts into narratives whether they actually fit or not.

I suspect he is probably on target -- at least, on the frequent occasions when I am mistaken, one of those factors seems to have played a considerable role. I may be a slow learner.

What piqued my curiosity was Novella's post title: "Why Is the Public So Wrong?" I'm afraid my own confirmation bias sprang to action even before I clicked the link and read, because I had an immediate answer: "Because it's made up of people."

Saturday, May 16, 2015

Maddening Max

Say whatever else you want to about Mad Max: Fury Road, but this much is certain: No one other than George Miller could have made it. Yes, anyone can make post-apocalyptic loner movies that re-tell Shane or some other Western only with MTV costumes -- the "Action Movie" shelves at Blockbuster in the late 80s testify to that. Yes, anyone can depict death-defying stunts on screen -- these days, with some clicks of a mouse. And more than a few people can make movies with barely a handful of lines of dialogue -- although every time I see Seth Rogen onscreen I realize that number isn't as large as I'd like it to be.

But nobody can mix all of these things with the fuel additive of flat-out crazy in the way that Miller can. His absence from Mad Max Beyond Thunderdome -- the death of his filmmaking partner and friend Byron Kennedy while Kennedy was scouting locations left Miller unwilling to agree to handle more than the action set pieces -- was among that movie's many problems. And now, 30 years later, comes a fourth Mad Max movie telling us about Max Rockatansky, (Tom Hardy) a police officer before the world went meshuggah who lost his wife and baby daughter to a biker gang and who now just wanders from place to place, seeking only survival. Fury Road begins with Max's capture by a warlord named Immortan Joe (Hugh Keays-Byrne) and his being pressed into service as a human blood bank due to his universal donor profile. It isn't long before Joe leaves his Citadel oasis to chase a turncoat soldier, Imperator Furiosa (Charlize Theron), who has spirited his five wives out of their captivity. Max, taken along as a captive to supply blood for one of Joe's ailing suicidal soldiers, manages to escape his own captivity during the chase and winds up joining Furiosa as they flee Joe in search of a safe place in the wasteland.

The cars and culture Miller whipped up for the movie are the dream of every six-year-old boy who figured gluing Hot Wheels together would make a cool car a super car, every person stuck in traffic who wishes they could just stomp on the gas and shove the lesser mortals aside with the huge iron spikes that stick out the front of the car, and so on. The action sequences -- and the movie is basically one long chase, so there are many -- are like demolition derby ballet. Somehow Miller makes the violent collisions of multiple tons of Detroit rolling iron happen with the grace of a grand jeté. Even though most of the stunt work is practical rather than generated, it and the visuals are not particularly realistic. The story, though, works within that frame rather like a fable told around the campfire some years later -- with the escapes narrower, the villains villainouser, all of the guns and cars and equipment straight from the same place we got Babe the Blue Ox and everything exaggerated to make the tale even taller.

The only real problem with Fury Road is that there have been three Mad Max movies before it. I don't mean the problem of continuity -- "But the last of the V-8 Interceptors was already wrecked!" would be one of those complaints. Miller said he explicitly saw Fury Road as a franchise reboot, rather than a continuation of the stories told in the three movies where Mel Gibson plays Max. He said he deliberately made that choice to avoid telling a story he's already told. That's a good idea -- another one of the problems with Thunderdome was that Road Warrior basically completed Max's character arc. Anything that happened after it was much less interesting.

But in Fury Road, Miller does tell the same story he's told before -- that even in a wasteland, an "only the strong survive" mentality creates a society of brutality that eventually collapses under its own weight -- as Auntie Entity learns in Thunderdome and the Humungus figures out when he finds the hard way that brute strength is not the only way to win a fight. Joe's wackadoodle Citadel combines Auntie's personality cult and the Humungus's mohawked looney-toon predatory punk bikers.

Max reconnects with the humanity he thought lost when he realizes that although he couldn't help his wife and daughter yesterday, he can help someone today, just as he does in both Road Warrior and Thunderdome. Heck, Miller uses some of the same people: Hugh Keays-Byrne played the Toecutter, the main antagonist in Mad Max.

The he-man tough-guy warrior schtick of both Joe and the Humungus fall to a society that values both genders. That idea, by the way, seems to be the extent of the back-and-forth about Fury Road being a "feminist movie." Pro- and anti-feminist blatherers made a lot about Miller using playwright Eve Ensler as a coach to help the models playing Joe's wives understand the mindset of someone basically held as a sex slave. That's fine. My main worry was that he'd hired Ensler to write some of the movie; I've read The Vagina Monologues.

Road Warrior ended as the narrative built to a spectacular chase scene that's rarely if ever been equaled onscreen. in Fury Road, Miller has expanded the chase scene to be the whole movie and worked his story into it. No problem with that concept: John Ford and John Wayne managed a pretty decent movie using it. But Miller is telling the same story he has told before and using many of the same pieces. People who didn't see either of the first two Mad Max movies when they came out and kicked off a wave in action cinema that still circulates today, or who know Thunderdome only through the IMDB quote board ("Two men enter! One man leaves!" "Ain't we a pair, raggedy man?") and a Tina Turner video might not realize it, but the amazing spectacle and stunning visual fable they're watching has at its base a story that's just spinning its wheels.

Friday, May 15, 2015

Blues Never Dies

According to his 1996 autobiography Blues All Around Me, co-written with David Ritz, blues guitar legend B. B. King may have led less than an exemplary life and been something other than a role model as a family man. OK, given. I work for an outfit whose first high-profile spokesman denied the founder three times and whose most effective marketer spent his early years trying to kill the company he'd later work for. That same marketer mentioned that everyone's got flaws, so B. B.'s don't freak me out.

Even if they did, when I think of a world that didn't have "The Thrill Is Gone," "How Blue Can You Get," "Nobody Loves Me but My Mother," "When Love Came to Town," or heck, even "Into the Night," I think of a grayer and shallower place. That would be a universe significantly short-changed in the ol' "many-worlds hypothesis" sweepstakes, and one whose people would be justified in skedaddling for another reality as soon as possible.

And when they come to this world, one in which a woman named Lucille can weep silver tears that are not seen but heard, in every note, and one in which a King can roar as loud as the lion that shares his name and fill it with pain and regret and defiance and triumph every time, I imagine them stunned to their knees as they infuse a flood of the joy and sorrow of life all at once.

They have seen love conquer the great divide.

Requiescat in Pace, Rex Caerulei

Thursday, May 14, 2015

The Company of Friends

Pluto may not be considered a planet, but it's got more moons than some of the "full" planets, and the New Horizons probe has recently sent photo proof of their existence.

Take that, Venus and Mars!

Wednesday, May 13, 2015


At Science 2.0, Hank Campbell investigates just how far a home-run can be hit, and who probably hit the longest one.

Although there was much talk in days of yore about 500-foot blasts, Campbell suggests that they were rarer than believed, and 500 feet was probably the limit. In today's game, which includes a stadium in Denver's rarified air, the great Mickey Mantle might have sent one 580 feet at the outer limits of earthly conditions.

Based on his math, measurements and suppositions, Campbell figures that Mantle or Babe Ruth probably holds the unofficial record for the longest hit ball. That seems about right, although I think that Josh Gibson should be in the mix, and Bo Jackson in his all-too-brief prime might have managed that or a little more.

Either way, it's always nice to see science applied to the higher arts.

Tuesday, May 12, 2015


Today's Astronomy Picture of the Day shows how different a sunset can look from a different a different planet. The side-by-side compares a sunset on Earth with pictures the Curiosity rover took of one on Mars.

Unsurprisingly, the sun from Mars looks slightly smaller. But what scientists are working at is trying to figure out why it's bluer than the Earthview edition. Martian dust and its light-scattering properties are the prime candidates.

Monday, May 11, 2015

I Missed Something

Ray Kurzweil, a smart guy who talks a lot about what may (or may not) happen in the future, suggests that human beings will develop computerized personal assistants that will be able to read hundreds of millions of web pages in just a few seconds.

Unanswered by Mr. Kurzweil are these questions:

1) What will the assistant do after that? Not just in terms of what will it do with the data, but in terms of what will the human beings using it then use it for once its glommed up the hundreds of millions of web pages. Web pages are created at a pretty fair clip, but if you're downing them hundreds of millions at a gulp, you're likely to stay ahead of the curve. Not to mention that there are any number of web pages -- present company excluded -- that don't exactly add to your knowledge if you process what's on them.

2) Why the heck would you want it to do that?

3) Why do you want to waste time slurping up web pages with a computerized digital assistant when it can create a suit of powered armor that lets you fly around the world and fight crime?

Mr. Kurzweil may need to put this one back in the oven for a bit.

Sunday, May 10, 2015

Good Speech? Good Question!

This year's set of commencement speaker squawking has at least sparked a worthwhile question to ask: Just what the heck is that speech supposed to do?

Harry Painter, writing at The Pope Center, outlines some of the issues swirling around this matter after noting that some folks are asking the University of Houston why they are paying Matthew McConaughey $135,000 for about twenty minutes' work. That's kind of ludicrous even by Hollywood standards, even though McConaughey had some interesting things to say when he won his 2013 Academy Award.

Colleges use big-name speakers to help build brand identity, one theory goes, although it can backfire if the profile of the speaker is something that ticks off donating alums. It can also garner some negative publicity when the precious little snowflakes who make up the graduating class believe their graduation experience will be ruined because the speaker espouses causes in which they do not believe or otherwise fails to measure up to some arcane standard of university perfection. Of course, their employment experience after college will be ruined the first time they expect the world to conform to their standards, as employers frequently insist on things being the way they like them. This is if whatever micro-specialized subset of social theory in which they earned their degree allows them to secure employment, that is.

In any event, I think Mr. Painter has a point. My college commencement speaker was then-U.S. Secretary of State George Schultz. The man who spoke to our journalism class diploma ceremony was either a founder or manager of Crain's Chicago Business, a magazine. In addition to not remembering his precise position, I also can't remember his name. Whether or not this had to do with the bottles of champagne we were passing up and down the rows -- it was not a state school and we were all over 21 -- I can't say.

I imagine it really didn't, because the idea that a 20-minute speech would have some application that  four years of school hadn't already had seems a bit off. As a pastor who has served in small towns, I am sometimes called on to speak at the local baccalaureate service, and that idea forms the kernel of my usual speech. My key point to them is that, now that they are leaving high school, they should go ahead and leave high school. Because of reruns, I can still get away with reminding them that someone who peaks in high school, like say Al Bundy and his four touchdowns in one game, has more than 50 years to watch other people forget the only thing that has mattered to those who peaked pre-18. If some of them get the idea, great. If not, well, I am rather used to saying things that people don't listen to so well. Either way, the school isn't out a dime and I wouldn't expect them to be.

But for $135,000, I might come up with a new speech. And maybe even go twenty-two minutes.

Saturday, May 9, 2015


Had some torrentialness this evening, and had to open up the church for the neighbors. We'll return tomorrow to the regularly scheduled middle-aged grumpiness and exhortations to remove yourself from my lawn.

Friday, May 8, 2015

Repeat Performance Needed?

So 70 years ago today, we could finally get started fixing Europe after Germany, Italy and (off-again, on-again) Russia broke it.

Germany and Italy seem to have figured things out, but ol' Vladdy P. and his all-you-can-eat Ukraine sandwich make you wonder about Russia. And Greece must figure any attention is better than no attention at all and 70 years of relative stability and prosperity are boring, so why not whip up some fun?

Sometimes I wonder if these people need a baby-sitter. Then I look at the UN and I figure, let's not push things.

Thursday, May 7, 2015

No Bedtime Story

Adam Swift, a professor of political theory at the University of Warwick, demonstrates that in order to get an idea that's weapons-grade stupid, you need an extensive education.

Professor Swift was recently interviewed by the Australian Broadcasting Corporation's The Philosophers' Zone, where he discussed the theory of egalitarian families he is trying to develop with his colleague Harry Brighouse. Among the things that Professor Swift would like to see is the elimination of private schools, which help create advantage gaps that exacerbate inequalities in society. In other words, rich and (sometimes) smart kids who go to private schools have an unfair advantage over poor and smart kids who go to regular public schools. I think a better path than closing down good schools is improving less good ones, but we can live and let live on disagreements about that sort of thing.

The one thing Professor Swift talked about that got some press, though, was noting how parents who read to their kids at night should be aware they are providing their child an unfair advantage over the parents who don't. He magnanimously allows such parents a few nights off of feeling guilty about all of the kids who don't get read to, but still says they should think about that occasionally. Whether or not they should actually do anything that might help out some of those kids and families apparently is a matter for smaller minds than Professor Swift's.

And while we're at it, I don't believe parents are under any effin' obligation to think about this matter in terms of "unfair advantages." Feeling something about kids whose parents' decisions, freely made or otherwise, disadvantage them even before they get to the starting line is a good thing. Wanting to somehow respond to help those so disadvantaged is also a good thing. But the mere existence of my reading parents did not cause someone else's non-reading parents, and pretending otherwise as Professor Swift seems to suggest is just plan silly.

Professor Swift compares the existence of elite private schools to reading parents, and says that while the elimination of the schools wouldn't harm the family bonding process, elimination of reading at bedtime would. It has value beyond its educational benefits. He's a prince, ain't he? It takes someone who gets paid to be listened to by impressionable 18-year-olds about whatever airy fancy flits through his brain to even talk about conditions under which bedtime reading could somehow be reduced or eliminated.

Leave aside practical considerations, such as how such behavior could ever be curtailed outside of a world where clocks strike 13, 2+2=5 and Winston Smith loves Big Brother. Philosophy professors have spent a lot of money -- mostly mom and dad's and the government's -- to get to a place where they can ignore things like the impossibility of surveilling every pre-schooler at bedtime,  and we shouldn't ask them to just waste all of that by reminding them how much of the world gets by without encountering philosophy professors for years at a time.

No, let's just look at the big ol' existential elephant in junior's bedroom, which is that whether or not parents read to kids at bedtime is none of Professor Swift's gorram business. His belief that there's a utility to the activity that makes it acceptable is completely irrelevant. Also, for that matter, is why parents read to their kids or what they think when they do it. My dad did most of the bedtime reading, and I think some of it was because he didn't get to spend as much time with us as mom did since he worked at an office during the day. And sometimes it was because he liked reading and his mom had read to him and he wanted us to like reading too. And sometimes it was because that's what dads did. And sometimes it was because we were wound up and needed the semi-hypnotic effect of a bedtime story to get us ready for sleep.

But even if he was reading excerpts from the Montgomery Burns' Guide to Serf-Crushing with the deliberate intention of helping us to make other kids look dumb and fail at life, it still would have been something about which Professor's Swift's opinions were neither required nor appropriate to offer.

So please, if you have kids, read to them at night and do so free of guilt about what other parents don't do. Read to them to make them smarter, read to them to show them you care, read to them to spend time with them, read to them to help them develop a love of reading and language, read to them whyeverso you want to. And whatever you want to, and do so free of worry about what Professor Swift, I or anyone else thinks about it, because it's none of our business.

Unless you read to them from How Not to Be a Hypocrite: School Choice for the Morally Perplexed Parent, or something like that. Then I'm calling the cops on you.

Wednesday, May 6, 2015

All the Cool Theories Never Work

My own paranoia that the Large Hadron Collider will create a black hole that will swallow the universe is both well-documented and completely made-up. But it is by no means the wackiest theory anyone ever had about the gigantic particle accelerator lurking beneath placid Switzerland.

In 2008, Holger Bech Nielsen of the Niels Bohr Institute in Copenhagen and Masao Ninomiya of the Yukawa Institute for Theoretical Physics published a series of papers that said the LHC violated enough natural laws that it could never be operational and in fact attempts to turn it on would cause space-time ripples that would flow backwards in time and wreck the thing before it ever fully fired up.

No experiments regarding the Nielsen-Ninomiya hypothesis were ever conducted, but it was proven incorrect when the LHC did discover the Higgs Boson and did not spontaneously Tardis itself out of existence. The jury is still out, of course, on the whole black hole swallowing the universe thing.

The Nielson-Ninomiya hypothesis was always iffy, since no one has ever recorded an instance of time somehow self-correcting itself backwards and there are plenty of candidates that defy rational thinking to the degree that they should have triggered it. Nancy Pelosi winning elective office, Will Farrell getting work as a comedian, people taking Bill O'Reilly seriously, Mike Huckabee running for president twice -- any or all of these events should triggered the backwards-correction effect as the universe attempted to regain its equilibrium as a rational entity capable of being understood. So the hypothesis was never really practical.

Unless the universe isn't rational. In which case we might see something like college students arguing not for their right to see and say whatever they wished, but to be protected from seeing or hearing anything that might offend them...uh-oh.

Tuesday, May 5, 2015


You may have been wondering, which exerts a greater gravitational pull on you: the sun or spiders? Or you may not have been wondering, but just sort of came upon the question while reading the what-if? blog at xkcd.

Obviously the greatest gravitational pull on each of us is the earth itself, since it is the nearest large object to us. But what about after that? The writer does some math and shows that the sun, being a relatively immense body, makes up for the sheer number of spiders that may be surmised to exist in the world, and even for the fact that the spiders are nearer to us than is the sun.

On the other hand, I can guarantee that seeing a spider in the house -- especially since the area in which I live is prone to fiddlebacks -- creates a pull of my foot towards it with a force that simulates gravity quite nicely. And the end result for the spider is not completely different than the end result of experiencing immense crushing gravity might be for us. Except that in the case of the little Shelob-wannabe, it's richly deserved.

Monday, May 4, 2015

Wrong-Sized Hat

Or technically, the wrong-sized planet for the cool dwarf star HATS-6, which is about 500 light-years from Earth.

The planet, which is called HATS-6b, is a large gas giant about the size of Jupiter -- even though it has a total mass about equal to Saturn, which is smaller. This means that it is not very dense at all, because even Saturn itself is less dense than water and would actually float if you could find an ocean big enough to hold it -- and the gas didn't collapse into liquid or disperse or something. It orbits its star about once every 3.3 days, meaning that every week for us is a bit more than two "years" on HATS-6b.

Most theories of planetary formation would predict that M-class dwarf stars like HATS-6 would not have giant planets, and certainly not ones that would orbit so close to the star itself. A scientist quoted in the story said that the planet probably formed at a greater distance than it is now and migrated in, but current theories don't allow for that either.

In essence, this "hat" holds a large ball of rather dispersed gas. I suggest we call it Biden.

Sunday, May 3, 2015


A recent check of the bank balance via the trusty electronic banking interface revealed that Uncle Sam had deposited the refund I had coming. Meaning that the money which I had lent Uncle interest free, which belonged to me in the first place, once again resided with the person who had earned it.

I feel something about this, but I am not sure it's gratitude.

Saturday, May 2, 2015

Oh, Yeah

There is about to be some serious jamming; that's the only good news about this item.

Herewith, a taste of what's going to be going on in eternity for an eon or two (or three or four)...

Friday, May 1, 2015

It Sounds Pretty Good, Anyway...

We present another in our occasional series of article headlines that sound too good to pass up, even if the amount of scientific education required to understand them far exceeds the basic astronomy course which we took as a freshman in college more than three decades ago:
Multifractals Point To Existence Of Unknown Physical Mechanism On The Sun
And as usual, we should note that "Multifractals" would make an excellent name for a band -- probably some kind of techno-revivalist outfit that covers a lot of M, Gary Numan and Flock of Seagulls, with a couple of Devo's weirder numbers thrown in.

The story is actually about how a special kind of mathematical analysis of sunspot activity, called "multifractal analysis," has shown that something seems to influence sunspot activity in a way that provides an unsuspected correlation. Since "multifractal analysis" involves a fractal equation of a fractal equation, and since fractal equations are equations that provide answers that never repeat and never end (graphing them often makes those cool repeating psychedelic patterns), I have no way of knowing what exactly the analysis has shown. But at this point, neither do the scientists, although one suspects they will eventually arrive at an answer -- that I can later read about.