Monday, February 25, 2013


          Two generations have now lived their entire lives under a government-mandated regime of racial preferences that effectively discriminates against white Americans in employment, education, and government contracting.  Although relief from this divisive race-based regime is long overdue, developments under the Obama Administration indicate instead that worse is yet to come -- while those on the unfavorable end of these policies seem to be steeped in lassitude.

            The federal government's so-called affirmative action policies, an institutionalized euphemism for mandatory preferences for minorities, have been with us for at least 45 years, i.e., since 1966.  Although one might expect that 45 years of official discrimination is enough, and that the remarkable advances made by minorities during that period would have brought an end or substantial curtailment of such policies, nothing of the sort has occurred. 

            As demonstrated by events summarized below, radical government racial preferences have actually been expanded rather than reduced – despite repeated Supreme Court opinions establishing that most of those  policies and practices violate the Constitution's guarantee of equal protection under the laws.  And in what may be the most insidious result of nearly a half century of this discriminatory regime, many, if not most, white Americans now seem resigned to this state of affairs as some kind of bizarre status quo that is pointless to resist. Rarely in human history has such a large class of persons so tamely acquiesced in policies so harmful to itself and its offspring.

            In 1964, Congress enacted the landmark civil rights legislation that outlawed race discrimination in employment.  Similar laws affecting housing, education, and public accommodations soon followed.  Had these laws been enforced in keeping with their original language and purpose – equal opportunity for all, and discriminatory preference for none – the nation could have avoided the half-century of racial friction, litigation, and recrimination that has actually followed their enactment.  Instead, federal agencies, Congress, and the courts transformed the law's anti-discrimination mandate into a relentless regime of racial preferences and quotas under the benign label of "affirmative action." 

            The flawed central premise of these policies was, and is, that the anti-discrimination laws mandated not merely equal opportunity and freedom from discrimination, but equal results.  The critical machinery adapted to achieve that end was the infamous "disparate impact" doctrine.  Under that theory, any aspect of a hiring or other selection process (e.g., college admissions) that resulted in less favorable outcomes for preferred racial or ethnic groups was deemed discriminatory.  Perversely, this doctrine expressly eliminated the requirement of discriminatory intent to prove unlawful discrimination.  And the whole machinery of the federal government and the liberal federal judiciary stood ready to strictly enforce this oppressive doctrine of no-fault discrimination.  Predictably, employers, colleges, and other institutions went to extreme lengths to avoid the harsh penalties that would follow a purported violation – they began to adopt and expand preferences that favored the protected minorities and, ipso facto, discriminated against whites.

            The practical reality of the anti-white discrimination was soon demonstrated.  Institutionalized "reverse discrimination" was (and remains today) especially tangible in the realm of university admissions, and a few intrepid white students denied admission in favor of less qualified minorities began to stand up and sue.  In 1971 (42 years ago), a white man named Marco DeFunis established in court that he had been discriminatorily denied admission to Washington's state-operated law school.  By the time his case reached the Supreme Court on appeal (DeFunis v. Odegaard, 416 U.S. 312 (1974)), however, it was held moot.  But a dissenting opinion by Justice Douglas – one of the most liberal justices in history – eloquently stated the core principal that has since been crudely ignored by federal and state governments, generally with impunity:  "There is no constitutional right for any race to be preferred. . . .  Whatever his race, [DeFunis] had a constitutional right to have his application considered on its individual merits in a racially neutral manner." 
                     Allan Bakke's well-deserved graduation from med school                              
            Four years later, in the landmark case of Bakke v. Bd. Of Regents, 438 U.S. 265 (1978) – in which this writer co-drafted an amicus brief supporting Allan Bakke -- the High Court held that California had unconstitutionally excluded a qualified white applicant (Mr. Bakke) from the state medical school in setting aside a quota of 16 of 100 available places exclusively for designated minorities.  But even while acknowledging the reality of institutionalized anti-white discrimination, the Court held that race-based affirmative action plans could be maintained as long as race was "merely" one of several factors considered – i.e., race could not be the dispositive factor in itself. 

          This constitutional compromise proved to be an ineffectual fig leaf.  Although whites have succeeded over the years with various specific claims of reverse discrimination in employment, government contracting, and college admissions since Bakke, the Supreme Court has failed to formulate a dispositive ruling on the issue.  Given the constitutional leeway allowed by the Court's prevarication – especially its ruling in Grutter v. Bollinger, 539 U.S. 306 (2003), that some racial preferences can be justified merely by the desirability of achieving racial diversity – a broad range of Government racial preference programs have continued in force to the present.  And the Obama Administration is busy expanding them.

            A few examples demonstrate the pernicious consequences of the persistent racial preference and disparate impact regime that is now simply taken for granted by an American public preoccupied by more stimulating matters, like the Super Bowl and the Academy Awards.

            Going back to the early 1970's, an alliance of anti-death penalty liberals and black organizations like the NAACP have invoked disparate impact analysis to argue that the death penalty is discriminatorily applied to blacks.  In fact, however, the opposite is true.  Blacks commit roughly half (or a bit more than half) of the murders that are alone subject to the death penalty, so one would logically expect blacks to constitute about half of those sentenced to death or executed.  In fact, however, Bureau of Justice Statistics have consistently shown for decades that whites are both sentenced to death and executed  in far larger numbers than blacks.  In 2010, for example, 33 whites were executed in the U.S., while only 13 blacks were executed, see Bureau of Justice Statistics, Capital Punishment 2010 – Statistical Tables, Tables 4, 9, and 13 (Dec. 2011) – whereas roughly 52% of convicted homicide offenders are black. See Office of Justice Programs, Homicide Trends in the United States, 1980-2008 (Nov. 2011).

            Faced with this statistical reality, the advocates of race-based "justice" shifted gears and argued that it is the race of the victims that should be examined to determine if the death penalty was racially discriminatory:  i.e., if the murderers of white victims were more likely to get the death penalty than murderers of black victims, the "system" was deemed discriminatory on disparate impact grounds, and violated the equal protection clause.  In short, these advocates wanted black murderers to be able to escape the death penalty because their victim was white and therefore the death penalty would by that fact alone be rendered unconstitutional.  In effect, they grotesquely argued that murderers should be able to invoke the race of the person they chose to murder as a means of evading the death penalty.  But in the case of McCleskey v. Kemp, 481 U.S. 279 (1987) – another case where I prepared an amicus brief opposing the race-based justice arguments – the Supreme Court held that mere statistical disparities were insufficient to establish that the death penalty was unconstitutionally discriminatory; it must be demonstrated that racial discrimination was actually intended in the defendant's individual case.

            Despite this ruling, and despite the demonstrable fallacy of their premise that the death penalty discriminates against black murderers, these racial justice advocates have persisted in pressing their false argument in the legislative and judicial arena.  In 1988, the late Senator Edward Kennedy championed radical federal legislation that would have prohibited states from imposing the death penalty if statistics could be manipulated to show any "disparate impact" in their past administration of the death penalty.  Only determined efforts by a small group of stand-up senators (I was active in advising them in my then role as Counsel to one of those senators) prevented passage of this radical legislation.  But in North Carolina, legislators have since ignored the reasoning of the McCleskey  case, as well as basic logic, in enacting a law that bars the death penalty based on the purported statistical patterns of past cases, even if there is no evidence of discrimination in the defendant's individual case.  In short, racial politics continue to be employed to trump truth and basic justice in the administration of the death penalty.

            Although the race-based distortions in the death penalty area are egregious, their impact is relatively narrow compared to the more far-reaching application of "disparate impact" doctrine and racial politics that produced disastrous results for the national housing market and the economy in the 2008 collapse of the subprime mortgage market.  As has been well-documented elsewhere, the subprime mortgage crisis resulted in large part from the loosening of standards of creditworthiness that led in turn to extensive defaults and devalued mortgage debt.  Mortgage credit standards were drastically loosened because of threats and pressure from federal regulatory agencies based upon the notion that normal credit standards discriminated against blacks and other minorities under the disparate impact doctrine and other contrived racial justice theories. Obama Attorney General Eric Holder, then the Deputy AG at the Justice Department, was a key player in pressing these disastrous policies.  In short, federal regulators encouraged or effectively required lending banks to make risky subprime mortgage loans to meet affirmative action goals in housing, with the disastrous results that were a major cause of the 2008 recession.

            Even while the subprime mortgage disaster demonstrates the damaging practical consequences of race-based policies, the realities of life in 21st century America demonstrate the obsolescence of the premise for such policies – that African-Americans constitute, in the Supreme Court's famous phrasing in the case of U.S. v. Carolene Products, a "discrete and insular minority" that is powerless to protect itself through normal political processes.  Not only was the extremely cohesive black vote sufficient to assure the election of a black President in the last two elections, as well as 42 black members of the House of Representatives, but the increasingly influential role of blacks in high public life is further evidenced by the expanding succession of black Secretaries of State, Attorneys General, Presidential Advisers, federal judges, and other high officials.  It can no longer be credibly argued that blacks are entitled to preferential protection by law because they lack the political power to advance and defend their own interests.

             Nonetheless, far from moderating racial preference policies, the Obama Administration continues to reinforce and expand such policies.  The most egregious recent example is a radical regulatory mandate from the Equal Employment Opportunity Commission (EEOC) designed to prevent employers from performing criminal history checks in order to avoid hiring dangerous felons.  The premise for this dangerous and insidious policy is, once again, the discredited disparate impact canard. 

            Blacks represent a proportionately greater percentage of those with criminal records than whites for the obvious reason, firmly demonstrated by BJS data, that they have committed proportionately more crimes.  It naturally follows that screening potential employees to avoid hiring dangerous or dishonest offenders tends to have a mathematically "disparate" impact on black applicants, in that more black applicants with criminal records  will be identified.  But such a practice is in no way "discriminatory" under any reasonable understanding of the term; on the contrary, it is a necessary measure to assure a safe, secure, and honestly-run workplace.  But on the bogus grounds that this entirely sensible hiring practice has a discriminatory impact, the EEOC has issued a "guidance" that imposes such harsh conditions on the use of criminal background checks as to effectively discourage employers from using them.  Specifically, if the check reveals that the applicant has a criminal background (e.g, he is a convicted burglar or rapist), the employer is expected to conduct a thorough "individualized assessment" to establish that there is a genuine "business necessity" for refusing to hire the felon.  It is important to note in this respect that an employer is not required to conduct an "individualized assessment" for rejecting a perfectly honest applicant with a clean record on almost any grounds (e.g., his personality wouldn't be a good fit) not covered by the antidiscrimination laws.  In short, the EEOC's latest extension of the race-obsessed disparate impact doctrine effectively makes convicted felons a specially protected class in the eyes of the law.

            The EEOC's outrageous effort to prevent businesses from performing necessary criminal background checks on job applicants is merely one recent example of tyrannical and over-reaching race-based policies that have long outlived the purposes embodied in the civil rights acts of the 1960's.  Despite the obsolescence of the remedial basis for such policies, they are so deeply entrenched in our government and other institutions that only a determined and broadly-based movement could bring about their abolition or even substantially curtail them.  But given the political lassitude of much of the citizenry who are on the losing side of these racially preferential policies -- and their predilection for conflict avoidance on racial issues even when the welfare of their children may be at stake -- the prospects of eliminating institutionalized racial preferences are not promising in the absence of a widespread national attitude adjustment.



Monday, February 18, 2013


                Karl Marx, the father of the communist movement, is often quoted as saying that "Religion . . . is the opium of the people."  Whatever validity Marx's dictum may have had when he wrote it almost 200 years ago, it has little relevance in the secular world of 2013.  In the United States, at least, it is Big Time athletics that has become the opiate of the people.  And America's voyeuristic preoccupation with Big Time Sports programs not only diverts the popular attention span from the more important things in life, it has a degrading effect on the outlook and behavior of the growing legions of spectator sports addicts.
                                                                     Coach Marx

            Oddly, the obsessive and excessive quality of big-time spectator sports today is proudly embraced rather than denied.  Numerous popular sports radio programs are unashamedly but revealingly called "The Sports Fix" or "The Sports Addicts." Similarly, a frequently run advertisement on the sports mega-network, ESPN, gleefully depicts the extreme and irrational behavior of various categories of sports fans and then triumphantly and dogmatically proclaims, "It's not crazy.  It's sports."  It is simply taken for granted that the irrational excess associated with the fans of pro or college sports programs is normal and healthy, and anyone who disagrees is considered pompous, prissy, and pretentious.
            Of course, enjoyment and admiration of popular teams and athletes has long been a part of American culture, and, like millions of my own generation, I spent a good portion of my boyhood as an avid sports fan when I was not actually playing basketball.  But what was once a wholesome diversion has evolved for too many Americans into an obsessive preoccupation that is divorced from all sense of balance, proportion, or common sense. Consider just a few examples of bloated excess in big-time spectator sports that readily come to mind:

·         Governments and their taxpayers continue to finance obscenely lavish sports stadiums, to the tune of about $500 million per year (totaling some $1.2 billion in recent years), with public funds and subsidies.  The pro sports Taj Mahals are constructed for the benefit of billionaire owners, multi-millionaire players, and mostly affluent ticket-holders with funds that would be better spent on police forces, firefighters, and public schools in cities mired in crime, degenerate slums, and third-rate school systems.  That the mass of citizens do not rise up in rebellion against this grotesque misallocation of resources is mainly attributable to those citizens' own delusional addiction to the sports programs in question.
                                                Dallas Cowboys' $1.2 BILLION stadium

·         The salaries of coaches and professional athletes have long surpassed all bounds of sanity and proportion.  The average salaries of collegiate football and basketball coaches is over $1.6 million, with the high-end making $5.5 million.  Even assistant coaches at major universities can earn multiple six-figure salaries.  The average salaries of professional basketball, baseball, and football players are around $5.15 million, $3.3 million, and $2 million, respectively, with the so-called superstars making astronomically more than that.  Meanwhile, the physicians who save and prolong our lives earn only a small fraction of those salaries even after ten years of med school, internship, and residency, and many years of practice.  And while pro athletes who are fraudulently identified as "warriors" collect their millions in between their off-the-field misbehaviors, the real warriors of the Armed Forces who fight our wars, including the man who shot Bin Laden, often struggle to find any job at all.  And those who lamely defend this lunacy on the argument that it is merely the natural workings of our capitalist system miss the mark entirely – the point is not that such imbalances should be outlawed or regulated by government, but that the value system that creates them is grotesquely distorted.  The fault, dear Brutus, is in ourselves.

·         In what has become a truly bizarre media circus, legions of deadly serious "sports reporters" line up behind their laptops every year to breathlessly report the announcements of 17-year-old high school football players that they will deign to accept what are laughably still called "scholarships" to the universities of their choice.  This media-created sideshow has evolved into a highly-publicized annual "event," and has even been accorded an official title with portentous initial caps, like the Super Bowl:  "National Signing Day."  Preening in the glow of their instant celebrity, the oversized teenage athletes make their announcements behind podiums and banks of microphones with all the fanfare of a press conference announcing the nomination of a Supreme Court Justice; and the hundreds of fawning reporters who attend this contrived farce treat it with equivalent gravity.  And weeks later the whole twisted scenario is repeated, except on an even larger and more elaborate scale, when an even larger horde of media drones and sycophants gathers to broadcast the spectacle of  collegiate  "scholar-athletes" donning the caps of the NFL teams that have drafted them.  The breathless drama with which the selections are announced is beyond satire. 

·         While in past decades little boys would wear the ball caps of their favorite teams, we have now come to take for granted the odd custom of middle-aged men routinely wearing jerseys boldly emblazoned with the names of their favorite pro athletes.  Putting aside the questionable character of some of those same athletes, exactly when did it become commonplace and accepted for presumably mature grown men to act like hero-worshipping little boys?  It is hard to imagine the men of the World War II generation (aptly called The Greatest Generation), in contrast, acting as poster-boys for pro athletes; they were too comfortable in their own skin.  What makes this curious phenomenon even harder to fathom is that the fans in question pay very high prices for the "privilege" of providing the players free publicity on their billboard shirtbacks.

·         The NFL's Super Bowl has assumed the status of a grotesque and gargantuan Circus Maximus that would put the excesses of Caligulan Rome to shame.  Fellini in his wildest dreams would have difficulty conjuring a more sordid spectacle that at once combines gladiatorial violence, sexual exhibitionism, conspicuous gluttony, and Madison Avenue Madness in a lurid spectacle that originated as a simple game.  Persons who would otherwise recoil at the displays of vulgar excess that have become hallmarks of this national revelry feel somehow obligated not only to accept it, but to enthusiastically embrace it at one of the millions of Super Bowl parties  that have become a National Ritual.  For at least two weeks preceding the game, whole batteries of sports commentators devote thousands of program hours painstakingly discussing every possible aspect and angle of the teams, the players, and even the entertainers and the advertisements that add to the grotesque glitter surrounding the game.  Had anything approaching such minute analysis been devoted to the National Health Care bill or to the national debt and deficit crises, the Nation might have avoided the disasters about to descend upon it.

·         The celebration of sports victories has expanded out of all proportion to the significance of the games themselves.  A routine victory over a conference rival in college basketball now invariably triggers a mandatory student-body "rush" onto the court, an effusion once reserved for a truly special event, such as an upset victory in a major championship game.  A really significant victory of the latter kind is now apt to trigger a virtual public riot, replete with mobs rampaging in the streets and mass acts of pyromania that would put the Brits' Guy Fawkes Day bonfires to shame.  But the excess at the collegiate level is nothing compared to the grossly outsized celebrations and ceremonies that follow a championship victory in the major pro sports.  Municipal parades reminiscent of those that honored our troops for winning World War II are now de rigueur to honor the victors of these staged corporate contests – even though the honorees have merely performed the services that they are excessively compensated for in the first place.  And for some odd reason beyond rational understanding, it has somehow become  mandatory for the President to invite every pro sports championship team to be honored at a staged White House ceremony, as though they had earned the Congressional Medal of Honor.

·         None of the sports excess outlined above would have occurred, of course, without the cultivation, complicity, subsidy, and support of the corporate sports media.  Whole networks (both TV and radio), most notably the notorious and odious ESPN, are devoted to 24-hour per day sports coverage.  The never-ending cycle of media promotion and sensationalism both stimulates and reinforces the sports public's seemingly insatiable appetite for its "sports fix."
            Participatory sports activities continue to occupy a positive and enjoyable place in American life, and the expanding opportunities for participation in an enormous variety of sports for persons of all ages is one of the more beneficial developments of our age.  But the disturbing excesses and distortions that have evolved around Big Time spectator and media sports are another matter altogether.  Restoring some sense of proportion in this increasingly bizarre arena may seem a hopeless task, but unless and until that happens, our Big Time Sports culture will continue to undermine the very virtues and ideals that sports were intended to develop.


Monday, February 4, 2013


                Although I have written recently about the Carpenters, I have no hesitation in posting the piece below today, on the 30th anniversary of the passing of the twentieth century's greatest and most interesting female vocalist -- the unforgettable Karen Carpenter.   The Drummer Girl from Downey, California, continues to provide a musical beacon of innocence and light in a darker world.

                                                            * * * *

                            Karen Carpenter's Burial Site, Thousand Oaks, CA

            In the summer of 1970, pop radio's relentless cacophony of hard rock, acid rock, and Motown was interrupted by something delightfully offbeat and original.  It was the pure and mellifluous contralto of a guileless young girl, accompanied by a seamless arrangement of instrumental harmonies and multi-layered voice-overs that sounded like nothing anyone had heard before, this side of Les Paul and Mary Ford.  And if this intriguing group's multi-tracked vocals were not enough of a novelty, the ingĂ©nue with the angelic voice turned out to be a prodigious drummer as well, who could riff the Ludwigs with infectious verve and authority – the first female drummer in history to achieve any significant prominence and recognition.

            The breakthrough song was "Close to You," the drummer-girl was Karen Carpenter, and the captivating new sound was the brainchild of Karen's creative big brother, Richard.  "Close to You" quickly soared to #1 on the Billboard Chart and the Carpenters were off to the races.  Between 1970 and 1975, hit after Carpenters hit rose to the top of the charts in dizzying succession.  "We've Only Just Begun," "For All We Know," "Top of the World," "Superstar," "Rainy Days and Mondays," "Yesterday Once More," and countless other classics made the likeable, clean-cut siblings from Downey, California, instant international superstars.  Their popularity reached phenomenal heights in countries as varied as England, Holland, and Japan (and years later, even China ) where their concerts drew record-setting crowds, they were sometimes mobbed like the Beatles, and their records achieved international sales that ultimately exceeded their success in the U.S.  And many thousands of 1970's newlyweds chose the dawn-like romantic optimism of "We've Only Just Begun" as their wedding song.
             But 30 years ago today, on February 4, 1983, the Carpenters' dazzling melodic run came to a heart-rending and premature conclusion.  After some seven years of anorexic self-starvation, Miss Carpenter fatally succumbed to the debilitating effects of a disease that was then only dimly understood.  She was only 32.  Although in the last months of her life she had seemingly overcome the compulsive disorder, resumed reasonable eating, and re-gained weight, the years of nutritional deprivation had critically sapped the strength of her weakened constitution. The world had lost a vocalist of unique virtuosity and emotional conviction, a pioneering female drummer, and, even more sadly, an unforgettable icon of romantic innocence.  Karen Carpenter was The Girl Next Door writ large.  But just beneath the rich, round tones of her soothing ballads lurked a subtle but distinctive undercurrent of melancholy – what one discerning critic later described as "the sound of a human heart breaking."

            In the annals of underappreciated heroines, few can surpass the painful experience of Miss Carpenter.  The extraordinary success she and her brother achieved on the record charts and concert stages of the world was rivaled only by the harsh and mean-spirited vituperation that was inflicted on them at home by the fashionable music critics and cultural cognoscenti of that disjointed era.

            The condescending criticism had little to do with the Carpenters' music itself, which was indisputably superior both in Karen's vocals and Richard's arrangements.  But in what can only be described as a bizarre twist of cultural judgment, the Carpenters were mocked and belittled by the critics for the very qualities that attracted the affection of the many millions who bought their records – they were civil, good-natured, sincere, modest, and romantic.  In an era when tout le monde was embracing free love and egotistical exhibitionism, a demure suburban chick from unfashionable Downey who wore pinafores, cameo lockets, and lace collars up to her neck offered an easy object of ridicule.  Even worse, the Carpenters' songs were devoid of "attitude," raunchiness, or rebellion, and they never made the slightest effort to deny what they were or where they came from – middle-class white kids from suburban SoCal.  But rather than receiving due credit for their authenticity and willingness to sail against the winds of the times, the Carpenters were smugly castigated as too vanilla and "too white."

            While the stinging critical dismissal of their work failed to undercut their overwhelming popularity in the first half of the 1970's – when they were probably the most successful recording artists in the world – it eventually took its toll when it achieved the status of a kind of received truth among the arbiters of hip culture in America.  Karen's descent into anorexia and Richard's difficulties with prescription sleeping pills soon followed, and the Carpenters never recovered the productive heights they had maintained from 1970 to 1975.

            To this day, the scornful seed planted by the critics and hipsters of the post-Woodstock culture has undermined the Carpenters' musical legacy in the United States, even while that legacy has grown and prospered internationally (see my previous posts below).  Among other things, the liberals who dominate American musical orthodoxy could never forgive the Carpenters for graciously accepting an invitation from President Nixon to perform at the White House during the height of the Watergate scandal.  Under the incredible tension of that moment, exacerbated by the confining space of the small White House stage, Karen performed with extraordinary grace, poise, and good cheer.  Yet the Carpenters' classy performance at the White House (which is viewable on a YouTube video) has always been considered a strike against them by the left-oriented musical establishment.  The perverse bias built up against the Carpenters over the years for their association with white, middle-class values has not only greatly reduced their playtime on so-called Classic Rock radio stations, but is undoubtedly responsible for their continued exclusion from the Rock and Roll Hall of Fame, despite credentials which dwarf most inductees into that Philistine institution.

             Of all the cultural misjudgments inflicted on Karen Carpenter, none was more mindless or more ironic than the canard that the Carpenters were one-dimensional poster-children whose music was little more than superficial treacle.  Behind the flawless white teeth, the shining Breck Girl hair, and the puffed-shoulder dresses lay a deep and lovely lake of complex emotions that swelled to the surface in the songs that gave voice to the dark and melancholy side of romantic loss and emotional isolation.  Karen's moody rendition of Leon Russell's "Superstar" is the most prominent of these, but the Carpenters' catalogue includes numerous lesser-known ballads, such as "Crescent Noon" and "Eve," where she pushed her signature lower register to emotional depths rarely approached in contemporary pop.  And she delivered some of these haunting renditions when she was only 19 or 20 years old, revealing a vocal maturity that now seems prophetic in the hindsight of her tragically shortened life.

          Whatever the verdict of the nameless Woodstock-era critics, Karen Carpenter's place among the great pop vocalists of the 20th Century is firmly recognized in a far more telling and credible quarter – the enduring admiration of her performing peers of past and present.  Musical giants from Henry Mancini to Paul McCartney to Burt Bacharach have acknowledged her stature among the great ladies of song.  When the Carpenters happily agreed to record a song Mancini's daughter had written ("Sometimes"), the maestro remarked:  "It was like having Sinatra do your song."  Carpenters biographer Ray Coleman records that McCartney and his brother Michael described Karen as "the best female voice in the world, melodic, tuneful, distinctive."  And Elton John has praised her as "one of the greatest voices of our lifetime."  Experiencing one of her live performances in concert, such as her BBC Concert in 1971 (see above link to YouTube), is to understand the appreciation these giants had for Miss Carpenter.

            Karen's admirers run the gamut, crossing both musical and cultural divides.  Shania Twain called her "my favorite singer of all time . . . .  She has the voice of perfection."  Madonna, whom one might consider the antithesis of the very modest Miss Carpenter, is actually a great admirer, and acknowledged that "I'm completely influenced by her harmonic sensibility."  Even Barbra Streisand has extolled Carpenter's voice as a "marvelous instrument."

            Indeed, some of Karen's strongest admirers are found in the most unexpected quarters.  After revealing that he listened to the Carpenters on his iPod, the inimitable Alice Cooper was asked why.  He tersely responded, "They're the best."  Actor Nicholas Cage made the motorcycle tough guy he played in the film "Ghost Rider" a dedicated Karen Carpenter fan, and the film's soundtrack includes not only the haunting "Superstar," but an original instrumental piece called "A Thing for Karen Carpenter."  And more recently, the eccentric Tim Burton featured a video of Karen singing "Top of the World" in his film "Dark Shadows," moving the Johnny Depp vampire character to exclaim, "Reveal thyself, tiny songstress!"  Interestingly and somewhat ironically, large elements of the gay community are especially fond of Miss Carpenter's music and persona, possibly because they identify with the self-image struggles that led to her anorexia.

             The enduring affection for Karen's unique vocals extends even to the denizens of the Alternative Rock world.  In 1994, a motley assemblage of alt rockers combined to produce what can only be described as a highly singular tribute album called "If I Were a Carpenter."  The free-wheeling interpretations of Carpenters classics by cutting edge groups like the Cranberries, Sonic Youth, and Japan's Shonen Knife ranged from the slightly distorted to completely over-the-top, but all reflected genuine admiration and affection for the Carpenter sound and Karen's unforgettable vocals.  And just two years ago, California "roots rock" legend Dave Alvin, a fellow Downey native, recorded a moving tribute ballad about Karen called "Downey Girl."

            Thankfully, 30 years after her death, the Drummer Girl's mellifluous voice, and the bright and rosy face of her first television appearances, are still accessible in a forum Karen never lived to see.  Unlike commercial radio, the Internet is an unfiltered democratic medium, and tens of millions of her fans log onto YouTube to resurrect the timeless images of Karen Carpenter in her golden years.  The raunchy caravan of today's musical mayhem moves on, but the Drummer Girl's incomparable voice endures as a welcoming refuge for those who crave a gentler song.