What It Takes to Have a ‘Good’ Marriage

Over fifty years together should give me the experience of what I’ve learned about the most important union one can have and the commitment it takes to make a ‘good’ marriage…

An ever-growing number of marriages end in divorce and even more are in name only. In reality, just a small percentage can be considered as ‘good’ and none ‘perfect’. Why, when two people appear to love each other so deeply, at the start of their journey, do so many of these relationships end tragically?

Some would say that today most are not willing to make the commitment necessary to build a lasting marriage. After all, marriage is work that takes continued nurturing, not only to grow but to even last. It is two people standing together to face not only the problems that happen to each as individuals, but collectively as well. It is two people doing their part in their own way; each bringing certain strengths as well as weaknesses to the relationship. Not keeping score of who does more or less.

Marriage is not easy. The good marriages work only when both partners make the commitment. When both put the other first. Many start out that way, but gradually one or both partners begin to falter. One can’t do it alone.

Most marriages fail for one of three reasons— fear, selfishness and abandoning one’s commitment. Not willing to make a total commitment for fear of losing one’s identity in the relationship. A fear that if he or she lets down the protective barriers that were so carefully constructed while growing up, they will be vulnerable to hurt and pain. Another is selfishness— never being able to put the partner’s needs ahead of his or her own. Finally, forgetting the commitment they made to God when their relationship was formalized.

It is frequently said that marriages fail because people change. Everyone changes. What often happens is a loss of self-confidence. In an attempt to prove one’s own self-esteem, one looks in other directions. It may be to exercise, a different career, or a casual affair. While this ‘searching’ was not meant to destroy the relationship, it either does, or permanently damages the trust that holds the marriage together.

It is only a fortunate few who enter a relationship where both individuals are brave enough and unselfish enough to make the sacrifices. But for them, a relationship is born that is second to none— a true partnership to share the pleasure and the pain, the joy and the sorrow, and the good and the bad.  Each partner finds that they don’t lose their identity, but because of their mutual support, they grow and flourish.  Sure, they are vulnerable to hurt and pain, if the relationship does not work out, but, nothing in life comes without some risk.

In the movie, Love Story, there is the quote, “Love is never having to say you’re sorry”. That’s not really the way life goes. It should have been, “Love is always to be willing to say you’re sorry”. Not necessarily sorry for being wrong, but sorry because whatever happened might have hurt the one they love.

If a good marriage is so fulfilling, why then are there so few of them? What is the secret that makes these relationships special? Each partner has different ways to keep their love alive.

The first is trust. One can’t keep looking over their shoulder and ahead at the same time. Next is respect for the partner and his/her needs. Forgiveness without reservations, knowing that any hurt was not done intentionally. And finally, loving that person, more than one’s self. Finding that the greatest pleasure comes from making their partner happy.

Good marriages don’t just happen. They are built by two people, who love each other, trust each other, forgive each other and respect each other.

Many never stay around to see the rewards of their commitment. But then again, some very fortunate do, and a lifetime together never seems long enough!

Rob Tenery            1998 (re-edited 2015)

We Live in One Big Infomercial

What do Keith Olbermann and Mark Levin have in common? It might be open to debate, but I would call both of them smart and well informed. They both, however, take the same facts and spin out almost completely different messages. Not that either is lying, only telling the truth  in their way. But what they share is they both bring down the debate to a lower level with their degrading characterizations of individuals with whom they disagree. Even those who support their particular position on a subject are either turned off, or become as radical as they are. In either case, these commentators and those that support them, are often left out of the debate by those who are willing to look at the differing positions in a more reasoned way.  The two make for ‘good press’ but not necessarily for meaningful dialogue. The good they might accomplish is that they bring attention to subjects that can be debated in a more subdued environment, but usually without them.

Being conservative, I support Mark Levin’s positions, almost down the line, but when he labels our Secretary of State as ‘mashed potato face’, it reflects poorly on Levin, even if his negative point about John Kerry was valid. That alone exonerates Kerry to a certain extent. Levin and Olbermann are just trying to entertain their listeners, or viewers, with their added ‘color’, but they, and many other media outlets are instrumental in adding to the unrest in this country. Maybe even more important, they and other commentators are influencing the attitude and methods with whom we dialogue. We have become less understanding because, in most situations, our minds are already made up. Far too many postings on Facebook are nothing more than unsubstantiated babble with an occasional shred of truth dropped in. Like Olbermann and Levin, most of us know who they are and take their comments accordingly.

Reporters in this country don’t just report the news any more. They have become journalists, commentators and editors who analyze the facts and slant their message, before releasing it to the public. Most reflect the bias of their upbringing, their education and their employer. The consumers get the news from these sources, but not all these sources— usually only the ones that are sympathetic to their points of view. We, the consumers, are basically lazy. Rather than doing the background work to come to the opinions we espouse, we rely on others (our preselected media sources) to shape our opinions for us.

Let’s examine how many, if not most of us, come to an opinion on a particular political candidate. First, we take note of the candidate’s political party. Next, we check with the media sources that we rely on, which virtually always lean the same way we do. We may also contact individuals with whom we regularly communicate for their points of view. Finally, we might interact with the candidates by listening to their recorded or printed excerpts, or even attending a rally where they appear. In that order. That’s the exact reverse of the way our positions should be formulated. The truth be told, most of us, who do care about this country, don’t have an open mind.  How many people listen to both CNBC and Fox, on a regular basis? Very few, I would guess. Thus, elections are won or lost by how well those of us who are concerned about the issues that affect this country, get those, that don’t seem to care as much, out to vote.

Another controversial commentator, Rush Limbaugh, once said, “You can never change a liberal’s mind.” I think he was only half-right, because he didn’t include conservatives as well.

Our ‘Kick It Down the Road’ Generation

My generation came along at just the right time. We saw the birth of Rock and Roll, Dwight Eisenhower was President and Chet Huntley and David Brinkley were broadcasting each night at 5:30. Coming off the victory of World War II and winding down from the Korean War we were proud to be called Americans. In 1955, there was virtually no inflation; the Consumer Price Index rose just 0.4% and unemployment hung around 5%. Rates on Treasury bonds were less than 3%. Real GDP rose 7.2% and the Dow Jones industrial average increased by close to 20%.

Dial ahead 60 years to 2015. Although we’ve been in wars in Vietnam, Iraq and Afghanistan over that time, we have not won any of them, except for the short sojourn in Kuwait. The predominant music of the day raps on about drugs, calls ‘cops’ the enemy and labels girls as whores. The news is what they want to tell us; depending on whether the media outlet is liberal or conservative. Close to 50% of the population is getting some sort of government subsidy. Our President has recommended more entitlement spending in his new budget, even though our national debt is increasing at $3 million per minute.

The shooting deaths of Trevon Martin in Florida and Michael Brown in Ferguson, Missouri have created unrest in the Black community. Having elected a black President who appointed another black as Attorney General, we could have hoped they would have taken the lead to easing racial tensions. Instead these differences appear to have worsened. Islamic terrorists are trying to force their radical religion on the rest of the world, and the most extreme (ISIS) has set their goal on reaching the White House.  Our administration has failed to comply with current law and adequately closed our southern border, while thousands of illegal children and potential terrorists pour across each day.  Our standing in the world community is taking a nosedive as we don’t take a leadership position in defending the world against terrorism. Our President has distanced himself from the current leadership in Israel, while possibly being manipulated by Iran as they continue to push ahead with their nuclear weapons program.

What can we do to address our skyrocketing financial commitments, increasing racial disharmony, a porous border that hurts our working class and opens us up to terrorism and our dwindling influence in the world community? A start might be to call our children and apologize to them, because we haven’t done a very good job of carrying on what our fathers gave to us.

What Can President Obama Learn from Neville Chamberlin?

History has a way of repeating itself. Smart people take advantage of that by learning from the mistakes of others or building on their accomplishments. Only a fool disregards the past.

Many judge Bush 43 as too hawkish, involving the United States in wars in Afghanistan and Iraq. Although Russia ended their involvement after 9 years because of the interminable nature of the conflict in Afghanistan, Bush still felt compelled to act in response to the 9/11 attack on this country. Like Russia, the United States has learned that wars between religious factions are virtually unwinable. But then again, Bush, along with Congressional approval, plunged headlong into attacking Iraq after Saddam Hussein’s regime failed to comply with the United Nations resolutions to allow their inspectors to check that they had eliminated their cache of ‘weapons of mass destruction.’ Some have also claimed that Bush’s aggression into Iraq was also possibly a reprisal to Hussein’s attack on Kuwait when his father was President.  Right or wrong, history will place the yoke of these wars on his back.

Most rate Ronald Reagan very high in his ability to handle foreign affairs. During his tenure as President, he sought a massive buildup in United States military capabilities, which led to the victory in Granada, and ultimately, an end to the Cold War with Russia. He promoted new and more advanced military technologies and granted aid to paramilitary forces that were committed to overthrowing their communist and leftist governments. In 1986, in response to learning that Libyan President Muammar Kaddafi was behind the terrorist bombing of La Belle Discotheque in Berlin, Germany, that killed two American soldiers and injured 150 more, Reagan authorized what became known as Operation El Dorado Canyon. United States air and naval forces launched a series of strikes against the headquarters, terrorist facilities and military assets that supported Kaddafi. Dozens were reported killed, including Kaddafi’s daughter.

Neville Chamberlain, Prime Minister of the United Kingdom from May 1937 to May 1940, will be remembered most for his appeasement foreign policy, because of his endorsement and signing of the Munich Agreement in 1938, which conceded to Germany the German speaking Sudetenland region of Czechoslovakia. Hitler then invaded Poland on September 1, 1939. Chamberlain, having pledged Britain to defend Poland’s independence, declared war on Germany on September 3. When the Labour and the Liberal parties would not join a government headed by him, Chamberlain resigned, but stayed on until his death, as a member of the War Cabinet of his successor, Sir Winston Churchill.

Churchill is regarded as one of the greatest wartime leaders of the 20th century, because he refused to consider surrendering during the early days of World War II when his country and the countries of the British Empire were alone in their opposition to Hitler’s Germany.

What can be learned from these leaders? First, if a war is inevitable, it is better to be fought on foreign soil. Second, negotiating is more effective if it comes from a position of strength. Isolationism is an idealistic theory. If we have something someone else wants, they will eventually come and take it from us, unless we are the stronger of the two parties. Terrorism, in some ways, defies that logic. The proponents of terrorism are more concerned with inflicting damage and imposing their form of dominance on others, as opposed to just taking our property. The latest terrorist threat is that ISIS appears to have added another element and that is eradication of those who oppose their radical way of thinking.

Our leader, President Barack Obama, should learn from George W. Bush that we need to fight our enemies on their soil.  From Ronald Reagan that we cannot negotiate from the position of a weakened military. From Neville Chamberlin that yielding to those elements that advocate dominance over others, only encourages them to take more. Finally, from Sir Winston Churchill, who incentivized his country and the free world to never give up.

On the other hand, maybe the world can learn from President Obama that the role of the United States should change from a world power to an entitlement state.  That the United States should pull out of wars on foreign lands and instead treat them as police actions when others bring their violence to this country. Lend token support, while terrorist dictatorships wipeout populations that don’t adhere to their religious beliefs. Finally, let Russia’s Vladimir Putin, Prime Minister Tammam Salam of Lebanon, Egypt’s President Sisi and almost everyone’s favorite, Israel’s Benjamin Netanyahu, take over as the world leaders. I suppose that is why our President sent the bust of Sir Winston Churchill, that resided in the White House, back to England when he moved into the Oval Office.

Are We Going to Look the Other Way, Again?

Do 30,000+ radical extremist Islamic militants in the Middle East really pose a threat to the rest of the world? When it first began, did Hitler’s Nazi Party pose a threat to world peace when it first began? The answer to the latter nearly took the world to its knees before Germany fell. What do both of these movements have in common?

What we have here is a clash of ideologies. Although it has been reported that only slightly over 20% of the worlds one and a half billion Muslims are sympathetic to the ISIS cause, that number alone is astronomical.

Make no mistake, what is happening in the Middle East is a Holy War with immense implications to the rest of the world. The participants, mostly Sunni Muslims, of ISIS have declared war on their fellow Shia Muslims, Jews and Christians. In a matter of months, their ranks have grown to be in excess of 30,000 and they have established their Caliphate state by occupying portions of Iraq and Syria. Already their presence in other adjacent countries is being felt. This intense fervor comes not just from a desire to acquire territory, but an ideology of forcing their Muslim religion, under Sharia law, on those they conquer and kill those that don’t comply. It appears no one that comes under the control of ISIS is immune. Their brutality is shocking— from beheadings and torchings to mass killing of innocent children and selling of  ‘donor’ organs on the black market. Convert to their form of the Muslim religion or die.

This is not so different from when Hitler’s Germany took control of Austria in March 1938 and then, with help of England and France, completed the take over of Czechoslovakia by March 1939. Using the harsh limitations put on Germany by the Treaty of Versailles after World War I, Hitler’s pretext was Germany’s right to acquire land where German-speaking people lived (policy of lebensraum). But quickly, the takeover became more. Jews, from all parts of the German occupied territory, were rounded up and put in camps, then by the millions, worked to death or systematically exterminated in gas chambers, while the majority of German citizens either didn’t know or looked the other way at what was going on right under their noses. At the end of World War II, the world community promised this almost unimaginable brutality of ethnic cleansing would never be allowed to happen again. In the subcontinent of Africa alone, there are examples where history has proven that promise wrong on numerous occasions.

What seems different, about the participants of ISIS from Hitler’s Nazis, is their willingness to sacrifice their own lives for their ideology and their senseless brutality. A brutality, broadcast to the world community, that almost begs for reprisal. Either the ISIS terrorists feel the world community is so burned out from the protracted wars in Iraq and Afghanistan, that they are unwilling to make the commitment to eradicate them. Or, they want to draw the opposing countries into a third World War— the flash-points being Israel and Iran developing a nuclear weapon.

The complicating problems are immense: The American public is burned out on getting into another war that results in ‘boots-on-the ground.’ They have accepted drone attacks on Al-Qaeda leaders and targeted bombings. But the ISIS confrontation is different— the terrorists are too spread out and more forces pour into their occupied territories daily. The efforts of the limited sorties flown by US forces, the response by Lebanon in retaliation when their captured pilot was burned to death and Egypt’s response when 21 of their citizens were beheaded in a mass execution, just because they were Christians, is clearly not enough to bring a halt to this expanding threat by ISIS. The President’s representatives have made it known that the US military has planned an invasion into the ISIS controlled territory to retake Mosul in April using using Iraqi troops. So, there is hope other, less public, plans to eradicate ISIS are in the works.

The United States has been the word’s policeman since the end of World War II. President Obama campaigned and won two elections on the promise he would extricate this country from Iraq and Afghanistan. Additionally, even though Obama claims to be a Christian, his Muslim background has shaped his beliefs that this movement is only by a small part of those that hold onto the Muslim faith. So, he is fundamentally torn with dealing with ISAS as a band of fanatical terrorists versus the religious fanatics that they really are. Although he recognizes the threat they pose, I feel he greatly underestimates their future capabilities if not stopped soon.

As ISIS advances its agenda of domination and destabilization across the Middle East, they have already set their eyes on Rome and, ultimately, Washington, D.C. Is this just wishful thinking by a band of ‘junior varsity’ terrorists, as President Obama compared them to in an interview with New Yorker magazine, released January 20, 2014? Close observers of their advances in taking large areas of Iraq and Syria and the brutality to those they capture, would probably disagree.

History has a way of repeating itself.  Hopefully, the world leaders won’t keep looking the other way, until it’s too late!

The Fall of the “American Empire”?

Growing up, Sunday mornings were when my mother would drag my dress shirt, cuffed pants, seersucker jacket, clip-on bow tie and one-day-a-week pair shoes out of the closet and lay them neatly at the foot of my un-made bed. It was church time; a weekly ritual for as long as I can remember. For me and my baby sister, it wasn’t really church, but Sunday school. That would come when we were older. Going to church was part of our lives. Granted, I’ve not been as disciplined as my parents, but the lessons they taught me have shaped my thinking throughout my life.

The extra time it took to dress up for church, instead of just dropping into yesterday’s jeans and a fresh t-shirt, was all part of the process of preparing for church. Not that I couldn’t have benefitted just as much in my jeans and t-shirt, but there was a certain decorum that was acceptable in those days. I thought of it as my church uniform.

Dial ahead thirty years to many of the Pentecostal and so-called Cowboy churches where the tradition of dressing up was no longer expected. Many attendees still did, but many didn’t. The argument put forward was that by relaxing the ‘dress code’ more people would go to church. And that’s what counted! The standards, that were fine for our parents, had passed.

This change struck home when my wife and I attended our church on a recent Sunday. The greeters that welcomed and directed us to a pew weren’t wearing ties. Many of the older attendees, scattered around the sanctuary, were still dressed in their Sunday best, but almost as many were dressed from dress-casual, to jeans, even one in her sweat suit, as if she had just drooped by after an early-morning jog.

It was a special Sunday, called Kirkin’ o’ th’ Tartan, when the culture of the roots of our church from Scotland was celebrated by some of the members bringing to the service tartans of their clan’s heritage. We were barely seated when the music of bagpipes filled the air. Leading the procession into the sanctuary of the choir and the minister was the North Texas Caledonian Pipes and Drums Corps, dressed in their full Scottish regalia, with their bagpipes tightly tucked under their arms. The look on their faces told it all, as they proudly carried on the traditions of their forefathers’ native country. I was struck by the irony of holding on to the traditions of the country where our church had its roots, but throwing other traditions aside just to get more people to attend. I wondered if the affect would have been the same if the bagpipe corps had turned in their kilts for jeans or a sweat suit.

There is something to be said for decorum— behavior in keeping with good taste and propriety. Tradition for tradition’s sake without meaning only stifles progress . Dress codes in church may hold back attendance, but it shows respect for the church. It also demonstrates (in this case attendance) sacrifice to take the extra time to ‘clean-up’ so to speak. Does it matter to God? Probably not! But it may matter to others that attend the service.

Dress code in church is only one example of how social norms are ‘dumbing down.’ The more free use of curse words in public, toothpicks hanging out the mouth, and chewing gum in public are the most obvious.

Chewing gum in public, especially when it is done by our President, has garnered lots of attention. Recently, the press in India was critical when Obama, who was in conversation with their Prime Minister Shri Modi, took the gum that he was chewing, out of his mouth, examined it and then popped it back in. When Obama was photographed at the somber, 70th anniversary ceremony of the D-Day landing, ‘chomping’ his gum, the French media had a field day with comments like lack of respect and shocking. The comment that said it all came in a French twitter that claimed the President’s chewing gum was an example of American class.

Collectively, these examples of changing decorum don’t demonstrate how far our society and our leadership have come in shattering old taboos and setting new norms, but possibly the first signs of the fall of the “American Empire.”

Physicians Desperately Need an American Medical Association

Physicians need a national organization that speaks with a united voice on issues that are corporatizing this once-noble profession. For without that representation, they become nothing more than independent contractors answering to the highest bidder, and not to their patients.

In the early 1950s, the American Medical Association’s (AMA) membership was almost 75% of the eligible physician population. Today that number hovers in the upper teens, which also includes medical student and resident members. From 1980 to 2002, the membership of the American College of Surgeons (ACS) had grown by 55.6% and, from 2002 to the present; it has grown by an additional 16%.

Although these statistics don’t exactly equate, they point out a glaring difference: While the AMA’s market share of eligible physician members continues on a long downhill slide, the ACS continues to grow its membership. Understanding this difference is important, since the AMA, even with its low market share, is still usually considered the spokesman when it comes to national issues that affect the medical profession. At some point, and with the continuing trend, that may no longer be the case. Either some other voice will take the AMA’s place, or the disparate entities that make up the body of practicing physicians will then have to feign for themselves.

With respect to its charter, the American College of Surgeons is targeted to advocate for issues that particularly affect the surgical specialties. The AMA tries to represent the needs and desires of all the disciplines that comprise its broad membership, which, all too often, puts it in a no-win situation with at least some of its constituency.

The reasons physicians join organizations generally fall into four general categories: Education, certification, representation and duty are the most notable. The first three are obvious, although they vary, depending on area of interest and locality. It is this duty to the profession that has changed this paradigm— this moral obligation that supersedes personal reward.

The American public’s support for this country’s efforts during World War II, versus the Vietnam conflict, seems to be the most analogous comparison. In the former, there was an almost 100% support, both on the battlefield and at home. During the Vietnam effort, the goals shifted from defending liberty to protesting— an attitude shift from public good to self-fulfillment.

It is the responsibility of the leadership of these two similar organizations, that claim to speak for and to the physicians they represent, to reevaluate and make adjustments when necessary to the three basics of responsibility, relevance, and representation: Responsibility to their members— just as important to the patients these physicians serve. Staying relevant to the current needs and wishes of the medical profession and their membership. Finally, representing the physicians’ best interests in those areas that impact the profession.

These basics require ongoing diligence by the leadership to the changing norms and expectations of both their physician populations and society as a whole— an understanding of the big picture. Unfortunately, that is where the entrenched leadership often fails. So caught up with their own, personal agendas and the precepts established when the climate was different, they often try to lead rather than follow.

Isn’t that the question? When should leadership lead and when should it follow the wishes of its constituency? When does the collective knowledge of those that are elected to lead out weigh the apparent wishes of the membership?

        While the ACS continues to grow its membership, the drop in AMA penetration from close to 75% of eligible members in the 1950s to somewhere in the upper teens, deserves a closer look.

Although the AMA is more representative of the body of physicians as a whole, it is less representative of many specific groups that make up its membership. Thus, the major reason the ACS grows its penetration rate in the surgical sectors of the profession.

The membership drop in the AMA speaks particularly to apathy on the part of today’s physicians and lack of perceived value. Potential members feel that they can get more targeted representation through other organizations and alliances. But it is the loss of allegiance to duty that is the most critical. This social trend is not only with medical organizations, but traditional churches, many social organizations and volunteer efforts too. It is a re-prioritization away from traditional allegiances and into arrangements that create more direct benefit to the participants

Because of its diverse audience, the AMA has always had difficulty with communication. With the loss of AM News, revelations about the regulatory and political issues that are impacting the medical profession have deteriorated even further. These subjects are not routinely part of JAMA’s purview or even the specialty periodicals they publish. It’s not through the minutes of the AMA meetings. Those are usually distributed through the state organizations’ periodicals and the minutes of some of the specialty societies. If there ever was a wake-up call, it is for the AMA to prove its relevance.

The Patient Protection and Affordable Care Act (ACA) appears to be the ‘sword’ on which the AMA has chosen to make its stand. Virtually every poll conducted of physicians was and continues to be opposed to the legislation in its current form. Still, the AMA publicly stands in support of most of the mandates in the ACA.

Although no one, except maybe the authors of the original legislation, understood the full implications of the proposed law, the AMA’s almost blind support set the path for a new era for the organization— compliance. Their position, put forward by their BOT and supported by their House of Delegates, is one of compliance—evaluate ways that physicians, as providers of care, and small business employers, can maintain compliance with the proposed Employer and Individual Mandate clauses. The proposed intent was to do this without eliminating the choice of their doctors and many of their health care plans. At least, that’s what the American public was told as the plan was initially touted.

Instead of supporting changes to those aspects of the law that potentially create access and financial hardships to employers and patients (the Employer and Individual Mandate clauses), the AMA appears to look for ways to improve compliance with those mandates. Instead of saying ‘no’ to the government mandates concerning ICD 10 guidelines (which they have recently put out objections), EMR stipulations, and punitive rules concerning hospitals that treat Medicare recipients, the AMA appears to sit idly by, apparently afraid to ‘ruffle the feathers’ of the Centers for Medicare and Medicaid Services (CMS) and the Administration that are the root causes of these intrusions into health care delivery.

We must question why the AMA membership continues to lose penetration in the physician population, while the ACS still maintains and grows theirs. Maybe, it’s because the ACS is seen as an advocacy organization with their frequent postings to their membership with the ACS NewsScope and other methods of communication, while the AMA loiters in relative silence of compliance.

This concern is not about the number of members in either organization, but about being able to protect the precepts of this noble profession and the health and wellbeing of the patients they serve.

Some might claim the AMA is too big, because it tries to represent too many divergent interests. With a member penetration of only in the upper teens, some would say it’s not big enough. The numbers may not be important, just being able to deliver the message!

Physicians desperately do need an American Medical Association, but is it the one we have now?

For additional thoughts:

Why the AMA Endorses Obamacare— But Your Doctor Does Not, Lee Heib, M.D. @ http://www.theblaze.com/contributions/why-the-ama-endorses-obamacare-but-your-doctor-does-not/

Our Changing Health Care System Since the Inception of the Affordable Care Act, The American College of Surgeons, @https://www.facs.org/advocacy/federal/health-care-reform.

Doctors for Hire

When medical care transitions from private to institutional, as in a socialized medicine delivery system, depersonalization occurs when the providers change their priority from the patients under their care to the system that employs them.

The Medical Group Management Association has shown that physician productivity falls, sometimes by more than 25%, in hospital based practices versus their counterparts in the private sector. This lost productivity is a consequence of the more fragmented, less accountable care that results from these arrangements.

Most hospitals measure the productivity of the physician practices they purchase in Relative Value Units (RVUs). They are beholding to the RVU system only because that is how they get paid. This is a formula that Medicare already uses to set doctor-payment rates. RVUs are supposed to measure how much time and physical effort a doctor requires to perform different clinical endeavors.

All of this productivity translates into the loss of what should be a critical factor in the effort to offer more health care while containing costs. Hospitals aren’t buying doctors’ practices because they want to reform the delivery of medical care. They are making these purchases to gain local market share and develop monopolies.1

The Affordable Care Act (ACA) pushes hospital-based practices on the assumption that models that worked well in one community can be made to work everywhere. President Obama has touted “staff models” like the Geisinger Health System in Pennsylvania and the Mayo Clinic in Minnesota that employ doctors and then succeed in reducing costs by closely managing what they do. When integrated delivery networks succeed, they are rarely led by a hospital. The ACA seeks to replicate these institutions nationwide (Exchanges), even though their successes had more to do with the local traditions and superior management.1,2

The movement of physicians into an employee status, under the control of these multidisciplinary health care delivery systems, changes more than just their productivity, it potentially changes their priorities and their attitudes.3

To physicians, patients are people in need— vulnerable, sick and afraid. To health care delivery systems, these same people are customers— purchasers of health care services. When the physicians’ roles are subordinated to those that employ them, then their role as their patients’ advocates come in conflict with their goals as physicians.

Institutions measure their success by outcomes and the bottom line— physicians by the lives they have helped.

In the past, most doctors held a shared vision of what it meant to be a physician. It was the bedrock on which the medical profession was established and evolved. That goal also served as the foundation on which patients built their trust. As physicians grapple with their new roles as employees, they now have to divide their loyalty between their own patients, other patients within the system that must share those same resources and the system itself. For without awareness of these other, potentially conflicting needs, these systems fail.

This realignment is diverting physicians away from addressing the core problems that are eroding this profession’s autonomy. The beneficence and compassion of their forefathers is being strangled out by employer demands and compounding regulations that are being heaped on them— the very qualities they hoped to emulate when first choosing medicine as their life’s calling.

This is not to imply any less dedication by physicians today. It is a resetting of their priorities. Although health care, with respect to the science and the outcomes, is vastly better, there is a proportionate increase in the depersonalization of the doctor/patient relationship. Often, the physical examination and the history are secondary to the diagnostic studies. Doctors spend more time updating their electronic medical records and reviewing test results than examining their patients. They spend more time imputing into their computers and talking to consultants than to their patients.

When the encounter changes from relational to transactional, the practice of medicine is no longer a profession, but a vocation.

An article by The New York Times (NYT) writer Gardiner Harris, titled More Choose Less Hectic Schedules, highlights the sweeping changes in career choices by the emerging physician population. Telling is a reference to a survey by Merritt Hawkins, a top doctor recruitment firm, that quality of life was more important to new physicians than finances. In growing numbers, they want to work fewer hours or even part time and are willing to take salaried positions to achieve their goals.4

The Merritt Hawkins survey also reported that 51% of the positions the firm filled last year were for hospitals, almost a four-fold increase from eight years ago. Added to that number were private sector positions which included income guarantees by hospitals.3 If an increasing number of entering and practicing physicians are working or receive income from hospitals, what does this trend say about our profession’s battle against the corporate practice of medicine? There seems to be a contradiction when the organizations that represent physicians fight to block the corporate takeover of our profession, while more entering physicians are no longer interested in the ‘private practice’ models of their forefathers.

Even more important, this shift indicates “a sweeping cultural overhaul in medicine’s ethos…from being an individual to a team sport…to the point that many patients now see doctors as interchangeable.”4

The resident physician highlighted in the New York Times article, Dr. Kate Dewar, who has chosen a different career path than her primary care physician father is quoted as equating her father’s practice to the movie Groundhog Day, in which “the same boring problems recur endlessly.” She goes on to state, “I like to fix stuff and then move on.” Where in her comments does she personalize her effort to the patient instead of the malady?

Depersonalization is not just occurring in medicine. Self-serve gas stations , Home Depot, Wall Mart and McDonalds have changed the world. By cutting back on the personal services, these giant corporations create cost savings, some of which is passed on to their customers. The delivery of health care has become ‘big business’ controlled by federal regulations, hospital corporations and insurance conglomerates. As referenced in the NYT’s article, in growing numbers, we, the physicians, are letting it happen because of the “sweeping cultural overhaul in medicine’s ethos.”4

Are employed physicians less dedicated than their counterparts in the private sector? Let’s just say they are less consumed with the practice of medicine. Employed physicians are losing control of their profession by legislative fiat (Obamacare), payer mandates and the very corporations they willingly choose to join.

Dr. Kate Dewar’s response in the NYT’s article represents the thinking of many of today’s emerging physicians— focusing on the malady and not the person.

Is the growing trend of triaging through physician extenders  (nurse PAs, physician’s assistants, etc.) alarming? At the very least, it is adding to the depersonalization of health care delivery.

A sad commentary on a profession that built its reputation on trust!

The soul of the medical profession lies in the hearts of those who share their knowledge and ply their skills to relieve the pain and suffering of those who are in need. Not because of what they receive for their efforts, but because of the good it accomplishes.


  1. Gottlieb, S., American Enterprise Institute.
  2. Graham, J.R., Free of Obama Care Taxes, the Future of Health is Digital, NCPA posting, October 22, 2014.
  3. Tenery, R.M., Working for the Man, Echos for the Future, November, 2014.
  4. Harris, G. More Doctors Say No to Endless Workdays, the New York Times, April 2, 2011.


The Patient Protection and Affordable Care Act (ACA) became law on March 23, 2010. Appearing at a January 2012 symposium, MIT professor Jonathan Gruber stated, “What’s important to remember… if your state hasn’t set up an Exchange, your citizens don’t get their tax credits (that fund their costs of participation in the ACA program), but your citizens still pay the taxes that support this bill.”

Gruber, who advised the Massachusetts legislature when it created Romneycare and the President’s administration when it crafted the Obamacare legislation, went on the explain: “So you’re essentially saying to your citizens, you’re going to pay all the taxes to help all the other states in the country.”

“We just tax the insurance companies. They pass on the higher prices that offset the tax break we get. It ends up being the same thing. It’s a very clever, basic exploitation of the lack of economic understanding of the American voter,” Gruber said in remarks in 2012 that aired on the television show, “On the Record with Greta Van Susteren.” On another occasion, Gruber said, “lack of transparency is a huge political advantage. And basically, call it the stupidity of the American voter, or whatever, but basically that was really critical for the thing (the Affordable Care Act) to pass.” At a separate event, while talking about tax credits in the Affordable Care Act, Gruber said, “American voters are too stupid to understand the difference.”

Recently, the Supreme Court has agreed to hear the legal challenge to the ACA legislation that asks if the ACA should be taken literally as outlined in the King v. Burwell brief. The core question raised by the brief is it constitutional for the citizens in states without State run exchanges to be able to use tax credits for payment of their health care fees in the exchanges?

Even though the Supreme Court has ruled that the ACA is constitutional, its ruling was based on the premise that the law was not a mandate, but a tax that is under the constitutional authority of the Congress.  At the same time, the high court ruled that the individual states reserved the right whether or not to establish an Exchange. If not, the law left it up to the federal authority to establish an Exchange in those states. The Supreme Court will hopefully sort out this question.

The important point raised by Jonathan Gruber’s presentation on this and evidently, at least, on four other occasions, was did the President and members of his administration willfully deceive the members of the Congress and certain influential members of the business and medical community into supporting his legislation.

Can patients keep their health plan if they want? Can they keep their doctor? The President told them they could. In the literal sense he was right. If they are willing to pay enough for their coverage, or out of their pocket! It would not necessarily be with their previous doctor or their previous insurance carrier, since some of these plans may no longer be available.

What about the health insurance companies that cannot afford to meet the criteria outlined in the ACA mandates for coverage? Are they being forced to raise their rates to noncompetitive levels, or will they drop out of the health care market all together? Where was that conversation brought into the dialogue before the vote in Congress was taken? Or when eliciting the support of the Board of Directors of the American Association of Retired Persons and the Board of Trustees of the American Medical Association?

Were these the revelations, Congresswoman Nancy Pelosi revealed to the press when she was quoted on Mar 09, 2010,  “we have to pass the  (health care) bill so that you can find out what is in it.”

The questions that should now be asked of our elected representatives in Washington, D.C. are: Do they still feel that they were correct in their support for the proposed ACA legislation, given these new revelations concerning transparency? In retrospect, did they perform their due diligence before they cast their vote in early 2010? After hearing Jonathan Gruber’s presentations, do they feel that the President and his administration knowingly deceived them?

The same questions should be asked of the leaders of two of this country’s largest medical organizations— the American Medical Association (AMA) and the American Academy of Family Physicians (AAFP),* that have supported and continue to support most of the proposed mandates put forth the in the Affordable Care Act.

*AAFP Letter Regarding Support for the Patient Protection and Affordable Care Act, www.aafp org. Mar. 2009, 2010.

‘Working for the Man’

It’s time for a reality check! The independent physicians are rapidly disappearing. Replacing them are contract and salaried physicians, whose income is based on their skill sets, seniority or productivity. In all these employment models, their reimbursement does not come directly from their patients, but from some third party.

In the 1970s, doctors first agreed to accept assignment as full payment for their services. It wasn’t the onset of private health insurance and not the Medicare program, not even when these programs set out the fee schedules that the payers would reimburse their enrollees. It was when the providers of services (doctors) agreed to accept those rates as full payment. That’s the beginning of when doctors started ‘working for the man‘— an often-quoted idiom used when independent decision-making and control of one’s own self interests are subverted to superiors.

Initially, this seemed innocent enough. Patents were being reimbursed by their insurers, and were in turn, supposed to turn the money over to their doctors. Knowing that many doctors were reluctant to rely on collection agencies, a few patients kept the money. Since the practice was becoming more commonplace, in increasing numbers, doctors made arrangements with the insurer for direct reimbursement, feeling that the payers’ reduced payment was better than nothing.

Dial ahead to the advent of managed care health care systems, which linked reimbursement to improved efficiency on the part of the providers— more specifically, the capitation systems that set out total reimbursements to a fixed group of providers for a fixed group of patients over a defined time. Again, the man is setting not only the fees charged, but also the reimbursements that would be accepted.

It is estimated that by next year, about 50% of U.S. doctors will be working for a hospital or hospital-owned health system. A recent survey by the Medical Group Management Association shows a nearly 75% increase in the number of active doctors employed by hospitals or hospital systems since 2000, reflecting a trend that sharply accelerated around the time that Obamacare was enacted.*

Many factors have contributed to this trend: The growing complexities of the rapidly expanding base of medical knowledge, necessitating ready consultations across specialties. The variances in the requirements of the reimbursement models. The changes in attitudes of the emerging physician population that they would give up some of their independence, rather than fight the intrusions restricting their ability to care for patients.

Fearful of being left out of the panel selection with the payer contracts, many doctors are or have joined with other physician groups (PPOs) or have sold their practices to hospital run and owned organizations. As competition becomes more intense with health care carriers, the independent PPOs find themselves, either creating arrangements with the hospitals with respect to revenue distribution or are being taken in by these hospital systems.

With the number of health care payer options narrowing, the providers of health care services concentrate into fewer, but larger and more inclusive, entities. The physician participants within these systems increasing loose control of their decision-making, and also become distanced from the payer. Thus, the increasing control over physicians by these multidisciplinary health care delivery systems— the integrated managed care consortium Kaiser Permanente is an example that was founded in 1945.

Now introduce the Affordable Care Act’s formation of the State and Federal Exchanges. Couple that with Medicare and the expansion of the states’ Medicaid coverage. Allow the Individual Mandate and the Employer Mandate requirements of the ACA to be fully implemented.

What is the future for private heath care insurers in this scenario? Many insurers will find other markets in which to provide coverage. The remaining carriers may consolidate or continue to go it on their own. Because of the mandates set out in the ACA, with respect to what their policies must cover, the private insurers will be increasingly at a disadvantage as they compete with the coverage offered by the Exchanges that are shored up with government subsidies.

If the Employer Mandate and the Individual Mandate clauses of the ACA are fully implemented in their current form, it will totally rewrite this country’s health care payment model. Over time, employers will increasingly dump their employees into the Exchanges as the costs of the private coverage options escalate. With a shrinking patient base and without federal subsidy support to meet the coverage demands outlined in the ACA, the private carriers will not be able to compete. Those individuals who aren’t eligible for Medicaid or Medicare and are not covered by the Employer Mandate provision will be forced to pay the escalating premium costs for private coverage, turn to the Exchanges or pay the fine and go ‘uncovered’.

With the physician population moving voluntarily into a subordinate role in these multidisciplinary health care delivery systems, and the Exchanges poised to squeeze out the remaining so-called private health care payers, the end result seems obvious. Most physicians will be hired, fired and reimbursed by an entity that derives its primary revenue stream from federally funded programs. The owners of the treatment facilities may not be the federal government in the projected United States model, but with the main revenue stream coming from government funding, by default, it becomes a single payer system— just a variant of socialized medicine painted in red, white and blue. **

My physician father warned me in July 1965, when President Lyndon Johnson signed the Social Security Amendments into law creating the Medicare program, that it was only a matter of time that time until socialized medicine would take over.

Physicians should have seen it coming: Accepting assignment, instead of collecting from their patients. Managed care arrangements under a capitation model. Agreeing with the hospital based systems to broker with the payers on their behalf. Willing to give up many of our freedoms to avoid some of the hassles in their practices. Finally, supporting legislation (ACA) that defaults to a single payer, because the law is unsustainable in its current form.

Make no mistake. Within the next decade, our once noble and independent profession will be “working for the man.”

*Graham, J.R., Free of Obamacare Taxes, the Future of Health is Digital, NCPA POSTING, October 22, 2014.

** Socialized medicine is, by definition, a health care system in which the government owns and operates health care facilities and employs the health care professionals, thus also paying for all health care services.