Test Driving the USPX Guidelines on Say-on-Pay

Author Brett Davidson takes the USPX guidelines on voting say-on-pay for a test drive.

Brett Davidson is a member of the USPX and publishes Investletter. He serves on the committee that released the USPX guidelines for voting on say-on-pay.  In this article, he assesses those guidelines, applying them to the 36 corporations that failed their say-on-pay votes this proxy season. For that purpose, he also experiments with how the guidelines might be extended to apply to small and medium sized companies. In their current form, the guidelines apply only to large corporations.

 

Say-on-Pay (SOP) is finally here.  Shareholders now have a voice on executive pay levels.  So how did we do?  It appears that after the first proxy season of voting, shareholders have a very meek voice.  More than 98% of SOP proposals were approved.  This certainly is not the impact proponents of SOP envisioned when the measure was enacted.

Having worked on the United States Proxy Exchange SOP white paper, I was intimately involved with the development of the tools that were presented in the guidelines.  These tools are designed to function as a rule of thumb for retail shareholders to help guide them when voting on SOP proposals.  Having shortcut methods to determine how to vote is a great idea, but do they work?

Here is my experience with using these tools to determine how they work when applied to actual company SOP proposals.  Below you will find an analysis of 30 of the companies that saw their SOP proposals defeated this proxy season and following the discussion is the list of companies.  To write this paper I had to perform that same step any investor needs to perform to use the tools in the white paper.

Are the tools we used scientifically tested and fully vetted to guarantee they work?  Heck no!  Nor do they need to be.  The aim is to function as a rule of thumb, seeing that executive compensation is not an exact science to begin with.  Just read the proxy statements and the opinion of the compensation consultants writing them.  It is much closer to science fiction.  Each company’s circumstances are different.  Would you treat a company with a super voting class of shares the same as a company with one vote per share each paying their CEO the same amount?  I wouldn’t, but some investors might.  This ignores the fact that CEO pay in general is too high.   Detailed analysis is great but we need tools we can use now to act as an aid to help push down exorbitant pay.

Do the tools proposed in the SOP white paper support the fact that CEO pay is too high?  If the proposed tools indicate companies that should have their proposals voted down, then ideally the 36 proposals that have failed to receive shareholder support without applying the Guidelines should also fail when the Guidelines are applied. (To date 36 SOP proposals have failed to pass this proxy season.)  This by no means validates the tools, but it will indicate that they are a step in the right direction.  But, don’t take my word for it.  As a colleague of mine is fond of saying, “that’s why they play the game”.  Let’s buckle our chin straps and see how the proposed tools held up.

The 36 companies with failed SOP proposals varied considerably in size.  This presented a challenge when using the tools as provided in the USPX guidelines.  In their current form, the tools are designed to be applied to large firms only, where they stand the chance to make the biggest impact.  For purposes of this paper I felt it necessary to expand the analysis to smaller companies.  To do this I had to make some adjustment.

The easiest adjustment to make was eliminating the six companies whose market cap was less than the small company threshold.  Reducing executive pay is better accomplished by taking on large companies rather than small companies for a number of reasons: size of pay packages and media attention are two.  Executive pay was mentioned in the guidelines segregated by company size with no mention of how the companies were segregated.  A footnote was included on page ten of the paper that the data for medium and small companies was gathered from the S&P 400 and S&P 600.  I used the market cap ranges provided in the three different indexes (including the S&P 500) to break the companies into groups to compare executive pay to the median pay for similar sized companies.

Comparing CEO pay based on how large the pay is in multiples of the average workers wage was easy.  The CEO’s pay was divided into the average worker’s pay provided in the paper.  The results were not surprising.  The largest pay package was 1,108 times the average worker’s pay.  The second smallest company by market cap actually satisfied this measure.  The pay for the CEO of Shuffle Master, Inc. was a mere 30 times the average worker’s pay.  Apparently, other issues resulted in his pay package failing to receive approval.

With the threshold set at 50 times the average worker’s pay, Shuffle Master was the only company that passed the test.  In 2010, the average level of CEO pay to the average worker’s pay was 343 times.  On the basis of this prong of the two tests 29 of the 30 companies failed.

The comparison of CEO pay to the median CEO pay using companies segregated by market cap yielded this breakdown: 6 large companies; 10 medium companies; and 14 small companies.  Company size is determined as the chart below indicates.

Company Size Median Compensation
Small Cap: $300mm to $1.4b $9 mil.
Mid Cap: $1b To $4.4b $4.3 mil.
Large Cap $4b and up $2.2 mil.

 

All six of the large companies failed the measure by means of their CEO pay being greater than the $9 million dollar median pay for companies in the S&P 500 average.  Medium companies went down to defeat in 8 out of 10 companies and 12 of the 14 small companies failed.  Overall, 26 of the 30 companies that were analyzed failed the measure.

When the two tools were combined, Shuffle Master was the only company that satisfied both tools.  The 29 companies that failed both tests would have been strong candidates to see their pay packages voted down.  From a median pay standpoint, the S&P 500 average would have seen around 250 companies pass this test.  Of the companies analyzed only 4 out of 30 or 13% passed.  Shareholders are singling out companies that the median pay test indicates are worthy of a no vote.

So how did the tools perform in the real world?  With the objective of helping drive down executive pay the tools proposed in the paper were in agreement with shareholders in 29 of the 30 pay packages that were voted down.  Using each tool individually would have seen 29 of 30 and 26 of 30 fail the CEO pay times average workers pay and median pay test respectively.

Using the tools did take a bit of time.  Gathering CEO pay from an analysis of regulatory filings consumed most of this time.  Once all of the data was gathered the analysis took minutes.  To analyze asingle company would take about 10 minutes.  This assumes you are familiar with the sec.gov website and able to navigate accessing company filings.

 http://sec.gov/edgar/searchedgar/companysearch.html

In live action the purpose of the tools is not to analyze whether they would work on companies whose SOP proposals were defeated.  The aim is to use the tools to analyze pay packages as companies release their yearly proxy statements…to determine which companies SOP proposals should be voted down.  After taking a dip into the pool of 2011 defeated SOP proposals, I feel confident that the two tools perform as advertised.  You will need to decide whether to use the tools together or latch on to one or the other.

By no means do the tools provide the whole picture.  The shareholders of Shuffle Master, Inc. saw something that caused them to vote down the company’s SOP proposal even though the proposal passed both of the proposed tools in the paper.  That is perfectly acceptable and shareholders always have the ability to take actions based on other issues, but it takes time to perform an analysis on that level.  Shareholders with a large number of positions may not have the time to follow through with an in-depth analysis of all of the companies in their portfolio.  The tools provide a shortcut.

With enough SOP proposals going down to defeat we have a chance to drive executive pay downward.  These two tools stand ready to help in that quest.

A few changes could prove helpful.  The addition of the median pay by company size in future revisions of the guidelines will aid in the use of the median CEO pay measure.  A mechanism is needed to make it easy for investors to obtain CEO pay levels.  A service that will gather proxy forms as they are filed and parse out the CEO pay level would make this analysis much easier.  This can be done in about 15 minutes for each company, but would be a prime candidate to be shortened by the use of automated tools.  All of the information could be provided in a database accessible to investors that would require nothing more than the company name or ticker symbol.  The database would provide the results of both of the proposed measures and the rest would be the result of concerned shareholders.

Scroll down to see the companies discussed in this article and how they fared using the USPX SOP tools.

Market X Avg
Ticker Cap in Mil. Comp Median Worker Pay
1 Hewlett-Packard HPQ $75,600 23,863,744 >$9 Failed 719 Failed
2 Freeport-McMoRan FCX $50,800 36,752,989 >$9 Failed 1108 Failed
3 Weatherford International WFT $14,090 13,163,770 >$9 Failed 397 Failed
4 Stanley Black & Decker SWK $12,450 32,730,259 >$9 Failed 987 Failed
5 Constellation Energy CEG $7,650 15,716,378 >$9 Failed 474 Failed
6 Nabors Industries NBR $7,060 13,537,486 >$9 Failed 408 Failed
7 Jacobs Engineering JEC $5,540 6,378,250 passed 192 Failed
8 Masco Corp. MAS $4,390 7,051,130 >$4.3 Failed 213 Failed
9 NVR, Inc. NVR $4,380 30,879,812 >$4.3 Failed 931 Failed
10 Superior Energy Services SPN $3,060 23,935,388 >$4.3 Failed 721 Failed
11 BioMed Realty Trust BMR $2,570 5,032,814 >$4.3 Failed 152 Failed
12 Kilroy Realty KRC $2,360 6,399,322 >$4.3 Failed 193 Failed
13 Helix Energy Solutions HLX $1,780 4,001,116 passed 121 Failed
14 Janus Capital JNS $1,780 20,337,868 >$4.3 Failed 613 Failed
15 Intersil Corp. ISIL $1,610 4,446,636 >$4.3 Failed 134 Failed
16 Curtiss-Wright CW $1,530 7,948,056 >$4.3 Failed 240 Failed
17 Umpqua Holdings UMPQ $1,340 3,731,340 >$2.2 Failed 112 Failed
18 Blackbaud, Inc. BLKB $1,220 4,552,265 >$2.2 Failed 137 Failed
19 M.D.C. Holdings MDC $1,190 9,206,403 >$2.2 Failed 278 Failed
20 Tutor Perini TPC $887 9,001,900 >$2.2 Failed 271 Failed
21 Cogent Communications CCOI $793 3,990,873 >$2.2 Failed 120 Failed
22 Ameron International AMN $786 3,930,100 >$2.2 Failed 118 Failed
23 Hercules Offshore HERO $757 2,516,064 >$2.2 Failed 76 Failed
24 PICO Holdings PICO $688 14,278,401 >$2.2 Failed 430 Failed
25 Cincinnati Bell CBB $650 20,259,761 >$2.2 Failed 611 Failed
26 Penn Virginia PVA $623 4,039,592 >$2.2 Failed 122 Failed
27 Navigant Consulting NCI $532 1,883,293 passed 57 Failed
28 Monolithic Power Systems MPWR $526 5,625,500 >$2.2 Failed 170 Failed
29 Shuffle Master, Inc. SHFL $517 1,011,591 passed 30 passed
30 Nutrisystem, Inc. NTRI $381 5,264,513 >$2.2 Failed 159 Failed
 

Companies Below Minimum Market Cap Levels Covered in White Paper

31 Beazer Homes USA BZH $269 6,893,362 208
32 The Talbots, Inc. TLB $221 6,268,760 189
33 Stewart Information Svcs. STC $194 1,260,276 38
34 Cadiz Inc. CDZI $156 2,324,641 70
35 Dex One DEXO $139 8,020,297 242
36 Cutera, Inc. CUTR $118 1,149,772 N/A 35
 

Company Size

 

Median Comp

Small Cap: $300mm to $1.4b $9 mil.
Mid Cap: $1b To $4.4b $4.3 mil.
Large Cap $4b and up $2.2 mil.
 

Times Average Worker Pay Threshhold set at 50

About Brett Davidson

Brett Davidson is the publisher of the Commonsense Investletter Investment Advisory newsletter. www.investletter.com

2 Responses to Test Driving the USPX Guidelines on Say-on-Pay

  1. Brett Davidson August 12, 2011 at 7:20 pm #

    A copy of the spreadsheet with all of the links to the source information is available at http://www.investletter.com. See the post titled “Say on Pay Article Posted”. A link to the spreadsheet is available at the bottom of the post.

Trackbacks/Pingbacks

  1. CorpGov.net » Medtronic: How I Voted - August 22, 2011

    [...] who contributed to writing the USPX guidelines for say-on-pay voting, recently wrote an article, Test Driving the USPX Guidelines on Say-on-Pay, applying them to the 36 corporations that failed their say-on-pay votes this proxy season. For [...]

Leave a Reply

s2Member® ( security for WordPress® )