April 29, 2011
- 2001 – The red line is at 88%, while the blue line is at 79%
- 2011 - The red line and blue line are both at about 114%
No dual-axis needed! Chart junk be gone! What a difference!
Look at the chart below from Chart of the Day. One question, 5 seconds. How much has the variance changed between the two lines from 2001 and 2011?
I’m going to guess you came up with:
- 2001 – The red line is about 1% lower than the blue line
- 2011 - The red line is about 10% lower than the blue line
- Therefore, the total change is 9%
If that's what you guessed, you’re wrong, completely wrong! Why?
- The axes are not synchronized
- The axis on the left is for the blue line, whereas the axis on the right is for the red line
- How can I be sure? There are gray headers at the top of each axis.
This is incredibly confusing and incredibly misleading!
What’s the real answer?
- 2001 – The red line is at 88%, while the blue line is at 79%. The red line is 9% higher than the blue line, not 1% lower like I thought
- 2011 - The red line and blue line are both at about 114%, not the 10% variance I supposed
- Therefore, the total change is actually 0%, not 9%
This is one of the worst and most misleading dual-axis charts I’ve ever seen. At a glance it looked like blue was slightly ahead of red then the gap got really wide in blue’s favor. However, it’s the complete opposite.
Scary, scary, scary!
I was contacted by a follower of my blog yesterday that lives in the UK and works for Autonet Insurance Group. She asked if I would host their infographic on this blog, but I only agreed once she confirmed it was ok that I critique it a bit. I have very limited experience creating infographics, but I am a firm believer that you should always present data in the most simple format to understand.
Keep in mind that Wikipedia defines Information Graphics as “visual representations of information, data or knowledge. These graphics present complex information quickly and clearly.” Maybe this is why I’m not a huge fan of them. Scroll down below Autonet’s infographic for my improvements.
Infographic by Autonet Insurance
First, I love the content and the descriptions; they’re very informative. However,
- I despise donut charts just as much as pie charts, although donut charts give me the sensation of riding a bike for some reason. In this donut, it’s challenging to differentiate some of the colors and there’s no particular order that I can discern.
- The bubbles at the bottom are designed to show the relative differences in the cost of the insurance. But no one can tell me that the bubble representing Prince Charles’ wedding to Lady Diana in 1981 is 7,000 times smaller than the bubble representing Prince William’s upcoming marriage to Kate Middleton.
While this view below doesn’t give any of the great details of the infographics and doesn’t look as snazzy, it does make the comparisons much, much easier.
And if you insist on a donut chart, how about something like this?
April 28, 2011
Via the Guardian Datablog, questions to ponder:
- Which is the most violent state?
- Which are getting better/worse?
Click on a state either in the map or the bar chart to update the trend. Filter to the years you want to see with the table on the lower-left.
It’s pretty clear that the peace index is worst in the south. I wonder if there is any correlation to gun control laws?
April 19, 2011
From Nielsen. There’s no explanation in the article as to the scale, nor are any numbers mentioned. How am I supposed to compare men 35+ that are planning to purchase HDTVs to women 35+?
This chart could be whole numbers, percentages, those planning to buy on the 2nd Saturday next week. This is a really poor effort.
April 15, 2011
Have you ever received a chart similar to this? I did and was flabbergasted. What disturbed me most was that it was intentionally misleading. The other people who saw the chart didn’t notice the major problem that the dual-axes are not synchronized (and a dual-axis chart is unnecessary in this case anyway).
I quickly corrected the chart and shared how it should look. They were stunned at the the different stories these charts tell at first glance.
Bottom line: be skeptical when someone sends you a chart and quickly correct the situation if needed. Take Mar-11 as an example:
- In the first chart it looks like Group E is only 2% better than Group Y
- In the second chart you can clearly see the gap is much, much bigger
Now take a quick look at Dec-10 (don’t cheat and look at the data). In the 1st chart you’d think Group Y is killing Group E, when in fact Group E is outperforming Group Y.
It’s a scary world out there when people try to intentionally mislead you in an effort to support their personal agenda. A friend of mine likes to say “Facts are friendly”. Presenting facts needs to be friendly as well.
April 7, 2011
Another interesting report released by Nielsen, this time they measured the internet connection speeds of nine countries. They presented the results as a stacked column chart.
One of the biggest problems with stacked bar or column charts is that it’s very difficult to compare values other than those at the bottom or top. In the Nielsen chart, it takes some time for your eyes to rank any of the measures other than “Above 8Mb”. I’ve turned to Tableau and presented several alternatives. Scroll through the tabs to see each.
I also used a parameter control to allow sorting by any of the measures. This allows you to quickly rank the countries for any of the speeds. My personal favorite is the “Favorite” tab. Which is yours? How else would you present the data?
April 5, 2011
Contrary to popular belief, Scott Boras does not represent every big league player. He doesn't even represent all of the biggest names. In fact, Boras Corporation doesn't even represent the most baseball players. That distinction belongs to SFX World.
The Pareto principle is nearly in affect here – 29 of the 117 agencies (25%) account for 80% of the players. Click on any of the lines are bars to see the players and teams those agencies represent. Visit MLB Trade Rumors by clicking on the little red start on the logo.
April 4, 2011
A blog post from Nielsen says:
According to a new mobile video report from The Nielsen Company, the number of U.S. mobile subscribers watching video on their mobile devices rose more than 40 percent year-over-year in both the third and fourth quarters of 2010, ending the year at nearly 25 million people.
They presented their results with this table:
What’s wrong here?
- Their summary says watching improved in both the third and fourth quarters of 2010, but why isn’t Q3 09 on the table?
- Why do they include Q3 10 if they don’t include the quarter to quarter comparison?
- The table is quite dark and dreary, with unnecessary color
I would reformat the table like this.
- Added Q3 09 along with quarter over quarter comparisons (% diff on the right) and year over year comparisons (% diff at the bottom)
- Removed any unnecessary color
While Nielsen’s report says “the number of U.S. mobile subscribers watching video on their mobile devices rose more than 40 percent year-over-year in both the third and fourth quarters of 2010”, they fail to point out that growth has actually slowed in 2010, both quarter over quarter and year over year.
Bottom line, be careful what you read. I hope not, but Nielsen may have intentionally left out Q3 09 to paint a better picture.
I’ve previously published research regarding the Cobb County School Board’s decision to revert to the “traditional” calendar. Much to the dismay of the board members, many of the parents have not let this unpopular change go away.
In a very rare move, the Cobb school board was called to testify in front of a grand jury to discuss the recent calendar decision and the board’s overall decision-making process. The outcome/recommendation from the grand jury will likely not be known until early-May.
The remainder of this blog post contains excerpts from research conducted by a friend and fellow Cobb County parent with respect to academic gains achieved to-date under the “balanced” calendar. View the research in its entirety here or see the bottom of this post.
In November of 2009, the CCSD approved a balanced calendar for a three-year period to begin in 2010-2011. The district committed to monitor the impact of the balanced calendar on key operational areas including student achievement.
On February 17, 2011, the Board of Education overturned this decision before the first year under the balanced calendar had even been completed, approving “traditional” calendars for the following two school years. Although this reversal was shocking, even more stunning was the fact that the board overturned its 2009 decision without honoring the commitment made to the district when the balanced calendar plan was established.
Because the 2010-2011 school year has not yet been completed, a full assessment of student achievement under the balanced calendar is not possible. However, the three assessments completed to date – The Iowa Tests of Basic Skills (ITBS), The Georgia High School Writing Test (GHSWT), and the End-of-Course Test (EOCT) – can provide some insight into the impact of the balanced calendar on student achievement.
The Iowa Tests of Basic Skills (ITBS)
Students are assessed in the fall on the Iowa Tests of Basic Skills (ITBS). This is a norm-referenced assessment that measures student achievement in comparison to other students nationwide. Students in grades 3, 5, and 7 are tested in Reading, Mathematics, Language Arts, Science, Social Studies and Sources of Information. Results include:
- ITBS Grade Equivalency (GE) scores for Cobb’s third, fifth, and seventh graders had all been declining in recent years. However, for the first time since 2007, ITBS GE scores in Cobb County increased in 2010
- In 2010, GE scores for third graders increased to 3.59, an increase of 3.5% in just one year
- GE scores for fifth graders increased 1.8% in 2010 to 6.05
- GE scores for Cobb’s seventh graders increased slightly by 0.6% to reach 8.05 in 2010
- Percentile rankings for all three grade levels increased by 1.5% in 2010
The Georgia High School Writing Test (GHSWT)
High school students are assessed on the Georgia High School Graduation Tests (GHSGT) and the Georgia High School Writing Test (GHSWT). Most students take the GHSWT during the fall of their junior year and are required to pass the test to earn a regular education diploma. Results include:
- GHSWT results improved by 3.3% in 2008 with 95% of the students passing the assessment
- Results declined slightly in 2009, falling to 94%
- Results have increased dramatically in 2010 with a 4.3% increase to an all-time high of 98%
End-of-Course Test (EOCT)
All students take the End-of-Course Test (EOCT) after completing various courses in four different categories: English and Language Arts (ELA), Math, Science, and Social Studies. Results of the EOCT are used for diagnostic, remedial and accountability purposes to gauge the quality of education in the state and also count as part of the student's final grade in the course. Results include:
- Results in ELA and Social Studies have maintained a slight but steady increase through the 2010-2011 school year
- Results in Science showed a slight decrease in 2010-2011, returning to their 2008 level of 71%
- After a 23% decline in 2008-2009, results in Math dramatically improved by 26% in 2009-2010. They continued to improve by another 11% to 75% in 2010-2011, reaching a level not obtained since 2006-2007
Quantifiable year-to-date student achievement in the Cobb County School District during the first year of the Balanced Calendar can be summarized as follows:
- ITBS Grade Equivalency scores and national percentile rankings had all been declining in recent years but increased in 2010-2011 for the first time since 2007
- GHSWT passing rates had improved in 2008-2009 and declined in 2009-2010, but then they increased dramatically in 2010-2011 to an all-time high
- In 2010-2011, EOCT passing rates declined slightly in science, continued a steady increase in ELA and social studies, and continued to increase significantly in math
Of course, the results of only three assessments cannot definitively establish the impact of the balanced calendar on student achievement. Ten total assessments are published each year in the CCSD, and full evaluations of all ten at the end of the 2010-2011 and 2011-2012 school years would have provided the most reliable gauge of the balanced calendar’s impact.
However, these are the only assessments that have been completed to date, and one thing can be said for certain. The balanced calendar has not had an adverse affect on student achievement in the Cobb County School District to date in 2010-2011. In fact, all indications so far suggest that the balanced calendar has had a positive impact on student achievement and is promoting improved results.
April 1, 2011
From Chart of the Day:
For me, this chart is quite hard to read because (1) you have to tilt your head 90 degrees to read anything and (2) the background is unnecessarily dark. A bar chart would be far superior; you’d be able to read left-to-right, just like a book.