In 2013 jobless claims not seasonally adjusted fell to their lowest level under Obama, totaling 17.8 million, just 100,000 more than two back to back years in the Bush administration when jobless claims not seasonally adjusted fell to 17.7 million after 2003. Which felt worse, 2013 or 2004/2005, since the level was nearly the same? One way to measure that would be to compare the level to the labor force participation rate. Using the not seasonally adjusted annual averages of that, if you divide the total jobless claims by the rate you get the following: 2004=.268, 2005=.268, 2013=.281.
How would you know which was worse? During the whole period under question, jobless claims hit their lowest level in 2006 at 16.2 million, when the labor force participation rate averaged 66.2%. Dividing the claims by the rate gives you .244, the lowest result for the period. The highest result for the period, not coincidentally, was .451 in 2009 when claims soared to 29.5 million and the labor force participation rate averaged 65.4. So it seems reasonable to suggest that 2013, the best year to date for aggregate claims since 2007, still feels worse than either 2004 or 2005. About 4.9% worse. Indeed, even if you assumed you had 100,000 fewer claims in 2013 to equalize them to 2004/2005 when you also had 17.7 million instead of 17.8 million first time claims for unemployment, not seasonally adjusted, you'd still get a result higher than .268, at .279, because the civilian labor force participation rate had fallen to 63.3 from 66.0. So just because a similar number of people is losing jobs compared to some point in the past doesn't mean things have returned to normal. If they had, right now fewer people would be making jobless claims in proportion to the smaller number of people participating in the labor force, and they aren't. Not yet.