The only "problem" with it, it does not work anymore. When trying to view The New York Times, it returns the "12ft has been disabled for this site" message.
So what's that old idea? If outlets want to rank high and be indexed by search engines - and they do - then they have to let crawlers index their content. So supposedly, crawlers get served the full articles. The trick is to pretend to be the Googlebot, Bingbot or Whateverbot.
Of course, it's more complicated than it sounds. Portals, journals, magazines are smart enough to detect the most obvious ways of people faking requests. And if a paywall removal service finds a backdoor and becomes popular, it ends up as 12ft Ladder, blocked.
Popularity is not always beneficial.