Where should we start? How about this: internet traffic is half-fake and everyone’s known it for years, but there’s no incentive to actually acknowledge it. The situation is technically improving: 2015 was hailed (quietly, among people who aren’t in charge of selling advertising) as a banner year because humans took back the majority with a stunning 51.5% share of online traffic, so hurray for that I guess. All the analytics suites, the ad networks and the tracking pixels can try as they might to filter the rest out, and there’s plenty of advice on the endless Sisyphean task of helping them do so, but considering at least half of all that bot traffic comes from bots that fall into the “malicious” or at least “unauthorized” category, and thus have every incentive to subvert the mostly-voluntary systems that are our first line of defence against bots… Well, good luck. We already know that Alexa rankings are garbage, but what does this say about even the internal numbers that sites use to sell ad space? Could they even be off by a factor of 10? I don’t know, and neither do you. Hell, we don’t even know how accurate the 51.5% figure is — it could be way off… in either direction… More at techdirt.
It can be awfully dispiriting running any website and trying to work out if it is getting more popular or if it is falling down the trap of the weird stuff that happens on the internet. I gave up caring how popular LIM is a long time ago.