Tag Archives: big g

Anti-Big G Knuckleheads and Unintended Consequences

I recently learned of the Knuckleheads (yes that is their name) initiative. On the surface they advocate for more competition in search engines. However, after digging deeper what they are really about is enabling said competition through the creation a new, “US Bureau of the Shared Internet Cache.” And how? By essentially nationalize Internet search, specifically by confiscate Google’s successful Internet crawler search cache.

Now, past-readers of this blog are aware that I’m not a fan of Big G and 99% of the time I use Bing. However, it is said that two wrongs don’t make a right.

The beef is supposedly about the de facto preference of millions and millions of Web site owners – some big some tiny in terms of Web site traffic volume. Some of them explicitly give preference to Big G via their robots.txt configuration file disallowing (blocking) all the others! They claim that these independent site owners are biased and present detailed documented evidence.

The first problem is that what Knuckleheads call “bias” is the site owner’s freedom of choice. Predictably, the search wannabe’s don’t like that and presume to know better.

Lipstick on the Pig

All told, what these search engine wannabes are pushing to do is a really bad idea but all too familiar rent-seeking. Instead of developing something better they seek special privilege by appropriating the successful incumbent.

Here are a few reasons to consider that suggest darker motives:

  1. Robots.txt can be ignored by crawlers (think red herring)
  2. Calls for either outright property theft and/or a massive taxpayer gift to Big G
  3. Terrible precedent for more and new digital grifters
  4. Unintended consequence of likely disinvestment

First, robots.txt it truly is an optional guideline for crawlers and bots. According to Moz.com: “Some user agents (robots) may choose to ignore your robots.txt file.” New ones in stealth-mode, student CS projects as well as black hats ignore it. That means the supposed rationale doesn’t even jive.

Most Web sites with freely available content only explicitly prevent access unless through user authentication controlled by the Web server. Everything else is essentially “come and get it.” Would-be competitors to Big G can and do crawl Web sites which translates into serious misdirection on what Knuckeheads is really about. Kinda’ shady.

Second, the purported essential facilities doctrine that is referenced is a euphemism that economically resembles eminent domain in that government force is used for the taking. The difference is that eminent domain is laid out in the 5th amendment and requires just compensation. It makes sense that would-be competitors and their investors would be on board.

Perhaps less immoral is that Big G might get compensated for the taking. With real estate that is easy enough to do with recent comparables to establish price to be paid out for seizing the asset. How would the price be established for Big G’s Internet crawler cache? Politicians, bureaucrats and eager competitors of course. How would that be paid for? Most likely: either or both baking cost into some budget that is covered by the taxpayers or worse still, pumping up the money supply just a little bit more. It is also not hard to imagine the process becoming punitive as well, e.g. a one-time fine, covering legal costs or maybe ongoing administrative fees. As to which Faustian bargain, it just depends on the legal argument.

Next, would be the setting of a precedent, i.e. the path being cleared for even more Internet regulations built on the tortured logic of essential facilities. By ginning-up government enabled franchises – what could go wrong? For professional lobbyists this is a feature not a bug.

Last, as usual there are unintended consequences of such privitization of gain (the windfall to the Big G wannabes) at public cost (all searchers). In recent years economics has circled around to the idea of externalities which can be good as well as bad. Consider the example of demand shifts in the price of alcohol vs education. More expensive booze is supposedly better for society (definitely better for government revenues) while a more educated populace has “spillover effects.”

In their defense, the main positive externality of Big G is that everyone benefits from everyone else’s search behavior via their Page Rank algorithm. With regard to the Knukleheads, they want to put at risk the future quality of search to clearly benefit themselves and MAYBE benefit some abstract Internet searchers. Once the cache becomes public one problem gets solved but several others pop-up. For Big G, the incremental benefit of spending on crawler cadence, better algorithms or speedy servers just went through the floor. That means the future state is seriously put at risk with worse quality (file under: Tragedy of the Commons).

Be Careful What You Wish For…

End Result: Flexible Ethics

In the Knucklehead dream world, US taxpayers/Internet users trade the technical availability more search choice (that they were always free to use beforehand) from new players in exchange for a more invasive/powerful government bureaucracy, combined with either abject theft of private property or a big gift of cash to Big G (that said US taxpayers pay for.) Plus, the added benefit of likely disinvestment by Big yielding worse search quality in the future.

These flexible ethical standards are illustrated by the diagram below. Nearly everybody understands that #1 is immoral behavior: they will not put a gun to John’s head to force him to give up his possessions. Further they also get that it us wrong to pay a3rd party intermediary to do it as shown in #2. They still get that it is immoral, criminal and wrong for society. However, when we engage a 4th entity the ethics change and far too many are now OK with force being used.

Knuckleheads purport that their position is ethical – it is not. If they somehow stole the technology we’d all see it for what it is; even if they had an offshore team or hackers do the dirty work it would still be plain to see. Only by whitewashing through a 4th party, one that is supposed to be disinterested (the government) is it possible to pull this off. Knuckleheads is #3 plain and simple.

Visual Showing Illogic of Flexible Ethics

Alternatives

Disappointing then that there isn’t much in the way of free market solutions being proposed. They could incent site owners or Big G to modify their behavior. Behavioral economics and game theory could likely solve this problem without the heavy hand and nasty unintended consequences. Some thought-starters:

  • Open source consortium offering alternative cache
  • Tax incentives to make it more appealing for Big G (and any others) sharing cache or cloud server capacity
  • Crowdfunding campaign to support the effort
  • Education of the market
  • Etc…
Interesting Framework on Rent-seeking

I ran across the above framework which is telling and relevant. Seeking a monopoly from the government suggests is the strategy used by weak competitors and weak co-operators. Not a good look.

Not that I say it lightly, it would be better to break it up into many smaller businesses and force more competition/fairer dealings within the advertising, analytics, hardware, consumer products and data businesses. That is a tall order since nobody is forced to use Big G. Sadly, it is easier for Knuckleheads to find some contemptible career politician looking for a cause and a donation.

Part III – Big G & Media Minion Maneuvers

3rd Post in a 6-Part Series About the Behavioral Data Land Grab

Big G and many of their media minions are quick to point out that by using the new global site tag, they can then get around ITP’s 3rd party tracking limitations. The reason is that the GTM tag architecture tricks the user’s Web browser to treating it as 1st party by changing the context. The legacy DFA Floodlight tag cannot do this as it is a plain and simple 3rd party tracker in a 3rd party context. That DoubleClick impression cookie served up on ad delivery (on media publisher site) and then later checked for by the DFA Floodlight (on the advertisers site) is notorious enough at this point to be easily black-listed by blockers and anti-virus platforms.

Voted Most Likely to be Blocked, Deleted or Purged

Manufactured Crisis?

The global site tag (gtag.js) request often comes from the media agency team as a panicked rush to install the new code snippet – toute suite. The implication is that if these new global site tags are *not* used, then campaign measurement and therefore campaign performance will dramatically suffer or to become questionable. The implied benefit of the new global site tag is that at minimum, current paid search measurement accuracy will be better. What this really means is that Big G AdWords conversions (clickthough and (post-clickthrough) can be more accurately counted.

Most advertisers and their agencies will miss the nuance. They may not realize that simply showing more conversions in AdWords reports does not mean that Big G paid search actually caused more of them to happen. Questionable incrementality is a broader problem with paid search attribution and Big G’s walled garden of search performance data. That aside, showing the universe of conversions that an advertiser is already receiving more accurately only means that Big G’s AdWords reporting approaches the conversion tracking accuracy of site analytics like Adobe’s. Stated differently, Big G fixed their conversion tracking problem (caused by 3rd party blocking by ITP, plug-ins and anti-virus deleters) which before the global site tag relied on a predicted count. That is what has been reported out for years in AdWords. It is all about Big G more confidently taking greater credit for more of the conversions in their analytics system (not advertisers’).

Dropping the Ball: Who Do They Work for Anyway?

Instead of pushing back to Big G on behalf of their clients or suggesting alternative solutions, too many media agencies are not doing their diligence. They are pressuring clients to just go along with the request and merely parroting that Big G recommends this without much question. The implication is that the global site tag is needed for the media agency to measure better and there fore to do their very job. At the same time, most digital advertisers today do not want to provide their media agency with another reason for bad analytics and poor measurement. Meanwhile Apple ITP is conveniently blamed for the problem.

Judas Goat Leads the Sheep Up to Slaughter at Chicago Stockyards

Expected to be stewards of their client’s digital media business, this is an unabashed agency fail. All told, the new global tag combined with an expanded tag footprint on the Web site is a shifty way for Big G to also ingest more highly valuable behavioral data at the expense of digital advertisers. Even more unseemly is that this is a clever end-run to thwart advertisers that sought to limit Big G’s behavioral data access by not using their Analytics/Tag Manager products in the first place. Worse, the end result is dysfunction with analytics teams and hidden operational costs of a maintaining a redundant de facto tag management system.

Such conduct by those representing themselves as agents of marketers is disappointing. Unfortunately, it is consistent with the unflattering issues of undisclosed incentives and rebates from tech companies, media vendors and others that was revealed in the ANA Media Transparency Report of 2015. Digital advertiser clients themselves are not blameless: the buck needs to stop with them.

Next: Part IV – A Trojan Horse for Digital Marketers

Response: Did Google Just Kill Independent Attribution?

Response to Martin Kihn of Gartner’s piece at AdExchanger, Did Google Just Kill Independent Attribution?

———————–

Interesting. Digital advertisers relying on those IDs for MTA sacrificed quality for expedience a long time ago. That cookie and anything relying on its stability is the poster-child for unreliable 3rd party tracking, i.e. bogus measurement and imprecise targeting.

#1: I’m glad you brought up trust which hopefully advertisers are paying attention to now in the post-ANA world. Best approach is also “to verify”.

#2: It is hard to believe that advertisers willingly uploading their data to ADS.

#3: Well-said. Some media agencies may be conflicted about this, while others are moving their clients away from the expedient choice.

#4: It should now become more clear: digital advertisers that chose expedience have lost the competitive advantage. Worse, they have been measuring and targeting with garbage data for years.

My recommendation to clients is to run from reliance on this dirty bird and find point solutions where they can leverage 1st party methods/do quality control and own their own destiny.

Stepping Over Data Dollars to Save Pennies