Untagged Sponsor

We sponsor Untagged IO 2016

On November 9, the first edition of Untagged, an international Web Analyticsevent, took place at the Google Campus in Madrid, which we attended as sponsors.

It was an event held entirely in English and with a high level, both for the profile of the speakers and the audience. Since it is not easy to summarize more than six hours of papers in a single post, here we collect the main ideas of the presentation of each speaker, as well as the links to the papers that have been made public by the authors.

Table of Contents

1. Brian Clifton: “The Current State of Digital Analytics”

Brian took a brief tour of web analytics from its inception to today, and his presentation revolved primarily around the importance of data quality.

It threw us an essential question that must be asked before performing any analysis: Can you trust the data that your web analytics tool gives? It’s a basic question, especially if you invest more than $100,000 in a premium web analytics tool.

And, according to Brian’s research, only in the basic implementation of the tool half of the websites do it wrong, with pages where the measurement code is missing or duplicated. Other data he offered us: only 21% correctly track form submissions, 26% measure transactions well, 25% measure campaigns correctly and only 4% segment their users correctly…

To know if you can draw accurate conclusions from the data it is essential to audit the quality of the data before performing any other process…. and it’s not a straightforward process. An audit can take more than three days of work.

In any case, the recommendation he made to us is never to make a dashboard until the data quality index exceeds 50%.

Twitter: @BrianClifton

2. Kristoffer Ewald: “Innovations in Digital: New Models and Metrics”

In an environment where data is everywhere, Kristoffer made a reference to the “Cluetrain Manifesto,” in which you start by claiming that markets are conversations and those conversations create dialogues.

Concepts such as the six degrees of separation are no longer valid, for the simple fact that with social networks we are now much more related to each other than before. We have gone from a world where information was centralized with a single node, to a world where it is totally decentralized, where popularity is not influence and where the web is not the center but one more channel.

Currently, in the digital world we are facing challenges such as the following:

  • We know what we have to segment, but we do not know what the criteria is: among a sample of users, we can segment them by countless criteria: those who wear hats, women, those who wear green jerseys, those who wear blue pants … what is the correct segmentation we need?
  • Even today, many tools continue to attributing conversion to the last click, when we are in a world characterized by multichannel and device overlay: the challenge is to find the right model.
  • In standard analytics we usually lack a fundamental metric: impressions… how then can we measure conversion correctly?
  • The expiration time of a cookie is ridiculous, as many of the conversions happen beyond 90 days. It can even take years. If we buy a car, it makes very little sense for us to remarket within a few days of buying it, but to do it years later when it is presumable that we are considering changing cars again.

In short, all these challenges lead us to constantly rethink the models and metrics to be used.

Twitter: @kewald

3. Simo Ahava: “Measuring engagement: Google Tag Manager & Google Analytics”

Google Analytics is an excellent tool, but simply measuring a metric as basic as session can be problematic:

  • It resets after 30 minutes.
  • It changes at midnight and depending on the time zone selected in each view.
  • It changes depending on whether the cross-domain configuration is implemented.
  • It may have been implemented incorrectly in Google Analytics or in Google Tag manager.

Google Tag Manager allows us to make configurations that are very useful for content measurement.

Simo provided us with advice on how to implement measurements through content groupings and, in this way, be able to make groupings such as author, date of publication, header, length of the title, length of the article … all through custom JavaScript variables that we can then include in the content grouping fields in Google Tag Manager or even as custom dimensions on the page views.

With this, you can make very useful advanced analyses such as studying the engagement of blog articles based on the number of words they have, the number of images included…

As a culmination, Simo left us these 5 phrases that we must take into account:

  1. The quality of the data must be achieved, it does not come by default.
  2. Google Analytics is too generic in its basic settings.
  3. Web analytics platforms must be customized to obtain data-driven information that serves the organization.
  4. Understanding how content can be measured is a great opportunity to measure goals.
  5. Knowledge of web technology is a fundamental requirement for a modern web analyst.

Twitter: @SimoAhava

Download the paper here

4. Ani López: “Store, Extract, Transform, Load, Visualize”

Ani explained the process of data processing, from storage to visualization. To perform an analysis we must follow the following steps:

1) Store

We must create our own database, where we integrate all the data from the different sources and this is not simple, sometimes it is difficult to even have access to the data they ask us to analyze.

This database can be anything from a simple excel to a cloud solution like Amazon Redshift.

2) Extract

We have economic tools, such as Next Analytics or Supermetrics, others not so economical as Analytics Canvas, Xplenty… and others such as Google Cloud Dataflow Amazon Kinesis for which the support of DBAs (database administrators) or engineers is needed.

3) Transform

First of all you have to clean, enrich and give consistency to the database.

Then comes the data modeling phase, for which it is important to have knowledge of SQL and JavaScritp and/or Python.

4) Load

It is not advisable to directly connect the visualization tool to the database, since we can find ourselves facing calculations that can take several days with the consequent expenditure on computer data.

It is necessary to pass the data to intermediate tables and from there connect them to visualization tools such as Tableau.

As advice for those who do not know SQL, xplenty can be used.

5) View

As a point, a dashboard is not the same as a visualization tool. The insights do not come from the dashboards, but from the analysis work of the analyst.

Recommended tools: Excel, Qlik, Tableau, Data Studio…

In visualizations, it is advisable not to use outdated designs with extravagant colors (Lady Gaga Dashboards) or pretentiously futuristic (Tron Legacy).

Twitter: @anilopez

Download the paper here

5. Virginia Relea and Nicolás Lozano: “DMP, segment & PUT your DATA to work”

We were introduced to the use of Data Management Platfoms (DMP) or Data Management Platforms, which allow us to centralize, define, customize, automate and enrich data.

They made us a demonstration of the use they make of their DMP (Data Management Platform) that allows us to track a job offer, from its publication in a specialized portal, a follow-up of the sessions that the user makes to the blog, then creating personalized ads for the user that take him to a recruitment page of the company. If the user then signs up and is interviewed, the interviewer will have at his disposal the complete traceability of the steps that the user has followed, and, for example, which topics have interested him most.

Twitter: @virelga @nlozanoargue

6. Yehoshua Coren: “Google Analytics For Smarties”

In a jovial and relaxed atmosphere Yehoshua insisted on the importance of segmentation and remarketing to segmented users.

He told us about the possibilities offered by an analysis of horizontal funnels in Google Analytics, establishing funnels built from Google Analytics objectives.

He recommended using custom metrics for major site interactions, as well as using calculated metrics, with special emphasis on using users in the denominator rather than sessions.

He influenced, among other things, the concept of two-level segmentation (“Two tiered segmentation”), integrating the customer database with that of the web analytics tool, as well as the concept of Smart Data Layer, placing in the dataLayer relevant parameters to incorporate into the analytics data.

Finally, he stressed the analysis of user intent thanks to sequential segments and the importance of the concept of Remarketing Grid.

Twitter: @AnalyticsNinja

7. Marie Loridan O’meara: “Firebase Analytics: Apps success made simple”

He introduced us to Firebase,the tool acquired by Google that offers great advantages for both developers and analysts. Firebase is a free web analytics tool that works on both iOS and Android apps.

He told us about the potential of apps… just at the moment that it began to talk that the growth of downloads were about to reach a ceiling and reach their degree of maturity, it has been observed that the revenues of these are growing to a much greater extent, offering a growing horizon of profitability.

Among the advantages for developers, thanks to Firebase, modifications can be made to the App without the need to constantly ask users to update the app, reports of “crashes”…

For marketers, it provides an analytics platform with event tracking, engagement and growth metrics, deeplinking, and integrated data from both Android and iOS.

Twitter: @marieloridan

8. Dave Sottimano: “Log Analytics and PRO use cases for search marketers”

Dave told us about the direct relationship between the crawl frequency of google bots and organic traffic… all this to highlight the importance of analyzing the trail that these bots have left in the logs.

He explained the process of debugging such logs: having all the logs of the correct host with the right fields, having the appropriate IP addresses, identifying the GoogleBot views and integrated the tracking data and Google Analytics.

All this to subsequently perform a complete analysis and favor crawling, with the aim that all pages of a site are visited by GoogleBot at least once every 28 days.

Twitter: @dsottimano

Link to the presentation

Conclusion

In short, great speakers, a lot of information and excellent networking for an event that we hope will be repeated.

From here, our congratulations to the organizers David Vallejo and Natzir Turrado for having successfully organized a high-level international event like this in Madrid.