• Telefónica Mexico provided insights on building momentum on creating replicable data-driven local systems, with potential for wider deployment.
  • Groundwork on establishing principles, and ensuring standardisation, vital to ensuring benefits can flow through to subsequent use cases.
  • Real-world implementation of a standardised approach can cause initial localised pain, but is proving an effective way to pursue a central data vision flexible enough to address regional expectations.

Daniel Vaughn, Telefónica México, on a panel discussing the realities of AI

Daniel Vaughn, Telefónica México, on a panel discussing the realities of AI

Source: TM Forum

Daniel Vaughan, Head of Data Science and Chief Data Officer, Telefónica México, talked about choosing the right data model at a presentation at Digital Transformation World 2019. During his talk, which was entitled What is the right data model for telcos to adopt?, he considered the choices that Telefónica Group has made on developing a centrally-driven data strategy that can accommodate local market requirements, while ensuring new data services are also replicable across its footprint.

Vaughan acknowledged that the right data model will vary depending on the operator involved, but noted that Telefónica approached the process by considering two key questions: “what type of data company are we?” and “what type of data company do we want to be?”.

From here, the operator was able to begin the development, and subsequent implementation of its Unified Data Reference Model (URM – Telefónicawatch, #121, passim), a process described as “very hard, very complicated”, but which Vaughan considers is now providing valuable benefits.

Design phase considerations – breaking silos in a straitjacket

Vaughan said that the two key questions revealed a set of core values and objectives for Telefónica as a customer-centric telco, which the URM had to complement.

The three core values identified are:

  1. Empowerment – the data is owned by customers and Telefónica must ensure they have access to it at any time.
  2. Transparency – being clear with customers about what type of data Telefónica holds about them and how the company uses it.
  3. Security – assuring customers that their data is safe with Telefónica at all times.

Telefónica Group was recently listed as the highest performing telco in the 2019 Corporate Accountability Index compiled by non-profit citizens’ rights entity Ranking Digital Rights (RDR). However, RDR found that Telefónica and the other leading companies in the Index fell short in key areas affecting freedom of expression and privacy, with the Spanish telco achieving 47% and 49% in the two categories, respectively. Although Telefónica was found to have made a number of improvements to its privacy policies, it still fell short on disclosure in a number of areas, particularly around keeping users informed of the way data is shared with third parties (Telefónicawatch, #135).

Building on these core values, Vaughan moved on to the company’s four major objectives concerning using data as an asset:

  1. Creating value from the data and making intelligent decisions, based on the right data. This entails building use cases to solve specific business problems using predictive modelling, machine learning and AI.
  2. Producing a data model in a fast, agile way to get to market as fast as possible to support partners in its ecosystem, including non-telcos.
  3. Sharing data and knowledge across its operating businesses (OBs). Vaughan noted that Telefónica has found that OBs historically solve the same problems over and over, with valuable knowledge trapped in geographical silos. Vaughan said, “We wanted to break this and create one knowledge base”.
  4. Providing structure with flexibility. Vaughan acknowledged the challenge of hitting this balance, noting that too much flexibility creates trouble in the future, while too little create problems in the present.

Data models are like straitjackets; they restrict what you can do in future. So we have to think about it in two ways in the design phase. It has to be ‘forward-compatible’, that is able to adapt to changing market circumstances, and to be backward compatible. ”

– Vaughan.

URM enablers

Vaughan talked through three enablers which he said were instrumental in the creation of the URM, based on the data principles identified by the Group.

  1. The first enabler was creation of a common data model for all OBs. Vaughan stressed that the models used for URM would look familiar to anyone who has worked with other model examples, “a set of tables or entities, in data jargon, with headings and columns”. The key distinguishing element is that there is uniformity for these tables across geographies, which he said means the model can be implemented in unique ways to address operational issues, but at the same time be applied by other OBs (see implementation below).
  2. The second enabler was a standard set of APIs. These common tools were said to be vital for ensuring communication across countries regardless of any local systems in place. “We want to share code, to transfer software and code among us, independent of whatever vendor we have,” he said.
  3. The third enabler was installing a consistent technology stack – operating with the same software and same version – across all operating businesses.

That’s how we create value in the data economy, doing these three things and sharing code. ”

– Vaughan.

While Telefónica built the URM internally, Vaughan acknowledged that in doing so it had used some best practices from TMF’s long-established data reference model, the Information Framework, usually referred to as the SID (for shared information/data model).

Implementation pains providing lasting lessons

With the theory in place, Vaughan said putting the URM into practice proved very painful – more so for the local teams than the global ones. The early deployments did, though, provide five main lessons that have been applied to the future development of the URM.

  • Finding the right use cases. Use cases are the fundamental unit for validation of data models, and global data teams need to identify the most effective and valuable local OB models, and convert these to global models to be implemented across all markets.
  • Ensuring data consistency to enable efficient re-use. Vaughan stressed that the way of preparing data for application in one use case should be normalised to ensure savings are made on future applications. Giving the example of Mexico, he said it took several months of work to ensure consistent data could be loaded into the first URM-based use case, a network optimisation algorithm. However, when the team moved on to the second use case, device recommendation, an estimated 70% of the data had been “normalised” as part of the network optimisation process, meaning the cost of deploying the second application was substantially lower than the first.
  • Recycle where possible. Vaughan said that some OBs had successfully deployed use cases using machine learning, and the Group was looking at whether it is possible to refactor the code and deploy it elsewhere. This remains a challenge, however, where OBs have developed code that may have been built with pre-URM local data models in mind.
  • Move quickly, but do not break things. Vaughan said the Group’s core values mean a note of caution needs to be added to any rush to develop new systems “because if you put garbage in, you get garbage out…even using the best algorithms in the market is only as good as the data you put in”. Telefónica is said to run automatic data governance checks on new developments, and also uses an external benchmark to monitor practices within OBs. Nevertheless, the Group is looking at ways to make rapid progress on new solutions, and is currently looking at ways to develop use cases based on “minimal viable data” which can minimise the number of fields of data that need to be normalised for any single application.
  • Patience is a virtue. Vaughan highlighted that existing data models across OBs had their own unique naming conventions. This therefore required occasions where a parallel data model has to be built based on standardised practices to leverage the value of machine learning technology. Some OBs apparently still have two data models in operation. “This is painful,” Vaughan acknowledged, “you have to be patient”.

Significant gains feeding into a virtuous cycle

Despite the challenges, Vaughan was predictably upbeat on the benefits of the process, and stressed the success of the components built on top of the URM in his home market of Mexico. He also noted that the work in the OB is expected to provide smoother sailing for future solutions across the Group as a whole, as well as within the OB.

In Mexico we found the URM worked well, but it still needed extra data to deploy machine learning solutions, which is what data scientists call predictive variables or predictive models. So we decided to create a feature meta-store, because data scientists [across the Group] are creating the same features or variables over and over again. The feature store has been very helpful, enabling the team in Mexico to build machine learning-based and AI solutions very quickly as a common strategy. ”

– Vaughan.

Vaughan said that now use cases are beginning to be effectively reused across OBs, with local resources creating universal solutions, and a virtuous circle emerging between local development teams and the centralised global office.

This is also said to be ensuring Telefónica’s core values are adhered to in all markets, and regulatory standards can be met, while supporting the principal objective of strengthening and deepening customer relationships