Skip to main content

Machine Learning, a key component in business model transformation

The digital revolution is resulting in deep changes in consumer habits, caused, among other factors, by greater access to information and increasingly developing new technologies. All this invites us to take an in-depth look into the business models currently being used.

A fundamental driver of business model transformation is data science, which is based on the combined use of machine learning techniques, artificial intelligence, mathematics, statistics, databases and optimization.

Several factors, mainly technology related, promote Data Science techniques to be used across many different areas. These factors can be grouped around the following four themes: (i) the unprecedented increase in the volume and type of data available, (ii) data connectivity and access, (iii) improvements in algorithms, and (iv) the increased computational capacity of systems.

Regarding the volume of data, several studies show metrics that raise awareness of the magnitude of such growth. Some of the most relevant are as follows:

  • According to recent reports, 90% of all data created in the history of humanity was produced during the last year, while 40% annual growth is expected to occur over the next decade. The volume of available data today is even higher due to developments in both communications, known as Machine to Machine (M2M), and the so-called Internet of Things (IoT).
  • Studies published by large telecommunications companies highlight that the number of devices connected to the Internet will be more than 3 times the world population in 2020 and the number of IoT connections will reach 13,700 million that year, compared to 5,800 million in 2016.
  • As a result, by 2020 the total volume of existing data will reach 44 trillion gigabytes. 
  • Of these, a large amount of data is directly generated from the digital environment, as it is the case with Google searches (40,000 searches per second), Facebook messages (31 million messages per minute) or data from videos and pictures (300 hours of video uploaded to YouTube every hour).
  • By 2020 it is  estimated that, all mobile devices will include biometric technology. It is also espected that, by that year, at least a third of the data will be sentthrough the cloud8.

Secondly, connectivity improvements represent a qualitative jump that leads to new services and business models being developed based on real-time data generation and analysis in order to adapt services and/or price according to usage: data are generated and collected automatically through sensorized and digitalized point-of-sale terminals, which creates a continuous flow of information. Much of this connectivity takes place between machines: once an action is performed, the data generated by the different digital components involved are connected to servers in order to store and analyze the information. This type of M2M connection has increased to reach 1,100 million connections in 2017.

Thirdly, improvements in algorithms have made it possible to optimize the processing of large data volumes (through scaling, resampling, etc.) as well as to obtain more efficient and robust methods and to process missing data, non-numerical variables and outliers. Despite the fact that most algorithms were developed before 2000, companies are now making a major effort in implementing these algorithms, achieving better results than those produced by humans. To provide a few examples:

  • DeepMind AlphaZero and AlphaGo algorithms play chess and go games at a level beyond what humans are currently capable of.
  • An algorithm based on artificial intelligence can detect breast cancer 30 times faster than a doctor and with 99% accuracy.
  • In the United States, roboadvisors have 25.83 million users, representing a market penetration rate of 1.8% in 2018. This rate is expected to reach 8.3% in 2022.

Finally, improvements in computing capacity, which have been huge over the last few decades due to advances in processor technology, are now being driven by other key factors such as the significant development achieved in programming languages (both those of a general nature and those used in data processing, visualization, algorithms, etc.), cloud computing, and especially the design of new computing architectures specifically aimed at machine learning tasks, data analysis and engineering applications (known as GPUs).

In summary, over the last two decades the availability of digital data has increased almost 1,000 times, while algorithms have become 10 times more efficient and computing speed has increased 100 times. All this has led to a renewed interest in these techniques as a formula for adding value to information in the new business environment. 

Machine Learning: more than half a century of history

Machine learning techniques are experiencing an unprecedented boom in a number of fields, both in the academic and business world, and are an important lever for transformation. While these techniques were known in both areas, several factors are leading to their use becoming more widespread where it was previously minority based, and are causing it to extend to other fields where they were hardly used before due to both their high implementation costs and the small initial profit expected from their implementation.

Machine learning techniques can be defined as a set of methods capable of automatically detecting patterns among data. Under this definition, the concept of machine learning has existed at least since the 1950s, a period of time in which various statistical methods were developed, redefined and used in machine learning through simple algorithms, though almost exclusively within the academic field. 

This concept of machine learning involves since then the use of identified patterns to make predictions, or to make other types of decisions in environments of uncertainty.

Machine Learning techniques represent a step ahead from classical statistical techniques in the sense that they enhance the model estimation process, not only because they increase predictive power by using  new methodologies and techniques for selecting variables, but also because they lead to  improved process efficiency through automation. 

Within this context, the present study aims to provide insight into the digital revolution and its impact on the transformation of business, with a special focus on machine learning techniques.

For this purpose, the document is structured in three sections, which correspond with three objectives:

  • Illustrate the development of the digital revolution and its impact on different fronts.
  • Introduce the machine learning discipline, describe different approaches and outline current trends in this field.
  • Present a case study to illustrate the use of Machine Learning techniques in the specific case of the financial industry.

For more information, click here to access the full document in pdf (also available in Spanish and Portuguêse).


Last white papers