Emilio Moreno

Emilio Moreno

Telecommunications Engineer from the Polytechnic University of Madrid.

I have been living and working in the cloud for a decade, and I currently work in product management at Telefonica Tech. Here I combine technology with business (with the usual Excel sheets) in VDC and VDC-Edge services, our commercial Edge Computing service.

When I'm not in the cloud I like to travel, go for walks with my wife and read about historical topics.

Cloud
How to ensure business continuity in the face of disasters: keys to prevent your company from 'going down'
Until recently, the term ‘going down’ wasn’t widely known outside of technical circles. But incidents like large-scale power outages have brought it into everyday conversation. As the saying goes, it’s better to prepare for the storm before it hits. And while this wasn’t thunder or a major storm, it certainly served as a important wake-up call. Many companies have experienced first-hand the risk of not being adequately prepared for an incident that can 'knock a business out', threatening their very survival. Business Continuity strategies have been with us for some time, but it is in situations like this one that we become aware of their relevance, and also of the complexity involved. It is not enough to have a Disaster Recovery solution in place; if all the pieces are not in place, unpleasant surprises can happen. The importance of analyzing the risks to establish key parameters It is sometimes assumed that if we have a Disaster Recovery solution in place, we are already protected. And while this is a fundamental part, it is far from being enough. If we don't consider all the pieces, we may find ourselves with unpleasant surprises. The starting point is to to have a detailed knowledge of our business, the IT systems that provide it and the human teams that support it. A tool such as Risk Analysis allows us to identify the impact of not being able to offer different functionalities of our services in different time intervals, for example, half an hour, four hours, a day, weeks, etc. This impact can be analyzed at different levels, such as the economic impact (income that we will not receive or economic penalties to which we may be subject), at a legal level and also at a reputational level. This analysis can provide requirements for two of the most common parameters in Business Continuity and Disaster Recovery, which are the RTO and the RPO. RTO (Recovery Time Objective) indicates the time it takes to recover a system. RPO (Recovery Point Objective) indicates the point in time before the disaster from which we can recover without data loss. We will be able to make a first design of the measures to be adopted by combining the requirements of the risk analysis, as well as the different technical solutions with which our services are developed. This process is not simple and here a very relevant factor comes into play, which is cost. If we invest more, we will improve our RTO and RPO values, but unfortunately, the budgets for these activities are limited, and often, due to many other emergencies, they are not among the highest priorities. ■ Identifying which elements are essential or estimating the capacity required for a temporary contingency situation helps to optimize the solution. The importance of considering all scenarios and resources On the other hand, it is essential to analyze the possible failure scenarios and to be aware of the scenarios for which we are protecting ourselves when implementing a given solution. It would be of no use to have a solution that will raise our servers in another location if we have not considered the necessary connectivity so that our users and clients can access them. One aspect that is sometimes not emphasized is the issue of human resources. It is not just a matter of servers, machines and networks. Just as important is to be clear about how teams should respond, what they have to do, or simply how and by whom recovery procedures should be activated. In some cases, it may be obvious, such as a complete loss of infrastructure, but in others, more focused at the application level, it may not be so simple to assess what is happening or what actions need to be taken. Business Continuity involves continuously reviewing the design, the scenarios and improving after each contingency. Another relevant aspect is the execution of periodic tests. This is not something simple, because a test can usually affect the provision of our services. Fortunately, technology is making it possible to carry out more and more non-disruptive tests, which give us the reassurance that if we find ourselves in a real incident, we shall be prepared and there will be no surprises. On the other hand, as a major outage can show, Business Continuity is a continuous process in which we must review the validity of the design, the scenarios we contemplate, and in the event that we have had a contingency, review the execution and identify areas for improvement. Solutions to ensure continuity At Telefónica Tech, we offer consulting capabilities to support our clients in designing effective Business Continuity policies. In addition, through our Telefónica Tech Cloud Platform service, we provide Disaster Recovery as a Service (DRaaS) solutions that safeguard customer infrastructure in the event of a disaster and automate geographic replication across multiple nodes. These solutions also allow testing in a non-disruptive way and allow fine-tuning of those small details that are not always taken into account in the initial design. A common case that is detected in testing, for example, is not contemplating all the necessary connectivities so that end users can consume the services. Don't forget Business Continuity if you don't want your business to go down. Cyber Security Cyber Risk Quantification May 28, 2025
June 18, 2025
Telefónica Tech
Cloud
VMware Explore: AI, VDC, sovereign Cloud, and Edge Computing
Once again, VMware celebrated the EMEA (Europe, Middle-East, and Africa) edition of its VMware Explore event in Barcelona, formerly known as VMworld. Held at the Fira de Barcelona, thousands of attendees got first-hand insights into VMware's latest product developments and its vision for the future of technology and innovation. It was also an excellent opportunity to learn from customer experiences and other industry providers. During the General Session, Hock Tan, CEO of Broadcom, took the stage. He stressed VMware's innovation capabilities and pledged to increase resources to maintain the company's innovative tradition and commitment to its partners. VMware was the pioneer in developing computer virtualization technology and remains the reference in enterprise environments. Over time, through its own capabilities and a slew of acquisitions like DynamicOps, NCIRA, Airwatch, CloudHealth, VeloCloud, Heptio, Bitnami, Pivotal, and Carbon Black, VMware has built an extensive portfolio of Cloud Computing solutions. VMware's strategy revolves around positioning itself in Cross Cloud solutions, enabling workload management across different cloud and on-premise environments. ◾ One of the overarching themes of this edition was the impact of VMware's acquisition by Broadcom. Although it couldn't be executed on the originally planned date of November 1st due to regulatory and competition authority delays, some indicators of impending changes started to emerge. Finally, on November 22nd, after the event concluded, the last pending approval was received, and the acquisition was completed. Our participation in VMware Explorer 2023 As in previous years, Telefónica Tech participated in various sessions at the event, showcasing our collaborative work with VMware. Telefónica is the leading Cloud Provider with VMware technology on the Spanish market. The alliance with VMware has been fruitful for over 10 years, including services like Virtual Data Center (VDC), a reference in VMware service solutions. Telefónica Tech continues to strengthen its alliance with VMware to deliver innovative solutions to our customers to support their businesses. Our colleague Miguel Gómez, Technical Product Manager at Telefónica Tech, participated in the roundtable discussion at the Sovereign Cloud Summit. VMware's ecosystem service providers could learn about our experience, along with the experiences of other providers participating in VMware's Sovereign Cloud Program. Miguel Gómez, Technical Product Manager at Telefónica Tech, in the roundtable discussion at the Sovereign Cloud Summit. Some of the highlights from the roundtable included the importance of ensuring data sovereignty for government and regulated clients, as well as the challenges faced by clients and providers in offering such solutions. Our commitment to VDC (Virtual Data Center) Within the broader context of Explore, and open to all participants, Miguel shared the stage with Madhup Gulati, Senior Director of Product Management for the Cloud Providers business unit, in the session titled "How VMware innovation is shaping Global Market Agendas with Sovereign Cloud." During this session, Miguel discussed Telefónica Tech's experience, where VDC has become the cornerstone of our sovereign cloud proposal. VDC is the only solution in our Cloud portfolio that covers scenarios involving the protection of confidential, secret, and top-secret information. Miguel also presented some ongoing developments, such as Bring Your Own Encryption and Confidential Computing with Intel SGX. In another session, Fernando Bertrán, Head of Cloud Services Platform Operations at Telefónica Tech, presented the panels conducted by our colleagues using the Aria Operations tool. He was joined by Galina Kostova, Product Line Manager for Cloud Operations at VMware. Fernando Bertrán, Head of Cloud Services Platform Operations at Telefónica Tech Aria Operations assists VDC platform operation groups in managing VMware infrastructure. It will soon be offered to our customers as an additional capability to monitor and optimize their infrastructure. VMware Private AI: Secure and Private Generative Artificial Intelligence Artificial Intelligence played a prominent role in both general and technical sessions. The partnership with NVIDIA, particularly the announcement of VMware Private AI, was a highlight. This solution will make Generative Artificial Intelligence accessible to our customers in a secure and private environment. For several editions now, VMware has been committed to a MultiCloud or Cross-Cloud approach, recognizing the heterogeneity of customer environments and offering cross-cutting solutions for managing infrastructure, from on-premise environments to VMware technology-based clouds like VDC and hyperscalers. Another area where VMware is strongly investing is in cloud-native applications, with Tanzu as its proposal for container development using Kubernetes, aiming to bridge the worlds of virtualization and containers. VMware drives sovereign Cloud and Edge Computing As mentioned earlier, VMware's commitment to sovereign Cloud is evident, with the development of a specific program for service providers like Telefónica Tech. Through this program, they are helping to position a distinctive proposal against hyperscalers, fully aligned with the European Union's data sovereignty strategy. Edge Computing also had a presence, with new orchestration and integration solutions with Cloud offerings. Edge Computing allows companies to decentralize data processing and bring it closer to devices and end-users, improving response times. Miguel Gomez speaks at VMware Explore 2023 about Sovereign Cloud Furthermore, Edge Computing offers increased data security and privacy by avoiding sensitive information transmission to the cloud. It also provides increased flexibility and scalability by enabling workload distribution between the cloud and edge devices. This is particularly useful in cases where Edge AI or real-time data processing is required, such as in manufacturing, logistics, or healthcare. However, it wasn't all technical sessions and meetings. There was also the opportunity to visit the exhibition area, where a diverse range of companies showcased specific solutions for the VMware ecosystem. Additionally, there were networking opportunities at various events tailored for event participants, VMware Iberia customers, and global service providers. Cloud 6 common mistakes when quoting for a Cloud project October 19, 2023 __ VMware is part of the Telefónica Tech partner ecosystem, the network of alliances that allows us to develop the most advanced solutions on the market for our clients.
December 8, 2023
Cloud
Edge Computing and ultra-low latency: Why is it important?
For many years we have been in a race to increase the speed of our connections. Ever since those modems that treated us to a symphony of beeps, the end of which we waited anxiously to see the speed at which we were connected finally confirmed, higher speeds have always been the goal to be achieved. The incorporation of new technologies, such as ADSL, fibre optics, 3G or 4G mobile communications, private MPLS networks, has gradually brought higher and higher speeds. And in many cases, the commercial claim has been to promise more kilobits, more megabits in a technical and commercial race so that we can consume new services. For example, mobile internet consumption did not become widespread until the arrival of 3G. The case of HD or UHD video is unthinkable without these higher bandwidth values. But bandwidth is not the only parameter that is important when consuming digital services. This is where latency comes in. ■ According to IDC's 2025 Worldwide Edge Computing Spending Guide, global spending on edge computing solutions is expected to reach nearly $261 billion in 2025 and grow at a compound annual growth rate (CAGR) of 13.8%, reaching $380 billion by 2028. Latency, the great protagonist Latency basically measures the time that elapses in the communication between the client initiating the communication and the time it takes to receive the response. The order of magnitude in which we move is milliseconds. Latency, even if it has not been very visible, has always been there and some of its consequences are sometimes perceptible. When transatlantic communications were carried out via satellites in geostationary orbit, more than 35,000 km above the earth's surface, the time taken for the signal to travel from the earth station to the satellite and back down to another earth station added enough delay to complicate communication between people, with timeouts, collisions between speakers, etc. Here the latency is in the order of hundreds of milliseconds. Another example is in data centres when replicating data between two locations. There are hardware solutions that do not commit write operations to disk until the remote system has committed the equivalent write to the secondary system to ensure that the copy has been performed correctly. This is why many vendors have at least two data centres in the same metropolitan area to offer synchronous replication solutions. In contrast, there are many other situations where latency is not relevant, because communications response times are much shorter than the processing time, or the responsiveness of a human being. For example, a large part of most web query applications are not particularly sensitive to latency. In mobile communications, the advent of 5G has been a major departure from previous generations. While this technology promises a growth in speed, it has put latency at the centre. On the one hand, to achieve much lower values, and on the other, to ensure stable values, with little variation and very controlled. But this is not only happening in mobile communications: fibre networks also allow for lower and more stable latency values. And it is latency that really puts Edge Computing at its best. Edge means in simplified terms that we are bringing computing capabilities to the edge. To the edge of the network. Why bring this compute capacity to the edge of the network? The main advantage is to improve the latency perceived by the consumer of this capacity. If instead of the hundreds or thousands of kilometres that the signal would have to travel to reach a traditional Data Centre, it only must travel a very short distance of a few kilometres, the latency is reduced to very low milliseconds. But is it really worth the effort to deploy multiple nodes to bring computing closer to end users? For some use cases, it certainly is. And this is where one of the most important lines of work begins: identifying the use cases that really need a very low latency value. In this line, at Telefónica we have been working for some time now with our customers and partners to identify these use cases that, only in an Edge Computing infrastructure, could happen. Many of them are the result of the most advanced lines of research and are still in a very preliminary stage. We can mention some of them, such as augmented reality, Smart Industry, image recognition in real time, gaming, drone management, etc. For this reason, Next Generation Networks (5G and Fibre) combined with Edge Computing are the winning option to optimally develop solutions that are sensitive to latency. * * * Edge Computing and sustainability According to the International Energy Agency (IEA), data centers already consume between 1% and 1.5% of global electricity, and their demand could double by 2030 if energy efficiency measures are not implemented. Academic research confirms that shifting workloads to Edge Computing helps reduce energy consumption. A meta-analysis published by Springer in 2024 estimates average energy savings of around 30% when processing is offloaded to the edge. In IoT scenarios, simply avoiding unnecessary round trips to the cloud can reduce traffic by 60% to 90%, significantly lowering the CO₂ footprint. Cloud Connectivity & IoT Edge Computing and 5G: a symbiosis driving innovation August 19, 2024 Updated: June 2025
September 20, 2022
Cloud
2021: the year Edge Computing came into our lives
As we come to the end of the year, it is a good time to look back and see how the Edge has come into our lives. In my personal case, in 2021 Edge Computing has entered fully, as I have been lucky enough to take on responsibilities in the area. 2021 has been an important year in the Edge Computing relationship for Telefónica. From an infrastructure point of view, we already have a commercial offering (VDC-Edge) that allows our customers to enjoy the benefits of Edge. In parallel, we have shown new examples of innovation with our customers in the area, demonstrating that there are technological challenges that can be solved by this new paradigm. However, there is always a margin of doubt about the real impact it is having - is it just a fancy term that the whole industry is eager to embrace? Some time ago, when the term Cloud Computing started to become popular, I remember seeing "classic" solutions that with a little coat of varnish were quickly postulated as Cloud solutions. One of the leading technology analysts, Gartner, positioned the term Edge Computing in August this year, at the top of the "peak of inflated expectations" of its classic "Hype Cycle". Obviously, when expectations are high, there is a risk that they may not fully materialise and lead to disappointment. In this particular case, regardless of the illusions of each actor in the industry, I see some factors that I find interesting and that make me think that the impact of the Edge is going to be very significant. It is unlikely that we will ever have "before the Edge" or "after the Edge" time references, but it is going to have a significant impact on our lives. One of these is the arrival of "real" 5G, 5G SA (StandAlone). Although we have already been enjoying 5G for months, the deployment of the SA version will mean the arrival of new capabilities, such as Network Slicing, which represent a huge leap forward. And we must not forget the latent geopolitical conflict in relation to 5G between China and the USA, which could lead to the world being divided into isolated technological blocs. This latent rivalry shows that 5G is more than just a multi-billion-dollar market, but that it directly and powerfully influences other economic sectors and our lives in general. The other is that we are working intensively with partners and end-users to identify use cases that demand this technology, and to improve the offering and create a new generation of services to users. This is a substantial change from the classic approach of designing a product, launching it and waiting for demand to come. In my view, it will help our business customers to imagine services, technologies and use cases that they could not even contemplate before, and that will rely on Edge to get to market. So, let's hope we see the Edge explode in 2022. And in the meantime, Merry Christmas and 2022 to all of you!
December 21, 2021