The Theme - New Media and New Digital Economy Ventures
"The Rise of the New Digital Economy: Trends, Opportunities and Challenges"
1. Overview
The world is changing fast. Technology has permeated every aspect of modern life, both personal and professional. We are at the dawn of the Fourth Industrial Revolution, which will bring together digital, biological and physical technologies in new and powerful combinations. Rapid developments in technology and science are changing the way we live, work and do business. These changes come with challenges for our industries, work places and communities. Digital technologies have immense potential to drive competition, innovation and productivity.
Innovation in the business world is accelerating exponentially, with new, disruptive technologies and trends emerging that are fundamentally changing how businesses and the global economy operate. Today, most business professionals spend almost every waking second of their day either interacting with a computer or carrying one on their person - and many people even sleep tethered to a computer via wearable devices that track activity and sleep patterns.
Today’s new digital economy is forcing organizations to think faster, more openly and more flexibly. By understanding emerging digital ecosystems, businesses can implement strategies that differentiate the customer experience and drive competitive advantage. The strategies for succeeding in the knowledge economy are predicated on the notions that knowledge and information are costly to generate and can be protected. It makes sense to build your enterprise’s competitive differentiation around its knowledge capital only if that information is unique to the firm.
- Digital Changes Driving the New Digital Economy:
Today, the notion that knowledge and information are costly and protectable is being challenged by the four forces of digital change. In combination, these forces are pushing the knowledge economy to the margins and giving rise to the "New" Digital Economy (NDE). The Internet has underpinned, enabled and accelerated many of these trends, and it lies at the core of the NDE as well.
- Sharing, collaboration, and on-demand. Digital technologies make it easy to share information freely. Collaboration platforms create new pathways for knowledge production that depend on connections between people rather than on hierarchical controls. We are now seeing products become platforms that drive impressive new services in ways that were just not imagined when the products were first designed. A number of new commercial online services have emerged in recent years, each promising to reshape some aspect of the way people go about their lives. Some of these services offer on-demand access to goods or services with the click of a mouse or swipe of a smartphone app. Others promote the commercialized sharing of products or expertise, while still others seek to connect communities of interest and solve problems using open, collaborative platforms.
- Hyperconnectivity. Information systems, particularly the Internet of Things (IoT), are generating powerful live information flows. Our enthusiasm for creating these data streams - from devices and ourselves - implies that information at the point of creation is more valuable than any legacy knowledge. Soon everything will be connected; every asset, supplier, worker and stakeholder. This means products can now work together to get jobs done faster and even safer than ever before. However, hyperconnectivity driven by the rise of the digital-everything economy and IoT will soon disrupt the cybersecurity landscape in a way that hasn’t been seen in the past.
- Artificial intelligence (AI) and machine learning. AI and other advanced analytics technologies decrease information processing costs. With the proper dataset behind it, AI can help alleviate many repetitive and redundant tasks, changing the way humans approach work. Today, AIs can write poetry and songs, discover new compounds in medicine, can be used to create information and to counter fake news. It may not be long before an AI receives a patient. Just as companies used capital to accelerate through the experience curve in manufacturing, it is now possible to use machine learning to power through the experience curve of the digital economy. We are now in a world where things get done faster, more easily, with more accuracy, and based on better knowledge. Everything we do is touched in some way by AI. AI is becoming more and more ingrained into our lives.
- Crowdsourcing the world’s cognitive surplus. Wikipedia, YouTube, and Linux are early examples of using collaboration technology to harness latent cognitive capacity around the globe to topple traditional sources of competitive advantage. Tapping into the intelligence of groups - within a company or around the globe - can help organizations combat bias, make better decisions, and compete for talent and ideas with the help of artificial intelligence. Think of crowdsourcing as applying the principles of the sharing economy to cognitive surpluses. Many people have thoughts, ideas, and skills with real business value that often go unused. Companies can tap into those surpluses both internally and externally, often with the help of technology. In fact, Alphabet’s former executive chairman, Eric Schmidt, has predicted that the next $100 billion company will likely result from the wisdom of the many.
Together, these forces mean all knowledge has true competitive value only at the moment it is created. It decays quickly into legacy knowledge. The only way to find competitive advantage in this digital economy is to become a Live Business: to learn to use information in the moment to make decisions, meet demand, and respond to customers. The difference between thriving in the legacy knowledge economy and thriving in the new digital economy is the speed at which companies can act on data from all sources.
- What is the "New" Digital Economy (NDE)?
The New Digital Economy (NDE) is emerging from a combination of technologies, mainly from the ICT (Information and Communications Technology) space, that are becoming pervasive across mechanical systems, communications, infrastructure, and the built environment, and thus playing an increasingly important role, not only in social and political life, but in research, manufacturing, services, transportation, and even agriculture.
The technologies underpinning the NDE, most importantly, include: advanced robotics and factory automation (sometimes referred to as advanced manufacturing); new sources of data from mobile and ubiquitous Internet connectivity (sometimes referred to as the Internet of Things); cloud computing; big data analytics and artificial intelligence (AI).
"The main driver of the NDE is the continued exponential improvement in the cost-performance of information and communications technology (ICT), mainly microelectronics, following Moore’s Law. This is not new. The digitization of design, advanced manufacturing, robotics, communications, and distributed computer networking (e.g. the Internet) have been altering innovation processes, the content of tasks, and the possibilities for the relocation of work for decades. However, three features of the NDE are relatively novel. First, new sources of data, from smart phones to factory sensors, are sending vast quantities of data into the “cloud,” where they can be analysed to generate new insights, products, and services. Second, new business models based on technology and product platforms - platform innovation, platform ownership, and platform complimenting - are significantly altering the organization of industries and the terms of competition in a range of leading-edge industries and product categories. Third, the performance of ICT hardware and software has advanced to the point where artificial intelligence and machine learning applications are proliferating. What these novel features share is reliance on very advanced and nearly ubiquitous ICT, embedded in a growing platform ecosystem characterized by high levels of interoperability and modularity." - [United Nations, UNCTAD]
The rise of new digital industrial technology, known as Industry 4.0, is a transformation that makes it possible to gather and analyze data across machines, enabling faster, more flexible, and more efficient processes to produce higher-quality goods at reduced costs. This manufacturing revolution will increase productivity, shift economics, foster industrial growth, and modify the profile of the workforce—ultimately changing the competitiveness of companies and regions.
- What is New Media?
New Media is a 21st Century catchall term used to define all that is related to the Internet and the interplay between technology, images and sound. In fact, the definition of new media changes daily, and will continue to do so. New media evolves and morphs continuously. What it will be tomorrow is virtually unpredictable for most of us, but we do know that it will continue to evolve in fast and furious ways.
Digital technologies underpin innovation and competitiveness across private and public sectors and enable scientific progress in all disciplines. ICT and Digital Media are now integrated into almost every technology, industry and job. New media are forms of media that are native to computers, computational and relying on computers for distribution. Currently, some examples of new media are websites, mobile apps, virtual worlds, multimedia, computer games, human-computer interface, computer animation and interactive computer installations.
2. Technologies for Digitizing and Transforming Industry and Services
Candidate approaches include different types of logic using cellular automata or quantum entanglement and superposition; 3D spatial architectures; and information-carrying variables other than electron charge, such as photon polarization, electron spin, and position and states of atoms and molecules. Approaches based on nanoscale science, engineering, and technology are most promising for realizing these radical changes and are expected to change the very nature of electronics and the essence of how electronic devices are manufactured. Rapidly reinforcing domestic R&D successes in these arenas could establish a U.S. domestic manufacturing base that will dominate 21st-century electronics commerce. The goal of this initiative is to accelerate the discovery and use of novel nanoscale fabrication processes and innovative concepts to produce revolutionary materials, devices, systems, and architectures to advance the field of nanoelectronics."
Electronic smart systems identify a broad class of intelligent and miniaturized devices that are usually energy-autonomous and ubiquitously connected. In order to support these functions like sensing, actuation, and control, electronic smart systems must include sophisticated and heterogeneous components and subsystems, such as digital signal processing devices, analog devices for RF and wireless communication, discrete elements, application-specific sensors and actuators, energy sources, and energy storage devices. These systems take advantage of the progress achieved in miniaturization of electronic systems, and are highly energy-efficient and increasingly often energy-autonomous, and can communicate with their environment.
Thanks to their heterogeneous nature, smart embedded and cyber-physical applications are able to deliver a wide range of services, and their application may lead to provide solutions to address the grand social, economic, and environmental challenges such as environmental and pollution control, energy efficiency at various scales, aging populations and demographic change, risk of industrial decline, security from micro- to macro-level, safety in transportation, increased needs for the mobility of people and goods, health and lifestyle improvements, just to name the most relevant.
- Robotic process automation, alongside blockchain, AI, cognitive computing and the Internet of Things (IoT), is one of the new and emerging technologies expected to profoundly impact and transform the workforce of the future across the financial services sector. Robotic Process Automation (RPA) is quickly becoming the go-to solution for financial institutions that want to improve digital speed to market and cost take outs.
- AI and robotics are transforming healthcare. AI is getting increasingly sophisticated at doing what humans do, but more efficiently, more quickly and at a lower cost. The potential for both AI and robotics in healthcare is vast. Just like in our every-day lives, AI and robotics are increasingly a part of our healthcare eco-system.
- The food industry is being revolutionized by robotics and automation. There are real problems in modern agriculture. Traditional farming methods struggle to keep up with the efficiencies required by the market. Farmers in developed countries are suffering from a lack of workforce. The rise of automated farming is an attempt to solve these problems by using robotics and advanced sensing. Following 5 ways robotics is changing the food industry: Nursery Automation, Autonomous Precision Seeding, Crop Monitoring and Analysis, Fertilizing and Irrigation, Crop Weeding and Spraying.
3. Data Infrastructure: HPC, Big Data and Cloud Technologies
Big data refers to extremely large datasets that are difficult to analyze with traditional tools. It is often boiled down to a few varieties of data generated by machines, people, and organizations. Big data is being generated by everything around us at all times. Every digital process and social media exchange produces it. Systems, sensors and mobile devices transmit it. Big data can be either structured, semi-structured, or unstructured. IDC estimates that 90 percent of big data is unstructured data.
Big data is arriving from multiple sources at an alarming velocity, volume and variety. To extract meaningful value from big data, you need optimal processing power, analytics capabilities and skills. In most business use cases, any single source of data on its own is not useful. Real value often comes from combining these streams of big data sources with each other and analyzing them to generate new insights.
Analyzing large data sets, so-called big data, will become a key basis of competition, underpinning new waves of productivity growth, innovation, and consumer surplus. Big data must pass through a series of steps before it generates value. Namely data access, storage, cleaning, and analysis.
4. Artificial Intelligence, Machine Learning, and Neural Networks
- Artificial Intelligence (AI):
Artificial Intelligence (AI) is the broader concept of machines being able to carry out tasks in a way that we would consider “smart”. Machine Learning (ML) is a current application of AI based around the idea that we should really just be able to give machines access to data and let them learn for themselves.
Over the past few years AI has exploded, and especially since 2015. Much of that has to do with the wide availability of GPUs that make parallel processing ever faster, cheaper, and more powerful. It also has to do with the simultaneous one-two punch of practically infinite storage and a flood of data of every stripe (that whole Big Data movement) - images, text, transactions, mapping data, you name it. The hype around AI becomes tangible.
The most important thing to understand about AI is that it is not a static formula to solve. It’s a constantly evolving system designed to identify, sort, and present the data that is most likely to meet the needs of users at that specific time, based on a multitude of variables that go far beyond just a simple keyword phrase.
AI is trained by using known data, such as: content, links, user behavior, trust, citations, patterns, and then analyzing that data using user experience, big data, and machine learning to develop new ranking factors capable of producing the results most likely to meet user needs.
The goal of Artificial Intelligence (AI) is to understand intelligence by constructing computational models of intelligent behavior. This entails developing and testing falsifiable algorithmic theories of (aspects of) intelligent behavior, including sensing, representation, reasoning, learning, decision-making, communication, coordination, action, and interaction. AI is also concerned with the engineering of systems that exhibit intelligence. AI fuels analytics, which fuels actionable intelligence, which fuels business growth.
- Machine and Deep Learning:
Machine-learning algorithms use statistics to find patterns in massive amounts of data. And data encompasses a lot of things - numbers, words, images, clicks, what have you. If it can be digitally stored, it can be fed into a machine-learning algorithm.
Machine learning is the process that powers many of the services we use today - recommendation systems like Netflix; search engines like Google; social-media feeds like Facebook; voice assistants like Siri; etc.. In all of these instances, each platform is collecting as much data about you as possible\ - what genres you like watching, what links you are clicking, which statuses you are reacting to - and using machine learning to make a highly educated guess about what you might want next. Or, in the case of a voice assistant, about which words match best with the funny sounds coming out of your mouth. Frankly, this process is quite basic: find the pattern, apply the pattern. But it pretty much runs the world.
Deep learning is machine learning on steroids: it uses a technique that gives machines an enhanced ability to find - and amplify - even the smallest patterns. This technique is called a deep neural network - deep because it has many, many layers of simple computational nodes that work together to munch through data and deliver a final result in the form of the prediction. Machine (and deep) learning comes in three flavors: supervised, unsupervised, and reinforcement.
- Neural Networks:
Neural networks were vaguely inspired by the inner workings of the human brain. The nodes are sort of like neurons, and the network is sort of like the brain itself.
Neural networks are a set of algorithms, modeled loosely after the human brain, that are designed to recognize patterns. They interpret sensory data through a kind of machine perception, labeling or clustering raw input. The patterns they recognize are numerical, contained in vectors, into which all real-world data, be it images, sound, text or time series, must be translated. A Neural Network is a computer system designed to work by classifying information in the same way a human brain does. It can be taught to recognize, for example, images, and classify them according to elements they contain. The development of neural network has been key to teaching computers to think and understand the world in the way we do, while retaining the innate advantages they hold over us such as speed, accuracy and lack of bias.
5. 5G and Beyond Mobile Wireless Technology
- Mobile is the Largest Technology Platform in Human History
"AI, machine learning, deep learning, autonomous systems and neural networks are not just buzzwords and phrases. Increased computing power, more efficient hardware and robust software, as well as an explosion in sensor data from the Internet of Things - are fueling machine learning, and moving actionable data and intelligence towards edge devices. As AI makes devices, including smartphones and automobiles, more intelligent, mobile is becoming the key platform for enhancing all aspects of our lives, having an impact now and in the future." -- (MIT)
Mobile is the largest technology platform in human history. The next-generation wireless super-fast networks known as 5G, which will operate at vastly higher speeds and be able to handle tens of times more devices than existing 4G networks. 5G is about more than fast data rates and greater capacity. It's about the seamless, real-time interaction between humans and billions of intelligent devices. 4G turned mobile phones into movie-streaming platforms, but 5G promises more than speedy downloads. It could pave the way for surgeons operating remotely on patients, cars that rarely crash, and events that can be vividly experienced from thousands of miles away.
- 5G Standards Are Not Yet Finalised
5G wireless technology promises a rich, reliable, and hyperconnected world. But from new bands to wider bandwidth and new beamforming technology, The actual 5G radio system, known as 5G-NR (New Radio), won't be compatible with 4G. But all 5G devices, initially, will need 4G because they'll lean on it to make initial connections before trading up to 5G where it's available. 4G will continue to improve with time, as well.
5G standards are not yet finalised and the most advanced services are still in the pre-commercial phase. 5G needs spectrum within three key frequency ranges to deliver widespread coverage and support all use cases. The three ranges are: Sub-1 GHz, 1-6 GHz and above 6 GHz. - Above 6 GHz is needed to meet the ultra-high broadband speeds envisioned for 5G. Players (AT&T, Verizon, ..) in the (U.S.) national wireless industry are developing their 5G networks and are working to acquire spectrum. AT&T is gearing up to launch the first standards-based 5G services in multiple U.S. markets by the end of 2018.
5G will achieve speeds of 20 gigabits per second, fast enough to download an entire Hollywood movie in a few seconds. It also will reduce latency - the measure of how long it takes a packet of data to be transmitted between two points - by a factor of 15. 5G networks will combine numerous wireless technologies, such as 4G LTE, Wi-Fi, and millimeter wave technology. 5G will also leverages cloud infrastructure, intelligent edge services and virtualized network core.
- Four Factors Distinguish 5G from Its Predecessors
Instead of point-to-point communications provided by legacy mobile networks, 5G will move packets of data following the most efficient path to their destination. This shift enables real time aggregation and analysis of data, moving wireless technology from communication to computing. Four factors distinguish 5G from its predecessors: connected devices, fast and intelligent networks, back-end services and extremely low latency. These qualities enable a fully connected and interactive world with a variety of new applications.
5G technology is more secure than 4G, the current highest mobile internet standard. One of the reasons it's more secure is that the tech encrypts data in a way so advanced that hackers would need a "quantum computer." The data protection rules in the European Union known as the General Data Protection Regulation (GDPR) came into force in May, 2018. The law requires companies that handle data to have a very high standard of data protection or face potentially huge fines. With massive amounts of data expected to be flowing along 5G networks, GDPR is likely to become even more important for the business world.
- 5G is the Primary Catalyst for Next-Generation Internet of Things (IoT) Services
Leveraging state-of-the-art communication network architectures, 5G is touted to be the primary catalyst for next-generation Internet of Things (IoT) services. 5G will provide the backbone for IoT that greatly improves data transfer speeds and processing power over its predecessors. This combination of speed and computing power will enable new applications. These include connected cars coupled with augmented reality and virtual reality platform, smart cities and connected devices that revolutionize key industry verticals.
By 2020, the 5G network will support more than 20 billion connected devices, 212 billion connected sensors and enable access to 44 zettabytes of data gathered from a wide range of devices from smartphones to remote monitoring devices. Healthcare organizations are eager to embrace IoT devices because they save money by keeping patients out of the hospital. If IoT devices can diagnose people in advance then that saves huge costs.
6. The Next Generation Internet (NGI) and Quantum Computing
Other important features of IPv6: Stateless Auto-reconfiguration of Hosts - This feature allows IPv6 host to configure automatically when connected to a routed IPv6 network; Network-layer security - Pv6 implements network-layer encryption and authentication via IPsec.
[World Economic Forum]: As China moves closer to building a working quantum communications network, the possibility of a quantum Internet becomes more and more real. In the simplest of terms, a quantum Internet would be one that uses quantum signals instead of radio waves to send information. The Internet as we know it uses radio frequencies to connect various computers through a global web in which electronic signals are sent back and forth. In a quantum internet, signals would be sent through a quantum network using entangled quantum particles.
Researchers have recently made significant progress in building this quantum communication network. China launched the world’s first quantum communication satellite in 2016, and they’ve since been busy testing and extending the limitations of sending entangled photons from space to ground stations on Earth and then back again. They’ve also managed to store information using quantum memory. By the end of August, 2017, the nation plans to have a working quantum communication network to boost the Beijing-Shanghai internet. Leading these efforts is Jian-Wei Pan of the University of Science and Technology of China, and he expects that a global quantum network could exist by 2030. That means a quantum internet is just 13 years away, if all goes well.
- What is a Quantum Computers?:
[BBC]: "A quantum computer is a machine that is able to crack very tough computation problems with incredible speed - beyond that of today's "classical" computers. In conventional computers, the unit of information is called a "bit" and can have a value of either 1 or 0. But its equivalent in a quantum system - the qubit (quantum bit) - can be both 1 and 0 at the same time. This phenomenon opens the door for multiple calculations to be performed simultaneously. However, qubits need to be synchronised using a quantum effect known as entanglement, which Albert Einstein termed "spooky action at a distance". There are four types of quantum computers currently being developed, which use: Light particles; Trapped ions; Superconducting qubits; Nitrogen. vacancy centres in diamonds
Quantum computers will enable a multitude of useful applications, such as being able to model many variations of a chemical reaction to discover new medications; developing new imaging technologies for healthcare to better detect problems in the body; or to speed up how we design batteries, new materials and flexible electronics."
- How do you build the next-generation Internet?:
[BBC]: It's not easy to develop technology for a device that hasn't technically been invented yet, but quantum communications is an attractive field of research because the technology will enable us to send messages that are much more secure.
There are several problems that will need to be solved in order to make a quantum Internet possible: getting quantum computers to talk to each other; making communications secure from hacking; transmitting messages over long distances without losing parts of the message; and routing messages across a quantum network.
7. Building the Digtal Plaforms of the Future
The new digital technologies have led to widespread use of cloud computing, recognition of the potential of big data analytics, artificial intelligence, and significant progress in aspects of the Internet of Things, such as home automation, smart cities and grids and digital manufacturing. In addition to closing gaps in respect of the basic necessities of access and usage, now the conditions must be established for using the new platforms and finding ways to participate actively in the creation of content and even new applications and platforms.
A website is an essential element for running a successful business. A business without a website can potentially lose out on great opportunities since potential customers can’t reach you, find you and learn about you online.
The enterprise world is changing faster than ever. To compete, it is now necessary to do business at an almost unprecedented size and scale. In order to achieve this scale, winning companies are establishing digital platforms that extend their organizational boundaries. With the Internet as the platform for innovation and the emergence of the information-fueled economy, technology is both a strategic requirement and a strategic advantage.
Based on current trends, digital platforms will become the preferred and dominant business model for banks and financial institutions in the future. Digital platforms offer consumers and small businesses the ability to connect to financial and other service providers through an online or mobile channel as an integrated part of their day-to-day activities.
In emerging markets, billions of people around the world without access to traditional financial services, FinTech could lead to a revolution in financial inclusion and membership in the new global digital economy. Individuals and businesses have access to useful and affordable financial products and services that meet their needs – transactions, payments, savings, credit and insurance – delivered in a responsible and sustainable way. Financial inclusion is a key enabler to reducing poverty and boosting prosperity.
With lower distribution costs and simplified engagement, the movement from paper to digital is picking up speed and increasing consumer expectations. This provides traditional financial institutions the opportunity to transform legacy delivery options, while also challenging the business case for existing physical infrastructures.
The digitization of financial services also will improve identity management through enhanced biometrics. This will impact on the access to banking services in underserved markets and improve traditional payments and global money movement.
Needed first and foremost in this digital ecosystem is an integrating digital platform. Much like how Apple opened up the iPhone to independent application providers, The standardized digital platform will be able to provide a hub for all agtech providers to essentially sell their wares, at the same time capturing their data and integrating it into the digital platform.This integrated platform gives farmers the ability to track their operations from several different angles, from soil moisture sensing to satellite imagery to weather data, to better make predictions and decisions on how their operations are faring. The integrating platform enables and protects stakeholder access and information; automates the development and analysis of massive bodies of data; and develops, reveals, and manages the potential costs – and revenues – of these decisions. These decisions can then be quickly implemented with greater accuracy through robotics and advanced machinery, and farmers can get real-time feedback on the impact their actions.
Digital agriculture has the potential to transform the way we produce the world’s food but the approach is still very new, costs are high and the details of the long term benefits are rarely available. That means to secure its widespread adoption will require collaboration and consensus across the value chain on how to overcome these challenges.
In the developed world, the emergence of utility-based cloud computing is shifting focus from technical barriers to the business environment challenges facing digital entrepreneurs. This shift reinforces the growing importance of implementing effective policies that foster the best climate for digital service incubation, growth and successful development. However, in many rural areas and developing countries, even basic infrastructure remains a challenge, from the hardware, the network, the content, the ICT eco-system, to the skills on both consumer and business sides.
Virtual Reality and Van Gogh Collide - technology is turning museums into a booming industry. Virtual museum (VM) is a digital entity that draws on the characteristics of a museum, in order to complement, enhance, or augment the museum through personalization, interactivity, user experience and richness of content. VM is not a real museum transposed to the web, nor an archive or a database of virtual digital assets but a provider of information on top of being an exhibition room. VM provides opportunities for people to access digital content before, during and after a visit in a range of digital ‘encounters’. VM is technologically demanding especially in terms of virtual and augmented reality and storytelling authoring tools which must covers various types of digital creations including virtual reality and 3D experiences, located online, in museums or on heritage sites. The challenge will be to give further emphasis on improving access, establishing meaningful narratives for collections and displays and story-led interpretation by the development of VM. It will also address the fundamental issues that are required to make this happen e.g. image rights, licencing and the ability of museums to support new ICT technology. Virtual Museums offer visitors the possibility to see art works residing in different places in context and experience objects or sites inaccessible to the public.
Cultural and creative industries are the economic activities of artists, arts enterprises, and cultural entrepreneurs in the production, distribution and consumption of film, literature, theatre, dance, visual arts, broadcasting, and fashion. New digital and information and communication technologies have revolutionized the industry's production process, distribution channels, and consumption modes.
- Digital Platforms for Interoperable and Smart Homes, Smart Buildings, Smart Environments, and Smart Grids:
Digital data and analytics can reduce O&M costs by enabling predictive maintenance, which can lower the price of electricity for end users. Digital data and analytics can help achieve greater efficiencies through improved planning, improved efficiency of combustion in power plants and lower loss rates in networks, as well as better project design throughout the power system. In networks, efficiency gains can be achieved by lowering the rate of losses in the delivery of power to consumers, for example through remote monitoring that allows equipment to be operated closer to its optimal conditions, and flows and bottlenecks to be better managed by grid operators. Digital data and analytics can also reduce the frequency of unplanned outages through better monitoring and predictive maintenance, as well as limiting the duration of downtime by rapidly identifying the point of failure. This reduces costs and increases the resilience and reliability of supply.
Artificial Intelligence (AI) is making its way into all types of industries, including the energy sector, with significant growth in the use of AI to leverage big data and draw inference from very large data sets. AI is the application of machine learning for the purposes of automation and computational support of decision-making in a complex system. AI has great potential to coordinate and optimize the use of distributed energy resources, electric vehicles, and IoT. Use of AI aligns well with the current pace of change that utilities, regulators and customers expect with improvements to common utility operations including: reliability (e.g., self-healing grids, operations improvement and efficient use of renewable resources and energy storage); safety (e.g., outage prediction and outage response); cybersecurity of systems (e.g., threat detection and response); optimization (e.g., asset, maintenance, workflow and portfolio management); and enhancements for the customer experience (e.g., faster and more intuitive interactive voice response, personalization, product and service matching); etc..
8. Cybersecurity and Advanced Software Engineering
Cyber security protects the data and integrity of computing assets belonging to or connecting to an organization’s network. Its purpose is to defend those assets against all threat actors throughout the entire life cycle of a cyber attack.
We’re going to see more mega-breaches and ransomware attacks years to come. Planning to deal with these and other established risks, like threats to web-connected consumer devices and critical infrastructure such as electrical grids and transport systems, will be a top priority for security teams. But cyber-defenders should be paying attention to new threats, too.
- Why Is Cybersecurity So Important?
[The U.S. Homeland Security]: "Our daily life, economic vitality, and national security depend on a stable, safe, and resilient cyberspace. Cyberspace and its underlying infrastructure are vulnerable to a wide range of risk stemming from both physical and cyber threats and hazards. Sophisticated cyber actors and nation-states exploit vulnerabilities to steal information and money and are developing capabilities to disrupt, destroy, or threaten the delivery of essential services."
- Exploiting AI-generated fake video and audio: Thanks to advances in artificial intelligence (AI), it’s now possible to create fake video and audio messages that are incredibly difficult to distinguish from the real thing. These “deepfakes” could be a boon to hackers in a couple of ways. AI-generated “phishing” e-mails that aim to trick people into handing over passwords and other sensitive data. Cybercriminals could also use the technology to manipulate stock prices by, say, posting a fake video of a CEO announcing that a company is facing a financing problem or some other crisis. There’s also the danger that deepfakes could be used to spread false news in elections and to stoke geopolitical tensions. Such ploys would once have required the resources of a big movie studio, but now they can be pulled off by anyone with a decent computer and a powerful graphics card.
- Poisoning AI defenses: Security companies have rushed to embrace AI models as a way to help anticipate and detect cyberattacks. AI can help us parse signals from noise, but in the hands of the wrong people, it’s also AI that’s going to generate the most sophisticated attacks.
- Hacking smart contracts: Smart contracts are software programs stored on a blockchain that automatically execute some form of digital asset exchange if conditions encoded in them are met. Entrepreneurs are pitching their use for everything from money transfers to intellectual-property protection. But it’s still early in their development, and researchers are finding bugs in some of them. So are hackers, who have exploited flaws to steal millions of dollars’ worth of cryptocurrencies.
- Breaking encryption using quantum computers: Security experts predict that quantum computers, which harness exotic phenomena from quantum physics to produce exponential leaps in processing power, could crack encryption that currently helps protect everything from e-commerce transactions to health records. Quantum machines are still in their infancy, and it could be some years before they pose a serious threat. But products like cars whose software can be updated remotely will still be in use a decade or more from now. The encryption baked into them today could ultimately become vulnerable to quantum attack. The same holds true for code used to protect sensitive data, like financial records, that need to be stored for many years."
- Tighten Security with Better Software Development
What’s to stop someone from hacking into an online software system or application and stealing data or access to critical processes? Both the threats and the solutions depend on software. One wall of defense is secure development with a focus on quality assurance, testing and code review.
<drafted by hhw: 1/9/19>