19
Jue, Sep
0 New Articles

Exclusive Interviews
Typography
  • Smaller Small Medium Big Bigger
  • Default Helvetica Segoe Georgia Times

Chris White, president of NEC Labs America, had an in-depth discussion with Telecom Review North America to shed a light on the strategic value and advantage of NEC’s industrial arm over traditional corporate R&D, and how industrial research contributes to pursuing innovative solutions to real-world problems.

Can you share with us the essence behind NEC Labs America and how it differs from traditional corporate R&D?

Our philosophy in NEC Labs America is smart people, hard problems, magic happens.

Most R&D is done to take an existing product and make it ten to fifteen percent better. NEC Labs America is not focused on incremental innovation, but more on disruptive innovation. Think of us as NEC’s internal innovation incubator that allows us to not only evolve a product, like a typical R&D would do but instead create entirely new markets.

The way we do research is also different. You will often hear me say that an electrical engineer shouldn’t solve an electrical engineering problem because the solution they will give tends to be limited, incremental, and not disruptive. The electrical engineer should be solving chemistry, biology or math problems — take a chemist, biologist, or mathematician to solve the electrical engineering problem. Better yet have them all work together as a team. 

This will result in a solution that is different from the conventional solution. Having a different perspective brings disruption. This is what we are trying to do at the Labs. We gather smart people with diverse perspectives and apply them to the problems that come from NEC’s business units. As they talk to customers, they get to know the real problems, and we solve these in a way that differentiates NEC’s products.

Research is tremendously valuable, but the value goes up exponentially if you’re solving a real-world problem. The only way to have a real disruption is to know what those problems are, what the barriers are and try to remove those using research. 

Technological innovation empowers people, businesses, and society. How will industrial research contribute to this, alongside responding to real-world problems?

Let’s visualize a six-month-old child tapping on a tablet. I’d say that when I was young if you tapped on the TV screen, you would only get in trouble because of leaving fingerprints on it. And at a very early stage, I knew that the TV will not change, unless I changed it. But now even an infant can already recognize that when they act, the world reacts back to them.

We’ve created an entire generation that expects the world to adapt to them rather than them adapting to the world. Which is why smart cities, smart factories, smart cars, and smart homes are not going to be innovative and different anymore. These are going to be the natural state of things.

We have the evolution of automating our everyday actions as well as different industries, factories, and enterprises to drive productivity and global connectivity across supply chains. 

Connecting our research to the vision of sensing (fiber), understanding (video, audio and textual) and action with some level of deployment — optical and wireless networking — is what we have in the Labs. Additionally, the predictive piece is what modern machine learning is moving towards.

All those pieces come together in a fairly nice way, aligned to the goal of making the world more responsive to us. That’s driven by the fact that everybody now has this expectation.

Years from now, we’re going to look back and say, “how could we have ever lived without the world actually optimized and oriented itself towards our needs?”

Thus, those who are predicting the future based only on technology evolution will fail. You always want to start with the human need, concerning what the exponential driving force is – and in our scenario, that’s the six-month-old child.

If you can’t say what problem you’re solving, and who is impacted by the solution, that’s not a good research challenge. If we have a technology that enables us to do something that we couldn’t have done before and it satisfies a human need, that’s the big win. This will result in trends that are going to evolve in time.

How does NEC Labs America's research impact modern ICT infrastructure? Why are these relevant?

Inside NEC Labs America, we view efficiently automating the world as key. That means we can’t use all of the world’s resources in order to make it more adaptive to people.

We have to have an underlying system – for sensing how a person needs to act in near real-time. Hence, you have to be cognizant of latency, resource, and utilization – understanding not only what resources you need but also where you want to use them.

Building up from the bottom, we’re working on high capacity, low latency, and well-controlled optical and wireless networking. What sits on top of that is a layer that has to make it easier for an application creator to deploy new applications. The person doesn’t need to know about the complexity of what’s actually built underneath.

5G, in comparison to the previous mobile generations, break many barriers that enable networks to function reliably and consistently. 5G is also pushing the time to scale down, adapting the network in near real time. The optical side is no longer isolated from the IP and wireless sides, bringing the need to be smarter in building these interrelated, complex systems.

One of the big pushes for NEC is OpenRAN (O-RAN). Having an open architecture allows the industry to pay integration costs between those layers through optimization at the global level. It’s a problem that goes beyond what a single vendor could do.

The 5G slicing layer that sits above the network provides an abstraction layer where applications can talk to the slicing layer, moving to another layer below that can talk to computing and network connectivity orchestrators. This process localizes the compute where it needs to be in order to have the right level of latency.

Moreover, NEC hugely invested in building the AI application business. We are number one in face recognition and state-of-the-art in action and object recognition. Our ability to sell those into a wide range of deployments at scale means that we need to be able to write the application once and deploy it a thousand times without any modification.

The 5G slicing layer is critical for us because it enables us to do just that. My team is working on that boundary to understand how to define a slice so that the applications we’re already deploying work well and the network has the ability to optimize and provide critical resources.

The idea of matching the supply and demand is also crucial. We’re not talking economics, but building efficient systems where supply and demand are perfectly matched.  4G and 3G networks have statically defined resources, putting bandwidth to waste when the network does not use it. In 5G, the network can dynamically adjust the bandwidth based on the amount needed. This dynamic adaptation is critically important not just for networks but for all kinds of systems.

In the AI perspective, we’re also shifting from static AI to elastic AI systems. The elastic AI systems apply just the right resources and effort based on the amount of utility that we’re going to get from an IT/ICT optimization problem.

One way of managing high-level resources is to start putting in constraints and regulations so that we reduce the amount of energy that people use. Different companies need to be able to collaborate to reduce the total amount of energy and drive better efficiency.

All of those levels are ICT – from the network itself, the layer that abstracts and makes applications that we need work better, and the long-term applications matching the energy usage between companies in order to drive efficiency.

How do you foresee the growth of ICT use cases impacting NEC's multidisciplinary areas of research?

We aim not just to be a technology-driven lab but to be a problem-driven lab. As the ICT industry needs to evolve, we also evolve the research program that we have to remove key barriers and allow us to open new markets.

My personal view is that we’re going to see this automation and optimization covering a wide range of areas. And that’s what’s going to drive a lot of what we’re doing in the networking and hardware ICT side.

We’re also doing a lot of work around natural language task automation. If you look at where AI systems have been deployed to date, the tasks that they’re doing don’t necessarily remove a large number of the routine tasks where we’re applying a lot of resources right now. For example, governments are actually getting more documents, and they need more people to process those documents and move them through the system. This is a form of digitization, which the world has started to do over the past twenty years. They key is not just digitization but digital representation.

Instead of having a digital clone of what we have in the physical world, what’s necessary is to go beyond simple digitization and find a better representation of that information. That’s what natural language processing does. It takes a 100-page document and breaks it down into a set of features that we can use to actually optimize tasks and make decisions.

If you need to read 100 emails, a system that actually breaks that down and provides which emails to read first and which are for later is a game changer. That’s the kind of trivial optimization, the next layer of understanding, that’s not yet here but sits close on the horizon.

Considering human augmentation, even just enabling a human to be 10% more efficient at making decisions, finding documents, or reading emails would be a huge efficiency gain when done globally.

In line with this, we’re evolving for collaboration between humans and machines, or rather humans and the world. The world being adaptive is a collaborative process.

Delivering distributed, collaborative, trusted, and private work is a key research task. Bringing all the pieces — network and people — together should be done in a way that allows us to trust what the system is going to do, at a rate of speed. If the system is designed well, they can solve a task faster than each of them individually.

But in reality, coordinating people together is hard and can slow down productivity. Hence, depending on the type of task and method, you can either gain or lose tremendously from collaborative efforts. There’s a lot of work that needs to happen on that space to understand, at a more fundamental level, on which problems can be solved among collaborators.

How do you plan to continue delivering high-impact, problem-driven research to generate new and up-to-date information and innovative solutions?

I hope that we’re on the right trajectory now of what we’re doing. What we do inside NEC is work shoulder-to-shoulder with our business units, defining a vision for the future. In that conversation with them and their customers, we will find where the key challenges are.

Knowing the right problem is half the battle to doing good research. We’re going to take those problems and we’ll go back to our manifesto: smart people, hard problems, magic happens.

We’re going to collaborate broadly, working across all the global NEC Labs to give us a perspective from different countries. We have PhD researchers trained across a wide variety of disciplines, so we have the expertise to bring in when collaborating with academia and universities.

We draw in a wide range of both new and existing talents from the universities who may be focused on more profound topics. We become the coupling agent between the problems that come from our business units and the cutting-edge academic research that could hopefully bring solutions to disrupt, create new markets and make the world a better place for all of us.