News & Interviews

 

Edge Executive Interview – Bill Walker, Tensor Networks

In the lead up to Edge Computing World, we’re taking some time to speak to key Executives from the leading companies supporting the show. Today we’re talking to Bill Walker, Co-Founder, and CTO at Tensor Networks.

BOOK NOW

Tell us a bit about yourself – what led you to get involved in the edge computing market and Tensor Networks?

BW: I’m Bill Walker, CTO at Tensor Networks.  I have spent most of my career in the IT field with Sun Microsystems and some other great innovation-centered companies.  In 2011, I went overseas to work for a Network Equipment Provider, focusing on the IT side of things, including SDN at scale and the early work in NFV (Network Functions Virtualization).  In 2015, I came back to the US to work for a large Tier-1 carrier in the NFV space, running traditional network functions as software.  

It soon became apparent that the network functions made more sense closer to the customer edge.  We looked at Quality of Service and Latency as components of an overall application architectures and focused on the economics involved, in both cost (to us) and value (to them).  And if you happen to be placing IT hardware closer to the customer, or even at the customer premises, how can application and cloud architectures change to distribute the data to where it has the most value and context?

Seeing how the large telcos were progressing, and the difficulties they were having in implementing largely repetitive “invented here” solutions in this space, I saw a need for a mainstream vendor approach, providing more unified and standardized platform solutions to the edge market.  The telco mindset is changing, but is still largely a vendor-driven, large-purchase, long procurement cycle world.  So I decided that I could make change happen quicker and better from the vendor side of the market.

What is it you & your company are uniquely bringing to the edge market?

BW: I always ask people to first define edge.  To us the “edge” is where data changes context and value.  There is the datacenter edge, the cloud edge, the campus edge, the building edge, the transport edge, and many more.  Data has context and value to many different consumers.  In retail, a store manager cares about how many customers are in the store and how long the checkout lines are.  Now.  Corporate headquarters might care more about customer satisfaction numbers based on wait times across stores and labor costs.  Same basic data, different levels of granularity and urgency.

We looked at NFV and the opportunities for the carriers for distribution of workloads around the edge, but also at the telcos’ customers markets.  If you are dispersing workloads, and I don’t mean just datacenter to cloud, but everywhere in the internet transport of data, there is a huge opportunity from micro to macro to global, to optimize processing of all data with locality and latency as a key component of solution architecture.  There is a huge opportunity to manage the network as a function of data instead of the other way around, and to include the network itself in the evolution of processing.

Our solutions combine the processing, acceleration, networking, and deployment tools into a single platform for locality and latency sensitive AI and ML processing, and data security and acceleration.

Tell us more about the company, what’s your advantages compared to others on the market, who is involved, and what are your major milestones so far?

BW: AI and large volume data was a great use case from day one.  Adding acceleration like GPUs and FPGAs closer to the data meant that the customer was in control of their data, and that only critical data would be sent to the datacenter or the cloud for further analysls.  A simple example would be video security.  Why would you send 24/7 data that no one will ever see or care about to the cloud?  If we use local processing to recognize motion, people, cars, uniforms, or even squirrels, the data being transported to the cloud would be reduced by as much as 90%, or more in most deployments.  Perhaps the facilities management cares about squirrel counts, they can receive squirrel statistics from the local system without paying to transport and store videos of squirrels to the cloud.

We have great relationships with early adopter customers and partners in the DoD, Research Labs, and Video processing worlds.  These markets have the greatest urgency for controlling and processing large volumes of data, and are the most latency and acceleration sensitive.  They have helped us to evolve our hardware and software, our platform, to deal with the hardest problems.  Now we are starting to see more adoption and traction in the more traditional markets including retail, telecommunications providers, and even shipping.

Our platform started shipping almost two years ago.  Since then, we have evolved from an operating system and reference hardware, to specific solution stacks for target industries.  We have added the tools, developer resources, and partnerships to get closer to a turnkey option.  Of course, nothing is “turnkey”, but if you can make an AI developer productive on day one, reduce the time to deployment, and streamline the process of creating and evolving system deployments (not just systems and software), we see huge wins in our customers’ productivity.

How do you see the edge market developing over the next few years?

BW: The future is bright.  I see a revolution coming, with companies beginning to take control of their data.  There is absolutely nothing wrong with “cloud”, but we see large corporations and even small companies starting to look at cloud as a non-binary solution.  “Put everything in the cloud” has turned out to be very inefficient, and often tragic.  I see IT and Telcos cooperating to optimize and really look deeper into how they use network, cloud, datacenter, and processing.

Combine this with the shift to more AI and ML workloads, more accelerated processing, more awareness of the true context of data, and what value that can provide, and we see a real tectonic shift.  The CIO will no longer be seen as an HR manager and spreadsheet guy.  The CIO will truly be recognized as a strategic asset of a corporation.  Data will be seen as an asset to be leveraged and exploited for advantage, instead of a cost center of overhead.

What are the main trends you see about integrating edge computing into different verticals?

BW: I see higher expectations for “now” and “here” in the corporate world, similar to what we have seen over the past decade with smart phones.  Perhaps it is the iPhone generation beginning to assume leadership roles in the corporate world.  Maybe it is just a cultural shift in our lives that is invading our workplaces.  But it is happening.  I am seeing the end customer, both inside and outside of companies become leaders in driving change and pushing for innovation.  The top-down world of corporate management is fading.

The other huge driver is, of course, data.  The sheer volume of data being produced is growing at a faster rate than telcos and clouds can possibly build infrastructure and evolve the technology to transport it.  How many petabytes of video cross the internet through streaming services?  How many new viruses do we need to model and find answers for?  We are becoming not only a data driven society, but an impatient data driven society,

I see the need for “situational awareness” growing very fast.  This is the intersection of “here/now” and “data driven”.  And the edges, all of the intelligent edges, are critical to making that happen.

Thanks Bill – Looking forward to hearing more from Tensor Networks at the event !