A product business can double its revenue and quadruple its margins by moving to a service business. What is service? It's information, personal and relevant to you.
Amazon delivers information that is personal and relevant to you, for example, with its recommendations: customers like you bought this book, or customers like you like this music. Now think about your favorite banking site and log in. I will contend that there’s very little personal and relevant information. The only reason you’re being asked to log in is for security reasons. After that you are really looking at a big shopping cart to move money from savings to checking, buy stocks, sell a bond, etc.
Could the bank deliver information that’s personal or relevant to you? Could they say that people like you bought this stock, or people like you re-financed their mortgage? Yes, they could, so why don’t they? Well, you probably never thought about this, but the consumer Internet that Google and Bing let you see through search is believed to only be about 100 or 200 terabytes. That’s it. Now, I’ll guarantee your current IT systems have 10, 100, or 1,000 times that amount of information; so why can’t they deliver information that is personal and relevant to you? Well, I say they are held hostage by the SQL monster. So let’s just have a little fun here.
It’s the late ‘90s and I have several SQL engineers in the room. I come in with a brilliant business idea. My idea is that we are going to index the consumer Internet and we’re going to monetize it with ads. We’re going to be billionaires! Just guess what the SQL engineers would do?
The first thing they’re going to do is design a master, global-data schema to index all information on the planet. The second thing they’re going to do is write ETL and data cleansing tools to import all that information into this master, global-data schema. And the last thing they are going to do is write reports, for instance, the best place to camp in France or great places to eat in San Francisco.
Any of you who are technical are probably laughing right now thinking, “Well that’s a completely stupid thing to do.” But if you try and attack the problem using SQL and BI tools, you’re also going to fail.
Furthermore, as you connect your machines, you have the opportunity to bring in large amounts of time-series data. Modern wind turbines have 500 sensors and the ability to transmit those sensor readings once a second. Most analytic techniques depend on the idea that the data scientist can try and visualize the data, but how is that possible if I have a 1,000 wind turbines and data for 12, 24 or 36 months? How can we learn from that?
Artificial Intelligence (AI) has been increasingly in the news. Google’s DeepMind made headlines when the machine, AlphaGo, programmed to play Go, defeated Lee Sedol, one of the best players in the world, by 4 - 1. Amazon’s Echo and voice assistant Alexa is being widely praised for its voice recognition capabilities, and many people remember how Watson handily beat the best Jeopardy players in the world.
Things have been changing quickly and here is a great example. ImageNet is a database of millions of images. Beginning in 2010 the ImageNet Challenge was established to see how well a machine would do at object recognition. As a point of reference an average person will be able to achieve 95% accuracy. In 2010, the winning machine could correctly label an image 72% of the time. By 2012, accuracy had improved to 85%, and in 2015 the machine achieved 96% accuracy.
So why have things been changing so quickly?
First, we’re continuing to get more computing and more storage for lower and lower prices. Next generation compute and storage cloud services can provide thousands of computers for an hour or a day. AI and machine learning software require lots of computing during the learning phase. The second reason is the emergence of neural network algorithms. Third, it’s not possible to apply these advanced AI technologies without data, and lots of it. Consumer Internet companies like Facebook are able to use billions of photos to train facial recognition systems. AlphaGo learned from millions of games of Go and Alexa learned from millions of voice patterns.
While we’ll continue to see progress in replicating what humans do, we have the opportunity to apply these AI technologies to even more important challenges. Today, many of the machines that generate electricity, transport goods, farm food, or sequence genes have large amounts of data. If we were able to connect these machines and collect the sensor data from them, we would have the opportunity to use AI and machine learning technologies to operate a more precise planet. Imagine a future farm that can use fewer pesticides, which not only reduces the cost of the food, but also makes it healthier. A future power utility could be based on a vast array of solar panels, wind turbines, small hydro generators and batteries to generate more power, much more efficiently. A pediatric hospital could share the results of millions of MRI scans and diagnose patients far faster.
Next-generation machine companies could not only double their revenues and quadruple their margins, but build a better planet in the process.
Timothy Chou, Ph.D.
Timothy Chou has lectured at Stanford University for over twenty-five years and is the Alchemist Accelerator IoT Chair. Not only does he have academic credentials, but also he's served as President of Oracle's cloud business and today is a board member at both Blackbaud and Teradata. He began his career at one of the first Kleiner Perkins startups, Tandem Computers, and today is working with several Silicon Valley startups including as the Executive Chairman of Lecida, which is building precision assistants for the IoT using AI technologies. Timothy has published a few landmark books including, The End of Software, and Precision: Principals, Practices and Solutions for the Internet of Things, which was recently named one of the top ten books for CIOs. He's lectured at over twenty universities and delivered keynotes on all six continents.