Artificial Intelligence (AI) is surrounded by a lot of hype and buzzwords. If you believe the marketing, AI seems shiny and new. Technologies companies are rushing to tout their AI offerings which come in a wide array of flavors.
Along with massive hype comes a massive price tag – right?
That used to be the case. A few years ago, projects deploying technologies, we now call AI, over business processes took a lot of resources. Most of those resources (hardware, software and people) were assembled for the explicit purpose of that AI project. It was typically expensive and, therefore, was afforded by large companies only.
It is no longer just for big business! I cannot point to a single event, announcement or product offering that changed everything. But I will point to a time – last year, 2017.
By sometime in 2017 many things in AI had changed. The increase in hype occurring in 2015 and 2016 preceded the cost changes in 2017. Following the hype, a lot of great ideas blossomed. AI has become more affordable and more widely in demand to the mid-market beginning in 2017. Most definitely the economics of AI had changed.
Sure, there were lots of new products and services. The price point for those continued to drop as does just about everything in IT and technology. However, no one product or vendor alone made the market different. All products and services combined made the price point more affordable as they began to utilized cloud and open source technologies together. Don’t you love seeing the benefits of true competition?
Roughly speaking, 10 years ago deploying AI was costly and usually involved specialized software on specialized servers. Now, the widespread adoption of open source analytic languages has broadened the reach of AI. Many basic computer science courses use Python and some use “R” to train new analysts. These are the two most common languages and both are open source that hosted AI services will use. Developing business-process-focused predictive models can now be done easily using simpler code. This allows increases the number of people developing and at the same time lessening the need for big and costly, licensed statistical analysis programs.
Specialized hardware has also made strides as computer engineers continue to make advances in specialized processors. This allows for midsize companies to afford the power to run complex analysis over large amounts of data by running their models on commonly available servers. And for those who don’t want to host their own servers, the open source systems are available as a service from technology vendors to run as a cloud service. This allows you to pay as you go without hiring expensive technical staff just to maintain specialty hardware.
The cost of developing new concepts also provides a return. Part of the lowering of the costs of AI processing in the cloud comes from the hot topics of deep learning and neural nets. As large companies race to get to the latest algorithms in data science to solve HUGE problems, service providers have lowered the price of the more traditional AI services presumably since those concepts are already developed, proven, and stable.
Last year, 2017, appears to be the year of AI price inflection. It will take a year or two for people to realize this but a wave is coming to midsize companies.