From Details to Disinterest:
Before the dawn of AWS and the official branding of the “cloud,” when you wanted to expand your data storage, you needed to determine which servers, routers, UPS, switches, hard drives, applications, and storage system management solutions you needed to buy. You had to choose a hosting facility and source and test every element within the system.
Now – log into your cloud console, make a few choices, and then automatically allocate another TB or petabyte of storage – without needing to know a single fact about the back-end hardware and software.
You’ve Seen One Platform…
Gartner’s Magic Quadrant for Data Science platforms lists 16 different options. First you have to tell them apart. Then, you need to determine which one your team can work best with it. After you’ve spent a few million dollars integrating them into your existing corporate data infrastructure, your data science team can finally get to work.
I Can’t Tell Them Apart. Can You?
Wikipedia lists almost 275 cloud providers. What are the differences among them? To list everything to figure this out – if possible – would involve the spreadsheet from hell. Do you need to fully understand how your electric car’s engine maximizes battery life to ensure that you don’t need to charge it every time you go more than 100 miles/kilometers?
The Bottom Line
When making decisions about AI and data science, the mathematical elements are far less important than the measurable results.
Like with cloud computing, AI is going to become ubiquitous. It ultimately won’t matter what platform you’re going to use.
The highest level of the evolution of AI is simply the application – the most effective use cases that will deliver the most accurate, stable, and high-performance results – the results that will most benefit your bottom line.
You don’t need to care how they got there – just that they deliver.