Transform 2022 will return in-person July 19th and virtually July 20th – 28th. Join AI and data leaders for informative talks and exciting networking opportunities. Register now!
While artificial intelligence (AI), which is steadily making its presence felt in the enterprise mainstream, faces significant challenges to make it a viable option for the operating system. The technology is at risk of losing its economic impact and could be relegated to a niche.
This is why AI deployment has been so prominent this year. It’s never easy to move any technology from the lab into production. AI, however, can prove to be a challenge because there are so many possible outcomes for each problem it is trying to solve. Organizations must be careful and quick to stay ahead of the competition in an ever-changing landscape.
We are making steady progress in deploying AI into production
According to IDC31% of IT decision-makers believe they have put AI into production. However, only 33% of them consider their deployments to be mature. This is the point when AI begins to benefit enterprise-wide business models. It can improve customer satisfaction, automate decision-making, or streamline processes.
As one might expect, the biggest challenge in dealing with data and infrastructure at the scale that AI requires is maintaining real value. Even in the cloud it is difficult to maintain and build a data infrastructure of this size. Efficiently conditioning data to remove bias, duplicates, and other factors that can affect results is just as difficult. Many organizations are using pre-trained, off the-shelf AI platforms which can be deployed quickly. However, they are less adaptable and more difficult to integrate into legacy workflows.
Scale is not only about the size of the project, but also about the coordination. Sumanth VakadaQualetics Data Machines founder and CEO, explains that infrastructure and lack of dedicated resource are key barriers to scaling, but that other issues such as siloed architectures or isolated work cultures still exist in many organizations. These issues can prevent critical data from reaching AI models, which can lead to inaccurate outcomes. Few organizations have thought about enterprise-wide governance. This not only helps AI achieve common goals, but also provides crucial support functions such as security and compliance.
The case for AI infrastructure on-premises
Although it may seem tempting to use the cloud to provide the infrastructure needed for large-scale AI deployments on a large scale, this is not the best option. Recent white paperNvidia and Supermicro are attempting to discredit this notion at least partially. These companies argue that an on-premises infrastructure is better suited for certain situations, such as:
- Applications that require sensitive or proprietary data
- Infrastructure can be used to support other data-heavy apps, such as VDI.
- Cloud costs can rise to unsustainable levels when data loads increase.
- If specific hardware configurations or performance requirements are not possible in the cloud, it is impossible to guarantee them.
- When enterprise-grade support may be required to complement in-house staff or expertise
An on-premises strategy is only viable if the infrastructure falls within a reasonable price range and has a physical footprint. An on-prem deployment can be designed with the same ROI factors as a third-party solution if there is a need for direct control.
Yet, both in terms of scale and operational proficiency it seems that many companies have put AI before the horse. They want to reap all the benefits of AI without having to invest in the right support.
Jeff Boudier, the head of product and growth for AI language developer Hugging Face, recently pointed out to VB that it is extremely difficult to share and version AI models, codes and datasets without proper backing from data science teams. This adds to project managers’ workload as they try to implement these elements in production environments. This only makes the technology less useful and makes it more difficult to use.
Many organizations are trying to force AI into traditional software development in the pre-collaboration and pre-version control eras, rather than use it to create a modern MLops environment. AI is like any technology. Without adequate support for development and training, the whole project could fall apart.
The most important stage in AI’s evolution is its deployment into real-world settings. This is where it will ultimately prove to be a boon to the business model. Although it could take several decades to evaluate its worth, for now, AI implementation and failure is better than holding back.
VentureBeat’s missionThe purpose of the digital town square is to provide a platform for technical decision makers to exchange information about transformative enterprise technologies and transact. Learn more about membership.